This is a transcript of The Resonance from 2025 February 2.
00:00: The Resonance. Post. Post. And the last one. Posting here. And we should have people trickling in.
00:19: Hello, hello. Okay, seems to be...
00:21: Hello.
00:22: Hello. Oh no, I had it on so I was hearing myself. There we go.
00:31: Oh, there we go.
00:34: Shnopyt.
00:35: Shnopyt.
00:35: Shnopyt.
00:36: The answer to that one is a shplblb.
00:40: I'm gonna mark it as answered.
00:43: Shplblb.
00:43: So hello again.
00:45:
00:46: Hello everyone. People should be piling in.
00:51: How's everyone doing today?
00:55: What's Cyro doing?
00:56: Oh, how am I doing?
00:58: Yes.
00:58: Oh, I'm, I'm eating, I am as eating cereal as I'll ever be right now.
01:04: I literally inhaled lunch before this.
01:07: I was like, I was playing a game and I was like, oh, like I should start preparing for a stream, but I'm hungry.
01:13: It's gonna be two hours.
01:15: So I heated lunch and as I'm heating the lunch, I'm prepping stuff and I'm like throwing into my mouth.
01:22: And then like I get on here and my control was not working.
01:25: So I'm like, I guess I need to restart and like I have to speedrun the setup again.
01:32: I was kind of similar.
01:36: I woke up and I was like, man, I'm tired.
01:39: And then like I, I like took my meds and I like played on my phone for like an hour and then an hour passes.
01:45: And I'm like, oh my God, it's like 30 minutes before the stream.
01:48: And I have to, and I get up and I go brush my teeth and I get my Froot Loops and I'm a tornado of chaos.
01:54: Yeah, I mean, Froot Loops.
01:56: We'll see, we'll see what the chat is going to be, tornado of chaos today.
02:01: Let's see, let's see how many questions we get and how many interesting ones we cannot.
02:05: I simplified the setup today.
02:07: So like we're right in front of the board.
02:08: So if there's anything, you know, that requires drawing diagrams, it's like right there.
02:14: Yeah, we just, we just magnetized to the board.
02:18: We're there.
02:24: Grant says he's got a, he's got, got to get a silly question.
02:28: Oh, that was the, the Xnopyt thing.
02:31: I don't know how to answer it, but I don't know what it means.
02:35: I mean, we should give a good amount of people in today.
02:37: So hello everyone, welcome to The Resonance.
02:40: I kind of stopped tracking which episode it is.
02:43: Things like tent or something, but.
02:45: What it is essentially, it's like my office hours combined with like a podcast.
02:50: So you can ask any question you want about Resonite.
02:54: We'll do the best our ability to answer those questions.
02:58: Some of them we might just give like a general overview and like, you know, ask you to bring it to other office hours.
03:03: Like, especially for example, if it's like some moderation matter.
03:06: The moderation team, they're holding office hours like an hour and a half before this one.
03:12: So like some of you probably came from there and because you missed it, you know, there's going to be another one next week.
03:18: But yeah, like whatever you want to ask about Resonite, like whether it's technical, whether it's like, you know, how the platform is going.
03:24: Like what's its history, what's its future, like what are we doing on the team, like anything related to Resonite, like feel free to ask.
03:31: The one thing, make sure you put a question mark at the end of the question.
03:37: That way, like it pops out on our thing and we don't lose the track of it.
03:42: If you do ask some follow-up questions or follow-up clarifications, please include the context in the message because sometimes it takes a bit to get to your follow-up and by the time we forgot what the original question was.
03:56: So with that, we should be ready to start it.
03:59: Also, some of the questions we might be talking about, we might be talking a little bit deeper into the style and kind of rambling about it depending on how time goes.
04:10: I also forgot, like, we have Cyro, one of our people from engineering team here as well, so he's kinda here to help me, like, answer some questions.
04:18: Especially not selected ones.
04:21: I forgot about me too, don't worry.
04:24: I also found that Xnopyt doesn't seem to really mean anything, but has gained the colloquial meaning of to suddenly disintegrate at the thought of something.
04:38: Oh my god. What would be the thought of something that would make a sudden disintegrate?
04:45: Oh, it's a TOSC.
04:48: I guess, oh, I know what I disintegrate at sometimes. I disintegrate at the thought of ref-hacking when people are like,
04:57: I'm gonna put ref-hacking in this thing and I'm gonna make it public and then I'm gonna share it around and then I'm going to drink your tears when you cry when it breaks.
05:10: That's some... I don't know if it's dark, but that's...
05:18: Y'all better look out.
05:22: Yeah, ref-hacking is one of those things where it's like...
05:27: We try to... with the Resonite, we have this philosophy of like, it's a platform for thinkers.
05:32: So if you want to thinker, then tinker away, play around with things and explore.
05:38: So you should understand the impact of some things and some features, they are designed to be supported for long term.
05:45: Which means if you use those features for their internet purpose, we give you certain guarantees that your stuff is not gonna break.
05:52: And if it does break, then we're gonna spend some time fixing.
05:56: But there's other things where a lot of them will tell you specifically, don't do this.
06:02: You're kind of exploiting an implementation quirk, or you're trying to get around some things.
06:10: And those bits can change, and often times they will change without warning.
06:13: Because we might change one system and it's influencing how some other system behaves.
06:19: And the outward behavior of the design parts stays the same, but how they work under the hood might change.
06:27: And if you depend on how they work under the hood, your stuff will explode.
06:29: And in those cases we will not necessarily spend the time fixing that, because this is something we don't intend to keep working the way it is.
06:42: And you shouldn't be relying on it.
06:44: So if you want to think, play with it, you're free to do so.
06:51: We're not going to forfeit it, but at the same time you need to understand the risks that come along with it.
06:57: And we also appreciate if you talk about it with other people, that you share those risks with others, so you know what to expect.
07:05: Because it's one of those things where you feel like it's fine, and then it breaks, and we have some toolset or workflow that's built upon it, and suddenly that goes poof.
07:16: Because we have other kind of priorities at a time, we might not make proper replacements for whatever you're using it for, which means you might be left without a solution, or might end up scrambling and so on.
07:27: So as long as we understand that's a risk of what you're doing, I feel it's fine, but the problem is often times people don't talk about the risks of it, or don't really understand them, and then they get upset when stuff breaks.
07:43: So the best thing is to just stay informed about things, like know what good practice is, what metrics to be avoided.
07:53: We do have a section on the wiki that actually says things to avoid, that specifically calls out some of these things and tells you these are risky things to use, these are not gonna, this can explode at you at any moment without a valid replacement for what we were trying to do.
08:09: So just keep it in mind as you play and build systems.
08:14: Yeah, basically if you use something like refhacking your creation and it breaks one day, we're probably not gonna auto-upgrade it for you.
08:23: Yeah, a lot of times we also can't because sometimes, whether you depend on the specifics of how a certain system works, if we wanted to keep that, we would essentially be locked to how that particle system functions.
08:41: And doing that essentially ties our hands to upgrading those systems and improving them or even optimizing them.
08:49: One actual example I can give is people used to, or maybe they still do, I don't actually know, they used to parse the reference IDs to figure out the user.
09:01: And the way it works is because the reference IDs, right now they're partitioned where 8 bits of it are for the allocation ID of whichever user.
09:11: And they would parse it out to figure out a user that allocated an item.
09:15: The problem is, that's an implementation detail. The reference IDs, they're meant to be opaque.
09:21: Which means it's just an identifier, but it doesn't necessarily tell you more on itself. Its structure is subject to change.
09:30: So if you build something that assumes those 8 bits are the user, and then we change how they're presented, which we actually will in the future.
09:38: Because it being 8 bits, that limits it to 256 users, or 55, I forget which one of.
09:48: But once we do the big server things, where we can do domains and so on, there might be a lot more users, and the allocation might be more dynamic.
09:56: It might not even have an allocated part of it, to be specifically the user.
10:01: It might do a system where there's an allocation manager, and other things, and it becomes more complicated.
10:07: And suddenly the system you built is not going to work, and it's just going to break up.
10:11: One thing we actually did in this particular case, because we saw you want to figure out which user allocated something, we made a node, where we plug, I think we plug, what does it take, like the border element, I think?
10:25: You plug a border element, and it tells you this is the allocating user.
10:29: What this does is because the node, it kind of abstracts away.
10:33: It essentially abstracts away the idea that you want to figure out which user allocated this element.
10:40: So right now, internally, it will actually use the reference IDs because that matches.
10:44: Once we change that implementation, you know, how reference IDs work, there's going to be different ways to figure out who allocated something.
10:52: We can change the internals of the node so it uses the new system, but its outward behavior will not change.
11:00: From your perspective, you still give it a word element and you still get the user who allocated it.
11:05: You don't care what happens inside because we can upgrade inside.
11:10: But if you sort of build a system yourself where you parse it out and figure out the user that way, there's not really a reasonable way for us to upgrade that system for you.
11:21: And that's kind of the general principle we try to ask people.
11:24: If you're abusing something for something that's not necessarily its intended purpose, it's better to ask for this to be a feature.
11:33: We'll look into it and see.
11:35: Sometimes we will tell you we can actually already use this system, this is intended for this.
11:41: But if you just try to hack things out and so on, that's where it should give you a pause.
11:55: Sometimes we'll say we actually want to introduce a system for this, this is not going to keep working the way it is.
12:03: Or we tell you, you can feel free to use this system, the system is designed to be supported long term.
12:10: So this gives you an idea of how to make things so they support you long term and they keep working and they don't break on you.
12:21: Yeah, and I won't hold up this topic too much longer, but it's kind of funny to...
12:26: Because there's a similar parallel in C-Sharp with...
12:30: Oh, what was it? It was GitHash.
12:32: GitHash, yes, that's a good one.
12:36: There's a similar thing in C-Sharp where you can get the hash code of an object or whatever.
12:42: And it's kind of like, at that time, a unique identifier of what it is.
12:53: You can tell.
12:55: In C-Sharp, every object implements GitHash code, which is a way to obtain a hash,
13:02: which is sort of like a fingerprint of an object.
13:04: It is not guaranteed to be unique, but it's oftentimes used for various collections,
13:09: like dictionaries, hashes, and so on.
13:12: It's when you want to do a quick identification.
13:15: For example, if you're storing a bunch of objects in a dictionary and you want to have a fast lookup,
13:19: one way to do it is to get a hash, and then you store objects under that hash, or you're still multiple.
13:26: So when you want a query object based on the key, you get a hash, you find whatever is under that hash,
13:31: and that can be either just one object, in which case your job is done, there can be multiple objects,
13:37: in which case you have to filter them through and figure out which one is the one you actually want,
13:42: because you can have hash collisions.
13:46: The problem was that if you for example take a piece of string, like text is your key,
13:54: and you have the word test, and you have the word fruit, say the word test would end up with a hash that's a number 7,
14:04: and then the fruit would end up with the number 42.
14:07: One thing the developers would do is they would build a dictionary with these hashes, and they would save it.
14:16: The problem is the actual number you get, the specific number, is implementation detail,
14:21: and it could change at any moment unexpectedly.
14:24: Which means if you as a developer save some data using those hashes and you assume they never change,
14:30: and then they do change, your string will explode, because suddenly the numbers don't match.
14:35: So in order to discourage people from doing this, what the developers of C-Sharp and Core More Language Runtime did,
14:42: they actually made every time you start a C-Sharp application, those hashes are randomized.
14:49: So you will essentially get a different one.
14:52: So on one startup for the war chest, you get 7.
14:55: On another startup, you get 1,735 or something.
14:59: This is probably going to be more complicated numbers, but that kind of illustrates the point.
15:05: And because they kind of forcefully randomized it every startup,
15:10: if you were to use it, where you save to your disk the pairings of hashes with the objects,
15:18: your application will break immediately on the next startup and you realize,
15:22: oh, I shouldn't have been doing that, this doesn't say the same,
15:26: I need to find a different mechanism that guarantees that it's persistent over multiple launches and implementation changes.
15:34: If they just keep it as it is, the problem is a lot of developers would not realize until they upgraded Net Framework,
15:41: so a new major version, and then the application would break.
15:45: And at that point, it's much harder to figure out why did it even break,
15:48: because you might have implemented a thing a year or two ago.
15:54: And we've kind of done something similar, like we referenced ideas, we've kind of like, scrabbled them a bit, and so on, we make it harder.
16:01: Oh, your microphone died, I don't know.
16:14: Technical difficulties, people, we'll wait for Froox to get his mic back.
16:18: Hello, this one too. I don't know if my microphone's dying.
16:20: It looks like on the charger right before it is.
16:23: But yeah, we kind of scrambled, we kind of make it harder to use them, just to like, you know, when you try to do stuff,
16:30: like, you know, to give you pause and be like, is this something I'm supposed to be doing?
16:34: Like, like, it's sort of like, you know, sort of there like to tell you, like, you know, this is maybe something that's hacky and like, you shouldn't be relying on this.
16:46: Yeah, that's kind of, that's kind of why some of the like, like ProtoFlux or older like Logix colors and stuff used to get screwed up.
16:55: Because some of the type colors used to rely on the hash code.
16:59:
16:59: If you're on a headless, slots would turn from green, they would be now they're pink now.
17:04: So like,
17:05: I mean, we can still use the hashes for some of them.
17:09: It's just like ones that like where it doesn't matter specifically which color it is.
17:14: The colors are there more just to like, so it's easier to tell just like at first glance.
17:20: But also like the colors get kind of limited because there's like thousands of dozens of thousands of types and you're not going to be able to like distinguish that many colors.
17:29: So like one thing we do with that one is like some of the common types, like we assign them explicit colors.
17:34: So they're very vivid and very easy to distinguish.
17:36: And all the rest we just, we just like use this system that just scrambles the colors and gives you different colors just to make it easier to like distinguish them.
17:46: But like still should like check which type it is.
17:54: So it's kind of funny because like, you know, like literally we went into our first ramble and the question was Xnopyt.
18:01: I just don't know what to pronounce it.
18:03: So congratulations, Graham, like your meme question actually ended up like, you know, in the longer ramble.
18:09: Yeah.
18:10: It's, it's, we're like peak efficiency today.
18:14: Yeah, we did actually also get some other questions coming in.
18:18:
18:19: There's some questions piling up, but like I'm kind of like letting them pile up for a bit because I feel, I feel this was an important PSA.
18:25: Plus it's funny, you know, given the original question.
18:28: But yes.
18:30: In TLDR, feel free to like, you know, think around, but please be like responsible with it and make sure like, you know, you understand which systems are designed for which purposes and which ones are designed to like, you know, have work time support and which things you will break on you.
18:45: So that's pretty much it.
18:46: And we should get to the, we should get, you know, to the questions.
18:50: So the first one, that's actual question is from JacktheFox.Her.
18:55: I also have a good first question.
18:58: Aha, tricks on you, like the actual first question was Xnopyt.
19:04: It probably wasn't VentoVee, but it turned out that way.
19:07: But anyways, do you notice MMC in the cloud statistics?
19:11: Oh, there was actually a thing like during the launch, like, let me see if you actually go to metrics.
19:22: Metrics.resonite.com.
19:25: There was an increase of like users around like where the ceremony was happening.
19:32: Let me actually grab this.
19:34: This is kind of a junky way, but I can just screenshot my web page and my dash and just bring it here.
19:42: So you see like around here, like when it sharply increases, that's like, you know, this is around like where the opening ceremony started.
19:54: And it kind of stayed a little bit and I think it was like the normal weekend thing.
19:58: So there are like noticeable things.
20:01: This is like accessible to everyone.
20:03: So if you go through the URL, you know, you can like play around with these graphs yourself and you can check for yourself.
20:10: I haven't checked like the other ones like super much.
20:14: It's not like, like if you like zoom it out a bit more, it's like not too, too crazy.
20:20: Like out of like, you know, what we normally get on weekends.
20:22: So I have to like poke around some more, but make sure I can do last 30 days.
20:32: It is a bit noticeable, like bump.
20:35: Let me see if I can do this one, just screenshot this.
20:41: Come on.
20:43: Come on, there we go.
20:47: So you can kind of see, like these are like, these are like usually weekends.
20:52: That's like where we get most people.
20:53: And this is, you know, the MMC opening.
20:55: So there's definitely noticeable bump in the users.
21:00: So yeah, this is noticeable.
21:04: So go around, go to metrics.resonite.com.
21:07: Perhaps like, maybe just like super fancy and play around with the girls.
21:12: Like it's kind of fun, you know, looking for insights.
21:16: Next question is, do you want to do this one?
21:18: Yeah, so GameTheCopDog asks, question for Cyro.
21:22: How did you get around the issues preventing any CPU compilation?
21:26: In particular, I mean, I assume you mean for the headless, just for the rest of the people watching.
21:34: It actually wasn't that hard.
21:37: I might've just been screwing something up the first time.
21:42: Cause I, a while back, I ended up refactoring like our whole build process
21:46: to make it more like streamlined and more buildable on like other platforms like Linux and stuff.
21:53: But in doing so, since we switched to the new like project style,
21:58: it kind of messed up some of the dependencies for like the headless.
22:02: And, um, I didn't know how to fix it in that moment.
22:09: And so I was like, okay, I'm just going to make this compile for like WinX64,
22:14: just so that it like retains the old behavior.
22:17: Um, so that at least we have a working headless.
22:20: Um, but yeah, I think I might've just been screwing something up there
22:24: because when I changed it to compile for any CPU,
22:28: um, the, it seems to work just fine now.
22:32: Um, and I don't, I'm logging the headless works fine.
22:35: You know, I'm doesn't crash.
22:37: And I just included the, I ended up just the only thing I needed to do really
22:42: was just include the, the steam API, like native library.
22:46: And that was it.
22:47: And like the runtimes folder, because the headless can actually,
22:50: since the headless runs on .NET, like the proper modern .NET,
22:57: it can actually choose like which native libraries it needs to load
23:00: for the platform it's running on.
23:02: So if it's running on like ARM, it'll load all the ARM libraries.
23:06: If it's running on, you know, 64-bit machine, it'll load the 64-bit libraries.
23:10: If it's specifically Linux, it'll load the Linux, you know, 64-bit libraries
23:14: and stuff like that.
23:16: But yeah, now, since it seems to work fine now,
23:23: we can compile it with that in mind.
23:27: And it should be much easier to make the headless run on different platforms
23:35: just by giving it like the native libraries it needs.
23:38: And especially if we get like ARM, you know, continuous integration,
23:41: we could actually just like, you know, officially support ARM at that point.
23:46: I've been making a bunch of PRs recently.
23:48: I haven't been able to look at them in detail yet,
23:50: but it feels like they're close.
23:56: So shout-out to J4 for that, too.
23:58: Yeah, big, big, big shout-out to J4.
24:01: He's been doing some great work with doing the continuous integration on GitHub
24:07: and making all the libraries build by themselves
24:10: once you push updates to them and whatnot.
24:13: And it sure saves us a lot of headaches, so thank you so much,
24:19: because I, oh my god, get reactions from TDU sometimes.
24:23: See, this is one of the reasons I don't like doing it either.
24:27: You just gotta mess around with it, and I'm like,
24:30: no, no, no, we have a Cyro.
24:31: I'm like, did you come with us?
24:35: Yeah, just give, see, I'm here to pick up all of the,
24:39: I'm here to pick up all the slop that Froox doesn't want to pick up.
24:43: And then, you know, I'm like, okay, Froox, I'll do this thing in like two days, no problem.
24:49: Two weeks later.
24:53: I did it, Froox.
24:55: It's done.
24:56: I'm like crawling to him, you know, with like a polished,
25:00: I'm crawling to him with like a polished marble in my hand,
25:03: but like the rest of my body, I'm like crawling through the mud and the dirt,
25:07: and no, it's actually, it's fine, it's fine.
25:09: I mean, I'm actually like, it's kind of funny because on my end it's like easy,
25:13: like kind of better because like, I'm like, I give you a task and you can focus on it for like a week or two,
25:18: and then like, I don't have to worry about like, you know, dealing with some PRs and such for a bit,
25:22: I can focus on the other stuff.
25:24: So like, there's no complaints for me, from me.
25:28: It's also like, you know, the two weeks that I would have maybe had to spend on it.
25:31: So it works out fine.
25:34: Yes.
25:37: Anyways, yeah, I hope that answers your question, game.
25:42: It's pretty exciting.
25:43: Like, I know like people begin to think the Oracle servers are on ARM.
25:47: Actually, it's kind of fun, I think, because like, I kind of,
25:50: like I've been looking for an excuse to get a Raspberry Pi because I wanted to mess with it,
25:54: and like, I'm like, maybe we'll have less, but like, I don't have a super time,
25:57: but now I can maybe like, you know, get one and see what it does and just, you know, have fun with it.
26:05: The headless is running on like .NET 8.
26:08: It runs so much better.
26:10: No, it's 9.
26:13: Not .NET 9. It's .NET 9. No, I'm on a date. I'm crusty.
26:18: Crusty.
26:19: It runs so much better.
26:23: You probably could run, you know, a little server for like 10 people off of a Pi.
26:28:
26:30: Anyways, let's move to the other questions.
26:32: No, there's a bunch. We should get to the questions.
26:35: Oh, yes, yes.
26:36: We've got plenty of time, so we'll see.
26:40: That was a little bit weird. There's going to be not too many questions because it was quite at the beginning.
26:43: You know, there's a bunch and we're already getting into rambles.
26:47: Anyways, Marta Khanar is asking,
26:52: Hello everyone. I've been exploring Resonite and I'm loving how great it is so far.
26:57: It's quite a lot to get to for a start, but it's all right.
26:59: I'd like to ask a high level question.
27:03: From already well-known other engines, Unreal, Blender, Eevee, Cycles, etc.
27:08: Easier to implement, easier for license, easier to have allies and development.
27:13: So there's actually multiple things in this question.
27:17: But first, thank you, I'm glad you enjoyed Resonite.
27:20: It can definitely be quite overwhelming at first, so we generally just advise people to take it slow.
27:26: There's a lot in here, so it's natural to take
27:32: your time, it can absorb everything, and there might be even parts you might not be super interested in.
27:37: I know there's people who've been here for over a year
27:43: and they're still discovering new stuff, so that's nothing bad.
27:50: For the question of the renderer, there's ones that are more specific.
27:55: For example, for Unreal, we cannot use Unreal due to licensing, even if we wanted to.
28:02: Unreal doesn't allow to build any applications that give you editor functionality in the game.
28:08: That's against their license, which means we cannot just use it based on that.
28:14: Blender is licensed as GPL, which is not compatible with our code and our own dependencies,
28:21: which means we also cannot use it based on that.
28:25: Cycles, specifically Cycles, is not really a real-time renderer.
28:29: It's more like a scene, and it can take anywhere from a few seconds to several minutes, so it's not really suitable.
28:40: They're more high-level though, if you just consider other engines, because there's other engines like Godot,
28:45: and we had conversations, would it be like use Godot, and so on.
28:49: We've kind of started using Unity as our rendering engine.
28:53: And over the years, we just kind of run into so many issues where we don't really have control over things,
29:01: because Unity is designed a certain way, and oftentimes that doesn't mesh with the things we need.
29:09: Because, Resonite, and specifically FrooxEngine, they're a little bit atypical from other games,
29:15: where most of FrooxEngine, it actually is like a game engine,
29:18: it just doesn't have, it relies on some functionality to be provided by other engines.
29:26: But for the most part, stuff that the unit is doing, we're already doing with our own engine,
29:30: in a sort of different way, which allows us to do all the cool things that Resonite does.
29:37: All the interactivity, being able to edit anything, the implicit networking and so on,
29:43: that's given the design of our own engine.
29:45: So, we already got most of the engine that's our own,
29:52: and what happens with integration with the unit is that some of the bits get duplicated,
29:57: that's like the same work is being done twice by both FrooxEngine and the unit,
30:01: which decreases some efficiency.
30:05: There's some parts where we kind of bump into how the unit is designed,
30:10: because we sort of end up with this whole game engine within game engine.
30:15: And there's just like all kinds of kind of trouble that come from it.
30:19: It helps us a lot, like I don't think if we didn't do that,
30:23: we probably wouldn't be able to be here right now,
30:26: because it gave us like a head start,
30:29: but we do want to kind of replace it with our own,
30:32: because we've kind of built out like, you know, majority of FrooxEngine,
30:36: and the renderer is like one of the last big parts that like we kind of need to replace.
30:40: And by having an engine that's more designed,
30:45: we can make sure it fits really well with like what we need it to do.
30:51: Or relying on a third-party engine,
30:55: where it's not necessarily designed, you know,
30:57: and it might also come with like lots of like, you know, other bits,
31:01: that like we don't care about,
31:03: that's going to be slowing things down and bloating things up.
31:06: So it's the most efficient kind of approach,
31:08: it's the one that gives us most control.
31:12: And also, you know, makes it like easier to like,
31:15: solves all the licensing issues as well.
31:18: However, Sauce is actually,
31:21: it's not an engine that's made completely from scratch,
31:24: it's being built on an existing project called Beve,
31:28: which is also a game engine, but also comes with its own renderer.
31:31: So we're essentially, you know, utilizing,
31:35: and wrapping like a third-party renderer.
31:38: And like, you know, Gins and like other people
31:41: have been kind of adding more bits to it to bring it to future priority.
31:45: So it's still kind of like, you know, using third-party project,
31:48: but like one where we can adapt it like, you know, pretty well,
31:51: you know, to our needs and use it sort of like a building block.
31:55: Before that, like, you know, Sauce was using the A-Forge framework,
31:59: which is also like, it's not an engine,
32:01: but it's sort of like a set of building blocks to building your own render engine.
32:05: So it's not like 100% made from scratch,
32:08: but it's, it's using something where it's also like,
32:12: we're not like just utilizing, you know, under the fully fledged game engine,
32:16: because we already kind of have like, you know, mass parts of it.
32:19: So I hope that's kind of, you know, answers the question.
32:24: The next one is Grand Decay. Quick question while I'm here.
32:27: Is there any ways to get the components under a known slot
32:30: and the values of these components,
32:32: like slot field in reference field slot in a read and or write way?
32:38: So like you want like a component access,
32:39: which is something we do want to provide eventually with like ProtoFlux.
32:42: So we can enumerate the components and access the fields.
32:45: Right now there's not really an official way to do it.
32:48: You'd have to like, you know, preset them,
32:50: set them like in advance with like dynamic variables,
32:53: you know, and stuff like that.
32:58: The next question.
33:02: Check the Fox author.
33:03: Here's something I was discussing with some friends and get them a bit ago.
33:06: Do you think source engine style 3D skybox
33:08: could have a place in Resonite?
33:10: There's massive benefit of those,
33:12: being able to create objects that are very far away
33:14: without being affected by fire clipping issues.
33:16: I mean, is there a system we could like, you know, look into?
33:19: Like there's always like approaches and mechanisms
33:22: from other engines that, you know, like we might look into adapting.
33:27: So I would say so.
33:29: Like I would say like, you know, it's like a good way to create
33:31: like a GitHub issue about it and, you know,
33:32: we can like discuss it a little bit more, see what the benefits are,
33:36: how complicated it would be and maybe like, you know,
33:39: it doesn't mesh like with our approach and we'll see like
33:41: maybe there's a different way to do it.
33:43: But there's definitely something, you know, we can think about.
33:51: Next question is from ShadowX.
33:53: Now that MMC has begun, will you wait until the end of February
33:55: to discuss in Splatting support?
33:59: You last time said that.
34:01: I don't think I said that.
34:02: I was waiting because I was traveling.
34:08: I was like at the sea so like I didn't want to release it
34:10: like, you know, when I was in travel but now I'm back
34:12: so like I do want to release it pretty much as soon as I can
34:15: since it's kind of been sitting in a branch and like
34:18: it's only like a few things like fixed up and then I need to
34:20: like optimize like in memory like, you know, compression
34:24: but I'm not really planning to wait until the end of February.
34:29: Like I don't really see much reason for that.
34:34: But yeah, I did want to release it like over like this week,
34:38: you know, the one that's like over but there are some other things
34:41: that kind of end up popping up so like it's kind of slid
34:44: like, you know, in priority because Metal can deal with those
34:48: but hopefully next week.
34:53: The next question is from Bakedby.
34:55: I've been wondering, considering the recent implementation
34:57: of PhotonDust to replace Unity's built-in particle system,
34:59: is the eventual plan from Resonite to fully transition away
35:02: from using Unity as its main engine instead of running everything
35:04: on a completely custom-built engine for sake of performance?
35:08: Yes, that's pretty much like something that we've been kind of working on
35:12: because like I mentioned earlier in the stream,
35:16: Resonite is already, it's like majority of it is already custom engine.
35:22: Unity has been handling mostly like the rendering part
35:25: plus like, you know, fill a bit with the bits.
35:28: There have been like two things, like essentially kind of like, you know,
35:32: still making, for example, kind of interweaved with Unity,
35:36: which is the particle system, which is now, you know, able to be removed,
35:41: and the audio system, which is going to be reworked next,
35:43: so we can separate out Froox Engine into its own process,
35:46: run it with .NET 9, which runs way faster than Mono does,
35:50: which is what Unity uses, and then, you know, use Unity as essentially just a renderer.
35:56: Once it's done, we also eventually replace Unity as the renderer for Sauce,
36:00: and we're going to have like, you know, our own kind of custom solution.
36:04: We kind of deloved into this topic like in one of the previous episodes.
36:08: If you go to our official YouTube channel,
36:11: there's a video on how the performance, you know,
36:14: upgrade is like essentially going to work,
36:16: where I can have like, you know, do a bunch of like diagrams and so on,
36:19: to visualize, you know, how is the change, like, you know,
36:23: how is the performance upgrade going to work.
36:25: So if you're kind of interested more in this topic,
36:27: I recommend checking that video out.
36:29: It goes into a lot more detail.
36:31: But yes, the gist of it is like, you know,
36:33: we slowly, piece by piece, moving away from Unity.
36:36: That's going to help a lot with performance when we split it into a separate process,
36:40: and eventually we just snip Unity out and have our own renderer.
36:46: Next question is from Dusty Sprinkles.
36:49: New Josh question.
36:50: What is the rationale behind dynamic variables' trig naming requirements?
36:53: Disallowing such abuse will make sense, but other than that, why anything else?
36:58: One of the main reasons for that is because it kind of gives us space for expansions.
37:05: So for example, say we want to add new features, you know,
37:07: say like, for example, wildcards.
37:11: I'm not saying we can add wildcards as an example,
37:13: but say you want to do some more complex mapping,
37:17: we'll need symbols to use for that.
37:22: So by not allowing to use any of them,
37:25: we can guarantee that the names are not going to contain them.
37:30: If we just allowed you to use absolutely any symbols for the dynamic variables,
37:36: what would happen is then we would have no symbols left for the syntax
37:41: because we wouldn't know if anybody's using them or not,
37:44: because if you were just using some random symbol and now we decided,
37:47: OK, the symbol actually has this function, it's some kind of syntax,
37:52: now your stuff breaks.
37:54: So it's pretty much giving us the space for potential future upgrades.
38:03: It's also so we don't have to deal with some of the complexities of Unicode within those names and so on.
38:11: Now we're running out of questions again.
38:15: Next question is from IamBadAtNamesK.
38:19: I am currently working on setting up dozens of servers so that I may host dozens of sessions
38:23: and my friend is worried that hosting tons of worlds will be a burden on Resonite's servers.
38:27: Kind of like DDoSing.
38:28: Would it be a significant burden or would it be alright for me to move on with my project?
38:32: I don't really see it being a super big burden, the servers are kind of scalable.
38:38: It kind of depends on how many and what's happening on them,
38:40: but generally I don't think that should be a big of an issue.
38:43: There's already like 100 headlaces running at any given time,
38:51: so it's probably not going to be super big of an issue,
38:54: especially if there's not much happening on those servers as well,
38:57: because they do communicate with the cloud, for example the session information and so on,
39:01: but it also depends on the level of activity on those servers,
39:04: so if there's not much happening, it's going to be very low on the amount of traffic it causes.
39:12: And most of the actual heavy traffic that's happening between your server that you set up
39:18: and whoever is connected to it, so I think it should be good.
39:24: You probably need a lot more to do those things.
39:29: The next question is from Eriang.
39:31: What is the rationale for resolo-
39:33: That's the same question.
39:35: What is the rationale for resoloing symbols other than slash and dynamic variable names?
39:39: We just answered this one earlier, so hopefully you got the answer there.
39:46: The next question is Ozzy is asking,
39:49: I got perhaps a bit of a sensitive question.
39:51: In the past MMCs I do know you are a bit careful on possibly breaking updates.
39:55: Given the next focus will be audio rework, do you think there will be delay to release till after this month?
40:00: Assume it could possibly be done this month.
40:04: Ozzy will better likely run a pre-release on it.
40:08: Once it's ready to replace things, we'll do a pre-release.
40:13: Since it's a big enough change, so you know how that runs, we'll ask people if this is fine.
40:20: If we see there's not really too many issues,
40:23: we might just release it like the main.
40:26: It also depends how it's going to work out timeline-wise.
40:32: If we would have to do the swap near the end of the month when everybody is finishing their project,
40:39: we might delay it, in which case it doesn't matter as much because it's not delaying it that much.
40:50: If it's sooner, we might be able to do swap then, so we'll play it by ear.
40:53: But if we feel this is going to cause issues with people's projects, we're going to delay it.
41:00: If we feel it's not, we're just going to be informed by running a pre-release
41:06: and getting enough feedback and getting enough people to test it, we might release it.
41:11: So we'll see how it ends up working out.
41:13: I've only been doing a little bit of research into that, so I don't know how long it's going to take.
41:23: So we'll see. I don't have a timeline for it right now.
41:27: We'll do what we can to avoid any major issues, but it's going to depend on how things work out.
41:33: Whether it releases or not.
41:37: NukiKoon is asking, but we can still use B-thing with a dot, right?
41:44: I don't know what that means.
41:47: Can I use that as a symbol in my dynamic grandpa's names?
41:50: Oh, then I can use it if dots are allowed to be...
41:55: It essentially uses a wide list system.
41:58: So the way it works, it looks for characters it specifically allows.
42:02: If it sees anything else, it's just going to be like, nope, this is the wrong name.
42:06: So if you can put a symbol in there, you can use it.
42:13: The next question is from Luca07.
42:15: Did the legacy particle system use Unity colliders for physics?
42:19: If so, does that mean moving into PhotonDust, are we completely free from Unity's physics engine?
42:23: No, it actually didn't use physics.
42:24: And it was like one of the kind of...
42:26: Well, it didn't use Unity's physics, I mean.
42:28: It was like one of the kind of more complicated bits,
42:30: and like one of the bits that made it like a weird hybrid between FrooxEngine and Unity.
42:36: Because we are not using Unity's physics engine at all.
42:38: We're using the Bepu physics engine, we just integrated directly with FrooxEngine.
42:43: The problem is that Unity has no idea that physics engine exists.
42:48: So what do we have to do?
42:50: Whenever Unity would run its own simulation of all the particles,
42:54: for any particle systems that had collisions enabled,
42:57: we would actually read down the simulation results,
43:01: then do sort of like, you know, check how the particles move,
43:04: or like where do they want to move,
43:06: and resolve collisions, you know,
43:09: like just like Raycast against Bepu physics,
43:13: update their velocities,
43:15: and, you know, write it back into the Unity system.
43:19: And what this did, you know, this kind of,
43:23: it added like, you know, extra steps,
43:25: like which decreased efficiency,
43:28: because, you know, we have to like read Unity's data out,
43:32: then we have to like, you know, do sort of our own simulation mini-step almost,
43:37: like, you know, where we kind of are like,
43:38: okay, this particle has moved like this way,
43:40: figure out like where it should be,
43:42: and write everything back.
43:46: With PhotonDust, it just like resolves the collisions
43:49: like as part of normal part of our simulation,
43:51: so it's a little more efficient,
43:52: we have a little more control,
43:54: it's less kind of nasty,
43:55: because we have like a thing where it kind of like
43:58: plugged into like, you know,
44:00: Unity's like update pipeline to make sure the collisions get resolved at the right time.
44:09: So, yeah, it's,
44:11: we haven't pretty much like,
44:13: we haven't like been using Unity's physics engine at all,
44:16: I think it's probably,
44:17: I don't actually know if it's like disabled,
44:18: or if we can disable it like completely,
44:20: but we've been using,
44:22: we haven't been using any parts of it,
44:24: and even though I guess the system was like doing Bepu,
44:27: but it was a little more kind of convoluted,
44:30: than it is with PhotonDust.
44:34: Let's just check in on time.
44:36: Oh, still got a bit of time.
44:38: Check the fox author is asking,
44:40: a bit curious question,
44:42: a bit curious architecture question I've had,
44:45: are you planning to introduce a system that will allow local hierarchies on the world data model at some point?
44:50: Right now if you, for example, have a heads up display for a certain user,
44:54: it still needs to exist on the data model for all users,
44:56: you can drive the enabled state local already,
44:58: but it doesn't remove the network overhead for something others will never see.
45:03: So, internally,
45:05: Froox Engine already supports this.
45:07: So, like a lot of things that you might not even realize,
45:09: because, for example,
45:12: NAUIX,
45:13: what actually happens, it creates local slots,
45:17: and it's a sub like, you know, mesh renderers,
45:18: it's a sub like, you know, asset providers,
45:21: you know, and all things on those that are local to you,
45:24: because the components that are within the data model,
45:30: the components that are, you know, within the data model,
45:34: they are synchronized,
45:35: and those components, they fully define what the state should be.
45:41: Then the local ones,
45:44: these components, you know, the local ones they create,
45:47: you know, all the materials and everything,
45:50: this essentially to implement that shared state.
45:54: And that doesn't need to be synchronized,
45:55: because everybody has the same synchronized state,
45:58: you know, for example, the UIX canvas and all of its structure,
46:00: so the render components can be pure local,
46:03: because they are pretty much deterministically computed from the shared state.
46:09: It's the similar thing with particle systems,
46:11: because what PhotonDust does,
46:15: the components actually set up local slots,
46:18: which have renders on them,
46:20: the actual components that are responsible for rendering out the particle buffers and so on.
46:27: So there's a number of things that kind of use it.
46:31: Even things like if you, for example, override people's name tags.
46:37: Let me just see if I can do that.
46:39: So we have these name tags, and these are actually in the world.
46:43: If you switch this, if I switch this to custom name tags, these are actually local.
46:50: You will not find these in the Inspector, because Inspector doesn't show local slots.
46:58: But this is technically an injected local slot, under whatever the hierarchy is.
47:05: So this allows us to show name tags that the user pretty much has no control over.
47:13: So it's kind of used to implement some features where we want to give a local control.
47:19: It creates some oddities, because for anything that is local, it cannot be referenced by the data model,
47:28: or by the user. So that kind of creates some kind of complexities.
47:32: The systems that are local, they can reference things that are in the data model, but not the other way around.
47:39: And that kind of creates a little bit of complexities for interaction.
47:44: You would want to provide systems to work with this.
47:49: The problem is, you probably need to introduce a bunch more generalized mechanisms to be able to utilize this properly.
47:56: One use, for example, might be hosting a server for a game.
48:02: And you want to have a state of the game and logic that you want to keep hidden from everyone in the session,
48:07: you can have that part be part of the local hierarchy.
48:10: That way it doesn't get synchronized, but it's still affecting the synchronized data model.
48:15: So that would be a mechanism where you have something hidden, and it never gets on the other users.
48:23: But we don't need mechanisms to do that.
48:27: For stuff like heads-up display, it might be a little bit trickier,
48:34: because some things, even if it's something you can potentially see,
48:38: we might want to keep them part of the data model, because there might be other interactions with the system.
48:43: Maybe there's something Avatar that wants to reference that, and so on.
48:46: So instead, in those cases, what we want to do is introduce things like the Cascading Asset Dependency system,
48:54: which will allow us to check that this is not enabled for this user,
48:58: so we don't need to load and update any assets within this hierarchy until it gets enabled.
49:03: Because one of the things you could do is say you want somebody's help.
49:09: Maybe you need to fix up something in your HUD, and you can unhide it,
49:14: and you can start messing with it, and other people can help you with it, and so on.
49:17: And if it was pure local, you would lose a lot of that.
49:21: So the way it's probably going to work is you're going to work on things that's part of the Shared Data Model,
49:25: and then you can decide, I'm just going to move this to the local one,
49:29: I just don't want this to be part of the Shared Data Model,
49:33: but then also limit the level of interaction you can have with it.
49:36: Because even something like selecting local stuff with the Developer Tool Tape,
49:43: you can't do that because the Developer Tool Tape needs to reference it.
49:49: And it cannot reference it because you cannot reference local stuff from the Shared Data Model stuff.
49:57: So that stuff just wouldn't work.
49:58: But there's probably going to be mechanisms to do that kind of thing in the future.
50:03: We just need to formalize it and design good ones that allow for good workflows with this kind of thing.
50:14: The next questions...
50:17: Marcia Kanark is asking,
50:19: I recall you've mentioned that UDIMs won't be supported anytime soon.
50:23: Could it expand if you have extra time, if limited integration of say, model with up to 10 UDIMs
50:29: and only single standard shader could be supported sooner or would it just be double the work down the line?
50:36: Yeah, it's pretty much like...
50:37: Right now we're in a weird state where we want to move away from Unity and that's work being put into Sauce.
50:44: Which means everything, every shader that's going to be added needs to be re-implemented later.
50:50: And usually the limited integration, it just makes things more complicated because
50:56: when you do limited implementation, you essentially cut corners.
50:59: And when you cut corners, you kind of sort of bake things into how everything else works.
51:08: And when you do that, it actually makes it harder to move it to another system because
51:13: you've cut corners in making it a more generalized implementation to make it more baked in
51:20: with whatever you're using at the moment.
51:22: Which means if you change what you're using, like which renderer you're using,
51:26: that actually makes it harder to re-implement.
51:32: I don't think this is likely to happen.
51:35: Maybe if there's a lot of interest in it, like if there's a GitHub issue,
51:40: we get lots of upvotes, there's maybe some business use cases,
51:44: we might decide to eat that cost of implementing it now and do it.
51:51: But it depends. There don't need to be significant interest in it.
51:54: Otherwise, we're probably just going to delay it until later.
52:01: And the other key word there is extra time. We never have extra time.
52:09: It's already like, often times, at least for me, I end up wishing,
52:15: I wish I had more normal time to work on the things I'm supposed to be working
52:21: on because there's things that kind of inject themselves in.
52:26: And I'm like, I need to do this thing that's urgent so I can work on this other thing that's urgent
52:32: so I can work on this thing that was supposed to be working in the first place
52:34: because now a bunch of them kind of stacked in instead.
52:38: So, I think, I mean, if there's not a GitHub issue, I recommend making one
52:44: seeing if there's a bunch of interest in it, but right now I think it's not super likely.
52:51: At least not until we move to the engine.
52:55: Because one of the things we also, there's actually another aspect of this.
53:00: One of the reasons we want to move to a custom engine is because they allow for making custom shaders.
53:10: And with custom shaders, what we essentially do, instead of us having to be like,
53:15: say people from the community you request 100 different shaders that you want for different things.
53:21: We could either spend time implementing every single one of them ourselves.
53:28: And when we implement them, we need to support them and we need to like,
53:31: if you need changes or if things are not implemented the way you need it to move,
53:35: that's additional time we need to spend with each one of them.
53:39: And we will not have time for that many shaders.
53:43: But if we invest our time into making solution that lets you implement shaders,
53:48: or anyone else in the community, now we kind of give you the power,
53:53: you know, to like, essentially like if you need a shader, you're not reliant on us anymore,
53:58: you can just implement it yourself or you can ask somebody knowledgeable to do it for you,
54:04: because now there's a system to just make them on your own.
54:07: And that like, even though, even though like, you know, like it's, what's the word?
54:19: Even though like it's like, you know, bigger chunk of work to like do you implement, you know,
54:24: switch the engine, implement custom shader support in the long run,
54:28: like once you consider like, you know, the hundreds and thousands and eventually dozens of thousands
54:32: of all the shaders that people could be implementing, or people could want,
54:39: like long term. So that's kind of like, you know, how we kind of approach like things a lot of times,
54:44: is instead of like us solving lots of individual problems, we instead of like, you know,
54:49: provide tooling to solve them generally and give like everybody in the community more power on how things work.
54:58: Next question is from Grant.OK.
55:01: Have you heard about stack.divs as a tool to use in development?
55:04: It might be something you could look at.
55:05: I'm actually not sure about like stack.divs.
55:10: Sorry, do you know like what it could be talking about maybe?
55:14: Well, let me let me see what it looks like.
55:18: Let's see.
55:19: Also, I'm just going to drink a little bit of soda because I'm getting thirsty.
55:27: One.
55:50: Find anything?
55:52: I might hope like, like, uh, like if you could like, you know, explain a little bit more what you mean by those.
55:59: Yeah, it would help to explain like how those differ or, or if they differ or actually just what the heck are those?
56:09: Yeah, like, I'm familiar with this, but I don't like exactly what staff.divs are.
56:14: And like, I don't like the development.
56:17: It says they can provide a video to watch after this.
56:19: Yeah, but yeah, we won't be able to answer it like during the stream.
56:25: The next question is from Shaka the Fox Sutter.
56:28: Are there any specific future plans you have for Resonite's avatar system?
56:32: Either new features or improvements to existing stuff?
56:35: Yeah, so, um, let's see.
56:44: So you know this thing.
56:46: Oh, that didn't spawn in the right spot.
56:49: You know, you know, you know this thing.
56:54: It's gonna get thrown into fire.
56:57: This thing is horrible.
56:58: Yes, it's very old. It's very like big, kind of like assumptions and things.
57:04: The workflow is like, you know, not where we want to be.
57:08: The thing I kind of want to do like, you know, with our avatar system is make it so it's more sort of dynamic.
57:14: There's really few aspects to it.
57:16: Like one of them is rely a lot more on heuristics because the original, it wasn't really built with much heuristics.
57:23: And when we tried to add some heuristics, it just kind of breaks because it's kind of spread out on multiple parts.
57:30: There's the infamous like, you know, try to align hands and it's just kind of oftentimes less of a weird thing.
57:37: It's one of those things where the avatar is not designed to work with heuristics.
57:42: So it kind of just doesn't work right.
57:44: But we have heuristics that are part of the other systems.
57:46: So what we want to do is for the new avatar system is make it like, you know, where if you want to convert something to an avatar,
57:55: one, it will like for initial setup, it will just try to figure out as much as it can.
58:01: So ideally, you don't even need to do anything extra.
58:05: We just, you know, say make this an avatar and figures everything, sorry, it figures everything out.
58:11: The other part is going to make it so you can redo it.
58:16: If you need to adjust offsets for your hands, for your head, you can, you know, you can reopen the visuals on the avatar like at any time,
58:26: and, you know, do like more modifications and do more tweaks.
58:29: That way, like, you know, it's not just like one time process, but it's kind of gets back thin.
58:33: But like, you know, parts of that system, you can, you know, sort of like reinvoke at later times.
58:38: So there will be like one part of it.
58:41: The other one is like, you know, must have also worked IK, making so you can, you know, for example, select the bones.
58:47: If it doesn't figure the bones out, you know, during the import, like you can just say, you know, this is the, you know, this is the hip bone.
58:56: This is the spine bone. This is this bone.
58:57: So you can, you know, fix things up and, you know, kind of change them at a later time too.
59:01: Because IK is another component that like needs to be sort of initialized in the setup.
59:05: And then once you kind of initialize it, it's sort of baked in and it's kind of hard to change.
59:13: So that's been like another thing to change.
59:20: So, yeah, like these are like a bunch of things.
59:22: I mean, there's probably a bunch of other stuff too, like just generally, because the other creator, like it's not a single thing.
59:29: There's like multiple systems, you know, one of them is just importing the mesh in the first place.
59:34: Which is its own system and its own heuristics.
59:37: And there's, you know, when you actually make it into an avatar, it kind of instructs it with things and sets up all the slots and components and everything.
59:44: So, probably to be kind of like unifying some of that too.
59:49: Yeah, just like the whole import pipeline in general is a little old and cobwebby still.
59:56: Yeah.
59:56: We're going to look at a place like, it's actually one of the things because there was a question like, you know, last week about open sourcing.
01:00:03: The import pipeline is making more modules instead of, you know, we just kind of, because I know it's kind of like this kind of big function that like decides which type of asset it is.
01:00:12: And then it triggers various importers, but also like all the import functionality is sort of like baked into the engine.
01:00:19: Like it's kind of compiled with it.
01:00:21: What I want to do is instead of providing an interface, interface for generally specifying importers and exporters.
01:00:31: So like, you know, and like, which then hook into the generalized system, which is going to invoke those.
01:00:37: And that'll make it much more easier to like add more importers, you know, and more exporters to the system.
01:00:44: We'll also be able to like, you know, to separate out of the code base and, you know, open source these.
01:00:49: So you can, you know, look at how they work.
01:00:51: You could make your own forks, you could make your own, like, you know, modified like versions, which pretty much what forks are.
01:00:57: You could build your own importers, you know, add them to the system, share them with others.
01:01:02: So just re-versioning that system is going to, I think, be like very beneficial.
01:01:05: There's like another thing that it's going to help with because one of the limitations that we have is sometimes like, you know, the system will try to figure out what type of asset something is.
01:01:16: And it usually uses like the combination of the header and extension.
01:01:20: The problem is, in some cases, that's not enough.
01:01:24: Like, for example, consider something like, you know, a GIF.
01:01:27: Because a GIF, it could either be a texture, when it's a static one, or it could be like, you know, like a video or something, like when it's an animated one.
01:01:35: Similar with like, you know, .PLI, which is actually used for 3D meshes.
01:01:41: But also it's used for Gaussian Splats and you need to kind of read the header and stuff like to figure out what it is.
01:01:47: But right now the system doesn't have the flexibility because the determination of the type of asset happens, you know, before the importers even go.
01:01:54: And then the importer is left with a choice.
01:01:58: So what we want to do with the generalized system, make it so that each importer has a mechanism where it can be asked,
01:02:07: this is the data I have, is this something you're able to import?
01:02:11: And it's going to do a little bit of analysis and determine, okay, maybe I can do this one, maybe I don't do this one.
01:02:18: It's a similar thing also when you import, for example, some URLs.
01:02:23: For example, Blue Sky, where sometimes it imports as a video, even though it's not.
01:02:31: And it's because the importer doesn't really have that flexibility.
01:02:37: So working it and adding that flexibility is going to help a lot in lots of ways.
01:02:44: Next question, away from Ozzy.
01:02:47: When looking at grabbers in Flux, I noticed you can get a user's grabber by body node, not by chirality.
01:02:54: Is that a future work for multiple grabbers?
01:02:56: Yes, actually it is.
01:02:58: It's literally there, so it can define grabbers.
01:03:03: Well, actually there's another reason for it.
01:03:04: It's because originally, FrooxEngine, it used to be very controller-centric.
01:03:11: Controller was the main node, and your avatars were mostly built around controllers.
01:03:16: And then virtual reality started moving towards, instead of the controller, representing your hands.
01:03:23: And this was actually added, so the system can distinguish whether the grabber is supposed to be on the controller, or it's supposed to be on the hand.
01:03:33: So that was one of the design decisions.
01:03:35: But also, it allows, you know, you can just say, the grabber is on the mouth.
01:03:39: If you want to make a system where you can have a grabber, and you chomp something, and you grab it that way.
01:03:46: And maybe you want it on your leg, you know, or something.
01:03:49: It makes it flexible.
01:03:57: Granuke is actually saying, as a barber, to describe stack divs, so you do the work on feature A, and finish it, but can't push it yet, and you want to work on feature B, which requires feature A.
01:04:07: Stack divs give you a workflow, backed by tools, to do this, so you can keep working on features, and then wait for things to be merged or pushed.
01:04:13: I mean, we cannot do this sometimes, like, I don't know if it's stack div, but like, you can essentially, like with Git,
01:04:23: because one thing we do is, we make a branch to work on a feature, and then it needs another thing, so we make another branch.
01:04:37: Which allows us to work on that, while the thing is not pushed yet, so like, it does happen, so there's a layer on top of Git, we have to explore it.
01:04:46: But, the problem is, like, you know, like, usually the way, like, you know, for we think things to be merged or pushed, it's not like, it's not like a thing that like, you know, there's not a workarounds for it, like we could kind of like, you know, keep working on things,
01:05:02: but when things exist for too long on branches, and there's like work on multiple branches, the problem is they get out of sync, and at some point, this kind of becomes unmanageable.
01:05:13: It also becomes like a problem, I think, because, say, you need to push feature B, and because, like, it depends on feature A, you cannot push feature B without pushing feature A.
01:05:29: So, like, you know, like, I don't think it would, like, really, you know, solve by that way, or by this mechanism.
01:05:37: So, it's kind of, it gets a bit complicated, like, this, this is like a thing that, like, you cannot run into sometimes, it's like where I generally, you know, when we collaborate on code, prefer keep things more small, do them in phases, and, like, merge them in as fast as possible,
01:05:54: because if something exists on the main, like, you know, in a branch for too long, it diverges, and then we have to deal with, you know, deal with those differences, and merge conflicts, you know, and maybe, like, this thing is, like, compatible with this,
01:06:08: and that takes, you know, time to deal with, so it's, oftentimes, it's not, you know, that, like, you, that you physically can't, you know, keep, like, working on things that way, but it's, like, it started costing a lot of time,
01:06:22: and that maybe makes it, you know, so you cannot meet, you know, deadlines for certain things, or it makes things take longer, or maybe just, you know, things become, like, too convoluted, it becomes harder to kind of, you know, keep all those kind of parallel things, you know, in, in sync,
01:06:44: I don't, like, without fully understanding what it is, like, I don't know, you know, how this would help in this scenario, so, we'll see, like, send, send a video, and try to have a look.
01:06:59: The next question's from Jack, the Fox Weather.
01:07:01: One thing I'd like to see at some point is the ability for objects to save items dedicated further in inventory on your behalf,
01:07:07: but the biggest thing I see standing in the way, there currently is not a mirror mechanism for inward objects to have been trusted context between you and the object.
01:07:14: Like, permissions to save on your behalf, and where to save to.
01:07:18: So there's two questions here. First, what kind of safety mechanisms for such things are you planning in the future?
01:07:24: And there's a kind of tricky one. So, like, it depends...
01:07:28: So, like, one of the things that kind of comes to mind with this is, like, we want to add, as part, like, you know, of the hard permission system,
01:07:38: sort of a way to track, we want to, you know, track the sort of history of the object in a way, like, who owns it.
01:07:50: So that way, like, say you have an avatar or a tool, and the tool wants to have access, you know, to our inventory.
01:08:00: You only want to give it access if it's something, you know, you spawn from your own inventory and you've given it access.
01:08:06: So we need the system to be able to say, you are the user who spawned this thing, therefore it has those permissions to do those things, but only to your inventory.
01:08:20: The problem is, you know, people might end up doing stuff, like, where they spawn something, then make something spawn on your end, you know, to try to make it look to the system like it's yours.
01:08:39: Even if it's trying to spawn something, even if it's trying to spawn it on your end, you didn't spawn the original system, therefore whatever it spawned doesn't have your permissions, you know, it has permissions of the original user.
01:08:56: So I think that kind of system could help with these things, where some things will have certain privileges based on their origin in the world.
01:09:09: And some things, you know, maybe part of the data model says, like, you know, this is the original part of the world, this has been spawned by the host, this has permissions, you know, to do certain things.
01:09:19: Like, for example, with hard permissions, you might be able to say, users cannot, you know, apply force to each other.
01:09:25: Like, you cannot, like, you know, use a tool that bonks somebody, unless it's something that actually came from the original world and kind of, you know, keeps track of that.
01:09:34: And anything else that you introduce later doesn't have those same permissions.
01:09:40: There's still, like, one issue with, kind of, system where people could make gadgets that try to utilize things that do have permissions and make them do things, you know, they want.
01:09:55: Say, like, you made an item that, like, allows to save the inventory, and then, like, somebody makes something that, like, hijacks the item and saves stuff that you didn't want to save.
01:10:08: With that case, you know, like, it's also, like, very hard permissions with, kind of, help, because we can say, you know, unless this part of the object can only be interacted, you know, with objects that also have, like, you know, this origin.
01:10:22: If anything else tries to modify it, that will either not work, it'll, like, not allow it to make modifications, or it'll make it lose its permissions.
01:10:31: So, I think those kinds of systems would help, like, with these.
01:10:35: The other approach is, you know, like, if something wants to save something into inventory, it essentially shows a prompt.
01:10:41: This thing wants to save, do you want to give it permission to save this item or not?
01:10:45: You know, that way the user actually has a choice, they're informed, and you cannot get around it, because it will show the dialogue in the user space.
01:10:53: And it will have, like, you know, some protection, so you cannot spam it either.
01:10:57: Kind of like, you know, like, when you open Hyperlink.
01:10:58: So, that's, like, another approach, but it does require user input to let it through.
01:11:04: Still good on time.
01:11:09: Second, so this continuation, do you think this could be implemented?
01:11:13: Saving items on your behalf would open up, arguably, Resonite's most powerful feature to be utilized by user creation's inventory.
01:11:20: Yeah, I do think it's, like, a pretty powerful thing.
01:11:24: Like, you know, we could, like, make, like, lots of kind of systems for that.
01:11:28: But, you know, it is, like, one that's dangerous-ish.
01:11:32: Like, the other part is, like, you know, if, say, you want to save, like, items, you could, we could have, like, a system, potentially,
01:11:42: where, like, you grant permission for the world to save the specific part of your inventory and also give it, like, you know, some quota size.
01:11:49: But it also has complications because now we need to, like, you know, track that and so on.
01:11:54: So, is there something we could add?
01:11:58: We'll also, like, think about, like, you know, the secret implications.
01:12:02: So, things like that, we tend to be, like, cautious with it because, like, you know, the potential for abuse is, like, potentially big.
01:12:09: And if, like, you know, there is abuse and we need to change how the system works, that can end up breaking creations.
01:12:15: So, prefer to, like, you know, just kind of take it slower with those things and kind of, like, you know, rush the solution and then cause a bunch of issues.
01:12:28: So, that kind of clears out, like, the questions we have for now.
01:12:35: I had a ramble, but I forgot about it, so I don't remember what it is.
01:12:44: Do you have a ramble?
01:12:46: Oh, there is one thing, actually, that comes to mind, though.
01:12:50: I don't know if it was already announced, like, at the mod direction office hours, but one change we'll be making for office hours,
01:12:59: because we can run these, like, you know, at specific times, and not everyone is able to, like, make it into these.
01:13:07: So, if you've got a question, and you're not able to ask it, because, you know, you're not able to be, you know, present for these livestreams,
01:13:14: what we'll do, at least for now, we'll make, like, a post on the Discord, where you can ask your questions in advance,
01:13:24: and then, like, when we have, you know, some free time, you know, during the office hours themselves,
01:13:30: we'll go through those questions, and get those answered.
01:13:34: So, this way, even for people who are not able to, like, you know, participate directly, you'll be able to, you know, ask your questions, and get, you know, answers to them, that way.
01:13:42: I think this is kind of, like, you know, how I play with people in different, like, you know, time zones, or if you might be, you know, at work, and stuff like that,
01:13:49: and give, like, you know, access to these, like, Q&A.
01:13:53: So, there's probably going to be, like, a ping, like, you know, after this, like, make the post.
01:13:57: I don't know if... Mothership Officers might have already done this.
01:14:06: I'm actually not sure, but, yeah, you'll see a ping.
01:14:10: I've just had more questions, you know, I've still got about, like, four to five minutes.
01:14:17: Yeah, just about, I think.
01:14:21: Do you have a ramble?
01:14:21: Do I have a ramble?
01:14:24: I have a soda, I'm gonna get more soda.
01:14:30: Let's see, do I have a ramble?
01:14:47: Ah, there is one thing, ahh, we have a question.
01:14:53: I mean, we could also go, like, and we don't have to insert, like, right away.
01:14:58: True.
01:15:03: One of the things that I'm, like, I'm really excited about is, like, the ability to...
01:15:14: Gosh, I feel like a broken record whenever I say this now.
01:15:16: I'm really excited to be able to do, like, shape sweeping and stuff.
01:15:22: I think that is super cool.
01:15:24: Being able to, like, throw a shape into the scene and, like, see where it hit and whatnot is really neat.
01:15:34: It opens the door for a lot of things, like, you know...
01:15:38: Making systems to open doors.
01:15:41: Yes, actually, making systems to open doors.
01:15:44: You know, like, swinging swords and stuff and having them actually be able to properly kind of collide, clash with, like, geometry.
01:15:57: Or even with each other, if they both have colliders on them, you can sweep.
01:16:05: It gives all more flexibility to things.
01:16:09: You get to, like, you know, probe the environment and see, like, you know, what happens.
01:16:12: That adds a lot more, kind of, interactivity.
01:16:18: Yeah. It's neat because you can actually give them, like, an angular velocity.
01:16:22: So as they proceed through the sweep, they can, like, tumble through the air.
01:16:27: So, like, they're good at approximating a simple, like, let's say you have a grenade.
01:16:34: You can use a capsule sweep for that and have it, like, tumble through the air.
01:16:40: And, like, actually, you know, if the grenade is, like, sideways, you know, it's not going to hit the thing above it.
01:16:50: And stuff like that. Gosh, it's kind of hard to explain this.
01:16:54: And there's also, like, a thing, like, the physics engine in general, like, internally, it uses, like, sweeps to figure out, like, collisions between things.
01:17:05: Yeah. I guess we do have some questions now.
01:17:09: We have questions, ramp us over.
01:17:11: I mean, this one's kind of rambly, too, because Marta Canard, I'm also not sure if I'm pronouncing your name correctly, sorry if I'm not, is asking,
01:17:22: What's your dream feature you'd love to implement, but for one reason or another it's not yet possible?
01:17:27: I mean, there's a bunch.
01:17:31: I would actually, the one, I'm gonna say custom shaders, because, like, you know, we kind of talked about it, so it's kind of fresh on the mind,
01:17:40: because I feel it's one of those things that's, you know, gonna be really fun, because, you know, you certainly have, like, full control over visuals,
01:17:47: and you have the power of the GPU, you know, to accelerate it.
01:17:51: So, if you're familiar with a website called ShaderToy, it's, like, this thing, you know, where,
01:18:01: it's this kind of thing, you know, where you can essentially just make, like, all kinds of shaders to do, like, you know,
01:18:08: remarching stuff and cool visuals and can share with the community and can play with them and see their code.
01:18:14: And I kind of expected there was something similar-ish happening here, like, once we have that,
01:18:19: when people, like, you know, just make all kinds of, you know, crazy visuals that are, you know, super smooth,
01:18:26: because they're, like, you know, GPU accelerated and everything.
01:18:29: But, all right, now it's not really possible, because, you know, we're still kind of tied, you know, to Unity.
01:18:37: But eventually it will be.
01:18:39: So it's, it's also, like, I mean, it kind of also depends, like, what do you mean by not possible?
01:18:43: Because, you know, like, some things are not possible because we don't have time for them right now,
01:18:48: some are not possible because we need, you know, some prerequisites and so on.
01:18:54: So it's, there's a lot of things I have to kind of think about, like, the others, like,
01:18:59: it also depends what would be, like, the new feature, because there's just a lot of features I really want us to have,
01:19:05: but it just takes time.
01:19:10: There is your body physics, there's actually another one that's gonna, but this is gonna become, like, easier,
01:19:14: like, once we do the switch to .NET 9, because this is gonna give us a lot of, like, big performance boost,
01:19:19: we're gonna sync up with, like, this Bepu, so it kind of puts it back on the plate of possibilities.
01:19:27: But yeah, there's a bunch. Oh, one, one I kind of, like, want to do is, like, once happens where separation,
01:19:33: I've kind of talked about this, it's like we could run part of Resonite as an overlay.
01:19:38: So you could actually, you know, have your dash, you know, be on top of other games, you know,
01:19:44: for example, like, VR Shed, or, you know, Minecraft in VR, like, with Wifecraft and other VR games,
01:19:50: and create sort of interactions with, you know, other bits.
01:19:53: Actually, if we go even further, I would say, like, having Resonite for augmented reality,
01:19:58: because there's not very much commercial available, like, you know, augmented reality hardware,
01:20:06: imagine, like, you know, imagine you have something like this, you know, you have, like, a brush,
01:20:11: and this brush, you know, is already developed, it like, you know, already does things and so on,
01:20:15: it operates in this world, which is like a fully VR world, but it doesn't depend on it.
01:20:21: We, like, if we get Resonite on an AR hardware, you'll be able to, like, you know, spawn things like this,
01:20:31: and just, like, you know, draw on top of, like, the real world.
01:20:35: Because the brush, you know, is the same, it's just like, you know, instead of, like, the world being, like, you know, 3D,
01:20:42: it's a pass-through, but then you have stuff that surrenders on top of it, including these brushes,
01:20:47: and we already have interaction systems, we have all these, you know, complex things,
01:20:51: so we can use tools, you know, same with the developer tooltip, if you have, you know,
01:20:57: if I grab this, and I have some objects in the AR layer, I could still, you know, select them,
01:21:04: and I could do ProtoFlux, you know, all of this, all these gizmos,
01:21:08: they can still render on top of, like, a real world, instead of, like, you know, the virtual one.
01:21:13: So I think that's, like, one of the kind of dream things that I think is gonna be really fun,
01:21:16: but there's not possible, kind of, due to lack of, like, you know, good hardware,
01:21:21: and us not being, you know, there yet with some things.
01:21:27: The other cool thing is gonna be, like, you know, with AR hardware, it's like, you know, say,
01:21:31: you're in your room, and you could actually have somebody join in their avatar,
01:21:36: like, for example, in my room, and just see, like, you know, Cyro joins,
01:21:39: and because the AR headset says, you know, they can have, like, a scan of the room,
01:21:43: he can see, like, you know, an approximation of my room, and he can move around in it,
01:21:46: and I'll just see him, like, moving around my room, looking at the things I have there, you know.
01:21:50: I think that could be a really cool feature, but not possible right now.
01:22:00: Maybe. Do we have a dream feature that's not possible yet?
01:22:06: I want to import 73,000 trees.
01:22:10: That's not that many. I think that's gonna be possible.
01:22:16: I guess more so, like, to what that's actually alluding to, is I want, like, a proper LOD system.
01:22:24: Yes.
01:22:27: Because we do have an LOD component, which is kind of, it is kind of like a carbon copy of Unity's right now.
01:22:36: Yeah. But unfortunately, it kind of keeps all, like, the LODs in memory too, so, like, it's...
01:22:42: I kind of regret about doing that once.
01:22:45: Yeah, now we gotta support it.
01:22:48: I'm probably gonna convert to something.
01:22:52: But yeah, those are good features.
01:22:55: I also like to think about them a bit more. I'm also a bit...
01:22:59: No, thank you for those right now. I have to, like...
01:23:02: Usually what I do is, like, sometimes I go to the GitHub and I'm like, okay, like, these are...
01:23:05: Actually, I do have a thing. I have, on my GitHub, I have, like, you know, fun features that I want to work on.
01:23:12: So I could, like, literally just check that.
01:23:18: Thank you, Tatsu. Yay!
01:23:22: Anyway, Tatsu's been activated by the trees.
01:23:29: He wants to live in a forest. That's kind of cool.
01:23:36: There's been systems where somebody made, like, where it sort of, like, just repositions the trees and there's fog.
01:23:41: So, like, you can have, like, really big dense forest, but you cannot tell.
01:23:50: Let me see.
01:23:55: What was one of the things?
01:24:01: Let me just check this.
01:24:05: So on GitHub, I have specific board of Froox Fun Feel Good Issues.
01:24:13: Oh, there's a bunch. Well, some of these are, like, possible right away.
01:24:18: Spatial Variables, DSP, in-game, oh, in-game Vertex Level Mesh, I think.
01:24:24: But these are kind of, like, technically possible. It's just a question of prioritization.
01:24:29: State Machine, generic collections, generic, ooh, generic terrain system, I talked about this one.
01:24:34: Face tracking, generic timeline system, cloud simulations.
01:24:37: All of these are kind of possible, like, they're, I don't, this GitHub board is not for my use.
01:24:44: They're possible, like, you know, if I decided I'm working with this now, like, it's possible, so, it's more of a prioritization thing.
01:24:51: Anyway, let's move to the other questions.
01:24:54: The next one are from PowerPap.
01:24:56: How have things been going with the recent switch from Asset Variables system to Linux?
01:25:00: The one actually might be better to ask, like, in Proables office hours, because he's the one who's been kind of, you know, working on that one.
01:25:08: As far as I know, like, it's working well.
01:25:10: Like, I think, he said there's, like, you know, some issues to wrap up there.
01:25:14: It's not been merged yet, like, to the mainline, he's still kind of working on it, so, he hasn't made, like, a PR for it yet, but it's probably going to be coming soon.
01:25:22: But, yeah, we're giving out, like, more details.
01:25:25: Asking his office hours, I don't have, like, super much info on this one.
01:25:29: It's very exciting, though, for me, because, like, one of the things that's been bugging me is, you know, that, like, on Linux, we just kind of drew the Compersonator library.
01:25:38: Because they didn't provide precompiled libraries for it.
01:25:43: So, like, you know, the Linux headless and Linux asset variant, like, can it actually compute all the variants it needs to?
01:25:48: And this is going to bring it, you know, to the feature parity.
01:25:54: So, this is a very exciting development.
01:25:58: The next question is from Navy3001.
01:26:01: How much work is there for the audio switch from Unity?
01:26:04: So, right now, it's a little bit hard to estimate, because, like, I haven't, like, got into the full swing of it yet.
01:26:11: But, there's, like, a few sort of, like, milestones that kind of need to be hit.
01:26:18: Because I did cover this in one of the previous Resonance office hours, like, you know, the technical details of it.
01:26:25: So, there's also a video on our YouTube channel, I recommend checking that one out, you know, there's graphs and visuals and everything.
01:26:32: Overall, I started doing some research on how, you know, to integrate with Steam Audio.
01:26:37: And there's, like, several parts of the system that need to be built.
01:26:40: One of them is having a way to query the space, you know, for audio sources.
01:26:45: Which we mostly have, because we're using the Bepu physics engine, and we already have, like, a sort of, like, API where we can, like, you know, register things in bounding boxes and update them and then query that.
01:26:56: Because Bepu has, like, very efficient algorithms for spatial queries.
01:27:02: So, I'm probably gonna, you know, adapt that, maybe generalize it a little bit more, because there's other systems I want to use it for, like, spatial variables.
01:27:10: The other part is I'm gonna be building, like, the actual audio engine, which is, like, when, once it queries, you know, all the audio sources that are within, that could be, like, you know, listened to within a specific location.
01:27:20: Then I'll need to, like, build a list of, like, okay, these are the audio sources, and it's gonna sort them by their priority.
01:27:27: So, like, you know, the ones with the highest priority are the ones gonna be playing, and if there's too many audio sources, it's gonna, you know, have a cutoff at a certain point, because it needs to limit, you know, the CPU usage on how many audio sources it can mix.
01:27:41: It needs to, like, you know, it needs to sample the source audio data, it needs to apply, you know, stereo panning, spatialization, you know, other effects, reverb, other things that all need to be resolved.
01:27:53: And then, like, you need to mix them together to produce final audio output.
01:27:58: So it's mostly that, it's, I expect it to be way less work than the particle system, because the particle system in general has a lot.
01:28:05: The audio system in comparison is relatively simple.
01:28:09: We already have, like, you know, most of the actual decoding of the audio data and so on that's already done in FrooxEngine.
01:28:17: So, you know, that kind of stays the way it is.
01:28:20: And it's mostly, you know, just, it's mostly just kind of like the resolving, you know, which audio sources are within a certain area and applying the appropriate effects.
01:28:31: And I've already, I've been kind of playing with the Steam audio library and a wrapper for it, so I can have a better feeling on how it works.
01:28:39: So I think it's probably just a few weeks of time.
01:28:42: We'll see how things go, sometimes they end up like rabbit holes, but I think it's gonna be relatively fast, hopefully.
01:28:51: We'll see.
01:28:52: That's kind of like the gist of the thing that kind of needs to be done.
01:28:55: And then one thing that needs to also happen is we're gonna decide if we're gonna pipe the final audio, well, we're gonna pipe the final audio into Unity and have Unity output it, or if we're just gonna use CSCore and just output it ourselves.
01:29:11: In which case, you know, there's some things that I figure I'd like to see, does this work properly, you know, like Proton and Linux and so on, so like, there's things there that need to be investigated, and we'll see which ones less work or which one works better.
01:29:31: Next question is from Jack the Fox author.
01:29:34: When we get proper rigid body support, how do you plan to handle network synchronization? I feel that when it's only simulated by one and the network, it will create lots of traffic and be somewhat choppy depending on network throughput.
01:29:45: But simulating fully locally would open up system to desync, which kind of would not work with precision design principles, and also make it sad.
01:29:53: Usually with physics, I don't have specific details for you right now, because part of figuring out how exactly it's going to be synchronized, that's part of working on the system.
01:30:04: And because we're not working on the system yet, we don't have those details.
01:30:09: In general though, how this tends to be approached, there's two major approaches you can use for synchronizing physics.
01:30:17: One is using something called lockstep synchronization, or lockstep simulation, where you make sure each user simulates with the exact same inputs at the same rate.
01:30:31: And essentially sync up all the inputs between them, make sure they're all same, and you perform simulations to block locally.
01:30:38: The problem with this is, one, you need to make sure all the inputs are the same, which one itself is a challenge for a lot of things.
01:30:45: But also the simulation itself needs to be fully deterministic, which with Bepu Physics, you can technically do it, but it has additional performance costs because multi-trading tends to be more stochastic.
01:31:00: And also it might not be deterministic anyways if you run it on different platforms, because different CPUs might produce slightly different values for floating point calculations which end up accumulating over time.
01:31:13: So we're very unlikely to use this approach.
01:31:17: The other approach is, you still run the simulation locally on each user, and you keep synchronizing corrections.
01:31:26: So you essentially keep extending the status, and usually when you encode the states of the rigidbodies, you use some mechanism to compress it.
01:31:36: You use stuff like quantization to reduce, and then use delta compression to reduce how much data you're sending.
01:31:44: And maybe you're not sending it super frequently, you're just sending no corrections, because most of the time, because things are already synchronized, most of the time the behavior and movement is going to be the same.
01:31:59: And what happens is that you want to avoid a drift over time.
01:32:04: And if you just left it like that, over time it's going to drift, and the error is going to be large, and then you're going to see this inconsistent between users.
01:32:12: But if you apply frequent enough corrections, it keeps the drift from becoming too big.
01:32:19: So that's probably the approach we're going to use.
01:32:24: But for exact details of it, I can't tell you right now, that's going to be part of the work itself.
01:32:30: Next question.
01:32:41: It's actually simple math, it's very efficient to compute.
01:32:46: I can actually even draw it maybe. How do you typically do intersection?
01:32:50: So, when you are rendering something, do I have a nice geometry or something brighter?
01:32:58: Can you see this easily? Does this show on the camera well?
01:33:03: Maybe I want my glow in one. How about this one?
01:33:07: I can punch in the FOV a little bit, maybe.
01:33:10: Hold on, let's end the thing. Actually, hold on.
01:33:14: Let me do this. I'm going to switch it to manual.
01:33:17: I set it up like for this board and I haven't drawn anything the whole time.
01:33:27: It's kind of ironic, isn't it?
01:33:33: Anyway, so if you're doing intersection, there's an easy...
01:33:41: Essentially, when you think about it, when the shader runs, it's just...
01:33:46: It's rendering pixels on your screen.
01:33:48: So you have some kind of object. I don't know what this is, it's some kind of blob.
01:33:53: You have an object and it's composed out of lots of tiny pixels.
01:33:59: So what are GPUs doing as it's rendering?
01:34:02: It's going to be computing the individual pixels on the screen.
01:34:07: So you have something called a fragment shader.
01:34:10: And fragment shader that's literally invoked for every single pixel on the screen that the object covers.
01:34:16: So there's going to be a function, your fragment shader that gets called.
01:34:23: And then you compute the color of the pixel.
01:34:25: Or you can tell the GPU, discard this pixel.
01:34:28: Which means that pixel is actually not going to render out, it's just going to be skipped.
01:34:35: Usually, the way you could do intersections is...
01:34:40: You have the intersection parameterized, for example, by a point.
01:34:45: So there's a point in space, and there's a normal.
01:34:49: Which defines a plane.
01:34:52: Like an infinite plane that kind of goes this way.
01:34:55: So when you're computing your pixel, what do you do?
01:35:01: Oftentimes, in the shader, say you consider a pixel here.
01:35:06: And you want everything that's on this side of the plane to be cut.
01:35:08: And you want everything on this side of the plane to be visible.
01:35:18: You don't want any changes to it.
01:35:20: So you take the pixel, and as part of the fragment invocation, you actually have its position.
01:35:27: So you have X, Y, Z, and technically there's a W, but let's just say X, Y, Z position.
01:35:33: And we can actually ignore the Z, because this is already aligned to screen space.
01:35:38: Let's just worry about X, Y.
01:35:41: Actually no, you need to kind of worry about X, Y, Z.
01:35:44: But for this purpose we don't have to, it's kind of the same principle.
01:35:48: So it has X, Y coordinate.
01:35:52: What do you do?
01:35:55: You literally take this plane.
01:35:58: The easiest way to think about it is you align each point, each pixel you're rendering,
01:36:02: you just align it to the plane so it's upwards.
01:36:05: So if you look at this whole thing from the perspective of the plane,
01:36:11: you have this point, you have this normal,
01:36:13: and this pixel, if you were to rotate it like this,
01:36:19: the pixel is here.
01:36:20: So I'm going to move this here, and the pixel is right here.
01:36:28: And this is the actual calculation you can make,
01:36:30: where you recalculate the X, Y position of the point
01:36:34: so it's aligned with the plane, and the way you do that is usually you subtract
01:36:38: well, you first rotate it. So like, you know, it's facing upwards.
01:36:43: For that you can use like a matrix calculation where
01:36:45: you know, we have the plane, you have like, say this is the other axis, and
01:36:50: you compute a rotation matrix that essentially
01:36:53: rotates this whole thing so this is upwards.
01:36:58: And you also offset it by the center.
01:37:02: And that way you end up with an X, Y coordinate where
01:37:07: you end up with an X, Y coordinate
01:37:12: where this is like, you know, the origin.
01:37:14: This is like 0, 0 instead of being, you know, an
01:37:18: arbitrary position. And then the position of the pixel is gonna be
01:37:21: say like this is, you know, say this is 2
01:37:26: and minus 2. And we have
01:37:30: another pixel that's over here.
01:37:33: You know, there's another pixel over here.
01:37:38: Once you do the rotation matrix, subtract the position
01:37:42: subtract the center of it. This pixel ends up with a coordinate
01:37:46: that's 2 and plus 2.
01:37:50: And then you have a very simple logic where you just ask
01:37:53: is the Y coordinate, is it above 0? If it is
01:37:57: discard the pixel. Don't render it. Which means this pixel gets discarded
01:38:02: and this pixel doesn't because it's negative.
01:38:05: And that's all math there is. Very simple math, very easy
01:38:09: to do like, you know, intersection this way. You can also do some additional
01:38:14: effects because one of the things you can do with Intersect Shader is, you know, at the
01:38:18: intersection you can make it so like, you know, for example this glows.
01:38:23: And literally the way that it works is you say
01:38:26: you know, you essentially interpolate the positions
01:38:29: for example from between 0 and say minus 6
01:38:34: you're just gonna, you know, interpolate the color.
01:38:38: So like, you know, this would be 0 intensity, this would be 0 intensity and this would be
01:38:42: 1 maximum intensity and you map this to a color
01:38:46: and you know, you have your glow that's aligned with the plane.
01:38:50: So, intersection, very easy.
01:38:53: Where it gets complicated is like if you wanted the intersection
01:38:57: to close holes and that's this own whole thing
01:39:01: where it gets really complicated because now you have to like figure out
01:39:06: complex geometry because you essentially end up with a hole here
01:39:09: and you have to figure out like, you know, where the hole should be closed and then
01:39:13: you probably have to use something like a compute shader or something.
01:39:18: But for intersection itself, the way it is here, very simple
01:39:21: very light, like there's
01:39:25: like GPUs can do this like, you know, like nothing. It's peanuts to them.
01:39:32: There we go.
01:39:33: And I got at least one drawing
01:39:38: in this episode.
01:39:41: And we got 20 minutes so, but yeah.
01:39:45: Hopefully that answers the question. And also we got the subscription from Tatsu
01:39:49: thank you. Oh and we got, oh thank you
01:39:53: check the folks out there also subscribe with Prime, with Boa, thank you.
01:39:58: And also Tatsu donated 100 bits. Oh my god
01:40:01: thank you Tatsu. We need to mention trees more often.
01:40:06: And Tyra, subscribe to tier 1. Thank you so much
01:40:09: for Tyra. Oh my god, we got this going.
01:40:15: Erm, Rasmus0211 is asking any word on a
01:40:17: native animated gif support. It's not really something we have planned
01:40:21: right now. It's like one of those things where it's like
01:40:25: it's kind of a weird situation because like I feel in most cases
01:40:29: nowadays like people don't actually use animated gifs. Even for things
01:40:33: they still call gifs, they tend to be just videos.
01:40:38: So it's, the problem with animated
01:40:41: gifs is like they're a bit different from videos because like
01:40:45: they can do a lot of like weird things where like
01:40:49: like for example each frame instead of fully replacing
01:40:53: the previous one like videos do
01:40:57: they can you know work additively where they just kind of like you know
01:41:01: draw on top of like but they leave the old stuff and mix decoding
01:41:05: them and like you know doing like timeline makes it kind of weird.
01:41:10: Also each frame can take like you know different amount of time. It's like
01:41:16: they're
01:41:18: they're just kind of weird like they don't kind of fit neatly
01:41:21: into like you know the video box but you can sort of think of them like a video
01:41:27: so it's like
01:41:28: so the question is, is it worth like you know spending a lot of
01:41:33: time adding support for them?
01:41:37: Because like how much you know how much like often times when we kind of think about these
01:41:41: features is like how much benefit do we get for supporting certain
01:41:45: features? And
01:41:49: supporting animated gifs it's like
01:41:53: it won't take super long but also like you know
01:41:57: I don't think it's like you know like a super high priority because like it feels like it's
01:42:01: a very niche thing and a lot of times you just kind of bring a video instead.
01:42:05: So it's it kind of depends on like what do you want to like use
01:42:09: animated gifs for? Yeah like there's mostly saying low priority
01:42:13: just for ancient memes. Like one way to sort of support
01:42:17: them would be to kind of extract all the frames ahead of time and have like atlas
01:42:21: but like that's still quite a bit of work and if you have like a really long one that's gonna
01:42:25: balloon your memory.
01:42:30: So
01:42:34: yeah it's one of those things like I
01:42:37: think it's like one of those things that's just too niche to like spend
01:42:41: time on it right now unless there's like you know big surge like you know a lot of people
01:42:45: want it then we'll be like oh okay yeah like this is gonna bring
01:42:49: a lot of people, this is gonna make a lot of people happy. We'll add a support but
01:42:54: unfortunately I think it's one of those things where it's like
01:42:57: the amount of work required for it to have a proper support
01:43:01: you know outweighs the utility of it.
01:43:07: NukiKon is like
01:43:08: how useful do you think these office hours things are? I know you've seen
01:43:12: but are they that productive? Does doing them get you excited to keep working or is it a
01:43:16: track? Just ask them because you do more than anyone else with these. I think they're
01:43:20: pretty useful like I do like you know talking about like my work and oftentimes like you know
01:43:25: and like talking about it like you know with people in world like you know
01:43:28: in other places and so on but like you know that's kind of isolated like in its
01:43:32: impact because only few people get to hear it with these
01:43:37: one thing I've been kind of doing is like whenever we cover a big
01:43:40: or interesting topic I will cut it into its own
01:43:43: standalone video and we've actually got like a lot of good response on those
01:43:48: like people seem to like you know enjoy like having that information accessible
01:43:52: it also gives people material that they can use to refer someone
01:43:56: you know if people are asking about how the performance upgrade is working
01:43:59: like why why are you working on the particle system as part of the performance upgrade
01:44:03: now we have a resource that we can point video to
01:44:07: and it helps a lot so I think it's one of those things you know that helps
01:44:11: with you know communication with the community so like there's more
01:44:16: transparency you know so like you know
01:44:20: in how things are going like you know why are we doing things the way we are
01:44:24: you know how things are designed so it gives
01:44:28: I think it's a good way to you know to get a lot of useful information out there
01:44:32: I do enjoy talking about it too like I enjoy like answering questions
01:44:35: and you know spreading the knowledge and one of the things like
01:44:39: I really like doing is just kind of bringing people on the same page with things
01:44:44: so that's one of the reasons I like you know explaining things it's like I just want
01:44:48: people to have the same understanding
01:44:52: you know things like as I do like and understand
01:44:56: like you know why decisions are being made certain way why you're approaching things certain way
01:45:00: how things are going to work because the more you understand
01:45:04: you know the more powerful you are because like you know the more
01:45:08: you know the more kind of tools you have you know in your repertoire to like
01:45:12: do all kinds of things so I do think like they are pretty like productive
01:45:17: and they're definitely not a drag and it gives me
01:45:20: no excited to talk about a feature so I really want to work on because one thing that helps is
01:45:24: you know like if I talk about some features and people also get excited for it and I'm more excited
01:45:28: to work on it so it does help in that regard as well
01:45:32: I don't know like what about you Cyro like is this a drag?
01:45:37: Is this a drag?
01:45:50: No. No it's not.
01:45:54: Let me just check the time, we've got 15 minutes left so
01:45:57: at this point we might not be able to, well there's a few questions but we might be able to
01:46:02: take a few more but if you want your question answered like you know
01:46:06: get it in now because we might run out of time
01:46:11: The next question is from NukiKoon as well
01:46:14: Follow up if there's time, support for SVGs anytime soon
01:46:18: those are animated too. Anytime...
01:46:21: I have a problem with whenever someone asks anytime soon because
01:46:24: I don't know what that means. Is anytime soon, you know, is that next
01:46:29: week? Is that next month? Is that within the next six months? Is that within a year?
01:46:34: If you can please specify what kind of
01:46:37: timeline you're thinking because that kind of makes it a bit easier to answer
01:46:42: But in general I would say probably no
01:46:44: SVGs are kind of weird because like there's
01:46:50: I feel they would be
01:46:51: more beneficial to support than animated GIFs
01:46:55: but also they're a lot more complicated than animated GIFs
01:46:59: so even support for static SVGs they can have all kinds of effects
01:47:03: because you can have all kinds of different shapes but also effects like
01:47:07: they have their own CSS system, you can style them, they can do
01:47:11: effects like blurring and so on, so we would need some kind of library
01:47:15: to render them out, but often times those libraries are designed for 2D rendering
01:47:19: rather than rendering them in 3D space
01:47:24: so supporting them properly gets really
01:47:27: complicated and then having them animated is even
01:47:32: harder
01:47:34: that's what I mean by support because we could technically support
01:47:39: rendering them into a texture, so they get rendered into a texture
01:47:42: and then apply the texture to something, so it's gonna be kind of like video
01:47:46: the problem is if you render them out into a texture that's 2K
01:47:50: and you put in this and then you
01:47:55: look at it super close, then you're gonna
01:47:58: lose that smoothness of an SVG
01:48:03: because if you wanted to maintain that smoothness, it would need to be converted
01:48:07: to a geometry, so when you look close
01:48:10: the GPU is still filling in all the extra detail
01:48:14: so depending on how the support is implemented
01:48:17: it also affects the complexity
01:48:21: and I think if we wanted to do support in the form
01:48:24: it gets rendered into a texture, it's not rendered as part of the world
01:48:29: you just apply the texture to something, you can control the resolution of that
01:48:33: it would be way easier to support because there's already libraries for that
01:48:39: they pretty much do all the heavy lifting
01:48:43: all the heavy lifting
01:48:45: for rendering, so we don't have to worry about that
01:48:48: but also in some ways limits how they can be used
01:48:51: but if that's a potential thing that you could find useful
01:48:56: we could do it, they would be definitely faster
01:49:02: I would say it depends how
01:49:05: do you imagine the support working
01:49:08: next question, one last thing, what are our next steps in optimization?
01:49:14: so the next steps right now
01:49:16: PhotonDust is pretty much nearly finished
01:49:19: I think there's a few issues that still need to be resolved, it's been brought out of experimental
01:49:23: so please use it for new creations from now on, I strongly recommend
01:49:29: avoiding the legacy particle system at this point
01:49:31: the legacy particle system is going to be removed very soon, so it's just going to go poof
01:49:36: which means the conversion is going to be
01:49:39: automatic, everything is going to automatically be converted to
01:49:44: PhotonDust, once that's done, I'll be working on the
01:49:47: audio system, that's going to be the last big part
01:49:51: that needs to be fully moved into FrooxEngine
01:49:54: and after that it's going to be the Splithening
01:49:59: like we're going to be reworking how FrooxEngine communicates
01:50:03: with Unity and then pull FrooxEngine into its own process
01:50:07: which is going to finish this major optimization milestone
01:50:10: and from that step alone I think we'll get a huge performance boost
01:50:15: because now majority of FrooxEngine will run with .NET 9
01:50:19: so those are pretty much the next steps you can expect, the audio system is going to
01:50:23: happen first, and after that the actual
01:50:27: steps to pull out FrooxEngine are pretty much going to be the next thing
01:50:31: to work on. And that's going to finish, this is going to
01:50:35: always say that this part of the performance optimization
01:50:39: was and I think that's going to give us enough of a performance boost
01:50:43: that we can focus on other things for a while. There's still a lot of other things
01:50:47: that could be done for performance like data model
01:50:50: rework, reworking the update system, variable rate updates, adding more
01:50:55: multi-threading for things, so there's still a lot of performance that's
01:50:59: going to be unlocked, but we can do these
01:51:03: sometime later after, like we maybe work on the IK, maybe work on the UI,
01:51:07: work on some other things.
01:51:10: Oh, wait, that's the one I just answered. We also got
01:51:14: nine minutes.
01:51:18: Oh, yeah, sorry, that was part of the question.
01:51:24: PixelNator is asking,
01:51:26: what part of the world is virtual reality in, and why are all the cute people from there?
01:51:32: Do Tanuki don't know?
01:51:35: That's an emote.
01:51:38: I don't quite understand the question.
01:51:41: They're like, what part of the world is virtual reality in, and why are all the cute people...
01:51:46: It's everywhere. Wherever you are.
01:51:48: Yeah, it's all around you, it's inside of you.
01:51:52: It surrounds us.
01:51:55: It penetrates us.
01:51:59: We're making a matrix reference.
01:52:04: It what? Okay.
01:52:07: No, sorry, that's Star Wars.
01:52:11: That's the force.
01:52:13: Oh, dear.
01:52:19: I got my references mixed up.
01:52:24: I'm gonna blow up, man.
01:52:27: I'm gonna schnapp it.
01:52:34: Hopefully that answers that question.
01:52:39: NukiKon's asking, I'd like to ask any
01:52:40: super vector graphics or HTML pages, but we all don't have time. I'm here a little bit.
01:52:44: We have still like seven minutes. Vector graphics,
01:52:48: that's kind of SVG, I'm assuming something else. HTML pages, that's its own
01:52:52: thing. I feel for that, what we want to do first
01:52:56: is have video streams. Because
01:53:00: with web pages, so you want to bring one in, there's not
01:53:04: really a way to guarantee it's gonna be synced for everyone. But we
01:53:08: kind of, you know, our feel of it is to have everything synced for everyone. So the way we would
01:53:12: approach it is we would have one user render it, and
01:53:16: synchronize the rendered output to everyone else, so
01:53:20: everybody sees the same thing.
01:53:24: So that's something I'd like to add, but it needs the video synchronization
01:53:28: to be implemented, so this is like a prerequisite.
01:53:33: Next question is from Nitra.
01:53:35: How do the angular velocity modules work for PhotonDust?
01:53:40: So for the angular velocity, you need
01:53:43: the rotation simulation module. And what it does, it's actually similar for the
01:53:47: position module. The module keeps track
01:53:51: of the angular velocity for the particles, and then each frame
01:53:55: modifies the rotation by the angular velocity. So if your angular
01:53:59: velocity says we have a particle, and
01:54:03: the angular velocity says it rotates this much around this axis
01:54:08: per second, it's just going to
01:54:11: modify that particle's rotation
01:54:14: by the amount each frame. So if this is 1 second,
01:54:18: it's going to rotate this much in half a second, in a full second,
01:54:23: in a quarter second, and whatever frame updates it, it rotates it, and over time
01:54:27: it accumulates. And angular velocity can potentially be
01:54:31: changed, it can be modified, which means maybe it rotates this much, and then
01:54:34: the velocity changes, so it rotates this way,
01:54:38: whatever it is. So it can
01:54:42: essentially just integrate the angular velocity over
01:54:46: time on the particle's rotation. So hopefully that
01:54:51: explains things.
01:54:55: We've got 5 minutes left. rasmus0211
01:54:59: is asking, admin related, any plans to make it possible to allow use
01:55:03: of particle assets or worlds for specified clients with time frame, think rental props
01:55:07: or worlds? I'm actually not fully sure
01:55:11: if I understand, but we do have a number of business uses,
01:55:15: so if you have anything like that, send us an email,
01:55:20: I think it's adilo.men.com, we have a forum for businesses,
01:55:24: actually it might be at theresonite.com right now, I don't fully remember,
01:55:28: check both of them.
01:55:31: We do have specific business emails, so give us some business use,
01:55:35: send us a message, what do you want to do, and we can have a chat about it.
01:55:41: We do have some companies that use Resonite for
01:55:43: a number of different things,
01:55:47: so it is possible, we just need to know more details.
01:55:57: We just have
01:55:58: stuff like
01:56:00: some companies we do white label licenses as well,
01:56:03: where you can remove the Resonite branding and we make
01:56:07: a very curated experience where you open Resonite
01:56:11: and it launches you into a specific world, maybe there's a lobby for the company,
01:56:15: and from there you can go to training applications and so on.
01:56:18: There's a lot of possibilities.
01:56:21: If you'd like to know more, I would recommend asking him about this
01:56:27: in his office hours. He'll be able to give you a lot of good info on this.
01:56:34: Nightstar is asking
01:56:35: for the Unity split. I'm curious if the object serialization will be a limit
01:56:38: for performance. I don't think so.
01:56:42: It depends what I mean by object serialization too, because there's multiple
01:56:47: stages where it happens.
01:56:50: You have the real-time serialization, when I'm doing
01:56:55: this, this is getting serialized and being transferred to others, and when I'm grabbing this
01:56:59: and so on. Then you have one where you save things, when I save this
01:57:03: to my inventory, it gets serialized. This different type of serialization
01:57:06: than this one. I would say the saving one
01:57:11: that's heavier, because the one for real-time serialization that's
01:57:15: designed to be very efficient.
01:57:19: I think it's very rarely much of a performance
01:57:22: limitation. Saving it to inventory and
01:57:26: just in general for persistence is a bit heavier, but it also
01:57:30: only happens when you actually save, it doesn't happen every single frame.
01:57:36: And with the unit, it doesn't really have to
01:57:38: be super much like with the unit split, other than FrooxEngine being able to
01:57:42: run with .NET 9, there's a much
01:57:46: better runtime, so the code that does the serialization will now run way faster,
01:57:50: so it's going to be even better, you know.
01:57:54: I don't think they mean the network, I think they mean between Unity.
01:57:58: I think they mean when they
01:58:01: are talking about telling Unity what to do, I think
01:58:06: is what they mean. That doesn't use object serialization at all.
01:58:11: There's no object serialization between FrooxEngine and Unity,
01:58:14: so that's not really relevant.
01:58:18: Yeah, it's going to use shared memory or whatever.
01:58:22: Unity doesn't really get,
01:58:26: there's just no serialization happening between FrooxEngine and Unity, not even
01:58:30: now, so
01:58:33: that's not going to affect it or limit it. For the actual communication,
01:58:39: we're going to use shared memory, which is
01:58:43: pretty much like you can share a big chunk of memory
01:58:46: or smaller chunks of memory between multiple processes, and they each
01:58:50: no, shared memory doesn't require serialization at all.
01:58:54: The operating system literally maps the same
01:58:58: physical memory region to multiple processes, so each one of them
01:59:02: is essentially treated as its own memory, except multiple processes
01:59:06: can write and read to it, which means you don't
01:59:10: need to do any serialization, you don't need to do any copying, you can literally just
01:59:14: share data between the processes, so we can fill buffers
01:59:18: with stuff and then just read them out on the other process
01:59:22: without having to push them through some pipe or some serialization
01:59:26: format or anything. They literally just share a chunk
01:59:30: of memory. It's extremely fast, extremely efficient.
01:59:35: Each process treats it
01:59:36: as if it was its own memory.
01:59:41: With that, we've pretty much run out of time.
01:59:45: So thank you very much everyone for joining, thank you Cyro for helping me
01:59:49: answer your questions. Thank you
01:59:53: for asking all the cool questions and
01:59:57: general support in Resonite, being part of your platform, being part of your community,
02:00:01: building cool content. Whatever you do is appreciated.
02:00:13: Thank you
02:00:15: for being here.
02:00:16: And thank you for listening to me and Cyro
02:00:19: ramble for two hours. So we'll see you with the next one.
02:00:25: See you around guys.
02:00:28: Wait, is there anybody to raid? I need to figure out the rating.
02:00:33: Let's see, let's see.
02:00:36: Creator Jam? Is there
02:00:37: any other Creator Jam? Not like Creator Jam is bad, like
02:00:43: they're excellent and they started with MMC. Actually that's another thing I should have said.
02:00:47: Good luck with MMC. I'm kinda excited, like whatever everybody's gonna be working on
02:00:51: and all the cool creations.
02:00:54: I don't wanna see if there's anybody else
02:00:56: streaming Resonite, because we do like to support
02:01:02: Nope, it's just Creator Jam, so Creator Jam
02:01:04: is getting raided. If you're a streamer, definitely recommend
02:01:08: streaming around this time, because we will raid you, you're gonna get raided.
02:01:14: But you're not streaming right now, so Creator Jam
02:01:17: it is. Say hi to Madra
02:01:20: for me, for me. So, Creator
02:01:24: Jam.
02:01:29: What I'm doing for MMC, I'm just
02:01:32: working on like resonant things.
02:01:37: It's like, you know, like people are busy with things and I get like time to work
02:01:40: on things.
02:01:44: Yeah, okay. Say hi to Madra.
02:01:48: Oh.
02:01:54: Oh.