This is a transcript of The Resonance from 2024 December 8.
00:09: Hi. I wonder how many people would jump-scare with death. Hello, hello everyone.
00:18: Just like the first three frames of... Yeah, the first three frames are just gonna
00:22: be death. We're gonna look pretty quick so like it might like maybe nobody even seen
00:28: it like as we'll see in a sec. Hello? Hello? Is there anyone in the chat? Hello?
00:40: Don't see anyone posting it. Maybe nobody saw a jump-scare and it's gonna be a
00:47: mystery to people like what happened. Hello, Nordvig. Hello, ShilloJillo. Happy Holidays.
00:56: Yes, you're kind of... Well, I'm festive. You need festive stuff. Hello, Connor.
01:05: So before we blow the thumbnail, yes. Okay. So yeah, that's a Gmod color. Hello. Hello,
01:18: actually, before we start, I have a question. How many of you got jump-scared by like the
01:22: first 10 seconds of the stream? Did anyone see the thing that we did or was it too soon?
01:32: How many of you got a jump-scare? No, you didn't see. ShilloJillo, scariest moment of
01:39: my life. Perfect. We have traumatized at least one person, which means we're good to start.
01:44: So hello and welcome everyone. I'm Frooxius and I'm here with Cyro, our engineering intern.
01:50: This is the fourth episode of The Resonance, which is essentially, here's that word,
01:57: it's essentially sort of a hybrid between office hours and podcast. We're here to answer questions
02:04: about the Resonite, whether it's like technical questions, whether it's more general about the
02:08: Resonite. You can also ask us some personal stuff here and there if you'd like to get us to know
02:14: about it. And we're also going to be doing a bit of like, you know, kind of like rambling,
02:18: talking about like more general vision of Resonite, you know, like where the platform is going,
02:22: where it's coming from, sort of like, you know, very high level view of it. And also going like
02:28: into a lot of more technical details on things. We are actually streaming from the pre-release
02:34: build because, oh, thank you so much for your subscription, shalom, shalom.
02:41: And I lost my train of thought. We're on the pre-release build because the new particle
02:48: system that I've been working on called PhotonDust is now in pre-release testing.
02:54: And I kind of want to give you a little bit more kind of, you know, a little bit of a showcase,
02:57: you know, show like how it works and also a little bit like, you know, deeper dive. How does it
03:02: actually work under the hood, you know, to help you understand how is it put together. The system
03:09: is going to be released to the public builds starting next week. For the, what we'll still do
03:15: testing, it's going to exist alongside the old system. So we can like, you know, kind of use
03:19: both and it helps like, you know, anybody do testing without having to switch builds.
03:24: Once they're kind of comfortable with like the system working properly, you know, the old system
03:29: is going to be removed and it's only going to be PhotonDust. So we can like use like, you know,
03:34: this kind of stream as a little bit of a primer on it as well. We are seeing like lots of people
03:39: playing with the system, making like a really, really cool, you know, making really cool like
03:45: visuals with it, like something like some popping on Twitter. I might actually show some of them as
03:48: well. I have like the video saved. But with that, let's get started on the questions and
03:56: once we kind of field some of them, we're gonna, we'll see like if it happens naturally or if we
04:01: just have a cut-off point and be like, you know, okay, we're gonna talk about PhotonDust and
04:05: showcase it. So before, just not before, just so like, you know, make sure your question, oh my
04:14: god, there's even more. Thank you Finesseformer for the subscription and thank you Ty Whitehall
04:19: for the subscription. Thank you. Just to make sure like, you know, your question pops up on
04:26: our thing here. We have like this cool panel built by Chuck the Fox author. Make sure your
04:32: question actually ends with a question mark. That way it pops up on our pin messages and, you know,
04:38: we don't lose it and make sure like it's answered. So with that we can get started. So the first
04:46: question is actually Corey2099. They're saying, Gasp! VR Cyro, what's the occasion? What's the
04:56: I missed the last stream and so I might as well not miss this one and be extra here instead of
05:04: just a little guy in desktop. Yeah, and we got face tracking with all of this. It's actually kind of,
05:10: I'm gonna get to Drake's question next because this one's also kind of related.
05:14: Granduke is asking, is Cyro on Linux right now or Windows? I am actually on Linux right now. I'm
05:26: Wyvern. That's W-I-V-R-N. It's really, really good actually. I actually just wrote the face
05:35: tracking to work on Linux 2 with a Resonite Steam Link integration. It sends it in the same format
05:44: so it can recognize it. Certainly cool. They're very extra expressive. So I still keep thinking
05:51: about looking at the lens instead of the preview. It's kind of messing with my brain.
05:56: We need to make it so it's kind of like in front or something. But anyways, yeah,
06:02: Cyro's been on Linux. You've been trying to get Linux to work with VR really well for a while
06:08: and they've been getting fruits of your labor. We've been literally talking about it right
06:23: now. So here's the problem. We don't have access to Unity's Particle Code. So I
06:37: cannot look how it is implemented. Because in order to convert anything you've built with the
06:49: cone that takes the settings and converts it to a PhotonDust equivalent. While doing that,
06:57: I've discovered some nasty rabbit holes where for some reason in certain modes,
07:05: Unity's own particle system coordinate system doesn't match the rest of its coordinate system.
07:12: It took a lot of pulling hair to figure out what's even happening and why things are looking
07:18: different and why things are misaligned. It's been pain, but I've never gotten through it.
07:27: And the hardest part about it is because we don't have access to the resource code,
07:31: I can only mess with the Unity system, try different things, observe its behavior,
07:37: and from that infer how is it working. Which is an extra painful process, but
07:44: maintaining content compatibility is one of our highest principles.
07:51: We hate breaking content compatibility. If you build something, we want it to work.
07:58: Regardless, even if we change a complete system, we want it to work as much as possible. There's
08:03: a few gaps there, but most things will generally not keep working and it's one of the reasons why
08:10: we put so much effort in maintaining the compatibility. The part of testing that
08:18: PhotonDust does right now is making sure the compatibility works. So people have been
08:23: throwing lots of different items on it, helping to isolate where are the discrepancies and
08:28: differences, and will be resolving most of these or all of these before the actual switch happens.
08:37: And also, this one I've been kind of hoping for this one, because Grant asked me this
08:45: one in the moderation of his hours that were happening right before the stream. Grant UK
08:53: is asking, so Froox, what's your favorite pizza? And I do in fact have a very, I'm actually
09:00: gonna stand up for this one too, and I'm kind of out of frame with that. I guess, okay I'm not
09:08: gonna stand up, I'm gonna sit down. I didn't think that through. What I'm gonna tell you,
09:13: my favorite pizza is not favorite because of how it tastes. It's favorite because of how people
09:23: react to it, and I'm gonna spawn it in a second. Let's see, my favorite pizza is the USA pizza.
09:34: So there's this store that delivers pizza in my city, and on their menu I saw the USA pizza.
09:45: It even has a little USA flag over here, and look, it's french fries on the pizza.
09:53: And I've shown this picture with a bunch of my friends from the US, and everybody's like
09:58: what the hell is this? And then like two people independently of each other were like,
10:03: Froox you need to buy it so you can 3D scan it. And I was like, say no more. So one time I was
10:07: hungry and I was like, I'm gonna do it, I'm gonna buy the pizza, you know, see how it is.
10:14: Just so I can 3D scan it. And well, there it is. I opened the box and I was kind of expecting,
10:23: golden fries and everything. And I look at it, I'm like, I was hoping this pizza will strike
10:28: terror into people's, you know, hearts. And instead, you know, this is just going to make
10:32: them cry. But I guess, you know, this is what it is. But then like, you know, I started showing
10:38: this around to people and I've started getting, you know, reactions like this, like this is,
10:43: you know, Minx over here. And this is literally, this is literally him, like five seconds after
10:48: seeing the pizza for the first time. And I'm like, this, this is why this is my favorite
10:52: pizza because, you know, I can, I've only had to suffer eating it once, but now I get to
10:59: pester people with this forever. And this is why it's my favorite pizza. Also, another
11:05: interesting fact, you might be, you know, noticing there's some corn on the pizza. You see,
11:10: corn is heavily subsidized in the US, therefore it belongs on the US pizza. You're welcome, Grant.
11:16: I hope this answers your question to your satisfaction. And I'm going to delete this
11:22: abomination. I can say the directions I've been getting have been well worth the $7 I
11:31: paid for the pizza. Its value keeps growing over time. It's like horrible and like stale
11:39: and milky and gross looking on the scan. I need to repurpose this with Gaussian Splatting.
11:46: Or yeah, you sure do. You sure that's an option you can take.
11:51: And this is also why I was looking forward to the question. Grant asked me in the livestream and
11:56: was like asking me on the stream because I'll show you. So I hope Grant that you are satisfied.
12:02: So hopefully going to a more serious question. Dusty Sprinkle is asking,
12:06: is there going to be a second round of phase two tests for PhotonDust?
12:09: It feels like conversion process probably needs some work, but I don't know what minimum
12:12: viable concept for it. There's not really like any rounds to this thing. Like usually when we
12:17: do testing, it's sort of a continuous process. So we get, you know, we get the reports, you know,
12:23: of like compatibility issues. And then we just kind of keep releasing builds and fixing things
12:28: up and ask people, you know, please keep testing more, please keep testing more. And once things
12:33: kind of quiet down on the reporting and we're like, okay, like there doesn't seem to be really
12:37: any issues, you know everything like we tried like works, all the issues have been resolved or most
12:42: of them have been resolved because in some cases, you know, in some cases there's going to be some
12:49: discrepancies that like might be not worth fixing. Like if things look close enough, but you know,
12:54: there's a little bit discrepancy, we can let some things slide. But it's kind of, it's very case by
13:00: case basis. But the gist of the process is we just keep testing until we're comfortable with it.
13:08: Once we're comfortable with it, like, you know, we release it fully. So there's not really like
13:13: multiple phases. There's just going to be kind of continuous, like updates with it, like builds.
13:17: And we're going to always say, you know, this build fixes this and this, it fixes this and this,
13:21: you know, and then ask people this more. And once it kind of gets quieter, like, we're like, okay,
13:26: this is in a good state. Next question, Finnas Farmer. Question. It seems my avatar is more
13:35: expressive than VRC using VR-15 than Resonite using Steam Link. Is it just my imagination or
13:40: is that true? So I think that's probably to do with like, you know, what blind shapes you have
13:45: mapped. It's kind of hard to say in general, but all of the blind shapes that you get from
13:51: face tracking, depending on the device you use, should be available in the Resonite.
13:55: So my guess would be that some of them might not be mapped on an avatar because there's like two
14:05: parts to the process. One of them is, you know, Resonite kind of feeding whatever face tracking
14:11: data it gets, you know, into the generalized input system. And the other part is actually mapping
14:17: those blind shapes, you know, mapping those, you know, weights to the blind shapes on an avatar.
14:23: And for that, like, Resonite, it has a heuristic process. It tries to like, you know,
14:26: guess which blend shapes correspond to which ones. But that process is not perfect. So like,
14:33: you know, sometimes it kind of misses a bunch and you have to like assign it manually.
14:36: It also might be just a question of strength. Like for example, the values, in some cases,
14:43: like they might be too weak. One of the reasons we kind of do it is because sometimes we had like
14:52: cases where, where I put it, like we're on some avatar, essentially, like the blanches were
14:58: overdriven. So we kind of tend to like, you know, make, make more kind of conservative defaults.
15:05: And then like people, you know, kind of tweak it from there. So
15:09: this could be kind of a combination of these things.
15:13: Yeah, I mean, I don't know, I'm pretty, I think I look pretty expressive and I'm using like the
15:19: Steam Link stuff or the Steam Link standard at the very least for face tracking.
15:24: All the data like should be there. So it's probably just a question of like, you know,
15:28: how it's mapped on an avatar. So that might kind of require some tweaking to like get it looked the
15:32: way you like. Next question, Grant UK, have you considered having something like a community
15:39: manager and team to act as primary point of contact for a team so they can focus on making
15:43: the game? Not as exclusive point of contact, but first point of contact. We do have like some
15:49: people like in those kind of roles. Most of the stuff it's kind of spread out because we have,
15:54: you know, different departments. So for example, you know, we have the moderation team, they've
15:58: just done, you know, their office hours and they kind of, you know, they're sort of the point of
16:03: contact for moderation like things. We do have, you know, people who handle our marketing, you
16:09: know, Chroma and Arial and they have been kind of like, you know, people like they've been like,
16:13: you know, kind of talking with the community, especially about like, you know, promoting Resonite
16:17: and so on. So it's kind of, it depends. We also have like, you know, business team. So for this
16:24: one, we actually have a business email and we have like a form on our website and that goes into
16:29: shared inbox. So multiple people on the business team can see those. And I know like Bob, Prime,
16:35: and sometimes Dean have been fielding lots of those. Canadian has been doing some as well.
16:40: So it's kind of like, you know, it's a spread responsibility, like depending on what kind of
16:45: contact it is, because if you want to, you know, talk about some kind of business thing,
16:49: there's people on the team who are better to talk to if you want to talk about, you know,
16:54: acceleration issues, there's like different, if you want to talk like in general, like,
16:58: it kind of depends. I would ask like, can you clarify a little bit more, like, you know, like,
17:04: what kind of kind of contact like are you thinking of?
17:10: Next question is GlavinVR. How long do you think it will take to get through the audio optimization
17:15: and what do we all need to go through with all its stuff to get to the .NET 9?
17:24: In order how long it'll take, we generally don't do time estimates because like it's
17:31: hard to do them and we don't want to commit to timelines like, you know, we are not confident
17:34: we can keep. My general expectation of the audio system is it's going to take less time than the
17:41: particle system. There's not as many kind of moving parts to it and there's not as many kind
17:47: compatibility things to get through. The main part of like doing the audio system, there's like
17:56: things like, you know, handling the audio rendering like spatialization.
18:02: There's probably going to be a big chunk of it. My plan there is we actually utilize the
18:07: Bepu Physics structures to like, you know, do stuff like query what, you know, what audio
18:12: sources are at this point, you know, in space. So we can efficiently get a list of them.
18:18: And then like, you know, we do whatever processing through Steam audio to handle the spatialization
18:23: and, you know, whatever else needs to happen. There's also like one part that Cyro already
18:28: worked on, which is the reverb zones. Because that's one of the features we kind of have and
18:34: we need to like, you know, preserve it in some way. So Cyro has done some research on like
18:39: libraries and have like integrated, you know, the Zita like the reverb library, so we can
18:46: actually, you know, process out your data with it. And also done some like mapping, like where
18:50: each of the presets we have for reverb zone, you found like something that sounds the closest,
18:55: like even like, you know, talk about that a little bit more.
18:58: Yeah, so I actually just recently went back in and touched it up again. And it's actually,
19:05: um, I implemented it such that like, you can process like batches of samples now rather
19:10: than having to do one at a time. So it's way faster. Like I can process like five minutes
19:17: of audio in like 10 seconds, like on a standalone program. It's crazy. But yeah, I found it's not
19:30: like, I looked into like some solutions with like some, like even like using, like training
19:37: a neural network to try and interpolate the parameters. But the, it's kind of just going
19:44: to be like a closest estimate. Like if the, if you have a certain preset, like the set for
19:50: the audio I've gone through and I've made equivalent presets in the Zita Reverb, which
19:56: is provided by Soundpipe, which I believe is also provided by just giving credit. The
20:03: library is made by someone named Paul Batchelor. They're cool. But, oh dear, I've lost my train
20:11: of thought. Where was I? I was at the, oh yes. Yeah, it will, if you have a certain parameter
20:25: set like, you know, Sewerpipe or like Long Hallway or whatever, it'll choose the equivalent Zita
20:34: preset. But I don't, I couldn't find a good way to like interpolate them because they're just not
20:41: directly translatable. Yeah, this is like, it's like one of those things where I like in part
20:48: like I kind of like regret making the choice, like, you know, making just the reverb zone
20:55: because like now it's a kind of like a complicated point where the reverb is, you know, very specific
21:00: to the solution that Unity uses and makes it harder for us to translate, but we can kind of get
21:04: closer. We still kind of get, you know, the similar vibe for it. And, you know, like, and going
21:12: forward, we are kind of, you know, have our own audio system, which means like, you know, we now
21:16: fully control the process and we can make sure, you know, compatibility is maintained long-term
21:21: because even say, you know, say like we offer more reverb kind of solutions, you know, more different
21:26: like reverb filters, we can still keep this one because it's a relatively simple library. We have
21:29: the full integration with it and we can have, you know, there'll be a toggle. So that kind of gives
21:36: us, you know, a better way kind of going forward. But the audio system, it's one of the big
21:41: parts that kind of needs to be done, but I would expect like much smaller than like the particle
21:45: system because there's not as many moving parts. They also kind of avoided like, you know, adding
21:50: more to it because if we added more to the existing audio system, then we would have more things
21:56: that need to rework and it would essentially make it take longer. And we knew like, you know, this,
22:02: we knew that this would eventually happen. The good part is also like the audio system
22:06: similar to the particle system. It's a hybrid, but most of the things for the audio system are
22:12: being handled on Froox Engine. The parts that need to be reworked is like where we actually send the
22:18: unit, you know, the audio data for individual, like audio sources, you know, to like play from
22:25: particle point in space. But like all the decoding, you know, buffering, everything that already
22:32: happens within Froox Engine. We have our own handling, you know, for like different audio
22:36: formats. We have handling, you know, for resampling, you know, things like that.
22:41: So those things we don't have to rework, we just have to like move the actual spatialization
22:46: and rendering of the audio, you know, to our, you know, to Froox Engine and that way, like,
22:50: it's not gonna be tied to Unity anymore. After the audio system, the actual integration with
22:56: Unity has to be reworked. I'm not gonna go super into details because I've done that on the
23:01: previous episode. I've also published, I've published like a video on our official Resonite
23:07: channel, which goes on the performance, but I've kind of drawn, you know, some kind of
23:10: like diagrams and so on to give you a better understanding. So I don't want to like spend
23:14: time on this, on this one. If you're interested in more in depth, I recommend watching it.
23:19: It's going to give you a much better understanding how the performance update happens and what
23:22: steps need to happen for it. But reworking the Unity integration is the other part. That
23:29: one, I also don't have estimate on how long it's gonna take. It's one of those things,
23:33: you know, where it's kind of like, it's staying and then like, we need to spend time kind
23:36: of like pulling it apart, be like, okay, this, this, this, this. I have some general idea,
23:41: but like we'll try to get through it as fast as we can, essentially.
23:47: Yeah, I can, I can see why you like playing with audio stuff though. Cause it's like,
23:52: it's fun. I, I, I kept just listening to my music. Once I implemented it, I was like,
23:58: what does this setting do? Oh, it sounds like that. What does this do? Oh, it sounds
24:02: like that. Oh, this one makes my music explode. That's great too.
24:06: Yeah, it's like, it's one of the fun things like, you know, with like working with audio
24:08: visual stuff is like you poke something or you make even small change and like, you know,
24:12: suddenly does like this cool thing. And it's actually, I'm going to showcase this, you
24:15: know, with PhotonDust because PhotonDust is very easy to write modules for it. And like
24:20: I've written some modules that took me like five minutes to write, but then I spent a
24:25: while just playing around with them, making all kinds of cool effects. And it's like really
24:28: easy to do those, those kinds of systems. Also, thank you so much for the subscription
24:33: and Nyalov, thank you for the subscription too. I'm going to clear this out.
24:41: Stelonaro, what's the most interesting thing you've seen made with PhotonDust? Let me actually,
24:46: I'll just bring this in. There's actually kind of a cool thing because like whenever
24:52: we make an update and there's something, you know, there's like visual audio, this
24:58: one member from the like Japanese community, Orange, they always do these like super cool
25:03: videos. And like when I released PhotonDust to this thing, I was actually thinking in
25:07: my head, I wonder what Orange is going to do with this. It's probably going to make
25:11: something cool. And you know, and like not even 24 hours later, he just made like this
25:16: video that I'm going to bring. Give me a sec. And I'm like, I was looking forward to this.
25:25: And then Orange actually organized like an event with the Japanese community where they
25:29: were just, you know, messing around with the new particle system and making cool things.
25:34: So I have a few that I can show you. So this is the orange one. I'm going to pull this
25:41: one in. Let's see, how does it look? I'm going to click it. There's no sound to it. So this
25:49: is the thing that Orange did with PhotonDust. And he even specifically mentions in the tweet
25:54: that like you can have all these particles and FPS is not really dropping, which we'll
25:59: get to like, you know, later. There's another one. This one, this one's also like one of
26:05: my favorites. This is like a nebula like kind of looking thing. So if I play this one, you
26:12: can see this is like all particles and they're like, you know, moving.
26:19: Yeah, it's almost like, it's almost like there's so many, it's almost like a volumetric cloud.
26:25: Oh yeah. I mean, pretty much it's like a nebula thing. Like this one I really like,
26:29: there's sort of a kind of subtle motion. I'm not sure like, you know, if it's fully coming
26:32: through on the stream, but it's, it's very impressive.
26:36: Oh, there's also this one by Rapids. Is this the right one? I think this should be the
26:43: right one. This one I also like really liked. So I'm kind of pushing Cy around with this
26:49: one. I'm going to play this one. This is a super cool effect. People have been playing
26:54: a lot with the turbulent force, which is one of the things I added because it can make
26:59: really cool visuals with it. Oh yeah. Like this is, this is super appreciated. Like
27:06: it kind of makes me happy, you know, to see like people having fun with it and making,
27:10: making all these kind of cool visuals. I think one of the things that I asked you to implement
27:16: was like having a 3d texture to basically make a vector field for the particles. So
27:23: like if you make your 3d texture right, you can make a vector field for the particles
27:27: to flow through in the direction that you want them to go. Okay. I can make them like,
27:32: you know, do cycles and like loops. And like, it's, I've made like one of the devlog videos,
27:36: like where it was like messing around with it. I was like, this is like, it's really weird how
27:41: they behave, but also kind of fun to play with. So like once it goes on a main build, I expect
27:45: people to be just having, you know, having a field day with it. But yeah, like those videos,
27:50: like, you know, they're super cool. And like, to me, that's like one of the most exciting parts
27:53: of development is like, you know, just seeing people have fun with the new system and exploring
27:57: all the things, you know, it can do. Sometimes it's like, especially during a testing phase,
28:02: exploding themselves. And it's just kind of like, you know, it makes it feel like, you know,
28:09: like all the work and like, you know, put into the system was like really worth it. So yeah,
28:16: just, just, and I can't expect, you know, to see kind of a lot more. So thank you Kaibes,
28:20: subscribe as well. Thank you.
28:25: Granny K is the next question. Granny K is asking, is there any intentions to make importing avatars
28:31: easier? For example, better head and head detection, unit package import, et cetera.
28:35: Yes. So generally the avatar creator, like it needs an overhaul. One of the, and there's like
28:42: several things on it. The original, it wasn't really designed with like how it sticks much.
28:48: Um, so it's not like, you know, it's not really good at like kind of automatically things that
28:52: will cause manual steps and adding heuristics tends to make it kind of worse in some parts,
28:58: at least, you know, for the alignment of things, but also it's sort of like, you know, it's,
29:02: it's one time process, uh, where like, you know, like you click it and you're done and everything's
29:07: kind of set up. Um, so my general kind of idea to like rework it is, uh, there's like multiple parts
29:16: to it. One of them is built a new version that's just from the ground trap, like, you know, built
29:21: with like good heuristics. So you can actually figure most of the stuff on its own, you know,
29:25: where the hands are, where the feet are, where the head is, where the eyes are and sort of like, you
29:29: know, set, set it by default. So most of the time you just have to like, you know, click a button
29:33: and like, you know, you're done. Um, the other part is, is, uh, instead of making it sort of
29:40: one time tool, make it something where it can, like, it can essentially use it multiple times,
29:46: like to make adjustments. So you would like to turn something to avatar. It does the heuristics
29:51: to do the initial positioning, but like, you know, you can adjust it on the fly and then, you know,
29:55: you hide the visuals for the adjustment, but then you're like, okay, this avatar, this, this
30:00: hand of positions feels off, this head position feels off. So you just activate it again. You
30:10: make, and, uh, you know, and then like you hide them again. So at least for part of the process,
30:16: make it so it's, it can be repeated multiple times because right now, if you want to, you know,
30:21: make adjustments after using it, um, you need to, um, you essentially need to go to the inspector,
30:29: you know, and like make the changes there. Uh, the other part is the Unity package import.
30:33: That one, that one's a little bit complicated topic. Um, because Unity package, it's a format,
30:40: it's very specific to Unity. So in order, um, for us to fully support it, we would essentially
30:47: need to implement like a lot of Unity, which also is not going to be, you know, usable long-term
30:52: because if they change anything with a format, now we have to, you know, reverse engineer those
30:56: things again. We could also just, you know, extract the FBX, you know, or the files in the Unity
31:01: package. But it's also kind of iffy because like the Unity package might have the avatar already
31:05: fully set up. Uh, there's also like one approach that I feel would work the best, uh, is, uh,
31:13: introducing actual Unity SDK. Um, because that way you can just, you know, import Unity package into
31:19: Unity and, uh, then use the SDK to convert an avatar that's already fully in Unity into a
31:32: avatar, but like, you know, from within the Unity. Um, so that's like, you know, a possible
31:38: approach there. Uh, there's also like where the, uh, like, the working avatar creator might help,
31:45: because if you have like, you know, better heuristics, we can, you know, call them on
31:48: the imported data. So it just kind of sets it up for you and maybe give it more hints, you know,
31:53: from, from Unity. So it's able to figure most of, most of on its own. Uh, next question. Oh,
32:01: chat question. Uh, oh, they had a follow-up somewhere. Let me see if I can find it real
32:10: quick. Just by, yeah. If you have like a follow-up, uh, please make sure like, you know,
32:17: there's a question mark. So Grand UK says, what I mean by a community manager is someone who can act
32:25: as a kind of filter to accumulate complaints and requests so that it's easier for the team to know
32:29: what to focus on. The team leads could still, can still be approached for specific queries,
32:34: but generic stuff to go through a community manager, who's already got connections in the
32:39: community to get the response of the community and their needs. So, uh, this one's, it's also
32:45: kind of like a shared role right now. There's not like, you know, single kind of point of contact.
32:49: Um, it's, um, oh, wait, don't, don't move it. Uh, oh, sorry. I have video. Oh, I'll hold it up for
33:00: you. Does it, is it, oh, it drops. I thought it was okay. Cool. Yeah. So we don't have like a very
33:10: like specific question, like personally, you know, for this kind of stuff, uh, we do like the
33:14: different people like on the team, they were like, you know, bring up like certain kind of like
33:18: complaints and, uh, you know, requests and things they will kind of bring it like, you know, during
33:22: teamy things. And we kind of like discuss things. It's something we kind of like, you know, consider
33:26: having like a very specific person to do it, but, um, you know, as kind of being small theme, like
33:33: we tend to share a lot of responsibilities or spread a lot of responsibilities between multiple
33:38: people. Um, we've been having, you know, people from the marketing team that kind of, you know,
33:42: go around the community and they kind of relay some of the kind of general kind of feedback and
33:47: so on. Um, so we'll see, like maybe, maybe at some point, um, but yeah, right now, like there's
33:57: not like a single person, like you can kind of go to like, you know, for everything. And I'm not
34:01: sure how well it's going to like work as well because, um, it's, you know, like there's only
34:06: so much like a single person can kind of like handle. So it's, they cannot have, you know,
34:15: like a single person. Next questions, uh, AK, AK underscore underscore 222 is asked,
34:24: Hey, can we import Unity scene to the game? So right now you can't directly, uh, you need to
34:30: like use something in unit two to export it as a glTF or, you know, FBX and import that. However,
34:36: we do have a Unity SDK on the roadmap. Uh, it's on our GitHub. Uh, if you'd like to see this,
34:42: you know, happen sooner, uh, I'd recommend giving that issue an upvote. Uh, what is, uh,
34:47: we'll essentially do is give you a tool where you load it up in unity and you can convert
34:53: a unit scene or unity object, you know, whatever you have like in unity, you convert it into
34:58: Resonite equivalent. And the way we want to approach this is by making it sort of like
35:05: a framework for making SDKs where it's like, you know, like if you're familiar with web browser
35:09: terminology, they have something called a DLM, the document object model. And if you're,
35:16: for example, writing JavaScript, it's a way, you know, for you to manipulate, you know, the page,
35:21: we offer some things that's kind of similar that you can kind of access over network.
35:25: And that would represent the data model. And then we're like, you know, an API that's easy
35:30: to integrate into other solutions. So what would happen is unity would have like, you know, part
35:36: is kind of connects to the Resonite and it uses that to like, you know, build out a scene, you
35:40: know, based on its own scene and kind of, you know, kind of sync it up. The way we want to approach
35:48: it is, you know, build sort of the base of it where it converts a lot of common unity stuff,
35:53: but make it easy for anyone to add additional conversion scripts. So for example, if you want
35:58: convert specific materials or components or even like more complicated setups,
36:04: you can add extra code to the SDK to handle conversions, you know, of those bits and kind
36:10: of expand it or modify it to your needs. And that way we would like, you know, we would essentially
36:16: allow the community to expand this SDK and like, you know, if you have very specific components,
36:22: you know, we can build your own tooling around it. And also it would allow for other SDKs,
36:29: for example, you know, Godot SDK, Unreal SDK, which connects, you know, to the same Resonite,
36:34: like same Resonite sort of, you know, DLM, the Document Object Model equivalent,
36:42: and use that, you know, manipulate the scene in Resonite and, you know, pipe data in and out.
36:47: Oh, Niall Love is asking, can you boost cybersound?
36:53: Let me go to my audio settings. Cy-Cyber, can you say something?
36:59: Hello? I can say in a long sentence, meow, meow, meow, meow, meow, bark, wolf, squeak.
37:06: I can't unfortunately, hold on, I need to add a filter because you're,
37:10: right now, you are like at 100%, so I need to add like a gain filter, so give me a second.
37:17: Oh.
37:19: Okay, try talking now.
37:20: Hello, I've been talking, meow, meow, meow, meow.
37:24: This is better.
37:27: But it seems to be like going about the same level.
37:31: Defhammer to Cyro.
37:34: Someone wanting, actually, do you want to read that one?
37:36: Yeah, I'll read this one.
37:39: Defhammer asks, to Cyro, as someone wanting to experiment with VR and Linux,
37:44: what are some pitfalls you ran into when trying to get VR to work with Resonite?
37:48: So, one of the, I actually didn't have a lot of problem making it work with Resonite.
37:58: So, with VR and Linux, a lot of the niceties that have kind of come of it
38:03: are still kind of being iterated on quite heavily.
38:09: And so, they're kind of still in the more bleeding edge distros.
38:16: Particularly, Arch is reaping a lot of the benefits right now, Arch-based distros,
38:22: especially due to the fact that they have the AUR and stuff, which is the user repository.
38:27: So, in terms of some of the difficulties, I guess one of the difficulties I experienced
38:34: recently was that the program that you use to get up and running, it's called Envision.
38:42: And what it does is it compiles a profile for you and builds Monado and Open Composite and stuff.
38:50: And Monado is an open source OpenXR runtime,
38:55: translates OpenXR calls into OpenVR calls, which is why I can use Resonite.
39:02: The problem with Envision right now is that it doesn't set your GPU to VR mode.
39:10: So I was experiencing a little bit of weird jitteriness in my headset,
39:16: but there's actually a website you can go to. I don't remember the link, but if you look up
39:24: Google or whatever search engine you want to use, you should be able to find that website.
39:30: And they actually have a couple scripts there to set VR mode on or off, which helps a lot.
39:39: It makes it not like jittery anymore. It works pretty good.
39:47: Oh yeah, another one was the eye tracking, because Steam Link doesn't really work on Linux.
39:57: The eye tracking provided by Wyvern, which is the open source streamer I mentioned earlier,
40:04: is only provided through OpenXR APIs. So I spent like a weekend or so trying to...
40:13: like I wrote... I basically wrote a driver for it, essentially. It just takes the OpenXR data
40:19: and transforms it into Steam Link formatted data, which then you can pipe into Resonite.
40:26: And if you... we should make an option to turn on the Steam Link driver, like force it on.
40:34: Oh yeah, there's some issues with that.
40:37: Yeah, but anyways, those were a couple of my difficulties with getting it to work,
40:44: is like I had to write my own proxy for the face tracking, but it works good,
40:50: and I'll probably put it on GitHub at some point once I clean it up a little bit.
40:54: I was about to ask, it feels like we've got to share it in a bit of community,
40:58: like it would help a lot of people as well.
41:00: Oh yeah, I will, definitely.
41:04: I hope that answers your question.
41:07: Thank you for answering that one, Hamid was going to direct it.
41:11: Oh yes, why is it that, so Dusty Sprinkles asks,
41:16: oh yeah, why is it that reverb zones seem to only affect non-spatialized audio?
41:21: And I thought this was kind of weird too, but a lot of these,
41:26: so a lot of reverb effects that you apply in like, you know, digital audio workstations,
41:31: or like, you know, by using like VSTs or, you know, whatever you want to use for reverb,
41:37: they're not like physically based, they don't take into account like the environment around you.
41:44: And so if you were to apply a reverb to spatialized audio, like let's say there's,
41:50: let's say there's something making noise right here, and you apply that to,
41:53: to the audio coming out of it, like the end result of like the spatialization,
41:58: it's going to mess up the spatialization, because the spatialization is a bit more
42:04: complex than I think most people realize, because it has to calculate the exact
42:09: difference between how the audio sounds in your left ear versus your right ear,
42:16: so that you can accurately determine where something is in the scene.
42:21: And if you apply a reverb on top of that, it's going to mess that up.
42:26: Um, in some cases, depending on like the effect, it might be fine.
42:32: Um, but it would probably mess it up.
42:38: Like one of the things like it does, like, uh, that's part of like the binaural audio
42:42: is the HRTF, which is like header-related transfer function.
42:45: And pretty much what it is, like it, it subtly modifies the frequencies of the incoming sound
42:51: for each ear.
42:53: Because if you like, think about it, like when, when there's something, you know, on
42:56: the left and it comes, you know, into this ear, then like, you know, it's kind of having
43:00: a direct path, but for this ear, it actually has to go around and through the skull.
43:05: And that makes it sound subtly different.
43:07: Or when it comes in from the top, it's kind of like, you know, you know, coming from the
43:11: top.
43:11: So it's not going directly into the ear, but it's going, you know, through your skull and
43:15: through the shape of your ear and our brains, they're like very finely tuned to pick up
43:19: on these subtle frequency differences to determine which direction the sound is coming from.
43:25: It's also like why you can, you know, you can tell this sound is coming from the top
43:29: or from the bottom, because if it's coming from the bottom, that's, you know, it's going
43:32: through different parts of your head and your ears also shaped differently from that direction.
43:36: And all of that is, all of that, you know, our brains pick up on.
43:44: There's like other parts that don't mess with the frequency, which is like,
43:49: I forgot the term for it, I think it was, you know, like, it was like,
43:54: like the time difference. So like, like if something comes from the left,
43:57: it arrives at the left ear slightly sooner than on the right ear.
44:00: And it's another thing that our ears pick up on.
44:03: So that's kind of, you know, very dark kind of sound specialization and like the reverb
44:07: can kind of mess with that. So I think like by default, you know, Steam Audio,
44:11: which we use for specialization, it just doesn't work with the reverbs.
44:16: But it also, usually the audio specialization libraries, they have their own reverb systems
44:23: and those systems are more physically modelled. Which means like they're actually modelled,
44:28: you know, if something like, if a sound bounces, you know, over this carpet, the carpet modifies,
44:33: and it bounces into the ear. So on top of having, you know, the initial kind of like
44:40: sound arriving to your ear, you have like, you know, we have primary and secondary and tertiary
44:45: reflections that kind of happen in the world. And that also actually gives you some idea,
44:50: you know, about the shape of the room you're in. It gives you some idea, you know, about the
44:53: structure of it, because like, if you're in a room with a lot of like, you know, soft padding,
44:57: that sounds very different from a room that has, you know, stone walls.
45:02: And the spatialization, like with Steam Audio, for example, you can geometrically model the scene
45:08: and say, you know, these surfaces have these, you know, properties for sound, and it's going
45:13: to simulate that. It's actually, you know, the sort of like, sort of like, it's almost like,
45:19: what's the term for it, not the recasting,
45:26: it's like path tracing, but instead of, you know, visual, it's for audio, and it's kind of like,
45:31: you know, gives a little spatialization. So like, usually you'd want to like, use that to have it
45:34: more physically modeled, so it actually matches, you know, what our brains expect.
45:40: Yeah, and that isn't to say that you, oh sorry, I wanted to add one last thing on that, if that's
45:45: okay. That isn't to say that we couldn't have it on like, partially unspatialized audio. So like,
45:56: but you kind of wanted ambient echo, you could have it be like partially unspatialized,
46:02: and then the unspatialized part, which would be, you know, global, or around the object,
46:07: could have that little bit of reverb applied to it, especially in the cases of stereo reverb,
46:13: like zita reverb. And we definitely should allow partially unspatialized audio to have reverb
46:18: applied to it once we rework the audio system. And this is like another benefit of reworking the
46:22: system, because, like, right now with Unity, I don't know how do we do it, like, we have to
46:26: kind of mess around with it, and maybe, you know, do some shenanigans to like, make it happen.
46:31: But when we have our own audio system, like, you know, I've kind of heard a similar thing,
46:35: like, you know, with PhotonDust, like, you know, the Thanos meme of like,
46:39: reality is whatever I, you know, whatever I make it, or whatever I want, we just kind of do the
46:43: thing. We can, like, that's one of the reasons why sometimes we work system is because we can
46:47: just make them work the way we want. Where, versus if you're using, you know, another system,
46:53: sometimes it gets hard to, like, make it do certain things because it's not designed for that.
46:58: Next question, unfortunately, like, I don't know the context for this one. AK222 was asking will
47:03: it work, but I don't know the context. If you have, if you have another, you know, follow-up
47:08: question, make sure to include context in your message because we only see, you know, the question
47:12: itself, so we can't answer that. Next one is KayoBikaru is asking what are the plans to improve
47:20: hiccups when spawning large objects like avatars? So it's not something that's complex ones. I mean,
47:25: generally anything performance is going to be a complex question. One thing that should help a lot
47:30: just on its own is once we make the switch to .NET 9, because part of the, part of the hiccup is,
47:38: you know, well, there's like multiple parts to it, but part of it is like, you know, the processing
47:43: time, which on itself is going to be improved by .NET 9. Another part is caused by the garbage
47:47: collector because the one that's used in Unity is very old, like it's older than I am. It's like
47:54: from 1988 and it's not been designed, you know, for this kind of use, but unfortunately Unity is
48:02: stuck with it. And one of the things it does, particularly when you allocate a lot of memory,
48:06: it essentially freezes the whole process and does like, you know, full collection. You should try to
48:11: do it incrementally, but like when you have sudden memory spike, like it tends to like
48:14: do those kind of freezes. So that's contributing to some of the hiccups as well. And once we switch
48:20: to .NET 9, it has much more modern garbage collector with much better performance that
48:24: should remove part of the hiccups. The other part is also loading of the assets themselves, because
48:30: right now for textures, when textures are being loaded, we use time slicing. What that means is
48:38: the upload of the texture to the GPU, we only spend certain amount of time per frame to do it.
48:47: So like, you know, it doesn't cause long hiccups. So like if you're uploading a big texture,
48:51: it's going to happen, you know, over several frames and the impact of its kind of spread out.
48:56: But right now it doesn't happen for meshes. When the unit integration gets reworked,
49:02: it's a part that I would like to touch where, you know, the upload of meshes is also kind of like,
49:08: you know, time-sliced. I still have to kind of explore available APIs and how complex that is,
49:13: but that could also help. And if that, you know, doesn't happen or it doesn't help us much,
49:19: once we switch to our customer engine, which is going to call name Sauce,
49:25: we will have much more control on how data is uploaded to the GPU, because within this,
49:29: there's like certain kind of limitations on that. And we actually, even for the textures,
49:33: we're doing a bit of a hack that Unity kind of makes harder to do. So, well, once we kind of
49:44: have like a full control over the engine, like this kind of give us a lot more kind of options
49:47: in regards to make things more, you know, time-sliced or asynchronous. So there's like
49:53: multiple things to it, but like overall, like there should be like also gradual improvements
49:57: on those. Next question is Stella Mares asking, was there a reason for swapping to .NET 9
50:02: relatively early in its release? I was kind of flippant, like, it's more like there was,
50:08: there was not a reason not to do it. The swap was, it was literally just flipping, you know,
50:14: from .NET 8 to .NET 9 in the project build file and everything pretty much worked. There was like
50:19: one syntax like issue that I had to fix, which was one of the functions, and this was in the cloud,
50:27: where function was somewhat really ambiguous because there was an overload. I fixed that,
50:32: it compiled and everything just kind of works. And on the cloud, we actually got,
50:37: we got bonus performance. Let me actually bring the graphs. I posted them in devlog like a while
50:44: back. Well, actually the question is, can I find them quickly enough? I'm just gonna scroll through.
50:51: And it's kind of cool how, how even like the incremental upgrades still give you
50:57: performance benefits. It's literally, we haven't, we don't change anything with the code. We
51:02: literally change a number and suddenly free performance. So this one, this is our cloud
51:10: background worker, which generally has like a constant CPU load. And if you look, this is,
51:16: this is when we switch that worker from .NET 8 to .NET 9, and you see the CPU just kind of
51:23: dropped all of a sudden, which is literally free performance. They, every year they release a new
51:32: version of .NET. They have so many performance improvements. Like if you look at the blog post,
51:36: like they're ridiculously long. And like, I remember, I think it was like with .NET 7,
51:42: like the post was so long. If you loaded it on a mobile web browser, it would crash because it's
51:47: just too much. And the author of those posts, like they were, they kind of doing it partly on
51:53: when they started doing those posts, they were like, oh, like we're getting so many performance
51:57: improvements early on. The next one, I'm like, we're going to do them all. And then I have
52:01: nothing to write about, but instead every year they have like more and more and more to write
52:06: about. So like, we've been like very happy with this, but also sad because we were not able to
52:11: get those performance improvements because it requires a huge amount of work, but, you know,
52:17: thanks to everybody in the survey, like voting for performance, we're kind of like, it's like,
52:22: but like, we're going to do it. We're going to get, you know, those big improvements.
52:28: This is, this is the API frontend one. So you can also see like the switch happened around here
52:34: and you see like this one has a little bit more variable usage because it depends like on the,
52:38: you know, how much stuff is happening, but you can see like overall it actually dropped
52:42: afterwards as well. Yeah, it's, it's, it's absolutely insane. Especially,
52:50: because like the headless, just the sheer, like Grand Canyon clip between like the performance
53:00: between like the headless and the graphical client now, the headless can render everyone,
53:06: you know, like a 40 person session and it's a 60. Just the crazy thing is like, so like the graphs
53:13: I just showed that was, you know, going from .NET 8 to .NET 9, which is a relatively small improvement,
53:19: it's still enough that it shows up. There's just more performance. But the one we're using right
53:24: now, Mono, you know, that's, that's like at least order of magnitude. Like, like, like, like if it
53:30: was on a graph, you know, Mono would be somewhere here and then like .NET 9 is somewhere here.
53:35: Like there's much, much more drastic performance improvements in the two.
53:41: Like half a decade. Yeah, like it's, it's, it's, it's insane. And there's one of the reasons like
53:46: we don't want to do it. It's also like why we did like, you know, the headless first, so we can kind
53:49: of see how much it actually gives because the headless, it arounds, headless arounds, you know,
53:55: majority of FrooxEngine. Like it runs the same code it's running right now to like do all this,
54:00: except the, you know, rendering parts and a few extra bits, but majority of the CPU time is
54:06: spent in the same code. So it gives us a really good idea of like what kind of performance boost
54:10: we get. Next? Next one is also for you. Do you want to read this one? Yeah, I'll read this one.
54:21: Oh, this one's... Naya Love asks, Cyro, is the driver you made for using your face tracking
54:30: Resonite? They're asking if it's available somewhere. Well, maybe try it next time. I
54:40: also want to open XR support Froox. To answer the first part, I will most likely put it on my
54:47: GitHub. I just need to clean it up and not hyper focus on funny little bits of it that don't matter,
54:57: but it will be on my GitHub at some point. So to answer like the second one, so open XR support
55:05: right now is actually a little bit difficult because the version of Unity we use doesn't really
55:10: support it and the ones that do, they actually break some of the stuff we use. So if we were to
55:17: do that, like we would have to break, you know, some content, unfortunately, which is one of the
55:21: reasons we're like, there's like many reasons, but this is one of the reasons we're moving away from
55:26: Unity to our custom rendering engine. So we actually have control of it and we can use modern
55:31: runtimes and also maintain long-term content compatibility. So it's not going to happen,
55:37: I can tell it's pretty much not going to happen with Unity, but like once we switch, you know,
55:41: to Sauce, like it might even be like, you know, out of the box kind of support, like I'm not 100%
55:48: up to date like on that part, but that one will make it, at the very least, it's going to make
55:52: it much easier to support it. And I think it's probably going to support it out of box.
55:58: There is a, there is a big, a really big asterisk technically with that. And that's technically,
56:08: if you use, like, if you're using Linux, like I'm using right now,
56:13: Monado and Open, or Open Composite will actually translate OpenXR calls into OpenVR calls.
56:21: So I'm actually using OpenXR right now, and it's still faster than SteamVR.
56:26: Yeah. So I'm just going to try to get through the questions because we're coming up on an hour.
56:32: So I do want to like, you know, do the Photodust, kind of like, you know,
56:36: talk a little bit more and do a showcase. So next one, we're kind of like, you know,
56:42: we're going to do the showcase. We're going to cut off the, we're going to try to get through
56:45: the questions we have right now. Then we're going to have a look at PhotonDust. I want to talk a
56:53: lot more questions. So next, GrandUK. I feel a community volunteer could fit well for a community
56:59: manager to be the community voice in those meetings where appropriate. Also one person
57:04: can manage too much connection a team similar to moderation or mentors could be as good as volunteer
57:08: basis as well to prevent massive workload and less than one team. What are Tom's idea?
57:13: So there's few problems with that. One is like, we generally kind of do community volunteers on
57:19: that kind of level because like, you know, a bunch of like company and legal stuff.
57:24: The other problem I have with that is like, you know, it's, it's like, it makes it one person in
57:29: general. Like we try to like avoid having like, you know, one person for this thing because that
57:33: introduces more bias into things. So if you actually have multiple people in the team,
57:39: each providing kind of different perspectives, I feel that's much healthier overall. We do similar
57:45: things, you know, with the moderation team where like, if there's like an anytime there's a
57:50: moderation tickets, multiple moderators see it and multiple actually kind of contribute to it,
57:55: even if there's like, you know, one kind of moderator handling, you know, that particle
57:58: ticket. And what it does is make sure like, you know, no particle moderator, if like anybody has
58:04: bias to the situation, there's multiple people kind of checking up on each other. So like, I,
58:13: I feel it's kind of better, like, you know, something that kind of comes from multiple
58:16: sources and multiple viewpoints than like, you know, a single kind of question.
58:26: And the mentors could like potentially help like, like one of the things mentors could potentially
58:30: do is, you know, make like, if there's like lots of common problems that people are hitting is,
58:35: you know, for example, write up GitHub issues, make like reports about it, and we can, you know,
58:41: that's like, you know, one way to kind of like help with those things. But
58:48: it's kind of depends, you know, what, what, what do you like, imagine the interaction to kind of be
58:51: like, next question's Nailov. Are you planning to do the IK rework right after performance update
58:58: is considered finished or is it not set in stone and stuff? It's not set in stone. The IK is still
59:04: like one of the kind of high things we want to rework. It's kind of up there with stuff like UI
59:09: and so on. So we'll, we'll see, like we'll cross the bridge when we get to it. Usually when we
59:15: finish kind of big chunk of work, we kind of re-evaluate and we're like, okay, where are we
59:19: at? What's the biggest pain point right now? You know, what's the most, what is the thing that's
59:26: going to have the most impact that we can work on right now? And this can very well be, you know,
59:30: the IK. But it could also, maybe it could be like UI. There might be also like, you know, some
59:36: smaller projects because one of the, one of them that we're considering is, you know, doing stuff
59:39: like the Unity SDK because that one's much quicker to do and we can like, you know, then let the
59:45: community expand and build around it. So that might kind of help, you know, but it's, there is
59:52: no set in stone yet. There's like a bunch of kind of like ideas on that one and we'll, we'll see.
59:59: Next one is Dusty Sprinkles. When we have multi-process and Froox engines out from Unity, would upgrading
01:00:03: Unity be easier or would there still be a risk of breaking existing content? Unfortunately, it doesn't make
01:00:08: upgrading Unity any easier. The problems we have with our data that would break content are purely
01:00:13: on the rendering side and like, we're supported. So we're very likely not going to upgrade Unity.
01:00:20: We would also, wouldn't probably get super many benefits like from it. We're just going to, you know,
01:00:26: get some rendering engine at that point. Next one is Granuke. When you start to open XR work,
01:00:34: make sure to get served or do some testing on Linux to make sure it doesn't break massively there.
01:00:39: Yeah, I think you're probably going to be like, you know, doing a bunch of like Linux testing
01:00:41: since you like daily drive it. Yeah. So next one, Granuke, do you mind if I talk to you in
01:00:49: Resonite after a stream, talk over it before make official feature request? Probably not after the
01:00:53: stream. I generally tend to like, you know, go out of like kind of work mode and kind of like
01:00:58: just hang out and so on. So we, I like in general, like I will not want to do like much kind of work
01:01:06: talk like while I'm on Resonite because it makes it much harder to sort of like, you know, set
01:01:12: boundaries. And it's something like that's been kind of contributing to like, you know,
01:01:18: burnout issues and stuff like that. So like, I need to set a hard line there. I'm sorry. I'd
01:01:24: recommend just, you know, making the request, we look at it, you know, whenever we can and we either
01:01:28: say like, you know, yes, no, or maybe something different. As a good rule of thumb, it's best to
01:01:36: like take the issues that you're having with Resonite and majorly keep them on stuff like
01:01:41: the GitHub. Because, you know, when we come in game, we kind of just want to exist in the thing
01:01:51: like that we made. And we kind of want to just live in it like you guys. And being reminded of,
01:02:00: you know, like work all the time is kind of hard on the old psyche. It's already like difficult
01:02:05: to kind of like get like out of the work mode like a lot of the times. And the other problem
01:02:11: I used to do it a lot like when I kind of talked about this stuff like in game and I even take
01:02:16: like, I used to like, you know, take feature requests in game, and, you know, just kind of
01:02:20: write them down as people are doing them and do them. But like, as the community kept growing,
01:02:26: it became like unmanageable and like, and the problem is, you know, like, if like, if we make
01:02:32: exception for this person, then you know, then this person gets, you know, upset because like,
01:02:36: we told that person no. And it's also, it just becomes this kind of slippery slope kind of thing
01:02:43: that like, makes it hard to like set boundaries for things and makes it harder to like, you know,
01:02:49: be able to like, relax. So we generally like, you know, we want to be pretty strict about people
01:02:55: going through the official sources. With that, like, that's, we have like one hour minus three
01:03:03: minutes left. So there should be like time for like more questions, probably. But right now,
01:03:10: I want to do a bit of a showcase and talk about PhotonDust, which is our new particle system.
01:03:17: So I'm actually gonna, I'm gonna get this, this one is the one I get up to, because we're just
01:03:20: gonna be moving over there. So I'm just gonna, there we go. I'm gonna, there we go. So I'm gonna
01:03:30: switch the camera over here.
01:03:35: And I forgot my brush. Where's my brush? Oh, wait, I'm done. I have it on my tool shelf.
01:03:40: I'll grab the chat too. There we go. So, okay. And where's my streaming window?
01:03:50: Where's my streaming window? Oh, it's in the desk. There we go. There we go. Hello.
01:03:57: So just to kind of give it, you know, give a little bit of a context. PhotonDust, it's our
01:04:04: new, brand new particle system that's written in-house. It's a version from scratch and it
01:04:10: replaces the legacy particle system that was a hybrid between our system and Unity, which
01:04:25: in domain builds very soon. And you can already kind of play with it. People have been doing,
01:04:31: you know, lots of cool things with the system. I'm going to just kind of showcase you, you know,
01:04:37: how is it built. And one of the reasons we also did like PhotonDust is because of the performance
01:04:44: updates, but also, you know, to kind of get more control on the particle system works and how,
01:04:52: how is it kind of, you know, like be able to like, you know, add a lot of new features and
01:04:57: love tools for people to work with. And I'm sure we're going to do a little primer on particle
01:05:03: systems, because if you think about it, particle systems are, they're very core, they're like,
01:05:08: you know, very simple. So say like you have like a coordinate system, you're just going to do,
01:05:12: you know, 2D1. What particle system is, is, you know, we have like a bunch of particles.
01:05:18: Each particle can have, you know, a position, you know, so like this one has like position
01:05:23: on the X and you know, Y axis, I need to label my axis. You know, so there's like position,
01:05:32: you know, the position you might have, you know, rotation, the particle might be like, you know,
01:05:36: 2D, like it might be, you know, 3D rotation. It can also have stuff like, you know, size,
01:05:44: it's something like color, and there, you know, can have stuff like velocity. So for example,
01:05:53: it has, you know, this velocity, which means it's going to be moving this way.
01:05:57: So, you know, they can be like velocity.
01:06:03: And then you have like, you know, bunch of particle in the system, maybe there's like
01:06:05: an other one there that has like, you know, this velocity, you know, and there's like a whole bunch
01:06:15: of, you know, you can update the simulation based, you know, for example, their velocities.
01:06:18: So if you advance it in time, this particle will move over here, and this will move over here,
01:06:25: and it's going to happen for every single particle in the system. You can also, you know, evolve
01:06:30: their size or color or rotation. They can, you know, change over time, or they can change based
01:06:36: on, you know, some equation. And the other part is you, when you have the simulation, the particles,
01:06:44: they're essentially just pieces of information, you know, like the position, size, I didn't add
01:06:48: the rotation, I'm just going to add the rotation here, you know, position, size, color, velocity,
01:06:54: which is used internally, and then you render it out, which means, you know, for each particle,
01:07:00: for example, you can get like, you know, a sprite that you render around this point in space,
01:07:04: or maybe a little like, you know, 3D model or whatever you want, and then, you know,
01:07:14: one gist, you know, of how particle systems work is you have points in space, and you're updating
01:07:20: their parameters over time. They also have something called lifetime, and essentially
01:07:31: that indicates, you know, how long does the particle live for. So like, when it starts,
01:07:36: maybe, you know, the lifetime is one second, so like, you know, it keeps moving, and each step,
01:07:42: its lifetime is dropping. So here it would be, you know, say it's like one second here,
01:07:46: and this is 0.9 seconds, and over here it's going to be 0.5 seconds, and after five seconds,
01:07:52: it's going to, you know, disappear. The lifetime is also something that you can use to drive other
01:08:00: parameters. So for example, say like you want the particle, you know, to fade in and out,
01:08:06: so what you're going to do is you're going to define, you know, some kind of function.
01:08:10: How does the color of the particle change over its lifetime? And you can essentially say,
01:08:15: at the beginning of its lifetime, like if you were, say like this is, you know, alpha,
01:08:26: when the particle starts living, this is when it dies, that's 0.1 seconds,
01:08:31: and you're going to say, you know, the particle, you make like a function that's like
01:08:34: when the particle starts living, it goes, you know, from transparent, it becomes visible,
01:08:39: then it kind of exists, and then sometime, you know, before it dies, it's going to fade out.
01:08:46: So that's, for example, one way, you know, to map its lifetime.
01:08:57: You know, to like appropriately like the alpha or color, it could also be size, you know,
01:09:02: like it could, you know, start tiny, and kind of, you know, and then kind of, you know, get big,
01:09:08: and then get small again. You can also do multiple parameters. So for example,
01:09:13: so for example, you know, you can do alpha like this, so it actually fades in and then fades out.
01:09:18: But also, you could do something like size over time, where it just keeps growing.
01:09:23: So it can start, you know, at certain size, and it just keeps growing. And it dies, you know,
01:09:30: at certain size. And you have created mapping between the lifetime and the size of the particle.
01:09:35: There's lots of different ways, you know, to create mappings between these properties
01:09:40: in order to create all kinds of effect. Like if you think about it, for example,
01:09:44: like a smoke effect. So you have like, you know, you have a sprite that's a smoke,
01:09:49: you know, it's like a smoke sprite. And for each particle, it fades in when it appears,
01:09:56: so it doesn't just pop into existence, it stays, and then it fades out, and you also
01:10:01: combine it with a size, so it starts small, and it kind of expands. And as a way, you know,
01:10:06: you can combine two effects, you know, like a smoke effect or some different ones.
01:10:12: You can also do, you know, lots of other effects. For example, you can have, you know,
01:10:17: can apply forces to them, like turbulent force, so instead of the velocity of the particle just
01:10:23: being the same, you know, let me actually do another one over here, so let's say one starts
01:10:27: here, and the particle starts moving this way, but maybe, you know, there's a force that, like,
01:10:34: when it moves over here, the force now moves here, which means next time it's going to be here,
01:10:40: and the force rotates this way, and next time it's going to be here, and the force rotates this way,
01:10:45: and it makes the particle follow, you know, some kind of turbulent path, because the velocity
01:10:51: itself is changing over time. So there's lots of things like that we can do, you know, with particle
01:10:56: system. I'm going to clear this out. The way PhotonDust is designed is you have some base
01:11:11: you have rotation, you have scale,
01:11:19: and you have color, and then you have also, like, lifetime, which is like how much lifetime it has
01:11:25: over one lifetime. So those are some of the kind of built-in properties that particles have,
01:11:34: and these, every single particle is always, like, going to have, some of them can be just,
01:11:39: default, but they are kind of part of PhotonDust. And there's starting versions of these.
01:11:45: When the particle is first created, when it's essentially emitted from an emitter,
01:11:50: it initializes the starting values. So, for example, the emitter can, you know, it gives the
01:11:56: particle position. So, like, if you have a sphere emitter, you know, like, that's emitting particles,
01:12:01: it can be emitting, you know, particles within its volume, it will give each particle a position
01:12:07: within that volume. And that's, you know, the job of the emitter to do that. You can have rotation,
01:12:13: you know, scale, color, they are kind of initialized. And, for example, for color,
01:12:17: you can have a module, I'm kind of, actually, I'm kind of skipping ahead a little bit,
01:12:24: or just kind of back a little bit. The way PhotonDust works is you have these basic
01:12:31: modules, and modules that can do multiple things. Some of them will initialize the starting
01:12:38: properties, some of them will calculate new properties during the particle's lifetime,
01:12:44: and some will kind of, you know, do a mix of both and do, like, more complex stuff.
01:12:49: So you have modules which initialize some of the properties. For example, you want each particle
01:12:53: to start with a unique color, you can have a module that computes color for any particle,
01:12:59: there's a mix between two different colors, or maybe it picks it from a texture, you know,
01:13:03: or maybe it does some equation to compute it, or maybe, you know, sometimes the emitter can,
01:13:09: like, you know, also contribute a color if it's like a mesh-emitter, or if it's like an emitter
01:13:13: with just some, you know, equation to compute the initial color for a particle. What PhotonDust
01:13:22: does is it has a bunch of these starting properties, but it also has sort of output
01:13:28: properties, and for those you also have position, you have rotation, you have scale or size,
01:13:40: and you have color. And there's also like another one, I'm talking about,
01:13:45: well added here, which is like a frame index. You can think of it as sort of like a UV coordinate,
01:13:50: you don't have to worry about this one super much. These are like really the main ones,
01:13:55: because this is what is needed to render the particle. You're essentially gonna render
01:13:59: particle at certain position in space, you're gonna render it with certain orientation,
01:14:04: at certain size, the size is 3D by the way, and with certain color. And these then go
01:14:11: into a render module. So this goes into a renderer.
01:14:21: And the renderer is essentially responsible for taking this data and making the particle appear
01:14:28: in some way. And it can be like a billboard, you know, so it's like a billboard sprite.
01:14:33: So there's specifically a module called Billboard Renderer. There's also a mesh renderer, which
01:14:41: for each point is gonna render some kind of mesh. And there could be more in the future. One
01:14:49: that I kind of want to add at some point is one that actually computes a mesh using something
01:14:54: like marching cubes to sort of create a surface. So you can do stuff that looks like liquid.
01:15:01: And there's kind of lots of opportunities for additional renders there. But the gist of it,
01:15:07: they take these, they make sure it's rendered out. And then the PhotonDust itself,
01:15:12: it will take stuff like the emitters to make, to add new particles to the system. And then
01:15:18: you have a bunch of modules which take these and compute these. One of the important modules
01:15:26: is the position simulator. And this is a typical way particles are simulated.
01:15:33: But unlike the particle system that's in Unity, PhotonDust is very modular. So there's like,
01:15:39: you know, there's lots of different ways the initial properties can be transformed
01:15:43: into these output like render properties. But just to kind of keep things simple,
01:15:49: you have the position module. So I'm going to just be position module.
01:15:58: And the position module, what it actually has, you can notice like, you know, in the starting
01:16:03: properties, like you don't really have, you don't have the velocity. And it's because there's lots
01:16:11: of different ways to compute position. And the velocity is specific to the position simulator
01:16:17: module. So the module can actually add its own internal buffer, which is velocity.
01:16:26: That velocity can also be initialized from starting properties. You can have like module
01:16:33: that, you know, for example, assigns a new speed that's like, you know, between minimum and maximum.
01:16:38: There's like one initial property that omitted called direction, or sort of like initial like
01:16:44: vector. So the position module, when a new particle is added, it will take the direction
01:16:54: to compute initial velocity, and that will then get multiplied by any initializer module
01:17:00: to kind of like, you know, compute the initial velocity. What the position module will then do
01:17:04: that every single update, it will essentially compute a new position for the particle,
01:17:11: and it makes the particle move around. And you can have, you know, another module like, you know,
01:17:17: color over lifetime, which is going to color lifetime. And what this module does,
01:17:24: it will take the starting color, and it'll do something with it, like it's going to multiply
01:17:30: with some value, and then it, you know, computes the output color.
01:17:36: And can do whatever math it wants in here. You can also chain those modules, so like if you have
01:17:42: multiple modules, so you can have like, you know, one, and there's like, you know, another one that
01:17:46: like simulates color in a different way, or maybe just, you know, just the alpha channel.
01:17:51: And if you have multiple of them, it will take whatever is computed,
01:17:56: do whatever on math, and then, you know, assign a color. It's sort of like a chain.
01:18:04: Which is like why, for PhotonDust, the order of modules matters. The same also happens for
01:18:10: initializers. If you have something that computes the starting color, it's gonna, if you have
01:18:18: multiple of them, they're all gonna contribute, and their contributions are multiplied with each
01:18:22: other. So this is the kind of general gist, you know, of how PhotonDust works, is we have
01:18:28: some starting properties, we have some sort of like lifetime kind of properties, like,
01:18:33: like, for example, lifetime, I cannot simplify this one a bit, because there is actually,
01:18:39: there's a starting lifetime, which is just, you know, how long the particle is supposed to live,
01:18:43: and then there's, like, its current lifetime, which is, which is how much it has left to live.
01:18:55: And some of these properties change during the lifetime, some of these are specifically
01:18:58: starting ones, and the ones that end up changing are like, you know, these.
01:19:04: So this kind of should cover, this should kind of cover, like, you know, kind of the basics,
01:19:10: you have, like, the starting properties, you have some lifetime properties, modules modify these
01:19:15: in various ways, or initialize these, it computes, you know, final properties, and then those go into
01:19:21: a renderer that, like, shows the output in some way. I'm actually gonna do a showcase, and I'm
01:19:28: gonna switch this one to first person camera. Oh, and you can see my hair, give me a second.
01:19:36: Wrong one.
01:19:40: There we go. So you should have the Froox view now, and you can see side over there,
01:19:48: and I'm gonna spawn a developer tool from the inventory from Resonite Essentials.
01:19:57: So this is the developer tool for anyone who's not familiar with, and you can start creating.
01:20:02: So if I go open context menu, there's create new, and you see there's the aciparticle system.
01:20:09: During testing, both systems are available, and if I spawn this one, you see, this is
01:20:16: even called like aciparticle style, it's just gonna give people idea this is gonna go away,
01:20:20: but this old system, it's very monolithic, you know, everything like is kind of bunched up in one
01:20:27: place, which makes it a little bit kind of harder to work with. It's also like harder to extend,
01:20:32: the size of the particles is initialized, you know, it's always min and max, and makes it
01:20:39: harder to do lots of different methods to initialize those. I'm gonna get rid of this one,
01:20:45: and instead I will create a new particle system, so this one's gonna be PhotonDust.
01:20:51: And you see over here, you have the particle system, that's the actual handler of the
01:20:57: simulation. So this is like, you know, what's triggering all the simulation of particles.
01:21:02: But in order to simulate them, it uses a particle style. Particle style, it defines, you know,
01:21:08: how the particle system behaves, and how it looks. Each particle style is going to have
01:21:15: specific renderer, and you can see it actually says Billboard Particle Renderer, which is
01:21:22: what I talked about. Oh, why can't I... Oh, my controller was weird, sorry.
01:21:31: Which is essentially what I was talking about here.
01:21:34: And that's responsible for rendering these particles, you know, in the world.
01:21:38: There's also a number of modules, and initialize it with a few basic modules,
01:21:42: so, you know, just to give it some initial behavior.
01:21:47: And you can see, you know, there's a few of them. And the modules are the ones that actually do
01:21:52: system. You see, there's the Position Simulator module. If I actually make it go away, you see
01:21:58: now it's emitting particles, but they're not moving around, because there's nothing that
01:22:02: would be moving them, so they only stay at their starting position. So I'm just gonna undo that.
01:22:09: Then you see this suddenly burst out. There's also a bunch of stuff that's been kind of organized
01:22:17: here. So all of these modules that have been put under these slots. It's not necessary to put them
01:22:24: there. I made it this way, just so it's kind of easier so you don't have like, you know, a million
01:22:28: things in one place. So I can be, you know, this is the emitter, this is where the particles come from.
01:22:33: And if I move it around, it changes where the particles come from. I can change the rate of
01:22:38: emission. I can be like, you know, add more there, add a thousand, you know, and I have a bunch of
01:22:44: particles here. I can change the color for this one, for example. This is like a new thing
01:22:53: somebody mentioned. I just made it part of this emitter. And when I'm changing this,
01:22:58: this is actually the emitter itself, which is giving the particle its starting color. But you see
01:23:05: it's still being kind of combined because the particles, when they were white,
01:23:10: you see they go from white to black. And the reason they do that is because of initializers.
01:23:17: They're specifically color range initializer. And what this does, it picks a random color between
01:23:24: minimum and maximum value. So if I change this one, you see they're now getting a random color
01:23:30: that's a linear interpolation between these two. And this doesn't depend on the emitter at all.
01:23:36: That's the module giving them that value. I'm going to change this one back to black.
01:23:43: It has been combined. Not all emitters have this. The point emitter has a single color.
01:23:50: So you assign it, it multiplies the color, which means they still go from fully bright to black
01:23:58: because that color by the initializer is being combined with the emitter color.
01:24:04: I can also add other emitters, so I'm going to actually get rid of this one. And you can see
01:24:08: it stops emitting because now there's nothing that's introducing new particles into the system.
01:24:15: If I go under rendering, particle system, emitters, there's a whole number of them.
01:24:23: And I'm going to pick box emitter because then it has a bunch more color options.
01:24:28: So for each emitter, you need to assign it the system that it's adding particles to.
01:24:35: So I'm going to open this one next to this, and I'm going to take a reference to this,
01:24:43: and drop it here. And you see, now it starts emitting new particles.
01:24:48: And I'm going to increase it a bit. I'm going to do 1000. And you see now, instead of emitting from
01:24:53: a single point, the particles are actually emitting from a box volume.
01:24:59: It's a little bit harder to see, so I want to slow them down.
01:25:02: So I'm going to go back to the initializers, and there's a speed range initializer.
01:25:08: So I'm just going to drop it to lower values so they don't go too fast.
01:25:14: And you can better see the box shape of them.
01:25:18: But you can also see, if I go back to the emitter, this one has its own method to give them initial colors.
01:25:28: So I can for example color each vertex of the emitter.
01:25:32: So I can make this one red, and I'll make this one blue, and this one green.
01:25:41: I think that's not quite enough particles, I want a bit more.
01:25:47: On the particle system we also have a limit, you see we're actually reaching how many particles this system can have.
01:25:54: So let's just add an extra 0 there.
01:25:57: There we go, we can get more particles.
01:26:01: And let's see how many we have.
01:26:04: We have about 3000, we can bump it up a bit more.
01:26:08: Let's do like 5000.
01:26:12: There we go.
01:26:15: And now it's going.
01:26:17: Now it's going.
01:26:22: I might actually have saw them live a little bit too long, so what I'll do is go back to the initializers and this lifetime range initializer.
01:26:31: And I'll just set it so they live 1 second, because right now they're living between 1 and 5 seconds.
01:26:37: So now they don't live too long.
01:26:40: And let's see if we can get a kind of box shape.
01:26:43: And it's kind of volumetric, so if I go inside, it's full of particles.
01:26:49: Each of the emitters have a bunch of properties, so you can control how they move.
01:26:54: Like for example, I just want to emit from a shell.
01:26:56: So now instead of being emitted from the entire volume, it's sort of like a shell of a cube.
01:27:03: And they're still kind of being colored.
01:27:07: So what you have in this case is you have the emitter, and you have multiple of these,
01:27:13: which is adding new particles to the system.
01:27:15: The emitter can also provide some of the starting properties,
01:27:19: like the color in this case, and the direction.
01:27:24: And then you have initializers, which modify the starting properties.
01:27:28: For example, the colorage initializer takes the color from emitter,
01:27:31: it adds, multiplies with additional color, the size is initialized,
01:27:36: the particles are different sizes, the speed is initialized,
01:27:39: lifetime is initialized to something.
01:27:42: And then you have simulators.
01:27:44: And the simulators, they are what is responsible for updating the particles over their lifetime.
01:27:51: And right now there's only the position simulator module.
01:27:53: That's what makes the particles move, because each update it computes
01:27:58: their new position based on their velocity.
01:28:02: It has a few properties on itself.
01:28:04: It can, for example, do collisions.
01:28:06: So if I enable that, you can actually see an interesting effect here,
01:28:09: because now all of the particles are computing collisions.
01:28:14: And what you see is, we might need a bigger collider, like a box or something.
01:28:22: What you see, though, is it's running at lower frame rate,
01:28:27: but it's actually the particle system itself is running at lower frame rate than I am.
01:28:32: So if I switch this here, you see I can still move at a faster rate
01:28:39: than the particle system is updating on.
01:28:42: And this is one of the big improvements that PhotonDust has
01:28:46: over the previous particle system, is it's asynchronous.
01:28:48: If you make the simulation too heavy, instead of this being your frame rate,
01:29:08: instead of us running at the frame rate of this particle system,
01:29:11: it's only the particle system that runs at a slower rate.
01:29:15: All of the simulation, it happens on multiple cores,
01:29:18: so this is going to be multi-threaded, which the previous system was mostly as well,
01:29:26: but it's also specifically asynchronous.
01:29:33: Let me modify the properties a little bit.
01:29:35: So I would actually want the particles, you know, let's make them fall to the floor.
01:29:40: For death, we actually want something that's going to update the velocity.
01:29:47: So I'm going to add a gravity module.
01:29:50: So I'm going to add such a component,
01:29:53: because when you think about it, like what gravity is,
01:29:56: is it's modifying the velocity every frame.
01:29:59: Like if the particle starts moving, you know, this way,
01:30:02: let's see where did I leave my brush.
01:30:05: If the particle has some velocity, gravity is a force that's modifying that velocity.
01:30:13: It's essentially acceleration that's pulling it down.
01:30:17: So as the particle, you know, it moves over here
01:30:19: and the direction moves a little bit downwards.
01:30:22: And then like it moves over here and direction moves more downwards
01:30:25: and then moves over here and direction more downwards
01:30:29: and then moves here and directions now it's almost aligned.
01:30:32: Essentially that ends up like, you know, falling.
01:30:36: It's because like there's a force that's being applied to the velocity
01:30:39: that's, you know, making move downwards.
01:30:41: And it's also accelerating because it's an additive force.
01:30:44: So every update, you know, the velocity is higher and higher and higher,
01:30:50: which means it moves faster and faster.
01:30:54: So if I go to rendering, practical system, modules,
01:30:59: these are probably going to get more organized after like some time in the release.
01:31:05: But I'm just going to add a gravity force.
01:31:09: And then what I need to do is I need to open the system itself.
01:31:13: When you add a module, the module can actually,
01:31:16: each module can be shared between multiple particle systems.
01:31:19: So you need to be able to tell the particle-particle style
01:31:24: that do you want it to use this module.
01:31:26: So what I'll do, I'll just add a new empathy space for here
01:31:29: under the modules list and I'll drop it here.
01:31:34: And the particles start falling down.
01:31:38: And now do we have a nice glider?
01:31:42: Just put it in.
01:31:44: Make it a little bit bigger.
01:31:48: I need to have them live for a bit longer because this is a little bit harder to see
01:31:51: and make them a little bit less dense.
01:31:56: So I'll do this.
01:31:57: This works particularly bad like on collisions.
01:32:01: So this is just this menu.
01:32:03: And I'm gonna make them live, say for like four seconds.
01:32:11: There we go.
01:32:14: And you should be able to see they could, they should be like rolling off.
01:32:18: Well, this part is really bad for collisions.
01:32:21: I think it's because there's like a lot of colliders and stuff in here.
01:32:27: Is this, does this query need Bepu structures?
01:32:31: Yes.
01:32:33: Okay.
01:32:35: Yeah, all the collisions this against Bepu because this is the physics engine we use.
01:32:39: I'm gonna make this smaller so it's a little bit easier to, so it's easier to see.
01:32:45: Let me also reset the colors so it's,
01:32:52: I should have been doing this.
01:32:54: Drip this here, drip this here, just so it's kind of nicer to see.
01:32:59: There we go.
01:33:00: And you see the kind of like, you know, colliding and flowing there.
01:33:06: And if I go back to the simulator,
01:33:11: disable the collisions, they will, you know, start falling through.
01:33:15: And you see that runs a fair bit faster, but I was, you know, not colliding anymore.
01:33:20: But that's one of the benefits is like, you know, if you had like worlds that are particularly not
01:33:24: friendly, you know, to collisions, this would like, you know, if this was the old system,
01:33:29: this would be killing our frame rate, but we can still kind of run.
01:33:33: And it's just, you know, it's only the particle system itself that degrades.
01:33:41: Lower this a little bit more, I'm just gonna do 100.
01:33:46: So if it's fear of them, it has like fear to simulate, you see,
01:33:49: like it actually starts running faster as they kind of bounce around.
01:33:56: I didn't expect this to be this bad for collisions.
01:34:01: It's actually one of the things that I kind of also expect that we're going to get like
01:34:05: a significant improvement on once we do the switch to .NET 9.
01:34:10: Bepu physics, it's specifically optimized, you know, for modern .NET
01:34:14: and it's running way slower than it can, you know, with the mono.
01:34:19: We even had to do some things to make it run reasonably fast,
01:34:23: but like we're not like we're not anywhere like tapping its full potential.
01:34:26: So like there's going to be one of those things that's going to be interesting.
01:34:29: Once we make the switch and we're actually running with .NET 9,
01:34:32: I kind of expect the simulation to happen a lot faster when there's, you know,
01:34:35: lots of physics, like, you know, physics interactions, but also in general,
01:34:40: way faster code, but you can kind of see, you know, the particle system,
01:34:47: how it's kind of like, you know, doing its work.
01:34:50: If I switch back here, I can show you some more modules than I can do.
01:34:57: So one of the things, I'll change the initializer.
01:35:02: I don't want to be like, you know, dead dark.
01:35:03: I just want to be a little bit.
01:35:05: There we go.
01:35:07: And now they're bouncing over here.
01:35:09: There's lots of new effects that were not possible to do before with old particle system.
01:35:14: And ones that particularly benefit from the module nature of the system.
01:35:20: So I have added like, you know, a module that simulates the force,
01:35:24: which essentially, which is modifying the velocity.
01:35:27: Let's add one that modifies the color.
01:35:30: So I'm going to do attach component rendering.
01:35:33: And I'm going to do modules.
01:35:38: It's not like my favorite because it makes cool visuals is color HSV over lifetime start end.
01:35:45: So there's lots of modules, you know, that will modify the color somehow.
01:35:50: But I particularly want this one because what it does,
01:35:54: it changes the color of the particle based on how long it has lived.
01:35:59: And it changes it in the hue saturation and value color space,
01:36:04: which means if it starts at zero hue and ends with one hue,
01:36:09: it goes through the entire core spectrum.
01:36:11: So I have added this module.
01:36:13: Now I just need to like add it to the particle style.
01:36:16: So I'm going to take this.
01:36:18: I'm going to drop it here and whip and see it immediately starts coloring the particles.
01:36:26: I kind of like this effect.
01:36:29: Looks pretty like Skittles.
01:36:32: Skittles.
01:36:33: You could even do things like, you know, if I, if I said it too,
01:36:36: then it's actually going to cycle.
01:36:39: I'm making them go all around.
01:36:41: It's actually going to, you know, go through the hue like twice.
01:36:44: So we see it starts red, goes, gets to red in the middle,
01:36:47: and then like goes through the spectrum again.
01:36:50: So like you can play, you know, with these properties.
01:36:53: You can also like, you know, just have it go through just part of the spectrum.
01:36:55: So, you know, this one just gets from red, you know, all the way all the way to green.
01:37:01: The important part is also what this module is doing.
01:37:06: If you look, I'm gonna position this.
01:37:13: It's actually fair bit distracting.
01:37:15: So I'll move it.
01:37:16: I'll move the emitter from here.
01:37:19: Where do I move it?
01:37:22: Can we, can we make them have shadows?
01:37:24: Uh, you can with the right render.
01:37:28: I'm just gonna move it over here and have them bounce over there.
01:37:31: There we go.
01:37:34: Or maybe here.
01:37:35: Now they're going on the desk.
01:37:39: So what this module is doing, and I have a bunch of clutter here.
01:37:46: And I lost my brush.
01:37:47: Oh, there it is.
01:37:49: What the module is doing, it's taking the starting color,
01:37:54: initializers, and then it's computing, it's taking the current lifetime, seeing how long
01:38:00: the particle has lived, and it computes a new color based on that lifetime, multiplies
01:38:06: it with a starting color, and then, you know, writes it into the color to be rendered.
01:38:12: And that's kind of how it's achieving its work.
01:38:15: You can also combine multiple things.
01:38:17: I actually have a request.
01:38:18: So would you be able to like, bring me like a gradient texture, whatever gradient texture
01:38:24: you like?
01:38:25: Yes, I do have one actually.
01:38:27: Do you have a cool one?
01:38:29: I do have a cool one.
01:38:31: Let me see if I can find it here.
01:38:33: Just real quick.
01:38:38: Aha, here we go.
01:38:40: These are all the colors of the rainbow.
01:38:42: Beautiful.
01:38:43: So where's the colors of the spectrum?
01:38:46: Which is, yeah, that's pretty much right.
01:38:48: So what I can do, I'm going to switch this back into some POV.
01:38:54: What I can do here is add another module.
01:38:59: I'm going to add another simulator.
01:39:02: And if I go here, I'll add rendering particle system modules.
01:39:12: There's also color over lifetime texture.
01:39:16: And what this module does, it assigns it, instead of like, you know, the color being
01:39:22: computed mathematically, it will read it from the texture, you know, going from left to right.
01:39:28: One thing I need to do in order to use this texture, I need to make sure it is readable.
01:39:33: So I'm going to open it in inspector.
01:39:37: Let's see where the texture itself is.
01:39:40: I could have just opened from there.
01:39:43: I'm just going to open it here.
01:39:45: So I'll grab this one.
01:39:48: Get rid of this.
01:39:50: Oh, this is a procedural texture.
01:39:53: Okay.
01:39:53: I thought it was a big one.
01:39:54: So this one's already going to be readable because procedural textures are readable by default.
01:39:58: Yeah, so I can, I can literally just grab this texture and drop it here.
01:40:05: And I'm going to open, I should have probably just kept this.
01:40:07: I don't have it open over there, so I'll just grab this.
01:40:11: Let me just pull this out.
01:40:13: So I have the particle style.
01:40:16: And I'm actually going to replace the module first.
01:40:18: So I'll put this one here.
01:40:21: And you'll see it's actually, as the particles live, the colors are going, you know, through
01:40:28: the values in this texture.
01:40:31: So you can, you can add like, you know, whatever you want there.
01:40:33: Like I could literally even do this, you know, this is probably going to be weird, but I
01:40:38: could take a picture once it loads and drop that in there.
01:40:44: And it's going to, you know, it's going to use those values to color those particles.
01:40:50: I do need to make sure it's readable, so I need to check readable.
01:40:55: I need to check uncompressed.
01:40:58: There we go.
01:40:59: Now I can drop this here.
01:41:01: So I'm not going to use like a strip of it, so it's going to be a bit weird, but you can
01:41:03: kind of see, you know, it's getting the colors from this texture.
01:41:09: Oh yeah, it's just going along the top.
01:41:11: It's going along the top.
01:41:12: I do want to add a module that like lets you kind of like, you know, randomize like which
01:41:15: part of this also kind of comes through.
01:41:17: So there's going to be more options for that, but you can see kind of an action, but I'm
01:41:23: going to drop this back in and actually have like another question.
01:41:29: Can you find a texture that's also like a gradient, which is a color gradient, but it
01:41:39: doesn't go through all the colors.
01:41:40: There's only a few colors.
01:41:41: Find me that one.
01:41:43: And I'll show you like another effect, which comes in with combining the modules.
01:41:49: So this module, each one essentially is like, you know, doing something to the course.
01:41:53: If I take this one and I also add it, you can see it doesn't quite like look like much
01:42:03: because now the colors are getting multiplied with each other.
01:42:05: So what I'll do here is instead of changing the hue, I want to keep the hue the same.
01:42:15: So now it's actually starting like, you know, it's literally just the red hue for the whole
01:42:18: lifetime.
01:42:20: And it gets multiplied with the hue on this one.
01:42:22: And because it's red, any other color gets masked out.
01:42:28: But what I can do is I can also remove the saturation because I don't want it to be,
01:42:35: kind of, saturated at all.
01:42:37: And now I can change the value.
01:42:40: So I can, for example, set the start value to be black.
01:42:43: And you see like it's kind of combining the contributions of these two to produce the
01:42:47: final color.
01:42:49: But what also matters is, you know, which order these modules are in.
01:42:53: So if I swap them...
01:42:56: Well, actually for this one, it's not going to matter because the multiplication is always
01:43:00: the same.
01:43:02: In some cases, it will matter because depending on the order of operations, it's going to
01:43:09: produce you different results, but in the case it doesn't because it's a multiplication.
01:43:14: But you can kind of, you know, you can stack multiple different modules to produce behaviors.
01:43:19: As I'm checking on the time, I'm actually going to speed this up a little bit because
01:43:23: some would like to skip over the other texture stuff because there's a few more things I
01:43:26: wanted to showcase.
01:43:27: Um, one of them is, you know, some of that kind of turbulence stuff because there's
01:43:32: other really cool effects.
01:43:33: And I see we have a few questions as well, so I'll try to get those in a sec.
01:43:38: Um, let's switch this back to the camera and I'm going to bring the emitter over here.
01:43:50: And I will remove one of these systems.
01:43:54: I think I've closed it.
01:43:54: I always close things without thinking about it.
01:43:57: So get rid of this one.
01:44:00: I will change this.
01:44:05: Simulators.
01:44:06: I'll change this one to have the full saturation.
01:44:12: I can follow you.
01:44:14: No, it's okay.
01:44:15: Like, I'm gonna skip that part right now because we're kind of running short on time.
01:44:21: And also I'll get rid of the collisions.
01:44:27: Then I will lower the gravity.
01:44:30: I don't want it to be that strong.
01:44:33: There we go.
01:44:35: So one of the cool effects.
01:44:37: No, they cannot go in like this.
01:44:39: One of the cool effects also, there's a new turbulent force.
01:44:45: So you have the gravity force, but we can also add, under rendering, particle system
01:44:52: modules, there's a turbulent force.
01:44:57: Oh, actually it's called, I think it's called simplex turbulent force.
01:45:00: There we go.
01:45:01: So I add that one.
01:45:04: I will add this module simplex turbulent force.
01:45:09: And you see it kind of starts, the particles start kind of like, you know, flowing a bit
01:45:12: different and you can like, you know, mess with these settings.
01:45:16: You know, there's like different ways for it to work.
01:45:18: So like you can, for example, have it like, you know, alter direction.
01:45:21: And you see now the particles are going, you know, doing this thing and I can, maybe I
01:45:26: can make it stronger.
01:45:30: And maybe I want it to be like, you know, do like this kind of thing.
01:45:37: And maybe increase the scale so like it's a little bit more, it's a little bit more
01:45:43: kind of varied and this one actually might be too strong for this.
01:45:51: So you see now they're kind of like doing like this kind of more complex behavior.
01:45:54: I'm not gonna like mess with it too much because like there's like, you can literally
01:45:57: spend hours like, you know, just messing around with this, you know, producing all kinds of
01:46:01: cool effects.
01:46:03: But I see the particles like they can do a lot of kind of cool stuff.
01:46:06: There's like one more thing I'm gonna actually show you because I want to give a little bit
01:46:10: more kind of background on how it works.
01:46:14: You know, like each particle it can have, you know, it has its own render, but there's
01:46:20: modules which are also their own renderers and add like additional rendering to the particle
01:46:26: system.
01:46:27: One of those is the Trails module.
01:46:30: So if I go back to Smooth POV, I'm gonna add, I'm just gonna add it to the root.
01:46:38: Rendering, Particle System, Modules, there's the Trail.
01:46:47: Trails module, where is it?
01:46:50: Trails, Particle, oh it's Particle Trails module.
01:46:53: There's Particle Lights module, Ribbons, Trails.
01:46:55: I'm gonna show you the Trails one.
01:46:57: So there's the Trails module, which has also a bunch of stuff.
01:47:00: I'm gonna, I'm gonna just give it the same material for the time being.
01:47:05: There's lots of options to play with and we don't really have time till I go through all
01:47:08: these, but I'll assign it here.
01:47:12: And now you see the particles actually start drawing trails, you know, which is like,
01:47:19: looks really neat.
01:47:22: And if I now start like, you know, messing with like stuff like the Forces,
01:47:27: I'm gonna go Simulators.
01:47:32: We don't make them, you know, stronger.
01:47:35: Oh, this is weird, this is just turned into a clump.
01:47:40: I'll change the scaling on it, on the turbulent force.
01:47:44: There we go, you see now it's doing all these cool things.
01:47:50: If I change the Force Offset, you know, it's gonna do like...
01:47:55: Oh, that is weird.
01:47:59: You need to mess around with it a fair bit.
01:48:04: I might need to like lower the Strength, so they're not like too constrained.
01:48:17: Yeah, I'm kind of getting into the... there we go, this one's supposed to go this way,
01:48:22: and I'll change this one.
01:48:26: There's lots of different things you can kind of, you know, do, like you can just mess around
01:48:29: with it and do all kinds of effects, but what I wanted to show you with this one, there's also
01:48:34: like, you know, modules where...
01:48:40: There's modules where...
01:48:42: Actually, sorry, can you move them a little bit somewhere, so they don't turn away, I'm
01:48:47: just going to clean up.
01:48:49: I think I kind of shook his leg enough, there's like, I underestimated how long this is going
01:48:54: to take, so I do want to get like, you know, to some questions.
01:48:58: Okay, they're, they're, they are over there now, that's, that's fine.
01:49:02: Oh, they're going a little bit over here still, but it's fine, it's fine.
01:49:06: So there's also like, modules, you know, which kind of act after the main modules do.
01:49:12: So like, there's a trails module, trails, and what this module does, it will, this is
01:49:22: something called internally a follower module, and there's a few of these, and what it does,
01:49:27: for each particle, the particles can be followed by additional effects, in this case, a trail.
01:49:36: And what it will do, it essentially will look at the final near simulated position,
01:49:39: and has its own internal data, you know, there's like the trail data.
01:49:43: I'm just gonna like, go in super into details on this one, because it's also like a rabbit hole.
01:49:48: But it's a trail data, it'll take, you know, the color, it'll take the position,
01:49:52: it'll combine it to compute its own data, and then this one has its own sort of built-in
01:49:57: render.
01:49:59: These are also the particle lights module, so it's gonna, you know,
01:50:03: some particles are gonna have like actual light in the scene, there's the ribbons module,
01:50:07: which is similar to trails, but instead of the trail kind of following the part of the particle,
01:50:12: it kind of goes through all the particles, and each of the modules also has its own additional
01:50:17: modules. For trails, there's also initializer modules, you know, which are gonna initialize
01:50:22: some of the properties, there can be simulation modules, which can also modify the trails,
01:50:27: and over the lifetime, there's not really many, but there's gonna be lots that will be added
01:50:30: over time. But overall, this is the general structure of the particle system. You start
01:50:36: with some, you know, built-in properties, there's amateurs, which add new particles,
01:50:41: they can initialize some of these, and there's modules, some of the modules are gonna initialize,
01:50:46: you know, stacking properties, some of them compute the properties, you know, every frame,
01:50:52: either using the other properties, or using whatever internal math, or, you know, whatever
01:50:56: structure, and then those then get computed to the final render properties, which are then rendered
01:51:01: out, you know, as individual points or meshes. And some of the modules, you know, they can follow
01:51:07: the particles, they can do some extra stuff, and they have their own renders as well. And this
01:51:13: kind of, you know, how the particle systems kind of put together, and one of the reasons is it makes
01:51:17: things, it makes everything a little more modular, because if we want to change how color is computed,
01:51:23: or how position is computed, because the reason I did this is I was actually inspired by the
01:51:30: Minecraft particle system, which was shown to, like, one of my friends, like, a while back,
01:51:36: Thorne, he showed me the Minecraft particle system, which has a very interesting approach
01:51:40: where you can define, instead of, you know, simulating velocity of the particle, you actually
01:51:47: define what the particle is based on the starting properties, and that's something I want to ask
01:51:52: the PhotonDust at some point, so instead of, you know, velocity-based simulation, you define an
01:51:57: equation that's, for example, based on position, rotation, direction, and lifetime, and you just say,
01:52:02: based on these, this is the position of the particle, and it's going to open up, you know,
01:52:07: lots of kind of cool effects where you can sort of procedurally define, you know, how the particles
01:52:11: move around, and all you need to do is just swap out this position simulation module for the
01:52:17: simulation module. So the system is designed to be, like, you know, very flexible, very expandable,
01:52:23: you know, some of these modules, like, for example, the turbulent force, it literally, like, it took,
01:52:29: like, very, like, it was less than an outer, like, right, everything was, like, with a, like, with a
01:52:34: simplex function, like, it was something like five minutes, it's, it's very, it's very easy to
01:52:39: write lots more modules, which means once the system is out, like, we can add lots of cool effects.
01:52:44: If you have, like, you know, specific way you want to, like, the particles to behave,
01:52:48: you can make requests, and now we can, we'll be able to fulfill those requests
01:52:52: much faster and much easier than we would have been with the old system.
01:52:57: With this, we have, there's seven minutes left, so we might go a little bit over, maybe,
01:53:02: I do want to answer, like, some of these questions, so let's have a look.
01:53:09: Let's see, I'm gonna move this over here.
01:53:14: So, shadow x, for if there's more questions, time after demo.
01:53:17: If anything, we'll get more data fit integration with things like component, node browsers,
01:53:21: component, node searching without MOS would be great.
01:53:24: There's just something that's planned.
01:53:25: This isn't gonna happen before performance update, I would say sometime after performance
01:53:29: update, but we don't have a specific timeline on that.
01:53:32: Next, shadow x is lifetime encoded as remaining lifetime that decreases hours total lifetime
01:53:37: and star creation time.
01:53:39: I suppose the latter would make it more difficult to do things like lifetime loss and bounce.
01:53:44: So the lifetime is defined, actually, it is defined as the remaining lifetime.
01:53:49: So you have the starting lifetime, you have the remaining lifetime, but also the engine
01:53:53: does something whenever the lifetime updates, it computes a normalized lifetime.
01:53:57: And the reason that it is done is because it's used for lots of effects, which don't
01:54:03: care how much time the particle lives, but how long is it on its actual lifetime progression.
01:54:12: And computing that in every module can be expensive, because it essentially requires
01:54:16: division, which is a simple operation, but it can be expensive, especially if you do
01:54:20: it a lot, so the system sort of pre-computes it.
01:54:23: Ah, she's asking fractals? Maybe you could do fractals? I'm sure somebody will figure
01:54:29: something cool like that. Nyalov is asking, I now need a meta meme that showed a particle
01:54:37: cube called bitrate or bitracer and apply effect to simulate low bitrate. Neither of
01:54:42: you can actually do that. To load a bitrate effect can be a little bit difficult to do
01:54:45: right now, but you probably need to do either some shader stuff, or you could use the pixelate
01:54:53: distortion, but I don't think it's going to give quite the same effect. But yeah, these
01:54:58: tend to obliterate bitrate, especially if there's a lot of chaotic motion.
01:55:04: ShadowX, have you profiled Bepu performance on the headless compared to client?
01:55:08: Yes, actually Cyro ran some tests and it runs really fast, it's very speedy.
01:55:15: Do you want to give more?
01:55:16: Yeah, so just for a very quick context, I was originally playing with .NET 8 before the headless
01:55:24: was officially upgraded to newer.net, and I was able to, I spawned like 900 character controllers,
01:55:32: and I made them all streamed, and I, on Mono, even the newest version of Mono, which is like
01:55:46: was like 900 character controller cubes, and then I switched to .NET 8, and it was like pinned at 60
01:55:54: FPS, and it was like this big gelatinous like cube of character controllers, and like the networking
01:55:59: actually broke before the physics did. That's a lot, like it's a lot of data. I remember like
01:56:06: joining the session and just seeing this like blob like around, and everybody was like messing with
01:56:11: it. It was like beautiful, but yeah, like Bepu, it's optimized for this, like you know, like it's
01:56:17: it's like pretty much right now with Mono, it's kind of handicapped. It's usable enough,
01:56:22: but like once we make the switch, like they expect like a significant boost like in that alone.
01:56:30: Oh, I deleted the question. I have to paraphrase this one because I read it, but I deleted it by
01:56:37: accident. ShadowX was asking like, did I wanted to do Stacky Particles and it was still on the table
01:56:42: for MVP, after MVP? Yes, that one's on the table. It's actually much easier for the video system
01:56:47: because now, you know, I have like full control. I have some ideas how to approach it. I'll see,
01:56:52: it's not gonna be for MVP, but sometime after MVP I can, you know, do it. ShadowX, is it gonna be
01:56:58: Spectrum 1? I think this was relating to something we're showing. SNB8272, is it normal for when
01:57:07: exiting and saving homes for it to be sinking for a very long time? I mean it kind of depends,
01:57:12: if you change a lot of things it can take longer. So it kind of depends what you mean by very long
01:57:17: time because, you know, that can mean, you know, five minutes, can be 10 minutes, it could mean
01:57:22: an hour. So it kind of depends on specifics. Nukecon, a lot of this stuff is hidden behind
01:57:29: lack of array support. We're now going to get uploaded to the Arise in Resonite without a mod.
01:57:36: So this could happen sometime around like when we rework the inspectors, we actually gonna rework
01:57:40: them using the data feeds. This is gonna make it easier, you know, to actually build UIs for,
01:57:45: you know, working with arrays. But like any timelines, you know, like we generally don't
01:57:52: uh because those are too like difficult to estimate so we don't want to like, you know,
01:57:56: promise specific time which we're not gonna miss. Check the folks out there. I see colored names on
01:58:02: the chat as they're broken. Do you figure what is causing that? I didn't get a chance to look good
01:58:05: and unfortunately. LickSquid, maybe you're more debunking. So we know assets storage costs are
01:58:11: duplicated across items inventory. Are assets referenced from Resonite Essentials folder
01:58:16: counted as free unused storage quota? So it kind of depends where exactly they're stored,
01:58:22: because the folder does link to some shared folders, but if it's stored directly in the
01:58:27: Resonite Essentials, those should be free on the storage quota. Anything stored on the official
01:58:33: kind of Resonite group, the system should treat it as a free asset. Even if you save it to your
01:58:52: or I missed their ply, are you planning to redo iCarry work right after performance
01:58:56: update is finished or not set in stone? Yeah, I did answer this one earlier and we didn't have
01:59:01: much time so I can't get into details. It's one of the top things we want to do. It's gonna depend,
01:59:07: we're gonna evaluate after we done with the performance update. I did answer it in the
01:59:10: earlier on the stream, so I do recommend rewinding after this to get the full answer.
01:59:16: Next one is snb8272. Is there orbiting gravity system for orbiting moon map?
01:59:23: So if you mean for the particles there's actually radial force, so yes you can do orbiting.
01:59:29: I don't know if I have time to kind of showcase it. I can try it like you know real quick.
01:59:34: See like you know if I can open this up so with emitter I'm gonna add
01:59:41: and this is like a speedrun because like we have only like a minute left
01:59:47: oh we have 10 seconds left yeah there's probably not enough time I'm just gonna show you a radial
01:59:53: force I'm gonna drop this one in see what this does if it doesn't work they might need some
02:00:00: more like messing around the radial force I'm gonna eat the trails not trails the simplex one
02:00:11: use this one radial force that's gonna be stronger and might not work right yeah it is not
02:00:19: working so I need to mess around with this a little bit more because we moved a bunch of stuff
02:00:26: um so that's pretty much like all the time we have like we know a little bit over uh thank you
02:00:32: very much you know everyone for um you know for all your questions thank you very much for watching
02:00:37: I hope you enjoyed this episode of The Resonance uh I hope like you enjoyed you know learning more
02:00:42: about the new particle system that we have and they're gonna enjoy you know messing around with
02:00:46: it and playing um so thank you very much you know for building cool stuff thank you everyone for
02:00:52: hard like you know help participate in the testing there's a bunch of kind of issues that I'm going
02:00:56: to be going through like you know over the next week and hopefully get a photo and does the
02:01:02: um so thank you very much again for watching thank you Sarah you know for helping me co-host
02:01:07: uh thank you for supporting this platform and we'll see you next week bye-bye thank you very much