The Resonance/2025-01-05/Transcript

From Resonite Wiki

This is a transcript of The Resonance from 2025 January 5.

This transcript is auto-generated from YouTube using Whisper. There may be missing information or inaccuracies reflected in it, but it is better to have searchable text in general than an unsearchable audio or video. It is heavily encouraged to verify any information from the source using the provided timestamps.

00:00: I'm going to post the announcement, posting, there we go, FSRs, post the livestream announcement,

00:14: where's livestream, there we go, livestream, there we go, let me make sure I have it open

00:30: make sure it's working, okay it is open, hello, hello, hello, yes I'm still a bit sick, hello,

01:19: can you hear me, hello, it's going like about the same levels as OBS so I assume it should

01:26: be okay, so hello everyone, hello and welcome to the first episode of The Resonance of 2025

01:36: of this new year.

01:39: The Resonance, it's a weekly sort of office hour slash podcast where you can come, you

01:45: know, ask any questions about Resonite, whether it's the development, engineering, whether

01:50: it's, you know, how the platform is going, where the goals are, you know, what the history

01:54: is, like whatever you want to know about the platform and so all the questions that can

01:59: end up like going into a little bit of a tangent or like longer rambles where we kind of dwell

02:05: a little bit more into, you know, the flaws of it, like where is it going, what do we

02:11: want to do with, you know, with this project, where it's heading to give you better and

02:16: sort of like, you know, higher level overview of like, you know, what the platform is and

02:21: what our goals are.

02:23: If you'd like to ask any questions, make sure you put a question mark at the end of it or,

02:29: you know, somewhere in the middle of it.

02:32: And then the way it pops up like on our messages so that makes sure like we don't miss it.

02:38: Also, we are coming to you from this like beautiful world.

02:40: This is the new Creator Jam 2025 New Years World.

02:45: It's very precious.

02:46: So like, I'm like, it's going to be like very fitting to stream from here.

02:52: It's like going to be like kind of good backdrop.

02:54: I want to kind of get into the habit of trying to like use different worlds for these streams.

02:58: I'm just going to go around.

02:59: Hello check the fucks out there.

03:01: Hello Fuzzy.

03:02: Hello Terska.

03:06: Terskale.

03:07: Terskale.

03:08: I'm sorry if I'm mispronouncing that.

03:12: So hello everyone.

03:14: So with that, like, I think we could get started.

03:17: The aim is like, you know, usually we go, usually these go like for two hours because

03:21: we get plenty of questions by the time it kind of gets going.

03:25: So hello blue bois.

03:27: Blue bois?

03:28: A bois?

03:29: Blue bois?

03:33: Blue bois.

03:34: I'm like, this sounds like blue bois, but it's a...

03:38: Blue bois.

03:43: That's a very different dialect.

03:46: It's a different dialect.

03:48: Anyway, we got a question.

03:51: Let me actually clear this one.

03:53: SMB8272 is asking, will sharing videos in game on the spot be improved?

03:58: Maybe an option to load it if another friend wants to watch it alongside.

04:01: For now, I have preloaded MP4 by saving it in the world itself.

04:05: I'm actually not really sure what you mean by that.

04:12: Like, when you bring a video into the world, it ultimately loads for everyone in the world,

04:18: at least like it should.

04:21: I'm actually not quite sure I understand this one.

04:29: Perhaps maybe you've just had bad luck.

04:31: There is currently a bug that happens where if there's a lot of asset transfers in a world,

04:38: there's a chance that asset transfers may end up getting stuck.

04:41: That is, transfers between, you know, peers, players.

04:46: Yeah, it could be that.

04:47: But it's like, you know, circumstantial.

04:51: So I'm not sure if maybe you've just been having bad luck with that,

04:53: but if you bring an MP4 into the world, it should upload to the host,

04:57: and then the host should distribute it to everyone else in the session.

04:59: Yeah, I don't want to re-write that system because it is prone to just getting clogged

05:04: and stuff that's locally imported doesn't transfer anymore.

05:13: But it's not something that's specific to videos.

05:16: It's pictures, whatever assets you bring into the session that are not safe or cloudy,

05:22: they will be synchronized between the clients through the session,

05:27: and when it clogs, it just stops transferring, so they won't re-write that one at some point.

05:38: The question is, where are you importing the video from?

05:43: Are you importing, is it from YouTube? Is it local asset and so on?

05:49: Yeah, it might be just that. That system kind of needs to re-write.

05:56: They did mention that the loading indicator on the video player only showed one out of two.

06:03: Or just some other bugs.

06:05: Well, the question is, is the video actually loaded? Because there's also another bug in the indicator

06:10: where it might not show it correctly, so does the video play for the other person?

06:16: But also maybe the assets are just clogged, so it could be either of the two bugs.

06:19: Yeah, it is a local file.

06:26: The system, it's one of those things I really want to re-write at some point.

06:31: Because it makes things kind of just weird, because once you start, you bring stuff,

06:40: once you're kind of going to bring stuff, we could be doing pictures and memes and videos

06:46: and other stuff we want to show people and stop showing it.

06:50: It kind of gets weird because it usually restarts the session.

06:55: So I don't really want to touch that one at some point.

06:58: Olukazero is also asking, perhaps an option for uploading assets directly to the cloud inventory

07:02: so it can load faster for everyone?

07:04: And we can already do that, we just save it to the inventory and then spawn it from there.

07:09: So there's already an option.

07:10: And usually what people do is if this syncing is stuck, usually we just tell people

07:17: just save it to inventory, wait for it to sync and spawn it out.

07:20: That way it loads from the cloud because once you save it to your inventory,

07:25: the URLs are rewritten to the cloud instead of local.

07:30: So it kind of gets around it.

07:32: But the main thing is just to fix it so that syncing doesn't break.

07:38: We usually don't add features to work around a bug.

07:44: We just want to fix the bug.

07:45: Because otherwise we just end up with lots of kind of bugs that still stay there

07:51: and you're just kind of working around them and it kind of makes the bugs faster,

07:55: which is not good long term.

08:01: The next question is from Jack the Fox author.

08:04: Was the scope of the upcoming audio system rework?

08:07: What things will get worked on, added, replaced?

08:11: So right now the audio system in Resonite is already like a hybrid.

08:17: Like FrooxEngine, it handles a lot of the bits of audio processing.

08:23: Like for example, we do our own audio decoding.

08:26: Like when it's decoding WAV files or FLIC files or OGG, like Vorbis.

08:33: All of that is in our own.

08:34: Same with stuff like the voice that's being transmitted that's fully FrooxEngine.

08:40: The parts that Unity is handling right now is the spatialization.

08:45: So we actually just give Unity the audio buffers.

08:50: Actually, this might be a good... I might draw this a little bit just to give you a better idea.

08:55: I'm going to go over here. I made a spot for drawing. There we go.

09:01: It's not a whiteboard, but it's kind of nice.

09:11: It's nice like whiteboard in the sky.

09:16: So it's just kind of a bit weird because I don't have any reference to draw on, but we'll see how it works.

09:24: So right now, like you know, say you have a bunch of audio sounds.

09:28: You know, you have like audio source here. You have another one here.

09:33: You know, and you have another one here. And these are like, you know, all playing like different sounds.

09:42: You know, say like this one's like a wave.

09:46: This one's like... Actually, the camera is a little bit rotated. I might move it a little bit better.

09:56: I guess like having a whiteboard would be better, but this should work.

10:01: So a bunch of different audio effects, you know, say this one's OGG, you know, and this one's like flake.

10:06: It's just a bunch of sounds in the world, you know, and you have like a listener, you have like an ear.

10:11: I don't know if I can draw an ear, but you have like your ear here.

10:15: So what we do is we have a system that actually decodes the audio for this, you know.

10:23: So this is like the audio data, and like this is the buffer, and every like audio update, it like, you know, decodes the bit of that audio, and it's already done on the Froox Engine.

10:33: So this is, you know, on our side of Froox Engine.

10:42: Is it still kind of in frame? It's kind of weird.

10:46: I probably should have like found like a flat place or something.

10:53: Yeah.

10:56: It's one of the downstairs. I'm gonna move the camera a bit.

11:00: I can actually move a little bit out of the way.

11:07: There we go.

11:08: So this is like, you know, Froox Engine.

11:15: And then like, you know, we have the Unity side, and right now what happens is Froox Engine will kind of, you know, synchronize the positions of them.

11:29: This is Unity.

11:31: There we go. We got a whiteboard.

11:34: There we go. Does that help?

11:36: I mean, it doesn't fully... There we go.

11:40: Is it flat with the camera?

11:42: There we go.

11:43: I think that's fine.

11:49: Maybe I chose the camera a bit.

11:53: Let me just move it. Oh.

11:57: Oh, wait, did it?

11:58: Stand by, folks. We're a little out of it today.

11:59: Why is the camera attached to the whiteboard?

12:06: What did it do?

12:07: Oh, is it the frustrum snapping to the whiteboard?

12:12: Oh, no. That's weird.

12:14: Ah, no.

12:18: Okay, this works. Let's just roll with this.

12:24: So then we have the unit this side.

12:28: And unit this side, we actually have to duplicate...

12:31: We can actually move out of the way so we don't...

12:34: On the unit side, we actually say there's audio source here,

12:37: there's audio source here, there's audio source here.

12:42: And then we have to transmit.

12:44: We have to give unit the audio data for each one of these.

12:47: So we give it buffer for this one.

12:49: We give it buffer for this one.

12:52: And then we give it buffer for this one.

12:55: And then unit, and we also tell it the ear is over here.

12:59: So unit can figure out how this should sound based on the position of the ear with the spatialization.

13:10: And this is one of the things which makes it a little bit more entangled with the unit

13:14: because we're giving it audio buffers for each individual audio output.

13:20: So one of the things we'll be doing is all this spatialization,

13:23: because unit resolves this and then it produces a final mix of the audio

13:30: then goes to your actual speaker.

13:35: And then you can hear that.

13:38: So this then goes to your headphones, your speakers, whatever.

13:45: What we'll do by having our own audio system,

13:48: because say we have the ear here and these sounds are near you,

13:54: you will actually collect, these are the closed sound effects,

13:58: so I'm going to render these out, I'm going to spatialize them,

14:01: and maybe there's also a reverb zone here that we told it about.

14:10: There's one on here and we're just going to sync that.

14:14: There's the reverb zone here, so that's going to apply extra effects.

14:19: We want to resolve all of this here, so Unity is not even going to need to care about this.

14:24: So what our system will do, instead of having Unity,

14:29: or essentially FMOD, which is the system it uses,

14:32: do this part.

14:38: We'll have a system which is very likely going to be based on the Bepu Physics,

14:43: where each of the sound effects has a range,

14:50: where you can hear it, and then when we say,

14:52: there's a listener here, we want to collect all the sound effects,

14:56: or all the sounds that should be hearable from this point in space,

15:04: and we want to hear them,

15:08: we want to resolve them.

15:10: So it's going to collect all of them,

15:12: so it's going to be okay, this one's near, and this one's near.

15:14: We already have a system for recording audio,

15:17: so then it's just a matter of taking these,

15:20: and running it through spatialization,

15:22: which is, we're going to be still using Steam Audio,

15:26: except we're going to be using it directly now.

15:28: So we're going to ask Steam Audio to make the sound,

15:32: as if they know they are at this position relative to the listen,

15:35: and this one's at this position relative to the listen.

15:40: What's cool about this is once we're also resolving the audio footage over the sound effects,

15:45: we can now control stuff like attenuation.

15:48: So for example, we can say this is this far from the user,

15:52: so maybe it's going to use this kind of curve,

15:56: or this kind of curve, or maybe we're going to have a different type of curve,

16:04: or whatever we want, because we can just plug whatever function we want

16:08: to compute the actual attenuation.

16:16: This can do stuff like feature parity.

16:20: When there's also reverb zones, they're going to be like,

16:22: okay, the listeners within the reverb zone, we found they're inside,

16:27: so we're also going to apply extra sound effects to this audio data before mixing it together.

16:32: And once this is done, we actually compute our own final audio

16:36: that's all mixed and resolved for this listener.

16:40: So this is the final audio that's been resolved from all the sound effects,

16:44: whatever other effects are in there, and we're going to produce it ourselves

16:48: instead of having Unity produce it.

16:51: And it becomes a question because either we can just send this to Unity

16:56: and be like, can you pipe it to the speakers?

17:00: Or we just use the library, we might just pipe it to the speakers ourselves.

17:07: So we just pipe it in and Unity is not even going to know there's any audio being produced

17:12: because we just do it with our own process.

17:18: By having the control over the system, what does it allow?

17:23: There's going to be a few new features that are going to come in as part of the system.

17:28: The main goal, because this is done as part of the performance update,

17:31: is feature parity, which is making sure all the stuff that works right now

17:35: reasonably works for the new system.

17:40: But there's a few things that are relatively simple additions

17:44: that we're going to add along the way,

17:46: especially because they're also things that people have been asking about.

17:49: One of those is having multiple listeners.

17:52: So right now, I'm streaming to the camera.

17:55: If I switch to audio to spatialize,

17:57: actually I might even do a live demonstration for this.

18:01: So I'm going to switch Cyro back to spatialized.

18:05: So Cyro, if you say something,

18:09: you hear him coming from the wrong side,

18:12: because it's actually coming from my side.

18:15: I would have to like not turn this way now if you say something.

18:19: So now he's a little from the right side,

18:20: because you're actually hearing it from my viewpoint.

18:23: We do have a feature where I can switch it on the camera.

18:28: There's a render audio from camera viewpoint.

18:33: So now, Cyro, if you say something.

18:35: Oh, that's so weird.

18:38: So now you see it correctly from the camera,

18:40: but I hear him from over there, which is incorrect.

18:43: Again, it kind of, it messes with my brain.

18:47: Okay, switch it back.

18:48: I'm going to switch him back to broadcast.

18:50: So that's one of the limitations.

18:51: You can either have the audio coming from the camera's viewpoint

18:55: or from my viewpoint, but not like, you know,

18:57: you cannot have two separate ones because of the unity.

19:00: We can only have like one listener,

19:03: like one virtual ear in the scene.

19:09: With having our own system,

19:12: we can essentially have as many ears as we want.

19:15: So we can, you know, have another one that represents the camera

19:18: and just be like, we want to also resolve all the audio, you know,

19:21: from the viewpoint of this ear and it's got to be the camera.

19:24: And this is going to be piped into a different audio device.

19:28: So this is, you know, this one goes to your speakers.

19:31: So this is what you hear and this goes to the camera.

19:33: And maybe there's a different audio device. You pipe it to OBS.

19:36: And now we have accurate audio, you know,

19:40: from the camera's viewpoint.

19:42: We can do like a bunch of extra features as well.

19:44: Like we can, for example, render your own voice,

19:48: you could be hearing me spatialized relative to your viewpoint,

19:52: even for like, you know, I'm the one streaming,

19:55: which is also going to make it a little bit easier, you know, for the setup,

19:57: because with OBS, I won't have to like, you know,

20:00: I won't have to do my voice separate, capture my voice separately

20:04: from the game audio. My voice is still,

20:07: is actually going to come from the game audio with this system.

20:11: There's some other things we could do too,

20:13: because for example, when we are resolving the audio,

20:18: one of the things that's added,

20:23: that we're kind of like adding, like in the background,

20:26: is spherical harmonics. And spherical harmonics,

20:29: like I mentioned before, one of the things they're used for is ambisonics,

20:32: which is a way to encode directional audio.

20:36: That means, in a scene,

20:40: I'm going to like remove some of this clutter,

20:42: just so it's like not too cluttered.

20:46: It's a way of having audio

20:51: that sounds different based on the direction you're hearing it from.

20:56: So, say there's an audio source here,

20:59: and this is ambisonics one.

21:01: I'm just going to do like, I'm going to visualize it like this.

21:07: So, you might actually, this might sound different

21:10: if you're like hearing it from this angle,

21:12: and might sound different if you're hearing it from this angle.

21:16: Or this angle.

21:18: Or it can also be the other way, so it can be coming from everywhere,

21:22: and you're kind of inside of it, and as you turn your head,

21:25: depending on which angle your head is,

21:30: the audio is going to move around you.

21:32: It's a really good way to do environmental audio.

21:36: So, you can have sounds of environment,

21:40: and as you move around, it doesn't stay fixed to your head,

21:45: and as you move around, it's a good way to do ambience.

21:49: And because we have the code for resolving spherical harmonics,

21:53: and because we have full control over resolving the audio,

21:56: we can just pipe the direction into the ambisonics,

22:01: compute what the audio should be, and compute the audio buffer,

22:04: and then we do whatever processing and piping we need on that.

22:09: So just having the full control of the system that's going to open up

22:13: all sorts of possibilities.

22:15: The one thing I don't want to mention is the main focus is just kind of

22:19: collect audio, like collecting, and then it's like, you know,

22:25: resolving the spatialization.

22:29: People kind of expect some things like stuff like with audio decoding

22:32: and so on, but it's stuff that we already do.

22:35: That's why it's like currently, why is it a hybrid?

22:38: Because we already handle our own decoding of different audio files,

22:44: so it's not really going to get touched.

22:47: There's also things like doing audio processing, like DSP.

22:53: That's a separate project that's going to be the ProtoFlux audio DSP.

22:58: So you can do effects like splitting channels,

23:02: piping them to different things and so on.

23:04: That's going to handle stuff like that.

23:05: The audio system specifically is just resolving the spatialization,

23:12: calling the audio sources that are around you,

23:15: the system is going to build a priority list too.

23:18: For example, if you have lots of audio sounds in the same area,

23:31: and each one of these is an audio source,

23:37: there's so many around,

23:43: the system cannot reasonably play them all without losing performance.

23:49: So this is the only way to build a list of them.

23:55: These are all of the sound sources within the proximity of the ear,

24:03: and it's a sorted list, like this one has higher priority,

24:07: and it ranks them,

24:12: and decides I'm going to play only these ones,

24:16: and these ones are going to be quiet.

24:18: But we also have control over that logic now,

24:20: so we can add whatever rules we want,

24:22: we can add whatever behaviors we want.

24:26: Options for expansion for the system.

24:31: So I don't know if this covers things,

24:33: what the audio reward will involve,

24:37: what can you expect from it.

24:39: If you get any follow-up questions, feel free to ask.

24:43: I'm going to move back to the other area.

24:49: There we go.

24:50: Oh.

24:53: There's another of the rambles.

24:55: Let me position this.

25:00: There's also a bunch of things that we'll do in the future,

25:03: just like by implementing our own versions of some of these systems,

25:07: we get full control of our system,

25:09: because right now some of it is more of a black box inside of Unity,

25:14: and by having our own code that handles it,

25:17: it's no longer a black box, we control it,

25:20: and it has a lot of possibilities too,

25:21: and makes it easier to fix certain issues,

25:24: or out certain features that otherwise would be a lot more difficult.

25:27: So, that kind of covers that question.

25:33: Next question from snb8272.

25:37: Another question.

25:39: Will we be getting any linear curves for color alpha over lifetime,

25:42: or legacy ribbon particles back?

25:48: So, what do you mean by legacy ribbon particles?

25:55: We should already have the arrays for the curve keys should already be there,

26:01: so if you have any previous particles that ended up using those,

26:04: they should be fine.

26:06: There was previously no way to edit those,

26:10: and if you saw a way to edit those, then that was likely,

26:13: you joined someone's session and they had a mod that added a hack UI to edit them.

26:22: It should convert.

26:23: All the modules, they should be there for feature priority.

26:26: If something's not converting, make sure to make a GitHub issue so we can look into it.

26:34: All the modules for feature priority should be there,

26:38: so we probably need a little more clarification.

26:42: There's also new modules, so you can, for example, do color and alpha over lifetime using texture as well,

26:48: which can make it easier to edit, but if you're using a mod or something to edit the arrays,

26:56: they will require whoever made the mod or whoever made the thing to update it for the new system.

27:02: But the actual system is in there.

27:07: Next question is the chip forge. Any plans to support Opus OG file decoding?

27:13: I think last time I tried to upload an OG G file that had audio encoded in Opus,

27:16: it would just have a blank way for it not to play, which is as funny because all the voices from users seem to be Opus.

27:23: So it's something we could do, it's just like there's not been too much, you know, requests for it.

27:29: I would recommend making a GitHub issue because that can be a good way to kind of gauge, like, you know, a lot of people are interested in that.

27:35: It's just a matter of, you know, putting time into it.

27:41: It's doable, it's just, you know, it needs many hours to happen.

27:46: For the voices of users, the reason we kind of just used the part like, you know, this is used for voices

27:52: is because while the voices, they use the Opus codec, they don't use the OGG container.

28:00: The data is like, you know, is essentially sent pretty much raw.

28:03: It's like raw Opus frames that are being transmitted through our custom system for handling audio buffering.

28:11: Which is one of the ways, like, you know, we achieve a really low audio latency with voices.

28:17: But it means, you know, like, we cannot just use that same code for, you know, Opus OGG playback.

28:23: Because what you need to do is, like, either, like, we would have to use code to, like, you know, decode the Opus frames from OGG.

28:33: And then, like, you know, use the Opus decoder to decode the actual audio data.

28:38: Or we just, you know, use a library that kind of does it, like, both at the same time.

28:41: We don't have to, like, worry, we just kind of give it and it gives us audio data back.

28:45: So, but essentially, you need, like, two pieces.

28:48: You need something to extract the OGG, you know, frames from, you know, the OGG stream.

28:55: And within those, you have, you know, the Opus frames, which is the actual kind of raw encoded audio data.

29:03: Because the codec, it also handles, like, you know, some other stuff.

29:07: Like, for example, seeking, you know, like, each of the OGG frames that has, like, you know, stuff like timestamp and so on.

29:15: If you want to, you know, move from the audio file, like, you know, like, you know, jump a minute ahead or jump a minute back.

29:22: Stuff like that. You cannot really do that with voices, because the voices is just a constant stream, and if you miss a packet, like, it's gone.

29:30: Like, and you cannot really go back.

29:32: So it's kind of different.

29:34: But yeah, like, if you want to, like, import those files, I would recommend making a GitHub issue, getting an uploads on it, and that way, like...

29:42: Oh, the keyboard is sounding.

29:48: They can, you know...

29:50: What?

29:51: I don't know.

29:53: I haven't used it.

29:57: But I do recommend, like, making a GitHub issue.

30:02: There's just another possibility. I do eventually want us to switch to a different kind of library that handles a lot of the decoding, because we're using...

30:10: We're using a lot of kind of C-sharp, like, implementations of these things, and there's, like, some, like, libraries that we're kind of looking at as a potential replacement.

30:18: Some of those libraries, they do support, like, Opus, like, with an OGG container.

30:23: So by making that switch in the future, like, you know, it might just happen implicitly.

30:29: But we'll see how that one kind of goes.

30:35: So this one, this one's not related.

30:41: Nookicoon. Isn't there an open standard for VR avatars and worlds? Why hasn't it been implemented in Resonite, and will it ever be?

30:50: I don't think there's, like, a standard for, like, VR, like, I mean, there's, like, some for stuff, like, you have, like, for example, VRM.

30:58: Which is sort of, like, you know, just glt files with some extra stuff.

31:03: One of the things that's been kind of, like, blocking us there is because it includes, like, licensing information, and right now we don't have, like, a good way to kind of preserve that.

31:11: Like, enforce that. So, like, we've been kind of, like, waiting on that.

31:15: Especially since, like, information we've gotten from people where, like, you know, it's better to have, like, wait until VR opens licensing system before supporting that.

31:22: But, you know, we might want to reevaluate that.

31:25: I don't think there's, like, any standard for worlds. Like, at least I'm not, like, really aware of any.

31:32: The problem is also, like, you know, it's a little bit hard to, like, have standard that kind of covers everything.

31:36: Like, you end up, like, with something that's, like, you know, with something that's essentially, like, the least common denominator.

31:46: Like, because there's a lot of stuff you could do in Resonite that you cannot do in other platforms, which means it cannot be part of any standard, you know, and vice versa with other platforms.

31:56: Like, every platform has its own different ways of doing stuff.

32:01: And if you want to, like, you know, distill the things that can be shown between platforms, like, it needs to be some common shared functionality which ends up being just a small subset of what the different platforms support.

32:16: We do support, like, you know, a number of, like, formats. Like, you can import, like, glTF Lite directly, that's, like, an open, you know, format, like, for motor files.

32:25: So it was looking to add, like, more formats as the time goes, it's just, you know, like, with everything it's just a matter of, like, man hours that need to be put into it.

32:37: And also kind of, like, you know, kind of interest from people.

32:47: It is what helps to determine which things we invest man hours into to implement.

32:53: So if you're, like, aware of, like, any standards, like, you would like to see, you know, make sure there's, like, a GitHub issue and make sure there's, like, an uploads on it.

33:04: VZArtilius, how much is the Froox? How do I give Froox the coffee? A coffee? I don't know. How are you giving me a coffee?

33:13: I mean, you, like, go to Starbucks, I guess.

33:19: I don't care.

33:25: Zuan5250 is asking, do we have, or we have any interface, ProtoFlux to manipulate audio buffer for creating custom sound effect audio filter, audio synthesizer algorithms like the callback, mono behavior on audio filter read in Unity Engine?

33:41: So there is a planned feature that's called the audio DSP for ProtoFlux.

33:48: There's a GitHub issue on it that kind of has a bunch of kind of details.

33:51: And essentially what it will do is, like, you know, what you're asking is so you can do a bunch of, like, audio processing.

33:57: That's going to include, you know, a bunch of existing filters.

33:59: So, like, you know, a lot of common filters you want to do with audio.

34:02: There are going to be, like, just nodes and you just plug it through them.

34:05: There's also going to be mechanisms where you can specify and say, like, you know, where you can specify where you get, like, you know, the audio buffer.

34:17: And you can just kind of loop through it and do whatever you want with it to make your own filters as well.

34:23: So it's going to come at some point.

34:25: If you like, you know, if you like to see that, like, if that didn't get a visual, like, an upvote to do it, it's going to make it easier for you to justify prioritizing it sooner.

34:34: It's definitely one of the things I'm kind of looking forward to working on, because I think it'll open, like, you know, lots of new cool options.

34:41: But yeah, it's on the roadmap. It's going to happen eventually.

34:50: Next question from Jack the Fox Author.

34:53: I've been busy the past week getting into plugin development, and I'm curious, what are your long-term plans for improving and expanding the plugin system?

35:00: The ability to make userspace core plugins is powerful and fun to work with already, but some things still seem to be missing, like callbacks,

35:08: events that get called after the engine start, stops, and I may get an issue about it.

35:14: It still kind of depends, because I always like, there's like several things, because plugins are so mean, they're like,

35:25: How do I put it? Like, you can pretty much use them potentially to add anything into the engine.

35:32: The problem is, whatever you add, it makes you incompatible with other clients.

35:39: And one thing could be done is, you know, you have other users install the plugin, but it requires them to go to the website and install another thing and so on.

35:48: You could theoretically have the plugin automatically download and run, but that's a very, very bad idea, because it can be malicious code, and you know, you don't want to just run that automatically.

35:59: It can be malicious code if you download it, you know, from some websites too, and install it, but at least like, you know, that requires user action to do, so it's less likely to be exploited.

36:11: So in my area, like I think plugins are most suited for things that don't touch the data model.

36:18: I think you're going to be like, you know, great. Like I do want to like expand support for, you know, making it easier to add like stuff like device drivers.

36:29: You want to add, you know, support for another device that can make it easier to do that.

36:33: You want to add stuff like new importers and exporters for file formats, you know, plugins can be like really great for that too.

36:44: Different types of things to the engine that can expand it well.

36:50: And sort of like, you know, provide like different kind of hooks and callbacks that make it easy to plug into these systems.

36:56: That way, like, you know, we could also like, you know, take a lot of our own code.

37:02: Like one of the things I've kind of wanted to do for a while is, for example, you know, take the importer exporter system and rewrite it to be more modular, but also open source those modules.

37:12: So we can publish the code that does the importing of models, you know, and does importing, like, you know, exporting of other stuff.

37:19: Because that way you can both use it as a reference to make your own or make your forks that, you know, better suit your specific needs.

37:28: Or you can even like, you know, contributions to the official ones.

37:33: And if there's, you know, some issues with how something's imported and, like, you have fixes for it, you could, like, you know, help us with those and help, like, you know, improve things that way.

37:45: Plus, you know, you can just build your own importers. You want to, you know, support this, like, super obscure file format, you know, write an importer plugin for it.

37:56: The other thing is, like, you know, having sort of, like, batch processes. So, like, if you have, like, you know, tools that, like, I don't know, say, like, they go through the whole hierarchy of the world and they do a bunch of, like, optimizations on it.

38:08: That's another, I think, like, really good use for plugins because, like, that doesn't need to introduce new data into the data model.

38:16: So it can be, you know, another thing, like, we provide sort of nice interfaces for.

38:21: The first thing, like, you know, that, like, adds stuff to the data model, I feel it's better to, like, use stuff like, you know, WebAssembly.

38:30: Because that's much, much easier to, like, sandbox. It's designed to be secure.

38:36: So we can, like, you know, we can, like, synchronize that code to other users without worrying, you know, it's gonna cause exploits.

38:44: Or, like, you know, the likelihood of an exploit is, like, way, like, way lower.

38:53: WebAssembly is, like, designed, you know, for the internet. It's designed to, like, you know, run in your web browser.

38:59: So it's already kind of, like, you know, hardened against that.

39:02: There's just, like, you know, six other plugins, like, they're not.

39:08: But, yeah, like, there's probably gonna be, like, you know, things for supporting specific, you know, kind of, like, use cases for plugins.

39:16: One of the things is also gonna be so you can more easily, like, you know, like, when you do have stuff that adds to the data model, if you really want to do that,

39:33: because right now, like, you know, you can't really select that.

39:37: So when you start a session and you want to use, like, you know, some, like, components you made or something, you gotta select it, and then anybody else who has those, they'll still be able to join.

39:47: But also, like, you know, you're not forced to use those automatically, so you can have those plugins installed, but you can still join sessions hosted by people who don't have those plugins.

39:57: There's just not gonna be, you know, usable within that session. You won't be able to bring those in.

40:02: One other kind of improvement is going to be done to the system as well.

40:08: Oh, that kind of answers that one.

40:12: Let's see.

40:13: I actually wanted to comment real quick, actually.

40:16: Okay.

40:18: There is actually already a hook you can use to run, like, a function or whatever after the engine is initialized.

40:28: I think it's called, like, engine.current.onpostinit or postinit or something, I can't remember the exact name.

40:35: But if you give that a delegate, that will run after the engine has initialized.

40:40: So you do already have that one, I know for a fact.

40:45: I'm sorry, I forgot to plug in my headset.

40:48: The power.

40:51: Come on.

40:53: Yeah, there's kind of, like, a few events. Like, it's not super nice, but, like...

40:58: Oh jeez, there we go.

41:01: I can plug it in.

41:03: But you can, like, you know...

41:04: Once you kind of hook into, like, a method, you can, like, unaccess the engine, you can then, like, you know, hook into, like, whatever other things you need.

41:13: Okay, so I should be good.

41:23: So next...

41:23: What sort of audio system are you looking at? Replace Unity ones, something in-house, or a pre-made one, like Steam Audio?

41:29: So, Steam Audio, that's not...

41:32: It's not an audio system, it's, like, an audio spatialization, so it does part of it.

41:37: I think this question was probably before, like, I did, like, the long explainer, so, like, I'm gonna go into details.

41:44: But, Steam Audio is...

41:48: This part that we're gonna be using for a spatialization and stuff like that, but there's, you know, other parts that are gonna be custom.

41:54: We're gonna be using, you know, other, like, for example, for the calling of the audios, we're gonna use, you know, Bepu.

42:02: Because it kind of has, like, you know, stuff like for accelerating spatial queries and stuff like that.

42:11: So, like, the gluing and everything.

42:15: Next question.

42:16: Check the fox letter.

42:16: Duncan.

42:17: I wanna do Duncan, Duncan thing again.

42:19: Yes, Duncan.

42:20: When we tell your friends, like, I mean, Jack, like, we want to Duncan because somehow they have Duncan in Germany.

42:27: And it was right at the train station.

42:30: So, my thing is freaking out.

42:31: Ooh, is the battery freaking out?

42:33: There we go.

42:37: S&B.

42:38: And it was going with your hair.

42:39: It's blue and kind of on my eyes a bit.

42:40: I think that's the transform override.

42:44: Maybe I just disable motion blur.

42:45: There we go.

42:47: Okay, that should make it better.

42:51: Next question is from shininghero1.

42:55: There's an idea.

42:57: How does a plugin store for, how does a plugin store for Resonite sound?

43:01: Kind of like how the modding discord has a better list of mods, but officially run.

43:10: So, hmm.

43:11: Oh, do you mean like having like a place where you can find plugins and so on?

43:16: Like a workshop, I guess.

43:17: Yeah.

43:18: I don't know if it'll be the workshop.

43:20: Like the thing is like maybe the problem is like with plugins, it's like they're inherently less safe.

43:28: So like, we're a little bit like, if we make it like very easy to download them, it potentially opens a vector for abusing them.

43:37: And I'd rather much like, you know, implement, you know, like the WebAssembly-based ones, which we can be like, you know, more kind of certain like this is not going to cause like security concerns.

43:51: So I don't know if it'll be, I don't know if it'd do it for plugins.

43:54: It might be opening like a ton of worms that we might not want to.

44:03: Nukecon is asking, will these custom modules be open format? Like, can we bring them to other programs?

44:13: No, like, no, like, I mean, generally no, like, because like the importer, like, what essentially is a job of the import and exporter for an importer, you know, it reads whatever format you like, you know, whatever format you want.

44:31: And it converts it into a resonant equivalent.

44:34: The exporter takes, you know, the resonant equivalence and translates it into whatever format you want.

44:40: Which means it's, you know, it's specific to Resonite by its nature.

44:45: So you cannot really take, like, an importer that's for Resonite and just bring it to a different program. You'd have to, like, kind of rewrite it for that program.

44:53: You could make, like, you know, an import and exporter that, like, you know, imports and exports format that works in a different program that supports, like, you know, an open format, but you'd have to make that a module.

45:05: But those modules themselves, they're still specific to Resonite because, you know, they work with its data structures. So there's not really a way to, like, have those work in different programs.

45:25: So the next question is, TheRedNeko asks, Hey Froox, what are some of the systems other than audio and particles that keep us from detaching version from Unity?

45:35: So majorly for, like, any systems that are, like, you know, big system, big systems, it's pretty much just the particles and audio.

45:43: There's a bunch of smaller bits, like, and particularly the integration, like, how the integration with Unity works.

45:50: So that's another system that needs to be kind of reworked before, like, we actually move it into a separate process.

45:57: But that's pretty much it.

46:01: Like, in the past, like, FrooxEngine, you know, it was tied into Unity in more ways.

46:06: Like, for example, like, you know, it was using its UI system, which was, like, you know, bad.

46:10: And then, like, it got, like, UIX, which is our own UI system, which handles all the UI generation completely, you know, completely, like, on the side of, like, you know, FrooxEngine.

46:25: And it just sends, like, you know, generated, like, meshes, you know, to a unit that are in and out.

46:30: There's been, like, more systems, but, like, been kind of, like, slowly, like, moving them piece by piece.

46:35: Also, Redneck was asking, so not the input system.

46:38: Yeah, we already kind of, like, have, like, our own input system that's kind of largely handled.

46:43: So we just, the main thing is, like, we need to, like, pipe data into it.

46:48: But, like, we already do, you know, we have our own input system, and we're just, like, piping data into it.

46:54: So we just keep doing that.

46:58: But it's not really, like, a big thing.

47:03: So can I clear out, like, all the questions right now?

47:08: Let's see what the time is.

47:11: I think, hm?

47:13: I'm, like, vibrating because I want to see a splat so bad.

47:17: Yes.

47:18: So, it's one thing, like, we're actually running on a build that supports Gaussian Splatting.

47:23: It's not super, like, tested yet because we're kind of just working on it.

47:28: That's sort of, like, a fun thing, but do you guys want to see, like, a Gaussian Splat?

47:34: Say yes.

47:37: Don't I think we'll have to go to a separate...

47:41: Oh, never mind, there's a question.

47:45: Can't do the splats.

47:48: Like you made Cyro sad.

47:51: We'll answer this question and I'll show you some splats.

48:00: Casterblades.

48:00: Off topic question, but did you see contact track?

48:06: If so, whatever it was.

48:07: I haven't seen it.

48:07: I don't know what it is.

48:09: Do you know what it is?

48:12: I'm not really sure what that word means.

48:14: Yeah, I don't know what it means.

48:16: The question is if we saw it, then let's know.

48:21: I don't know what it is.

48:23: Also, did you spot the shoot or not?

48:24: I did.

48:24: I don't have it in portrait.

48:27: So let's actually move.

48:28: Like, how I'll try to do this word is a little bit too heavy.

48:33: It's a little bit too heavy for like, you know, this world, especially because there's

48:37: also like, you know, this is like a mirror and everything.

48:39: So like it's like rendering the splat like multiple times and when I try spawning it,

48:43: it kind of killed me.

48:44: So we're going to move to a grid space for a second.

48:48: I'm just going to go to grid.

48:51: There we go.

48:53: There we go.

48:54: Come on.

48:57: Oh, I still have these.

48:58: I need to like get rid of them.

49:00: Sorry, I should be able to join me.

49:01: There we go.

49:01: It's public.

49:04: So let me also bring the chat here.

49:09: The chat should still be running in the other place.

49:11: So it's going to collect questions, but just so we can kind of see people's reactions.

49:17: We're going to be collecting questions in this world.

49:24: So I do have some, let's see, I do have some supplies that I like to know that I have already

49:40: I'm going to show you the import process.

49:47: Let me just make sure I got stuff ready.

49:53: Let's see.

49:56: I'll show how the importing works.

49:59: If I can find it.

50:03: 3dskin.

50:05: Give me a second.

50:10: So right now we support like importing supplies in the .ply format.

50:19: It's kind of imports it into like our own kind of like internal format, which actually tested it.

50:24: It's a little bit smaller because we are probably a bit of a compression on the data.

50:29: OK, so let me also switch from private UI so you can see my dash. There we go.

50:39: There we go.

50:43: So I have the Olego BoyKisser .ply. Yes, I'm importing that one. I'm just going to copy this file.

50:50: I'm going to paste it here.

50:53: Did I press it? I missed it. There we go.

50:56: And the thing is, you know, this format, PLY, it can also be 3D model, but this one's Gaussian Splat.

51:02: So I have to select Gaussian Splat, otherwise it won't work properly.

51:06: And you can see it's converting the splats now. There's three million of them.

51:11: Splatting.

51:12: Splatting. So it converts them and then it needs to encode them.

51:16: I actually went a little bit overboard and I like made a report, like, you know, what is encoding at the time too.

51:23: Right now, we also don't...

51:27: There we go, it's in kind of colors.

51:30: Right now, we don't do like any compression of data yet.

51:37: That's going to come like later. So like the kind of heavy on VRAM.

51:40: Like this is going to be like...

51:42: This is going to be like 700 megs of VRAM plus like the sorting, which is another, like it's going to be about a gig of VRAM.

51:53: Finalizing encode and it should appear in a second.

51:55: This might take a while to show for you actually, which I didn't realize.

51:58: Oh, there it is.

52:02: And it's going to hurt you because of the camera too.

52:07: Let me switch to the person.

52:09: Actually, I don't need the private UI anymore.

52:12: There we go.

52:15: But this is a Gaussian Splat.

52:17: Oh, this is much heavier with the camera too.

52:23: I think my laptop is kind of crying right now.

52:26: Actually, let me see if this is going to be...

52:33: This is kind of like...

52:39: I don't have much testing, so this is kind of, you know, just showing you off the cuff.

52:47: Wow, this is surprisingly heavy.

52:50: Like, it's big, so it's probably transferring, so even it's taking a while.

52:53: It should be...

52:55: Give it a bit.

52:59: Wait, it already transferred? It was fast.

53:01: Yeah, it finished really quick.

53:04: It might be processing the stuff, like give it...

53:07: Okay.

53:09: Keep an eye on my stats.

53:14: Oh, this is chunky.

53:15: Okay.

53:16: I wonder if I disable... I'm gonna disable the mirror to display for a bit and close the camera.

53:22: Oh yeah, that makes it nicer.

53:25: I might need to like add like a thing where it caps how many

53:28: sorts it does, you know, per frame.

53:32: But yeah, it's like...

53:34: Let's see if it works.

53:37: Still work to do, but...

53:41: So there's a question.

53:43: Will it be able to edit supplies within Resonite?

53:46: Funny thing you ask.

53:48: So I actually did add like a basic editing functionality.

53:53: It also might be there's like a bug like with transferring them over network.

53:56: So like...

53:58: Oh, there is a bug.

54:01: It won't show to you because...

54:05: Because when it's imported locally...

54:09: It needs to sync the metadata for it.

54:11: But I haven't added code for that, which means it probably blew up.

54:15: And you're stuck not seeing it.

54:18: I'll spawn this one and...

54:21: I'm just gonna eat this one.

54:25: I'm gonna spawn it from the cloud, which means you'll be forced to compute it locally.

54:29: So I'm just gonna grab some splats.

54:35: So this one should load for you.

54:39: There we go.

54:42: Do you see this one?

54:45: That is quite chunky, damn.

54:48: Yeah, they are...

54:49: They are heavier.

54:54: It would be faster if we were using a new version of Unity, so we could use more efficient sorting algorithms.

55:04: I still wanna do stuff where I can kind of limit how many...

55:09: Because when you render the splats, you essentially have to sort all of these based on the distance.

55:18: And you see there's a lot of them, and these are being constantly sorted based on where I'm viewing them from.

55:26: And because of the version of Unity, the sorting options are more limited to less efficient algorithms.

55:31: So I do wanna add mechanisms where you'll be able to limit how many sort of events can happen per frame, so it doesn't hurt as much.

55:43: Holy crap.

55:46: My GPU is at 100%.

55:49: Oh yeah, this will tax your GPU. This is very, very GPU.

55:55: I've got a 7900 XT just for context, so it's no slouching.

55:59: Yeah.

56:02: But anyways, there was a question about editing, and it's actually kind of like where it also, you know, it's gonna help.

56:06: So I'm gonna select this app, I'm gonna open it in the inspector, and you see there's a bunch of things.

56:13: There's like, you know, static Gaussian Splat, there's Gaussian Splat render.

56:19: You can actually also see, like, you can change the opacity, which is weird, and it can also make it way...

56:27: Oh, this is weird.

56:30: Or I can, you know, change the size of them, which I should be careful, because that hurts.

56:37: It can make, you know, all the supplies be way smaller.

56:39: But you can kind of see, you know, how is it constructed?

56:42: Because, like, you know, you see, like, everything's made out of these, like, fuzzy color blobs.

56:49: And there's a lot of them in here.

56:51: You know, this whole thing is made out of the blobs, and now they're just smaller than they should be, so they don't cover the whole area.

56:55: But it gives you a better idea of the structure of it.

56:59: Let's make them back to normal size.

57:03: I'm sure it's just under, so it's perfectly in one.

57:06: There we go.

57:09: It can also make it sort of less often, but, like, you know, if I do that, it still works.

57:19: But if I move too fast, it might take a little bit to adjust.

57:27: Let's circle it.

57:31: Like, if I, like...

57:33: Well, it kinda works.

57:36: It's actually not too bad.

57:37: It's not too bad.

57:39: It's definitely not in VR.

57:40: Because it makes your game run a little faster than it.

57:44: You have more...

57:46: Let me make it sort. There we go.

57:48: Oh, there we go, you see.

57:50: And now it's taking a while to sort, so, like, they're in the wrong order.

57:54: There we go, now it's sorted.

57:57: So you're gonna see the effect of it.

58:00: And if I go on the other side...

58:02: You know, now it's not sorted again, so it's...

58:05: There we go, now it's sorted.

58:08: I make it sort a little bit more often.

58:17: I'm not gonna complicate things right now.

58:18: I don't wanna showcase the other thing I was showing.

58:24: So, what I was saying is, you know, there's like a bunch of different things.

58:28: There's Gaussian Splatter in there.

58:29: On the static Gaussian Splatter, there's some methods exposed to ProtoFlux.

58:36: So if I go and grab a ProtoFlux tool...

58:43: There we go.

58:45: I can exu...

58:47: Something I wanted.

58:50: I'm just gonna...

58:51: There's a Splatter that was grabbing.

58:54: So I can expose this method.

58:57: I'm gonna proxy this.

59:00: And I need to feed it like, you know, a bounding box.

59:03: So I go...

59:05: Bounce, Math...

59:10: Let's see...

59:11: Where is it?

59:13: Do you know where the bounding box ones are?

59:19: Transform balance, balance...

59:23: Complete bounding box.

59:27: Is that the one I want?

59:31: No, don't touch that one, it's the component menu.

59:34: Oh.

59:35: Accidentally, here you go.

59:36: Oh.

59:38: Is this the one? No, that's not the one.

59:42: No, that's not the one I want.

59:46: Let's transform balance, encapsulate balance.

59:50: No.

59:52: Hold on, it was...

59:56: That's really weird, why is it like...

01:00:01: Where is the stuff? It's not showing everything.

01:00:08: It is...

01:00:09: What is under transform?

01:00:10: So transform, bounds...

01:00:15: There we go, from center size, that's the one I want.

01:00:19: From center size...

01:00:23: So I just compose like one.

01:00:25: And I kind of have to like, you know, just guesstimate it, but I'm gonna use like, you know, one that's 4x4x4.

01:00:32: And then I run it, and this will take a little bit.

01:00:35: But there's a lesson for run processing that's gonna clip...

01:00:39: That's gonna clip all the...

01:00:42: All the Gaussians that are, you know, outside of this bounding box.

01:00:46: There we go.

01:00:47: And you see how it's clipped.

01:00:49: So there's like very basic editing options.

01:00:55: Or rather like, you know, there's like editing options.

01:00:57: They're not like exposed through like intuitive tools yet.

01:01:03: But they're there, and like we can add more.

01:01:05: And we can add like, you know, visual tools where instead of, you know, having to do this, the tool like lets you draw a box.

01:01:12: And then like, you know, triggers it, and it triggers this to process the Gaussian Splat.

01:01:16: So you can do basic editing, and I would like to add more as well in the future.

01:01:21: And, you know, make it more intuitive to use.

01:01:23: But there's like some things that you can, you know, you can build your own tools for editing.

01:01:30: Also, that's because I was asking, is the bounding box clipping reversible or is it permanent?

01:01:36: I mean, as far as with anything, like it's permanent if you save this and delete the original.

01:01:43: So like as long as we have the original, we can always go back to it.

01:01:48: Because, you know, it's similar to processing any asset when there's a write.

01:01:51: Like if you don't delete the original, then you can always go back to it.

01:01:55: But if the original is gone, then, you know, it's gone.

01:01:58: You've deleted it.

01:02:00: This makes it much lighter, you know, the process.

01:02:03: Also, one thing you can do with this, you can actually export this.

01:02:06: So I can, you know, go files.

01:02:09: I've already tested this, but it does support exporting.

01:02:14: Actually, this one doesn't because this is an old import before that.

01:02:19: I'll have to set it up.

01:02:19: But you can export the edited casting supplies out of Resonite too.

01:02:25: There's actually one thing I can show you.

01:02:26: I can't show this one in the other world because it's going to be too heavy.

01:02:30: Let's switch back.

01:02:34: There we go.

01:02:35: Hello.

01:02:36: I'm going to switch.

01:02:38: I'm going to turn the camera back on as well.

01:02:44: So I also got a bunch of questions accumulated.

01:02:51: But where did Cyro go?

01:02:55: There's another thing I wanted to show you.

01:02:58: There is a procedural Gaussian splat.

01:03:01: And this is just to show you what Gaussian splats look like.

01:03:07: So if I create an empty object, I go into assets, there's procedural Gaussian splats, diagnostic Gaussian splat.

01:03:18: And this is one of the things I use when working with rendering.

01:03:21: You can literally add the splats one by one.

01:03:24: I'm going to set a render for this.

01:03:28: And I'm going to add a Gaussian splat.

01:03:30: There we go.

01:03:31: This is a single Gaussian splat.

01:03:34: And you can see it's essentially like a color blob.

01:03:43: And this is just the unit size.

01:03:46: So it's like one by one, like by in every...

01:03:50: It's kind of grayish too.

01:03:51: And there's a bunch of different parameters for it.

01:03:53: You have the positions, so you have the Y position, so you can make it move up.

01:03:59: Just a rotation.

01:04:00: That's not going to do anything because it's a uniform size, but you have the scale.

01:04:03: So you can actually make it be 0.1 on the X axis, which means it's kind of thin like this.

01:04:10: But if I look at it from the side...

01:04:13: Oh, that's not a good one because there's a bunch of stuff there.

01:04:15: There we go.

01:04:15: You see like now, over this, it's still kind of like...

01:04:20: It's essentially more of a disc, but each splat has like a 3D dimension to it.

01:04:28: And if I change the rotation, you know, that's going to...

01:04:34: That's not the right axis.

01:04:37: What axis do I need?

01:04:39: This one? There we go.

01:04:41: There we go.

01:04:41: So now it's going to rotate it.

01:04:43: So each splat can also have like, you know, rotations.

01:04:46: So like it can be stretched, it can be thinned, it can be big.

01:04:50: I can, you know, make it squished on another axis.

01:04:53: So it can be 0.1 on this axis.

01:04:55: And now it's like, you know, this like thin splat.

01:05:00: And they also have color, and the color is specified using something called spherical harmonics.

01:05:06: So I can put color here, and then I can put also color on like, you know,

01:05:11: some of the other, you know, some of the other bands of the spherical harmonics.

01:05:17: I'm just going to put some random values there.

01:05:22: Just to kind of see what it does.

01:05:24: And now like if I like move around it, I might have done wrong bands.

01:05:33: I might have done like vertical ones or something.

01:05:36: I'll actually do some of the lower bands.

01:05:37: That's going to make it a little bit more obvious.

01:05:40: There we go.

01:05:42: Let me also change the default one.

01:05:45: So now when I move around, what did I do?

01:05:52: These are kind of hard to edit by hand, so I'm just going to like, you know.

01:05:55: Yeah, I think it's up and down for this.

01:05:58: Let me do this thing.

01:06:01: Two on here.

01:06:03: I'm going to do zero on this one.

01:06:07: This is hard to, like, get the effect.

01:06:15: I wasn't doing much.

01:06:16: Is it maybe these other bands?

01:06:23: Let me see if I fly up.

01:06:31: And let's see, let me just return this back to what it was.

01:06:38: So if I change this one, this one should be...

01:06:46: Which direction is this in?

01:06:53: These are puzzling.

01:06:56: Let me try this one.

01:06:59: It's not really super human readable for this.

01:07:05: That looks like red from this way and blue from this way.

01:07:09: Does it?

01:07:11: Did it not update for me?

01:07:12: Yeah, it's red from here.

01:07:15: And then, when I look at it from this way, it's blue.

01:07:18: That's weird, why is it not...

01:07:20: Oh, there's something...

01:07:22: Oh, I think there's a bug.

01:07:26: Okay, this looks like a bug because it's gray for me, but it's red on the camera.

01:07:34: So the camera's rendering it okay.

01:07:35: Yeah, so it's...

01:07:36: That is odd.

01:07:37: That's very odd.

01:07:38: There's something wrong.

01:07:40: Okay, more bugs.

01:07:43: Like, for my viewpoint, if I disable this, it's gray.

01:07:50: But on the camera, it's red.

01:07:53: So there's a bug with the rendering.

01:07:54: So I was kind of confused because most of it wasn't doing anything.

01:07:59: But you can see it on the camera now.

01:08:02: It changes color based on the angle we view it from.

01:08:06: And essentially, all the Gaussian Splats, they're composed from these.

01:08:11: And this is like a new type of primitive.

01:08:14: You know, the way you have triangles for the directional geometry, you have Gaussian Splats

01:08:21: for Gaussian Splatting.

01:08:24: And because of this representation, you can see all the Gaussian Splats, they have this

01:08:30: very kind of fuzzy edge on them, which means they're really good at representing fuzzy

01:08:36: things like fur or plants or transparent things and so on.

01:08:43: It's like this new primitive that kind of gives it the ability to make really photorealistic

01:08:48: looking scenes.

01:08:49: And people sometimes ask, can we just make a model out of it?

01:08:53: And I always tell them, it kind of loses the point because the Gaussian Splats look so

01:08:58: good because they're Gaussian Splats, because of how they work, because of how the rendering

01:09:03: works.

01:09:03: So if you convert it to just the typical model, it's not going to look as good anymore.

01:09:11: Hopefully this kind of gives you a little better idea.

01:09:13: It still needs a little bit more time to work on, but I wanted to give you a little bit

01:09:18: of a sneak peek on things that have been worked out and bags and stuff that need to be sorted

01:09:23: through.

01:09:27: Let's go back to the other questions because we have got a bunch of them kind of piled

01:09:31: up.

01:09:32: I also need to be able to view the camera.

01:09:36: There we go.

01:09:38: So hopefully that kind of gives you a bit of a sneak peek and a little bit of a better

01:09:43: understanding of what Gaussian Splats are and how they're going to work in Resonite.

01:09:48: Just a little sneak.

01:09:52: So going back to the questions, let's also double check on time.

01:09:56: We've still got a bit of time.

01:10:02: TheRedNeko is asking, I had a second one that related to the audio system.

01:10:06: Will there be an increase in the number of audio outputs I'm playing once before the

01:10:10: audio breaks in the new system?

01:10:14: Potentially.

01:10:15: We'll have to see how it kind of performs.

01:10:20: It's essentially just kind of restricted by how much the hardware can handle before the

01:10:30: audio can kind of increase the defaults.

01:10:32: It also might be kind of tricky because we could make it a set thing based on the hardware,

01:10:38: but then the problem is if you build a world and you assume everybody can hear as many

01:10:42: sounds as you can, maybe some people can't, and now the audio is kind of broken for them.

01:10:47: So that's one of the trickier things to approach.

01:10:51: So we'll see how it kind of goes.

01:10:54: I don't want to make any commitments there right now.

01:10:58: Next question is from Dviki Morph.

01:11:01: Speaking of splats, will compressed Gaussian Splat format be supported as well or just

01:11:05: .PLI files?

01:11:08: It depends what you mean by compressed format.

01:11:13: Which format do you mean specifically?

01:11:14: There's like one format I've been looking at called the SPZ.

01:11:18: We just need to like, you know, have a wrapper for it and make like, you know, have like

01:11:23: a compiled version of it, and then it should be relatively easy to add support for it.

01:11:30: So that one's probably going to be supported, but I don't know if that's the same format

01:11:33: you know what you're thinking of.

01:11:34: So like if it's the SPZ one, then probably yes.

01:11:39: There's also like other part, like it's like having the Gaussian Splats compressed in VRM,

01:11:43: which is going to like, you know, make them a little more efficient to render because

01:11:47: it's going to use less VRM.

01:11:49: That's one of the things I also want to add, just to like, because they take a lot of VRM.

01:11:56: So that's also going to come at some point.

01:11:59: But it's kind of separate, you know, from the file format compression.

01:12:02: That's like, you know, how to compress an FDM memory.

01:12:08: Next question is from ShiningHero1.

01:12:10: How badly would Unity Reactor having the underlying Mono version replaced with even a slight update?

01:12:15: You can't really do that.

01:12:17: Like the Unity is built, you know, with its own like Mono version.

01:12:21: So like, and they have their own fork of it too.

01:12:24: So like, you know, like you can't, you can't just update it.

01:12:27: Even if you could, it would react badly to say the least.

01:12:31: Yeah.

01:12:31: Like I was one of the reasons like they don't update often is because they haven't made a lot of changes to it, which means like every time they want to update it, you know, they need to like, like backport, like all the changes and everything.

01:12:45: So it takes them like forever to do it.

01:12:47: And even then, like, you know, there's, you don't really get super much benefit from doing it because even the newer versions of Mono are like, they're not substantially faster than, you know,

01:12:57: the older ones.

01:13:00: The next question is from D-Redneko.

01:13:04: Rescinding cast, not in crew.

01:13:05: Oh, this was, we already answered that one.

01:13:11: Alex2PI, Froox Volleurs Plus, and there's a nice support movies in Splats.

01:13:14: No, you need all the VRM for that.

01:13:18: So I don't know, like what would even movies, what would movies in Splats mean?

01:13:23: I don't even know what that might mean.

01:13:25: Like basically like just like think of like gifts, but for Splats, I guess.

01:13:30: No, I thought it's like, it's a movie that's in each Splat.

01:13:33: Each Splat is a movie texture.

01:13:37: Oh God.

01:13:39: I don't know.

01:13:40: I don't know.

01:13:41: Like that was just the joke one, but like if you have like more one, like, I mean, like

01:13:46: you could like do animation with Splats.

01:13:48: There's like some things for that, but like until like there's something we can like,

01:13:52: you know, easily adapt and use.

01:13:54: And we're going to integrate it like until then.

01:13:58: Probably we're going to do a bunch of kind of research because like they're already like

01:14:01: very heavy on memory and for movies, like you need a lot of data.

01:14:06: So like there needs to be a lot of work into those kinds of formats.

01:14:12: Next question is from KesterBlades.

01:14:16: I don't know much about casting Splats.

01:14:19: Would you be able to like cut some of...

01:14:22: Yeah, like we virtually showed that.

01:14:27: Actually, I think that is it.

01:14:29: Cut some of the background fuzzer bits of a Gaussian Splat.

01:14:31: Only have the good bits of the models.

01:14:33: It can be integrated with other stuff.

01:14:34: Only got Splat mode as a centerpiece for social plays.

01:14:38: Yeah, you kind of like cut them out.

01:14:39: The only thing is in some Splats you actually need the bits because like they do stuff like,

01:14:44: you know, like reflections because sometimes like oftentimes they do reflections where

01:14:48: it's literally sort of like a mirror version of the world.

01:14:52: So if you cut those pieces, oh, we actually lose a bunch of the visual.

01:14:57: But, you know, like we showed earlier, you can kind of cut them off.

01:15:04: Dragging morph, yeah, like we showed this one.

01:15:06: We'll be able to edit Splats.

01:15:07: There's an IDS, like you'll be able to.

01:15:09: And we already kind of like, you know, you'll be able to like with a simple like protoflex method,

01:15:13: but we want to add like nicer tools too.

01:15:16: Sharing here one ask, I hope there's a third of the Trim Gaussian Splats as well.

01:15:22: Like you can do it, like, you know, you can do it in some of the tools.

01:15:26: It's a little bit harder to do in desktop because like, you know, the cool thing in VR is like, you know,

01:15:31: I can just literally grab a tool and be like, you know, I'm just going to start like here.

01:15:35: I can be like, you know, I want to crop this and we just kind of draw the box around versus on desktop,

01:15:40: you have to be like, okay, I have to be like, I have to align here and then I have to make sure it's aligned here.

01:15:46: And in VR, it's like more intuitive, like, you know, to do it.

01:15:49: It's even harder if you want to, like, you know, just clean them up.

01:15:52: So like, you know, there's a bunch of like splats over here.

01:15:55: So like, in VR, you can just grab a tool and be like, okay, I'm going to remove here.

01:16:00: I'm going to remove here and a little bit here.

01:16:02: Like it gives you like, you know, that kind of like spatial ability to work like spatially,

01:16:07: which I think is like, you know, really beneficial and makes things a lot easier.

01:16:14: Next question, shining here.

01:16:15: Another one also, what was that glitch for doing the family death pose?

01:16:19: Yes, he was.

01:16:19: He does that a lot.

01:16:21: It's been cute.

01:16:26: Next question from Dwagimorph.

01:16:29: Also, what do you use to make your splats?

01:16:32: Direct with camera scanner or from images and videos?

01:16:35: I make my from like photos I take.

01:16:38: I use a software called JAWSET Postshot to like reconstruct them.

01:16:43: I usually take like raw photos and I kind of process them like do like denoise and so

01:16:47: like start processing and process them like you know for the Gaussian Splats software.

01:16:55: Jack the Fox author is asking, are Gaussian Splats part of the performance update?

01:16:59: They seem to help with shifting the button like to the GPU.

01:17:02: They're not.

01:17:03: It's just like a fun project kind of on the side.

01:17:07: They do like, you know, they're very, they use the GPU a lot, but they use it for themselves.

01:17:12: And they don't really help like, you know, they don't help with any other stuff.

01:17:18: So like I wouldn't consider it part of it.

01:17:20: There is like one.

01:17:22: Oh, sorry.

01:17:23: There's.

01:17:24: There's no.

01:17:26: Sorry.

01:17:28: There's like one, but like there's like a few kind of things that are, that actually

01:17:32: end up like being like the performance update.

01:17:35: One of them is, you know, I think the support for the spherical harmonics, which we'll use

01:17:40: like for a bunch of like different things, you know, like PhotonDust, there's going to

01:17:44: be new effects and so on, but also to kind of like migrate more stuff from Unity into

01:17:48: Froox Engine.

01:17:49: Like, you know, for example, the ambient lighting, which also uses spherical harmonics, the same

01:17:53: like Gaussian Splats do.

01:17:56: The spherical harmonics are also going to be used, you know, for audio systems, for

01:17:59: ambisonics and so on.

01:18:00: So there's like stuff that's kind of related there.

01:18:05: Actually, maybe we can also show that one.

01:18:07: I probably shouldn't right now, but we'll see.

01:18:13: Because like one of the things that's like, you know, the ambient lighting in the world,

01:18:16: it actually is also encoded using spherical harmonics, which is what Gaussian Splats,

01:18:22: you know, use for like, you know, the individual pieces.

01:18:25: So the code either for spherical harmonics, for the Gaussian Splats, that's also useful

01:18:30: for other things.

01:18:31: Some of them are related to, you know, the performance update, like the audio system

01:18:34: and like moving some of the lighting.

01:18:37: You know, from being computed on Unity side, you know, the Froox Engine side.

01:18:43: So there's like bits there.

01:18:47: One of the things that like were actually in mind end up being sort of related is because

01:18:53: I have to like, you know, work with like, make the Gaussian Splats, they're rendered

01:18:59: using compute shaders.

01:19:03: So I've kind of like, you know, had to like integrate that like into the kind of pipeline

01:19:06: that we have, you know, especially with the one that Resonite uses, because it has to,

01:19:11: you know, do all the sourcing of them, but also has to compute all the properties for

01:19:14: the Splats and shade the, you know, Gaussian, the shade the spherical harmonics for each

01:19:19: Gaussian.

01:19:21: I might end up like, you know, writing a compute shader to do similar thing for the

01:19:25: particle system for PhotonDust.

01:19:28: Because there's still some like kind of performance issues with PhotonDust, like when it comes

01:19:34: data to Unity.

01:19:37: But I don't know if it's necessary yet, I still need to kind of look into it deeper,

01:19:41: but it's possible we might need to like, you know, bypass like just kind of not use Unity's

01:19:45: like particle rendering at all and just have our own using compute shaders.

01:19:51: So like working on the Gaussian Splats like kind of helps like, you know, get a bit more

01:19:57: familiar with that.

01:19:59: So it might help like, you know, in that regard, sort of like not super directly, but kind

01:20:04: of indirectly kind of helps.

01:20:08: Fuzzy by Polar Bear, Columbia Blob Rude.

01:20:11: We weren't talking about it in specific.

01:20:14: Oh, we already answered that one.

01:20:20: LX2PI.

01:20:21: I ask everyone about possible support for movie spots in Resonite.

01:20:25: I think we already answered that.

01:20:27: Yeah, we answered that.

01:20:29: Can we bake a set of these diagnostic castings plot as one?

01:20:31: Could be interesting to make brushes of these.

01:20:34: You can bake it.

01:20:35: You couldn't really make brushes of it.

01:20:38: Like, you know what it would do?

01:20:41: Like, it's not really a thing like I would use for a brush.

01:20:44: I don't think it's like super useful for that.

01:20:47: Also, like, you know, like, you'd have to like, like, they don't, one of the things

01:20:52: that the casting spots, they will not blend with each other.

01:20:57: Like, they kind of like render it as its own, like, unit.

01:20:59: So you have two of them, like, they're not gonna blend, you know, in like super useful

01:21:04: ways.

01:21:05: Like, it's not useful for that.

01:21:10: Next question is from the Jeb Forest.

01:21:11: Is it possible to regas and splats photorealistic avatars?

01:21:15: So, for the rigging part, you could actually do it.

01:21:19: Like, like the basis of it, like, you know, you do have like, each calcium is determined

01:21:25: by the rotation and the scale.

01:21:29: And technically, you know, you could have, you know, a matrix that binds it to a bone,

01:21:34: and you just recalculate that, you know, and move it around, move it around, you know,

01:21:41: as the bones move, you recalculate positions.

01:21:44: Theoretically, yes, you can rig them.

01:21:47: You would need like, you know, some kind of toolset to do it.

01:21:49: I don't know if there's any software that lets you do that right now.

01:21:54: In terms of like the math of it, like, you know, it's not a problem in terms of the math.

01:22:00: I don't know if there's any tooling for it.

01:22:02: But even if there was tooling for it, I don't think it's necessarily super useful for avatars.

01:22:09: Because there's been like, there's been like some parts of Gas Inc++ where they're like, you know,

01:22:13: they actually rig them, but it's like, you know, specific to that project.

01:22:16: And you can kind of move them around.

01:22:18: But the problem is Gas Inc++, they don't interact with the lighting in the scene.

01:22:23: Because, you know, baked, which means your avatar, for one, the avatar is going to be really heavy.

01:22:31: You know, like you're, you don't want to like, you know, like use huge amounts of GPU just to render your avatar.

01:22:40: It's not going to blend well with other avatars because like, you know, like I mentioned earlier, the Gas Inc++, they don't blend with each other.

01:22:47: You could make them blend with each other, like with lots of extra effort, lots of extra performance, but you know, it doesn't right now.

01:22:54: It also doesn't interact with the lighting.

01:22:55: So like, you know, if you're in a dark environment, you're going to be glowing.

01:22:57: If you're in a brandy environment, you know, it still might not fit in because the lighting is effectively baked into it.

01:23:05: So like, I don't think it's, other than like, you know, using it as a joke, I don't think it's useful for, you know, that kind of stuff.

01:23:12: It is really most useful right now for capturing, you know, real world scenes and then like, you know, rendering them in VR.

01:23:20: So we can kind of look at them and preserve, like, you know, their graphic or fidelity.

01:23:26: But I don't know, like, how useful is it for interactive stuff, like avatars and so on.

01:23:38: Next question is from Alex2PI.

01:23:41: There's new paperwork. People are starting to create moving videos and splats, Froox.

01:23:46: Yeah, there's like, there's like some research. The only thing is, like, you know, until it makes it into some, like, code, you know, that's like public and it can be used in some format that we can, like, easily adapt.

01:24:00: Like, you're probably not gonna see it because we'd have to, like, you know, we have to do a lot of this kind of work ourselves and it can take months and we're not gonna, we cannot afford to dedicate, you know, months, like, to doing Gaussian Splatting kind of research ourselves, you know, to have it in here.

01:24:19: That's not our whole, like, main focus.

01:24:21: So, the best thing is, like, you know, like, once those make it into some kind of, like, you know, more standardized formats, once there's more, like, standardized kind of tooling, we can adapt it, but we're probably not gonna spend that amount of time, like, you know, implementing that.

01:24:35: I'll just kind of groundwork ourselves.

01:24:40: Next question is from Zwan5250.

01:24:43: I'm trying to research for technical details about Gaussian Splatting. May I ask this feature related to paper, 3D Gaussian Splatting.

01:24:52: Brainless Field Rendering, published in August 2023.

01:24:58: Yes, yes, it's related to that paper.

01:25:00: I mean, that one, I think that's the, like, the first one, maybe, that, like, you know, kind of published, or, like, one of the first ones.

01:25:12: And so, like, the rendering we have is, like, you know, since we're just rendering the basic, like, static Gaussian Splat.

01:25:17: And there's, like, a lot of research, like, doing, like, those other different things, but, like, you know, it's still within the removal of the research, or, like, you know, very specific apps.

01:25:28: So, next question is from snba3272.

01:25:33: Will there, or have there been any attempts on getting system for converting shaders from VRC to Resonite?

01:25:40: That's very unlikely to happen. Shaders, like, shaders, they're essentially code.

01:25:47: And converting code from one system to another, like, that's very complicated kind of task.

01:25:56: You know, like, where, like, you only get systems that will be able to convert, like, all possible code into a different system.

01:26:08: You know, even the issue is, like, you know, we don't support, like, custom shaders, which means all the shaders, like, are kind of, like, pre-built.

01:26:17: And one of the reasons for that, or the main reason for that, is because while we're still tied to Unity, we cannot guarantee that the code will keep working.

01:26:29: Like, it won't keep working, like, only once we switch to graphics engine, because it works a fair bit, like, you know, like, it's gonna work, like, very differently.

01:26:45: And, like, one of the things we want to do with everything on Resonite is make sure we can maintain compatibility.

01:26:50: Whenever you build, we want to make sure it can keep working, and we can adapt it to keep it working, like, you know, with whatever changes we make to the system.

01:27:00: You know, for example, with the PhotonDust, like, we have a system that converts the legacy particle system into the new one, and, like, you know, most of the times, like, it just works, and there's, like, still few cases that need to be fixed, but pretty much, like, you know, things will look and feel the same.

01:27:16: And we want to make sure we can do the same for shaders.

01:27:20: Implemented custom shaders right now, we don't have to use unit system, but there's not a reasonable way to convert those shaders into whatever new engine we'll switch to eventually, which means all those custom shaders would break, and it's one of the reasons we don't have the support right now.

01:27:37: And the reason, like, you know, the reason, like, it's not possible to keep supporting them is because you can't really convert code automatically to a completely different system.

01:27:50: So there's not a system, I don't think anybody tried one, I don't think it's even possible. Not like with a reasonable kind of success rate.

01:28:10: Yeah, I think that's like, it's one of those things, like, where, like, you know, like, there's like some tools that can like, you know, can you get your like, you know, certain amount of time, like, they can get you a little bit there, like they kind of pre-convert some kind of patterns and stuff, like edit it and it doesn't really work.

01:28:32: Like the closest, like the most successful you'd probably get with like the least amount of effort is like, like, let's say you had a physically based shader, right?

01:28:47: The closest you could probably get is just plugging the same like textures and parameters into the corresponding engine's also physically based shader.

01:28:56: In terms of like, especially like asking how do I run my Unity shaders in Unreal Engine, like it's the same level of difficulty.

01:29:05:

01:29:06: It's also like, because like some shaders can be like, you know, built like for a specific way the rendering pipeline works and actually have certain effects, you know, based on that.

01:29:17: But if the pipeline works different, then you need to do the effect like a different way and maybe it's not even possible with a different pipeline, you know, to get the same effect.

01:29:29: Because, you know, one of the things is just it doesn't translate automatically.

01:29:33: Usually you have to like, have somebody figure out how to do similar effects, you know, with a different one.

01:29:38: It kind of depends on the shader, but like it is one of those tricky things.

01:29:43: So Fuzzy Bipolar Bear is asking, are you feeling any better about getting over Conquered?

01:29:46: Yeah, it's like, it's better than it was like on Friday, like Friday I pretty much like was dead.

01:29:52: I have like a fever and such, so I'm just kind of like more kind of bland stuff, you know.

01:29:59: But it is, it's getting better.

01:30:03: Everyone seems to have gotten hit by the frickin', by an annoying head cold.

01:30:09: We got the plague.

01:30:16: So that's pretty much all the questions right now.

01:30:21: We have about 13 minutes left, so there's still plenty of time for questions.

01:30:25: So like if we could some more make sure like, you know, to like get them in before, before the time runs out.

01:30:31: Just so we don't get like, you know, like any last time questions like, like last time.

01:30:38: But we can like ramble about some other stuff.

01:30:42: So the other thing I wanted to show you with the spherical harmonics.

01:30:47: Let me change this back to some POV.

01:30:51: So all the lighting in the world.

01:30:55: If I make like a, let me make like a sphere.

01:31:02: So there's like, you know, multiple things that kind of contribute to the lighting of the spheres, the actual lights in the scene.

01:31:08: But there's also like, you know, ambient lighting.

01:31:13: There's also ambient lighting that's, you know, computed typically from the skybox.

01:31:21: And there's not like, you know, in normal builds, we can't control that, but there's actually a new component.

01:31:26: This is not really polished yet, so this might actually look different.

01:31:31: But I can like, you know, give you a little bit of a peek.

01:31:33: If I go into rendering, there's ambient light SH2.

01:31:44: And this actually uses second order spherical harmonics, you know, to determine ambient lighting.

01:31:50: And if I plug red here, for example, now there's like a red ambient light everywhere.

01:31:56: And you see it immediately affects the scene, and everything's red.

01:32:03: It's all red.

01:32:05: So like the first band, you know, is literally just like ambient.

01:32:10: It covers everything.

01:32:11: But if I do the second band, this is going to be like one side.

01:32:15: So now it's also red, but it's red, you know, only from the top.

01:32:21: You see?

01:32:22: And the bottom is like dark, and same with everything.

01:32:24: The red is from the top.

01:32:25: We can also change it, you know, to be like, I'm going to do blue.

01:32:30: Blue, now it's blue from the top.

01:32:33: And if I actually reverse it, if I like, you know, plug minus one, like minus, now it's blue from the bottom.

01:32:43: If I plug it, you know, if I plug it here, it is going to be from one axis.

01:32:50: So now, you know, it's blue from this one.

01:32:52: So it's almost like, you know, there's like blue lights kind of coming from here.

01:32:56: And you can kind of see it on everything, you know, over there.

01:33:01: And then if I go into even higher ones, you're actually going to see this kind of creates this kind of like band effect.

01:33:10: See, like in my, like, let me actually use the different axis one so it's a little bit clearer.

01:33:15: I'm going to use this one.

01:33:18: Here we go.

01:33:20: This should be, there's a band here and there's a band here.

01:33:25: So this is the way to deal, like, you know, the lighting is like the ambient lighting is kind of encoded and I can actually just plug a bunch of random stuff here.

01:33:35: So, you know, I've kind of created like this kind of funky ambient lighting and I can actually control it.

01:33:42: And usually, like, you know, it's, it's made so it actually kind of matches the environment.

01:33:45: I'm just kind of plugging random colors in here, but actually looks kind of neat with this world.

01:33:53: It's kind of neat looking, but essentially it is using, you know, the spherical harmonics to control how the ambient lighting works.

01:34:04: So it's one of the things that's probably going to be exposed and probably I'd like to also, you can kind of compute it, you know, from a texture and then you can also blend it.

01:34:14: So like, you know, like if you, if you're like, you know, move around the environment, you could like, what is that?

01:34:21: That is not me, that is the, that is something in the world.

01:34:25: That's so weird. Anyways, we want to make it like, you know, so it's, so you can kind of blend it.

01:34:37: So like, you know, for example, if you move around the world, ambient lighting will also change.

01:34:40: So it's kind of exposed, like, you know, some new options, but the basic building block that is used for that, you know, is the Spherical Harmonics,

01:34:48: which originally, you know, I added them because of the Gaussian Splats because they used them to encode the color, but it's also useful, you know, for this other thing.

01:34:57: It kind of gives you like more control over stuff.

01:35:00: I'll actually keep this on because I kind of like how it looks.

01:35:02: I can also show you, if I create another empty object, I've added, I'm actually, I'm going to, no, this works.

01:35:16: I'm going to add a procedural asset, procedural meshes, there's an icosphere, icosphere meshes, SH2.

01:35:28: So what this does, this is like the icosphere procedural mesh.

01:35:33: Except the, both the colors, like vertex colors and the radius, you can modulate them using spherical harmonics.

01:35:47: So if I set up a render, it's actually using the ambient lighting as well of the world, so it's kind of...

01:35:56: I'm not going to show the colors because we kind of see that with that, but it's kind of funky if you play with the radius.

01:36:02: It's only using, you know, the first band, which is like, it's the same everywhere.

01:36:09: I will switch it to a different material so you can kind of see a positive and a negative better show.

01:36:16: So go vertex color, metallic, there we go.

01:36:24: I don't want this on.

01:36:27: Plug in the vertex color, metallic, there we go.

01:36:32: I think it's like inverted by default.

01:36:35: There's dark by default.

01:36:37: So if I actually remove this one and I only give it like the positive band, you see like it kind of creates this kind of shape.

01:36:47: I'm actually going to increase the number of subdivisions so it's kind of smoother. There we go.

01:36:53: And what you're seeing is like, you know, it's still, you know, the base is the sphere.

01:36:58: But what is happening, the function, the radius is being modulated based on the direction.

01:37:05: And this band, you know, it goes like vertical like this, which means over here, the radius is zero.

01:37:12: As I keep increasing, it actually goes negative here.

01:37:15: That's why this part is, actually no, this is positive because the color is not in the version.

01:37:21: It goes like, you know, it goes positive here and negative here.

01:37:25: And it's visualized, you know, by the color.

01:37:29: And if I change this one to be on this one, this is kind of like rotated to zero here.

01:37:38: It's essentially the same thing, except it's, you know, on this axis.

01:37:44: And the way you can think about spherical harmonics, I'm going to grab my brush here.

01:37:52: Brush, give me brush.

01:37:57: I'm not used to these controllers, there we go.

01:38:01: The way you can think about spherical harmonics, it's like having a standing wave except it's in a sphere.

01:38:09: So, you know, over here, imagine you have like, you know, like you have like start and end and you have like a wave here.

01:38:17: You have like a wave that goes positive and it goes negative.

01:38:19: It's just like, you know, a sine wave.

01:38:21: It's essentially this except it's wrapped around a sphere.

01:38:26: So, you know, on one side it goes, you know, it goes positive and on the other side it goes negative.

01:38:35: And this one it just kind of like, you know, it flips it, it's like it's kind of, when it goes negative it still like flips it so it goes like this.

01:38:42: But essentially it's like, you know, it's this except following a sphere.

01:38:50: It kind of makes sense.

01:38:54: And this is, you know, this is the first order.

01:38:57: That's, you know, why it's just like one like this.

01:39:00: If I go and it's, you know, and you have like one band, like, you know, in each kind of direction.

01:39:07: If I go here, like, you know, it also does it like in this direction.

01:39:10: But then when I go into the higher order bands, so let's do this one.

01:39:15: One, two, you see now it does this kind of shape.

01:39:20: And the reason for this is because for the second order, the sine wave, it goes like this.

01:39:28: One, two.

01:39:31: And does you know how it wraps around this?

01:39:33: It's like you literally have it like, you know, is literally this peak, you know, matches this peak.

01:39:40: Or maybe like in its reverse, but it doesn't matter.

01:39:42: This peak, you know, is going to match this one.

01:39:45: This one is going to match this one.

01:39:49: This one's going to match this one.

01:39:52: So it's kind of, you know, it's almost like it took the sine wave and like you kind of wrapped it around.

01:40:00: And it's also like in three dimensions makes things a bit more complicated, but essentially it's this kind of principle.

01:40:07: And what this does, it's like, it's a way of like sort of encoding information where, you know, it can be sampled direction.

01:40:14: Because you can kind of combine these different bands, you know, to create all kinds of different, you know, functions on the surface of the sphere.

01:40:22: So I can like, you know, plug one here and maybe I'm going to put a negative one here and do like this thing and do a thing here.

01:40:32: We're going to do a thing here.

01:40:37: I don't want it to be that big.

01:40:39: You can create like, you know, all kinds of like weird shapes.

01:40:44: And like, you don't need to know that many parameters for this.

01:40:48: Like the SH2, which is like, you know, second order, which has nine coefficients.

01:40:55: It's enough, you know, to encode like the ambient, like ambient lighting information like this in the world.

01:41:01: And it doesn't need much data to do it.

01:41:04: These Gaussian Splats, they use third order, which has 16 channels and that can encode even more information.

01:41:15: So it's essentially like a mechanism, you know, to like encode information.

01:41:19: Because if you think about a sphere, each point of sphere, it's a direction from the center of the sphere, you know, to that point.

01:41:27: So if you, if you want to like, you know, sample point on the sphere, you're essentially, you know, taking something that's inside, like let's go from the center and have a directional vector.

01:41:38: And if the sphere is like unit sphere, this is, you know, just the unit direction, you sample the spherical harmonic and it gives you, it gives you color or it gives you, you know, radius.

01:41:48: Or it gives you, you know, audio amplitude for the ambisonics.

01:41:52: So like if you have like, you know, an audio source here, the spherical harmonics, it encodes the audio channels, the audio channels are the individual coefficients.

01:42:04: And then, you know, you sample it based on where the listener is relative to the audio source.

01:42:14: And this is kind of a cool way, like, you know, to kind of be able to visualize it.

01:42:18: There's actually one thing I also wanted to bring in.

01:42:21: Let me, I'm not getting any more questions.

01:42:24: Like if you got any more questions, make sure like, you know, to post them before this, three minutes.

01:42:30: I'll try to get them before, like in this sense.

01:42:33: But if there's no more questions, we'll probably end up like rambling about things for a bit.

01:42:48: I did want to showcase, so there's like this kind of diagram that you can kind of see a lot, like in different displays.

01:42:56: Because like spherical harmonics, they're one of the like building blocks, you know, like vectors and stuff that kind of comes up with a lot of other stuff.

01:43:04: If you look at this diagram, this is, this should be very familiar.

01:43:09: This is literally like, you know, what I've showed you with this.

01:43:14: Because I've always been like, what does this mean? Like, you know, what, what is this visualizing?

01:43:19: And this is literally just the value where the spherical harmonics modulates the radius.

01:43:29: So if I, if I change the note back, go say zero.

01:43:34: If it's just the first band, you know, then it's just, it's literally just a sphere, you know, as the first one.

01:43:42: If we go for the second band, the first order, you see it makes that kind of shape.

01:43:51: That's exactly that shape. And there's like, you know, three of them.

01:43:56: Then the second, second order, you know, there's a bunch. So like, if we have one, two, three, four.

01:44:04: So this one, this one makes this shape, you know, and this matches our shape here.

01:44:11: And then also if we plug this one to this band, you know, this makes shape on a different axis.

01:44:23: Let me plug this one here. This makes this kind of shape. Does it look familiar? Is that the shape?

01:44:37: And then this one, this is going to, now they're kind of getting combined.

01:44:44: And this makes this shape. So like, it's literally these shapes.

01:44:48: And all this indicates is the radius modulated by the, you know, value of the spherical harmonics function, you know, at each point of the sphere.

01:45:01: Which, like the colors indicating, like where it goes positive and where it goes negative.

01:45:05: Because, you know, if you've got a sine wave, it goes, you know, it goes positive and it goes negative, positive, negative.

01:45:11: Same here. You know it goes positive, negative, positive, negative.

01:45:15: And if you add more bands, you know, this one, you can see like it now it goes like three times.

01:45:21: So the third order is, you know, positive, negative, positive, negative, positive, negative.

01:45:27: Like, you know, it goes through three, three waves, essentially.

01:45:33: And the more orders you add, like, you know, the more complex the shape gets, the more complex the information you can encode is.

01:45:41: I guess, like, you can almost think about it like, you can almost think about it like the different frequencies that, like, make up a sound wave, for example, because the data that they represent is constructed out of their interference with each other.

01:45:59: Okay, I mean, this kind of waves in general is like, like, these are sort of like, you know, the basis ones, and then like, you know, you combine them to, like, create whatever function you want.

01:46:09: Usually, like, this is like an extreme variation, you know, so like, this is gonna be like, this kind of like, you know, kind of showcase.

01:46:17: But you can use these kind of, you know, to constructively and destructively, like, create, like, some kind of, like, smooth function.

01:46:24: It's actually the same kind of, it's similar approach that's, like, used for encoding images with, like, JPEG.

01:46:29: Because with JPEG, with JPEG, what you do, you essentially vary, like, you know, kind of frequency information in a block of 8x8 pixels.

01:46:41: There we go.

01:46:43: You know, so, like, in JPEG, you're essentially overlaying, you know, like, a bunch of things.

01:46:48: And it can have, like, you know, very high frequency detail, you know, that goes with, like, low frequency detail.

01:46:52: And, like, it ends up, like, you know, making, you know, like, if you want, like, very big detail, you can have a function that's, you know, that's gonna be bright here and low here.

01:47:03: And what this is, you know, this is gonna be like this, you know.

01:47:07: And then you're gonna have, like, information that, you know, varies a lot more, so it's gonna create, like, you know, kind of bands.

01:47:14: And if you have enough of these, you can actually construct pretty much any, you can construct any image you want.

01:47:22: What it is, you're essentially using, you know, the Fourier transform, where you translate, you know, the information from spatial domain into frequency domain.

01:47:33: You can compose any wave out of, like, you know, any frequencies, as long as you have, like, any of them.

01:47:38: And it's kind of a similar thing. If you had enough of these, you could, you know, have arbitrarily precise information on the surface of a sphere.

01:47:47: And the spherical harmonics is what makes the encoding kind of, like, you know, efficient.

01:47:51: Especially, like, if you don't have too many bands, it makes it very efficient to encode low frequency detail.

01:47:58: So, like, you cannot, like, you know, have, like, something that's, like, um, something that's, like, you know, a color that would be, like, you know, in a very small area, very different.

01:48:09: The information's very low frequency, which means it kind of varies more slowly, but for stuff like ambient lighting, that's perfect, because that doesn't have very high frequency details.

01:48:21: Would you get a couple more questions, actually?

01:48:23: Yeah, there's more questions.

01:48:28: So, Lucas is also asking, so far his examples use a color SH, will there be higher order ones?

01:48:34: So, Gaussian Splat actually has third order ones, so, like, if I switch to this, you see for the Gaussian Splat, it goes all the way to SH15.

01:48:45: There's also data structure for SH4.

01:48:50: So, if I switch to this, and create just an empty object, and I just go to data, and I'm just going to make a value field.

01:49:10: Oh, just another cool thing about these, like, I've implemented these very general requests.

01:49:17: So, I can replace spherical harmonics, and then, you know, I'm just going to use float.

01:49:27: Oh, actually, I forgot to add the order.

01:49:29: So, right now, the hardcoder ones are all the way up to fourth order.

01:49:34: So, I can add, you know, four other spherical harmonics, and these have, like, you know, 25 coefficients.

01:49:44: There's nothing that, like, uses these right now.

01:49:47: That actually generates the icosphere for all of these, all the way up to L4.

01:49:52: If there's interest, I can implement, like, you know, arbitrary level of support, but that, like, that's going to make it a little bit more complicated for the data model.

01:50:02: And I just think, like, you see, like, hardcoder ones, like, super often anyways, so I'll probably do one for later.

01:50:08: But right now, they go all the way to L4 within the data model.

01:50:12: Just another question, you know, how much can you change with essentially the ProtoFlux? Can you drive it, for example? Yes.

01:50:18: It's literally, like, it's not that different, you know, from other data types, like matrices, you know, or vectors.

01:50:25: So, if I grab this, let me actually do, I'm going to show it with this one, because it's going to look funky.

01:50:36: So, if we go back, you know, to this kind of thing.

01:50:40: This is just the value, like, you know, like any other. So, I can drive it, and there's a new set of nodes.

01:50:49: There's only a few basic ones that are probably going to need some more in the future.

01:50:55: Wait, what is happening? Sorry, I was selecting something.

01:51:01: So, if I go math, the spherical harmonics, you can pack them, and this is like evaluation.

01:51:08: So, you know, you can plug in a spherical harmonic and evaluate the value of the vector you want.

01:51:13: This one is second order, so I'm going to pack this one.

01:51:17: I'm going to pack the type float.

01:51:23: Essentially, just pack it.

01:51:26: I'm going to move this a little bit closer.

01:51:33: And say I just want to animate it, like do funky stuff with this, just kind of for visualization purposes.

01:51:40: I'll go time, first time.

01:51:46: And now, I'll do like simplex noise, I think.

01:51:54: Yeah, it works, simplex noise.

01:51:56: It's actually working because it goes to like the negative two.

01:51:58: I could also just use simplex noise, there we go, plug this in.

01:52:08: Come on, there we go.

01:52:10: Plug this in, just plug this into this.

01:52:13: Now I have this blubber thing, whatever this is.

01:52:19: That looks kind of neat.

01:52:23: There's, yeah, hold on, let me, let me change the color so it's also like...

01:52:32: That's too much.

01:52:39: There we go.

01:52:48: But I can like, you know, mess with it and do like whatever stuff you want.

01:52:55: I'm gonna do some more, I'm just gonna multiply.

01:53:03: Where's multiply my dummy?

01:53:04: It's in operators.

01:53:10: There we go, I'm just gonna plug this in so I can just vary it a little bit so they're not like all the same.

01:53:20: So I'm gonna do like, I don't know, like 3-point something, just a random value.

01:53:26: And let's do simplex2 and just go here.

01:53:30: Oh, that's too much actually.

01:53:36: There we go, I've got like, you know, this funky shape.

01:53:38: So like, you can drive them, you know, you can like mess around with them.

01:53:42: You can sample them.

01:53:44: I've kind of hinted at this a little bit earlier.

01:53:47: So if I go math, spherical harmonics, there's the evaluate.

01:53:55: And there's some kind of helpers because like you need to specify the spherical harmonic if you want.

01:54:00: So one spherical harmonics, two of type float, and the type is float.

01:54:16: Take it out of it.

01:54:18: Float.

01:54:20: Did I do it wrong?

01:54:22: I did something wrong.

01:54:26: Spherical harmonics.

01:54:27: What did I do wrong?

01:54:32: This is spherical harmonics.

01:54:34: L2 float.

01:54:36: Oh, it misspelled spherical on dummy.

01:54:43: There we go.

01:54:45: So I get the evaluate node.

01:54:49: So I can actually now be like, I want to know what value is in order to dispersion vector.

01:54:54: And right now there's not one, so I'm going to do it here.

01:54:58: Oh, it's not updated.

01:55:00: Oh, I need to mark it as continuous update.

01:55:01: No, that's weird.

01:55:02: Why is it not updating by default?

01:55:08: Something's clogged.

01:55:11: Whaaat?

01:55:14: What is wrong with this?

01:55:17: Did I bork something?

01:55:19: My stuff had bugs.

01:55:21: Yeah, it's kind of like real work in progress and stuff.

01:55:23: Also, like I think there's...

01:55:25: Oh, why is this not evaluating?

01:55:32: Yeah, I really should be able to evaluate them, but there's something wrong right now.

01:55:38: Eh, working for a good build.

01:55:42: How do you...

01:55:44: Oh, there we go, animating...

01:55:46: There we go, animating the ambience.

01:55:48: Hooking up to ColorX Hue, there's a flat offset for a couple of them.

01:55:53: Yes, there we go.

01:55:55: That's really neat.

01:55:56: So I can create all kinds of effects with that.

01:56:01: Woah, it's kind of almost rotating, that's crazy.

01:56:06: It pretty much is, I think.

01:56:08: Like, there's an offset.

01:56:10: When you combine them, it essentially creates this kind of rotating effect.

01:56:14: You might have seen it earlier, too, when I was combining some of them.

01:56:20: It's kind of funky when you look at this, too.

01:56:22: So it's just going...

01:56:23: Oh wow, it's like a hole in the middle sometimes.

01:56:30: Woah.

01:56:32: One thing I'm hoping someone will make with this is an audio visualizer.

01:56:39: Yes.

01:56:41: But I hope everyone else answers those extra questions.

01:56:44: It's like, you know, you can work with them.

01:56:47: My goal was to essentially add a new data primitive.

01:56:51: You know, the way vectors are, the way matrices are.

01:56:54: This is just another one, because it's useful for so many things.

01:56:59: So the Gaussian Splats, they were sort of like the initial impulse to do it,

01:57:02: but then I was like, oh wait, these are useful for this, and they're useful for that,

01:57:06: and for this other thing.

01:57:07: And I made sure the implementation is very generic.

01:57:11: So with any data type that does support multiplication or specifically scaling,

01:57:22: you can use them with spherical harmonics.

01:57:23: You could make spherical harmonics that has matrices on each point,

01:57:28: which I don't know if there's any practical use for it,

01:57:30: but somebody will make something crazy with it.

01:57:34: We also have about two and a half minutes left,

01:57:38: so it's pretty much about time to wrap it up.

01:57:49: I think this kind of covers everything.

01:57:51: Do you have anything yourself?

01:57:55: No, I think that this is magical, and now I want to play with it for ten hours straight.

01:58:00: Yeah, people will have lots of fun with it.

01:58:04: Okay, I think we're going to be good to end it here.

01:58:11: Thank you very much everyone who has joined.

01:58:17: I hope you enjoyed this episode of SMNs.

01:58:20: I hope everyone had a good New Year.

01:58:26: I enjoyed the holidays and everything.

01:58:28: Thank you everyone for asking questions.

01:58:31: And for staying with us, and for just kind of in general supporting Resonite.

01:58:40: Thank you all for being here, for joining on the stream.

01:58:45: I'm hoping to answer questions and showcase things.

01:58:48: I'm glad to have made this one.

01:58:52: Hopefully next time will be less diseased.

01:58:57: Yeah, it was kind of like just, it was still kind of like, bleh.

01:59:01: Today, so it's a little bit, bleh.

01:59:03: But hopefully the stream was in general, regardless.

01:59:08:

01:59:09: So thank you very much.

01:59:11: Thank you for being here, thank you for supporting Resonite.

01:59:15: And we'll see you with the next one.

01:59:17: Actually, what's the next one? It's going to be...

01:59:20: Okay, this is on the 12th, so I'll still be around.

01:59:27: The 109th probably will have to skip, because I'll be like at FC.

01:59:32: So I don't think I'll be able to do one then.

01:59:36: So the next one is going to be around, but the next one, the one after is going to be skipped.

01:59:42:

01:59:43: So, anyways, thank you very much, you know.

01:59:47: Thank you for watching, and we'll see you with the next one.

01:59:51: Bye.

01:59:55: Bye.

01:59:57: Where's the OBS line?

02:00:06: Okay, there we go. Bye.