This is a transcript of The Resonance from 2024 December 29.
00:01: Oh, I can hear myself, oh my god, here we go.
00:12: Posting livestreams and everything, posted, there we go, what, what are you doing over
00:23: there?
00:24: Come over here, hello?
00:30: Hello?
00:31: Hello?
00:35: Hello everyone?
00:38: I should be alive.
00:40: We got messages, yeah, we got messages.
00:42: Huh, I'm bright, if it's too bright, let me know, I'll turn it out.
00:45: I'm just, I'm like, glow with the new year's thing, I guess.
00:51: Hello!
00:52: Hello, LexaVoe, hello JBiden, hello Fuzzy.
00:55: We got people coming in.
00:58: Ah, can you hear me fine?
00:59: Is the audio okay?
01:15: I guess we could get started. So hello everyone and welcome to the I believe the seventh episode of The Resonance, which is sort of like a hybrid between my office hours, where you can ask me any questions about Resonite, whether it's technical, whether it's, you know, drills of a call, like how like, you know, what I want Resonite to do, like anything with business, essentially anything around Resonite, I'll try to answer the best I can.
02:00: You know, sometimes I can look better, like, you know, answer those questions, but I'll try to answer like, you know, things the best I can.
02:08: But also like, might like, you know, go into more like, longer winded kind of rambles and like the high level of why things are done the way.
02:16: This will give you like, you know, better sort of bird's eye perspective on, you know, how Resonite is made, what you want to do with it, how we're kind of, you know, approaching everything and so on.
02:27: And also, yes, this is the last episode of The Resonance in 2024.
02:33: This is the last one, actually pick this world. This one's from last year.
02:37: Character Jump, they haven't published like the new one yet, but it like might be fitting.
02:41: This kind of precious world has like a, what's it called, this solarpunk kind of like, you know, style to it and I really like it kind of vibe.
02:50: So, hello everyone, and hello Karolza, hello Dedergan, and the Ultramando La Carca.
02:59: So if you want to ask me any question, make sure to end the message on Twitch chat with a question mark.
03:08: That way, oh no, no, it's not the last episode of The Resonance, it's the last of this year, we'll be doing a lot more.
03:20: So, yeah, make sure like the question ends with a question mark.
03:24: It doesn't actually need to be written in, but the question mark needs to be there somewhere.
03:27: Then it pops on this thing, like for example, this one, this one actually just popped in.
03:32: Learning, there we go. Like this one, you see, I can pull them off too.
03:39: So if you do that, I didn't talk correctly, let me do that again.
03:45: I think it's Discord being Discord.
03:51: Yeah, because like Discord likes doing the thing where it doesn't show the full name.
03:57: I'm just going to send another tag, office hours.
04:04: I'm actually going to disable streamer mode.
04:08: Oh, I see, it's like a different thing.
04:21: There we go, there we go.
04:28: Thank you for letting me know.
04:33: So with that, we should be ready to start.
04:36: So if you have any questions, make sure to end it with a question mark.
04:43: And I'll do the best I can answer.
04:46: And when there's time, I'll probably ramble at a bunch of different things.
04:50: So with that, we actually have some questions already popping.
04:53: Well, there's one.
04:57: So there's a bunch popping.
05:04: So the first question is from Fazzy Bipolar Bear.
05:08: I've been watching your Gaussian Splatting work with interest after trying to find ways to bring them into Resonite before but failing.
05:14: And I'm wondering how the progress is going and how does it feel to be working on something a little more fun than necessary for a change?
05:21: So yeah, in case you haven't seen, actually, let me see if I can bring the video.
05:27: Give me a second, I have it right here.
05:31: So just bring that out.
05:33: I'm on the quest controller so clicking also jumps.
05:38: There we go.
05:39: Put this in.
05:40: Make sure it's in the frame.
05:43: So this is pretty much like progress.
05:45: I actually made this video earlier today.
05:49: I've been working on adding support for rendering Gaussian Splats in Resonite.
05:53: It's not something like, you know, you could like implement yourself because it has to
05:57: require more tighter kind of implementation like, you know, with the render pipeline.
06:02: And I've been basing my work on existing unit implementations.
06:05: The only problem is those existing implementations were designed for newer versions of Unity
06:11: using some features that are not available in the one we have right now.
06:15: So I had to kind of our alternate solutions.
06:18: Particularly the problem was the sorting algorithm.
06:23: Because when you're rendering Gaussian Splats, you need to sort them based on the sense of the camera.
06:30: The algorithm that the reference implementation uses, or like the process we're basing it on,
06:37: is just something called wave intrinsics.
06:40: And it's like a way where you're running like a compute shader.
06:43: Where the compute shader, like different threads can communicate with each other and exchange data.
06:50: That's not supported by the version of Unity we have.
06:52: So I had to find a different sorting algorithm and adapt it, called the bitonic sort.
06:56: And that actually ends up working fine.
06:58: And then I had to do a bunch of raggling and adding our own data types for it and so on.
07:03: But pretty much here, this is rendering in our version of Unity.
07:09: You can see it's kind of working pretty well.
07:13: You can see there's like the preview, there's like this one.
07:21: This whole thing, this is already loaded to our own FrooxEngine class.
07:26: So like the loading of the Gaussian Splat and then like pushing the data.
07:31: That's our own code.
07:32: The rendering is adapted code from the open source project rendering them.
07:38: It's pretty much in this stage where I have it like it's hooked into a commerce.
07:42: It needs a little bit more cleanup.
07:44: And once that's all done, I'll properly add like a new asset type into Resonite.
07:49: So you can just, you know, drag and drop.
07:51: And it'll automatically set up stuff on the side of Unity.
07:54: So it's actually very close.
07:57: The one showstopper, which was the sorting algorithm, that's pretty much resolved.
08:01: The rendering, you know, you can see it works.
08:04: At this point, like, you know, it's just matter like, you know, probably a few more days.
08:08: And I'll have...
08:10: We might like...
08:11: Well, today, like I'm kind of visiting.
08:14: So I'll be like doing some things.
08:17: Like we're going like to the cinema to like watch the new Sonic movie.
08:20: So it might not happen today, but like it's very close.
08:23: Like it's gonna happen and it's gonna happen very soon.
08:27: And yeah, it's been like...
08:28: It's been a lot of fun because I've been kind of like, you know, doing this kind of on the side.
08:31: Like when we're kind of like, you know, hanging out and so on.
08:35: I mean, I'm kind of doing it like over the weekends as well over like past few weeks.
08:39: I'm just kind of nudging it over.
08:40: And I had a little bit more time to like, you know, on something that's not really work,
08:45: but something that's like, you know, more fun, like hobby thing that I wanted to do.
08:49: And it's been kind of like, you know, refreshing because like I found myself to be like,
08:54: I really want to work on this.
08:55: Like I really want to see this.
08:56: And I've been like, you know, going around 3D scanning everything and also casting splints.
09:01: And I'm like, I want to bring them in here.
09:02: I want to, you know, show them off to people.
09:05: And I want to see them like in VR in a collaborative environment.
09:09: So like, I've been very excited, like, you know, to get this thing going and get this thing finished.
09:14: And I do believe like it's going to be like a cool thing for everyone as well,
09:17: because I think this is going to make us the first social VR platform to support these.
09:24: So you can, you know, just drag and drop them in and like, you know, show them to people.
09:28: So hopefully, you know, a lot of people can enjoy this as well.
09:30: But yeah, it's been a lot of fun.
09:32: It's been very refreshing to just kind of work on something like where you have natural drives to kind of do it.
09:39: So it's, it's, it's been, it's been a lot of fun.
09:46: Yeah. And I'm excited to have this one.
09:49: So soon at the end.
09:54: Next question we have from GlovinVR.
09:56: From testing, it appears that PhotonDust runs much slower than Unity Particles.
10:00: It runs at around 23 FPS max.
10:02: It's a suitable limitation of Framework or PhotonDust.
10:04: And will the update on .NET 9 allow it to run much more smoothly and faster?
10:08: Well, so it kind of depends.
10:12: Like I don't, the problem is I don't know, like, you know, your testing setup for this.
10:15: And the performance is going to heavily, like, depend, you know, what you're doing with the system.
10:21: For example, with particles that, like, you know, resolve collisions,
10:25: they'll, like, run generally slower.
10:26: And especially with the version of, like, Unity, when we're still running with Mono.
10:30: There's been some tests where we can kind of see, you know, where, essentially, like, in a complex environment with lots of colliders,
10:41: it can run pretty slow with, like, you know, when there's lots of collisions to resolve.
10:45: But if it's, like, when we run the same thing with .NET 9, it runs, like, 10 times faster.
10:52: So that alone is going to, like, help it a lot.
10:54: But also, it depends, like, there's, like, no locking.
10:58: Like, it is not like the 20 and 30 FPS, and it definitely, like, with my own testing, it runs faster than that.
11:05: So, again, like, you know, it's harder to, like, generally, like, when I don't know what your testing conditions are.
11:11: But I can tell you for sure there's, like, no locking.
11:13: If there is, we need, like, a report, like, you know, to kind of see, like, what is happening in our particular system,
11:18: because sometimes these things can be, like, system dependent.
11:21: However, it is still, like, there's still, like, some issues that need to be resolved.
11:28: Like, with, like, a submission system.
11:32: Because, like, when, there's, like, two parts to PhotonDust.
11:35: One is the actual simulation, and that tends to run really fast.
11:39: Even, like, with the Mono.
11:40: The other part is, like, you know, pushing the data to be rendered into Unity.
11:45: And from what I've done, like, in testing, that part actually ends up, that ends up, you know, going, kind of slowing things down.
11:55: And there's, like, still parts that, like, need to be kind of wrapped around.
11:59: Because right now, what can happen is, like, PhotonDust did simulation.
12:04: And then sends the data to Unity, and essentially waits until there's another simulation step, until Unity integrates the result.
12:13: And on, like, how fast you're running, how much other stuff is happening.
12:17: So, like, there's, like, you know, stuff that can be done, like, put it on, like, you know, separate, higher priority queue.
12:21: But also improving the performance of the submission.
12:25: Right now, we're still kind of, like, adapting Unity's particle rendering for, like, you know, the data we simulate.
12:32: And that might be kind of, like, a limiting factor.
12:35: So, like, one of the approaches I've been kind of, like, looking into is using, actually, compute shaders to, like, make our own things.
12:44: The problem is, you know, it will take extra time to implement that.
12:49: But it might be, you know, needed, like, if, like, we cannot get the submission to the Unity's one, like, be fast enough.
12:57: The problem is, like, you know, that works and kind of probably cannot be thrown away, but also we only get a use of it.
13:02: It's, like, you know, compute shaders.
13:03: It's also kind of funny because there's actually some overlap with how the Gaussian Splats work because it's also using a compute shader to sort of, like,
13:12: you know, the geometry to show the particles on your screen.
13:19: something like that, like, kind of thing, like a mechanism.
13:22: So there might be, like, interesting synergy.
13:25: So it kind of depends, like, I would need, like, you know, more kind of information.
13:29: And there's, like, some reports, but, like, I've been, like, over there, like, what it is, I haven't really, like, been paying super much attention.
13:35: I'll be kind of just doing fun things, but I'll be, like, looking into it, like, you know, like, once the new year kind of, like, rolls in.
13:41: And once I'm done with some of the fun stuff, so circle back, collect the PhotonDust and look into those.
13:49: But yeah, like, the update to .NET 9 is definitely gonna make it run faster, but we also, like, the goal is to make it run as fast as it can before we make the switch as well.
14:01: And there's things, like, you know, that, like, need to be kind of investigated still, so.
14:06: And, like, since I haven't, like, looked into those, like, super deep, like, I cannot give you a more specific answer right now.
14:12: But yeah, I do think, like, we can get it, like, you know, it's just, it's more a question of, like, how much, like, how, sort of like, you know, like, time invested versus returns, like, you know, equation.
14:27: So it's kind of like, it's like, how much time do we want to spend now that's gonna be thrown away, you know, to bridge us until we make the switch?
14:40: And how bad, like, you know, it can be. And that's something, you know, I need to, like, look into how well we, like, sort of, evaluate how much time needs to be put into it, and how bad it is.
14:50: So, like, if we can, say, like, you know, we can get it to, like, even with the limitations of Unity, like, if we don't spend too much on it, maybe we get it to 90%, you know, of, like, the refresh rate.
15:04: Maybe that, you know, that's enough, and we don't need to spend, like, you know, extra time to get it over 100%, until, like, we make, like, you know, the switch with the rendering engine.
15:15: But maybe, like, you know, we need to get it, you know, from, say, like, 30% to those 90%, and that spends more time, so...
15:23: I'll have to look, like, it's really hard to answer these questions without, like, having to, like, look at it, you know, in the visual details.
15:29: In my, like, one of the things, it might even be something stupid, because what I think might be happening is just kind of ordering of some of the submissions.
15:38: So, like, one thing I have a hunch that might be happening is, like, when it finishes the simulation, it actually waits until the next frame, when it can hit the submission, and it, like, ends up, like, halving the actual total submission rate.
15:53: But, like, I don't know for sure right now if it's what's happening or not. I need to, like, you know, run through the tests and investigate that.
16:01: So, that's all no more. There's actually been another, like, performance issue where it locked your frame rate, like, you know, significantly lower.
16:10: Like, it would essentially, like, with some existing systems, when they converted to PhotonDust, you would start lagging.
16:15: And what it, like, and I actually thought, like, it had to do with its submission mechanism slowing down.
16:22: But what that turned out to be is that one of the trail, like, parameters was not converting properly.
16:30: So, the system, the converted system was actually generating 10 times more geometry data for trails.
16:37: Like, trails were, like, way longer than they should have been.
16:41: So it lagged as a result of that, because, like, it was, like, you were, like, you know, rendering and processing 10 times amount than the original was.
16:49: So it wasn't, like, you know, one-to-one comparison for that.
16:52: And fixing that conversion, that actually made it not like the framerate.
16:57: So, there might be, you know, other kind of, kind of, you know, stupid things like that, which end up, like, being relatively quick fixes and, like, end up, like, boosting the rate, like, quite, quite a bit.
17:11: So, I'll, I'll, what I can tell you is, like, I'll essentially, like, you know, get it as fast as I can.
17:18: I'm gonna evaluate how much it takes.
17:20: It might be, you know, that, like, with some, like, simple fixes, it ends up being, like, something like 90% of the performance and I'll be, like, okay, that's fine.
17:28: It's close enough.
17:29: We're gonna keep it on that until we switch the graphics engine.
17:33: Or maybe it'll turn around to be, like, no, actually, I need to invest more work into the render.
17:38: Like, it's gonna take more time, but it's gonna bring it, like, you know, kind of closer.
17:42: So, but that, that is still kind of remains to be determined.
17:47: But yeah, nope, like, that kind of answers it thoroughly.
17:51: And the next question is from Lexavo.
17:54: Gaussian Splatting.
17:55: I know you've had quite a bit of fun implementing a viewer into Resonite, but someone who doesn't know much about it,
18:00: apart from just the way to make and view 3D models, what is it and how it can be used in the Resonite.
18:05: So, Resonite, you cannot really use them in the Resonite yet, like, you'll be able to soon,
18:11: but, to put it simply, they're sort of a new way of rendering, I don't even want to say models,
18:18: because it's not exactly a model in the traditional sense.
18:22: You can think of, like, a Gaussian Splat, sort of like, almost like an extension to PointCloud,
18:30: instead of each point being just a point, it's like a colored blob.
18:38: You know, it's like a three-dimensional kind of blob that's kind of fuzzy,
18:41: and it can be all kinds of different shapes.
18:44: Like, you can actually, like, over here.
18:54: You have a PointCloud, you have just points, and they're all kind of the same size and everything,
19:00: and you can represent something like that, but it's kind of...
19:04: So with Gaussian Splats, each point, there's this kind of fuzzy blob around it,
19:11: and it's going to be 3D, and it's going to be very opaque here,
19:15: and the closer you get to the edge, the more transparent it gets.
19:21: And it's kind of following the Gaussian function.
19:25: And you have a bunch of these blobs, and some of these can be ground-ish,
19:31: some of them can be stretched, I'm just going to call it simplified,
19:34: some of them can be like this, and they can have different orientations.
19:39: And what the Gaussians end up doing is they end up approximating the shapes.
19:44: So say you have a shape like this, and you can end up having a Gaussian here,
19:51: and you have another Gaussian here, and they kind of approximate that shape.
19:58: What's really good about them is because they are fuzzy, and they have different orientations.
20:06: They're really good at representing soft details, like fur or plants,
20:11: so you don't get that kind of traditional blobiness you get with 3D models.
20:20: The other benefit of them is, oh, one more thing, is they also have directional information.
20:27: So each Gaussian, it has a color, but its color is encoded using something called Spherical Harmonics.
20:36: And what it essentially is, it's a way to encode the color where it depends on the direction.
20:41: So if I look at this casting from this angle, this can have one color in this direction.
20:47: If I go over here and look at it from here, for example, say over here is red.
20:53: And this is an extreme example, it's going to be a similar color, but then I go over here and it's going to be green.
21:00: So it can kind of change color, and that's really good for capturing some of the reflections
21:06: and subtle changes in the scene based on the direction you view it from, which helps add extra realism.
21:15: But also the good thing is that they really work really well with the 3D reconstruction.
21:21: Because when you reconstruct stuff, it uses a process called gradient descent.
21:26: It's a form of machine learning where it's essentially learning to approximate your input data, which are typically photos.
21:37: So in photos, if you have a shape like this, maybe it starts with a single Gaussian that tries to cover the whole shape, but this is not accurate enough.
21:46: So I'm going to split this Gaussian and put one here and one here.
21:51: And it just kind of slowly kind of settles in place and then kind of splits into more.
21:55: And maybe there's a little blue blood here, so it adds another Gaussian here.
22:01: And it just ends up kind of, once you're already through enough steps of learning, it learns to represent the scene pretty well.
22:14: If I go back to the original camera, I'm going to show you some more videos.
22:20: Because of that, they're really, really, really good at just reconstructing 3D scenes captured from the real world.
22:31: If I go back to the Gaussian Splats.
22:36: So let me bring this over.
22:41: So I'm going to spawn this one.
22:44: So this one, for example, you see, it's one of the captures I did.
22:49: I'm going to import more in here.
22:51: But this is like rendered using Gaussian Splats.
22:53: And you see, like, their stuff, the surface is shiny, which the direction photogrammetry tends to kind of like not be super good with.
23:00: And you can see the shine kind of changes, you know, when the camera pans around.
23:05: Like, it ends up, like, looking, you know, virtual, like, reckless.
23:11: You see, like, how the shine kind of, like, changes based on the direction.
23:15: So the call is essentially, you know, to be able to bring these into the scene.
23:19: So you can view them, you know, in Resonite, you can share them with other people.
23:25: So if you have, like, you know, ways, ways to kind of, like, you know, share stuff, share stuff like you've captured.
23:32: It's almost like little fragments of reality.
23:34: This is a really, really good example too.
23:38: You see, like, look at the fur.
23:40: Because, oops, I'm going to grab it by accident.
23:43: It's kind of hard to put with this controller without grabbing.
23:45: Look at the fur.
23:46: The fur is, like, you know, there's, like, very fine detail.
23:48: And it's, like, it actually looks soft.
23:51: Which, you know, you don't get with traditional, you know, you don't get it with traditional 3D scanning.
23:58: So this one came out, like, really well.
23:59: Like, it kind of captures, like, you know, that kind of, like, softness to it.
24:06: And it's because, like, you know, each of the strand of the fur, instead of, like, you know, being, like, geometry,
24:11: it's actually composed of the Gaussians.
24:13: And because they're fuzzy blobs, they're just not representing that, like, really well.
24:17: Here, you should be able to see, like, the center of the Gaussians.
24:21: There's also, like, you know, a lot of, like, subtle effects.
24:23: Like, as you kind of look around, the fur, it actually has, like, you know, a sheen to it.
24:27: It's kind of shiny.
24:28: And because of the sphere of harmonics, that ends up being represented as well,
24:32: and that makes it look a lot more real.
24:35: And I've got, like, another one that I want to show you.
24:38: This one's a really cool example.
24:41: So this one's, like, it's really good at new stuff plans as well.
24:45: So if I play this one, you see, like, it's just, like,
24:50: it can make 3D representations that just kind of look real.
24:54: Like, it doesn't have that kind of, like, machineness that you get with, like, models,
24:59: because with models, it has to, like, you know, reduce the scene to triangles.
25:05: It has to find what is the surface and the new project texture on it.
25:10: And with that, like, it actually ends up, like, having an error.
25:12: But with this, because the rendering primitive, which are the Gaussians,
25:17: the Gaussian Splats, instead of triangles, it ends up being much better at representing the scene.
25:25: And one of the things that, like, people end up asking is, like, can you convert these into models?
25:29: And I usually tell them, like, you essentially, that would reduce the entire point of that.
25:34: Because if you end up, like, with a model, like, you know, it ends up having the machineness
25:37: because the model is not as good at representing, you know, the scene.
25:43: I have, like, one more example I want to show you.
25:45: This one's kind of, like, an extreme one. This actually shows the difference, you know, between Gaussian Splats and traditional photogrammetry models.
25:54: So first, this is a traditional model. You see, it has, like, a surface.
25:59: Like, it's kind of missing parts. This actually shows where Gaussian Splats can be weak, though.
26:03: So this is, the scene is made from just three photos that are, like, near to each other.
26:10: With photogrammetry, it actually can see the surface and look at it from the side.
26:13: But with Gaussian Splatting, the moment I look away, like, the scene just kind of falls apart.
26:20: And you see, like, the whole thing is composed of, like, these colorful fuzzy blobs.
26:26: And because the photos only see, the photos are only from, like, one particular angle,
26:32: they're aligned so, like, when you look at it from the angle of those original photos, it looks correct.
26:37: But the moment you look away, it just kind of diverges because they can do whatever.
26:42: Like, there's no views to kind of constrain how you're supposed to look.
26:46: But it's, like, they are an excellent way, like, in autocapture reality.
26:51: So, one of the things I would really want to do, once we also have them, is build, sort of, like, a virtual sort of museum.
26:59: Like, I've, like, over, like, this past year, I've had a chance to visit the Yellowstone National Park, for example, twice.
27:07: I took a whole lot of 3D scans there, like, I have, like, a few hundred gigabytes worth of, like, data from there.
27:13: And I've been reconstructing, you know, the geysers and stuff, and, like, it's been looking really good.
27:17: Because the Gaussians, they're really good at, like, you know, capturing stuff like the reflections in the water, even, like, the steam that's coming out of them.
27:25: I visited some other museums, like the Museum of the Rockies, where they have, like, dinosaur skeletons.
27:29: I have, like, 3D reconstructions of those.
27:32: So what I want to do is, like, you know, start, like, kind of publishing those, and it's a really good way to kind of, you know, share things with people.
27:39: It's almost like capturing a fragment of reality and being able to preserve it and show it to other people.
27:47: And that's, like, one of the things I really like about this, like, technique.
27:50: It kind of, like, you know, puts you in that place.
27:52: It lets you kind of feel how that place feels, you know, like, wherever you are in the world.
27:57: And that's why I think, like, you know, having those in here, it's going to be a really powerful feature.
28:03: And there's also, like, you know, lots of research going into those.
28:06: Like, the Gaussian Splatting is one of the kind of techniques that's been kind of, like, on the...
28:13: What's the word? Like, when it's kind of on the right, essentially.
28:18: So there's probably a lot more tooling appearing.
28:20: Like, you know, there's, like, tools you can just download and use.
28:22: And so being able to bring those in, I think it's going to be a lot of fun.
28:26: And it's going to have a lot of kind of cool applications.
28:29: Or pester people with, like, scans with all this stuff.
28:33: I mean, like, I already do, but, like, now they're kind of big ass scenes.
28:37: So, next question is NukiKoon.
28:41: Will we at some point get a way to put a detailed texture on a triplanar?
28:44: It would be useful for adding variation to large landscapes using it.
28:48: I mean, it's potentially possible.
28:50: I would recommend making sure that's a GitHub issue.
28:52: And, you know, make sure, like, to get holes on it.
28:55: I don't think it would be particularly difficult to add.
28:58: We're in a state where we're not touching the shaders super much.
29:01: Because we know we want to, like, you know, we want to move to a different rendering engine.
29:07: We want to move away from Unity.
29:09: So, like, we don't want, and everything we do with shaders, you know, we'll have to rework.
29:14: It's actually the thing, you know, with the Gaussian Splats as well.
29:18: We'll have to kind of re-implement it.
29:19: Although, in that particular case, it's actually kind of easier to port that.
29:24: But we had, like, you know, discussions about it.
29:27: Like, are we okay? Like, is this going to align with the new engine?
29:32: But, like, the new engine is actually going to use compute shaders a little more heavily.
29:35: So, we're like, yeah, that's fine. It's not going to be too much extra work.
29:40: Oh. Oh, it started to show. I forgot it does that.
29:43: I might turn it off. Sorry.
29:45: Uh, how do I...
29:52: So, I didn't quite realize, um, I didn't quite realize this is going to go.
29:57: It's not quite the New Year's yet, but I think it just does it every hour.
30:02: Um, is midnight somewhere?
30:06: I think maybe it's actually midnight, like, at home.
30:08: But, like, um...
30:10: Okay, sorry, I kind of lost my turn, I thought.
30:12: But, yeah, like, we were kind of like, you know, that's not super much extra work.
30:26: But, like, so we can, like, make changes, and, like, Ginza's been working on the consolidation for the Shaders.
30:31: So, like, it's always, like, possible to add, like, little things here and there, especially if they don't require too much rework.
30:37: But, like, we're kind of cautious with it.
30:39: So, just make sure, like, there's going to be a issue, and, um, and make it, like, kind of prioritized at some point.
30:46: The next question is from El Tremendo, like, Arco.
30:50: Oh, real question.
30:50: Where will Resonite be on stage where you can do everything on it, from developing Resonite itself to doing everyday work inside?
30:57: So, it kind of depends what you mean by that, because, like, you can, um, can develop Resonite from inside itself, like, because it kind of feels like a cheat way, but, like, there's the desktop tab.
31:08: And actually, a lot of the times, I will, you know, I will be, like, on Resonite, but also, like, I have the desktop tab open and can be, like, working on some stuff.
31:19: Um, and I even, like, you know, run one instance of Resonite while in Resonite, which is kind of always kind of fun.
31:25: Um, so it's, like, it depends what you mean, because, like, if you, it's very, like, kind of open-ended, because you could also, like, think, like, can you, like, you know, update, like, the assemblies that are running, but, like, at some point you have to kind of restart depending on which functionality you attach.
31:43: So, it depends. Like, there's, and there's also, like, other parts, like, for example, you know, like, earlier this year, we reworked the settings UI.
31:52: And a lot of the settings UI, like, the pieces you actually see, as well as the facets, you know, on your home dash, they were built in Resonite.
31:59: So there's some parts of Resonite that were developed in Resonite itself, and we want to do more of it.
32:07: But, like, it's just, it's too, the question's a little bit too vague to answer it, like, you know, thoroughly.
32:15: Because, like, it, like, working in Resonite itself, inside of it, it depends. Also, like, doing everything, you know, like, I don't know what you mean by everything.
32:24: Because in some cases, everything can mean, like, you know, you go to another tool, you do stuff there, or in Resonite, and you import it in.
32:32: One thing I would probably say is, like, and it's also weird because, like, technically, if you put enough effort into things, you could do everything with PhotonLux.
32:46: It might not run the fastest, but it is a Turing-complete language, which means, if you put the effort into it, you could technically even, like, you know, simulate a whole CPU, and you could just do everything with it, but, like, you know, it's not going to be practical.
33:02: So one thing that's going to make it a lot easier is, like, once we also have, like, you know, WebAssembly support, because in being able to import WebAssembly modules, you can write code pretty much in most major languages, compile them into WebAssembly module, bring them here, and, you know, have them do stuff.
33:20: And that can include, you know, being able to compile WebAssembly modules inside of Resonite, because at that point, you know, we can pretty much bring any code in.
33:32: So it can interact with the data model and stuff, because the WebAssembly is, you know, naturally sandboxed.
33:41: So it's very kind of, like, I hope, like, this kind of answers it, but if you have more kind of, like, kind of specific idea, like when, you know, like, or some examples would mean, like, you know, that could kind of help give, like, a little bit more specific answer.
33:58: Next question is from Lukikun.
34:02: I mean, it's also, it depends what you mean by support for HTML, because that can mean, like, when you say that, to me, that can mean a lot of different things.
34:12: One interpretation is, like, you know, support for HTML is we have a WebView.
34:16: So you can, like, you know, open web pages, we integrate, you know, some web browser, and that lets you, that will let you, you know, kind of, like, you know, browse web pages.
34:25: And you can convert HTML support.
34:27: The other way to have HTML support is, you know, to be, like, where, for example, there's modules in, you know, ProtoFlux that parse it.
34:37: And then you can, like, you know, do stuff, like, with that in ProtoFlux, you know, with the document object model or whatnot, and script with it or do something.
34:45: Or the other interpretation would be, like, you know, converting it to, like, native Resonite visuals, you know, like converting some HTML and rendering with UIX.
34:53: So, like, there's lots of ways to kind of interpret those questions.
34:58: So, like, usually I ask people, you know, be more specific.
35:02: Like, what exactly do you mean? Like, what kind of applications, what kind of use cases are you, like, thinking of?
35:07: And that way, like, I can answer things more specifically.
35:11: Next question is Epic Easton is asking, what do you mean by our own version of Unity?
35:16: Are you saying that it's running the same version or did you make your own Unity?
35:19: What I mean by our own version of Unity is that it's actually the version that Resonite uses.
35:25: Which is, like, 2019.4.19F, I think.
35:30: So it just means, you know, that version.
35:34: Because, like, I think it was, like, in relation to the...
35:38: It was in relation to the Gaussian Splats, because the sample project is for a much newer version of Unity.
35:45: So I had to, like, you know, make sure it works with one v-group.
35:49: J4 is asking, how performant is the rendering of the Splats?
35:53: So it's gonna depend a fair bit on the Splat.
35:55: The heaviest part that I found is the sorting.
35:59: But also, one of the things I do in my reference implementation...
36:03: In my, like, implementation right now, is, like, the sorting happens every frame.
36:08: You don't actually need to do it every frame.
36:09: Like, the sample project does it every, like, 20-bit frame by default.
36:13: And I haven't implemented that part yet, so I will soon.
36:17: But even with it sorting every frame, it tends to go, like, it's at usable frame rates.
36:23: Like, even with an editor, it was, like, I think it was, like, running over 100 FPS.
36:28: Like, with the latest version I've been working on.
36:31: But it also kind of depends which part of the Splat you're viewing.
36:34: One of the things is they tend to be having on their overdraw.
36:37: And that's where it also depends on the particles Splat you're rendering.
36:43: They tend to, like, because you might have, like, you know, lots of Splats, like, overlapping each other.
36:50: So it depends, you know, how big they are and how many are overlapping for the particles Splat you're looking at.
36:56: And it's also going to determine, you know, how heavy it is at that particle, you know, situation.
37:02: That's, like, another part because right now, for the rendering, I'm actually not, like,
37:07: I'm just putting the data into the GPU, like, into PRM raw.
37:11: Pretty much, you know, how they come from the Splat.
37:13: But, you know, so I'm kind of shuffling around, but, like, that's pretty much how it comes.
37:16: Like, it's just a bunch of floats.
37:20: One of the things I'd want to implement once I have the first version working is implement the compression support,
37:26: which are truly speedzapped rendering, and also, like, significantly decreases the PRM usage.
37:32: So that's also probably going to affect things.
37:35: Overall, I would say they are heavier than, like, traditional 3D models.
37:41: But, you know, you have, like, the...
37:42: So, like, you probably don't want to have too many of them in a scene.
37:46: We can kind of think of them, you know, as a very kind of, like,
37:49: we kind of go into having, like, a very heavy model in the scene.
37:51: So we have a very heavy 3D scan, you know, say, that's, like, several million triangles.
37:57: You don't want to spawn too many of those because that's also going to hurt.
38:00: So the splats are going to be kind of similar, but depends on particle splats,
38:04: how many they have, how much overlap there is.
38:07: It tends to be, you know, very variable.
38:11: But the main thing is, like, you know, you can't even purchase heavy splats.
38:14: You can't get them rendering at usable frame rates.
38:21: So, next question is Fuzzy Bipolar Bear.
38:23: How did you import previous scans of your food, etc.?
38:26: Those were not Gaussian splats, they were just 3D models.
38:28: It was very constructed with, like, photogrammetry.
38:32: They're just, you know, normal triangle models.
38:36: And we already kind of support importing those.
38:39: Next question, a fun goofy question from Foxaberg.
38:42: What's the origin of the name Sauce for the custom rendering engine?
38:47: So, since he makes this thing called a Scotch Sauce,
38:50: which is, like, you know, this kind of cooking sauce that's pretty, like, delicious.
38:57: And it's been kind of, like, you know, one of his kind of signature things when he kind of cooks things.
39:04: So I think he based it on that.
39:05: I think the other part is also it sounds like the Sauce engine.
39:09: So it's kind of, you know, probably, like, play on that.
39:13: It's also, the other part is also, like, I think he named it, like, around the time when I was working on ProtoFlux
39:21: because the codename for ProtoFlux was Spaghet.
39:30: And he was like, well, if you have Spaghet, you know, you need a sauce for it, you know.
39:33: So, like, it's just kind of, like, playful naming.
39:39: But also, like, the important thing, it's a codename, which means, like, it's not likely going to be named once it actually releases.
39:47: We ended up, like, renaming Spaghet to ProtoFlux.
39:53: It just kind of sounds more professional.
39:58: So, some more stuff is going to happen to, like, you know, Sauce as well.
40:12: I'm not sure from memory, I'd have to kind of look it up.
40:16: There might be, like, anchor or interaction permissions, like, things, like, they can click on.
40:22: But I don't remember from memory, I'm sorry.
40:27: Also, what world are you in? I'm in the 2024 creator jam, like, New Year's World.
40:33: It's not a published world, so if you just search New Year's, you should be able to find it.
40:40: Fuzzy bipolar bear? Fuzzy blob? I feel called out, I don't know.
40:43: You can be represented by Gaussians.
40:46: I mean, they're, like, really good at, like, you know, presenting other things, but then they're good.
40:52: Projectivity IO. Can you share any plans for user inventory system rework, if any?
40:57: So, there's really a bunch. If you should be able to go on the GitHub, there's, like, an issue for inventory rework.
41:05: I think you might have some stuff.
41:07: Just in short, though, well, just kind of, like, you know, kind of things I know from the top of my head,
41:12: is one of the big things is gonna be, you know, search.
41:16: Being able to, you know, search the items, so you don't have to, like, you know, navigate through a million folders.
41:21: There's actually not even, like, people can talk about this as, like, you know, UI rework, but I feel like it's a little bit, like, what's the word?
41:33: It's a little bit inaccurate, like, misleading, because, like, it's not just reworking the UI.
41:40: For some of the parts, we do have to change the backend to how things work under the code.
41:45: You know, you can add, you know, a search field for inventory, but, like, if the system doesn't support searching, it's not going to do anything.
41:54: So there's, like, kind of a fair bit more work than just reworking the UI.
41:59: But yeah, search is going to be, like, one of those things.
42:02: The whole thing is going to be based on the data feeds, which is the mechanism that was first introduced with the settings UI.
42:10: What it will do is that, like, individual items in the inventory, the templates for it will actually build in-game.
42:16: And you'll be able to replace them with your own as well.
42:19: This will also mean you'll be able to have multiple, you know, views of the inventory.
42:23: So if you want, you know, for example, you know, there's the, let me render private UI.
42:29: Like, you know, there's the asset anchors.
42:33: So if you want to have an inventory folder that's, like, you know, quick access on the end,
42:37: you'll be able to, like, just put one on your hand and, you know, you have, for example,
42:40: your favorite tools or materials or whatever, or, like, you know, quick storage.
42:43: So you can grab something, save it, you know, on your hand and, like, then pull it out.
42:48: So you'll be able to do stuff like that because it's just going to be a data feed.
42:52: And you'll have, like, you know, be able to have multiple views and filter them and do other things.
42:57: So there's going to be other things.
42:59: It's also, like, having functionality, like, being able to, like, move and copy items and rename.
43:05: So it's going to be built into that.
43:07: I would also like to do, like, you know, a bunch of things, like, when I click on specific items,
43:11: like, for example, photos, you can preview them, you know, something like videos and so on.
43:15: So you get, like, a thing that pops out and says, like, you know, you have a bunch of actions you can do with that.
43:20: Maybe have, like, you know, a panel that kind of pops and says, you know, this is an item, has, like, you know,
43:27: different triangles and so on.
43:33: So it's like, you know, like, there's, like, a lot of kind of, like, features as well.
43:38: And just kind of, like, reworking in general, like, use, use the new kind of modern systems for building, like, UI.
43:44: So both our team can actually make the UI in-game instead of, you know, it kind of being hard coded,
43:50: which is very painful and doesn't lead to, like, super good UI.
43:54: And also so you can, like, you know, mess with it and, like, build the function like that.
43:57: Part of their work is also just modularizing the whole work.
44:00: So you can, like, you know, you can build, um, um, you can build, um, you know,
44:09: your own UI is like in a much more manner and the content team is gonna, or the art, you know, they're gonna make, um,
44:17: they're essentially gonna make, like, you know, the actual visuals for the UI while on the engineering side,
44:20: we focus more on the functionality.
44:22: So we're almost gonna use the same approach we use, you know, for the settings UI to do the inventory as well.
44:29: Um, this is like, you know, a little like niceties because you can also then feed it lots of data.
44:32: So for example, when you save an item, you know, the actual item can show, you know, this item is currently syncing
44:38: and it shows this progress in the inventory because it just, it'll feed the template, you know, that data up at syncing.
44:45: So, um, there's gonna be lots of kind of like new cool features.
44:48: I think I'm checking out like the inventory UI, like on GitHub, like there should be information.
44:52: I don't know, like I posted one there, but, um, there should be some.
44:58: Do you know some idea like what's like planned, uh, what's planned for it?
45:04: Next question is, uh, Edge is always asking, uh, it's a fractal basically getting smaller, smaller splats.
45:11: Well, I wouldn't call it a fractal because fractals like, um, they essentially, like they keep going no matter how much you zoom in.
45:21: With the Gaussian splats, it'll stop, you know, like there's, there's only so many Gaussians you can have.
45:28: So like, it doesn't really represent a fractal. Like, um, it's just a different kind of primitive.
45:35: Um, you could like, you know, make something that renders fractal using Gaussian splats that would need to be extra thing.
45:42: You know, the same way, like think about it, like, you know, if we have a picture, you can have a picture of a fractal, but if it's just a picture and you zoom in, you're just going to see the pixels.
45:51: Like there's not going to be any more details because like, you know, that fractal got kind of baked in into this kind of primitive, which are in this case pixels.
45:58: The same things, you know, with Gaussians, if you represent the fractal through them, if you zoom in, you're just going to see fuzzy blobs.
46:05: There's not going to be any more detail unless you have another process.
46:09: But once you zoom, it generates like, you know, more Gaussian, kind of like more Gaussians to kind of show you the detail that would have been there.
46:19: And Gaussians themselves, they're not fractals.
46:25: Next question, Aegis Wolf is asking, here's a good question.
46:28: Could you use Gaussian Splats to get scale tracing for regular modern making from source material?
46:34: Sounds like a more realistic way, converting certain spots to a game engine or 3D printed asset.
46:39: So they're actually not as good for making models unless you use them as a reference.
46:45: Like if you use them as a reference and you manually model things, then yes.
46:50: But the Splats themselves, like if you're, if you remember, I can actually just point it again.
46:57: If you remember like this showcase, so this is like a really good sample of it.
47:04: The Gaussians, they don't actually need to follow the geometric model.
47:08: They just need to look right, you know, for specific angles.
47:12: So this, this is normal like model and you can see if I look from the side, you actually, oh my god.
47:19: I'm not used to the Quest controllers, if I want to point.
47:22: You see, if I look from the side, you still have like the surface.
47:26: It's kind of rough because there's so many, so few photos, but like it, it has, you know, coherency, you know, of the model.
47:33: So that, that is, that those traditional models, they're kind of useful for that.
47:38: The Gaussian Splatting, it looks fine, but the moment you look at it from the side, you see there's no, there's no coherency.
47:44: It doesn't look anything like the object is meant to represent.
47:50: So like you couldn't like really get a model out of this because like Gaussians, by their nature, they don't necessarily represent the model.
48:00: Like they often can, like they, like, you know, the Gaussians can be in the right place for the model, but they're not guaranteed to.
48:05: It's not part, you know, of how they work.
48:08: So if you wanted to like do like 3D printing and so on, like if you want to 3D print your scan, doing traditional photogrammetry, that's probably a much better bet than using Gaussian Splatting.
48:22: So hopefully that can answer that question.
48:27: Next, let's see.
48:29: Can I happen to you?
48:31: I think that was when it was the New Year's thing.
48:37: Will issues with the current shaders be fixed, like being not able to override values for metallic smoothness after setting a metallic map?
48:44: And we make GitHub issues, like anything, things can be fixed, we just need to like know about them, and we need to make, you know, sometimes we need to know enough people are interested in this for us to invest time into it, because you know, there's lots of issues, and we can only work on so many of them, so like some will be fixed, some won't be fixed.
49:04: Depends.
49:05: Make sure there's like a GitHub issue.
49:12: So next question from GrandUK is, after performance, what would you want to work on first?
49:18: So the question I want, I mean, there's multiple levels of want, because like, I always like want to work on lots of things for different purposes.
49:29: And there's things I want, I want to work on, because they'll be good for Resonite.
49:36: And there's things I want to work on, because they're also in a way good for Resonite, but also like, you know, they're more fun.
49:45: So it kind of depends, you know, which angle you look at it from.
49:51: If you look at it from the angle of like, you know, I want to work on things to improve Resonite for everyone.
49:57: I'm like, this is what Resonite means, and I want it to be doing good.
50:01: And I want more people, you know, to come in and use Resonite so, you know, we can kind of grow.
50:07: I would say, like, UI is definitely on the list.
50:12: I kind of want to rework more UI, because like, I myself kind of like, you know, like ran into the issues with it.
50:19: Because I use, you know, I use Resonite a lot, and all the issues like you're having with UI, I'm having with UI.
50:25: And I do want to improve those, because it's just going to like, you know, make the quality of life easier.
50:31: And also it's going to make it more friendly to users, because that's one of the complaints, you know, we get.
50:36: The UI gets confusing, so that's one of the things I would really want us to spend time on.
50:44: There's, the IK is probably a big one too, because I do like use full body roll.
50:50: And then, you know, oftentimes have the issue like, you know, with the head moving and like it's kind of scrunching up weird.
50:55: So like, I want it to feel good when I use it myself.
51:00: But it's also like one of those things like there's more of a work thing, so it's more like a camera want to work on it, but also like there's that work pressure on it.
51:11: So, and then it kind of makes it a bit different, like for me.
51:18: So it just depends, like, you know, first thing I would say like is that these ones.
51:23: That's not like the thing that I feel like, you know, we need.
51:26: And I kind of want to work on them and I want to get them in as early as possible because I feel they're going to have a big impact.
51:34: One of them is also related to the UI and it's having the workshop.
51:38: Being able to, you know, essentially share any item.
51:43: Like, you know, avatars, tools, you know, like whatever gadgets like you make, you know, materials, sound effects, you know, textures, like whatever, like assets or even like entire complex objects.
51:57: Because we have a community that makes a lot of cool stuff all the time, but right now the sharing is kind of limited.
52:07: It's very like, you know, like you have to know a person who gives you a public folder and like, you know, maybe that's public for public folders and it's kind of hard to discover things.
52:16: So having a workshop, I feel like would be a really big boon and I would want people to start like submitting things to it early to kind of build a big library of publicly shared stuff.
52:26: That's like searchable, that's filterable.
52:30: And I think that will also help, you know, lots of new users.
52:40: There's like, for example, you know, like sometimes when somebody comes in, they want to bring in an avatar and sometimes, you know, they have to like import one.
52:51: But like if we have a workshop, we can just be, here's all the public level avatars.
52:56: You know, we just filter the category.
52:58: We make it easy to just click on it and be like, OK, we have it equipped now.
53:02: So I think, you know, having that will also make it make the kind of like initial flow for new users much better because it can make it easier for them to discover, you know, cool content, discover avatars, find the stuff they want.
53:16: So that's something I want to work on as well.
53:19: You know, for that purpose.
53:22: Those are like things I want to work on to like, you know, improve general kind of like, you know, usability of Resonite.
53:30: Like one of them is going to be, you know, Protoflux collection support because that's going to be huge.
53:35: That's going to like, you know, explode what you can do with it.
53:40: There's, you know, one of the things I'm really looking for is having like, you know, the DSP in like Protoflux.
53:48: And mesh processing, texture processing, because I really like those kinds of systems and some of the groundwork is already laid out.
53:55: It's also having, you know, the nested nodes and so on.
53:57: So there'll be like, you know, really good for people who develop stuff.
54:00: There's also like, you know, creative tools.
54:02: Like one of the things I really look forward to working on is like, you know, the terrain system.
54:08: So the terrain system that's going to be another big thing that like I think will be like a lot of fun to work on.
54:16: And it's actually similar in the particle system because our drill of particle systems I'm kind of looking for working on PhotonDust.
54:24: But it kind of brings it back, you know, to the thing like where it's like there's this work pressure on it, which kind of makes it a bit weirder.
54:35: You know, this is like the other thing like I really want to see is like, you know, timelines.
54:39: Actually, I might, I might do this one on camera, like to kind of show you some of the things that will be possible.
54:46: So I'm going to move here.
54:52: I'm kind of using this kind of as an opportunity to ramble, I need to clear these.
54:59: I'm not using the board, so like I have to clean it manually.
55:10: So, one more.
55:15: So one of the things I really do look forward to working on is the timeline system.
55:24: And what it will be, what it will essentially do is offer a generalized system to sequence anything.
55:31: Like pretty much anything in Resonite.
55:33: It's going to be built around, you know, the same core primitives that you already can access with Inspector,
55:39: that you can access with ProtoFlux.
55:42: And I think it's going to make it a really powerful mechanism that's just going to integrate really, really well with lots of different systems and open lots of new possibilities.
55:51: And the way it's going to work, like in general, let me make sure I'm like on the camera, is you'll be able to like, you know, create a timeline.
55:59: Let me actually unplug my headset because I'm kind of getting caught in the cable.
56:04: So I'm just going to draw a timeline.
56:06: So say like you create a timeline and it's like going to be like this object, it's going to have a start, you know, it's going to be some marks.
56:15: And what you can think of as a timeline is just, it's just a general, you know, it's concept.
56:23: You have like, you know, you have points in time and you can put things to happen or sync, you know, with a point in time and then you can play it back.
56:30: So for example, so you have like, you know, you have your play cursor here and you can play it, you know,
56:36: it plays, it kind of goes through the timeline and it can, you know, seek and move around, you know.
56:41: And that gives you information where you essentially have like, you know, this is the current time and this is, you know, the current speed of playback.
56:50: And it can be fed into lots of other systems.
56:54: The other systems also going to say, I'm at this point in the timeline, you know, for example, you know, like you're going to have a thing and it's going to be here.
57:05: And this is going to be, whatever this is, it's going to be active, you know, when the time cursor is here.
57:12: And then you're going to have like, you know, a position within this, you know, that's depending on where the cursor is relative to this item, wherever it is on timeline.
57:23: And the real power will be that like, you know, this can be anything.
57:28: To give you an example, say you want to animate, you know, some colors in the world.
57:34: Can be like a gradient. So you put a gradient on the timeline and then you have like, you know, so you have like an item in the world and you have like, you know, like you have a drive for its color.
57:45: I'm kind of grabbing it wrong.
57:47: And say, like, you know, you have another node so you can, or module, like, you know, there's probably multiple mechanisms, but let's think of it as a node.
57:56: So we have a gradient. This is, you know, some kind of color.
58:00: We're gonna, you know, just do whatever.
58:03: Like, imagine there's more colors than this than yellow.
58:06: You plop this on the timeline and then like you have like a thing which actually feeds this and gives you whatever color is at the current playback.
58:17: And then you're going to drive a color, you know, of something.
58:20: And then you can, for example, say I want to drive a size of something or, you know, so you can have another thing and say the size is one here and it's going to be two over here.
58:29: And then, you know, you're going to feed this to another thing.
58:33: So, you know, you get the size, you feed it like, you know, you drive it.
58:38: And as the timer kind of goes, you know, it's going to change, you know, whatever values you get for here and you can map that to size.
58:47: And either, you know, there's going to be a component which maps it directly.
58:51: So like, you know, you don't need to like, you know, drive it specifically, just say like drive this part.
58:56: Or, you know, you can sample it with ProtoFlux. You can do extra stuff.
59:00: I didn't expect that to happen.
59:02: But, you know, you can, you can sample it and do whatever extra stuff with it.
59:05: Like, whatever processing, whatever mechanisms you want to do.
59:10: So that's like one of those things you'll be able to do.
59:14: If it works, you can undo things like this.
59:17: Here we go.
59:19: This can also be, you know, for example, audio.
59:22: So you can put like, you know, an audio here.
59:29: And then you're going to have something where you have like, you know, an audio output.
59:32: I'm just going to represent it with this speaker.
59:38: And you're going to have a module which, you know, feeds the, it blends these audio sources together.
59:45: And, you know, and feeds it, you know, into your audio output.
59:50: So it's going to, you know, play and blend whatever audio you have there.
59:55: So like, if you want, you know, audio to happen in synchronization with some value changes, you'll be able to do that.
01:00:00: You can place it on the same timeline.
01:00:04: So you can have like, you know, audio clips.
01:00:06: One of the things you'll also be able to do is, you know, if you have this, is, you know, it can be like just an output in the world.
01:00:15: Maybe, you know, it's like you're making like a virtual experience.
01:00:18: You know, one sounds to happen a particle times, but also you could just use this for audio compositing.
01:00:25: So we could have tools where if you have a timeline with audio clips on it, you're going to say, I want to render this out into single audio clip and you essentially bake it.
01:00:35: And it's going to render out a new audio clip.
01:00:38: That's, you know, literally whatever this is.
01:00:41: So you could essentially use Resonite for audio editing.
01:00:45: Because with those clips, you know, you'll be also able to like, you know, the more powerful the modules that go on the timeline, the more you'll be able to do with them.
01:00:52: So you could, for example, say this can be different sizes.
01:00:56: You know, I want to stretch this audio effect or maybe I want to like, you know, cut it.
01:01:00: So it's only part of the sound effects.
01:01:02: So you could, you could use the timeline not to drive something in the world, but use it as a workflow tool to, you know, do audio editing.
01:01:11: And, you know, bake audio and you can then export this, you know, use it even outside of Resonite or use it within Resonite.
01:01:17: The main goal is like, you know, make the tool as powerful as it can be and as versatile as it can be.
01:01:24: The other things, you know, with the timeline system, you know, you approach the idea is like whatever, whatever can be like time-based, you can place it on the timeline and you can use it to drive whatever things you want.
01:01:39: You know, another thing you could place on the timeline is, actually, let me back up a little bit.
01:01:46: I'm going to mention another thing, audio thing, because you might have, you know, multiple audios, but also you're going to say, you know, this track is, you know, this track goes to this.
01:01:59: And maybe you have another speaker here in the world and you're going to say this track goes to this.
01:02:06: So when this whole timeline plays, when it gets to this point, it's going to play sound out of this one.
01:02:13: And once it gets to this point, it's going to play this sound out of this one.
01:02:15: So you can use it, you know, to synchronize multiple sources as well.
01:02:22: You might have, you know, not just audio clips, but also video clips.
01:02:26: Maybe you want to, you know, synchronize video there.
01:02:28: And once we have, like, you know, better video decoding, maybe you'll be able to, like, you know, composite video.
01:02:34: Like, you know, so you can use it for video editing.
01:02:37: It's probably going to be further down the line because it requires better, like, video kind of support.
01:02:41: But it's, you know, by making the timeline a very generic construct, it'll open those possibilities, you know.
01:02:48: The more functionality comes into Resonite itself, and into Froox Engine itself, the more it'll kind of resonate with the timeline.
01:02:58: And the more things you'll be able to do with the timeline.
01:03:03: These things on the timeline, you know, they can also be animations.
01:03:05: For example, say you have, like, you know, you have a holo rig.
01:03:10: So you have, like, a rig here.
01:03:13: This thing.
01:03:14: And then you have, like, you know, mocap animation.
01:03:16: You know, and this is already kind of, like, animation within itself.
01:03:21: And you're gonna say, at this point, this is gonna drive this thing.
01:03:26: So, you know, we can place, you can place animations there.
01:03:29: Like, you know, they're kind of baked animations.
01:03:31: And what I think is gonna be even more powerful is you'll be able to, like, you know, take a whole timeline, which has a bunch of stuff.
01:03:38: And say, this whole timeline is its own thing.
01:03:48: So, like, I have, like, a timeline, you know, that has a bunch of stuff.
01:03:52: I'm gonna put this timeline into another timeline, and then, you know, maybe I'll repeat it, you know, and do other things with it.
01:03:58: Or maybe I'll put, like, just part of it.
01:04:00: And then, like, use it to, you know, drive other stuff.
01:04:03: So you'll be able to, like, you know, mess the timelines.
01:04:06: And do, like, lots of stuff that way.
01:04:08: The other thing that's also gonna be powerful with timelines are sort of events.
01:04:13: So you could, for example, say, I'm gonna put an event at this point.
01:04:19: And say you have, like, a PerfFlux.
01:04:23: And this event is gonna send, you know, an impulse.
01:04:27: So, like, once the cursor reaches here, it's gonna send an impulse, and it's gonna trigger whatever you want.
01:04:32: So you can, you know, for example, I don't know, trigger a particle if it happens.
01:04:36: You know, or whatever mechanism you want.
01:04:41: The whole timeline, you know, it's not gonna be tied into any specific, you know, system.
01:04:49: It's gonna be a general construct.
01:04:51: And the goal is to make it like, you know, so it kind of integrates naturally with lots of systems in Resonite.
01:04:57: Anything where you want something to change on time.
01:05:01: Once we, the other example is, you know, once we, for example, have the audio DSP.
01:05:07: You could, you know, have, you could have like a system, say, for example, you know, a node that's like, you know, accepting a bunch of inputs.
01:05:16: And it's like, you know, it's doing some kind of signal thing.
01:05:20: You could have a timeline where we have like, you know, sequence of, you know, values and maybe like within those, you know, we open this up, you know, this could be like a curve or something, you know, maybe it's like, it's like, you know, bezier.
01:05:35: You know, maybe there's one, you know, like, you know, so this is here and then like, you have like a bunch of more modules here to replace and, you know, we do stuff.
01:05:44: And then like, there's another one here, another one here, and you've got to say this plug's here, this plug's here, this plug is here.
01:05:52: And as the audio DSP is like, you know, generating audio is actually sampling the timeline for the inputs.
01:06:00: So you can use it as a sequencer for making music or making sound effects.
01:06:06: The goal is to have the system be super versatile, you know, be able to plug pretty much into anything the way, you know, ProtoFlux plugs into anything, the way inspectors plug into anything.
01:06:22: Anytime you have a value that you want, you know, change over time, you can sequence it on the timeline and then, you know, wire it into whatever you need to do and create like, you know, lots of kind of complex behavior.
01:06:33: So I think once we have this mechanism, it'll open up like lots of really cool things.
01:06:39: Like you could, for example, make an entire world, you know, just like music synchronization.
01:06:44: And what do you do? You just undo all of this.
01:06:50: I could really use the clear button. Oh, I don't have enough undo step, so now I have to do this manually.
01:06:59: Let me get rid of this.
01:07:03: So say like, you know, you just plop an audio track, you know, and you have audio here and then like, you know, you sequence whatever you want to happen, you know, whatever stuff you want to drive.
01:07:14: You just, you know, sequence it with it using the timeline and plop whatever things.
01:07:19: And now it's going to be perfectly synchronized, you know, with the music.
01:07:23: Or maybe instead of music, this is a video and I use this, you know, to make meta memes.
01:07:27: You just say you want, we want this thing to activate, we want this thing to change.
01:07:33: Just sequence it. It's going to make stuff like that, like way easier to do.
01:07:39: You can use it as a workflow tool, you know, do audio editing, you know, clip the audio clips, arrange them, blend them.
01:07:47: And then like, you know, render it into an audio clip. Stuff like that.
01:07:53: The goal is like, you know, like I said, everything that's in Resonite is make it as generic as possible.
01:07:58: So like, it both integrates with systems and lots of other systems can like, you know, build upon it.
01:08:06: So that's one of the things I really kind of want to work on at some point or two, because I think it's kind of just going to explode, you know.
01:08:13: Lots of things you want to do with Resonite.
01:08:20: There's another... I was going to mention...
01:08:24: I was going to mention something that kind of slipped my mind right now.
01:08:30: But yeah, this is... I think this is going to be really powerful.
01:08:33: Oh, and remember, one of the things I would like to see...
01:08:36: Like, once this is in, they would like to come in is like, you know, when people actually make...
01:08:41: They used to make like, you know, these kind of like stop-motion things.
01:08:43: Like, say we can build tooling to also help, you know, work with the timeline.
01:08:50: So you don't have to just, you know, work with the timeline manually sequencing things, but say...
01:08:54: Say you have like, you know, an avatar in the world, you know, so like a stick figure.
01:08:58: And there used to be like stick figure animations.
01:09:00: So you can have tooling which says, put the current state, you know, like I'm here on the timeline right now.
01:09:07: Put the state here and it's gonna, you know, it's gonna make a keyframe.
01:09:11: And then you're gonna move it here and you know, you're gonna like move this, you know, I didn't make it enough.
01:09:16: But like, you know, imagine his hand moved.
01:09:18: Put a keyframe here, you know, and you just kind of like do animation step by step.
01:09:22: And then you can, you know, play it back.
01:09:24: So we can make tools that let you, you know, do stuff in the world and then capture the state into the timeline.
01:09:32: So literally make the changes you want to happen and then make keyframes.
01:09:36: And one thing I would love to see is like, you know, people making sort of like these stop-motion animations.
01:09:42: Like, you know, like the skip and stick figure ones.
01:09:43: But now in VR, you know, with avatars and everything.
01:09:46: So like, and that's one of the reasons I really want to work on this because I think like, you know, that can be a lot of fun for a lot of people.
01:09:55: Like trying to not just like, you know, a social platform, but something people use, something that people use to make cool content that, you know, exits outside of Resonite.
01:10:08: So that's, I've kind of used your question for another thing I really wanted to talk about because this is one of the big things I really look forward to like implementing at some point because I think it's going to be big for Resonite.
01:10:22: There's a bunch of others too.
01:10:23: So like I'm, I should probably get like the other questions, but there's a lot of stuff I kind of work on and there's stuff I want to work on, you know, for different reasons.
01:10:35: There's also like stuff that I just want to work on like for personal things.
01:10:42: I don't think I can actually remember, like one thing I do want to add is like face tracking for desktop because I want to like, you know, have some face tracking and I'm liking them all as well.
01:10:49: But that's one of the smaller things.
01:10:51: Actually I do, I do, oh, I mean there's a bunch of other, I just like now that I think about it.
01:10:57: But if you're going on the GitHub, there's a project board called Froox Feel Good Issues and there's a bunch of them.
01:11:05: Like one of them, one of them that's a smaller one, but also that I think is going to be really powerful are spatial variables.
01:11:12: And it's actually something like where you can define variables to change in space.
01:11:18: So you can, for example, say like there's an area and the variables, this value here and this value here, and then you can sample it at any point in space.
01:11:26: And you have like, you know, helper module. So like you just derive a value based on where the object is in 3D space.
01:11:32: And I think that's going to be a really powerful mechanism that's going to enable you to do lots of cool things.
01:11:36: And I really want to work on that one.
01:11:42: There's a lot of things, it's kind of, there's a lot of things. I do recommend checking out that board.
01:11:51: I need to update it because I didn't really get to work on many things on that.
01:11:55: But the Gaussian Splats are kind of like one of those things that's kind of like, you know, more fun thing.
01:12:01: There's a few people, but like the main reason, like I'm kind of, it's just sort of like mental health rate project.
01:12:08: So there is no projects I want to work on like that.
01:12:11: There's also like projects in Resonite.
01:12:13: I really do want to get into some of the, making some of the music more or else to publish some of my scans.
01:12:18: And the Gaussian Splatting is kind of aligned with that.
01:12:21: So like I'd want to put work into those two.
01:12:23: So yeah, I hope, I hope like, I don't know if I'm drawing for this one, but it's also kind of part of these themes.
01:12:31: I hope that gave you some idea on some of the things that are going to be coming in the future.
01:12:36: And also Granuki is asking, wait, WebAssembly support in Resonite?
01:12:39: Yes, that's actually, that's the main way now.
01:12:43: I've been kind of watching WebAssembly for a while and it's pretty much reached a point where I'm like, yeah, this is like, for a while I was like thinking for like, you know, traditional scripting in Resonite.
01:12:51: Maybe we'll integrate Lua, but I don't really like Lua and it's very like, you know, I was thinking Python, but Python is harder to integrate and maybe like C sharp, but it's also harder to like sandbox.
01:13:02: And it's like WebAssembly is pretty much reached a point where it is usable outside of Web2 and is meant to be used outside of Web2.
01:13:12: It's designed to be secure, but it's like, what's really powerful is you can compile existing code into it.
01:13:18: Like you can take a library written in C or C++ and just, you know, compile it to WebAssembly and then like, you know, glue it with other stuff.
01:13:27: And like, you can now use all the existing code.
01:13:30: It could take, you know, emulators that are like already coded and compile them into WebAssembly and then like, you know, just integrate them with whatever you want them to.
01:13:40: So that's going to open up a lot of like really common possibilities as well.
01:13:43: And this is actually one of the things I do want to also work on too, but it's not a big one too.
01:13:48: But yes, WebAssembly is how I want Resonite to support lots of other languages because you can use, you know, C and C++.
01:13:58: You can use, you know, like say like you want to run Python or Lua, you can compile, you know, Lua or Python like, you know, runtime into WebAssembly and then use that to run it.
01:14:11: There's lots of other languages that compile to it as well, you know.
01:14:15: Whereas, you know, there's also C sharp like ones and like there's sort of possibilities there and the tooling is, you know, keeps growing.
01:14:25: Next question is Clothosin.
01:14:27: With Periflux, is there going to be an update where you hover or select the Periflux node and there's information about it?
01:14:33: I mean, we don't have like super specific plans, but we do want to integrate documentation with Resonite so it's kind of accessible.
01:14:39: And that would be like, you know, a good way like where it can integrate into the component or like node browser so you can like easily access information.
01:14:47: So maybe eventually not super specific plan, but generally, yes.
01:14:54: Next question, Granuke.
01:14:55: Is there any plans to support authorization and headers with HTTP GET POST and WebStack pattern in a secure way for secrets?
01:15:03: So this one's a little bit tricky because like we cannot put these secrets in the world and needs to be done in a way.
01:15:10: Where you kind of like, you know, abuse it like where other people cannot abuse it because you don't want your secrets to leak and you don't want other users to be able to, you know, abuse it to like get access to secure stuff.
01:15:25: So one of the things that like might be done like for that is like the secrets are gonna kept in user space and actually it's gonna, you know, it's gonna like you're gonna like, you know, some kind of UI to maybe manage them or something.
01:15:39: And whatever spawn it will track if it's like, you know, spawned by you.
01:15:47: And maybe like, you know, add mechanisms so you can only have like secure parts like locally so other people cannot mess with it.
01:15:52: So there's like general idea like how to handle stuff like that, but it requires extra care because of the potential for abuse and security holes.
01:16:05: Next question is, can LVR, oh, so thank you for the donation for the bid.
01:16:10: Got any new worlds to showcase? I don't actually have like, well, this one's kind of New Year, but this one's from the last New Year.
01:16:16: I don't, I haven't like, looked at many worlds like recently. I've been kind of like busy all the time.
01:16:22: So I don't have any to showcase, unfortunately.
01:16:27: Next question from, let me check the time.
01:16:33: Shenzao, are there plans to provide support for people to break their own fragment or vertex shaders down the line,
01:16:38: or enable insert probability with the sine, distance fields, and zero-frame things like Gaussian Splatting and breakaway from traditional triangle meshes?
01:16:45: So you will have the ability to make your own shaders, probably initially with ProtoFlux.
01:16:53: Once we switch to the graph extension, because the unit is very limited.
01:16:57: We cannot really compile and upload new shaders at runtime, at least not in an insane way.
01:17:09: You might not be able to write specifically fragment or vertex shaders, because those are very specific,
01:17:16: and one of the things that's really important to us in Resonite is long-term compatibility.
01:17:24: And the shaders are probably going to be abstracted enough to make sure we can ensure that long-term compatibility.
01:17:33: The reason for it is actually like the new engine, right now it's planned to use meshlets, which means there's actually no vertex shader.
01:17:44: What it means is if we let you write specifically vertex shaders, either 1, any shaders you would write would break once we switch the engine,
01:17:57: which is something we don't want to happen, or we would not be able to switch to a new engine,
01:18:03: which means we'd be effectively stuck with development, which we also don't want.
01:18:10: So if we provide a more abstracted way, for example to modify vertex data, that's something we can wrap around and make it work with whatever pipeline is in place.
01:18:24: Probably there's going to be an abstraction layer, very likely, but you'll be able to do that kind of stuff eventually.
01:18:33: So hopefully that answers the question. This is going to be part of the general support for making shaders.
01:18:40: Next question, from Navy3001.
01:18:49: I mean, people use it. I don't really use it myself because I just use full body.
01:18:56: I don't know what you mean by using some of it for Resonite, like that'd be a very different kind of licensing, like if you mean integrating the code.
01:19:03: But I don't see the need for that, like we probably need to license it with a creator and it would be way more expensive than a standard license.
01:19:12: Assuming they're even willing to do something like that, but even like, I don't know the reasoning because people can just buy it and just use it.
01:19:22: Like it doesn't need to be like integrated like with Resonite.
01:19:30: snb8272, question, about when we'll be getting easier access to linear curves on fading colors alpha or color over lifetime.
01:19:37: So if you mean like curves, that requires the inspector UI to be reworked, because we need to actually make UI for working with curves, so probably sometime around then.
01:19:50: The inspectors, they're going to be reworked also with the data feeds, so it'll make it easier to make more modules specifically not to working with curves and so on.
01:19:59: The curves themselves are probably going to be data feeds too, so sometime around then like most of the time things happen.
01:20:09: Similar with the colors, you know, like it needs like a gradient editor, which is its own like unique piece of UI.
01:20:16: One thing that you can do with PhotonDust is you can actually use textures.
01:20:22: So if you already bake your transition into a texture, you can, you know, plug that in and it can work.
01:20:47: Yes, pretty much like needs to provide like, you know, for the common languages we'll need to provide
01:20:55: sort of like pretty much a reference library, which is like, you know, a bunch of version of API functions and so on.
01:21:01: So you can just include that and make sure, you know, whatever code you write can, you know, interop with, you know, the Resonite.
01:21:15: We might like do ones, you know, for common languages, you know, maybe like C, C++, Rust, but leave the rest of them, you know, for the community.
01:21:27: So like this is probably going to be something that like, you know, we're even like, you know, make those like opens.
01:21:31: People can kind of look at them and use them as a reference.
01:21:34: And if you want support for more languages, you know, we'll probably like the community help.
01:21:39: So for example, if somebody makes a Python integration, they'll have to like, you know, make a Python site as well.
01:21:46: C or C++.
01:21:49: And Grant Decays asking, how much do you think collections will cause a wave of amazing creations?
01:21:55: A lot.
01:21:56: I think, I think it's one of those things like where people don't even realize, you know, some of this stuff.
01:22:01: Like, let me actually give you an example.
01:22:03: Like, you know, the brush I'm using, if I can, I don't think I can grab it from over there.
01:22:10: Let me go for the brush.
01:22:13: Where's the brush?
01:22:18: So one thing you might not even realize, let me just clear this up.
01:22:23: The way brushes work, it's a really, like, I'm kind of proud of like, you know, how the whole system is designed.
01:22:35: Let me delete things.
01:22:37: So whenever you use a brush, what a brush actually does under the hood.
01:22:45: It adds a new procedural mesh.
01:22:49: So like, this is literally just a procedural mesh.
01:22:51: And what the brush is doing, it just spawns it in the world.
01:22:54: But then as long as you keep pressing trigger, the procedural mesh, it has an array of points, potentially colors and other things.
01:23:03: And what the brush is doing is just adding new points to it.
01:23:07: And the mesh is like, you know, I've been modified, I'm going to update my own geometry.
01:23:12: So like the brush, it has no, like, it doesn't really have much idea about the brush stroke.
01:23:17: All it's doing is just adding new coordinates, you know, based on where the brush currently is.
01:23:23: So it's technically two systems that are mostly independent from each other.
01:23:28: It's the procedural mesh system, particularly this type of procedural mesh, and another system that's like, you know, just adding data to it.
01:23:36: And it doesn't know how, and this is how the brush kind of like works under the hood.
01:23:39: But the brush doesn't need to be the only thing, you know, it's, it's literally just adding elements to the array.
01:23:46: But like, once you have like, you know, collections for the ProtoFlux, you can, you can, you know, do stuff with this array.
01:23:52: Like you'll be able to like, you know, for example, you know, just use this mesh and like procedurally generate whatever visuals you want, you know, like you want to generate like a spiral.
01:24:03: So just make an algorithm that just adds a point to make a spiral and you get a spiral out of this.
01:24:10: Because you can just, you know, use the raw data tool and make your own brush.
01:24:17: You know, like, like just completely bypass the system if you like, if you want to, or you could make tools that, you know, for example, take this and do some processing to it.
01:24:25: Maybe, you know, they will smooth it out.
01:24:27: Maybe they'll like, you know, you process and maybe you're like, like, you know, like you'll make like a processing thing that just iterates over all the points and, you know, does a thing where it's like, you know, does something to each of the points.
01:24:40: And you'll be able to make tools like that.
01:24:42: But also, you know, it's a prerequisite for lots of other functionality.
01:24:46: I'm going to move back now.
01:24:49: Weee.
01:24:51: Hello.
01:24:53: Weee.
01:24:54: Here we go.
01:24:56: You know, there's, there's lots of other functionality, like, like one that comes very often, JSON parsing.
01:25:03: Because when you parse JSON, you essentially get a collection.
01:25:08: You get, you know, like a dictionary of, you know, tokens, like, you know, keys.
01:25:13: And then like, you know, those can be for the collections, you know.
01:25:17: So then on itself, I think that's going to open up a lot of options, other stuff.
01:25:21: Like right now we have the raycast one node.
01:25:24: So you can like, you know, make a raycast, but like it gives you whatever is the first thing, which makes things difficult because sometimes you want to filter things.
01:25:34: So it's, what's the word, like it's sort of like a gateway for lots of other features.
01:25:45: Like it's going to open, like it's not only going to like create like possibility for lots of cool creations, but also it's going to open up doors for lots of features that we're not able to do properly, you know, until it's in place.
01:25:58: So once we have it in, we'll add like a JSON node, like you just plug JSON and you get, you know, parse output and you can just enumerate the results.
01:26:06: Same way, you know, opposite, you construct a collection of things and you'll be like, do JSON and it's going to make like, you know, stuff like interrupt ways here.
01:26:14: With the raycast node that I was mentioning earlier, you'll be able to get a list of all hits and you can just iterate over them, find the hit you want, you know, and you'll have like a lot more flexibility that way.
01:26:27: Pretty much same thing, like, you know, like for example, working with slots or once we add component access, that also gives you collections.
01:26:34: Like lots of things that are essentially collections and once you have the native support, it's going to be able to interact with those.
01:26:42: So I think that is like one of those things that's going to just explore things.
01:26:48: And also in general, like if you're making your own code, oftentimes collections can be a really good way to make code.
01:26:54: Like, you know, like implement certain things because you work on collections of things internally, even if those collections don't go into the data model.
01:27:03: So it's just going to make, you know, programming a lot of things like way easier than it is now and more performant.
01:27:10: So yes, I think that it's going to be one of those things that's going to like, it's going to be one of those things that's just going to blow up.
01:27:21: Fuzzy Bipolar Bear is asking, Froox Resonite to talk when? I mean, that's now.
01:27:28: That's kind of weird. Maybe you could consider this to talk, I don't know.
01:27:38: So let me check on the time. We've got about half an hour left, so I think we're doing pretty okay.
01:27:46: Next question, Electorspy. VST plugin support when? Well, I mean, that won't...
01:27:52: So is that the thing I've been kind of thinking about? Like, you know, with the ProtoFlux DSP for audio, the one problem...
01:28:00: Well, there's multiple problems, but like, one that I see is compatibility?
01:28:05: Like, Resonite is designed to become a multiplatform, VST, I don't think they are.
01:28:10: Like, they might be like, you know, Windows specific and so on.
01:28:13: The other one is security, because when I have like running VSP, like, it can be like an arbitrary code.
01:28:20: And with Resonite, for, you know, that ProtoFlux DSP to work for everyone, unless you're rendering it on just particle user,
01:28:27: and then syncing the resulting audio stream, everybody will need to run their code.
01:28:32: Which means, do you synchronize the VST? What if it's malicious? You know, like, it's not safe to do that.
01:28:38: You know, that would open a vector for attacks.
01:28:44: So, the only way I see that working is, you know, like, if you run it in a mode where only a single user is actually rendering locally and then streaming audio,
01:28:51: but it doesn't fit well in Resonite, where, like, you know, everything's kind of synchronized, and everything can be also persisted.
01:28:57: So, like, because we have a VST module, and you save your thing, and then you load it on a different computer where you don't have that VST,
01:29:06: do we save it together? Like, you know, do we save it and then just install it on your system?
01:29:10: Like, and that opens up, you know, the whole thing. What if somebody else spawns it, and there's malicious code, you know?
01:29:17: And now, like, we run malicious code on there, and it makes things kind of complicated in a sense.
01:29:21: So, maybe, maybe, like, some way, like, limited way to make it work, but it's trickier.
01:29:30: Maybe if there's, like, a way to work with the bus assembly, that would open up, like, make it safe to do that kind of stuff.
01:29:36: But I don't know enough right now. Oh, there we go again. Oh, I'm gonna turn it off, I'm gonna turn it off.
01:29:45: On a second thought, this might not have been the best, um, this might not have been the best on our version.
01:29:53: I mean, it's fine, it's fine. Um, but yes, uh, at least it lets me know that it's, like, our floor.
01:29:59: But yeah, like, maybe, like, with the bus assembly, I don't, I didn't do, like, enough research right now to, like, you know, uh,
01:30:10: door, like, not really give you, like, a super clear answer.
01:30:13: So Cyro's asking, I'm outside your door, can I come in the room and say hi? I mean, you can, but I don't know if you're actually watching.
01:30:20: Cyro! Okay, how long have you been waiting?
01:30:28: Um, you need to go closer.
01:30:31: I can smell your breath.
01:30:36: Did, did you hear Cyro?
01:30:41: Yeah, Cyro, Cyro has that.
01:30:49: Oh no, Cyro, Cyro overslept, so, like, you just, you get, um, you get background, background noise.
01:30:55: Cyro!
01:31:00: I don't know it on my own.
01:31:02: My voice is dull.
01:31:03: I'm like, actually, I have soda.
01:31:06: How I have soda, I'm gonna grab a soda.
01:31:09: So this is, like, a little intermission, Cyro.
01:31:21: So there's about 20 minute lag on the questions, I guess.
01:31:25: I mean, Cyro's pretty...
01:31:27: There's quite a bit of persistence.
01:31:29: So yes, I hope that answers your question, Cyro, who's not here.
01:31:41: Did he answer his own question?
01:31:43: Well, I answered his question and...
01:31:46: I don't know what I'm saying.
01:31:49: But yeah, that is that question resolved.
01:31:54: So the next one is from Kayobi Yoru.
01:31:57: Would marching cube staring be possible in the future?
01:32:00: Yes, I mean, one of the previous Resonance episodes,
01:32:08: I've done a bit of an explainer on the terrain system,
01:32:11: so I don't want to go too much into detail on that one right now.
01:32:15: There's a video on the official YouTube channel.
01:32:17: I recommend watching it.
01:32:19: I go into a fair amount of detail on how the terrain system is going to work and so on.
01:32:24: So give that one a...
01:32:26: Sorry, give that one a watch.
01:32:28: But yeah, pretty much, yes.
01:32:31: What I actually want to do, so...
01:32:34: Sorry, I drank soda and I'm full of bubbles.
01:32:39: What I want to do is just generally integrate the marching cubes algorithm
01:32:45: into our own classes so we can run it on any data.
01:32:49: Because one of the cool things, you know,
01:32:51: once you have that algorithm, there's lots of cool features you can build with it.
01:32:55: One of them is, for example, you know, Metabol support.
01:33:00: Because usually that's, you know, rendered, like,
01:33:02: it converts to a mesh using marching cubes algorithm.
01:33:06: And what we could do, and what I really want to do,
01:33:10: is add a render module for the new part system for PhotonDust
01:33:15: where each particle is, like, you know, like a metabol.
01:33:18: And they actually, you know, instead of, like,
01:33:21: rendering as a mesh of particles, it generates a mesh
01:33:23: because it can look like a liquid.
01:33:26: So you could use it, you know, for some sort of a simple liquid simulation.
01:33:35: Having, you know, like,
01:33:40: having, like, you know, that algorithm sort of in the library,
01:33:44: we can just, you know, run the whole thing, run the particles through it,
01:33:49: you know, generate, like, the same distance field,
01:33:51: and, you know, convert it into a mesh,
01:33:53: and now we can do, like, fake liquids with the particle system with PhotonDust.
01:34:00: So having that, you know,
01:34:02: it's gonna open up a lot of options.
01:34:05: So I've been actually looking at, like, the implementation and possibly adapting one
01:34:10: because I wanted to do that one soon-ish, if there's time,
01:34:13: like, one of the fun things for PhotonDust,
01:34:15: but I was seeing, like, no promises at this time.
01:34:19: Varyable Frick's Alarm Clock. Yeah, I'm not,
01:34:22: I'm not a good alarm clock. I wasn't sure what Cyro was doing
01:34:25: because we were staying at a friend's house, and
01:34:32: and, like, there's stuff going on.
01:34:35: So, like, there's a lot of people around, so, sorry, sorry, I didn't know.
01:34:40: I sent him a message because I wasn't sure, like,
01:34:42: he might have been, like, working on some of his own stuff and such, but,
01:34:45: I don't know.
01:34:46: I feel like working on the, like, getting,
01:34:51: like, the ViFi working on Linux is getting his eyes in there.
01:34:56: So next question, ShadowX.
01:34:59: When Gaussian Splatting is done,
01:35:01: is the audio system right away or is there more particle work and under between projects?
01:35:05: There's still more particle work, like PhotonDust is still considered kind of in experimental phase.
01:35:11: We need to get it, like, you know, polished enough.
01:35:13: I know there's a bunch of compatibility issues that I need to look into.
01:35:17: People have been, you know, making reports.
01:35:19: So once I'm kind of done with, you know, the fun thing,
01:35:23: I'll circle back to those, fix up the issues.
01:35:26: We want to bring it out of experimental.
01:35:29: And essentially have it, like, replace, you know, the particle system.
01:35:33: So once those issues are resolved, PhotonDust is going to come out of experimental.
01:35:38: And the old particle system, the legacy one, is going to be removed completely.
01:35:43: Making, you know, PhotonDust essentially, like, you know, the only particle system we have.
01:35:50: So, but we're not going to, you know, do that until, like, we resolve, like, majority of, like, the issues.
01:35:55: Like, once we're kind of confident, it can, you know, preserve existing content and it serves as a good kind of replacement.
01:36:02: So there's still work to do on that one.
01:36:08: There's, I don't know about, like, in between projects. I'll see, like, how things kind of go.
01:36:15: So, but the next, next bigger thing that's kind of a part of a big performance update is essentially, you know, it's going to be the audio system.
01:36:23: And also, like I was asking, isn't Cyro already working on audio?
01:36:27: Cyro, yes, he's been, he's working on a part of it. He's been specifically working on the reverb library.
01:36:34: Because the audio system, the one we have right now, you can have reverb zones and we need to preserve those in some way.
01:36:40: The problem is, you know, the reverb zones, they're specific to a unit, especially with FMOD.
01:36:47: And we're not going to be using that, which means we need to provide an authoritative.
01:36:52: Cyro's been integrating this library called the Zita Reverb, which is part of a library called Soundpipe, and making sure you know it works with the resonance.
01:37:01: So that part of work, you know, is already pretty much done.
01:37:05: It's going to make, you know, the parts that I work on then, I don't have to worry about that part, and it saves me quite a good amount of time.
01:37:16: So that should answer Joe's questions.
01:37:19: So with that, we have about 23 minutes left, 22 minutes left.
01:37:26: There's no more questions right now, so if you want any questions, make sure...
01:37:31: make sure not to ask them before you look at the timer itself.
01:37:36: Like, last time we got some questions, like, last minute, and at that point, it's too late to answer them.
01:37:41: So if you've got any questions, get them in, you know, sooner than later.
01:37:48: With that, I might end up doing a little bit of rambling about stuff.
01:37:51: I could actually talk about the Gaussian Splatting too, because...
01:37:56: I wonder if I have a video somewhere.
01:38:02: One of the things, I can't show you because I'm not running on the right build,
01:38:06: but one of the things I'm actually adding for the Gaussian Splatting are spherical harmonics.
01:38:15: And like I was mentioning earlier, Gaussian Splats, they use spherical harmonics.
01:38:25: I'm gonna grab my brush over there. Let's go over there.
01:38:29: Oh, wrong thing.
01:38:37: Spherical harmonics, it's sort of like a way to encode directional information on a sphere.
01:38:46: Imagine you have defined start and end, and you can have a wave there.
01:38:57: And I'm just gonna look here.
01:39:00: And this wave, you know, it's just a single parameter to say how big this wave is.
01:39:08: And then you could also have a wave that goes in twice.
01:39:12: So it's, for example, one, two.
01:39:16: And you also have a parameter that says how big this wave is.
01:39:20: So it can be different scales.
01:39:22: And these then composite on each other.
01:39:25: So what spherical harmonics is, it's sort of a way of doing this on the surface of a sphere.
01:39:33: Essentially, you know, you have sort of like, almost like, what's the term for it, like standing waves?
01:39:39: Like on the surface of a sphere.
01:39:41: So like if you have something like this, imagine this is a sphere.
01:39:46: You know, imagine like you took like this, and you know, wrapped it like this.
01:39:51: So you're gonna have like, you know, a wave that's gonna be like, and then like, I cannot draw it well.
01:39:59: This is gonna be easier once once I have these changes emerged, because I can just show you on the procedural objects.
01:40:03: But essentially, we can have like, you know, the wave goes up and then wave goes down.
01:40:10: And it's kind of crazy, like, you know, it's kind of like visual.
01:40:12: So like, what it does, if you if you map this on a sphere and say for example, say for example, this encodes brightness.
01:40:19: What this would do is make you know that like, the more you're here, you know, it's gonna be brighter.
01:40:26: So like this part is gonna be really bright and it's gonna be darker.
01:40:28: And it's gonna be like the opposite, you know, there's gonna be negative brightness.
01:40:32: So it's gonna be weak one and it's gonna be strong one.
01:40:36: And then you can have like, you know, multiple levels of it.
01:40:38: So you can have like a wave, then you know, it goes, it sort of goes up, goes down and goes up and goes down.
01:40:46: I'm drawing it very poorly, but you essentially have like, you know, multiple kind of waves and then they kind of add together.
01:40:56: And what the end result of that is, is that you can sample the information.
01:41:02: Like this mathematical representation can be used for information that changes based on the direction.
01:41:12: So you can have a sphere and you have like information encoded with spherical harmonics that can be, for example, core.
01:41:18: And you say, you know, I have like a viewer here and I'm looking at a sphere from here.
01:41:25: So the direction is this.
01:41:27: So I'll sample whatever, whatever the wave, you know, is, because it can be kind of complicated, you know, with multiple of them.
01:41:35: I'll sample it and get, you know, whatever color is here.
01:41:37: And then I'll look over here.
01:41:40: If I have another viewer here and I look here, you sample at this direction and I get a different value.
01:41:46: So like the information changes based on the angle.
01:41:50: And this is useful for lots of things.
01:41:52: One of them is, you know, it's a very efficient way to encode color information like light, like ambient lighting information.
01:42:00: It's actually what Unity uses, you know, for the ambience of the world.
01:42:03: It encodes that information into second order spherical harmonics and then samples it, which is like, you know, very quick.
01:42:12: Similar with the Gaussian Splats.
01:42:14: Each Gaussian encodes color using spherical harmonics.
01:42:20: And like, you know, that way it can actually change color depending on the angle you view it from.
01:42:26: But it's also useful for lots of other things.
01:42:28: And the way I've implemented it into Resonite is a general structure.
01:42:32: So you can encode any data type that can be, suppose, like addition and multiplication.
01:42:38: You can encode it in spherical harmonics and then you can actually sample it, you know, based on the direction.
01:42:43: So you'll be able to use it yourself, you know, in your own creations where you specify your spherical harmony.
01:42:49: And then you can sample it, just provide a direction and you get whatever value.
01:42:53: There's also interesting, and it's kind of funny because sometimes things end up having unexpected, like, unexpected, I think I'm going to go back over there.
01:43:06: Sometimes things have unexpected synergies.
01:43:11: So I mentioned earlier, one of the things for a particle system I might end up doing is having to compute shaders to generate things to get around.
01:43:21: You know, to get around some of the performance issues that PhotonDust has been submitting data.
01:43:28: And it's got similar mechanisms that are being used for rendering the Gaussian Splats because there's a lot of kind of overlap.
01:43:33: So some of the work I'm doing now actually kind of helps with that.
01:43:37: Then also the spherical harmonics that helps, like, you know, exposing some of the stuff and preserving some of the stuff how Unity does light things so we can, you know, expose that.
01:43:45: But also, when I'm, once I'm working on the audio system, spherical harmonics, they're used to encode ambisonics, which is direct from audio.
01:43:55: And it's something that we had a number of people request.
01:43:58: And with the support already being added, it actually makes it trivial to decode those.
01:44:05: Because like, literally, with ambisonics, you have multiple channels and the channels are the coefficients for the spherical harmonics.
01:44:12: And we already have code, so like, you know, sample those directionally.
01:44:17: So funnily enough, it makes it very easy, you know, to add support for ambisonics.
01:44:22: Like as part of the audio rework, we might just get ambisonic support.
01:44:27: Just because, you know, all of these cannot be basic primitives, like they use a lot of different systems.
01:44:52: Oh, hello, are we back? Oh, sorry, the headset Wi-Fi died.
01:45:00: I'm on the coolest quest deck, can you hear me? Hello, Test 1-2.
01:45:09: Can you hear me fine? Okay, sweet, thank you.
01:45:12: Yeah, the headset Wi-Fi just completely died.
01:45:18: It's one of those things where I'm like, I originally added it for this thing that was just a fun thing,
01:45:25: and I'm like, now it's useful for this, this, and this, so I'm like, I'm happy.
01:45:29: I like having more, you know, general primitives and the FrooxEngine that can be used for other things.
01:45:36: With that, Jack Fox Sutter is asking, when will you make me hungry with the Gaussian food scans?
01:45:41: Once it's implemented very soon, like, the moment it's in the build, I'm just importing my hundreds of Gaussian splats
01:45:50: and eating everybody's beer with them and, you know, distributing them in people's faces to make them hungry.
01:45:56: I did already do a bunch of Gaussian splats of food and it looks, like, amazing.
01:45:59: So, well, it's gonna happen soon.
01:46:03: Nuki Kuhn is asking, last time I asked about using systems like a particle system has to build shaders from parts,
01:46:09: like, one part is an arbitrary texture where there's a normal map and you can stack them.
01:46:13: It wasn't clear if you thought you could, but my understanding is that these are just pieces of code that you can stack.
01:46:18: What would be an issue with that?
01:46:22: I don't really understand the question.
01:46:29: Like, a particle system doesn't really have anything to do with building shaders.
01:46:35: Like, you could use materials with it, but I don't...
01:46:43: I'm sorry, I have to rephrase the question. I don't really understand what you're asking.
01:46:50: Next question, ShadowX is asking, in the future, could there be an eraser for geometry line brushes?
01:46:55: How expensive would it be to check every point in a stroke against an eraser?
01:47:00: Yeah, that's actually one of the things you could do.
01:47:01: And I would like to have more tools for working with the strokes so I can do stuff like smoothing them and also erasing them.
01:47:10: It's not like... well, it kind of depends on how many points there are.
01:47:14: Usually, if there's a certain amount, you can just kind of loop through all of them and do the distance check.
01:47:18: It's going to be fast enough, especially if you're floating in a thousand per country or something.
01:47:23: If it's a bigger one, one thing you can do is build acceleration structure,
01:47:28: which lets you more efficiently sample things in space.
01:47:32: That's things that, for example, physics engine does.
01:47:36: In this world, it can be hundreds of colliders or thousands or even more,
01:47:42: and it's not going to loop through every single one of them to check the collision.
01:47:45: It uses acceleration structures, usually ones based on some form of tree,
01:47:51: that makes it way more efficient to sample things.
01:47:57: The question is, are we going to have a thing that does it?
01:48:00: Automatically, it doesn't happen when you bring the eraser and stuff like that,
01:48:05: but it's possible.
01:48:07: It's just a matter of investing time into implementing it.
01:48:11: Check the Fox Author.
01:48:12: Also, I do have a question if there's still time when you get to it.
01:48:16: I'm currently building a little spaceship game.
01:48:19: Currently, resonant creations like this can only be shared in Resonite.
01:48:22: What are your plans on allowing external, sterile, non-distribution of games made within Resonite?
01:48:28: We do have general ideas. We want to do it.
01:48:32: You can essentially say, this is a bunch of worlds I've built, export these as a standard executable.
01:48:40: That's something we want to give to other people.
01:48:43: It's very unlikely to happen while still in Unity because I don't think it works with their licensing.
01:48:50: Maybe if you generated a Unity project and you did compile it with your own version of Unity, that would work.
01:48:58: But as we know, running on our own engine level, we have licensed everything, so we can just pop out an .exe and do it.
01:49:07: There's also things we need to figure out, like the business model for that.
01:49:11: So it might be a paid feature or something. That's going to depend.
01:49:18: We do have currently already some level of support.
01:49:21: We have a wide label kind of way where we can make builds of Resonite that are branded as something else.
01:49:29: But this is mostly for really big companies because it does require some manual work.
01:49:33: But also the commercial license.
01:49:36: If something is going to happen, there's not too many details on the specifics yet.
01:49:40: So this is going to come with time.
01:49:43: Also, thank you so much for your subscription with Prime Project Boxer.
01:49:47: And no worries. Thank you for helping get users set up.
01:49:55: Borable. Best tie-hies and warmers you've got then.
01:49:58: I ordered Engie Paws in April.
01:50:04: And I'm still waiting on them because the shipping company is really bad and they screwed up.
01:50:13: I've been looking forward to those because they have ones that are like...
01:50:17: Those are my colors, they're like yellow and orange.
01:50:21: And I hear they're super comfy, they're super good, but like...
01:50:26: I don't know when they're going to arrive.
01:50:30: I've been waiting like...
01:50:34: How much is it? Like 8 months at this point?
01:50:39: The shipping company is pretty bad.
01:50:44: NukiKun is asking,
01:50:45: Will I be able to use Gaussian spotmaps to take a scan of a section of my fursuit fur and somehow tile it on my avatar?
01:50:52: It seems like it would be way too under-rastic for without deep simulation.
01:50:56: You can't really use it for that.
01:50:58: Like, you get like a model but you cannot really tile it easily and I don't think it's a good representation of it.
01:51:05: Because you should just render Gaussian spot by itself, they don't really blend well with each other super much.
01:51:14: And it's also like, you know, there's like multiple problems with that because it's not gonna be...
01:51:21: Like it's not gonna move, it's not gonna respond to anything, it's not gonna respond to lighting in the world.
01:51:26: All the lighting is baked into it.
01:51:28: So like if you're in a dark world, the fur is just gonna be glowing.
01:51:31: You know, it's...
01:51:35: Like they're not useful for that kind of stuff, they're useful for showing things you've captured in the real world.
01:51:42: But I wouldn't use them for...
01:51:44: I wouldn't really use them for like, you know, representing avatars and stuff like that.
01:51:49: I'm proud of some of the Gaussian spot avatars, just you know, a rigid avatar.
01:51:54: But like, they're not the best suited for that kind of thing.
01:52:01: Next question, fuzzy poly by Polybear.
01:52:03: Amazonix and the Gaussian splat stuff.
01:52:05: Reminds me of the light field photography stuff from 3 years ago, lighter camera.
01:52:09: I wonder if all these things are interestingly linked and working on one will give use to others.
01:52:14: I mean, there's some overlap.
01:52:16: Like some of the math is like, you know, like the more basic math you use, like the more that you know it has.
01:52:23: Like if you consider something like linear algebra, that's just used all the way across like lots of things with computer graphics, you know, game engines and so on.
01:52:34: Stuff like, you know, complex numbers, quaternions, you know, like trigonometry.
01:52:40: That's used in so much stuff that like, you know, having like more stuff there just opens up lots of possibilities.
01:52:48: So there's so much stuff, yes, you get like, you know, more.
01:52:53: You get kind of like, you know, like the more of the primitives you kind of have in the engine and your code, it makes it easier to implement those other things.
01:53:03: And sometimes they're kind of, you know, built from these kind of common building blocks, but with some you still need, you know, to do extra work.
01:53:08: So it kind of depends, you know, on some of them.
01:53:19: Because light fields are very, very memory heavy.
01:53:26: It's not as efficient as 3D models, but, you know, it's a different kind of trade-offs kind of thing.
01:53:39: I've got about six minutes left. So like, I think I have time for like a few more questions if there's some more.
01:53:45: Otherwise, I'll probably get some more rambles.
01:53:49: It might be like, you know, like a non-worker questions, depending how long they are.
01:53:53: So if you could like some, like, it's probably like your last chance.
01:53:58: But I'd like, I would also like just generally maybe like ask like, like everybody's been doing good like this year.
01:54:04: It's been, you know, it's pretty much the last episode of this year.
01:54:12: Like I hope like everybody's been enjoying like the Resonance podcast and learning more about Resonite.
01:54:20: I've been literally like glad like I've got an opportunity to sort of like, you know,
01:54:26: I would like, you know, like I kind of wanted to talk about a lot of the things, you know,
01:54:33: and sometimes didn't know just kind of people randomly in Resonite.
01:54:36: But this kind of gives a way to share it like, you know, with the wider community.
01:54:40: And there's been lots of ideas that like I wanted to kind of share and just kind of
01:54:44: give you a little vision of like, where Resonite is going, what we want to do with it.
01:54:49: You know, when we start heading and just give you a bit more like behind the scenes as well,
01:54:56: you know, sort of thinking that goes into things.
01:54:59: So thank you very much.
01:55:00: Like, you know, everyone who's been watching these and, you know, I hope like we'll make lots more.
01:55:05: As I checked, the focus author is asking, would you play standalone games made with Resonite?
01:55:09: Like, all of them. Well, depends how many there are, but really not all of them.
01:55:13: Like, I, like, things already like reach the point, like where I kind of keep up with everything that there is on Resonite.
01:55:21: So if there's going to be a lot of them, then I don't think I'll be able to, which is in a way nice problem to have,
01:55:32: because it's like, you know, lots of people are using it, but also like,
01:55:35: they experience everything, so it's kind of like, um, we'll see, like the first few I'll probably, you know, play,
01:55:42: be like, you know, this is stuff that people made, but like once there's a lot of them, like, I don't think I'll have the time.
01:55:49: Um, yeah, um, I've been, like, enjoying, like, you know, doing, like, these Resonance podcasts and, like, you know,
01:55:56: cutting them, like, into, like, videos that people can watch. People sometimes have been enjoying, like, you know,
01:56:00: especially when it's cut into smaller videos on specific topics. This is, like, some of them.
01:56:05: It's also kind of interesting because it works as sort of, like, a rough gauge of, like, you know,
01:56:12: what things are people interested in. Like, for example, the Performance one, it got, like, lots of views.
01:56:17: The Gaussian is one thing that actually got quite a bit of views, so I think, like, it feels like a lot of people are interested in that.
01:56:25: It's kind of an interesting way, you know, to kind of measure, you know, how much people are interested in particle things.
01:56:32: And it just generally, you know, give people kind of bite-sized things to, like, be able to share, you know,
01:56:37: and have, like, sort of, like, central information, like, you know, in one place on how things are going.
01:56:42: Like, for example, with the Performance, you know, there's been lots of rumors and there's been lots of, kind of, like,
01:56:46: unknowns kind of floating around. People are like, you know, why is particle system being worked on, you know, as part of the Performance, you know?
01:56:55: And I think that's their visual explanation. I feel it's gonna help, you know, get people better understanding, you know,
01:57:00: the processes and the reasoning, you know, and why we do, you know, the things that we do in more of a, like, visual way as well.
01:57:12: So, thank you very much, you know, for, like, you know, the opportunities of, like, all of these.
01:57:15: Thank you everyone, you know, for, like, using Resonite and for supporting us and making it, you know,
01:57:22: so we can kind of keep going and make lots of cool stuff.
01:57:25: And also for, like, you know, just, you know, like, being part of the community and making lots of cool projects
01:57:29: and, you know, using it as a night, you know, so you're, like, whether it's, like, you know, your home, like,
01:57:35: whether it's, like, you know, work or whether it's, you know, a place where, like, you know, you have fun.
01:57:41: It's always, like, you know, makes me happy, like, when people enjoy the platform for whatever kind of purpose they want to use it for,
01:57:46: or multiple of them, like, there's, you know, obviously lots of, lots of things to do
01:57:53: or whatever.
01:57:55: I've got, like, two minutes left.
01:57:57: I don't think, like, if there's only a last minute question, maybe I'll be able to answer,
01:58:01: but at this point, like, I kind of guarantee you'll be able to answer, like, answer them.
01:58:08: But let me actually ask you, like, in general, like, what kind of things would you like to see, like, you know,
01:58:12: in order to have, like, a new year's resolution and stuff like that?
01:58:17: Or just, like, a bunch of stuff, like, I do want to, like, I hope, like, I get to, like, work on, like, myself.
01:58:23: Also, thank you very much everyone in the chat.
01:58:28: Um, but yes.
01:58:31: Thank you.
01:58:37: It's been, like, it's been a lot of fun doing this and, like, I hope, like, to get, like, to do a lot more of these, like, you know, over the next year as well.
01:58:43: Like, I'm glad I finally, like, these streams, like, it's something I wanted to start for so long and it's just never
01:58:51: good, like, around, like, you know, getting everything together through it because there's just so much
01:58:55: stuff all the time but now I'm kind of glad, like, I did and thank you very much, you know, for everyone
01:59:01: like, you know, for supporting me and supporting, like, this platform and I think we have, like, a lot of kind of cool
01:59:07: cool things in the future. So, with that, that's the last minute so thank you very much.
01:59:14: I hope, like, everybody has, like, you know, had a great, great 2024 and sure I have even better 2025.
01:59:22: I think the creator jam, they're gonna be hosting the New Year's event like they usually do so
01:59:28: I hope, you know, there's not gonna be a lot of fun. I'll be around, like, I'll try to pop in but I'll be around, like, you know,
01:59:33: with, you know, with, like, people, like, we have, like, a bit of an event, like, a lot of gathering with
01:59:40: some nice people so I don't know how much I'll be, like, around but we'll probably be, like, running on a project or something.
01:59:47: I'll see if I can bring my, like, quest pro with me. So it's been, like, you know, pleasure, like, chatting with everyone,
01:59:56: answering your questions and, you know, just being a general part of the community. So thank you very much.
02:00:00: Thank you for supporting us. Thank you for watching these streams and have a happy new year, you know, have a happy 2025.
02:00:08: Thank you. Oh, did I hit the button? No.
02:00:16: Actually, wait, I'm gonna wait. I'm gonna, let me see who's streaming. Anybody streaming? Anybody streaming?
02:00:27: This is just Crater Jam. Is there anybody who streams Resonite?
02:00:35: No, it's not just Crater Jam right now, so Crater Jam it is.
02:00:39: If you stream Resonite, like, especially around this time, like, you're gonna get raided, so I do recommend
02:00:48: streaming around this time if you wanna stream.
02:00:53: Crater Jam. I'm gonna do Crater Jam.
02:00:57: Hit the button.
02:01:01: Oh, is it running? Oh, I think I typed it in the wrong thing.
02:01:11: Come on.
02:01:16: This is hard to do in VR, come on.
02:01:18: Press this thing, there we go.
02:01:21: Getting raid ready.
02:01:24: Raid.
02:01:27: Okay.
02:01:30: Crater Jam.
02:01:33: Okay, raid is ready.
02:01:38: Okay, so, thanks again everyone.
02:01:41: Thank you very much, you know, for watching and we'll see you next year.
02:01:48: Bye.