The Resonance/2025-01-26/Transcript

From Resonite Wiki

This is a transcript of The Resonance from 2025 January 26.

This transcript is auto-generated from YouTube using Whisper. There may be missing information or inaccuracies reflected in it, but it is better to have searchable text in general than an unsearchable audio or video. It is heavily encouraged to verify any information from the source using the provided timestamps.

00:00: I'm going to record as well, there we go. I'm going to post announcements. Post. Post this. And post the live stream.

00:21: There we go. Hello, hello. Let me also make sure I have it open to make sure...

00:30: Everything is going okay. Oh, we got us the sprinkles, hello. Hello Jack.

00:38: It's good people are all in the chat. Let's see...

00:46: I'll make sure I open my stream... Oh no, I can't hear myself, there we go.

00:55: Hello everyone. Hello Grand. People are slowly trickling into the stream.

01:05: Hello. Hello, can you hear us all fine? Is the audio going okay?

01:09: Can you hear me? Am I... understandable?

01:17: Hello, I just had a question about... pineapple.

01:30: Yes, Jack the Fox author is asking, have you unlocked the jet properly?

01:34: Yeah, kind of. It's kind of weird, because my sleep was weird the day I got back.

01:41: And then it just kind of snapped back into my normal sleeping schedule,

01:45: which is aligned with the US one, so I wake up in the evening and go to sleep, like, in the early, like...

01:53: Actually, what's it called before afternoon?

01:56: What's the time period between, like, morning and afternoon?

02:02: Between morning and afternoon?

02:04:

02:07: Like, we have a word in it in Czech that sort of translates as, like...

02:16: Until noon.

02:18: Until noon.

02:19: It's like... I don't know if I know, or maybe there's, like, some colloquialisms around the US, but, like...

02:27: I think that... I've always just called it either morning or afternoon.

02:32: But it's the thing, it's time that's before noon, so it cannot be afternoon, but also it's too late for it to be morning.

02:41: It's like, you know, like 10, 11 AM.

02:45: And, like, I wouldn't say 10, 11 AM. Like, morning, that's like 7 AM. Maybe 8.

02:53: You need a word for that.

03:00: Anyway, hello everyone.

03:05: So we got a bunch of people kind of in the chat, so we should be able to formally start.

03:11: So, hello everyone. Welcome to The Resonance.

03:17: We're back after, like, two weeks hiatus.

03:20: The last, the previous one, like, I kind of skipped it because we went to visit Snow Call Me Falls in Washington, where, like I said, it was really pretty.

03:28: And the other one, I was at FC, at Viral Confusion, so I wasn't available, but now I'm back home, and we're going to be doing these, like, a lot more.

03:37: So, hello everyone. For those of you who don't know, this is sort of a combination of, like, my office hours.

03:43: So you can ask, you know, any questions about Resonite, you know, whatever, like, whether it's a technical one, whether it's, like, you know, about the platform, about its past, about its future, its philosophy, or even just generally, you know, the team, and like, whatever thing you would like to know, we'll try to answer to the best of our ability.

04:00: Some questions we might punt over, like, you know, just give a generic answer and kind of punt it over to different office hours, like, for example, if it's anything to do specifically with moderation, they host office hours before this one, and some of you are, like, you know, probably coming from there as well, so we might, like, you know, suggest to kind of ask those, like, in the different ones, but we'll try to answer, like, you know, what we can.

04:24: I also have I'm Frooxius, and I have Cyro here with me.

04:30: He's part of our, like, engineering team.

04:34: And in order to, like, you know, if you want to ask any questions, make sure to put a question mark in the chat, in your question, that way it kind of pops another thing in here, and we make sure, you know, we don't miss it.

04:47: Some of the questions we also might, it's sort of like a mix, you know, it's like the question and answers, but also we might go a little bit in-depth and explain, you know, some of the, like, further kind of feelings of the plans, goals, you know, of Resonite, and how the different features will shape its future, and how they're gonna, you know, gonna be constructed.

05:07: So some of them might end up, like, turning into these, like, more longer kind of explanations, rambles, and deep dives.

05:14: We've got about a visual around this for about two hours, so there's, like, you know, plenty of time for lots of questions, lots of, like, in-depth explanations.

05:22: With that, I think it should be kind of everything, so we should be ready to get started.

05:27: We already got a bunch of questions popping up, so we should probably just cut into those right away.

05:36: So let me scroll up, because there's a bunch.

05:39: There's the one from Jack Focus, there, have you un-liked the chat properly?

05:43: Yes, I'm kind of just back to my US sleeping schedule, which kind of works for me.

05:50: Check the focus, I was just saying noon, I guess.

05:52: Well, isn't noon, like, the 12pm? That's, like, the exact point.

05:56: So we have, like, noon, then we have afternoon, which happens after the noon, but there's a period between morning and the noon that's, like,

06:07: Like, we have a word for it in check that, like, if you were to translate it literally, it's almost something like

06:15: until noon, or before noon.

06:19: It's kind of like one of those things where you kind of miss having a good equivalent for that, that kind of has the same meaning.

06:28: Anyway, the next question is from Justice Prinkles.

06:31: I have two parters from Mirsko's, it's long.

06:38: So, execution context, like, there are sort of, like, different ways to use ProtoFlux, because ProtoFlux, what it is at the core,

06:52: it is a node-based programming language.

06:56: And nodes, they can, you know, represent anything.

06:58: Right now, there's only, like, one context you can really use, which is the execution context.

07:03: And that essentially, what it does, it makes, you know, whatever, it executes operations, you know, as it kind of goes.

07:11: And it can, you know, make, do stuff like, you know, read values from the world, modify values in the world, you know, trigger values, like, actions to happen, and so on.

07:22: And it's kind of like the primary context.

07:24: There's going to be a lot more context in the future, and they have, like, you know, ways to interact with each other.

07:30: Whether there are going to be extensions of the execution context or not, that's going to depend on a specific context, because they can be used for lots of different things.

07:38: For example, one of the contexts that's going to come at some point in the future is shader context.

07:44: Once we switch away from Unity, we will be able to use ProtoFlux to make shaders.

07:51: And that, instead of just executing directly on the CPU, it will actually be compiled into a shader that then gets uploaded to the GPU, so it's going to operate a bit differently.

08:03: It's also not going to be extension of the execution context, because it cannot be, because you cannot really do the same kind of operations you do to the world and to the scene on the GPU.

08:14: It's a very constrained type of context.

08:19: So that one is not going to be an extension.

08:22: However, the shader machine context is actually going to interact with the execution context, because what that one will do, it will provide a way where you have a node that represents your state, and it can flow the state from one world to another, and that node can actually trigger things to happen in the execution context.

08:42: So for example, if you have a node that represents the state that your system is in, you can be getting impulses that say when the state is first transitioned into, when it's transitioned out of.

08:55: And then maybe every update you get an impulse saying this state is currently active, do whatever update thing.

09:04: And then you also have values, you can read out is this context currently active, you can derive things or use it as part of equations.

09:13: So there's going to be contexts that do interact with execution contexts, there's going to be contexts which do not.

09:20: There's a bunch more as well.

09:23: Actually, there's even more, even right now, because the context you're typically using is the FrooxEngine execution context.

09:32: And there's actually an extension of a more basic execution context.

09:37: Because ProtoFlux is developed as sort of a separate library, so it can technically be used outside of Resonite, outside of FrooxEngine.

09:46: And there's a bunch of things like, you know, core math operations, they have no dependence on FrooxEngine, they don't care about it.

09:53: So this is just basic execution context. But then there's a bunch of nodes, you know, say something like that duplicates slots.

10:00: That requires the FrooxEngine context, because they operate within the FrooxEngine and with specific sort of FrooxEngine.

10:08: So they use a context as an extension of the execution context.

10:13: There's a lot more into that, and we could maybe sometimes also get deeper into this, but overall, you know, there's going to be a bunch of them.

10:22: There's, in some of the previous episodes, we talked about, you know, future of Protoflux.

10:30: And I covered stuff like the DSP context, which is going to allow, you know, for processing of like audio and textures and meshes, instead of generating them.

10:41: So I do recommend giving that video a watch if you're interested in more.

10:45: It's on our YouTube, and I kind of go into details, you know, with some drawings as well.

10:51: It's another context that you'll be able to use.

10:53: And it's another one that's not going to be an extension of the execution context, because, say, if you're doing real-time audio processing, you're not really interacting with the scene,

11:02: and doing that would be kind of complicated, because you have to, like, you know, keep synchronizing, and it would, like, get really poor performance-wise.

11:11: So that's, like, running on its own, like, you know, thread, that's on the audio thread, and it's its own separate context.

11:19: That's going to be, like, a whole bunch, so, you know, keep tuned.

11:24: And do give that video a watch if you're interested anymore.

11:28: The next question is also from DustySprinkles, part 2, from Josh.

11:32: All the action nodes requiring Froox engine context, such as the bucketting slot,

11:35: I literally talked about that one, I haven't even read that question at the time,

11:39: are implicitly execution context, which would make using the mouse head execution context impossible,

11:43: yet having state machine pause one of these action nodes, and a state change would be a pretty useful ability.

11:49: How would this be handled?

11:50: And that's pretty much, like, you know, what I covered, like, earlier,

11:53: it's like the different contexts that can actually interact with each other.

11:57: Like, it kind of gets even more complex, because, like, say you have the DSP context,

12:01: and that's, you know, sort of, it changes how the nodes kind of behave,

12:04: that instead of, like, actions, it sort of builds, like, you know, sequence of, like, operations that happen to audio buffers.

12:13: Like, some of the nodes, they might have, you know, if you want to, like, do some,

12:18: if you want to run, like, a for loop through, like, you know, a buffer of audio samples,

12:22: that DSP context node might actually trigger an execution context piece.

12:28: It wouldn't be the Froox Engine execution, it would be just the basic one,

12:32: which means you can only do stuff, you know, like, core math operations and some other stuff,

12:36: but it would let you, whenever that audio buffer needs to be processed within the DSP context,

12:40: the DSP context node triggers execution context to do whatever processing you want to the data,

12:48: and then continues along the DSP context.

12:51: So, it's pretty much like, you know, it's a very flexible mechanism,

12:54: and we can, you know, build whatever we need for these to interact.

12:58: There's actually one more that I wanted to mention.

13:00: One of the recent episodes I also talked about the Terrain system,

13:04: and the Terrain system is going to introduce yet another context,

13:07: a way to use ProtoFlux to define how your Terrain works.

13:13: This has also been separated into another video that's available on our Resonite YouTube channel,

13:18: so give that one a watch as well if you're interested in more details there.

13:24: Next question is from Navy3001.

13:28: Will .NET 9 help with loading lag, aka spawning items?

13:32: I believe it will help.

13:34: I don't think it's going to solve it completely.

13:37: In general, any kind of update, it's not going to solve 100% of the problems,

13:41: but I do expect it's going to have a significant improvement.

13:47: The problem is, a lot of times when you have lag,

13:54: usually there's multiple sources for it.

13:57: So what does .NET 9 help with?

13:59: There's two main ones.

14:01: One is the actual code that needs to execute to decode the item, process it, and so on.

14:07: With .NET 9, it makes the code run faster,

14:12: which means if any time that is being spent processing the item,

14:17: decoding it and so on on the Froox Engine side,

14:19: that's going to happen faster,

14:22: which means it's going to speed up that part of it.

14:28: The other part is going to help it significantly.

14:31: One of the issues is, because we're running within Unity,

14:34: Unity has a really poor garbage collector,

14:38: which means if we suddenly allocate a lot of memory, for example, to load an item,

14:43: that can trigger the garbage collector to run full collections,

14:47: which means it essentially pauses the process to run a collection,

14:50: and it causes some of the stutter, some of the lagging.

14:54: There's something that's generally called memory pressure.

14:57: It's like when you're allocating memory,

15:02: which puts pressure on the garbage collector,

15:04: and because the garbage collector is very unperformant,

15:09: that contributes to the stutters, and it contributes to the lag.

15:13: With .NET 9, a lot of the code will be running with a way better garbage collector,

15:19: which is going to help reduce additional source of stutters.

15:23: Now, there's some other things that contribute to the lag.

15:26: One of them is, for example, when you load an item,

15:31: those meshes need to be uploaded to the GPU memory.

15:35: Right now, that is a synchronous operation.

15:38: It has to be synchronized with the main unit thread,

15:41: which means it's essentially going to lag everything,

15:44: until it's finished uploading that.

15:46: Which means if you have really complex meshes,

15:47: with lots of blend shapes, lots of geometry,

15:50: that's going to cause bigger lag than smaller ones.

15:52: We do have a system for textures,

15:57: where if you're loading lots of textures,

15:58: the cost is amortized over time.

16:02: Which means it uses a mechanism called time slicing,

16:06: where each frame we upload a small chunk of the texture,

16:10: but if it takes too long, we wait until the next frame.

16:13: Which means it's going to take several frames to fully load the texture,

16:16: but you're not lagging as much.

16:19: We'd like to implement something similar for meshes.

16:21: It's a little bit tricky, given how Unity works.

16:24: I believe it's possible, it just requires some doing.

16:29: But also it's a thing where .NET 9 is not going to help with that part of the lag.

16:35: Because that's a limitation of Unity in the graphics API.

16:43: And it requires us to do more specific work,

16:47: to get around that limitation and remove that part of the lag.

16:51: So, in short, I believe it's going to help.

16:55: I don't think it's going to solve it in 100% of the cases.

16:58: But there's also some other things we can do to improve that.

17:04: A select civil subscriber prime, hello, thank you.

17:06: Thank you very much for your subscription.

17:08: Oh my gosh, sorry.

17:12: I think what also will help, specifically with loading items,

17:17: will be having our own binary format.

17:22: Because right now we use Newtonsoft JSON, I believe,

17:28: to serialize the items into BSON, which is a binary representation of JSON.

17:34: And then we compress that using stuff like Brotli,

17:37: or sometimes we use LZ4 if the world crashes, whatever.

17:42: But having our own format that's tailored to work with our data model well,

17:49: and isn't powered by Newtonsoft, because Newtonsoft JSON is a little bit memory efficient

17:53: compared to system.txt JSON, but system.txt JSON can't do BSON.

18:00: And so having our own binary format will help speed that up a lot, I think.

18:09: It will definitely help the loading process to become faster.

18:13: I don't know if that necessarily helps, because one of the issues with Newtonsoft JSON is

18:19: right now it hurts because of the poor garbage collector, because that also causes a little memory allocations.

18:25: But with .NET 9, .NET 9 has a much better garbage collector which actually has better paired thread handling.

18:34: Because one thing we already do is we load things on a background thread.

18:40: So when the Newtonsoft JSON decoding is happening, that's actually happening outside of the main thread.

18:46: So ideally it shouldn't cause lag at all.

18:49: One of the reasons it causes lag is because it causes a lot of memory pressure and it triggers the garbage collector.

18:56: So what I think will happen when I switch to .NET 9 is that because of the better garbage collector,

19:02: that's not going to contribute to the lag because the garbage collector doesn't really need to pause the process

19:07: and it can also just pause individual threads and collect their memory.

19:11: So I think the .NET 9 switch is going to help significantly with the particle of part,

19:18: possibly to the point where it's not contributing to the actual lag, it just makes it load in the background.

19:28: But we definitely do want to switch to the custom format because there's a bunch of issues with it

19:32: and also it's just going to make the loading itself way faster.

19:35: So if you spawn an item, instead of having to wait 5 seconds for it to spawn, you just wait 1 second.

19:44: So that's going to be one of the biggest benefits of the format.

19:48: For the lag itself, it would help having that format while we're still with Unity

19:53: because we can rework it to reduce the memory pressure.

19:59: But I think the .NET 9 is going to make it so it doesn't really become an issue as much.

20:05: But we'll see how it behaves and we don't want to switch to the format anyway.

20:10: So it will definitely help.

20:14: Next question from GrandUK.

20:18: Hello, follow over from moderation office hours.

20:20: What do you think about custom clients for Resonite?

20:23: Either including FrooxEngine itself or reverse engineering FrooxEngine in another language?

20:28: So this one's kind of tricky because I remember you were asking during moderation hours

20:34: about is it okay to do this from moderation perspective and from legal perspective

20:41: and there might be some complications there.

20:44: But what I would like to focus on is more just the technical and philosophical perspective

20:50: of custom third-party clients.

20:54: The first thing is defining what is a custom client because you mention including FrooxEngine itself.

21:03: If you use FrooxEngine itself, is that like a custom client?

21:07: Because it's technically just using the same FrooxEngine as the rest.

21:11: So I think in that case it's not really something you'd call a third-party client.

21:17: It's more just like a modded client.

21:19: You modify bits of it.

21:22: Reverse engineering, that's another thing.

21:27: That all can be tricky from a legal perspective, but if you don't consider that part,

21:31: I think that's a significant undertaking.

21:36: Because you'd pretty much have to implement FrooxEngine and have it behave the same way the current one does.

21:44: Which is a huge other thing. It would take a long, long time to implement something like that.

21:51: At least if you want to achieve feature parity and have full compatibility.

21:58: Let me actually move over there because I do want to showcase...

22:04: Give you a little bit of visualization for this.

22:08: So if I go over here...

22:12: And actually I forgot my brush. I knew I forgot something.

22:21: So, let me grab a brush.

22:25: There we go.

22:26: Brush is, geometry line brush is.

22:29: Let me know how visible this one is because we can have different lighting.

22:32: Let me actually check.

22:34: Can you see this fine? Is this okay? That looks okay.

22:43: Typical thing, when people talk about third-party clients, they generally think about Second Life.

22:51: Because Second Life is one of the platforms that is known for having third-party clients.

22:57: And there's an important thing that I think a lot of people don't realize,

23:05: is how First Engine works, how Resonite works compared to Second Life.

23:10: With Second Life, you have your servers, like the grid, that are running a lot of complex logic.

23:17: There's a lot of stuff here, there's complex things and so on.

23:22: And this is the Second Life server.

23:30: And this is running a lot of complex logic, like stuff that is actually happening on the server.

23:35: And then you have your relatively thin client, or viewer.

23:45: Which has a few bits to get the data of the scene that's happening and display it to you.

23:52: Which makes it much easier to replace this with a modified version, because there's this separation between the server and the viewer.

24:03: With Resonite, however, it uses a different architecture.

24:08: And it's an architecture where everything is within one.

24:12: So you can have this be a server, but it's also a client.

24:17: Pretty much the majority of the code runs on both and is the same.

24:24: So we have a lot of complexity here, we have all the engine stuff, we have audio systems with ProtoFlux, the scene system rendering.

24:33: A lot of the complexity is in here, of what is actually happening.

24:38: All the UI, pretty much everything that's happening in Resonite is all within the single thing.

24:43: Which means you cannot really swap out the viewer as easily.

24:48: And if you wanted to re-implement it, you would need to re-implement pretty much all of this.

24:54: Versus with Second Life, you might just need to re-implement a bunch of these bits.

24:59: And it's still compatible with the server infrastructure that just stays as it is.

25:05: So I think for Resonite, it doesn't make as much sense having these kinds of third-party clients.

25:10: What you would have to do for it to be compatible, because in order for these clients to connect to each other,

25:16: they rely on all of these bits behaving the same.

25:21: So if you wanted to reverse engineer all of this, and then implement another language,

25:27: and be compatible with each other, you would have to take the huge undertaking of re-implementing years and years and years worth of code.

25:37: I don't think it's easily achievable, because it would take a long, long time.

25:47: And between them, we're also going to implement a lot of other things.

25:51: So to me, what makes more sense is just taking these clients, and maybe we have a modified version which adds extra bits here,

25:58: or maybe modifies a few bits here.

26:01: It's kind of the way mods are doing, and use that.

26:05: So instead of that being a third-party client, it's more like using a modified one.

26:10: And I think that works much better with Resonite's, or specifically Froox Engine's architecture, compared to something like this.

26:21: I think it pretty much covers this thing.

26:28: Just to reiterate the point, it's essentially like you don't have a separate client, it's a thinner version, because everything is happening.

26:38: And Resonite is based on everybody in the session potentially being able to execute any of the code.

26:44: And you can have one of the clients, you can have the responsibilities delegated, you can have only particle clients running certain updates and certain modifications,

26:53: and everybody has the potential capability to do so. That's more of a software thing.

26:58: So, I don't think reverse engineering, like Froox Engine, is too feasible, it's going to take a really long time.

27:09: And doing it in a way that preserves the compatibility, because you have to make sure things kind of behave the same, is very difficult.

27:17: Because you probably want to stay compatible with other clients, otherwise you're just kind of developing a different platform that's similar, but you cannot connect to others.

27:34: Even just to give you an example, you could potentially implement a subset, but then a bunch of things will not be working.

27:41: Say for example, you implemented only mesh renders, and static meshes. Now anything using procedural meshes is not going to render.

27:49: So anybody using avatars like that, that's just not going to work, and other tools are not going to work.

27:53: If you don't implement ProtoFlux, then anything that's driven, any materials, any behaviors, any avatars that use that, doesn't work.

28:04: You cannot have to implement most of it to really be able to be compatible with other clients.

28:12: So I think for Resonite and Particle, I don't think we'll probably see custom viewers, we'll mostly see modded versions of them.

28:20: You did mention in the moderation chat, modifying the renderer, which is actually one of the things we're doing.

28:31: This, we're going to move the renderer into its separate process, and this communicates with that.

28:37: So whatever the renderer bits are are going to be here, and it's a separate process, so you could potentially replace this bit.

28:47: And have your own renderer, which is also relatively big undershaking, and also it only lets you do limited things, because you cannot change how any of this behaves.

28:59: And if you want to do it for performance reasons, that's not really going to help you unless you're GPU limited.

29:04: Because the majority of CPU time is being spent here, so if you replace this, that's not going to help with any of these bits.

29:12: And also fundamentally, I would say that it still doesn't make it a custom third-party client, it just makes it a custom renderer.

29:20: That's not too different from just modifying some bits in here, like if you just considered this to be one thing.

29:31: But I feel like we're getting a little bit philosophical with that one.

29:36: But yeah, you could replace some bits like the renderer, it's something we already plan to do.

29:41: We don't want to switch to a custom renderer, but also, one thing you have to realize, the renderer is only responsible for the rendering.

29:54: But it isn't going to help with any of these bits.

29:57: So I do hope that kind of makes things a little bit clearer, and answers your question.

30:01: If you've got any follow-ups, let me know and we'll get to those after we get through a bunch of the other questions.

30:08: So we're going to return back.

30:27: Next question is from Navy3001, when are Resonite shorts returning?

30:32: I don't have any updates on this right now, I'm sorry.

30:36: The next question is, CheckTheFoxAuthor is asking, my question today is regarding the planned in-game workshop.

30:43: How will it work? What kind of functionality do you want to be able to through it?

30:48: I feel workshop is one of those things where it's really going to supercharge the community.

30:58: Because I feel right now, it's one of those features where there's a lot of need for it in the community.

31:06: From the things we see people doing.

31:09: Because people in our community, you make lots of really cool things.

31:14: You make cool worlds, but also make lots of cool items, avatars, gadgets to play with, or even tools for other people to use.

31:26: And the way people share these right now is through shared folders.

31:30: The big problem with shared folders is one, it's discoverability.

31:35: You kind of have to know somebody who has shared folders, so you can grab it from them and put it there.

31:40: It's also discoverability, figuring out when new items get put in and so on.

31:46: To solve this, we want to introduce a workshop.

31:51: It's actually going to use similar mechanisms to submitting worlds, because you can already publish worlds and share them with others.

31:59: But it's going to generalize that, so you can do that for any items.

32:03: So say you build a gadget that you want to play with, something like this.

32:11: And you want to build something like this and share it with others.

32:19: You'll be able to submit this item specifically, not having to put it in a world, you'll be able to submit it to the workshop.

32:25: You tag it, you provide pictures and so on, and then it becomes available for people to search and spawn out of the workshop.

32:34: You'll also be able to submit updates, so you can do improvements, everybody gets whatever the latest version is.

32:44: But the overall goal is to have the workshop be this kind of centralized repository of community creations.

32:52: And something like, you know, we'll build around.

32:58: Actually, I should go and draw some more sketches. I'm going to go and draw more sketches.

33:08: Let's go here, let's go here, there we go. Did I forget my brush? No, it's here, there we go.

33:11: I'm the guy pushing, like, the wheelie cart around with the TV on it. Oh, what happened to the camera?

33:17: Oh, what the heck? Uh, the camera got messed up.

33:22: Oh dear, I must have accidentally grabbed it, I apologize. Oh, that's good, just adjusted.

33:29: Okay, there we go, I think that's good enough.

33:34: Eh, sorry, the camera got grabbed.

33:41: There we go.

33:43: So, you know, with Workshop, like, you'll be able to, you know, submit, like, various items in there.

33:48: And on your dash, you know, there's gonna be, you know, you're gonna have, like, Workshop buttons,

33:54: so you're, like, you know, click it, and you have, like, you know, a feed of items.

33:56: So you have, like, you know, for example, this cool, like, you know, thing, whatever, whatever, you know,

34:03: whatever this thing, you know, that is, I don't know, I don't know what I'm drawing.

34:09: I'm drawing an avatar, you know, and it's, like, maybe, maybe it's, like, you know,

34:15: I'm spending way too much on these doodles.

34:19: So you have, like, you know, a feed of different items that people make, and there are, like, you know, tanks,

34:23: so some of them can be avatars.

34:25: Then you're gonna have, you know, categories.

34:27: So you can, for example, be, like, you know, I want avatars.

34:31: And by the way, this is not, like, actual, necessarily how the UI is gonna look like.

34:34: This is just more, you know, straight principle.

34:37: But you have, like, you know, stuff like avatars.

34:38: You have, like, tools, you know, and gadgets.

34:41: Some of the popular tags.

34:43: Or maybe games.

34:46: One of our users, like, they were, like, actually just, literally yesterday,

34:49: he built a version of Cards Against Humanity in here,

34:53: and he gave me the shared folder, so, like, you know,

34:56: that's gonna make it easier distributing.

34:57: You just, you know, submit it there.

34:59: You're gonna have, you know, like, a search bar,

35:00: so you can, like, be, like, I wanna search for this thing,

35:04: you know, maybe find this thing.

35:06: You can browse the categories.

35:08: So it's gonna be, like, you know, the typical, like, workshop, like, UI.

35:11: Listen, like, when you browse all the items,

35:13: and then you can, you know, spawn it out,

35:14: or maybe you can equip the avatars, and so on.

35:17: But what I think is also gonna make the workshop really powerful is, you know,

35:20: like, when you click on the items, you know,

35:22: there's gonna be, like, a description of it, you know,

35:24: you can make, provide pictures and other things.

35:27: Eventually we'd like to have, you know,

35:30: other features built around it, like, for example, discussions.

35:34: If you have an item, and you open it up,

35:37: oop, my second drawing,

35:41: you open an item, you know, say, like,

35:44: you have the, you know, avatar,

35:48: and there's, you know, some description of it,

35:50: you know, maybe there's a little discussion, like, you know,

35:51: with people saying this is a cool avatar, you know,

35:54: like, this is such a cool thing, and, like, this is super cute,

35:56: and, like, you know, whatever people want to discuss.

36:04: Like this, we want to generalize it, so they're kind of like,

36:07: they're almost like, you know, modular pieces that kind of go together.

36:11: So when we add discussions, you know, so we can discuss items,

36:15: we also make it, like, you know, possible to have discussions for other things.

36:19: So, say, for example, once we add,

36:22: so we can actually see, you know, the update history in here,

36:26: you can maybe leave comments to the updates,

36:27: or maybe, you know, when we add profiles, you know,

36:31: so we have, like, a profile, maybe people can leave comments,

36:35: or maybe once we add events, like, you know, events UI,

36:38: you'll be able to add comments, you know, to whatever events are being planned.

36:41: So we want to make that, like, you know, functionality very generalized

36:44: and kind of, like, you know, modularly build pieces of functionality together.

36:50: The other part where I think Workshop is going to be a little important is

36:54: once we have, you know, people submitting avatars

36:57: and they're properly tagged as avatars,

36:59: it'll improve the initial user flow,

37:02: because oftentimes when a user comes to Resonite,

37:05: what they want to do, you know, they want an avatar.

37:10: So what we can do is during the initial setup, we'll be like,

37:13: we'll be like, you know, do you want an avatar?

37:20: You know, it's going to just ask them, do you want an avatar?

37:23: And then, like, you know, it shows them a feed of, like, you know,

37:26: some of the popular avatars that are free for use

37:29: and they can just pick one right there without having to search,

37:32: they just pick, it does everything for them.

37:34: They now have, like, you know, an avatar to star for them to walk around with,

37:37: instead of having to, like, you know, find a world and find an avatar in there.

37:42: So we can, you know, literally take a piece of the workshop

37:45: and put it into other people's, into other places, you know, like the setup UI.

37:52: Another example, say you have something, you know, like the material tool.

38:03: And you want to change some materials, we have to find them.

38:06: Like, you know, for example, here's some materials.

38:08: I can, like, grab this.

38:11: So you can find materials, you know, in the world if you want to change something,

38:14: and I'm grass for some reason.

38:17: You could, you know, submit materials and tag them into the workshop.

38:21: So you could search the workshop and say, like, you want, like, a wooden material.

38:26: And you find one there, you spawn it in, you pop it in.

38:30: But with our focus on making everything very modular and kind of, like, you know, interacting with each other,

38:36: what we'll be able to do is we can actually integrate part of the workshop into the tool itself.

38:42: So it's just going to be like, I want the material.

38:45: You select, you know, find material, and it pops out, you know, a little like UI.

38:52: That's technically the workshop, but it's filtered just for material.

38:57: Like, you know, wood, or maybe there's metal, or glass, you know.

39:02: There's a bunch of categories, and there's like, you know, showing just the materials.

39:08: And you can just browse it, you know, with your tool, and you'll be like, okay, this is the material I want.

39:12: You click on it, it pops in there, and you can, like, you know, use it, and you can, you know, apply it to things.

39:17: You know, now the wall is grass.

39:19: So being able, you know, to use the workshop as a building block for other features

39:24: is another of the things that I think will be really, really powerful.

39:28: Because, you know, it's gonna allow us to simplify the workflows.

39:34: And even, you know, doing something like that, because technically you can, you know,

39:36: you could go to this interface, you could search, filter by materials, you can, you know, find the material you want,

39:42: spawn it out, plop it into the tool, you can still do that.

39:46: But I think by having it integrated into the tool, it skips a bunch of steps.

39:49: I think that can help tremendously, because like when you're building worlds,

39:54: you know, that's finding a material that's, you know, something you do very often,

39:59: as you're working. So if we remove, you know, say, five steps from it,

40:05: and it's a thing you do, you know, a hundred times per session,

40:08: that's 500 steps, you know, removed, 500 clicks you don't have to do anymore,

40:12: and this can make like a huge difference to the workflow.

40:16: There's a bunch of other features as well that are going to integrate, you know, with the workshop.

40:21: One of the things we want to do is add a system for sort of like, you know, tracking licenses.

40:27: So you can say, you know, this asset, you know, has this license to use,

40:31: and maybe this one requires, you know, payment or proof of ownership,

40:35: because you have to purchase the asset.

40:38: What it will do, once we have that feature,

40:42: it's, you know, the workshop is going to become aware this item is, you know, something that requires payment.

40:50: So if you want to actually spawn that item and use it, you'll have to purchase it or proof that you own it.

40:56: And at the point of workshop also, the workshop UI will also serve a sort of like a shop.

41:03: So you'll be able to like, you know, sell items and worlds, you know, and avatars.

41:07: Through Resonite.

41:10: The other part with the licensing system that I think is going to make things also easier for new users is

41:17: you might be able to, you know, prove ownership of certain assets by, for example, having a file.

41:23: So say you purchase, you know, like an avatar and you have the FBX file for it.

41:31: If you drag and drop, you know, the FBX into Resonite, we can actually detect,

41:35: okay, this file hash is associated with this paid asset, so that can ground you a license,

41:41: and we can just be, instead of setting up this avatar, you know, from scratch,

41:48: here's, you know, here's a list of all the setup versions of it, do you just want to spawn one of them?

41:54: And I think that might help, you know, a little bit kind of initial onboarding, like,

41:59: when you don't find the avatar you want, but you drag one in,

42:02: and the system is available to use that, you know, to verify ownership,

42:06: and of course it also has to depend, you know, is the author of the asset, like, you know,

42:10: okay with it, kind of proof of ownership, but assuming that they're okay with it,

42:14: we can just ground the user, you know, you now have the permission to use this,

42:19: you can spawn these already pre-setup versions without having to worry about setting it up from scratch.

42:26: And there's going to be another thing, you know, like where the workshop,

42:30: it's sort of like, you know, this kind of central feature, and there's like other features that like interact with it.

42:37: Because one of the things we really want to avoid is like, you know, designing things,

42:41: but they're like, you know, big monoliths, because we could make it, you know,

42:44: that the shop itself is an integral part of the workshop.

42:48: We could make it, you know, licensing.

42:52: The problem is, you know, it makes these things into these big kind of monoliths,

42:57: which makes them harder, you know, to use, and it makes it harder to use that functionality outside.

43:03: Because, say for example, you know, for the licensing, the workshop, there can be one way to find paid assets,

43:10: you know, something like people are selling.

43:12: But say, you're just, you know, browsing the world, and like, you know, say Cyro is like, you know,

43:16: say Cyro is like, you know, playing with a thing, you know, he has like this cool collar, and say this is a paid asset.

43:21: And I will be like, you know, I want this thing, and like, I don't want to like this thing specifically,

43:27: for example, because I think this is a personal one, but say like, say this is a paid asset.

43:32: And I'm like, I would like this thing, I want it too.

43:34: And you'll be like, I want to save it.

43:36: And the system can be like, this is a paid asset, you could save it, but you know, you have to pay the creator of it.

43:42: So you could find items kind of organically outside of the workshop as well.

43:48: And it's going to use the same licensing system.

43:50: And that way having the licensing system be technically separate part,

43:53: but one that integrates with the workshop is a lot more powerful than having it integrated into it.

44:02: Oh, there is one more system that is also going to be technically separate thing,

44:09: but that I think will integrate really well with workshop and that's Molekule.

44:14: That's going to be our sort of like version tracking system.

44:16: And our primary goal is to use it for Resonite itself.

44:21: Essentially so we can push updates to it, track changes and so on.

44:26: You could switch different branches much more easily than through Steam.

44:30: And our own system for just handling that.

44:33: But the versioning functionality and tracking of the changelog,

44:37: what's changed between these and these versions,

44:40: that's useful for more than just Resonite.

44:42: That's going to be useful for items as well.

44:45: It's going to be another thing that's going to integrate.

44:47: So if you, for example, browse an item,

44:48: maybe there's a thing where you see this is the current version,

44:52: but you can see a version history and you can see changelog.

44:56: So for example version 0.02 and it can be added this thing, fixed this thing.

45:03: And maybe we can spawn the old versions too.

45:07: So there's going to be another thing that will integrate with the workshop.

45:13: But technically that's also going to be a separate feature.

45:15: The other part of the workshop or other example that I want to mention before we move on

45:22: is I mentioned materials.

45:25: And there's avatars, gadgets, tools, whatever you want to share.

45:29: But I think one that's also important to mention is ProtoFlux.

45:35: Because one of the things that's going to come to ProtoFlux are going to be custom nodes.

45:39: And custom nodes, they're going to let you bundle common functional, essentially into nodes.

45:46: So if you have a complex thing that's a bunch of nodes.

45:50: And I kind of talked about this in a different video as well that's on YouTube.

45:54: So I do recommend checking that one out if you want to know a little bit more about this.

45:59: But so you have a bunch of nodes, you combine that into a single node that you can then reuse in a bunch of different creations.

46:08: And you'll be able to take this custom node or maybe a whole library of them and submit it to the workshop.

46:14: And similarly, how you could find materials with a material tooltip, there could be one for the ProtoFlux where you're like,

46:21: I want a function that computes some complex math with a sphere and tangents or something.

46:30: And you just search it and be like, okay, there's a function and you just integrate it into a ProtoFlux.

46:34: And I think it's going to make sharing of code much easier and it's going to make it much more powerful building stuff with ProtoFlux,

46:42: because instead of having to find already pre-made things or just having to constantly build everything from scratch over and over,

46:51: you'll just have access to a repository of pre-made code and libraries and just easily integrate it into your own projects.

47:02: And it's also part where the molecule, the system is going to be very useful,

47:08: because say you integrate some set of nodes, like a library for something,

47:13: you might want to tell the system, whenever it's loaded, use the latest version of it, but only the minor version.

47:23: That way, if they make any fixes and improvements, your thing can automatically update,

47:28: but in major versions, you have to update manually, because usually, if you use semantic versioning,

47:34: major versions have breaking changes, whereas minor ones should not.

47:40: So, in short, the workshop I feel is going to be one of the really, really powerful features,

47:45: and I think it's just going to empower the community and the creators and even regular users to do all of the more,

47:52: because for regular users, if you're just a casual user, you want to hang out, you want to socialize,

47:57: it's going to make finding avatars easier, it's going to make it easier to find gadgets and games to play with.

48:03: If you're a creator, the same repository can be used for materials, textures, sounds,

48:10: ProtoFlux nodes, and you can find it easily, and we can even build integrations with our tools,

48:17: and you'll be able to build integrations with your tools, because it's all going to be data feed based,

48:23: to supercharge them with having a repository to all the community-made content.

48:29: So, as if it were one of these features I'm really looking forward to adding,

48:35: like, I keep, like, every time I see people sharing shared folders, and then having shared folders off shared folders,

48:45: and having multiple nestings of these, I'm like, it feels like, you know, the community content, you know,

48:52: it's almost like bursting at its seams, like it really needs a workshop,

48:57: so introducing that I think will get lots of cool content, and it's also going to make it much cooler,

49:04: it's going to make it much cooler and easier to discover that, which helps new users and existing users who build content.

49:12: Actually, one more thing I'm going to mention is, if you have a creator that's publishing things,

49:18: one of the things that I would also like to integrate is having sort of feeds.

49:22: So if the creator pushes in one update to an item, or they make a new item, and you're following that creator,

49:29: it can pop in your feed and be like, this creator made this new item.

49:34: And that's going to help contribute to the discoverability.

49:39: Anyway, I think I've rambled long enough about this one, so hopefully that answers the questions,

49:43: gives you a better idea of what you want to do with the workshop, and why is it such an exciting feature,

49:48: and why I think... it's definitely like, after the performance update is done, it's like, you know,

49:55: one of the things that's kind of at the top of my list where I think this is going to help the community,

49:59: it's going to help the growth, you know, of the community and of the platform, and I would like to prioritize it,

50:05: like, soon. Yeah. Anyway, thank you very much for the question, we're going to move back to the other ones.

50:19: That's just the awkward silence as we move between locations.

50:29: Okay. Also, in terms of... so the next question we have is, with the in-game workshop, I want to say,

50:38: group projects very much needed, this is by TheRedNeko, by the way, I forgot to mention them.

50:44: Me and Critters need joint work on an item, and only allowing one user to have credit would be annoying.

50:49: We do have groups, though, like you can request to have a group made, I think, if your Patreon tier is high enough.

50:55: Yeah. And you can then have a... it gives you a set amount of shared, like, group storage space that you both can access,

51:03: and when you publish worlds, or like, let's say you published a public folder from the group,

51:08: it will show as being owned by that group, rather than any individual user.

51:14: The groups mechanism is very useful for the collaboration, so you can just have a shared folder,

51:20: and you can have a shared credit.

51:22: So I would definitely recommend using that feature, and being able to publish things as a group,

51:27: that's definitely still going to be part of it.

51:29: I do think it might also be possible, with the licensing system, we could just say,

51:34: you don't have to create a group, we can just allow multiple credits for things,

51:39: but we'll have to see about that one, but the groups one is always going to be there,

51:44: definitely good.

51:48: GrandUK is asking, are you aware of what bus factor means when referring to a project?

51:52: If not, is it a measure of how many people could disappear before the project stops developing?

51:56: So in that context, what is the bus factor of Resonite in a technical sense,

52:00: and what can be done to make it better?

52:02: Yes, so there's actually something that we cannot talk about in the community,

52:06: not the core community, in the team.

52:08: Well, sometimes in the community as well, but I misspoke.

52:13: And it's something that we're continually looking to improve,

52:18: and generally what it involves is making sure that we document our processes,

52:24: we know how things are set up, what's our cloud infrastructure like,

52:29: how are things working together.

52:31: So having documentation so other people can pick up things,

52:33: that kind of helps with that kind of stuff.

52:38: The other part is making sure people actually have access to the things,

52:45: like with Azure Access, and then there's the part of the code.

52:52: Pretty much all the engineering teams have access to the code,

52:55: so if something happens to any one of them,

52:58: technically any of the people can continue on.

53:02: So it's something we do talk about, something we take measures to improve.

53:06: It's a gradual thing because there's a lot of things,

53:10: and sometimes if something happens to some people,

53:13: that's going to slow things down,

53:15: but there are ways to recover from that.

53:22: Next question is from Granoke.

53:24: Are there any thoughts you have on open-sourcing Resonite,

53:27: either in parse or fully?

53:29: Oh, yes, actually.

53:30: It's actually one of the things I would like to do in,

53:35: not like the immediate future, well, soonish,

53:40: is start open-sourcing pieces of Resonite.

53:45: If we think about open-sourcing Resonite in full,

53:50: that one's a lot more complicated,

53:51: because we use code that's ported from third-party code

53:57: that we don't necessarily have licensed to open source.

53:59: We use code that's also used from other things,

54:02: that we have licensed to include in the code,

54:03: but not necessarily licensed to re-license it.

54:07: There's stuff like where open-sourcing certain parts

54:11: might potentially open up some security issues,

54:18: because there's certain mechanisms,

54:21: like, say, stuff with ban evasion.

54:26: We specifically don't tell you how ban evasion works,

54:30: and part of it is on the cloud infrastructure,

54:33: where the code is not accessible to people.

54:37: It can be decomposed from the client,

54:39: and if people knew, or the more people knew how it works,

54:43: that makes it easier for them to get around it,

54:45: so open-sourcing that can cause harm.

54:51: So there's a lot of things like that.

54:54: There's also, the other part of it is,

54:56: in order to open-source it, that usually implies

54:59: that people will be able to start some pull requests,

55:02: because they want to submit improvements.

55:05: We need to be able to handle those,

55:07: both having the bandwidth for it,

55:09: and having some sort of structure in place,

55:11: where we are like, these are our coding practices.

55:15: This is what we accept, this is what we don't accept.

55:18: Because we need to ensure,

55:22: that is, for our version of it,

55:24: we want to maintain a certain quality of the code,

55:28: and we want to maintain certain fills of the code.

55:33: One of the things, one of the strong principles for Resonite,

55:37: is we always make sure, when possible, everything stays synchronized.

55:42: But you could just implement things in a way that doesn't follow that principle,

55:47: but it's something we would be very unlikely to accept.

55:51: But that's something that needs to be defined,

55:52: we need to say, this is how you need to write things.

55:54: So it adds additional overhead and complexity that we need to do,

56:01: to open source things.

56:03: And even generally, we might have, well, we do have an overall goal,

56:08: where we want to take Resonite,

56:10: and we want to head in a certain direction.

56:14: There also might be things where people might fix bugs

56:17: and things in a way that's not the highest quality,

56:22: or that could cause other issues and other problems,

56:25: so we might not accept those.

56:28: So there's a lot of complexities with open sourcing the whole thing,

56:32: that I don't think we are ready to handle.

56:36: There occurs a lot more work and potentially more manpower

56:39: than we have right now to do that.

56:42: Unless we open source it and we're not accepting anything,

56:47: we just do whatever,

56:48: but I don't think there would be a good way to do it.

56:52: So in order to tackle this,

56:54: I want us to do the open sourcing more gradually.

56:59: Oh, actually, there's one more thing.

57:03: If we open source the thing fully,

57:06: one consideration or concern that we have is,

57:09: are we going to be able to continue doing it as a company?

57:12: Could somebody like META just come in,

57:16: take the code and just push us out,

57:18: and destroy us with infinite resources?

57:24: That's something we don't want to happen.

57:25: We want to keep working on this.

57:27: We want to make this our livelihood.

57:29: And there's a thing, we need to balance the benefits we would get from open sourcing.

57:37: So that's another kind of thing where it's a bit of an unknown.

57:40: And there's stuff like licenses that will help prevent that,

57:44: or at least minimize the risk.

57:46: But also we have to weigh those against other things.

57:49: So it becomes a more complicated issue.

57:54: So on the flip side, doing it by pieces I think is much more digestible,

58:00: much more easier on our bandwidth right now,

58:03: while still getting a lot of the benefits of that.

58:07: So what I would want us to do is essentially take various subsystems

58:12: and make those open source.

58:13: One of them is being, for example, the import and export system.

58:21: Because I think the one's a really good candidate to open source relatively early,

58:29: because it doesn't really modify the data model.

58:35: It doesn't add any types of components and things.

58:38: It's just a system where you take whatever external formats you have,

58:43: slice them into the data model,

58:45: or takes stuff that's in the data model and translates it out.

58:50: If we design it right, we can make sure that it's easy to write importer and exporter modules.

58:57: We can also take the existing importer and exporter functionality

59:00: and essentially just open source it,

59:05: so people can use it both as a reference or to make their own forks with modifications

59:09: to help better support the content you're making.

59:14: It's something we don't have to worry super much about long-term issues.

59:20: If you're making a component, something that becomes part of a data model,

59:26: now there's a worry because that component can be saved into the cloud.

59:30: Now we have to worry, will that component keep working the same way it's working

59:35: five years down the line?

59:37: Is this maintainable? Is this going to cause issues?

59:40: And that adds a lot of overhead to those things.

59:43: Without importer and exporter, we don't have to worry about that

59:46: because it's not adding any new components, it's just translating stuff.

59:50: So at the moment you import something, it translates it into the Resonate data model,

59:56: and the Resonate data model, we already know what's in there that we can support it.

01:00:01: So the translation is essentially just a one-time process.

01:00:07: Additionally, it helps with some of the security stuff, because say we open source stuff,

01:00:13: it might make it much, much easier to make rippers for content,

01:00:18: where you can store people's avatars and things like that,

01:00:22: because now we have access to all of the code.

01:00:25: If we only open source the importer and exporter,

01:00:30: the specific APIs that it interacts with,

01:00:33: we can make it so it's not possible to invoke an exporter on something you don't own.

01:00:40: So you cannot just use it to rip data out,

01:00:43: and it helps make things a little bit more contained.

01:00:48: The other parts that I think we might open source is stuff like the shaders.

01:00:54: Those would be mostly for reference.

01:00:55: One thing actually that I've been considering is also open sourcing the new procedural animation system,

01:01:05: because it's one of those things where I'm not really happy with it.

01:01:10: It's been kind of rushed and it needs a lot more work,

01:01:15: but I also want to spend time on some bigger features,

01:01:18: so one of the thoughts I was having is if we made this open source,

01:01:23: here's the code, we could start accepting improvements,

01:01:26: let the community help us polish it up some more,

01:01:30: and improve the platform.

01:01:32: I don't know if that's the way we'll do it.

01:01:34: Actually, there's a question to you.

01:01:35: Would you be willing to help us improve things like that,

01:01:41: so we can focus on other things?

01:01:43: Is that something you would want to do?

01:01:45: If you like this idea, let us know,

01:01:48: because that would definitely help influence the decision about that.

01:01:53: The other part is also going to be great open source,

01:01:55: is the device drivers.

01:01:59: So if you want to support more devices,

01:02:02: more face tracking, more haptics hardware,

01:02:07: we can take existing drivers we have,

01:02:10: we open source them, again, as a reference,

01:02:12: so you can see this is how these things are implemented,

01:02:16: to make your own, or maybe you want to make a fork,

01:02:19: you don't like how a certain part of it works,

01:02:22: you want to modify it, run your own fork instead,

01:02:25: share it with the community, you'll be able to do it as well.

01:02:31: So this is going to be another great one,

01:02:33: it's also one of those that doesn't need to add new things to the data model.

01:02:38: So I do think we're going to be,

01:02:41: maybe this year, hopefully this year,

01:02:42: we'll start open-sourcing pieces of Resonite,

01:02:46: because we do have a great community,

01:02:48: and people who want to help make this platform better,

01:02:54: and I feel that it's going to provide a way to contribute more into their guard,

01:03:02: and also, even if you don't want to help contribute to their main platform,

01:03:06: it's going to give you an ability to better customise some of the aspects of the platform.

01:03:12: Say you want to import some super obscure format,

01:03:17: one that we would not really prioritize,

01:03:21: now you have easy frameworks to work with to add support for that format into Resonite.

01:03:29: By open-sourcing that part, we give you more power in there.

01:03:35: And over time, we do more and more bits of Resonite,

01:03:38: and maybe eventually we do the whole thing,

01:03:41: but like I mentioned earlier, that is a much bigger task with a lot more considerations.

01:03:48: But it's definitely something we'd like to do, what's going to happen in pieces hopefully this year.

01:03:57: So keep tuned and hopefully soon-ish the end.

01:04:06: The next question is...

01:04:10: ModernBarlon is asking, will component search be properly implemented anytime soon out of curiosity?

01:04:15: Well, it's one of those things, it depends what you mean anytime soon.

01:04:20: If you mean like, you know, because that could mean in a week, in which case the answer is pretty much no.

01:04:25: It could be in a month, in which case it's probably still no.

01:04:28: It could be within the next six months, which the answer is maybe.

01:04:34: The component search, like the component UI,

01:04:38: is actually going to scrap the current one and it's going to be re-implemented with the data feeds.

01:04:43: Data feeds, as part of their system, how they work under the hood, they natively support search.

01:04:49: So once that UI gets re-implemented, it'll come with search.

01:04:53: This is also going to include ProtoFlux nodes, which are technically components in the system as well.

01:04:58: It's just a specific subset.

01:05:01: But yeah, depends what you mean anytime soon.

01:05:04: We generally don't give time estimates on things because things change.

01:05:09: Right now the focus is still finishing the performance UI.

01:05:14: Performance updates.

01:05:17: The UI update, one of the things that's high on the list.

01:05:22: But on certain specific timelines, I cannot give you an number right now.

01:05:30: Moonbase is asking, how does Second Life have third-party clients then?

01:05:32: Are they just modified versions of the original client?

01:05:34: I think I was asked during the explanation bit, so I think we already answered that one during that part.

01:05:42: I do believe they use modified versions, like they modified clients and replaced more or less.

01:05:47: There's some open source implementations. I'm not super well versed, but...

01:05:55: Next question is from Jack the Fox Author. I'm just checking the time, we've still got about an hour left.

01:06:15: Thank you, Nikan132, for the subscription with Prime. Thank you.

01:06:21: Thank you.

01:06:21: It's probably kind of late because we're just chewing through the questions.

01:06:29: Sorry, I hope you're still in there. Thank you.

01:06:33: It's a good one.

01:06:37: Jack the Fox Author is asking, I remember under...

01:06:40: Wait, I already read that.

01:06:42: So yeah, as far as the nodes go, the nodes will generally not be modified.

01:06:48: The nodes that are in there are kind of going to stay the way they are.

01:06:51: There might be some new overloads.

01:06:53: But in terms of the specific support, we want to make it as generic as possible.

01:06:58: So for example, we're going to have nodes which let you access an element out of a collection by its index.

01:07:06: Or maybe by its key.

01:07:08: And that collection can be different types.

01:07:11: It can be an array, it can be a list.

01:07:15: So there's going to be probably some equivalent of for-each iteration, where you iterate over all the elements.

01:07:24: So you essentially just give it a collection, it just iterates over all of them.

01:07:29: So you're going to have a lot of very common operations.

01:07:32: Same thing when you want to add an element to a collection, or remove, or replace.

01:07:38: There's going to be a bunch of nodes for that.

01:07:40: And once you extract an element out of a collection, say it's a list of numbers, then you can just use addition nodes for working with a number.

01:07:51: Wherever you are to do, say with an integer or a string or something, you just do that once you've extracted it from the collection.

01:07:59: So most of it is probably going to be ways to iterate over collections, access the elements in collections, and build and modify the collections, like mutate them.

01:08:11: There's also going to be a number of nodes which wouldn't accept collections or produce collections.

01:08:17: For example, there's going to be a node which accepts a string and gives you a dictionary collection that's JSON parsed,

01:08:28: and there's going to be a reverse which accepts a dictionary or other data types and produces a JSON string.

01:08:33: So it's easy to convert from one to the other.

01:08:36: You could technically build those on your own as well, if you want to do JSON parsing, which some people already do,

01:08:42: but for some of the common ones we're going to provide you with primitives.

01:08:47: Same thing, once you get component access, you just get a collection of components on a particle slot,

01:08:58: you can iterate over it, access a specific element, whatever you want.

01:09:02: Another good example is the Raycast nodes, because right now you have Raycast 1,

01:09:09: and I've specifically added the Raycast to be Raycast 1, so it doesn't need collections,

01:09:14: it just gives you the first hit, which in some cases, in a lot of cases it's sufficient,

01:09:19: and it's actually faster too, because with Bepu Physics, how we have it integrated,

01:09:25: that search is more efficient, because it knows it doesn't need to build a collection,

01:09:29: it doesn't need to search them, it just literally finds whichever is the closest one that satisfies the criteria.

01:09:34: So using that one, if you don't need multiple hits, it's more efficient.

01:09:40: But, we will add ArrayCastAll node, where if you do ArrayCast, it gives you a collection of all the hits,

01:09:47: and you can iterate over them, you can sort them, do whatever you want with the results.

01:09:52: You essentially get a list, and then use the generic nodes for working with that list.

01:09:59: This also applies for collections that are part of the data model, like for example the brushes.

01:10:06: The way the brushes work, literally when you're drawing a brushstroke, the brush is just adding new elements to an array,

01:10:18: or a list, or whatever you call it, and it's updating the procedural mesh, you'll be able to access all the data with ProtoFlux.

01:10:25: So you can just iterate over it, do whatever modifications, you can add your own.

01:10:29: So you could build procedural things, whatever you want to do with a collection of data, you'll be able to do at that point.

01:10:41: The other part is we can add nodes, for example for accessing mesh data, or texture data,

01:10:47: so you can iterate over the pixels, or iterate over the triangles, build our meshes.

01:10:53: Same with audio, audio is a collection of audio samples.

01:10:56: So you can iterate over the already node, generate stuff procedurally, it's going to open up a lot of cool options.

01:11:05: Granik is asking can you submit a public folder for Cards Against Humanity, if it's fine to be shared?

01:11:10: I think it's okay to be shared, I'll have to check with Senuar, he's the one who made it, but he was giving it out.

01:11:18: Next question is rasmus0211

01:11:23: Any plans for being able to display your rules at firm prior to users joining UpWorld?

01:11:28: I'm under impression many users, especially new users, don't read prior to joining UpWorld.

01:11:33: Yes, there's been a feature in vorax in the background for that.

01:11:37: I don't know when it's going to come, but it's probably something that's going to pop in at some time.

01:11:44: Next question is from konnager

01:11:48: Sorry if I'm not pronouncing your name right.

01:11:51: Not trying to make you self-conscious, love what you're doing, but you're also saying you know a lot.

01:11:55: It's okay to have a breeder every once in a while, you don't need to be nervous.

01:11:58: Oh, I'm not really nervous, I'm just going to talk like this.

01:12:03: Glitchfur is asking nakin132

01:12:07: Yes, I guess Glitch is surprised to see nakin and also Glitch is cute.

01:12:14: It's probably screaming right now.

01:12:19: Next question, Navy3001

01:12:21: Will the Resonite have something like prefabs? Would be nice to update one asset and updates them on projects.

01:12:27: Yes, we do have a github issue for it. It's a functionality that's going to appear at some point.

01:12:32: There's actually also something that's going to integrate into the workshop which we talked about earlier.

01:12:39: So yes, this will be a feature that will come.

01:12:41: It's a little bit tricky one because sometimes it depends how and when do you want things to update.

01:12:49: Because you have a...

01:12:52: How do I put it?

01:12:58: You essentially build a world and use somebody else's prefab.

01:13:03: And they update that prefab that might break what you build with it.

01:13:08: So we'll need a system that's going to determine when do you want your worlds to update and whatnot.

01:13:16: And that can be either just a fully manual process.

01:13:18: So you say I'm using this prefab, I want to update all the instances in this world right now.

01:13:26: Or we can have it semi-automated and as you know where the system like Molekule comes in.

01:13:32: Where whoever is making the prefab they can inversion it and they can only do minor updates.

01:13:36: And you're going to say if it's just a minor update just update it automatically.

01:13:40: If it's a major version update then that requires a manual update.

01:13:47: So being able to define the rules for the prefabs like when they update a thing is going to be important.

01:13:52: Social kind of collaborative platform like this.

01:13:55: The other part of it is we need to have a good way to define overrides.

01:14:01: Because with Resonite you don't really have a difference between an editor and player mode.

01:14:12: Usually an editor if you're editing it kind of tracks the changes and then even makes some changes.

01:14:18: So you have a prefab and for a specific instance you change your color, it tracks that.

01:14:24: But you know the tracking has an overhead.

01:14:26: So the prefab system will need to have an efficient way of handling this.

01:14:31: And determining what is an override from the prefab instance and what is just a regular behavior that should be erased.

01:14:41: So there's going to be probably some systems to deal with that.

01:14:44: Next question is from DustySprinkles.

01:14:55: So generally we don't answer when questions.

01:15:01: It's something we've been looking into in the background.

01:15:05: There's a number of complexities with that.

01:15:08: Because if you go with that kind of solution, how is it implemented?

01:15:15: What company do we use? Do they have good data privacy?

01:15:22: We, as the older studios, we're a European company.

01:15:27: And generally there's much stronger focus on privacy and protecting people's privacy and identifying data.

01:15:38: The other part is how would the community work with it?

01:15:42: Which companies would be okay with us using?

01:15:48: So it's something that will probably show up in some form at some point.

01:15:52: There's not anything too specific right now.

01:15:58: So we'll keep you informed.

01:16:00: Once we're thinking more seriously and looking into specific companies for this kind of thing,

01:16:05: we might open up a discussion and be like, this is the companies we looked at,

01:16:10: which ones would be comfortable with, which options would it be comfortable with,

01:16:14: and have that kind of discussion about this feature.

01:16:19: The next question is from TheRedNeko.

01:16:21: The issue right now is that groups is on the $20 tier, which is quite expensive for me,

01:16:26: so having the other three or cheaper method for credit and multiple people would be great.

01:16:31: And you can always just put a credit on the item itself.

01:16:36: With the licensing system, we'll probably just have it so we can say these are the creators of this.

01:16:41: Because one of the powers of the license system is even if you're not selling anything,

01:16:46: you can just say, say this item.

01:16:50: I actually forgot which exact license we're using.

01:16:51: I think we're using some kind of Creative Commons ones, which means this is free,

01:16:55: you don't need to pay for this item.

01:16:57: But there still can be restrictions, how can it be used?

01:17:02: And you also want to track who works on this item, because there's multiple people that contribute to this.

01:17:08: So having the license system, you'll be able to place information, be like,

01:17:12: these are all the people that worked on this item, this is licensed with Creative Commons, this and this version.

01:17:20: So that system I think will provide that functionality.

01:17:23: For now, I would just recommend, you know, put a...

01:17:27: One thing that people do is just put like a slot which says like, you know, credits,

01:17:31: and put credits that way, or maybe integrate credits into the item itself.

01:17:38: The next questions...

01:17:42: This is pretty close, I kind of meant more like what it would be like, but IP, question marks or correction shows up.

01:17:50: Do you know which question they're talking about?

01:17:56: Hmm... I kind of lost context.

01:18:00: Yeah, I wouldn't know either.

01:18:02: If you're posting a correction, please include the context of the actual original questions,

01:18:07: because there might be a bunch of questions before we get to yours, and by the time we've forgotten what you've originally asked.

01:18:16: Next question is from ModernBaloney.

01:18:19: So about the IK, do you have any ideas about what the new IK systems would be?

01:18:23: Yes, I mean we're going to build a custom one, it's going to be an in-house system,

01:18:26: we've got lots of ideas how it's going to work.

01:18:29: There already is a discussion on it on our GitHub, so I do recommend checking that out,

01:18:33: because it has a bunch of details.

01:18:38: I don't know if I should get too much into depth on this one, because we're going to be working on it right now,

01:18:44: There's a few parts, one of them is making sure it works well with lots of different avatars,

01:18:50: so it's easy to set up, so it also works really well with multiple trackers,

01:18:55: it doesn't do neck scrunching, it doesn't do weird things with the spine,

01:19:00: so there's kind of lots of things with that.

01:19:03: Also make it easy to adjust it and putting assign bones and stuff like that.

01:19:10: Next question, so just checking time, Navy3001, random question,

01:19:17: if you move component of assets into disabled slots, will those assets load?

01:19:25: Like if mesh is on it, will the mesh load?

01:19:28: So right now, well it kind of depends which components you move,

01:19:31: if you move just the asset provider, then it'll load as long as there's something referenced.

01:19:40: Generally yes, the assets will still be loaded.

01:19:44: It kind of depends a little bit, because there's some things where,

01:19:48: actually no, even when disabled, they'll load.

01:19:51: There's a feature for our specific optimization that's planned,

01:19:55: called the Cascading Asset Dependency.

01:20:00: What it'll do, is actually make it like when you, say you have a mesh render,

01:20:05: and the mesh render is using a bunch of textures,

01:20:08: it's using a mesh. If you disable it, it means it doesn't need to be rendered,

01:20:12: it can tell those textures and the meshes, and the other assets,

01:20:17: I don't need you right now. I'm not being rendered, I'm disabled.

01:20:22: Which means, if there's nothing else using them, they can actually unload from memory.

01:20:27: Or if it starts that way, maybe they don't even load in the first place.

01:20:30: So that will change the behavior, but right now disabling it doesn't prevent it from being loaded.

01:20:41: Next question is from Dusty Sprinkles.

01:20:44: Do you think in the far-flowing future there will ever be some kind of way to pack components

01:20:48: with ProtoFlux into a custom component?

01:20:50: I think something like that would be useful, but I don't know if it fits any specific long-term plans.

01:20:55: Yes, there's actually a GitHub issue for it already.

01:20:57: It's called static ProtoFlux assets.

01:21:01: What we want to do is take a bunch of dynamic nodes that we have with ProtoFlux,

01:21:06: and you compile them into a static asset, and it will then get unloaded.

01:21:11: So that's something that's already on the Earth map, it's probably going to happen at some point.

01:21:15: It's also going to be useful, like I mentioned earlier with the workshop,

01:21:20: where you can essentially publish libraries of nodes,

01:21:28: and share them with people, there might be a more efficient way to share them

01:21:31: rather than having the whole nodes with all their positions and everything.

01:21:37: Next question is from Kyle Beyer.

01:21:41: For worldbuilding, a way to commit certain changes similar to how Git works,

01:21:44: would that be nice, but probably not feasible?

01:21:47: It wants a little bit trickier, because you need to be able to do a diff for the world.

01:21:55: Which might be possible?

01:21:58: We could, you know, like...

01:22:00: One way to do it, we could add a system

01:22:04: where whatever changes you're making, it just serializes it into a text representation

01:22:10: and then we just do a diff on that.

01:22:13: And then we can use external tools, but it's not going to be as well integrated with the rest.

01:22:18: It also requires that textual representation to be stable, so it doesn't just detect that the whole thing changed.

01:22:25: It only detects some individual changes, which I think should be feasible if it copies the hierarchy.

01:22:33: So I think it's potentially feasible.

01:22:35: You could also have it more integrated into the system.

01:22:39: But there's also a question, do we want to have it well integrated where...

01:22:43: How are you viewing the diff?

01:22:46: Are you looking at it in terms of looking at a text?

01:22:50: Because if you're looking at a text representation, that might be really tedious,

01:22:54: because there might be thousands of things that changed,

01:22:58: and it's hard to visualize what the changes are,

01:23:03: because it's all just a textual representation.

01:23:07: What might be more useful is having in-game visualization of the changes.

01:23:13: So say we remove this thing, or we move it,

01:23:17: there's something that shows that change,

01:23:20: and can just quickly flip between the two and compare them,

01:23:23: and maybe accept individual changes.

01:23:25: But this is a little more work,

01:23:28: because now we have to build some way to detect those changes,

01:23:32: and build tooling to visualize that those changes happened.

01:23:39: So that is feasible.

01:23:41: It's just a matter of how we want to implement it,

01:23:45: and how much time we want to invest into it.

01:23:50: It's still got like 35 minutes left, so we're good on time.

01:23:55: BotnPalony is asking,

01:23:57: Is there a way to override the size of haptic volume per avatar?

01:24:00: I got a giggle plug recently, I had to make flags to resize the haptic volume radius,

01:24:04: because it actually couldn't grow big enough for me.

01:24:08: Wait, so you're asking for haptic volume...

01:24:11: Do you mean the haptic triggers, or the haptic samplers?

01:24:14: Because you can change size of both, but depending which one you want, it's a little bit...

01:24:22: It's going to be a different process.

01:24:25: If you're using the giggle pack, that one actually should be in settings.

01:24:31: Let me actually check mine.

01:24:33: There should be a bunch, if I go manage giggle packs, there's...

01:24:38: You know, you've got like haptic point radius, and you've got a bunch of positioning options.

01:24:49: If you want the haptic triggers on the avatar, that's a little bit tricky because we need to modify it for each person.

01:24:58: Because generally it's going to be sized to their avatar.

01:25:02: You can put custom ones on the avatar.

01:25:04: For example, if I wanted my snoot to trigger things, I could put a haptic trigger in there.

01:25:09: So that's one way to kind of customize them.

01:25:11: There should be components to disable the default ones too if you want to build just a fully custom one as well.

01:25:18: But overall you should be able to modify them.

01:25:21: If you're using a bhaptics vest, there's actually components that specify how they're mapped.

01:25:29: The only thing is you need to have the vest on so it actually creates the objects.

01:25:32: Then you create, you know, enable the visualization and you can mess with the values and make sure it matches your avatar.

01:25:42: So hopefully that kind of answers it. If you have more questions, let me know.

01:25:47: And that kind of clears the questions we have for now. I still have a bunch of time left, so now what?

01:25:56: More questions. We're saved.

01:25:59: We're saved from awkward silence.

01:26:02: Warren Ballin is asking, yeah, but I can't really move them individually, should I make feature requests for that?

01:26:09: I'm not really sure. You should be able to move them individually because, well, it's just one for the giggle pack.

01:26:20: So I'm kind of confused what you mean.

01:26:23: But yeah, you can always make an official request, you can look at it, include some details there because I'm a little bit uncertain if I'm understanding you correctly right now.

01:26:33: And now back to the awkward silence.

01:26:40: Got any more questions in there?

01:26:48: I say meow. Even though I'm not a cat.

01:27:24: Now I'm getting all the silly questions. Aegiswolf is asking, what is this? This is.

01:27:38: Can I hit that one? I missed. I failed.

01:27:48: Next question is from William Barlani. Is potato a construct of the mind?

01:27:57: One moment.

01:28:05: Mmhmm.

01:28:09: Mmhmm.

01:28:11: Mmhmm.

01:28:13: Mmhmm.

01:28:15: Where is it? Seriously, this is where inventory stage would be useful.

01:28:28: Aha.

01:28:31: Oh, this is the bleeding one.

01:28:35: We've got a bleeding potato.

01:28:38: There's actually a funny story about this one, because I was making a pulsating potato for some reason, and then the gear bell, she shot it with a gun as I was saving it, so it got saved with a gunshot.

01:28:53: So it just, you know, just make it...

01:28:56: Oh.

01:29:00: I deleted it, but it was supposed to explode.

01:29:05: And maybe this is the exploding one.

01:29:09: Maybe this is the exploding one? Let's see.

01:29:13: Yes, this is the exploding one.

01:29:16: I hope that answers your questions, Mother Melanie.

01:29:20: Also, thank you, RecommenderRocke, for the Prime subscription. Thank you, we.

01:29:31: Next question is from TheRedNeko.

01:29:33: Reposting cause I was gone when he responded last time.

01:29:36: I'm on bio, so I went to wait hours. Sorry, the issue is that...

01:29:39: Oh, I recommend, like, you know, just rewinding the stream, or like, you know, watching the recording.

01:29:48: Like, you should put it like, you know, the day after.

01:29:51: Next question is from RustySprinklers.

01:29:53: Froox, how does it feel to look at the Gaussian Splatting VR?

01:29:56: It's kinda cool, like, it's, like, um...

01:30:00: It doesn't look like it's kinda, like, weird.

01:30:04: Like, if you've got a good Splat, like, it just, it looks so real, which is really weird.

01:30:12: Like, I don't know how to describe it.

01:30:14: You'll be able to see soon enough, because I do have the Gaussian Splatting rendering working,

01:30:19: and it works in VR, and since the last demo I could also fix, you know, some issues.

01:30:24: But also, Cyro actually has been working, um, they've been making the port, the C-Sharp port of the library,

01:30:29: for, like, a file format for them, the SPZ, which makes them, like, way smaller for storage.

01:30:36: Yeah.

01:30:38: So, um, you'll be able to use them. I didn't want to push it because it touches the cloud,

01:30:42: and I was, like, at FC and then traveling, and I was, like, I don't wanna, in case something blows up,

01:30:48: I'll be making changes to it right now. But I'll probably release it, you know, sometime next week.

01:30:53: At least the initial version of it, so you'll be able to, like, look at Splats, you know, yourself.

01:30:58: The only thing is, they're, like, they're pretty heavy. There's gonna be some things that gonna reduce

01:31:04: heaviness, like in-memory compression, but generally they're kinda heavy.

01:31:09: You might wanna, like, you know, use them in a simpler world.

01:31:13: And you might need a beefy GPU to render them.

01:31:18: The reticule, no, we're going to Mamba, unfortunately.

01:31:21: You can, you know, just wait for YouTube, like, you know, to appear tomorrow.

01:31:28: Erasmus0211, any plans to replace the cloud home?

01:31:30: Users often have a very hard time loading it.

01:31:32: We don't really have any plans, like, to replace it.

01:31:36: And generally we can, there have been, like, some optimizations to it.

01:31:39: And we can always do more optimizations.

01:31:41: There's optimizations, you know, to the engine.

01:31:43: If there's, like, issues with, like, loading it, it will actually help us to know

01:31:48: what those issues are.

01:31:49: You know, is it, like, the assets are taking a long time to load?

01:31:51: Which might also be weird, because, like, you know, we pre-cache it.

01:31:55: So that's, that'd be actually weird.

01:31:58: Like, is it, like, you know, or is it, like, too heavy?

01:32:00: Or, like, what exactly do you mean, you know, by the users having a hard time loading it?

01:32:06: Because, of course, when we get, like, feedback like this, but it's very vague,

01:32:10: that actually makes it very difficult for us to do something about it,

01:32:13: because it's not enough for us to, you know, understand the problem that people are having.

01:32:18: If we understand the problem, then we can better, you know, engineer solutions for it,

01:32:24: you know, and figure out, like, what's the best approach.

01:32:27: But, you know, we're not gonna replace something, you know, just because, you know, like,

01:32:34: if we're like, where should I replace something, without even understanding what's the problem in the first place,

01:32:39: you know, how would we also make sure we don't introduce the same problem with the replacement?

01:32:44: That's not really, you know, a good approach.

01:32:47: So, generally, we'll, you know, generally, like, you'll have to, like, give us more detail, essentially,

01:33:00: like, you know, what kind of issues are people running into, let us understand the problem,

01:33:04: and let us, you know, figure out how to fix it.

01:33:08: To next is Cyrena Kombovitch, how do I just think about that?

01:33:14: Okay, so this is also, like, all the questions right now.

01:33:16: We could actually, I forgot, but, like, we are running on the previous build,

01:33:20: so we could give a bit of a showcase of some of the new things,

01:33:24: because one of the things that this is, one of the things that, you know, this is running

01:33:31: is we have a new system for ambient light.

01:33:36: Specifically, like, you know, whenever you're in a world,

01:33:40: there's, like, multiple things that contributes to the lighting of, you know, of things,

01:33:43: so you have, like, you know, the direct light,

01:33:47: I actually switched this to a smart POV,

01:33:50: you know, you've got the direct light that's coming, you know, from the sun over there,

01:33:54: but there's also, like, ambient light,

01:33:56: because I don't think there's, like, any light sources in this direction,

01:34:00: and it's actually taken from the skybox.

01:34:04: So, would you actually be able to, can you kill the main light?

01:34:08: Like, the directional light, so it's just the ambient light,

01:34:11: so we can kind of see what that looks like.

01:34:15: Yeah, let me, uh, let me grab the sunlight here and just turn it off.

01:34:20: Alright.

01:34:21: There we go.

01:34:22: Just turn off the sunlight.

01:34:23: So, what do you see on the sphere?

01:34:27: Let me make sure.

01:34:29: This is now illuminated purely, at least it should be, I don't think there's any light sources,

01:34:33: purely by the skybox.

01:34:36: Oh, there's this one right here.

01:34:38: There's this one, is it, is it doing anything?

01:34:41: Oh yeah, I see it.

01:34:42: Just turn off all the lights.

01:34:55: Trying to find where the heck the light is.

01:35:00: Whoops.

01:35:03: I found it.

01:35:06: I think it's contributing super much anyways, actually a bit.

01:35:12: Hanging light, that one or that one.

01:35:17: Okay, there we go.

01:35:19: There we go. Oh yeah, it's darker.

01:35:21: So now this should be illuminated purely by the skybox.

01:35:27: And virtually what UNINT is using for this is something called Spherical Harmonics.

01:35:33: It's a, it's like a mathematical construct that allows you to encode low frequency information,

01:35:38: you know, that's directional.

01:35:40: So like in every direction that the surface is facing, it can sample a color and get this very smoothed out color information.

01:35:51: And before, this was completely handled by UNITY.

01:35:55: You know, we essentially were just letting UNITY do its own thing.

01:36:01: We're just telling it like, you know, please recompute this now.

01:36:04: Now there are a few problems with that.

01:36:06: One of them is like, you know, you don't have direct control over it.

01:36:10: You don't want it to be like, you know, a bit brighter or darker than the skybox.

01:36:13: Or maybe you don't want it to match the skybox at all.

01:36:16: You are not able to do that.

01:36:18: Unless you are doing some shenanigans with fake skyboxes.

01:36:22: The other problem is like, the other problem is like, it's very CPU heavy.

01:36:33: Because what UNITY does, it just computes the texture, transfers to the CPU and computes it.

01:36:37: Which means it would cause a lot of stutters and freezes and drops in framerate.

01:36:42: So we had to turtle how often it actually computes it.

01:36:46: And because of that, if the lighting was changing in the world, it would be very jagged.

01:36:55: Let me, where's the tool?

01:36:58: I think I left it somewhere here.

01:37:02: Oh there it is.

01:37:05: So I grab the tool.

01:37:07: Which means if you made it dark, it would be still light for a little bit and then change after it.

01:37:13: But here, if I change the skybox, it's pretty much changing instantly and make it dark.

01:37:20: I mean it's like, it's super duper dark blue on the top.

01:37:25: It's definitely not how that's supposed to look.

01:37:28: Do I need some tweaking? There's higher frequency lag informations.

01:37:36: It's probably getting a lot of lag from the bottom as well.

01:37:41: But pretty much what the principle for this is, it's now using a compute shader to compute the spherical harmonics.

01:37:50: Which is way faster, which means it can pretty much run every single frame.

01:37:59: So if I actually open up, you know, if I open up the inspector and I find the water assets,

01:38:07: and I think it's under skybox.

01:38:10: Oh my god, you're so, you are so blue.

01:38:14: I'm blue.

01:38:16: Under skybox materials, where's the skybox?

01:38:24: Oh there we go, skybox component is the one I want.

01:38:28: And there's like, you know, the spherical harmonics.

01:38:33: So what I can do is like, you know, if I change the lighting in the world, you can see those values change.

01:38:38: And the colors, they cannot roughly match, but they're kind of like, you know, they represent like coefficients,

01:38:44: so they're going to be a little bit like weird.

01:38:47: It's all dark.

01:38:50: Interesting that like, when you make it full day, like it gets really, really dark blue.

01:38:55: Yeah, it's probably coming up from something.

01:38:58: But the way it's doing it is actually rendering, you know, the reflection probe.

01:39:02: And then this component, reflection probe, ambient SH2, is, you know, color connecting from that.

01:39:08: And one of the things you can actually do is like, you know, there's the spherical harmonics is using second order,

01:39:14: which means there's the flat, like, you know, just single color.

01:39:19: And then there's like second orders, which have like higher frequency information.

01:39:22: And you can actually tune these.

01:39:26: And like, you know, lower the lower frequency components and kind of, you know, adjust, like, you know, how it kind of looks.

01:39:34: For my experiment thing, you know, this kind of like tuning the parameters a bit, like brings it, you know, closer to like unity,

01:39:39: because unity for its own calculation seems to like boost the, like, lower frequency and more kind of smooth out the higher frequency information,

01:39:50: because the high frequency tends to like add, like, you know, very strong, like darks or brights.

01:39:56: So you can like mess with it, but you can also, like, you know, now have kind of freedom to adjust these, you know, to whatever fits your world.

01:40:02: Like if you want to be really bright. No, that's really bright.

01:40:07: Actually see, like, if I boost the high frequency, high frequency, you know, bits, you see those are contributing to really dark tones.

01:40:17: So I might do this. But also, this is just, you know, a value.

01:40:22: So you can use this with ProtoFlux. You can, you know, compute your own, you can, like, you know, do whatever you want with it.

01:40:29: Or you can pre-compute a bunch and then, you know, lerp it between them.

01:40:32: So like if you're in an open world and you go into a cave, you can, like, lerp it to, like, make the ambience darker.

01:40:38: So now we have, like, you know, a lot more kind of control how this works.

01:40:42: This, the Ambient Light SH2, it's literally just being driven from this.

01:40:47: So I can actually just break the drive, you know, and then I can, like, pull whatever I want here.

01:40:53: So if I, if I, like, let me actually just, let me reset the value first.

01:41:00: Reset the default and now, you know, there's no Ambient Light at all and everything is absolutely dark.

01:41:05: Actually this makes it easy to see, like, the other light sources.

01:41:09: You can see there's a light here because now the only source of light is, you know, whatever the light sources are.

01:41:17: There's no Ambient Light, it's like pure darkness.

01:41:21: But then I can maybe, like, put like this here, if I put 3 and now everything's red.

01:41:27: Or if I put, like, red into the higher band, you see now it's red from the top.

01:41:32: So, like, the top is red but the bottom is gonna be darker.

01:41:37: And we can do, you know, green from the side.

01:41:40: So, you know, now it's almost like there's, like, a green light, you know, that's coming from this side.

01:41:45: And usually I wouldn't put extreme values like this but I'm just kind of doing these as a demonstration.

01:41:51: You know, you can do very funky things with the lighting.

01:41:57: Just full control over how this works.

01:42:01: I can show you some higher bands too. As you go, like, higher and higher, you see this one kind of has, like, two kind of waves that kind of happen on it.

01:42:10: Never look around Cyro.

01:42:12: It's kind of like very harsh lighting but it's good for visualization.

01:42:17: So, I can just put this back here, I can drive it.

01:42:20: And now it's being computed, you know, from the skybox again.

01:42:23: But, any existing worlds, or any new ones that you create, they're going to have this kind of setup.

01:42:29: Where the ambient light SH2, this is actually what determines what the ambient light is in the world.

01:42:35: Ooh, what was that?

01:42:38: I was touching it.

01:42:39: Oh, okay.

01:42:41: So, determines, you know, what this is.

01:42:45: And then, this component computes it from a reflection probe.

01:42:49: It also adds a bigger reflection probe, that just kind of covers everything.

01:42:53: And it provides skybox, you know, it only renders skybox, so all your ambient lighting is from that.

01:43:02: But you can make more reflection probes, compute this, save it into a field, and then make a bit of ProtoFlux that interplays between the two.

01:43:15: We do have a bit of time, so I can actually show you how to kind of do that.

01:43:19: So if I grab ProtoFlux, let's make, I'm gonna, I'm just gonna add two spherical harmonics.

01:43:36: So the data type is SphericalHarmonics, and this one's specifically second order, so it's L2, and the data type of the coefficients is ColorX.

01:43:54: So I add a value field, there we go, and I'm just gonna duplicate it.

01:44:00: So say like this is, you know, one lighting I like, I'm just gonna grab this value.

01:44:10: Actually no, that's not a value, I'm gonna grab this value, and I'll drop it here, blip.

01:44:16: And say I wanna change the lighting, I'm gonna be like this, so it's kinda, you know, reddish.

01:44:25: And I grab this value, and drop it here, oops, I grabbed the wrong thing.

01:44:32: Grab this, drop it here, so I have these two values stored.

01:44:38: And now I can just, you know, source them.

01:44:41: So I have source for one, source for the other one.

01:44:46: And then I can just use, you can kinda think of these like, you know, if you work with vectors or matrices,

01:44:52: it's a little bit similar, you can interpolate them.

01:44:58: Forget this, interpolation, volume, lerp.

01:45:03: And I'm actually gonna type the volume, just gonna let that overload.

01:45:06: So from 2, lerp.

01:45:10: So it's just gonna interpolate.

01:45:12: And now, for this one I'll just break the drive, don't care about it.

01:45:19: And I'm gonna drive it from ProtoFlux, I know it's dark.

01:45:23: And now I can actually control it.

01:45:26: And you see, now it's kind of like interpolates between the two of them.

01:45:30: And if I do 0.5 it's like, you know, one that's in between.

01:45:35: So now if I plug, you know, so I do time, and I do like a sine wave.

01:45:49: Sine wave, sine, is it trigonometry?

01:45:54: Sine.

01:45:58: And I just wanna remap it too, so it's a remap.

01:46:05: There we go.

01:46:06: So I have a value that goes, you know, from 0 to 1, I can plug it here.

01:46:10: And now, you know, my ambient lighting is, you know, interpolating between the two.

01:46:19: And with this, you can pretty much, you know, this is just a simple example.

01:46:24: You can do whatever logic you want, you know, save, you know, your spherical harmonics values,

01:46:29: and do whatever interpolation, and then drive, you know, this with pretty much whatever you want.

01:46:34: You have like, you know, now much kind of better control.

01:46:36: The other thing you could do is you can also multiply these values.

01:46:40: So say like, I want this to be brighter.

01:46:45: You can use the, let's see, multiplication value.

01:46:55: It's just small.

01:46:56: I'm going to let it overload.

01:46:59: So I'll take this, like the result.

01:47:01: Oh wait, I cannot, it doesn't like this.

01:47:08: No wait.

01:47:10: Oh wait, do we need to have multiplication for this one?

01:47:16: Oh, I might need to add these.

01:47:21: There might be some operators still missing.

01:47:24: There's one too, if I go math.

01:47:27: Spherical harmonics is like one specifically.

01:47:29: There's scale orders, so I can go color X and just scale the values.

01:47:41: There we go, there we go, scale orders.

01:47:44: I can process this one, do this.

01:47:50: And I can just make it brighter if I want.

01:47:58: And you can do whatever kind of processing you want on these.

01:48:00: Or say you actually want it to be darker compared to Skybox.

01:48:04: So I do 0.5 and now the ambience is darker.

01:48:11: So this gives you a bit of an idea how these work and how you'll be able to utilize them.

01:48:16: How you'll be able to utilize them for controlling the ambient light in the scene.

01:48:25: So I'm going to move this, let's just leave this here.

01:48:31: It's going to be changing the lighting but the thing is fine.

01:48:34: It looks a little bit funky.

01:48:36: The symmetric is a bit funky, there we go.

01:48:43: Oop, why is my default? I'm just going to do this.

01:48:50: There we go.

01:48:53: I've actually got a few more questions as well so let's have a look.

01:48:59: TheRedNeko is asking, we also have 10 minutes left.

01:49:03: So we might not be able to get through all the questions depending on how many there is.

01:49:14: So, we'll be able to see how many we can get through.

01:49:18: TheRedNeko is asking, actual question though.

01:49:20: Any plans to get the Linux build working again once Switch switches to Sauce?

01:49:24: As its GPU performance is about 50% of Windows to Proton, it is in multiple worlds.

01:49:28: The worst example is GreatSpace where Windows get it 140 FPS at 60% usage,

01:49:33: where Linux under Proton is mere 80 to 90 FPS and maxing out my GPU.

01:49:39: If we have any specific plans, that's something we'll have to evaluate at a later point.

01:49:43: It might make it easier for support, because the engine is being based on one that should have multi-platform support,

01:49:51: but we'll still have to see if there are any complications with that, what other things we have to deal with.

01:49:56: There might be other libraries, other things to consider, so we cannot make any promises right now.

01:50:02: I think with the performance it might depend on your setup, because I cannot hear conflicting things,

01:50:09: like some people are actually using Linux as well, and you...

01:50:13: Yeah, so the performance under Proton can vary depending on your setup.

01:50:19: I'm currently using a build of Proton that has better emulation, or not emulation, it's not an emulator.

01:50:34: It has a better implementation of, or a better re-implementation, I should say, of Windows synchronization primitives under the hood,

01:50:46: so that is how it synchronizes resources.

01:50:48: And so, that's called ntsync, so it's like a special branch of Proton that has this ntsync thing in it,

01:50:59: and it makes it run pretty good.

01:51:03: The performance difference between Windows at this point when I'm using this is pretty negligible,

01:51:09: but you can still get pretty good performance even on just stock Proton.

01:51:13: And some people say it's faster, some people say it's slower.

01:51:16: It really depends on what game you play, what hardware you have.

01:51:21: So, if you notice a difference, definitely don't consider that the be-all-end-all result of like,

01:51:29: oh, I guess Proton is just bad.

01:51:31: Because it really could just be an unfortunately poor combination of hardware or whatever, I don't even know.

01:51:42: We do have a Linux channel in the Discord, so you might poke there.

01:51:46: I know some of our active Linux users, they give people advice or share their experiences,

01:51:54: so they may have some pointers on how to improve performance as well.

01:51:59: So it might be worth trying.

01:52:03: Next question is, OSA and Game Club wonder if there's a bug in Shader Sphere calculated like concave versus convex

01:52:08: that would mirror inverted reflection like concave tri-cube shape?

01:52:12: I don't really know what you mean.

01:52:14: Yeah, I don't. I don't understand that, I'm sorry.

01:52:18: Yeah, it doesn't really relate to the things. I'm sorry.

01:52:26: DJ Prodigy Hunters. Can you apply spherical harmonics to shading independent objects like avatars?

01:52:31: No, you cannot apply it to individual objects. So, unfortunately not.

01:52:38: The lighting is kind of global, so it applies to everything in the world.

01:52:47: Erasmus0211. Not exactly a resonite, but I've seen anyone running around with resonite t-shirts.

01:52:53: Yes, a bunch. I actually saw some people post on Blue Sky recently, which is kind of neat.

01:52:58: We've also seen a bunch of BLFC, because there was a lot of people.

01:53:09: It's kind of like the de facto resonite con. And there was a lot of resonite t-shirt people.

01:53:15: We also had some people at FC. I had my resonite shirt, but I didn't actually wear that one, but some other people did.

01:53:22: This is why it's kind of cool to see them in the wild.

01:53:27: And next question is DJ Prodigy Hunters. Can we get ProtoFlux node that displays the value of the USA national depth clock?

01:53:36: I think we don't have data types large enough for that.

01:53:44: I mean, there's WebSocket interface, you can plug things in from external APIs if you want to, assuming there's one for this.

01:53:57: It also covers all the questions again. We've got six minutes left, so we actually might be able to squeeze a few more questions.

01:54:04: I thought it was a few more, but if you've got any...

01:54:06: I have a question.

01:54:07: You have a question?

01:54:09: I have a question.

01:54:10: What is the question?

01:54:13: I don't anticipate this, but given that you can calculate the spherical harmonics from reflection probes, does that then open the door to potentially basic support for GI probes based on those spherical harmonics?

01:54:35: I mean, you could. Oh, thank you Nagisa Phantom for the first subscription, thank you.

01:54:40: I mean, technically, yes. What the light probes do is, instead of just a single global one, you have a bunch of them scattered in the world, and then when you're rendering things, it essentially samples that.

01:54:59: It finds the closest ones, and it uses a tetrahedral interpolation to compute a value between those probes.

01:55:06: So, technically, we could support that. We might not, because that adds a complex data moving to Sauce, because that is a thing that needs to be reimplemented.

01:55:20: And I've had a... before I actually implemented this, I had a conversation with Kings as well, just making sure it doesn't cause complications there.

01:55:34: And that might cause complications, so it's one of those things we might not want to do, because it's gonna delay Sauce a lot more.

01:55:43: It's a similar thing with shaders. If we wanted to, we could add support for custom shaders right now.

01:55:52: Problem is, all the shaders you would make would break when we switched to Sauce, which means your content would break.

01:56:00: And that's one of our philosophies, we try to avoid tooth and nail. We will try to avoid breaking content as much as possible.

01:56:08: And because of that, even though on the technical side it's easy to add right now, we could just let you build custom shaders in Unity.

01:56:17: There is not a path to maintaining Electrum compatibility, and because of that we don't do that feature.

01:56:23: And this is kind of similar. This one we could implement with Sauce, I don't see anything blocking it, other than time.

01:56:34: But sometimes just to kind of weigh it, is it really worth it having it now?

01:56:43: Is it worth it now, and then delaying the thing that's going to provide even more benefits later?

01:56:52: Probably not, I would say.

01:56:55: Our technical level is on version planning now.

01:57:02: Next question is from Alex Tucker-Anto.

01:57:05: How many things do you have to re-implement from Unity to get rid out of Unity from the game? Thank you.

01:57:11: There's actually a longer talk on our YouTube channel, I recommend checking that one out.

01:57:15: Because it goes into detail, but there's pretty much two things that are like features.

01:57:20: One of them is the particle system, PhotonDust, is our new custom particle system that pretty much does that.

01:57:28: That one's about to be finished, so we're gonna remove the old Unity one, we're gonna only have PhotonDust very soon.

01:57:36: And the remaining thing after that is the audio system, which there is also another video on our YouTube channel about that.

01:57:43: So once the audio system is done, then it's pretty much something I started jokingly called the splittening.

01:57:49: We're gonna be pulling FrooxEngine out of Unity.

01:57:52: We'll still be using Unity for rendering at that point, but we'll get the big performance improvement by using .NET 9 runtime, which has much better codegen.

01:58:02: At that point, we only have to replace the renderer to get completely rid of Unity, which is gonna be one of the next steps after that.

01:58:10: At some point after that.

01:58:14: And then we'll pretty much be completely rid of it.

01:58:16: But right now, PhotonDust is almost done, audio system is gonna be working next, and then the actual work on the Unity communication and then splitting out into process is gonna be pretty much the next thing.

01:58:30: So next question is from Rasmus.

01:58:32: So it talks about Resonite operating systems, where Resonite moved more towards Linux and SteamOS. Some people have been code running Resonite on a Steam Deck.

01:58:40: Probably not in the near future, because Windows is still, as unfortunate as it is, it's still the majority of poor people around to run VR.

01:58:50: So until that changes, I don't see us moving.

01:58:55: It's kind of a thing, if there's a lot of people using a certain thing, then it's worthwhile for us to invest time into it.

01:59:04: But if there's not much pressure, we might not be able to dedicate time to it.

01:59:12: Anyway, with that, we pretty much have like 40 seconds left, so should be able to wrap it up.

01:59:18: So thank you everyone for joining the stream, thank you for asking all the questions.

01:59:23: We enjoyed the answers, we enjoyed learning more about Resonite, learning about what the future is, learning how we work.

01:59:30: Thank you as well for making really cool content, whatever we make in the community.

01:59:36: Or if you just hang out and socialize, we'd appreciate it too, because it helps the community keep going and keep this platform going.

01:59:45: And thank you everyone who supports us now on Patreon, or Stripe, or whatever other means.

01:59:52: It helps make sure we can keep doing this.

01:59:55: So thank you very much, thank you very much for watching, thank you Cyro for being here, helping me out with some other things.

02:00:01: And we'll see you with the next one next week.

02:00:05: So thank you very much, and bye-bye!

02:00:08: Alright.

02:00:10: I have to check if there's an ability to raid.

02:00:15: Let me see.

02:00:18: I see CraterJam.

02:00:22: There's Bluejay.

02:00:24: Ooh, they're actually in the CraterJam.

02:00:25: I'll raid them.

02:00:28: Bluejay, JB.

02:00:29: Everyone get ready.

02:00:31: It's kind of funny because there's literally two next to each other, like CraterJam and Bluejay, and they're both showing the same thing.

02:00:37: So I'm gonna do raid.

02:00:43: Bluejay.

02:00:44: I like to support, you know, like streamers, so Bluejay.

02:00:52: Bluejay, JB.

02:00:55: Let me make sure I put that right.

02:00:58: Yes.

02:01:00: So I'm gonna raid.

02:01:05: The raid has been created.

02:01:07: I'm getting ready.

02:01:10: So everyone say hello to Bluejay from us.

02:01:15: And we're raiding now.

02:01:17: Whee!

02:01:22: Oh my god.

02:01:24: Do I get like a go from one viewer to a bunch?

02:01:37: Okay.