The Resonance/2024-11-17/Transcript

From Resonite Wiki

This is a transcript of The Resonance from 2024 November 17.

This transcript is auto-generated from YouTube using Whisper. There may be missing information or inaccuracies reflected in it, but it is better to have searchable text in general than an unsearchable audio or video. It is heavily encouraged to verify any information from the source using the provided timestamps.

00:00: Everything should be up.

00:02: I'm going to post the announcement.

00:10: Hello, hello, let's see if we get people in there, we need to move this one a little bit

00:15: so we can read it.

00:18: Hello, do we have people on the stream?

00:25: Hello, can you hear us?

00:26: Can you hear us?

00:32: I'm just going to wait for some people to come in.

00:37: Oh, there we go, we've got Shushio.

00:39: Got one person.

00:43: Hello, hello Shushio.

00:59: Hello, just a sprinkle, we've got a bunch more people piling in.

01:03: Hello and welcome everyone.

01:08: So this is the first episode of The Resonance, that's like a new podcast that I'm starting.

01:16: It's like a mix between office hours where you can kind of ask anything about Resonite,

01:21: you know, whether it's a technical thing, whether you want to ask more broad questions,

01:26: you know, more kind of open-ended as well.

01:29: But also I have Cyro with me, who's our engineering intern.

01:34: We have a lot of times talking about Resonite and talking about cool technology,

01:41: talking about VR, talking about big vision behind Resonite,

01:47: like where do we want to, you know, which direction we want the platform to hand and so on.

02:01: You know, about what this place is.

02:04: We see a bunch of people popping in, so hello everyone.

02:07: I see Dustus Sprinkles, ApexRxAI, LexiVoe, I see Ground, Fuzzy, Jack Forge, AlexDupi, I see Jack, and Birdo.

02:20: Hello, welcome everyone.

02:23: Could I also request the chat before we start, since this is the first one,

02:29: I'm just kind of tuning things a little bit.

02:33: Is the audio level okay on your end? Can you hear me fine?

02:36: And, Cyro, can you say something?

02:38: Can you hear me okay, guys?

02:42: Let me know if I can even adjust the levels a little bit.

02:46: They look okay, like on the OBS side, but sometimes it's a little bit hard to tell.

02:53: Oh, oh my. It's public. Thank you.

02:56: We should maybe not do that.

02:59: Yes, I should have checked that. Thank you for letting us know.

03:03: Surprisingly, nobody joined, so I'm going to say hello.

03:07: I do have one more request.

03:10: For questions, we have a thing here that's going to show the questions,

03:16: so we're going to make sure we don't miss them.

03:18: What you need to do is, when you ask your question, make sure you answer the question mark,

03:22: and it's going to get picked up.

03:24: Would anybody in the chat be able to... Perfect.

03:28: I have a question.

03:29: Perfect. It works. Thank you.

03:35: Great.

03:37: Thank you. So everything works.

03:40: With this, we've got a bunch of people in there.

03:43: I think we're ready to start.

03:45: Hello again, everyone. I'm Froox.

03:48: I have Cyro with me, our engineering intern.

03:50: This is the first episode of what we're calling The Resonance.

03:54: The idea is that this is going to be a mix of office hours,

03:57: so we can ask anything about Resonite,

04:00: whether it's a technical question, whether it's more philosophical about the platform,

04:05: whether it's more specific or open-ended,

04:09: and we'll try to answer those questions as best as we can.

04:15: We're also going to talk a little bit more in broader terms.

04:20: What is the direction of the platform? What's the big ideas behind it?

04:24: Because we don't want to keep things just to the wire,

04:28: where it's dealing with individual technical issues,

04:30: but also what are the driving forces?

04:35: What would we want the platform to do in general,

04:39: irrelevant to any kind of specific features?

04:43: So if it did, we can start answering questions,

04:48: and if there's not too many of them, we can just talk about things.

04:54: We already have a few questions popping in.

04:56: Jack is asking, are you using MyChat?

05:01: I'm actually not. I don't know where I saved it.

05:04: I was kind of looking for it before the start,

05:05: and I was like, oh, I can't find it, so I'm using a little bit older one.

05:12: Then Ozzy is asking, of course, is Glitch cute? Yes, he's cute.

05:17: It's proven right here on the stream.

05:21: ChronicJoke, is Mayonnaise a ProtoFlux node?

05:25: No, but I actually have a list of ideas for April Fools,

05:31: and there's a food-related ProtoFlux node in there

05:35: that might pop up at some point, maybe.

05:39: Is Mayonnaise a ProtoFlux node?

05:42: The question is, what would it do if it's a ProtoFlux node?

05:45: Because Mayonnaise, that's got to be a data type.

05:48: That is true.

05:54: Or would it produce Mayonnaise?

05:56: Or maybe you have a number of inputs, you need to input eggs,

06:00: you need to input, actually, I don't know what it calls it, to mayonnaise.

06:03: I think egg is an egg.

06:04: We have the leaky impulse bucket, maybe we could have the leaky mayonnaise bucket.

06:13: We need mayonnaise outputs.

06:16: Yes.

06:17: Hopefully that answers your joke question with more jokes.

06:22: Then we have...

06:24: Oh, sorry.

06:26: Oh, no, go ahead.

06:28: I was just going to read Jack's if that's fine.

06:30: Okay.

06:31: Jack says, I have a pretty broad question, but I assume it's going in the same direction you're already heading.

06:36: Where do you want to see Resonite positioned within the VR slash social VR space?

06:41: Ah, this is a good ramble-inducing question.

06:46: There's a few things that we know about Resonite.

06:48: One of the big ideas of this platform is that it's built of multiple layers.

06:56: At the base layer, you have things like automated networking.

07:02: Everything you build, even the engine itself, you always get everything synchronized by default.

07:09: You don't even have to think about it.

07:12: Everything is potentially persistent.

07:14: You can save everything into inventory, into the cloud, under hard drive.

07:19: Everything that you get on the platform, you can persist.

07:23: The way I see it is once you have this kind of layer, you can start building on top of it.

07:28: We also have layers for working with various devices, various interactions, grabbing stuff, touching stuff, pointing out things.

07:37: Those are things that I feel like are important to solve really well.

07:45: Do them properly.

07:47: When I started my work in VR, I was doing a lot of disparate applications,

07:54: where one application had these features and supported this hardware,

07:58: and the other application supported these things and this other hardware.

08:01: Sometimes I would like functionality from this one application and this other one,

08:07: but it was kind of difficult to bring them over.

08:10: Plus, I would also find myself solving the same kind of problems over and over.

08:17: For example, being able to grab stuff.

08:22: One of the driving forces was to create a framework, a layer,

08:27: where everything is part of the same shared universe,

08:32: and build an abstraction layer.

08:38: It's kind of analogous to programming languages,

08:43: where the really old ones had assembly programming,

08:48: and you had to do a lot of stuff like managing memory,

08:52: like where is this stuff, and managing your stack,

08:55: and doing a lot of manual work to make sure the state of everything is correct.

09:01: Then came high-level programming languages,

09:03: where they would essentially do it for you,

09:05: and they would let you focus more on the high level.

09:08: What do you want to do?

09:12: Personally, what I want Resonite to do in the VR social space

09:17: is do a similar paradigm shift for applications,

09:24: where no matter what you build, you always have real-time collaboration.

09:29: You don't even have to think about it.

09:32: You can always interact with multiple users,

09:35: and you always have persistence,

09:36: and you always have integration with lots of common hardware.

09:42: To me, the social VR layer is the basis.

09:48: You always have the social stuff.

09:50: You can join people, you can talk with them,

09:52: you can be represented as your avatar,

09:54: but then everyone can build lots of different things.

09:59: Some people will just socialize, some people will play games,

10:02: but some people will build a virtual studio.

10:07: Maybe they want to produce music, maybe they want to program stuff,

10:11: and they're able to use Resonite, a framework to do that,

10:18: and share whatever they make with other people.

10:23: If you're good at building tools, you can make tools,

10:28: like I mentioned, for example, producing music.

10:31: Say somebody makes really cool tools.

10:33: Other people who do like to produce music can take those tools made by the users,

10:38: and because they exist within the same universe,

10:40: you can build your own music studio,

10:42: and you have all the guarantees that I mentioned earlier.

10:46: Video Music Studio can invite people in and collaborate with them no matter where they are.

10:51: You can save the state of your work,

10:53: or maybe say you can make a really cool audio processing filter or something.

10:58: You save it, you can share it with other users,

11:00: and it kind of opens up this kind of interoperability.

11:04: I want Resonite to be general enough where you can build pretty much any application.

11:13: Whatever you can think of, you can build on here and get those guarantees.

11:19: Kind of similar to how you have a web browser.

11:23: Web browsers used to be just browsers for websites,

11:26: but now we have fully-fledged applications.

11:28: You have your office set, like Google Docs.

11:33: There's a version of Photoshop.

11:35: We can play games.

11:36: There's so many applications on the web that it essentially becomes its own operating system in a way.

11:47: I want Resonite to do a similar thing,

11:50: where the platform itself is like the analog of the web browser.

11:54: You can build any kind of application in it,

11:57: but also you get the guarantees of the automated networking,

12:01: of the persistence, of the integration with the hardware,

12:04: and other things solved for you so you don't have to keep solving them.

12:09: That's pretty much in broad terms what I want Resonite to do.

12:13: I hope that Dremble can answer that question well.

12:18: I think it answered it pretty good.

12:24: When you were talking about this, I was thinking of way, way back,

12:31: before we had any sort of proper type of game engine.

12:36: You'd program all of your code, all of your games.

12:39: You would just program them raw.

12:41: You didn't have Unity, you didn't have Unreal.

12:44: If you wanted to collaborate with people,

12:45: you had your immediate vicinity of the people who you lived around.

12:52: And then now you have game engines and stuff,

12:55: which integrate a lot of the typical stuff that you would need to make a game.

13:03: But you're still limited to basically working over a Skype call,

13:08: or again with people close to you physically.

13:11: But now, this is kind of like a layer on top of that even.

13:17: Yes.

13:18: Where now, as social creatures, we don't really have something like this in that sort of space,

13:28: and now we do.

13:30: And being able to have that same sort of collaboration like you could have in real life,

13:35: with people working next to you, you can have from people who live a thousand miles away,

13:42: across the entire world, and you can work exactly as if you were right there,

13:51: and a lot of the things that you'd expect to work just kind of do like,

13:54: oh, you can see my context menu when it comes up, you can see this in Spectrum opening.

13:59: It's just like putting a piece of paper down on a table

14:04: and working on it with someone standing right next to you.

14:07: Yeah, that's a really good point.

14:10: There's actually another thing that I've seen that inspired me,

14:14: is seeing engines like Unity and Unreal.

14:19: Because it used to be when you wanted to make a game,

14:21: you pretty much had to build your own engine, which in itself is a big undertaking,

14:26: and you needed bigger studios.

14:27: But then game engines came out, they were more generalized,

14:32: and what they essentially did, they erased the minimal part,

14:37: where suddenly everybody has access to a fully-fledged game engine,

14:40: and it's no longer a problem you have to solve on your own.

14:45: And now you have small studios, even just individuals,

14:49: who are able to build games and applications that previously would take entire teams of people to do.

14:56: And where I see Resonite is doing that same thing, just pushing it even further,

15:04: where we go from just the game engine,

15:12: where you don't have to worry about stuff like making a rendering pipeline,

15:18: making a system for updating your entities, and so on.

15:22: Now you have additional guarantees, like real-time collaboration, synchronization, persistence,

15:27: that all just kind of comes for free, and you don't have to solve those problems,

15:31: and you can focus even more of your time on what you actually want to do in the social VR space.

15:36: What do you want to build, how do you want to interact.

15:40: So that's definitely a very good point, too, with the game engines.

15:48: I think we're probably going to move to the next questions, because we kind of rambled about this one a bit.

15:55: So we have a...

15:55: I think that one went ahead.

15:58: But I think we can answer that one pretty thoroughly.

16:03: So next we have ShadowX.

16:05: Food-related uphill fools joke? Shocking. I know, is it?

16:11: Next we have MrDaboop123456.

16:17: What are some bugs that we have said it's a feature?

16:21: Others?

16:23: The one that literally comes to the mind...

16:24: Actually, sorry, we've got to demonstrate it.

16:26: It's the fast crouch one.

16:29: You know, like when you...

16:31: Can you... can you... can you... there we go.

16:33: This.

16:35: This is technically a bug.

16:37: There's a bug report for this.

16:39: But I'm like...

16:41: We need to fix this one in a way...

16:43: Where you can still do this because it's just...

16:45: It's just funny and like, you know, it's like the language of desktop users.

16:53: It's... it's a bug returning into a feature.

16:56: So I think it's a good example of one.

17:05: Oh...

17:05: There have been so many updates that I can't think of any one in particular.

17:10: The obvious one, I guess, is Bulbul 3.0, which is just a typo, but...

17:13: Oh my god, yes.

17:15: I mean, it's more of a meme feature.

17:17: It's just kind of like, you know, like an easter egg.

17:20: But yeah.

17:22: Um...

17:23: Yeah, like, there's so much stuff that I don't really remember, but like...

17:26: This one is definitely like...

17:29: This one comes to the mind.

17:31: There's even a bunch of others, but...

17:33: I don't think I can think of any others myself.

17:38: So next we have Alex2PI.

17:41: I would think that mayonnaise...

17:42: We're going with the food thing.

17:44: I would think that mayonnaise would be a way to package information by quoting in mayonnaise.

17:49: Oh, I guess mayonnaise is like a rapper tribe.

17:53: Hmm...

17:53: It's kind of like a nullable except mayonnaise.

17:56: Kind of like a...

17:58: Kind of like a like, tar GZ.

18:00: Where it's like two layers of like, packaging.

18:03: Where one of them is the package and one of them is the compression or something.

18:06: Oh, it's more like...

18:07: So mayonnaise is a container format.

18:12: It's just kind of like...

18:15: It can contain other things in it.

18:18: .Mayo.

18:20: So next we have GrandUK.

18:22: Have you thought about other ways to get audio-video out of Resonite other than simply mirror-to-display of camera and the audio output of Resonite?

18:31: It's quite jarring to hear Froox non-specialized and then Cyro specialized as well as having inverse-specializing of Cyro that camera POV would suggest.

18:42: Actually, let me... I'm actually gonna turn Cyro into broadcast. That should make things easier for this.

18:47: You can also set the audio source to be from the camera.

18:53: I know, but that messes with my head too much.

18:57: I'm just gonna keep you on broadcast for now so it's easier for the stream.

19:01: However, I do actually have answers to that question.

19:06: So one of the big things that we're focusing on right now is a big performance upgrade.

19:11: And actually, I think I've seen a question so this might answer some of that too.

19:15: It's doing a big performance upgrade.

19:17: The two big things that need to be done...

19:20: Well, there's actually one more, but the two big systems that need to be done

19:25: is a particle system, which is being worked on right now, and the audio system,

19:30: which Cyro actually has been working on a part of it for doing a reverb system.

19:37: Those two systems, they're essentially the last two big systems

19:43: that are sort of like a hybrid between FrooxEngine and Unity.

19:47: I'll go a little bit more into details on this one with a later question,

19:50: but we are going to be reworking the audio system, and with the current one,

19:55: the Unity one, it doesn't support multiple listeners.

20:01: The goal for reworking the audio system is so we can actually do that,

20:05: there's one listener that's for you, for your ears,

20:08: and there's additional listener that can be for camera

20:11: that you route to a different audio device, so you can actually kind of split it too.

20:15: Because you can switch to camera, but then I'll be hearing everything from camera's viewpoint

20:20: that it kind of messes with my kind of specialization.

20:25: So yes, there's going to be a way to do it, we just need to get it out of the system.

20:30: Next we have OrigamiVR.

20:35: I'm going back home very soon, I'll finally be able to reside again.

20:38: I was wondering, no social platform has this in official, I think.

20:42: What are the chances of implementing social events and gathering lists in-game

20:45: that notifies people about upcoming events and more?

20:49: Yes, that's actually one of the things I would like us to do.

20:52: We do have a GitHub issue for it, so if you search events UI, I kind of forget its name exactly.

20:59: On our GitHub, there's a bunch of details.

21:03: It would be really like adding server-generalized systems plus some UI

21:07: to be able to register events and see what's happening.

21:10: It's going to help people discover more things going on in the platform

21:14: and make it easier to socialize and join things.

21:18: It's probably going to happen sometime after we finish with the triangle of the performance update

21:24: because there's a bunch of UI improvements we want to do,

21:28: and we want to focus on so many things at a time.

21:31: So it's going to come at some point. No timeline yet.

21:36: At the very least, it's going to be sometime after the performance update.

21:44: With one of the things that's definitely on my mind,

21:46: and that I think should be pretty high on the list

21:49: because we want to help people drive socialization engagement,

21:54: so editing is a pretty important feature.

21:59: Actually, when you were talking about the performance,

22:03: I actually saw someone in the chat.

22:06: Yes.

22:08: And I actually wanted to say that the rendering engine in particular,

22:15: like using Unity, isn't necessarily like a blocker for the performance update.

22:25: I see there's two questions that are related to this,

22:29: so I'll go a little bit more in detail on this one.

22:31: We have SkywinKitsune asking,

22:34: Froox, could you explain the roadmap to a big optimization update?

22:38: Where are we at in that process?

22:40: And then we have GlovinVR asking,

22:41: what are some of the big milestones still needed

22:45: to move the client applications to .NET 8?

22:47: I know the particle system is one of the prerequisites,

22:50: but where are some other prerequisites that can be looked forward before the shift?

22:55: So these two questions are pretty much the same kind of question,

23:00: so I'm going to cover this in one.

23:03: Let me actually bring my brush,

23:05: because I feel it would help if I draw a diagram.

23:11: I'm also going to turn the camera to manual mode,

23:15: so it's not moving around for this.

23:19: Where's my brush? Give me like a second.

23:24: Tools... I should have gotten one already, but...

23:28: Geometer, my brushes...

23:31: Yes, Cyro will dance the other thing while I look for the brush.

23:36: I think this one should be okay. There we go.

23:39: So, let me see... So this looks pretty visible on the camera.

23:43: So, just to kind of give you an idea, consider...

23:51: Let me actually make this a little bit bigger.

23:56: So consider this is Unity.

24:04: That might be glowing a little bit too much.

24:08: Consider this is Unity.

24:10: You have Unity stuff, whatever it's doing.

24:14: And within Unity, we have FrooxEngine.

24:18: So this is FrooxEngine.

24:25: So, right now, because of Unity, FrooxEngine is contained within Unity,

24:31: it's using Unity's runtime to run its code.

24:34: It's using the Mono. The Mono framework, it's very old,

24:40: and it's kind of slow.

24:42: Which is why we kind of want to move FrooxEngine to .NET 9,

24:47: because we're originally saying .NET 8, but I think it was this week,

24:51: or last week, .NET 9 release, so we can talk about that one.

24:56: But the problem we have right now, in order to move,

24:59: there's two systems where this is sort of like a hybrid.

25:04: FrooxEngine, most of the stuff, most of all the interactions,

25:08: all the scripting, networking, physics, interactions,

25:11: most of it's fully contained within FrooxEngine.

25:15: We don't have to worry about it, that's already kind of nicely contained.

25:21: But then, what we have,

25:26: there's two systems which are sort of like a hybrid,

25:29: they kind of exist on both sides, and it's a particle system,

25:33: so we have a particle system,

25:36: and the second one is a sound system.

25:43: So, what the overall goal is, is we want to take these systems

25:47: and rework them into completely custom ones, so they're fully contained within FrooxEngine.

25:53: Once that kind of happens,

25:55: there's also interaction with all the Unity stuff.

25:58: And right now, that's also kind of like where this goes this,

26:02: this goes here, this goes here, it's kind of like, it's messy.

26:07: So once we move both of these systems fully into FrooxEngine,

26:11: we're going to rework how FrooxEngine actually communicates with Unity.

26:16: So it's a much simpler pipe,

26:19: where it sends a very self-contained package,

26:23: and be like, render this stuff for me, please.

26:26: Once this is done, what we can do

26:29: is we can take this entire thing, and I kind of got a bit like at the same time,

26:34: but we essentially move this out of Unity,

26:38: into its own process that's going to be

26:40: .NET 9, and this is going to

26:45: communicate with Unity using that same process.

26:50: And is this switch switching to the much modern

26:53: .NET 9 runtime, that's going to provide a big

26:57: performance, like uplift. The reason for that is

27:01: because .NET 9, it has a much better JIT compiler,

27:04: that's essentially the component that takes our code, and it translates it

27:09: into machine code that your CPU runs. And the one that's in .NET 9

27:13: produces at least an order of magnitude

27:17: better code. It also has better,

27:20: more optimized libraries that are part of the .NET framework that are being used,

27:24: and also much better garbage collector, which is another thing on the Unity side

27:28: that's slowing things down. We've already done

27:32: a thing where, for the headless client,

27:37: we've moved it to a .NET 8 a few months back,

27:41: because with the headless client you don't have the renderer, which means

27:44: it already exists outside of Unity.

27:47: And that made it much easier to actually move it

27:51: to the modern .NET runtime.

27:54: And the headless client, it's still this. It's the same code.

27:58: It's not a separate thing from what we're running right now.

28:03: 99% of the code is the same.

28:07: So by moving it first, we were able to see

28:11: how much of a big performance uplift we actually get.

28:16: What we found,

28:18: and we've had a number of community events that have been hosting

28:21: big events, we've been able to get way more people

28:26: on those headlaces, even with those headlaces

28:30: completing everybody's avatars, completing everybody's IK,

28:33: and dynamic bones, while still maintaining a pretty high frame rate.

28:39: So thanks to that, we are confident that moving

28:43: the graphical client to .NET 9

28:45: is going to give us a really good performance upgrade.

28:50: The only problem is, it's a little bit more complicated process, because we do have to

28:54: rework those systems, and we have to rework integration before we can

28:58: move it out. This is also a little bit tangential, but

29:02: one of the things, once this happens, once we move it out,

29:06: we can actually replace Unity with Sauce, which is going to be our

29:09: custom rendering engine. And this whole process, it makes it easier

29:13: because we have this very nicely defined way to communicate

29:17: between the two, which means we can actually heat this away, you know, and put

29:21: Sauce in here instead.

29:25: Like, where we are right now. So right now,

29:30: if I move this back...

29:35: I'll move this back...

29:37: We have the sound here, there we go. So right now, the

29:41: sound system is still hybrid, the communication with Unity is still kind of like

29:47: messy, like there's a lot of

29:49: routes into everything, and the particle system is being reworked.

29:54: So we're essentially taking this, and moving

29:57: it in here. We are working on a new particle

30:01: system called PhotonDust. The work has been

30:05: kind of progressing over the past few weeks. It's actually getting close

30:10: to feature party with the current system, which is a hybrid between Unity

30:13: and FrooxEngine. Because the goal is, we don't want to

30:17: break any content. We want to make sure that whatever

30:21: is built with existing particle system still works, and

30:25: looks the same, or at least very close to what it's supposed to look.

30:31: Most of the things are already implemented.

30:33: If you go into devlog in our Discord,

30:37: you can see some of the updates and some of the progress.

30:41: The main thing that's missing right now as a major system is

30:45: implementing our own system for particle trails.

30:50: Once it's done, it's possible

30:53: that this is going to be sometime next week. I don't want to make any promises because things happen,

30:57: but it is getting close to there.

31:01: We can actually release public builds where we have

31:03: both systems at the same time. So we're going to have

31:08: legacy system and PhotonDust, with

31:11: conversion being something you trigger manually. We'll learn a bunch of tests

31:16: with the community, so we'll essentially ask you to test your content,

31:20: test the new system, find any bugs with it. Once we are

31:24: confident that it works okay,

31:28: we are going to essentially remove the old system

31:32: and we make the conversion automatic to the new system.

31:35: With that, this part is going to be done, and we're going to move on

31:39: to this sound part. The sound system

31:44: is essentially what handles stuff like audio

31:46: specialization and so on. Right now, it's also a hybrid. So for example,

31:50: on Froox Engine's side, we do our own audio encoding and decoding.

31:55: Unity is not handling that, but what we're doing is we're feeding Unity

31:59: the audio data we want to play into individual sources

32:03: and Unity then handles specialization and then outputting it to your

32:07: audio device. We're going to move that part into our own system

32:11: which is also what's going to allow us to take control of it and

32:15: build new features, like for example having multiple listeners.

32:19: And we're also going to move the system here. There's actually some work on

32:23: this that Cyro's been working on that I'm asking for help

32:27: with, because one of the things in the system is a

32:31: reverb effect. And we essentially need to implement our own, because there's also

32:35: a thing that's currently handled by Unity, and Cyro has

32:39: made integration with a reverb called the Zita reverb that we

32:43: use to replace the existing reverb songs.

32:48: Would you like to tell us a little bit more about that part?

32:50: Yeah, so we found

32:55: a nifty little... so let me actually back up a little bit.

32:59: So currently,

33:02: the reason why we can't just keep using this reverb or whatever,

33:07: like the one that we're using right now, is because it uses

33:11: Unity, but in turn the underlying reverb

33:15: uses FMOD or something.

33:17: And that costs at least four dollar signs

33:22: to use commercially, I think.

33:25: But we found a nifty library called Soundpipe

33:29: that includes a really nice sounding reverb effect.

33:33: And I have been working on

33:38: getting the library compiled and integrating it

33:41: with FrooxEngine.

33:45: You won't be able to do anything super duper fancy

33:48: with it right away, at least not until Froox reworks the whole audio system.

33:53: But you'll at least be able to process

33:55: audio clips with it and make them sound all echoey and stuff just to try it out.

34:00: Which I think will be pretty cool.

34:04: Then you can just reverbify

34:08: any audio clip in-game. You can make a reverb

34:12: with Baker, essentially, which I think is pretty cool.

34:16: It's kind of like expanding the audio processing, because you can already do some trimming,

34:20: you can do normalization, volume adjustments, fading, and so on.

34:25: Having that code integrated and ready,

34:28: we can already expose some of it,

34:31: play with it, and make tools that spin off of it

34:36: before we do the big integration.

34:39: Right now, the particle system

34:41: is the major one that's going to be fully pulled in.

34:45: Once that part is done, we're going to do the sound system, which I expect to be

34:49: faster than the particle system, because it doesn't have as many things,

34:53: but we'll see how that one goes.

34:56: Once the sound system happens, this is going to get reworked, the integration with Unity, so it's simpler.

35:02: Once this is done, we move the whole thing out, and it's going to be the big performance update.

35:08: I hope that it helps answer

35:12: the question.

35:16: I think I'm going to clean this up, just so it doesn't

35:19: clutter our space, and we can move to the next questions.

35:24: I'll mark these two as unanswered then.

35:27: There's actually one thing I was also going to mention. Even from the particle system, there's actually a few functions

35:32: that spawned us extra things.

35:36: One of them being that we have access to

35:40: 4D Simplex noise, so you can use a ProtoFlux node,

35:44: and there's also a 3D texture

35:46: with Simplex noise,

35:52: and I've seen people do really cool effects

35:56: with it, like this one for example. I think I actually got this one from Cyro.

36:00: So you see how it kind of evolves in time?

36:03: This is like a volumetric effect, so you can kind of push it through.

36:10: So people have been

36:11: already playing with it. And this is kind of generally how

36:15: we like to do development, where, I've got another version here.

36:19: This is super neat.

36:25: How we like to develop things

36:27: is like, you know, we want to add more building blocks. So even if we're building something

36:31: official, whatever building blocks we add, we try to make

36:35: as many of them available to everyone using the platform, because you can use them for a lot of other

36:40: things. So yeah, but that should

36:43: kind of cover those questions.

36:47: So next, we have a question from Navy3001.

36:52: Any idea of how in-game performance metrics for user content

36:55: would work? That's actually, that's a good question, and like, measuring

37:01: performance, that's a very kind of difficult

37:03: thing, because one of the things with performance is like, it depends.

37:08: Like, it depends on a lot of stuff.

37:11: So usually, like, you want to have like, you know, kind of a range of tools, you know, to kind of measure

37:16: like, measure things. One of them is, you know, you can measure how long

37:20: individual components take, you know, to execute, and sort of some way to

37:23: aggregate the data, so you can kind of see, okay, this is consuming a lot of time,

37:28: this is consuming a lot of time, but the performance impact

37:31: of something is not always like, you know, that direct, because something can, for example, the components

37:35: themselves, they can be quick to execute, but maybe the object is, you know,

37:40: has a really complex geometry, so it's taking a long time on the GPU

37:44: to render out. The other part

37:47: is like, performance can also differ depending on the scenario. Say,

37:51: you build an object, and the object is doing a raycast,

37:55: it's doing, you know, some kind of checks for collisions. If you have an object in a simple world,

38:00: maybe it doesn't like, you know, it runs pretty fast, but you bring

38:03: that object into a world with much more complex colliders, it suddenly, it starts

38:07: hurting performance, because now those collision checks are needed, like, you know, are more

38:11: complex. The other example is like, say you use

38:15: like a node, like find child. You try to search for a child in a hierarchy.

38:20: And if you're a simple world, maybe like, you know,

38:24: the hierarchy of objects is, you know,

38:27: it doesn't have too much in it. So it runs fast. But then you go into a world which has

38:31: way more, and now the performance kind of tanks. Now the thing

38:35: that was running reasonably fast in one world is

38:39: running slower in the other one. So, one of the ideas we

38:43: kind of had is, we would kind of build some sort of like, you know,

38:47: kind of like benchmark worlds. We would like, you know, like have like

38:51: different scenarios, complex worlds with complex hierarchies, you know, for this and that

38:55: and then have a system where you can essentially like run

38:59: that object in that world and sort of, you know, see how fast

39:03: it runs and how does it differ depending, you know, on a different kind of

39:07: scenario. Overall, I think this

39:12: eventually end up with, you know, lots of different tools. So you have like, you know,

39:15: you don't have the tools to measure how much the components take to execute,

39:19: how long the, you know, GPU takes to execute,

39:23: just sort of like lots of different tools to analyze different like, you know, performance things.

39:29: So I think that's overall like, you know, like

39:31: what you should expect, like once those tools come in

39:35: it's not going to be a single tool, but it's going to be like, you know,

39:39: a range of them that will probably keep like, you know, expanding and building upon.

39:44: If I could

39:46: append to that

39:52: we probably also

39:54: because I know this is, I know this has like come up occasionally in relation

39:58: to questions like this, we probably also wouldn't

40:03: like give things like

40:06: we wouldn't do like an arbitrary limiting system like

40:10: oh you can only have 60,000 triangles, you can only have

40:14: X number of seconds of audio on you.

40:18: We do want to add tools so you can restrict, because it can like, it's not

40:23: it's not a perfect solution, but like we want to add people

40:26: like we want to add tools so people can like, you know, set some limits

40:30: on things. Because whatever kind of philosophy

40:34: is like, you know, we want to give people a lot of control.

40:38: And if you want to run a session like where you can spawn

40:42: object that has, you know, 50 million triangles and like everybody's going to be running at like

40:46: you know, 10 FPS, but you know, you want to be like I have a beefy

40:50: GPU, I want to look at this super detailed model, we always

40:54: want people to have ability to do that. At the same time

40:58: we want to add tools so like, you know, if you want to host like a chill world

41:02: if you want to keep it like, you know, more light, you have tools

41:06: to kind of like, you know, set certain limits on the users, how much they can

41:10: spawn in, how much they can bring in. So we're not going to make

41:14: them, you know, forced, but we're much more likely to add like tools where

41:18: you have the kind of control to decide what you want, you know, in your

41:22: world, what you want in your experience.

41:26: Other aspect of that is like, you know, we have the asset variant system and we already

41:30: use part of it, like you can go into your settings and you can lower

41:34: the resolution of textures. You can, for example, clamp it to like 2K.

41:38: So if you're, you know, low on VRM, you can lower the textures

41:42: and if somebody has, you know, 8K texture on their avatar, you're

41:46: only going to load it up to 2K. You know, it's not going to hurt you, but other

41:50: people, like say somebody has, you know, one of the, you know, 1490

41:54: with 24 gigs of VRM and they don't care, they can keep it like, you know, kind of

41:57: unlocked. And it's kind of, you know, aligned with our kind of like philosophy is like

42:04: give people as many tools as possible to kind of control your experience. But also

42:08: we don't want to enforce, like, you know, limits on people where possible.

42:14: Yeah, that's kind of more

42:15: so where I was going with that, is that we wouldn't have like a sort of

42:20: hard and fast, these are the rules for the whole platform kind of

42:24: rules. Because, you know, not everybody's computers are

42:28: equal, and so maybe I don't want to render your 500 million

42:31: polygon model, right? But

42:37: we also don't want to

42:39: we want to present this stuff in a sort of like unbiased way.

42:43: Like, we don't want to, like, we wouldn't

42:48: I wouldn't want to color, like, 500, like, you know,

42:51: someone's polygon count in, like, red or something. Because

42:55: we got to, like, social kind of thing, but it also comes with

43:03: I think that should, like, answer

43:05: like this, like, particularly, we should probably, like, move to the other questions because we got a bunch of them piling up.

43:10: Can I answer the next one?

43:14: Uh, sure.

43:17: I haven't actually read it yet.

43:20: Okay, so, TheJebForge asks, would it even be possible

43:24: to multithread world processing in Resonite? Like, if the world is incredibly heavy and the amount

43:28: of CPU it uses, but since Resonite only uses one thread, it's not using all the CPU it could have been.

43:34: I know multithreading introduces a lot of problems with thread synchronization.

43:37: What do you think?

43:41: Alright guys, say it with me.

43:45: Oh gosh, the camera's moving.

43:48: Hold on, hold on, hold on, hold on.

43:51: Alright, say it with me. Resonite is not

43:56: single-threading. This is a myth

44:00: that has somehow spread around that Resonite only runs on a single

44:04: thread. This is abjectly not true.

44:09: Yeah, this is a thing we kind of get a lot, because I think people are just

44:13: like, you know, it runs with poor performance, therefore it's single-threaded.

44:18: When it comes to multithreading, it's like way

44:21: more complex. It's not a black and white thing.

44:24: So, the way I kind of put it is, you know,

44:27: it's not like an on-off switch. Imagine you have

44:31: a city or something, and the city has poor roads.

44:36: Maybe there's areas where the roads

44:39: are very narrow, and it's kind of hard for cars to get through.

44:43: You can have areas of the city where you have highways, and

44:47: you can have lots of cars in there. It's not an on-off

44:51: switch where you just turn a switch and suddenly

44:55: every road is wide, but you can gradually rebuild

45:00: more of the city infrastructure to support

45:03: more of that high bandwidth. With Resonite,

45:07: there's a lot of things that are multithreaded.

45:11: There's also a lot of things that could be multithreaded, and they're going to be more

45:15: multithreaded in the future, but it's not

45:21: it's essentially not

45:23: a black and white thing, whether it's either multithreaded or not

45:27: multithreaded. You have to think about Resonite,

45:30: it's like lots of complex systems. There's so many systems, and

45:34: some of them are going to be multithreaded, some of them are not going to be

45:38: multithreaded. Some of them are not multithreaded, and they're

45:42: going to be multithreaded. Some of them are going to stay single-threaded, because there's not

45:46: much benefit to them being multithreaded. So we definitely

45:50: want to do more, but we already have a lot of things

45:54: running on multiple threads, like

45:59: asset processing that's multithreaded, the physics that's using

46:02: multiple threads, a lot of additional processing

46:06: spins off, does a bunch of background processing, and then integrates with the main thread.

46:11: So there's a lot of multithreading to the system already,

46:14: there's got to be more.

46:18: It's not something that's like a magic silver bullet.

46:25: With performance,

46:27: there's a lot of complexity. There's a lot of things

46:32: that can be causing low performance,

46:34: and multithreading is not always the best answer.

46:38: So for example, the .NET 9 switch, that's actually not

46:42: going to change anything with multithreading,

46:46: but it essentially makes the code that we already have,

46:50: which as we know, whatever multithreading has right now, it makes it run

46:54: several times faster, just by switching to the runtime, just by having

46:58: better code gen. So there's a lot of different

47:02: things that can be done to improve performance, multithreading is just one of them.

47:08: I think I should cover a lot of it,

47:11: but yes.

47:15: One more thing is, it's also something like,

47:18: when there's a world that's very heavy, it depends what's making it

47:23: heavy, because some things you can multithread, but some things you cannot multithread.

47:26: If you have some user content that's doing lots of interactions with things,

47:31: if you're just blatantly multithreaded, it's going to end up

47:34: corrupting a bunch of stuff, because with every algorithm

47:38: there's always a part of it that's irreducible.

47:43: So we want to introduce more systems that use multithreading

47:46: where possible, but again, it's not

47:50: a silver bullet. It's more like

47:54: a gradual kind of process that happens over time.

48:00: Next we have GrandUK is asking

48:02: are there roadmaps with time estimates for both development and what do you want Resonite to be?

48:06: So for roadmaps, we generally don't do super

48:10: ahead of roadmaps. Right now our focus is on performance

48:14: updates, and you can actually find on our GitHub

48:18: there's a project board, and there's a list of issues

48:22: that pertain to performance updates, and you can see how those

48:26: progress. We don't do time estimates

48:31: because the development varies a lot, and oftentimes

48:34: things come in that we have to deal with, the delay things

48:38: or maybe there's additional complexity, so we don't

48:42: avoid promising certain dates

48:46: when we are not confident we could actually keep them.

48:50: We can give you very rough ones, for example with the

48:54: performance, with the big performance upgrade

48:59: I roughly expect it to happen sometime in Q1

49:02: sometime early next year. We'll see how it goes

49:08: but that would be my rough estimate

49:10: on that one. After that, we usually

49:13: once we finish on a big task, we re-evaluate

49:17: what would be the next best step for the platform

49:22: at that point, and we decide are we going to focus on UI

49:25: are we going to implement this thing, are we going to implement that thing, because

49:30: we try to

49:33: keep our ear to the ground and be like this is what would

49:37: give the community and the platform most benefit right now

49:40: this is what's most needed right now, and we want to make the decision

49:46: as soon as possible

49:48: no, actually as late as possible.

49:54: Next question, we have Jack the Fox author

49:57: what are some examples of features you've implemented a particle you're proud about?

50:02: There's a whole bunch, I do a lot of

50:04: systems, the one I'm actually working on right now, the particle system

50:08: I'm pretty proud of that, it's

50:13: technically not out yet, but I'm very happy with how it's going

50:17: in part because it now

50:21: gives us control to very easily make new particle effects

50:25: and do stuff we were not able to do easily before

50:29: the one that came before that is the data feed system

50:35: that's a culmination of a lot of

50:37: approaches I've been developing to how we do UI

50:41: in the Resonite

50:44: so with that one, one of the big problems we've had with the UI is because the Resonite

50:49: is building a lot of things from ground up

50:52: because of the layer I was talking about in the stream

50:58: but it also makes things difficult because

51:00: we cannot just take existing solution and use it

51:04: so a lot of the UI, we actually have to build those systems ourselves and build frameworks

51:08: to work with them, and the old UIs, they have

51:12: the problem where the code of them is like this big monolith

51:16: and it's really hard to work with, we have to

51:19: if there's misaligned button or something

51:23: we have to go to the code, change some numbers there, change some methods

51:27: that are called, compile, wait for it to compile

51:30: run the application, look at it, be like that's still wrong

51:34: go back to the code, make more changes, compile, wait for it

51:38: wait for it to launch, look at it, it's still wrong, go back to the code

51:44: sometimes people are like, oh this thing is misaligned

51:47: in this UI, and we're fixing that

51:51: sometimes it takes an hour, just messing around

51:55: and that's not very good use of our engineering time

52:00: but the data feeds

52:03: is a system that's very generalized

52:05: that essentially allows us to split the work on UI

52:10: between the engineering team and our content, or now, our team

52:16: so when we work the settings UI, on the code

52:19: side we only have to worry more about the functionality of it, like what's the structure, what's the data

52:23: interfaces, and then we have the rest of our team

52:27: like our team, actually build the visuals

52:31: in-game, and put a lot of polish into each of the elements

52:36: and that process has made it much

52:39: simpler to rework the settings

52:43: UI, but what's an even bigger part of it is

52:47: the data feed system that this is built on

52:50: is very general, and it's been kind of designed to be general

52:54: so the settings UI, it was used as sort of like a pilot project for it

52:59: but now, we're going to use it

53:02: once we get to more UI work, to rework the inventory

53:07: rework the contacts, rework the word browser, file browser

53:11: rework the inspectors, and it makes the work required

53:14: to rework those UIs be at least an order of magnitude

53:18: less, which means that before the data feeds

53:25: these are

53:26: rough estimates, but say it would have taken us two months to

53:31: rework the inventory UI, now it's going to take us two weeks

53:36: and those numbers are

53:39: more of an illustrative point, but it's

53:43: essentially on this kind of order, it makes it way simpler, it saves us so much time

53:47: which means we can rework a lot more UI

53:50: in a shorter time span.

53:55: There's lots of things I'm kind of proud of, I just kind of did two most recent

53:59: ones, so I could ramble for this for a while, but

54:03: we have a lot of questions, so I don't want to hold things up.

54:06: Sorry, do you actually have one we'd like to share with us?

54:10: Yeah, I'll try and be quick with it

54:14: since we're getting back to... How long have we been running actually?

54:19: We're coming in an hour.

54:22: How long do we want to keep this going for?

54:24: So my aim was for one hour to two hours, depending on the questions. We got a lot of those questions, so I'm okay going through

54:30: all of these, but as we start getting out of two hours, we'll probably

54:34: stop it.

54:38: When you were talking about the build process, that kind of made me think of

54:42: something that I really enjoyed working on.

54:47: It's kind of one of those things where

54:49: it's really important, but it's just so invisible.

54:56: And what I did behind the scenes

54:58: is I basically reworked the entire

55:02: build process of FrooxEngine, almost.

55:08: Since FrooxEngine has been

55:10: around for a while, and it's been through

55:14: many updates to C Sharp and C Sharp's project system, we were still

55:18: using the legacy

55:22: C Sharp project format, which is called MSBuild.

55:27: And that really only works in

55:30: something like Visual Studio these days.

55:33: It's kind of hard to work with, it's not quite as robust as the newer

55:38: build system for .NET, and as a result

55:44: there would oftentimes be

55:49: like, you'd have like

55:50: weird issues if you wanted to add packages and stuff, and

55:54: you could only use something like Visual Studio as your IDE of choice to boot.

56:01: And I

56:03: saw that, and I

56:07: decided to poke at it, and it actually ended up being

56:11: a lot easier than I anticipated because Microsoft provides a nice

56:15: little tool to upgrade your projects, and so what I did is I

56:18: went through and I upgraded all of the projects to the new

56:22: C Sharp format, which means that we can take advantage of

56:26: the much nicer project files, which means it's easier

56:31: to edit them directly and add actions and stuff

56:35: and it also means the engine

56:39: can now be built in IDEs other than VS Code.

56:43: You could use, or VS Code, Visual Studio

56:47: Lopper is what I meant to say there. But now you can build it in like

56:50: VS Code, or like, you could build it in

56:56: you could probably build it in like

56:59: Rider if you pay for Rider, you could build it, you could even build the engine

57:02: from the command line now, which is really really good for

57:06: yeah, like automated builds. That's a big thing I did

57:11: that nobody saw, but I'm really really proud about.

57:14: It's one of those things where it doesn't show on the surface, but

57:18: it makes our lives as developers way easier, because I had

57:23: so many times where I would literally lose sometimes even hours

57:27: of time just trying to deal with some kind of problem, and

57:30: having those problems kind of resolved, and have the system kind of be nicer

57:34: it allows us to invest more of our time into actually

57:39: building things like we want to build and dealing with project build issues.

57:43: One of the problems, for example, that's

57:46: kind of weird, like one of those weird things is with ProtoFlux.

57:50: Because for ProtoFlux, it's technically a separate system

57:54: and we have a project that actually analyses all the nodes

57:57: and generates C-Sharp code that binds it to Resonite.

58:03: The problem is, with all the MSBuild,

58:06: for some reason, even if the

58:10: project that generates that code runs first

58:14: the build process doesn't see any of the new files

58:19: in that same build pipeline.

58:22: So if we ever added a new node, we would compile it and it would fail

58:26: because it's like, oh, this code doesn't exist

58:30: even though it actually exists at the time, it just doesn't see it.

58:34: With the changes Cyro made, the problem is gone. We don't have to talk about this whole thing.

58:39: But the really big thing is it prepares Resonite for more

58:43: automated build pipeline, which is something we've been trying to move towards

58:47: to because it's going to be one of the things that's going to save us a lot more time as well

58:51: that's going to make it so we can actually just push code

58:54: into the repository. There's automated tests that are going to run, there's going to be automated

58:58: builds, the binaries are automatically going to be uploaded and it's just going to

59:02: remove all of the manual work that happens all the time.

59:06: It makes bringing on people like me easier too.

59:09: It makes it easier to bring more engineers as well because now they don't have to deal with those weird

59:14: issues. I know Prime also lost

59:18: sometimes he lost a day just dealing with project issues

59:22: and a day you could spend working on other stuff

59:26: and instead you have to just make things work.

59:30: Thank you Cyro for making this.

59:34: Things like this, even though they're not visible to the community, they help a lot.

59:41: Next, we have

59:43: a question from FantasticMrFoxBox.

59:47: With sound system updates, can we get a way to capture a user's voice with

59:50: a permission and import audio streams dynamically into the world?

59:55: This would allow us to fully implement the ham radio stuff into Resonite and allow us

59:59: to ditch using external browser support to support audio.

01:00:04: So I'm not sure if I've

01:00:05: I don't understand enough about how you want to capture it

01:00:11: But since we'll be handling all the audio rendering

01:00:15: we'll be able to build a virtual microphone that actually captures

01:00:19: specialized audio from its own, whatever it is in the world.

01:00:23: So that's one of the things you'll be able to do. You'll be able to bring the camera

01:00:27: and have it stream the audio device.

01:00:30: So I would say yes on that part, on the

01:00:35: kind of capture.

01:00:37: I don't know...

01:00:39: I think I know what they mean.

01:00:45: Am I correct in assuming

01:00:48: that you want a way to import multiple

01:00:51: streams into the world from a single user? Is that what you're talking about?

01:00:58: You'll probably have to wait for them.

01:01:00: Yeah, wait a second.

01:01:05: We might

01:01:05: get back to this question.

01:01:10: You'll essentially be able to render audio out

01:01:13: from any point in the game in addition to rendering for the user.

01:01:17: And then it becomes a question what do we want to do? Do we want to record an audio clip?

01:01:21: Do we want to output it into another audio device so we can stream it into something?

01:01:25: So that will work. If you want to import audio back in

01:01:31: that's probably a separate thing.

01:01:33: That's probably not going to come as part of it. We'll see.

01:01:37: If you have any kind of clarification just ask us more and we'll get back to this.

01:01:43: Next we have

01:01:46: EpicEston is asking, will the headless coin be upgraded to .NET 9?

01:01:50: Yes. Plan to do this soon.

01:01:53: It should be mostly just a flip of a switch, we don't expect

01:01:57: big issues. One of the things we want to do is we're going to make announcements

01:02:01: so people know this is coming, you can prepare your tooling

01:02:05: make sure whatever scripts you're using

01:02:09: to upload your headlesses, they don't just explode.

01:02:14: There's a GitHub issue on it and I'll try to make the announcement

01:02:17: in a bit, probably sometime next week.

01:02:22: Get people ready. Alex2PI is asking

01:02:25: makes me wonder what's currently a culprit of most crashes, at least on my computer

01:02:29: I must have seen information that Unity crashes, couldn't you just restart Unity?

01:02:35: We also had a discussion about couldn't you just

01:02:39: I mean, so

01:02:41: for the first part of the question, crashes, they can have lots of reasons

01:02:47: it's really hard to say, like in general

01:02:49: you pretty much have to send us the crash log, we look at it, we look at the calc tag and be like

01:02:53: this is probably causing it, so it's kind of hard to say

01:02:58: in general, for the part where we just restart Unity

01:03:02: I mean, it's kind of what a crash is, it essentially breaks

01:03:07: and then it has to shut down and you have to start

01:03:09: it again, so in a way you're kind of restarting Unity

01:03:13: it's just that the restart is kind of forced

01:03:19: but this actually kind of ties

01:03:25: because if you've been here earlier

01:03:28: we've been talking about how FrooxEngine is going to essentially be moved into

01:03:32: its own process, and then Unity is going to be handling the rendering

01:03:36: one of the things that I'm considering as part of the design is so

01:03:40: the Unity can actually be restarted

01:03:44: maybe. So if Unity happens to crash, we can keep

01:03:48: running FrooxEngine, start a new Unity, and just

01:03:52: reinitialize everything. So I do want to make that part of it

01:03:55: just in general to make the system more robust, so it's possible

01:04:02: but TBD

01:04:03: we'll see how that kind of goes

01:04:08: currentUK is asking, I have heard from someone complains of headless being

01:04:11: patron reward. This was particularly a complaint about communities that do want to host events

01:04:15: essentially forced into it to keep events going if they haven't host crashes. Is there

01:04:19: any plans later to remove the patron requirement for the headlaces when things are more stable

01:04:23: and performant? So at some point we'll probably

01:04:28: make it more open. Our

01:04:31: tentative goal, and this is not set in stone, so things

01:04:35: might change. Our tentative goal is we want to offer

01:04:39: a service where we make it easy to auto-spin headlaces

01:04:43: and move Patreon to that, so if you

01:04:47: support us financially you will get a certain amount of

01:04:51: hours for the headlaces and we're going to make it very easy to host, and if you want to self-host

01:04:55: we're going to give you the headlaces. We don't have to add it

01:04:59: from the business perspective because Patreon is one of the

01:05:03: things that's supporting the platform and it's allowing us to work on it.

01:05:07: So we don't want to compromise that because

01:05:13: if we do something with that

01:05:15: it ends up hurting our revenue stream, then we're not able to

01:05:18: pay people on our team, and then we're not able to work on

01:05:23: things and things end up kind of bad.

01:05:28: We don't want it to be accessible to as many people as possible, but we're sort of

01:05:31: balancing it with the business side of things.

01:05:37: Next one, TroyBorg.

01:05:39: Cyro also did FFT mode a while ago. Having the audio system that could make VHS part of the game

01:05:43: like waveform visual is, or be able to do better detection of bass music effects.

01:05:47: That's actually separate from, because that happens fully with the Resonite.

01:05:52: The audio system is more about rendering the audio output

01:05:56: and pushing it to your audio device.

01:06:02: Next we have, I'm kind of just speeding through these questions because we have a bunch.

01:06:07: Skywakitsune. A few people have mentioned that they are not happy with the new working system and how good

01:06:11: it looks. I have plans to continue to improve that. It will be a specialized update but people

01:06:15: are still not happy. We can always improve things. We just released

01:06:19: an update which

01:06:23: integrates some of the community settings which would make it look way better.

01:06:29: For things that are like, you know,

01:06:31: that people still find us and issues with it, we will need reports on those because

01:06:35: right now

01:06:38: we're not sure

01:06:41: after the update, we're not sure what's making people not happy about it.

01:06:45: We have more concrete stuff to work with

01:06:48: as well as people make reports so we can know

01:06:52: what do we focus on. But yes, in general, we are always

01:06:56: willing to improve things. We want to

01:07:01: make

01:07:04: essentially want to make it as polished

01:07:06: as it can be, but we also need more kind of hard

01:07:09: data to work with so we know where to invest our time.

01:07:18: Next we have Terborg. What is causing Virulence first

01:07:21: sometimes when Froox moves? I'm not sure. It could be just the bloom on death

01:07:25: thing, maybe.

01:07:27: It's his radiant yellow complexion.

01:07:33: Your resplendent visage.

01:07:35: This is actually Erlage. What was the answer to this?

01:07:39: So these are just looks like questions within the chat.

01:07:43: Erlage86. Who is the second person here on the camera?

01:07:46: This is Cyro. He's our engineering intern.

01:07:50: Hello. Hi. How you doing guys? It's me.

01:07:57: Next we have

01:07:58: SkymoKitsum. Questions from Tara Whitel who can't watch the stream now.

01:08:03: If video players are going to be updated with Core and VLC, I have

01:08:06: heard from several builders that players use very outdated Core.

01:08:10: Yes, the system we use right now, it's a plugin called UMP

01:08:14: Universal Media employer, which is a builder on VLC, unfortunately

01:08:18: hasn't been updated in years, which means it's using an older

01:08:22: version of it. We've been looking into upgrading

01:08:26: to actual official VLC plugin. The problem is

01:08:30: it's still not mature enough in some ways.

01:08:35: The last I remember, there's issues where you cannot

01:08:38: have more than one video at a time. You can only have

01:08:42: one, and if you try to do another one, it just explodes.

01:08:47: There's other things we can look

01:08:49: into, like alternative rendering engines, but there's also

01:08:52: potential time and money investment. If the pros

01:08:57: are bad, we can

01:09:00: consider that we might invest into one, but we need to do some testing

01:09:05: there and see how well it works.

01:09:09: It's unfortunately difficult situations because the solutions

01:09:12: are limited.

01:09:17: It's something we want to improve,

01:09:20: but it's also difficult to work with, unfortunately.

01:09:26: Can I comment on the next one?

01:09:32: Rasmus0211

01:09:32: asks, thoughts on about 75% of all users being in private worlds

01:09:36: around the clock. Often new users mention they see practically no enticing

01:09:40: worlds. This is not a Resonite

01:09:44: problem. This is a problem of scale.

01:09:48: All platforms have a

01:09:53: pretty wide majority of people who just kind of want

01:09:56: to hang out and not really be bothered.

01:10:01: Unfortunately, we're not the biggest platform out

01:10:04: there. We're still kind of small.

01:10:09: And as we

01:10:10: grow, that problem will undoubtedly get better.

01:10:16: It's not really a

01:10:18: technical problem, it's more like a social one, because people

01:10:23: behave in a certain way, and it's really hard

01:10:26: to change that. There's some things we want to do to

01:10:30: entice people to make it easier to discover things, like

01:10:34: we were talking earlier, adding an event's UI, so you cannot see these are the things

01:10:38: that are coming up, these are going to be public events that you can join. Right now, I

01:10:42: believe there's the creator chain event that's going on, and it's always

01:10:47: every weekend, it's public to everyone.

01:10:51: But it depends what people are coming in for, because people

01:10:54: might come in, and they don't actually want to join public events, they want to go into those

01:10:58: private worlds. But the question is, how do you make those people

01:11:03: discover the friend groups and hang out

01:11:07: in those worlds? It's a challenging problem,

01:11:11: especially from the platform perspective, because we can't just force

01:11:15: people into public worlds. People

01:11:17: will host whatever worlds they like, but

01:11:22: always want to see what kind of tools we can give to entice people

01:11:27: and make worlds and socialization easier for

01:11:30: them to discover. But like Cyro said, it is a thing that

01:11:34: gets better with scale, once we can grow more.

01:11:40: There's a

01:11:42: number of events, though. If people go right now,

01:11:47: since we don't have the event's UI in-game, if you go into the

01:11:51: Resonite Discord,

01:11:57: if you go into

01:11:58: Resonite Discord, we have community news, and lots of

01:12:03: different communities post regular events there, so people can

01:12:06: find what's going on in the platform, it helps a bit in the meantime if

01:12:11: people are looking for things to do.

01:12:15: Next question from Baplar.

01:12:20: Yes, it's actually been working in parallel.

01:12:26: Ginns is one of the main people working on that.

01:12:30: We did have meetings

01:12:33: now and then, we're sort of synchronized on the status of it.

01:12:37: Last time, that was two weeks ago or so, we talked about

01:12:41: the multi-process architecture, how that's going to work, how it's going to integrate

01:12:45: with Froox Engine, and how

01:12:49: those systems are going to communicate. Ginns' approach was

01:12:53: to look at what the current unit integration has

01:12:57: and were implemented on source end. However, there's a lot of things

01:13:01: that we're actually moving, like the particle system, audio system,

01:13:05: input system, lots of things that are going to be moved forward into Froox Engine,

01:13:09: so they don't need to be implemented on source side, and they're going to focus more

01:13:14: on other things. They have a list

01:13:17: of source features, and specifically

01:13:20: bevy features, because source is being built around the bevy

01:13:24: rendering engine, which

01:13:28: maps the current features we have. For example, we have lights,

01:13:32: do they support shadows, we have reflection probes, do they support

01:13:36: this and that. So they're working on making sure there's a feature

01:13:40: part there. Once we have a performance upgrade,

01:13:44: we can work more on the integration. They also work

01:13:48: on the Resonite side, so you know what Jenkins has been doing on consolidating

01:13:52: the shaders, because all the shaders we have right now,

01:13:56: they need to be rewritten for source, because

01:14:00: the current ones, they're not designed for Unity, so we need equivalents

01:14:05: the equivalents of those are

01:14:08: essentially going to be implemented for the new rendering engine.

01:14:14: Next, Epic Easton. How do you

01:14:16: make walkie-talkie system? There's actually one thing you should be able to do

01:14:21: with the new audio system, you'll be able to

01:14:24: have a virtual microphone, put it on a thing

01:14:28: and then have it output from another audio source. There actually might be

01:14:32: a thing you'll be able to do once we rework that, because it shouldn't be too difficult to add

01:14:36: components for that. Relanche,

01:14:41: Richard Bode, Newtonian Physics System 1, soon or later.

01:14:44: So definitely sometime after the performance upgrade

01:14:49: we integrate a physics engine called Bepu Physics.

01:14:53: One of the things we want to do after we move the Froox engine

01:14:56: out of Unity and it's running on .NET 9, we want to synchronize

01:15:00: Bepu to the latest version, because right now we kind of have to diverge

01:15:04: because the Bepu physics, it used to work with

01:15:07: .NET Framework, which is what Resonite is like right now for Unity.

01:15:12: But now the newer versions they require, I think .NET 5

01:15:15: or maybe they even bumped it higher, which means we cannot

01:15:19: really use those, at least not with lots of backporting.

01:15:23: So one of the tasks is going to be to sync it up and then we're going to

01:15:27: be able to look at how much work is it, when do we want to

01:15:31: prioritize how we should put a simulation integrated with Froox Engine. It's also

01:15:35: going to help because Bepu Physics is designed to work with

01:15:39: modern .NET to be like a really performant, which is why

01:15:43: I would like a person to consider it as a prerequisite for

01:15:48: implementing that as the performance upgrade, so we're actually running it with

01:15:51: the runtime it's supposed to run with. But there's no specific

01:15:55: kind of prioritization right now. Once we're done with the performance update, we might focus

01:15:59: more on UI and be focused on IK, maybe other things we'll

01:16:03: reevaluate at that point.

01:16:07: Grant is asking

01:16:09: Must move away from Unity to Source. Could it be possible to dynamically connect, disconnect from VR

01:16:13: runtime without restarting the game? There's not really even a thing

01:16:17: that needs to move away from Unity. It's possible to implement it

01:16:22: with Unity. It just takes a fair amount of work.

01:16:27: So, possible yes, I would

01:16:29: say. The question is are we going to invest time into implementing that.

01:16:35: For that I don't know the answer right now.

01:16:39: Next we have a question, RustybotPrime

01:16:41: Would these audio rendering sources allow for spatial data for your own

01:16:45: voice? Example, if I want to record conversation between myself and someone

01:16:49: else from third person without it sounding like I'm right at the camera.

01:16:54: Yes, there wouldn't be an issue because we can just

01:16:58: have any sort of listener in the world and just

01:17:01: record that with binary audio and everything.

01:17:06: Next, what flavor of sauce, what does it

01:17:09: taste like? And it's very salty. Mayonnaise.

01:17:13: Not mayonnaise, it's actually made of his own kind of sauce

01:17:17: which is why it's named sauce. Actually, I forget what he calls it.

01:17:23: Scotch sauce. Scotch sauce, yes.

01:17:25: He makes this really delicious sauce, it's a very salty

01:17:29: one, but it has loads of flavors to it.

01:17:34: I think this next one's aimed at me.

01:17:37: Alex2pie says, Cyro, I heard that some people don't trust you and that you don't care.

01:17:42: You know where this comes from. I think I do.

01:17:45: I'm in desktop a lot, and I'm often

01:17:49: either working in Froox Engine these days, or

01:17:54: I'm kind of audio sensitive and I can get overstimulated

01:17:57: kind of easily, so sometimes I will just kind of stand there.

01:18:02: Or maybe I won't respond so colorfully.

01:18:05: But I like having people around, and so that's why

01:18:09: I exist despite that.

01:18:13: I also appreciate it when

01:18:18: I'll probably open up a lot more

01:18:23: if

01:18:25: ...how do I put this...

01:18:30: Basically, if you

01:18:32: want to interact with the Cyro creature well,

01:18:37: do things like

01:18:39: ask before poking my nose or patting my head and stuff.

01:18:46: And ask me

01:18:48: if you want to send me a contact request. Just don't come up

01:18:52: to me and be like, you're cute, and then click my name and add me. Because then I have to

01:18:56: explain, I'm probably not going to add you, man, we talked

01:19:00: for maybe two seconds. I need at least 45 seconds.

01:19:09: But I...

01:19:10: If you've come across me and I've been in that

01:19:12: sort of state where I'm not super talkative, or maybe I seem a little detached,

01:19:18: hopefully that sheds a little light on that.

01:19:20: I love this place very dearly, and

01:19:25: I love all of you very dearly.

01:19:28: Sarah is a good bean.

01:19:33: So next, we have a question from Dan Amos.

01:19:37: Was the current workflow for identifying performance bottlenecks?

01:19:41: So, generally, the workflow

01:19:44: is something like, you know,

01:19:47: it kind of depends, because there's lots of things that can

01:19:50: cause performance issues.

01:19:54: So usually it's a combination of different things, but usually it kind of starts more

01:19:58: with just observation. You know, kind of seeing what's running

01:20:02: slow, when am I lagging, and so on.

01:20:07: Once there's that initial observation, we will try to

01:20:12: narrow down to the root of the issue.

01:20:14: And for that, we can use a variety of tools. Some of them are in-game.

01:20:18: For example, we have stats on how

01:20:23: much are certain parts of the process taking.

01:20:26: Once we need more detailed information, we can, for example, around

01:20:30: Headless, the Headless client with Visual Studio profiling tools,

01:20:34: and they actually measure how long is spent in each method,

01:20:39: how long is spent in each code. That gives us some kind of data.

01:20:42: The other part of it is benchmarking. Once we can have suspicion,

01:20:47: this thing is causing a lot of performance problems.

01:20:50: We can write a test sample, and then run it

01:20:55: with different runtimes, run it with different settings,

01:20:58: do A-B test things, see how things change.

01:21:03: For example, I've done this with a lot of Resonite's

01:21:06: offer extensions methods where, for example,

01:21:10: even with stuff like the base vector operations, I would try different ways to implement

01:21:13: certain operations, run a benchmark, and see how fast it runs.

01:21:20: One thing that kind of depends

01:21:22: there is what the runtime it uses.

01:21:26: One thing I would, for example, find is certain implementations, they actually

01:21:30: run faster with Mono,

01:21:34: and then slower with the modern .NET runtime.

01:21:38: There's a lot of things in FrooxEngine where

01:21:42: sometimes people kind of decompile and say, why is this

01:21:45: done this weird way? And in some cases, it's because

01:21:49: it actually, even though you wouldn't do it with

01:21:53: more modern code, it interacts better with the runtime used at a time.

01:21:59: But for example, with these

01:22:01: general operations, I would find

01:22:04: if I compare them with the Mono in Unity

01:22:07: and compare them with the modern runtime, they would run

01:22:11: 10, sometimes even 100 times faster. There's some other things

01:22:15: that also speed up, some things that are the same.

01:22:19: But generally, it's just a combination of tools.

01:22:24: We observe something not performing well, we have a suspicion

01:22:27: that this might be causing it, and then we just use tools

01:22:31: to dig down and figure out the root cause of that problem.

01:22:38: So hopefully that answers that.

01:22:41: I think there are also

01:22:44: some manual profiling tools out there, like Tracy, I know there's some

01:22:47: Tracy bindings for C Sharp, which are really cool.

01:22:52: That's actually one of the cool things, because there's a bunch of libraries that we cannot even use

01:22:56: right now because of the old runtime. Tracy, I think it requires

01:23:01: .NET 8 or

01:23:03: some new version.

01:23:06: It's listed for .NET 7, but I think it's just interop, so it could work, but

01:23:11: it's better to just wait.

01:23:14: We do want to integrate more tools. Usually, you have a performance profiling toolset

01:23:20: so you just dig down and figure out where it could be coming from.

01:23:23: Sometimes it's easier to find, sometimes it's harder, sometimes you have to do a lot of work.

01:23:27: For example, the testing I've done before

01:23:31: comparing the .NET 5 or whatever version it was

01:23:35: and Mono, I saw this code is running way better

01:23:40: so I think it's going to help improve a lot, but

01:23:43: it's still usually testing bits and pieces, and it's hard to test the whole

01:23:47: thing because the whole thing doesn't run with that new interface.

01:23:52: That's why for our current performance

01:23:55: update, we moved the headless first, because

01:23:59: moving the headless was much easier since it exists outside of Unity

01:24:04: and we could run sessions and compare

01:24:07: how does it perform compared to the Mono one.

01:24:11: And the results from that, we got

01:24:16: it's essentially beyond expectations, it's way faster.

01:24:20: That makes us more confident in doing all this work to move FrooxEngine

01:24:24: out of Unity, it's really going to be worth it.

01:24:33: As of my perception, Resonite is somewhat being marketed

01:24:40: as a furry social VR platform, which is not the case at all. But every time

01:24:44: I ask somebody, hey do you want to try Resonite, I usually get answers like, oh that VR game

01:24:48: for furries. I have nothing against them, but in Resonite they are very publicly dominant.

01:24:52: Are there thoughts about this topic that could maybe bring in more people?

01:24:56: So, we don't really market like Resonite as a furry social

01:25:00: VR platform. We actually specifically, on Chroma, we know who's

01:25:04: heading our marketing, we specifically for our own official

01:25:08: marketing materials, we show different diverse avatars

01:25:12: because yes, there's a lot of furries on this platform and

01:25:16: it's also a self-perpetuating thing where

01:25:20: because there's a lot of furries, they bring in a bunch more.

01:25:24: We do have lots of other communities as well, which are not

01:25:28: just big, but they are here as well.

01:25:32: So, we want Resonite to be for everyone. It's not designed

01:25:37: specifically for furries.

01:25:42: We want everyone to be welcome here.

01:25:50: It's sort of like a

01:25:50: complicated kind of thing because

01:25:54: the marketing we make, we try to make it as generalized, but

01:25:58: the question is when you come to the platform, you're going to have lots of furries.

01:26:03: I think the only way to bring in

01:26:06: more people is to showcase lots of

01:26:10: different people on the platform, lots of different

01:26:14: kind of communities, but if there's lots of furries, it becomes

01:26:18: kind of difficult. It's self-perpetuating.

01:26:24: But I think it's also a thing of scale.

01:26:26: As we keep growing, there's going to be more different groups of people

01:26:30: and the communities that are different kind of fandoms or just different

01:26:36: demographics, they're going to get bigger and it's going to help

01:26:41: people who are from those

01:26:42: demographics find their groups much easier, once there's more of them.

01:26:49: Yeah, Resonite's all

01:26:50: about self-expression and stuff and being who you want to be

01:26:55: and building what you want to build, and furries kind of got

01:26:58: that down pat, and so that's probably why you see a lot of them, but

01:27:02: everybody can do that. It's not just those people, it's made for

01:27:08: every person to come together

01:27:10: and hang out and build and

01:27:14: just be you, no matter who you are.

01:27:19: Yeah, we try to make this platform kind of inclusive and for everyone.

01:27:23: It's our goal. We don't want

01:27:26: anybody to feel unwelcome.

01:27:30: I mean asterisk, because we don't want

01:27:34: hate groups, people like that.

01:27:36: So that one we would have an issue with, but generally

01:27:40: we want this platform to be everyone.

01:27:43: Yeah, also we're coming up on the hour and a half mark.

01:27:46: Ok, so we have about 30 minutes left. We're getting to the end of the question, so we'll see how they

01:27:52: keep piling, but we might need to stop them at a certain point.

01:27:57: So next question, Oran Moonclaw.

01:28:00: Is rendering performance being looked into before you move the source as well? From my experience, when

01:28:04: the system is not CP bound, the rendering can be still quite heavy for semi-tuber resolution.

01:28:09: So there's actually a thing that source will help with.

01:28:12: We don't want to invest super much time into the current rendering pipeline

01:28:16: with Unity, because the goal is to move away from it, which means

01:28:20: any time we invest

01:28:24: improving Unity, it's essentially going to be wasted

01:28:28: and it's going to delay the eventual big switch. Source

01:28:32: is going to

01:28:34: use much more modern rendering method. Right now we're using deferred method

01:28:39: which can be quite heavy, like

01:28:43: memory bandwidth and so on.

01:28:47: With source, it's going to use something called clustered forward rendering

01:28:52: which allows lots of dynamic lines while also being much lighter

01:28:55: on the hardware. So that should improve

01:28:59: rendering performance on itself, and once we make the move we can

01:29:03: look for more areas to optimize things, introduce things like

01:29:07: impostors, more LOD systems and things like that.

01:29:14: So yeah,

01:29:15: it's pretty much like, unless there's any sort of

01:29:19: very obvious low-hanging fruit with rendering

01:29:23: that would take us less than a day

01:29:27: or maybe just a few days to get a significant boost in performance

01:29:31: we're probably not going to invest much time into it and instead want to invest

01:29:35: into the move away from anything.

01:29:39: Next question, RestibotPrime

01:29:41: How straightforward is conversion of our current particles to PhotonDust?

01:29:45: I assume goal is seamless to the point of them looking to be having identically, but there is anything current particles can do

01:29:51: that photovoltas won't, or will it do in a different enough way

01:29:55: that it will have to be manually fixed?

01:29:58: So the conversion, I can't really answer it exactly, because the conversion actually isn't written yet

01:30:04: however, the main focus right now is actually

01:30:07: feature parity. So I actually have a list, and I can post it

01:30:10: in the devlog if you're curious, where I have all the things that

01:30:15: the legacy system has, and I'll be working through that list

01:30:18: just making sure that PhotonDust has the same or equivalent functionality.

01:30:23: The goal is to make it so it's pretty much equivalent

01:30:27: so it converts and it will look either the same

01:30:31: or just very close

01:30:33: so hopefully there won't be things that are too different

01:30:39: however, sometimes those things

01:30:42: become apparent during the testing period

01:30:45: so once those things here come out, we'll look at them and we'll be like

01:30:49: this is easy enough to fix, or maybe this one's a little bit more complicated

01:30:53: maybe we just bring it close enough and ask people to manually

01:30:57: fix things, but we'll have to see how this kind of goes

01:31:02: sometimes it's kind of hard to know these before it actually happens

01:31:06: but

01:31:08: it should have a feature parity, well it's going to have a feature parity with the current

01:31:15: host of things that just work

01:31:18: next we have Fuzzy Bipolar Bear

01:31:21: is there a way to stop the dash particles from being shown when streaming? I don't think there is

01:31:25: I think we would have to implement that, does it show?

01:31:29: it does show, yeah

01:31:33: next, Ekky Kadir

01:31:36: what things are currently planned for the whole performance update? I think net weight is part of it, for example

01:31:40: so we actually answered this one earlier

01:31:43: I'm not going to go into details on this one

01:31:49: but essentially moving to .NET 9

01:31:52: we're originally going for .NET 8, but .NET 9 released

01:31:56: literally just like a week ago or so

01:31:59: in short, currently there's two main systems that need to be moved

01:32:04: fully into Froox Engine because they're a hybrid system, that's the particle system

01:32:08: which is being worked on right now, there's the sound system, which Cyro did some work on

01:32:13: once those systems are fully in Froox Engine, we're going to rework

01:32:16: how Froox Engine interfaces with Unity, and then we're going to move it out

01:32:20: into its own process, to use .NET 9

01:32:23: and it's going to be the big performance uplift from there

01:32:28: we're going to post, this video is going to be archived

01:32:33: if you're curious in more details, I recommend

01:32:36: watching it later, because we went into quite detail on this

01:32:39: earlier on the stream

01:32:43: so this question within chat

01:32:47: ShadowX, in the future, could there be a way to override values

01:32:51: not just per user, but in different contexts? For example, override active-enabled

01:32:56: state of a slot or component for a specific camera, basically same concept

01:32:59: of RTO, but more flexible.

01:33:01: so probably not like this

01:33:07: the problem with RTO is

01:33:09: if you want to override certain things

01:33:15: for example, in rendering

01:33:18: when rendering is happening, although

01:33:21: work on updating the world is already

01:33:25: complete, which means the render actually has much more limited functionality

01:33:29: on what it can change

01:33:33: probably the best way to handle situations like that

01:33:37: is you have multiple copies of whatever you want to change

01:33:42: or whatever system you want to have

01:33:45: and you mark each one to show in a different context

01:33:49: but you need to manually set them up

01:33:52: consider a scenario where you override an active-enabled state

01:33:59: that component might have

01:34:02: a lot of complex functionality, maybe there's even

01:34:06: ProtoFlux or some other components that are reading the active state

01:34:09: and doing things based on being enabled or disabled

01:34:13: and once you get into that realm

01:34:16: the effect of that single enabled state can be very complex

01:34:21: where you can literally have a bunch of ProtoFlux that does a bunch of modifications

01:34:26: to the scene when that state changes

01:34:29: and it's too complex for something like the render to resolve

01:34:32: because you would have to run another update

01:34:37: on the world just to resolve those differences

01:34:40: and the complexity of that system essentially explodes

01:34:46: so probably not in that sense

01:34:48: if you give us more details on what you want to achieve

01:34:52: we can give a more specific answer

01:34:55: but this is pretty much how much I can say on this look in general

01:35:03: was the locomotion animation system one of the unit systems that need to be implemented in Froox Engine or was it something else?

01:35:09: that one was something else, it came

01:35:12: as a part of business contracts

01:35:16: it's not something

01:35:19: it's not something I kind of wanted to prioritize myself

01:35:25: it's kind of a complicated situation

01:35:27: but unfortunately it was necessary at the time

01:35:32: and I'm not super happy with how

01:35:34: that whole thing went because

01:35:40: it came at the wrong time

01:35:45: and it's

01:35:47: it was essentially a lot of, because we don't have a lot of systems

01:35:51: for dealing with animation which would have made these things much easier

01:35:55: and we have never worked with IK itself which would have made things also easier

01:35:59: so there was a lot of foundational work that was not there

01:36:05: and also

01:36:07: the timeline was kind of really short

01:36:11: so it was pretty much like just a month of

01:36:14: constant crunch just kind of working on it and

01:36:17: there wasn't enough time to kind of get it through

01:36:23: so it is a complicated situation

01:36:26: unfortunately. And there's a thing that kind of happens sometimes with businesses

01:36:30: like you end up in a

01:36:34: situation where you don't really have a good

01:36:38: path so you just have to deal with it

01:36:43: we want to eliminate those kind of situations and we had

01:36:46: a number of conversations internally to see how do we prevent this

01:36:50: from happening again, how do we make sure we don't end up in a situation

01:36:54: where we have to do something like that

01:36:58: and we have a much better understanding of the problem

01:37:02: now where if a situation

01:37:06: like this were to occur again we're going to be

01:37:10: better equipped on the communication side

01:37:14: how do we deal with it and how do we make sure it doesn't mess with

01:37:18: our priorities and things we need to focus on

01:37:24: so it was like

01:37:25: it was a messy situation, I'm not happy with how I handled

01:37:30: some of the things with it

01:37:32: but it's pretty much

01:37:35: it is what it is and the best thing we can do right now is

01:37:41: learn from it and try to improve things

01:37:46: Next question is

01:37:48: How are you compiling the questions from the streamchats? I thought Twitch knows we're

01:37:52: broken. No, it's actually work

01:37:56: We do have this thing here where we're going through the questions. This is an older one

01:38:00: I need to grab a bigger one, but it's sort of like, you know, sorting the questions

01:38:04: for us

01:38:11: The Twitch

01:38:12: nodes also would have actually broken where

01:38:16: the displays of them and could have fixed very recently

01:38:19: I pushed the update for it last week

01:38:26: So next we have Epic Easton

01:38:28: He's asking, most were able to access internal array to edit things

01:38:32: like color over lifetime, enough over lifetime. Will those be

01:38:36: properly converted? Yes. Those systems have been very

01:38:40: implemented for PhotonDust, so they're going to be converted to equity ones

01:38:44: So it's just going to work out of the box. The good news

01:38:48: is there's also new modules

01:38:52: because PhotonDust, the new particle

01:38:56: system, is designed to be way more modular

01:39:01: So there's modules that instead of just

01:39:04: the internal array, you can also specify the color over lifetime

01:39:09: using a texture, or using

01:39:12: starting and ending color. You can also do starting and ending color in the

01:39:18: HSV color space, so there's

01:39:21: a lot of new color effects that it can do that's going to give you more control over the particle

01:39:25: system. And we can always add more, because we now have full control

01:39:29: of the system, so those modules are very easy to write.

01:39:33: This next one is a little

01:39:37: moderation focused. Do you mind if I attempt to answer it?

01:39:43: Okay. Let me take a breath

01:39:45: for this one, because it's a long one. On the topic of the platform being

01:39:48: for everyone, why was the nipples allowed? We will pass if the majority of people in the world

01:39:52: including me are not going to want to see them in public sessions. I will admit

01:39:56: that it has been an extremely rare occurrence of seeing someone with them shown in a public session

01:40:00: and will it be possible for me to request things like this both to the team and other people

01:40:04: without having my moto-slash-belief question at every turn?

01:40:10: So, the reason why we

01:40:12: wanted to take a stand on topic quality

01:40:17: that's what this issue is called, by the way, it's called topic quality

01:40:19: is, um, because

01:40:25: ultimately

01:40:28: like, if a man can have a bare chest

01:40:31: you know, why can't a woman? The only difference is that on average

01:40:35: women have larger chests than men, and I think

01:40:39: we're also an EU-based company, right?

01:40:43: I'm from Europe. Okay, this is the stance in a lot of places

01:40:48: in Europe, too, where topic quality is just sort of the norm

01:40:52: and we want to normalize that, because

01:40:57: we do need this kind of a quality, like why

01:40:59: can't a woman have, you know, their

01:41:03: why can't a woman be topless, you know, in a non-sexual context?

01:41:07: There's just no precedent for it.

01:41:12: And, let me see if I'm...

01:41:16: There's also a thing with this, it's like we

01:41:19: you know, we believe in equality and we believe in a lot of progress, so

01:41:23: we don't need to take stance on those things, but also we kind of give you tools

01:41:27: to kind of deal with those, so if it's something you really don't want to see

01:41:31: there's an avatar block function. You can block those people, they will not appear

01:41:35: for you. There's probably more things we can do in that

01:41:39: area as well, but ultimately we want to be like, you know,

01:41:44: very kind of like open and very kind of progressive as a company

01:41:46: when it comes to these things. There's also like, I would really recommend

01:41:51: like asking this question also in the moderation, like

01:41:54: office hours, because the moderation teams, you know, the one that kind of deals

01:41:59: with this a lot of detail and they're going to have like, they're going to have a lot more kind of context for

01:42:03: these things. But also like, you know, I

01:42:08: don't necessarily believe that like, you know, it's

01:42:11: like the majority of the people on the internet, you know, like having that stance

01:42:15: like it's, there's, there's a good chunk of like, you know,

01:42:19: kind of people like who are kind of like, you know, very open about this and

01:42:24: I feel like, you know, that the chunk is kind of growing. People are kind of

01:42:26: getting like, you know, more open with things.

01:42:30: I do recommend like, you know, bringing this like with the moderation office hours, like they're going to be able to

01:42:35: give you like kind of much, much kind of a better answer for this because they've been

01:42:39: dealing with this topic, you know, for a while.

01:42:45: So, you know, take what we say like a little

01:42:47: bit of a grain of salt. I don't want to, you know, kind of step on the moderation teams

01:42:51: like those with that.

01:42:54: Yeah, I was going to say something to, I was going to say something to wrap it up. What was I going to say?

01:43:05: Yeah, I was just going to say, I don't know what

01:43:09: I don't know what you mean by, because I commented

01:43:13: I don't know what you mean by this rule being

01:43:17: exploited by transgender males and females, but

01:43:20: being transgender has nothing to do with this.

01:43:24: If you want to be a boy or want to be a girl

01:43:29: that has no bearing on this rule.

01:43:31: Most part of the quote too is like, you know, because it kind of like

01:43:35: erases that kind of disparity, like it doesn't really

01:43:39: matter. If you do feel there's some exploit you can

01:43:43: always, you know, you can file moderation reports or you can file, like you know

01:43:48: you can bring these

01:43:51: to the moderation office hours and discuss these there.

01:43:55: Then we can kind of see what is happening and then we sort of evaluate does it fit

01:43:59: with our rules or does it not.

01:44:03: So you can, if you feel there's some issue

01:44:07: you can make us aware of it and we can promise that we're going to

01:44:11: agree with you, that we're going to have the same view on it, but

01:44:14: we can at the very least look at it and listen to what you have to say on that.

01:44:22: So next we have

01:44:23: Grand UK, Hearsay. I have heard from someone that they try to report

01:44:27: someone to the moderation team but because they were connected to the team nothing happened

01:44:31: of it and they ended up banned instead. I can't confirm

01:44:35: 100% that what was said happened and I know nothing can be said about moderation

01:44:39: cases but in case where there are conflicts of interest like above

01:44:43: what can be done and how can we be sure where we won't have wrongful consequences bans for trying to

01:44:47: uphold the US and guidelines for everyone.

01:44:52: So, I understand there's not like super many details but I can kind of

01:44:55: talk in general. Sometimes we do have cases where

01:45:02: there's actually two things.

01:45:03: We do have cases where there's reports against people who are on the moderation

01:45:07: team or even on the Resonite team.

01:45:11: If it's a report against someone who's on the moderation team that will usually go

01:45:15: to the moderation leads and those people

01:45:19: cannot deal with it, they will investigate. We actually have multiple moderation

01:45:23: leads as well. That way it's not like

01:45:27: there's a single person who can just bury the thing but there's multiple people

01:45:31: who all can see the same data and then sort of check on each other.

01:45:36: If it happens something with a team

01:45:38: or if there's an issue with somebody on the actual Resonite team, usually

01:45:43: that goes like the Canadian kid who's doing

01:45:47: those things and he brings these things with me.

01:45:54: We have cases

01:45:56: where we had to deal with difficult situations before

01:46:00: but on the theme, but in the moderation

01:46:03: team as well. I can't really go into details

01:46:08: because there's privacy issues

01:46:12: with that. I can tell you there's been

01:46:15: cases where people on the moderation team

01:46:19: they had to permanently ban some people who

01:46:22: were their friends, even long-time friends, because

01:46:27: they did something wrong.

01:46:31: This caused people on the moderation team

01:46:34: a lot of distress, but they still made the decision

01:46:38: to ban their friend because they

01:46:43: want to uphold the moderation rules

01:46:47: above all else. I've looked at

01:46:51: a few of those cases because I do want to make sure things are

01:46:56: going okay, there's

01:46:59: favoritism happening. I've been involved in

01:47:03: a few of those cases as well.

01:47:07: Part of the discussion of it and so on.

01:47:12: There's been a number of difficult discussions on those

01:47:15: and every single one, if there was sufficient

01:47:19: evidence for somebody's wrongdoing,

01:47:23: even if we knew that person personally, even if they were connected to the team,

01:47:27: they were still banned.

01:47:31: There's one thing I kind of noticed that's also kind of in general, is usually when

01:47:35: people do get banned,

01:47:42: they're almost never

01:47:43: truthful about the reason, because we do make sure

01:47:47: as part of the moderation, if somebody ends up being banned, usually

01:47:51: they will receive warnings first, depending on the severity.

01:47:56: If they end up being banned,

01:47:59: the reasoning is explained to them.

01:48:03: Oftentimes there's somebody from the team who's actually going to

01:48:07: sit down with them and be like, we have this evidence, this

01:48:10: kind of happened, you're getting banned for these reasons.

01:48:15: They are made aware of it. And in a lot of cases,

01:48:19: those people will come out and

01:48:22: give completely different reasons for why they're banned.

01:48:28: And this kind of puts us in a difficult situation,

01:48:30: because we value privacy, and sometimes giving details to the public

01:48:34: could put innocent people who are involved in those incidents at risk.

01:48:41: So we cannot really say

01:48:42: the person was actually banned for these reasons.

01:48:48: But it is a thing that happens.

01:48:52: So the only thing I can request is

01:48:57: be more skeptical about what

01:48:59: people say about these things. If you see something,

01:49:02: if you believe you can always send us a report, we will look at it, we will evaluate it,

01:49:07: we will see what evidence we have.

01:49:11: But ultimately, we will not necessarily tell you

01:49:14: the details of how it was resolved to protect the privacy

01:49:20: and potential security of people involved.

01:49:24: I will also... Oh, sorry.

01:49:27: No, go ahead. I was just going to say that we're

01:49:31: just about 10 minute mark, so I think we should close questions.

01:49:34: Okay, so we're going to close the questions.

01:49:38: So if you send

01:49:42: questions right now, we have a few of them coming in,

01:49:46: if you send any questions after this point, we can guarantee we're going to

01:49:50: answer that one. We'll try to answer as many as we can that are still left,

01:49:54: but no guarantees at this point. But I will at the very least

01:49:58: try to make it

01:50:03: the ones that we have

01:50:04: on the list right now. So the next one,

01:50:09: EpicEston. Does the question mark need to be at the end of the question?

01:50:13: I think it doesn't need to be. I think I can put it

01:50:16: in the middle, but just to be sure, I would put it like...

01:50:20: Actually, no. I literally see a question that has a question mark in the middle

01:50:24: of it, so no, it doesn't need to be at the end.

01:50:31: Erasmus0211.

01:50:32: Any more flux nodes in the works? If yes, which

01:50:35: excites you the most?

01:50:39: You're working on some new ones.

01:50:46: Which ones am I working on again?

01:50:49: I'm just the one I just took.

01:50:52: Oh yes, there is a

01:50:55: new ProtoFlux node I'm particularly excited about. So, you know how

01:51:00: for those of you who do ProtoFlux,

01:51:03: there is currently a node where you can perform a raycast

01:51:07: which shoots an infinitely thin line, and whenever it hits, you can get the position,

01:51:11: you can get the direction, stuff like that.

01:51:16: What I'm going to implement is I'm going to implement

01:51:20: sweeping, or I think it's

01:51:24: also been called shapecasting or whatever,

01:51:27: unlike some other platforms, but it's essentially a way of doing thick

01:51:31: raycasts using a shape that you essentially

01:51:35: extrude in the direction that you want it to go.

01:51:38: So, if you wanted to shoot a sphere in a direction,

01:51:44: you would essentially be shooting a capsule

01:51:48: however long you want to shoot it, and anything within there

01:51:52: it would hit. Or in this case, you know, the first thing it hits

01:51:56: it will return basically exactly like a raycast,

01:51:59: but it's thick, and you can do that with different shapes like a sphere,

01:52:03: or a cube, or I think you

01:52:07: should also be able to do it with convex hulls, right?

01:52:12: I'm not sure if we have that one, maybe.

01:52:16: I thought it was going to be better. At the very least, you'll be able to do it

01:52:20: with spheres, and cubes, and cylinders, and capsules, and stuff.

01:52:25: But I think that will be very useful, especially for those

01:52:28: of you who make vehicles who don't want your raycasts to

01:52:32: shoot between two infinitely close triangles in geometry, and now your

01:52:35: car is flying across the map. Yeah. Thick raycasts.

01:52:40: Yeah, thick raycasts.

01:52:42: Because we do have a lot of the functionality, it's already in the part of the

01:52:50: car. We use it internally in our own engine. For example, the laser is actually

01:52:53: using sweeps to behave a bit better.

01:52:58: And this is going to expose them, so you can also use them from ProtoFlux.

01:53:06: This one seems to be asking something in the chat, so I'm going to

01:53:09: skip this one. Tribe Grade World VR. Question.

01:53:14: For example, if you're still already on the video, say genius, what app are you using to do those scans?

01:53:18: Yes, some interstellar reality.

01:53:21: For most of my scans, I'm using a software called Agisoft Metashape.

01:53:27: It's a photogrammetry software, and essentially you take lots of pictures

01:53:30: of the subject from lots of different angles,

01:53:35: and it's able to do those reconstructions. It figures out

01:53:39: based on the patterns in the photos, where the photos are, and then

01:53:42: reconstructs a mesh. I also sometimes use additional

01:53:46: software, like I'll for example use Photoshop to like, with certain

01:53:50: photos, I will do like an AI denoise on them, which

01:53:54: helps increase the quality of the scans, and I will also do

01:53:58: some kind of tuning of the lighting and so on. But I guess Metashape

01:54:02: is the main one. There's also one that I kind of started experimenting with a few days ago,

01:54:07: and I literally turned my room into like,

01:54:11: it's a software called, actually

01:54:14: I forget the first, it's called PostShot. Let me see

01:54:18: the full name. Joseth PostShot. And this one's for

01:54:22: Gaussian Splathing, which is sort of like this new technique, you know,

01:54:26: for 3D reconstruction, or more general like rendering, which can

01:54:30: reconstruct the scenes with much better fidelity. And we're kind of

01:54:34: playing with it, like because I have all my datasets, I've been just kind of throwing at it and see like

01:54:38: how it kind of works with different things.

01:54:42: So like I might

01:54:44: like integrate that one more into my workflow as I kind of like

01:54:48: go. I posted like a quick video and have like a bunch more

01:54:52: that I'll be posting soon-ish.

01:54:55: But yeah, I guess some mentorship is the main one to use, like you know, it makes it easier to just

01:54:59: get a mesh, bring it in.

01:55:04: This one is continuing a moderation question

01:55:08: that we had a couple ago.

01:55:12: This one from Ralag86

01:55:16: again asks, continuing my previous question, will anything be done

01:55:20: regarding people who do not want to see beta.top females in public sessions? For non-hosts

01:55:24: I am aware of the already in-play system where you can ask the version to switch avatars slash avi settings

01:55:28: and for hosts they can enforce address code which I am no doubt making use of.

01:55:32: So in the future we do want to

01:55:36: implement stuff like content tagging

01:55:41: and that will come with

01:55:44: the ability to, you know, if things are tagged a certain way you can

01:55:47: take a checkbox and you won't see them anymore, right?

01:55:50: So you could make use of that.

01:55:55: That's something we will do in the future.

01:55:59: But other than that, for the time being

01:56:04: if you don't want to see that, don't go to those sessions.

01:56:08: Well, you can still go to those sessions because we do have

01:56:11: the ability to block somebody's avatar.

01:56:16: I can actually show you if I

01:56:20: click on Cyro's name...

01:56:23: Careful, it might ban me from the session.

01:56:25: Oh, it should be just block avatar. There we go, see now Cyro is gone.

01:56:30: I don't have to look at it. Well, I can still see it, but I don't have to look at it like him anymore.

01:56:35: Yeah, that is something I forgot about.

01:56:39: This is one of the reasons we added it.

01:56:42: You have the power. If some avatar is legitimately upsetting you,

01:56:47: you can block it. The other part is if you host

01:56:50: your own sessions, you can enforce your own rules. We do allow for that,

01:56:54: with some caveats. So if you want to enforce a dress code,

01:56:58: that's completely up to you. You have that freedom.

01:57:03: You can always add additional rules to whatever

01:57:06: sessions you want to host.

01:57:10: That's another thing. Eventually the content tagging system

01:57:14: should make these things more generalized.

01:57:18: You don't even have to go and see it in the first place as long as the content is properly tagged.

01:57:23: We can filter certain things out. We can block certain avatars.

01:57:26: We don't want to give you the tools, but

01:57:30: we don't want to make global decisions

01:57:34: just forbidding these things for everyone.

01:57:38: There is a nuance I was going to get to there

01:57:43: in that if you decide

01:57:46: to not allow, let's say you're like,

01:57:50: I don't want to see nipples in my world, that also has to apply to the men in the session

01:57:54: as well. It is universal, you cannot discriminate.

01:58:00: So it's either nipples allowed for all, or no nipples at all.

01:58:05: It actually reminds me, because there was one thing

01:58:07: that was particularly funny to me. With the Creator Jam, they actually made a nipple gun

01:58:11: they were shooting around the world, and people got upset, and they were like

01:58:15: oh no, it's okay, those are male nipples, they're not female nipples.

01:58:19: It was a funny way to point out to that

01:58:22: like, double standard, you know, for this kind of thing.

01:58:28: Uh, but yeah. Uh, next we have

01:58:30: Verailash86, my question being will anything be done past it all?

01:58:34: I don't know which one this one's actually related to.

01:58:39: It was related to the previous one they sent them in a row.

01:58:43: Um, so we're

01:58:46: um, this might be the last one because we're last minute.

01:58:52: Yeah, we already answered that one.

01:58:55: Um, yeah.

01:58:58: I think that's pretty much it, we had a few more come in, but we got it.

01:59:02: Yeah, there's a few more, but this is pretty much the last minute, like we've been here for two hours

01:59:06: my throat is kinda sore from this, I should have brought some water.

01:59:10: Uh, but thank you everyone, you know, for joining, thank you for so many

01:59:14: kind of questions, like we're very happy to answer those, you know,

01:59:18: like let you know more about the platform, and just kind of like chat with you.

01:59:23: Thanks everyone, also like, you know, for playing, you know, Resonite

01:59:26: for enjoying this platform, you know, for supporting us and letting us do this kind of thing.

01:59:31: Um, I hope you enjoy the stream, like

01:59:34: my goal is, you know, make this every week. The format might kind of

01:59:38: change a little bit, we'll kind of see, you know, how many questions we get like next time and so on.

01:59:43: We might, you know, next time might be for example

01:59:46: outside of Resonite, you know, playing some kind of chill games while kind of chatting

01:59:49: with you, but we'll see how it kind of goes, because

01:59:53: this one there was a lot of questions, we're like, you know, kind of focused more on the Q&A

01:59:57: and we'll see like, you know, how it changes with the upcoming streams.

02:00:01: So we'll experiment with the format a little bit and see like, you know,

02:00:05: and also let us know, let us know like, you know, like what do you think, like do you like this?

02:00:09: Would you like to see some other things? Are there like any kind of issues?

02:00:13: You can like, you know, post, like

02:00:16: actually where should I post? Maybe make a thread in the office hours

02:00:23: like under Discord

02:00:25: to share your feedback. So thank you very much for joining,

02:00:28: you know, thank you for like spending time with us and asking us questions.

02:00:33: I'll try, like, you know, try to get like this video uploaded on

02:00:36: you know, our YouTube channel so you can, anybody who like missed these office hours

02:00:40: you can, you know, watch them afterwards, and we'll see you next week.

02:00:45: So thank you very much, and thank you also Cyro for, you know, helping me with this.

02:00:49: And being a good co-host, and we'll see you next week.

02:00:53: Bye!