The Resonance/2024-11-17/Transcript: Difference between revisions

From Resonite Wiki
create
 
Regenerate transcript using Whisper
Line 1: Line 1:
{{OfficeHoursTranscriptHeader|The Resonance|2024-11-17|url=https://www.youtube.com/watch?v=wX19I0EXZN0&list=PLQn4R3khhxITNPmhpSJx5q7-PgeRFGlyQ|autogen=YouTube}}
{{OfficeHoursTranscriptHeader|The Resonance|2024-11-17|url=https://www.youtube.com/watch?v=wX19I0EXZN0&list=PLQn4R3khhxITNPmhpSJx5q7-PgeRFGlyQ|autogen=YouTube using Whisper}}


0:00: should be up I'm going to post the
00:00: Everything should be up.


0:09: announcement hello hello let's see we get people in there I need to move this one a little bit so I can read
00:02: I'm going to post the announcement.


0:17: it hello do we have people on the
00:10: Hello, hello, let's see if we get people in there, we need to move this one a little bit


0:24: stream hello can you hear us can you hear us hello
00:15: so we can read it.


0:35: just going to wait for some people to come in oh there we go we got one person got one
00:18: Hello, do we have people on the stream?


0:42: person hello hello shush uh there probably going to be a bunch more people kind of coming in in a
00:25: Hello, can you hear us?


0:52: bit some people to Pile in yes how we we go to get more people
00:26: Can you hear us?


0:59: hello spring call got a bunch more people piling in hello and welcome
00:32: I'm just going to wait for some people to come in.


1:07: everyone so this is uh this is the first episode of the resonance uh that's like
00:37: Oh, there we go, we've got Shushio.


1:13: a new podcast that I'm starting um it's like a mix between office hours where you can kind of you know um ask anything
00:39: Got one person.


1:20: about aite you know like whether it's a technical thing whether you want to ask like more broad questions you know more
00:43: Hello, hello Shushio.


1:27: kind of open-ended as well uh but also like I have like C with me uh who's our engineering intern we of times you know
00:59: Hello, just a sprinkle, we've got a bunch more people piling in.


1:34: like talking about like resonite and uh you know talking like you know about a
01:03: Hello and welcome everyone.


1:40: cool technology talking about VR talking about like you know um big Vision like
01:08: So this is the first episode of The Resonance, that's like a new podcast that I'm starting.


1:46: you know behind our I like where do we want to like you know which direction we want the platform to head and so on so
01:16: It's like a mix between office hours where you can kind of ask anything about Resonite,


1:53: um depending on how active we are with questions we're probably going to you know talk like between the each other
01:21: you know, whether it's a technical thing, whether you want to ask more broad questions,


1:59: and kind of you BL like a um you know dis places um we see a bunch of people
01:26: you know, more kind of open-ended as well.


2:05: kind popping in so hello everyone um s dust sprinkles apexx AI uh lexo Grand
01:29: But also I have Cyro with me, who's our engineering intern.


2:14: fuzzy uh je Forge Alex DPI Jack uh Ando hello welcome everyone um
01:34: We have a lot of times talking about Resonite and talking about cool technology,


2:23: could also request uh uh the chat before we start since this is the first one I'm
01:41: talking about VR, talking about big vision behind Resonite,


2:29: just going to tuning things a little bit um uh is the Audio Level okay on your
01:47: like where do we want to, you know, which direction we want the platform to hand and so on.


2:34: end can you hear me fine and Sarah can you say something can you hear me okay
02:01: You know, about what this place is.


2:41: guys let let me know if I can even need to adjust the levels a little bit they look okay like on the on the OBS side
02:04: We see a bunch of people popping in, so hello everyone.


2:48: but uh sometimes it's a little bit hard to tell oh oh my public it's public thank
02:07: I see Dustus Sprinkles, ApexRxAI, LexiVoe, I see Ground, Fuzzy, Jack Forge, AlexDupi, I see Jack, and Birdo.


2:56: you we should we should maybe not do that yes I should have checked that thank you for
02:20: Hello, welcome everyone.


3:02: letting us know surprisingly nobody joined someone hello uh I do have like
02:23: Could I also request the chat before we start, since this is the first one,


3:08: one more request um uh for questions uh we have a thing here that's going to um
02:29: I'm just kind of tuning things a little bit.


3:15: show the questions so we like you to make sure we don't miss them what you need to do is when you ask your question
02:33: Is the audio level okay on your end? Can you hear me fine?


3:21: make sure it ends with a question mark and it's going to pick going to get picked up uh would anybody in the chat
02:36: And, Cyro, can you say something?


3:26: be able to uh perfect I have a question snop it perfect it works thank you
02:38: Can you hear me okay, guys?


3:34: grand um great thank you so everything
02:42: Let me know if I can even adjust the levels a little bit.


3:39: works uh so with this like we got a bunch of people in there um I think we're like ready to start so hello again
02:46: They look okay, like on the OBS side, but sometimes it's a little bit hard to tell.


3:46: everyone I'm FKS uh I have S with me our engineering intern and this is the first
02:53: Oh, oh my. It's public. Thank you.


3:51: episode of the what we calling the resonance um the idea is like this is going to be a mix of office hours so we
02:56: We should maybe not do that.


3:57: can like you know um ask anything about resonite uh whether it's a technical question whether it's like you know more
02:59: Yes, I should have checked that. Thank you for letting us know.


4:03: philosophical about the platform whether it's like you know more specific or open ended and we'll try to like ask the you
03:03: Surprisingly, nobody joined, so I'm going to say hello.


4:11: know answer those questions as best as we can uh but we also going to like you
03:07: I do have one more request.


4:16: know talk a little bit more in you know broader terms uh what is the direction of the platform what's like you know the
03:10: For questions, we have a thing here that's going to show the questions,


4:22: Big Ideas behind it uh because we don't want to keep things you know just kind of like you know to the wire where it's
03:16: so we're going to make sure we don't miss them.


4:29: like you know dealing with indidual technical issues but also like what are sort of you know the driving forces like
03:18: What you need to do is, when you ask your question, make sure you answer the question mark,


4:35: what what we want the platform to do like you know in general irrelevantly you have any kind
03:22: and it's going to get picked up.


4:41: of specific features um so with that we can we can start uh you know uh answering questions
03:24: Would anybody in the chat be able to... Perfect.


4:48: and if there's like you know if there's like not too many of them like we can like you know just talk about
03:28: I have a question.


4:53: things um we already have like a few questions popping in uh trck is asking
03:29: Perfect. It works. Thank you.


4:58: uh are you using uh my chat I'm actually not like I I uh I don't know where I
03:35: Great.


5:03: saved I was kind of looking for it before the start and I was like oh I can't find it so I'm using a little bit older one um I'll have to procure
03:37: Thank you. So everything works.


5:11: it uh then oie is asking of course is glitch cute yes he's cute uh it's proven
03:40: With this, we've got a bunch of people in there.


5:18: right here on the stream uh gr K joke is mayonnaise a per flx
03:43: I think we're ready to start.


5:24: note No but like there's I actually have I have a list of it is for April Fools
03:45: Hello again, everyone. I'm Froox.


5:31: and there's a food related um protox node in there that might pop up at some
03:48: I have Cyro with me, our engineering intern.


5:37: point maybe is mayonnaise a prolex node question is what would it do if it's
03:50: This is the first episode of what we're calling The Resonance.


5:43: like you know prolex node because may like that's going to be a data type that is
03:54: The idea is that this is going to be a mix of office hours,


5:53: true or would it Pur like mayonnaise or maybe like maybe you have like a number
03:57: so we can ask anything about Resonite,


5:58: of inputs like you know you need to input like eggs you need to input like actually I don't know what it goes into mayonnaise I think egg is in there we we
04:00: whether it's a technical question, whether it's more philosophical about the platform,


6:05: have the The Leaky impulse bucket maybe we could have the Leaky mayonnaise
04:05: whether it's more specific or open-ended,


6:12: bucket we need mayonnaise outputs yes but yeah that's uh hopefully
04:09: and we'll try to answer those questions as best as we can.


6:18: that answers your joke question with more jokes uh then we
04:15: We're also going to talk a little bit more in broader terms.


6:23: have sorry oh no go ahead I I was just going to read Jack if
04:20: What is the direction of the platform? What's the big ideas behind it?


6:29: that's fine okay Jack says I have a pretty broad question but I assume it's going in the
04:24: Because we don't want to keep things just to the wire,


6:35: same direction you're already heading where do you want to see resonate positioned within the vr/ social VR space ah this is a good uh Rumble
04:28: where it's dealing with individual technical issues,


6:43: inducing question um I just like a few things L on it like one of the kind of
04:30: but also what are the driving forces?


6:49: like big ideas of this platform is that like um is built of like you know
04:35: What would we want the platform to do in general,


6:55: multiple kind of layers and at the base layer you have uh things like automated
04:39: irrelevant to any kind of specific features?


7:01: networking like everything you build even the engine itself you always get like you know everything synchronized by
04:43: So if it did, we can start answering questions,


7:08: default like you don't even have to think about it um everything's you know potentially persistent you can save
04:48: and if there's not too many of them, we can just talk about things.


7:15: everything you know into inventory like you know into the cloud on your hard drive everything that you get on the
04:54: We already have a few questions popping in.


7:20: platform you can persist and the way I kind of see it is like you know like once you have this kind of layer you can
04:56: Jack is asking, are you using MyChat?


7:26: start building on top of it so we also have like you know layers for you know working with various devices like you
05:01: I'm actually not. I don't know where I saved it.


7:32: know various interactions like grabbing stuff touching stuff you know pointing at things and those are things that um I
05:04: I was kind of looking for it before the start,


7:41: feel like you know like important to kind of like you know solve really well like do them like properly uh because
05:05: and I was like, oh, I can't find it, so I'm using a little bit older one.


7:48: when I started my work in VR I was doing a lot of sort of um you know disparate
05:12: Then Ozzy is asking, of course, is Glitch cute? Yes, he's cute.


7:53: like applications where one Applications had like you know these features and supported this hardware and the other
05:17: It's proven right here on the stream.


7:59: supported these things and this other hardware and sometimes I was like I would like functionality from this one application in this other one but like
05:21: ChronicJoke, is Mayonnaise a ProtoFlux node?


8:07: it was kind of difficult to bring them over plus I would also find myself you know solving the same kind of problems
05:25: No, but I actually have a list of ideas for April Fools,


8:14: like over and over um you know for example being able to grab stuff um so one of the driving
05:31: and there's a food-related ProtoFlux node in there


8:23: forces was you know to create sort of a framework create like a layer where everything's part of the same shared
05:35: that might pop up at some point, maybe.


8:31: universe and sort of like you know build like a pretty much like an obstruction
05:39: Is Mayonnaise a ProtoFlux node?


8:36: layer um it's kind of analogous to like if you consider stuff like programming
05:42: The question is, what would it do if it's a ProtoFlux node?


8:43: languages where the really old ones you know like you had like assembly programming and you have to like do a
05:45: Because Mayonnaise, that's got to be a data type.


8:48: lot of like you know stuff like managing memory like you know like if like where does this stuff and managing your stack
05:48: That is true.


8:55: and doing a lot of kind of like manual work to like you know make sure the state of everything is correct and then
05:54: Or would it produce Mayonnaise?


9:01: came like you know higher level programming languages where they would essentially do it for you and they would
05:56: Or maybe you have a number of inputs, you need to input eggs,


9:06: let you focus more on the high level what do you want to do what I want like personally what I
06:00: you need to input, actually, I don't know what it calls it, to mayonnaise.


9:13: want like you know resonite like to do in the like you know the VR social space is do like similar kind of paradigm
06:03: I think egg is an egg.


9:20: shift for um you know applications where no matter what you build you always have
06:04: We have the leaky impulse bucket, maybe we could have the leaky mayonnaise bucket.


9:27: like a you always have real time collabor you don't even have to think about it um you can always you know
06:13: We need mayonnaise outputs.


9:33: interact with multiple users uh and you always have like you know persistence and you always have like integration
06:16: Yes.


9:38: with lots of common Hardware um so to me like the social VR
06:17: Hopefully that answers your joke question with more jokes.


9:45: layer it's almost like you know it's the basis you always have like you know the social stuff like you can you know join
06:22: Then we have...


9:51: people you can talk with them you can be represented as your avatar but then uh everyone can build lots of
06:24: Oh, sorry.


9:58: different things you know some people will just socialize some people will play games but some people will you know
06:26: Oh, no, go ahead.


10:04: build like a virtual Studio you know maybe they want to produce content on here maybe they want to produce like
06:28: I was just going to read Jack's if that's fine.


10:09: music maybe they want to program stuff and they're able to like use um resonite
06:30: Okay.


10:16: a sort of framework to do that and um kind of share like you know whatever
06:31: Jack says, I have a pretty broad question, but I assume it's going in the same direction you're already heading.


10:22: they make like with other people so like if you're good at building tools you can make tools you know um you know like I
06:36: Where do you want to see Resonite positioned within the VR slash social VR space?


10:29: mentioned for example producing music say somebody makes really cool tools now other people who do like to produce
06:41: Ah, this is a good ramble-inducing question.


10:35: music can take those tools made by the by the users and because they exist within the same universe you can build
06:46: There's a few things that we know about Resonite.


10:40: your own music studio and you have all the guarantees you know that I mentioned earlier video
06:48: One of the big ideas of this platform is that it's built of multiple layers.


10:46: music studio can invite people in and collaborate with them no matter where they are you can you know save the state
06:56: At the base layer, you have things like automated networking.


10:52: of your work or maybe say you can make a really cool like you know audio processing filter or something you save
07:02: Everything you build, even the engine itself, you always get everything synchronized by default.


10:58: it you can share with other users and it kind of opens up you this kind of like interrup and I want like resonates to be
07:09: You don't even have to think about it.


11:05: General enough where you can build like you know pretty much any application you
07:12: Everything is potentially persistent.


11:12: know like whatever whatever you can kind of think of um you can build on here and
07:14: You can save everything into inventory, into the cloud, under hard drive.


11:18: get those guarantees kind of similar you know how you have a web browser and web browsers they used to be you know just
07:19: Everything that you get on the platform, you can persist.


11:25: browsers for websites but now we have like you know fully fledged applications you have like um you know you have your office set
07:23: The way I see it is once you have this kind of layer, you can start building on top of it.


11:31: like you know if like Google Docs uh there's version of Photoshop there's you know we can play games there's like so
07:28: We also have layers for working with various devices, various interactions, grabbing stuff, touching stuff, pointing out things.


11:37: many applications on the web um that like it essentially becomes
07:37: Those are things that I feel like are important to solve really well.


11:42: like you know its own operating system in a way and I wonder as to kind of do
07:45: Do them properly.


11:49: similar thing where the platform itself it's like the analog of the web browser
07:47: When I started my work in VR, I was doing a lot of disparate applications,


11:54: you can build any kind of application in it but also you get the guarantee
07:54: where one application had these features and supported this hardware,


11:59: of the automated networking of the persistence and of the integration like with the hardware and and other things
07:58: and the other application supported these things and this other hardware.


12:05: solved for you so you don't have to keep solving them so that's that's pretty much like you know in Brams like where
08:01: Sometimes I would like functionality from this one application and this other one,


12:12: what I want there I to do so I hope that dble can answer that question
08:07: but it was kind of difficult to bring them over.


12:17: well I think it I think it I think it answered it pretty good it's kind of like
08:10: Plus, I would also find myself solving the same kind of problems over and over.


12:23: um cuz um when you were talking about this I was thinking of you know way like
08:17: For example, being able to grab stuff.


12:28: way way back um before like we had any sort of like proper like type of game
08:22: One of the driving forces was to create a framework, a layer,


12:35: engine you know you program all of your code all of your games you know you would just program them raw you wouldn't
08:27: where everything is part of the same shared universe,


12:41: you didn't have Unity you didn't have unreal and if you wanted to collaborate with people you know you had your
08:32: and build an abstraction layer.


12:47: immediate vicinity of like the people who you lived around um and then now you
08:38: It's kind of analogous to programming languages,


12:53: have like game engines and stuff which you know integrate a lot of the typical stuff that you would need to make a game
08:43: where the really old ones had assembly programming,


13:01: um but you know you're still limited to basically working you know over a Skype
08:48: and you had to do a lot of stuff like managing memory,


13:08: call or again with people you know close to you physically but
08:52: like where is this stuff, and managing your stack,


13:14: now this is kind of like a layer on top of that even yes where
08:55: and doing a lot of manual work to make sure the state of everything is correct.


13:19: um now you know as social creatures we kind of we don't really have something like
09:01: Then came high-level programming languages,


13:27: this in that sort of space and now we do and being able to have that same sort of
09:03: where they would essentially do it for you,


13:32: collaboration like you could have in in real life you know with people working next to you you can have from people who
09:05: and they would let you focus more on the high level.


13:40: live a thousand miles away you know across the entire world and you can work
09:08: What do you want to do?


13:46: as if you were yeah you can work exactly as if you were right there and a lot of the things that you'd expect to work
09:12: Personally, what I want Resonite to do in the VR social space


13:53: just kind of do like oh you can see my context menu when it comes up you can see this in Spectrum opening it's just
09:17: is do a similar paradigm shift for applications,


14:00: like you know putting a piece of paper down on a table and working on it with
09:24: where no matter what you build, you always have real-time collaboration.


14:05: someone standing right next to you yeah yeah that's like a really good point like uh like there's actually another
09:29: You don't even have to think about it.


14:11: thing you know that I've seen that like inspired me is like you know seeing like engines like unity and unreal because it
09:32: You can always interact with multiple users,


14:19: used to be like when you wanted to make a game you pretty much have to build your own engine which is like you know on itself this big undertaking they need
09:35: and you always have persistence,


14:26: like bigger Studios but then game engines you know came out they were like more generalized and what essentially
09:36: and you always have integration with lots of common hardware.


14:34: did they kind of you know they they raised the minimal bar like where suddenly everybody has access to a fully fledged game Manion and it's no longer a
09:42: To me, the social VR layer is the basis.


14:41: problem you have to you know solve on your own and now we have like you know small Studios like even just individuals
09:48: You always have the social stuff.


14:49: who are able to build games and applications that previously would take like entire teams of people to do and
09:50: You can join people, you can talk with them,


14:57: what I like see as night is is like you know doing that same thing like just
09:52: you can be represented as your avatar,


15:02: kind of pushing it even further where we like you know go from the game
09:54: but then everyone can build lots of different things.


15:07: engine um where we got know from from just the game engine instead of like you
09:59: Some people will just socialize, some people will play games,


15:13: know where you don't have to like you know worry about stuff like you know graphic like you know making a rendering pipeline making a system like you know
10:02: but some people will build a virtual studio.


15:19: for updating your entities and so on now you have additional guarantees like you know the real time collaboration
10:07: Maybe they want to produce music, maybe they want to program stuff,


15:26: synchronization persistence that's all just going of come for free and you don't have to solve those problems and
10:11: and they're able to use Resonite, a framework to do that,


15:31: you can focus even more of your time on what you actually want to do in the social VR space what do you want to
10:18: and share whatever they make with other people.


15:37: build how do you want to interact so this definitely definitely like very
10:23: If you're good at building tools, you can make tools,


15:42: good point too with you know game mentions yeah um I think we're probably
10:28: like I mentioned, for example, producing music.


15:49: going to move to the next questions because we can R this one a bit um uh so
10:31: Say somebody makes really cool tools.


15:55: we have a I think that one ahead but I think can answer that one pretty
10:33: Other people who do like to produce music can take those tools made by the users,


16:02: terally uh so next we have a Shadow X food related appful joke shocking I know
10:38: and because they exist within the same universe,


16:08: is it uh next we have uh Mr deop uh 1 2 3 4
10:40: you can build your own music studio,


16:16: 5 six uh what are some bugs where we have said it's a feature oh there's the one that like
10:42: and you have all the guarantees that I mentioned earlier.


16:23: immedately comes to the Mind actually sorry able to demonstrate it is the fast Crouch one you know like when you can
10:46: Video Music Studio can invite people in and collaborate with them no matter where they are.


16:31: can you can you can you there we go this like this is technically a bug there's a
10:51: You can save the state of your work,


16:38: bug report for this but I'm like we need to fix this one in a way where you can
10:53: or maybe say you can make a really cool audio processing filter or something.


16:44: still do this because it's just it's just funny and like you know it's it's like the language of desktop
10:58: You save it, you can share it with other users,


16:52: users it's it's it's a b we're turning into a feature so I think it's a good
11:00: and it kind of opens up this kind of interoperability.


16:57: example of one do you have any like that you can think of
11:04: I want Resonite to be general enough where you can build pretty much any application.


17:02: yourself oh there there have been so many updates that I I can't think of any one in
11:13: Whatever you can think of, you can build on here and get those guarantees.


17:09: particular the obvious one I guess is Bob three all which is just a typo but oh my good he is I mean it was more of a
11:19: Kind of similar to how you have a web browser.


17:16: meme feature it's just kind of like you know like an e stke but yeah
11:23: Web browsers used to be just browsers for websites,


17:22: um yeah like there's so much stuff that I don't really remember but like this one is definitely like this this this
11:26: but now we have fully-fledged applications.


17:29: one comes through to mind this there's been a bunch of others but um I I don't
11:28: You have your office set, like Google Docs.


17:34: think I can think of any others myself so next we have Alex 2pi I would
11:33: There's a version of Photoshop.


17:41: think that may we going with a food thing I would think that mayonnaise would be a way to package information by
11:35: We can play games.


17:47: con coting a mayonnaise oh like it's mayonnaise like a rapper type h it's
11:36: There's so many applications on the web that it essentially becomes its own operating system in a way.


17:54: like kind of like a noble except mayonnaise kind of like um kind of like a like tar gz where it's like two layers
11:47: I want Resonite to do a similar thing,


18:01: of like packaging where one of them is the package and one of them is the compression or something oh it's more like even how we're just going to so
11:50: where the platform itself is like the analog of the web browser.


18:09: mayonnaise is a container format we just kind to like it's like
11:54: You can build any kind of application in it,


18:14: like you can contain other things in it that
11:57: but also you get the guarantees of the automated networking,


18:19: Mayo so next we have a grand K have you thought about other ways to get audio
12:01: of the persistence, of the integration with the hardware,


18:25: video out of vers other than simply mirro display of camera and the audio output of resonite it's quite jarring to
12:04: and other things solved for you so you don't have to keep solving them.


18:32: hear frugs non specialized and the S specialized as well as having iners specializing of
12:09: That's pretty much in broad terms what I want Resonite to do.


18:38: Cyro that camera P would suggest actually let me I'm actually going to turn Cyro into broadcast uh that should
12:13: I hope that Dremble can answer that question well.


18:45: make things easier for this you can also like set the audio source to I know be
12:18: I think it answered it pretty good.


18:52: like from the camera I know but that that messes with my head too much um
12:24: When you were talking about this, I was thinking of way, way back,


18:57: okay but yeah like I'm just going to and broadcast for now so it's easier for the stream however uh I do actually have
12:31: before we had any sort of proper type of game engine.


19:03: like answer to that question uh so one of the big things that we're focusing on right now is you know a big performance
12:36: You'd program all of your code, all of your games.


19:10: upgrade and actually I think I've seen like a question so this might answer some of the too um is doing a big performance upgrade uh the two big
12:39: You would just program them raw.


19:18: things that need to be done well there's actually one more but the two big systems that need to be done um is uh
12:41: You didn't have Unity, you didn't have Unreal.


19:26: particle system which is being worked on right now now and the audio system uh which s actually like has been working
12:44: If you wanted to collaborate with people,


19:32: on a part of it for like a doing like a reverb system um those two systems
12:45: you had your immediate vicinity of the people who you lived around.


19:38: they're the essentially the last two big systems um they are sort of like a
12:52: And then now you have game engines and stuff,


19:44: hybrid between FS engine and unity um I'll go little bit more into details on this one like with a later question but
12:55: which integrate a lot of the typical stuff that you would need to make a game.


19:51: um we are going to be reworking the audio system and with the current one the Unity One it doesn't support uh
13:03: But you're still limited to basically working over a Skype call,


19:59: multiple listeners the goal for reworking the audio system is so we can actually do that so we can like have
13:08: or again with people close to you physically.


20:04: like one listener that's you know for for you for your ears and there's additional listener that can be for
13:11: But now, this is kind of like a layer on top of that even.


20:10: camera that you route to a different audio device so you can actually kind of split it too uh because uh you can
13:17: Yes.


20:16: switch to camera uh but like then I'll be hearing everything from camera's Viewpoint that it kind of messes with my
13:18: Where now, as social creatures, we don't really have something like this in that sort of space,


20:22: kind of specialization um so yes there's going to be there's going to be a way to do it
13:28: and now we do.


20:28: which just need to get like it all to the system um next we have origam VR I'm
13:30: And being able to have that same sort of collaboration like you could have in real life,


20:35: going back home very soon I'll finally be able to Res it again I was wondering no social platform has doesn't official
13:35: with people working next to you, you can have from people who live a thousand miles away,


20:41: thing what are the chances of implementing social events and Gathering list in game that notifies people about
13:42: across the entire world, and you can work exactly as if you were right there,


20:47: upcoming events and War yes that's actually one of the things uh I would like us to do uh we do have a GitHub
13:51: and a lot of the things that you'd expect to work just kind of do like,


20:53: issue for it uh so like if you search like events UI I kind of forget its name exactly
13:54: oh, you can see my context menu when it comes up, you can see this in Spectrum opening.


20:59: um on our GitHub there's a bunch of details um I do believe like adding sort
13:59: It's just like putting a piece of paper down on a table


21:04: of generalized systems plus some like you know UI to be able to register events and you know see what's going to
14:04: and working on it with someone standing right next to you.


21:09: happening is going to help people discover more things going on the platform and make it kind of easier you
14:07: Yeah, that's a really good point.


21:15: know to socialize and join things um it's probably going to happen sometime
14:10: There's actually another thing that I've seen that inspired me,


21:20: after like you know we finish with the TR of the performance upgrade uh because there's a bunch of like kind of UI
14:14: is seeing engines like Unity and Unreal.


21:26: improvements we want to do but um we can only know focus on so many things at a time so it's going to come at some point
14:19: Because it used to be when you wanted to make a game,


21:33: uh no timeline yet uh at uh very least it's going to be know sometime after the
14:21: you pretty much had to build your own engine, which in itself is a big undertaking,


21:40: um after the performance update um with one of the things that's definitely like you know on my mind and that I think is
14:26: and you needed bigger studios.


21:47: should be like pretty high on the list because like we want to help people you know kind of drive like socialization
14:27: But then game engines came out, they were more generalized,


21:53: engagement so um I do think is a pretty important feature
14:32: and what they essentially did, they erased the minimal part,


21:58: actually um I actually uh when you were talking about the performance I actually saw someone in the uh the chat um yes
14:37: where suddenly everybody has access to a fully-fledged game engine,


22:08: and uh I actually wanted to to say that like the the rendering engine in
14:40: and it's no longer a problem you have to solve on your own.


22:14: particular like using Unity isn't necessarily like a blocker for the
14:45: And now you have small studios, even just individuals,


22:20: performance update I'm actually going to like cover like this uh because I see like there's
14:49: who are able to build games and applications that previously would take entire teams of people to do.


22:26: like two questions that are sort of like related to this so I'll go like a little bit more like in detail on this one uh
14:56: And where I see Resonite is doing that same thing, just pushing it even further,


22:31: we have sky and kitsun asking FRS could you explain the road map to Big optimization update where are we at in
15:04: where we go from just the game engine,


22:39: that process and then we have gloen VR asking what are some uh some of the big Milestones still needed to move the
15:12: where you don't have to worry about stuff like making a rendering pipeline,


22:45: client applications to net8 uh I know the particle system is one of the prerequisits but there are some other
15:18: making a system for updating your entities, and so on.


22:51: prerequisits that can look forward before the shift um so these two questions are it's pretty much like you
15:22: Now you have additional guarantees, like real-time collaboration, synchronization, persistence,


22:56: know um is the same kind of question so like I'm going to cover this uh in one uh let me
15:27: that all just kind of comes for free, and you don't have to solve those problems,


23:04: actually bring my brush because I feel it would help um it would help if I kind
15:31: and you can focus even more of your time on what you actually want to do in the social VR space.


23:09: of like you know draw draw a diagram also going to I'm going to turn the camera to manual mode so like it's not
15:36: What do you want to build, how do you want to interact.


23:15: kind of moving around for this um where's my brush give me like a
15:40: So that's definitely a very good point, too, with the game engines.


23:22: second uh tools I should have gotten one already but uh J brushes yes s will
15:48: I think we're probably going to move to the next questions, because we kind of rambled about this one a bit.


23:32: dance the entertain while I look for the brush I think this one should be okay
15:55: So we have a...


23:37: there we go so let me see so this is yeah this looks pretty visible on the
15:55: I think that one went ahead.


23:44: camera so uh just to kind of give you like an idea um consider you
15:58: But I think we can answer that one pretty thoroughly.


23:50: know let me actually make this a little bit
16:03: So next we have ShadowX.


23:55: bigger so consider like you know this is
16:05: Food-related uphill fools joke? Shocking. I know, is it?


24:03: Unity um how might be glowing a little bit too much um consider like you know
16:11: Next we have MrDaboop123456.


24:09: this is unity and like you know have like Unity stuff you know there's like you know whatever it's doing and then with an Unity we have FRS engine so this
16:17: What are some bugs that we have said it's a feature?


24:18: is you know FRS
16:21: Others?


24:24: engine so right now because unit uh FR engine is contained with unity is
16:23: The one that literally comes to the mind...


24:31: using unity's run time to run its code it's using the mono uh the mono
16:24: Actually, sorry, we've got to demonstrate it.


24:36: framework it's very old um and it's kind of slow uh which is why we kind of want
16:26: It's the fast crouch one.


24:44: to move F engine to do net 9 because we're originally saying do net 8 but uh
16:29: You know, like when you...


24:50: I think it was like this week or last week doet 9 release so we're going to talk at that one um but the problem we have right now
16:31: Can you... can you... can you... there we go.


24:58: like you know to do move there's like two systems uh where they sort of like a
16:33: This.


25:03: hybrid so FR engine most of the stuff most of like all the interactions all the scripting networking physics you
16:35: This is technically a bug.


25:10: know interactions most of it's like fully contained within FR engine so like
16:37: There's a bug report for this.


25:16: you know we don't have to like worry about it that's already kind of nicely contained uh but
16:39: But I'm like...


25:22: then um what we have um there's like two systems which are sort of like a hybrid
16:41: We need to fix this one in a way...


25:29: to kind of like exist on both sides and it's particle system so we have a particle system and
16:43: Where you can still do this because it's just...


25:36: the second one is sound
16:45: It's just funny and like, you know, it's like the language of desktop users.


25:42: system so what the overall goal is is like we want to take like these systems
16:53: It's... it's a bug returning into a feature.


25:48: and rework them into completely custom ones so they're fully contained within FS engine once that kind of happens um
16:56: So I think it's a good example of one.


25:55: there's also like you know interaction with like all the unit stuff and right now that's also kind of like
17:05: Oh...


26:00: you know like where this goes this you know this goes here this goes here it's kind of like it's
17:05: There have been so many updates that I can't think of any one in particular.


26:06: messy uh so like once we move both of these systems full into F engine we're going to rework how F engine actually
17:10: The obvious one, I guess, is Bulbul 3.0, which is just a typo, but...


26:13: communicates with unity uh so it's uh sort of like you know a much simpler
17:13: Oh my god, yes.


26:19: pipe where it almost like you know it sends like a very self-contained kind of like package and be like render this
17:15: I mean, it's more of a meme feature.


26:24: stuff for me please once this is done what we can do is we can take this
17:17: It's just kind of like, you know, like an easter egg.


26:30: entire thing and I kind of grab it like at the same time but we essentially move this out of
17:20: But yeah.


26:37: unity into its own process that's going to be net net
17:22: Um...


26:44: n and this is going to you know communicate with unity using that same
17:23: Yeah, like, there's so much stuff that I don't really remember, but like...


26:49: process and is this switch switching to like the much modern net 9 runtime
17:26: This one is definitely like...


26:55: that's going to provide a big performance like uplift um the reason for that is because
17:29: This one comes to the mind.


27:01: net9 it has much better jit compiler that's uh essentially the component that
17:31: There's even a bunch of others, but...


27:06: takes you our code and it translates it into like machine code that your CPU runs and the one that's in N produces
17:33: I don't think I can think of any others myself.


27:15: like at least an order of magnitude like better better code uh it also has better
17:38: So next we have Alex2PI.


27:21: like more optimized libraries that are part of like NET Framework that are being used and also much better garbage collector which is another thing like on
17:41: I would think that mayonnaise...


27:27: the decid it's slowing things down um we've already done a thing where uh for
17:42: We're going with the food thing.


27:34: the Headless client um we've moved it to net 8 uh like a few months back uh
17:44: I would think that mayonnaise would be a way to package information by quoting in mayonnaise.


27:41: because with the Headless client you don't have the render which means it already exists outside of
17:49: Oh, I guess mayonnaise is like a rapper tribe.


27:47: unity uh that made it much easier to actually move it you know to the modern. net R time um and the Headless client is
17:53: Hmm...


27:56: pretty much like you know it's still this it's the same code like it's not a separate thing from you know what we're
17:53: It's kind of like a nullable except mayonnaise.


28:01: running right now like 99% of the code is the
17:56: Kind of like a...


28:07: same so by moving it first we were able to see how much of big performance
17:58: Kind of like a like, tar GZ.


28:13: uplift we actually get um what we found and we had like a
18:00: Where it's like two layers of like, packaging.


28:19: number of kind of community events that have been hosting you know kind of big events we've been able to get like way
18:03: Where one of them is the package and one of them is the compression or something.


28:25: more people on those heades even with those heades you know Computing everybody's avatars Computing
18:06: Oh, it's more like...


28:31: everybody's you know ik and dynamic bones while still maintaining like a
18:07: So mayonnaise is a container format.


28:37: pretty high frame rate so thanks to that we are confident that moving moving the
18:12: It's just kind of like...


28:43: graphic client to do net 9 is going to give us a really good performance
18:15: It can contain other things in it.


28:48: upgrade um the only problem is like you know it's a little bit more complicated process because we do have to rework
18:18: .Mayo.


28:54: those systems and we have to rework integration before can before you can move it out this is also a little bit
18:20: So next we have GrandUK.


29:01: tangential but one of the things once once this kind of happens once we move it out we can actually replace Unity
18:22: Have you thought about other ways to get audio-video out of Resonite other than simply mirror-to-display of camera and the audio output of Resonite?


29:08: with sauce which is going to be our custom rendering engine and this whole process it makes it easier because we
18:31: It's quite jarring to hear Froox non-specialized and then Cyro specialized as well as having inverse-specializing of Cyro that camera POV would suggest.


29:14: have this like you know very nicely defined way to communicate between the two which means we can actually eat this
18:42: Actually, let me... I'm actually gonna turn Cyro into broadcast. That should make things easier for this.


29:20: away you know and put Source in here instead I so like where we are right now
18:47: You can also set the audio source to be from the camera.


29:28: uh so right now um if I move this
18:53: I know, but that messes with my head too much.


29:34: back oh move this back we have the sound here there we go so right now uh the
18:57: I'm just gonna keep you on broadcast for now so it's easier for the stream.


29:41: sound system is still hybrid the communication with the unit is still kind of like messy like it's this lot of kind of
19:01: However, I do actually have answers to that question.


29:49: like you know Roots into everything and the particle system is being reworked so we're essentially uh we're essentially
19:06: So one of the big things that we're focusing on right now is a big performance upgrade.


29:56: taking this and moving it into here um we are working on a new particle
19:11: And actually, I think I've seen a question so this might answer some of that too.


30:01: system called Photon dust uh the work has been kind of progressing over the past few weeks uh is actually getting
19:15: It's doing a big performance upgrade.


30:09: close uh to Feature Part of current system which is a hybrid between unity and for engine uh because the goal is we
19:17: The two big things that need to be done...


30:17: don't want to break any content we make we want to make sure that like you know whatever we've build with existing
19:20: Well, there's actually one more, but the two big systems that need to be done


30:22: particle system still works and looks the same or at least very close to like
19:25: is a particle system, which is being worked on right now, and the audio system,


30:28: you know what it's supposed to look um most of the things are implemented uh if
19:30: which Cyro actually has been working on a part of it for doing a reverb system.


30:34: you go into like devlog uh in our Discord you can see you know some of the updates and some of the
19:37: Those two systems, they're essentially the last two big systems


30:40: progress um the main thing that's missing right now as a major system is are implementing or implementing our own
19:43: that are sort of like a hybrid between FrooxEngine and Unity.


30:46: system for uh particle Trails once that's done um it's possible
19:47: I'll go a little bit more into details on this one with a later question,


30:53: like this is going to be sometime next week I don't want to make like any promises because things happen but um it
19:50: but we are going to be reworking the audio system, and with the current one,


30:58: is getting close like closer there um you can actually release public builds
19:55: the Unity one, it doesn't support multiple listeners.


31:03: where you have both systems at the same time so you're going to have like you know the leg like now Legacy system and
20:01: The goal for reworking the audio system is so we can actually do that,


31:09: Photon dust um with conversion being something you trigger like manually uh
20:05: there's one listener that's for you, for your ears,


31:15: we'll run a bunch of tests with the community so we'll essentially ask you to you know test your content test the
20:08: and there's additional listener that can be for camera


31:20: new system find any bags with it uh once we are con confident like that it works
20:11: that you route to a different audio device, so you can actually kind of split it too.


31:26: okay um we are going to uh essentially remove the all system and we make the
20:15: Because you can switch to camera, but then I'll be hearing everything from camera's viewpoint


31:32: conversion automatic to the new system uh with that the this part is going to be done and we're going to move on to
20:20: that it kind of messes with my kind of specialization.


31:39: this sound part uh the sound system um this
20:25: So yes, there's going to be a way to do it, we just need to get it out of the system.


31:44: essentially like you know what handles like stuff like audio specialization and so on right now it's also a hybrid so
20:30: Next we have OrigamiVR.


31:50: for example on FR engion side we do our own audio encoding and decoding unit is
20:35: I'm going back home very soon, I'll finally be able to reside again.


31:55: not like not handling that but what we're doing is we're feeding United the audio data we want to play into
20:38: I was wondering, no social platform has this in official, I think.


32:01: individual sour sources and United and handle specialization and then outputting it you know to your audio
20:42: What are the chances of implementing social events and gathering lists in-game


32:07: device uh we're going to move that part into our own system which is also what's going to allow us to you know take
20:45: that notifies people about upcoming events and more?


32:14: control of it and build you know new features like for example having multiple listeners and we also going to
20:49: Yes, that's actually one of the things I would like us to do.


32:19: move the system here uh there's actually some work on this uh that Sarah's been
20:52: We do have a GitHub issue for it, so if you search events UI, I kind of forget its name exactly.


32:24: working on that I've kind of um asking for help with uh because one of the things in the system is a Reverb effect
20:59: On our GitHub, there's a bunch of details.


32:32: and we essentially need to implement our own because there also a thing that's currently Hand by un and cro has uh made
21:03: It would be really like adding server-generalized systems plus some UI


32:39: like integration with a Reverb called Z Reverb uh that we're going to use to like you know replace the existing kind
21:07: to be able to register events and see what's happening.


32:45: of Reverb Zs um I don't would like to tell us a little bit more about that
21:10: It's going to help people discover more things going on in the platform


32:51: part yeah so um we found a Nifty little
21:14: and make it easier to socialize and join things.


32:57: so so let let me actually back up a little bit so currently um like the
21:18: It's probably going to happen sometime after we finish with the triangle of the performance update


33:03: reason why we can't just keep using like this Reverb or whatever like the one that we're using right now is because it
21:24: because there's a bunch of UI improvements we want to do,


33:10: uses well it uses unities but in turn it it the underlying Reverb uses like is it
21:28: and we want to focus on so many things at a time.


33:16: like f mod something and that costs at least $4
21:31: So it's going to come at some point. No timeline yet.


33:21: signs uh to use commercially I think uh but we found a Nifty Library called
21:36: At the very least, it's going to be sometime after the performance update.


33:27: sound pipe um that includes a really nice sounding Reverb effect and um I
21:44: With one of the things that's definitely on my mind,


33:35: have been working on uh getting the library compiled and integrating it um
21:46: and that I think should be pretty high on the list


33:42: with uh FRS engine uh you won't be able to do anything like super duper Fancy
21:49: because we want to help people drive socialization engagement,


33:48: with it right away at least not until FRS reworks the whole audio system but
21:54: so editing is a pretty important feature.


33:54: you'll you'll at least be able to process like audio clips with it and make them sound all echoey and stuff
21:59: Actually, when you were talking about the performance,


33:59: just to like try it out which I think will be pretty cool yeah cuz then like
22:03: I actually saw someone in the chat.


34:04: cuz then you could like just Reverb ify any audio clip in game yeah like you can
22:06: Yes.


34:11: make a you could make a Reverb Baker essentially which I think is pretty cool
22:08: And I actually wanted to say that the rendering engine in particular,


34:16: yeah it's kind of like expand like know AIO processing because you can already do you know like some like trimming you can do like you know normalization volum
22:15: like using Unity, isn't necessarily like a blocker for the performance update.


34:22: adjustments like fading and so on so like this is just having the code like integrated and red day we can already
22:25: I see there's two questions that are related to this,


34:28: kind of you know expose some of it uh to you know play with it and like make essentially sort of tools that kind you
22:29: so I'll go a little bit more in detail on this one.


34:35: know spin off of it before like we do the big integration um but yes like uh right now
22:31: We have SkywinKitsune asking,


34:40: the particle system is the major one that's going to be you know fully pulled in uh once that part is done uh we're
22:34: Froox, could you explain the roadmap to a big optimization update?


34:47: going to do the sound system which I expect to be faster than the particle system because it doesn't have like as
22:38: Where are we at in that process?


34:52: many things but we'll we'll see how that one goes uh once the sound systems happens this is going to get reworked
22:40: And then we have GlovinVR asking,


34:59: integration with unity so it's simpler and once this is done we move the whole thing out and it's going to be the big
22:41: what are some of the big milestones still needed


35:05: performance update uh so I hope that uh kind of you know helps like answer um it
22:45: to move the client applications to .NET 8?


35:11: helps answer you know kind of the question um I think I'm going to clean
22:47: I know the particle system is one of the prerequisites,


35:18: this up uh just so it doesn't clutter our space and we can move to the next questions I'll uh I'll mark these two as
22:50: but where are some other prerequisites that can be looked forward before the shift?


35:25: an insed that yes uh there's actually one thing I was also going to mention even from the particle system there's
22:55: So these two questions are pretty much the same kind of question,


35:31: actually a few functions that like sort of Spawn as extra things um one of them
23:00: so I'm going to cover this in one.


35:37: being uh that you know have access to 4D Simplex noise uh so you can use you know
23:03: Let me actually bring my brush,


35:43: a prolex node and there also like a texture there like a 3D texture for um
23:05: because I feel it would help if I draw a diagram.


35:49: uh 3D texture with Simplex noise and I've seen like people do like really cool effects
23:11: I'm also going to turn the camera to manual mode,


35:56: with it like this one for for example I think I actually got this one from Cyro so like you see how it kind of like
23:15: so it's not moving around for this.


36:02: evolves in time and this is like a volumetric effect so you can kind of like you know push it
23:19: Where's my brush? Give me like a second.


36:09: through so like people have been like you know already playing with it and this is kind of generally how we like to
23:24: Tools... I should have gotten one already, but...


36:15: do development where uh I've got like another version here it's like super
23:28: Geometer, my brushes...


36:22: neat um how we like like to like develop things is like you know we want to add
23:31: Yes, Cyro will dance the other thing while I look for the brush.


36:29: more building blocks so even if we're building Something official whatever building blocks we add we try to make as
23:36: I think this one should be okay. There we go.


36:35: many of them available you know to everyone using the platform because you can use them for a lot of other
23:39: So, let me see... So this looks pretty visible on the camera.


36:41: things um so yeah but that should kind of cover like those questions um so next uh we have a
23:43: So, just to kind of give you an idea, consider...


36:49: question from Navy 3001 any idea of how in game performance metrics for user
23:51: Let me actually make this a little bit bigger.


36:55: content would work there actually that's a good question and like measuring performance um that's a very kind of
23:56: So consider this is Unity.


37:03: difficult thing because one of the things with performance is like it depends like it depends a lot of stuff
24:04: That might be glowing a little bit too much.


37:11: um so usually like you want to have like you know kind of range of tools you know to kind of measure uh like measure
24:08: Consider this is Unity.


37:17: things one of them is you know you can measure how long individual components take you know to execute and sort of
24:10: You have Unity stuff, whatever it's doing.


37:22: some way to aggregate the the data so you can kind of see okay this is consuming of time this consume a lot of
24:14: And within Unity, we have FrooxEngine.


37:29: time but the performance impact of something is not always like you know that direct because something can for
24:18: So this is FrooxEngine.


37:34: example the components themselves they can be quick to execute but maybe the object is you know has a really complex
24:25: So, right now, because of Unity, FrooxEngine is contained within Unity,


37:41: geometry so it's taking a long time on the GPU to render out um the other part
24:31: it's using Unity's runtime to run its code.


37:47: is like performance can also differ depending on the scenario say you build an object and the object is doing a ray
24:34: It's using the Mono. The Mono framework, it's very old,


37:54: cast is doing you know some kind of checks for collisions if you have that object in a simple World maybe it
24:40: and it's kind of slow.


38:00: doesn't like you know it it runs pretty fast but you bring that object into world with much more complex colliders
24:42: Which is why we kind of want to move FrooxEngine to .NET 9,


38:06: it sudden it starts hurting performance because now those Collision shaks are needed like you know are more
24:47: because we're originally saying .NET 8, but I think it was this week,


38:12: complex the other example is like say use like node like find child you try to
24:51: or last week, .NET 9 release, so we can talk about that one.


38:18: search for a child in a hierarchy and if you're a simple World maybe like you know the the hierarchy of objects is you
24:56: But the problem we have right now, in order to move,


38:27: know it doesn't have too much in it so it runs fast but then you go into a world which has way more and now the
24:59: there's two systems where this is sort of like a hybrid.


38:33: performance kind of Tanks now now that thing that was running reasonably fast in one world is running slower in other
25:04: FrooxEngine, most of the stuff, most of all the interactions,


38:40: one so one of the ideas we kind of had is we would kind of build uh some sort
25:08: all the scripting, networking, physics, interactions,


38:47: of like you know set of like Benchmark worlds we would like you know like have like different scenarios complex worlds
25:11: most of it's fully contained within FrooxEngine.


38:53: with complex hierarchies you know with this and that and then have a system where you can and essentially like run
25:15: We don't have to worry about it, that's already kind of nicely contained.


38:59: that object in that world and sort of you know see how fast it runs and how
25:21: But then, what we have,


39:05: does it differ depending you know on a different kind of scenario um overall I think this will eventually
25:26: there's two systems which are sort of like a hybrid,


39:12: end up with you know lots of different tools so have like you know you don't have the tools to measure how much the
25:29: they kind of exist on both sides, and it's a particle system,


39:18: components take to execute how long the you know GPU takes to execute um just
25:33: so we have a particle system,


39:23: sort of like lots of different tools to analyze different like you know performance things
25:36: and the second one is a sound system.


39:28: so I think that's overall like you know like what you should expect like once
25:43: So, what the overall goal is, is we want to take these systems


39:34: once those tools come in it's not going to be a single tool but it's going to be like you know a range of them that will
25:47: and rework them into completely custom ones, so they're fully contained within FrooxEngine.


39:40: probably keep like you know expanding and building upon yeah if I could uh
25:53: Once that kind of happens,


39:48: add a pen to that um we probably also CU I know this is uh
25:55: there's also interaction with all the Unity stuff.


39:56: I know this is like come up occasionally in relation to questions like this we probably also wouldn't like
25:58: And right now, that's also kind of like where this goes this,


40:05: give things like we wouldn't do like an arbitrary limiting system like oh you
26:02: this goes here, this goes here, it's kind of like, it's messy.


40:10: can only have 60,000 triangles you can only have X number of seconds of audio
26:07: So once we move both of these systems fully into FrooxEngine,


40:16: on you cuz well we do want like other like tools so you can rest because like
26:11: we're going to rework how FrooxEngine actually communicates with Unity.


40:21: it can like it's not it's not a perfect solution but like we want to add people like we want to add tools so people can
26:16: So it's a much simpler pipe,


40:29: like you know set some limits on things because um one of our kind of
26:19: where it sends a very self-contained package,


40:34: philosophies is is like you know we want to give people a lot of control and if you want to run a session like where you
26:23: and be like, render this stuff for me, please.


40:40: you can spawn object that has you know 50 million triangles and like everybody's going to be running at like
26:26: Once this is done, what we can do


40:46: you know 10 FPS but you know you want to be like I have a b GPU I want to look at the super detailed model we always want
26:29: is we can take this entire thing, and I kind of got a bit like at the same time,


40:54: people to have ability to do that and the same time we want to add tools so
26:34: but we essentially move this out of Unity,


41:00: like you know if you want to host like a chill world if you want to keep it like you know more light you have tools to
26:38: into its own process that's going to be


41:06: kind of like you know um set certain like limits on the users how much they can spawn in how much they can bring in
26:40: .NET 9, and this is going to


41:12: so we not going to make them you know forced but we're much more likely to add
26:45: communicate with Unity using that same process.


41:17: like tools where you have the kind of control to decide what you want you know in your world what do you want in your
26:50: And is this switch switching to the much modern


41:23: experience the yeah other aspect of that is like you know we have the asset
26:53: .NET 9 runtime, that's going to provide a big


41:28: variant system and we already use part of it like you can you can go into your settings and you can lower the
26:57: performance, like uplift. The reason for that is


41:34: resolution of textures you can for example clamp it to like 2K so if you're you know low on vrm you can lower your
27:01: because .NET 9, it has a much better JIT compiler,


41:41: textures and if somebody has you know 8K texture on their Avatar you're only going to load it up to 2K you know it's
27:04: that's essentially the component that takes our code, and it translates it


41:48: not going to hurt you but other people like say somebody has you know one of the uh you know 490 with 24 gigs of vram
27:09: into machine code that your CPU runs. And the one that's in .NET 9


41:55: and they don't care they can keep keep it like you know kind of unlocked and it's kind of you know aligned with our
27:13: produces at least an order of magnitude


42:00: kind of like philosophies like we give you give people as many tools as possible to kind of control your
27:17: better code. It also has better,


42:06: experience but also we don't want to enforce like you know limits on people where
27:20: more optimized libraries that are part of the .NET framework that are being used,


42:13: possible yeah that's that's kind of more so where I was um going with that is that we wouldn't have like a sort of
27:24: and also much better garbage collector, which is another thing on the Unity side


42:20: hard and fast these are the rules for the whole platform kind of yes rules cuz
27:28: that's slowing things down. We've already done


42:26: you know not everybody these computers are equal and so maybe I don't want to render your 500 million polygon model
27:32: a thing where, for the headless client,


42:32: right um but uh we also don't want to we want to
27:37: we've moved it to a .NET 8 a few months back,


42:40: present this stuff in a sort of like unbiased way like we don't want
27:41: because with the headless client you don't have the renderer, which means


42:45: to like we we wouldn't uh I I wouldn't want to color like 5
27:44: it already exists outside of Unity.


42:51: like you know someone's polygon count in like red or something that cuz I we got
27:47: And that made it much easier to actually move it


42:56: to like social kind of also comes with that like we don't want to
27:51: to the modern .NET runtime.


43:01: invite but I think that they should like answer like this like pretty totally we should probably like move to the other
27:54: And the headless client, it's still this. It's the same code.


43:07: questions because we got a bunch of them piling up can I can I answer the next one
27:58: It's not a separate thing from what we're running right now.


43:14: uh uh sure I haven't actually read
28:03: 99% of the code is the same.


43:19: yet okay so the Jeb Forge asks would it even be possible to multi-thread World
28:07: So by moving it first, we were able to see


43:25: processing in resite like if the world is incredibly heavy in the amount of CPU it uses but since reso only uses one
28:11: how much of a big performance uplift we actually get.


43:31: thread it's not using all the CPU it could have been I know multi-threading introduces a lot of problems with thread syn medication what do you think I
28:16: What we found,


43:39: actually have a answer all right guys say it with me oh gosh's
28:18: and we've had a number of community events that have been hosting


43:47: moving for a sec hold on hold on hold on hold on okay here we go all right all right
28:21: big events, we've been able to get way more people


43:54: say it with me resinite is not single threading this is this is a myth that
28:26: on those headlaces, even with those headlaces


44:02: that has somehow spread around that resume only runs on a single thread this is abjectly not true yeah this is this
28:30: completing everybody's avatars, completing everybody's IK,


44:10: is the thing like we kind of get a lot like because I think like people are just like you know it runs with poor performance therefore it's single
28:33: and dynamic bones, while still maintaining a pretty high frame rate.


44:17: threaded it's like when it comes to multi threading it's like way more complex that like it's not a black and
28:39: So thanks to that, we are confident that moving


44:22: white thing so the way I kind of put it is like you know like
28:43: the graphical client to .NET 9


44:28: it's not like an onoff switch like imagine like you know like you have a city or something and the city has you
28:45: is going to give us a really good performance upgrade.


44:33: know po roads you know like maybe there's like areas like where like you know you know the roads are very kind of
28:50: The only problem is, it's a little bit more complicated process, because we do have to


44:40: narrow and like you know it's kind of hard for cars to get through you can have like you know areas of the city
28:54: rework those systems, and we have to rework integration before we can


44:45: like where you have like you know highways and like you can have lots of cars in there it's not an onoff switch
28:58: move it out. This is also a little bit tangential, but


44:51: like you know where you just like you turn a switch and suddenly every every road is like w
29:02: one of the things, once this happens, once we move it out,


44:57: but you can kind of gradually like rebuild like you know more of this infrastructure to support you know more
29:06: we can actually replace Unity with Sauce, which is going to be our


45:03: of this kind of like high B with um with arite like there's a lot of things that are are
29:09: custom rendering engine. And this whole process, it makes it easier


45:09: multi-threaded um there's also L of things that could be multi-threaded and they're going to be more multi-threaded
29:13: because we have this very nicely defined way to communicate


45:15: in the future but it's not a it's like is essentially not like you
29:17: between the two, which means we can actually heat this away, you know, and put


45:22: know a black and wine thing like whether it's like either multi or not multi it um you have to think about like you
29:21: Sauce in here instead.


45:30: know resonite it's like lots of complex systems there's like so many systems and
29:25: Like, where we are right now. So right now,


45:35: some of them you know are going to be multi-threaded some of them are going to not going to be multi-threaded some of them are not multi-threaded and they're
29:30: if I move this back...


45:42: going to be multi- threaded some of them are going to stay single threaded because there's not really much benefit to them being multi- threaded so we
29:35: I'll move this back...


45:50: definitely want to do more but we already have a lot of like things like running on multiple threats like
29:37: We have the sound here, there we go. So right now, the


45:57: um a lot of like you know the like like asset processing you know that's multi-threaded the physics like that's using multiple threads a lot of kind of
29:41: sound system is still hybrid, the communication with Unity is still kind of like


46:04: like you know additional processing kind of like spins off like you know does bunch of like background processing and then like integrates with the main
29:47: messy, like there's a lot of


46:10: thread so there's all of like multi threading to the system already there's
29:49: routes into everything, and the particle system is being reworked.


46:15: got to be more um it's also like not like you know it's not something that's
29:54: So we're essentially taking this, and moving


46:21: like a magic you know kind of Silver Bullet like you um with performance
29:57: it in here. We are working on a new particle


46:27: there's a lot of complexity like there's a lot of things um that can be causing like low
30:01: system called PhotonDust. The work has been


46:34: performance and multi- threading is not always the best answer so for example you know with doet uh with net n switch
30:05: kind of progressing over the past few weeks. It's actually getting close


46:42: that's actually not going to change anything with multi threading uh but it
30:10: to feature party with the current system, which is a hybrid between Unity


46:47: essentially makes the code that we already have which is which has you know whatever multi trading has right now it
30:13: and FrooxEngine. Because the goal is, we don't want to


46:53: makes it run several times faster just by switching to the run time just by having better you know kind of Coden so
30:17: break any content. We want to make sure that whatever


47:02: there's a lot of different things that can be done to improve performance most is just one of
30:21: is built with existing particle system still works, and


47:07: them um I think I kind of should cover like a lot of it but uh yeah yes um one
30:25: looks the same, or at least very close to what it's supposed to look.


47:15: more thinging is like you know it's also something like when when there's like a world that's like you know very
30:31: Most of the things are already implemented.


47:21: heavy it depends what's making it heavy because some things you can multithread but some things you cannot multithread
30:33: If you go into devlog in our Discord,


47:26: like if you have like you know some user content that's like you know doing lots of interactions with things if you just
30:37: you can see some of the updates and some of the progress.


47:32: you know blatant multi-thread it like it's going to end up like corrupting bunch of stuff because with every
30:41: The main thing that's missing right now as a major system is


47:38: algorithm there's always like a part of it that's irreducible so we want to introduce like
30:45: implementing our own system for particle trails.


47:44: you know more systems that do use multi-threading um were possible but
30:50: Once it's done, it's possible


47:49: again it's like you know it's not a thing it's not a silver blet it's like a it's more like a gradual kind of process
30:53: that this is going to be sometime next week. I don't want to make any promises because things happen,


47:56: you know kind of happens over time uh next we have uh Grand UK is
30:57: but it is getting close to there.


48:02: asking are there road maps with time estimates for B development and what do you want there it to be so for road maps
31:01: We can actually release public builds where we have


48:08: we we generally don't do like super ahead of road maps um right now our focus is you know on performance updates
31:03: both systems at the same time. So we're going to have


48:16: and you can actually find uh on our GitHub uh there's a project board and there's like a list of issues uh you
31:08: legacy system and PhotonDust, with


48:22: know that pertain to like performance update and you can kind of see you know how those kind of pro progress um we
31:11: conversion being something you trigger manually. We'll learn a bunch of tests


48:28: don't do generally super much like time estimates because the development varies a lot uh and often times you know kind
31:16: with the community, so we'll essentially ask you to test your content,


48:34: of things kind of come in that like we have to deal with the delay things or maybe there's like no additional complexity so we
31:20: test the new system, find any bugs with it. Once we are


48:41: don't uh we want to avoid like you know promising certain dates uh when we are
31:24: confident that it works okay,


48:47: not like confident we could actually keep them we can give you like very rough ones like for example with the
31:28: we are going to essentially remove the old system


48:54: performance um uh with the performance upgrade I I roughly expect it to happen
31:32: and we make the conversion automatic to the new system.


49:01: sometime like in Q like q1 sometime like early next year we'll see how it goes
31:35: With that, this part is going to be done, and we're going to move on


49:07: um uh but like you know there will be my R estimate on that one after that we
31:39: to this sound part. The sound system


49:13: usually like once we kind of finish kind of big task we sort of re-evaluate like we see like you know what would be the
31:44: is essentially what handles stuff like audio


49:19: next back what be the next best step for the platform at that point and we decide
31:46: specialization and so on. Right now, it's also a hybrid. So for example,


49:24: you know are we going to focus on UI are we going to implement this thing are we going to implement that thing because um
31:50: on Froox Engine's side, we do our own audio encoding and decoding.


49:30: we try to you know keep our ear to the ground and be like you know this is what
31:55: Unity is not handling that, but what we're doing is we're feeding Unity


49:36: would give like you know the community and the platform most benefit right now this is like you know what's most needed
31:59: the audio data we want to play into individual sources


49:42: right now and we kind of like want to make the decision as um um you know as soon as possible or no actually as late
32:03: and Unity then handles specialization and then outputting it to your


49:51: as possible uh next question uh we have a
32:07: audio device. We're going to move that part into our own system


49:56: check Fox auter what are some examples of features you've implemented you particularly proud about I mean there's
32:11: which is also what's going to allow us to take control of it and


50:02: a whole bunch like I do like like lot of systems like the one I'm actually working on right now like the particle
32:15: build new features, like for example having multiple listeners.


50:08: system like I'm pretty proud of that um it's I mean it's kind of technically not
32:19: And we're also going to move the system here. There's actually some work on


50:13: out yet but like I'm very happy like with kind of How It's Growing uh in part because you know um it now kind of gives
32:23: this that Cyro's been working on that I'm asking for help


50:21: us control to very easily make new particle effects you know and do stuff like we were not able to to do like
32:27: with, because one of the things in the system is a


50:27: easily before um the one like that kind of came before that is the data feed
32:31: reverb effect. And we essentially need to implement our own, because there's also


50:33: system um that's sort of like a culmination of like a lot of kind of approaches I've been kind of developing
32:35: a thing that's currently handled by Unity, and Cyro has


50:39: to how we do UI and the resite um so with that one uh one of the big
32:39: made integration with a reverb called the Zita reverb that we


50:47: problems we kind of had with the UI is because the resonite is sort of it's It's building a lot of things from ground up because you know of the kind
32:43: use to replace the existing reverb songs.


50:53: of like you know layer I was talking at the you know ear in the Stream um but it also like makes things
32:48: Would you like to tell us a little bit more about that part?


51:00: difficult because we cannot just you know take existing solution and use it Sol of the UI we actually have to build
32:50: Yeah, so we found


51:06: those systems ourselves and build sort of Frameworks you know to work with them and the old uis they have the problem
32:55: a nifty little... so let me actually back up a little bit.


51:13: where the code of them it's like this B be kind of monolith and it's really hard to work with we have to like you know if
32:59: So currently,


51:19: if if there's like misaligned you know button or something we have to go to the code change you know some numbers there
33:02: the reason why we can't just keep using this reverb or whatever,


51:26: change methods that are called compile wait for it to compile run you know run
33:07: like the one that we're using right now, is because it uses


51:31: the application look at it be like that's still wrong go back to the code make more changes compile wait for it
33:11: Unity, but in turn the underlying reverb


51:38: wait for it to launch look at it it's still wrong go back to the code like
33:15: uses FMOD or something.


51:44: sometimes people are like you know like oh like this thing is like misaligned in this UI and we're
33:17: And that costs at least four dollar signs


51:49: like fixing that like you know that could like sometimes it takes like an hour just kind of like messing around
33:22: to use commercially, I think.


51:55: and that's not very kind of you know good use like of our engineering time but the
33:25: But we found a nifty library called Soundpipe


52:01: data feeds um is a system that's like very generalized uh that um essentially
33:29: that includes a really nice sounding reverb effect.


52:08: allows us to like split the work on UI between the engineering team and our
33:33: And I have been working on


52:13: content or now our team uh so for the when we work the settings UI on the code
33:38: getting the library compiled and integrating it


52:19: side we only have to worry more about the functionality of it like what's the structure what's like no data interfaces
33:41: with FrooxEngine.


52:24: and then we have uh there of our team uh like our team actually build the visuals
33:45: You won't be able to do anything super duper fancy


52:31: uh in game and build like you know kind of put a lot of Polish into each of the elements um and that process has made it
33:48: with it right away, at least not until Froox reworks the whole audio system.


52:38: much simpler to like rework you know rework like the settings UI but what's
33:53: But you'll at least be able to process


52:44: even bigger part of it is um the data feed system that this is built on it's
33:55: audio clips with it and make them sound all echoey and stuff just to try it out.


52:51: very general and it's been kind of designed to be General so the settings UI it was used as sort of like a pilot
34:00: Which I think will be pretty cool.


52:58: project for it but now um we're going to use it you know to like once we get to
34:04: Then you can just reverbify


53:04: more UI rework to rework the inventory rework the contacts reor the World Browser you know file browser rework the
34:08: any audio clip in-game. You can make a reverb


53:12: inspectors and it makes the work required to R those uis be at least an
34:12: with Baker, essentially, which I think is pretty cool.


53:17: or like order of magnitude less which means that like before the data feeds um
34:16: It's kind of like expanding the audio processing, because you can already do some trimming,


53:24: say like and I'm just kind of like these are rough estimates but say it would have taken us two months to reor the
34:20: you can do normalization, volume adjustments, fading, and so on.


53:31: inventory why now it's going to take us you know two weeks and like I those numbers are kind
34:25: Having that code integrated and ready,


53:38: of you know very just kind of more to illustrate a point um but it's essentially you know on that kind of
34:28: we can already expose some of it,


53:44: order like it makes it way simpler it saves us so much time which means we can rework a lot more UI um you know in the
34:31: play with it, and make tools that spin off of it


53:51: shorter time span but overall like you know like there's lots of things I'm
34:36: before we do the big integration.


53:56: kind of proud of those are just kind of the two most recent ones so I kind of BR them up and I could like ramble for this
34:39: Right now, the particle system


54:02: for a while but we have a lot of questions so I don't want to like you know hold things up but uh Sor do
34:41: is the major one that's going to be fully pulled in.


54:08: actually have like one would you like to like share with us yours yeah um I I'll
34:45: Once that part is done, we're going to do the sound system, which I expect to be


54:13: try and be quick with it uh since we're getting back to how long have we been running actually we've been uh we're
34:49: faster than the particle system, because it doesn't have as many things,


54:19: coming in an hour how long do we want to keep this going for so my aim was for like 1 hour
34:53: but we'll see how that one goes.


54:25: to 2 hours uh depending on questions we got a lot questions so like I'm okay like going through all of these but uh
34:56: Once the sound system happens, this is going to get reworked, the integration with Unity, so it's simpler.


54:32: as we start getting out two hours uh we probably stop it okay anyways um the the
35:02: Once this is done, we move the whole thing out, and it's going to be the big performance update.


54:39: when you're were talking about the build process that kind of made me think of uh something that I'm I really enjoyed
35:08: I hope that it helps answer


54:44: working on that you know was kind of it's kind of one of those things where
35:12: the question.


54:50: it's really important uh but it's just so invisible yes um and
35:16: I think I'm going to clean this up, just so it doesn't


54:57: what I did behind the scenes is I actually I basically reworked the entire
35:19: clutter our space, and we can move to the next questions.


55:02: build process of like FRS engine almost um
35:24: I'll mark these two as unanswered then.


55:08: so since since F engine's been around for a while um and it's been through
35:27: There's actually one thing I was also going to mention. Even from the particle system, there's actually a few functions


55:14: many updates to like C and C's project system we were still using um the Legacy
35:32: that spawned us extra things.


55:22: C project format which is called um MS build and that really only works in
35:36: One of them being that we have access to


55:30: something like Visual Studio these days and it's kind of it's it's kind of hard
35:40: 4D Simplex noise, so you can use a ProtoFlux node,


55:35: to work with it's not quite as robust as the like newer build system foret and as
35:44: and there's also a 3D texture


55:41: a result um there would often times be like it
35:46: with Simplex noise,


55:48: would be like you'd have like weird issues if you wanted to add like packages and stuff and you could only
35:52: and I've seen people do really cool effects


55:55: use something like Visual Studio as your IDE of choice to boot um
35:56: with it, like this one for example. I think I actually got this one from Cyro.


56:01: and I saw that and I decided to uh poke
36:00: So you see how it kind of evolves in time?


56:09: at it and it actually ended up being a lot easier than I anticipated because Microsoft provides a nice little tool to
36:03: This is like a volumetric effect, so you can kind of push it through.


56:15: upgrade your projects and so what I did is I went through and I upgraded all of the projects to the new uh C format
36:10: So people have been


56:24: which means that we can take advantage of like the much nicer um project files
36:11: already playing with it. And this is kind of generally how


56:30: which means it's easier to like edit them directly and add actions and stuff
36:15: we like to do development, where, I've got another version here.


56:36: and it also means uh the engine can now be built in IDs other than vs code you
36:19: This is super neat.


56:43: could use or vs code V Visual Studio proper is what I meant to say there but
36:25: How we like to develop things


56:49: now now you can build it in like vs code or like um you could build it in
36:27: is like, you know, we want to add more building blocks. So even if we're building something


56:56: um you could probably build it in like ryer if you pay for ryer you can build it you can even build the engine from
36:31: official, whatever building blocks we add, we try to make


57:03: the command line now which is really really good for um like yeah like
36:35: as many of them available to everyone using the platform, because you can use them for a lot of other


57:08: automated builds that's a that's a big thing I did that nobody saw but I'm really really proud about yeah like it's
36:40: things. So yeah, but that should


57:15: one of those things like where like you know it it doesn't show on the surface but like it makes you know our lives as
36:43: kind of cover those questions.


57:21: developers like way easier because I had so many times where I would literally lose like sometimes even hours of time
36:47: So next, we have a question from Navy3001.


57:27: just trying to like deal with some kind of problem and having those problems kind of resolved and have the system
36:52: Any idea of how in-game performance metrics for user content


57:33: kind of be like you know nicer it allows us to invest more of our time into actually building things like we want to
36:55: would work? That's actually, that's a good question, and like, measuring


57:40: build it and then dealing you know with project build issues one of the problems for example that's um kind of just kind
37:01: performance, that's a very kind of difficult


57:47: of weird like one of those weird things is with protox because for prolex it's technically a separate system and we
37:03: thing, because one of the things with performance is like, it depends.


57:54: have a project that actually analyes you know all the nodes and it generates C code that sort of binds it to
37:08: Like, it depends on a lot of stuff.


58:01: resonite um the problem is with old like Ms build for some reason even if like
37:11: So usually, like, you want to have like, you know, kind of a range of tools, you know, to kind of measure


58:07: you know if um if the if the project that generates that code runs first the
37:16: like, measure things. One of them is, you know, you can measure how long


58:15: build process doesn't see any of the new files you know like in that same build
37:20: individual components take, you know, to execute, and sort of some way to


58:21: build pipeline so if we ever added like a new node like we would compile it and it would fail because um it's like oh
37:23: aggregate the data, so you can kind of see, okay, this is consuming a lot of time,


58:29: like this code doesn't exist even though it actually exists at the time it just doesn't see it with the change S made
37:28: this is consuming a lot of time, but the performance impact


58:35: the problem is gone like we don't we don't have to kind you know this whole thing but the real big thing is like It
37:31: of something is not always like, you know, that direct, because something can, for example, the components


58:40: prepares res night for like more sort of automated like you know build pipeline which is something we've been trying to
37:35: themselves, they can be quick to execute, but maybe the object is, you know,


58:46: move towards too because that it's going to be one of the things that's going to save us a lot more time as well that's
37:40: has a really complex geometry, so it's taking a long time on the GPU


58:51: going to uh you know make it so we can actually just push code in the Repository there's automated tests
37:44: to render out. The other part


58:57: they're going to run there's going to be automated builds the binary is automatically goingon to be uploaded and like it's just gonna remove all of
37:47: is like, performance can also differ depending on the scenario. Say,


59:03: manual work that happens all the time um it makes it makes bringing on people like me easier too yeah it makes
37:51: you build an object, and the object is doing a raycast,


59:10: it easier to bring more like Engineers as well because now they don't they don't have to deal in over those like weird like issues I know prime prime
37:55: it's doing, you know, some kind of checks for collisions. If you have an object in a simple world,


59:17: like he like also lost like sometimes he lost like a day just dealing with like project issues and you know there's a
38:00: maybe it doesn't like, you know, it runs pretty fast, but you bring


59:23: day you could like you could spend working on other stuff and instead we have to just make things work so uh
38:03: that object into a world with much more complex colliders, it suddenly, it starts


59:31: thank you s for making this like this this this things like this even though like they're not visible to the
38:07: hurting performance, because now those collision checks are needed, like, you know, are more


59:36: community they help a lot yeah of course um so next we have a question
38:11: complex. The other example is like, say you use


59:44: from uh Fantastic Mr foxbox the sound system updates can we get a way to
38:15: like a node, like find child. You try to search for a child in a hierarchy.


59:49: capture users voice with the with a permission and import audio streams dynamically into the world uh this would
38:20: And if you're a simple world, maybe like, you know,


59:55: allow us to fully implement the H radio stuff into resonate and allow us to ditch using external brows support to
38:24: the hierarchy of objects is, you know,


1:00:01: support audio um so I'm not sure if I like um I feel like I don't understand
38:27: it doesn't have too much in it. So it runs fast. But then you go into a world which has


1:00:08: enough like about like how you want to kind of capture it um but since we'll be like you know handling all the audio
38:31: way more, and now the performance kind of tanks. Now the thing


1:00:14: rendering we'll be able to build like you know like a virtual microphone that actually captures specialized audio you
38:35: that was running reasonably fast in one world is


1:00:20: know from its own like kind of like whever it is in world uh so that's one of the things you'll be able to do uh
38:39: running slower in the other one. So, one of the ideas we


1:00:25: you're going able you know to bring the camera and have it like you know streamed out your device so I would say
38:43: kind of had is, we would kind of build some sort of like, you know,


1:00:32: um yes on that part like on the kind of capture um I don't know about like the I
38:47: kind of like benchmark worlds. We would like, you know, like have like


1:00:40: think I know what they mean yeah I think I I think I'm I know they're trying to like
38:51: different scenarios, complex worlds with complex hierarchies, you know, for this and that


1:00:45: they am I correct in in assuming that you want a way to like import multiple
38:55: and then have a system where you can essentially like run


1:00:52: streams into the world from a single user is that what you're talking about
38:59: that object in that world and sort of, you know, see how fast


1:00:58: you probably have to wait uh for them yeah wait a
39:03: it runs and how does it differ depending, you know, on a different kind of


1:01:04: second you might get back to this question yeah but like you'll be you
39:07: scenario. Overall, I think this


1:01:10: essenti should be able to like you know render audio out from any point in the game in addition you know for the user
39:12: eventually end up with, you know, lots of different tools. So you have like, you know,


1:01:17: so like and then becomes question what do we want to do do want to record an audio clip do we want to Output it into
39:15: you don't have the tools to measure how much the components take to execute,


1:01:23: like another audio device so you can stream it into something so that will work if you want to import audio back
39:19: how long the, you know, GPU takes to execute,


1:01:29: in um that's probably a separate thing and that's probably not going to come as part of
39:23: just sort of like lots of different tools to analyze different like, you know, performance things.


1:01:35: it uh we'll see like if if you have like a no kind clarification just like ask us more and we'll get back to
39:29: So I think that's overall like, you know, like


1:01:42: this uh next we have epic e uh is asking
39:31: what you should expect, like once those tools come in


1:01:47: will the Headless CL be upgraded to net 9 yes uh plan to do this like soon it
39:35: it's not going to be a single tool, but it's going to be like, you know,


1:01:53: should be mostly just like um uh flip of a switch we don't kind of expect like big issues uh one of the things we want
39:39: a range of them that will probably keep like, you know, expanding and building upon.


1:02:00: to do is we're going to make like announcement so people know this is coming you can prepare you know your tooling uh make sure like you know like
39:44: If I could


1:02:07: you're like whatever scripts you're using so like you know upload your you know run your heades like didn't just
39:46: append to that


1:02:13: like explode um there's a GitHub show on it and I'll try to like you know make the announcement like in a bit um uh
39:52: we probably also


1:02:19: probably you know sometime like next week uh get people ready uh Alex TPI is asking makes me
39:54: because I know this is, I know this has like come up occasionally in relation


1:02:26: wonder where is currently corporate of most crashes this on my computer I most see information that Unity crashes
39:58: to questions like this, we probably also wouldn't


1:02:31: couldn't just restart Unity we s had a discussion about couldn't you just um I
40:03: like give things like


1:02:40: mean so for the first part the question crashes they can have lots of reasons um
40:06: we wouldn't do like an arbitrary limiting system like


1:02:47: it's really hard to say like in general like you pretty much have to send us the crash log we look at it you know we look at a c St and be like this is probably
40:10: oh you can only have 60,000 triangles, you can only have


1:02:55: causing it um so it's kind of hard to say in general for the part couldn't just
40:14: X number of seconds of audio on you.


1:03:00: restart unit I mean kind what the crash is it's like it's essentially you know
40:18: We do want to add tools so you can restrict, because it can like, it's not


1:03:06: breaks um and then like it has to shut down and you have to start it again so in a way you're kind of
40:23: it's not a perfect solution, but like we want to add people


1:03:12: restarting Unity it's just that like you know the restart is kind of forced but um this actually kind of ties
40:26: like we want to add tools so people can like, you know, set some limits


1:03:20: you know uh to some of the stuff with the multiprocess architecture for the performance upgrades because if you've
40:30: on things. Because whatever kind of philosophy


1:03:26: been here earlier um we've been talking about you know how like FR engine is
40:34: is like, you know, we want to give people a lot of control.


1:03:31: going to essentially be moved into its own process and then unit is going to be handling the rendering one of the things
40:38: And if you want to run a session like where you can spawn


1:03:37: that like I'm sort of considering as part of a design is so like the unity can actually be restarted so
40:42: object that has, you know, 50 million triangles and like everybody's going to be running at like


1:03:43: like maybe uh so if Unity happens to crash we can you know keep running F
40:46: you know, 10 FPS, but you know, you want to be like I have a beefy


1:03:49: engine start a new unity and just you know reinitialize everything so like I do want to make that part of it just you
40:50: GPU, I want to look at this super detailed model, we always


1:03:55: know in general to make the system more robust so it's possible um but uh dbd like we'll see how that
40:54: want people to have ability to do that. At the same time


1:04:05: kind of goes uh kuk is asking I have heard from someone complaints of headless being
40:58: we want to add tools so like, you know, if you want to host like a chill world


1:04:11: Patron reward this was particularly a complaint about communities that do want to host events essentially forc into it to keep events going if they haven't
41:02: if you want to keep it like, you know, more light, you have tools


1:04:17: host crashes is there plans later to remove the patron requirement for the heades when things are more stable and
41:06: to kind of like, you know, set certain limits on the users, how much they can


1:04:24: performant so at some point we'll probably um make it kind of more open uh
41:10: spawn in, how much they can bring in. So we're not going to make


1:04:30: our tenative goal and this is like you know not set in stone so like things you know might change uh alternative goal is
41:14: them, you know, forced, but we're much more likely to add like tools where


1:04:38: like we want to offer sort of like a service where we make it easy to Auto Spin headasses and kind of move you know
41:18: you have the kind of control to decide what you want, you know, in your


1:04:45: patreon to that so like if you support us kind of financially he will get like you know you get like you know certain
41:22: world, what you want in your experience.


1:04:51: amount of hours you know for heades and like we're going to make it very easy to host and if you want to self host we're going to give to head us um we do have
41:26: Other aspect of that is like, you know, we have the asset variant system and we already


1:04:58: to look at it like you know from the business perspective because um it is like patreon is like one of the things
41:30: use part of it, like you can go into your settings and you can lower


1:05:03: that's like supporting the platform and that's allowing us to work on it so we you know we don't want to kind of
41:34: the resolution of textures. You can, for example, clamp it to like 2K.


1:05:10: compromise that because um if we do something with that you know it ends up
41:38: So if you're, you know, low on VRM, you can lower the textures


1:05:15: like you know uh hurting our Revenue stream that we're not able to like you know pay people on our team and then
41:42: and if somebody has, you know, 8K texture on their avatar, you're


1:05:22: we're not able you know to work on things and you know things end up kind of bad um we want it to be like
41:46: only going to load it up to 2K. You know, it's not going to hurt you, but other


1:05:28: accessible to as many people as possible but we sort of like you know balancing it like with the business side of
41:50: people, like say somebody has, you know, one of the, you know, 1490


1:05:36: things uh next one troy Borg s also did FF mode while ago having a audio system
41:54: with 24 gigs of VRM and they don't care, they can keep it like, you know, kind of


1:05:41: that could make be just part of the game like waveform visual is or be able better detection of B music effects uh
41:57: unlocked. And it's kind of, you know, aligned with our kind of like philosophy is like


1:05:47: this actually separate from because like that happens fully like within R night uh the audio system is more like you
42:04: give people as many tools as possible to kind of control your experience. But also


1:05:53: know about rendering the audio output put you know and like pushing into to your audio
42:08: we don't want to enforce, like, you know, limits on people where possible.


1:06:01: device uh next we have I'm kind of just speeding through these questions because we have a bunch uh Sky soon a few people
42:14: Yeah, that's kind of more


1:06:08: have mentioned that they are not happy with the new Walking system and how it looks our plans to continue to improve
42:15: so where I was going with that, is that we wouldn't have like a sort of


1:06:13: that it go last update but people are still not happy and we can always like improve things so we just released you
42:20: hard and fast, these are the rules for the whole platform kind of


1:06:19: know an update which uh um weas an update like which integrates some kind
42:24: rules. Because, you know, not everybody's computers are


1:06:24: of community like settings which would make it like look way better um for things that are like you know that
42:28: equal, and so maybe I don't want to render your 500 million


1:06:31: people still find us an issues with it like we will need like reports on those because uh right now um right now like
42:31: polygon model, right? But


1:06:39: you know like we're not sure like after the update we're not sure what's making people not happy about it uh so we have
42:37: we also don't want to


1:06:46: like kind of more concrete stuff to work with uh I suggest people you know kind of like make make kind of like reports
42:39: we want to present this stuff in a sort of like unbiased way.


1:06:51: so we can know what do we focus on uh but yes in general like we are always
42:43: Like, we don't want to, like, we wouldn't


1:06:56: kind of you know willing to improve things we want to like um we want to make
42:48: I wouldn't want to color, like, 500, like, you know,


1:07:02: um um essentially want to like you know make like it as polished as it can be um
42:51: someone's polygon count in, like, red or something. Because


1:07:08: but we also need like you know more kind of hard data to kind of work with so we know like where to invest our
42:55: we got to, like, social kind of thing, but it also comes with


1:07:16: time um next we have uh turbor what is causing vience for sometimes when moves
43:03: I think that should, like, answer


1:07:23: I'm not sure it could be just the Bloom on that thing maybe it's his radiant yellow complexion
43:05: like this, like, particularly, we should probably, like, move to the other questions because we got a bunch of them piling up.


1:07:32: what your resplendant Vis uh oh this is actually very much what was the answer
43:10: Can I answer the next one?


1:07:38: to this uh so these are just looks like questions within the chat uh rash 86
43:14: Uh, sure.


1:07:44: who's the second person here on the camera uh this is cro he's our engineering intern hello hi how you doing guys it's
43:17: I haven't actually read it yet.


1:07:53: me uh next we have Sky kitum questions from
43:20: Okay, so, TheJebForge asks, would it even be possible


1:08:00: Tyra white tail who can't watch the stream now uh if video players are going to be updated with cor VLC I have heard
43:24: to multithread world processing in Resonite? Like, if the world is incredibly heavy and the amount


1:08:06: from several builders that player use very outdated core yes the system VI use right now uh it's a plugin called ump
43:28: of CPU it uses, but since Resonite only uses one thread, it's not using all the CPU it could have been.


1:08:14: Universal media player which is built our own VC um unfortunately hasn't been updated in years uh which means it's
43:34: I know multithreading introduces a lot of problems with thread synchronization.


1:08:21: using like an older version of it um we have been kind of looking into upgrading
43:37: What do you think?


1:08:26: to actual official like VLC plugin the problem is it's still kind of not mature
43:41: Alright guys, say it with me.


1:08:32: enough in some ways like um um the last I remember there's like issues where you
43:45: Oh gosh, the camera's moving.


1:08:38: cannot like have more than like one video at a time you can only have like you know one and like if you try to do
43:48: Hold on, hold on, hold on, hold on.


1:08:43: another one it just explodes um there's other things like we can kind of look into like alternative
43:51: Alright, say it with me. Resonite is not


1:08:50: kind of like rendering engines but there also like you know potential like time and money investment um if the problems
43:56: single-threading. This is a myth


1:08:56: kind of like you know bad like we can kind of are we kind of considering like
44:00: that has somehow spread around that Resonite only runs on a single


1:09:02: uh that we might invest into one but we need to kind of do some testing there and see see like you know how well it
44:04: thread. This is abjectly not true.


1:09:07: kind of works uh it's unfortunately kind of like difficult situations because the solutions you know are limited
44:09: Yeah, this is a thing we kind of get a lot, because I think people are just


1:09:16: um so it's something like it's something we want to like improve but it's also
44:13: like, you know, it runs with poor performance, therefore it's single-threaded.


1:09:21: kind of like difficult to work with unfortunately um can I can I comment on the next one
44:18: When it comes to multithreading, it's like way


1:09:29: yeah so Rasmus 0211 asks thoughts on about 75% of all users being in private
44:21: more complex. It's not a black and white thing.


1:09:36: worlds Around the Clock often new users mention they see practically no enticing worlds this is not a resonite problem
44:24: So, the way I kind of put it is, you know,


1:09:45: this is a problem of scale all platforms have people have
44:27: it's not like an on-off switch. Imagine you have


1:09:52: a have a pretty wide majority of people who just kind of want to hang out and
44:31: a city or something, and the city has poor roads.


1:09:58: not really you know be bothered and unfortunately you know we're not we're not the biggest platform out there we're
44:36: Maybe there's areas where the roads


1:10:04: still we're still kind of small um and as we grow that problem will
44:39: are very narrow, and it's kind of hard for cars to get through.


1:10:12: undoubtedly get better um it's like a it's sort of like you know like not like
44:43: You can have areas of the city where you have highways, and


1:10:18: really a technical problem it's more like a social one because you know it's like people behave you know a certain way and
44:47: you can have lots of cars in there. It's not an on-off


1:10:25: then like it's really hard to kind like you know change that um there's some things we want to do to kind of entice
44:51: switch where you just turn a switch and suddenly


1:10:31: people to make it easier to discover things like you know um like we were talking earlier adding an events UI so
44:55: every road is wide, but you can gradually rebuild


1:10:37: you can of see you know these are the things that are coming up these are going to be public events that you can join um right now I believe there's the
45:00: more of the city infrastructure to support


1:10:43: Creator chm event that's like going on and it's always you know every weekend uh it's public to everyone um but it
45:03: more of that high bandwidth. With Resonite,


1:10:51: depends you know what people are kind of coming in for because people might come in and they don't actually want to join
45:07: there's a lot of things that are multithreaded.


1:10:57: public events they want to you know go into those private worlds but the question is you know how do you make
45:11: There's also a lot of things that could be multithreaded, and they're going to be more


1:11:02: those people you know discover like you know the friend groups and like you know who can hang out in those worlds and
45:15: multithreaded in the future, but it's not


1:11:08: it's a part kind of challenging problem especially you know from the platform perspective because we we can't just you
45:21: it's essentially not


1:11:14: know force people into public worlds you know people um will host whatever worlds
45:23: a black and white thing, whether it's either multithreaded or not


1:11:20: they like but I always like want to see what kind of tools we can give like you know to sort of entice people and make
45:27: multithreaded. You have to think about Resonite,


1:11:28: make like you know make worlds and socialization easier to discover but like Cy said like it is is is a thing
45:30: it's like lots of complex systems. There's so many systems, and


1:11:34: that kind of gets better like with scale once you kind of grow more um there's um like there's defitely like
45:34: some of them are going to be multithreaded, some of them are not going to be


1:11:42: number of events though so like if people go you know right now since we don't have the events you are in game if
45:38: multithreaded. Some of them are not multithreaded, and they're


1:11:49: you go into the resonite Discord um
45:42: going to be multithreaded. Some of them are going to stay single-threaded, because there's not


1:11:56: um like if you go like you know into the resite Discord we have like a community news and lots of different communities
45:46: much benefit to them being multithreaded. So we definitely


1:12:03: kind of post like regular events there so people can kind of find what's kind of going in the platform and kind of helps a bit like in the meantime if uh
45:50: want to do more, but we already have a lot of things


1:12:11: you know if people are looking for things to do uh next we question from bular is SCE
45:54: running on multiple threads, like


1:12:18: being unfor in parel with the current performance related updates or has it been put on hold until permane work is
45:59: asset processing that's multithreaded, the physics that's using


1:12:23: done yes it's actually been working in parl um uh Gins is like uh one of the
46:02: multiple threads, a lot of additional processing


1:12:28: main people like working on that um we did have like um uh we do have like a meetings like now and then like we sort
46:06: spins off, does a bunch of background processing, and then integrates with the main thread.


1:12:34: of like you know synchronized on the status of it uh last time like a that was like a two weeks ago or so we talked
46:11: So there's a lot of multithreading to the system already,


1:12:41: about like the multiprocess architecture how this going going to work how this going to integrate you know with for
46:14: there's got to be more.


1:12:47: extension um and how those systems are going to communicate um gin's approach
46:18: It's not something that's like a magic silver bullet.


1:12:53: was like you know to look at what current what the current unit integration has and implemented on
46:25: With performance,


1:12:59: Source end however there's a lot of things that we're actually like you know moving like the partical system audio
46:27: there's a lot of complexity. There's a lot of things


1:13:04: system you know input system lots of things that are going to be moved forward into FS engine uh so they don't
46:32: that can be causing low performance,


1:13:10: need to be implemented on sof side and they're going to you know focus more um on other things uh they have a list of
46:34: and multithreading is not always the best answer.


1:13:17: like uh you know Source features uh and specifically Bevy features uh because s
46:38: So for example, the .NET 9 switch, that's actually not


1:13:23: is being built around Bevy r in engine um which you know map to like you
46:42: going to change anything with multithreading,


1:13:29: know current features we have like for example we have you know lights do they support Shadows you have like reflection
46:46: but it essentially makes the code that we already have,


1:13:35: probes you know do they support this and that so they're working kind of like you know on making sure like there's a
46:50: which as we know, whatever multithreading has right now, it makes it run


1:13:40: feature part to there um once we have like a new performance upgrade like we can like work more like on the
46:54: several times faster, just by switching to the runtime, just by having


1:13:46: integration also like work uh on theonite side soort of like you know has been doing on consolidating the shaders
46:58: better code gen. So there's a lot of different


1:13:53: because all the shaders we have right now they need to be Rewritten for Source
47:02: things that can be done to improve performance, multithreading is just one of them.


1:13:59: um because the current ones they no designed for Unity so we need
47:08: I think I should cover a lot of it,


1:14:04: equivalents um the equivalents of those are uh essentially going to be implemented for for the new rering
47:11: but yes.


1:14:13: engine uh next epic e how do you make walkietalkie system there's actually one
47:15: One more thing is, it's also something like,


1:14:19: things like you should be able to do you know with the new like audio system you'll be able to like you know have
47:18: when there's a world that's very heavy, it depends what's making it


1:14:24: like virtual virtual like microphone put it on a thing and then have it like output from like you know another audio
47:23: heavy, because some things you can multithread, but some things you cannot multithread.


1:14:30: source so there actually might be a thing you'll be able to do once we rework that because it shouldn't be too
47:26: If you have some user content that's doing lots of interactions with things,


1:14:35: difficult to add components for that uh relash 86 uh rard body newon
47:31: if you're just blatantly multithreaded, it's going to end up


1:14:42: physics system one soon or later uh so definitely sometime after the performance upgrade um we do have like
47:34: corrupting a bunch of stuff, because with every algorithm


1:14:49: we integrate a physics engine called be physics uh one of the things we want to do after we move the FR engine out of
47:38: there's always a part of it that's irreducible.


1:14:56: unity and it's running AET 9 we want to synchronize B to the latest version
47:43: So we want to introduce more systems that use multithreading


1:15:02: because we right now we kind of have to diverge because the beu physics it uh it used to work with you know NET Framework
47:46: where possible, but again, it's not


1:15:09: which is what the I is like right now for like Unity um but now the new
47:50: a silver bullet. It's more like


1:15:14: versions they require I think n five or maybe they even bumped it higher uh
47:54: a gradual kind of process that happens over time.


1:15:19: which means like we cannot really use those uh at least not like you know lots of back porting so one of the task is
48:00: Next we have GrandUK is asking


1:15:24: going to be going to sync it up and then like you know we're going to be able to look at like how much work is it you
48:02: are there roadmaps with time estimates for both development and what do you want Resonite to be?


1:15:30: know when do we want to prioritize rigid body simulation integrative with for extension it's also going to help
48:06: So for roadmaps, we generally don't do super


1:15:36: because B physics is it's designed to work like you know with the modern. net to be like really performant um which is
48:10: ahead of roadmaps. Right now our focus is on performance


1:15:43: why I would like person kind of considerer as a sort of prerequisite for you know implementing that is the
48:14: updates, and you can actually find on our GitHub


1:15:49: performance upgrade so we actually running it with the run time it's supposed to run with uh but there's not
48:18: there's a project board, and there's a list of issues


1:15:55: like specific kind of prioritization right now once we're done with the performance update we might Focus you know more on UI focus on ik maybe other
48:22: that pertain to performance updates, and you can see how those


1:16:02: things we will reevaluate at that point uh gr is asking once move away
48:26: progress. We don't do time estimates


1:16:10: from Unity to Sauce could it be possible to dynamically connect disconnect from VR on time without staring the game uh
48:31: because the development varies a lot, and oftentimes


1:16:16: that's not really even a thing like that needs to move away from un it's possible to implement it uh with unity it's just
48:34: things come in that we have to deal with, the delay things


1:16:24: you know takes a fair amount of work so possible yes I would say um question
48:38: or maybe there's additional complexity, so we don't


1:16:31: is you know are we going to invest time into implementing that and for that like you know I don't
48:42: avoid promising certain dates


1:16:36: know answer right now uh next we have question the rest about Prime with these alterate audio
48:46: when we are not confident we could actually keep them.


1:16:42: rning sources allow for spatial data for your own voice example if I want to record conversation between myself and
48:50: We can give you very rough ones, for example with the


1:16:49: someone else from third person without it sounding like I'm right at the camera yes yeah like the that wouldn't be like
48:54: performance, with the big performance upgrade


1:16:55: an issue because we can just you know like I mentioned like you can have like any sort of listener in the world and
48:59: I roughly expect it to happen sometime in Q1


1:17:01: just record that but like you know P AIO and everything uh next what flavor of sauce
49:02: sometime early next year. We'll see how it goes


1:17:08: what does it taste like uh and it's very salty mayonnaise not mayonnaise it's
49:08: but that would be my rough estimate


1:17:15: actually made like his own like kind of sauce which is like a white named sauce actually I forget what what he call
49:10: on that one. After that, we usually


1:17:20: calls it uh Scotch sauce Scotch sauce yes he
49:13: once we finish on a big task, we re-evaluate


1:17:26: makes like you know this like a really delicious sauce so like it's it's very salty one so but it has like you know of
49:17: what would be the next best step for the platform


1:17:31: flavor to it I think this next one's aimed at me uh Alex tpie says sah I heard that
49:22: at that point, and we decide are we going to focus on UI


1:17:39: some people don't trust you and that you don't care you know where this comes from I think I do I'm in desktop a lot
49:25: are we going to implement this thing, are we going to implement that thing, because


1:17:47: um and I'm often either working in Fruit engion these days or I'm I I I'm kind of
49:30: we try to


1:17:55: audio sensitive and I can get over stimulated kind of easily so sometimes I will you know just kind of stand there
49:33: keep our ear to the ground and be like this is what would


1:18:01: or maybe I won't respond so colorfully um but I like having um
49:37: give the community and the platform most benefit right now


1:18:07: people around and so that's why you know I exist despite that
49:40: this is what's most needed right now, and we want to make the decision


1:18:13: um I also uh appreciate it when I I'll probably open up a lot more
49:46: as soon as possible


1:18:22: um if uh how do I put
49:48: no, actually as late as possible.


1:18:28: this basically um if you if you want to interact with the Cyber creature uh well
49:54: Next question, we have Jack the Fox author


1:18:36: you know do things like um you know just ask before you know like poking my nose
49:57: what are some examples of features you've implemented a particle you're proud about?


1:18:42: or like patting my head and stuff um and uh you know ask me if you want to send
50:02: There's a whole bunch, I do a lot of


1:18:49: me like a contact request um just don't come up to me and be like you're cute
50:04: systems, the one I'm actually working on right now, the particle system


1:18:54: and then like click my name and add me cuz then I have to explain like I'm probably not going to add you man we
50:08: I'm pretty proud of that, it's


1:19:00: talked for like maybe 2 seconds I need at least 45 seconds
50:13: technically not out yet, but I'm very happy with how it's going


1:19:05: um but I uh and if you if you've come across me
50:17: in part because it now


1:19:11: and I've been in that sort of state where I'm not super talkative or maybe I seem a little detached um hopefully that sheds a
50:21: gives us control to very easily make new particle effects


1:19:19: little light on uh on that I I love this place very dearly and um
50:25: and do stuff we were not able to do easily before


1:19:25: I love all of you very dearly is a good
50:29: the one that came before that is the data feed system


1:19:32: be uh so next we have question from damos what's the current workflow for
50:35: that's a culmination of a lot of


1:19:38: identifying performance bottlenecks so um generally like the
50:37: approaches I've been developing to how we do UI


1:19:44: workflow is like something like you know um it's kind of depends because like
50:41: in the Resonite


1:19:49: there's like lots of things that can like you know cause like performance issues um so usually it's a combination
50:44: so with that one, one of the big problems we've had with the UI is because the Resonite


1:19:55: of those different things but usually it kind of starts more with just observation you know kind of
50:49: is building a lot of things from ground up


1:20:01: seeing what's running slow when am I liking you know and so on and once kind
50:52: because of the layer I was talking about in the stream


1:20:07: of like you know there's that initial observation we will try to you know narrow down to like the root of the
50:58: but it also makes things difficult because


1:20:14: issue and for that we can use variety of tools some of them are in gamees for example we have like you know stats on
51:00: we cannot just take existing solution and use it


1:20:21: how um you know how much like are certain part to the process like you know taking once we need more detailed
51:04: so a lot of the UI, we actually have to build those systems ourselves and build frameworks


1:20:28: information we can for example around you know headless uh the Headless client with like Visual Studio profiling tools
51:08: to work with them, and the old UIs, they have


1:20:34: and they actually measure you know like how long is spent like you know in each method how long is spent like in each
51:12: the problem where the code of them is like this big monolith


1:20:40: code it gives us you know some kind of data the other part of it like ining benchmarking once we kind of have like
51:16: and it's really hard to work with, we have to


1:20:46: suspicion this thing is you know causing a lot of you know performance problems we can write like a test sample and then
51:19: if there's misaligned button or something


1:20:52: you know run it um run with different run times r with different settings do sort of like you
51:23: we have to go to the code, change some numbers there, change some methods


1:20:58: know AB testing see how things change um I for example done this you know with
51:27: that are called, compile, wait for it to compile


1:21:04: like a lot of likeit um like of extensions kind of like methods where um
51:30: run the application, look at it, be like that's still wrong


1:21:09: for example even like you know stuff like the base Vector operations I would try different ways to implement certain operation run a benchmark and see how
51:34: go back to the code, make more changes, compile, wait for it


1:21:17: fast it runs um and there like one thing that kind of depends there is you know what
51:38: wait for it to launch, look at it, it's still wrong, go back to the code


1:21:23: the run time it uses uh so one thing I would for example find is like certain implementations
51:44: sometimes people are like, oh this thing is misaligned


1:21:29: they actually run faster with mono um and then slower for like you
51:47: in this UI, and we're fixing that


1:21:36: know modern net turn time so there's a lot of things you know in F engine like
51:51: sometimes it takes an hour, just messing around


1:21:41: where like sometimes like people like kind of decompile and see like why is this like done this weird way and in
51:55: and that's not very good use of our engineering time


1:21:47: some cases It's like because it actually even though it's kind of like you wouldn't do it with more modern code
52:00: but the data feeds


1:21:54: it interacts better with their R time views at a time um but for example with
52:03: is a system that's very generalized


1:22:00: these like you know General pars I would find you know they if I compare them
52:05: that essentially allows us to split the work on UI


1:22:06: with the mono in unity and compar the mod runtime they would around like you know 10 like sometimes even like 100
52:10: between the engineering team and our content, or now, our team


1:22:13: times faster there's like some other things that you know also kind of like speed up some things that are kind of the same but General it's just a kind of
52:16: so when we work the settings UI, on the code


1:22:20: combination of like you know tools like like it's um we kind of observe you know something not performing well we have
52:19: side we only have to worry more about the functionality of it, like what's the structure, what's the data


1:22:26: like a suspicion like this might be causing this might be causing it and then we just kind of use tools to kind
52:23: interfaces, and then we have the rest of our team


1:22:32: of dig down and figure out the root cause of that problem
52:27: like our team, actually build the visuals


1:22:37: um so hopefully that kind you know answers that I think there are also um
52:31: in-game, and put a lot of polish into each of the elements


1:22:44: like some some manual profiling tools out there like Tracy I know there's some Tracy bindings for C which are really
52:36: and that process has made it much


1:22:50: cool yeah want to inte that's actually one of the cool things because there's a bunch of of libraries that we canot even
52:39: simpler to rework the settings


1:22:55: use right now because of the all around time uh Tracy I think it requires like
52:43: UI, but what's an even bigger part of it is


1:23:01: what was it like net8 or some some new version it's listed for Net 7 but I
52:47: the data feed system that this is built on


1:23:08: think it's just interrupt so it could work but it's better to just wait yeah
52:50: is very general, and it's been kind of designed to be general


1:23:14: but we want to integrate like more tools like usually like you have like a know your performance profiling tool set so
52:54: so the settings UI, it was used as sort of like a pilot project for it


1:23:20: like you just kind of like you know dig down and figure out like where it could be coming from sometimes it's kind easier to find sometimes it's harder
52:59: but now, we're going to use it


1:23:26: sometimes you have to do a lot of work for example you know with the testing like I've done before comparing you know
53:02: once we get to more UI work, to rework the inventory


1:23:32: the net you know five or whatever version it was and mono I saw like you
53:07: rework the contacts, rework the word browser, file browser


1:23:37: know this is this code is running way better so I think it's going to help like improve a lot but like you still
53:11: rework the inspectors, and it makes the work required


1:23:44: usually test you know bits and pieces and it's kind of hard to test the whole thing because the whole thing doesn't run with that new interface and that's
53:14: to rework those UIs be at least an order of magnitude


1:23:52: like you know why uh for our current performance update we moved the Headless
53:18: less, which means that before the data feeds


1:23:58: first uh because moving the Headless was much easier since it already you know it
53:25: these are


1:24:03: exists outside of unity and we could you know run sessions and kind of compare how how does it perform compared to the
53:26: rough estimates, but say it would have taken us two months to


1:24:10: mono one and the results from that like we got like they're
53:31: rework the inventory UI, now it's going to take us two weeks


1:24:15: like it's essentially like you know beyond expectations is way faster so that makes us more confident than doing
53:36: and those numbers are


1:24:22: all this work to move fre engine out of unity it's really going to be worth
53:39: more of an illustrative point, but it's


1:24:27: it um next question we have uh let's make
53:43: essentially on this kind of order, it makes it way simpler, it saves us so much time


1:24:33: this a little bit bigger uh fitop as of my perception the Ison is
53:47: which means we can rework a lot more UI


1:24:39: somewhat being marketed as a furry social VR platform which is not case at all but every time as somebody hey do I
53:50: in a shorter time span.


1:24:44: want to try arite I usually get answers like oh that VR game for furries I have nothing against them but in arite they
53:55: There's lots of things I'm kind of proud of, I just kind of did two most recent


1:24:50: are very publicly dominant I about this topic that could maybe bring in more people so we don't really market like
53:59: ones, so I could ramble for this for a while, but


1:24:59: resonate as a for social VR platform we actually specifically um chrom like we know who's kind of hting our marketing
54:03: we have a lot of questions, so I don't want to hold things up.


1:25:06: we specifically for our own official marketing materials we show like you know different like diverse avatars
54:06: Sorry, do you actually have one we'd like to share with us?


1:25:13: because yes there's a lot of fies on this platform and you know it's also sort of sort of per self-perpetuating
54:10: Yeah, I'll try and be quick with it


1:25:19: thing where you know because there's lot of fies they bring in bunch more um we do have like you know lots of other
54:14: since we're getting back to... How long have we been running actually?


1:25:25: communities like as well uh which are not as big but you know they are here as well
54:19: We're coming in an hour.


1:25:31: um so we we want Rite to be for everyone it's not designed specifically for fues
54:22: How long do we want to keep this going for?


1:25:39: um um like we we pretty much like you know like we want everyone to be welcome
54:24: So my aim was for one hour to two hours, depending on the questions. We got a lot of those questions, so I'm okay going through


1:25:46: here and there's it's it's sort of like a complicated kind of thing because like
54:30: all of these, but as we start getting out of two hours, we'll probably


1:25:52: we we can adjust like you know we the marketing we make you know we try to make it as generalized but you know then
54:34: stop it.


1:25:59: question is like you know when you come to the platform you're going to have lots of Fes and I think the only way to
54:38: When you were talking about the build process, that kind of made me think of


1:26:05: like you know bring in more people is like you know you know showcase there's
54:42: something that I really enjoyed working on.


1:26:10: like lots of like you know kind of different people um like on the platform lots of different kind of communities but if
54:47: It's kind of one of those things where


1:26:16: there's like you know lots of Fes that becomes it becomes kind of difficult like you know like I mentioned earlier sort of like it self perpetuating uh but
54:49: it's really important, but it's just so invisible.


1:26:24: I think it's also like a thing of scale once we kind of like you know as we keep growing there's going to be more different groups of people and the
54:56: And what I did behind the scenes


1:26:31: communities you know that are different kind of fandoms or just different kind of like you know demographics uh they're
54:58: is I basically reworked the entire


1:26:37: going to get bigger and it's going to help bring you know uh this going to help like people who are from those
55:02: build process of FrooxEngine, almost.


1:26:42: demographics find their groups much easier once they're like once there's like more of them yeah res resonates uh all about
55:08: Since FrooxEngine has been


1:26:51: like self-expression and and stuff and being who you want to be and building what you want to build and F's kind of
55:10: around for a while, and it's been through


1:26:58: got that down pat and so that's probably why you see a lot of them but you know everybody can do that it's it's not just
55:14: many updates to C Sharp and C Sharp's project system, we were still


1:27:05: those people everyone it's made for every every person to come together and
55:18: using the legacy


1:27:11: uh hang out and build and just be you yes no matter who you are yeah we try to
55:22: C Sharp project format, which is called MSBuild.


1:27:19: make this platform kind of inclusive and for everyone um it's like you know it's our goal
55:27: And that really only works in


1:27:25: uh we don't want like any like we don't want anybody to feel unwelcome unless you know like well I
55:30: something like Visual Studio these days.


1:27:31: mean asterisk you know because there's like you know we don't want hate groups you know people like that so that that
55:33: It's kind of hard to work with, it's not quite as robust as the newer


1:27:37: one like we would have issue with but generally we want this platform TOA be
55:38: build system for .NET, and as a result


1:27:42: everyone yeah also we're coming up on the hour and a half mark okay so we have
55:44: there would oftentimes be


1:27:47: about like uh 30 minutes left uh think like to wor stud end every question so we'll see like how they keep piling but
55:49: like, you'd have like


1:27:53: we might need to stop them at a certain point uh so next question or Moon Claw
55:50: weird issues if you wanted to add packages and stuff, and


1:28:00: is rendering performance being looked into before the move to source as well from my experience when there isn't CP bound the rendering can be still quite
55:54: you could only use something like Visual Studio as your IDE of choice to boot.


1:28:06: heavy forcing me to drop resolution uh so there's actually a thing that Source will help with um we don't want to like
56:01: And I


1:28:13: invest super much time into you know the current rendering Pipeline with unity because the goal is to move away from
56:03: saw that, and I


1:28:19: which means any time we invest you know in improving
56:07: decided to poke at it, and it actually ended up being


1:28:25: Unity is essentially going to be wasted and it's going to delay the eventual big switch Source um it's going to like you
56:11: a lot easier than I anticipated because Microsoft provides a nice


1:28:33: know introduce like uh it's going to use much more modern rendering method right now we're using def method which can be
56:15: little tool to upgrade your projects, and so what I did is I


1:28:40: quite heavy like a sort of like you know like lot of like memory bandd and so on
56:18: went through and I upgraded all of the projects to the new


1:28:46: um with Source like uh it's going to use something called like a clustered forward rendering which allows lots of
56:22: C Sharp format, which means that we can take advantage of


1:28:53: Dynamics lights while also being much lighter on the hardware so that should
56:26: the much nicer project files, which means it's easier


1:28:58: improve like you know rendering performance on itself uh and once we know make the move we can look for more
56:31: to edit them directly and add actions and stuff


1:29:04: areas to kind of optimize things you know introduce things like imposters you know like more LOD systems and things
56:35: and it also means the engine


1:29:10: like that um so yeah and that's pretty much like
56:39: can now be built in IDEs other than VS Code.


1:29:17: unless there's like you know any sort of like very obvious long hanging fruit uh with like you know rendering
56:43: You could use, or VS Code, Visual Studio


1:29:23: that like you know would take us like you know say less than a day or maybe just a few days to like you know get a
56:47: Lopper is what I meant to say there. But now you can build it in like


1:29:29: significant boost in performance we're probably not going to invest like you know much time into it and instead like
56:50: VS Code, or like, you could build it in


1:29:34: want to invest it into into the move away from un and next question restot Prime how
56:56: you could probably build it in like


1:29:42: straightforward is conversion of our current particles to Photon dust I assume go is seamless to the point of
56:59: Rider if you pay for Rider, you could build it, you could even build the engine


1:29:47: them looking behaving identically but there is anything current particles can do that photo us want over that uh doing
57:02: from the command line now, which is really really good for


1:29:54: different enough way that it will have to be manually fixed so the conversion I can't really answer it exactly because
57:06: yeah, like automated builds. That's a big thing I did


1:30:00: the conversion actually isn't written yet uh however uh the main focus right
57:11: that nobody saw, but I'm really really proud about.


1:30:06: now is actually feature pity so I actually have like a list uh and I kind of posted this like you know inog if
57:14: It's one of those things where it doesn't show on the surface, but


1:30:12: you're curious where I have like you know all like the things that the Legacy system has and be kind of like working
57:18: it makes our lives as developers way easier, because I had


1:30:18: through that list making sure that Photon test has like you know the same or equivalent functionality the goal is
57:23: so many times where I would literally lose sometimes even hours


1:30:25: to make it uh so it's pretty much equivalent um so like it converts and it will look either the same or like you
57:27: of time just trying to deal with some kind of problem, and


1:30:31: know just very close so hopefully you know there won't be like things that
57:30: having those problems kind of resolved, and have the system kind of be nicer


1:30:37: like are too different uh however you know sometimes they kind of those things
57:34: it allows us to invest more of our time into actually


1:30:42: kind of become apparent like during the testing period so once those things here come out we'll look at them and we'll be
57:39: building things like we want to build and dealing with project build issues.


1:30:48: like okay like this is easy enough you know to fix uh or maybe this one's a little bit more complicated maybe we
57:43: One of the problems, for example, that's


1:30:55: just bring it close enough and ask people you know to manually fix things but uh we'll we'll have to see how this
57:46: kind of weird, like one of those weird things is with ProtoFlux.


1:31:00: kind of goes uh sometimes it's kind of hard to like you know noce before it actually
57:50: Because for ProtoFlux, it's technically a separate system


1:31:05: happens but um it should have like a feature part
57:54: and we have a project that actually analyses all the nodes


1:31:10: like well it's going to have like a feature par of it like inov the current system so like I kind of expect most of things to just
57:57: and generates C-Sharp code that binds it to Resonite.


1:31:17: work uh next we have F bipolar bear is there a way to stop the dasd particles
58:03: The problem is, with all the MSBuild,


1:31:23: from being shown streaming uh I don't think there is I think we would have to like Implement that does it show it does
58:06: for some reason, even if the


1:31:29: show yeah uh next a key
58:10: project that generates that code runs first


1:31:35: CER uh what things are currently planned for the whole performance update I think net is part of it for example so we
58:14: the build process doesn't see any of the new files


1:31:41: actually answered this one earlier uh so I'm not going to go um I'm not going to
58:19: in that same build pipeline.


1:31:47: like you know go into details on this one uh but essentially moving to do net 9 because uh we're originally going
58:22: So if we ever added a new node, we would compile it and it would fail


1:31:53: foret 9 net eight but the net n released you know literally just like a week ago or so um in short Uh current there's two
58:26: because it's like, oh, this code doesn't exist


1:32:02: main systems that need to be moved fully into F engine because they're a hybrid system that's the particle system which
58:30: even though it actually exists at the time, it just doesn't see it.


1:32:08: is being working on right now uh there's the sound system which cro did some work on once those systems are fully in F
58:34: With the changes Cyro made, the problem is gone. We don't have to talk about this whole thing.


1:32:14: engine we're going to rework how F engine interfaces with unity and then we're going to move it out into its own
58:39: But the really big thing is it prepares Resonite for more


1:32:21: process uh to use net 9 and it's going to be you know the big performance uplift from that um we're
58:43: automated build pipeline, which is something we've been trying to move towards


1:32:29: going to post you know this this is going to be like you know archived so like uh um if you're curious in more
58:47: to because it's going to be one of the things that's going to save us a lot more time as well


1:32:35: like details I recommend you know watching it later because we went into quite detail on this like earlier uh on
58:51: that's going to make it so we can actually just push code


1:32:41: the stream uh so this question with in chat uh Shadow X in the future could
58:54: into the repository. There's automated tests that are going to run, there's going to be automated


1:32:49: there be a could there be a way to override values not just per user but infering content
58:58: builds, the binaries are automatically going to be uploaded and it's just going to


1:32:54: for example override active enable state of a slot or component for specific camera basically same concept RTO but
59:02: remove all of the manual work that happens all the time.


1:33:00: more flexible um so probably not like
59:06: It makes bringing on people like me easier too.


1:33:05: this um the problem like with ar like is like um if you want to override certain
59:09: It makes it easier to bring more engineers as well because now they don't have to deal with those weird


1:33:11: like certain things um like for example in
59:14: issues. I know Prime also lost


1:33:17: rendering um when the rendering is happening all the you know work on updating the world is already kind of
59:18: sometimes he lost a day just dealing with project issues


1:33:24: you know it's complete which means like the the renderer actually has like much more limited functionality on what it
59:22: and a day you could spend working on other stuff


1:33:30: can change um what probably the best way to handle situations like that is you you
59:26: and instead you have to just make things work.


1:33:39: have like you know multiple kind of copies of whatever you want to change or like whatever system you want to have
59:30: Thank you Cyro for making this.


1:33:45: and you know you mark each one to show in a different context but like you actually you need to kind of manually
59:34: Things like this, even though they're not visible to the community, they help a lot.


1:33:51: set them up because consider a scenario where you know you override an active
59:41: Next, we have


1:33:57: enabled State um that component like you know it might have all kind of complex
59:43: a question from FantasticMrFoxBox.


1:34:03: functionality maybe there's even like you know protox you know or some other components that are reading the active
59:47: With sound system updates, can we get a way to capture a user's voice with


1:34:09: State and doing things you know based on it being enabled or disabled and once you kind of get into
59:50: a permission and import audio streams dynamically into the world?


1:34:15: that sort of Realm like you know the the effect of that single you know enabled State can be very complex where you can
59:55: This would allow us to fully implement the ham radio stuff into Resonite and allow us


1:34:22: literally have like you know by bunch of prole that like does bunch of modifications to the scene when that
59:59: to ditch using external browser support to support audio.


1:34:28: state changes and it's too complex for something like the render kind of resolve because you you would essentially have to make like you would
01:00:04: So I'm not sure if I've


1:34:35: have to like run a whole another update on the world just to kind of resolve those differences and the complexity of
01:00:05: I don't understand enough about how you want to capture it


1:34:42: the system essentially explodes um so probably not in that
01:00:11: But since we'll be handling all the audio rendering


1:34:47: sense um if it give us more details on what do you want to achieve um you can
01:00:15: we'll be able to build a virtual microphone that actually captures


1:34:53: give a more kind of specific answer but uh um this is pretty much know how much
01:00:19: specialized audio from its own, whatever it is in the world.


1:34:58: I can say like on this like in general uh granny K was The Locomotion
01:00:23: So that's one of the things you'll be able to do. You'll be able to bring the camera


1:35:04: animation system one of the unity systems that need to be implemented F engine or was it something else uh that
01:00:27: and have it stream the audio device.


1:35:09: one was something else um it came like a as a sort of like a part of like kind of business business contracts um it's not
01:00:30: So I would say yes on that part, on the


1:35:18: something it's not something I kind of wanted to prioritize myself but um
01:00:35: kind of capture.


1:35:24: like it uh it's kind of complicated situation but unfortunately it was
01:00:37: I don't know...


1:35:29: like it was necessary at the time and I'm not like super happy with like how
01:00:39: I think I know what they mean.


1:35:35: the whole thing kind of like went like because it kind of came like you know at
01:00:45: Am I correct in assuming


1:35:42: like wrong time and it's it was essentially like you know a
01:00:48: that you want a way to import multiple


1:35:49: lot of because we don't have like a lot of systems for dealing with animation which would have made like you know
01:00:51: streams into the world from a single user? Is that what you're talking about?


1:35:54: these things much easier and like we haven't worked ik itself which would make things also easier so there was
01:00:58: You'll probably have to wait for them.


1:36:00: like a lot of kind of foundational war that was not there um and
01:01:00: Yeah, wait a second.


1:36:06: also like the timeline was like you know kind of really short um so it was pretty much like you
01:01:05: We might


1:36:13: know like just month of like constant crunch just kind of like working on it and that wasn't enough time to kind of
01:01:05: get back to this question.


1:36:19: you know get it through um so it's it is a complicated situation
01:01:10: You'll essentially be able to render audio out


1:36:27: unfortunately um and it's a thing you know it can happen sometimes with businesses like
01:01:13: from any point in the game in addition to rendering for the user.


1:36:32: you you end up like you know kind of like in a situation where like you don't really have a good path like you know
01:01:17: And then it becomes a question what do we want to do? Do we want to record an audio clip?


1:36:39: like so you you just have to kind of like you know deal with it um we want to like eliminate those kind of situations
01:01:21: Do we want to output it into another audio device so we can stream it into something?


1:36:45: and we had like a number of conversations kind of internally to see like how do we prevent this from
01:01:25: So that will work. If you want to import audio back in


1:36:50: happening again you know like like how do we make sure don't end up in a situation where we have to do something
01:01:31: that's probably a separate thing.


1:36:57: like that and we have like you know kind of like much better understanding you know of the problem now like we where if
01:01:33: That's probably not going to come as part of it. We'll see.


1:37:06: situation like this were to occur again we're going to be you know kind of better equipped like you know um on the
01:01:37: If you have any kind of clarification just ask us more and we'll get back to this.


1:37:13: communication side how do we like deal with it and how do we make sure it doesn't kind of like you know mess with our kind of priorities and things we
01:01:43: Next we have


1:37:20: need to focus on um um so I know it was like you know
01:01:46: EpicEston is asking, will the headless coin be upgraded to .NET 9?


1:37:26: not it was kind of like M the situation I'm not like you know happy with like how I handled some of the things with it
01:01:50: Yes. Plan to do this soon.


1:37:33: um but it's it's pretty much it's it's it's it is what it is and like you know
01:01:53: It should be mostly just a flip of a switch, we don't expect


1:37:39: the best thing we can do right now is kind of you know learn from it and like try to like improve
01:01:57: big issues. One of the things we want to do is we're going to make announcements


1:37:45: things um next question is Shadow X uh how are you compiling the questions from
01:02:01: so people know this is coming, you can prepare your tooling


1:37:51: the stream chats I thought twitch NOS were broken uh no it actually work um we
01:02:05: make sure whatever scripts you're using


1:37:56: do have this thing here where we kind of like you know going through the questions this is an older one I need to grab a bigger one but it's sort of like
01:02:09: to upload your headlesses, they don't just explode.


1:38:03: you know sorting the questions for us um the question marking it
01:02:14: There's a GitHub issue on it and I'll try to make the announcement


1:38:10: yeah um the twitch noes also like what was actually broken where uh the
01:02:17: in a bit, probably sometime next week.


1:38:16: displays of them and it got fixed very recently like I think last I pushed to update for it like last week
01:02:22: Get people ready. Alex2PI is asking


1:38:26: uh mod were able so next we have epic e is asking mods were able to access internal array to edit things like uh
01:02:25: makes me wonder what's currently a culprit of most crashes, at least on my computer


1:38:34: color over Lifetime and over lifetime bu those be properly converted yes uh those systems like they've been implemented
01:02:29: I must have seen information that Unity crashes, couldn't you just restart Unity?


1:38:40: for for Photon dust uh so they're going to be converted to equivalents uh so they just going to you know that should
01:02:35: We also had a discussion about couldn't you just


1:38:46: just work out of your box the good news is there's also new like uh new modules because uh um
01:02:39: I mean, so


1:38:54: Photon dust the new particle system is designed to be way more
01:02:41: for the first part of the question, crashes, they can have lots of reasons


1:39:00: modular um so there's like modules that uh instead of just you know the internal
01:02:47: it's really hard to say, like in general


1:39:06: array you can also specify the color of your lifetime using a texture or using you know start and starting and ending
01:02:49: you pretty much have to send us the crash log, we look at it, we look at the calc tag and be like


1:39:13: color you can also do starting ending color in the like a HSV like a color
01:02:53: this is probably causing it, so it's kind of hard to say


1:39:19: space so there's like a lot of new color effects that we can do that's going to give you more control over the particle
01:02:58: in general, for the part where we just restart Unity


1:39:25: system and we can always ear more because we now have full control over the system so those modules are very
01:03:02: I mean, it's kind of what a crash is, it essentially breaks


1:39:31: easy to write this next one is um a little like
01:03:07: and then it has to shut down and you have to start


1:39:37: moderation focused do you mind if I uh to answer
01:03:09: it again, so in a way you're kind of restarting Unity


1:39:42: it okay let me take a breath for this one because it's a long one on the topic
01:03:13: it's just that the restart is kind of forced


1:39:48: of the platform being for everyone why was the nipples allowed rule passed for the majority of people in the world/ intern including me are not going to
01:03:19: but this actually kind of ties


1:39:54: want to see them in public sessions I will admit that it has been extremely rare occurrence of seeing someone with them shown in a public session and will
01:03:25: because if you've been here earlier


1:40:01: it be possible for me to request things like this both to the team and other people without having my moo belief question at every turn
01:03:28: we've been talking about how FrooxEngine is going to essentially be moved into


1:40:10: so the the reason why we wanted to take a stand on uh top equality um that's
01:03:32: its own process, and then Unity is going to be handling the rendering


1:40:17: what this issue is called by the way it's called top equality is um because
01:03:36: one of the things that I'm considering as part of the design is so


1:40:24: is ultimately like if if a man can have a
01:03:40: the Unity can actually be restarted


1:40:31: be chest you know why why can't a woman the only difference is that on average women have larger chests than men and I
01:03:44: maybe. So if Unity happens to crash, we can keep


1:40:39: think we're also a an EU based company right so no I'm from Europe okay this is
01:03:48: running FrooxEngine, start a new Unity, and just


1:40:46: this is the stance in a lot of places in Europe too where um top equality is just sort of the norm um and we want to we
01:03:52: reinitialize everything. So I do want to make that part of it


1:40:53: want to normalize that because we we do need this kind of
01:03:55: just in general to make the system more robust, so it's possible


1:40:59: equality like why can't a woman have you know their why can't a woman be topless
01:04:02: but TBD


1:41:05: you know in a nonsexual context there's there's just no precedent for it
01:04:03: we'll see how that kind of goes


1:41:11: um and let me see if I'm question again
01:04:08: currentUK is asking, I have heard from someone complains of headless being


1:41:16: I there's also like a thing with this is like we like you know we believe like in equality and we believe like in a lot of
01:04:11: patron reward. This was particularly a complaint about communities that do want to host events


1:41:22: progress so we know to take stance on those things but also like we kind of give you tools to kind of deal with
01:04:15: essentially forced into it to keep events going if they haven't host crashes. Is there


1:41:28: those so like if it's something you really don't want to see there's an avatar block function you can you know
01:04:19: any plans later to remove the patron requirement for the headlaces when things are more stable


1:41:33: block those people they will like not appear to you um there's probably know more things we can do in that area as
01:04:23: and performant? So at some point we'll probably


1:41:39: well um but ultimately we want to be like you know very kind of like open and
01:04:28: make it more open. Our


1:41:45: very kind of progressive as a company when it comes to these things um there's also like I would really recommend like
01:04:31: tentative goal, and this is not set in stone, so things


1:41:51: uh asking this question also in the moderation like office ours um because
01:04:35: might change. Our tentative goal is we want to offer


1:41:57: the moderation team is you know the one that kind of deals with this a lot of detail and they kind have like they're going to have a lot more kind of context
01:04:39: a service where we make it easy to auto-spin headlaces


1:42:02: for these things um but also like you know I don't necessarily believe that like you
01:04:43: and move Patreon to that, so if you


1:42:10: know it's majority of the people on the internet you know like having that stance like it's there's there's a good
01:04:47: support us financially you will get a certain amount of


1:42:18: CH of like you know kind of people like who are kind of like you know very open about this and um I feel like you know
01:04:51: hours for the headlaces and we're going to make it very easy to host, and if you want to self-host


1:42:25: that the trun is kind of grow people are kind of getting like you know more open with things but I do recommend like you know
01:04:55: we're going to give you the headlaces. We don't have to add it


1:42:31: bringing this like with moderation of his hours like they're going to be able to give you like kind of much much kind of better better answer to this because
01:04:59: from the business perspective because Patreon is one of the


1:42:38: they've been dealing with this topic uh you know for a while um so U you know
01:05:03: things that's supporting the platform and it's allowing us to work on it.


1:42:46: take take what we say like a bit a little bit of a ground of sell I don't want to you know kind of step on the moderation themes like those with that
01:05:07: So we don't want to compromise that because


1:42:53: yeah um I was going to say something to I was going to say something to wrap it
01:05:13: if we do something with that


1:42:58: up what was I going to say um so next one what you want
01:05:15: it ends up hurting our revenue stream, then we're not able to


1:43:06: to yeah I was just going to I was going to say I don't know what I don't know what you mean by um cuz I commented um I
01:05:18: pay people on our team, and then we're not able to work on


1:43:13: don't know what you mean by um this rule being exploited um by transgender males
01:05:23: things and things end up kind of bad.


1:43:19: and females but being transgender has nothing to do with this um if you if you're if you want to be a
01:05:28: We don't want it to be accessible to as many people as possible, but we're sort of


1:43:28: boy or want to be a girl um that has no bearing on this Rule and that's part of the quoted too is like you know because
01:05:31: balancing it with the business side of things.


1:43:34: it kind of like erases that kind of despire to like so like it doesn't really you know matter if you do feel
01:05:37: Next one, TroyBorg.


1:43:41: there's like you know some like exploit you can always you know you can file moderation reports or you can file like
01:05:39: Cyro also did FFT mode a while ago. Having the audio system that could make VHS part of the game


1:43:46: you know um you can um you know bring these like to the moderation kind of
01:05:43: like waveform visual is, or be able to do better detection of bass music effects.


1:43:52: office hour or and discuss these there um and we can kind of you know see what is happening and then we sort of
01:05:47: That's actually separate from, because that happens fully with the Resonite.


1:43:58: evaluate does it fit like you know with our philosophy does it fit with our Like rules or does it not so you
01:05:52: The audio system is more about rendering the audio output


1:44:05: can if you're like in a field or some issue you can make us aware of it uh we can promise you know that we're going to
01:05:56: and pushing it to your audio device.


1:44:11: agree with you uh that we're going to have you know same view on it but we can at very least you know look at it and
01:06:02: Next we have, I'm kind of just speeding through these questions because we have a bunch.


1:44:17: like listen to what we have to say on that uh so next we have Grand UK uh hear
01:06:07: Skywakitsune. A few people have mentioned that they are not happy with the new working system and how good


1:44:24: say I have heard from someone that they tried to report someone to the moderation team but because they were connected to the team nothing happened
01:06:11: it looks. I have plans to continue to improve that. It will be a specialized update but people


1:44:31: of it and they ended up banned instead I can't confirm 100% that what was said
01:06:15: are still not happy. We can always improve things. We just released


1:44:37: happened and I know nothing can be said cases but in case where there are conflicts of interest like above what
01:06:19: an update which


1:44:43: can be done and how can be sure where we won't have wrongful consequences B for trying to uphold the US and guidelines
01:06:23: integrates some of the community settings which would make it look way better.


1:44:49: for everyone so um I doesn't have like super many details but I can kind of
01:06:29: For things that are like, you know,


1:44:55: talk you know in general sometimes we do have cases where
01:06:31: that people still find us and issues with it, we will need reports on those because


1:45:01: um there's actually two things like we to have cases you know where there's reports against you know people who are
01:06:35: right now


1:45:07: on the moderation team or even on the resonite theme um if it's a report
01:06:38: we're not sure


1:45:12: against someone who's on the moderation team that will usually go to the moderation Le uh and those will like you
01:06:41: after the update, we're not sure what's making people not happy about it.


1:45:18: know those people can of deal with it that will investigate uh we actually have multip moderation leads as well
01:06:45: We have more concrete stuff to work with


1:45:24: that way you know it's not like it's not like you know there's a single person who can you know just kind of bury the
01:06:48: as well as people make reports so we can know


1:45:30: thing but there's like multiple people who all can see the same data and then you know sort of check on each other uh
01:06:52: what do we focus on. But yes, in general, we are always


1:45:36: if it happen something with a team like or like uh you know is an issue with somebody on the actual resonite team uh
01:06:56: willing to improve things. We want to


1:45:42: usually that goes um um that goes like the Canadian kid who's like deal with these things and he brings these things
01:07:01: make


1:45:50: uh with me like if a you know try to kind of like involved in that we have
01:07:04: essentially want to make it as polished


1:45:55: cases like we kind of had to deal like with difficult situations before um uh both kind of like on the theme uh
01:07:06: as it can be, but we also need more kind of hard


1:46:03: but in the moderation team as well and I can't really go into details because um
01:07:09: data to work with so we know where to invest our time.


1:46:09: you know there's s of kind of like you know privacy kind of like issues with that uh I can tell you so there's been
01:07:18: Next we have Terborg. What is causing Virulence first


1:46:16: cases where people on the moderation team they had to permanently ban some
01:07:21: sometimes when Froox moves? I'm not sure. It could be just the bloom on death


1:46:22: people who who were their friends even long time friends because uh they did
01:07:25: thing, maybe.


1:46:28: something wrong and this caused like you know people on the moderation team like a lot
01:07:27: It's his radiant yellow complexion.


1:46:35: of kind of distress but they still like you know made the decision like to like you know B their friend because
01:07:33: Your resplendent visage.


1:46:43: they they want to like you know uphold like you know the moderation rules rules above all else
01:07:35: This is actually Erlage. What was the answer to this?


1:46:49: so I've kind of looked at you know a few of those cases because I don't want to like you know make sure like things are
01:07:39: So these are just looks like questions within the chat.


1:46:55: uh kind of going okay there's like no kind of like favoritism kind of
01:07:43: Erlage86. Who is the second person here on the camera?


1:47:00: happening um I've been kind of involved in F those cases as well um like kind of
01:07:46: This is Cyro. He's our engineering intern.


1:47:06: you know sort of like you know part of the kind of like know discussion of it and so on so those are there's been a
01:07:50: Hello. Hi. How you doing guys? It's me.


1:47:12: number of kind of difficult kind of discussions on those and every single one if there was sufficient evidence for
01:07:57: Next we have


1:47:20: somebody's like in wrongdoing even if we knew that person like personally even if they were connected
01:07:58: SkymoKitsum. Questions from Tara Whitel who can't watch the stream now.


1:47:26: to the team they were still banned um there's one thing I kind of
01:08:03: If video players are going to be updated with Core and VLC, I have


1:47:32: noticed is also kind of like in general is like usually when people do get banned
01:08:06: heard from several builders that players use very outdated Core.


1:47:39: um they will they're almost like never kind of fruitful about like the reason because
01:08:10: Yes, the system we use right now, it's a plugin called UMP


1:47:45: uh we do make sure as part of like the moderation if somebody ends up being banned usually they will receive like in
01:08:14: Universal Media employer, which is a builder on VLC, unfortunately


1:47:52: a warnings first depending in what the sity um if they end up being banned the
01:08:18: hasn't been updated in years, which means it's using an older


1:48:00: reasoning this often like you know this explained to them often times there's somebody from the team who's actually
01:08:22: version of it. We've been looking into upgrading


1:48:07: going to have sit down with them and be like we have this evidence you know this kind of happened you're getting banned you know
01:08:26: to actual official VLC plugin. The problem is


1:48:13: for these reasons they are made aware of it and in a lot of cases they like those people would you
01:08:30: it's still not mature enough in some ways.


1:48:20: know come out and give completely different reason you know for why they're banned
01:08:35: The last I remember, there's issues where you cannot


1:48:27: because and it kind of puts us in a difficult situation because we value you know privacy and sometimes giving
01:08:38: have more than one video at a time. You can only have


1:48:33: details to the public could put innocent people who are involved in those incidents you know at
01:08:42: one, and if you try to do another one, it just explodes.


1:48:39: risk um so we cannot really know we cannot say you know the person was
01:08:47: There's other things we can look


1:48:45: actually banned for these reasons um but it is a thing that kind of
01:08:49: into, like alternative rendering engines, but there's also


1:48:50: happens so the only thing I can kind of request is you know like
01:08:52: potential time and money investment. If the pros


1:48:56: be a more kind of skeptical about like what people say about these things if
01:08:57: are bad, we can


1:49:01: you see something you know like if you like believe like you can always send us a report we will like look at it we will
01:09:00: consider that we might invest into one, but we need to do some testing


1:49:06: evaluate it we will see like what evidence we have um but ultimately like you know we will
01:09:05: there and see how well it works.


1:49:13: not necessarily tell you the like we will not tell you the details of how it was resolved to protect you know the
01:09:09: It's unfortunately difficult situations because the solutions


1:49:18: Privacy um you know and potential security of people involved
01:09:12: are limited.


1:49:24: so I will also oh sorry no go ahead uh I was just going to say that we're we're
01:09:17: It's something we want to improve,


1:49:31: just about at 10 minute Mark so I think we should uh close questions okay uh so
01:09:20: but it's also difficult to work with, unfortunately.


1:49:36: we're going to close the questions like will'll um uh so if you send question
01:09:26: Can I comment on the next one?


1:49:42: right now we have like a few of them like coming in uh if you send any questions after this point uh we can
01:09:32: Rasmus0211


1:49:49: guarantee we're going to answer at one uh we'll try to answer as many as we can that are still like left but uh no
01:09:32: asks, thoughts on about 75% of all users being in private worlds


1:49:55: guarantees at the point but I will at very least try to make it uh
01:09:36: around the clock. Often new users mention they see practically no enticing


1:50:01: um my well at least like you know the ones that we have on the list right now
01:09:40: worlds. This is not a Resonite


1:50:07: yeah so the next one uh epic e does the question mark need to be at the end of
01:09:44: problem. This is a problem of scale.


1:50:12: the question uh I think it doesn't need to be I think I can like put like put it in medal but I would just to be sure I I
01:09:48: All platforms have a


1:50:19: would put it like actually no yeah I literally see a question that has like question mark in the middle of it so no
01:09:53: pretty wide majority of people who just kind of want


1:50:25: it doesn't need to be at at the end um Rasmus uh
01:09:56: to hang out and not really be bothered.


1:50:31: 0211 any more Flex notes in the works if yes which excites you the
01:10:01: Unfortunately, we're not the biggest platform out


1:50:37: most so you want to take this one you're working on so new ones oh yeah um
01:10:04: there. We're still kind of small.


1:50:46: so yeah which ones am I working on again this the one I just
01:10:09: And as we


1:50:51: took when I just oh yes um there is a there is a new protox that I'm
01:10:10: grow, that problem will undoubtedly get better.


1:50:56: particularly excited about so you know how um for for those of you who do
01:10:16: It's not really a


1:51:03: protox there is currently a node where you can you know perform a ray cast which shoots an infinitely Thin Line and
01:10:18: technical problem, it's more like a social one, because people


1:51:09: whenever it hits you know you can get the position you can get the direction um stuff like that what I'm
01:10:23: behave in a certain way, and it's really hard


1:51:16: going to implement is I'm going to implement um sweeping or um
01:10:26: to change that. There's some things we want to do to


1:51:23: I think it's it's also been called like shape like shape casting or whatever um unlike some other platforms but it's
01:10:30: entice people to make it easier to discover things, like


1:51:30: essentially a way of doing thick Ray casts using a shape that you essentially
01:10:34: we were talking earlier, adding an event's UI, so you cannot see these are the things


1:51:35: like extrude in the direction that you want it to go so uh like if you wanted
01:10:38: that are coming up, these are going to be public events that you can join. Right now, I


1:51:40: to shoot a sphere in a direction you know you would essentially be shooting a
01:10:42: believe there's the creator chain event that's going on, and it's always


1:51:47: capsule however long you want to shoot it and anything within there it would hit or in this case you know the first
01:10:47: every weekend, it's public to everyone.


1:51:54: thing it hits um you know it'll return like basically exactly like a ray cast
01:10:51: But it depends what people are coming in for, because people


1:51:59: but it's thick and you can do that with different shapes like a sphere or a cube
01:10:54: might come in, and they don't actually want to join public events, they want to go into those


1:52:05: or um I think you should also be able to do it with like convex holes right uh
01:10:58: private worlds. But the question is, how do you make those people


1:52:12: I'm actually not sure we have that one maybe okay that was going to be the very
01:11:03: discover the friend groups and hang out


1:52:18: at the very least you'll be able to do it with spheres and cubes and cylinders and capsules and and stuff um but I
01:11:07: in those worlds? It's a challenging problem,


1:52:25: think that'll be very useful especially for those of you who make Vehicles who don't want your rast to uh you know
01:11:11: especially from the platform perspective, because we can't just force


1:52:32: shoot between like two infinitely close triangles in geometry and now your car is flying across the map yeah thick Ray
01:11:15: people into public worlds. People


1:52:39: casts yeah thick Ray casts also should be like pretty easy one to because we do
01:11:17: will host whatever worlds they like, but


1:52:44: have like a lot of the functionality like it's already in the part of the be physics engine and we have exposed like
01:11:22: always want to see what kind of tools we can give to entice people


1:52:49: we use it internally in our own engine like for for example the laser is actually using like you know sweeps to
01:11:27: and make worlds and socialization easier for


1:52:56: like behave a bit better and this is going to expose them so you can also use them from
01:11:30: them to discover. But like Cyro said, it is a thing that


1:53:02: prolex uh this one this one seems to be asking
01:11:34: gets better with scale, once we can grow more.


1:53:08: something in the chat so I'm going to skip this one uh tribe gra W VR question
01:11:40: There's a


1:53:13: FRS absolutely love your steel the videos a genius what app are you using to do those scans yes s to to steal of
01:11:42: number of events, though. If people go right now,


1:53:20: reality so for most of my scans uh I'm using a software called agis of meta
01:11:47: since we don't have the event's UI in-game, if you go into the


1:53:26: shape uh it's a photog gometry software and essentially you you take lots of pictures of the subject from alls of
01:11:51: Resonite Discord,


1:53:31: different angles um and like you know like it sort of it's able to like you know do those
01:11:57: if you go into


1:53:37: construction figures out you know based on the patterns and photos where the photos are and then reconstructs a mesh
01:11:58: Resonite Discord, we have community news, and lots of


1:53:44: um I also sometimes use like you know additional software like I'll for example use Photoshop to like um with
01:12:03: different communities post regular events there, so people can


1:53:50: certain photos I will do like an AR ID noise on them which helps like increase
01:12:06: find what's going on in the platform, it helps a bit in the meantime if


1:53:55: the quality of the scans and I also do you know some kind of tuning of the lighting and so on uh but I guess metab
01:12:11: people are looking for things to do.


1:54:02: is the main one there's like one that I kind of started experimenting with like few days ago and I Lally turned my room
01:12:15: Next question from Baplar.


1:54:08: into like um uh it's software called uh actually I
01:12:20: Yes, it's actually been working in parallel.


1:54:14: forget the first uh it's called post shot uh let me see the full name Jose
01:12:26: Ginns is one of the main people working on that.


1:54:20: said post shot and this one's for gassing spting which is sort of like this new technique you know for 3day
01:12:30: We did have meetings


1:54:26: reconstruction or more General like rendering uh which can reconstruct the scenes with much better Fidelity and
01:12:33: now and then, we're sort of synchronized on the status of it.


1:54:33: we're kind of you know playing with it like because I have all my data sets I've been just kind of throwing into it and see like you know how it's kind of works with different
01:12:37: Last time, that was two weeks ago or so, we talked about


1:54:40: things um so like I might like you know integrate that one like U more into my
01:12:41: the multi-process architecture, how that's going to work, how it's going to integrate


1:54:46: workflow as I kind of like go um I posted like a quick video and have like a bunch of more be posting like soon
01:12:45: with Froox Engine, and how


1:54:54: L um but yeah like I guess met ship is the main one to use like you know it makes it easy to like you know just get
01:12:49: those systems are going to communicate. Ginns' approach was


1:55:00: a mesh bring it in this one is uh continuing on
01:12:53: to look at what the current unit integration has


1:55:07: moderation question that we had a couple ago um this one from uh rale 86 again
01:12:57: and were implemented on source end. However, there's a lot of things


1:55:17: asks continuing my previous question will anything be done regarding people who do not want to see b top f emails
01:13:01: that we're actually moving, like the particle system, audio system,


1:55:22: and public sessions for nonhosts I'm aware of the already Inplay system where you can ask the person to switch Avatar
01:13:05: input system, lots of things that are going to be moved forward into Froox Engine,


1:55:27: Avy settings and for hosts they can enforce address code which I'm no doubt making use of
01:13:09: so they don't need to be implemented on source side, and they're going to focus more


1:55:33: so in in the future we do want to implement stuff like content tagging um
01:13:14: on other things. They have a list


1:55:41: and that that will come with like um the ability to like you know if things are tagged a certain way you can like take a
01:13:17: of source features, and specifically


1:55:48: check box and you won't see them anymore right so you could you could make use of that um that's something you know we
01:13:20: bevy features, because source is being built around the bevy


1:55:56: will do in the future um but other than other than that um for the time being um
01:13:24: rendering engine, which


1:56:04: if you don't want to see that uh don't go to those sessions well you can still go through our sessions because we do
01:13:28: maps the current features we have. For example, we have lights,


1:56:11: have like like we do have ability to block somebody's hard hard you literally will just you know like I could actually
01:13:32: do they support shadows, we have reflection probes, do they support


1:56:18: show you like you know if I click on cra's name
01:13:36: this and that. So they're working on making sure there's a feature


1:56:23: careful it might ban me from the session oh it should be just
01:13:40: part there. Once we have a performance upgrade,


1:56:28: blar there we go see is gone I don't I don't have to look at well I can still see him but like you know I don't have
01:13:44: we can work more on the integration. They also work


1:56:33: to look at it like him anymore yeah that that is something I forgot about it's
01:13:48: on the Resonite side, so you know what Jenkins has been doing on consolidating


1:56:39: like this is like one of the reasons like we added it you you have the power you know like if some Avatar is
01:13:52: the shaders, because all the shaders we have right now,


1:56:45: legitimately upsetting you you can block it uh the other part is if you host your
01:13:56: they need to be rewritten for source, because


1:56:51: own sessions you can your own rules we do allow for that like you know with some caveat uh so if you want to enforce a
01:14:00: the current ones, they're not designed for Unity, so we need equivalents


1:56:58: dress code that's completely up to you you have that like you know you have that freedom yeah um you can always like
01:14:05: the equivalents of those are


1:57:04: you know add like additional kind of rules to like in whatever sessions you want to host um so you know that's kind of thing
01:14:08: essentially going to be implemented for the new rendering engine.


1:57:12: and eventually the content tacking system that should make these things you know more generalized so like you know
01:14:14: Next, Epic Easton. How do you


1:57:18: like you don't even have to like you know go and see it in the first place as long as the content is properly tacked
01:14:16: make walkie-talkie system? There's actually one thing you should be able to do


1:57:23: we can filter certain things out automatically block certain avatars you know we do want to give you the tools
01:14:21: with the new audio system, you'll be able to


1:57:29: but we don't want to make you know we don't want to make like Global decisions just forbiding these things you know for
01:14:24: have a virtual microphone, put it on a thing


1:57:37: everyone yeah and there there is a there is a Nuance that was going to get to there um in that uh if you if you decide
01:14:28: and then have it output from another audio source. There actually might be


1:57:47: to not allow um like let's say you're like I don't want to see nipples in my world also has to apply to the men in
01:14:32: a thing you'll be able to do once we rework that, because it shouldn't be too difficult to add


1:57:54: the session as well um it is it is universal you cannot discriminate um so
01:14:36: components for that. Relanche,


1:58:00: it's either nipples allowed for all or no nipples at all actually reminds me because there
01:14:41: Richard Bode, Newtonian Physics System 1, soon or later.


1:58:06: was like one thing that was particularly fun to me like uh with a Creator Jam they actually made like a nipple gun
01:14:44: So definitely sometime after the performance upgrade


1:58:11: they like you know shooting around the worlds and people get upset and they were like oh no it's okay those are male
01:14:49: we integrate a physics engine called Bepu Physics.


1:58:17: nipples they're not female nipples and like it kind of was like a funny way to you know point out how to that like
01:14:53: One of the things we want to do after we move the Froox engine


1:58:24: double standard you know for this kind of thing uh but yeah uh next we have R 8
01:14:56: out of Unity and it's running on .NET 9, we want to synchronize


1:58:30: to6 my question being will anything be done past at all that basically I don't know which
01:15:00: Bepu to the latest version, because right now we kind of have to diverge


1:58:36: one this one's actually related to it it was related to the previous one they sent them in a
01:15:04: because the Bepu physics, it used to work with


1:58:43: row um so basically um this might be the
01:15:07: .NET Framework, which is what Resonite is like right now for Unity.


1:58:48: last one because we were last minute yeah uh there we already answered
01:15:12: But now the newer versions they require, I think .NET 5


1:58:54: that one um yeah uh I think I think that's I
01:15:15: or maybe they even bumped it higher, which means we cannot


1:58:59: think that's pretty much it we had a few more coming but we we yeah there's a few more but uh this is pretty much the last minute like we've been here for two
01:15:19: really use those, at least not with lots of backporting.


1:59:06: hours my throat is kind of sore from this I should have brought some water uh but thank you everyone you know for
01:15:23: So one of the tasks is going to be to sync it up and then we're going to


1:59:12: joining thank you like you know for so many kind of questions like we're like you know very happy to answer those you
01:15:27: be able to look at how much work is it, when do we want to


1:59:18: know like let you know more you know about the platform and just going to like you know chat with you and thanks
01:15:31: prioritize how we should put a simulation integrated with Froox Engine. It's also


1:59:23: everyone also like you know like for playing like you know resonite for enjoying this platform you know for supporting us and letting us do this
01:15:35: going to help because Bepu Physics is designed to work with


1:59:29: kind of thing um I hope you enjoy the stream like U uh my goal is you know
01:15:39: modern .NET to be like a really performant, which is why


1:59:35: make this every every week uh the forite might kind of change a little bit uh uh we kind of see you know how many
01:15:43: I would like a person to consider it as a prerequisite for


1:59:41: questions we get like next time and so on uh we might you know next time like be for example uh outside of is I you
01:15:48: implementing that as the performance upgrade, so we're actually running it with


1:59:47: know playing some kind of ch games while kind of chatting with you but um we'll see how it kind of goes because this one
01:15:51: the runtime it's supposed to run with. But there's no specific


1:59:53: there was a little lot of questions we like you know kind of focus more on the Q&A and we'll see like you know how it changes with the upcoming
01:15:55: kind of prioritization right now. Once we're done with the performance update, we might focus


2:00:00: streams um so we'll experiment with the forat a little bit uh and see like you know and also let us know um let us know
01:15:59: more on UI and be focused on IK, maybe other things we'll


2:00:07: like you know like what do you think like do like this would you like to see some other things are there like any kind of issues um you can like you know post
01:16:03: reevaluate at that point.


2:00:15: like uh actually why should I post uh make maybe make a thread into office hours um
01:16:07: Grant is asking


2:00:22: um like uh on the Discord uh to share your feedback so thank you very much for
01:16:09: Must move away from Unity to Source. Could it be possible to dynamically connect, disconnect from VR


2:00:28: joining you know thank you for like spending time with us and asking us questions um I'll try like you know try
01:16:13: runtime without restarting the game? There's not really even a thing


2:00:35: to get like this video uploaded on you know our YouTube channel so can anybody who like missed this these office hours
01:16:17: that needs to move away from Unity. It's possible to implement it


2:00:40: can you know watch them afterwards and we'll see you next week so thank you very much and thank you also s for you
01:16:22: with Unity. It just takes a fair amount of work.


2:00:47: know helping me with this and I'm being good co-host uh and we'll see you next
01:16:27: So, possible yes, I would


2:00:52: week bye bye guys miss you
01:16:29: say. The question is are we going to invest time into implementing that.
 
01:16:35: For that I don't know the answer right now.
 
01:16:39: Next we have a question, RustybotPrime
 
01:16:41: Would these audio rendering sources allow for spatial data for your own
 
01:16:45: voice? Example, if I want to record conversation between myself and someone
 
01:16:49: else from third person without it sounding like I'm right at the camera.
 
01:16:54: Yes, there wouldn't be an issue because we can just
 
01:16:58: have any sort of listener in the world and just
 
01:17:01: record that with binary audio and everything.
 
01:17:06: Next, what flavor of sauce, what does it
 
01:17:09: taste like? And it's very salty. Mayonnaise.
 
01:17:13: Not mayonnaise, it's actually made of his own kind of sauce
 
01:17:17: which is why it's named sauce. Actually, I forget what he calls it.
 
01:17:23: Scotch sauce. Scotch sauce, yes.
 
01:17:25: He makes this really delicious sauce, it's a very salty
 
01:17:29: one, but it has loads of flavors to it.
 
01:17:34: I think this next one's aimed at me.
 
01:17:37: Alex2pie says, Cyro, I heard that some people don't trust you and that you don't care.
 
01:17:42: You know where this comes from. I think I do.
 
01:17:45: I'm in desktop a lot, and I'm often
 
01:17:49: either working in Froox Engine these days, or
 
01:17:54: I'm kind of audio sensitive and I can get overstimulated
 
01:17:57: kind of easily, so sometimes I will just kind of stand there.
 
01:18:02: Or maybe I won't respond so colorfully.
 
01:18:05: But I like having people around, and so that's why
 
01:18:09: I exist despite that.
 
01:18:13: I also appreciate it when
 
01:18:18: I'll probably open up a lot more
 
01:18:23: if
 
01:18:25: ...how do I put this...
 
01:18:30: Basically, if you
 
01:18:32: want to interact with the Cyro creature well,
 
01:18:37: do things like
 
01:18:39: ask before poking my nose or patting my head and stuff.
 
01:18:46: And ask me
 
01:18:48: if you want to send me a contact request. Just don't come up
 
01:18:52: to me and be like, you're cute, and then click my name and add me. Because then I have to
 
01:18:56: explain, I'm probably not going to add you, man, we talked
 
01:19:00: for maybe two seconds. I need at least 45 seconds.
 
01:19:09: But I...
 
01:19:10: If you've come across me and I've been in that
 
01:19:12: sort of state where I'm not super talkative, or maybe I seem a little detached,
 
01:19:18: hopefully that sheds a little light on that.
 
01:19:20: I love this place very dearly, and
 
01:19:25: I love all of you very dearly.
 
01:19:28: Sarah is a good bean.
 
01:19:33: So next, we have a question from Dan Amos.
 
01:19:37: Was the current workflow for identifying performance bottlenecks?
 
01:19:41: So, generally, the workflow
 
01:19:44: is something like, you know,
 
01:19:47: it kind of depends, because there's lots of things that can
 
01:19:50: cause performance issues.
 
01:19:54: So usually it's a combination of different things, but usually it kind of starts more
 
01:19:58: with just observation. You know, kind of seeing what's running
 
01:20:02: slow, when am I lagging, and so on.
 
01:20:07: Once there's that initial observation, we will try to
 
01:20:12: narrow down to the root of the issue.
 
01:20:14: And for that, we can use a variety of tools. Some of them are in-game.
 
01:20:18: For example, we have stats on how
 
01:20:23: much are certain parts of the process taking.
 
01:20:26: Once we need more detailed information, we can, for example, around
 
01:20:30: Headless, the Headless client with Visual Studio profiling tools,
 
01:20:34: and they actually measure how long is spent in each method,
 
01:20:39: how long is spent in each code. That gives us some kind of data.
 
01:20:42: The other part of it is benchmarking. Once we can have suspicion,
 
01:20:47: this thing is causing a lot of performance problems.
 
01:20:50: We can write a test sample, and then run it
 
01:20:55: with different runtimes, run it with different settings,
 
01:20:58: do A-B test things, see how things change.
 
01:21:03: For example, I've done this with a lot of Resonite's
 
01:21:06: offer extensions methods where, for example,
 
01:21:10: even with stuff like the base vector operations, I would try different ways to implement
 
01:21:13: certain operations, run a benchmark, and see how fast it runs.
 
01:21:20: One thing that kind of depends
 
01:21:22: there is what the runtime it uses.
 
01:21:26: One thing I would, for example, find is certain implementations, they actually
 
01:21:30: run faster with Mono,
 
01:21:34: and then slower with the modern .NET runtime.
 
01:21:38: There's a lot of things in FrooxEngine where
 
01:21:42: sometimes people kind of decompile and say, why is this
 
01:21:45: done this weird way? And in some cases, it's because
 
01:21:49: it actually, even though you wouldn't do it with
 
01:21:53: more modern code, it interacts better with the runtime used at a time.
 
01:21:59: But for example, with these
 
01:22:01: general operations, I would find
 
01:22:04: if I compare them with the Mono in Unity
 
01:22:07: and compare them with the modern runtime, they would run
 
01:22:11: 10, sometimes even 100 times faster. There's some other things
 
01:22:15: that also speed up, some things that are the same.
 
01:22:19: But generally, it's just a combination of tools.
 
01:22:24: We observe something not performing well, we have a suspicion
 
01:22:27: that this might be causing it, and then we just use tools
 
01:22:31: to dig down and figure out the root cause of that problem.
 
01:22:38: So hopefully that answers that.
 
01:22:41: I think there are also
 
01:22:44: some manual profiling tools out there, like Tracy, I know there's some
 
01:22:47: Tracy bindings for C Sharp, which are really cool.
 
01:22:52: That's actually one of the cool things, because there's a bunch of libraries that we cannot even use
 
01:22:56: right now because of the old runtime. Tracy, I think it requires
 
01:23:01: .NET 8 or
 
01:23:03: some new version.
 
01:23:06: It's listed for .NET 7, but I think it's just interop, so it could work, but
 
01:23:11: it's better to just wait.
 
01:23:14: We do want to integrate more tools. Usually, you have a performance profiling toolset
 
01:23:20: so you just dig down and figure out where it could be coming from.
 
01:23:23: Sometimes it's easier to find, sometimes it's harder, sometimes you have to do a lot of work.
 
01:23:27: For example, the testing I've done before
 
01:23:31: comparing the .NET 5 or whatever version it was
 
01:23:35: and Mono, I saw this code is running way better
 
01:23:40: so I think it's going to help improve a lot, but
 
01:23:43: it's still usually testing bits and pieces, and it's hard to test the whole
 
01:23:47: thing because the whole thing doesn't run with that new interface.
 
01:23:52: That's why for our current performance
 
01:23:55: update, we moved the headless first, because
 
01:23:59: moving the headless was much easier since it exists outside of Unity
 
01:24:04: and we could run sessions and compare
 
01:24:07: how does it perform compared to the Mono one.
 
01:24:11: And the results from that, we got
 
01:24:16: it's essentially beyond expectations, it's way faster.
 
01:24:20: That makes us more confident in doing all this work to move FrooxEngine
 
01:24:24: out of Unity, it's really going to be worth it.
 
01:24:33: As of my perception, Resonite is somewhat being marketed
 
01:24:40: as a furry social VR platform, which is not the case at all. But every time
 
01:24:44: I ask somebody, hey do you want to try Resonite, I usually get answers like, oh that VR game
 
01:24:48: for furries. I have nothing against them, but in Resonite they are very publicly dominant.
 
01:24:52: Are there thoughts about this topic that could maybe bring in more people?
 
01:24:56: So, we don't really market like Resonite as a furry social
 
01:25:00: VR platform. We actually specifically, on Chroma, we know who's
 
01:25:04: heading our marketing, we specifically for our own official
 
01:25:08: marketing materials, we show different diverse avatars
 
01:25:12: because yes, there's a lot of furries on this platform and
 
01:25:16: it's also a self-perpetuating thing where
 
01:25:20: because there's a lot of furries, they bring in a bunch more.
 
01:25:24: We do have lots of other communities as well, which are not
 
01:25:28: just big, but they are here as well.
 
01:25:32: So, we want Resonite to be for everyone. It's not designed
 
01:25:37: specifically for furries.
 
01:25:42: We want everyone to be welcome here.
 
01:25:50: It's sort of like a
 
01:25:50: complicated kind of thing because
 
01:25:54: the marketing we make, we try to make it as generalized, but
 
01:25:58: the question is when you come to the platform, you're going to have lots of furries.
 
01:26:03: I think the only way to bring in
 
01:26:06: more people is to showcase lots of
 
01:26:10: different people on the platform, lots of different
 
01:26:14: kind of communities, but if there's lots of furries, it becomes
 
01:26:18: kind of difficult. It's self-perpetuating.
 
01:26:24: But I think it's also a thing of scale.
 
01:26:26: As we keep growing, there's going to be more different groups of people
 
01:26:30: and the communities that are different kind of fandoms or just different
 
01:26:36: demographics, they're going to get bigger and it's going to help
 
01:26:41: people who are from those
 
01:26:42: demographics find their groups much easier, once there's more of them.
 
01:26:49: Yeah, Resonite's all
 
01:26:50: about self-expression and stuff and being who you want to be
 
01:26:55: and building what you want to build, and furries kind of got
 
01:26:58: that down pat, and so that's probably why you see a lot of them, but
 
01:27:02: everybody can do that. It's not just those people, it's made for
 
01:27:08: every person to come together
 
01:27:10: and hang out and build and
 
01:27:14: just be you, no matter who you are.
 
01:27:19: Yeah, we try to make this platform kind of inclusive and for everyone.
 
01:27:23: It's our goal. We don't want
 
01:27:26: anybody to feel unwelcome.
 
01:27:30: I mean asterisk, because we don't want
 
01:27:34: hate groups, people like that.
 
01:27:36: So that one we would have an issue with, but generally
 
01:27:40: we want this platform to be everyone.
 
01:27:43: Yeah, also we're coming up on the hour and a half mark.
 
01:27:46: Ok, so we have about 30 minutes left. We're getting to the end of the question, so we'll see how they
 
01:27:52: keep piling, but we might need to stop them at a certain point.
 
01:27:57: So next question, Oran Moonclaw.
 
01:28:00: Is rendering performance being looked into before you move the source as well? From my experience, when
 
01:28:04: the system is not CP bound, the rendering can be still quite heavy for semi-tuber resolution.
 
01:28:09: So there's actually a thing that source will help with.
 
01:28:12: We don't want to invest super much time into the current rendering pipeline
 
01:28:16: with Unity, because the goal is to move away from it, which means
 
01:28:20: any time we invest
 
01:28:24: improving Unity, it's essentially going to be wasted
 
01:28:28: and it's going to delay the eventual big switch. Source
 
01:28:32: is going to
 
01:28:34: use much more modern rendering method. Right now we're using deferred method
 
01:28:39: which can be quite heavy, like
 
01:28:43: memory bandwidth and so on.
 
01:28:47: With source, it's going to use something called clustered forward rendering
 
01:28:52: which allows lots of dynamic lines while also being much lighter
 
01:28:55: on the hardware. So that should improve
 
01:28:59: rendering performance on itself, and once we make the move we can
 
01:29:03: look for more areas to optimize things, introduce things like
 
01:29:07: impostors, more LOD systems and things like that.
 
01:29:14: So yeah,
 
01:29:15: it's pretty much like, unless there's any sort of
 
01:29:19: very obvious low-hanging fruit with rendering
 
01:29:23: that would take us less than a day
 
01:29:27: or maybe just a few days to get a significant boost in performance
 
01:29:31: we're probably not going to invest much time into it and instead want to invest
 
01:29:35: into the move away from anything.
 
01:29:39: Next question, RestibotPrime
 
01:29:41: How straightforward is conversion of our current particles to PhotonDust?
 
01:29:45: I assume goal is seamless to the point of them looking to be having identically, but there is anything current particles can do
 
01:29:51: that photovoltas won't, or will it do in a different enough way
 
01:29:55: that it will have to be manually fixed?
 
01:29:58: So the conversion, I can't really answer it exactly, because the conversion actually isn't written yet
 
01:30:04: however, the main focus right now is actually
 
01:30:07: feature parity. So I actually have a list, and I can post it
 
01:30:10: in the devlog if you're curious, where I have all the things that
 
01:30:15: the legacy system has, and I'll be working through that list
 
01:30:18: just making sure that PhotonDust has the same or equivalent functionality.
 
01:30:23: The goal is to make it so it's pretty much equivalent
 
01:30:27: so it converts and it will look either the same
 
01:30:31: or just very close
 
01:30:33: so hopefully there won't be things that are too different
 
01:30:39: however, sometimes those things
 
01:30:42: become apparent during the testing period
 
01:30:45: so once those things here come out, we'll look at them and we'll be like
 
01:30:49: this is easy enough to fix, or maybe this one's a little bit more complicated
 
01:30:53: maybe we just bring it close enough and ask people to manually
 
01:30:57: fix things, but we'll have to see how this kind of goes
 
01:31:02: sometimes it's kind of hard to know these before it actually happens
 
01:31:06: but
 
01:31:08: it should have a feature parity, well it's going to have a feature parity with the current
 
01:31:15: host of things that just work
 
01:31:18: next we have Fuzzy Bipolar Bear
 
01:31:21: is there a way to stop the dash particles from being shown when streaming? I don't think there is
 
01:31:25: I think we would have to implement that, does it show?
 
01:31:29: it does show, yeah
 
01:31:33: next, Ekky Kadir
 
01:31:36: what things are currently planned for the whole performance update? I think net weight is part of it, for example
 
01:31:40: so we actually answered this one earlier
 
01:31:43: I'm not going to go into details on this one
 
01:31:49: but essentially moving to .NET 9
 
01:31:52: we're originally going for .NET 8, but .NET 9 released
 
01:31:56: literally just like a week ago or so
 
01:31:59: in short, currently there's two main systems that need to be moved
 
01:32:04: fully into Froox Engine because they're a hybrid system, that's the particle system
 
01:32:08: which is being worked on right now, there's the sound system, which Cyro did some work on
 
01:32:13: once those systems are fully in Froox Engine, we're going to rework
 
01:32:16: how Froox Engine interfaces with Unity, and then we're going to move it out
 
01:32:20: into its own process, to use .NET 9
 
01:32:23: and it's going to be the big performance uplift from there
 
01:32:28: we're going to post, this video is going to be archived
 
01:32:33: if you're curious in more details, I recommend
 
01:32:36: watching it later, because we went into quite detail on this
 
01:32:39: earlier on the stream
 
01:32:43: so this question within chat
 
01:32:47: ShadowX, in the future, could there be a way to override values
 
01:32:51: not just per user, but in different contexts? For example, override active-enabled
 
01:32:56: state of a slot or component for a specific camera, basically same concept
 
01:32:59: of RTO, but more flexible.
 
01:33:01: so probably not like this
 
01:33:07: the problem with RTO is
 
01:33:09: if you want to override certain things
 
01:33:15: for example, in rendering
 
01:33:18: when rendering is happening, although
 
01:33:21: work on updating the world is already
 
01:33:25: complete, which means the render actually has much more limited functionality
 
01:33:29: on what it can change
 
01:33:33: probably the best way to handle situations like that
 
01:33:37: is you have multiple copies of whatever you want to change
 
01:33:42: or whatever system you want to have
 
01:33:45: and you mark each one to show in a different context
 
01:33:49: but you need to manually set them up
 
01:33:52: consider a scenario where you override an active-enabled state
 
01:33:59: that component might have
 
01:34:02: a lot of complex functionality, maybe there's even
 
01:34:06: ProtoFlux or some other components that are reading the active state
 
01:34:09: and doing things based on being enabled or disabled
 
01:34:13: and once you get into that realm
 
01:34:16: the effect of that single enabled state can be very complex
 
01:34:21: where you can literally have a bunch of ProtoFlux that does a bunch of modifications
 
01:34:26: to the scene when that state changes
 
01:34:29: and it's too complex for something like the render to resolve
 
01:34:32: because you would have to run another update
 
01:34:37: on the world just to resolve those differences
 
01:34:40: and the complexity of that system essentially explodes
 
01:34:46: so probably not in that sense
 
01:34:48: if you give us more details on what you want to achieve
 
01:34:52: we can give a more specific answer
 
01:34:55: but this is pretty much how much I can say on this look in general
 
01:35:03: was the locomotion animation system one of the unit systems that need to be implemented in Froox Engine or was it something else?
 
01:35:09: that one was something else, it came
 
01:35:12: as a part of business contracts
 
01:35:16: it's not something
 
01:35:19: it's not something I kind of wanted to prioritize myself
 
01:35:25: it's kind of a complicated situation
 
01:35:27: but unfortunately it was necessary at the time
 
01:35:32: and I'm not super happy with how
 
01:35:34: that whole thing went because
 
01:35:40: it came at the wrong time
 
01:35:45: and it's
 
01:35:47: it was essentially a lot of, because we don't have a lot of systems
 
01:35:51: for dealing with animation which would have made these things much easier
 
01:35:55: and we have never worked with IK itself which would have made things also easier
 
01:35:59: so there was a lot of foundational work that was not there
 
01:36:05: and also
 
01:36:07: the timeline was kind of really short
 
01:36:11: so it was pretty much like just a month of
 
01:36:14: constant crunch just kind of working on it and
 
01:36:17: there wasn't enough time to kind of get it through
 
01:36:23: so it is a complicated situation
 
01:36:26: unfortunately. And there's a thing that kind of happens sometimes with businesses
 
01:36:30: like you end up in a
 
01:36:34: situation where you don't really have a good
 
01:36:38: path so you just have to deal with it
 
01:36:43: we want to eliminate those kind of situations and we had
 
01:36:46: a number of conversations internally to see how do we prevent this
 
01:36:50: from happening again, how do we make sure we don't end up in a situation
 
01:36:54: where we have to do something like that
 
01:36:58: and we have a much better understanding of the problem
 
01:37:02: now where if a situation
 
01:37:06: like this were to occur again we're going to be
 
01:37:10: better equipped on the communication side
 
01:37:14: how do we deal with it and how do we make sure it doesn't mess with
 
01:37:18: our priorities and things we need to focus on
 
01:37:24: so it was like
 
01:37:25: it was a messy situation, I'm not happy with how I handled
 
01:37:30: some of the things with it
 
01:37:32: but it's pretty much
 
01:37:35: it is what it is and the best thing we can do right now is
 
01:37:41: learn from it and try to improve things
 
01:37:46: Next question is
 
01:37:48: How are you compiling the questions from the streamchats? I thought Twitch knows we're
 
01:37:52: broken. No, it's actually work
 
01:37:56: We do have this thing here where we're going through the questions. This is an older one
 
01:38:00: I need to grab a bigger one, but it's sort of like, you know, sorting the questions
 
01:38:04: for us
 
01:38:11: The Twitch
 
01:38:12: nodes also would have actually broken where
 
01:38:16: the displays of them and could have fixed very recently
 
01:38:19: I pushed the update for it last week
 
01:38:26: So next we have Epic Easton
 
01:38:28: He's asking, most were able to access internal array to edit things
 
01:38:32: like color over lifetime, enough over lifetime. Will those be
 
01:38:36: properly converted? Yes. Those systems have been very
 
01:38:40: implemented for PhotonDust, so they're going to be converted to equity ones
 
01:38:44: So it's just going to work out of the box. The good news
 
01:38:48: is there's also new modules
 
01:38:52: because PhotonDust, the new particle
 
01:38:56: system, is designed to be way more modular
 
01:39:01: So there's modules that instead of just
 
01:39:04: the internal array, you can also specify the color over lifetime
 
01:39:09: using a texture, or using
 
01:39:12: starting and ending color. You can also do starting and ending color in the
 
01:39:18: HSV color space, so there's
 
01:39:21: a lot of new color effects that it can do that's going to give you more control over the particle
 
01:39:25: system. And we can always add more, because we now have full control
 
01:39:29: of the system, so those modules are very easy to write.
 
01:39:33: This next one is a little
 
01:39:37: moderation focused. Do you mind if I attempt to answer it?
 
01:39:43: Okay. Let me take a breath
 
01:39:45: for this one, because it's a long one. On the topic of the platform being
 
01:39:48: for everyone, why was the nipples allowed? We will pass if the majority of people in the world
 
01:39:52: including me are not going to want to see them in public sessions. I will admit
 
01:39:56: that it has been an extremely rare occurrence of seeing someone with them shown in a public session
 
01:40:00: and will it be possible for me to request things like this both to the team and other people
 
01:40:04: without having my moto-slash-belief question at every turn?
 
01:40:10: So, the reason why we
 
01:40:12: wanted to take a stand on topic quality
 
01:40:17: that's what this issue is called, by the way, it's called topic quality
 
01:40:19: is, um, because
 
01:40:25: ultimately
 
01:40:28: like, if a man can have a bare chest
 
01:40:31: you know, why can't a woman? The only difference is that on average
 
01:40:35: women have larger chests than men, and I think
 
01:40:39: we're also an EU-based company, right?
 
01:40:43: I'm from Europe. Okay, this is the stance in a lot of places
 
01:40:48: in Europe, too, where topic quality is just sort of the norm
 
01:40:52: and we want to normalize that, because
 
01:40:57: we do need this kind of a quality, like why
 
01:40:59: can't a woman have, you know, their
 
01:41:03: why can't a woman be topless, you know, in a non-sexual context?
 
01:41:07: There's just no precedent for it.
 
01:41:12: And, let me see if I'm...
 
01:41:16: There's also a thing with this, it's like we
 
01:41:19: you know, we believe in equality and we believe in a lot of progress, so
 
01:41:23: we don't need to take stance on those things, but also we kind of give you tools
 
01:41:27: to kind of deal with those, so if it's something you really don't want to see
 
01:41:31: there's an avatar block function. You can block those people, they will not appear
 
01:41:35: for you. There's probably more things we can do in that
 
01:41:39: area as well, but ultimately we want to be like, you know,
 
01:41:44: very kind of like open and very kind of progressive as a company
 
01:41:46: when it comes to these things. There's also like, I would really recommend
 
01:41:51: like asking this question also in the moderation, like
 
01:41:54: office hours, because the moderation teams, you know, the one that kind of deals
 
01:41:59: with this a lot of detail and they're going to have like, they're going to have a lot more kind of context for
 
01:42:03: these things. But also like, you know, I
 
01:42:08: don't necessarily believe that like, you know, it's
 
01:42:11: like the majority of the people on the internet, you know, like having that stance
 
01:42:15: like it's, there's, there's a good chunk of like, you know,
 
01:42:19: kind of people like who are kind of like, you know, very open about this and
 
01:42:24: I feel like, you know, that the chunk is kind of growing. People are kind of
 
01:42:26: getting like, you know, more open with things.
 
01:42:30: I do recommend like, you know, bringing this like with the moderation office hours, like they're going to be able to
 
01:42:35: give you like kind of much, much kind of a better answer for this because they've been
 
01:42:39: dealing with this topic, you know, for a while.
 
01:42:45: So, you know, take what we say like a little
 
01:42:47: bit of a grain of salt. I don't want to, you know, kind of step on the moderation teams
 
01:42:51: like those with that.
 
01:42:54: Yeah, I was going to say something to, I was going to say something to wrap it up. What was I going to say?
 
01:43:05: Yeah, I was just going to say, I don't know what
 
01:43:09: I don't know what you mean by, because I commented
 
01:43:13: I don't know what you mean by this rule being
 
01:43:17: exploited by transgender males and females, but
 
01:43:20: being transgender has nothing to do with this.
 
01:43:24: If you want to be a boy or want to be a girl
 
01:43:29: that has no bearing on this rule.
 
01:43:31: Most part of the quote too is like, you know, because it kind of like
 
01:43:35: erases that kind of disparity, like it doesn't really
 
01:43:39: matter. If you do feel there's some exploit you can
 
01:43:43: always, you know, you can file moderation reports or you can file, like you know
 
01:43:48: you can bring these
 
01:43:51: to the moderation office hours and discuss these there.
 
01:43:55: Then we can kind of see what is happening and then we sort of evaluate does it fit
 
01:43:59: with our rules or does it not.
 
01:44:03: So you can, if you feel there's some issue
 
01:44:07: you can make us aware of it and we can promise that we're going to
 
01:44:11: agree with you, that we're going to have the same view on it, but
 
01:44:14: we can at the very least look at it and listen to what you have to say on that.
 
01:44:22: So next we have
 
01:44:23: Grand UK, Hearsay. I have heard from someone that they try to report
 
01:44:27: someone to the moderation team but because they were connected to the team nothing happened
 
01:44:31: of it and they ended up banned instead. I can't confirm
 
01:44:35: 100% that what was said happened and I know nothing can be said about moderation
 
01:44:39: cases but in case where there are conflicts of interest like above
 
01:44:43: what can be done and how can we be sure where we won't have wrongful consequences bans for trying to
 
01:44:47: uphold the US and guidelines for everyone.
 
01:44:52: So, I understand there's not like super many details but I can kind of
 
01:44:55: talk in general. Sometimes we do have cases where
 
01:45:02: there's actually two things.
 
01:45:03: We do have cases where there's reports against people who are on the moderation
 
01:45:07: team or even on the Resonite team.
 
01:45:11: If it's a report against someone who's on the moderation team that will usually go
 
01:45:15: to the moderation leads and those people
 
01:45:19: cannot deal with it, they will investigate. We actually have multiple moderation
 
01:45:23: leads as well. That way it's not like
 
01:45:27: there's a single person who can just bury the thing but there's multiple people
 
01:45:31: who all can see the same data and then sort of check on each other.
 
01:45:36: If it happens something with a team
 
01:45:38: or if there's an issue with somebody on the actual Resonite team, usually
 
01:45:43: that goes like the Canadian kid who's doing
 
01:45:47: those things and he brings these things with me.
 
01:45:54: We have cases
 
01:45:56: where we had to deal with difficult situations before
 
01:46:00: but on the theme, but in the moderation
 
01:46:03: team as well. I can't really go into details
 
01:46:08: because there's privacy issues
 
01:46:12: with that. I can tell you there's been
 
01:46:15: cases where people on the moderation team
 
01:46:19: they had to permanently ban some people who
 
01:46:22: were their friends, even long-time friends, because
 
01:46:27: they did something wrong.
 
01:46:31: This caused people on the moderation team
 
01:46:34: a lot of distress, but they still made the decision
 
01:46:38: to ban their friend because they
 
01:46:43: want to uphold the moderation rules
 
01:46:47: above all else. I've looked at
 
01:46:51: a few of those cases because I do want to make sure things are
 
01:46:56: going okay, there's
 
01:46:59: favoritism happening. I've been involved in
 
01:47:03: a few of those cases as well.
 
01:47:07: Part of the discussion of it and so on.
 
01:47:12: There's been a number of difficult discussions on those
 
01:47:15: and every single one, if there was sufficient
 
01:47:19: evidence for somebody's wrongdoing,
 
01:47:23: even if we knew that person personally, even if they were connected to the team,
 
01:47:27: they were still banned.
 
01:47:31: There's one thing I kind of noticed that's also kind of in general, is usually when
 
01:47:35: people do get banned,
 
01:47:42: they're almost never
 
01:47:43: truthful about the reason, because we do make sure
 
01:47:47: as part of the moderation, if somebody ends up being banned, usually
 
01:47:51: they will receive warnings first, depending on the severity.
 
01:47:56: If they end up being banned,
 
01:47:59: the reasoning is explained to them.
 
01:48:03: Oftentimes there's somebody from the team who's actually going to
 
01:48:07: sit down with them and be like, we have this evidence, this
 
01:48:10: kind of happened, you're getting banned for these reasons.
 
01:48:15: They are made aware of it. And in a lot of cases,
 
01:48:19: those people will come out and
 
01:48:22: give completely different reasons for why they're banned.
 
01:48:28: And this kind of puts us in a difficult situation,
 
01:48:30: because we value privacy, and sometimes giving details to the public
 
01:48:34: could put innocent people who are involved in those incidents at risk.
 
01:48:41: So we cannot really say
 
01:48:42: the person was actually banned for these reasons.
 
01:48:48: But it is a thing that happens.
 
01:48:52: So the only thing I can request is
 
01:48:57: be more skeptical about what
 
01:48:59: people say about these things. If you see something,
 
01:49:02: if you believe you can always send us a report, we will look at it, we will evaluate it,
 
01:49:07: we will see what evidence we have.
 
01:49:11: But ultimately, we will not necessarily tell you
 
01:49:14: the details of how it was resolved to protect the privacy
 
01:49:20: and potential security of people involved.
 
01:49:24: I will also... Oh, sorry.
 
01:49:27: No, go ahead. I was just going to say that we're
 
01:49:31: just about 10 minute mark, so I think we should close questions.
 
01:49:34: Okay, so we're going to close the questions.
 
01:49:38: So if you send
 
01:49:42: questions right now, we have a few of them coming in,
 
01:49:46: if you send any questions after this point, we can guarantee we're going to
 
01:49:50: answer that one. We'll try to answer as many as we can that are still left,
 
01:49:54: but no guarantees at this point. But I will at the very least
 
01:49:58: try to make it
 
01:50:03: the ones that we have
 
01:50:04: on the list right now. So the next one,
 
01:50:09: EpicEston. Does the question mark need to be at the end of the question?
 
01:50:13: I think it doesn't need to be. I think I can put it
 
01:50:16: in the middle, but just to be sure, I would put it like...
 
01:50:20: Actually, no. I literally see a question that has a question mark in the middle
 
01:50:24: of it, so no, it doesn't need to be at the end.
 
01:50:31: Erasmus0211.
 
01:50:32: Any more flux nodes in the works? If yes, which
 
01:50:35: excites you the most?
 
01:50:39: You're working on some new ones.
 
01:50:46: Which ones am I working on again?
 
01:50:49: I'm just the one I just took.
 
01:50:52: Oh yes, there is a
 
01:50:55: new ProtoFlux node I'm particularly excited about. So, you know how
 
01:51:00: for those of you who do ProtoFlux,
 
01:51:03: there is currently a node where you can perform a raycast
 
01:51:07: which shoots an infinitely thin line, and whenever it hits, you can get the position,
 
01:51:11: you can get the direction, stuff like that.
 
01:51:16: What I'm going to implement is I'm going to implement
 
01:51:20: sweeping, or I think it's
 
01:51:24: also been called shapecasting or whatever,
 
01:51:27: unlike some other platforms, but it's essentially a way of doing thick
 
01:51:31: raycasts using a shape that you essentially
 
01:51:35: extrude in the direction that you want it to go.
 
01:51:38: So, if you wanted to shoot a sphere in a direction,
 
01:51:44: you would essentially be shooting a capsule
 
01:51:48: however long you want to shoot it, and anything within there
 
01:51:52: it would hit. Or in this case, you know, the first thing it hits
 
01:51:56: it will return basically exactly like a raycast,
 
01:51:59: but it's thick, and you can do that with different shapes like a sphere,
 
01:52:03: or a cube, or I think you
 
01:52:07: should also be able to do it with convex hulls, right?
 
01:52:12: I'm not sure if we have that one, maybe.
 
01:52:16: I thought it was going to be better. At the very least, you'll be able to do it
 
01:52:20: with spheres, and cubes, and cylinders, and capsules, and stuff.
 
01:52:25: But I think that will be very useful, especially for those
 
01:52:28: of you who make vehicles who don't want your raycasts to
 
01:52:32: shoot between two infinitely close triangles in geometry, and now your
 
01:52:35: car is flying across the map. Yeah. Thick raycasts.
 
01:52:40: Yeah, thick raycasts.
 
01:52:42: Because we do have a lot of the functionality, it's already in the part of the
 
01:52:50: car. We use it internally in our own engine. For example, the laser is actually
 
01:52:53: using sweeps to behave a bit better.
 
01:52:58: And this is going to expose them, so you can also use them from ProtoFlux.
 
01:53:06: This one seems to be asking something in the chat, so I'm going to
 
01:53:09: skip this one. Tribe Grade World VR. Question.
 
01:53:14: For example, if you're still already on the video, say genius, what app are you using to do those scans?
 
01:53:18: Yes, some interstellar reality.
 
01:53:21: For most of my scans, I'm using a software called Agisoft Metashape.
 
01:53:27: It's a photogrammetry software, and essentially you take lots of pictures
 
01:53:30: of the subject from lots of different angles,
 
01:53:35: and it's able to do those reconstructions. It figures out
 
01:53:39: based on the patterns in the photos, where the photos are, and then
 
01:53:42: reconstructs a mesh. I also sometimes use additional
 
01:53:46: software, like I'll for example use Photoshop to like, with certain
 
01:53:50: photos, I will do like an AI denoise on them, which
 
01:53:54: helps increase the quality of the scans, and I will also do
 
01:53:58: some kind of tuning of the lighting and so on. But I guess Metashape
 
01:54:02: is the main one. There's also one that I kind of started experimenting with a few days ago,
 
01:54:07: and I literally turned my room into like,
 
01:54:11: it's a software called, actually
 
01:54:14: I forget the first, it's called PostShot. Let me see
 
01:54:18: the full name. Joseth PostShot. And this one's for
 
01:54:22: Gaussian Splathing, which is sort of like this new technique, you know,
 
01:54:26: for 3D reconstruction, or more general like rendering, which can
 
01:54:30: reconstruct the scenes with much better fidelity. And we're kind of
 
01:54:34: playing with it, like because I have all my datasets, I've been just kind of throwing at it and see like
 
01:54:38: how it kind of works with different things.
 
01:54:42: So like I might
 
01:54:44: like integrate that one more into my workflow as I kind of like
 
01:54:48: go. I posted like a quick video and have like a bunch more
 
01:54:52: that I'll be posting soon-ish.
 
01:54:55: But yeah, I guess some mentorship is the main one to use, like you know, it makes it easier to just
 
01:54:59: get a mesh, bring it in.
 
01:55:04: This one is continuing a moderation question
 
01:55:08: that we had a couple ago.
 
01:55:12: This one from Ralag86
 
01:55:16: again asks, continuing my previous question, will anything be done
 
01:55:20: regarding people who do not want to see beta.top females in public sessions? For non-hosts
 
01:55:24: I am aware of the already in-play system where you can ask the version to switch avatars slash avi settings
 
01:55:28: and for hosts they can enforce address code which I am no doubt making use of.
 
01:55:32: So in the future we do want to
 
01:55:36: implement stuff like content tagging
 
01:55:41: and that will come with
 
01:55:44: the ability to, you know, if things are tagged a certain way you can
 
01:55:47: take a checkbox and you won't see them anymore, right?
 
01:55:50: So you could make use of that.
 
01:55:55: That's something we will do in the future.
 
01:55:59: But other than that, for the time being
 
01:56:04: if you don't want to see that, don't go to those sessions.
 
01:56:08: Well, you can still go to those sessions because we do have
 
01:56:11: the ability to block somebody's avatar.
 
01:56:16: I can actually show you if I
 
01:56:20: click on Cyro's name...
 
01:56:23: Careful, it might ban me from the session.
 
01:56:25: Oh, it should be just block avatar. There we go, see now Cyro is gone.
 
01:56:30: I don't have to look at it. Well, I can still see it, but I don't have to look at it like him anymore.
 
01:56:35: Yeah, that is something I forgot about.
 
01:56:39: This is one of the reasons we added it.
 
01:56:42: You have the power. If some avatar is legitimately upsetting you,
 
01:56:47: you can block it. The other part is if you host
 
01:56:50: your own sessions, you can enforce your own rules. We do allow for that,
 
01:56:54: with some caveats. So if you want to enforce a dress code,
 
01:56:58: that's completely up to you. You have that freedom.
 
01:57:03: You can always add additional rules to whatever
 
01:57:06: sessions you want to host.
 
01:57:10: That's another thing. Eventually the content tagging system
 
01:57:14: should make these things more generalized.
 
01:57:18: You don't even have to go and see it in the first place as long as the content is properly tagged.
 
01:57:23: We can filter certain things out. We can block certain avatars.
 
01:57:26: We don't want to give you the tools, but
 
01:57:30: we don't want to make global decisions
 
01:57:34: just forbidding these things for everyone.
 
01:57:38: There is a nuance I was going to get to there
 
01:57:43: in that if you decide
 
01:57:46: to not allow, let's say you're like,
 
01:57:50: I don't want to see nipples in my world, that also has to apply to the men in the session
 
01:57:54: as well. It is universal, you cannot discriminate.
 
01:58:00: So it's either nipples allowed for all, or no nipples at all.
 
01:58:05: It actually reminds me, because there was one thing
 
01:58:07: that was particularly funny to me. With the Creator Jam, they actually made a nipple gun
 
01:58:11: they were shooting around the world, and people got upset, and they were like
 
01:58:15: oh no, it's okay, those are male nipples, they're not female nipples.
 
01:58:19: It was a funny way to point out to that
 
01:58:22: like, double standard, you know, for this kind of thing.
 
01:58:28: Uh, but yeah. Uh, next we have
 
01:58:30: Verailash86, my question being will anything be done past it all?
 
01:58:34: I don't know which one this one's actually related to.
 
01:58:39: It was related to the previous one they sent them in a row.
 
01:58:43: Um, so we're
 
01:58:46: um, this might be the last one because we're last minute.
 
01:58:52: Yeah, we already answered that one.
 
01:58:55: Um, yeah.
 
01:58:58: I think that's pretty much it, we had a few more come in, but we got it.
 
01:59:02: Yeah, there's a few more, but this is pretty much the last minute, like we've been here for two hours
 
01:59:06: my throat is kinda sore from this, I should have brought some water.
 
01:59:10: Uh, but thank you everyone, you know, for joining, thank you for so many
 
01:59:14: kind of questions, like we're very happy to answer those, you know,
 
01:59:18: like let you know more about the platform, and just kind of like chat with you.
 
01:59:23: Thanks everyone, also like, you know, for playing, you know, Resonite
 
01:59:26: for enjoying this platform, you know, for supporting us and letting us do this kind of thing.
 
01:59:31: Um, I hope you enjoy the stream, like
 
01:59:34: my goal is, you know, make this every week. The format might kind of
 
01:59:38: change a little bit, we'll kind of see, you know, how many questions we get like next time and so on.
 
01:59:43: We might, you know, next time might be for example
 
01:59:46: outside of Resonite, you know, playing some kind of chill games while kind of chatting
 
01:59:49: with you, but we'll see how it kind of goes, because
 
01:59:53: this one there was a lot of questions, we're like, you know, kind of focused more on the Q&A
 
01:59:57: and we'll see like, you know, how it changes with the upcoming streams.
 
02:00:01: So we'll experiment with the format a little bit and see like, you know,
 
02:00:05: and also let us know, let us know like, you know, like what do you think, like do you like this?
 
02:00:09: Would you like to see some other things? Are there like any kind of issues?
 
02:00:13: You can like, you know, post, like
 
02:00:16: actually where should I post? Maybe make a thread in the office hours
 
02:00:23: like under Discord
 
02:00:25: to share your feedback. So thank you very much for joining,
 
02:00:28: you know, thank you for like spending time with us and asking us questions.
 
02:00:33: I'll try, like, you know, try to get like this video uploaded on
 
02:00:36: you know, our YouTube channel so you can, anybody who like missed these office hours
 
02:00:40: you can, you know, watch them afterwards, and we'll see you next week.
 
02:00:45: So thank you very much, and thank you also Cyro for, you know, helping me with this.
 
02:00:49: And being a good co-host, and we'll see you next week.
 
02:00:53: Bye!

Revision as of 09:36, 7 May 2025

This is a transcript of The Resonance from 2024 November 17.

This transcript is auto-generated from YouTube using Whisper. There may be missing information or inaccuracies reflected in it, but it is better to have searchable text in general than an unsearchable audio or video. It is heavily encouraged to verify any information from the source using the provided timestamps.

00:00: Everything should be up.

00:02: I'm going to post the announcement.

00:10: Hello, hello, let's see if we get people in there, we need to move this one a little bit

00:15: so we can read it.

00:18: Hello, do we have people on the stream?

00:25: Hello, can you hear us?

00:26: Can you hear us?

00:32: I'm just going to wait for some people to come in.

00:37: Oh, there we go, we've got Shushio.

00:39: Got one person.

00:43: Hello, hello Shushio.

00:59: Hello, just a sprinkle, we've got a bunch more people piling in.

01:03: Hello and welcome everyone.

01:08: So this is the first episode of The Resonance, that's like a new podcast that I'm starting.

01:16: It's like a mix between office hours where you can kind of ask anything about Resonite,

01:21: you know, whether it's a technical thing, whether you want to ask more broad questions,

01:26: you know, more kind of open-ended as well.

01:29: But also I have Cyro with me, who's our engineering intern.

01:34: We have a lot of times talking about Resonite and talking about cool technology,

01:41: talking about VR, talking about big vision behind Resonite,

01:47: like where do we want to, you know, which direction we want the platform to hand and so on.

02:01: You know, about what this place is.

02:04: We see a bunch of people popping in, so hello everyone.

02:07: I see Dustus Sprinkles, ApexRxAI, LexiVoe, I see Ground, Fuzzy, Jack Forge, AlexDupi, I see Jack, and Birdo.

02:20: Hello, welcome everyone.

02:23: Could I also request the chat before we start, since this is the first one,

02:29: I'm just kind of tuning things a little bit.

02:33: Is the audio level okay on your end? Can you hear me fine?

02:36: And, Cyro, can you say something?

02:38: Can you hear me okay, guys?

02:42: Let me know if I can even adjust the levels a little bit.

02:46: They look okay, like on the OBS side, but sometimes it's a little bit hard to tell.

02:53: Oh, oh my. It's public. Thank you.

02:56: We should maybe not do that.

02:59: Yes, I should have checked that. Thank you for letting us know.

03:03: Surprisingly, nobody joined, so I'm going to say hello.

03:07: I do have one more request.

03:10: For questions, we have a thing here that's going to show the questions,

03:16: so we're going to make sure we don't miss them.

03:18: What you need to do is, when you ask your question, make sure you answer the question mark,

03:22: and it's going to get picked up.

03:24: Would anybody in the chat be able to... Perfect.

03:28: I have a question.

03:29: Perfect. It works. Thank you.

03:35: Great.

03:37: Thank you. So everything works.

03:40: With this, we've got a bunch of people in there.

03:43: I think we're ready to start.

03:45: Hello again, everyone. I'm Froox.

03:48: I have Cyro with me, our engineering intern.

03:50: This is the first episode of what we're calling The Resonance.

03:54: The idea is that this is going to be a mix of office hours,

03:57: so we can ask anything about Resonite,

04:00: whether it's a technical question, whether it's more philosophical about the platform,

04:05: whether it's more specific or open-ended,

04:09: and we'll try to answer those questions as best as we can.

04:15: We're also going to talk a little bit more in broader terms.

04:20: What is the direction of the platform? What's the big ideas behind it?

04:24: Because we don't want to keep things just to the wire,

04:28: where it's dealing with individual technical issues,

04:30: but also what are the driving forces?

04:35: What would we want the platform to do in general,

04:39: irrelevant to any kind of specific features?

04:43: So if it did, we can start answering questions,

04:48: and if there's not too many of them, we can just talk about things.

04:54: We already have a few questions popping in.

04:56: Jack is asking, are you using MyChat?

05:01: I'm actually not. I don't know where I saved it.

05:04: I was kind of looking for it before the start,

05:05: and I was like, oh, I can't find it, so I'm using a little bit older one.

05:12: Then Ozzy is asking, of course, is Glitch cute? Yes, he's cute.

05:17: It's proven right here on the stream.

05:21: ChronicJoke, is Mayonnaise a ProtoFlux node?

05:25: No, but I actually have a list of ideas for April Fools,

05:31: and there's a food-related ProtoFlux node in there

05:35: that might pop up at some point, maybe.

05:39: Is Mayonnaise a ProtoFlux node?

05:42: The question is, what would it do if it's a ProtoFlux node?

05:45: Because Mayonnaise, that's got to be a data type.

05:48: That is true.

05:54: Or would it produce Mayonnaise?

05:56: Or maybe you have a number of inputs, you need to input eggs,

06:00: you need to input, actually, I don't know what it calls it, to mayonnaise.

06:03: I think egg is an egg.

06:04: We have the leaky impulse bucket, maybe we could have the leaky mayonnaise bucket.

06:13: We need mayonnaise outputs.

06:16: Yes.

06:17: Hopefully that answers your joke question with more jokes.

06:22: Then we have...

06:24: Oh, sorry.

06:26: Oh, no, go ahead.

06:28: I was just going to read Jack's if that's fine.

06:30: Okay.

06:31: Jack says, I have a pretty broad question, but I assume it's going in the same direction you're already heading.

06:36: Where do you want to see Resonite positioned within the VR slash social VR space?

06:41: Ah, this is a good ramble-inducing question.

06:46: There's a few things that we know about Resonite.

06:48: One of the big ideas of this platform is that it's built of multiple layers.

06:56: At the base layer, you have things like automated networking.

07:02: Everything you build, even the engine itself, you always get everything synchronized by default.

07:09: You don't even have to think about it.

07:12: Everything is potentially persistent.

07:14: You can save everything into inventory, into the cloud, under hard drive.

07:19: Everything that you get on the platform, you can persist.

07:23: The way I see it is once you have this kind of layer, you can start building on top of it.

07:28: We also have layers for working with various devices, various interactions, grabbing stuff, touching stuff, pointing out things.

07:37: Those are things that I feel like are important to solve really well.

07:45: Do them properly.

07:47: When I started my work in VR, I was doing a lot of disparate applications,

07:54: where one application had these features and supported this hardware,

07:58: and the other application supported these things and this other hardware.

08:01: Sometimes I would like functionality from this one application and this other one,

08:07: but it was kind of difficult to bring them over.

08:10: Plus, I would also find myself solving the same kind of problems over and over.

08:17: For example, being able to grab stuff.

08:22: One of the driving forces was to create a framework, a layer,

08:27: where everything is part of the same shared universe,

08:32: and build an abstraction layer.

08:38: It's kind of analogous to programming languages,

08:43: where the really old ones had assembly programming,

08:48: and you had to do a lot of stuff like managing memory,

08:52: like where is this stuff, and managing your stack,

08:55: and doing a lot of manual work to make sure the state of everything is correct.

09:01: Then came high-level programming languages,

09:03: where they would essentially do it for you,

09:05: and they would let you focus more on the high level.

09:08: What do you want to do?

09:12: Personally, what I want Resonite to do in the VR social space

09:17: is do a similar paradigm shift for applications,

09:24: where no matter what you build, you always have real-time collaboration.

09:29: You don't even have to think about it.

09:32: You can always interact with multiple users,

09:35: and you always have persistence,

09:36: and you always have integration with lots of common hardware.

09:42: To me, the social VR layer is the basis.

09:48: You always have the social stuff.

09:50: You can join people, you can talk with them,

09:52: you can be represented as your avatar,

09:54: but then everyone can build lots of different things.

09:59: Some people will just socialize, some people will play games,

10:02: but some people will build a virtual studio.

10:07: Maybe they want to produce music, maybe they want to program stuff,

10:11: and they're able to use Resonite, a framework to do that,

10:18: and share whatever they make with other people.

10:23: If you're good at building tools, you can make tools,

10:28: like I mentioned, for example, producing music.

10:31: Say somebody makes really cool tools.

10:33: Other people who do like to produce music can take those tools made by the users,

10:38: and because they exist within the same universe,

10:40: you can build your own music studio,

10:42: and you have all the guarantees that I mentioned earlier.

10:46: Video Music Studio can invite people in and collaborate with them no matter where they are.

10:51: You can save the state of your work,

10:53: or maybe say you can make a really cool audio processing filter or something.

10:58: You save it, you can share it with other users,

11:00: and it kind of opens up this kind of interoperability.

11:04: I want Resonite to be general enough where you can build pretty much any application.

11:13: Whatever you can think of, you can build on here and get those guarantees.

11:19: Kind of similar to how you have a web browser.

11:23: Web browsers used to be just browsers for websites,

11:26: but now we have fully-fledged applications.

11:28: You have your office set, like Google Docs.

11:33: There's a version of Photoshop.

11:35: We can play games.

11:36: There's so many applications on the web that it essentially becomes its own operating system in a way.

11:47: I want Resonite to do a similar thing,

11:50: where the platform itself is like the analog of the web browser.

11:54: You can build any kind of application in it,

11:57: but also you get the guarantees of the automated networking,

12:01: of the persistence, of the integration with the hardware,

12:04: and other things solved for you so you don't have to keep solving them.

12:09: That's pretty much in broad terms what I want Resonite to do.

12:13: I hope that Dremble can answer that question well.

12:18: I think it answered it pretty good.

12:24: When you were talking about this, I was thinking of way, way back,

12:31: before we had any sort of proper type of game engine.

12:36: You'd program all of your code, all of your games.

12:39: You would just program them raw.

12:41: You didn't have Unity, you didn't have Unreal.

12:44: If you wanted to collaborate with people,

12:45: you had your immediate vicinity of the people who you lived around.

12:52: And then now you have game engines and stuff,

12:55: which integrate a lot of the typical stuff that you would need to make a game.

13:03: But you're still limited to basically working over a Skype call,

13:08: or again with people close to you physically.

13:11: But now, this is kind of like a layer on top of that even.

13:17: Yes.

13:18: Where now, as social creatures, we don't really have something like this in that sort of space,

13:28: and now we do.

13:30: And being able to have that same sort of collaboration like you could have in real life,

13:35: with people working next to you, you can have from people who live a thousand miles away,

13:42: across the entire world, and you can work exactly as if you were right there,

13:51: and a lot of the things that you'd expect to work just kind of do like,

13:54: oh, you can see my context menu when it comes up, you can see this in Spectrum opening.

13:59: It's just like putting a piece of paper down on a table

14:04: and working on it with someone standing right next to you.

14:07: Yeah, that's a really good point.

14:10: There's actually another thing that I've seen that inspired me,

14:14: is seeing engines like Unity and Unreal.

14:19: Because it used to be when you wanted to make a game,

14:21: you pretty much had to build your own engine, which in itself is a big undertaking,

14:26: and you needed bigger studios.

14:27: But then game engines came out, they were more generalized,

14:32: and what they essentially did, they erased the minimal part,

14:37: where suddenly everybody has access to a fully-fledged game engine,

14:40: and it's no longer a problem you have to solve on your own.

14:45: And now you have small studios, even just individuals,

14:49: who are able to build games and applications that previously would take entire teams of people to do.

14:56: And where I see Resonite is doing that same thing, just pushing it even further,

15:04: where we go from just the game engine,

15:12: where you don't have to worry about stuff like making a rendering pipeline,

15:18: making a system for updating your entities, and so on.

15:22: Now you have additional guarantees, like real-time collaboration, synchronization, persistence,

15:27: that all just kind of comes for free, and you don't have to solve those problems,

15:31: and you can focus even more of your time on what you actually want to do in the social VR space.

15:36: What do you want to build, how do you want to interact.

15:40: So that's definitely a very good point, too, with the game engines.

15:48: I think we're probably going to move to the next questions, because we kind of rambled about this one a bit.

15:55: So we have a...

15:55: I think that one went ahead.

15:58: But I think we can answer that one pretty thoroughly.

16:03: So next we have ShadowX.

16:05: Food-related uphill fools joke? Shocking. I know, is it?

16:11: Next we have MrDaboop123456.

16:17: What are some bugs that we have said it's a feature?

16:21: Others?

16:23: The one that literally comes to the mind...

16:24: Actually, sorry, we've got to demonstrate it.

16:26: It's the fast crouch one.

16:29: You know, like when you...

16:31: Can you... can you... can you... there we go.

16:33: This.

16:35: This is technically a bug.

16:37: There's a bug report for this.

16:39: But I'm like...

16:41: We need to fix this one in a way...

16:43: Where you can still do this because it's just...

16:45: It's just funny and like, you know, it's like the language of desktop users.

16:53: It's... it's a bug returning into a feature.

16:56: So I think it's a good example of one.

17:05: Oh...

17:05: There have been so many updates that I can't think of any one in particular.

17:10: The obvious one, I guess, is Bulbul 3.0, which is just a typo, but...

17:13: Oh my god, yes.

17:15: I mean, it's more of a meme feature.

17:17: It's just kind of like, you know, like an easter egg.

17:20: But yeah.

17:22: Um...

17:23: Yeah, like, there's so much stuff that I don't really remember, but like...

17:26: This one is definitely like...

17:29: This one comes to the mind.

17:31: There's even a bunch of others, but...

17:33: I don't think I can think of any others myself.

17:38: So next we have Alex2PI.

17:41: I would think that mayonnaise...

17:42: We're going with the food thing.

17:44: I would think that mayonnaise would be a way to package information by quoting in mayonnaise.

17:49: Oh, I guess mayonnaise is like a rapper tribe.

17:53: Hmm...

17:53: It's kind of like a nullable except mayonnaise.

17:56: Kind of like a...

17:58: Kind of like a like, tar GZ.

18:00: Where it's like two layers of like, packaging.

18:03: Where one of them is the package and one of them is the compression or something.

18:06: Oh, it's more like...

18:07: So mayonnaise is a container format.

18:12: It's just kind of like...

18:15: It can contain other things in it.

18:18: .Mayo.

18:20: So next we have GrandUK.

18:22: Have you thought about other ways to get audio-video out of Resonite other than simply mirror-to-display of camera and the audio output of Resonite?

18:31: It's quite jarring to hear Froox non-specialized and then Cyro specialized as well as having inverse-specializing of Cyro that camera POV would suggest.

18:42: Actually, let me... I'm actually gonna turn Cyro into broadcast. That should make things easier for this.

18:47: You can also set the audio source to be from the camera.

18:53: I know, but that messes with my head too much.

18:57: I'm just gonna keep you on broadcast for now so it's easier for the stream.

19:01: However, I do actually have answers to that question.

19:06: So one of the big things that we're focusing on right now is a big performance upgrade.

19:11: And actually, I think I've seen a question so this might answer some of that too.

19:15: It's doing a big performance upgrade.

19:17: The two big things that need to be done...

19:20: Well, there's actually one more, but the two big systems that need to be done

19:25: is a particle system, which is being worked on right now, and the audio system,

19:30: which Cyro actually has been working on a part of it for doing a reverb system.

19:37: Those two systems, they're essentially the last two big systems

19:43: that are sort of like a hybrid between FrooxEngine and Unity.

19:47: I'll go a little bit more into details on this one with a later question,

19:50: but we are going to be reworking the audio system, and with the current one,

19:55: the Unity one, it doesn't support multiple listeners.

20:01: The goal for reworking the audio system is so we can actually do that,

20:05: there's one listener that's for you, for your ears,

20:08: and there's additional listener that can be for camera

20:11: that you route to a different audio device, so you can actually kind of split it too.

20:15: Because you can switch to camera, but then I'll be hearing everything from camera's viewpoint

20:20: that it kind of messes with my kind of specialization.

20:25: So yes, there's going to be a way to do it, we just need to get it out of the system.

20:30: Next we have OrigamiVR.

20:35: I'm going back home very soon, I'll finally be able to reside again.

20:38: I was wondering, no social platform has this in official, I think.

20:42: What are the chances of implementing social events and gathering lists in-game

20:45: that notifies people about upcoming events and more?

20:49: Yes, that's actually one of the things I would like us to do.

20:52: We do have a GitHub issue for it, so if you search events UI, I kind of forget its name exactly.

20:59: On our GitHub, there's a bunch of details.

21:03: It would be really like adding server-generalized systems plus some UI

21:07: to be able to register events and see what's happening.

21:10: It's going to help people discover more things going on in the platform

21:14: and make it easier to socialize and join things.

21:18: It's probably going to happen sometime after we finish with the triangle of the performance update

21:24: because there's a bunch of UI improvements we want to do,

21:28: and we want to focus on so many things at a time.

21:31: So it's going to come at some point. No timeline yet.

21:36: At the very least, it's going to be sometime after the performance update.

21:44: With one of the things that's definitely on my mind,

21:46: and that I think should be pretty high on the list

21:49: because we want to help people drive socialization engagement,

21:54: so editing is a pretty important feature.

21:59: Actually, when you were talking about the performance,

22:03: I actually saw someone in the chat.

22:06: Yes.

22:08: And I actually wanted to say that the rendering engine in particular,

22:15: like using Unity, isn't necessarily like a blocker for the performance update.

22:25: I see there's two questions that are related to this,

22:29: so I'll go a little bit more in detail on this one.

22:31: We have SkywinKitsune asking,

22:34: Froox, could you explain the roadmap to a big optimization update?

22:38: Where are we at in that process?

22:40: And then we have GlovinVR asking,

22:41: what are some of the big milestones still needed

22:45: to move the client applications to .NET 8?

22:47: I know the particle system is one of the prerequisites,

22:50: but where are some other prerequisites that can be looked forward before the shift?

22:55: So these two questions are pretty much the same kind of question,

23:00: so I'm going to cover this in one.

23:03: Let me actually bring my brush,

23:05: because I feel it would help if I draw a diagram.

23:11: I'm also going to turn the camera to manual mode,

23:15: so it's not moving around for this.

23:19: Where's my brush? Give me like a second.

23:24: Tools... I should have gotten one already, but...

23:28: Geometer, my brushes...

23:31: Yes, Cyro will dance the other thing while I look for the brush.

23:36: I think this one should be okay. There we go.

23:39: So, let me see... So this looks pretty visible on the camera.

23:43: So, just to kind of give you an idea, consider...

23:51: Let me actually make this a little bit bigger.

23:56: So consider this is Unity.

24:04: That might be glowing a little bit too much.

24:08: Consider this is Unity.

24:10: You have Unity stuff, whatever it's doing.

24:14: And within Unity, we have FrooxEngine.

24:18: So this is FrooxEngine.

24:25: So, right now, because of Unity, FrooxEngine is contained within Unity,

24:31: it's using Unity's runtime to run its code.

24:34: It's using the Mono. The Mono framework, it's very old,

24:40: and it's kind of slow.

24:42: Which is why we kind of want to move FrooxEngine to .NET 9,

24:47: because we're originally saying .NET 8, but I think it was this week,

24:51: or last week, .NET 9 release, so we can talk about that one.

24:56: But the problem we have right now, in order to move,

24:59: there's two systems where this is sort of like a hybrid.

25:04: FrooxEngine, most of the stuff, most of all the interactions,

25:08: all the scripting, networking, physics, interactions,

25:11: most of it's fully contained within FrooxEngine.

25:15: We don't have to worry about it, that's already kind of nicely contained.

25:21: But then, what we have,

25:26: there's two systems which are sort of like a hybrid,

25:29: they kind of exist on both sides, and it's a particle system,

25:33: so we have a particle system,

25:36: and the second one is a sound system.

25:43: So, what the overall goal is, is we want to take these systems

25:47: and rework them into completely custom ones, so they're fully contained within FrooxEngine.

25:53: Once that kind of happens,

25:55: there's also interaction with all the Unity stuff.

25:58: And right now, that's also kind of like where this goes this,

26:02: this goes here, this goes here, it's kind of like, it's messy.

26:07: So once we move both of these systems fully into FrooxEngine,

26:11: we're going to rework how FrooxEngine actually communicates with Unity.

26:16: So it's a much simpler pipe,

26:19: where it sends a very self-contained package,

26:23: and be like, render this stuff for me, please.

26:26: Once this is done, what we can do

26:29: is we can take this entire thing, and I kind of got a bit like at the same time,

26:34: but we essentially move this out of Unity,

26:38: into its own process that's going to be

26:40: .NET 9, and this is going to

26:45: communicate with Unity using that same process.

26:50: And is this switch switching to the much modern

26:53: .NET 9 runtime, that's going to provide a big

26:57: performance, like uplift. The reason for that is

27:01: because .NET 9, it has a much better JIT compiler,

27:04: that's essentially the component that takes our code, and it translates it

27:09: into machine code that your CPU runs. And the one that's in .NET 9

27:13: produces at least an order of magnitude

27:17: better code. It also has better,

27:20: more optimized libraries that are part of the .NET framework that are being used,

27:24: and also much better garbage collector, which is another thing on the Unity side

27:28: that's slowing things down. We've already done

27:32: a thing where, for the headless client,

27:37: we've moved it to a .NET 8 a few months back,

27:41: because with the headless client you don't have the renderer, which means

27:44: it already exists outside of Unity.

27:47: And that made it much easier to actually move it

27:51: to the modern .NET runtime.

27:54: And the headless client, it's still this. It's the same code.

27:58: It's not a separate thing from what we're running right now.

28:03: 99% of the code is the same.

28:07: So by moving it first, we were able to see

28:11: how much of a big performance uplift we actually get.

28:16: What we found,

28:18: and we've had a number of community events that have been hosting

28:21: big events, we've been able to get way more people

28:26: on those headlaces, even with those headlaces

28:30: completing everybody's avatars, completing everybody's IK,

28:33: and dynamic bones, while still maintaining a pretty high frame rate.

28:39: So thanks to that, we are confident that moving

28:43: the graphical client to .NET 9

28:45: is going to give us a really good performance upgrade.

28:50: The only problem is, it's a little bit more complicated process, because we do have to

28:54: rework those systems, and we have to rework integration before we can

28:58: move it out. This is also a little bit tangential, but

29:02: one of the things, once this happens, once we move it out,

29:06: we can actually replace Unity with Sauce, which is going to be our

29:09: custom rendering engine. And this whole process, it makes it easier

29:13: because we have this very nicely defined way to communicate

29:17: between the two, which means we can actually heat this away, you know, and put

29:21: Sauce in here instead.

29:25: Like, where we are right now. So right now,

29:30: if I move this back...

29:35: I'll move this back...

29:37: We have the sound here, there we go. So right now, the

29:41: sound system is still hybrid, the communication with Unity is still kind of like

29:47: messy, like there's a lot of

29:49: routes into everything, and the particle system is being reworked.

29:54: So we're essentially taking this, and moving

29:57: it in here. We are working on a new particle

30:01: system called PhotonDust. The work has been

30:05: kind of progressing over the past few weeks. It's actually getting close

30:10: to feature party with the current system, which is a hybrid between Unity

30:13: and FrooxEngine. Because the goal is, we don't want to

30:17: break any content. We want to make sure that whatever

30:21: is built with existing particle system still works, and

30:25: looks the same, or at least very close to what it's supposed to look.

30:31: Most of the things are already implemented.

30:33: If you go into devlog in our Discord,

30:37: you can see some of the updates and some of the progress.

30:41: The main thing that's missing right now as a major system is

30:45: implementing our own system for particle trails.

30:50: Once it's done, it's possible

30:53: that this is going to be sometime next week. I don't want to make any promises because things happen,

30:57: but it is getting close to there.

31:01: We can actually release public builds where we have

31:03: both systems at the same time. So we're going to have

31:08: legacy system and PhotonDust, with

31:11: conversion being something you trigger manually. We'll learn a bunch of tests

31:16: with the community, so we'll essentially ask you to test your content,

31:20: test the new system, find any bugs with it. Once we are

31:24: confident that it works okay,

31:28: we are going to essentially remove the old system

31:32: and we make the conversion automatic to the new system.

31:35: With that, this part is going to be done, and we're going to move on

31:39: to this sound part. The sound system

31:44: is essentially what handles stuff like audio

31:46: specialization and so on. Right now, it's also a hybrid. So for example,

31:50: on Froox Engine's side, we do our own audio encoding and decoding.

31:55: Unity is not handling that, but what we're doing is we're feeding Unity

31:59: the audio data we want to play into individual sources

32:03: and Unity then handles specialization and then outputting it to your

32:07: audio device. We're going to move that part into our own system

32:11: which is also what's going to allow us to take control of it and

32:15: build new features, like for example having multiple listeners.

32:19: And we're also going to move the system here. There's actually some work on

32:23: this that Cyro's been working on that I'm asking for help

32:27: with, because one of the things in the system is a

32:31: reverb effect. And we essentially need to implement our own, because there's also

32:35: a thing that's currently handled by Unity, and Cyro has

32:39: made integration with a reverb called the Zita reverb that we

32:43: use to replace the existing reverb songs.

32:48: Would you like to tell us a little bit more about that part?

32:50: Yeah, so we found

32:55: a nifty little... so let me actually back up a little bit.

32:59: So currently,

33:02: the reason why we can't just keep using this reverb or whatever,

33:07: like the one that we're using right now, is because it uses

33:11: Unity, but in turn the underlying reverb

33:15: uses FMOD or something.

33:17: And that costs at least four dollar signs

33:22: to use commercially, I think.

33:25: But we found a nifty library called Soundpipe

33:29: that includes a really nice sounding reverb effect.

33:33: And I have been working on

33:38: getting the library compiled and integrating it

33:41: with FrooxEngine.

33:45: You won't be able to do anything super duper fancy

33:48: with it right away, at least not until Froox reworks the whole audio system.

33:53: But you'll at least be able to process

33:55: audio clips with it and make them sound all echoey and stuff just to try it out.

34:00: Which I think will be pretty cool.

34:04: Then you can just reverbify

34:08: any audio clip in-game. You can make a reverb

34:12: with Baker, essentially, which I think is pretty cool.

34:16: It's kind of like expanding the audio processing, because you can already do some trimming,

34:20: you can do normalization, volume adjustments, fading, and so on.

34:25: Having that code integrated and ready,

34:28: we can already expose some of it,

34:31: play with it, and make tools that spin off of it

34:36: before we do the big integration.

34:39: Right now, the particle system

34:41: is the major one that's going to be fully pulled in.

34:45: Once that part is done, we're going to do the sound system, which I expect to be

34:49: faster than the particle system, because it doesn't have as many things,

34:53: but we'll see how that one goes.

34:56: Once the sound system happens, this is going to get reworked, the integration with Unity, so it's simpler.

35:02: Once this is done, we move the whole thing out, and it's going to be the big performance update.

35:08: I hope that it helps answer

35:12: the question.

35:16: I think I'm going to clean this up, just so it doesn't

35:19: clutter our space, and we can move to the next questions.

35:24: I'll mark these two as unanswered then.

35:27: There's actually one thing I was also going to mention. Even from the particle system, there's actually a few functions

35:32: that spawned us extra things.

35:36: One of them being that we have access to

35:40: 4D Simplex noise, so you can use a ProtoFlux node,

35:44: and there's also a 3D texture

35:46: with Simplex noise,

35:52: and I've seen people do really cool effects

35:56: with it, like this one for example. I think I actually got this one from Cyro.

36:00: So you see how it kind of evolves in time?

36:03: This is like a volumetric effect, so you can kind of push it through.

36:10: So people have been

36:11: already playing with it. And this is kind of generally how

36:15: we like to do development, where, I've got another version here.

36:19: This is super neat.

36:25: How we like to develop things

36:27: is like, you know, we want to add more building blocks. So even if we're building something

36:31: official, whatever building blocks we add, we try to make

36:35: as many of them available to everyone using the platform, because you can use them for a lot of other

36:40: things. So yeah, but that should

36:43: kind of cover those questions.

36:47: So next, we have a question from Navy3001.

36:52: Any idea of how in-game performance metrics for user content

36:55: would work? That's actually, that's a good question, and like, measuring

37:01: performance, that's a very kind of difficult

37:03: thing, because one of the things with performance is like, it depends.

37:08: Like, it depends on a lot of stuff.

37:11: So usually, like, you want to have like, you know, kind of a range of tools, you know, to kind of measure

37:16: like, measure things. One of them is, you know, you can measure how long

37:20: individual components take, you know, to execute, and sort of some way to

37:23: aggregate the data, so you can kind of see, okay, this is consuming a lot of time,

37:28: this is consuming a lot of time, but the performance impact

37:31: of something is not always like, you know, that direct, because something can, for example, the components

37:35: themselves, they can be quick to execute, but maybe the object is, you know,

37:40: has a really complex geometry, so it's taking a long time on the GPU

37:44: to render out. The other part

37:47: is like, performance can also differ depending on the scenario. Say,

37:51: you build an object, and the object is doing a raycast,

37:55: it's doing, you know, some kind of checks for collisions. If you have an object in a simple world,

38:00: maybe it doesn't like, you know, it runs pretty fast, but you bring

38:03: that object into a world with much more complex colliders, it suddenly, it starts

38:07: hurting performance, because now those collision checks are needed, like, you know, are more

38:11: complex. The other example is like, say you use

38:15: like a node, like find child. You try to search for a child in a hierarchy.

38:20: And if you're a simple world, maybe like, you know,

38:24: the hierarchy of objects is, you know,

38:27: it doesn't have too much in it. So it runs fast. But then you go into a world which has

38:31: way more, and now the performance kind of tanks. Now the thing

38:35: that was running reasonably fast in one world is

38:39: running slower in the other one. So, one of the ideas we

38:43: kind of had is, we would kind of build some sort of like, you know,

38:47: kind of like benchmark worlds. We would like, you know, like have like

38:51: different scenarios, complex worlds with complex hierarchies, you know, for this and that

38:55: and then have a system where you can essentially like run

38:59: that object in that world and sort of, you know, see how fast

39:03: it runs and how does it differ depending, you know, on a different kind of

39:07: scenario. Overall, I think this

39:12: eventually end up with, you know, lots of different tools. So you have like, you know,

39:15: you don't have the tools to measure how much the components take to execute,

39:19: how long the, you know, GPU takes to execute,

39:23: just sort of like lots of different tools to analyze different like, you know, performance things.

39:29: So I think that's overall like, you know, like

39:31: what you should expect, like once those tools come in

39:35: it's not going to be a single tool, but it's going to be like, you know,

39:39: a range of them that will probably keep like, you know, expanding and building upon.

39:44: If I could

39:46: append to that

39:52: we probably also

39:54: because I know this is, I know this has like come up occasionally in relation

39:58: to questions like this, we probably also wouldn't

40:03: like give things like

40:06: we wouldn't do like an arbitrary limiting system like

40:10: oh you can only have 60,000 triangles, you can only have

40:14: X number of seconds of audio on you.

40:18: We do want to add tools so you can restrict, because it can like, it's not

40:23: it's not a perfect solution, but like we want to add people

40:26: like we want to add tools so people can like, you know, set some limits

40:30: on things. Because whatever kind of philosophy

40:34: is like, you know, we want to give people a lot of control.

40:38: And if you want to run a session like where you can spawn

40:42: object that has, you know, 50 million triangles and like everybody's going to be running at like

40:46: you know, 10 FPS, but you know, you want to be like I have a beefy

40:50: GPU, I want to look at this super detailed model, we always

40:54: want people to have ability to do that. At the same time

40:58: we want to add tools so like, you know, if you want to host like a chill world

41:02: if you want to keep it like, you know, more light, you have tools

41:06: to kind of like, you know, set certain limits on the users, how much they can

41:10: spawn in, how much they can bring in. So we're not going to make

41:14: them, you know, forced, but we're much more likely to add like tools where

41:18: you have the kind of control to decide what you want, you know, in your

41:22: world, what you want in your experience.

41:26: Other aspect of that is like, you know, we have the asset variant system and we already

41:30: use part of it, like you can go into your settings and you can lower

41:34: the resolution of textures. You can, for example, clamp it to like 2K.

41:38: So if you're, you know, low on VRM, you can lower the textures

41:42: and if somebody has, you know, 8K texture on their avatar, you're

41:46: only going to load it up to 2K. You know, it's not going to hurt you, but other

41:50: people, like say somebody has, you know, one of the, you know, 1490

41:54: with 24 gigs of VRM and they don't care, they can keep it like, you know, kind of

41:57: unlocked. And it's kind of, you know, aligned with our kind of like philosophy is like

42:04: give people as many tools as possible to kind of control your experience. But also

42:08: we don't want to enforce, like, you know, limits on people where possible.

42:14: Yeah, that's kind of more

42:15: so where I was going with that, is that we wouldn't have like a sort of

42:20: hard and fast, these are the rules for the whole platform kind of

42:24: rules. Because, you know, not everybody's computers are

42:28: equal, and so maybe I don't want to render your 500 million

42:31: polygon model, right? But

42:37: we also don't want to

42:39: we want to present this stuff in a sort of like unbiased way.

42:43: Like, we don't want to, like, we wouldn't

42:48: I wouldn't want to color, like, 500, like, you know,

42:51: someone's polygon count in, like, red or something. Because

42:55: we got to, like, social kind of thing, but it also comes with

43:03: I think that should, like, answer

43:05: like this, like, particularly, we should probably, like, move to the other questions because we got a bunch of them piling up.

43:10: Can I answer the next one?

43:14: Uh, sure.

43:17: I haven't actually read it yet.

43:20: Okay, so, TheJebForge asks, would it even be possible

43:24: to multithread world processing in Resonite? Like, if the world is incredibly heavy and the amount

43:28: of CPU it uses, but since Resonite only uses one thread, it's not using all the CPU it could have been.

43:34: I know multithreading introduces a lot of problems with thread synchronization.

43:37: What do you think?

43:41: Alright guys, say it with me.

43:45: Oh gosh, the camera's moving.

43:48: Hold on, hold on, hold on, hold on.

43:51: Alright, say it with me. Resonite is not

43:56: single-threading. This is a myth

44:00: that has somehow spread around that Resonite only runs on a single

44:04: thread. This is abjectly not true.

44:09: Yeah, this is a thing we kind of get a lot, because I think people are just

44:13: like, you know, it runs with poor performance, therefore it's single-threaded.

44:18: When it comes to multithreading, it's like way

44:21: more complex. It's not a black and white thing.

44:24: So, the way I kind of put it is, you know,

44:27: it's not like an on-off switch. Imagine you have

44:31: a city or something, and the city has poor roads.

44:36: Maybe there's areas where the roads

44:39: are very narrow, and it's kind of hard for cars to get through.

44:43: You can have areas of the city where you have highways, and

44:47: you can have lots of cars in there. It's not an on-off

44:51: switch where you just turn a switch and suddenly

44:55: every road is wide, but you can gradually rebuild

45:00: more of the city infrastructure to support

45:03: more of that high bandwidth. With Resonite,

45:07: there's a lot of things that are multithreaded.

45:11: There's also a lot of things that could be multithreaded, and they're going to be more

45:15: multithreaded in the future, but it's not

45:21: it's essentially not

45:23: a black and white thing, whether it's either multithreaded or not

45:27: multithreaded. You have to think about Resonite,

45:30: it's like lots of complex systems. There's so many systems, and

45:34: some of them are going to be multithreaded, some of them are not going to be

45:38: multithreaded. Some of them are not multithreaded, and they're

45:42: going to be multithreaded. Some of them are going to stay single-threaded, because there's not

45:46: much benefit to them being multithreaded. So we definitely

45:50: want to do more, but we already have a lot of things

45:54: running on multiple threads, like

45:59: asset processing that's multithreaded, the physics that's using

46:02: multiple threads, a lot of additional processing

46:06: spins off, does a bunch of background processing, and then integrates with the main thread.

46:11: So there's a lot of multithreading to the system already,

46:14: there's got to be more.

46:18: It's not something that's like a magic silver bullet.

46:25: With performance,

46:27: there's a lot of complexity. There's a lot of things

46:32: that can be causing low performance,

46:34: and multithreading is not always the best answer.

46:38: So for example, the .NET 9 switch, that's actually not

46:42: going to change anything with multithreading,

46:46: but it essentially makes the code that we already have,

46:50: which as we know, whatever multithreading has right now, it makes it run

46:54: several times faster, just by switching to the runtime, just by having

46:58: better code gen. So there's a lot of different

47:02: things that can be done to improve performance, multithreading is just one of them.

47:08: I think I should cover a lot of it,

47:11: but yes.

47:15: One more thing is, it's also something like,

47:18: when there's a world that's very heavy, it depends what's making it

47:23: heavy, because some things you can multithread, but some things you cannot multithread.

47:26: If you have some user content that's doing lots of interactions with things,

47:31: if you're just blatantly multithreaded, it's going to end up

47:34: corrupting a bunch of stuff, because with every algorithm

47:38: there's always a part of it that's irreducible.

47:43: So we want to introduce more systems that use multithreading

47:46: where possible, but again, it's not

47:50: a silver bullet. It's more like

47:54: a gradual kind of process that happens over time.

48:00: Next we have GrandUK is asking

48:02: are there roadmaps with time estimates for both development and what do you want Resonite to be?

48:06: So for roadmaps, we generally don't do super

48:10: ahead of roadmaps. Right now our focus is on performance

48:14: updates, and you can actually find on our GitHub

48:18: there's a project board, and there's a list of issues

48:22: that pertain to performance updates, and you can see how those

48:26: progress. We don't do time estimates

48:31: because the development varies a lot, and oftentimes

48:34: things come in that we have to deal with, the delay things

48:38: or maybe there's additional complexity, so we don't

48:42: avoid promising certain dates

48:46: when we are not confident we could actually keep them.

48:50: We can give you very rough ones, for example with the

48:54: performance, with the big performance upgrade

48:59: I roughly expect it to happen sometime in Q1

49:02: sometime early next year. We'll see how it goes

49:08: but that would be my rough estimate

49:10: on that one. After that, we usually

49:13: once we finish on a big task, we re-evaluate

49:17: what would be the next best step for the platform

49:22: at that point, and we decide are we going to focus on UI

49:25: are we going to implement this thing, are we going to implement that thing, because

49:30: we try to

49:33: keep our ear to the ground and be like this is what would

49:37: give the community and the platform most benefit right now

49:40: this is what's most needed right now, and we want to make the decision

49:46: as soon as possible

49:48: no, actually as late as possible.

49:54: Next question, we have Jack the Fox author

49:57: what are some examples of features you've implemented a particle you're proud about?

50:02: There's a whole bunch, I do a lot of

50:04: systems, the one I'm actually working on right now, the particle system

50:08: I'm pretty proud of that, it's

50:13: technically not out yet, but I'm very happy with how it's going

50:17: in part because it now

50:21: gives us control to very easily make new particle effects

50:25: and do stuff we were not able to do easily before

50:29: the one that came before that is the data feed system

50:35: that's a culmination of a lot of

50:37: approaches I've been developing to how we do UI

50:41: in the Resonite

50:44: so with that one, one of the big problems we've had with the UI is because the Resonite

50:49: is building a lot of things from ground up

50:52: because of the layer I was talking about in the stream

50:58: but it also makes things difficult because

51:00: we cannot just take existing solution and use it

51:04: so a lot of the UI, we actually have to build those systems ourselves and build frameworks

51:08: to work with them, and the old UIs, they have

51:12: the problem where the code of them is like this big monolith

51:16: and it's really hard to work with, we have to

51:19: if there's misaligned button or something

51:23: we have to go to the code, change some numbers there, change some methods

51:27: that are called, compile, wait for it to compile

51:30: run the application, look at it, be like that's still wrong

51:34: go back to the code, make more changes, compile, wait for it

51:38: wait for it to launch, look at it, it's still wrong, go back to the code

51:44: sometimes people are like, oh this thing is misaligned

51:47: in this UI, and we're fixing that

51:51: sometimes it takes an hour, just messing around

51:55: and that's not very good use of our engineering time

52:00: but the data feeds

52:03: is a system that's very generalized

52:05: that essentially allows us to split the work on UI

52:10: between the engineering team and our content, or now, our team

52:16: so when we work the settings UI, on the code

52:19: side we only have to worry more about the functionality of it, like what's the structure, what's the data

52:23: interfaces, and then we have the rest of our team

52:27: like our team, actually build the visuals

52:31: in-game, and put a lot of polish into each of the elements

52:36: and that process has made it much

52:39: simpler to rework the settings

52:43: UI, but what's an even bigger part of it is

52:47: the data feed system that this is built on

52:50: is very general, and it's been kind of designed to be general

52:54: so the settings UI, it was used as sort of like a pilot project for it

52:59: but now, we're going to use it

53:02: once we get to more UI work, to rework the inventory

53:07: rework the contacts, rework the word browser, file browser

53:11: rework the inspectors, and it makes the work required

53:14: to rework those UIs be at least an order of magnitude

53:18: less, which means that before the data feeds

53:25: these are

53:26: rough estimates, but say it would have taken us two months to

53:31: rework the inventory UI, now it's going to take us two weeks

53:36: and those numbers are

53:39: more of an illustrative point, but it's

53:43: essentially on this kind of order, it makes it way simpler, it saves us so much time

53:47: which means we can rework a lot more UI

53:50: in a shorter time span.

53:55: There's lots of things I'm kind of proud of, I just kind of did two most recent

53:59: ones, so I could ramble for this for a while, but

54:03: we have a lot of questions, so I don't want to hold things up.

54:06: Sorry, do you actually have one we'd like to share with us?

54:10: Yeah, I'll try and be quick with it

54:14: since we're getting back to... How long have we been running actually?

54:19: We're coming in an hour.

54:22: How long do we want to keep this going for?

54:24: So my aim was for one hour to two hours, depending on the questions. We got a lot of those questions, so I'm okay going through

54:30: all of these, but as we start getting out of two hours, we'll probably

54:34: stop it.

54:38: When you were talking about the build process, that kind of made me think of

54:42: something that I really enjoyed working on.

54:47: It's kind of one of those things where

54:49: it's really important, but it's just so invisible.

54:56: And what I did behind the scenes

54:58: is I basically reworked the entire

55:02: build process of FrooxEngine, almost.

55:08: Since FrooxEngine has been

55:10: around for a while, and it's been through

55:14: many updates to C Sharp and C Sharp's project system, we were still

55:18: using the legacy

55:22: C Sharp project format, which is called MSBuild.

55:27: And that really only works in

55:30: something like Visual Studio these days.

55:33: It's kind of hard to work with, it's not quite as robust as the newer

55:38: build system for .NET, and as a result

55:44: there would oftentimes be

55:49: like, you'd have like

55:50: weird issues if you wanted to add packages and stuff, and

55:54: you could only use something like Visual Studio as your IDE of choice to boot.

56:01: And I

56:03: saw that, and I

56:07: decided to poke at it, and it actually ended up being

56:11: a lot easier than I anticipated because Microsoft provides a nice

56:15: little tool to upgrade your projects, and so what I did is I

56:18: went through and I upgraded all of the projects to the new

56:22: C Sharp format, which means that we can take advantage of

56:26: the much nicer project files, which means it's easier

56:31: to edit them directly and add actions and stuff

56:35: and it also means the engine

56:39: can now be built in IDEs other than VS Code.

56:43: You could use, or VS Code, Visual Studio

56:47: Lopper is what I meant to say there. But now you can build it in like

56:50: VS Code, or like, you could build it in

56:56: you could probably build it in like

56:59: Rider if you pay for Rider, you could build it, you could even build the engine

57:02: from the command line now, which is really really good for

57:06: yeah, like automated builds. That's a big thing I did

57:11: that nobody saw, but I'm really really proud about.

57:14: It's one of those things where it doesn't show on the surface, but

57:18: it makes our lives as developers way easier, because I had

57:23: so many times where I would literally lose sometimes even hours

57:27: of time just trying to deal with some kind of problem, and

57:30: having those problems kind of resolved, and have the system kind of be nicer

57:34: it allows us to invest more of our time into actually

57:39: building things like we want to build and dealing with project build issues.

57:43: One of the problems, for example, that's

57:46: kind of weird, like one of those weird things is with ProtoFlux.

57:50: Because for ProtoFlux, it's technically a separate system

57:54: and we have a project that actually analyses all the nodes

57:57: and generates C-Sharp code that binds it to Resonite.

58:03: The problem is, with all the MSBuild,

58:06: for some reason, even if the

58:10: project that generates that code runs first

58:14: the build process doesn't see any of the new files

58:19: in that same build pipeline.

58:22: So if we ever added a new node, we would compile it and it would fail

58:26: because it's like, oh, this code doesn't exist

58:30: even though it actually exists at the time, it just doesn't see it.

58:34: With the changes Cyro made, the problem is gone. We don't have to talk about this whole thing.

58:39: But the really big thing is it prepares Resonite for more

58:43: automated build pipeline, which is something we've been trying to move towards

58:47: to because it's going to be one of the things that's going to save us a lot more time as well

58:51: that's going to make it so we can actually just push code

58:54: into the repository. There's automated tests that are going to run, there's going to be automated

58:58: builds, the binaries are automatically going to be uploaded and it's just going to

59:02: remove all of the manual work that happens all the time.

59:06: It makes bringing on people like me easier too.

59:09: It makes it easier to bring more engineers as well because now they don't have to deal with those weird

59:14: issues. I know Prime also lost

59:18: sometimes he lost a day just dealing with project issues

59:22: and a day you could spend working on other stuff

59:26: and instead you have to just make things work.

59:30: Thank you Cyro for making this.

59:34: Things like this, even though they're not visible to the community, they help a lot.

59:41: Next, we have

59:43: a question from FantasticMrFoxBox.

59:47: With sound system updates, can we get a way to capture a user's voice with

59:50: a permission and import audio streams dynamically into the world?

59:55: This would allow us to fully implement the ham radio stuff into Resonite and allow us

59:59: to ditch using external browser support to support audio.

01:00:04: So I'm not sure if I've

01:00:05: I don't understand enough about how you want to capture it

01:00:11: But since we'll be handling all the audio rendering

01:00:15: we'll be able to build a virtual microphone that actually captures

01:00:19: specialized audio from its own, whatever it is in the world.

01:00:23: So that's one of the things you'll be able to do. You'll be able to bring the camera

01:00:27: and have it stream the audio device.

01:00:30: So I would say yes on that part, on the

01:00:35: kind of capture.

01:00:37: I don't know...

01:00:39: I think I know what they mean.

01:00:45: Am I correct in assuming

01:00:48: that you want a way to import multiple

01:00:51: streams into the world from a single user? Is that what you're talking about?

01:00:58: You'll probably have to wait for them.

01:01:00: Yeah, wait a second.

01:01:05: We might

01:01:05: get back to this question.

01:01:10: You'll essentially be able to render audio out

01:01:13: from any point in the game in addition to rendering for the user.

01:01:17: And then it becomes a question what do we want to do? Do we want to record an audio clip?

01:01:21: Do we want to output it into another audio device so we can stream it into something?

01:01:25: So that will work. If you want to import audio back in

01:01:31: that's probably a separate thing.

01:01:33: That's probably not going to come as part of it. We'll see.

01:01:37: If you have any kind of clarification just ask us more and we'll get back to this.

01:01:43: Next we have

01:01:46: EpicEston is asking, will the headless coin be upgraded to .NET 9?

01:01:50: Yes. Plan to do this soon.

01:01:53: It should be mostly just a flip of a switch, we don't expect

01:01:57: big issues. One of the things we want to do is we're going to make announcements

01:02:01: so people know this is coming, you can prepare your tooling

01:02:05: make sure whatever scripts you're using

01:02:09: to upload your headlesses, they don't just explode.

01:02:14: There's a GitHub issue on it and I'll try to make the announcement

01:02:17: in a bit, probably sometime next week.

01:02:22: Get people ready. Alex2PI is asking

01:02:25: makes me wonder what's currently a culprit of most crashes, at least on my computer

01:02:29: I must have seen information that Unity crashes, couldn't you just restart Unity?

01:02:35: We also had a discussion about couldn't you just

01:02:39: I mean, so

01:02:41: for the first part of the question, crashes, they can have lots of reasons

01:02:47: it's really hard to say, like in general

01:02:49: you pretty much have to send us the crash log, we look at it, we look at the calc tag and be like

01:02:53: this is probably causing it, so it's kind of hard to say

01:02:58: in general, for the part where we just restart Unity

01:03:02: I mean, it's kind of what a crash is, it essentially breaks

01:03:07: and then it has to shut down and you have to start

01:03:09: it again, so in a way you're kind of restarting Unity

01:03:13: it's just that the restart is kind of forced

01:03:19: but this actually kind of ties

01:03:25: because if you've been here earlier

01:03:28: we've been talking about how FrooxEngine is going to essentially be moved into

01:03:32: its own process, and then Unity is going to be handling the rendering

01:03:36: one of the things that I'm considering as part of the design is so

01:03:40: the Unity can actually be restarted

01:03:44: maybe. So if Unity happens to crash, we can keep

01:03:48: running FrooxEngine, start a new Unity, and just

01:03:52: reinitialize everything. So I do want to make that part of it

01:03:55: just in general to make the system more robust, so it's possible

01:04:02: but TBD

01:04:03: we'll see how that kind of goes

01:04:08: currentUK is asking, I have heard from someone complains of headless being

01:04:11: patron reward. This was particularly a complaint about communities that do want to host events

01:04:15: essentially forced into it to keep events going if they haven't host crashes. Is there

01:04:19: any plans later to remove the patron requirement for the headlaces when things are more stable

01:04:23: and performant? So at some point we'll probably

01:04:28: make it more open. Our

01:04:31: tentative goal, and this is not set in stone, so things

01:04:35: might change. Our tentative goal is we want to offer

01:04:39: a service where we make it easy to auto-spin headlaces

01:04:43: and move Patreon to that, so if you

01:04:47: support us financially you will get a certain amount of

01:04:51: hours for the headlaces and we're going to make it very easy to host, and if you want to self-host

01:04:55: we're going to give you the headlaces. We don't have to add it

01:04:59: from the business perspective because Patreon is one of the

01:05:03: things that's supporting the platform and it's allowing us to work on it.

01:05:07: So we don't want to compromise that because

01:05:13: if we do something with that

01:05:15: it ends up hurting our revenue stream, then we're not able to

01:05:18: pay people on our team, and then we're not able to work on

01:05:23: things and things end up kind of bad.

01:05:28: We don't want it to be accessible to as many people as possible, but we're sort of

01:05:31: balancing it with the business side of things.

01:05:37: Next one, TroyBorg.

01:05:39: Cyro also did FFT mode a while ago. Having the audio system that could make VHS part of the game

01:05:43: like waveform visual is, or be able to do better detection of bass music effects.

01:05:47: That's actually separate from, because that happens fully with the Resonite.

01:05:52: The audio system is more about rendering the audio output

01:05:56: and pushing it to your audio device.

01:06:02: Next we have, I'm kind of just speeding through these questions because we have a bunch.

01:06:07: Skywakitsune. A few people have mentioned that they are not happy with the new working system and how good

01:06:11: it looks. I have plans to continue to improve that. It will be a specialized update but people

01:06:15: are still not happy. We can always improve things. We just released

01:06:19: an update which

01:06:23: integrates some of the community settings which would make it look way better.

01:06:29: For things that are like, you know,

01:06:31: that people still find us and issues with it, we will need reports on those because

01:06:35: right now

01:06:38: we're not sure

01:06:41: after the update, we're not sure what's making people not happy about it.

01:06:45: We have more concrete stuff to work with

01:06:48: as well as people make reports so we can know

01:06:52: what do we focus on. But yes, in general, we are always

01:06:56: willing to improve things. We want to

01:07:01: make

01:07:04: essentially want to make it as polished

01:07:06: as it can be, but we also need more kind of hard

01:07:09: data to work with so we know where to invest our time.

01:07:18: Next we have Terborg. What is causing Virulence first

01:07:21: sometimes when Froox moves? I'm not sure. It could be just the bloom on death

01:07:25: thing, maybe.

01:07:27: It's his radiant yellow complexion.

01:07:33: Your resplendent visage.

01:07:35: This is actually Erlage. What was the answer to this?

01:07:39: So these are just looks like questions within the chat.

01:07:43: Erlage86. Who is the second person here on the camera?

01:07:46: This is Cyro. He's our engineering intern.

01:07:50: Hello. Hi. How you doing guys? It's me.

01:07:57: Next we have

01:07:58: SkymoKitsum. Questions from Tara Whitel who can't watch the stream now.

01:08:03: If video players are going to be updated with Core and VLC, I have

01:08:06: heard from several builders that players use very outdated Core.

01:08:10: Yes, the system we use right now, it's a plugin called UMP

01:08:14: Universal Media employer, which is a builder on VLC, unfortunately

01:08:18: hasn't been updated in years, which means it's using an older

01:08:22: version of it. We've been looking into upgrading

01:08:26: to actual official VLC plugin. The problem is

01:08:30: it's still not mature enough in some ways.

01:08:35: The last I remember, there's issues where you cannot

01:08:38: have more than one video at a time. You can only have

01:08:42: one, and if you try to do another one, it just explodes.

01:08:47: There's other things we can look

01:08:49: into, like alternative rendering engines, but there's also

01:08:52: potential time and money investment. If the pros

01:08:57: are bad, we can

01:09:00: consider that we might invest into one, but we need to do some testing

01:09:05: there and see how well it works.

01:09:09: It's unfortunately difficult situations because the solutions

01:09:12: are limited.

01:09:17: It's something we want to improve,

01:09:20: but it's also difficult to work with, unfortunately.

01:09:26: Can I comment on the next one?

01:09:32: Rasmus0211

01:09:32: asks, thoughts on about 75% of all users being in private worlds

01:09:36: around the clock. Often new users mention they see practically no enticing

01:09:40: worlds. This is not a Resonite

01:09:44: problem. This is a problem of scale.

01:09:48: All platforms have a

01:09:53: pretty wide majority of people who just kind of want

01:09:56: to hang out and not really be bothered.

01:10:01: Unfortunately, we're not the biggest platform out

01:10:04: there. We're still kind of small.

01:10:09: And as we

01:10:10: grow, that problem will undoubtedly get better.

01:10:16: It's not really a

01:10:18: technical problem, it's more like a social one, because people

01:10:23: behave in a certain way, and it's really hard

01:10:26: to change that. There's some things we want to do to

01:10:30: entice people to make it easier to discover things, like

01:10:34: we were talking earlier, adding an event's UI, so you cannot see these are the things

01:10:38: that are coming up, these are going to be public events that you can join. Right now, I

01:10:42: believe there's the creator chain event that's going on, and it's always

01:10:47: every weekend, it's public to everyone.

01:10:51: But it depends what people are coming in for, because people

01:10:54: might come in, and they don't actually want to join public events, they want to go into those

01:10:58: private worlds. But the question is, how do you make those people

01:11:03: discover the friend groups and hang out

01:11:07: in those worlds? It's a challenging problem,

01:11:11: especially from the platform perspective, because we can't just force

01:11:15: people into public worlds. People

01:11:17: will host whatever worlds they like, but

01:11:22: always want to see what kind of tools we can give to entice people

01:11:27: and make worlds and socialization easier for

01:11:30: them to discover. But like Cyro said, it is a thing that

01:11:34: gets better with scale, once we can grow more.

01:11:40: There's a

01:11:42: number of events, though. If people go right now,

01:11:47: since we don't have the event's UI in-game, if you go into the

01:11:51: Resonite Discord,

01:11:57: if you go into

01:11:58: Resonite Discord, we have community news, and lots of

01:12:03: different communities post regular events there, so people can

01:12:06: find what's going on in the platform, it helps a bit in the meantime if

01:12:11: people are looking for things to do.

01:12:15: Next question from Baplar.

01:12:20: Yes, it's actually been working in parallel.

01:12:26: Ginns is one of the main people working on that.

01:12:30: We did have meetings

01:12:33: now and then, we're sort of synchronized on the status of it.

01:12:37: Last time, that was two weeks ago or so, we talked about

01:12:41: the multi-process architecture, how that's going to work, how it's going to integrate

01:12:45: with Froox Engine, and how

01:12:49: those systems are going to communicate. Ginns' approach was

01:12:53: to look at what the current unit integration has

01:12:57: and were implemented on source end. However, there's a lot of things

01:13:01: that we're actually moving, like the particle system, audio system,

01:13:05: input system, lots of things that are going to be moved forward into Froox Engine,

01:13:09: so they don't need to be implemented on source side, and they're going to focus more

01:13:14: on other things. They have a list

01:13:17: of source features, and specifically

01:13:20: bevy features, because source is being built around the bevy

01:13:24: rendering engine, which

01:13:28: maps the current features we have. For example, we have lights,

01:13:32: do they support shadows, we have reflection probes, do they support

01:13:36: this and that. So they're working on making sure there's a feature

01:13:40: part there. Once we have a performance upgrade,

01:13:44: we can work more on the integration. They also work

01:13:48: on the Resonite side, so you know what Jenkins has been doing on consolidating

01:13:52: the shaders, because all the shaders we have right now,

01:13:56: they need to be rewritten for source, because

01:14:00: the current ones, they're not designed for Unity, so we need equivalents

01:14:05: the equivalents of those are

01:14:08: essentially going to be implemented for the new rendering engine.

01:14:14: Next, Epic Easton. How do you

01:14:16: make walkie-talkie system? There's actually one thing you should be able to do

01:14:21: with the new audio system, you'll be able to

01:14:24: have a virtual microphone, put it on a thing

01:14:28: and then have it output from another audio source. There actually might be

01:14:32: a thing you'll be able to do once we rework that, because it shouldn't be too difficult to add

01:14:36: components for that. Relanche,

01:14:41: Richard Bode, Newtonian Physics System 1, soon or later.

01:14:44: So definitely sometime after the performance upgrade

01:14:49: we integrate a physics engine called Bepu Physics.

01:14:53: One of the things we want to do after we move the Froox engine

01:14:56: out of Unity and it's running on .NET 9, we want to synchronize

01:15:00: Bepu to the latest version, because right now we kind of have to diverge

01:15:04: because the Bepu physics, it used to work with

01:15:07: .NET Framework, which is what Resonite is like right now for Unity.

01:15:12: But now the newer versions they require, I think .NET 5

01:15:15: or maybe they even bumped it higher, which means we cannot

01:15:19: really use those, at least not with lots of backporting.

01:15:23: So one of the tasks is going to be to sync it up and then we're going to

01:15:27: be able to look at how much work is it, when do we want to

01:15:31: prioritize how we should put a simulation integrated with Froox Engine. It's also

01:15:35: going to help because Bepu Physics is designed to work with

01:15:39: modern .NET to be like a really performant, which is why

01:15:43: I would like a person to consider it as a prerequisite for

01:15:48: implementing that as the performance upgrade, so we're actually running it with

01:15:51: the runtime it's supposed to run with. But there's no specific

01:15:55: kind of prioritization right now. Once we're done with the performance update, we might focus

01:15:59: more on UI and be focused on IK, maybe other things we'll

01:16:03: reevaluate at that point.

01:16:07: Grant is asking

01:16:09: Must move away from Unity to Source. Could it be possible to dynamically connect, disconnect from VR

01:16:13: runtime without restarting the game? There's not really even a thing

01:16:17: that needs to move away from Unity. It's possible to implement it

01:16:22: with Unity. It just takes a fair amount of work.

01:16:27: So, possible yes, I would

01:16:29: say. The question is are we going to invest time into implementing that.

01:16:35: For that I don't know the answer right now.

01:16:39: Next we have a question, RustybotPrime

01:16:41: Would these audio rendering sources allow for spatial data for your own

01:16:45: voice? Example, if I want to record conversation between myself and someone

01:16:49: else from third person without it sounding like I'm right at the camera.

01:16:54: Yes, there wouldn't be an issue because we can just

01:16:58: have any sort of listener in the world and just

01:17:01: record that with binary audio and everything.

01:17:06: Next, what flavor of sauce, what does it

01:17:09: taste like? And it's very salty. Mayonnaise.

01:17:13: Not mayonnaise, it's actually made of his own kind of sauce

01:17:17: which is why it's named sauce. Actually, I forget what he calls it.

01:17:23: Scotch sauce. Scotch sauce, yes.

01:17:25: He makes this really delicious sauce, it's a very salty

01:17:29: one, but it has loads of flavors to it.

01:17:34: I think this next one's aimed at me.

01:17:37: Alex2pie says, Cyro, I heard that some people don't trust you and that you don't care.

01:17:42: You know where this comes from. I think I do.

01:17:45: I'm in desktop a lot, and I'm often

01:17:49: either working in Froox Engine these days, or

01:17:54: I'm kind of audio sensitive and I can get overstimulated

01:17:57: kind of easily, so sometimes I will just kind of stand there.

01:18:02: Or maybe I won't respond so colorfully.

01:18:05: But I like having people around, and so that's why

01:18:09: I exist despite that.

01:18:13: I also appreciate it when

01:18:18: I'll probably open up a lot more

01:18:23: if

01:18:25: ...how do I put this...

01:18:30: Basically, if you

01:18:32: want to interact with the Cyro creature well,

01:18:37: do things like

01:18:39: ask before poking my nose or patting my head and stuff.

01:18:46: And ask me

01:18:48: if you want to send me a contact request. Just don't come up

01:18:52: to me and be like, you're cute, and then click my name and add me. Because then I have to

01:18:56: explain, I'm probably not going to add you, man, we talked

01:19:00: for maybe two seconds. I need at least 45 seconds.

01:19:09: But I...

01:19:10: If you've come across me and I've been in that

01:19:12: sort of state where I'm not super talkative, or maybe I seem a little detached,

01:19:18: hopefully that sheds a little light on that.

01:19:20: I love this place very dearly, and

01:19:25: I love all of you very dearly.

01:19:28: Sarah is a good bean.

01:19:33: So next, we have a question from Dan Amos.

01:19:37: Was the current workflow for identifying performance bottlenecks?

01:19:41: So, generally, the workflow

01:19:44: is something like, you know,

01:19:47: it kind of depends, because there's lots of things that can

01:19:50: cause performance issues.

01:19:54: So usually it's a combination of different things, but usually it kind of starts more

01:19:58: with just observation. You know, kind of seeing what's running

01:20:02: slow, when am I lagging, and so on.

01:20:07: Once there's that initial observation, we will try to

01:20:12: narrow down to the root of the issue.

01:20:14: And for that, we can use a variety of tools. Some of them are in-game.

01:20:18: For example, we have stats on how

01:20:23: much are certain parts of the process taking.

01:20:26: Once we need more detailed information, we can, for example, around

01:20:30: Headless, the Headless client with Visual Studio profiling tools,

01:20:34: and they actually measure how long is spent in each method,

01:20:39: how long is spent in each code. That gives us some kind of data.

01:20:42: The other part of it is benchmarking. Once we can have suspicion,

01:20:47: this thing is causing a lot of performance problems.

01:20:50: We can write a test sample, and then run it

01:20:55: with different runtimes, run it with different settings,

01:20:58: do A-B test things, see how things change.

01:21:03: For example, I've done this with a lot of Resonite's

01:21:06: offer extensions methods where, for example,

01:21:10: even with stuff like the base vector operations, I would try different ways to implement

01:21:13: certain operations, run a benchmark, and see how fast it runs.

01:21:20: One thing that kind of depends

01:21:22: there is what the runtime it uses.

01:21:26: One thing I would, for example, find is certain implementations, they actually

01:21:30: run faster with Mono,

01:21:34: and then slower with the modern .NET runtime.

01:21:38: There's a lot of things in FrooxEngine where

01:21:42: sometimes people kind of decompile and say, why is this

01:21:45: done this weird way? And in some cases, it's because

01:21:49: it actually, even though you wouldn't do it with

01:21:53: more modern code, it interacts better with the runtime used at a time.

01:21:59: But for example, with these

01:22:01: general operations, I would find

01:22:04: if I compare them with the Mono in Unity

01:22:07: and compare them with the modern runtime, they would run

01:22:11: 10, sometimes even 100 times faster. There's some other things

01:22:15: that also speed up, some things that are the same.

01:22:19: But generally, it's just a combination of tools.

01:22:24: We observe something not performing well, we have a suspicion

01:22:27: that this might be causing it, and then we just use tools

01:22:31: to dig down and figure out the root cause of that problem.

01:22:38: So hopefully that answers that.

01:22:41: I think there are also

01:22:44: some manual profiling tools out there, like Tracy, I know there's some

01:22:47: Tracy bindings for C Sharp, which are really cool.

01:22:52: That's actually one of the cool things, because there's a bunch of libraries that we cannot even use

01:22:56: right now because of the old runtime. Tracy, I think it requires

01:23:01: .NET 8 or

01:23:03: some new version.

01:23:06: It's listed for .NET 7, but I think it's just interop, so it could work, but

01:23:11: it's better to just wait.

01:23:14: We do want to integrate more tools. Usually, you have a performance profiling toolset

01:23:20: so you just dig down and figure out where it could be coming from.

01:23:23: Sometimes it's easier to find, sometimes it's harder, sometimes you have to do a lot of work.

01:23:27: For example, the testing I've done before

01:23:31: comparing the .NET 5 or whatever version it was

01:23:35: and Mono, I saw this code is running way better

01:23:40: so I think it's going to help improve a lot, but

01:23:43: it's still usually testing bits and pieces, and it's hard to test the whole

01:23:47: thing because the whole thing doesn't run with that new interface.

01:23:52: That's why for our current performance

01:23:55: update, we moved the headless first, because

01:23:59: moving the headless was much easier since it exists outside of Unity

01:24:04: and we could run sessions and compare

01:24:07: how does it perform compared to the Mono one.

01:24:11: And the results from that, we got

01:24:16: it's essentially beyond expectations, it's way faster.

01:24:20: That makes us more confident in doing all this work to move FrooxEngine

01:24:24: out of Unity, it's really going to be worth it.

01:24:33: As of my perception, Resonite is somewhat being marketed

01:24:40: as a furry social VR platform, which is not the case at all. But every time

01:24:44: I ask somebody, hey do you want to try Resonite, I usually get answers like, oh that VR game

01:24:48: for furries. I have nothing against them, but in Resonite they are very publicly dominant.

01:24:52: Are there thoughts about this topic that could maybe bring in more people?

01:24:56: So, we don't really market like Resonite as a furry social

01:25:00: VR platform. We actually specifically, on Chroma, we know who's

01:25:04: heading our marketing, we specifically for our own official

01:25:08: marketing materials, we show different diverse avatars

01:25:12: because yes, there's a lot of furries on this platform and

01:25:16: it's also a self-perpetuating thing where

01:25:20: because there's a lot of furries, they bring in a bunch more.

01:25:24: We do have lots of other communities as well, which are not

01:25:28: just big, but they are here as well.

01:25:32: So, we want Resonite to be for everyone. It's not designed

01:25:37: specifically for furries.

01:25:42: We want everyone to be welcome here.

01:25:50: It's sort of like a

01:25:50: complicated kind of thing because

01:25:54: the marketing we make, we try to make it as generalized, but

01:25:58: the question is when you come to the platform, you're going to have lots of furries.

01:26:03: I think the only way to bring in

01:26:06: more people is to showcase lots of

01:26:10: different people on the platform, lots of different

01:26:14: kind of communities, but if there's lots of furries, it becomes

01:26:18: kind of difficult. It's self-perpetuating.

01:26:24: But I think it's also a thing of scale.

01:26:26: As we keep growing, there's going to be more different groups of people

01:26:30: and the communities that are different kind of fandoms or just different

01:26:36: demographics, they're going to get bigger and it's going to help

01:26:41: people who are from those

01:26:42: demographics find their groups much easier, once there's more of them.

01:26:49: Yeah, Resonite's all

01:26:50: about self-expression and stuff and being who you want to be

01:26:55: and building what you want to build, and furries kind of got

01:26:58: that down pat, and so that's probably why you see a lot of them, but

01:27:02: everybody can do that. It's not just those people, it's made for

01:27:08: every person to come together

01:27:10: and hang out and build and

01:27:14: just be you, no matter who you are.

01:27:19: Yeah, we try to make this platform kind of inclusive and for everyone.

01:27:23: It's our goal. We don't want

01:27:26: anybody to feel unwelcome.

01:27:30: I mean asterisk, because we don't want

01:27:34: hate groups, people like that.

01:27:36: So that one we would have an issue with, but generally

01:27:40: we want this platform to be everyone.

01:27:43: Yeah, also we're coming up on the hour and a half mark.

01:27:46: Ok, so we have about 30 minutes left. We're getting to the end of the question, so we'll see how they

01:27:52: keep piling, but we might need to stop them at a certain point.

01:27:57: So next question, Oran Moonclaw.

01:28:00: Is rendering performance being looked into before you move the source as well? From my experience, when

01:28:04: the system is not CP bound, the rendering can be still quite heavy for semi-tuber resolution.

01:28:09: So there's actually a thing that source will help with.

01:28:12: We don't want to invest super much time into the current rendering pipeline

01:28:16: with Unity, because the goal is to move away from it, which means

01:28:20: any time we invest

01:28:24: improving Unity, it's essentially going to be wasted

01:28:28: and it's going to delay the eventual big switch. Source

01:28:32: is going to

01:28:34: use much more modern rendering method. Right now we're using deferred method

01:28:39: which can be quite heavy, like

01:28:43: memory bandwidth and so on.

01:28:47: With source, it's going to use something called clustered forward rendering

01:28:52: which allows lots of dynamic lines while also being much lighter

01:28:55: on the hardware. So that should improve

01:28:59: rendering performance on itself, and once we make the move we can

01:29:03: look for more areas to optimize things, introduce things like

01:29:07: impostors, more LOD systems and things like that.

01:29:14: So yeah,

01:29:15: it's pretty much like, unless there's any sort of

01:29:19: very obvious low-hanging fruit with rendering

01:29:23: that would take us less than a day

01:29:27: or maybe just a few days to get a significant boost in performance

01:29:31: we're probably not going to invest much time into it and instead want to invest

01:29:35: into the move away from anything.

01:29:39: Next question, RestibotPrime

01:29:41: How straightforward is conversion of our current particles to PhotonDust?

01:29:45: I assume goal is seamless to the point of them looking to be having identically, but there is anything current particles can do

01:29:51: that photovoltas won't, or will it do in a different enough way

01:29:55: that it will have to be manually fixed?

01:29:58: So the conversion, I can't really answer it exactly, because the conversion actually isn't written yet

01:30:04: however, the main focus right now is actually

01:30:07: feature parity. So I actually have a list, and I can post it

01:30:10: in the devlog if you're curious, where I have all the things that

01:30:15: the legacy system has, and I'll be working through that list

01:30:18: just making sure that PhotonDust has the same or equivalent functionality.

01:30:23: The goal is to make it so it's pretty much equivalent

01:30:27: so it converts and it will look either the same

01:30:31: or just very close

01:30:33: so hopefully there won't be things that are too different

01:30:39: however, sometimes those things

01:30:42: become apparent during the testing period

01:30:45: so once those things here come out, we'll look at them and we'll be like

01:30:49: this is easy enough to fix, or maybe this one's a little bit more complicated

01:30:53: maybe we just bring it close enough and ask people to manually

01:30:57: fix things, but we'll have to see how this kind of goes

01:31:02: sometimes it's kind of hard to know these before it actually happens

01:31:06: but

01:31:08: it should have a feature parity, well it's going to have a feature parity with the current

01:31:15: host of things that just work

01:31:18: next we have Fuzzy Bipolar Bear

01:31:21: is there a way to stop the dash particles from being shown when streaming? I don't think there is

01:31:25: I think we would have to implement that, does it show?

01:31:29: it does show, yeah

01:31:33: next, Ekky Kadir

01:31:36: what things are currently planned for the whole performance update? I think net weight is part of it, for example

01:31:40: so we actually answered this one earlier

01:31:43: I'm not going to go into details on this one

01:31:49: but essentially moving to .NET 9

01:31:52: we're originally going for .NET 8, but .NET 9 released

01:31:56: literally just like a week ago or so

01:31:59: in short, currently there's two main systems that need to be moved

01:32:04: fully into Froox Engine because they're a hybrid system, that's the particle system

01:32:08: which is being worked on right now, there's the sound system, which Cyro did some work on

01:32:13: once those systems are fully in Froox Engine, we're going to rework

01:32:16: how Froox Engine interfaces with Unity, and then we're going to move it out

01:32:20: into its own process, to use .NET 9

01:32:23: and it's going to be the big performance uplift from there

01:32:28: we're going to post, this video is going to be archived

01:32:33: if you're curious in more details, I recommend

01:32:36: watching it later, because we went into quite detail on this

01:32:39: earlier on the stream

01:32:43: so this question within chat

01:32:47: ShadowX, in the future, could there be a way to override values

01:32:51: not just per user, but in different contexts? For example, override active-enabled

01:32:56: state of a slot or component for a specific camera, basically same concept

01:32:59: of RTO, but more flexible.

01:33:01: so probably not like this

01:33:07: the problem with RTO is

01:33:09: if you want to override certain things

01:33:15: for example, in rendering

01:33:18: when rendering is happening, although

01:33:21: work on updating the world is already

01:33:25: complete, which means the render actually has much more limited functionality

01:33:29: on what it can change

01:33:33: probably the best way to handle situations like that

01:33:37: is you have multiple copies of whatever you want to change

01:33:42: or whatever system you want to have

01:33:45: and you mark each one to show in a different context

01:33:49: but you need to manually set them up

01:33:52: consider a scenario where you override an active-enabled state

01:33:59: that component might have

01:34:02: a lot of complex functionality, maybe there's even

01:34:06: ProtoFlux or some other components that are reading the active state

01:34:09: and doing things based on being enabled or disabled

01:34:13: and once you get into that realm

01:34:16: the effect of that single enabled state can be very complex

01:34:21: where you can literally have a bunch of ProtoFlux that does a bunch of modifications

01:34:26: to the scene when that state changes

01:34:29: and it's too complex for something like the render to resolve

01:34:32: because you would have to run another update

01:34:37: on the world just to resolve those differences

01:34:40: and the complexity of that system essentially explodes

01:34:46: so probably not in that sense

01:34:48: if you give us more details on what you want to achieve

01:34:52: we can give a more specific answer

01:34:55: but this is pretty much how much I can say on this look in general

01:35:03: was the locomotion animation system one of the unit systems that need to be implemented in Froox Engine or was it something else?

01:35:09: that one was something else, it came

01:35:12: as a part of business contracts

01:35:16: it's not something

01:35:19: it's not something I kind of wanted to prioritize myself

01:35:25: it's kind of a complicated situation

01:35:27: but unfortunately it was necessary at the time

01:35:32: and I'm not super happy with how

01:35:34: that whole thing went because

01:35:40: it came at the wrong time

01:35:45: and it's

01:35:47: it was essentially a lot of, because we don't have a lot of systems

01:35:51: for dealing with animation which would have made these things much easier

01:35:55: and we have never worked with IK itself which would have made things also easier

01:35:59: so there was a lot of foundational work that was not there

01:36:05: and also

01:36:07: the timeline was kind of really short

01:36:11: so it was pretty much like just a month of

01:36:14: constant crunch just kind of working on it and

01:36:17: there wasn't enough time to kind of get it through

01:36:23: so it is a complicated situation

01:36:26: unfortunately. And there's a thing that kind of happens sometimes with businesses

01:36:30: like you end up in a

01:36:34: situation where you don't really have a good

01:36:38: path so you just have to deal with it

01:36:43: we want to eliminate those kind of situations and we had

01:36:46: a number of conversations internally to see how do we prevent this

01:36:50: from happening again, how do we make sure we don't end up in a situation

01:36:54: where we have to do something like that

01:36:58: and we have a much better understanding of the problem

01:37:02: now where if a situation

01:37:06: like this were to occur again we're going to be

01:37:10: better equipped on the communication side

01:37:14: how do we deal with it and how do we make sure it doesn't mess with

01:37:18: our priorities and things we need to focus on

01:37:24: so it was like

01:37:25: it was a messy situation, I'm not happy with how I handled

01:37:30: some of the things with it

01:37:32: but it's pretty much

01:37:35: it is what it is and the best thing we can do right now is

01:37:41: learn from it and try to improve things

01:37:46: Next question is

01:37:48: How are you compiling the questions from the streamchats? I thought Twitch knows we're

01:37:52: broken. No, it's actually work

01:37:56: We do have this thing here where we're going through the questions. This is an older one

01:38:00: I need to grab a bigger one, but it's sort of like, you know, sorting the questions

01:38:04: for us

01:38:11: The Twitch

01:38:12: nodes also would have actually broken where

01:38:16: the displays of them and could have fixed very recently

01:38:19: I pushed the update for it last week

01:38:26: So next we have Epic Easton

01:38:28: He's asking, most were able to access internal array to edit things

01:38:32: like color over lifetime, enough over lifetime. Will those be

01:38:36: properly converted? Yes. Those systems have been very

01:38:40: implemented for PhotonDust, so they're going to be converted to equity ones

01:38:44: So it's just going to work out of the box. The good news

01:38:48: is there's also new modules

01:38:52: because PhotonDust, the new particle

01:38:56: system, is designed to be way more modular

01:39:01: So there's modules that instead of just

01:39:04: the internal array, you can also specify the color over lifetime

01:39:09: using a texture, or using

01:39:12: starting and ending color. You can also do starting and ending color in the

01:39:18: HSV color space, so there's

01:39:21: a lot of new color effects that it can do that's going to give you more control over the particle

01:39:25: system. And we can always add more, because we now have full control

01:39:29: of the system, so those modules are very easy to write.

01:39:33: This next one is a little

01:39:37: moderation focused. Do you mind if I attempt to answer it?

01:39:43: Okay. Let me take a breath

01:39:45: for this one, because it's a long one. On the topic of the platform being

01:39:48: for everyone, why was the nipples allowed? We will pass if the majority of people in the world

01:39:52: including me are not going to want to see them in public sessions. I will admit

01:39:56: that it has been an extremely rare occurrence of seeing someone with them shown in a public session

01:40:00: and will it be possible for me to request things like this both to the team and other people

01:40:04: without having my moto-slash-belief question at every turn?

01:40:10: So, the reason why we

01:40:12: wanted to take a stand on topic quality

01:40:17: that's what this issue is called, by the way, it's called topic quality

01:40:19: is, um, because

01:40:25: ultimately

01:40:28: like, if a man can have a bare chest

01:40:31: you know, why can't a woman? The only difference is that on average

01:40:35: women have larger chests than men, and I think

01:40:39: we're also an EU-based company, right?

01:40:43: I'm from Europe. Okay, this is the stance in a lot of places

01:40:48: in Europe, too, where topic quality is just sort of the norm

01:40:52: and we want to normalize that, because

01:40:57: we do need this kind of a quality, like why

01:40:59: can't a woman have, you know, their

01:41:03: why can't a woman be topless, you know, in a non-sexual context?

01:41:07: There's just no precedent for it.

01:41:12: And, let me see if I'm...

01:41:16: There's also a thing with this, it's like we

01:41:19: you know, we believe in equality and we believe in a lot of progress, so

01:41:23: we don't need to take stance on those things, but also we kind of give you tools

01:41:27: to kind of deal with those, so if it's something you really don't want to see

01:41:31: there's an avatar block function. You can block those people, they will not appear

01:41:35: for you. There's probably more things we can do in that

01:41:39: area as well, but ultimately we want to be like, you know,

01:41:44: very kind of like open and very kind of progressive as a company

01:41:46: when it comes to these things. There's also like, I would really recommend

01:41:51: like asking this question also in the moderation, like

01:41:54: office hours, because the moderation teams, you know, the one that kind of deals

01:41:59: with this a lot of detail and they're going to have like, they're going to have a lot more kind of context for

01:42:03: these things. But also like, you know, I

01:42:08: don't necessarily believe that like, you know, it's

01:42:11: like the majority of the people on the internet, you know, like having that stance

01:42:15: like it's, there's, there's a good chunk of like, you know,

01:42:19: kind of people like who are kind of like, you know, very open about this and

01:42:24: I feel like, you know, that the chunk is kind of growing. People are kind of

01:42:26: getting like, you know, more open with things.

01:42:30: I do recommend like, you know, bringing this like with the moderation office hours, like they're going to be able to

01:42:35: give you like kind of much, much kind of a better answer for this because they've been

01:42:39: dealing with this topic, you know, for a while.

01:42:45: So, you know, take what we say like a little

01:42:47: bit of a grain of salt. I don't want to, you know, kind of step on the moderation teams

01:42:51: like those with that.

01:42:54: Yeah, I was going to say something to, I was going to say something to wrap it up. What was I going to say?

01:43:05: Yeah, I was just going to say, I don't know what

01:43:09: I don't know what you mean by, because I commented

01:43:13: I don't know what you mean by this rule being

01:43:17: exploited by transgender males and females, but

01:43:20: being transgender has nothing to do with this.

01:43:24: If you want to be a boy or want to be a girl

01:43:29: that has no bearing on this rule.

01:43:31: Most part of the quote too is like, you know, because it kind of like

01:43:35: erases that kind of disparity, like it doesn't really

01:43:39: matter. If you do feel there's some exploit you can

01:43:43: always, you know, you can file moderation reports or you can file, like you know

01:43:48: you can bring these

01:43:51: to the moderation office hours and discuss these there.

01:43:55: Then we can kind of see what is happening and then we sort of evaluate does it fit

01:43:59: with our rules or does it not.

01:44:03: So you can, if you feel there's some issue

01:44:07: you can make us aware of it and we can promise that we're going to

01:44:11: agree with you, that we're going to have the same view on it, but

01:44:14: we can at the very least look at it and listen to what you have to say on that.

01:44:22: So next we have

01:44:23: Grand UK, Hearsay. I have heard from someone that they try to report

01:44:27: someone to the moderation team but because they were connected to the team nothing happened

01:44:31: of it and they ended up banned instead. I can't confirm

01:44:35: 100% that what was said happened and I know nothing can be said about moderation

01:44:39: cases but in case where there are conflicts of interest like above

01:44:43: what can be done and how can we be sure where we won't have wrongful consequences bans for trying to

01:44:47: uphold the US and guidelines for everyone.

01:44:52: So, I understand there's not like super many details but I can kind of

01:44:55: talk in general. Sometimes we do have cases where

01:45:02: there's actually two things.

01:45:03: We do have cases where there's reports against people who are on the moderation

01:45:07: team or even on the Resonite team.

01:45:11: If it's a report against someone who's on the moderation team that will usually go

01:45:15: to the moderation leads and those people

01:45:19: cannot deal with it, they will investigate. We actually have multiple moderation

01:45:23: leads as well. That way it's not like

01:45:27: there's a single person who can just bury the thing but there's multiple people

01:45:31: who all can see the same data and then sort of check on each other.

01:45:36: If it happens something with a team

01:45:38: or if there's an issue with somebody on the actual Resonite team, usually

01:45:43: that goes like the Canadian kid who's doing

01:45:47: those things and he brings these things with me.

01:45:54: We have cases

01:45:56: where we had to deal with difficult situations before

01:46:00: but on the theme, but in the moderation

01:46:03: team as well. I can't really go into details

01:46:08: because there's privacy issues

01:46:12: with that. I can tell you there's been

01:46:15: cases where people on the moderation team

01:46:19: they had to permanently ban some people who

01:46:22: were their friends, even long-time friends, because

01:46:27: they did something wrong.

01:46:31: This caused people on the moderation team

01:46:34: a lot of distress, but they still made the decision

01:46:38: to ban their friend because they

01:46:43: want to uphold the moderation rules

01:46:47: above all else. I've looked at

01:46:51: a few of those cases because I do want to make sure things are

01:46:56: going okay, there's

01:46:59: favoritism happening. I've been involved in

01:47:03: a few of those cases as well.

01:47:07: Part of the discussion of it and so on.

01:47:12: There's been a number of difficult discussions on those

01:47:15: and every single one, if there was sufficient

01:47:19: evidence for somebody's wrongdoing,

01:47:23: even if we knew that person personally, even if they were connected to the team,

01:47:27: they were still banned.

01:47:31: There's one thing I kind of noticed that's also kind of in general, is usually when

01:47:35: people do get banned,

01:47:42: they're almost never

01:47:43: truthful about the reason, because we do make sure

01:47:47: as part of the moderation, if somebody ends up being banned, usually

01:47:51: they will receive warnings first, depending on the severity.

01:47:56: If they end up being banned,

01:47:59: the reasoning is explained to them.

01:48:03: Oftentimes there's somebody from the team who's actually going to

01:48:07: sit down with them and be like, we have this evidence, this

01:48:10: kind of happened, you're getting banned for these reasons.

01:48:15: They are made aware of it. And in a lot of cases,

01:48:19: those people will come out and

01:48:22: give completely different reasons for why they're banned.

01:48:28: And this kind of puts us in a difficult situation,

01:48:30: because we value privacy, and sometimes giving details to the public

01:48:34: could put innocent people who are involved in those incidents at risk.

01:48:41: So we cannot really say

01:48:42: the person was actually banned for these reasons.

01:48:48: But it is a thing that happens.

01:48:52: So the only thing I can request is

01:48:57: be more skeptical about what

01:48:59: people say about these things. If you see something,

01:49:02: if you believe you can always send us a report, we will look at it, we will evaluate it,

01:49:07: we will see what evidence we have.

01:49:11: But ultimately, we will not necessarily tell you

01:49:14: the details of how it was resolved to protect the privacy

01:49:20: and potential security of people involved.

01:49:24: I will also... Oh, sorry.

01:49:27: No, go ahead. I was just going to say that we're

01:49:31: just about 10 minute mark, so I think we should close questions.

01:49:34: Okay, so we're going to close the questions.

01:49:38: So if you send

01:49:42: questions right now, we have a few of them coming in,

01:49:46: if you send any questions after this point, we can guarantee we're going to

01:49:50: answer that one. We'll try to answer as many as we can that are still left,

01:49:54: but no guarantees at this point. But I will at the very least

01:49:58: try to make it

01:50:03: the ones that we have

01:50:04: on the list right now. So the next one,

01:50:09: EpicEston. Does the question mark need to be at the end of the question?

01:50:13: I think it doesn't need to be. I think I can put it

01:50:16: in the middle, but just to be sure, I would put it like...

01:50:20: Actually, no. I literally see a question that has a question mark in the middle

01:50:24: of it, so no, it doesn't need to be at the end.

01:50:31: Erasmus0211.

01:50:32: Any more flux nodes in the works? If yes, which

01:50:35: excites you the most?

01:50:39: You're working on some new ones.

01:50:46: Which ones am I working on again?

01:50:49: I'm just the one I just took.

01:50:52: Oh yes, there is a

01:50:55: new ProtoFlux node I'm particularly excited about. So, you know how

01:51:00: for those of you who do ProtoFlux,

01:51:03: there is currently a node where you can perform a raycast

01:51:07: which shoots an infinitely thin line, and whenever it hits, you can get the position,

01:51:11: you can get the direction, stuff like that.

01:51:16: What I'm going to implement is I'm going to implement

01:51:20: sweeping, or I think it's

01:51:24: also been called shapecasting or whatever,

01:51:27: unlike some other platforms, but it's essentially a way of doing thick

01:51:31: raycasts using a shape that you essentially

01:51:35: extrude in the direction that you want it to go.

01:51:38: So, if you wanted to shoot a sphere in a direction,

01:51:44: you would essentially be shooting a capsule

01:51:48: however long you want to shoot it, and anything within there

01:51:52: it would hit. Or in this case, you know, the first thing it hits

01:51:56: it will return basically exactly like a raycast,

01:51:59: but it's thick, and you can do that with different shapes like a sphere,

01:52:03: or a cube, or I think you

01:52:07: should also be able to do it with convex hulls, right?

01:52:12: I'm not sure if we have that one, maybe.

01:52:16: I thought it was going to be better. At the very least, you'll be able to do it

01:52:20: with spheres, and cubes, and cylinders, and capsules, and stuff.

01:52:25: But I think that will be very useful, especially for those

01:52:28: of you who make vehicles who don't want your raycasts to

01:52:32: shoot between two infinitely close triangles in geometry, and now your

01:52:35: car is flying across the map. Yeah. Thick raycasts.

01:52:40: Yeah, thick raycasts.

01:52:42: Because we do have a lot of the functionality, it's already in the part of the

01:52:50: car. We use it internally in our own engine. For example, the laser is actually

01:52:53: using sweeps to behave a bit better.

01:52:58: And this is going to expose them, so you can also use them from ProtoFlux.

01:53:06: This one seems to be asking something in the chat, so I'm going to

01:53:09: skip this one. Tribe Grade World VR. Question.

01:53:14: For example, if you're still already on the video, say genius, what app are you using to do those scans?

01:53:18: Yes, some interstellar reality.

01:53:21: For most of my scans, I'm using a software called Agisoft Metashape.

01:53:27: It's a photogrammetry software, and essentially you take lots of pictures

01:53:30: of the subject from lots of different angles,

01:53:35: and it's able to do those reconstructions. It figures out

01:53:39: based on the patterns in the photos, where the photos are, and then

01:53:42: reconstructs a mesh. I also sometimes use additional

01:53:46: software, like I'll for example use Photoshop to like, with certain

01:53:50: photos, I will do like an AI denoise on them, which

01:53:54: helps increase the quality of the scans, and I will also do

01:53:58: some kind of tuning of the lighting and so on. But I guess Metashape

01:54:02: is the main one. There's also one that I kind of started experimenting with a few days ago,

01:54:07: and I literally turned my room into like,

01:54:11: it's a software called, actually

01:54:14: I forget the first, it's called PostShot. Let me see

01:54:18: the full name. Joseth PostShot. And this one's for

01:54:22: Gaussian Splathing, which is sort of like this new technique, you know,

01:54:26: for 3D reconstruction, or more general like rendering, which can

01:54:30: reconstruct the scenes with much better fidelity. And we're kind of

01:54:34: playing with it, like because I have all my datasets, I've been just kind of throwing at it and see like

01:54:38: how it kind of works with different things.

01:54:42: So like I might

01:54:44: like integrate that one more into my workflow as I kind of like

01:54:48: go. I posted like a quick video and have like a bunch more

01:54:52: that I'll be posting soon-ish.

01:54:55: But yeah, I guess some mentorship is the main one to use, like you know, it makes it easier to just

01:54:59: get a mesh, bring it in.

01:55:04: This one is continuing a moderation question

01:55:08: that we had a couple ago.

01:55:12: This one from Ralag86

01:55:16: again asks, continuing my previous question, will anything be done

01:55:20: regarding people who do not want to see beta.top females in public sessions? For non-hosts

01:55:24: I am aware of the already in-play system where you can ask the version to switch avatars slash avi settings

01:55:28: and for hosts they can enforce address code which I am no doubt making use of.

01:55:32: So in the future we do want to

01:55:36: implement stuff like content tagging

01:55:41: and that will come with

01:55:44: the ability to, you know, if things are tagged a certain way you can

01:55:47: take a checkbox and you won't see them anymore, right?

01:55:50: So you could make use of that.

01:55:55: That's something we will do in the future.

01:55:59: But other than that, for the time being

01:56:04: if you don't want to see that, don't go to those sessions.

01:56:08: Well, you can still go to those sessions because we do have

01:56:11: the ability to block somebody's avatar.

01:56:16: I can actually show you if I

01:56:20: click on Cyro's name...

01:56:23: Careful, it might ban me from the session.

01:56:25: Oh, it should be just block avatar. There we go, see now Cyro is gone.

01:56:30: I don't have to look at it. Well, I can still see it, but I don't have to look at it like him anymore.

01:56:35: Yeah, that is something I forgot about.

01:56:39: This is one of the reasons we added it.

01:56:42: You have the power. If some avatar is legitimately upsetting you,

01:56:47: you can block it. The other part is if you host

01:56:50: your own sessions, you can enforce your own rules. We do allow for that,

01:56:54: with some caveats. So if you want to enforce a dress code,

01:56:58: that's completely up to you. You have that freedom.

01:57:03: You can always add additional rules to whatever

01:57:06: sessions you want to host.

01:57:10: That's another thing. Eventually the content tagging system

01:57:14: should make these things more generalized.

01:57:18: You don't even have to go and see it in the first place as long as the content is properly tagged.

01:57:23: We can filter certain things out. We can block certain avatars.

01:57:26: We don't want to give you the tools, but

01:57:30: we don't want to make global decisions

01:57:34: just forbidding these things for everyone.

01:57:38: There is a nuance I was going to get to there

01:57:43: in that if you decide

01:57:46: to not allow, let's say you're like,

01:57:50: I don't want to see nipples in my world, that also has to apply to the men in the session

01:57:54: as well. It is universal, you cannot discriminate.

01:58:00: So it's either nipples allowed for all, or no nipples at all.

01:58:05: It actually reminds me, because there was one thing

01:58:07: that was particularly funny to me. With the Creator Jam, they actually made a nipple gun

01:58:11: they were shooting around the world, and people got upset, and they were like

01:58:15: oh no, it's okay, those are male nipples, they're not female nipples.

01:58:19: It was a funny way to point out to that

01:58:22: like, double standard, you know, for this kind of thing.

01:58:28: Uh, but yeah. Uh, next we have

01:58:30: Verailash86, my question being will anything be done past it all?

01:58:34: I don't know which one this one's actually related to.

01:58:39: It was related to the previous one they sent them in a row.

01:58:43: Um, so we're

01:58:46: um, this might be the last one because we're last minute.

01:58:52: Yeah, we already answered that one.

01:58:55: Um, yeah.

01:58:58: I think that's pretty much it, we had a few more come in, but we got it.

01:59:02: Yeah, there's a few more, but this is pretty much the last minute, like we've been here for two hours

01:59:06: my throat is kinda sore from this, I should have brought some water.

01:59:10: Uh, but thank you everyone, you know, for joining, thank you for so many

01:59:14: kind of questions, like we're very happy to answer those, you know,

01:59:18: like let you know more about the platform, and just kind of like chat with you.

01:59:23: Thanks everyone, also like, you know, for playing, you know, Resonite

01:59:26: for enjoying this platform, you know, for supporting us and letting us do this kind of thing.

01:59:31: Um, I hope you enjoy the stream, like

01:59:34: my goal is, you know, make this every week. The format might kind of

01:59:38: change a little bit, we'll kind of see, you know, how many questions we get like next time and so on.

01:59:43: We might, you know, next time might be for example

01:59:46: outside of Resonite, you know, playing some kind of chill games while kind of chatting

01:59:49: with you, but we'll see how it kind of goes, because

01:59:53: this one there was a lot of questions, we're like, you know, kind of focused more on the Q&A

01:59:57: and we'll see like, you know, how it changes with the upcoming streams.

02:00:01: So we'll experiment with the format a little bit and see like, you know,

02:00:05: and also let us know, let us know like, you know, like what do you think, like do you like this?

02:00:09: Would you like to see some other things? Are there like any kind of issues?

02:00:13: You can like, you know, post, like

02:00:16: actually where should I post? Maybe make a thread in the office hours

02:00:23: like under Discord

02:00:25: to share your feedback. So thank you very much for joining,

02:00:28: you know, thank you for like spending time with us and asking us questions.

02:00:33: I'll try, like, you know, try to get like this video uploaded on

02:00:36: you know, our YouTube channel so you can, anybody who like missed these office hours

02:00:40: you can, you know, watch them afterwards, and we'll see you next week.

02:00:45: So thank you very much, and thank you also Cyro for, you know, helping me with this.

02:00:49: And being a good co-host, and we'll see you next week.

02:00:53: Bye!