This is a transcript of The Resonance from 2025 April 6.
0:00: start recording okay we're live uh actually let me check i'm going
0:05: to post the announcements i got enough space on drive okay uh going to post an
0:13: announcement hello friends hello
0:18: hello hello hello hello beautiful people
0:29: and post the blue sky oh I didn't prepare blue sky post on my damn
0:34: uh Oh no I messed
0:40: up uh wait I'm going to just do a quick
0:46: one come ask anything about
0:54: resinite resonite uh
1:00: http twitch TV slash
1:08: right and that looks correct just typing this on the virtual
1:18: keyboard sorry for that i should have prepared the poster i forgot there we go
1:24: okay post it hello everyone i'm a little bit lunch today
1:30: because I ate way too much potato salad and I'm and is made of potato salad now i'm
1:37: made of potato salad now it was too delicious and I ate too much and don't
1:43: squeeze him it'll come out of his ears no hello everyone hello Marty Sage
1:49: Harler Gar VR uh welcome to the resonance i'm just
1:56: also switching to the channel for the questions there we
2:01: go so we should be
2:06: live and people should be piling in hello everyone welcome welcome to
2:13: your first live yay we got a thing thank you oh my god thank you Nikki for the for the raid
2:20: thank you yay hello so hello everyone uh welcome to
2:26: another episode of Resonance which is sort of like a like an office hour/podcast where you can ask anything
2:32: about the Resonite uh whatever you kind of want to ask whether it's like to do with development like you know technical
2:38: side community side you know the philosophy of it like whatever we would like to ask feel free some of the topics
2:46: we might like redirect you to some other ones uh like the moderation of fish
2:51: hours that happened like an hour and a half before this one or our team like probably primes so but you're always
2:59: free to ask like you know worst case scenario will just be like okay we recommend asking in this one instead um
3:07: uh if you would do ask a question make sure you put a question mark at the end of it that it's going to pop out on
3:13: another thing over here like u well this one's not a question but uh it's going to pop out like this uh this is the rate
3:19: it also pops there uh yeah we should be I'm also here
3:26: I'm here with Syra from our engineering team as well who's going to be helping me answer some of the questions um and
3:32: we should be able to get started so um before we start
3:41: uh let's see we already got some questions popping but we're actually going to go to the questions from
3:48: Discord first uh there's a um we do have a post where people are able to ask
3:56: questions in advance so the first question we have actually
4:02: sir would be able to bring it in as well so we can kind of show it on the thing
4:07: yes yes yes let me grab the uh question from the thread real quick uh so the
4:13: question the first question is from Mintshock will Resonate become the metaverse by Wikipedia's definition it
4:20: already is a metaverse the metaverse is loosely defined term referring to virtual worlds in which users
4:26: represented by avatars interact usually in 3D focused on social and economic
4:31: connection but I'm thinking more in line with the ready player once metaverse so widely adopted and all encompassing
4:37: platform that shapes the lives of millions I haven't watched the movie or the books but that's what I gather from
4:43: others so I mean if you go by the Wikipedia as a definition then you already can you know answer your own
4:48: question it technically is a metaverse by that you know definition but um it
4:54: kind of depends uh and if it's more you know about like being used by millions
5:00: and shipping for millions like you know we hope we do get there that we make like you know platform that's really
5:06: powerful and it's very versatile so pretty much anybody can you know use it for like their daily lives you know
5:12: whether it's for socialization work education like you know learning things building things you know using it as
5:17: your virtual like you know workshop your virtual studio we want our design to be like very versatile and usable pretty
5:24: much in every aspect of your life whether we get there you know that is a question we're going to do our best so
5:31: we do get there but you know we can't you know predict the future so we do hope like we get to that point but um
5:38: I'm going to adjust this a little bit so we can also see it better where's
5:44: the I'm going to make it a little bit smaller there we go um so yeah it's a
5:53: um hope we get there um time will tell you know is one of
5:58: those kinds of things i do think there's like a few things that would definitely help it uh like one that's actually
6:04: getting to its final phases is the performance update which is going to let us switch to modern .NET 9 framework
6:11: um that will like kind of free up like Resonate much easier to use to people
6:17: because it's going to run better on different machines so that should like help broaden it appeal uh there's other
6:22: things you know like improving the UI like one of the things like having a workshop and then having a license
6:27: system which will allow people like you know to more easily share what you create
6:33: um and you know even sell them which would like you know go into the commerce
6:38: uh aspect of metaverse so there's definitely things that will kind of help you know on the road there um but yeah
6:47: like short answer is like we hope we'll get there we'll hope like you know to get to that kind of impact we'll do our best we can't see the future so can't
6:55: really give you like you know definitive answer but we'll do our best to make it a reality
7:03: just queuing some of these up they're in top down order if you want to grab them yeah thank you
7:09: so the next question is from Marty SH uh
7:14: how are particle systems synchronized across clients does every client just run the particle system for themselves
7:20: yeah pretty much so like the actual behavior of individual particles is not synchronized because in most cases there
7:25: will be like way way too much data
7:31: um it's like um you know it's it's one of those kind
7:36: of systems where typically the inputs to the particle system are you know
7:42: synchronized uh meaning like you know the properties of the particles how they're supposed to look uh the
7:48: emissions that's part of the data model so like you know that should be same for every user so generally the like people
7:56: will see very similar things if you push it to the like you know to the extreme
8:02: and say like you emit a particle and the particle can have like a random direction and you only have like one or
8:08: few particles one user might see like you know particle going one direction the other will see it doing other that's
8:14: where it will kind of show that like you know the individual particles are not synchronized but if we have like you
8:20: know spray of particles like say I do this this actually looks a little bit
8:25: different on Cyro's end like you know he sees the bubbles in different spots but
8:30: general like you know it doesn't matter in most cases and when it does matter particle system might not be the best
8:37: choice uh like but if it's like a visual effect like it doesn't matter where the each individual particle kind of is so
8:45: um you know it's it's that kind of like system like usually use it for things like
8:51: where you like that that is not an issue it uh essentially produces like
8:58: similar result on everybody's hand um the next one is I just ceued all
9:04: of these up uh the next question is from uh BD uh how much of performance impact
9:12: is IK calculation on net 9 and with upcoming IK changes is there any third of transmitting computed bone poses
9:19: instead of running I all the clients so with net 9 I don't actually have like the data on hand um because I don't have
9:27: like specific profiling data with net specifically I did profile the IK like several times in the past um it does
9:35: have like some impact it's not as Um it's not as slow as like some
9:42: people think like that generally can do I care relatively fast uh however there is like one aspect of it i'm actually
9:48: going to grab a brush um there is like one aspect of
9:54: it that's not actually the IK itself that's causing a slowdown and it's sort of the interactions with the data model
10:01: because when the IARE works what it has to do it has to like read out like you know data from data model and has to
10:07: write it back and one of the issues uh let me just grab my brush one of the
10:15: issues is like you know like if you go into a profile you will see like you know for example IK is being executed
10:21: and this is you know IK cannot write it backwards so I'm just going to do this uh and also going to move it like
10:28: this you know so you have like your IK and then like the IK does like a whole bunch of stuff you know it does like
10:35: lots of different operations and some of these are like writing to the data model so you have like you know you have the
10:41: data model um just going to write data and put it
10:47: here save the data model and it's you know sort of interacting with that in
10:52: some of the phases and it's one of the things that actually kind of slows it down um it's one of the things we also do
10:58: want to optimize we kind of want to like make the data model more efficient for recurrent operations like things that
11:04: just run every single frame because right now the data model it tracks when things change and the tracking like
11:10: especially when there's a lot of interactions under every frame that adds up and it it like when you look at data
11:16: profile with like a flame graph it will look like you know the IK kind of bottoms up but it's actually like
11:22: accumulating lots of these kind of interactions with the data model um and this kind of makes the IK kind of look
11:28: bigger than it is um optimizing this like would kind of like reduce a little
11:33: bit overhead um so that would also help but just to switch to net 9 it makes it
11:40: makes those interactions faster way faster it makes the math calculations faster so that kind of helps a lot you
11:47: know just on its own with existing code but there's still like you know room for
11:52: improvement um right now we don't really have like any plans to do it likeworked because um
12:01: I don't believe it is necessary uh because there's like a lot of other things that can be done as well like one
12:06: of them being you know variable rate updates which is another like you know performance optimizations we want to
12:11: make in the future where some systems will um update at slower rate than
12:18: others so say for example you know I'm I have my own you know I'm kind of like moving and so on i want it to be match
12:24: my frame rate uh so that needs to update every single frame and if somebody like is near me you know maybe it also needs
12:31: to update like I want it to be smooth but say there's a person you know who's over distance and over there or maybe
12:36: like in over there I don't need to see them at the same fidelity because I can you know barely see them maybe I don't
12:41: see them at all so the system will be able to like lower the rate at which their IK updates
12:48: um proportionally you know how far away they are or like you know how you cannot see them and it will help you know
12:56: reduce the the load um the other things we want to do with the data model we'll be able to execute the IK in parallel uh
13:04: using like multi-threading for it which is going to help even more because right now the interactions with the data model
13:09: they need to be synchronized which makes it so you can really run it super well multi-threaded um there is actually a
13:17: system that like end up like being multi-threaded which is the dynamic bones all of the simulation of the dynamic bones all the heavy math heavy
13:24: stuff is multi-threaded uh however what is really funny is like you can actually
13:31: even look at it yourself if you if I go to the debug dialogue and I look at the focused world there's like a bunch of
13:37: statistics actually no it's under physics um the dynamic bones there's total update time for me is like half a
13:44: millisecond uh the simulation is around hovering around 20 0.2 20 milliseconds
13:51: and then the finish is around 0.20 milliseconds the finish part is literally just taking all the computed
13:57: values and values and copying them back to the data model that's taking the same amount of time as the just all the m
14:06: simulation which kind of shows you you know how much overhead that has um which means you know that's something you know
14:13: that like the multituding it helps because it takes like less compute time
14:19: but you know it uh it still kind like you know that that finish part where it has to write the results back that's
14:25: what ends up like you know kind of slowing it down and sort of like needs you know some of the data model kind of
14:31: changes but just by switching to 9 that code runs way faster like it's able to generate much better machine code uh so
14:39: that just you know bumps it into performance but there's a lot of you
14:44: know lot of like things that we can still do to help improve the performance and make it so you can just compute the
14:50: IK like locally um because if you do start transferring over network you
14:55: increase you know how much bandwidth is necessary uh you also introduce you know some effects like you know where maybe
15:01: like it needs to be smoother you like you know and like you don't get like much of a precision in the movement you
15:07: know which are kind of downside sides so we would like to avoid those if possible and I do believe they are possible
15:13: because like with the right systems at hand like your machine should be able to
15:18: run IK for like hundreds or thousands of like users like there's like um if it's
15:24: like you know optimized really well um there's a lot of games that like you know that do this kind of thing the
15:32: um the combination of like faster runtime and then like in the data model improvements and the multi multi-
15:38: threading for actual computations I think that'll make it so we don't need to do
15:44: it uh next questions also one more thing I'm going to mention is like if you know
15:50: any of this is also kind of subject to change so like if like it seems like even with all of that it would be
15:55: beneficial to network it we would eventually decide to do that but unless we have like you know a solid reason to
16:02: do that um like you know and we exhaust the other possible approaches like I don't think we'll go for it kind of
16:08: thing um next question is from
16:14: Mintshock um just position it there we go um so
16:20: Mshock is asking at what point would you consider working on Linux client again
16:26: after splittening after 1.0 or much later so right now I don't really have
16:31: any specific kind of time like like you know in a specific kind of milestone i
16:36: know at the very least it would like it wouldn't be before the split thinning it I don't know if like before after 1 1.0
16:43: it might be after as well after we do like a whole bunch of other things because the core of the issue is you
16:51: know we don't have the bandwidth to it like you know we don't have like the time to kind of deal with those kind of
16:57: specific issues um and like we're essentially looking you know at all the things you know on our plate and there's
17:03: like you know stuff like you know like UI improvements there's stuff like you know the performance that we're working
17:10: with right now there's stuff like you know physics there's stuff like the IK and you essentially have to decide okay
17:15: so which one are we going to invest time into because we only like you know you essentially
17:20: end up a situation where you can pick one or the other because your time is you know our time is limited so we have
17:26: to decide which one we invest time in so it becomes a question is the native Linux client uh more important than you
17:34: know for example rigid body physics is the native Linux client more important than UI improvements is the native Linux
17:41: clients more important you know than having the workshop system so it kind of
17:46: becomes like you know that kind of a question is like when when does it become more important than all the other
17:52: things like I need to deal with so I would say at the point like if
17:57: there's a you know situation where we have enough bandwidth for it um because
18:03: the other aspect of this you know as we kind of grow the team we'll be able to have like people dedicated to specific
18:09: you know um more maintaining specific kind of areas so having an engineer
18:15: whose responsibility can be maintaining the Linux one that could also solve a problem but for that we need to grow and
18:21: in order to grow we need to improve a lot of things uh one of those being you know the performance that's being
18:27: offered right now another being the UI is adding kind of more features and there's also a question you know if we
18:33: offer native Linux client does that help us grow or not because if it doesn't
18:40: help us grow enough and you know get enough funding to afford more like engineers we're probably not going to
18:46: prioritize it and it sucks you know that it's it's that's the kind of reality of
18:52: things but we do have be very careful about like you know like
18:58: where we spend our time um because like we we need to make sure
19:04: you know we can kind of keep going we can keep paying you know people on the team and we can keep growing the team to kind of tackle more of the problems and
19:11: we want we want all the engineers to tackle problems that are most likely to
19:17: let us you know grow more bring more people on the platform get more support get more funding um so at some point we
19:24: might you know grow to the point like where like okay like we can focus on this again you know there's like good
19:29: reasons for it um we can afford it you know like like we're big enough but I
19:36: don't know when it's going to happen uh but we're probably going to focus on like some bigger like you know bigger
19:41: priorities before that happens
19:48: uh so the next question is uh from
19:54: Triple Helix which is actually three X's too in the name Triple X's um Triple
20:01: Helix is asking with the new audio system an upcoming performance boost after a splitening could we potentially
20:08: see audio uh rate tracing in the future of course I don't know the technical challenge behind it but the system this
20:14: guy came up with seems really cool with a lot of practically and included a YouTube video um we're not going to
20:19: watch the video now but I've I've seen this one before um well at least bits of it um what I can tell you is the we are
20:26: using the Steam audio for audio spatialization uh what steam audio does
20:32: it actually has support for rate tracing where it sort of like you know does bunch of rays and figures out how the audio is like you know bouncing around
20:39: how it's reflecting and how it's being occluded by the world geometry so steam
20:44: audio already has that functionality what we want to do is integrate it uh
20:50: where we essentially you know we take there's like from what I've looked at the API there's essentially two ways to kind of do it one of the ways is um one
20:59: of the ways is like you know we can take you know the geometry that's in the
21:04: world that you want the audio to interact with and pass it to steam audio steam audio does its own like
21:09: accelerations the other one is we expose to it sort of like a rate trace function
21:15: so we could like you know use existing rate tracing that beu physics has uh so
21:20: we actually handle rate tracing and it then like you know uses to compute all the properties so there's like two ways
21:26: to kind of implement it but we do need to integrate it somehow and it will take some time and part of me kind of wants
21:34: to do it now because I'm like I want cool you know audio like that kind of reacts to the environment but it also
21:40: mean you know delaying the per the spliting because like you know like I have to kind of
21:45: spend time on it so I don't know if we'll prioritize it now because I feel like it kind of need we need to split
21:52: anymore we need a big performance update but after that happens like I I would
21:57: really love to like integrate it like give us like more realistic audio um I don't think it's going to take even that
22:03: much of a time like it might maybe take a week maybe two um and I think it's
22:08: going to be really cool just for like the audio realism uh we probably wouldn't implement our own system especially since Steam Audios are like a
22:14: robust solution that handles it so you'll definitely see it at some point in the
22:20: future uh next question is from Ozie we have a
22:27: lot of Discord questions today
22:32: um there we go um OC is asking got some random questions I was wondering about
22:38: before going to sleep here is it planned to be able to create local slots components ourselves at some point yes
22:45: uh we would like to like provide kind of tooling for it um because there's like a lot of kind of use cases um you know
22:51: like being able to have like systems that like you know exist for example only on the server so you can have like
22:56: you know say like you're building a game and you want like you know a game controller that's like you know orchestrating everything interacting
23:03: with things but you don't want you don't want players to cheat for example like you know being able to read some
23:08: privileged state of the game that they could use for cheating in that game or you know even mess with it like you know
23:14: um so there's like lots of kind of like use cases um probably provide some like
23:20: some kind of tooling for it don't know when but it'll happen sooner or later
23:25: next question is I believe photon dust collision events node was turned down on GitHub but is there planned port nodes
23:31: for programmatically interact with photon dust uh we don't really have like super specific because like most of the
23:36: way you interact with photon does is through the data model and a lot of the way how resoid works is you interact
23:44: with things through the data model which means like a lot of the cases you don't really need to do specific things and
23:52: because like uh like a like the way like you know it works uh and we kind of talked about you know earlier question
23:58: is like everybody has their own local simulation so we have a node that for example adds a bunch of particles
24:04: somewhere um that node like you know would only do
24:09: it for the user who runs it which would link to desync unless you manually
24:14: handle synchronization and one of the core principles of resonate is avoiding so you manually so you have to manually
24:20: handle synchronization that's like one of the things that like we will almost
24:25: never break like every systems that you want to design should work with implicit
24:30: synchronization by default and if you do want to override something to happen only locally that is an override like
24:37: the that is essentially is an exception not the default with node like that you
24:42: know the things being local would be the default which like you know breaks our
24:49: kind of philosophy so this one of the reasons probably don't do that and one of the ways we achieve you know that
24:55: everything is kind of synchronized is by modeling it in the data model because the data model is implicitly
25:01: synchronized um there might be some helper functions like that we add like to for example like you know for
25:06: particle burst and so on but generally a lot of the stuff like we wanted to use like you know the general things that
25:13: are in the data model just manipulating the values there uh and the last question from Ozie
25:21: is and totally unrelated to asking this question at 9:00 a.m do you have words of advice for fixing sleep
25:28: schedule i'm the wrong person to ask for that one i I I have the sleep schedule
25:34: of a US person with a bad sleep schedule in Europe so not a good person to ask
25:44: uh next question is
25:50: uh from BD uh what software would you recommend for someone who wants to play with creating gashion spots for the
25:56: first time so the one I use is called uh Jawset Postshot um it's currently in beta which
26:04: makes it's free you can play with it but it will eventually become a paid software i don't know how much the price
26:10: will be but if you want to mess with it like you know now is a good time uh and there's also like some other alternatives like I know like say we've
26:16: been kind of looking into some as well uh I'm not too familiar with those because I've been using I've been using
26:22: post myself yeah there are like a couple solutions for uh well actually there's one main
26:30: solution that I know of there's probably more but there's another solution for Linux called Open Splat it's a little
26:37: bit more of like a manual process you know you run it in like a Docker container or something and you I believe
26:44: you have to use some other separate software to do like the the initial like
26:49: sparse uh like yeah the alignment and like the sparse like uh point cloud like
26:55: generation and then you feed that data into like the open splat program and then it will start splatting it yeah
27:03: yeah yeah it's kind of more like DI kind of solution but it's free like and I think the one will remain free the Joset
27:09: posture has the benefit like it's very integrated you pretty much just drag and drop your photos you have a good quality
27:14: photos you know can even leave everything at default and it's going to give you like a good splat um so it kind
27:20: of depends you know what you want to play with and the last question is from
27:27: modern
27:32: uh let's pick one yeah there we go um modern is asking so my question is about
27:38: dynamic bone squish compression gab issue if you look at it and see my example on it I've been meaning to give
27:44: this couple resonances now but this for example so let me have a look for
27:51: example uh so there's like a
27:57: video and it's just a video like um I don't know we can bring it here but it's just
28:02: ramming okay interesting i have to figure out how to
28:07: model that with it like it's kind of like pushing it like I
28:13: think that would be like doable like yeah I think this can be implemented
28:19: uh yeah I don't think like that would be like much we just kind of figure out a good way to kind of fit it with the
28:24: system so I think that could be implemented we just need to some time to think about
28:30: it so with that that's actually all of our Discord questions which means we can
28:35: finally get to the Twitch questions and there's a lot
28:41: um so let's get started so the first one is
28:46: from Fur uh Fur is asking Fergus are you ever jealous that Sire has pointer ears than
28:53: you well I think he meant to say pointier but he said pointer and we say like we don't use pointers here we use
28:59: references we don't do that here we don't do that here only sometimes only only in unsafe
29:07: blocks but and even then only when we only when we enable the
29:12: allow unsafe blocks flag in the CS project file yes which now we don't need to because in most cases because of
29:18: spans it's beautiful yes spans and the unsafe class it's great
29:24: see get you ask a silly question get a silly answer uh next question is from Garin
29:31: Var uh is asking "I've been wondering who is that who is the silent cutie in the
29:37: background whose whole purpose seem to be passing perward and doing goofy notes syra and isn't silent."
29:45: I am uh I am so I am okay i I got I got
29:51: paranoid that like the your volume was gone but like it seems like
29:58: Okay if you don't hear Syro let me know but like it looks like you should be able to
30:04: hear him oh I think they're talking about like just because I've been quiet
30:09: i am Cyro i am a I'm just a little guy i'm one of the engineering okay hi yeah
30:14: Gri is saying we do hear him there we go okay so the next question is from Marty
30:21: Sh um uh is there a way to make object have a trail something like particle
30:27: trail module but for individual objects i mean I use the trails like um like the
30:32: particle trails like that's pretty good way to do it um we could like build your own system
30:39: like we kind of position things and so on but like I feel that's just more effort and we can just use the particle
30:44: system for that like that's usually how people do
30:53: it uh next question is from
30:59: uh from cobalt uh side uh if the EU laws changed uh for the
31:05: minimum available rating for platforms like Resonite to 13 plus would Resonate be forced to change the due do the
31:11: rating so I don't think we would be forced to do that um
31:16: aren't we 16 plus anyways yeah we're 16 power so like if the one of the reasons we are 16 plus is because like uh if you
31:23: go under 16 uh the laws for like you know processing and storing uh data of
31:30: users who are below 16 is much much much stricter uh and right now like we are
31:37: too small like to like be able to deal like you know with like the legality of that like you know and surfing those
31:43: things out the because like the laws are a lot more complicated like it's just simpler for us to be like you know 16
31:50: plus we don't allow anybody under 16 that way like you know we're clear with the laws we make sure like you know
31:56: we're not really breaking anything if they change those laws so like you know those things were no longer necessary
32:01: which I don't really think is going to happen uh but if they did theoretically like we could still be like you know say
32:07: like we just allow 16 plus because that is our policy like you know they will not force us to be to allow users
32:14: younger than that unless they made laws specifically for that saying like you know you're not allowed to do that but
32:19: again like I I don't see this happening and I don't see that happening but more
32:24: realistically if they did change the law like so like you know we didn't need to do some specific stuff we might like you
32:30: know just lower it to 13 so but it's very like I think it's like a
32:36: hypothetical that like this is not going to happen so more I think what's going to happen is like once we kind of grow
32:41: big enough we'll have the resources you know um to actually be able to handle
32:47: the the complexities of like you know data of like users under 16 and at that point we would allow them to the
32:53: platform but uh we're not at that point yet uh next question uh from Navy
33:02: 3001 uh they're asking me just uh so I can type my question from
33:09: last week when corrected what do you think of the idea of you click a wardrob and it shows a list of active sessions
33:15: for the world and also button to create a session um I mean that could be
33:21: interesting like uh we do want to like rework the whole like world UI as well and kind of like improve the mechanisms
33:26: um I also just like you know get up issue for that so we can kind of consider it like it's possible i don't
33:32: know if it's going to like align with what we want to do but something we can definitely
33:38: consider uh Nitra is asking "Will specialized reverb zones be possible
33:44: with audio?" Uh yes they will be uh Sarah you actually did like a test um
33:50: like because you were implementing the ZAR reverb and it works pretty okay okay i think like you you were doing
33:56: something like like half and half mix or something yeah so I was like uh I tested
34:03: the Zeta Reverb like I downloaded like the the actual like Zeta Reverb effect cuz it's the same thing and like I put
34:09: it in my like audio mixer just to like see how it sounded um and with like a
34:17: 5050 like mix of like wet and dry that's like sound that's affected by reverb
34:22: versus sound that's not affected by reverb um or actually no wet and dry in terms
34:28: of how much audio is spatialized and how much audio is not spatialized so if you
34:34: partially unspatialize the audio so that you can pump some of that into the reverb um it sounds really cool cuz like
34:42: you still get the spatialization like I can still tell where people are in the world from like the dry signal cuz it's
34:47: still spatialized but the like global sort of like in my ears like uh like
34:54: echo sounds like um it's like coming from the room that I'm in and that's also because Zeta Reverb is like a
35:00: stereo reverb so the reverb actually reverberates slightly differently in each ear which makes it sound a lot more
35:07: open and spatialized than it actually is yeah in short like we will allow it like
35:12: for for you to like do that because now we kind of control the system so you can you know mess around with it like see for yourself like what works for your
35:18: world it's also like once we do integrate like you know the environmental thing you might not even
35:24: want to use that because the you know the environmental probing it actually does a more physically based reverb so
35:31: like that might end up giving more realistic results but we like you know
35:36: we will offer all the options next question is from Darkest
35:43: Sabertooth um what do you call Froot Loops made by Resonite a Fruit Loop got them fruit
35:50: loop we don't we don't have we don't have any any Froot Loops Froot Loop
35:56: sponsorship maybe uh next question is from
36:04: BD_ um I've noticed in Resonate if you have multiple gshian splord properly
36:10: that is one will always render over the other rather than doing so by spot-by-spot basis is this something that could be resolved in the future
36:16: that you just don't have the bandwidth for or are there some significant technical issues here so it could be
36:22: resolved in the future um one of the potential issues is like it might cost
36:29: too much performance to do at least like with the current pipeline but like once we do make some switches that'll make some things easier um essentially what
36:37: happens is like you know you you have to sort the splats and if you're like rendering you know if you are rendering
36:44: multiple splats what you have to do on the GPU you have to essentially collect and combine all of the splats and then
36:51: sort them sort of combined ones uh which means you have to essentially build out a new buffer of uh gashion splats you
37:00: know on the fly for each frame that you render right now the buffer is fixed you know because you have like each splat is
37:05: like its own buffer and you just sort that buffer you know and like you don't have
37:11: to worry about any other splats so it is possible it'll definitely take like a quite a bit like to implement that
37:17: because you'll have to change how the gausian split rendering pipeline works and I probably would not want to touch
37:23: it like until um wouldn't want to touch it until like you know we do make some switches for
37:30: the rendering engine uh it's also one of those things like you know like most of the times like you don't like the
37:36: gussian sp will not really blend well anyways because like they tend like you
37:42: know the way they tend to work is they might not model the surface of it like super well so like they might
37:48: blend in a kind of weird ways one thing we could do is add like a
37:54: thing like that like bakes together and splits together but like I don't know if that really
38:01: worries uh next question is from Darkest Sabertooth uh will audio support uh uh
38:08: 5.1 or 7.1 sur sound yes I know not many VHS supported uh however a lot of
38:14: cheaper gaming headsets do um so from my check steam mode supports it um right
38:21: now like I'm focusing sort of like on uh feature parity so like it's written with stereo buffers however it can be
38:28: expanded in the future i don't think I would add the support for um MVP because
38:35: I feel it might add like you know too much kind of time and I don't want to like delay things um I would suggest
38:41: making a GitHub issue um you know asking for the support we can gauge you know how much people really want to see this
38:48: h sorry how much people want uh to see this happen and you know we'll decide
38:55: based on then when to prioritize it but probably not for MVP
39:02: uh next question is from platypus 744 um regarding the net 9 switch how do you
39:09: expect it to be deployed will there be pre-release testing do you anticipate it leading a lot of bugs that put it
39:15: testing for a long time yeah there will definitely be pre-release testing like it's a very very big change so like we
39:22: do want to make sure like you know the stable before it gets like you know release domain um I don't know how many
39:29: bugs to expect i don't think it will actually have that many bugs you know with like functionality of the worlds
39:35: because that's not really going to be changing super much um what I expect more is like you know stuff like stabil
39:41: like either stability bugs or bugs like where some of the state of the render maybe gets desynced or maybe like
39:48: corrupted or you know or it crashes or something like that so more more bugs along that line because it doesn't
39:55: really touch the data model much like it's still like it's doing the same kind of things it's just not been kind of
40:00: sent over like you know share memory and some pipe um so we'll we'll see how it goes we
40:07: usually kind of deal like we deal with those things as they kind of go but we've been also doing a lot of kind of preparatory work to make sure like you
40:14: know taking step by step um so like you know like we don't have like a million
40:19: things like kind of million variables that we have to deal with and it's hard to figure out what's happening but we kind of test things like gradually
40:28: uh next question is uh from dark saber-tooth uh dark is asking a real
40:33: question will headless ever support supplying their own audio stream for example in your config choose your audio
40:39: input for streams like MC or some other source uh there's a GitHub issue for it like you could like if you want to like
40:45: you know stream audio it is possible right now if you wanted to do it it's a little bit trickier because we are using
40:50: the CS core library to interact with um you know with the audio devices uh
40:57: problem with that it's Windows only which um it it's fine on
41:06: um it's it's fine like you know like for the graphical client because that is Windows only but like most people use
41:12: Linux for the headless which means it wouldn't work there is a library uh that
41:18: I just discovered like very recently that does have multiplatform support for
41:23: audio devices one issue um actually it wouldn't be like much of an issue for
41:28: headless but we're probably not going to touch it until the spliting the issue is that library is like net 8 and plus uh
41:36: which means it does need the split i would want to replace CS score completely just going to eat it out
41:41: because it's kind of not really updated anymore it's kind of uh deprecated kind of like is a lot of
41:48: crust so we don't switch to this library like globally and at that point like it's going to be very easy for headless
41:54: to be just okay I want to stream this audio device in so yeah that will become a thing at some point but probably after
42:03: splittening next question uh Maria say UX question do certain UI
42:11: colors mean certain things for example I noticed cyan is war public sociable and purple is invisible private it is also
42:18: one of the first things I assumed but there's a nice UI when I first joined yeah so there's there's some logic to it
42:24: i don't actually know it from top of my head this might be one of those things like to bring to the art uh to the art
42:31: team uh office hours or specifically like um Chroma because Chroma is the one
42:37: who designed the color scheme and he made like a document which actually I
42:42: think it should also even be part of our press kit he made like a document on like you know what do the different
42:48: colors mean and when do they when are they kind of like you know being used so there is a sort of like you know some
42:54: design language behind it but I can't tell you the details um yeah I mean when
43:00: we were initially building it uh we did try and code like pinky purple colors to like local kind of like the idea of
43:08: localization like local values and stuff and like that kind of thing
43:17: yeah next question is from uh Tikon hey FS I got a question about
43:23: Quest compatibility is it going to happen or too hard so better to assume
43:29: you mean like running natively on the Quest um we would like it to happen at some point there's a lot of stuff that
43:34: needs to happen for it two main things is like you know the performance improvements which is something really
43:40: working on because Quest it uses mobile hardware and has a fraction of resources that your PC has which makes it like you
43:47: know a lot harder to kind of run as also like limitations you know for example with Unity engine where um the garbage
43:55: collector it uses it actually doesn't behave super well on the quest with something of this type where like you
44:01: essentially get increasing memory fragmentation and then eventually ends up crashing the application
44:06: even if it like manages his memory and it never allocates more than like you know available problem with
44:12: fragmentation is like it reduces how much continuous memory is available and
44:17: it just crashes um so there's like a bunch of kind of issues the other thing we need for native support is there will
44:25: need to be some content segregation because um your quest even with the best
44:32: optimizations it won't be able to like you know handle as much as a PC can you
44:37: know handle as much geometry handle as much complexity so the content will need to be sort of separated uh so you only
44:44: allow like you know more optimized more like low poly content you know running
44:49: on the mobile hardware so it'll happen at some point we're kind of working towards it slowly
44:56: um and I hope like we get there like you know as soon as we can but there is a
45:02: number of things that will need to be sorted out next question is from Grand UK uh
45:10: Grant is asking "I was attempting to create a plug-in for Resonate and needed to make Protox bindings which I
45:16: eventually stumbled on the Protolex bindings generator added without support however it doesn't work for non YDMS
45:22: assemblies i'm actually going to plug it up because it's hard to read um assemblies would it possible to update
45:28: it to work with non YDMS assemblies for plug-in development yes I know it's provided as is with no support but it
45:33: doesn't work at all outside MS yes it's already GitHub issue and I have given it a thumbs up yeah like if you're giving
45:40: thumbs up like you know we can kind of look at it um I don't know how much work that would be so like you know it's just
45:45: a question of bandwidth so this is pretty much like just a
45:51: prioritization question like you know we need somebody to look at it and fix up whatever needs to be fixed
45:57: up uh next question is from VT Arcelus
46:02: how many flavors of ice cream does Fuks enjoy all the fruit ones i I I really
46:08: like the fruit ice creams which are more like sorbets but uh um I still customize ice cream how about
46:16: this i like what kind of ice cream um I
46:21: like what ice cream do I like i tend to like mint chocolate chip
46:28: um cookie dough can be nice
46:36: um yeah i mean like those are good i mean sometimes I just like playing old vanilla too well can we get the times
46:43: two there's like one ice cream I had was super good which was kiwi flavored it
46:49: was delicious also frozen yogurt which I guess could treat us ice cream it was
46:55: fruity i think it was watermelon they're both dairy
47:01: yay so next question is from Dev Hammer um I saw your de uh dev blog and seems
47:09: you have made a lot of progress on the audio system at least on the surface what still needs to be done before we uh
47:16: before the audio system is in a good place i actually lally have like my
47:21: notes uh so I can just bring it out because like I usually test it and there's like a bunch of stuff um so this
47:29: might not be all of it i'm just going to screenshot it so it's kind of easy to bring in uh it might not be all of it
47:35: there might be like more things I kind of discover along the way but these are like the main things I have kind of on
47:40: my to-do list uh so uh how do I place it there we go um so one of them is like
47:48: you know sending the actual audio updates uh that's you know going to like send events that's not happening right
47:55: now it's relatively uh any nonspecialized audio
48:00: uh needs to be global because right now the falloff and specialization in audio
48:05: they are actually separate concepts so even if audio is not specialized it will still become quieter as you get further
48:11: from the source um so that's another thing also relatively simpler um need to
48:17: add like some filtering clumping you know some sanitization of the buffers relatively easy um then need to like uh
48:24: change the default device because the previous system it was built with the idea that the unit is handling the audio
48:30: device by default and we override the audio engine or audio output now Unity
48:36: essentially the builds that I've shown the Unity audio system is completely disabled like it's just gone and it
48:43: means like we're actually handling the interactions with the audio devices ourselves um and so that system kind of
48:49: needs to be adjusted so instead of like you know it being an override it actually just uses it by default because
48:54: right now unless I override it in the test builds there's just no audio because it just thinks okay unit is
49:01: handling it but there's like no unit handling it so it's just quiet um needs
49:06: to handle some scaling so like you know needs to handle like when you actually scale up and down there needs to be few
49:12: things like like you know some other properties like for um usually like you disable specialization if you're really
49:18: close to a star so I need to add that to actually not even on the list uh and Doppler I need to implement the Doppler
49:24: effect which is like one of the little bit weirder ones so I think that one will take a little bit so like have a
49:29: good system for it also I have like an idea um there's also like you know just a bunch of testing some optimizations
49:34: because right now it just allocates a lot of buffers and deallocate so I need to add like pooling throughout the
49:40: system um so yeah there's there's a fair amount of stuff like a lot of it's like
49:45: relatively quick and easy so it's just you know kind of knocking it down in the list i kind of hope I can knock down a
49:51: bunch of things like tomorrow but I'll see how it
49:57: goes next question is uh from BD when making official tools
50:05: how do you decide how much to do in C# and how much to do in Protollex for example the gussian tool has the logic
50:11: to create and adjust bounding boxes in C# which means you can't make changes on how it's handled within the resite very
50:16: easily it's kind of depends like what is easier right now like what's like the least amount of effort um eventually our
50:23: goal is like you know so you can essentially have all the APIs and things that the tools do work with a very
50:29: general system um and they can pretty much build all the functionality within per flags and they can kind of adjust it
50:36: but right now for some things it would just take too much time so it's like faster to just implement sort of the
50:42: skeleton of it in C and then like you know make it neat like in protoflux uh
50:47: that way I like only need to worry about the core functionality of it
50:55: uh and then like you know have like the art team like handle like you know sort of some of the polish you know adding
51:00: like a lot of kind of cool things um that way like for example you know with
51:06: the gausian split tool one of the things it does like you know it has like a lot of dynamic behavior where it needs to it
51:12: needs to get like you know objects like you know there are overlap like uh if you draw something needs to figure out
51:18: are there any gash splits within this it's using stuff like you know physics queries like which give it a collection
51:24: we don't have collection support which means you can't really do that with protoflux nicely um uh it's doing like
51:31: you know it's calling like async methods like which you can kind of do but like you cannot do it like dynamically on
51:36: things so we do need to implement more stuff in proto flags for those things to
51:42: be easier to implement once it kind of happens I would like us to kind of transition more towards that like where
51:48: pretty much most of the tools are actually just built purely in FRS engine with proto flags eventually web assembly
51:54: and it just become content and there's very little stuff on the C# side or the C# side is mostly just general APIs and
52:01: frameworks for it but until it happens like you know it's going to be a mix of
52:07: both uh next question is from Shy Loki um hello folks i'm so happy I found
52:14: Resonite really unique VR experience i am curious on the state of improving full body tracking is that on upgrade
52:20: radar yay and hello i really enjoy Resite um yes it's something I
52:25: definitely want like we definitely want to improve uh right now the main focus is on performance um which is kind of
52:32: getting into this final stages ik is relatively high on the list as well there's a few other things like like the
52:38: UI and uh some other stuff um like Protolex collections for example that's
52:44: like very something that a lot of like creators want so I don't know for sure
52:49: which one will end up prioritizing yet um they usually kind of like handled it more kind of close like when I'm
52:55: actually done with the major task but it is something I want to improve like I like to use like you know full body
53:01: myself and I want it to feel better so it's definitely something that uh is
53:07: going to be improved at some point i just uh I don't know if it's going to be the next immediate priority or it's going to be things in between um I was
53:14: just checking time um so yeah it it'll be improved uh
53:21: but I cannot tell you like in a specific timeline right now um never seen is asking I feel like
53:29: I've missed something what exactly is the splithning so it's sort of
53:35: a it's sort of a joke name like uh for our for the last phase of the big
53:42: performance update uh there's actually a dedicated video on our YouTube channel that I do recommend checking out um it's
53:50: uh uh it's like you know on how the performance update is going to work with the multipprocess architecture in short
53:58: we there's like a few systems that are essentially sort of you know actually let me back up a little bit in order to
54:04: get like the big performance update we need to take you know FRS engine which is like running all of the logic all the
54:10: kind of complexity and pull it out of Unity because Unity is using uh very
54:17: inefficient kind of runtime that's making all the code much slower than it can be uh and we need to pull it out
54:24: into its own process you know that runs under net 9 and essentially that is what
54:29: the spliting is we split fs engine from unity into its own thing and it's just
54:35: going to communicate to unity which is only going to be handling pretty much the rendering at that point uh and
54:40: that's you know the splittening in order for that split to be able to happen we kind of need to
54:46: unentangle frux engine from unity so like you know it can actually be pulled out uh the two main systems that were
54:53: like uh that had like you know more deeper entanglements was the particle system which is now replaced with photon
55:00: dust and the audio system which is being worked on right now um once those systems are done then essentially the
55:07: sort of integration between for engine and unity can be reworked so it can communicate over uh shared memory and
55:15: like you know some IPC me IPC mechanism and we can pull for engine out
55:20: essentially split it out so that's the splitening and once that happens you know that's like when we get
55:27: the major performance boost because we'll be able to run frux engine with much faster run
55:33: time i do recommend watching the video like on our YouTube channel um it has
55:39: like you know all the graphs and everything is very like
55:45: visual uh dark uh dark saber should we get fruits a ninja cream so you can make
55:52: your own fruit based ice cream sorbet i don't know ninja what ninja cream is is
55:57: there like a kit to make ice creams i I I don't know what that one is
56:04: i fruit ice cream i'm I'm for it but if it requires a lot of work then probably
56:10: no because I'm not good with that stuff uh Grand K follow up to plugging by next
56:15: question the issue mentioned does say how someone patched it to work without other assemblies if not I will add it
56:21: yeah this can help like if if you already like like identified issue like we can look at it be like okay this fix works and we do that so that might help
56:30: uh but we like we'll still have to kind of look at it and evaluate
56:36: it next questions from Garen VR uh Karen VR is asking how do you and other devs
56:43: feel about uh how far you and the project come in the past couple of years since rebranding as Resonite so uh
56:51: technically we haven't actually rebranded like Resonite is a uh it's like a distinct project um but overall
56:59: like like with our work like with like you know any like past projects and this one like I feel like we've kind of like
57:07: come pretty far like like if you like it's kind of hard to see progress you
57:12: know like as you kind of like making it you know it feels kind of slowish
57:18: because like like it just it takes a lot of time but then if you're like you know look a year back and you look two years
57:23: back and three years back like where you were at the time that's usually when you see like much you know kind of bigger
57:29: change um but it's kind of hard to do because like really go like you know on old builds but sometimes you can kind of
57:37: casual glimpses of it like you know when people are like "Oh like you know this this is improved this used to be a thing this is no longer a
57:44: thing." But you always kind of end up like wishing you could like you know go faster anyways so there's that
57:55: the next questions from BD uh I started res after sorry I started
58:02: after Resonite became Resonite what's the difference between Protolex and Logix thing I see on some other
58:08: items so it's the pretty much like different systems like there's like you know general similarities because both
58:15: are like node based systems which is not like you know not unique to like Resonide or any other applications node
58:21: based systems have existed for decades at this point they're both like you know programming languages um so you know
58:31: even with very distinct languages like you know going to have like similarities in like you know how they kind of operate but if you look at the details
58:38: there's a lot of Oh my god thank you oh my god there's a lot of Thank you BD for
58:43: the for the gifts um uh if you kind of look at them like
58:50: uh like there's like a number of distinct things for example Protolex it's a uh technically it's a separate VM
58:57: it's like a um its own sort of like virtual machine that sort of like you know takes all the code you you takes
59:04: all the note code and sort of builds its own structures um which is something that like you know uh Logix doesn't do
59:11: um so Protolex builds its own acceleration structures it's sort of like a semicompilation step it's also
59:17: stack based so it has like you know its own execution like stack and so on um which is also very distinct in how it
59:23: behaves and because of that it supports a lot of other features like for example you can call into nested node which is
59:30: not integrated yet but it exists in proto flags you can have local variables you can have stores which is something
59:36: that also system didn't support um there's um like even like just the
59:43: compilation step it makes it actually faster because um it doesn't need to be figuring out
59:50: constantly on the fly you know what it needs to be doing it sort of like tries to figure out as much as it can ahead of
59:56: time and then just you know execute on it um so that kind of like you know makes it behave much faster it also is
1:00:02: more efficient on how it kind of propagates changes so like um it builds this acceleration
1:00:09: structure where uh you know it tracks if any inputs change on you know on events
1:00:16: or if they change continually and based on that it classifies things into two different distinct update categories if
1:00:23: once update continually it's just it it essentially skips all of the tracking for changes and it's just going to
1:00:29: update it every frame because it knows it's going to be changing every frame um but if it's only on changes it like
1:00:35: builds a list okay if this input changes you know then I know I need to update this part of code to like run and comput
1:00:43: a new result versus like with the the um logic system what that would do it would
1:00:50: um every time change happens it would essentially have to propagate through the entire setup of nodes
1:00:57: um which takes a lot of like CPU time so it would be actually very slow and would be happening every frame if thing is
1:01:03: changing every frame and it would be kind of very inefficient so Protolex has like a lot of like it's works very kind
1:01:10: of differently which um gives you a lot of kind of benefits and like uh both
1:01:17: like in terms of like its performance and also capabilities it also supports async uh so you can actually have like
1:01:24: you know something that delays execution until next you know until for example next frame or something else
1:01:34: happens uh so I hope that kind of like uh clears that one uh there's like a number of like other like kind of
1:01:40: differences too but like I feel those are like some of the major ones um next question is from
1:01:48: Tikon uh so another one about VR hardware are you planning to get big screen Beyond 2 um it seems kind of
1:01:56: promising like I um so maybe like I do use the V4 which is relatively old
1:02:02: headset but like there's not really like a headset that would that I would like more this point the big screen too seems
1:02:09: like it could be but I'm probably going to wait like a little bit like you know for some reviews and stuff like that see
1:02:15: like how people like like it and uh then maybe kind of consider like upgrading to it
1:02:23: what do you think so initially I was going to pre-order one but I think I'm
1:02:29: going to wait cuz uh Ryu said that they were going to get one so I might wait
1:02:34: and to see what Ry has to say about it yeah we're kind of on the same
1:02:42: boat uh next question is from
1:02:48: PD_ um do you have any plans
1:02:54: uh any plans uh on access control or licensing for assets in order allow paid
1:03:00: assets to be used for worlds but not exported transferred to avatars etc yes we would like to introduce like some
1:03:06: sort of like you know asset uh licensing kind of system or license tracking system where you can say you know for
1:03:12: example you say I made this asset and then like you know you specify when people can obtain license and what can
1:03:17: they do with a license the system would then be able to sort of you know enforce those rules uh make sure only people who
1:03:24: actually have license to use it and we could provide like you know methods for people to purchase the license you know
1:03:30: to use that asset or maybe import like an external license if they purchased it like you know say in cam road or
1:03:36: something like that um so yeah like we we want to have a system and it would like fit really well with workshop
1:03:43: because um there actually is a video on workshop on our YouTube channel as well which covers a bunch of this uh but in
1:03:50: short in Workshop you essentially be able to publish anything anything you make and some of the items they can have
1:03:57: like like a license in order to be able to use them and it will essentially make the workshop serve as a store where you
1:04:03: can like you know sell your stuff but the other cool thing is you'll be able to you know have sort of discover things
1:04:10: because people like to you know go into worlds and they're like you know they spawn stuff and somebody spawns something cool and you're like okay I
1:04:16: want to save that and when you try to save it it's going to say this is a a paid asset you need to purchase a
1:04:22: license for it um so like you have to decide okay do you want to buy it you
1:04:28: know so I can save it or no uh and the way like you can also you know buy things more organically as you kind of
1:04:34: discover people playing with them in world and I think that could be like you know really cool thing you know if
1:04:41: you're a asset creator uh if you make stuff you make something that people like to play with and it's you know how
1:04:47: you can get a lot of sales on it and thank you again for all the all the all the gifts uh all the gifted
1:04:57: tiers Uh next question is from Grand K i'm sure if this uh if ever I asked this
1:05:03: before or not uh could it be possible to add a local API to headless system on server commands as you type them into
1:05:09: command line normally this could help people make system programs to manage headless software better yeah this is something we covered a few times like we
1:05:15: essentially want to provide some kind of API so you can you know interact with it their own scripts write your own like
1:05:23: you know dashes and frameworks and things to operate the headless uh I think there's probably a GitHub issue
1:05:29: for it like I think there should be so just checking in on
1:05:34: time uh next question is from
1:05:40: Lucas7 wait why do we need Photon dust audio for spliting photonus is great and I can't wait for improvements features
1:05:46: of audio but would it be possible to use IPC and continue using Unity particles audio systems um I mean I kind of
1:05:54: covered this earlier and I do strongly recommend
1:05:59: um watching that like video on our YouTube channel because I cannot go into detail on like why it needs to happen
1:06:06: but in short uh the problem is like you know if you imagine this is FS engine and you imagine this is Unity for IPC to
1:06:14: work like or or like to be like feasible things need to be like you know kind of done in sort of like book so you can
1:06:21: just send like a big bulk message be like please do this stuff problem with those systems is like you know they're
1:06:27: like okay this ties into this this ties into that this ties into this this ties into that this ties into this this ties
1:06:33: into this they're very they're very like entangled and it makes it very hard to
1:06:39: like you know split this entire process because now we have this kind of complex thing where um you know doing that over
1:06:47: IPC would be night a nightmare and would be a nightmare to handle stuff like you know latent and synchronization because
1:06:54: if you want to switch to another process that is a context switch and context switches are expensive
1:07:01: um and they add a lot of latencies because right Now like you know something in France engine it can literally call a method do something
1:07:07: here then return here do stuff here call a method do some here and now imagine that every single one of these you know
1:07:15: you have to like switch from one process to another then switch back so you can continue here like it it essentially
1:07:22: just become a nightmare like if you if we really really really wanted to you
1:07:28: could technically probably do it but it would probably run very slow maybe even especially with audio even so slow that
1:07:34: like you know the audio would be clicking because it would be missing a lot of the you know kind of calls it
1:07:40: would make the system like way more complicated and it would make it take way more time to implement and we
1:07:48: wouldn't really like like we wouldn't really like you know gain any of the other benefits
1:07:55: because like it would be so entangled that like we would have to rework it anyways to do some of the other you know
1:08:02: multiprocess architecture stuff because one of them is like it provides better fault tolerance like for example you
1:08:08: know this is FX engine this is Unity say like this crashes we can keep this running you know relatively easily spin
1:08:15: this up again and just start sending a data again and be like please render this stuff um when it's more entangled
1:08:22: it's like you know it's it would be like way harder for this to not bring down
1:08:28: this with it and the other part is like one of the reasons for multiprocess architecture is also security
1:08:35: um so like once the processor kind of sandbox if one of them is like you know compromised for example say the render
1:08:42: would like end up being compromised somehow um it prevents what it can do
1:08:47: but if you have like you know very wide API surface that makes it kind of hard to
1:08:52: control versus when it's like more where is more like you know sort of encapsulated that's much
1:09:01: much smaller surface and it's much easier to reason about and make it secure
1:09:06: plus the other reason is also we do want to eventually take Unity yeet it out and put a new renderer system which means
1:09:13: you know now we do now we would have to rework those systems anyways so we would pay we would spend so much time you know
1:09:20: making it work with those systems just for it to be slower harder to manage
1:09:26: more buggy more latency only to need to do it any like you know only need to like make our own
1:09:31: systems anyways so it's it's just like not a very kind of
1:09:36: good way to go about it and like the results would it will very likely take way longer than the current approach is
1:09:44: taking for worse results and next question is from modern
1:09:53: baloney uh modern is asking is there any plan on making the haptic colliders properly movable when these are driven
1:09:59: it seems that disappear until they're done being driven making moving haptics a little weird i don't actually know
1:10:04: what you mean that sounds like should probably be like a GitHub uh bug
1:10:10: report yeah like I I I don't really know like what what do you mean there but that sounds like a bug
1:10:18: yeah uh next question is from BD_ uh I heard a rumor that perflux
1:10:25: driver uh for example output driving fx engine value is avoided every frame even if unchanged is this true in the current
1:10:32: implementation or should I not worry too much about the number of drives I have as long as they don't change too often
1:10:38: so if it's uh if it's unchanged it like it kind of depends you know where that
1:10:45: value is sourced from but a system it actually has a mechanism where it sort
1:10:50: of tracks um actually answer this question the last one I sort of like I
1:10:55: have like a dejà vu um but in short uh with purple flags like
1:11:01: when you have like inputs and you have like you know a drive so if you just plug this directly
1:11:07: if this input is not changing continually it'll actually put it like into list you know like where it's like
1:11:13: if say this is a if a changes then update this drive and if you have like
1:11:19: AB and like you for example you know doing stuff with those I'm going to move it here uh so
1:11:26: say like you're doing stuff with those um this goes here this goes here if this bolt can be like you know change like if
1:11:32: they're not like continuous changes it'll update whenever A or B changes but
1:11:38: if you do something like you know say like you use the T- node the T- node is
1:11:43: changing continuously so it doesn't matter if this one changes like frequently or not this will need to
1:11:51: change every frame because you put something continuous in there it's a similar thing if you put uh smooth
1:11:58: LAR so you put smooth uh smooth lur in here and then you put a the smooth lab
1:12:05: like you know it can be changing continuously it's kind of hard to track like when it's when the off will be
1:12:10: changing or not it's going to force this to update every frame because like this is sort of a continuous behavior
1:12:17: um so it's uh you know it it depends what your setup is if
1:12:26: you have like setup that doesn't change continuously it should only update when
1:12:31: something actually changes but if you have something that forces it to be continuous then it's going to be every frame um I would say if there are like a
1:12:40: rumors uh and people somehow they notice you know that stuff change like you know
1:12:45: the driver changes every frame when it shouldn't they should make a bug report
1:12:50: see if they can replicate it because it's one thing for it to be a rumor um but you know question is there
1:12:57: anything solid behind the rumor because a lot of times there is like you know there might not be um it's just people
1:13:05: kind of speculating about things which makes things a little bit frustrating because like you know it's we don't know
1:13:11: is there a bug or is it just people like assuming things because we've seen kind of both happen and until somebody you
1:13:18: know makes a solid report and it's like you know okay I have this setup the setup is uh has nothing contain
1:13:24: continues in it this shouldn't been changing every frame but I've noticed it change like you know it's updates every frame then it's something actionable we
1:13:31: can look at it and we can be okay there's a bug happening here but when it's just a rumor there's like you know nothing to really work with which means
1:13:38: like we can't really do much about it and might not even be true you know and
1:13:44: so it's like if you hear that like ask for sources ask you know for people to
1:13:50: like do you have any sort of solid data to like you know that this is actually happening or
1:13:57: not um gener asking would uh would bypassing
1:14:03: the license be something that would get moderation involved uh almost definitely like there would be like if if there is
1:14:09: a license that says you know we can use it certain way or not if you were like you know bypassing that that would be
1:14:15: some kind of violation we don't have any specifics you know how that would be dealt with because like this is not the
1:14:21: current priority but um yeah we probably like you know not really allow you to do
1:14:27: that because we cannot allow you to kind of break the license
1:14:34: uh next questions from Grand UK can the session browser be changed to have the
1:14:40: name of the session rather than word name or maybe toggle for this it can be hard to find sessions by session name
1:14:45: that use publish words unmodified um we're probably going to rework the world's UI like to use the data feed so
1:14:52: we'll probably do a bunch of changes i don't know if we'll be making changes to the current one um you can always like
1:14:59: you know make a GitHub issue like about it next question is from Grand UK uh has
1:15:06: will audio be test set under program to ensure basic audio features will work you'll probably gr it like uh hello oh
1:15:13: well yeah you s runs on Linux so like and there's going to be pre-release too
1:15:19: so you know feel free to test it as well i'll probably like run it on my Steam Deck as well
1:15:24: yes but in
1:15:31: short uh next questions from BD_ is there a built-in way to actually confirm when Perflex driver is being updated um
1:15:38: I don't think there's like uh I don't know if we do have a tooling for it like it could like hm I wonder like uh if we
1:15:47: have Evan knows like when something changes in data model
1:15:52: i mean we we do have we do have like a fire on change node yeah but it's going to filter actual same voice changes
1:16:00: yeah I don't know if we I I can't think of a mechanism we have either to know whether something is being like updated
1:16:08: or you know if something is also like continuously changing or but it's actually a thing is you know like if
1:16:14: there's not a mechanism how are people saying you know that is happening because like do they make their own
1:16:21: tools because like if you if somebody's saying you know that it is changing continually how do they find out because
1:16:28: if there's not the tooling like did they build their own that kind of tracks it and if they did
1:16:34: you know they can share that um you know it's a line of things and if there's like if they don't have the tooling and
1:16:41: they're just assuming then it's they're like you know assuming that's what's happening and you know that doesn't
1:16:46: really put much weight in in that claim so it's kind of like you know thing is
1:16:53: like if if somebody's making the claim that is upinking continually the burden you know to prove that is on that
1:17:00: person like you know that is actually happening so it's a question how did you find out what tooling did you
1:17:10: use next questions from Nitra um are you actually certain that
1:17:17: splittening will improve performance is there any worry that you implement everything and then find it actually runs worse uh so I'm just you can never
1:17:27: be 100% certain of anything but we do have pretty strong confidence that it'll
1:17:34: significantly improve things um and the reason for that is we actually as part
1:17:40: of the big performance update we actually switched the headless to net 9 well there was net 8 at the time we
1:17:46: switched it first because headless is essentially is the same code base is
1:17:51: literally all the same code you know minus maybe like little bits but it's
1:17:57: like 99.9% the same code same engine and it uh didn't require that
1:18:04: much work to move that one to net 8 um so we did that and then like we know
1:18:14: we like we looked and the reason we did that is we you know we wanted to see how
1:18:19: much of a difference does that runtime switch make and what we found is it actually is pretty big like it literally
1:18:26: runs several times faster we had community host sessions on the same
1:18:32: hardware um like for example there's the grand's uh karaoke session that runs
1:18:38: every Monday and before with mono the headless it would struggle with like you
1:18:45: know 25 people it would be kind of like you know running really bad then they run it with the net 8
1:18:53: um and it the world got like 44 people and I even asked about it uh like Foxbox
1:19:00: told me like they don't use any calling like on the headless the headless is computing everybody's IK it's computing
1:19:07: diamond bones it's computing everything they cannot keep that um and it still stayed at 60fps with 44
1:19:16: people where previously it would be struggling with 25 people so there's a
1:19:22: number of tests that cyro and like number of people in the community have run like where like it would consistently give the same result like
1:19:29: systems just run several times faster Like recently I even had like somebody
1:19:35: show me they made an item that profiles you know spawning of
1:19:40: items and if you spawn it on the normal client it would take like something like 3 4 seconds if you if you spawn the same
1:19:47: item on headless it would take like half a second so again several times faster
1:19:53: so I'm like based on that like I'm very
1:19:59: confident this will like improve things if you know when we did the upgrade to
1:20:04: headless if we didn't see that I wouldn't be that confident like we might have like you know been like okay let's
1:20:11: maybe change the plan let's maybe like take a different approach but the fact that like you know switching the
1:20:16: headless provided such a huge performance boost is like you know why
1:20:22: we investing all this time and energy into this and it was done on purpose it was done so we you know actually can
1:20:28: gather some data how much faster it runs even before the switch itself like I've
1:20:33: done like a lot of kind of like microbenchmarks on different parts of the code you know comparing between different run times and again seen
1:20:39: similar thing it literally runs several times faster for most of the code um so I think it's very unlikely that it
1:20:47: would actually run worse like uh I think it's going to like run like way better and again like you know we can never
1:20:54: know anything with 100% Maybe there's going to be a really weird thing that happens and it's going to make it like
1:21:00: look run slower for some reason but I think like the chance of that is like very low
1:21:06: yeah and like just as a cherry on top um and even before Fuks like reworked some
1:21:12: of the particle system to use like C's more native like parallelization library um
1:21:21: uh I tested the new particle system with uh some friends on like a a a big like
1:21:29: mesh collider map like like you know typical big giant landscape whole thing is a mesh collider um we tested I think
1:21:37: 4,500 particles um with and without collisions so
1:21:43: without collisions the particle system um you know the particles are all flying
1:21:48: about whatever at like maybe 2 and 1/2ish milliseconds per frame on like
1:21:55: the graphical client which would be something like I don't know I don't know how high of FPS that would be uh how do
1:22:02: I calculate FPS from milliseconds is it over00 or something
1:22:08: okay one over Well it's going to give you seconds so we need to multiply by
1:22:14: thousands okay well 2.5 milliseconds or so um per
1:22:21: frame on the graphical client and 230 microsconds per frame on the uh on the
1:22:28: server cuz the server simulates the particle system then we turned on collisions and the server jumps up to
1:22:37: about 7.3 7 1/2ish milliseconds or so you know with the particles all like
1:22:42: performing ray cast and doing their collision thing and then the graphical client jumps up to 80 milliseconds per
1:22:49: frame uh with collisions on because it's using cuz it's doing the ray cast which
1:22:54: queries beu which runs a lot slower in our runtime so
1:23:00: like that alone like 10 times faster yeah like 10 to 12 times faster on
1:23:07: average in that scenario i want to do more tests just because like Fuks has
1:23:12: improved the parallelization of it a little bit um but like that's already like that's
1:23:18: already like if that alone is like wow that's that's a lot and if you need it even
1:23:24: more um this also great because like every like you know some of this was with net 8 and now there's net 9 and
1:23:32: there's like the the diff the performance difference between mono and
1:23:37: net like you know net 8 for example that's like drastic that's like at least
1:23:42: an order of magnitude but they also keep making improvements with every net version
1:23:48: um and let me just pull this graph So while back when net 9 released our cloud
1:23:56: infrastructure it actually already uses um you know the modern net and I wonder
1:24:03: if you can tell where we switch the server code from net 8 to net 9 and literally this
1:24:11: is the CPU usage same workload it just like swapped the server to use the new
1:24:17: one the CPU usage visibly dropped just by changing ing the .NET 8 to .NET 9 in
1:24:25: the codebase like it's just free performance so they keep making drastic
1:24:30: like improvements and like with every like release like we cannot expect to
1:24:35: get even more out of it and that's even like you know with the same code base the other part is like once we upgrade
1:24:42: we can actually switch to more modern version of BPU which has even more optimizations it utilizes a lot of
1:24:48: features of you know like the modern.NET net uh that our version is not able to
1:24:53: utilize we will be able to do that for our own code we'll be able to like you know utilize a lot of the new performance features that just don't
1:24:59: exist you know in the old like monoet framework which is going to give even more you know performance boost so in
1:25:07: short like I'm very confident it's going to help performance like dramatically like if if I didn't have this level of
1:25:14: confidence like wouldn't putting this much effort into like making that happen
1:25:22: uh next question is from Alex DPI um Fergus how do you feel you sound a
1:25:29: little sick or have a cold yeah I'm a little bit uh I don't think I'm sick but like I'm
1:25:35: like a little bit b part of it is like I ate way too much for the potato salad so I'm like that's kind of one of the
1:25:42: reasons why I'm just like lounging and just being like I'm kind of I'm kind of tired
1:25:54: um but hopefully still answering questions okay um Nitra is asking I was
1:26:02: mainly thinking about the Unity Forks Engine IPC connection not necessarily the origin performance has the IPC stuff
1:26:07: been tested at all i have done some things to IPC it was actually quite slow but I may have just coded it badly uh so
1:26:14: we've done a bunch of like research into IPC methods um the one that we're going
1:26:19: to be using heavily is actually called shared memory and that one it's like
1:26:26: what essentially happens is like you know you can have like each each process you know it has its own sort of chunk of
1:26:32: memory so like maybe say like you have a process you know say this is like you know fs engine and then we have like you know
1:26:40: unity I'm just going to make like weird cube or something going to draw um
1:26:49: uh pretend this is unity and this is gloopy poorly drawn gloopy so each process has its own
1:26:56: memory and this maps you know to physical memory so maybe like you know this chunk will map here you know this
1:27:03: chunk will map you know over there I'm kind of not drawing it super well you
1:27:09: know this one has chunk here that maps here what shared memory does it can take
1:27:14: a region in the process memory say like this one that will map to a memory like
1:27:21: physical memory region and then the other application maps that same memory region to this so
1:27:29: literally two processes share the same chunk of memory now we're kind of drawing a bit
1:27:34: there we go uh and this mechanism is extremely fast um there's like you know
1:27:41: other mechanisms where you actually you know you have the processes like you know you have the operating system you
1:27:46: have like your OS uh and you know this one sends tells al tells the operating system tell tell this process that and
1:27:53: it goes there and sends it that this is usually what takes more time because like you know it has to switch to the
1:27:59: operating system do a thing and so on um but also like even this can be workable
1:28:05: uh often times like when you design like uh multi like IPC mechanisms uh it's a
1:28:11: little bit similar also like you know like when you use pin invoke like you know in C for example to call native
1:28:16: library that that switch from one you know process or like one runtime to another can be expensive usually what
1:28:24: you do um to make that efficient is like you reduce how much you switch so for
1:28:29: example with C# if you was select P invoke uh you will you will want to model your API so you call as
1:28:36: infrequently as possible but with each call you give it a big chunk of data to work with that way like you reduce that
1:28:43: cost so if you were like you know using this messaging algorithm we would probably you know be like okay prepare a
1:28:50: big chunk of like you know update that's pretty much everything it needs to do uh you know for whatever the frame is like
1:28:57: you know being rendered and it just you know sends it over sends it over that cost of switching is you know then
1:29:03: minimized uh to the point like where it's like you know it's just going to be
1:29:09: like negligible compared to the amount of time that's being saved but with a shared memory it's even better because
1:29:17: you know the two processes can literally share same chunk of memory for both of
1:29:22: them like there's there's like pretty much just zero copying that needs to
1:29:28: happen so Resonide will essentially fill a buffer with stuff you know that needs to be rendered and then the unit is just
1:29:34: going to you know read that as if it's it was you know their own memory like which means zero copies it's extremely
1:29:40: fast uh there's even like libraries for it like it's just like it's pretty much
1:29:46: as if like you know each process was accessing its own chunk of memory there's like pretty much like no
1:29:51: overhead for it uh and there might be like you know some small messages that get sent over just telling it okay this is ready you know render this
1:30:00: um um it might also happen over the share memory there's like some mechanisms for that like not fully
1:30:06: decided on that one yet but this is going to be like you know how block of data is going to be communicated a lot
1:30:11: of those systems are also being designed to work around this so for example with photon dust the way photon dust works
1:30:19: when it finishes the simulation it has like a lot of its own buffers but the final result uh you know the data that
1:30:26: is necessary to render the particle system is a continuous data buffer so
1:30:31: like you know it can literally put it like into this and just tell you know you say like there's you know photon
1:30:37: does buffer at this like you know location within the share block and it's just going to interpret it and render it
1:30:43: uh so like those systems are designed to work with this IPC mechanism
1:30:50: specifically with shared memory and so will the integration as part of the splittening i will be like redesigning
1:30:57: how frux engine communicates to unity so it happens like essentially prepares a big buffer of stuff that Unity needs to
1:31:04: do and just going to tell it it's ready unity is going to you know go on town on it to you know do the rendering then fine can like do its own stuff
1:31:13: um so I'm pretty confident I know that stuff is going to like run like pretty
1:31:19: fast um and I've kind of like know done a bunch of kind of like research into it
1:31:24: if you look at it as well there's like uh there's also precedent for this there's a number of projects that do use
1:31:30: IPC mechanisms that run really fast for example your web browser because web
1:31:36: browsers uh they use like you know multiprocess architecture and they do
1:31:41: quite complex stuff oftenimes you know with web browser you have like you have processes that handle the web pages
1:31:46: themselves and then they need to like you know communicate you know a lot of the stuff like for example stuff that
1:31:52: renders often times the webgl like is the process that interacts with the GPU is actually separate process from the
1:31:59: one that's you know handling the page so it's communicating over IPC um you know a lot of the kind
1:32:05: of a lot of the kind of complex stuff it does if you design it well it can run
1:32:11: really fast The other example is also VR because you know when this is you know
1:32:16: kind of being rendered like you know the if you use Steam VR for example the
1:32:21: results they're being sent over IPC you know to the compositor which is a separate process so you know the
1:32:29: mechanisms the IPC mechanisms like you know they are fast enough if you if you
1:32:35: use the right ones and if you use them the right way they are fast enough to you know even run like VR stuff and to
1:32:41: you know do a lot of high performance stuff on web browser even stuff like you know 3D webgl and so on
1:32:47: so in short like I I don't expect it to be an issue especially with the shared
1:32:53: memory mechanism and with building everything to take advant advantage of this like when when I started you know
1:33:02: investigating this I did a whole bunch of research I was like what IPC mechanisms are there how do they perform
1:33:08: what would be the best ones to use and I decided like you know on the shared memory quite early on um and then like
1:33:15: you know designed like photon dust uh and other stuff like you know around it
1:33:21: the audio engine is actually not even doesn't even need to be because like that's not going to be sending anything to Unity at
1:33:27: which is also like you know a way to make it faster so for audio we don't need to worry
1:33:33: about IPC at all because like it's not going to touch unity so I do hope like this kind of uh
1:33:40: answers that question yeah I think for like even like
1:33:46: some like smaller bits of data too you could make you could like take advantage of like of like the atomicity of certain
1:33:52: operations yeah often times like with these things is you know it's like how you use the
1:33:58: APIs it's like um you know there can be overhead with some and if there's like an overhead you kind of find like a way
1:34:04: to work around it and there definitely are ways to make that fast and based on
1:34:11: the research like I've done like uh I'm confident this is going to like run like much
1:34:18: faster uh next question also checking time we got 25 minutes left uh next
1:34:24: question is from Ozie uh I know this purple implementation details but you
1:34:29: mentioned before possibly restarting Unity with crashes with splitting Unity handles the SteamVR rendering
1:34:35: communication from Would Steam VR complain massively on losing connection to the application i mean it will yeah
1:34:41: like if the render crashes from SteamVR's perspective the application's gone but we just started again and it
1:34:47: connects to Steam VR again so you know there's actually a part like you know where uh the main process is not the one
1:34:54: that's going to be interacting with u is not going to be interacting with uh you
1:35:00: know fx engine directly it's going to be going through unity the renderer which is you know interfacing with that one so
1:35:06: from Stevyard's perspective the application shuts down and starts again but you know it doesn't matter for the
1:35:12: process for the main process um next question is also from Oz
1:35:20: extra question on top of that could there be a way after multiprocess to be able to switch from desktop to VR
1:35:27: without starting up in Steam VR first i mean technically that doesn't need
1:35:32: multiprocess architecture like we could just implement that we could like you
1:35:37: know initialize steamware later there's a few changes that do need to happen because like right now when fruit engine
1:35:44: starts it essentially the engine is either initialized with um uh desktop only with screen only or
1:35:52: like you know with using VR uh but one of the approaches is you know to kind of
1:35:57: unify that so like it just doesn't care it's always going to initialize everything in a way that assumes you can
1:36:03: be using either and then like you know we just spin up SteamVR like when needed
1:36:08: um so yeah like it it technically can be done without it like we Yeah like pretty
1:36:14: much like it needs that done so like the engine isn't making assumption that you're going to be using on the
1:36:20: screen and like the multiprocess doesn't really maybe it helps a little bit in
1:36:25: the fact that like you know if we don't implement on the renderer where it can be initialized after the fact which I
1:36:32: don't really see super much reason that we wouldn't be able to do that um but in case we would not able to do
1:36:39: that we could restart like just restart it but I don't think that's necessary i think we can just you know implement
1:36:45: this um Mart is asking does Chroma have his own office hours or is he part of
1:36:52: art office hours i haven't seen him on the art of office hours and I have a lot of UAX questions um I don't think he
1:37:00: does actually it's hard to happen to know i don't know if he runs like a zone right now
1:37:07: i don't know let me check like the events it might be worth like asking the
1:37:15: artific um yeah I don't I don't know i haven't kept
1:37:22: I haven't kept up with that particular thing i also like you know go and ask over there
1:37:28: um and like you know see like what what can I get like they might like you know because I they work with Chroma a lot
1:37:34: more uh since it's kind of like more intertwined um I know like Chroma like does run something sometimes like it
1:37:40: does like stream and discord and such so might be I don't know if he like accepts questions during those but might be
1:37:45: worth uh checking them out but you can also ask you know you can ask ex questions over here as well
1:37:52: uh you can ask someone on the art team so um because like there's you know a
1:37:59: lot of people have like on the team have like a lot of like shortcut
1:38:04: responsibilities with that like we have about uh 22 minutes left
1:38:09: um and there's no more questions right now so we've still got more questions you know feel free to
1:38:15: ask actually dnet today i just realized
1:38:20: we did not get a schnoit today yeah there's no schnuppet what what happened this is grand sick
1:38:28: oh there we go oh there we go we got schnopit from grand but now it's too
1:38:33: late it's too late for a schnuppet unless Do you have one do I have
1:38:40: one well I mean I guess I only have a kind of mild one that like a question
1:38:46: earlier made me think about was the like the myth that like the IK is this like
1:38:52: big heavy thing and that uh like you
1:38:58: know just that the myth is the IK is heavy when it's really not yeah i mean
1:39:05: that's kind of thing in general there's like a lot of like Yeah i'm actually going to turn this into bigger schnop i
1:39:11: feel like this is a thing in general is like a lot of people will just
1:39:16: uh take things on face value you know they will like and that's
1:39:22: kind of like how a lot of the rumors spread is because somebody says something and it was even worse it's a little bit like you know um what's the
1:39:29: name of it like the Chinese whispers like where you tell one person one thing and you tell them to pass it on and they
1:39:35: tell it to other person but they morph it a bit a game of telephone yeah game of telephone
1:39:41: um and like it just keeps morphing and morphing and we kind of noticed that happening with some of the like a lot of
1:39:47: the things in the community where it just kind of things propagate and people just take it at face value but also
1:39:53: mutates over time and no
1:40:01: sorry to reposition there we go
1:40:07: um oh boy there we go so you know a lot of people will just kind of like repeat
1:40:12: things and not verify and some of my
1:40:17: um background like you know is actually like in you know science and one of the
1:40:23: things like you do a lot is like you know we have to figure out you know you have to figure out like what
1:40:29: your primary sources of information are it wasn't just science but I I was also working as an editor for a magazine and
1:40:36: a lot of the times like we have to do research like you know reporting on things and you do have to find your
1:40:42: primary sources like because you know using secondary sources or tertiary sources you know like more it can be
1:40:48: unreliable the information gets morphed so a lot of my thinking like whenever I hear somebody say something I'm like
1:40:54: where is that coming from what is like you know where did you get that piece of information are you the source of
1:41:00: information did you like did you do a test did you do something did you get the information from somebody else did
1:41:06: they do the test you know like where is that piece of information where
1:41:12: is that like you know that statement coming from and it's also like very important to do as an engineer because
1:41:17: if you want to solve problems you need to understand them unless like you know like some people solve problems like
1:41:24: they just kind of poke things and until it you know until it starts working and
1:41:29: they you know and you end up like a situation you fix it but you don't know why it's fixed like
1:41:34: I really hate like when it happens like whenever there's a bug and it starts working and I don't know why I will
1:41:40: usually keep at it because I'm like I need to understand like what is happening i need to understand why it was not working before i need to
1:41:46: understand why it works now because if I don't understand it I don't know if it's actually fixed or if it's just like you
1:41:51: know part of the behavior because usually if I don't understand it and it seems like it fixed itself it has a
1:41:56: tendency to just pop up later you know or maybe be just hiding some more
1:42:02: devious bug so I will be like really doing like you know kind of drilling down into like you
1:42:08: know trying to get to the core of the thing try to find what is the source of this and you also do stuff like you know
1:42:16: you apply okam's razor which is like you know you try to remove everything that doesn't matter
1:42:23: that doesn't like make a difference for whatever their statement is uh so if there's like an for example bug report
1:42:28: and the person mentions oh this this this I'm like does this thing matter Does this thing matter because the more
1:42:34: variables there are the harder the problem is to solve and anything you can eliminate you know it's like imagine
1:42:42: like you know like imagine a bug is like you know composed from loads of like little pieces and there's like you know
1:42:48: stuff here and there's stuff here and there's stuff here and there's stuff here and there's stuff here and there's
1:42:54: stuff here and here you know there's like lot of things and maybe like this is the like and you don't know where the
1:43:00: bug is so what you want to do is like you know does this make a difference for the bug nope does this make a difference
1:43:05: for the bug nope does this make a difference for the bug no does this make a difference for the bug no does this
1:43:10: make a difference for the bug no does this make a difference no okay this is my root and does this make a difference
1:43:17: yes this is my core issue and this is much easier to deal with you know than dealing with
1:43:25: this and I kind of like you know wish
1:43:30: people would adopt more of this kind of thinking not even for resite you know even more general because I do see that
1:43:38: kind of happen like where people will just kind of spread things
1:43:45: um without like you know having anything basis like you know it's like somebody said something so like you know saying
1:43:51: it and it's like and I was kind of thinking like you know is where like the way I kind of think
1:43:58: about things and it's also like similar how you approach things in science is you know how what is the strength of
1:44:06: evidence you have for something um and how many ways can it be verified
1:44:11: independently like you know in science like you have you know the pre peer review process whenever you publish
1:44:18: something you know you publish a study you start with some kind of hypothesis and what you actually do uh or what you
1:44:25: should do you don't actually go out trying to prove your hypothesis you go
1:44:31: out to try to disprove it you try to you know you try to find anything that could
1:44:36: make it that your hypothesis is wrong and if you can do that like you know if
1:44:42: you can throw a lot of things at your hypothesis and it still stands after doing after you genuinely trying to
1:44:49: disprove it that gives you confidence that your hypothesis is true it doesn't
1:44:54: give it 100% certainty but it gives you more confidence in it and you know maybe you publish your study you're like we
1:45:00: did this we did this we tried this um this is you know and because we tried to
1:45:06: disprove it this many ways it's still standing it uh this gives us good
1:45:11: confidence but you publish it and then like you know other people can look at it and you know we have the peer reviewview process maybe they try to
1:45:17: poke holes in it and if can withstand even that that makes it even stronger
1:45:22: and now if other people make you know their own studies and they try to like you know try to approach it from
1:45:28: different angles you know from different things and they arrive at the same results they still the hypothesis is
1:45:33: still standing after those you know attempts to like you know essentially
1:45:39: disprove it that gives you even more confidence that it is a good model for
1:45:44: what is happening um because if you if you go also like
1:45:50: you know prove what you're trying like you know if you if you try to prove your hypothesis true
1:45:58: um you end up like suffering from like a number of like kind of biases like you're going to look for data that like
1:46:04: fits the hypothesis and you might kind of like ignore even notice like you know data that doesn't and that can lead you
1:46:11: to false results which is why it's so important you know to approach it from the other angle you try to like you
1:46:18: know bring it down like you know you try to like disprove it and if it can withstand that then it's good then it's
1:46:25: a good model um and same like you know with any kind of like piece of information is like if somebody says
1:46:31: something that to me like unless they have a source like that to me doesn't have much of a value you know the
1:46:40: the like you know it doesn't have much weight but now if they're like okay I did this test you know it showed this
1:46:47: okay now this put like a lot more weight to it but still like I'm going to be like okay did you do this did you do
1:46:52: that could you be mistaking this and that and if it's withstands that then it'll be okay this has even more weight
1:46:58: and I'll take it even more seriously um so it's kind of like you know like and with most things you'll never
1:47:05: be 100% sure like there might be you know specific edge cases where it's not quite correct but the more
1:47:13: um the more like you the more you kind of like try to like you know poke holes
1:47:19: and the more you try to like scrutinize it and it still you know keep standing
1:47:26: after that the more confidence you can have that like you know it has some solid
1:47:32: basis in reality but if it's a rumor it's just somebody said it and it's like
1:47:38: how like you know how can you tell like you how can you figure out is this something that is based in reality or is
1:47:44: this you know a game of telephone like somebody said something and then
1:47:50: somebody repeated it and somebody else repeated it and like you get like a thing that kind of it's almost like a
1:47:56: butterfly effect you know somebody can say something small and somebody repeats it and then it just kind of keeps
1:48:01: spreading and literally you have like a whole you know rumor just kind of that
1:48:08: literally started from something small and it just kind of snowballed because nobody along that chain try to like you
1:48:15: know dig into it and try to find what is the source of this is this really true is this really happening it's actually
1:48:22: something that happened like you know recently and I think I actually talked about this one it happened uh when uh
1:48:29: when made like some of the changes uh for the join invites and made the ask to join button appear and somebody know
1:48:36: somebody was like oh it leaks it leaks invisible status and people started panicking they were like you know like
1:48:42: oh like it's it's it's horrible like you know the invisible status getting leaked and I like looked at it was like does it
1:48:49: really like and so my first thing to test was okay
1:48:55: can because if you look at it leaking invisible status means you can tell a
1:49:01: difference between the user who genuinely goes offline and who goes invisible what it means is that um if it
1:49:09: was true then user going invisible would keep that button and the user going offline would not keep that button
1:49:17: meaning you can tell difference that would mean invisible status is leaked so what I did it I did two tests one I
1:49:26: started like you know resonate on two two computers two different accounts uh
1:49:31: uh both were uh both were online and then with one of them I went invisible looked at the other okay the ask to join
1:49:38: button is still there but the other important part is like I
1:49:43: would the other important part I would shut both of them down start a clean one
1:49:49: both online and then I would actually shut the other one down So it goes offline
1:49:55: and truly goes offline and looked oh and the button was still there which means
1:50:00: it has nothing to do with leaking invisible status and that's an easy test
1:50:05: that like anybody could have done before reporting it before like you know saying things but people like they saw a thing
1:50:12: and they jump to a conclusion without verifying that conclusion is actually valid and you know and that's
1:50:20: potentially how rumor could have started you know that like this thing leaks invisible status even though it's
1:50:27: demonstrabably not true and I hope that like more people you know
1:50:34: can kind of like adopt this kind of like mindset of like if you see something or
1:50:39: if you think something try to disprove it first try to like think how could I
1:50:46: check if this is you know if this statement or this like you know thing I'm with this hypothesis because this is
1:50:53: pretty much you My hypothesis is that like um the the status like you know the
1:50:58: button leaks invisible status how do I disprove it because it's very easy to
1:51:04: disprove it is you know literally just go offline see if the same thing happens and the test it took me like you
1:51:11: know five it took me like five minutes to do maybe even less and immediately showed like you know there's no
1:51:19: basis in truth like for that claim there's still like a bug that happens
1:51:24: but like is not what people were claiming to be and it's like you know there was no reason for the panicking
1:51:32: so please you know if if if you're like working with thing try to like you know
1:51:37: think about where are you getting your information from you know what is the
1:51:44: basis you know for for this and how would you like you know disprove it and base your confidence in that statement
1:51:52: you know on how solid the evidence is and try to find you What are the
1:51:59: primary sources where is this information originating from is it originating from like a statement
1:52:06: somebody said like without any verification or is it originating from some actual test that you can replicate
1:52:12: as well because in science there's also another thing is you know like if you publish a study um the purpose of that
1:52:19: is like you know so pe other people can actually replicate what you did and see if they arrive to the same conclusion
1:52:24: and if multiple people do that you know multiple people repeat the experiment and they end up with the same conclusion
1:52:30: that gives you even more strength you know in that piece of information in that
1:52:36: hypothesis um I've kind of rambled a bit of a bunch of questions but yeah that's
1:52:41: that's that's my schnop it well we have eight minutes to answer these next two question we have eight
1:52:47: minutes for these i think it's actually one question that they put in two messages no oh there's a there's a few
1:52:54: um so the next one
1:52:59: uh is from uh Bitkark IGN i have a discussion about Resonate performance in comparing to VR chat personally I think
1:53:06: it's not fair to compare Resonate to VR chat because they're totally different but despite the nice dynamic nature I think it has some advantages over VR
1:53:13: chat or might have later on like utilizing at night new render and more uh continuing Oh yeah two questions
1:53:21: uh going to put the oops and I pulled out other one by accident um there we go
1:53:27: continuing but I'm curious to hear what you think do you believe the reason can ever come close to VR's performance uh
1:53:33: when compared to similar gameplay i mentioned this hangout world in a static world yeah I think it definitely can
1:53:39: like I I think with the net 9 like that's going to be a big improvement we'll see how it kind of runs with that
1:53:44: um there's still a lot of other optim performance optimizations we can do so there's like two things so like one in
1:53:51: some aspects I do feel it's a bit a little bit unfair because you know resite is doing a lot of things that you
1:53:56: know the archa isn't like with like you know the editing and so on so it's like you're comparing performance that
1:54:03: includes doing things you know the data platform doesn't but also like even in
1:54:08: those cases to the end user it might not matter like they will not they will not care why it might run slower
1:54:15: um it just runs slower and that's the main thing but I do think like you know
1:54:21: we like it's not even about kind of coming close i do think like we can run like way better uh especially with the
1:54:28: net 9 and with some of the future performance updates because we'll be a like with the 9 we'll be able to use
1:54:34: much more modern technologies and we already have like a lot of the systems designed for very you
1:54:40: know kind of high performance like usage and there's other things we want to do you know to be able to better scale we
1:54:47: also have systems like for example the asset variant system where we can sort of generate lots of variance for the
1:54:52: assets on the fly even for existing content and we also kind of optimize you know content for different kind of
1:54:58: devices which on itself can help quite a bit so I am pretty confident like it's
1:55:04: doable um I don't really have like any sideby-side benchmarks so like it's kind of hard to say but like I don't see a
1:55:11: reason like why it wouldn't be able to lug around like you know faster especially if you if you like you know
1:55:17: have a same scenario when people just kind of hanging out we can kind of optimize those scenarios that's like
1:55:23: another part with performance is like you know it depends what you're doing because you know you want to be
1:55:28: comparing like you know very similar scenarios because if you're doing very different things like you're doing heavy editing say like you use a node that's
1:55:35: like constantly scanning the hierarchy you know that's not going to run well um you know even like with a lot of
1:55:42: optimizations so the context of it kind of matters but
1:55:47: if you have like you know well optimized content on both sides you have like you know all optimizations I do think it can
1:55:53: run same or even like you know better like thanks to the modern
1:56:02: technologies uh next question is Garin VR is asking uh probably not the best
1:56:10: place to mention this but I've heard rumors of wearing certy spread about resonate on a couple of VR platforms how
1:56:16: do you plan to deal with this and what is the plan of action towards dealing with issues or accusations if they're actually true so I'm pretty much like
1:56:24: this like you know the best thing we can do is provide more sources of official information but I don't think it's a
1:56:31: problem you know you can really 100% fix because um people will spread the rumors and a
1:56:39: lot of people will not care you know about like whether they're true or
1:56:44: not and it's like something like you know that is hard to change but like what we can do is like you know by
1:56:50: providing these materials talking about the issues we provide more sources of official information and I've already
1:56:57: heard like you know from a number of people where if they're talking with somebody like before they would like you know be like oh I heard F saying this
1:57:03: and that or I heard like this team member saying this and that and it's like kind of hard but now they can like just be like there's a resonance video
1:57:09: there's you know they have something solid to point them to they have like a primary source of information and that
1:57:16: can help you know make people more informed and as you know part of the
1:57:21: reasons like why I'm doing this like you know why we're doing this office hours is to help prevent like rumors by
1:57:28: talking about things and you know giving you like um as accurate information as
1:57:34: we can usually what like you know like if there's a rumor you can bring it up
1:57:40: uh it's kind of hard to talk about it in general because I don't know like you know which ones you're talking about uh
1:57:45: but if you bring one you know we can talk about it we can see like you know where they might be coming from and like you know whatever the official thing is
1:57:52: or whatever like you know if we see some something that's like you know incorrect about it
1:57:58: um so yeah um I would say like you know bring more
1:58:04: specific things but like overall these office hours like you know they're a good way to kind of combat like some of
1:58:10: the rumors uh next question is from Alex
1:58:15: Taco uh Anton uh questions from CFK oh
1:58:21: we also might not have enough time to like answer all of these questions unfortunately yeah we have we have two
1:58:26: minutes uh will there be detailed note information on note selection screen probably that the interface is going to
1:58:31: be redesigned uh next question uh where do we get like these
1:58:37: like big questions at the end uh Alexa content feature will there be integration of third party modifications from users they improve application
1:58:44: process with additional installations probably not um Alexandon road map on which stage you
1:58:52: will see resides released we literally made a video on it last week check out the YouTube channel um Alexandon war
1:58:58: preloading i don't know what it means exactly um Maria say does uh does UI
1:59:06: design have a name we call it the radiant UI
1:59:11: uh ages wolf uh one of the reasons we're using older version of Unity compared to all other places the unit not doing
1:59:17: anything much with real time rendering as much anymore not really it's a complicated issue don't have time to answer it in detail but like the newer
1:59:24: versions of Unity are missing some of the crucial features that we need plus it takes a lot of time to kind of like
1:59:29: in like switch to those and we want to switch to a completely different engine so like it's like you know where do we invest our time
1:59:35: um and I think like we're just going to Yeah there's a lot of questions that like you know like I recommend check out
1:59:42: our YouTube channel a lot of these questions like and if you have like features feature requests make them on
1:59:47: our GitHub uh yeah like literally like literally several of the things you posted like there's other GitHub issues
1:59:53: for those i recommend checking that out check the videos on YouTube uh we're going to have to you know kind of cut it
1:59:58: here because like this is last uh two seconds so thank you very much for watching um I hope like you enjoyed like
2:00:06: you know this episode of resonance i hope um um I hope like you know like like you
2:00:12: enjoyed all the questions uh like all the answers um thank you everyone you know for supporting Resonite uh also big
2:00:19: thing um thank everyone for supporting you know Patreon and Stripe and especially to those like you know when
2:00:24: who switched to stripe this is already making big changes in our financial numbers we're actually now instead of
2:00:30: being in the deficit we're actually now cash positive like cash flow positive um
2:00:36: Bob did like a whole like had like nice caps on yesterday's no yesterday's Friday's recap stream uh so check to the
2:00:44: recording of that if you're interested and if you're supporting us Patreon consider switching to Stripe uh even at
2:00:51: the same tier because uh we get like about 10% more money uh which means we
2:00:57: can kind of invest it into making Reside better but whether you support us like like financially or just being part of the community uh you know building cool
2:01:04: stuff hanging out just you know being part of the place thank you very much um
2:01:09: it's there's probably not going to be resonance next week uh because I'll be
2:01:16: traveling uh and after that I'm not certain either so like it's possible next two might not happen i'll have to
2:01:23: like figure stuff out uh but the next week very likely not yeah
2:01:29: yeah the next week is definitely not going to happen so we'll make posts whenever the next one happens um so
2:01:35: whenever that is we'll see with the next one um and let me see if there's anybody
2:01:41: to raid we like to raid people who are streaming it might be only creator gem
2:01:49: yeah creator guys if you guys have questions you want to ask uh like just
2:01:55: when the stream's not happening be sure to check for the office hours thread that Fuks makes yes uh I might not make
2:02:02: one now because like I don't know when the next one's going to be so um
2:02:09: I do recommend I do recommend bring those bigger questions like earlier because we always like at the end like
2:02:15: we have like whole bunch of big questions that we now don't have the time to answer so
2:02:24: um we should be ready to rate Creator Gem so thank you again thank you very
2:02:31: much for watching and we'll see you with the next one wherever whenever the next one happens bye-bye
2:02:37: bye-bye