This is a transcript of The Resonance from 2025 April 27.
0:01: there we go we should be live let's see if this works okay oh it's like doing
0:09: 2.6 megabit so it should work with the connection i'm going to
0:17: post we're going to post uh the office
0:25: hours there we go and posting live
0:31: stream live streams there we
0:37: go okay we should be live hello do we have any people in the
0:43: stream hello oh jeez this is so hard to read in the quest bit crack
0:50: IGN oh the quest the quest Steam Link streaming
0:55: doesn't like that blue combination and it's so blurry we got Marsh
1:00: hello hello hello we have another episode of
1:06: the residence we got Ber we'll see like I kind of mentioned in the post but we'll see how this goes
1:13: because I've been like I'm currently away from home and I've been having issues with the like internet has been
1:19: like really slow for the upload but it seems it should be fine but if it
1:25: explodes we might need to end this one early but so far I think it's okay it's like it's showing green
1:32: so I think it should work green for great
1:40: green so hello everyone hello and let some people
1:45: gather in so let me actually open it can you hear us fine as well because I'm using a different setup
1:53: uh I'm using Zoom to Quest Pro right now sounds fine we got people piling in
2:02: hello yep just did I get enough sleep probably not but you know we're we're out here well
2:09: thanks for asking Jack so hello everyone and welcome to another
2:16: episode of Resence um we've missed I think two were like missed because I was
2:21: like traveling and I was at a con and so on and this week also been kind of like
2:26: a little bit wasn't I wasn't sure we were going to do one this week either but um now we can uh but the
2:36: um I'm bouncing my leg um but now like the the internet's kind of weird like
2:43: the uploads like 10 times slower than it's supposed to be so we'll see how it goes let me know if
2:50: like the stream is having any issues but so far it seems okay so I think we're like we should be good to go
2:58: forward um so yes so welcome everyone i'm Vxius
3:03: i'm here with Sire we're doing another episode of Resonance where you can like ask pretty much anything you want about
3:08: Resonite about the theme like you can even ask casual questions too as well if you want um the only thing make sure to put a
3:16: question mark at the end of your question that way it kind of pops on our thing here so we don't miss it in the chat
3:23: um so uh we should be able to get started uh we don't have any advanced
3:29: questions this week because I was like I wasn't sure we were going to do like resonance like I didn't make one um
3:35: probably do one for the next one or maybe it's going to be a bit frantic like for the next month or so um so
3:43: there might be some more kind of uh there will be some more like resonances oh thank you so much for the
3:49: subscription and and the the cheers the bit more frantic there might be some
3:55: that like escape like if there's any kind of things happening um so just bear with us for the next one
4:03: however I do actually have like a announcement like sort of announcement i'm going to make like a proper one but
4:09: we have a new subreddit that's at
4:15: reddit.com/rresite and one of the things like we'll be doing is doing like uh AMAs like you know ask us ask ask me
4:23: anything kind of threads um I'll do make some like sometime during the next week it's going to happen sometime during the
4:28: next week so you can like you know ask questions there as well so it's going to be another kind of format it's also
4:34: probably not going to be as regular as the resonances and as the office hours but we did talk with the rest of the team the uh moderation team is also
4:42: interested in doing one the art team they're interested in doing one so there's probably going to be more so if you want to ask questions in different
4:48: format uh there's going to be another opportunity for that um there's going to be you know kind of like
4:54: more text based you know like answering like where we can post a bunch of questions and then like there's going to
5:00: be time when we're going to be answering them live but we can also answer you know the ones that been like written there um and hopefully also getting like
5:06: more people using the subredd like just kind of opening more avenues of communication um but yeah uh we don't
5:14: have an advanced question so we can jump in straight into the live stream questions and again make sure to put a
5:20: question mark if you want to ask something uh that way it kind of pops in our thing uh with that we should be able
5:26: to get started am I forgetting anything and um
5:33: I don't know if I'm forgetting anything i don't think so we'll we'll figure it out as we go um but yeah u we're also
5:40: going to be showing the questions like I'm not used to the quest controllers uh
5:46: as I was asking questions I'm not I tend to bounce my legs sometimes i usually don't like because I don't have like
5:52: full body set up with me so I'm like you know just cannot lounge
5:57: um and also not super used to these controllers g is asking "Do you have a
6:04: schnoit or do you have a schnoit?" Do I have a schnoit today let me think
6:11: for a second
6:16: no I don't know if I do i think you guys have been good this week
6:21: this is This is This is your judgment one thank you i don't There was Nik thank you
6:29: i don't know if like one is like a minor one it's like but it's like um it's like like people like when people ask things
6:35: being kind of vague like where like often times I like have this thing like where somebody ask me something and I'm
6:42: like I can't see like 10 different ways to interpret what you're asking like provide more context with your
6:51: questions that makes kind of things a little bit easier because like usually have to kind of be like like well do you mean like this thing or do you mean like
6:57: this thing or do you mean like this other thing and it kind of becomes a So when you ask things provide more more
7:04: more context which actually helps us like get further because like this happens on the stream as well when
7:09: somebody ask a question I'm like I don't know the context what were you asking so if you're asking question like it helps
7:15: to provide more context uh and if you're asking follow-up question provide a context too because like sometimes like
7:21: there's like a whole bunch of questions between your previous one and the next one so it's a easier you know like if
7:29: you don't provide a context we might be like I don't remember like what was your like original thing anymore so we won't
7:35: be able to answer that question so context helps and we got like big one right from
7:41: the start from check the fox author um check the fox author is asking can you rant a bit about spatial variables i
7:48: already have a lot of potential use case ideas but I'm curious what applications you see for them so spatial variables
7:55: for those of you I'm gonna put this here I guess um for those of you uh who don't
8:02: know is a well it was a planned feature that was kind of separate uh it was
8:08: actually on my list of like things I really want to add because I feel it's a very powerful system for creators it's
8:15: going to open up a lot of new options so I really really wanted to add it and it doesn't like not very big like
8:20: implementation wise like there's not super much you know that it needs a lot of the kind
8:25: of lot of building blocks are kind of already there um and when I was when I'm
8:32: working on the audio system uh I needed a system you know for the audio
8:39: listeners in the scenes to figure out like what audio effect specifically what real effect is available in certain area
8:45: and then blend it and to do that I would have to implement system that would be kind of like spatial variables and I had
8:52: two choices we could just implement one that's specifically just for reverb specifically for audio that's the only
8:58: thing uh it can be used for or I could generalize it and just use spatial variables so I was like let's just let's
9:05: get like you know bigger bang for the buck like you know let's get more out of the time I invest into the system and
9:12: let's just I'm just going to implement the spatial variables and then use it for the system who would think you I
9:17: don't know what was considering again interesting I don't know we can do that
9:23: but yes what special variables are is like you can essentially define I'm
9:29: going to grab my brush um where's my brush They allow you to define values in
9:36: space and it happens in 3D space i'm going to draw it in 2D to make it a
9:41: little bit easier to understand let's see where's my brush there's my brush
9:48: so imagine like you know you can literally say okay at this point in space um there's a value like you know
9:56: this can be a sphere in this case it's a circle because 2D so like at this point in space there's a value and maybe
10:02: there's a different value over here you know and maybe there's another one here that's like a
10:08: cube and say like this is just a color or floating point it can be any type of
10:13: value like you know kind of similar to dynamic variables except instead of them being bound by hierarchy they are bound
10:21: by space so like when you want to sample spatial variables you literally just say
10:26: I want to sample what the v value is at this point so say like this sphere has
10:32: value you know 10 this one has like five and this one's three and by default it's
10:37: zero so like if you sample here like you know you sample here you're going to get zero if you sample here you're going to
10:44: get 10 and actually let me make this overlap as well so I can show you a thing if you sample here you get five if
10:50: you sample here you get three during the whole volume and the interesting thing
10:55: for value for like numeric values you can also determine how they're combined
11:00: so for example if you sample here where there's an overlap um you know depending on how you sample
11:07: it like you can configure it you can have it the values together so if you sample here you'll get 15 and I just
11:14: realized I'm drawing everything backwards for you 15 um
11:23: 10 three can do this there we go yeah there
11:29: we go so now it should be the right way um so you can you know example here you
11:36: get 15 or maybe you say uh I want this one to have higher priority than this one and the higher priority one wins
11:43: which means like if this one is high priority you get 10 until you move here and then you get five or you like you
11:49: know so like or you can for example say you want to blend between them so you get like you know something like 12 you
11:55: know or something depending on where you are specifically so it kind of interpolates based on you know how far
12:00: you are from Um so that's like you know the kind of the gist of this like you literally can
12:05: define values in space have them be kind of combined you know in different ways and the value can be anything like here
12:12: this is a number but it can be a color it can be a spherical harmonic you know if you want to like do like ambience it
12:18: can be a reference to a texture or like you know object or something you can you have like lots of pretty much like any
12:25: value that works in Resonite you can make a spatial variable out of it and then sample it there also named so you
12:31: can like you know these need to be like all have the same name so we can have like bunch of different ones that each
12:37: has a different name and they're kind of separate from each other um another cool thing with how the system works is you
12:44: know in this example I made the values be constant but you can actually also have
12:49: the value vary throughout the shape so instead of um let me actually just remove some of
12:56: these to make it a little bit clearer actually let me just clear this out i'm just going to redraw
13:02: everything so say
13:07: uh there we go say you have the sphere so even when the value is
13:13: constant one of the things you can you can say like you know the value is going to be five throughout the whole sphere
13:18: you can actually define a region where it blends so it can be you know there can be like a region and if you sample
13:24: here so say the value is you know five the value is five throughout the
13:31: whole thing but there's like a blend region so like if you're here you're going to get zero and then here you're
13:37: going to get you know five and I kind of want to add a brush
13:42: that lets me reverse things um you here you're going to get a five and you're like in the middle you get like you know
13:49: 2.5 so like it's like you know smooth smooth LA between the two and it's
13:55: actually how the how the reflection uh no no not reflection reverb zones are
14:01: going to work where you know there's going to be a sphere which defines a reference to the reverb effect and then
14:10: the blending area is going to be fed into that reverb to determine how strong it should be and you know and that's
14:16: like how the reverb uh zones are just a specific use case of spatial variables
14:22: but you can use it for anything any data type you know anything you want so blending is like one of the things you
14:29: can do like you know for example with constant value uh and if you have like you know if you have like multiple of
14:34: the special variables overlapping the blending can be combined multiple ways if you're adding the values together you
14:39: know you're lally going to be like if it's 2.5 it's going to be 2.5 plus whatever else is at this point you can
14:46: also say uh the higher priority wins so like if you have like you know two regions which have like both like blend
14:53: region then if you're like in the overlap then whichever one has like higher blend it's going to have higher
15:00: priority and going to get that value you can like you know uh do average like
15:05: weighted average of the values so like it's going to average them out like you know so you're going to get one that's
15:11: like blend of all of them with the weight of each value being determined you know by the
15:16: blend but you can also separately from the
15:21: blending you can also define you know that the value actually changes so you can for example say right at the edge of
15:27: the sphere uh the value is going to be 10 so you're going to get 10 and let's
15:33: say here it's going to be 50 so as you sample through the
15:39: volume you know here you're going to get 10 you're going to get like you know halfway between 10 and 50 which I don't
15:44: know it's 30 something I guess um and you're get like you know get 50 and there's also a bunch of different shapes
15:50: that define you know different ways to vary the value throughout the you know
15:56: throughout the thing um so for example you know like there's
16:03: also like a a box special variable and this box special it also has like a
16:08: blend region but also like you know there's multiple ways like there's like I've
16:14: already implemented one which uses a 3D texture so you can use 3D texture and the value you get is literally the value
16:20: from the 3D texture wherever you know wherever that box is so like we have a
16:27: texture you know that's going to give you the value the other one for the box is you know there can be a gradient so
16:33: like for example and the gradient can be aligned certain ways for example it's this way maybe say you know here it's 10
16:38: and here it's going to be 50 so there's lots of ways we can like you know make different shapes that
16:45: change how the value they they actually change what value you get throughout the specific shape and it's
16:53: on top of like you know having multiple shapes that can overlap with each other you know and give you like
16:58: stuff so that's kind of the gist of you know what spatial variables can do but
17:04: the other part is you know how can you use them you know that's already like the example I've given with uh you know
17:11: uh with reverb zones and one of the kind of like excuses I've used to like add it
17:16: um but there's lots of different use cases uh for example like say you want
17:21: to do environmental effects you know you want you want the ambient lighting to change based where the user is this
17:28: makes it super easy because if you have a world maybe say like you have a region here and this has like you know this has
17:35: like you know bright lighting outdoors lighting and you have like a room here and you say the room this is darker
17:41: lighting and you give this a higher maybe a higher you know higher um
17:47: priority so like when the user is in blend between the two is this actually this one ends up winning over this one
17:53: even though they overlap and maybe there's like you know and maybe there's like a super dark room here so you put
17:58: one here and maybe there's like you know a region here where you want different lighting and you just define these
18:03: spaces and then you sample whatever you know say spherical harmonic you get
18:09: depending you know um depending on where the user's head is so say as as the user
18:15: is walking around you just keep sampling and you literally just drive the ambient lighting of the world where the user is
18:21: and you don't need to worry about any special queries this this system it uses the Beu acceleration structure in the
18:27: background which makes the query super fast so as you move around you just keep getting different values depending where
18:33: you are so you can use it for environmental effects and it doesn't have to be just ambient lighting it can be you know for example sounds you know
18:40: like you want you want to like modify how loud some sound is like say
18:46: um say like you're making a club you know we have a room and maybe
18:51: there's like speakers here you know there's like speakers and you want it to be quieter here you know
18:59: there's a room so like if the user is here there's a spatial variable that says you know the volume of these should
19:04: be really really high but once you get into this room you want you want it to be way lower like even lower than you
19:12: know the distance attenuation would make it or maybe you want to apply maybe you want to blend it into like muffled
19:18: version you have like you know a version of the track that's kind of like you know like pass through low pass filter
19:25: and once we also get like low pass filter so you can do it in real time you can feed that into the low pass filter
19:31: you know we can have a value that feeds it into the low pass filter and automatically blends the effect as you
19:36: kind of move around so that's another thing you could do uh you could have like you know it can be literally anything in the world like you know say
19:43: you want to change the lights in the world or another example you want to do calling system you can literally define
19:49: bunch of shapes um in the world which essentially say
19:54: the user is the user in this room is the user in this room and then you just derive the boolean of like you know
20:01: different places based on what value you sample where the user is and and you know and then like you
20:08: enable and disable different parts of the world so calling system is another good example another cool thing is um
20:15: say you want to make interactive systems that are very interoperable so like you make something like a torch you know
20:21: make a torch um and then like you have like you know somebody makes like a fireplace so
20:27: there's like you know a fire and what they do is they can define
20:32: this as a special variable named fire or hot or whatever you know they want to do
20:40: so there's fire and actually duplicate this one and this torch it samples
20:46: spatial variable you know at its step and if it detects that fire is above certain point it it will activate itself
20:52: on fire and then the user all has to do is just put in your fire and now it works and if you have worlds where
20:59: people set up you know special variables with this now those are all interoperable you don't need to do
21:05: anything extra you just say is what what is the value of fire at this point in
21:11: space and with above a certain point like I'm going to set my cell on fire and even cooler you can you know have
21:16: the torch have a special variable that's fire and once it's on fire that's also going to output you know that it's on
21:22: fire you put it next to another torch and that's going to set it on fire so like you can systems you know interact
21:29: with each other without having to like you know manually sort of link them together uh so it like allows for a lot
21:35: of like really cool kind of spatial interactions like this another cool thing is you know you could
21:42: um for example you know have you have worlds where you want to determine you
21:47: know like you have like a when to switch to a swimming system so like you have like you know you have like say like a
21:53: pool here like you know this is a pool and there's like you know some water around say like this is a beach and this is the water so you just say this is
22:00: special variable you know water for example so you're just going to sample
22:07: what is the value of water here and maybe there's another water here so there's another water there's
22:13: another water and if you take it above a certain point you switch on a system once it goes below a certain value you
22:20: switch the system off and it becomes as simple as you know as just driving a value from a specifically named variable
22:28: uh so there's a lot of like you know and this is just kind of scratching the surface like I think like once this is
22:34: like added and everybody can like work with it that's going to be huge like it's going to make so many
22:39: systems much easier to make and make them much more interoperable with each other i've already some like people were
22:46: talking about like you know making standards for like you know common things like you know water witness
22:52: another thing you know you can make thing on your avatar which just samples you know was the amount of water once it
22:57: cross a certain point maybe you trigger effect which makes your avatar look wet like you know like and that helps like
23:03: enhance realism you go into the water your avatar is wet you step out and maybe like you know your wetness drops
23:09: you know I've already seen some people kind of use that kind of system but there's not a way there's not really a way to make it like very interoperable
23:16: with things and this system makes that much much much easier and I think like a lot once you'll able to play with it
23:23: which you'll be able to very soon um people are going to discover even
23:28: more like really cool like use cases for it and probably a lot of corresponds to questions are going to get like gravity
23:35: oh yeah that's another thing you could you could literally just say you know define a spatial variable that just is a
23:42: vector and like you know is like this This is the gravity that should be here you know so like if you're if you're if
23:48: you're here maybe the gravity is you know down but like once the player's here the gravity is maybe this way and
23:55: it's just like it makes the setup so much easier because like all you do is just ask what is the float tree you know
24:03: at this point of name gravity and once you go to here you get a different value you just drive whatever thing on
24:09: locomotion module you need and it just works it just works the you don't got to do
24:14: math you don't got to do a bunch of math to do it anymore it just it makes all
24:20: the things like way simpler for like interoperable systems and makes it much faster because like
24:25: you know this is the acceleration structure so there's a lot of like use cases for this and I think it's going to
24:32: be even more like once people like once you get you know your hands on it uh I think it's going to be like very very
24:37: powerful system like kind of like on the level like you know like when dynamic variables were introduced that kind of
24:43: changed a lot how people build things and I think this is going to be another really really good tool in the creators
24:49: toolbox that's going to make it easier to build things and make them more interactive and more interoperable with
24:55: each other oh yes I'm very excited there's like so
25:01: many I was talking um I was talking with like Jack earlier today and he he he was
25:09: like these are like so powerful that like I don't even there are things that like I don't even know I I could do with
25:15: them yet i'm sure like there's so many it's just such a a broad use case that I
25:21: I don't even know what all they could be used for and I'm eager to find out i really like these kinds of systems
25:27: because it's like like it's it's been relatively simple to implement it took me like a day or two i think I haven't
25:36: like super explicitly tracked it but like it didn't take like super long time and it's like such a simple system but like so
25:44: powerful i feel like the one one two days like it was kind of made easier because I had to like generalize some
25:50: like I made essentially a spatial collection built around the bat proof physics because it's also used for some other audio stuff which makes it easy to
25:57: just say there's like this object and it's like in this space and I can sample
26:02: like you know I can do spatial queries very easily um so there's
26:09: um that that made like 10 minute implementation also like kind of quicker because I have to do that but also that
26:15: wasn't super big thing because there was already part of a generalization that I did for the dynamic bones while back so
26:21: um it's just kind of building on top of that um I was going to say another thing
26:26: I was but it is it is going to be like I think very
26:32: like gamechanging system for like building a lot of things on there tonight and that's kind of like one of the reasons I was like I should just
26:39: implement that one because that way like you know like I can use it to make you know
26:46: the reverb zone so it's something I needed and it's and usually when we build things for we kind of look you
26:52: know how do we implement things in a way that like you know it's not just that one particular system it doesn't
26:58: have just that one use case but it has like million other use cases and it was like a perfect opportunity to do so
27:05: um and that way you know we get like much more value from the time because like our time is very limited on what it
27:11: can do and there's something of that value that we can do like you know to like supercharge you
27:17: know like the powers we have like to build things on Resonite like you know we want to do it oh another cool thing
27:24: is like um there's also another sampler that I added um is like a boolean sampler so you can for example say I
27:31: want like this boolean to be true if there's any true value like you know in the space or if all of them are true or
27:38: if like you know non true or even do like zor like exclusive like or like each each boolean potentially flips it
27:46: it's like in the space it can also be used for other cool things like if you have like you know overlapping region so
27:52: say like you have like here and here which can be useful for example for recalling you know systems is like you
27:57: know you're going to get through in all of this
28:03: spatial hashing spatial hashing oh my god but yeah it's it's it's it's I think
28:08: it's going to be like pretty big and I can't wait like what people will make like once once you get your hands on it which should like probably tomorrow
28:16: maybe tomorrow's Monday yeah probably tomorrow like on the pre-release build because I'll have
28:21: a pre-release build for more testing for audio which is another thing but I'm do the ramble for it
28:27: but it's uh special variables a pre-release just for us
28:35: well it's going to have like other audio soon audio base because like I did rework a bunch of things so that needs
28:41: to be testing but it also has special variables so you'll be able to play with
28:46: both um so the next question is from BitK IGN uh how's the audio rework going
28:53: well this actually this a good segway um how's the audio rework going what are you working on right now i'm pretty much
29:00: what I said like like uh I'm getting the reverb uh reverb zones like working i
29:05: have like most of it working i have like the new there's like a new system where you actually have a component which
29:11: determines where you hear from it's like a listener i've reworked all the things
29:17: uh and like the listener it has a list of effects that can be applied now the only thing that's missing is kind of
29:23: putting it together so like um it essentially has like a special variable
29:28: driver that drives the list of the effects based on the variables it samples um it's like super easy to set
29:34: up and the other thing is I need to convert the old reverb zones to the new system which already like s did like
29:41: bunch of work to like map the old settings to the new one so this already saves tons of time so like I don't think
29:47: it's going to take super much time um so I think I'll have like the reverb zones like fully working sometime tomorrow
29:54: then the other micro thing that's missing is going to be Doppler effect uh I think it's not going to take super
30:00: long either but I'm still kind of like fuzzing some of the details of that so I'll see how that goes and then it's
30:05: like you know fixing up some other like minor you know issues some like problems that have been kind of reported so like
30:11: once those are fixed up like that should be it like we do like more testing see if there's any more problems but I think
30:17: it's possible we'll get like you know audio sometime next week already probably the week after if not you know
30:24: the next week so that I think it's getting close and once that is done like it's it's going to be the spliting it's
30:31: going to be working on actually you know pulling fision out of Unity and getting the big performance boost which is going
30:37: to take some time because there's like you know things need to be rework there but that's pretty much going to be the
30:44: final phase at the end of which we'll get a huge performance boost from the net 9
30:50: like it's crazy to know that like we're basically we're pretty much just
30:56: like potentially a like a few weeks out from the splittening starting it's crazy yes yes it's very very close like it's
31:04: it's been a very very long project like like this is I I think this has been the biggest
31:10: undertaking for Resonite like you know or FS engine in general like that I've
31:16: taken like ever but it's going to be really worth it because once we do the
31:21: switch like I feel that's going to it's going to be very freeing like one we get like a little more
31:27: performance and now we can also use much more modern language we can use better tooling we can use you know um modern
31:33: libraries for stuff that we're not able to and like we're just not going to be like you know held back as much anymore
31:39: so I think that's like it's going to be pretty big game changer
31:47: uh next questions from Oz oz is asking "Will there be a way to
31:53: visualize spatial variables also what shapes will be exported for them i'm really excited for them." So right now I
31:59: don't have a way to visualize we could maybe add like a quick one the same way like you know the colliders have them
32:06: um right now the shapes I implemented are spheres and a box but I would also
32:12: do like a bunch of others you know doing like a cylinder capsule um whatever the basic shapes are there
32:21: i can't think of basic shapes um uh a cylinder a capsule cone we could
32:29: do cone well cone is also like a special we do conical frost that's like a
32:34: specialization like where just cone and cylinder are just special cases of it
32:41: yeah like we can it's also like not a thing you know like once we add them that's not the end of it like we can keep adding more shapes and more like
32:48: different ways to like you know v the values in them so you know for example
32:53: like one thing for the um for the box where should I put my brush to delete it
33:00: one more thing for example for the box is you know there's the gradient um I'm drawing a very bad
33:08: box so you have like you know you have your box let me make it there we go so you
33:16: have your box and you can define a gradient but the gradient is you know just in one direction or you can define
33:21: a 3D texture but then you need a 3D texture we also meant to have variance varants that like you know the texture
33:28: is defined on one of the faces and it's like you know actually there's another box where you can define value on each
33:34: of the vertices so the value is like kind of interpolated between them so that that's already implemented but
33:39: another one I can see us implementing is you know you define a texture on one of the axes so you have a texture you know
33:44: for example on here and then like you know the value you get in the volume is
33:50: going to be you know whatever the projection to the texture is Um so there's lots of different ways to add
33:57: you know more and like you'll be able to like make feature requests if you want like particle shape or particle way to sample it we can add it is relatively
34:04: easy to add them into the system adding new shapes a little bit more difficult because there's like you know a little bit of extra math that needs to be
34:10: figured out like for example for the boxes I had to like I had to add algorithm that figures out like you know the closest distance but also like it's
34:18: not a huge thing but like one that figures closest distance you know based on a point so um could it maybe be
34:25: convex houses too typically like you know the what whatever basic shapes like you know you
34:33: you would have like in physics system like they're probably going to be spatial
34:40: variables uh next question is a long one um
34:47: uh next question is from actually I can't read that way
34:54: um there sorry I'm going to cover you uh next question is from Bitrack IGN uh
35:00: this is a what if question i'm much aware of the idea can change in the future this is just to satisfy my
35:05: burning curiosity for now if the workshop comes around what do you think you do in the event of an asset material
35:12: sound someone use in their items or has been deleted most likely because of copyright violation will you notify the
35:19: user give them time to replace it before deleting it replace the asset with placeholder or missing one so this is
35:24: not even like a workshop thing this is more of a um this is more of like you know like uh
35:32: well I don't want to say like DMCA specifically but this DMCA is one of the mechanisms but essentially it's like a
35:38: copyright issue um and it kind of depends how the user goes about it if they file a DMCA that's like a bit of a
35:44: process and you know in that case we uh I don't know from the top of the head
35:50: but I think we probably need to like delete it right away or like make it inaccessible and then have like a system
35:55: for an appeal so we can kind of restore it uh so with the DMC if it's actual DMC
36:02: claim then like you know the law we essentially have to follow the law like you know what it says um and I don't
36:09: know from the top of my head what it is right now exactly but I'm pretty like from the discussions we had like I'm
36:14: pretty sure like we essentially need to like block it right away and provide like you know appeal option so in that
36:20: case you know if somebody does do that filing uh our hands are kind of tied in how we approach it you know that's kind
36:26: of prescribed by the law itself if it's more of a support ticket claim you know
36:31: so you're not invoking specific processes that gives us more flexibility how to deal with a situation in which
36:36: case we can actually contact the user be like hey there's this claim you know do you have like anything you know can can
36:42: you replace it or do you have like anything to prove you own the license to it so it gives a little bit more kind of flexibility on how to deal with that um
36:51: we probably also like formalize it a bit so we have like you know our actual system where we can make um reports uh
36:57: we also plan to have a licensing system which is going to provide more explicit control on who owns what assets and how
37:04: can they be used so ideally like you know those assets would already be registered and that would determine how
37:09: they can be used but like you know having like still support tickets for disputes and stuff like
37:16: that but yeah it'll depend on exact process
37:22: um next questions from BLAR
37:27: uh are you keeping with the new developments in VR so first space outside of course if any interesting
37:33: feature stuff you've seen that you'd be interested looking to resonate in the future um I mean so it kind of depends
37:38: what you mean like if you mean like you know stuff like standards like for example open XR you know engines and so
37:44: on or you mean like other like ideas from other platforms and so on um yeah
37:49: we kind of like keep an eye like you know what's going to happening and so on i guess like the biggest one would be like you know having like open XR
37:56: support but it's also like you know we need to switch to rendering engine for that
38:01: um you know like in order to kind of thank you for the
38:06: terror man for the subscription thank you but yeah like for the one like I
38:12: would say that's the biggest one that like we kind of want like you know just kind of be able to switch to more modern kind of standards um but it's also like
38:20: you know a big thing is you know we need to switch to an engine so it's not a thing like we're just like yeah this is we're just going to flip a switch and
38:27: that's them uh oh is asking to clarify any feature
38:35: in a software game that made you think that's cool should that night too yeah there's like a whole bunch of stuff like
38:40: I mean like a lot of it's kind of inspired by um a lot of other software and games I played but
38:47: Minecraft is actually a big one it's actually one of the reasons I was really looking forward to like implementing a
38:52: terrain system because like um I want to you know be able to make big worlds and
38:58: make them editable and even like make like ones they can be like you know voxil based like where city boxes or it
39:03: can be like you know smoothed out terrain but kind of using similar things and I like when like while back when I
39:09: implemented Minecraft importer I actually dug into the sub like the anvil format that Minecraft uses and I got a
39:16: bunch of inspiration from that how they handle the terrain data how is the format like you know even evolved over
39:21: time um because there were like little interesting things you know how they handled like you know the blocks where
39:27: it just used to be an ID like a bite then that wasn't enough so they added another like you know bite and the
39:32: problem was people made mods so they would use a number for specific block and then we have a collision like you
39:37: know when it became used officially so in anvil they actually have a pallet
39:42: where the pallet like have strings where for example says Minecraft stone Minecraft dirt Minecraft you know something and the mos they can add like
39:49: you know their own blocks into the pallet and that they don't collide with each other um so it's like you know bits
39:55: like that where I'm like okay like this is interesting how they approached it and it gives me some ideas how approach ours and usually
40:02: uh will you know try to like generalize it a lot more so it's like you know it
40:08: can be used for a lot more use cases because like for a lot of the games a lot of features they made specifically for that game and it's kind of hard to
40:14: use it for other stuff but withite we try to make systems as generic as possible so it can be used
40:21: for lots lots of different things because like we're not trying to be like you know in a specific game we try to be
40:26: more like an engine where you can build a lot of you know cool
40:31: things but it is there's there's a lot of them like like there's like small
40:37: things there's like big things um it's it's hard to like think of them
40:42: because just like you know there's there's a lot depends on like what you consider you know a big development because like some things can be like
40:48: really small and neat like another one actually also comes from Minecraft is like when in the Wcraft mode which I
40:54: like to play like it automatically switches you between VR and desktop as you take the headset off and on i was
40:59: like that's cool i'm stealing that and it's a small thing but like you know I was like I like that so I'm going to add
41:05: it here as well um so yeah there's a
41:13: lot sometimes it's also a question of time because it's like you know like I'm seeing like oh that's cool thing like I would want us to have that but like we
41:19: don't have time right now for that nicon is asking uh you're not raidable
41:25: i'm not really sure like I don't know how the hell seems like an issue it seems like like a bug or something like
41:33: I've seen your radars like twice so I think it worked like it showed on the thing
41:41: um next question is from Alex boot 23 i recently made a voice command ring where
41:47: I can do certain things with like open inspector create cube and make it explode i made the explode only be used
41:53: if you are the host but that means it will not work on headlesses so I thought maybe I could check a user's role is
41:59: there a reason why there's no node for that or was it because it was never needed so actually one of the things
42:04: like we um we actually had a request for that and the problem
42:11: is like the roles they're essentially system designed for security
42:17: and checking the role you know like that like checking like the name of the role
42:23: it has a lot of pitfalls and when it comes to like security we try to avoid like you know
42:30: anything that has a lot of pitfalls because like it It can lead to making systems that are fundamentally not
42:36: secure and that cannot be secured and it could even break because for example you
42:42: know even just a simple example some people can in their worlds they rename their roles so if they do that you know
42:50: now you've your system is kind of broken uh or uh you know it's not working right
42:57: or maybe somebody makes a system which like you know hijacks whatever test you're using for the security and
43:03: because like you know there's not really actual security check it's just sort of you know a soft one is very easy to
43:10: bypass and now we're in a situation where like you know if we add it people
43:15: will have certain expectations you you know they have expectations that this is a secure
43:20: mechanism and we know it's not going to be a secure mechanism so we don't want to add stuff like that where there would
43:27: be expectation that we cannot fulfill so for security we actually want
43:33: to like introduce like more robust systems where we can make those guarantees we can be like you know okay
43:38: if you use this this gives you these guarantees um and it's where we're going
43:43: to be a little bit more complicated than like you know our old check or something you know that cannot
43:48: be really tampered with whenever it comes like you know to secure feel like like we the standard
43:56: that like you know something needs to pass you know is much higher
44:01: it's not just the uh the raid thing again the thing like we've seen the raid
44:06: happen like on the thing so I think it worked Uh Nikki's asking can you
44:15: uh can you give an explanation of using spherical harmonics for ambient light i have basics down but I still feel a
44:20: little confused so this is a this is a good example of like you know where it would help to have more context
44:27: like what are you confused by because right now I don't know what part is confusing you which makes it kind of
44:34: hard to like you know answer this question but the gist of it is like spherical harmonics they're like a way
44:41: to encode low frequency information directional information so like you know if I simplify it into
44:48: 2D you know like like it literally like if you if you have like you know this
44:54: can be sphere in 3D but like in 2D it's going to be a circle and if you sample it based on the direction you can get
45:00: different values from it so you know you get like one point here maybe like you get one this is bright over here you
45:06: sample it here this is going to be like you know half it's going to be 0.5 oop and I drew it poor
45:15: way so over here you're going to get like you know I can draw backwards
45:21: um over here it's going to be 0.5 you know maybe over here it's going to be
45:27: bright again and you get you're going to get like you know 0.7 seven so it's just like depending on the
45:33: direction you sample it in you get different values and spherical harmonics
45:39: is just a mechanism to be able to encode that information very efficiently
45:45: uh on a sphere so when you look at a surface of an object
45:52: um for example if I make say a sphere actual sphere
45:58: um where is it I'm just going to make a
46:07: sphere sphere there we go so if you consider this sphere actually select
46:13: that for each point on the sphere you know you have something called surface normals and the normal is essentially a
46:20: vector that's perpendicular to the surface like one you know one that's here and like over here it's going to be like this and over it's going to be like
46:27: this and this these normals are essentially like a direction and this direction is used to sample that
46:33: spherical harmonics to get to get like you know what ambient lighting should be at this point on the mesh mesh and
46:41: that's pretty much all it is is you know you have you have the spherical harmonics encoding what ambient light is in that
46:48: particle direction and then when the object is being rendered the normal is used to
46:54: sample the spherical harmonics to determine what ambient light you get in the direction
47:00: so this kind of helps um it would help if you provide a little
47:06: bit more like you know what part is confusing you because I don't know if whether it says that or not but uh I hope they don't anyways
47:14: uh there's also like another video we actually did on spherical harmonics on our YouTube channel so that also might be worth checking
47:20: out uh let's see next question
47:27: uh is from Powder Pop how resilient is Resonite against ripping i've heard that
47:33: there modex would you say this is a form of security uh pretty much every single game or
47:41: software platform like security by obscurity is the best you can get there's like no way
47:50: if you want the model to be displayed and rendered it needs to be on the
47:55: machine it needs to be you know present in some format but at some point it
48:01: needs to be decoded and put onto the GPU so it can be displayed which means the
48:06: code to actually decode it is also present on your system so the only thing
48:12: preventing someone from ripping it it's figuring out like where that system is and there's only a question of time
48:19: so there's no platform there's no software there's no game that can do
48:26: better than security by obscurity on this kind of thing like look at it this
48:31: way there's like you know games that took hundreds of millions of dollars to develop and they use you know they used
48:39: anti-che soft software they use like you know DRM that also cost like you know
48:46: millions dozens of millions hundreds of millions of dollars to implement and people still rip models out of these
48:52: games like there's not really way you can like you know prevent it from happening
48:59: so there it is a security to obscurity but you literally cannot get better than
49:04: that like the only way to get better than that would be for the user to not have the model and if you wanted to do
49:11: that like the only like the only way you could really realistically do that is
49:17: that you would actually never run you would never run the game on the user's computer you would run it on some cloud
49:23: server that you trust and then stream the video to the user but that like you know that would have like latency that
49:29: would have like huge infrastructure cost so like it's not realistic
49:35: um so yeah like it it's it's just a matter of like you know people taking time to correct the format and put
49:42: things together uh there is like one thing that we can do that it doesn't
49:48: prevent the ripping but it prevents you know misuse and it's going to be the license system because the license
49:54: system that we are planning to implement it's going to say you know this asset is
50:00: owned by this user and it's going to you know use like some hashing so it can like you know determine this is the same asset so if you rip something and then
50:07: you try to use it on the resonite again will be able to
50:12: determine this is the same acid you're not allowed to use it um and it can be
50:18: you know some mechanisms like you know even if you modify it a little bit you know it's going to be like okay like
50:23: it's similar asset I'm going to prevent you from using it
50:29: uh if you you know try to modify your client the other clients you try to join worlds you don't have control over those
50:35: so those can check and they can be okay this user is trying to use this asset do they have permission to do it they don't
50:40: I'm going to kick them out so that makes it much harder for the user to actually do something with it so it doesn't prevent the ripping but it helps you
50:49: know it makes the life more difficult for people ripping to actually use the model so it kind of decreases you
50:56: know sort of the incentive to rip things uh because now it's going to deal with more and the other aspect is because
51:02: even this system is not going to be perfect like you know somebody can rip something and modify it enough so the
51:08: system will think it's a different you know asset it's a different texture different mesh
51:15: um we'll have like a system to like report it so like if somebody reports like this person's using a stolen asset
51:21: we can like you know look at them and be like okay these are actually the same even though the system doesn't think it's the same like you know we have a
51:27: human verify these they're the same ones we're going to we're going to mark this one as the same uh make it so the user
51:34: cannot use it and maybe they also you know get some kind of punishment they get like banned or something for you know ripping like uh ripping assets so
51:43: the main thing like we can really do is like you know make pe make people's life more difficult with stuff like that but
51:51: ultimately we cannot promise you know that the models cannot be ripped but also nobody else can make that promise
51:57: and if they do I don't think they're being honest like I'm sorry we're going to say
52:04: something oh no i'm I'm I I
52:09: I forgot i totally forgot um but no I I agree
52:17: like there there there really is no no other like platform that can say
52:23: something like that i I like can't think of one that says that in good faith yeah
52:30: yeah it's a thing like and for us like it's like we just want to be honest about it you know like like what what
52:36: options we offer and what we want to do in the future and one thing we know for sure is like we cannot prevent you know
52:44: we cannot prevent the ripping from happening altogether but we can do things to fight it and to make like you
52:50: know the life of people who rip things like more difficult but
52:56: there's nothing that would give you anything better than security through obscurity like sometime like for some
53:02: things that's the best you can get and even then like you know it becomes a space race because it's like
53:09: if people figure stuff out okay maybe we can change things up but there's only so much we know we can do because if we
53:15: change it up it just takes time for somebody to figure out a different way you know how to attack the new system
53:21: and it just becomes this never ending like you know space
53:27: race uh Nukun's asking can you walk us through basic setup with particles maybe
53:34: show us how to add some simple effects i understand the basis of photon does but it's a bit more than the old system um I
53:42: reckon we could do like a quick demonstration maybe i don't know how like involved we want to get uh because there's like a lot you can do with just
53:49: how many much questions we have it's quite a lot so we might just there's a
53:55: lot of questions oh my god this um how much time we have so we've been
54:02: for an hour okay we're going to do a very quick showcase um actually h
54:09: uh actually wait we we did a showcase we did a showcase like on a video going on
54:14: go on our YouTube channel there's a video on photon dust that has a showcase
54:19: watch that one I'm sorry we're not going to do like demonstration we we I feel it's kind of
54:26: better because like we have a lot of questions in the queue and uh this might take a bit and we already kind of did a showcase while back so um I recommend
54:33: watching that yeah sorry uh noon's also asking also when does
54:39: photo does become multi-threaded it's multi-threaded from the start it was never single threaded like it's just
54:45: it's design yeah well what they're what they're asking is like when does it start like making more and more like
54:53: threads at what what is the threshold for that i think is what they're asking there's no threshold it's just
54:58: multi-threaded like I cuz I know like if you have a few particles it's only going
55:03: to use you know one because it only needs one still like using like other threads
55:09: but you also have like multiple like you know even multiple like it schedules things like like it's going to schedule
55:14: jobs like to always process a bunch of particles so like you're not going to get you know you're not going to get
55:20: like we have like 10 particles in the system you're not going to get like you know one thread for each particle like
55:25: it does a bunch I think it's like something like 2,000 48 like per batch but even that is like multi-threaded
55:32: because like there's multiple stages that happen and those also can happen on different threads and also if you have
55:38: multiple parallel systems going to be paralyzed like it's just it's multi-threaded like it's
55:49: um next question is bit cracken is asking I know so far with spatial
55:54: variables you can blend value between two shapes but could spatial variables allow for point shapes
56:00: So the sample value would be a blunt between two points that doesn't really make much sense to me one is well I
56:09: don't quite understand what you're asking because like like so one you can like it's not
56:16: just blending between two shapes you can blend values between arbitrarily how how
56:21: many shapes you have in a particular point like if you have if you have a 100 shapes overlapping and a single point
56:27: you're going to have a blend of you know values for all the like you know for all
56:33: the 100 shapes um like you can you can do it with like a
56:39: radius on like the like the sphere like how close you are to the center of the sphere if that's what you mean yeah i
56:45: don't really know what I mean like I need to clarify it because when you say between two points I don't know i I don't know what
56:53: it means because like when you say two points I imagine like you know two points but like what's the shape volume
57:01: yeah there's like no volume and like if you you cannot have like a spatial value that's a point because you
57:08: know like even if you're like infantally like you know in different point than
57:14: this is then like you know you're not going to hit that point because it's infantally small like it needs to have some volume to it and then you know we
57:22: can sample it within that but I also probably clarify the question a bit
57:32: uh N was asking like when does it split from the main thread um pretty much like most of the work it does is like
57:39: separate from the main thread uh the main thing it does on the main thread is just kind of collect the values like it
57:45: has like a sort of like synchronization point so what it does it like you know it collects all the information from
57:51: data model that's all the setup you know the styles how many particles should emit and so on and actually it it sort
57:57: of creates a snapshot of the information and then you know it schedules jobs on the background threads and there's like
58:04: a whole bunch of processing you know it does all the simulation all the updates everything that needs to happen you know
58:09: until it finishes the buffer then it takes the buffer the buffer needs to be converted for rendering it also happens
58:15: multi-threaded and once like all that is done it synchronizes with the united thread and uh you know updates the
58:23: united data for rendering and once it's done then like in the next update cycle
58:29: um in the next update cycle you know it can you know it essentially tells it okay I'm done and then the update cycle
58:36: can be okay this is create a new snapshot of like everything and spin of like another update so there's actually
58:44: also like how is it asynchronous because if the Um like for the main thread if
58:49: the main thread like looks at the particle system is like okay I'm about to update this particle system oh it's still doing things i'm just going to
58:55: skip it this update and the particle system can keep doing like you know it simulation and all the stuff it needs
59:00: and until the particle system goes okay I'm done i've uploaded the result to Unity it's now displayed uh at that
59:07: point you know whenever next update comes um you know that starts the new cycle
59:13: again so most of it is like you know kind of on the main thread but like essentially it happens like every
59:19: simulation cycle like it it it does like you know the main thread spins off the
59:24: all the computations to happen once they finish and upload the result and the next update point it's
59:31: going to spin off like another update so I hope this kind of like you know explains how things sort of like work
59:37: for that next question is from Alex 2PI uh
59:44: they're asking so space variables are like more advanced space triggers um I
59:50: wouldn't compare them to space triggers they're not really they're not really like
59:55: sending events it's like sampling values so you can have you know a bunch of like
1:00:02: shapes in the in the world you know that sort of define you
1:00:07: know define like what values are in space and you can like you can have
1:00:12: multiple objects and like you know one one is maybe sampling here one is sampling here one is sampling here one
1:00:17: is sampling here it's it's just a system to say there's this value in the space
1:00:24: and then you can ask the system what is the value at this point in space and it
1:00:29: gets a combination of whatever shapes are overlapping that space um you can like you know have a system
1:00:36: like you can have the value for example do you like you know unchange or you can like drive a boolean and you can send
1:00:42: like impulses based on that but that's you know that's not part of the uh spatial variable system that's like you
1:00:48: know a thing you do on top of it with like you know whatever value you ended up like
1:00:55: sampling uh grandking uh so can I do arson in resonite I feel like that happens on
1:01:02: daily basis There's always you were talking about the fire the fire thing with the
1:01:07: variables earlier just well I mean this kind of happens like a lot like people have like all kinds of tools and like
1:01:13: often times you end up like with session that just ends up like either both
1:01:19: metaphorically or literally on fire so like you know that kind of happens but this makes it easier this makes it this
1:01:25: makes it easier to put the session literally on fire
1:01:31: but also check check with your host make sure make sure make sure the host is okay and the people around you are okay
1:01:37: with like setting everything on fire spatial variable for how hot a set is
1:01:44: yeah uh would it be possen is asking would it be possible to
1:01:50: have particles be spatial volumes possible yes um it's definitely going to
1:01:55: take more computational resources and it also depends what do you mean by that because like spatial volumes for what
1:02:02: like is it for collisions is it for like interacting with things is it like for the particles affecting each other like it can mean lots of different things i
1:02:09: think I think in there I think to make like an educated guess on what they're
1:02:14: talking about i think what they mean is being able to define a particle as being a like volume that is like a spatial
1:02:23: variable so like that particle would like you know if you're in the influence
1:02:28: of like that particle and you have a spatial variable on you that variable would get triggered by that particle
1:02:33: I guess so like I mean again like you know this is this is the thing I was talking at the beginning provide context
1:02:39: questions because like it's really hard to answer these um I mean possible yes
1:02:46: practical question mark I don't know we need to like details it might be like we
1:02:52: could implement system like that if there's interest for it but like it might like
1:02:58: it might be not be the most efficient way to do it this also depends you know what is the use case we want to do because like if you're updating a
1:03:04: spatial structure that is like 100,000 particles that's not going
1:03:09: to be fast and there might be much better ways to like you know do things but there might be practical use cases I
1:03:16: would I would say like you know with stuff like that like once the system is out make like a make a I make like a
1:03:23: GitHub issue for it i could potentially see a use case for the particles being able to sample the spatial variables
1:03:30: rather than be spatial variables no that's going to be a factor like the problem there is because like uh spatial
1:03:36: variables they work with the data model um but the background systems
1:03:43: they like you know it's not designed for that kind of thing because like that kind of happens that runs asynchronously
1:03:50: and it's kind of like what the aectors are going to be because the aectors are going to be synced up with the
1:03:55: simulation itself spatial variables are not synced up with the particle simulation
1:04:01: so probably that's going to be probably a separate system or was likely going to be a separate system it's actually funny
1:04:07: thing because like the the uh acceleration structures I like you know implemented for for the spatial
1:04:14: variables and like you know for audio system they actually make it much easier to implement aectors for the particle
1:04:21: system too because one of the things like the when the particles are being simulated um the spatial
1:04:29: structure should not change like you know mid simulation uh but like for
1:04:34: spatial variables that would mean we would have to lock all the special variables and sync them up with this particle system which you know we
1:04:41: probably don't want to do you don't want like you know your special variables to be like locked down by a particle simulation so the particles particle
1:04:47: simulation is going to have its own system where it creates like a snapshot and um it is kind of used for you know
1:04:54: specifically for that butler is asking uh are spatial
1:04:59: variables always avoided every frame or do they avoid lazily when conditions change like many flux nodes behave right
1:05:05: now they avoid every frame uh so it depends you know how many active evaluators you have like they're pretty
1:05:10: fast so like it shouldn't be too bad uh what is usually more costly is like you know moving the shapes but also that's
1:05:17: relatively cheap because of like you know using the systems uh but moving of the shapes that only happens when the
1:05:23: shape changes and that's kind of the more expensive part I would like to like eventually add more like another
1:05:28: optimization where uh the samplers you know they're going to register themselves as a space and if we move
1:05:34: spatial variables um you know that's going to say something in this region change and it's
1:05:40: only going to update those specific values the only problem is you know we also need system for saying this shape
1:05:46: changes continuously because it can change lots of different ways one is the shape you know that it's transform
1:05:52: changes it can be that it's value changes but also it could be like you know a texture or something that's like
1:05:58: you know changing so all of that is need to be able to say something in this region changed re-evaluate all the
1:06:04: spatial variables in this location um but there will be like a future optimization I don't think it's like
1:06:10: necessary for the system right now next question is from unskilled wolf
1:06:18: uh they're asking I think one of the things that makes Resate so special is how open accessible everything is i'm a
1:06:24: bit worried about that with eventual introduction of workshops secular licensing systems paid assets this openness might be diminished has there
1:06:30: been any consideration around how to avoid making feel paid to win um I mean
1:06:36: like I feel like there's like not really like if you have like paid assets
1:06:42: it doesn't prevent you know free assets and free stuff from existing so I don't
1:06:48: think like that's really going to diminish that like you're probably going to get more different stuff like you're
1:06:53: going to get like you know items that are you know paid but I don't think that's a detriment it's like you know
1:06:59: you're getting more stuff and some of it's like paid but you know also it can be reflected in its quality plus it also
1:07:06: helps like even if you have paid assets it helps like creators you know to support themselves which means they also
1:07:12: you know make more stuff for the platform so like I feel like overall it's like a a benefit to the platform
1:07:18: plus you know something like that will help fund the platform as well meaning we have more resources to actually
1:07:23: implement in that itself i do think like uh it's part of like you know nature and the culture is you know to keep things
1:07:30: open so like I I can I think like a lot of people will still make like you know free things
1:07:36: um even though like you know to make free samples of their work but you know get people interested in their paid assets as well uh but like I don't I
1:07:46: don't see like how there's going to be you know paid to win because like I feel to like really diminish it we would need
1:07:51: to introduce things that like you know prevent you from like
1:07:57: doing free stuff but like we don't plan to do that like you know making paid things and making licenses that's up to
1:08:04: the creators and it's up to each individual creator like you know do you want your work to be on the platform do
1:08:10: you want it to be free do you want it to be paid so it's going to be the creator who makes you know the decision and
1:08:17: there's always you know going to be people who just make stuff for the fun of it uh so I think it's just it's I
1:08:23: think it's going to be fine i think like it's going to be actually better kind of for the platform especially for the long term because the more by creating sort
1:08:31: of like an economy for creators you get more creators overall so you get like you know bigger volume like bigger
1:08:37: absolute volume of free stuff as well because more creators are interested in resonite so more people are making
1:08:44: stuff like to give you an example say like you know there's 100 creators uh
1:08:49: right now it's just made up number I'm not pulling it from anything but say there's 100 what was that there's 100
1:08:59: cm happening uh oh uh uh oh the internet's starting to
1:09:05: blink out oh okay so I know sorry for that
1:09:11: um so uh say like you know there's 100 creators that are making stuff and we
1:09:16: don't have a workshop 100% of the creators are making free stuff say we introduce the workshop
1:09:24: resonate grows significantly and now we have a,000 creators uh and say say 25%
1:09:31: of them are making free stuff so like major are making paid stuff you know 75%
1:09:36: are making paid things but 25% are making free things which means now you
1:09:42: get 250 creators making free things even so
1:09:47: 750 are making you know um paid stuff so even if majority of the creators are
1:09:53: making paid things the overall number of creators who are making free stuff is
1:10:00: still higher than it was when we could only make from free stuff so I think like you know having
1:10:07: the workshop and having like incentive for creators to build stuff it helps and
1:10:13: even if majority of the stuff ends up being paid you still end up with more
1:10:18: overall free stuff than you would have otherwise and it's the way you know I
1:10:23: kind of like like to look at it next question is from Brard Brard i
1:10:32: don't know how to pronounce it i'm sorry um you mentioned using procedural textures as normal map input is up using
1:10:39: them seem to distinguish between real normal map and procedural texture that's being used as a normal map and machine
1:10:44: caused a bunch of weirdness could you elaborate on these i don't know what the difference is between real normal map and a texture with color space set to
1:10:51: linear and I've not encountered any weirdness with them but I want to make sure it's not some unsupported behavior
1:10:56: so normal maps are there like they are very specific type of texture essentially what it is like for base
1:11:03: normal mob um each pixel is encoding a vector like so you have like a shape you
1:11:10: know like you have a vector and that vector is a unit vector
1:11:15: um you know uh it's a unit vector and you and each unit vector you know it has
1:11:21: the x y z coordinates and what you can do you can
1:11:28: like you know the simplest thing is you encode these into RGB values the problem is like you know
1:11:35: the vector the XYZ it can go positive and negative as well uh so you sort of
1:11:41: offset it so like instead of you know being going from minus1 to +
1:11:48: one you know like with this it has to go from zero to
1:11:54: one so you just sort of like you know you you shrink it by half and you offset it and you encode it that way and then
1:11:59: like on the GPU it gets decoded the other part is like for normals the
1:12:04: vectors should be normalized they need to be unit vectors uh and when you
1:12:11: generate a normal map the system will guarantee this is a unit vector you know it like it its length is always one the
1:12:18: problem is if you use arbitrary texture the vector can actually be longer than that or shorter than that uh which it'll
1:12:26: still do something you know it will do something when you use it with a shader but maybe but the shader shader is not
1:12:32: expecting that maybe you know it's not normalizing it or maybe it's combining in a way like the shader will assume it
1:12:39: is a unit vector and when it's not the math of it can go wonky uh in some cases
1:12:46: and this might not be apparent right away it may not happen like you know later but essentially like you know like it's it's not fitting the criteria of
1:12:54: like What makes a normal map which is you know that normal specifically the normal map specifically
1:13:00: encodes unit vectors that are in specific kind of you know
1:13:05: space so you can technically plug any texture
1:13:11: uh as a normal map but if it's not a normal map like you know you you might get graphical
1:13:17: weirdness you might also get weirdness when we make some changes say for example we change their rendering engine
1:13:23: and now like the normals are being processed differently and now your thing looks different because the normal is
1:13:29: being interpreted like weird you might also get you know normals that kind of like mforms you get like lightning artifacts like there can be lots of
1:13:36: weirdness things but the main thing is is not fitting the criteria of the normal map and how normal map like you
1:13:43: know how the vectors are actually encoded so um I think it's pretty much it do you
1:13:50: have like a thing oh yeah i was uh I was just making a little thing that shows like how the how
1:13:55: the red channel moves it like left and right and how the green channel moves it up and down for like which way it's pointing
1:14:04: but like if if you want like you know procedural normal maps like you know we can add that and we can make sure those
1:14:10: procedural normal maps they uh you know they like they make actual normal map
1:14:17: like you know a valid normal map uh but the current ones you know they don't
1:14:22: have that guarantee which means like you know your things can look weird it's also like you know usually if
1:14:28: you use like a like if you use texture like that um usually you run it through some
1:14:35: algorithm it actually generates normal map from it and make sure like you know it fits all those kind of criteria
1:14:42: so it's one of those cases you know where um it you can technically plug it there
1:14:48: but like it can make the mod go wonky and the wonkiness is going to depending on a specific shader and specific
1:14:54: rendering engine so like you might like want to avoid that and make sure you're
1:14:59: actually plugging normal maps in there otherwise like you're kind of relying on how undefined behavior
1:15:07: yeah sorry yeah essentially like it it is an undefined behavior which means it
1:15:12: can also change like on you uh next question is from uh Marty
1:15:20: SH uh they're asking I am working on a language inspired by Perflux and while researching I read that Perflex build
1:15:27: acceleration structures and runs in a ST based VM what kind of acceleration structures does it build and how is the
1:15:33: VM structured um so it kind of depends how in depth you want to go but like uh it builds
1:15:39: like two main types of acceleration structures one of them is action sequences so that's for impulses uh if
1:15:47: you have a bunch of nodes that are like you know plug into each other so like so you have a sequence of
1:15:53: actions um what it'll do is like you know you have like you have like a I'll try to
1:16:01: backwards ABC it'll look at this and be like okay I know it'll be like okay I know that
1:16:07: after A always comes B after B always comes C i don't need to be figuring it out you know at runtime so I'm just
1:16:14: going to make a execution list that's like you know A then B then C you know
1:16:20: and then like when it runs it when you send impulse there it just follows this list that it has sort of pre-built
1:16:26: um the other is evaluation sequences uh this like when you evaluate values so
1:16:32: like you know we have like a values that kind of like you know plug into something and it does a similar thing you know we have like a
1:16:39: values you know and this can actually have like you know branching too so like you know say this is uh let me replace
1:16:46: oh never mind uh let me make
1:16:53: this so say like this is a value say like adding two values together you know
1:16:58: and it can also figure out it can be like okay at the runtime like it would be okay I want to evaluate this node
1:17:05: this node evaluates this node this node evaluates this and this node um so it's you know these nodes are kind of
1:17:12: evaluating thing you know multiple nodes but it always goes in the same kind of sequence so it's like okay I'm going to
1:17:19: make a list i'm going to evaluate I'm going to evaluate this let's say this is A let's say this is B then this is plus
1:17:27: and this is C so it's going to be okay I'm going to evaluate A then I'm going
1:17:32: to evaluate B then I'm going to then I'm going to evaluate plus and then I'm going to evaluate C so
1:17:40: it can kind of goes through that sequence of operations and it figures it out it figures that out you know ahead
1:17:46: of time uh rather than doing it at runtime because like this order of operations never changes it's not
1:17:52: conditional the other things is also doing with this kind of system is like it's figuring out how to place these
1:17:58: values on a stack because it is a stack based machine uh so each execution it
1:18:03: has its own stack frame which is essentially just you know sequence of data you know on the stack and it's
1:18:09: going to be okay when this one evaluates it puts its value here when this one evaluates it's going to put its value
1:18:14: here and then when this one evaluates it actually takes these values from the stag computes a new value you know puts
1:18:21: it here these two go away then when this one evaluates you know it takes this value
1:18:27: and like puts this value here and then like this one accesses this value at this index of the stack so it figures
1:18:33: out all the addresses all the offsets like it figures all of that ahead of time and it just like you know when it
1:18:40: comes to actual execution it just executes that like all the offsets are calculated all like you know orders of
1:18:47: operations are calculated um it figures all of that out like you know during the sort of compilation step
1:18:55: the one step further that you could go is like you know when you actually generate machine code for the execution
1:19:00: so it's not sort of you know doing that by calling the you know methods and stuff um and that's something we would
1:19:08: like to do in the future too like so it actually makes it even faster but uh this is what the system does right now
1:19:14: so there like a lot more kind of details it does like you know a lot of like metadata and a lot of like you know
1:19:20: reflection kind of pre-processing and like you know there's a lot of complexities to it but um I don't think
1:19:26: we have time to really like go in that in depth and I probably also have need to like
1:19:31: refresh on that myself because haven't worked with it in a
1:19:38: bit uh next question is from Ozie uh I
1:19:44: believe you mentioned being unsure how to implement the duplo for audio has there been any advancements for it i asked because when trying to look up uh
1:19:50: for game engine it's a whole thing I just can't parse yeah like I have like some ideas but like it's one of those
1:19:56: things sometimes like when I'm not sure it's because like there's kind of multiple ways to approach it and I'm
1:20:02: like I'm not fully sure which one's the best one i need to play with it a bit and see how it behaves and then maybe
1:20:08: change my mind you know based on that so I look at some ideas but um because like the main issue with it is the way it
1:20:15: works in Unity is Unity will essentially sample ahead like if you have a why did
1:20:21: I delete the brush again um in Unity say uh you have you know your you
1:20:30: have an audio track and this is where you're in the audio track and like you know it will be playing at this speed
1:20:35: normally if you're moving really fast it's essentially just going to move really fast and then slow down you know and like um
1:20:42: it kind of changes the speed and can sample ahead the problem is with Resonide it doesn't quite mesh because
1:20:48: its audio is synchronized and this kind of like you know lead to inconsistencies also a lot of the audio sources you know
1:20:55: the audio like you cannot really go ahead because there's no audio ahead like if it's a user voice you know you
1:21:01: only get like you know the small section of the voice and then you wait thing for network you know for the user to
1:21:07: actually send you their voice you know to plug more data so you can kind of keep playing and this works
1:21:14: because like you know if you're playing at normal speed the new voice you know like you're playing you're playing
1:21:19: you're playing you're playing you're playing new voice arrives you're playing you're playing you're playing play playing new voice arrives you know it's
1:21:25: it's literally like building the audio track like or you can think of it as like you know building the audio track
1:21:30: you know right in front of the playback um and the problem is you know if if you
1:21:36: go way faster there's no audio because it it doesn't exist because the user hasn't made the sounds yet so like you
1:21:43: know what do you do in that case you know do you just do silence and so on so like one of the things I've been kind of
1:21:48: thinking about doing um if the system you know tries to sample ahead and it's like I don't have any more audio i
1:21:55: cannot sample ahead um it's sort of going to you know loop this bit so it's going to like you know it's going to
1:22:02: kind of take the clip and it's going to like you know interpolate it with itself it's like it's going to sort of repeat
1:22:09: you know um it's not going to be ideal but like you know that's the best you can really do in that case uh or maybe
1:22:15: like you know use algorithm to like stretch the stretch the clip uh like I'm
1:22:20: kind of doing some research on like you know doing like pitch fifth thing without changing audio like without changing the length of it one of them
1:22:27: one of the possible approaches is like you know you cut it into super tiny pieces and then you you you essentially
1:22:36: repeat those pieces blending them with each other sort of you know stretch it
1:22:41: so there's like one of possible approaches u another thing I'm kind of thinking is if it can sample ahead that
1:22:47: for sounds that do have a Doppler it's going to keep a context where if it samples more data it keeps it around you
1:22:55: know that like I have more data uh and then like if it slows down it's going to you know it's going to use that already
1:23:01: presumple data uh but it's only going to work you know with some of things like some with some of the audio clips and if
1:23:07: it like if it would be like too offset like if if based on the data model it should be here and based on the doppler
1:23:13: it ends up being here then it's going to correct it it's just going to blend back to where it's supposed to
1:23:18: be so that's like a lot of things but I'll have to kind of see you know how it works and how complex I really want to
1:23:23: make because a lot of times like you don't really um you don't need it because like this
1:23:29: Doppler does only a little bit and it really becomes like more of a problem if you like increase it to like a crazy
1:23:35: high value so it's question is it like worth the effort for you know doing that like you
1:23:42: know doing all these kind of systems like may maybe I can just have it like even for the audio clips I'll just have
1:23:47: it like stretch it you know with the mechanism and make the implementation
1:23:53: simpler and save a lot of time on the work so I'll I'll have to see it's one of those things you know like I'll I'll
1:23:59: see when I get to it
1:24:04: uh next question is from uh Papaline how stable are Resonite cloud
1:24:11: storage types uh the PRON resite package are these formats subject to breaking
1:24:17: changes over time like ref hacking i think there was a mention of a custom binary format as future plan i was
1:24:22: wondering if the existing formats would be deprecated in flavor of API for import export of custom file formats
1:24:28: using external tools so there's actually a lot in this one
1:24:35: um where do I start so first thing I'm going to say
1:24:40: ref hacking the stability of ref hacking has nothing to do with the file format
1:24:46: you know if we do change the file format which we probably will that has nothing to do with ref hacking refing is more
1:24:52: like of a runtime feature well not not even a feature i don't actually they're asking like they're they're asking is
1:25:00: akin to it okay I know what they're asking what I'm saying is um it has
1:25:05: nothing to do with file format like for ref hacking it doesn't matter what format you
1:25:12: use because um you know if we change the format refaced it just it doesn't
1:25:18: interact with each other similarly for other features that we do support um it
1:25:24: has nothing to do with the format itself like like the stability of the format and stability of features that's a
1:25:29: separate thing so you need to think about amino as two kind of separate things but for both we can talk about
1:25:35: the stability of them but you know like there are separate things
1:25:41: um for a format itself uh they like for example it is
1:25:48: considered sort of internal format so we
1:25:55: um like generally like you know we're committed to like long-term stability and since like this is how we store data
1:26:02: even when we do introduce new format uh we are going to you know support at
1:26:08: least reading the old format and converting it you know like so what will probably happen is we introduce the new
1:26:14: format we keep the functionality to read the old format but we might nuke the ability to store in it because we you
1:26:20: don't need it anymore uh so like if you load things with the old format and you save them again it's going to be saved
1:26:25: in the new format so that's kind of you know similar for example how Minecraft handles it where it can load the old
1:26:31: maps but it saves them into the new format that way you know your stuff the sonic packages the items those are still
1:26:38: you know going to be supported but maybe the ability to save in that format is nuked so if you if you like rely on the
1:26:47: uh format you know being able to write into it that can break also we might you
1:26:53: know like if there's any new features that we introduce they might not exist for the format so like if you if you
1:26:59: make like for example your custom exporter that exports you know in the
1:27:04: old data format you might not be able to export some things because the old format does
1:27:10: not support it anymore it only supports old stuff so at the very least what we will do is have ability to load all
1:27:18: things in a format but relying on it like you know for long-term support to be able to import and export some of it
1:27:25: might go away you know we might be just be like this is now a legacy format we're not saving any more data in this
1:27:30: format we're actually removing the ability to save in it we only support like reading it and we're not going to
1:27:36: we we are not going to upkeep it we just have it there to support loading already
1:27:42: save things when it was the main one um when you look at the
1:27:50: um features themselves that is separate like uh because the format is just like
1:27:56: you know way to kind of serialize the data but the format it doesn't care about any
1:28:01: specific you know features or non features it's just a way to kind of take the structure put it into file and then
1:28:08: like load it back but the format like to the file format what the specific features do and how those features
1:28:16: maintain their compatibility that's opaque it doesn't care about that like the format essentially is you know I
1:28:24: have like you know for example a data of this type you know type that is named
1:28:30: you know grabable for example for the graable component and it has a value of boolean of this value and this thing but
1:28:37: like it doesn't know what grabable is it doesn't know like what any of the components are it just stores them like it kind of
1:28:44: maps them and when the system loads it we have a separate system that like you know for the components if we make
1:28:51: breaking changes we have an upgrade mechanism for those but it's independent of the format the format is literally
1:28:58: just like you know I have this value of this name of you know this thing the format doesn't care what it does uh it's
1:29:06: the component like when it loads for example when we make upgrades to the gable component when it loads it looks
1:29:11: okay I've been loaded from version I need to make these changes to these values to upgrade myself to keep working
1:29:18: with whatever you know functional changes we made so this this is a separate thing and it doesn't depend on
1:29:23: the format the new format is going to you know do the same thing is going to be I have component you know of this
1:29:28: name with these values when it loads it just gives the values and then the
1:29:33: component decides how to upgrade itself and for the mechanism it doesn't
1:29:39: similarly you know how the format doesn't care about like you know the upgrade mechanism of the components the component doesn't care what format it
1:29:45: was loaded from it only cares okay I've been loaded from this like you know
1:29:51: version of resite uh I need to perform these upgrades I don't care if that was like you know from this format or this
1:29:57: format it's like you know it's like a separation layer the third part for this is you
1:30:03: know official API because APIs also come you know with their own kind of like you
1:30:09: know import export like things um that kind of overlaps a little bit
1:30:14: with the components but it's like you know if there's an API we will usually like you know make
1:30:19: sure like that API has like some like is maintained so if like you know some calls you can upgrade your
1:30:27: things so I hope this kind of answers that yeah like I would
1:30:34: say in short like don't depend on like you know the for
1:30:40: like you can make tool links but like you know we don't make guarantees that that specific
1:30:45: format is going to you know have all the capabilities going forward but we do
1:30:51: make the guarantee that stuff you've made and you saved with versions where it is the format that that will keep
1:30:58: working so like you know you'll be able to import resonate packages you'll be able to like uh spawn stuff from cloud
1:31:05: because cloud actually uses the exact same formats as the resite packages so you'll still be able to import those we
1:31:11: maintain compat compatibility for those but if you build tools you know that rely on that format
1:31:18: like supporting both import and export like forever we don't guarantee
1:31:29: that we have a question for S jack the protor is asking does S end up being the one to finetuning the reverb conversion
1:31:35: because of his big
1:31:41: ears i might be muted i am muted thank you i said you also you
1:31:49: also have big ass ears pal you Well I guess you have holes in them i guess mine don't so I guess the sound goes in
1:31:54: them better yes some of the sounds goes through for mine
1:32:00: can you like Can you like hang things between those like can you like is wearing earrings really easy for you
1:32:07: people keep wanting to like put a rope in the ears and tie them together or stick ouches or stick through strings
1:32:14: through the holes well what if you don't you got holes in your ears you don't want people
1:32:20: to put things in them it's for better Wi-Fi reception
1:32:25: yeah yeah yeah yeah next question is from Jack the Fox
1:32:31: uh also more serious question will we eventually be able to create custom shapes or spatial variables and
1:32:37: parameterize how the value is determined at each point within the shape we have any specific plans for that um I mean
1:32:44: potentially yes we could like expose like you know the API make it so we can you know define like you know pro like
1:32:49: you could define it with proto flags or define it like you know with web assembly like have like an API so like I
1:32:55: see that as a possibility but uh we don't have a specific plan for right now
1:33:00: i would say this like a feature request for the future and will probably depend on like bunch of other things being done
1:33:06: first uh next question is G will there be a mesh spatial variable to go along
1:33:12: with primitive shapes there won't be because meshes have no volume so um if
1:33:18: you think about a mesh you know like if you have like a sphere you know this is you can define a volume but if you have
1:33:24: a mesh you know say like imagine these are triangles there's
1:33:30: no volume like meshes have no volume that's just like a like a infinitely
1:33:37: thin shell which means like if you sample here you're not going to get anything because you're not the shell
1:33:42: the only thing we could do is a convex how maybe because with convex how convex
1:33:48: hulls do have a volume but meshes don't which means they can be used for spatial variables actually the only thing they
1:33:55: could potentially be used in some limited like very specific way if you were
1:34:01: like if you like you know say like this is my volume this is my mesh here and
1:34:07: what you want to do is like you know you project it onto a mesh but also question is like you know that's more like why
1:34:14: actually wouldn't be using it never mind wouldn't be using the mesh yeah I'm just going to say
1:34:21: no uh next question
1:34:28: uh next question from Message so would it be possible to plug in a generic sinus field function into spatial
1:34:34: variables once we get nested nodes um I mean this the same answer to like J's question is like we wanted to provide
1:34:40: API because special variables they have an API where like it says like you know this is my bounding box and then it's
1:34:46: like a sample it's like okay I'm sampling you know this is the bounding
1:34:52: box and there's like you know whatever shape you want and you essentially get okay I'm sampling at this point and it's
1:34:58: up to your function to figure out you know what value you get and also figure out is this still within the shape or
1:35:03: not because the system it will do basic coloring because like even for example sphere sphere has a bounding box so if
1:35:10: it samples here it's like okay yes there's a value here if it samples here it's going to be okay it is within the bounding box but it's outside of the
1:35:15: sphere I'm outside the shape you shouldn't sample there so if you provide
1:35:21: those functions you can technically do whatever you want you know there so if we do provide it API or when we provide
1:35:28: that API you'll be able to like you know do whatever logic you want oh my god we just got this
1:35:35: subscription we got like I don't know how long ago thank you again for the subscriptions it's been a ter been a
1:35:42: while since I've been all live by um we have a lot of questions in 25
1:35:48: minutes left we might need to start speeding through these uh so I I would say if you're asking
1:35:55: questions at this point we might not be able to get to those but hey like you know there's going to be the Reddit AMA
1:36:01: like you know some during that once it's announced um next questions from Grand K
1:36:08: would fit be looked into for devices and times that support it since it's needed to be implemented by runtime and game
1:36:13: for it to work we were kind of looked into it the main thing is right now like one it doesn't work out of the box with
1:36:20: the fur tank which is what we use which makes things a little bit more complicated the other part is right now we are CPU bound um in more cases than
1:36:27: GPU bound which means it doesn't have as a big benefit right now but like if it if it once that
1:36:34: kind of changes you know we might log into it but I would say probably not until we switch graphics
1:36:39: engine uh next question is from Kai Vus uh I'm curious about one thing if a
1:36:45: custom rendering engine does not come out and since you are using Unity as a rendering back end would you guys switch to using NT7 since Unity says they are
1:36:51: switching core CLR in that version no we're going to be switching rendering engine like we're already very close to
1:36:57: switching to like .NET 9 because after audio is done then is the splithning uh
1:37:02: the problem is also the newer versions of Unity they actually break a lot of stuff as well that we use so like you
1:37:09: know and there's like we have like we're at the point like where I've said this
1:37:14: like earlier like if we had half the reasons to switch away from Unity we will still have way too many reasons to
1:37:20: switch away from Unity so like you know one one thing changing is not going to
1:37:25: change our mind about switching away we need like full control over the rendering engine we want like you know because we want to do we want to do
1:37:31: features like you know doing custom shaders again very difficult with Unity we need like to have control how the
1:37:37: rendering works we need specific rendering pipeline there's a lot of licensing issues with Unity where they
1:37:42: have like suddenly changed things you know and they would char like they would literally wanted to charge companies
1:37:49: more than they make you know and after they said more than once that they're
1:37:55: not going to do certain things so like they're not a company we want to kind of keep working with like we don't trust
1:38:02: them um we need a stable API too yes so it's
1:38:07: like no we're we're we are we're 100% switching away from Unity it it the form
1:38:13: it might happen might change but we are switching away from
1:38:18: Unity ganuk is asking uh how do you get started using spherical harmonics depends what you want to use them for
1:38:24: because you can use them for lots of different things if you want for ambient lighting I recommend checking the video on the ambient lightning on our YouTube
1:38:30: channel because we did kind of cover that it also covers you know some of the other use cases where you can you know
1:38:36: sample it like you know with proto flags for arbitrary systems so I recommend
1:38:41: checking that one out um next questions from Alex boot23 uh the uh YouTube DLP
1:38:48: browser cookie thing is broken for most browsers because they encrypted all the cookies for safety stealers would make it so you can fetch any video useless
1:38:56: opera are the only two that still work should I make a report about it so you can track it uh I would say make that
1:39:01: report with YouTube DLP we are not implementing this functionally on our end we're just telling YouTube
1:39:06: DLP you know f the cookies so they they are the ones who need to implement fixes
1:39:13: for that so make the report on their GitHub uh once they do fix it or make workarounds for it we can implement it
1:39:20: on our end but until YouTube actually supports it like we can't really do much
1:39:27: uh next questions from Mintchalk uh the only solutions to completed ripping would be if game could uh be uh cloud
1:39:35: gaming stream exclusive right because the game would be rendered on server as you def 1% streamed clients yeah that's
1:39:42: pretty much what I said like during the thing like you have to like host everything and make sure like the user never actually gets the game never gets
1:39:47: the access and gets video stream but like you know then you need latency you need like infrastructure all over the
1:39:53: world so people can play it and even then like is it probably not good for VR so it's not really feasible
1:40:00: unfortunately um oh jeez sorry K is
1:40:06: asking will there be perholog nodes to use slots position to redide special variable uh you just use the component
1:40:11: for that but there will be a protoluglex node where you plug a position in world space and just sample the value uh you
1:40:18: don't need a slot for that like the the component that drives the value like it just uses the slot position but if you
1:40:25: use a protolex node then you know plug whatever position you want and sample there you could even sample you can do a
1:40:31: for loop and sample bunch of different points in that too uh next question is from Bitkai uh
1:40:39: follow for a question about point shapes with special variables i guess easiest way to understand what I mean is like a sphere of infinite size like putting
1:40:46: multiple points on infinite gradient texture sorry if my made more confusing yeah I don't
1:40:52: really like you could like make like shape that's really big i don't know like infinite shape by definition is
1:40:58: infinite like you know so like it makes it so you cannot really get two points within that shape because like if you
1:41:04: have a sphere like you know have a point here then you can like you know um say
1:41:10: like you interpolate the value between you know the center and this and it can be okay I'm I'm
1:41:16: 50% you know 0.5 like between the center and you know the outer sphere but if
1:41:23: this is at infinity Then this point is at infinity this point's at infinity you know every
1:41:29: single point other than zero is infinity and zero is just going to be nonsense so
1:41:36: like it doesn't really
1:41:42: work like you cannot really have radians like that uh
1:41:50: oop uh sorry i got what I meant why I have any good ideas for it i mean like could have used
1:41:57: [Music] uh like I don't quite understand because
1:42:03: like you I'm confused i'm actually even more confused because like you can do they're
1:42:09: talking about the sampling like if if the particle systems sampled the spatial variables so like if the embers of a
1:42:15: fire entered a entered enter entered into like a space or something then the embers would light up when they're close to the variable i could we could make a
1:42:24: system for it i would say make a GitHub issue for it like um make a GitHub issue for it like you know once once it comes
1:42:31: out it could be a cool use case um De Hummer I remember something about
1:42:38: PL up not being threat safe what is your tenative solution to deal with limitation i'll literally just wrap it
1:42:43: around a log like I I'm thinking about like adding pooling to it but like right
1:42:49: now it's fine without it because like all the other parts are still multi-threaded so like and people have
1:42:54: tested audio with like huge amounts of audio sources and it's still pretty fast so like I'll
1:42:59: probably leave it that way until it becomes a problem which it might not
1:43:08: be uh next uh bit underscore Oh oh sorry
1:43:13: sorry sorry yeah don't uh BD is asking Dupler effects would this not be just a
1:43:18: matter of a curly simulating the path delay of audio and the change when you move in air uh no this like it's not the
1:43:26: issue is not figuring out you know the amount of Doppler you want because that that's the easy part you know you just
1:43:31: like there's a formula for that we figure out you know how far you're moving the problem is if you're moving
1:43:37: towards the audio source it essentially sort of speeds up you know it goes faster and like I mentioned if you for
1:43:44: example voice audio data you know you might have this part of the buffer but this audio hasn't arrived yet and you
1:43:51: might speed up so fast that you go here and you literally don't have the audio data for where your playback is supposed
1:43:57: to be on the speed you're going at which means I know I have to work with whatever audio data you have and
1:44:03: that's you know where the gist of the problem is so like figuring out like you know how fast it needs to be playing
1:44:08: that's the easy part have figuring out what to do when you don't have the audio data there that's the hard
1:44:18: part um Navy 3001 how long it will take to
1:44:23: get on 9 once audio system is live i don't know yet i hope like it's going to be like a month or so but like it's one
1:44:30: of those things you know [Music] where we don't know it's like it's still
1:44:35: like a complex project and a lot of the work you know it can end up like having a lot of like undiscovered work so there
1:44:41: can be complexities that arise during that process so like we generally don't like giving estimates because we
1:44:47: literally don't know and I don't want to be like making up a number
1:44:55: um Daniel uh 361911 is asking are you all thinking about going to make mobile apps for
1:45:02: phone and quest is that hardware to limiting so do you mean like playing actual resonite or do you mean like for
1:45:08: example having like a messenger app because those are like you know different things I would say the messenger app that's much easier like
1:45:15: and like I would like us to have something like that officially so we can you know chat with people and like you know do things and so on um that would
1:45:22: be pretty neat uh having like a pillars on it is definitely a lot harder it's kind of similar to like you know the
1:45:29: limitations for the quest we would essentially we definitely need to have like bunch of optimizations go through
1:45:34: and we also need content segregation because you you know your phone cannot handle the same amount of stuff that
1:45:41: your PC can no matter how many optimizations you do so the content needs to be sort of like you know separated between the two
1:45:47: platforms but I would definitely definitely love to have both i think the messenger one is going to be like you
1:45:53: know one that's more feasible to happen in short reach time but we also don't have people to work on it right now
1:46:01: so we'll see next one is from Ace on Twitch 17 is
1:46:06: asking in the event that something tragic happens uh in the long run Brooks is no longer with
1:46:12: us what would happen to Reser's engine will someone else manage the engine while completely shut down i mean it's
1:46:18: kind of hard to say but like we do have like you know number of team members that have like full access to everything
1:46:24: so they could kind they could keep going with things i don't know how things would look like
1:46:31: uh but technically you know it can keep going going uh next question is from Grand UK
1:46:38: uh is a combination of special variable and value user override possible where special variable is different for each
1:46:44: user checking it yep you can do that um the value on the special variable it's
1:46:50: um you know it's just a field like any other so you can you can derive it you can like you know do values you override
1:46:55: so that's a perfectly valid use case assuming you know you of course have to like know that like I want this value to
1:47:01: be different for each user uh next question is from uh cave extra
1:47:10: space any plans to update the current way mesh collision is handled I clip through mesh collision and returning
1:47:15: geometry I cannot just convex all primitive shapes having option for giving mesh collision volume like primitives I already have would help a
1:47:22: lot so you cannot really give mesh so again this like one of those questions where like multiple things in it one you
1:47:28: cannot give meshes volume that's just not how it works mathematically if you want to do that you use convex valves uh
1:47:36: those have volume but also you know more limitations the other part is you don't need to give them volumes there are
1:47:42: things in the beo physics engine that actually make u that use like continuous collision
1:47:48: checking that improve like you know so you don't clip through things is at the cost of extra performance but it can be
1:47:54: done um so we don't plan on like changing because like we don't actually handle the mesh collision ourselves as
1:48:01: handled by the physics engine but there's also like you know configuration things for like how the collision is checked that can improve this behavior
1:48:12: um next question is from Grand K how far have you gotten with IPC theory for splitting will it be a bit of theory
1:48:18: after audio straight impromptation definitely theory as well like I do have like some general idea um but there like
1:48:24: needs to be like you know I need to I didn't mean to do that um but there's definitely like you know
1:48:30: work I actually need to kind of jump into like you know test some things and like you know do full design of it how
1:48:36: exactly it's going to communicate uh so yeah there's like design work that
1:48:43: needs to happen and usually that happens very close like you know right before the implementation there's like general bits like things are sort of
1:48:49: architectured for other systems so it fits into that but the actual all the details of implementations is done
1:48:54: before the implementation so there's going to be a bunch of design phase um
1:49:00: actually kind of getting through these um uh any plans to implement bake lightning
1:49:06: after rendering engine switch yeah there's definitely something we would like introduce no specific timeline though uh
1:49:13: Aussie is asking hopefully I can squeeze one last question how many audio surface does current audio system allow i
1:49:19: remember asking around I keep mix numbers uh sure right now on the builds it's configurable i think you can set it
1:49:25: all the way to 512 like concurrent um we'll see like if we keep that thing or
1:49:31: not in the vanilla oh I don't remember i think it's 32 i don't remember
1:49:39: yeah it's something it's something like I've seen it max out around like 50 something i don't know maybe they're
1:49:46: just some Yeah probably yeah I I don't remember unfortunately
1:49:53: uh G UK is asking are multiple audio outputs thing audio channels and pre yet they're not but I've like implemented
1:50:01: some things that are necessary for that need a little bit more so you can actually sell you know other audio devices but um it's very close to having
1:50:09: that so you'll actually be able to be like the audio you know for the camera it's going to output a different audio
1:50:15: device so the camera audio you know is like rendered from like unit's viewpoint
1:50:22: so you get like you you'll get me specialized you get sire specialized and for you in the headset you'll still hear
1:50:28: it normally so um
1:50:33: um so it's like um this going to help make streaming a lot
1:50:41: easier i actually got through questions pretty quick we got like 10 minutes left uh so if you want you can ask more
1:50:47: questions we have time for more uh let's
1:50:53: see uh As 17 I heard in previous streams you're talking about UI redesign what are our current ideas for it and how
1:51:00: would you see it in the long end so we already like been doing bunch of like UR design this is like you know it's more
1:51:06: of a like on and off thing uh we have like big general plans but like the gist
1:51:13: of what we want to do with the UI is make it when you first get into make it
1:51:18: as simple as possible make it you know so you have your basic stuff you can start socializing with people you know
1:51:24: you can start like going into worlds you can like you know do all the basic stuff but like not have like much other than
1:51:30: that um but then as you kind of keep using the platform more and more make it
1:51:35: so you can make the UI kind of grow with you and adapt with you and in order to do that we actually have a system called
1:51:42: facets uh facets um they're sort of like you know these like little containers of UI and we want like the default UI to
1:51:50: have like you know simple configuration but like once you start doing things you'll be able to like okay I'm going to find this piece of UI you know this for
1:51:57: this thing I want to do and plop it in there maybe plop it in there um and what
1:52:02: we've been sort of doing is uh reworking all the UIs that we have to use that
1:52:08: kind of you know principle and actually I can give you like a live showcase of this um if I
1:52:16: render private UI and discovers the camera sorry I'm going to switch to small POV
1:52:21: um you can kind of you know see our stuff um if I open the dash like you know there's a bunch of like old UIs
1:52:27: like the inventory doesn't use facets but for example settings we reworked this to use uh assets so for example you
1:52:36: know say there's a setting I really like to use frequently there another thing this UI it uses something called data
1:52:42: feeds so instead of like uh you know the UI being sort of all generated by code a
1:52:47: lot of these pieces they are made by our our team and there's a system that just generates you know feed of like sort of
1:52:54: bindings like you know values that need to change and then the actual pieces of UI they are made in game which makes
1:53:00: them you know much easier and much fancier uh and this is like a mechanism we want
1:53:06: to like do for other UIs as well because like all of this for example inventory this is generated from code this is
1:53:11: generated from code and making UI in code it's painful it's very very painful
1:53:17: it's hard to make like a really good looking like UI like for things um and
1:53:22: it's hard to iterate versus this you know this is you know there like really nice animations you know there's like
1:53:28: all this stuff um we want to use this mechanism where inventory is also going to be using data feed for example like
1:53:34: this there going to be feed of items there's going to be feet of path you know um and we just have the content
1:53:40: team design individual pieces of UI to like you know put everything together and on the code side we're going to do
1:53:47: more of a um you know more just kind of the structure of
1:53:52: it you know the sort of like raw data that goes into it um but another cool
1:53:58: thing you know going back to the facets is like say you know you use some piece
1:54:04: of UI like you know some setting use it like often so what you can do is you
1:54:10: know you can literally take a piece if I switch this to UI edit mode I can be
1:54:16: like okay like I like my m you know master volume and I'm just going to go on my screen and I'm just going to you
1:54:22: know drag and drop it here there we go and I can change my volume and if I turn
1:54:27: this off I have like you know quick access to it or even or even cooler you
1:54:34: know if you enable um and I think it's user interface
1:54:39: uh there should be setting for facet anchors do you enable facet anchors i
1:54:46: have this on the other button will then toggle these and you can see I have mine
1:54:51: already kind of customized so I have like you know I have a very quick access to a lot of the common functionality I
1:54:56: use you know I can like change the volume I can change you know my status you know I have uh my well these
1:55:04: controls don't report battery but you know I have like lots of settings available and this is the principle we
1:55:10: want to use is because these start empty and you can start you know sort of populating them with stuff you use frequently and adjusted you know to like
1:55:18: how like you like to use resonite um will do you know all of that for like
1:55:25: the other pieces of UI as well like for example inventory once we rework the inventory with the new system what
1:55:30: you'll be able to do is you can literally like you know plop out a piece of inventory put it you know on your
1:55:35: hand so you have like quick access to it maybe put it like you know somewhere else you can kind of customize it um so
1:55:42: just checking the time got four minutes um you know that's that's one of the
1:55:48: kind of like you know principles we want to kind of follow is like make the UI very modular make it so the content
1:55:55: theme is actually able to like you know build the visuals and we don't have to like you know painstakingly later try to make it in the code um make it a little
1:56:03: more dynamic and make it a little more customizable so you can you can make your own versions of it too or you can you know customize it you know put if
1:56:10: you use pieces of UI frequently you know just put them in places make the UI kind
1:56:15: of grow with you because oftenless people are like you should make you know a toggle like you know simple and
1:56:21: advanced mode and for me that feels like you know it feels too black and white
1:56:26: it's like you know like because people are people use different they use it for
1:56:31: different reasons same like you know for example with a stream UI say you're a streamer you know like you might want to
1:56:38: like you know take like the streaming UI and maybe plop it on your hand so you have like quick access to some more common functions for a camera maybe you
1:56:45: want to reorganize it you know so like you adjust for that maybe you're a builder so like you put like you know
1:56:51: your inventory full of tools on your hand so you have like super quick access to them uh we want to make the UI so it
1:56:57: kind of grows with you and how you use Resonite and so it's also a more gradual
1:57:04: process to kind of customize it as you go everybody you know kind of starts with a simple one and then you kind of
1:57:09: keep adjusting and building it so there's a lot more like on the topic but we don't have like super much time so I
1:57:16: hope this kind of helps give a better idea on it
1:57:22: uh next question oh there's another question
1:57:30: uh BD is asking the current inventory browser is a bit of pain to reorganize currently and then near-term plans to
1:57:36: improve it depends what you mean by near-term i don't know whether that means like you know months 3 months 6
1:57:41: months um we want to improve it no specific timeline we would like to do it as soon as we
1:57:46: can don't know if it's near-term or not uh but yeah like inventory is like one
1:57:51: of the oldest and crustiest pieces of code it's very painful to work with so like we're going to throw it into the
1:57:58: fire and like you know rework the data feeds
1:58:03: uh GR case is asking while moving copying sim linking sever objects between cover inventory system to make
1:58:09: it more similar to our mobile file systems so again lot um copying yes uh
1:58:14: moving yes sim linking don't know maybe I would say like there would be a github
1:58:20: request uh I mean you can actually can kind of do sort of sim linking well well with
1:58:26: the folders but not quite um gray how do
1:58:31: you feel if someone made a whole game in there something like amnesia darken legend of time they're really cool I I
1:58:39: would definitely love to see that because we're kind of making already know be like a game engine i say they're like there are some sort of simple games
1:58:45: but like we definitely want to see more uh next question from Ozie perhaps
1:58:50: subjective strange question but why does inventory stutter so much when loading a lot of items settings page is a lot more complex video inventory feels like it
1:58:57: hurts more uh lots of reasons like loading lots of assets like you know for thumbnails but it's also like doing a
1:59:03: lot of crusty stuff uh next question from Grand K
1:59:09: could be a thing where you can turn off certain facet anchors you don't use so not have all of them are turned on i
1:59:15: mean like if you don't use them like um I don't know what you mean by exactly turning them off because you can just
1:59:20: not put stuff on them um being able to turn on like hand facet
1:59:28: anchors not iron faceted anchors i mean just not put stuff on them i think uh would they probably
1:59:35: clarification um K contact space work to inventory yes yeah like so right now the old pieces of
1:59:42: UI is like inventory contact session file browser settings have been reworked
1:59:48: um world technically has been reworked but they also going to rework it a little bit again so like because the
1:59:54: wars was sort of like a prototype of like concept of you know data feeds almost like you know it doesn't use data
2:00:01: feeds but like it's uses some similar principles uh and we won't actually rework it so it just uses just data
2:00:07: feeds so that's also going to be technically reorked so pretty much like most of the requests other than settings is going to be
2:00:14: reworked um with that this is our time so sorry
2:00:19: like cannot answer any more questions at this point uh uh thank you very much like you know for joining thank you for
2:00:25: asking all the questions i hope like good enjoyed like you know enjoy the answers
2:00:30: um we'll figure out somebody to rate as well let's see if anybody's streaming
2:00:35: tonight uh there actually I don't know if this going to be like
2:00:41: a resonance next week probably yes it might be okay gem um it's possible it
2:00:50: might not be uh because we might like go somewhere uh
2:00:59: so we'll see uh either it happens or it doesn't um
2:01:06: uh thank you very much for like everybody like for like like watching us thank you know for supporting the
2:01:11: platform like helping this and thanks S for being here here helping me answer questions yeah uh and also like again
2:01:19: like um there's gonna be announcement but like we're going to start like doing some Reddit AMAs so that's going to be
2:01:25: another avenue to you know ask a bunch of questions um talk with us so like keep a keep an eye out on that one and
2:01:33: also like another pre-release build with special more audio and if you help us you know test audio like that's um it's
2:01:40: going to help like you get it done sooner which means you know we get to get we get to the splitting soon
2:01:46: hopefully uh so yeah thank you so much um uh you
2:01:51: can always check you know more stuff like on another code thank you for supporting us you know whether it's on Patreon or stripe al if you do support
2:01:56: us on Patreon consider switching to stripe uh on the same tier because we actually get more money out of that like
2:02:03: Patreon takes about 15% stripe gets up around 5% so if we get we get 10% more
2:02:09: you know for your paying uh which adds up and helps us a lot so thank you very
2:02:15: much uh I'm going to rate Creator [Music] Jam and start it
2:02:22: down there we go and uh I will probably see you next week we might not see you
2:02:28: next week depending how stuff goes uh we'll see uh thank you very much for
2:02:33: watching thank you for all the questions and everything and go say hi to Med
2:02:40: bye-bye