This is a transcript of The Resonance from 2025 February 9.
0:00: live just sending announcement another announcement
0:06: where's the office hour and blind there we
0:17: [Music] go questions and we going to post the Blue
0:22: Sky there we go hello everyone people should be
0:27: piling in hello hello we have people we have
0:33: people boah hello hello J hello real and hello hello nooki oh they're watching
0:39: ads they can hear our Hells yet
0:44: why w
0:50: w bling in hello V boy hello everyone
0:58: hello be listening thank you for for subscription yes thank
1:05: you got clber J spin can show
1:14: it subscribed they've subscribed hello
1:20: jiden hello everyone how's everyone doing
1:27: today ah they try to set on fire no didn't work that didn't work we don't
1:35: have that on on this one it's uh it's not conductive to answering questions so
1:41: we can maybe set up one maybe if people are sensible with it Sarah's giving a
1:47: side eyes so I don't think he likes that idea o oh my God thank you Ty Tyra
1:52: another subscription and there's like little versite logos which unfortunately don't render here but they do render the
1:58: twitch thank you uh should I say de this is what this is
2:05: what emojis look like to us yes this shows the text oh no he's he's doing it
2:11: again Grand is doing schnit so um before we actually get into the questions hello
2:16: and welcome everyone uh this is another episode of dear resonance uh it's essentially like kind of like my office
2:22: hours uh SL kind of like podcast where we kind of talk about um um essentially
2:27: anything resite there's like heav focus on technical stuff but we can like ask anything about you know the philosophy
2:33: how's you know the company doing you know what like pretty much like anything that has to do with the platform um you
2:40: just ask like whatever you want to ask you can ask you know during the stream uh if you're going to ask a question
2:46: make sure um uh make sure like you know to end it with a question mark uh that way like
2:53: you know it kind of pops on our train and we make sure we don't lose it if you're asking a followup question please
2:58: include you know cont of a question because often times like there's a bunch of questions in between and then we kind of
3:04: forget um you know what the original question was and it kind makes it hard to answer um I'm Fus also I have Cy with
3:13: me like this for our engineering thing um so we should be ready to kind of get started we also do have some questions
3:20: that we kind of ask ahead of time there's actually three now um in our Discord uh one thing we started doing is
3:27: for anyone who's not able to attend uh oh also thank you moon base for the subscription thank you more
3:35: subscriptions um if you're not able to attend uh you can you know ask your questions like in the Discord there's
3:41: like a Channel created before the stream starts and you can pile your questions there and we'll try to get through them
3:47: as well um I'll see I'll see how stuff goes because I'm not sure you know if
3:52: it's better to prioritize them at the beginning or at the end we have a few question dares so let's say like we go
3:58: say like you know for an hour um and if there's a free time like if there's like
4:03: not too many questions we're going to go to the Discord once if it's too busy we'll switch to the Discord questions
4:09: after a bit um just you know so like those kind of get answered since people
4:14: ask ahead of time um but for now I'll try to keep them like you know for a quieter moments and see how stuff
4:21: goes uh let's clear these things out and the first question is like uh gr is
4:27: asking schnit that uh end up like being like a 15min
4:33: ramble about rcking last time so like what like other than rcking what's other thing that makes you spontaneously like
4:40: disintegrate something that makes let's see what else makes me spontaneously
4:47: disintegrate ping monop packing yes there we go I don't I don't
4:55: think we're going to jump in the big one on that yeah
5:00: the the only thing I got to say about that is like just don't do it there's no
5:05: reason to there maybe a tiny one yeah like if you
5:14: have two nodes maybe and you just like don't care to pull out the protox chip
5:20: that's okay I don't really care about that they might as well just be components at that point but if you're packing like 100 plus nodes on a single
5:28: slot what are you doing reevaluate life well the thing is like that's going to
5:33: make like you know if somebody tries to open it's going to make them explode and disintegrate so they become they
5:39: become makes me explode it makes me cry
5:45: okay oh this one makes you cry but the last one said you said you're going to drink the people's steers when stuff
5:51: breaks so it's going to different explosions different
5:57: disintegrations anyway um because some other questions prob like we don't like make this like the big one um yeah so
6:04: the next question is from game the C dog I asked my question half time because I didn't think I could make it should I
6:10: clear from the thread um I'm going of feel free to kind of keep it there like we're going to we're going to get to the
6:16: questions um in in the Discord uh so I'll just St it at this
6:22: point I'll keep it there uh next questions from Lun uh question
6:28: pets we don't have a p right now yeah no pads
6:35: unfortunately uh subscription uh n is asking mood Bas tell us about your Saturday show I think
6:42: there like a with question within the chat I'm not sure what the context of that one was um check the fox auor are
6:49: there any specific procedural asses you want to add in future uh yes I'm there's going to be a lot of procedural assets
6:55: like just generally like we're like going to expand at like you know more Primitives more kind of common thing things I like more procedural texture
7:02: sounds you know like lots lots of different kinds it also kind of depends
7:07: what kind of procedural asset you mean in general because there's a lot of things that are technically procedural
7:13: assets even though you might not even like realize that for example the UI the
7:18: way uix works it actually generates procedural meshes um that you know are for the like
7:26: actual UI and then it's like you know Rend the out with like you know the the typical missure there that's exist as a
7:32: local slot that you cannot see but under the hood like there's a bunch of procedural asset providers you know that
7:40: are sort of orchestrated by the canvas component so there's things like that that also procedural assets uh one of
7:47: the examples you know where you might have such a system which is actually using procedural assets behind the hood
7:53: but they might not be you know direct procedual asset components you can use directly is something like metaballs
7:59: uh because metaballs um you know you it's not like a single thing like
8:05: usually you don't have just a single metabol is one mesh that you generate from multiple things and you can have
8:11: like you know variable number of them where you have like you know things that cannot generate Fields so it be an example of like you know one where
8:18: technically it's also procedural asset but you might not be it's not like you know single procedural component asset to use
8:25: it's like you know more like a subsystem that's using those you know for more complex
8:31: functionality um even something you know like once we have like verx me thing that's actually going to use you know
8:37: procal Assets in the background too um but yeah like I don't have like
8:42: any like there's a lot of like just small ones like one of the top of my head is also like uh like um a
8:50: hemisphere essentially like you know half of a sphere oh actually there's there's one
8:57: uh there's one thing I really want to add it's going to be a useful building block for lots of pral assets um right
9:05: now we are missing uh implementation of method for like triangulation of arbitrary you know um arbitrary kind of
9:14: like uh what's the word essentially arbitrary kind of like
9:19: you know kind of cures let me grab a brush so I can just throw it quick we're going to be moving to the board yet but
9:25: I'll make a quick sketch I should but I'm want put my brush at the
9:31: beginning there it is it's my favorite brush should just save it to the RO of my inventory so like right now like for
9:38: example if something like this that is convex you know say like you have like
9:45: this Loop of vertices and we want to fill this um we do have like a method
9:50: that's you know able to do you know that's able to do like triangulation that uses like a triangle fan for
9:56: example so it starts here and just does like oh that's pretty much it and it's kind
10:01: of you know fills this with triangles uh problem is what happens you know if you have like more complex shape that's uh
10:08: con uh that's not convex but it's concave so like you have something like this and it goes inside and it goes back
10:15: here so like you have something like so now if you did the triangle fan
10:23: this would actually break because if you started here and you did like you know
10:29: V now we have like you know triangle covering this area that's supposed to be empty so typically you know you have to
10:35: use like some algorithm say for example like ear clipping or some other
10:41: algorithm that's like you know bit smarter about this where essentially you give it this set of points and it's it
10:48: figures out okay like maybe I'll put triangle here I'll put another here and another here and this kind of covers the
10:54: shape of triangles once we have that algorithm that's going to open up option
10:59: for a lot more procedural measures because now you know this a building BL
11:04: it can be used for lots of different things and one of the cool things I want to add is 3D text because if you think
11:11: about 3D text the way fonts work is essentially their sort of like outlines
11:17: they're like you know like say for example the letter uh T so the letter T
11:22: that might be literally just you know an outline like this so this is specified by you know by
11:28: the font file you get like you know this line and then
11:35: we can feed it to the algorithm that is triangulation it figures out you know how to do
11:40: these uh I don't even know if I'm doing it right but like you know it figures out like how to kind of fill this out um
11:47: and now know now we can also like for example extrude it so we can like you know make it 3D this is like the simpler
11:53: part and make this like you know like a thick te and we can make this you know a
11:59: procedual mage you just type whatever text you want you give whatever front you want and you get you get that letter
12:06: um as a you know 3D object in here plus you know all kinds of other
12:11: stuff like for example you know we could like add like you know busier curves so you can like you know Define you know
12:18: some kind of like shapes and then like you know something that kind of triangulates it and fills it out so like
12:24: that's going to open up a lot of options and it's like when like relatively smaller things um but it's like one that I think is
12:32: going to open up lad of options and I just really wanted to do it for a while and I didn't get to it uh I have on my
12:38: list of my kind of fun issues the there's also another one I do want to add also um algorithm that's kind of
12:46: building block for a lot of lot of these uh called marching cubes and essentially
12:52: let you reconstruct like a mesh surface for some kind of like you know field and it's actually used for implementing
12:58: metaballs because metaballs the way they work you know you have like objects that kind of create like a field around them
13:06: and in order to make it into a mesh you actually feed it into like Maring cubes and that you know is what gives you the
13:11: final mesh from red but it's just for all other stuff too um one example is
13:17: with a particle system because with particles we could make like you know each particle be like almost like a little metabol and then you know instead
13:23: of those being rendered individually we feed it to that algorithm for the Martian cubes algorithm
13:29: and we generate you know like if we got like bunch of particles here it's going to create like you know this kind of blob like they're going of going to blob
13:35: ify this like another one is going to be like this and if there's like you know two that near each other maybe you know they'll be like this and what that can
13:43: be used for is you know just making it look a little bit like liquid you know for example if you've played portal
13:49: specifically Portal 2 you know they have the liquid gels and this is one way you can do it we're just shooting particles
13:54: and instead of rendering them as each individual like you know Billboards or meshes you have them you know we kind of
14:00: sort of goop together so that's those are some of the things that I would like to add because I think
14:06: that's going to open up like lot of really cool effects plus once the algorithms in there it can be used for
14:12: lots of things like save it the Maring cubes you know like once we have terrain system that's probably going to be one
14:18: of the building blocks you can use to make terrains because it's also often times used for you know destructive
14:23: terrains like where the terrains defined by some kind of field and like you kind of like can use musroom cubes to
14:29: construct a mesh so hopefully that kind answers you know the
14:35: question uh next questions from Rasmus o21 uh
14:41: with stripe age verification knocking on the door is there plans for nodes to use with it um I think that's actually
14:48: little bit misconception we've we did introduce stripe recently uh is not for
14:54: age verification uh it's a payment method so if you want to support uh resonite uh
15:01: the only way to do that until now was pretty much patreon uh problem with patreon is they take pretty big fees uh
15:08: which means like you know kind of we like on the money that you give us we actually lose a lot like um we lose like
15:14: you know several thousand like every month pretty much um with stripe like
15:21: with Patron like there's about like on average the fees are like 15ish per give or sh um with strip so far they kind of
15:30: working out to be around 5% and with amounts like we get that means like you know extra like
15:36: 23,000 a month which you know we can then like invest into uh other powers of resonite other into people marketing you
15:43: know services and some other things so this kind of gives us a lot more kind of resources to work with um so if you do
15:51: support us on patreon consider you know switching to stripe because even if you switch on the
15:56: same tier and you essentially give same amount of money we get a lot more from that money because the STP takes less um it should
16:04: also be more more like implementation so for example with patreon you can have to like wait a bit but if you subscribe
16:09: it's stbe you get the benefits immediately uh it's based like on web hooks they have like really modern API
16:15: so it's a little more powerful plus one more benefit um if you want the lower steer which on patreon is $1 a month uh
16:25: that works out you know to $12 a year um on stripee uh we don't actually offer
16:30: that as a per month option uh you have to pay for the whole year but as a discount you pay only $10 and the funny
16:38: part about that is um we actually get more money as a result because with really small
16:45: fees um generally payment processors they will take like much bigger cut so
16:51: for the $1 $1 on patreon we get like you know
16:56: something like 70% because they take around 30% so we get about 70 cents
17:01: something like that um uh so like you know of the $1 we get like 70 cents uh
17:09: from uh and that's what is that I don't actually I kind of do m in my head super well um I think it works out like
17:18: uh I just calculated before what like 0.7 *
17:23: 12.7 * 12 uh let me calculate real quick Al so terrible al man
17:31: yes we could also just P per flag but like you're in desktop so it's a little bit easier 0.7 * 12
17:39: 8.4 so that's 8 8.4 uh so if it's stripe they take 5%
17:46: and because we paid like $10 you know at once uh the fee is much smaller is you
17:51: know around 5% and 5% of $10 that's like 50 so we get 9. seven so we actually get
18:00: like you know more like you pay less and we get more so like it ends up like you know working out like a lot for Mass
18:06: especially with the amount of people in the tier um but yeah like um sorry kind
18:11: of like der question like since ask stripe um we like we do use it
18:20: specifically for the payment like we're not using the age verification right now it's possible we might use it in the
18:25: future we haven't like really started like the process yet it's something we had conversations about but um there's
18:32: not a solid plan to like do that yet uh if that were to happen we were likely you know going to open up the discussion
18:38: and be like you know this is what we plan to do this is one of the providers like we're potentially looking at so you
18:44: know at that point um you know we'll figure out how it's going to be like
18:50: integrated exactly I don't know if they're going to be notes for it because you know that's like way too far ahead because right now right now we're not
18:57: even at the stage where like you know we're doing a generification with stripe that's not a thing that's
19:04: happening uh at least not now so like you know asking if there's going to be notes for it like it's it's too far into
19:10: that um but hopefully that kind of answers your question and clears like
19:17: some misconceptions that maybe exist uh the next questions from Modern
19:22: Balon uh he so a question do you have an idea of what the new I system would look like or is it bit Rel to say there
19:29: actually a bunch of like information like if you go on GitHub we collected a bunch of information from people uh and
19:35: there's like you know some general plans which uh kind of go into like you know details of what is planned um generally
19:42: I have like a rough idea how it's going to be kind of structured uh but um for the
19:48: specifics that is a bit too early to say because typically when new features are
19:53: implemented the way it works is um there's like a design phase for it and
20:00: during the design phase it's like you know we actually work out all the ideas I'm like this is how this would work and figure out like if this thing works you
20:06: know this way how is going to affect this thing and how it's going to do this thing and kind of goes through lots of
20:11: iterations you know to get like much more solid and robust kind of design um
20:18: and because like you know like we haven't started working on that part yet like um it's too early so like we don't
20:24: have the robust design it's just kind of you know just a general rough idea of how it's going to be
20:30: structured um the one of the things like I do want it to be kind of modular so
20:36: you can you know for example have like say arbitrary number of like arms and legs I don't know if that's going to
20:41: complicate algorithm too much like it might not but it also might and that's something that will be revealed during
20:46: the design phase um so there's like you know like right now it's more like of a collection
20:52: of goals um the other part of it is like that's a problem wanted OD is like some
20:58: sort of like retargeting because often times like you know I has issues if you have Avatar that doesn't match your body
21:04: proportions because if like one that matches it works pretty well and the more it's far away like the worse it
21:11: kind of gets because like you know if like my actual you know pelvis is here but like say the Avatar has like a
21:17: really short legs and the pelvis would be like you know much lower the current ik is going to you know try to try to
21:23: take the pelvis that's like you know really low and is going to try to pull it up um so one of the things I want
21:28: system to have is a mechanism for um you know retargeting and what it is is you
21:36: essentially you know you can compute the ik for your actual real body proportions and then you adjust those proportions to
21:44: match the Avatar so like you know um that it kind of behaves a lot more um a
21:52: lot better so that's like another the aspects for the ik uh but yeah for the
21:58: bunch more kind of details on the GitHub there's both like discussion of the GitHub issue so definitely check those out if you're interested um but forther
22:06: specifics you know it is too early um overall like the goal is to make it like
22:14: this is one of the things that's a little bit hard to Def find but it's like you know it's to make it feel good because right now like in ik especially
22:20: like if you go to C poses like it doesn't always feel good because quite match like the neck get scrunch you know
22:27: maybe the hips are not doing like what supposed to do you know maybe it's like offset weird uh you have to calibration
22:33: often so that's one of the primary goals is like making it feel good and for that
22:39: like it kind of needs a lot of kind of testing where um you know it's kind of implemented it's just going to keep like
22:44: tuning it and tweaking it and you know just kind of iterating on how it works
22:50: and sometimes during that kind of process you know there might be some big design changes that need to happen this
22:57: what happen during Photon dust as well where I had like an initial idea how it's going to work but then actually during the implementation I found like
23:03: okay this is an issue I need to make some design changes through this um so
23:11: you know things things kind of change change throughout the development process but this is generally like you know the idea how it's going to kind of
23:17: work it's going to be composed of kind of multiple solvers hopeing like you know modular ones kind of can
23:22: contribute um and little details going to bit work out like once it gets PR
23:29: IED the next questions is from Dev Hammer how excited to have V us
23:35: implementation is almost done over it's it's yes it's been like it's kind of
23:41: turned into like a lot bigger thing particularly because there's a lot of um compatibility issues
23:48: because one thing I didn't expect is like there's a lot of messing is with
23:53: the unit particle system some of it is like on our end like on how we we can of expose things and some of it's like on U
24:00: This and like we're doing things in a really weird way um and the problem is
24:06: like there's been a lot of bugs like with Photon dust where technically it's not a bug of photon dust like for
24:12: example Photon dust it uses consistent coordination system so like you know if you have um did I remove my brush no I
24:19: have two brushes now like if you have you know for example here like say you have like an object and this would be
24:25: like it's up Axis and this would be like you know it's forward axis um and then if you have Photon dust you
24:31: know and you have a particle and the particle it essentially flows the same you know like the axis are going to
24:38: match but then in unit's particle system the particles for some reason this would be like you know the up Axis and this
24:44: would be like you know the forward axis it's like all offet and weird but also it behaves like weird if if the
24:50: particles aligned with this axis then you know maybe this randomly flips
24:56: here and it's just very mess and but also this only only happens
25:02: if you have a certain mode because if you change it to for mode and it suddenly matches again so it feels like you know their system is just bunch of
25:09: different systems made by different teams that like didn't super coordinate
25:14: um and it me things really difficult because in order to make um in order to
25:20: make things look pretty much the same as they used to after conversion we have to replicate those bugs and that takes a
25:27: lot of effort like the most recent one I had to deal with is like where somebody used negative value for velocity scaling
25:35: and what happens with unity system when you do that is like uh if you emit a
25:41: particle you know say like this is your source of particles if you stretch them normally you know the particle going of stretches like this but uh if and if
25:48: it's positive you know it stretches like this but if if you if you put a negative value it stretches but only like you
25:54: know it also offsets so it's no longer centered it only does that when you use
26:00: negative scaling so I had to add a whole new module like the pivot module which allows you know to offset particles from
26:07: their Center and set up a thing where it like if it detects that mapping it configures that module so it offsets the
26:13: particles and that makes it look the same um and that took a while and you know this is also like one of the
26:19: reasons like when people are like like for example when we say like we don't longer like we don't support you know
26:24: putting negative values for certain things you know and people are like I just let it like use it this is one of
26:29: the reasons it creates a lot of weird problems um in this case it was already
26:35: there and kind of like you know committed to like preserving that backwards compatibility so I spend time
26:40: implementing those things and it's been like you know kind of long journey like
26:45: we also had like a lot of people from the community like you know finding all these problems and reports which helped a lot because people help like you know
26:52: isolate things and make sure it's um that like you know 90 like something like 99% all the content should just
26:59: like work and look the same um but it it takes a lot of effort it takes a lot of kind of like you know mental drain and
27:05: so on so I'm really looking forward to it finally be done and the other part is you know like I really want to like you
27:12: know finish the performance update as sooner as possible and until Photon dust is done
27:18: you know we cannot move to the next Milestone which is going to be the audio system there's also like a question for it as well so we'll be talking about it
27:25: a little bit later um but you know it being nearly done means you know we can
27:31: fin like move on to like a different thing which also helps a bit mentally because it kind of changes up like you know what I focus on uh because it does
27:40: get like like after a bit it gets a little bit grindy you know just kind of working on the same system for so long
27:46: um but hopefully people also like you know like enjoying it too because like I whenever I work on I try to throw in a
27:52: bunch of few extra things you know some new know modules like you know for like
27:57: life I like some so you can you know change the color of the particles based on velocity so like there's know little
28:02: sprinkles of like fun stuff uh the compatibility things as well but yes I'm
28:08: I'm I'm very excited um for those like we're not aware like we're actually running the last phase of uh pre-release
28:14: testing the Legacy particle system is very likely going to be removed sometime next week um assuming there's like
28:21: another bit another a big blocker which means everything's going to be automatically converted to Photon dust
28:26: and you know the Milestone going to be done done and we can move to the audio system but yes very
28:33: excited um Che the fox author is asking also we are the topic of moning how do
28:40: you plan to implement perlex static asset compilation um I'm pretty much going to be like a thing like I'm not
28:46: like again decided on the specific details yet but much it's like when when
28:52: you build uh prolex prolex itself it's technically a separate library and has its own kind of representation of all
28:58: the nodes and it actually it doesn't care about any of like you know stuff like position of the nodes and so on it
29:04: just cares like you know about how the nodes are connected to each other um so um what is probably going to
29:13: happen with a static asset we essentially take you know like that kind of dynamic representation with the nose
29:18: that builds the sort of linear representation and that's what's going to be serialized so it's going to get
29:23: saved and it's you know then going to get loaded so it kind of Skips that step like you like about it you know we have like a bunch of nodes and you know so
29:32: like they're connected and they're like doing things and this one here this one here this one here and this is like you
29:37: know you have all these objects and they're kind of spatial and Al it's kind of bright so it's hard to see um just
29:43: move it here you know so have like notes connected what Photon dust does not Photon dust Proto
29:50: flx um it essentially converts you to Ser list of nodes so we have like you know one node we have the other node and
29:57: have the third node and then each node has like you know that it connects here and connects here just kind of referencing things um so when you
30:04: compile to static asset like like this sort of happens in the background when you're connecting things it's building
30:10: this with Proto uh Proto flux uh if you compile into static asset we'll just
30:16: take this and we'll save this and this this gets thrown
30:21: away and that way you know it kind of Skips that like whole step like where it has to convert and figure out the connections like from The Binding side
30:28: on like you know resonite side and build this and we just load this directly which is GNA you know
30:35: help and can also get you know seriz into its own asset and then kind of like load it
30:41: in uh noon's asking do you like getting pets do mean like pets like is and like
30:46: animals or like being petted because getting pets you know
30:51: that's a bigger responsib they but like pets like this is fine yes
30:59: uh moon base is asking I can ask will it be going I don't know what it is in context
31:07: of uh where you be going I'm trying to think of what we're talking about like
31:13: yeah I don't know yeah like if if you're asking something provide context please um next question we have from rots I
31:21: have a hi I have a question with the audition of something like Richard body what kinds of things can we do and how
31:26: it will be implemented it so for the Richard body uh it's probably going to
31:32: copy a lot of the stuff that's already kind of in beu physics uh because usually with Richard bodies you know you
31:37: have the Richard body itself so like you know the body can you know kind of simulate and Tumble around and it's like
31:43: you know physically simulated uh there's going to be mechanisms for that for um efficient synchronization so like you
31:50: know if you have like multiple bodies it all stays in sync for multiple users uh there's also probably going to be
31:56: components to work with those body so you can you know grab them toss them around move them apply forces and so on
32:03: and those will be likely designed uh in a way so like you know there's some sort of like handoff so
32:09: like you know for example if you if you're the one interacting with like a Richard body um it's going to primarily
32:15: simulate on your end you're going to be the authority on it and if somebody else touches it maybe that ownership goes to
32:21: them and they're the ones uh that way it ensures that like you know like for anything you do with a rigid body we
32:28: have the low latency you don't have to like you know wait for the simulation to happen on somebody's else end and then get the data back um there's also going
32:35: to be know like I mentioned bunch of components to work with it so like you know mechanisms to apply forces you know
32:40: to read voles back like do things and we're also going to integrate constraints because often times you know
32:47: physics engines they come with a number of constraints you can for example say you know hey I have like you know one
32:52: put here and you know there's another one here and maybe you know I add a like a joint here so like you know it kind of
32:59: moves like this and this whole thing is you know going to move and it's connected by the joint or maybe you know
33:05: there's like a spring in between um you know maybe there's a spring um so there's like no number of
33:12: different constraints which let's you kind of tie them together and create kind of more complex creation so there's
33:17: going to be no components and prob also some tools so you can you know work with
33:22: all that and like you know make all kinds of kind of Contraptions so that's very likely know
33:28: how it's kind of going to happen it's like um th component itself bunch of components to work with it and also like
33:35: you know integrating constraints and other bits uh Navy uh 3001 is asking just
33:42: wondering do you know of upload VR yes uh actually there should have been uh an
33:48: article published like re like within the last few months because we also uh did an interview uh with the voices of
33:55: VR no no sorry not voices of VR um that's going to buy um my brain just
34:01: going to um is the show that ska and Alex the
34:07: host uh I just completely blanked on the
34:13: name uh it was like a VR VR podcast and they do like C out like with upload VR as
34:22: well what was the uh let's see
34:34: I completely blanked out on it there's a yeah they also like write an article
34:39: about us as well do remember oh between realities
34:48: yes uh it was the between realities podcast um and they did publish it on
34:53: upload VR because they do like cab with them uh next questions from moon base please
35:01: Ed oh they're talking uh next question Tyre white is asking if we want to move
35:08: our support account of patreon what is the best way sign up uh on the oh it's soring theing I think page first and
35:15: then cancel on patreon or does it matter it doesn't really matter how you do it uh when you cancel patreon it doesn't
35:21: actually um it doesn't drop your current perks uh it's just you know they don't get renewed so in what order you do it
35:29: doesn't really matter like you can you know Cel now and then you know even subscribe like subcribe like later on so
35:35: say like say if your patreon was about to renew on March 1st you can you know
35:40: Cano it right now and then you can you know sub on stripe say like on March 1st because there's a little bit of like uh
35:46: grace period uh or you know maybe a little bit before then maybe a little bit after you just want to like you know
35:52: avoid a period like where you don't have unactive subscription but if you do cancel don't REM of your subscription
35:58: you know for that month it's still completes it actually creat interesting effect because the way our system works
36:05: if you subscribe and have board active uh a lot of the perks they will stack so like for example if you're getting you
36:11: know say 25 gabt from patreon and you subscribe to stripe you get 50 GB total
36:17: once patreon expires then those 25 GB you know go
36:22: away and we have like you know 25 GB uh at that point
36:28: so um yeah it doesn't it doesn't super matter the only thing like I want to make sure is you know you don't want
36:33: like if you don't want to like lose your benefits you have to subscribe before your before the patreon one
36:41: expires uh Navy 3,000 was asking what are your thoughts on hiring a project manager I mean it was kind of help uh
36:48: for some things the main thing is you know hiring requires like U give them like a wage and we are small company so
36:56: like you know that can be quite costly for us uh we have a number of people who kind of like you know are
37:03: in that role uh both like Bob the good and purple Prime are kind of like you know taking some project manager
37:09: responsibilities making sure stuff kind of goes through on different things and I can read a little bit on some things I
37:15: thought like I don't like doing it super much I don't like you know kind of poking people be like working on this
37:20: thing and so on um so it's like
37:25: um it's it's like helpful thing like especially like having somebody like
37:31: dedicated to it but I don't think like you know right now that would be the best kind of like investment like I
37:38: don't think like it would be bring enough benefit over like you know something like additional engineer because if you if you're like going to
37:45: think about these things when you hire somebody you you have to think you know what is going to be what's going to
37:51: bring the most value to the company and to the project at this current time
37:57: and sometimes like you know there's like a bunch of apartments that need more work and maybe you know we we get like
38:05: more benefit out of additional engineer compared to like you know a project manager um if the manager is like you
38:10: know if the engineer is kind of good fit and they can like work pretty like independently on a lot of things so it
38:16: depends it's always like you know uh because we get like all of questions you
38:22: know like why not hardress person why not hardress person and people don't realize you know
38:27: there's only so many people we can actually afford to like you know hire at any moment so often times we have to
38:34: kind of decide like which ones going to bring the biggest benefit because we just you know we cannot do them
38:40: all at least not not at once you want to like you know grow gradually the next question is from
38:47: famous Scout question what is your favorite Tetris Cube um I would say I
38:52: forget I don't know their names but uh this one just because it's very
38:58: satisfying if you have like you know a whole row and you put it there and it just goes four rows at
39:05: once do you have a favorite Tetris Cube uh I like the t-shaped one it's
39:14: satisfying when you like yeah like when you when you it's like a key like when
39:19: you when you lock it in that's also feel like I'm it feels like very satisfying
39:24: when I can fit it yeah
39:30: uh next question uh J wden for is asking what's the future of videos and live
39:36: streams in resonite as because I think resonite could be amazing for watch watch togethers uh we're trying to watch
39:43: Target in here and it's been fun but it's difficult since audio seems to desnc consistently Unity LC both seem to
39:49: have their quirks and it makes me wonder if things would improved for example through an MPV if make Bas spe if the
39:54: problem is deeper and can be fixed the new audio rendering thoughts um so yeah
40:00: that's like an issue so the problem is we are not using LIC directly it's
40:06: actually kind of interesting because like Unity shouldn't desync because that has like more like it shouldn't desync
40:12: out because it's more Direct Control uh the problem is the unity one it's very
40:18: fragile um like we have to be very careful what video streams will be feeded because it's it doesn't do much
40:25: validation and like early on in his implementation we would just like you know try to like we would essentially
40:31: let it try to load any stream and if it fails then we fall back to Li VLC the problem is sometimes it says I can load
40:37: the stream when it cannot and then what you get is a horrible graphical glitches and screeching in your ears which like
40:44: you know doesn't no no so we have to add like sort of like pre-filtering where we like very CES like which you know feed
40:53: we like which data we feed it because if we figure out okay it's probably not going to be able to play this one we
41:00: just fall back to the VLC even though maybe it could play it but like it's better kind of error like you know on
41:05: the side of caution then like you know have people just have horrible screeching in their ears because that was awful
41:12: um for the LI VLC the situations are a little bit complicated because we're not actually using Li VLC
41:19: directly um we are using a pro called ump like Universal media player which is
41:24: sort of a VPP around Li VC and problem is it doesn't expose a lot of things we would need from the VLC particle for
41:31: audio synchronization we we sort of like have to like hack it even like you know good audio data um and we don't have any
41:38: sort of like you know time stamp when we get it we just going to like try to like read it out as fast as we can and the
41:44: problem is you know there's not a mechanism to like determine how is it aligned to the video there is mechanism
41:50: within the BLC because Li VLC itself the way it gives you audio you actually register a callback method you say you
41:57: know like call this you know when there's new audio data there also has a bunch of callback methods where it says you know the user has like you know SE
42:04: different part of video or maybe there been a drop clear whatever audio data you were about to play you know start
42:10: fresh so it kind of you know controls that playb and make sure it kind of matches the video we wanted to switch to
42:18: the official like Li VLC Library um right now it's kind of got stuck stuck because um as far as I'm like there's
42:25: been some issues that kind of run with it um we could look into like you know
42:30: other Solutions like there's actually like one that um I think was called like AV Pro or something like that that like
42:36: wanted to kind of explore but problem is like it's a paid solution so it's question like you know do want to for
42:42: over like you know a th000 bucks for like a solution right now that like we won't be able to port to the new
42:49: engine um there might be you know some like a d like ma like the main thing like we
42:55: pretty much need is for that solu to have like integration with unity
43:01: um because like you know it's it's not like you know you canot just use it on itself it needs to be integrated with
43:06: the game engine to make the video data available you know as a texture that you can then project other things and it
43:12: requires specific integration um there's actually one thing that might
43:18: be worth it is like you know if you kind of like open this up because if there's like interest in the community to help
43:25: you know push some of these things through that's you know something we could like use uh we could use heal you
43:31: know like if there's for example issues with lplc we fig out what it is and if people want to look into it that could
43:36: help like you know as a contribution but it's like a difficult thing of like um
43:42: do we how much time do we want to invest and you know how much time and money do we want to invest into Unity when we
43:47: want to switch because the new rendering engine that we switch to that integration needs to also be done with
43:54: de engine so like you know the work that's done for the unity is eventually going to be wasted um which makes it a
44:01: little bit like you know kind of tricker thing because they might take like a fair amount of time and then like if we
44:08: waste the time you know that might hurt us uh just kind of like you know putting the time into making the
44:16: switch but yeah def like make it kind of better overall just kind of like fix a lot of these kind of
44:23: issues uh next question the game cup dog is asking how making procedural misses
44:28: from scch in prolex work uh what features would be needed so there's like
44:34: two ways that's going to be possible one there's a mesh DSP that's planned and
44:40: like you know sort of like data processing pipeline where uh for example you have like a node and like the node
44:46: is uh you know say like it's a mesh Source maybe like it gives you like a grid you know this gives you like a grid
44:53: and you say you know I want this grid to be like you know 10 by 10
44:58: and then maybe you know you feel it a node where you know it takes a texture and say like you have some kind of noise
45:04: texture and you know it displaces the grid you know it does a thing and maybe you fed it to another thing which you
45:10: know maybe voiles it you know does like processes and as you change these parameters it kind of goes through this
45:17: pipe and makes like you know spits out like a mesh at the end of it so you can like but of meshes that are sort of like
45:23: sources you can you know feed those um and you know do varus processing on them
45:28: you know varus filtering um the other aspect to this which is
45:34: also going to plug into this is like being able to just uh build a match from scratch you know from The Primitives uh
45:42: which is very likely going to work in a way where you you have like you know
45:47: some kind of like perlex function and what it's going to need is
45:53: Collections and essentially you know you get like an impulse and then what do you do you have like you know some kind of
45:58: like for Loop and this is backwards for you you have like you know a for Loop and you
46:04: just like you build out like you know a list of triangles and you know you
46:09: compute their positions and pretty much like do whatever you want you add like you know vertices you add triangles you
46:15: know you build a mesh literally from those primitive priess do whatever mod you want and once it finishes you have
46:22: your procedural mesh and this is you know this mesh is like parameterized by some things so you know you can for
46:27: example have like whatever values and they plug into this and you literally just you you know you build the ver by
46:34: ver triangle by triangle you position them however you want this will give you
46:39: you know the ultimate kind of flexibility to build procedural meeses and the cool thing is like once we have
46:45: this you know this you can then wrap um into it own node so it's going
46:51: to become like you know it's own Noe and you know it outputs like a mesh
46:58: that it generates and then you can actually put it like you know to another filter maybe apply you know sub like subsurface subdivision you know can I
47:04: make it smoother and so on um so that way you know we can kind of compul these
47:09: things you know the systems are kind of like interact with each other uh one of the goals is also for
47:16: the procedural meshes is like you know let actually the brushes one of the goals for the procedural measures is
47:21: like um once we have the definition and kind of talked about this in one of the recent resonances they should be like a
47:27: video but once you have the definition of it this actually becomes a component you know so like if say you have two
47:34: parameterizations you know say Like A and B and then this becomes a component you
47:40: can just you know attach this your procedural mesh and you have you know you have your like you have your A and B
47:47: and it just you know plug values and it just like you know runs your code on a background thread to generate that
47:53: mesh so I think it's going to be like a really cool really powerful mechanism for building you know procedural assets
47:59: on your own it's uh I work similar for like you know textures where literally just Loop
48:07: through all the pixels do whatever math you want to compute each pixel um and
48:12: then we know that we make a procedual texture send it audio just you know compute the individual
48:17: samples just kind of working like you know from the basic Primitives that these assets are composed
48:23: of uh nutopia is asking in a pre oh you mentioned animation timelines may be
48:29: happening in resonite this would of course be amazing so I guess my question is is something like this end that works or just a thought at the moment thanks
48:36: uh it's something that's like you know on ARP there's a um issue for it which kind of describes you know what this
48:42: will do is not being actively working on right now in terms of like you know fully designing it implementing it it's
48:49: one of the sort of like background things that's like you know this is generally how it's going to work because
48:54: whenever features are being added into resonate um the process like you know
49:02: like there's a lot of features that are not being actively worked on because you know right now we working on performance
49:08: updates for example basically finishing the photo indust and so on um but there like you know these little threads that
49:14: kind of happen in the background just thinking you know this is how this is going to work and they sometimes influence how other things are done
49:22: because the way you know resonate kind of works it's also like
49:27: is designed so it will work really well with the timeline and some of the features that being added right now like
49:33: for example when I was working on Proto flag there are thoughts how this is going to work with the timeline feature
49:39: so there's like little like nuggets and threads you know that sort of go into
49:45: that feature so there's like little like tiny pieces of work that kind of go into it like you know design wise um and at
49:51: some point we're going to be like okay now we're implementing this feature and all those threats going to come together and that becomes you know the major
49:57: Focus that's being you know implemented right now um so yeah like it's not actually
50:04: working on right now like not in like any major sense but it's like a thread that's kind of like um
50:12: building uh the next questions c c was working in par on the audio system right how is that going so C's been working on
50:20: the on specific on like R implantation because uh with the existing one in unity we have like you know Unity has
50:26: like a Reverb Zone and we need to like you know provide an equivalent something that's similar since s has been working
50:32: on integrating like Library called like Zar verb um we actually tells a little bit more about it yeah um so the the
50:40: actual library that we're using is uh it's called sound pipe um and it's made
50:46: by this guy called Paul I think it's like I think his name is Paul Bachelor I can't remember I'd have to look it up um
50:53: but what it does is it provides a basically like we were looking for a
50:58: library that provided like a nice set of uh sound effects particularly the
51:05: Reverb because we need that um and sound pipes seem to be pretty good it's a little C project it's easy to build it's
51:12: not too much hassle to really integrate uh and one of the effects it
51:18: provides is of course the uh The Zeta Reverb which I think it's
51:25: a i I think it's kind of like a wellestablished like type of Reverb I'm
51:30: I'm not really like too well versed in audio so I'm I'm not quite sure on that one um
51:39: but uh one of the nice things about it is is
51:44: uh compared to to unity's reverb um which is which uses fod as the
51:50: underlying uh type of like audio processing system this one actually is
51:56: like St so the the Reverb is like in both ears and there's like slight differences
52:01: between both ears and while I was like AB comparing it I
52:08: realized that like it just it sounds way better like it just it sounds a lot more like full and like rich and like the
52:15: Reverb is coming from like all around me it's going to be cool once we kind of
52:21: have it especially like once we have like as audio Zone people can play with it too so so yeah it's essentially a
52:28: thing you know because like this is like one of the components of the audio system that like I spefically kind of needed like help with so I can askon has
52:35: been like doing like a lot of kind of good work like you know for both like finding because I think can ask you to
52:41: like you know look into like what Solutions exist and what and then like you know integrate making of rapper but
52:47: also the really important thing that you've done sire has like donated like his ears and probably some of his sanity
52:53: just doing a lot of AB testing to like because the ZB it works different you
52:59: know from the F mod Reverb um which means we need to kind of map the existing presets to the new one and so
53:07: essentially you did like a lot of AB testing you know just to kind of manually match them yeah
53:14: so Unity has a lot of different parameters that control like various aspects about how like the the sound is
53:21: processed like how much quote unquote room it has or like how much how much
53:28: the lower frequencies uh like resonate versus the higher frequencies and how long they do that and how they
53:34: sound and Zeta has a lot of those same parameters but it it it has
53:41: like it has like a couple has like a couple parameters that
53:46: are kind of like rolled into one so like it it technically has fewer inputs than
53:53: the unity one but like it s you can make it sound a lot better the problem is making them sound the same and so what I
54:00: had to do was I had to take like music and I like recorded my own voice and
54:05: over the course of like a week or two I just sat in my home World
54:12: basically with my I like made a little like mockup like Reverb so I could
54:17: listen to it in game and I just had to compare them and tweak
54:23: the values and compare them and tweak the values so I was like all right time to replicate the sewer pipe preset oh
54:30: boy and I just sat there and I like tweaked the the like the high
54:36: frequencies and the low frequencies and how long they resonated and like oh my gosh dude I
54:43: I'm I maybe listened to the same audio clip over like 200
54:51: times we we we appreciate your sacrifice a heroin experience one of the
54:57: reasons you know why I saw was like you know like s could you like help with this thing it like saves a lot of time
55:03: you know like so I can kind of focus on the other things too yeah but it's going to be worth it and it's going to be like
55:10: you I'm kind of excited to like go like find like integrated and uh you know with the pH is like being
55:16: done yeah and the you'll be able to actually uh play with it right away in
55:22: the form of um I made it so that you can actually like process audio clips using
55:28: the using the Reverb parameters so any audio clip you have in game you'll be able to apply Reverb to so maybe like
55:35: something you could make tools that you know automatically apply certain presets to specific audio clips for certain
55:41: areas of your world for example I don't know yeah I want people do something cool with it and this kind of cool thing
55:47: is because like once something like that is like integrated you know it's a building block and it can be kind of
55:52: used and exposed lot of different ways one way is also like what want want us to do is like once we have the audio DSP
55:58: for Proto flx we're going to turn it into a node so you can actually pipe audio through it you know to kind of
56:03: like do whatever you know audio effects you want and there's also you know a whole bunch of like other effects in the
56:08: library to like you know might like integrating as well yeah so yeah it's going to be
56:16: there's always like the way like we approach things always like make things more intello building blocks that can be
56:22: like you know used in lots of different like ways but I hope that answers that
56:29: question I will I will say um where we ever where we going to get to the uh the
56:34: questions that were in okay I was to like do that there's a few like I was
56:40: going to do it at for the full hour because like um there's like a whole bunch of things so we're going to answer
56:47: the questions from Discord in a second uh we're going to go through the few of these because there's like very quick
56:53: ones uh nun is asking actually not asking I give Fus and S one pet each one
57:02: pet thank you I'm honoring oh yeah J the F beond Realties that's like earlier and
57:08: then moon base is asking is there a better Li VC library for CP so just to kind of clarify what I was like saying
57:14: earlier the official like live VC now has an official C implementation the
57:20: problem is we need a Unity integration on top of that uh and that's kind of you
57:25: know some of the kind of issues like lie because if there's an ination like you know we can kind of plug it in and
57:31: replace the existing one but as far as I know there's like you know some issues
57:36: like with that implementation uh last time like uh gin was kind of looking into this part so
57:42: like you might want to ask him like in his office hours for like any details on that
57:49: uh so um I'm going to go to the questions this one's like a bigger one so I'll check the questions uh people
57:58: ask in the Discord um so red uh has been asking what are
58:06: some of the currently planned extra features in the new audio system kind of how we got Simplex turbulence and Photon
58:12: dust um actually kind of covered some of this uh in a previous resonance where
58:17: kind of like when into details how it's going to work and so on and this a published video so I do recommend checking that one out uh but in short uh
58:25: one of the features I'm particular looking forward to that's actually kind of relevant to this as well uh is where
58:31: we can have multiple listeners so instead of like you know right now you have only like one listener where um I
58:39: pretty much you know set the uh audio to broadcast if I set the audio to
58:44: spatialized for cro for example you would hear him wrong because you would be hearing him from my perspective I can
58:51: switch it to the perspective of the camera but then it would be wrong for me and it really mess with my brain so I
58:57: don't do it and I just do broadcast uh and the reason like we cannot like you know do it is because
59:03: like with unit we can only have one listener one sort of like Viewpoint for
59:09: the ears uh with our own system we can you know have as many we can design have as many listeners as we need to um which
59:16: means we can render audio for the user and then we can actually render the audio again for the Viewpoint of the
59:22: camera as send it to a different audio device that way you can have audio going to you and you have another audio that's
59:28: going you know to the stream and that has the audio specialized fully from the camera's Viewpoint what's even cooler with that
59:35: one we might make it you know so you don't need to capture your own microphone will render your own audio
59:41: spatialized for the camera so like you know you're also not in broadcast you're going to be um properly spatialized for
59:48: the camera's Viewpoint as everybody else um there's other things I want to do with that one is like you know for
59:54: example make it so you can make like you know microphone pH in the world and you can just record things into specialized audio or maybe you can just you know
1:00:00: make the audio source so you can make like you know walkie Takis you know that like um go there so that really like one
1:00:07: of the things the other part we're going to have like you know much better control over how the audio fall of works
1:00:14: so there's probably going to be a bunch of features with that you know kind of exposing more control maybe making you know settings where you can kind of
1:00:20: tweak you know the fall off like locally uh if you like you know for example
1:00:25: overall audio or maybe have like trouble hearing like people um so there's like
1:00:30: you know stuff like that we might some some of these like you know we might not like do as part of it because usually
1:00:36: for the switch the main goal is you know Feature Part there's like a lot of features we could add but if they take a
1:00:44: lot of time that means they delay you know the performance update which is the main focus uh so usually we'll only add
1:00:51: like you know things that are like small enough um but we also might add like you
1:00:56: know for example mechanisms where you can you know have because right now audio sources if you like a source of
1:01:01: audio is you know just like a sphere it's like you know if you're here it's the same volume as it's here um because
1:01:09: you know it's the same distance but what I might do is so like
1:01:15: you know there's like ways to Define them to be you know different so for example you know maybe the audio is more like this you know and it's very kind of
1:01:22: quiet here uh so like it's it's very loud out from like if you're in front of it but quiet if you're behind so you
1:01:29: kind of do different shapes uh because we'll pretty much you know we we'll control the how the volume
1:01:34: is computed you know based on the positions and can you know plug whatever functions we want there um the other
1:01:43: part uh I was going to mention with that um what I was I going to mention I kind
1:01:49: of blinked out again um and I'm blinked out
1:01:57: um there's another thing I don't remember
1:02:03: now but there's a few um oh one of the
1:02:08: things I was kind of looking into is like that we want like to add eventually but it probably isn't going to happen for this one for MVP um we're going to
1:02:16: be integrating steam audio and steam audio supports like you can like
1:02:21: actually use the word geometry for AO bouncing and occluding oh actually I
1:02:27: remember there's another thing that I found uh one of the things that steam audio supports is simulating air
1:02:32: absorption so like if something's really far away certain frequencies they get absorbed more than others um so maybe
1:02:39: like and that should be easy enough to integrate so maybe you'll get that one um oh and the other one is uh we might
1:02:47: support like you know loading custom uh sofa files for um like the the steam
1:02:54: audio for the audio spatial ization it just something like called like uh hrtf
1:02:59: which is head related transfer function um and usually it just some default ones that come through and so
1:03:05: they work okay for most people but you can actually get you know different ones can you can even get like you know your
1:03:10: own kind of ears measured so you get a custom one that uh you know matches your
1:03:16: physiology um so it might expose like mechanism so you can just load a custom one so we have more personalized audio
1:03:23: specialization and I'm making like you know any premises like on these things right now like I said the main goal is
1:03:30: uh feature par but there also the smaller things that might be easy enough to kind of integrate along the
1:03:37: way so this should answer that question um next question is from game the cup
1:03:44: dog uh when multiprocess happens how viable would overriding hijacking the F
1:03:50: to Unity IPC for a customer render B I mean uh I'm kind of because like one of
1:03:56: the things we do want to do ourselves is you know swap it for a custom rendering engine so you know like we have like you
1:04:02: know FS engine you know there's like all this stuff and then it's like you know com over PC and it's like you know
1:04:09: rendering stuff and which makes it easier because you know how this well defined interface it makes it easier to
1:04:14: you know take this away and put a different one in um the problem is uh when we do this
1:04:22: we'll probably going to change this bit like you know Fair bit so it's not going to work 100% the same
1:04:28: we're going to you know make it so it works really well with the new engine it's going to work like on similar principles but you know we can we can
1:04:34: change how bits on this side also work so it works better with a new engine which gives us more
1:04:40: flexibility um so if you mean you know swapping it on our end like that's
1:04:46: something that will happen um if you mean swapping it you
1:04:52: know by somebody from the community that's harder I feel because like one
1:04:58: you'll have to reverse you'll have to rence reverse engineer you know how this kind of
1:05:04: works um to make like a different render and making and the question is you know are you going to replace it with completely different render because you
1:05:12: know making a different render that's a lot of work like a lot like you pretty
1:05:18: much you know you have to like Implement all the mechanics of how it works all the like intricacies you have to implement all the shaders you know all
1:05:24: the kind of like functional so there's actually a like it's something you could technically do is it might be way more
1:05:31: work than you kind of expect um plus if you were like you know
1:05:37: doing that you don't have the flexibility of adjusting this to much the new renderer again which we do for
1:05:44: our own development so um practically I don't think it's going to happen but I think it's more likely you know making a
1:05:50: modified version so like you take the our like Unity render and you make modifications to it it but you still
1:05:56: kind of build our the same base if you want to make one from scratch um that's going to be tough
1:06:02: especially if you want to do like full like you know feature par with everything uh but it's going to happen
1:06:08: like we might you know even like ask for some community help like with the official one so we'll see how that kind
1:06:15: of goes and the last question on the
1:06:20: channel is from oie I've been trying to understand some stress of Optics and have some questions about it for free to
1:06:25: go with the questions given I'm asking a lot uh is the htic manager component that injects local htics for the users
1:06:32: me of anybag visuals when checked um I'm actually not sure on one
1:06:38: there's like hoptic uh there should be component like visualizes the hoptics um
1:06:43: I don't remember from the top of my head like I would have to like get hoptics set up actually no we should be able to
1:06:48: like show it off maybe so do mind if I inspect you yeah I don't mind me have a look
1:06:57: because there should be a bunch of components that you can poke out um so
1:07:02: I'm actually going to switch the camera to POV okay and we're going to inspect
1:07:08: cro so we're going to open him up get thing inspected um let's see let's
1:07:17: go uh is this the top or is there
1:07:23: more no there's more I think yeah there's more
1:07:29: um let's see let's open I forgot where exactly is it place because there should be like a component there's us
1:07:36: controller hand simulator targeting
1:07:42: interaction photo capture manager M
1:07:48: freaking there we go there's htic manager oh yeah there is show debug visuals so if enable this on this might
1:07:56: not do much do you see anything yeah I think
1:08:02: it's because I'm in desktop I technically don't have any yeah okay let's inspect me instead um so I'm going
1:08:10: to open myself up uh let's go on
1:08:18: higher so if I go here
1:08:26: H manager show back visuals I don't think you might need
1:08:31: somebody yeah just like I don't think you have like anything that kind of triggers htics right now so oh wait wait
1:08:36: wait I might I might actually hang on do I have a thing because like this usually happens when there's something per that
1:08:42: can activates that system does that work does it oh no this
1:08:47: is just controllers no just the controller darn yeah I think I would have to need to have a device on so this
1:08:53: kind of shows um um H because usually this will kind of show like actually
1:09:00: what if I rebuilt actually no there we go I kind of forc it to happen um so I
1:09:07: actually see oh right because uh this actually makes it happen for other users so this
1:09:14: component what it does um it looks at the other users and
1:09:19: injects uh you know hoptic triggers into their Avatar and you can kind of like
1:09:24: you know see like how they can like sized oh no it has like nothing to do with that one um because I don't think
1:09:31: you can see them because they should be just injected locally on my end yeah they're not I don't see any and
1:09:38: there like you know some kind of properties but like these I don't think we have like any way to like persist this right now um however uh there's a
1:09:48: way um because they're asking a bunch of questions let me actually check uh can the Hep be externally disabled if not in
1:09:54: Che of Avatar and the Avatar htic sour manager can override o htics is called
1:10:00: htic volume active zones but disabling active doesn't actually is UN intended uh do different types of Sensations mean
1:10:06: anything to betics I to say um so let's just go like over this because you should be able to like you know say that
1:10:13: you don't want the hoptics to be injected for certain parts of the body on your avatar so I'm going to uh let's
1:10:21: click this um I'm going to open up syyro and I'm going to add a component to
1:10:28: you let's see uh I think it's an input
1:10:37: htics there's up think sampler htic Source manager yeah there
1:10:43: we go Avatar htic Source manager it's been a while since I actually work with this system so I'm
1:10:49: not this Hier optic s head optic Point M filters um
1:10:55: um control htic Point mopper so some of these should all be these are like moppers the receiving hoptics and some
1:11:02: are like like to like set you up H I might need to like look into the
1:11:09: system because it's been like a while since I work with it so I'm kind of like piecing it back uh head up hard what does this one
1:11:20: do Source no that's oh wait that's a different one um
1:11:28: H it might be under common out instead let's
1:11:34: see do I have
1:11:41: htics pose I'm actually not sure I just feel
1:11:49: like dig through this because this has been a while since I work with a system like even though like I wrote it I don't
1:11:55: remember how it works I have to like check the source code
1:12:02: um let me see if I can figure this
1:12:09: out yeah I don't think there's May device no we don't have
1:12:15: devices any yeah actually I'm not sure because this
1:12:22: one I don't know what this one yeah I'll have to like do some research
1:12:30: for this it's actually a thing because like I canot really it's like when I'm in VR
1:12:36: it's not that easy to check the source code um so I don't think I can answer those questions right
1:12:42: now um thetic Source manager like the way I
1:12:48: canot remember like rly working is like you should be able to like place like you can place like you know hoptic
1:12:54: Triggers on the avatar are and then like you know say I'm overriding these and then when it generates the
1:13:00: mapping um it shouldn't like you know you should be able to say like don't autogenerate you know the chest don't
1:13:06: autogenerate the arms and so on and they would like suppress it you probably
1:13:11: cannot like you know control like well you cannot control like the component that's on the rout because that gets like injected but you can control how it
1:13:18: behaves um there's a one question I can ask do
1:13:24: uh can answer do the different types of Sensations mean anything to be hoptics I noticed that say force and vibration
1:13:30: feel a little different on the control htics but I wasn't sure if it was betic best to yeah they generally they have
1:13:35: different implementations for each so like um if it's like you know just a vibration then you know it'll try to
1:13:42: like do something different so like it feels different uh the one of the reasons like you know why those are
1:13:47: there is because um you know in case there's like different out different H
1:13:53: devices for example ones that can simulate it's hot and cold they can actually instead of vibrations they do
1:13:59: hot and cold uh but if you don't have those it still does something you know to indicate something is
1:14:05: there um I think like for example for the pain like the betics will do sort of like this kind of like pulse thing like
1:14:11: heartbeat and was kind of inspired like you know by halflife Alex you know how when you're low on health your controls
1:14:17: are kind of like pulse like that um because you know it's kind of happens
1:14:22: like when you're in pain in the game but sorry like I kind of done some most of this like from the top of my head I'd
1:14:28: have to like check the source code um which means I probably need to like figure some good way to kind of maybe
1:14:34: just have it kind of open and have my keyboard with me so I can like you know check things out
1:14:40: um but yeah uh this is uh all the questions we got in Discord uh try try
1:14:47: asking maybe like you know for the for the next one and I'll figure out the problem is this one kind of pops up like
1:14:54: really late so I didn't know it was coming so we did like there all the
1:15:01: questions from the Discord which means we can move uh back to
1:15:08: the um uh but back to like the questions
1:15:14: from twitch um nikun is asking what is the current press of De integrating reson
1:15:20: right from Unity uh what optimization updates should we exp before run what are you working now so right now Photon
1:15:26: dust is essentially getting like um finalized uh we're running the last
1:15:31: phase of like preas testing which should be relatively short um essentially what it is it's removing the old particle
1:15:38: system completely and uh and then like you know like the
1:15:45: conversion essentially becomes automatic which means like anything you made now gets automatically converted to Photon
1:15:51: dust uh once that is done the next part is the audio engine so uh going to work
1:15:57: on like making a custom audio engine um that should be way faster in the
1:16:03: particle system because like there's a lot less kind of complexity to it um so
1:16:08: hopefully that one shouldn't take as long uh once the audio system is done
1:16:14: essentially it's going to be the last major system that's going to you know making a kind of still intertwined with
1:16:21: unity where it's kind of you know interweave and makes it hard to separate um after that I'm going to be like
1:16:27: reworking how F engine actually communicates with the unity and then it's going to get pulled into its own
1:16:32: process and that pulling into its own process is like when we're going to get the Major Performance boost because
1:16:38: we'll be able to run with net 9 run time um so it's pretty much it's getting
1:16:43: close it's getting there uh right now it's mainly just need to finish Photon dust you know delete the old particle
1:16:49: system which is already done you know on the branch uh then do the audio engine and then it's pretty much going to be
1:16:56: the main work on the actual like you know pulling it apart which is going to take a bit because the mechanism for
1:17:01: communication is to be reworked and there's going to be know some bits bits there but it's it's going to be the
1:17:08: actual you know the spling um so it's it's getting close and
1:17:14: I'm kind of excited for it the next question is from Grand UK
1:17:20: are there plans for audio video processing there's an answer that more full feature Productions can take place resite like compositing video camera
1:17:27: into another camera view yes um I definitely love like you know resite to
1:17:32: be like more of a like Production Tool so because our overall goal is you know
1:17:38: like having the social layer as something you know that's like a sort of foundation it's a basis but you can do a
1:17:45: lot on top of it it's not just you know like if you want to you can just hang out and socialize you know that's perfectly valid but you can also you
1:17:51: know collaborate with other people um you know you can like work with them and you can already kind of do that you know
1:17:57: building Wars in here but the more functionality we add you can use resonite to build and produce things you
1:18:03: know for outside of resite so one of the things I'm excited for for example is
1:18:08: the audio DSP because the audio DSP it will let you you know build a virtual
1:18:14: like music audio Production Studio and you can you know collaborate like you know we could be like here reti we could
1:18:19: be just you know building together and you know he can be bringing some sound effects and plopping them in and you know and we passing them to filters and
1:18:26: maybe you know somebody integrates like you know we have like a a keyboard so you can like you know sample some things
1:18:32: you know and like feed it into the system and it makes sound and you know just make make like sounds and music you
1:18:38: know and if you want to save your progress you literally just save the world or maybe you you build like a
1:18:44: really cool filter you know you package it into a custom node and you can share it with a community and other people can
1:18:49: use it in their own Productions um so there going to be like you know what of like like great Synergy
1:18:56: with stuff like that the other feature that I feel is going to help with that is the timeline because of the timeline
1:19:03: um and I kind of went into like deeper in previous resonances there's like a video kind of covering it in depth but
1:19:08: for example with timeline the way it helps you know with the audio production is like you know we can maybe plop sound
1:19:14: effects on it you know we have like you know you have like your timeline and say like you have like you
1:19:19: know a sound here you know is a sound and you can you know you can plop it here or maybe you know you cut it you
1:19:26: know so you cut like a piece of it or maybe have like another sound you know here and you kind of composite them together maybe you duplicate this one
1:19:33: you know you can play it multiple times or something or stretch it or do whatever you want you know you can place
1:19:39: all kinds of things on the timeline and you say I'm going to render this out into a new sound so there's going to be
1:19:46: tools that going to do that that're going to working over the timeline and get a new sound effect and then you export this and you know use it
1:19:53: wherever um or maybe like you know with the timeline the other way it can work with the AO
1:19:58: thing is you know use it to Like You Know sample things so we have like for example you know these are like you know
1:20:06: keys and maybe you know and this and this then you know for example each R we have like a thing that's generating
1:20:13: audio you know and maybe this one goes into this one this one goes into this one and you mix it together and you know
1:20:21: you use the timeline as a sequencer for audio production um and the goal is to essentially add
1:20:28: the timeline a sort of like a building block like a base that lots of other systems then can be built around and
1:20:36: it's going to kind of you know sort of glue serve as a glue that kind of glues those systems together and that way you
1:20:43: get lots of like really cool synergies between all the functionality you know in the resite um same thing like you know video
1:20:50: processing video processing a littleit trickier because like we would need to integrate some libraries for like you know really efficient video decoding but
1:20:58: it's something I would want us to have at some point in time uh because same you know with the video timeline you can
1:21:05: just you know Place video clips and maybe you know this gets composited you know somewhere uh like you have like a
1:21:11: video clip here because right now with the video player we like it doesn't have much control like we cannot say decode
1:21:18: this exact video frame you know like if you if you have your you know timeline you know and like it's you're
1:21:25: moving it around like if it's here we need this exact video frame to be you
1:21:31: know outputed and we don't have that level of control right now so like we would need something more robust for
1:21:37: that and that will you know take some time but if you have that functionality you know then we can take this and you
1:21:43: know and maybe you know you actually have like a thing like um you know it
1:21:49: used to be like that uh the old like movies they would actually have like you know like ma where it's like you know
1:21:57: for example like you know some kind of terrain and you literally put the camera you know like you literally put this in
1:22:03: front of the camera and imagine this is you know kind of pretty terrain or something you put
1:22:09: this in front of camera and new film like this you're essentially doing real world composi Thing by just like you
1:22:15: know layering things and maybe you know you could have like another thing you know that's in front of your camera over
1:22:22: here and maybe you know this one and there's like something that's moving this in front of the camera and if if
1:22:29: you look at it you know from a different Viewpoint um if I know if I switch this you know it's
1:22:37: literally just things that are being moved into the in front of the camera that are
1:22:42: compositing you know this image so what we could
1:22:48: do what we could do is like make tools where it's easy to set a
1:22:55: Contraption where um I pulled it a thing we can make it E I keep pulling a thing
1:23:02: just grab this like this oh there we go uh we could like make a
1:23:08: thing like where we have a video on the feed maybe you know there's a texture in
1:23:13: the world and you know and this video actually goes into this and then we have
1:23:21: a camera here that's looking at it it and then you have like you know
1:23:27: something behind and maybe you have like another layer over here and this is another track and maybe you compos it
1:23:33: you put like a a filter in here in between so you can sort of
1:23:38: physically composite things together in virtual reality and you can also you know collaborate with other people and
1:23:44: alls of other cool stuff so I think there's a lot of like really exciting workflows um that can be unlocked with
1:23:52: these features and I would love like you know to have like resite be something that can be used for
1:23:59: like you know collaborative Productions where you use it as your you know virtual studio and maybe you know you
1:24:05: make like videos with it like you know that are not even relevant to resonite you just use it as a tool which gives
1:24:11: you you know realtime collaboration like you over the Internet it gives you sort of embodiment in the virtual world and
1:24:18: gives kind of more physicality to that like you know whole production editing process where like you know you're in
1:24:24: forward and you're kind of like doing these things in a way it actually used to be done or like similar where you
1:24:33: know they used to actually literally put things in front of camera to composite like you know multiple effects together
1:24:40: and now it's kind of you know all digital so you kind of lose that like physical analogy but like with VR you
1:24:45: can kind of bring that physical analogy while keeping you know getting the benefits of the physicality of it so yes
1:24:53: I would definitely I would definitely love like you know for these things to kind of become more of a thing the audio
1:24:58: one is probably going to happen way sooner than the video because you know audio easier to process video like we
1:25:04: need those you know more advanced
1:25:09: libraries uh the next question is from nikun what is the lower image for how many audio sources can be active before
1:25:16: there are issues what issues might we encounter um I mean that kind of depends on the hardware mostly like and also
1:25:23: like a well hardware and like you know what you do get audio effects because if we have audio effect that's for example
1:25:28: spatialized that will take more CPU time than one that is not because we have to you know spend CPU cyes doing the
1:25:35: spatialization so there's not like a um there's not a specific number you
1:25:42: know it's going to depend on a computer it's kind of the similar thing you know like um if you got a B computer you can
1:25:48: run you know way complex worlds you know you can have more geometry in the world before you start like suffering a lot
1:25:55: compared to like you know somebody running on the lowend machine so generally these questions you know there's not a specific number the number
1:26:01: is going to depend on a hardware so then become becomes question you know what's the lowest Hardware that somebody might be
1:26:07: running uh and you know then we can like you know maybe measure stuff on that and you know and then like we get also into
1:26:13: other problem is like you know we have to Benchmark this stuff so you have to kind of see
1:26:18: um we have to kind of see like um how it rounds because right now
1:26:26: like you know can't really answer it without measuring it um usually the issues you encounter if it's if the
1:26:33: audio engine is installed the audio essentially starts popping because what happens you know like when you're
1:26:39: rendering audio you need to like if you're at 44.1 khz you essentially need
1:26:44: to like you know compute 44,100 samples every second and it
1:26:51: usually happens it's small chunks you know like maybe you're competing 1 1,24 samples you know at a time or maybe
1:26:58: 248 um so like what happens is you know I I don't know how it like works out to
1:27:04: I think it's something like uh I do month again um what is like
1:27:10: 44,100 divided by 1,24
1:27:16: 44100 divid by 1024 about 43.0 6 43 so that's like 43
1:27:23: millisecond so like yeah it actually works like so
1:27:29: say like you know every 43 milliseconds you need to compute 43 milliseconds of
1:27:35: audio which means you cannot think more than 43 milliseconds to compute it because then you're too late um so what
1:27:44: essentially the system needs to do it needs to compute 43 milliseconds of audio in less than 43
1:27:50: milliseconds um so it's question you know how much can compute before it happens because if you if you don't
1:27:56: compute it in time you miss it and now you have no audio to play to the user which means essentially you know you're
1:28:03: playing audio and suddenly you get like a burst of silence and maybe you know then it goes late and then you need to
1:28:10: compute another one and get another burst of silence and it just starts like you know popping and it doesn't sound
1:28:17: good as a fun fact I believe that is if you were to compute 1 second of audio
1:28:24: at 44.1 khz uh I'm pretty sure that it comes out to
1:28:31: like a 180 megabytes with the floating Point yeah I
1:28:37: mean not much for memory like usually it's this processing stuff but like I mean if you think about like the images
1:28:43: like if you calculate how much data is every single frame like you're get think to like sometimes gigabytes per second
1:28:51: yeah so like AIO is generally nothing like generally Stu is fast but like the problem is you know usually you have
1:28:57: more than one audio source and the more you stack you know the more you have to compute for every single sample and
1:29:05: that's like you know where it lowers because if you have say if you have like you know two audio
1:29:11: sources you have to essentially mix them together and what you do like you know you take like at every point you take
1:29:18: the sample and you just add them together you know just add them together add them together add them together add them together you're going to do that in
1:29:23: a loop so it's past now if we have like you know if you think about it like you know this is like um this is like one
1:29:32: addition for each sample but if we have you know say 32 of these now I have 31
1:29:38: additions and maybe you doing more complex calculations so for each maybe you know for each one you're actually doing the spatialization which adds you
1:29:45: know complexity and the more you add the more calculations you're doing in the
1:29:51: same unit of time and eventually you know you run of like essentially makes it like you know take the more you have
1:29:58: it makes it take longer and longer and longer and at some point you're going to
1:30:04: you know hit a threshold where you're asking the system to compute um you know
1:30:11: 4 to3 milliseconds of audio and it needs more than 43 milliseconds to do it because you just added so much and then
1:30:18: it doesn't keep up and it just starts popping um next question from Grand K is
1:30:25: there work or idea Stone AR R for molecule more specifical anything not in the initial GitHub issue for it uh I
1:30:32: don't know what's in the original GitHub issue so this one's kind of hard to answer I mean I would say like it's like
1:30:37: gener stuff that's in the GitHub issue like the main point of like molecule is
1:30:43: you know sort of like a versioning system that we can use both for resonite itself and we can build our own distribution but also it can be used you
1:30:49: know for components of if you build stuff if you build you know items you can use it to version them if you build
1:30:56: um say like once we have like you know libraries for protox or Vib assembly you
1:31:02: can use that to version him as well and there all like know dependencies and stuff like that so
1:31:09: I like the the main like the core ideas of it like you know they should be in
1:31:15: the GitHub and if it's not that's probably not a very core idea so like I don't
1:31:20: know um it's uh yeah like this one's a little bit
1:31:27: hard to answer espically also like you know what if they going to read through the issue
1:31:33: also for noon molecule it's our plan sort of like you know versioning system so like you know it can manage the
1:31:38: builds of resonite and also other items and we can
1:31:44: um we can like um you know we essentially like have
1:31:50: control because right now when we publish B of steam sometimes does want to update for people it's also hard to
1:31:56: have multiple you know branches for testing things because like you know the team is like working on stuff and it
1:32:01: would help if it was easy to access that build and you know
1:32:07: um you know like if they like for for the community to like you know run multiple things in
1:32:13: parallel um and like we cannot really do that with steam because we have just one prase and we want a multip relases we
1:32:19: need to build make more build scripts and makes things hard to manage so this just going to kind of give us a little
1:32:25: more kind of Control Plus you'll be able to like easily switch you know say I want to switch to this build you know
1:32:31: this older build to test something because sometimes you know we're like did this like when something breaks and
1:32:38: somebody makes a bugger Port we're like this this break recently like is this new and people don't know and I know
1:32:44: it's hard to be like you know go to this build and check it if it's still broken on this one and it can you know help us
1:32:50: you know figure some bus out so there going to be a lot of things that's going to allow us to
1:32:55: do also thank you it's l for the subscription with prime that was probably like while back like it's just
1:33:01: going of taking time to get through the questions but thank you very much for the subscription um next question is from
1:33:09: rabot I understand that once a particle and audio are separated from Unity the migration net will begin how challenging
1:33:16: do you think the migration toet will be and how long it might take or is it possible that it will simply be matter of switching the run time to net so I
1:33:24: can tell you um um like straight up it's not as simple as switching the run time
1:33:30: to the net uh the big part like is essentially making it so FR engine you
1:33:38: know so making for exension r.net is very
1:33:44: easy uh because that's pretty much what headless is the Headless it's pretty much almost all of
1:33:49: the parts of the for ex engine you know just kind of running with the graphical output the challenging part is the
1:33:56: communication you know with like uh with unit because we do need it to render
1:34:01: stuff out so you know we have the FRS
1:34:08: engine uh oh where did it go that was weird
1:34:14: where did it jump oh I moved okay yeah you moved I bumped the joystick I was like why did I
1:34:20: just suddenly jump to the right so you know we have the F Eng and then you have like an IPC
1:34:28: mechanism you know and the IPC and this communicates you know the unit
1:34:39: there um so this is probably the most complicated part uh making it you know
1:34:46: communicate efficiently and making sure these two kind of stay synchronized and you know this keeps feeling data in and
1:34:52: this also keeps being like like the frame is ready here's stuff for the next frame and like you know the communication is very efficient um we're
1:34:59: going to be using mainly shared memory for sharing the bulk of data because what share memory is is like literally a
1:35:05: piece of memory um that both have access to as if it was their own memory so like you know
1:35:12: this one has its own memory and this one has its own memory and this piece is shared which
1:35:18: makes it very easy to exchange data and then we just need to send like you know tiny messages communicating when it's happen
1:35:24: and this might also happen over this where it just you know puts a piece of data there so maybe it's all going to be
1:35:30: share memory that's not decided yet but we'll see uh but the part is like um for
1:35:38: actual splitting like the FR engine right now it still has you know bunch of kind of
1:35:44: ties you know like how it kind of communicates so the biggest the longest part is probably going to be you know
1:35:51: unifying this into this uh once once the communication is
1:35:58: kind of you know streamlined um that makes it like much easier so like you know just kind of split this up and you know run this like
1:36:06: uh run this like you know in the separate processes and the hardest part on that one is going to be making
1:36:11: efficient I don't know how long it will take exactly like that's kind of hard to estimate before actually starting to work on it uh because usually with these
1:36:20: things you know you can of discover a bunch of the problems and issues and so on like once you actually start working
1:36:25: on it so I don't really want to make an estimate um at this time um but yeah it's uh it's it's going
1:36:33: to be like like I I don't think it's going to be like like super hard
1:36:38: challenge it's like the hardest part was you know just getting everything to the point like where we can do it you know
1:36:44: it like it started kind of with a type system then doing um then doing you know
1:36:51: uh Photon dust and next one's going to be audio system which I think is going to be the simple part and is the
1:36:57: splitting which is going to be figuring all these like of mechanisms there's also going to be some mechanisms you
1:37:03: know because there needs to be some back and forth communication between them and making it like R efficient and reworking
1:37:08: that like in a efficient manner so hopefully that kind of like
1:37:14: answers the question uh o asking that works I don't know what's that in reference
1:37:20: of um rep
1:37:25: uh reot is this is a repost because I forgot to include question mark I'm really for to S how rendering engine
1:37:32: however I rarely get chance to check its progress on defl and the void anchor disc or server I'm curious about which
1:37:37: phase it's currently in and when transition to aspect take place so there's not like a specific timeline for
1:37:43: it right now um last uh there might be like B ask like in's office hours as
1:37:50: well but um do think that I know he's been working most recently is actually
1:37:55: consolidating shaders on Unity side for like FRS engine so they're going to be
1:38:01: much easier to por over to Sauce um I've been working also on some like bits of
1:38:06: sauce but I don't have like the most up to the information on that
1:38:13: one uh the J Forge I'm going to play some
1:38:20: resonate to socialize and talk to people right yes that's that's one of the favorite memes yeah now we can do that's
1:38:28: the point I can do both like the way I like to look at resonite is you know
1:38:34: like I feel a lot of people like you know put the software um you know they put the
1:38:42: software like into like this box and it's like a social VR you go there to socialize the way I like to look at
1:38:48: resite you know is like it's like a layer it's like a realm kind of like the
1:38:54: real world because in the real world you you know when you think about it like
1:38:59: you can navigate you can like you know go into rooms you can go into City you can you can move around and you can talk
1:39:04: to people and socialize and it's something that's just you know you take for granted in the virtual world like
1:39:09: you know that's not something that is given for granted but it should be and it's kind of what the reason it tries to
1:39:15: be it startes to provide like you know a reality you can access there and you can talk with people and you can do stuff
1:39:20: together and then the stuff that you can do you know that's like where it gets interesting so like you know same in the
1:39:27: real world you can just go there visit somebody can just hang out you know Vibe like you know whatever watch some videos
1:39:32: together you know socialize you can do that or you can meet somebody you know say in a hacker space and you know and
1:39:38: you maybe doing some Hardware together you know like and doing some engineering and some things or maybe you meet with
1:39:44: some people you know who are artists and you you know you paint together maybe have a class you know like like you can
1:39:51: do so many different activities in the award and you don't really think about it and you don't think about you know
1:39:57: that all these activities you know they have they are built on sort of like you know our ability to communicate our
1:40:04: ability to move in the world which are sort of there they're
1:40:09: always there and you know we don't think about them but in the virtual world they're
1:40:17: not and the reason why this you trying to make it so that's a thing that's always there it's always something you
1:40:23: know you don't have to even think about it you can exist in a world that's fully synchronized and whatever you do stays
1:40:30: you know snc with people and you don't have to think about it it's just you know how the reality kind of works and
1:40:36: then it becomes more about like you know what do you do together with other people what do you do on your own you
1:40:42: know you you pick that like you know whatever activities you want to do the same way you do in real real
1:40:49: life and that's kind of just you know how I like to like look at things
1:40:55: um Mr squarep is asking are there any plans for Quest build post to split between unity and
1:41:01: resonite um probably not immediately like that's probably not going to happen
1:41:06: until we actually switch to the uh sauce um because there might be like a
1:41:12: number of other optimizations like well once the split happens we'll very likely kind of you know evaluate how much this
1:41:19: helps like you know how it can improves things um there like it might be a possibility but
1:41:27: like you know we kind of need to like look at things and be like what's the best path forward uh because like doing
1:41:32: the multipress architecture also might be more difficult on Quest um so like you know that might be like a
1:41:39: hurle um it's something we evaluate you know like once the split happens uh but there
1:41:47: isn't a planning know to do it immediately after if it becomes easy to do then maybe like in you know sometimes
1:41:53: like when we do these things uh once it kind of happen we're like well this is not much easier to do
1:41:59: maybe we want to you know prioritize this um but it kind it requires you know
1:42:06: evaluation to kind happen so right now I say Maybe not maybe after sauce because that's also going to make uh the
1:42:13: rendering kind of more efficient which might be needed for Quest but we we'll
1:42:21: see uh the jip forge with all different features depending whether it ever become an operating system like emac I
1:42:27: mean kind of like operating system is actually like another like way I have to look at I like to look at it because you
1:42:33: know when you have like an operating system it's sort of like a common interface where you can have multiple different apps and things you know just
1:42:40: kind of coexist with each other and communicate um you know and you have like your core mechanisms like you know
1:42:46: like Windows you know like where application can is in window it can drag and you can move it around and arrange it you have like you know stuff like CLI
1:42:53: board and things similar resonite provides a bunch of things that are sort of provided to everything and you can
1:42:59: build you know stuff around it on top of it that can exist in that environment so
1:43:04: I would say yes I would definely play it too like you know
1:43:11: um the next one is I'm also kind of starting to speed through this a little bit because we've got a bunch all this
1:43:17: three um um we have uh about 50 15
1:43:23: minutes left so at this point it's possible we might not get to a question
1:43:28: um if you want like you know you can still ask it but we might not get into it uh at this point uh um if we don't
1:43:37: get get into it I remember we now open the thread where you can ask questions
1:43:42: in advance for the next week so you can also you know put it there uh but we have a question from our
1:43:49: boy there are many upcoming features that I have personally been really excited about and know have been talking about for a long time like perer physics
1:43:55: system Workshop hard permissions but those feel like they're going on years of waiting I've seen mple times people
1:44:00: told they should wait until hard permissions are implemented to make permission system related systems but it has been a long way I was wondering how
1:44:06: do you prioritize features to work on and if any of those above have any updates so usually we kind of like look
1:44:13: I mean there's going to be lots of plans because resonite is a long-term project which means you know there's a lot of
1:44:20: things we want to do it'll take you know to get to some of them um but you know
1:44:26: that's kind of like you know the point of a project you know it's a very long-term project and we're just building a lot of things there's only so
1:44:33: many things we can do at a time um and it's almost like you know like a lot of these things they're kind
1:44:38: of like Milestones but like um like if you if you think about it like you know like one nice way to think about is
1:44:45: almost like you know having a skill three and have like you know say like you're here and you know there's like
1:44:50: all these things and maybe there's like you know something here um let actually move this a little
1:44:56: bit you know you're here and for example you know these things need this thing and this needs this thing it doesn't
1:45:02: needs this thing it this needs this thing um and you're here and you're kind of deciding which way do you want to go
1:45:08: and maybe we like okay this is the most important thing that would help the project the most at this time so we're
1:45:14: going to you know develop this and then we're like okay this also helps this but we need to do this so we also do this
1:45:21: and now we can do this and now that we've kind of explored this part we'll be like okay at this point we have this
1:45:27: this is helping us you know a lot at this current state you know with
1:45:32: the community and like with a company and with like how everything else is going we think this is the most
1:45:38: important bit so we start developing here and then you know maybe here and then we maybe go a little bit here and
1:45:43: there's like it goes even further I'm actually making it
1:45:49: to uh oh boy there we go and it goes even further and
1:45:56: maybe we decide okay we're not going to go all the way you know like we're going to pause and we're going to instead
1:46:01: develop this and then this and maybe later time we return back here because like this is you know good
1:46:09: enough so almost like you know like there's so many things to do and at at
1:46:15: every point when we're kind of deciding what to prioritize we are like you know
1:46:21: what what anite needs the most right now what is like the most important
1:46:29: thing um and that's kind of hard question to answer sometimes but you
1:46:35: know there's a lot of things that go into it um there's actually a big post on our GitHub called how we prioritize
1:46:43: and it kind of goes into a lot of the signals and thought that goes into prioritizing
1:46:49: things um but like for major featur like it's like what will bring us you
1:46:56: know the most support the most and what's going to you know help most like with other features because
1:47:02: for example with the upgrade you know toet 9 we can now be unblocked on so many things because it makes stuff like
1:47:09: rigid body way easier uh because we can upgrade to like this be and we get like you know the benefits of that it lets us
1:47:16: clean up allad code with Mod mechanisms it lets you know so many features much
1:47:22: easier that makes sense to prioritize you know as the like as the thing to do uh because
1:47:30: if you you know if if you implement if you implement certain things they make
1:47:35: other things easier and that can you know contribute to those things being prioritized and specifically now with
1:47:43: performance for the longest time I kind of felt like you know performance is probably one of the biggest blockers you
1:47:50: know for a lot of people staying on the resite and we kind of we we need more people in the community we need kind of more support
1:47:57: and while B you know we did the survey and we kind of ask people what's preventing you from playing R act more
1:48:04: and like overwhelming majority of people said is the performance and as you know
1:48:09: why that got przed why we said you know this is going to be our major Focus for
1:48:15: like you know a while uh make it like way better because that can help the community that can help the future
1:48:22: development and uh and just kind to make the software like better and once that's done we'll kind of do the similar thing
1:48:28: we'll be like what's the biggest thing but also in that process we kind of you know make smaller things like you know
1:48:35: the performance being the major Focus doesn't mean like you know we stop doing the smaller things they just kind of you
1:48:40: know sprinkled around it um sometimes there's other stuff that kind of comes
1:48:45: up that we need to deal with like for example if servers on on fire no more performance work for now
1:48:52: until then is resolved like you know that needs to be fixed um same like you
1:48:57: know another reason like we prioritized for example stripe like you know we had like working in parel um is because
1:49:05: we've calculated that every day that we don't have stripe we lose like you know
1:49:10: $50 to $60 every single day so the longer it takes the more money we lose
1:49:16: that we could have potentially you know had that we could put into other things like you know hiring more people or maybe more marketing you know to bring
1:49:22: more support um but overall it's like you know prioritization it is a complicated
1:49:30: thing if have to like way so many things and figure out what's the most important
1:49:36: one right now what going to help the project the most what's going to bring us the most support what's going to help
1:49:41: the community the most you weigh all of these and then you make your decision and maybe you know sometimes you like
1:49:48: maybe like want a different decision or something but like you know there's so many
1:49:55: variables it's it becomes difficult you know to like because you don't quite see
1:50:01: you know like where some things will go you don't know like how people are going to react to some things so you might
1:50:07: have like you know some expectations and people also have like you know different things like some people don't care about
1:50:13: the performance and they want like ik um and ulat the decision comes you know to
1:50:20: being like major of people do so we have to go with that because you know that's
1:50:25: one of the things that's going of blocking this platform from growing
1:50:31: more um but yeah it's a one thing we're probably going to like
1:50:36: prioritize actually that's changed a bit is like molecule because it's one of those things that people I feel like
1:50:42: people don't super care about because like you know it's not something super user facing but we as developers really
1:50:48: need it because there's been so many cases where like lost hours of time
1:50:54: dealing with like you know issues that would be solved by having
1:50:59: it so you know sometimes like like we end up like paring things because like it
1:51:06: really helps speed up the rest of the development which means we can actually
1:51:11: do the other things you know faster than we would have otherwise been but it means we have to spend time on this
1:51:17: thing for a bit um but yeah I put it's kind of you
1:51:22: know aners the question and I do recommend reading the how we prioritize post on GitHub because it goes into a
1:51:28: lot of details on this uh check the FKS author is asking I know you can't say when you'll get to
1:51:35: work on the advantes how large the scope of Pera collection suppor is I run into a few situations last week where I
1:51:41: needed to write horrible inefficient code because I couldn't store collection data is the second highest up voted giab
1:51:46: isue but I'm not actually sure how large of a task it is compared to other stuff like Photon sorry for double post for
1:51:52: good a question Mark um I think actually like out of all things like I think
1:51:57: prolex collection is not that big um because like most of the stuff should
1:52:03: already be there it's just kind of like adding a lot of the
1:52:09: um like like the main thing is like you
1:52:15: know like there's like a few things because like one thing that I feel is going to make collections a little more complicated we might want to have some
1:52:22: mechanisms to restrict how large colle how large you can make collections because we don't want to know to
1:52:27: somebody just make a lube and just fill your memory um so like from technical
1:52:33: side it should be fine like like it's mostly just needs the localness tracking but like there's other mechanisms that
1:52:39: can be that can be built on top of then it's adding a bunch of nodes for like working like you know iterating on them
1:52:46: but that's relatively trivial um and the biggest power would
1:52:51: be just you know some of like system to track how much data you can allocating and put you know some limits
1:52:57: and checks on that and I think that's going to be the biggest part of it so it is actually something we might end like
1:53:04: prioritizing because that's going to help creators a lot um I would defin
1:53:09: like love to a because like there's so many other features is also blocking and I do feel if it's a feature we have
1:53:15: that's going to you know help with a lot of the content that people are building and then in turns also you know helps us
1:53:21: helps the platform because people now build a lot more kind of complex content that were not able to build before
1:53:27: because we just made it like you know way simpler to do um next question is from rabits I
1:53:36: know you're a genius programmer dog thank um but do even genius ducks like
1:53:41: uh you receive support from AI tools such as chpt or get up copile during development and if you don't use them
1:53:47: what is the reason so I don't really just get up C pilot like I kind of find it more
1:53:54: annoying um uh sometimes I do use CH upt
1:54:00: I don't use it mostly for code like uh I did use it for code like if I need like
1:54:05: a like a um what's it called like a boiler plate code um so for example like you
1:54:12: know like I was like can you just you like I asked GPT can you write C code
1:54:18: that initializes a hashset with all the C keywords and it just kind of put it together for me or if I need to like
1:54:24: edit like a bigger blog of code uh just in some predictable way ask it to do that I
1:54:31: don't like like using it for like making new code that's that requires more
1:54:38: complex reasoning uh because usually like when I generate something I kind of I kind of
1:54:45: Comb it through and I'm like is it is this does this make sense like you know does this code like
1:54:51: because I have like TR trouble like trusting it because like I've asked you know I've
1:54:57: asked it like about a bunch of things that I know a bit about and sometimes gives good answers but sometimes just
1:55:03: gives completely bogus answers that are and but it's very confident about them
1:55:08: and I'm like I can't trust this like you know there's like 50% chance it's just going to give me something wrong and I
1:55:15: don't want to you know put it in code without checking it and if I spend that much time checking it you know then uh
1:55:22: it doesn't really help me anyways so for something like boil boiler plate like you know that's fine because like I'm
1:55:29: like you know simple enough like there's not really complex but um for complex
1:55:35: code I don't really use it I did I didn't ask it like once to make like a collection for like probabilistic
1:55:41: sampling support and it just made code it made no sense at all like it was just
1:55:46: they was like making internal cues and was like adding things to them and then removing them for no reason like it was
1:55:54: weird um so I was like I I don't know um
1:55:59: the other way I kind of use it sometimes is like as a starting point for doing research on some things like I'll for
1:56:05: example ask it um you know do you know of any good libraries for c for doing
1:56:12: this thing and sometimes you know like it actually gives me good pointers is like you know this library is also a
1:56:18: thing because it does tend to hallucinate so sometimes it'll be like you know oh there's this library and then I Google it and it doesn't exist
1:56:25: but you know that's a thing I can easily check it I can Google it and if the library doesn't exist I know it give me
1:56:30: bogus answer but if it does exist then I'll be okay I'm going to look into this one um so it can be useful for that kind
1:56:37: of thing um yeah I will say I will say like
1:56:44: it's I would almost consider it like dangerous if you're trying to learn like
1:56:49: how to code for example yeah because it I've asked it like some C questions and
1:56:55: maybe this is just a c thing maybe it's not good at C I've asked I've asked uh I've asked
1:57:03: chat gbt about like some simple like C things like you know what's the most
1:57:09: efficient way to like iterate over this thing and do this other thing and it it
1:57:15: just keeps doing it in like the most suboptimal way that like
1:57:22: it just doesn't know like the like H Nob brainer like just do it this way like
1:57:27: sometimes it'll just be dumb yeah and that now it's taught you the dumb way to do
1:57:33: it there's actually been like a study like like I think they look at one of the latest models from CH GPT and they
1:57:40: found it's it gives you the wrong answer on average 54% of the
1:57:46: time more than half of the times it's going to give you something wrong which means it's hard to trust it and if like
1:57:54: if I have to spend like you know a lot of energy figuring around like if what it gave me is good or not then it like
1:58:01: you know it doesn't help much unless it's very easy to verify so like you
1:58:06: know if I can just do a quick Google search figure out like you gave me something bogus I can't trust this you
1:58:12: know then I'll use it for those things to kind of get pointers but like if it's something where I have to like come
1:58:18: through it and like do like a lot of kind of complex analysis figuring out like is this answer correct or no then
1:58:25: it's not worth it because like you know like it doesn't really save much
1:58:30: time and it just kind of kind of paranoid it like you know I'm going to put something that's going to cause like
1:58:36: issues um but yeah that's pretty much it um we also um this is pretty much the
1:58:42: last minute so I think like this was the last question so uh with that thank you
1:58:47: everyone you know for like joining I hope like you know you enjoyed um uh you enjoyed kind of you know learning more
1:58:52: about this night uh thank you all thank you all you know for all the questions uh also just a reminder uh we are
1:58:59: running the last phase of photon dust testing there's like announcement in our discore so if you can you know give it a
1:59:06: try because we'd like to merge it in uh we' like to merge it in like this upcoming
1:59:12: week uh we also launched the stripe so if you are supporting us on patreon which like you know we appreciate a lot
1:59:19: uh please consider switching you know on the same tier to stripe uh because we
1:59:24: actually get like you know a lot more uh much bigger cut that way patreon on average takes about 15% St takes about
1:59:32: 5% so we get like you know a little more money than we can invest into resonite um and with that like everyone
1:59:39: like you know thank you very much you know whether you support us you know or not uh thank you like for like you know
1:59:46: just joining like the stream asking questions being part of the community making cool stuff and just in general being part of platform and helping us
1:59:53: grow and thank you SRA for being here helping you can answer some of the questions too um yeah I'm glad I could sit here
2:00:01: and not alongs such some uh but yes uh and I hope like
2:00:07: everybody's like having fun like working on MC projects I um like I can't of wait like you know to see like what everyone
2:00:14: kind of makes um like you know at the end of the month I hopefully people are not getting
2:00:19: too burned out this I know some I know like from what I heard some my daughter be and they're
2:00:26: doing like scope creb and stuff like that but um as long as you're having fun so thank you very much for watching and
2:00:32: we'll see you at the next one and I'm going to check if there's anybody to
2:00:38: raid uh I think it's only Creator Jam that I
2:00:45: can see let me check if there's anyone streaming res
2:00:50: I no just me and Creator CH if you like like to stream uh I recommend streaming
2:00:57: around this time because you're going to get like a lot of viewers from us so we're going to send everyone to
2:01:05: Creator Jam so getting the ready oh just typed
2:01:10: gr Creator Creator Creator gem there we
2:01:16: go and should be getting ready in 7 seconds so thank you much and say hello
2:01:24: to Creator Jam say hello to M for bye right now did I click it oh I click