The Resonance/2025-05-11/Transcript

From Resonite Wiki

This is a transcript of The Resonance from 2025 May 11.

This transcript is auto-generated from YouTube. There may be missing information or inaccuracies reflected in it, but it is better to have searchable text in general than an unsearchable audio or video. It is heavily encouraged to verify any information from the source using the provided timestamps.

00:00: starting. Okay, it's started. Um, I'm

00:04: going to post on

00:15: [Music]

00:20: Discord. Post the other one.

00:31: There we go. Okay. Should be

00:34: live.

00:36: Hello. Hello. Can you hear us?

00:41: Fine. We got people popping

00:45: in. Hello. Hello everyone. Welcome to

00:49: the resonance.

00:51: [Music]

00:54: also open it up so I can make sure

00:56: everything's

00:58: okay. Good message.

01:01: Welcome to the residence. I'm taking

01:03: over. This is my co-host, Fuks. Um,

01:08: things now.

01:10: All right. I'm in I'm in charge of the

01:12: residence. Um, today we're going to talk

01:15: about

01:17: um why you should always monopac your

01:21: proto don't monop your protox. Don't do

01:23: that. Please don't.

01:26: Oh yeah, we got schnop it and it's not

01:28: from from grand. It's from the red neco.

01:32: Grant, you've you've failed your schnoit

01:35: for today. Uh, this is very

01:37: disappointing.

01:38: So request

01:42: question why schnoped is that grand

01:44: failed the

01:48: schnopet there we go today who knows

01:52: what do I have

01:56: today anyways welcome to resonance

02:00: um today like I'm a little bit sick so

02:03: like

02:05: um so it's going to be probably a bit

02:07: lower um whoever get through

02:13: um you it's essentially like a podcast

02:16: slash ask us anything you can ask any

02:19: question you want about resite anything

02:21: you know with the development anything

02:23: with the platform team like what do you

02:25: want to ask the only thing make sure

02:27: that at the end of your question or

02:31: within the question there's a question

02:32: mark that it kind of pops in our in our

02:35: thing Um

02:40: um so uh if you do that it's going to

02:43: pop on our thing. We do have a bunch of

02:45: questions from Discord as well. So like

02:48: we're going to go through those first.

02:51: Um also it's possible we might like end

02:54: this one like a little bit early

02:55: depending how things go. So that like we

02:59: should be able to get started.

03:01: Yeah. I mean, would you like me to read

03:03: some of them if you if you want to split

03:05: up the work of reading and answer any? I

03:07: can read it out.

03:11: I just need to like put one for

03:13: myself. So, Ter is asking, "How does one

03:17: resist the urge to when you have no

03:19: other solutions left for your design or

03:21: desire output? Do we wait or accept

03:25: defeat?"

03:26: So one of the things that can happen

03:29: like this there there definitely are

03:32: cases like where like there's no

03:33: alterate solutions but one thing we find

03:36: a lot of times when people do want to

03:38: use their hacking there's actually a

03:40: different approach they could use but it

03:42: requires you to sort of step back and

03:44: sort

03:45: of look at like how you're designing

03:48: things because like you might you might

03:49: have like you know designed yourself

03:50: into like particle hole where you do

03:53: need you where the only solution for you

03:55: is to you know use working if you really

03:59: want to approach C like you know do

04:00: certain

04:01: thing but um a lot of the times if you

04:06: know take a few steps back and sort of

04:09: like look at a sort of bigger picture

04:10: there might be like different

04:15: approach yeah often times it can look

04:17: like a a real everything starts looking

04:20: like a nail once you start doing that

04:22: you kind of forget how to do things

04:23: normally for a little

04:25: This happens a lot like when I like

04:27: design stuff for reset too. It's like

04:29: I'll like make particle design decisions

04:32: and then like work down how's it going

04:34: to work for this and this and some of

04:36: the decisions they end up like

04:38: essentially putting you in a corner and

04:41: that the corner is literally hard to get

04:43: out of without like some really nasty

04:45: stuff. But often times the solution is

04:48: to like you know go further back and

04:51: change the design decision so you don't

04:53: end up in the corner in the first place.

04:56: And it happens a lot with engineering

04:57: and like this is kind of no

05:00: different. Uh the next question is from

05:05: Phoenix. Phoenix is asking last stream

05:08: you talked about a bit about making web

05:10: assembly components. How do you imagine

05:12: syn network between most times? I

05:15: imagine things like texture printing

05:16: tool there as painting engine consent

05:19: can do multiplayer or scope much simpler

05:23: more stateless stateless things more

05:24: like pure functions so one of the

05:27: beauties of Brooks engine is it handles

05:29: synchronization implicitly for you and

05:32: by declaring a component you're

05:34: essentially defining a data model like

05:36: you're defining how the component is

05:39: modeled and frux engine will

05:41: automatically replicate that state for

05:43: you the code for that like either

05:46: mutates that state uh or reacts to that

05:48: state. You know, it doesn't need to care

05:50: about synchronization at all. Uh because

05:53: that's sort of like, you know, already

05:54: kind of solved problem. And this is

05:56: going to follow pretty much the same

05:58: principle. is like you're going to have

05:59: your data model definition how's your

06:01: component defined what data it has and

06:04: you have your code that either mutates

06:06: the state or um you know reacts to it

06:10: like you know does c make certain things

06:12: happen when like you know it's in

06:13: certain values uh and that pretty much

06:16: like you know makes it so you pretty

06:18: much most of the time you don't have to

06:19: worry about networking at all only

06:21: reason you would worry about networking

06:23: is more for like optimizing the flow and

06:25: so on there like certain things like for

06:27: example explorator cons emulator where

06:30: we would probably introduce new

06:31: primitives because usually you don't

06:34: want to like you know synchronize the

06:35: entire state and there's like for

06:38: console emulators there's like two ways

06:40: to approach the synchronization like

06:42: consider like the primary ones of them

06:45: is you just have a single user run the

06:48: emulation and then you use video

06:51: streaming to you know synchronize the

06:54: video output to other

06:56: users. So that one like you know we

06:58: essentially set it up. So only for

07:00: example the host is simulating it and

07:01: other users see video texture and you

07:03: have like something that like uh you

07:05: have like data model that specifies the

07:07: inputs from other players which can

07:09: react to that.

07:10: The other approach is using something

07:13: called lock step synchronization and we

07:16: will probably introduce you know more

07:17: primitives for that kind of thing

07:20: where there's like primitives that

07:22: guarantee that for each step like your

07:26: update all the values are the same for

07:28: each of the users and we kind of the

07:30: same thing you just model it you say

07:32: this needs to be lock step and is you

07:34: know the update function it takes care

07:36: of the rest for you and a kind of good

07:39: way it it does introduce latency for

07:41: everyone because it needs to make sure

07:43: that every user has every other user's

07:45: input before simulating the next step of

07:48: the

07:49: simulation. Um but pretty much generally

07:52: synchronization is handled on very sort

07:54: of like high level like on engine. um is

07:58: one of the kind of like biggest

08:00: strengths of the engine because even

08:02: when we make components and we make

08:04: stuff like 90 99.9% of the time we don't

08:08: even think about network synchronization

08:11: it just works

08:13: DM it's really useful for like when I'm

08:17: like on my own writing mods too because

08:20: I can just be like ah set this value to

08:22: like five and I know that that value is

08:25: going to be the same for everyone

08:26: because it just that it just happens And

08:27: it's just part of the world.

08:31: Yes. Uh, next question is also from

08:36: Phoenix. Uh, why does opening the death

08:40: inspector cause so much hitching? So

08:43: there's like a bunch of things. Uh, it

08:45: essentially generates lots of UI

08:47: elements uh, which is heavy against the

08:49: data model. It creates a lot of GC

08:51: pressure triggering the garbage

08:53: collector. So that kind of like causes

08:55: slowness and it's like multiple ways.

08:58: One of the things it's also uses a lot

09:00: of reflection and one of the things we

09:02: noticed is with the unit is mono the

09:04: more reflection is used the slower it

09:07: gets. Uh it feels like it's like uh we

09:10: have like look at specifically why but

09:12: my guess is it kind of builds a list

09:15: that's just going to keep expanding and

09:16: is not implemented efficiently at all.

09:19: Uh so once we do the switch to 9 that's

09:23: very likely going to help a fair bit

09:24: just making you know reflection go be

09:27: faster making the cost run faster making

09:29: the garbage collector handle GC pressure

09:32: much better too and we also be reworking

09:34: the UI eventually to like use more

09:36: efficient

09:39: approaches. Uh, next question is

09:45: uh from

09:48: uh

09:50: Missing.

09:52: Uh, Missing is asking, "Would you allow

09:54: creators of assets that are currently

09:56: sold on Gam Road Jinx to change the

09:59: usage rates to their asset after

10:01: original purchase? The original store

10:03: page didn't mention any limitations in

10:05: their own rate. For example, if original

10:08: page didn't mention anything about

10:10: public avatars or explicitly allowed it

10:12: to make public avatars that creator now

10:14: wanted to restrict the sharing of that

10:16: asset avatars would comply and change

10:19: licensing game. So this probably going

10:22: to depend on case by case basis because

10:24: I think like this actually depends like

10:26: what uh does the because the creator

10:30: they have like you know rights on how

10:32: their asset gets distributed. they are

10:34: the ones who set the license. But

10:36: there's also like you know so if if the

10:39: creator has a right you know we need to

10:42: kind of comply with that uh because it's

10:44: their legal rights to like you know

10:45: ensure uh the assets are not being

10:48: distributed you know

10:50: contrary you know to like how they like

10:53: licensed it. Uh they are the ones who

10:55: made it they are the ones who owns the

10:57: right you know they are the ones like

10:58: who decide you know how that goes. The

11:02: only caveat is like you know there's

11:03: cases where they did specify some rights

11:05: and say somebody bought the asset

11:08: uh with certain you know certain like

11:11: usage rights and they decided to change

11:13: them then it's like you know this more

11:15: of a legal question is can they change

11:18: it afterwards? Can they like you know

11:19: forbid the user from using it after

11:21: buying it with a different set of terms?

11:24: Uh, and then like we probably need to

11:26: like consult like you know with like

11:28: legal stuff and see like who actually

11:31: has the right there and who doesn't.

11:33: Like I don't think I'm I don't really

11:36: know the answer to that one right now.

11:39: Um so because it feels like intuitively

11:42: like if you buy it with certain like

11:45: usage rights

11:47: um you know then they kind of like go

11:49: like after you've already paid them and

11:51: then they decide to change the rights

11:54: um on the usage like something you

11:57: haven't like essentially they change to

11:59: you know the sort of not eola but you

12:03: know the terms of service they change it

12:05: afterwards you've already made your

12:06: purchase that kind of make it unable or

12:09: unable to use it. I don't know how that

12:11: kind of works legally. Uh so that would

12:13: kind of depend, you know, how however

12:15: that works

12:20: out. Next question

12:23: uh is from

12:25: Golduni. What are technical difficulties

12:27: of implementing proper SVG support for

12:30: his? So it kind of depends how you want

12:33: the SVG support to be approached. Um the

12:37: simple one would be you essentially just

12:39: similar to the PDF support where you

12:41: just like render it into a texture. You

12:44: know you get asset you can zoom in you

12:45: can zoom out but like it's event is

12:47: essentially rendered to a

12:48: texture that would be pretty simple to

12:52: add which just integrate library turns

12:54: into texture it pipes in nicely.

12:57: Um the fancier way to implement it is

13:02: actually convert like you know render it

13:04: in VR like like render in real time as

13:06: actual vector graphics for example

13:08: converting it to like you know geometry

13:10: at set of shaders and if you were to do

13:13: that it would like you know behave much

13:14: better because like like it wouldn't be

13:16: a texture so like if you like look

13:19: really close it would like still be like

13:21: probably

13:22: seamless but implementing that

13:25: integrated graphics pipeline That's like

13:27: significant amount of work especially in

13:29: the VR pipeline where you can be looking

13:31: for like you know different angles and

13:35: uh and so on. The problem is the SVG is

13:38: like it has a lot of features like

13:41: there's like you know blending modes you

13:42: can like it can apply like blurring to

13:45: the effects and so on and often times

13:46: like a lot of it is designed for 2D

13:49: screen presentation not necessarily for

13:51: 3D. Uh so making that work with 3D

13:54: pipeline that would be particle like

13:56: kind of challenging.

13:59: One thing you can actually do in the

14:00: meantime as a sort of approximation at

14:03: least for monochrome SVGs is you can

14:05: actually convert them into a uh a

14:08: multi-sign distance field like I did

14:09: with one of our icons here. You can get

14:12: kind of vector-like graphic rendering

14:14: with this where it's like infinitely

14:16: sharp with a really really low uh

14:18: texture resolution. So that might be

14:21: something to look into. But even that

14:23: like that has you know limitations

14:25: because like yes we essentially

14:27: converting to a texture which gives you

14:29: sharp lines but you like if there's a

14:31: very fine detail that's less than like

14:33: you know the size of pixels. Um it

14:37: doesn't work you know you'll lose the

14:39: detail. So like there like approaches

14:42: like that but like they all have like

14:43: their like limitations.

14:45: Yeah. I just in the meantime Yeah. It's

14:49: like but the thing is like you know we

14:51: probably wouldn't do like limited

14:53: implementation like that ourselves

14:55: because like it's not it's very it's

14:58: very like you know use case specific.

15:02: Yeah. Like supporting the SV shirt is

15:05: whole it's a lot. Yeah.

15:09: Uh next question is from uh missing.

15:12: Will there ever be official way to equip

15:14: avatar tool directly from inventory via

15:16: flexor components? Yes, very likely. Um,

15:21: one of the things we want to do with the

15:22: inventory when it gets reward is make it

15:25: so you can have, you know, your main

15:26: inventory, but you can also have like,

15:27: you know, sort of quick access hot bar.

15:30: So, you can, for example, put it on

15:31: your, you know, hand like you have like

15:33: a quick facet, you know, if you're in

15:35: desktop, you can lally have like hot bar

15:37: at the bottom where you can like put

15:39: your favorite tools and have them bound

15:40: by keys keys. So, there's probably going

15:43: to be some mechanisms to handle that.

15:48: Um, next question

15:51: uh is from uh so the

15:55: panic socks the panic is asking if you

15:58: had to start over or advice on you

16:01: knowing everything you do now would you

16:02: still start with unity and work towards

16:04: splittening or will you try to build

16:06: everything from scratch to begin? I

16:09: would probably still use I don't know

16:11: actually if Unity I might have like used

16:13: a different different engine but I would

16:16: have definitely based it on some engine

16:19: because one of the things building

16:21: everything was scratch it takes a lot of

16:24: time

16:26: um and even as with you know as many

16:28: issues we've had with Unity stuff like

16:30: we're dealing with

16:32: um it still was kind of beneficial to go

16:36: that way because we could get you know

16:38: very early versions

16:39: you know the software um you know kind

16:43: of bootstrap it we didn't have to like

16:45: you know worry about you know some of

16:47: the kind of basics like you know

16:49: rendering and so we could focus on what

16:50: made it unique and then over time

16:53: replace you know pieces that were in

16:56: Unity with engine it happened for

16:58: example with the UI framework it

17:00: happened with a lot of others um and

17:02: what it allows us to do is you know have

17:06: a functional platform that like you has

17:09: issues, but it's functional. You can

17:11: play it. Versus if you were building

17:13: everything from scratch, it would take

17:15: that amount of time to get something

17:18: functional. And for the entire length of

17:21: the development, we would not have

17:22: something you can download and play,

17:24: which means we wouldn't be able to get

17:26: like, you know, support from the

17:27: community to continue the development,

17:30: which means like it might not happen at

17:31: all.

17:32: So the only thing I would probably say

17:35: is like you know you maybe use a

17:36: different engine. It's most of the

17:39: things I would kind of like advise is

17:40: like you know just kind of the

17:41: accumulated knowledge and knowh how on

17:44: how to approach certain things and how

17:46: to kind of prepare them because like a

17:48: lot of development like you know there

17:50: were some like dead ends there were like

17:51: some approaches that didn't work right

17:53: and like I was like okay I'm going to

17:55: replace them with something that's you

17:56: know works way better. Uh, so it would

17:59: be mostly that, but would very likely

18:02: still be using some kind of like third

18:04: party kind of software like to handle or

18:07: engine to handle the parts that were

18:09: like not the important focus at the time

18:12: to get something working. Um, so yeah,

18:17: might not be Unity like I don't remember

18:19: like what actually was available like

18:21: you know back when I started but uh

18:25: um probably would have used something

18:31: Uh, next question is from sock

18:37: defic. Uh, so defend is asking, do you

18:40: think the dynamic nature of resonate is

18:43: a trade-off with performance versus

18:44: something like the trade where

18:46: everything goes to compilation

18:48: optimization step before making it into

18:50: the game? I don't think it has to be.

18:52: Uh, the only thing you have to do is

18:55: change your approach to things.

18:58: um like a lot of the stuff like with

19:00: Resonite, it's kind of designed to work

19:02: with it like dynamic nature but also be

19:04: able to optimize stuff. And one of the

19:06: cool examples is you know the acid

19:08: variant system because the asset varant

19:10: system

19:12: um when you think about uh you know for

19:15: example unity if you import a bunch of

19:18: assets you have that like a long

19:20: processing step like where you cannot

19:22: even use the editor that's because it's

19:24: making essentially variance of assets

19:26: for the specific you know uh specific

19:29: platform that you're targeting at the

19:31: time. uh and that's you know that's kind

19:35: of the process it it prevents some of

19:36: that interactivity because like you know

19:38: you cannot literally cannot use it for

19:41: time versus on Resonite the asset varant

19:44: system was designed in a way where it

19:47: has uh it's a lot more dynamic a little

19:50: more adaptive so if you import something

19:52: the asset varant system might decide to

19:54: do just very light compression on the

19:56: assets you brought in maybe even no

19:58: compression at all as you use them but

20:01: then the compression happens in the

20:02: background and especially if you upload

20:04: it to the cloud it happens in the cloud

20:06: and the cloud will generate all the

20:08: different variants kind of on the vi and

20:10: the client as it's like you know as

20:12: people are loading those assets it'll

20:14: look okay it'll be like I would like

20:16: this version you know like that's going

20:18: to be most optimal for the platform

20:20: maybe the cloud goes I don't have this

20:22: version but I have these versions and

20:23: the client's going to be okay I'm going

20:25: to load these versions for now uh or

20:27: maybe like you know I don't like any of

20:28: these versions I'm going to load the raw

20:30: version which is less efficient

20:33: But I know that like you know probably

20:35: within an hour there's going to be an

20:36: optimal version. So it loads that one

20:38: like you know like next time and the

20:41: system is kind of designed so you know

20:45: um things are still very dynamic but

20:47: they also like you know there's like

20:48: these processes that like make optimized

20:51: versions and we can do that for lots of

20:52: other features too where um you know you

20:56: build something and then like you know

20:57: apply like there's like an optimization

20:59: path that happens

21:02: um and a lot of like stuff in the

21:04: resonate it's sort of designed a sort of

21:06: like you

21:07: being like a add-on. So like you know

21:10: you have the base of it and like then a

21:12: lot of the editing stuff is like kind of

21:13: on top that kind of interacts with

21:16: stuff. Uh but there like you know kind

21:18: of mechanisms to work with it. So I

21:21: don't think it has to be trade-off.

21:22: There's definitely places where it is

21:24: trade-off right now. Uh but I think um

21:28: there's you know approaches to making

21:30: you know applying essential

21:33: optimizations there as well. You also

21:36: have like you know things like

21:39: uh proto flags which actually does you

21:41: know some basic kind of compilation.

21:43: There's like stuff we can do as well to

21:45: kind of um you know amortize a little

21:49: bit kind of cost like by for example

21:51: like um doing sort of like tiered

21:53: compilation where like you only get you

21:56: know basic one like when you're like

21:58: editing stuff. So like it's maybe it

22:01: doesn't run the most optimally but like

22:03: you can just you know mess with the code

22:05: and like you know have immediate results

22:07: and once you start messing with it it

22:09: gives the system time to make heavier

22:12: optimized version and then run that

22:16: uh and it's actually approach that's

22:18: even used you know in modern JIT

22:19: compilers where they do call it a tiered

22:22: compilation where the compiler at first

22:25: it will um just in order to be agile

22:29: It's going to do like very quick

22:31: compilation, a one with like minimal

22:33: optimizations, but one that gets the

22:34: program running as fast. And it looks

22:36: okay, this code is running a lot. Bex

22:39: code is still running. I'm going to make

22:41: much more heavily optimized version.

22:43: Once it's finished compiling that

22:45: version, it swaps it out and now it runs

22:47: much faster. So generally a lot of stuff

22:51: and I will take that kind of approach.

22:53: And the cool thing is like you know with

22:54: the tiered chit compilers they can like

22:58: there are benchmarks where they can beat

23:00: native code because one of the things

23:03: they can do is they can take advantage

23:05: of you know information that's only

23:07: known as runtime like for example

23:10: knowing that particle method is being

23:12: called say 99% of the time is being

23:15: called with a specific argument. So the

23:17: compiler can actually compile a version

23:19: when that argument is a constant and it

23:22: lets it optimize away big pieces of code

23:25: and achieve better performance versus

23:27: the native code that's compiled ahead of

23:30: time. Um that wouldn't be able to make

23:33: that optimization because it doesn't

23:34: know because it like it all happens

23:36: before the code runs. So I would even

23:40: say in some cases like you know you

23:41: might end up

23:43: like the dynamic system might end up

23:46: like outperforming like if it's like

23:47: designed right and has right mechanisms

23:49: for

23:52: optimizations. Uh the next question is

23:56: from papal

23:57: team. Uh let's

24:01: see. Papin is asking in the previous

24:04: office hours you said that perflex

24:06: already supports custom nodes. Does this

24:09: mean that you can already run custom

24:10: nodes inside perlex VM outside of

24:12: resonite? I'm aware that nested node

24:14: types in assembies but I thought they

24:16: were more placeholder skeletons until

24:18: custom nodes are implemented reset. No.

24:20: Yeah, this nested nodes it's fully

24:22: functional. Um there's in protofllex uh

24:26: project we actually have a bunch of unit

24:27: tests that use that like nested nodes

24:29: quite deeply. So that's fully

24:32: functional. What essentially you have

24:34: like right now is you like you have sort

24:37: of something we call implicit nodes

24:39: which like it's a node that like you

24:41: know it's just implicit. It's not

24:42: connected to any others but it's using

24:44: kind of the same mechanisms as nested

24:46: nodes do. So that's all fully

24:48: implemented. Like it works within Perfax

24:50: VM. It's just not integrated with.

24:54: So I So I couldn't hack in like a a

24:56: nested node into the data model or

24:58: anything. You can, but like it's going

25:00: to do stuff like without a lot of extra

25:03: code. You need to like you need to like

25:06: put angle it so it like does things.

25:09: Okay. Um this question is from BD_. I

25:13: know you've mentioned in the past that

25:14: the goal for splittening TM is to run

25:17: one process per session. Do you

25:18: anticipate immediate milestone where

25:20: unit is broken out but all sessions run

25:22: the same process are going directly for

25:24: end process model. So yeah we're we're

25:27: just doing two process model uh for the

25:29: first phase we're sort of like following

25:31: approach that Firefox took like uh where

25:34: the web browser they first like you know

25:36: split up like one process and then like

25:38: over time they kept like breaking into

25:40: smaller processes. So for this phase,

25:42: it's just going to be the render and

25:44: then it's going to be the rest of the

25:45: engine. Over time, we do want to split

25:48: it. So like each world, for example, is

25:50: its own session. The even some

25:51: subsystems are going to be its own

25:53: process. Uh so it's going to happen like

25:56: you know later in time. But right now

25:57: it's just going to be two processes. Um

26:01: like the render is going to be some

26:02: process and the engine's going to be in

26:04: some process.

26:06: It's going to sort of like gradual

26:07: approach because that lets us get you

26:09: know the results we care about which is

26:11: the use of net 9 faster but also the

26:15: additional splitting is going to help

26:17: like with performance too and with uh

26:19: optimizations too by um with security by

26:24: kind of sort of splitting each session

26:25: into its own

26:27: thing.

26:29: Uh next question is

26:33: uh from

26:34: BD postsplittening will asset load

26:38: status as supported by loaders or

26:39: presence of data in mesh statistics

26:42: immediately accurately reflect the unit

26:44: load status or only for extension load

26:47: status. I'm hoping doesn't break systems

26:48: that try to avoid avatars appearing

26:50: naked during mesh loading by waiting for

26:52: close mesh to fully load before enabling

26:54: renderers. Yeah, that shouldn't change

26:56: at all. like that the split thing

26:59: shouldn't change any sort of like you

27:00: know

27:04: behaviors. Uh next

27:06: question is from uh

27:10: Orange. Uh Orange is asking what is the

27:13: purpose of newly added global optional

27:15: audio output? Is there a specific use

27:17: case that cannot be achieved with other

27:19: parametric combinations?

27:21: I'm actually not sure uh because there's

27:24: like a lot of changes to output. Um so

27:27: it's possible you can just achieve

27:29: global with other modifications. But

27:33: what it essentially does it sort of like

27:35: overrides everything and says like you

27:37: know this audio effect is global meaning

27:40: it has the same volume everywhere. Like

27:43: you know there's like no attenuation. Uh

27:46: one of the things with uh audio is each

27:50: source it can like you know use any sort

27:53: of audio shape whereas like right now

27:55: there's only two shapes. There's like a

27:57: sphere shape and there's like infinite

27:58: shape but we can add more like you know

28:00: like a cone or cartrid and so on. So

28:03: eventually there's going to be actually

28:04: like selection where you can pick

28:05: different shapes and global is going to

28:08: be like one of them. Uh but I know it's

28:11: mostly just kind of remainder like of

28:13: like the development process. So it

28:15: might be sort of super close right now.

28:18: I'd have to kind of check it. Um but it

28:21: pretty much forces the shape to be like,

28:23: you know, the global one. So it kind of

28:25: skips a lot of kind of

28:27: calculations.

28:30: Um yeah, it's definitely going to be

28:32: more efficient if you actually want

28:34: global effects because you can sort of

28:35: achieve the other things with the other

28:36: shapes, but it still might run like you

28:38: know some calculations.

28:42: But it's going to make more sense over

28:43: time,

28:45: too. Next question is from

28:49: abysmal. Um, if you think about features

28:53: such as snappables, avatar anchors,

28:55: fistbones that have had a big impact on

28:57: how alive interactive the engine feels

29:00: that have become staples for creators in

29:02: Reside. What other small but significant

29:04: features have you thought of and would

29:06: like to implement in the future?

29:08: Well, one of them is actually the one

29:10: that got implemented now, which is the

29:12: spatial variables because I feel it's

29:14: going to help add a lot of like, you

29:16: know, sort of spatial interactivity

29:18: between avatars and things. Uh, but it

29:21: was already like, you know, implemented.

29:23: Uh, another one that kind of comes to

29:25: mind is uh it's like extension of the

29:27: snappable system. It's more like a

29:29: wearable system where you could for

29:32: example specify, you know, some points

29:34: on the body. For example, say this is

29:36: where the head is. This is the top of

29:38: the head. This is the sides of the

29:39: heads. And then like you know uh if I

29:43: grab uh I grab a

29:46: brush. Let's

29:48: see. Where's my brushes? Oh, there's

29:53: brushes. Uh geometry

29:56: line. So say like you know you make like

29:59: a prop. One of the things is like you

30:01: know right now props are not very

30:02: interchangeable. like for example in

30:04: Osiris glasses

30:06: um and like you know if you like want to

30:08: like just take them and put them on the

30:09: avatar you need to calibrate it per

30:11: avatar for each of the props but say

30:14: like you know you have like a hat and

30:16: you say like you know this goes to the

30:18: top of the head this is side of the head

30:19: side of the head and then on the avatar

30:22: itself you know you have like an avatar

30:26: um you know like your head

30:29: snoot have

30:31: eyes it's not very good drawing but

30:34: [Music]

30:35: um but you essentially say this is the

30:38: top of the head, this is the sides of

30:40: the heads. Uh the system wouldn't be

30:42: able to figure out okay it needs to

30:43: position like this scale like this so it

30:46: kind of fits on that avatar. Um so there

30:48: be sort of like you know an advanced

30:50: snapping system where you can

30:54: parameterize you know different parts of

30:55: the body and like you know for example

30:57: like say shirts clothes you know you say

30:59: like this is where the arm is this is

31:01: how thick it is and the system will

31:03: figure out how to place it so it

31:04: properly fits on the avatar.

31:07: Um, I think that kind of system like

31:09: would be really cool, you know, for

31:11: people who make props because you don't

31:13: need to calibrate paravar. You just say

31:15: this is how it fits. You put it on and

31:17: it just kind of snaps in the right

31:18: place. All that needs to happen is that

31:21: the uh the prop is calibrated to that

31:24: system and then the avatar is calibrated

31:26: and then the avatar will just work with

31:28: any properly calibrated prop without

31:30: having to like you know calibrate it,

31:32: calibrate every single combination of

31:34: like you know avatar and prop. So, I

31:36: think that that one would be probably

31:38: one of the most impactful systems

31:41: uh that would

31:42: like allow, you know, people to just

31:44: like exchange clothes and just find

31:46: something, plop it on your avatar, make

31:48: it like very easy to like wear

31:52: things. Uh, next question

31:57: uh is from Mars

31:59: Mani. I don't I never know how to

32:01: pronounce the name. Sorry.

32:04: Barsman uh with Ke having left the team,

32:07: what will happen to PBS consolidation?

32:09: Have any plans of moving the sales

32:11: changed in any way? Um it kind of has

32:14: changed a bit. We we're kind of like uh

32:18: like two things with the previous

32:20: consolidation right now. Uh we don't

32:22: know for sure yet because the project

32:25: has been kind of in limbo for a long

32:28: time. Um, and we kind of need to like

32:31: look at it like see what state is it in

32:34: uh, and you know kind of decide based on

32:36: that. So I don't have an answer for that

32:39: right now. Um, for like plans to moving

32:42: sauce we're sort of like looking at

32:43: multiple options. Uh, we're like looking

32:46: you know at like different like renders

32:47: like from open source projects that we

32:49: could take. We might end up like you

32:51: know going with be. We might like if

32:54: Gins ends up like implementing sauce in

32:56: time for you know when we want to switch

32:58: to rendering engine uh we might end up

33:01: like adopting that work since uh he

33:03: plans for that to be open source but

33:06: depending you know what state that is in

33:08: uh we might go you know in a different

33:10: direction. We might either end up like

33:12: you know integrating directly against

33:14: like you know making our own integration

33:16: say for example with the be renderer

33:18: because it has a lot of features we

33:19: want. we were looking at stuff you know

33:21: like godor and they're like see like can

33:23: we extract it from it and use it but

33:25: there's a lot of stuff that we need

33:27: that's currently missing uh there's

33:29: other like you know frameworks uh we

33:31: were looking at like for example the

33:32: stride engine which is like an engine

33:35: implemented fully in C that also has

33:37: like a cluster rendering but there's a

33:39: lot of kind of variables there so uh

33:42: there's like

33:44: options and we'll kind of you know we'll

33:48: we'll keep kind of looking into those

33:50: and the closer we kind of get like you

33:52: know to actually want to switch to the

33:53: the rendering engine

33:56: uh we'll make like you know our decision

34:01: there. Next question is from uh Red. Oh,

34:07: this one's a big one.

34:09: Uh Red is asking uh why is one gray? Uh

34:13: what's the grave drive knowledge type?

34:15: How does it differ from typical drive

34:16: CPF idea? Why is it like the same as

34:18: piece just placed there? Real force

34:21: being weird. Um I'm not actually sure.

34:25: Usually gray is like means like it's

34:26: invalid or something is wrong. But also

34:29: my [ __ ] is a bug. So I don't really

34:32: know in this case. I don't know how like

34:34: how you're driving it or what is it

34:37: doing. It also looks like you might be

34:40: like like you're using mods in that one.

34:42: So like it might be messing with

34:43: something. So it's kind of hard to say.

34:49: Uh and we got last one from Discord. Bit

34:53: karak is asking do you think memory

34:56: sharing or just splitting is only into

34:57: two processes in general could cause

34:59: issues with proton. I don't think it's

35:02: really going to cause too many issues

35:03: like there's uh the memory sharing

35:05: mechanisms they exist in some form you

35:07: know in different processes and we'll

35:09: probably like you know s like use like

35:12: Linux every day. So we'll have like you

35:15: know testing of these processes as we

35:16: kind of go.

35:19: Yep. Generally like the the thing with

35:22: like memory mapped files and whatnot

35:24: like memory shares on Linux is that uh

35:28: there's no such thing really as like a

35:29: named memory share. So like on Windows

35:32: you can declare like a memory map file

35:34: with like a name and then access that

35:36: file from a different process with the

35:38: same name. But on Linux, you actually

35:40: have to back it with a file first and

35:42: then anything that wants to access that

35:44: memory share will uh have to access the

35:47: file which will then just access the

35:50: memory share.

35:52: Yeah. Like mechanisms to make it work

35:55: and it's also like you know um web

35:58: browsers kind of use this mechanism a

36:00: lot. So um in certain like

36:03: approaches. So with this that's all the

36:06: questions from uh from Discord which

36:10: means we can get to the questions from

36:13: uh questions from the

36:15: chat. Uh so and I've got quite a bit so

36:19: it might take a little bit to go through

36:21: them. Uh there's a few so let's get

36:24: started on those. Let's see how we

36:29: time. So Red Neco is asking it. I said

36:34: my schnopit is that grand wasn't here to

36:36: make his [ __ ]

36:38: Yeah, that's my [ __ ] That's

36:41: the

36:43: u asking, can I ask a question? Yes, you

36:46: can. And this is how you ask

36:50: it. Uh, next one is from Reno is sick.

36:54: I'm kind of sick. I think I kind of

36:55: caught a bug from like some people and

36:57: plus it's like everything's blooming

36:59: right now and like everybody's cutting

37:01: grass which is also triggering my

37:03: allergies. So I'm like um I literally

37:06: just downed well not down but like I had

37:09: like a day like before the stream

37:12: because my throat was really

37:16: hurt. Uh next question is from Epigen.

37:19: When do you think work on the splitting

37:21: with all occur? Uh I am expecting to

37:24: patch some issues with audio system

37:25: before then. Yeah, some probably

37:27: sometime next week. Um I'll probably

37:30: spend the first like day or two like

37:32: patching up some audio issues and adding

37:34: some things but pretty much the idea you

37:37: know to get started on split ending as

37:39: soon as I

37:40: can. Uh those like same question

37:43: skipping that one. Um unskll wolf do you

37:46: or other developers use client modes?

37:49: Also, could any popular or very useful

37:52: must be integrated officially and so I

37:55: don't really use any myself. I'm like

37:57: more of a I'm like if I need something I

38:01: will add it like you know like I I want

38:03: if something annoys me enough I want to

38:05: be like okay I'm prioritizing that now

38:07: and implementing it. So that's kind of

38:10: my approach. I know some like do like s

38:12: make makes mods. So, yes, I use I use

38:16: mods myself. Um, mostly mostly most

38:19: mostly mods that like I make, but I do

38:22: use a couple that I know have have been

38:24: like vetted to be safe.

38:27: There's like one thing is like the mods

38:29: can potentially be like malicious. So,

38:31: like we like if you want to use them

38:33: officially, if anybody essentially we

38:35: have a process if anybody on the team

38:36: wants to use mods, we need our

38:39: engineering team to vet them. make sure

38:41: like you know there's nothing like

38:42: malicious

38:43: uh which adds a little bit like you know

38:45: kind of burden too. Um for integrating

38:48: officially very unlikely because usually

38:51: the way mods approach things is uh is

38:54: very hacky and for anything that

38:58: essentially goes into the engine like we

39:02: one of the big concern is a long-term

39:04: maintainability and of times mods don't

39:06: really care about that super much. So

39:08: like that kind of code like you know

39:10: wouldn't get accepted like because once

39:12: it's in we have to maintain it

39:15: essentially forever uh which means you

39:18: know like we need to make sure it works

39:20: with all the rest of the code. We need

39:21: to make sure it keeps working when we

39:23: change other parts of the code and that

39:26: can like you know add a lot of burden.

39:28: So usually for things to be implemented

39:30: officially like it needs to be designed

39:32: to be very longterm.

39:37: Oh, next question. Oh, this is the same

39:39: one from one about being sick. And next

39:43: question is from kite

39:46: vt and they're asking. So, I'm on quest

39:49: two and I was curious on if the controls

39:52: when editing works. I try moving around.

39:54: I can only move with one analog stick.

39:56: The rotation is mixed with the move

39:58: around one stick because the two

40:00: opposite analog stick will be smooth out

40:02: or fixed. So, this one's a tricky issue

40:05: because like with the Quest, you only

40:07: have the joysticks and the joystick can

40:10: either be used for moving or it can be

40:12: used for the tool because the if the

40:14: tool uses it, then if you if you try to

40:17: use the tool and it also used for

40:19: moving, then as you use the tool, you'll

40:21: be moving around. Uh, and vice versa,

40:23: you know, if you try to move around,

40:25: you'll be using the tool. So, it cannot

40:27: be used for both at the same time. So,

40:31: it's kind of tricky, you know, like what

40:34: do we do at that point because like only

40:36: one can gets to use them. For some

40:39: tools, we might move it like, you know,

40:40: to different buttons if it doesn't need

40:42: a joystick. So, that will kind of help.

40:44: But if the tool does need a joystick,

40:46: then there's not really super much you

40:49: can do. Uh the only thing um with

40:54: tools is if you you know if you take the

40:57: tool you can like put it down very

40:59: easily if you just you know click on

41:00: your if you click on your tool shelf. So

41:03: there's a way to quickly like you know

41:04: park it that frees up the thing and then

41:06: you click on it again and equip it

41:07: again. So you can kind of swap it back

41:09: and forth.

41:12: But essentially it's a it's an issue

41:14: with like you know having a limited

41:15: amount of

41:19: controls. Uh the chip forge is asking uh

41:23: the jet uh will IPC between full engine

41:26: be documented so implement their own

41:28: heads? uh we don't have any plans for

41:31: like public facing documentation for

41:33: this like it's an internal kind of

41:34: mechanism and doing the documentation

41:37: would like add a lot of like kind of

41:39: overhead that would like you know make

41:41: the implementation take longer um I

41:44: might like share some of the design

41:46: notes and such like in devlog and so on

41:48: but like we don't have any plans for

41:50: official documentation for that it's not

41:52: really being designed as like you know

41:53: something that's community sort of

41:56: extensible it's like an internal part of

41:58: our engine So if you do want to like

42:01: implement you know own stuff like you're

42:03: mostly on your own for

42:06: that. Uh next question is from Juan V.

42:10: Uh my question is do you guys know if

42:13: there's a fix or isn't right crashing if

42:16: handking is enabled on some systems like

42:18: Quest Pro? If there's no fix yet since

42:20: this has been happening for a long while

42:21: I suppose that there will be fix in the

42:23: future but not soon yet. Oh my headset

42:26: is also dying. Uh, this one's kind of

42:28: tricky because this actually is a bug in

42:31: Unity and it happens in native code that

42:34: like we don't have access to.

42:38: Um, so it is like one of those things

42:42: where we can't really fix

42:45: it and if we wanted like you know fixed

42:48: we would have to change like version of

42:50: Unity but we also can't change version

42:52: of Unity because that will break a lot

42:54: of other things. Um, so there's not

42:57: really clear solution to it right now,

43:00: unfortunately. Uh, and it's kind of like

43:03: a situation, but it's kind of what it

43:05: is.

43:07: Um, so yeah, if anybody like wants to

43:12: like look into it too, like in the

43:13: GitHub issue, I actually posted like you

43:15: know the stack trace of things. So, you

43:18: know, people are to investigate and if

43:19: you find something that could help that

43:21: would be helpful. But so far we haven't

43:23: had like anyone kind of come up with

43:25: anything.

43:27: Um the only approach is you know is like

43:30: we do move away from unit because that

43:33: like you know get rid of the bug that

43:35: way or maybe we could get like you know

43:38: help like from all like from the from

43:40: their system like to like not because

43:42: one it it kind of happens when you do

43:45: switch to hand tracking there's

43:48: a it essentially like says like oh those

43:51: those controllers you're using they got

43:53: disconnected now there's new controllers

43:55: that connected

43:56: and Unity tries to fetch like uh

43:58: features for those controllers and it

44:00: does like invalid access exception and

44:02: that crashes the process. Um but we also

44:06: don't know like you know something that

44:08: V would be willing to do is like you

44:09: know make improvements to

44:11: how their kind of mechanism kind of

44:14: works or implementing that to kind of

44:17: work around this.

44:21: Uh, next question is from

44:24: Nuki. Uh, Nikki's asking, "So, I'm

44:27: having optimization issue in Avatar

44:29: Station. The mirrors all work fine for

44:31: me in VR, even at higher resolution VR.

44:33: In underwater range, however, the same

44:35: mirrors over my 1050 Ti, even at low

44:39: resolution leading to glitchy headset

44:41: view. Is there any idea what's going on

44:43: there? Any advice on how I might

44:45: optimize them beyond just fire clip

44:47: override?" um you need to optimize

44:49: what's in your world essentially. Uh

44:51: mirrors like they're like one of those

44:53: features like you know where the mirror

44:56: itself doesn't have that much of a cost

45:00: but it costs what it renders and what it

45:02: renders depends on the world. So we have

45:05: like you know just a few basic things in

45:06: your world then the mirror is going to

45:08: be pretty fast. But if we have like you

45:10: know say thousands or dozens of

45:12: thousands of objects and lots of lights

45:14: in the world then the mirror is going to

45:16: get heavy because when it renders it has

45:18: to render all of those all over again

45:21: and actually two times you know for each

45:23: eye. Uh so the cost of mirror depends a

45:26: lot you know on the specific world and

45:28: stuff that's in it. One of the things

45:30: that can also affect it are the lights.

45:33: Um try disabling you know the per pixel

45:35: lighting because uh for most things like

45:38: you know we resonate will use the third

45:41: pipeline which makes it efficient to

45:43: render lots of lights. However, mirrors

45:46: they don't work with unites defer

45:48: because they use something called like a

45:50: skewed matrices essentially matrix that

45:53: like you know it's not like flat with

45:54: the camera but it's sort of like skewed

45:57: and that forcibly uses the forward

45:59: rendering pipeline which makes lights

46:02: way less efficient. So you have lots of

46:05: lights per pixel lights that's going to

46:07: make them run you know slower too. So

46:09: reducing those or disabling the per

46:11: pixel lights on the mirror can

46:18: help.

46:21: Uh next question is from uh Nick's

46:26: asking to what degree does the new audio

46:27: system utilize multi-threading? At what

46:30: point this process kick in other advice

46:31: for optimizing? So um it's multi-

46:35: threading is not really something that

46:36: kicks in. It just works. It's there. Uh,

46:39: audio is

46:41: very multi-threaded like very heavily

46:44: multi-threaded. I even implemented like

46:46: a a concurrent buffer mixing that like

46:49: is lock free mechanism to mix multiple

46:51: buffers. So you actually have like you

46:53: know the system when it starts it finds

46:55: out all the work it needs to do and then

46:57: like you know it runs a bunch of jobs

46:58: that run in parallel to like you know

47:01: render the audio apply the effects.

47:02: There's like some parts like the

47:05: binaural audio which uh that it's under

47:07: lock because unfortunately

47:10: um the steam audio is not thread safe

47:13: but pretty much every else every other

47:15: part of the process is

47:19: um and it's uh

47:23: uh it's um what's the word? is

47:28: essentially like you know we have

47:29: multiple threads all computing the audio

47:31: and like as the threads are finished

47:32: they will be mixing in their results

47:34: together and the mechanism is also very

47:37: efficient. It's like where it uses the

47:38: lo buffer mixing. Um the audio system is

47:42: not really something you

47:43: optimize. Uh it essentially has like a

47:46: fixed limit on the number of audio

47:48: sources.

47:49: So like you know it needs to be kept

47:52: like under certain time for every user

47:55: because if it takes too long you start

47:56: end up like hearing pops that is going

47:58: to you know pop. Um so there's like a

48:01: limit set uh and essentially you can't

48:04: really go beyond that limit. So like

48:05: it's not really something you optimize

48:07: on your end. That's something we do on

48:10: our

48:13: end.

48:15: Uh asking probably said this already.

48:17: What word is this? Looks like one of the

48:19: islands of the cloud home. Uh this is

48:21: Dante's

48:23: uh sky island camp. Uh I don't know what

48:27: MMC entry is. Technically a disqualified

48:29: one because like Dant is a developer. Um

48:33: but they just wanted to make like a nice

48:35: little submission. So it's very close to

48:37: small world. Um I kind of picked it

48:40: because like we're kind of driving like

48:42: I'm kind of not feeling super well. So I

48:44: was just I need something simple.

48:50: Uh noon is asking someone told me that

48:53: uh when interact with collider it

48:56: interacts with the ceramic colliders

48:57: too. Meaning intersecting mesh colliders

49:00: interact with each other when you

49:03: interact with one of them. Is this

49:04: correct? How to fix

49:06: optimize? I don't know what you mean

49:08: exactly by that. Like colliders will

49:12: colliders generally interact with nearby

49:14: colliders.

49:16: Uh, and it's not like you know when you

49:18: interact with them like you know they

49:20: just do like if like that's kind of

49:22: their point is like you know figure out

49:24: like are there any colliders nearby that

49:26: are kind of intersecting or touching. So

49:28: they'll be doing that.

49:31: Uh I don't

49:34: really I don't really quite understand

49:36: the question too. So like it's like

49:38: beyond that like if if you want to

49:40: optimize that like you know just have

49:41: pure colliders or colliders that don't

49:43: interact with each

49:48: other.

49:52: Uh next question

49:54: is from I noticed some material types

49:58: don't have access to all the

49:59: transparency types or other aspects. For

50:02: example, dual side it doesn't have

50:03: additive multipl transparency only alpha

50:06: clip alpha blend. Why is that? Would it

50:08: be to implement those especially with a

50:10: split from unity happening? So it's the

50:14: reason is pretty much because like it

50:15: takes time to implement it and we

50:17: haven't taken the time to do it for

50:19: those. Um but it kind of depend on the

50:22: specific shader. You're always free to

50:24: make the request and we'll kind of you

50:25: know look at it. Uh the split shouldn't

50:28: affect it much, but like the eventual

50:30: replacement of a render could. Uh but it

50:33: depends on the specific like you know

50:35: shader. Like we need to like look at the

50:37: code and see like you know how much this

50:39: would complicate adding these molds

50:46: in.

50:48: Uh, missing nerd is asking, "Do you look

50:51: at particle mods when trying to find new

50:53: features or are you fine with some

50:55: things staying external mods forever?"

50:58: Um, I didn't really look at mods super

51:00: much for features. It's more like, you

51:01: know, looking at a GitHub request, which

51:03: can be based on mods when people like,

51:05: you know, just for some function that's

51:07: popular. Uh, it kind of depends on the

51:09: thing. like I usually don't look like

51:11: you know if it's a mod or no more like

51:13: you know does this feature make sense uh

51:15: for us to implement like natively to

51:18: nightite and is it aligned you know with

51:20: what we're doing so there like you know

51:22: some things where uh we will make

51:25: official implementations for stuff that

51:27: mods are doing but also like you know

51:29: some things where we're like like this

51:30: is not something we really do you know

51:33: just use the mod and it's kind of you

51:35: know like one of the things with the

51:37: mods is you know you're able to like

51:39: customize the experience. So, it kind of

51:40: fits with what you want. Uh, and it's

51:44: not, you know, just stuff that's ours.

51:47: Uh, you know, because why else like, you

51:51: know, have

51:52: more. It lets you kind of like, you

51:55: know, kind of change the experience in a

51:57: way you want even if it like is not the

51:59: direction we want to go.

52:06: Uh, next question is from Nikon. Spatial

52:09: variables sound like a great tool for

52:10: calling and after optimizations. Is

52:13: there a way

52:15: uh is there a way you intend to

52:18: encourage world builders to do that? Um,

52:22: we don't really have like we don't I

52:24: don't think we have like any plans to

52:25: like encourage people to do that. Like I

52:27: think people just going to do it. Maybe

52:29: like some you should. Yeah.

52:33: It's there's like going to be some stuff

52:35: I know like our team is going to like

52:37: utilize it. So like maybe people can use

52:38: that as an example but like you know

52:40: it's just a feature. It's it's there.

52:41: You can use it. So we don't want to make

52:44: you use it.

52:49: Yeah. Next question is from media uh

52:52: posting will you bypass Unity for VR

52:54: input entirely in reference to handing

52:57: crash bug? Uh probably not. That's very

52:59: integrated with the VR SDK. So since

53:02: unit is still going to be doing the VR

53:03: rendering that's going to be handling

53:06: the input

53:07: too. I don't think like we can kind of

53:09: read out the inputs like you know

53:11: without

53:13: that. Uh next question is from Rickon.

53:16: Uh is there a material maybe a way to

53:19: use fog uh which just applies a filter

53:22: that gets darker or more saturated the

53:24: further behind it things

53:26: are? you probably have to use fog

53:28: because that's like, you know, depth

53:30: based. Um, there's also like the depth

53:32: material. So, you could maybe try

53:34: experiment with like the plant modes on

53:37: that. You know, maybe it could like that

53:40: could look like a neat effect, but I

53:42: don't know from the top of my

53:50: head.

53:52: Uh, next questions from Ultra White

53:54: Gamer. Uh with the spliting coming soon

53:57: resonate having issues with Unity and

53:59: latest information related to Gins would

54:00: the future uh would

54:03: the would in the future because of IPC

54:06: implementation would it be relatively

54:08: easy to switch to something like the

54:09: latest LTS of Unity? No, that doesn't

54:12: really help with that. Uh the main issue

54:15: with switching to like you know latest

54:17: Unity is essentially all of the shaders

54:19: will break. a lot of rendering features

54:21: will break like you know different

54:23: rendering doesn't work with VR like a

54:25: lot of the rendering features are just

54:27: straight up broken or don't work or

54:29: don't have equivalents

54:32: um which makes it I know hard for us to

54:34: switch so it the IPC doesn't really help

54:39: it unfortunately uh we'll probably still

54:42: be like switching to like you know a

54:43: custom rendering engine that's like

54:45: going to be based on something else um

54:47: we're very unlikely to be like investing

54:49: much time into switching like into new

54:51: version of Unity.

54:53: No amount of work can we do on making

54:54: Unity not suck. Yeah.

54:59: Uh complain we are trying to get like

55:01: that but I can't figure it out. I know

55:03: how to gel but doesn't have much gel. So

55:07: uh yeah I would suggest trying like the

55:10: things I suggested like I don't know

55:13: from the top of my head if it works or

55:15: no.

55:18: Uh, and last question is there right now

55:21: is from the red neco. The redne is

55:24: asking, I was curious, is there any

55:25: chance we could get some more material

55:28: property blocks? There's two that would

55:30: be hugely helpful over the color primary

55:32: texture skill offset or was that more of

55:34: a instinct before I left the team and we

55:36: can always make the request for it like

55:38: you know see if there's like a lot of

55:39: interest in that we can look into it. I

55:43: mean it's kind of like it's not too

55:44: different you know from any other

55:45: feature request like if if there's like

55:48: significant interest in something you

55:49: know we will um uh it's more likely you

55:53: know we'll pick it

55:56: up and like it doesn't like doesn't mean

56:01: like you know we cannot work on these

56:02: things like majority of the shaders that

56:04: are in here um were actually implemented

56:06: by me and I did a lot of the graphical

56:08: features so like uh other people you

56:11: know can do that kind of work as

56:15: Well, but that like actually that's all

56:17: the questions we have. Uh we've been

56:20: going for an hour right now.

56:23: Yeah. How are you feeling? Question.

56:30: Uh I'm kind of okay. Um I'm not like

56:35: much like I don't think I'll be able to

56:36: just blubber about stuff. So like if the

56:38: questions dry up like we might end up

56:41: early. Uh so we want to like you know

56:43: ask more questions like ask them now. Uh

56:46: otherwise we might end up like you know

56:47: ending this earlier than

56:49: usual. Uh Moonbase is asking not

56:52: necessarily asking for a release date

56:54: but is there or timeline on the split

56:56: ending update? Um I think at the very

57:01: least few

57:02: weeks I would hope like to have it done

57:04: you know within a month or two but like

57:06: you know you never know some of the

57:10: complexities that come with these things

57:13: so we'll

57:18: see. Uh next question is from Ace on

57:23: Twitch.

57:27: Uh as on Twitch is asking, "Do you guys

57:29: plan on hiring someone else to do the

57:31: rendering when you have money or now

57:32: that Gins is gone?" Uh we don't have any

57:35: specific plans for that right now. I

57:38: mean, possibly at some point, but

57:42: um we'll

57:44: see. Alo Gr is asking, "Uh you're too

57:49: late."

57:51: It's a very late

57:55: note. Okay, so that's all the questions

57:58: for now. So, we're gonna give it a bit,

58:00: but if there's no more, I might end up I

58:03: might end up

58:05: early. I'm kind of congested and

58:11: yeah, my throat hurts a little bit more

58:13: now, too. Yeah, I'm also sick. Hi.

58:17: Is everyone sick?

58:21: I went outside. That's how I got sick.

58:30: Well, I don't I being sick. So am

58:37: I. Oh god. As on Twitch is asking,

58:43: uh, question for B. What is your guys

58:46: favorite features on your

58:48: avatar?

58:51: Uh, where's the needles?

58:54: Maybe it's I like using

58:58: this. I'm dead

59:03: now.

59:10: Um, I also have the

59:15: needles. I fix it because the hair is

59:17: supposed to disappear.

59:19: I'd say uh I'd say child lock.

59:27: I'm a leader.

59:29: I have a child lock on my avatar so that

59:31: people can't take my glasses off if

59:33: they're being annoying with it.

59:38: Oh, next questions for BD_. What's the

59:42: opposite of Schnopet called? What is

59:44: your

59:45: anti?

59:49: Um I mean I guess that would be like you

59:52: know what makes you if you already

59:53: disintegrated what makes you

59:55: reintegrate. I would say is like when

59:57: people when people use like when people

01:00:00: get what features are and like they use

01:00:04: it like in the precisely way that this

01:00:07: was meant to be used to make something

01:00:09: really cool and powerful. And I'm like

01:00:10: oh that's the thing that's you're using

01:00:13: it right.

01:00:15: That's uh that would be like my

01:00:16: intentional bit I guess also fruit I

01:00:18: will I will reintegrate with fruit

01:00:23: when people understand that the thing

01:00:25: that we're making is still in progress

01:00:27: and no we don't actually the straw man

01:00:31: that you're making that we are actually

01:00:32: happy that your files are gone or

01:00:34: something when like an issue happens is

01:00:36: uh not

01:00:40: real thing like one of the things I like

01:00:42: is

01:00:44: um like I often times like approach

01:00:46: things and everybody attitude is like

01:00:48: you know things won't be perfect there

01:00:50: going to be issues there's going to be

01:00:51: bugs there going to be things that don't

01:00:53: get done and so on but that's the kind

01:00:55: of nature of things we just try to like

01:00:58: do you know the best we can and make

01:01:01: something that's kind of fun and

01:01:03: like and like sometimes it feels like

01:01:06: people get very like you know black and

01:01:08: white about things you know it's either

01:01:11: like it's either perfect or it's like

01:01:14: you know it's useless. Um and most of

01:01:17: the time there's like a lot more kind of

01:01:19: nuance to things and there's a lot more

01:01:21: kind of complexities and things are you

01:01:24: know pretty much nothing in the world is

01:01:27: ever going to be perfect. Uh there's

01:01:29: always going to be some issues but also

01:01:31: doesn't take you know way enjoyment. Um,

01:01:34: and when people, you know, have that

01:01:36: kind of similar attitude, like that kind

01:01:37: of like, you know, this my like one of

01:01:40: my aunts, it's like, you know, like,

01:01:43: yeah, like I get it. Like, you know,

01:01:45: there's like issues, there's like

01:01:46: problems, there's like stuff, and like

01:01:48: we try to do fix as many as we can, but

01:01:52: also like, you know, it's just kind of

01:01:55: accepted as fact of

01:01:57: life. There's always going to be stuff

01:01:59: like it won't be able to like, you know,

01:02:01: fix or

01:02:05: do.

01:02:09: Yeah.

01:02:12: Um, next questions. No, you cannot you

01:02:16: cannot eat the camera on me. Not this

01:02:19: one, but you could conceivably since

01:02:21: this is for mercenary essentials, you

01:02:24: could What is that? which is a spooky

01:02:29: sound.

01:02:33: Um like

01:02:35: essentials so you could like you know

01:02:38: attach your edible system on it and can

01:02:39: use it. So theoretically

01:02:43: yes she's asking read this

01:02:50: one. How do you context menu? I just saw

01:02:52: you going through like seven menus to

01:02:54: get the child lock. Oh, no. That one's

01:02:57: only like one too deep. I was just

01:02:59: flicking through my menus to see like

01:03:01: what I had uh like as as features that

01:03:06: that I could consider my

01:03:08: favorite. I consider that one of my

01:03:10: favorites though. The child lock is so

01:03:12: that like people aren't just constantly

01:03:14: taking my collar off and jingling it or

01:03:16: taking my glasses off and stealing them

01:03:18: or whatever cuz it's funny but it's, you

01:03:22: know, I don't want it happening like all

01:03:23: the time.

01:03:30: Uh, next question is, well, should I

01:03:32: update your needle to have tracking

01:03:33: project files for a version of it? Maybe

01:03:38: sometime it's very

01:03:40: old. Uh, next question from Reno. I love

01:03:44: the spatial variable collectors. They

01:03:46: are the best edition in a while. Do they

01:03:48: still have a few bugs when I'm actually

01:03:49: working figure out? What's your favorite

01:03:52: new feature when people use special

01:03:55: virus for past week? I haven't so far I

01:03:58: haven't like seen actual like

01:04:00: applications of it like new ones. Mostly

01:04:03: I've seen like people just experimenting

01:04:05: like you know messing up with the

01:04:07: system. So I'm kind of looking forward

01:04:09: to seeing like you know actually like

01:04:10: build systems with it. Uh, I feel like

01:04:12: some people like, you know, red do like

01:04:15: a thing where you can like, you know,

01:04:18: uh, roast like marshmallows and people

01:04:21: do like stuff that kind of moves in a

01:04:23: vector field and things like that. But

01:04:25: it's like I feel like right now it's on

01:04:27: a stage where people are just sort of

01:04:29: experimenting with

01:04:32: it. Suicide, do you have like any like

01:04:36: things you've

01:04:38: seen?

01:04:44: Um

01:04:46: h I saw some people trying to do like uh

01:04:49: gradient fields and whatever. That was

01:04:51: pretty

01:04:55: neat. That's pretty

01:04:58: cool. I I love I love spatial like

01:05:01: indexing stuff and whatever like spatial

01:05:04: hash maps. Like those are so cool. I

01:05:05: want to see someone make a spatial

01:05:07: hashmap. Very

01:05:10: cool. Next question is from

01:05:13: BD_. Uh, how does one get into finding

01:05:17: groups for Blood on the Clock Tower on

01:05:19: Resonate? Uh, so for that one, you

01:05:21: essentially need to know like right now

01:05:23: it's in closed beta. So, you need to

01:05:25: know a player. Uh, if you know a player,

01:05:28: they can invite you into the game. Uh,

01:05:31: they have like a system for it where

01:05:32: like they put your name and they send

01:05:35: out like invite waves now and then. So,

01:05:38: um could like poke somebody, you know,

01:05:39: and see what kind of they can get you

01:05:45: in. Uh, next one is from as 17. Uh, how

01:05:51: many games like V and Roblox have it? Do

01:05:53: you guys ever plan on having social

01:05:54: group system? Actually, not sure what

01:05:57: exactly that this does. Thank you, K,

01:06:00: for the subscription.

01:06:04: Do you know what it could mean by the

01:06:05: social group system? I I honestly I

01:06:10: honestly don't know. Let me see here. It

01:06:14: might help to elaborate what kind of

01:06:15: functionality you want. Like sometimes

01:06:17: like when you just like say like this

01:06:19: feature from other software. I'm like, I

01:06:23: don't actually know what it means

01:06:24: exactly because we do have a groups

01:06:26: system, but I don't know if that does,

01:06:29: you know, the same things that you

01:06:31: wanted

01:06:34: to

01:06:39: h I feel like I would need to be more

01:06:43: immersed in the culture to understand

01:06:44: it. On the surface, it just kind of

01:06:46: looks like the similar thing to like a

01:06:47: Steam group or something. Yeah,

01:06:50: elaborate a bit.

01:06:54: Like I was saying one feature messaging

01:06:55: whole group. Yeah, that might be kind of

01:06:57: cool. Like that is one thing I would

01:06:59: like want to do with the messaging

01:07:00: system kind of expand it. So you can

01:07:02: kind of you can essentially send

01:07:03: messages on anything. You can send it

01:07:05: within groups. You could send them you

01:07:07: know on words and items like comments.

01:07:13: Um so that like you know that would be a

01:07:15: thing but like you know there probably

01:07:17: is more functionality. So, it kind of

01:07:19: helps like if you're more explicit, we

01:07:20: know with the punches or what they want

01:07:22: because like we might not be familiar

01:07:24: with the feature you're asking

01:07:28: for. Londo 43 is Resite on

01:07:32: PS4. No, it's not. It's probably not

01:07:35: going to

01:07:39: be any more questions. We going to run

01:07:42: out again.

01:07:51: My throat

01:07:53: hurts my

01:08:00: sinuses. Uh was asking how's Prime doing

01:08:03: on the Excel version of Brooks Engine.

01:08:05: You have to ask

01:08:09: him and one was asking why are you so

01:08:12: cute? Oh

01:08:16: cuteness of the beholder.

01:08:18: I was hold that way. I was born that

01:08:21: way.

01:08:24: It's just a natural attribute of being.

01:08:27: You can get it by reflection actually.

01:08:30: Reflection like in C. Yeah. Yeah. You

01:08:33: can reflect my cute attribute. Yeah.

01:08:35: There we go. It's a part of the net

01:08:38: standard.

01:08:40: Just try not to do it on mono or

01:08:42: otherwise it might be slow. That's going

01:08:48: to I don't really have any more

01:08:50: questions. So like if uh more questions

01:08:53: they'll pop in in a bit like we probably

01:08:57: question elaborate on social group

01:09:00: feature people to join a group owners

01:09:02: mount stuff like event stars. Okay so

01:09:06: there's a few things there.

01:09:09: Um so uh right now for events we don't

01:09:13: have like you know calendar but it's

01:09:14: something we want to add and once it's

01:09:17: added like that will also work with the

01:09:19: groups. Uh we definitely do want to

01:09:22: expand the group system so you can like

01:09:23: you know be like you can follow the

01:09:24: group like you can be like sort of like

01:09:26: subscriber to it or something like that.

01:09:30: um like

01:09:33: um essentially like um you know add more

01:09:38: kind of functionality where you don't

01:09:39: need to be actual managed member of the

01:09:41: group you can kind of sort of follow it.

01:09:43: So for example, if a group that's like

01:09:45: you know say like a this organization or

01:09:47: something the group has it like actual

01:09:49: members who are like you know part of

01:09:51: the group and have people who follow it

01:09:53: like you know the sort of public profile

01:09:54: of it. Um and you could like you know

01:09:57: have it like whatever way you want but

01:10:00: you could have you know sessions that

01:10:02: like you know are visible to members of

01:10:03: the group that are subscribers of the

01:10:05: group. You could like you know events

01:10:07: that are visible as well. So yes, that's

01:10:09: definitely like a lot of stuff we would

01:10:11: like want to add like along these lines

01:10:14: like I do want to expand the group

01:10:16: system a lot like and make it like a lot

01:10:17: more versatile. Uh so it's not you know

01:10:20: just for kind of collaboration but it's

01:10:22: also like you know this is like an

01:10:24: organization and we're like you know we

01:10:26: want to have like public facing like

01:10:28: worlds and posts and you know things

01:10:30: which is actually another thing like I

01:10:31: would want to add a system for like you

01:10:34: know making posts so you can have sort

01:10:36: of like a feed and like you can like you

01:10:37: know if you're creator you can post like

01:10:39: announce like when you make new stuff or

01:10:41: when you're like working on something

01:10:43: and people could like follow you and add

01:10:45: more kind of social aspects to resonite.

01:10:51: Uh next question is from Ozie. Uh Oie

01:10:54: asking may ask explain the new rock

01:10:57: followups like what is the difference

01:10:58: between clamped and fall? Uh so this

01:11:02: might be easiest if I draw

01:11:04: it. So they're all um they're all

01:11:11: um logarithmic forms. So generally there

01:11:14: will be logarithmic but it depends how

01:11:15: they treat the max distance. So if I

01:11:20: draw like a

01:11:21: graph so we have like the minimum

01:11:24: distance it doesn't fall until then and

01:11:26: then it starts falling off. So uh for

01:11:30: the actually make a bit thicker so this

01:11:33: one and it starts falling

01:11:35: off. Let me redraw that. You know, it's

01:11:39: like falling off logarithmically. I'm

01:11:41: not even put drawing right

01:11:43: now. So, if it's uh infinite, then it

01:11:46: just keeps falling off. It never quite

01:11:48: reaches zero, but like, you know, it

01:11:49: keeps falling off. Um, if it's uh

01:11:52: clamped clamped, if this is your your

01:11:55: maximum distance, it'll essentially stay

01:11:58: at this forever. It will never fall

01:12:00: below the value that it's at at the max

01:12:02: distance. And then fade off. What fade

01:12:05: off does there's actually a separate

01:12:07: sort of linear fall

01:12:09: off and as it gets like you know there

01:12:12: it actually follows like linearly at the

01:12:14: end. So like it kind of instead of it

01:12:17: like you know just stopping abruptly

01:12:22: um it just kind of you know it fades out

01:12:24: and like you won't be able to hear it

01:12:26: past the maximum distance at all. So

01:12:29: that kind of explains

01:12:38: those. Uh Darkest Sabertooth is asking

01:12:42: will we eventually get officialite app

01:12:44: for sending deist either other people on

01:12:46: the phone. I would love like to have

01:12:49: one. I think that's like a really good

01:12:51: kind of natural expansion to like make

01:12:52: people make it easier for people to

01:12:54: interact and turn their into more of a

01:12:56: social platform like more general one.

01:13:00: Uh right now like we don't have the

01:13:01: bandwidth for it.

01:13:03: So I don't know like when we're going to

01:13:05: work on it but I it it's one of those

01:13:07: things that I think is going to be very

01:13:09: important for us in the future.

01:13:15: Uh next questions is from menshock. Any

01:13:19: are there any plans to add more

01:13:20: accessibility options into resonate? I

01:13:23: can imagine a lot of solutions that are

01:13:24: used in 2D applications pretty hard plan

01:13:26: for platform as flexible dependent as

01:13:29: user creations as resonate. Yeah,

01:13:31: there's always stuff like we're looking

01:13:32: for. Like I

01:13:35: um like I don't like want like specific

01:13:38: kind of examples like it mostly depends

01:13:40: you know like what people kind of

01:13:41: suggest and what people ask for. Um we

01:13:44: would definitely want to add like you

01:13:46: know things like for example for uh

01:13:48: photos sensitive users where it's like

01:13:50: stuff to uh selectively know the render

01:13:53: some things so it's going to be like you

01:13:55: know content is properly tagged anything

01:13:58: that's for example flashy visual will

01:13:59: not render for that user.

01:14:03: Uh so you know things like that adding

01:14:05: like kind of features to make like

01:14:08: things easier like there's always like

01:14:09: little things here and there it's going

01:14:10: to be done. It's not like you know any

01:14:13: single thing but we're always like on

01:14:16: account and we often times like

01:14:17: prioritize accessibility features

01:14:21: um you know where we normally wouldn't

01:14:24: prioritize a feature.

01:14:29: Next question is from uh Asent

01:14:32: Twitch. Uh will the rendering switch

01:14:35: give more performance than Unity? Also,

01:14:37: how would the Unity SDK work? If you

01:14:38: guys are running different rendering

01:14:40: engine and we're guys talking about

01:14:42: using SDK, it will support multiple

01:14:44: engines. I'm curious how it will work.

01:14:46: Uh it should give more performance

01:14:49: because uh we want to like Unity uses

01:14:51: like we're using defer right now because

01:14:52: it works with lots of lights, but it

01:14:55: like has some drawbacks. uh some

01:14:57: performance impacts and there's like

01:14:58: much better approaches now like cluster

01:15:01: form for example that can let's render

01:15:04: even more lights and has even more

01:15:05: flexibility

01:15:07: um having like you know more modern

01:15:09: pipeline also like one uh that uses

01:15:11: vulcan for example so we would expect it

01:15:14: to like you know how blood performance

01:15:17: uh SDK SDK actually going to work uh

01:15:21: it's going to be sort of like you know

01:15:23: like sort of like a document object

01:15:26: model like DOM that you can manipulate

01:15:28: over like a TCP or something like that.

01:15:31: It'll be like a library to kind of work

01:15:33: with it. So the Unity SDK will just you

01:15:36: know connect and just you know

01:15:39: communicate the data that way same like

01:15:41: any other engine. So it doesn't actually

01:15:44: need to be running the engine. Uh maybe

01:15:47: it comes in a bundle with like a

01:15:48: headless to kind of handle some stuff

01:15:50: for you. But uh the engine is no longer

01:15:52: actually running in that part like the

01:15:55: SDK doesn't need

01:15:58: that.

01:16:00: Uh next question I saw from K would a

01:16:04: Dvar field hook combo not be possible? I

01:16:07: don't know what do you mean by that? We

01:16:10: generally don't do node combos because

01:16:13: we just kind of like you know we can

01:16:15: just we can combine multiple nodes to

01:16:17: make a functionality you know we just

01:16:19: let you combine the nodes

01:16:30: I don't know what they would want from

01:16:31: that though.

01:16:34: Yeah, understand.

01:16:37: Um, next question is from Oussie. Was

01:16:39: there anything interesting or even

01:16:41: frustrating thing you learn when working

01:16:42: on audio?

01:16:45: Um, well, the frustrating part was like

01:16:48: the Steam audio not being thread safe,

01:16:51: which it looked like it was thread safe

01:16:52: from the documentation, but it actually

01:16:54: was

01:16:55: not. Um, and I kind of decided with the

01:16:58: assumption that it's thread safe and it

01:17:00: like wasn't.

01:17:02: the documentation wasn't very clear

01:17:03: about that. So like that was kind of

01:17:05: frustrating thing. Um interesting. I

01:17:08: don't super

01:17:10: know. It's

01:17:12: uh the other part like for setting is

01:17:14: like you know doing all the timing and

01:17:16: everything. Make sure like it kind of

01:17:17: works well piping stuff in and so on

01:17:20: because now we're kind of like sending

01:17:22: the audio to audio device ourselves and

01:17:24: we need to make sure the timing works

01:17:26: right like you know the threads are like

01:17:28: not getting starved and um just kind of

01:17:31: messing around but like it didn't feel

01:17:33: any particular worse I guess.

01:17:38: The

01:17:39: interest I don't know interesting parts

01:17:41: is mostly just kind of learning some

01:17:43: kind of like differences like how some

01:17:47: stuff is handled. Actually interesting

01:17:49: part in Unity is like original

01:17:51: interpreted wrong. I thought like audios

01:17:53: will actually have like maximum range

01:17:56: and it will just kind of stop but it

01:17:58: just kind of go infinitely for most of

01:18:01: them which was like like I was like what

01:18:04: that's a little bit strange.

01:18:10: The other part is like you know some

01:18:11: parts like like for example the reverb

01:18:14: uh so really good job like matching like

01:18:16: you know uh integrating like zar reverb

01:18:21: um and it kind of replaces the reverb

01:18:23: zones but we cannot actually make it

01:18:25: sound the same because we what everyone

01:18:28: uses it's like their own thing and uh

01:18:31: there's not much documentation on it. So

01:18:34: we only found like you know reverb

01:18:36: that's free that sounds good and like

01:18:38: machets sounds has a similar vibe which

01:18:40: works well enough but we were able to

01:18:42: get it like one to one.

01:18:44: Oh, another actually one thing that

01:18:46: worked interesting. Um, one thing I was

01:18:49: kind of thinking kind of tricky is the

01:18:51: Doppler effect because I thought like

01:18:52: you have to like you know have a

01:18:53: mechanism where it samples ahead and

01:18:55: behind and so on. But I was like why

01:18:57: don't I just use pitch shifting

01:18:59: algorithm and so integrated one for me

01:19:02: for that. It actually works really well

01:19:05: like it sounds really good. It works

01:19:07: with any audio source and it kind of

01:19:09: just simplify that whole process. So I

01:19:10: don't have to worry about like any of

01:19:12: the sampling ahead because for example

01:19:14: you know with user voice you cannot

01:19:16: sample ahead because the voice the user

01:19:18: has literally don't made the sound yet.

01:19:21: Uh and you can also you know other stuff

01:19:24: like pitch shift your voice now. So it's

01:19:26: kind of there you know funny stuff. Yeah

01:19:29: I've seen people like this sounding like

01:19:32: this when I turn my voice down. I can

01:19:34: sound like this when I turn it up.

01:19:38: Yeah. I always some people like make

01:19:40: like a helium stuff like super breed it

01:19:42: in and it makes your voice high pitched

01:19:44: so people have like fun with it and that

01:19:46: looks kind of surprising to me because I

01:19:47: was kind of expect like I was expecting

01:19:49: the pitch shift to not work for Doppler

01:19:52: but it just it works really well even

01:19:54: though it's it's not fully like

01:19:56: physically accurate. It just it works

01:19:58: fine like I would say even sounds better

01:20:00: than like actual Doppler. Yeah, it

01:20:03: sounds it sounds like real Doppler even

01:20:07: though it's like not changing the speed.

01:20:11: Yeah. So, I was like I'm just going to

01:20:14: like leave it at this. I'm going to I'm

01:20:16: going to bother with the other

01:20:17: approaches for the

01:20:19: play.

01:20:21: Uh next question is from Mshock. Uh,

01:20:25: Mitri is asking, "Would you consider

01:20:27: adding more nuanced uh levels of

01:20:29: contacts in Resonoid? Having everyone

01:20:31: that I meet Resonate I want to connect

01:20:32: with again on the same priority as my

01:20:34: friends feel like kind of weird to being

01:20:36: able to add notes to contact like you

01:20:38: can in Discord would be great feature.

01:20:40: For example, often have issues

01:20:42: remembering where I meet someone." Yeah,

01:20:44: this actually one of the reasons like we

01:20:45: named it contacts is because it's

01:20:47: neutral. Um, and what you want to do

01:20:49: with the contacts UI rework, which I do

01:20:52: recommend checking the GitHub issue

01:20:53: because it has a bunch of like ideas and

01:20:56: stuff, you know, that's going to happen

01:20:57: with it. Uh, one of the things is you'll

01:20:59: be able to tag people. So, you can tag

01:21:01: somebody as a close friend or

01:21:03: acquaintance, you know, or a workmate

01:21:05: or, you know, or something.

01:21:08: Uh, and then, uh, contact that we also

01:21:11: think we would want to add. So, you can,

01:21:13: you know, add notes to each people. And

01:21:15: when you host sessions, you can for

01:21:16: example say this session is a visible

01:21:19: not to all of my contacts but only to

01:21:22: contacts you know that are tags close

01:21:24: friend if you like you don't feel like

01:21:25: hanging out with wider group. So yes

01:21:28: like we do we do want to like expand

01:21:30: this uh in the future that's going to

01:21:32: happen with the complex UI

01:21:36: rework. Uh Oussie's asking uh combo

01:21:40: question. What is the first on the

01:21:42: chopping block for the split ending?

01:21:43: Assuming this wasn't asked before.

01:21:46: Um I was so there's like a few things I

01:21:49: probably could be doing first. One of

01:21:50: the one of them actually started looking

01:21:52: into already is like mesh upload. So I'm

01:21:55: going to like use like more of the sort

01:21:57: of plat the mesh into like a buffer so

01:21:59: it can be kind of swap. But we're also

01:22:01: going to move some of the smaller pieces

01:22:03: like like motion library. move that like

01:22:05: protos engine but it's like relatively

01:22:07: small um so I'm going to move all the

01:22:09: smaller pieces again LOD component

01:22:14: yeah well it's not like a chopping block

01:22:17: like it's more like you know what gets

01:22:18: processed is probably later one

01:22:21: um like what will probably happen first

01:22:24: is just move all the remaining smaller

01:22:26: pieces over

01:22:28: um and then like you know redesign how's

01:22:33: engine

01:22:34: um you know how

01:22:36: engine and you need to sort of

01:22:40: communicate for the data and part of you

01:22:43: know is the mesh stuff uh textures and

01:22:45: all kind of like over buffers so that's

01:22:47: kind of fine

01:22:49: um and like you know designing the

01:22:52: designing the IPC mechanisms and so on

01:22:54: but moving the small pieces first some

01:22:57: some of them already got moved part of

01:22:59: audio like for example device analysis

01:23:02: that got moved Uh there's no fully

01:23:04: unfixed inside. It's also been done unit

01:23:07: aside before. Uh I know there's like

01:23:09: some smaller pieces like that like we

01:23:11: can move. I mentioned the lead motion

01:23:14: for example, but I have to kind of dig

01:23:16: through it and figure it

01:23:23: out. Uh next question is from uh as

01:23:27: Twitch. Would UI rework on inventory

01:23:29: system also technically bring UI work on

01:23:31: file browser? they look exactly the same

01:23:33: even though they are different and no

01:23:36: they're going to be separate things. Uh

01:23:38: however a lot of the things for the URI

01:23:40: work are going to be based on data

01:23:42: feeds. So we're going to be using the

01:23:43: same mechanisms for both.

01:23:46: Um there might be some shared stuff

01:23:49: between them but like they're ultimately

01:23:51: two different systems that are going to

01:23:52: be reworked separately. They might be

01:23:55: reworked one after another but

01:23:57: um there's still like you know two

01:24:00: separate systems. are like

01:24:02: uh uh that's going to kind of change

01:24:08: stuff.

01:24:12: Uh next question is from Nicorn. Uh can

01:24:15: you talk about Rick animation system

01:24:17: which I'm told let you bring animations?

01:24:19: What benefits versus doing by tweening

01:24:22: rotation values? Can the same be used to

01:24:24: multiple rigs and working NPC systems? I

01:24:28: don't actually know what you mean. Like

01:24:29: you can import animations, you know, if

01:24:32: it's just a rig animation, you just

01:24:34: import it as normal model and then you

01:24:36: get like, you know, hierarchy that's

01:24:37: moving, but we don't have tools to like

01:24:39: really utilize that super well. So,

01:24:44: [Music]

01:24:45: um, you might need to do a lot of like

01:24:47: work for it like on your own.

01:24:53: Uh, Gray Fox MB is asking, "Is it

01:24:56: possible to give audio ability to

01:24:57: automatically restart when it crashes or

01:24:59: could the user able to manually restart

01:25:00: it?" Probably not because like when it

01:25:03: crashes, it means like something

01:25:04: unexpected happened. And when something

01:25:06: unexpected happens, we don't know what

01:25:08: state is it in. So, we don't know like

01:25:10: is it able to restart like it might have

01:25:12: corrupted bunch of things. So, we don't

01:25:14: know. Uh, when it crashes, make sure to

01:25:17: report it and we'll, you know, fix the

01:25:19: crash. But generally generally as a

01:25:23: design you know like it doesn't

01:25:26: cringo it's still my because it's like

01:25:28: very early but like that gets smooed out

01:25:31: as it go as time

01:25:35: goes. It's kind of like you know it's a

01:25:37: little bit similar like you know say

01:25:38: like you crash your car you know is it

01:25:42: safe to start it again? Like you don't

01:25:44: know maybe some critical part of the

01:25:46: engine was damaged and it's going to

01:25:48: blow up if you try to start it again.

01:25:53: Uh, next question is from

01:25:56: Oussie. Ozie is asking actually for

01:25:59: those little things you mentioned for

01:26:00: splittening, will we see those in an

01:26:02: update before splittening or will take

01:26:04: longer to integrate now compared to

01:26:06: everything properly separated? Some

01:26:08: small things we'll probably just see

01:26:09: happen like I try to like you know make

01:26:11: it like make functional builds as fast

01:26:14: as as possible but like the larger part

01:26:16: like where is the change in the

01:26:17: communication that's going to take a

01:26:19: bit. So the smaller things probably yes

01:26:21: as the bigger things will probably be

01:26:27: separate.

01:26:29: Uh Ace on Twitch is asking are there any

01:26:32: community made items that uh you are

01:26:34: excited for whenever you think about

01:26:36: future updates? For example, I'm excited

01:26:37: for video encoder screen and browser

01:26:40: sharing because that can open a

01:26:41: possibility of doing checkbox game

01:26:43: rights on res. Oh my god. Yeah, checkbox

01:26:45: would be fun. I would want to do

01:26:46: checkbox on here. That's actually one of

01:26:49: the reasons like you know for video

01:26:50: encoder and screen sharing would be

01:26:52: really good for that.

01:26:54: Um I there like so there definitely were

01:26:57: things like where I was like those are

01:26:59: cool feature like you know those are

01:27:00: cool items. Um I feel like you know we

01:27:04: need to add functionality to make it

01:27:05: easier to do this kind of stuff.

01:27:08: Um, I don't remember any off the top of

01:27:11: my head because like it's been like a

01:27:13: bunch outside of you know like any

01:27:15: yourself like we got an examples. I

01:27:18: probably need to like per use through

01:27:19: like my items and things to

01:27:23: my memory on that.

01:27:29: I I definitely am I definitely would be

01:27:31: excited to like do like screen sharing

01:27:34: screen share fun. I don't think there's

01:27:36: any com like Well, there's like the ones

01:27:38: actually. Yeah, there's like a video

01:27:39: player that some people use. You can

01:27:40: just stream into

01:27:43: Yeah, I do also have like my own setup

01:27:46: that uses a nifty little software called

01:27:49: um Media MTX, which like streams an RTSP

01:27:53: stream straight from my computer that

01:27:54: I've like forwarded.

01:27:57: Yeah,

01:28:00: I want I want sub one millisecond screen

01:28:04: or Yeah, sub like sub

01:28:09: like tenth of a millisecond or tenth of

01:28:12: a second at least screen sharing cuz

01:28:14: right now it's really delayed by like a

01:28:16: second or two. Yeah. Oh, that's also

01:28:19: hard for like

01:28:24: obviously there's something more than

01:28:27: you know two seconds.

01:28:30: Yeah.

01:28:39: Yeah. I don't remember any specific

01:28:42: ones. I know there's been like some in

01:28:43: the past, but I just don't remember the

01:28:44: examples.

01:28:49: Yeah, the cocoa live is the one like

01:28:51: I've seen

01:28:53: before.

01:28:58: Uh, next. Oh, no. That's the same

01:29:01: question. I I would love to play

01:29:03: Jackbox. I like playing checkbox. It's

01:29:04: one of the fun games. Could even stream

01:29:07: those.

01:29:10: Let me see if I find

01:29:13: any.

01:29:18: Uh if I remember any of

01:29:36: them. Oh, there's definitely like stuff

01:29:38: like you know that would like benefit

01:29:42: like there's like been things where I'm

01:29:44: like this would like benefit from

01:29:45: collections. It would make like things

01:29:47: so much easier to like implement to make

01:29:49: this more powerful. So like you know

01:29:51: some games and things and like things

01:29:53: like the photo slurper for example like

01:29:55: you know collects photos

01:29:58: so technically needed but

01:30:01: I wanted to make a spatial hashmap but

01:30:03: the fact that we don't have collections

01:30:04: makes that very hard. Yeah.

01:30:11: So this is asking does audio have any

01:30:14: concept of audio layers currently uh say

01:30:17: similar how we have render context for

01:30:19: mirrors cameras possible for audio

01:30:22: listeners only listen to certain sounds

01:30:24: that would be really powerful uh sort of

01:30:26: just like concept of audio inlets that I

01:30:29: kind of talked about last time it's not

01:30:31: quite the same but I'm also going to be

01:30:33: adding like more mechanisms because

01:30:34: we'll need it for the camera thing

01:30:37: because we want to make it so the camera

01:30:38: can, you know, have its own audio. Um,

01:30:42: and so it's able to hear even the, you

01:30:44: know, local user fully specialized, but

01:30:47: also, you know, the user still doesn't

01:30:49: hear themselves. So, there's going to be

01:30:51: more mechanisms added

01:30:54: there in for audio.

01:31:00: Uh, do you ever plan on posting content

01:31:03: whenever you're posting resonate? for

01:31:05: example, when you're FC. Um, I did post

01:31:08: like some of it on my blue sky. Posted

01:31:10: like some photos and videos and

01:31:12: things. I don't want to post more, but I

01:31:15: do post

01:31:29: some. Okay, real quick. Oh.

01:31:36: Okay, Fox is asking now that audio isn't

01:31:38: optional, uh, will be focused on back

01:31:40: fixing for the time

01:31:42: being.

01:31:44: Uh, like I said, like like there's

01:31:46: probably going to be like, you know,

01:31:47: some back fixes like over next week and

01:31:48: then the focus is going to be on the

01:31:50: split inning.

01:31:59: I was asking how do you like resource

01:32:00: affecting user voice honestly turning

01:32:03: immersive for different sections of

01:32:04: world. I haven't actually used it super

01:32:06: much myself. Like I've been since I'm

01:32:09: like kind of like visiting. Um I haven't

01:32:13: I haven't been on VR

01:32:17: much. So I don't know like I've been

01:32:19: around a

01:32:21: bunch.

01:32:23: Um what for like the reverb zones

01:32:26: affecting voice?

01:32:30: Yeah, I think it's an overall

01:32:31: improvement.

01:32:33: Um, I I really really really like it. I

01:32:38: know that there have

01:32:39: been there's like there's like some

01:32:43: people who like wanted like a a setting

01:32:46: like in your settings menu to turn it

01:32:47: off and I'm

01:32:49: like that's like the break this

01:32:52: component

01:32:54: setting. So maybe

01:33:04: Um as is asking uh do you plan on

01:33:08: slightly improvement locomotive system

01:33:09: after spliting since engine will use

01:33:11: more modern libraries? Um we don't

01:33:14: really use libraries there are like own

01:33:16: system for

01:33:18: it what do you mean by like a motion

01:33:20: system like do you mean like movement or

01:33:22: do you mean

01:33:24: animations? Um like improve how cuz

01:33:27: there's a lot of ways There's a lot of

01:33:30: be improved, but

01:33:31: like like are you talking about like

01:33:34: bugs? Cuz like we can fix bugs anytime.

01:33:39: You need to be a little bit more

01:33:41: specific though because there's like

01:33:43: multiple ways to interpret that.

01:33:57: Okay, we don't have any more questions

01:33:59: now.

01:34:00: We got like 25 minutes to minutes left

01:34:04: anyways.

01:34:06: There's not a questions, but there are

01:34:07: like a couple messages on the

01:34:09: accessibility of the reverb. Um, if you

01:34:12: want to ask, we can ask.

01:34:16: Yeah, if you want to ask that, ask in

01:34:17: the question mark.

01:34:20: Oh, my is going to die soon.

01:34:26: I have to say I

01:34:38: can't

01:34:41: prove like direct the cable.

01:34:45: Uh for clarification, I remember you

01:34:49: talking about the procedural commotion

01:34:51: machine requested at the wrong time and

01:34:52: was wondering if engine not using the

01:34:55: night would have something to do with

01:34:56: it. No, that has nothing to do with it.

01:34:58: Uh the wrong time is more like that like

01:35:01: we had to do it one before we reworked

01:35:03: the IK because the IK is kind of like

01:35:05: you know not handling things as well. We

01:35:07: don't have as control as much as we

01:35:08: want. The other thing is we don't have

01:35:10: animation system which makes the kind of

01:35:13: things complicated. So like there's like

01:35:14: a lot of kind of big things that made it

01:35:16: big at the wrong time. Um one of the

01:35:19: things we did suggest that like nobody

01:35:21: really reacted on is like we could open

01:35:26: you know the code for it and see like if

01:35:28: people because right now we don't have

01:35:30: the bandwidth for it. Uh but if people

01:35:33: wanted to like you know make their own

01:35:35: contributions and see like take their

01:35:37: style but you know improving it or uh

01:35:40: reworking some parts of it like you know

01:35:41: we'd be open to that but nobody seems to

01:35:45: like really be interested in

01:35:47: that. So it's going to you know the way

01:35:50: it is

01:35:52: unfortunately until like we get like

01:35:54: more bandwidth on that.

01:36:03: Hello. Uh, let's see if I can plug my

01:36:06: headset.

01:36:09: Maybe other might like end up like

01:36:11: ending earlyish.

01:36:18: I would not mind ending early to be

01:36:20: honest. Yeah, the question

01:36:24: like slowing

01:36:31: down. There we go. It's plugged

01:36:35: in. Yeah, I think we might ended up

01:36:39: here. Uh

01:36:45: yeah. Yeah.

01:37:02: I I read that question. My my brain just

01:37:05: like glazed over. Which one?

01:37:10: I don't I don't really understand like

01:37:12: the wording of it. Oh no.

01:37:15: Oh. Oh jeez. My Wi-Fi died. Oh, okay.

01:37:18: I'm

01:37:21: back.

01:37:24: Uh, have no joystick with tools. Most

01:37:27: don't even touch the joystick to my

01:37:28: knowledge. None of the default tools is

01:37:30: joystick click only use for jumping game

01:37:33: would lock joystick or something similar

01:37:35: be reasonable or is this just going to

01:37:37: be way too long time for some random

01:37:38: idea type of thing. So, it depends on

01:37:41: like if the tools don't use the joystick

01:37:43: then they shouldn't lock it. If you feel

01:37:47: some of them lock it when they shouldn't

01:37:48: and they're like actually official tools

01:37:50: because there's also tools made in game

01:37:53: and there the user actually needs to

01:37:55: mark it like doesn't use the joystick

01:37:57: and then you know then it will not lock

01:37:59: it. But a lot of people don't check that

01:38:02: and you know then it locks it because uh

01:38:05: they haven't indicated it.

01:38:07: for official tools even like if you use

01:38:09: the joystick press. The problem there is

01:38:11: the system right now is designed to work

01:38:14: across different controllers and we

01:38:17: still have to support the Vive Vans and

01:38:20: the Vans you it doesn't have a joystick

01:38:22: but touchpad and the touchpad you cannot

01:38:24: use without pressing it. So using you

01:38:27: know the joystick the directions and

01:38:29: pressing it is intrinsically linked

01:38:31: together in the system. Um that's kind

01:38:33: of what I said is you know we can

01:38:35: eventually rework the system move it to

01:38:36: other buttons make it more controller

01:38:39: specific

01:38:40: uh controller specific bindings. So if

01:38:43: you're on those type of controllers

01:38:45: maybe like you know it uses the other

01:38:46: button um and that could like improve

01:38:50: things. But if it does actually use the

01:38:51: joystick which there and there are some

01:38:53: tools that do then you know then it has

01:38:56: to lock it.

01:39:03: Uh, next question is from Ace on Twitch.

01:39:06: Uh, do you guys plan on supporting AR

01:39:08: kit blend shapes for face tracking in

01:39:10: the future? I know that some avatars

01:39:12: have different setups face tracking, but

01:39:14: don't use all shapes in resite. I would

01:39:16: like to add support for it. So, you have

01:39:18: like, you know, you can use it for face

01:39:19: tracking desktop actually. Well, it

01:39:22: depends. What do you mean uh AR blend

01:39:24: shapes? Like because I don't know if

01:39:27: you're I don't know if you're asking for

01:39:28: like you know using like an iPhone to

01:39:30: like a face tracking desktop kit or if

01:39:34: you just mean like using like remapping

01:39:36: stuff to AR blind ships on avatar in

01:39:40: which case like we could also like you

01:39:41: know look into that

01:39:44: too. It depends you know what system

01:39:46: like the pace tracking is coming from.

01:39:52: Reno's asking, "Every single official

01:39:54: tool locks the joystick on a quest due

01:39:56: to stick click being part of joystick

01:39:58: functionality."

01:40:01: Really? Every single one?

01:40:04: [Music]

01:40:06: uh or I think I think it's most like the

01:40:10: the raw data tool tip I know has an

01:40:12: option to disable the uh secondary, but

01:40:16: I think it's locked on all of the

01:40:17: official tools that aren't the raw data

01:40:22: tool. Okay, this one looks

01:40:25: it forget what the label is it

01:40:29: for. If there's like a tool that like

01:40:31: locks it, it like it doesn't use it.

01:40:34: Like if it doesn't even use the click,

01:40:37: then it shouldn't lock

01:40:46: it to like get up request and we can

01:40:49: like, you know, check and make sure the

01:40:51: tools properly like marked.

01:40:56: I wish we had one controller standard.

01:40:59: Yeah, unlocks it. Yeah, I'll probably

01:41:02: make it up like issues like for the ones

01:41:04: that don't even use the click. If it

01:41:06: does use the click, then it needs to

01:41:07: lock it because then, you know, it uses

01:41:09: it. So, in that case, like there's not

01:41:11: really a choice unless, you know, the

01:41:13: system is reworked a bit. Um, yeah.

01:41:19: uh because that one it will need like

01:41:22: controller specific bindings because

01:41:23: like I said on the V

01:41:25: bands you cannot use the secondary

01:41:28: without clicking

01:41:30: it. So the click like even if it is just

01:41:32: the click it does like it is the whole

01:41:41: thing. We can't we can't lock users out

01:41:44: of playing the game just because their

01:41:46: VR hardware is a little older.

01:41:48: Yeah, it's one of the issues, you know,

01:41:51: in general. It's also like this kind of,

01:41:54: you know, comes with like

01:41:57: um there's a similar thing that happens,

01:41:59: you know, with mods too is like, you

01:42:01: know, where it does something and it

01:42:03: works fine for a subset of people, but

01:42:05: there's like, you know, there's another

01:42:05: subset of people where it kind of breaks

01:42:07: things and we as developers, we can't do

01:42:11: that. Like, you know, we cannot just be

01:42:13: like, okay, like this percent of people,

01:42:15: you don't get to play the game. uh when

01:42:18: we implement things we need to make sure

01:42:20: they work for you know as many people as

01:42:22: we can uh pretty much everyone if

01:42:25: possible.

01:42:27: So, and it makes things a lot more

01:42:30: difficult a lot of the times and like

01:42:32: say a lot more painful. But it's also

01:42:34: like you know why on our end a lot of

01:42:36: things will take like you know longer

01:42:38: because we go through all of these kind

01:42:40: of considerations be like okay does it

01:42:42: work with this does it work with that

01:42:43: work this person and if you make them

01:42:45: all you don't care we just make it work

01:42:47: for yourself and you know that's it.

01:42:51: Um it's it's it's kind of tricky but

01:42:54: yeah like they would want to like have a

01:42:57: system where it has more

01:42:59: nuance for things. So they would kind of

01:43:01: like you know improve things but we

01:43:04: still do have to consider like it like

01:43:05: even the new system it still has to work

01:43:07: with Vance because there's people who

01:43:09: use them.

01:43:14: So yeah, like pretty much what you're

01:43:15: saying there like you know that's the

01:43:17: thing we keep saying is you know that

01:43:18: what's that's what we'll have to do. We

01:43:20: have to like redesign it so it has more

01:43:26: nuance. But it's also like you know it's

01:43:28: a it's a thing that's making it very

01:43:30: difficult because it needs to have the

01:43:32: nuance while still letting all the users

01:43:34: to do all the

01:43:38: functions. Uh next question is from

01:43:41: Twitch. Do you guys plan on letting

01:43:42: players able to subscribe? There is

01:43:44: another direct on game instead of on

01:43:45: website. Um I'm actually not sure right

01:43:49: now if you can do that on Steam because

01:43:52: Steam has I think some rules against

01:43:54: that. Uh so we might not be able to do

01:43:57: that like you know at least not like

01:43:59: within the main in-game

01:44:02: UI. Um, we probably ideally want to be

01:44:05: kind of ideal way to do it, but uh, we

01:44:08: might not be able to for the Steam

01:44:15: downloads. Uh, we got 15 minutes left.

01:44:18: Uh, so if you got like any more

01:44:20: questions, like ask them pretty much

01:44:22: now. Otherwise, like we're going to end

01:44:24: the stream in a bit, but it's going to

01:44:27: be ending in 15 minutes

01:44:29: anyways. I just wonder if there's any

01:44:31: more questions.

01:44:40: Where am I?

01:44:42: Okay, I think I think I'm going to end

01:44:44: it here. So, yeah.

01:44:47: And thank you everyone for joining and

01:44:51: bearing with us being sick uh and

01:44:56: sniffly. Uh thank you for asking all the

01:44:58: questions. Thank you for supporting uh

01:45:01: supporting the

01:45:03: platform.

01:45:05: Um and like uh you know whether it's

01:45:08: like on Patreon, whether it's on stripe

01:45:10: and if you're on Patreon consider

01:45:11: switching to ST because that gives us

01:45:13: like you know we get more out of that.

01:45:16: Um we're like you know Patreon taking

01:45:18: like 15% but also strip is around 5%. So

01:45:21: that's a huge difference. Uh but whether

01:45:24: like you know support us this way or

01:45:25: whether you're just like you know part

01:45:26: of your platform you know socializing or

01:45:29: making cool things like thank you so

01:45:30: much for helping to keep this place

01:45:33: going and I guess we'll see you next

01:45:36: next week unless uh something happens

01:45:39: with the postponeed one. So, thank you

01:45:42: very much for watching and see you next

01:45:45: time.

01:45:49: Bye. I war.