The Resonance/2025-06-29/Transcript

From Resonite Wiki

This is a transcript of The Resonance from 2025 June 29.

This transcript is auto-generated from YouTube. There may be missing information or inaccuracies reflected in it, but it is better to have searchable text in general than an unsearchable audio or video. It is heavily encouraged to verify any information from the source using the provided timestamps.

00:01: Hello.

00:04: So, should be streaming.

00:07: I'm going to post announcements. Uh,

00:09: let's see.

00:12: Post in Discord.

00:16: Where is it? Uh,

00:19: there we go.

00:22: Uh, post another one.

00:25: Live streams. There we go.

00:30: And then I got two more posts and

00:34: we should be good.

00:38: There we go.

00:40: Post it.

00:42: Hello. Do we have anybody in the chat

00:44: yet? Hello.

00:50: I'm actually going to open up the stream

00:51: too just to make sure my searchings are

00:54: okay. Hello. Oh, there we go. Hello.

01:01: We're getting people.

01:03: Hello.

01:06: I'm going to give people a few minutes

01:07: to pile in. Hello, Jake. Ah, good more

01:11: people.

01:13: Hello.

01:16: Welcome to another resonance. We are

01:19: back after two weeks. I say we we don't

01:23: have sire today. Um

01:27: but um

01:30: kind of forget how to do these

01:32: uh like so once you get like into the

01:35: habit like and you kind of like stop it

01:37: for even for a little bit now it's like

01:39: rust and I'm like how did I do this

01:41: again? Um I mean usually kind of like do

01:44: all of the stuff like off the top of my

01:46: head but they have like a checklist for

01:47: things but uh we should be good. The

01:49: announcements are out and everything. Um

01:53: the stream is live. We're getting some

01:55: first comments. So, hello and welcome.

01:58: Uh, welcome to another episode of the

01:59: resonance. Uh, this is, uh, helloist

02:03: bubbler bit. Uh, this is, uh, sort of

02:06: like a chill kind of like, you know,

02:08: office hours podcast where you can ask

02:10: anything about the resonite. Could be

02:12: technical questions, could be

02:13: philosophical questions, it could be

02:16: like, you know, even personal questions.

02:18: Oh, you can't hear me. Hello. Can you

02:21: hear?

02:23: What happened?

02:26: No audio. That is weird.

02:29: Oh,

02:31: okay. So, it's just one person. Okay. It

02:33: is like popping out like whenever I

02:35: talk. Uh, it's using the streaming

02:37: camera audio. So, like you should be

02:40: hearing me spatialized depending on the

02:42: position of the camera. Um, I see it

02:45: kind of going up, but if there's any

02:46: issues, like let me know. Uh, but a

02:49: bunch of people are saying the audio is

02:50: fine, so it should be good. Anyway, um

02:55: um this is like you know sort of like a

02:57: podcast like SL office hours you can ask

02:59: anything but there's a night where

03:00: there's technical philosophical about

03:01: the platform the team can be like you

03:03: know personal things like to kind of mix

03:05: things up a bit. Uh

03:08: so with that let's get started. Uh the

03:11: only thing make sure if you're going to

03:13: ask something make sure put a question

03:15: mark. Uh that way it's going to pop up

03:18: on the thing. Uh I'll be able to pull it

03:20: out. Uh and that way I don't miss it

03:21: because otherwise I will miss, you know,

03:23: some other things that are kind of

03:24: popping up through the thing. Um I'm

03:27: going to adjust the camera a little bit.

03:30: There we go. Yeah, there's no S today.

03:33: Uh I think he's he's like recently

03:35: traveled quite far. So um I don't

03:40: I don't think um

03:43: Oh, thank you. Thank you, Nikki, for the

03:45: raid.

03:46: And she had a uh Nuki had a cool stream

03:49: before called filling the hole uh where

03:52: it's like between the moderation of his

03:54: hours and on ends and she was

03:57: interviewing with her and it was kind of

03:58: fun. Um it was like a little bit before

04:02: like I started like setting up stuff. Uh

04:04: but yeah, he's been traveling so it's

04:06: just me unfortunately. Uh thank you

04:08: missing for the subscription too. Um and

04:12: yeah like uh we can get started. Uh make

04:15: sure

04:17: um you know make sure to put a question

04:19: mark if you're going to ask something.

04:21: Um

04:22: we also have questions from the discord

04:25: already prepared. Uh if you're asking

04:28: more questions in the discord they're

04:30: not going to be answered at this point.

04:32: Um make sure you know ask them in the

04:34: Twitch chat. Uh with that let's get

04:36: started. Uh, I'm going to do the Discord

04:38: questions first since it's like, you

04:40: know, fixed amount and then going to go

04:42: to the Twitch ones, uh, as they kind of

04:44: pile in. Uh, I'm actually going to put a

04:47: camera on the anchor so it's not

04:50: floating around. And I got this. And

04:52: also in this cool world, uh, Sigual

04:55: Island that it's, you know, it's getting

04:57: really hot out there. And I've got like,

04:58: you know, some nice pill.

05:01: Wait, I need to open it first. There we

05:03: go. Hopefully this is not one not one of

05:05: the ones that will like respawn me.

05:09: You got, you know, you got some of that

05:12: some of that pill. This is a nice um

05:15: nice summer world and kind of, you know,

05:18: even though it's night here, this is

05:19: kind of how it feels like. It's very

05:20: hot. Um anyways, uh let's go start with

05:25: a Discord questions.

05:28: I'm sorry. I'm a little bit lost with

05:30: this thing because I haven't done this

05:32: uh in two weeks and my sleep schedule is

05:34: still a bit weird. So, the first

05:37: question

05:39: uh this is a big one is from I'm

05:42: actually sure going to duplicate it so I

05:43: can read it too. The first question we

05:45: got is from um uh Papa team. How do sync

05:50: network values handle concurrent updates

05:52: for multiple users? A simple example

05:54: would if two users try to increment data

05:56: model value store at the same time.

05:59: Could there enter an erase conditions

06:00: causing their value? Are the data model

06:02: prefix variables using simpler

06:04: references or more concurrency

06:05: primitives like mutable vars or site

06:07: transactional memory? I came across this

06:09: video with Tim Sweeney recently got me

06:11: wondering how fision manages to scale

06:13: with many simultaneous updates. Uh it is

06:15: using something like event sourcing

06:17: software transaction memory version

06:18: timestamps or is it just yoling updates?

06:21: related to the question. Would you say

06:22: FX engine uses eventual consistency

06:24: model where updates might be peacemeal

06:26: or out of order or is the consistent

06:28: model more similar to something else? So

06:29: if there's a lot of questions in one,

06:31: thanks again splittening so far. Um so

06:34: yeah, it's it's definitely not ying the

06:36: updates. Um I think as far as like the

06:38: data model goes like one of the big

06:40: parts of the data model is ensuring that

06:42: the model will be eventually kind of

06:44: consistent. So there's like you know

06:46: some eventual consistency. It is

06:48: possible for the model to be uh

06:50: temporarily inconsistent between the

06:52: users but it will kind of converge to

06:54: the same value if multiple users try to

06:57: change the same value. There's actually

06:59: multiple kind of primitives but like we

07:01: right now we only have um we mostly have

07:04: like primitives where the value is

07:07: essentially like you know whoever

07:10: um is using something called like

07:12: optimistic concurrency

07:14: uh where like it assumes that multiple

07:16: users will not be updating the same

07:18: value at the same time. When it does

07:20: happen the data model essentially

07:22: decides which one of them wins and

07:25: that's sort of based on a version. So if

07:27: somebody updates a value and then like

07:29: the host uh receives that value from

07:31: that user and then it receives another

07:33: update from a different user but that

07:35: update was generated before they would

07:39: have received the value from the other

07:41: user it uh discards their value and

07:43: sends them a correction. Um so in most

07:47: cases that can happen. The goal is also

07:50: eventually want to add like a bit more

07:51: primitives for uh because with some

07:54: values for example say like you know the

07:55: integers you might want to like you know

07:58: combine them. So like if like one person

08:00: increments it another person increments

08:02: it those two operations um you know they

08:07: are mutually compatible like it it

08:11: doesn't matter in what order they happen

08:13: uh the end up value like you know

08:15: whatever order you know we would have

08:17: them happen the end value is going to be

08:19: the same. So you you know we would want

08:22: to introduce more primitives to kind of

08:25: simplify implementing those things. But

08:26: right now since cannot make that

08:28: assumption like say like one user writes

08:30: value nine another user writes value 100

08:34: which is like you know is correct it

08:36: just takes you know whatever

08:39: whatever the most recent one is and then

08:42: corrects it for the other user. That way

08:44: like you know it stays kind of

08:45: consistent.

08:46: Um so it's pretty much kind of dead.

08:49: There's a few other like you know

08:51: mechanisms like because a little bit

08:53: more complex where you know we have

08:54: stuff like driving mechanisms and

08:56: driving is a way um thank you thank you

09:01: Bernardo 79 for for the subscription

09:06: um so the goal is like you know to

09:08: provide sort of like primitives

09:10: um to make make it so you don't have to

09:14: think about network synchronization I

09:17: found generally this model it works like

09:19: really well. And something like you know

09:21: that also like modern databases will

09:22: often times use is like you make an

09:25: assumption that you're not going to be

09:27: like having multiple writers just

09:28: constantly trying to like you know

09:30: butter the same value in the database or

09:33: this case in the data model. You're

09:35: going to design your mechanisms in a way

09:37: where that should be rare. Uh but you

09:40: still have like you know a mechanism to

09:42: detect if there's you know concurrent

09:44: rights and have like a way to redo them.

09:46: And with optimistic concurrency, one of

09:48: the things you can do is, you know, if

09:51: you write a value and you didn't check,

09:53: okay, like this didn't succeed, you sort

09:55: of like redo whatever we're doing, but

09:57: there's not really much of a primitives

09:59: to do that right now. But most of the

10:00: times you don't need it as much. It also

10:03: gets a little bit more complicated

10:04: because there's like you know stuff like

10:05: the drives and the drives um there's

10:09: like um they usually kind of link you

10:12: know to values where like if you apply a

10:14: drive and but if you release drive it'll

10:17: you know bring that value back to a

10:19: consistent value because when you drive

10:20: something you're essentially telling the

10:21: data model I'm taking exclusive control

10:24: of this value

10:26: you know don't handle it I'm having

10:29: exive control but once you release that

10:32: drive you release that exclusive control

10:34: you bring it back to the data model so

10:36: it like kind of syncs it to the users

10:37: and there's mechanisms where it will

10:39: check like you know multiple people try

10:41: to like either set the same drive so for

10:44: example if two people will try to drive

10:46: the same thing at the same time um it

10:48: will sort of like revert like one of

10:50: those changes again depending on like

10:52: whoever was kind of first and then then

10:54: correct for the other user so um I hope

10:57: this kind of like uh this kind of like

10:59: you know uh answers the question well

11:02: enough. There's a fair amount like that

11:03: goes into this and there's also like

11:05: more primitives that we would like to

11:07: introduce in the future. Um like one of

11:10: the things I would also want us to have

11:13: is primitive. So you can make it very

11:15: simple to implement stuff like uh lock

11:18: step synchronization where say like you

11:20: have like something updating you know

11:22: saying some kind of code and you want to

11:23: make sure that all the inputs are same

11:26: for every user. So every user can can do

11:28: their own local deterministic simulation

11:32: um without having to like you know worry

11:34: that like the values are going to be

11:35: different and the simulation will

11:36: diverge. Um, and that's going to be

11:39: useful for doing stuff like say

11:40: emulators. You want to implement an

11:42: emulator or some kind of complex

11:43: simulation that would be too heavy to

11:45: just constantly synchronize all accept

11:48: synchronization can be very useful for

11:50: those. But right now we don't offer

11:52: primitives for that. Um, but it's

11:54: definitely something I want to expand on

11:56: in the future. But the data model we

11:58: have right now, it works like

12:00: surprisingly well for a lot of like use

12:02: cases and you can like make a lot of

12:04: things just work with it.

12:09: Uh the next question uh we have uh let

12:14: me put this on the stream.

12:17: There we go.

12:19: The next question is also from uh

12:21: Papine.

12:22: Question. One of the next items on the

12:24: tech tree after splitting is data model

12:26: rework. Have you considered converting

12:28: parts of the engine to use more data

12:30: oriented design patterns as method to

12:32: improve performance and composibility or

12:34: might that be overly ambitious? I know

12:37: to rebase their whole engine on top of

12:38: dots while keeping some more OP skin on

12:40: top of the priage for against everyone

12:43: but uh they have been added for several

12:45: years and have nothing public to show

12:46: for as far as I'm aware. So the engine

12:50: it actually is heavily sort of datab

12:53: based like the whole principle of the

12:56: engine is that data is king like the

12:59: data says what it is and then the code

13:01: is you know sort of trying to make it

13:03: reflect what the data is and actually

13:05: goes you know into the synchronization

13:07: mechanism where um the data

13:12: you know like you always trust the data

13:15: model that's like the multitude on

13:17: everything and then like the code you

13:20: know will make sure to reflect whatever

13:22: the data says there should be and that

13:25: does you know have like it heavily makes

13:29: things very like you know composable and

13:32: as you know the general philosophy we're

13:33: using for designing things where a lot

13:36: of the components you know they will

13:38: interact with the data model um and you

13:42: know that's that's what they care about

13:43: they interact with the data model they

13:44: modify a certain way and Then you can

13:46: have other components that you also like

13:48: listen to those values or maybe out like

13:50: other behaviors and you compose the

13:52: behaviors you do based on that. Does you

13:55: know how a lot of like frux engine and

13:57: how like resonite works is like it it is

14:01: very very very heavy on composability

14:05: like to the point like you know where

14:07: that's why the engine is as flexible as

14:11: it is is thanks to that you know it's

14:14: one of the reasons why you can open

14:16: inspector and you can edit anything or

14:19: you can you know take protoflux and you

14:20: can drive anything you can read anything

14:22: you can write anything that's because it

14:25: pretty much like you know takes the

14:26: composibility to like maximum. That's

14:28: how it's kind of designed and it's

14:30: because everything's designed around the

14:31: data. Um there is the other aspect of it

14:35: which is like the performance part. Um

14:38: that has to do more like you know with

14:39: like um doing updates like especially if

14:42: we have like lots of the same stuff

14:45: being able to like you know fill a

14:46: buffer with things and just like you

14:48: know loop through them

14:51: um which often times less better

14:53: performance characteristics and that is

14:54: something that we want to do more.

14:56: There's other parts of the engine like

14:58: where that is used uh where it's going

15:01: to like you know essentially it fills a

15:03: buffer with like you know sort of state

15:05: runs the simulation like often times

15:07: like in parallel like for example bones

15:10: kind of do this um

15:13: and it's something like we want to like

15:14: you know embrace more but on the surface

15:16: it's always going to be you know that

15:18: core data model that's going to be more

15:20: of a background thing where how some of

15:22: these things execute is you know over

15:25: these kind of buffers are know like you

15:26: being sort of individual like objects.

15:29: Uh but yeah, I hope it kind of answers

15:32: things like there's one thing is like

15:34: you know with how FR engine works, it's

15:37: fairly

15:39: different like it's sort of like with

15:42: pretty much like every engine

15:46: like you know out there. Um their

15:48: typical approach is you know we have the

15:50: data model for like you know the scene

15:52: or whatever but it's not synchronized

15:54: and usually you tuck on the

15:56: synchronization on top and this one of

15:58: the reasons like you have limitations

15:59: with those and the way resonite you know

16:03: the way FS engine is designed um it

16:06: essentially merges the two it merges

16:08: like the synchronization with like the

16:10: data model of the scene where pretty

16:12: much like the entire engine is made from

16:14: the same primitives for synchronization

16:16: and everything's kind build around it

16:19: which it takes a lot of work to do like

16:21: we pretty much like it's one of the

16:22: reasons why we can't like you know take

16:24: a lot of existing solution for things we

16:26: have to make our own systems is because

16:29: those systems need to be built from the

16:31: data model primitives that we have and

16:34: that's how everything is designed but

16:35: the huge benefit is you know that you

16:37: get automatic synchronization pretty

16:39: much for everything

16:42: um it's going to make uh you know it

16:46: increases the composible You can like

16:48: you know edit anything. You can like

16:49: protolex anything. Uh you can save

16:51: anything. Like everything's persistable.

16:54: So um there's like huge benefits to that

16:57: kind of approach.

17:01: The next question is from missing. Um

17:07: I'm going to put this here.

17:12: Uh are there any plans to draw Patreon

17:15: at a later date? It doesn't seem like

17:17: the game currently makes it clear that

17:18: they want people to use Stripe. Um, we

17:21: don't plan to drop it. Uh, but we don't

17:24: want to promote it as much. Uh, we want

17:26: to like promote like Stripe, but um, we

17:28: kind of want to offer like, you know,

17:30: multiple alternatives because, um,

17:32: dropping Patreon

17:35: like, you know, maybe some people don't

17:36: want to use Stripe, maybe some people

17:38: want to use Patreon, maybe they're more

17:40: familiar with it, maybe they discover

17:41: Resonite there. Um it's I don't think

17:45: like dropping it would be a good idea,

17:46: but we're definitely going to especially

17:48: recently because we moved Stripe out of

17:50: beta. Um we going to promote it more. So

17:54: it's going to be more at the forefront

17:56: and Patreon is going to be more in the

17:57: background. Uh it is like you know it's

18:00: helping us a lot because like we Patreon

18:03: takes roughly around 15%.

18:06: Um

18:08: like you know uh 15% uh whereas like

18:12: Stripe is something like around 6%. So

18:14: we actually get more resources. So if

18:16: you haven't switched from Patreon to

18:18: stripe consider like you know switching

18:20: over because that's going to help us a

18:21: lot.

18:24: The next question is from

18:28: um Mintshock.

18:32: Mintshock's asking, "How does session

18:33: discovery work? Essentially, how does my

18:36: computer find the list of all sessions

18:37: that I could join? Is there a

18:39: centralized system or do session host

18:41: like constantly announce, hey, I'm a

18:43: session host? Join me at this IP." So,

18:46: there's actually multiple systems to the

18:47: session announcement. Um there it

18:50: depends on the type of the session. So

18:52: for example on local network on LAN it

18:55: will use UDP broadcast on the network uh

18:57: to announce the session on the local

18:59: network. So you don't even need to touch

19:00: the cloud um it's just going to you know

19:03: is going to be listening

19:05: um is going to be like you know

19:07: listening to those packets and like you

19:09: know discover sessions that way. Uh

19:11: there's also public session

19:12: announcement. So like that's you know

19:14: sort of it's keeping you know any public

19:17: sessions uh in the database like they're

19:19: sort of like you know constantly

19:20: updating saying I'm still live and then

19:22: like when people like fetch those like

19:24: you know they're like okay these are the

19:26: public sessions and it's like a list

19:27: that anybody can access because those

19:28: sessions are public um and then you also

19:32: have a system for announcing

19:35: uh you know the less visible sessions

19:37: like contact sessions or contact plus uh

19:40: and the way that works is like over

19:42: signal are when you send updates. Um

19:45: people like if you have a contact

19:47: session uh people essentially sort of

19:49: like listen on on a particular user like

19:51: you know you'll be registered to listen

19:52: on every of your contacts. Um um and if

19:58: they have a session they're essentially

19:59: going to broadcast a message to all

20:01: their contacts saying I'm running the

20:03: session and if you change something it's

20:05: going to broadcast it again like with

20:07: updated information. Um

20:10: if and then like you know it does like a

20:12: regular broadcast like now and then to

20:14: like keep it alive. Uh just to make sure

20:17: because like if they for example you

20:19: know drop like for whatever reason like

20:22: the session information is going to

20:23: expire and it's going to disappear.

20:25: Normally when they just shut down the

20:26: sessions or shut down there's an idea

20:28: that will broadcast the sessions closed

20:30: so the weight can disappear from the

20:31: list right away. But um hope this kind

20:34: of explains it.

20:38: The next question is from Missing. Uh

20:45: let's see. There we go. Uh Missing's

20:47: asking, "Could domains allow users to

20:50: add their own domain on top of another

20:52: one without asking the host first?"

20:54: Asking since I keep thinking about

20:55: builder bubbles where user could do a

20:56: options would usually require builder

20:58: without being able to change the world

21:00: around them. For example, avatar thing.

21:02: Yeah. So if if you have a session like

21:05: brother bubbles have something we would

21:07: want to do that I think would help with

21:08: a lot of like the social scenarios

21:11: um and we would allow like it doesn't I

21:14: don't think even needs domains

21:16: specifically um we can implement it

21:18: before the domains uh in fact I think we

21:21: probably will um but yeah like if um if

21:25: you just like start one you know the I

21:28: don't see a reason why that wouldn't be

21:29: allowed since it doesn't affect the

21:31: session. And what wouldn't happen in

21:33: that case is you know it wouldn't

21:35: automatically be visible to the other

21:37: users. You could like invite them so

21:38: they could also see your bubble but that

21:40: will be popular in each individual user

21:43: with domains. Uh you know this can be

21:46: automated. So like you know the world

21:47: can say there should be this domain

21:50: above you know this world or connected

21:52: to this world and it's just going to be

21:53: loaded automatically because the you

21:55: know the main world says that's supposed

21:58: to be part of it. um and it will just

22:01: log it automatically. But if it's like

22:03: you know your own bubble, your own like

22:05: you know layer that you start on top of

22:06: another world like you know it's not

22:08: going to be like other users in the

22:09: session are not going to be

22:10: automatically loaded into it. Uh that

22:14: will have to be their choice.

22:18: The next question is a long one uh is

22:21: from Yosh.

22:25: Uh Yoshi is asking hey I was poking

22:27: around the mistaken left in the right DL

22:29: I noticed something that I find odd as

22:31: someone who has done a lot of

22:32: multiprocess architecture stuff before

22:34: while using share memory for IPC JSON is

22:36: being used as the structure of data that

22:38: gets passed to the ter memory serialized

22:41: by the sender D serialized by receiver

22:42: am I correct on this if I'm correct then

22:44: will JSON remain as the structure of

22:46: serialized data in the final release to

22:48: be clear I can understand why one would

22:49: want to use JSON serialization when

22:51: sketching out IPC architecture in C# is

22:53: right there standard library. It

22:55: automatically gets all the fields to

22:57: serialize. It's familiar human

22:58: observable to debug. So I'm not getting

23:00: on you anything for using during

23:02: development. I understand I'm poking and

23:03: looking at something that's unfinished.

23:05: However, it does end up used in the

23:07: final release. I have a small concern

23:09: for situations where speed is key. JSON

23:11: is usually poor choice for performance

23:12: in general since it requires not. Yeah.

23:16: Um I don't think I get the rest of it. I

23:19: can pretty much like up front say that's

23:21: used for development. Um one of the

23:24: reasons uh is you know pretty much like

23:26: you said like it's easy to human read

23:28: it's like available already in the

23:29: library. Um and right now like the bulk

23:32: of the data is being shared over shared

23:35: memory. So that's already using you know

23:37: sort of just data structures. Um which

23:39: pretty much it doesn't require any

23:41: parsing at all. It's just you know the

23:44: trackct layout is shared on both ends.

23:47: So they just access the same they

23:48: interpret the same chunk of memory you

23:50: know as that tracks and just operate

23:52: with the data as if it was local data

23:55: and there's like zero serialization

23:58: uh and that's like you know how both of

23:59: the data is like exchanged because

24:01: that's fast the structure there also

24:03: like structure messages or command

24:05: messages sent between you know the main

24:07: process and render to communicate and I

24:10: know that uses JSON uh one of the

24:12: reasons is you know it's very quick to

24:15: like develop

24:16: is be very easy to debug. If you watched

24:18: like any of my de vlogs, I actually

24:20: talked about this a fair bit. And you

24:22: can also see like, you know, I'm

24:23: literally dumping every message in like,

24:25: you know, into JSON into the log. Uh

24:28: because I'll debugging I'll be like,

24:31: okay, like this, you know, asset or

24:33: this, you know, renderable has exploded.

24:36: It's not what it's supposed to be doing.

24:38: I'm going to check its ID. And then I

24:40: will check back in the history of the

24:41: messages, be like, okay, this happened

24:43: here. just happened here and figure out

24:45: like know what happened. So the that's

24:49: pretty much like the main reason uh for

24:51: the final build probably use some kind

24:53: of like binary serialization. I'm not

24:55: decided on which one exactly yet but um

24:59: I'll use one of those. Uh this the

25:02: system is also using TCP circuit right

25:04: now but it's also going to be switched

25:05: to use the interprocess library which

25:07: just going to serialize it into shared

25:09: memory like circular buffer and use like

25:11: system level like sem force to like

25:14: indicate when it's ready to have like

25:16: you know process some data um but like

25:20: the library like it wasn't fully ready

25:21: initially like before I started adding

25:23: like the actual bulk of the data

25:26: um like there's like you know some stuff

25:28: that s was working on like to kind of

25:29: get it like working um

25:33: um and get it kind of compiled. Uh so

25:35: it's kind of get easy enough to get

25:37: started with. Um there is possibility

25:41: depending you know how like once I start

25:43: doing some benchmarking there's a

25:44: possibility that the initial releases

25:47: might end up keeping the JSON in case

25:51: it's like you know already way faster

25:53: with that in place. But um that's not

25:56: going to remain that's not going to be a

25:58: permanent thing.

26:00: um like uh if if it doesn't get

26:04: replaced, you know, before the release,

26:06: it's going to get replaced after it

26:07: because once the initial once the of the

26:10: initial release happens,

26:12: um you know, there's going to be a whole

26:15: bunch of like cleanup task that are

26:17: going to happen to like, you know, get

26:18: even more bits of performance. But if we

26:21: have a build, you know, that's usable,

26:24: that's stable, and that already has

26:26: like, you know, significantly improved

26:29: performance, even though it's still

26:30: using the JSON in the background, I

26:33: wouldn't see a reason not to release it

26:34: in that state. Uh because we generally,

26:37: you know, try to release as early as we

26:39: can and then like iterate.

26:41: uh that way you know people start

26:43: getting the benefits of extra

26:44: performance and then as the build is

26:45: released we work on those additional

26:47: task which would be replacing you know

26:49: the JSON for like more binary um uh

26:53: serialization kind of algorithm uh and

26:55: that way like you know we release a

26:57: build and there's going to be a little

26:58: bit of extra performance and we'll

27:00: probably do like another build where we

27:02: going to update the be physics uh the

27:04: latest that's probably going to give

27:05: like another bunch of you know

27:07: performance boosts we're probably going

27:08: to retune you know how physics works

27:11: because one of the things I had to do is

27:15: I actually had to disable this multi-

27:16: threading system because with unit is

27:20: mono enable like being way slower and

27:23: actually end up like being faster single

27:25: threaded but it doesn't scale as well.

27:27: So you know there's going to be another

27:28: of those things you know going to swap

27:31: it out get a little bit more performance

27:32: out of that. Uh we're going to you know

27:35: check like some of the core data types

27:37: but look about like you know switching

27:39: to system numeric for example which

27:41: might give us another boost. Um we'll

27:45: there's going to be a whole bunch of

27:46: like task. I'm kind of forgetting like

27:48: all of them but uh um the once the

27:51: spliting is released like that's not

27:52: going to be the end of it. There's going

27:54: to be sort of like polish phase

27:58: um and sort of like you know cleanup

28:00: phase where all of this stuff is going

28:02: to be. So there is possibility depending

28:04: how the page mark goes you know that

28:07: like the initial addresses might have

28:08: the JSON in it um because uh you know

28:13: again if it's faster or like even with

28:16: that you know penalty I wouldn't see a

28:18: reason not to release it in that state

28:20: and then like you know replace it like

28:21: in the subsequent builds.

28:24: the um how that kind of answers that.

28:28: But yeah, like the the important thing

28:30: is you know the goal isn't to keep using

28:32: that like the main goal of having that

28:34: is the debugging and still have to kind

28:37: of see how everything kind of performs

28:39: with everything in place.

28:44: Uh let's see the next one is from

28:49: Venport.

28:50: Uh I'm going to make it a little bit

28:52: bigger.

28:54: Oh, did I?

28:56: Oh, I I duplicated too many. Um, Vanport

28:58: is asking what is the intent uh behind

29:01: the object or component? It doesn't

29:04: appear to do anything, but there are

29:05: things inside for extension attached to

29:07: slot and fill its field. So, it feels

29:08: like it was meant to have some kind of

29:09: use, especially when one uh is created.

29:13: It has a snapper to it too. So, I don't

29:15: actually I don't actually remember

29:18: exactly what it was meant to do. What I

29:21: think that was I think I was working on

29:23: a tool sort of like you know to place

29:25: objects. So you have like you know the

29:26: material tool where you put like the

29:28: material orbs. You have the mesh tool

29:30: and I think I was like wanting to make a

29:32: tool where you can like you know just

29:33: put an object in it and it just be like

29:36: you know click around and just place it

29:38: in the world like very easily. So I

29:40: think that's what that was for and it

29:42: just never got finished and now there's

29:44: like you know just a initial fragment of

29:47: it that never got finished which we

29:50: still would like to I would like us to

29:52: have like you know proper tool for like

29:54: just placing objects very easily with

29:55: lots of options but people have already

29:57: built those kinds of tools in game too

29:59: so we might not do those you know in

30:02: code and just do them in game because

30:04: this might be more flexible

30:07: but there's a bunch of stuff like you

30:09: know like that in the engine where it's

30:10: just like

30:12: remainders of stuff that like you know

30:14: work started on something and then got

30:16: distracted or had like other things.

30:19: Thank you Climber Bear for the

30:20: subscription.

30:23: Okay, so the next question is from

30:25: Zizil. Uh just position it. Uh Zeil is

30:29: asking once the splitting has occurred

30:31: is it going to be possible for us for

30:33: example users to be able to see scene

30:36: rendering statistics similar to debug

30:38: info in Unity in a way that we see in

30:40: devlogs. That kind of information would

30:42: be very useful for optimizing busier

30:44: scenes. So unfortunately not really

30:47: because um that's a unity editor thing

30:50: like we would have to for that you know

30:52: to happen in game like where like this

30:54: we would have to actually implement that

30:58: um and problem is like Unity doesn't

31:01: give you those statistics on runtime

31:02: like from what I found like there's not

31:04: really way to access most of them so

31:08: they might require like significant

31:09: amount of like work that might have like

31:11: performance impacts um I would say like

31:14: you

31:15: make it into feature requests. But like

31:17: you know that's not something that's

31:18: just going to happen automatically

31:19: because of the split thinning. Um

31:21: technically like you know the splitting

31:23: doesn't even have any effect on that

31:25: because even before we're like you know

31:26: using Unity as well and didn't really

31:30: give you that information.

31:33: And the last questions uh the last

31:35: question from uh Discord

31:38: uh we'll be getting to the

31:41: um be getting to the uh Twitch questions

31:44: in a second. But the last one is from

31:47: Moonbase asking uh why can't for

31:50: extension drive a value with more than

31:51: one source? I have a feeling I know the

31:53: answer but I'm curious on how you put

31:54: it. So pretty much what drives are is

31:58: like it's a way to tell the data model

32:02: you know it's uh it's essentially being

32:04: like I'm taking exclusive control of

32:06: this value don't handle it and uh

32:13: you know that's pretty much it is like

32:14: you know it is an exclusive control

32:16: which means you can't have like you know

32:18: two things have exclusive control

32:19: because then by definition it's not

32:22: exclusive control. Um and like the

32:25: philosophy behind that is you know if

32:27: something has exclusive control that

32:29: system is fully responsible for you know

32:33: determining you know what that value is

32:36: and it also knows it can determine like

32:38: what that value is because it's the only

32:40: one that can actually control that

32:42: value. Like there's not going to be

32:43: something that's going to interfere with

32:44: it that it would have to worry about

32:47: that could make the value you know be

32:48: something that it's not under its

32:50: control. And if you had like you know

32:52: two systems for it um you know then

32:58: um

33:00: there's like you know if you had like

33:01: two systems like you know now they don't

33:04: really like one can change the value and

33:06: the other one changes the value but like

33:08: the other one doesn't know about the

33:09: change and now like you know the value

33:11: is inconsistent.

33:14: uh you know very much it it doesn't like

33:17: work by definition. If you have like

33:19: multiple thing if you want to like

33:20: combine multiple sources into a single

33:22: value then what do you have to do say

33:24: for example with prolex do whatever

33:26: logic you want to combine the values but

33:28: that becomes you know a single single

33:30: source that drives the value you have to

33:32: be very explicit about how multiple

33:35: sources are combined into the final

33:38: value.

33:41: So with this, all the um all the Discord

33:45: questions are done. So we can

33:48: um we can start getting into the Twitch

33:52: questions. And if you're like tuning

33:54: late, uh there's a whole bunch of

33:55: questions already. Uh but if you want to

33:57: ask your question, make sure to put a

33:59: question mark uh somewhere in it,

34:01: usually at the end. That way it kind of

34:02: pops on my thing. I can actually show

34:04: you right here. Um it pops here. and I

34:09: will make sure that I don't miss it. So

34:12: the first questions from Epic Easton uh

34:15: Epic Eon's asking uh I asked Prime this

34:19: uh but I thought I would ask you as well

34:20: what is order offset and how does it

34:22: work? So order offset you find that on

34:25: slots and essentially what it does it

34:27: just it lets you tell what order the

34:30: slot should be in the hierarchy. like

34:32: when you have a bunch of children uh it

34:35: just determines you know where it is

34:37: like relative to the other children or

34:39: in a lot of cases like it doesn't matter

34:41: but there's like systems where it does

34:43: like for example you know UIX in UIX the

34:47: order offset will determine you know in

34:50: which order the actual UIX elements are

34:52: render so often find you know the UIX

34:54: systems will assign orders similar thing

34:57: like you know with uh aligner systems

34:59: you have like you know say access

35:00: aligner or you know sphere aligner or

35:02: something. The order of the children

35:04: determines their order in, you know, in

35:07: the aligned like um aligned positioning

35:11: and the order offset lets you pretty

35:12: much control where that order is.

35:18: Uh next question

35:24: uh is from Shadow. Uh looking at delogs

35:27: uh you don't seem to have touched VR

35:29: support yet uh with a splitending. How

35:30: difficult do you expect it to be? And

35:32: how long do you expect it to take? Yeah,

35:34: I haven't like done that part yet

35:35: because like pretty much like right now

35:37: for everything I can just test it in

35:39: desktop. Like I don't need to be jumping

35:41: into VR. Uh and the stuff is, you know,

35:43: same between desktop and VR. So I'm kind

35:45: of not focusing that part yet. I don't

35:48: think it's going to take like really

35:49: long. Um it's mostly just, you know,

35:51: proxing a bunch of the VR inputs and

35:53: making sure the transitioning between VR

35:55: and VR works fine. But um I don't think

35:59: it's going to I think it's probably just

36:00: like a day off or maybe we'll we'll see

36:03: how it kind of goes once I get to it.

36:05: But right now the focus has been you

36:07: know just getting all the rendering

36:09: stuff working.

36:11: The next question is from Reen.

36:15: Uh Reeden's asking good evening. Uh hope

36:17: you are doing fine and are melting

36:19: unmelting a bit. uh will you do some

36:21: more performance updates after the

36:23: splittering for example the cascading

36:24: asset dependencies third with VRM and

36:27: VRM usage um I'm not sure like on that

36:31: specific one that's one I would

36:32: definitely love to do because I think

36:34: that's going to help a lot but um

36:39: I like like I was talking like a little

36:41: bit earlier the goal is you know

36:46: um the goal is like you know like we

36:48: release the spinning and then like we do

36:50: a bunch of like additional kind of

36:51: tasks. So like we get a little bit more

36:53: like you know updating be switching to

36:55: things switching you know what

36:56: primitives we use like for example we

36:58: want to switch over to using a lot of

36:59: the concurrent collections because like

37:02: for actually use the concurrent

37:04: collections like in early versions and

37:07: then we had to like replace those um or

37:09: I had to replace those with like uh spin

37:11: collections because with unit's garbage

37:14: collector the concurrent collections

37:15: actually end up like hurting performance

37:17: a lot causing a lot of freezes. Um but

37:20: with modern like you know net runtime

37:22: where you know which ones they're

37:24: designed for they actually perform much

37:26: better. So there's going to be a lot of

37:28: like small tasks like that or medium

37:30: tasks. Uh so it's possible like there

37:33: might be you know some more other things

37:34: but also might be like we'll be at some

37:36: point okay like if the performance is

37:38: like been improved like significantly we

37:40: want to focus on a few other things for

37:42: the time being. Um, but we'll see. Like

37:44: the the cascading one I think is going

37:47: to help quite a bit with things and

37:48: especially with some of the stuff how um

37:52: how the split thing is designed to work

37:54: because it makes it makes it like you

37:56: know when you have something that's not

37:58: visible to you. It doesn't actually send

38:00: it to the render like this on the FS

38:02: engine side so that we have like a good

38:04: synergy there. But I don't want to make

38:06: any promises right now

38:10: and yeah's asking no s today. Uh yeah,

38:12: so it's been traveling so this not

38:14: available today. Maybe next next week.

38:19: And we got a schnoid from Grand UK. And

38:22: actually I do have like one schnopit. Um

38:26: there's um so there's like one thing

38:29: that like I kind of notice sometimes is

38:30: like where if we like what kind of like

38:35: you know bugs me is like when we don't

38:38: work on something of often times people

38:40: will assume that like you know we don't

38:42: know how to do it like we don't know how

38:43: to approach it or we don't how to solve

38:45: the problem where in a lot of cases you

38:48: know it's more just like we literally

38:51: didn't have time to it yet. there were

38:52: like other priorities and the reason we

38:54: haven't done it is just we haven't had

38:56: time because a lot of the cases like we

38:59: know how are we going to solve certain

39:01: problems like and that's not like you

39:03: know what's blocking us but we need like

39:06: you know we can't prioritize everything

39:09: like there's only so many things we can

39:10: work out of time and then you know we

39:12: kind of start getting into the thing

39:14: where if people are like you know like

39:15: oh we could do this we could do this we

39:16: could do this and and I find like often

39:19: times like people jump into like the how

39:22: of something

39:24: before asking should we because

39:29: like

39:31: one one of the most limited resources

39:32: you're going to have as a developer is

39:34: your time.

39:36: Um and you have to be like you know you

39:38: have to be like concerned with how you

39:40: spend the time and which things you

39:42: invested into because you're not going

39:45: to have any everything. So we have to

39:46: kind of choose which things you know are

39:49: the most impactful. Uh and this kind of

39:52: happens with GitHub. You know sometimes

39:53: people will make issues and they'll be

39:55: like and we ask people like you know

39:57: what is your use case? Why would you

39:59: want us to implement this? And sometimes

40:01: people say like, "Oh, it would just be

40:02: nice to have, but um that's not really

40:07: going to help it get prioritized because

40:09: yeah, it might be nice to have,

40:12: but do we really need it right now over

40:15: like, you know, dozen other things? Like

40:18: is it is it more important?"

40:20: Because if we like prioritize one thing,

40:22: you know, then we have to then we have

40:24: to not prioritize another. And that's

40:27: you know often times like where the

40:28: thinking goes is you know like if why

40:31: should this thing be prioritized over

40:33: all the others that you know are

40:34: potential on our plate. Um and you know

40:38: issues end up like being pushed into

40:40: like you know background but then people

40:42: will assume like oh just you know why

40:44: don't we just do this and it's it's it's

40:46: not really about like

40:48: not knowing what to do technically on

40:50: it. It's more about like we didn't

40:52: decide to invest time into it. And when

40:56: people start like you know pulling us

40:57: into like how working into the solutions

41:00: themselves

41:01: you know it feels like we're being kind

41:04: of forced you know into like dealing

41:07: with the issue we have like where we

41:09: have decided like we're not going to

41:10: prioritize this right now.

41:13: And in that case like you know the

41:16: conversation should be this should be

41:18: prioritized for this and this reason

41:19: like you should spend time on this for

41:21: and this and this reason before we start

41:23: jumping into how. But when we get pulled

41:25: into the how of it, you know, it just

41:27: feels like we just skipped a bunch of

41:29: steps. Um, and it can be a bit jarring.

41:32: So that's that's that's that's my

41:35: schnopit.

41:39: Uh, we also got type one type 1 X from

41:42: Ace and G. I um

41:46: you know I'm just I'm just going to I'm

41:47: going to be lazy and I'm going to be

41:48: like uh my typon which is like autoite

41:51: of shop it is when people do have like

41:55: you know good explanation of use case

41:57: when people go okay this could help me

41:58: like you know this project and this

42:00: project or maybe I notice you know users

42:02: being affected by this and being

42:03: affected by this because often times

42:05: when people do that um and sometimes you

42:07: know I will I'll be like you know people

42:09: will not provide their reason they'll be

42:11: like what's your use case for this you

42:12: know why do you want prioritized and

42:14: then they explain it. I'm like okay now

42:16: it makes sense to me you know now um I

42:19: understand what you're trying to do

42:21: which not only helps you know to

42:22: formulate a good solution but it gives

42:25: um it gives me sort of context you know

42:28: for why is it needed and I understand

42:30: what kind of impact implementing that

42:32: feature can have uh because often times

42:34: prioritizing things it is about like you

42:37: know it is the ratio of like how much

42:40: effort we need to put into this issue

42:41: versus what impact it'll have you on

42:44: resite the community and like you know

42:47: the sustainability of the platform. So

42:49: any information that people whenever

42:51: people do give us that information that

42:53: makes us work much easier and we

42:56: appreciate that

43:00: uh got the subscription and also thank

43:03: you Jack for for the cheers.

43:06: Uh, Grand is asking, um,

43:11: uh,

43:13: how confident are you with the work

43:14: you've done on splitting so far? So, at

43:17: this point, I'm pretty confident with

43:18: it. Um, if you haven't seen the latest

43:21: devlog, I pretty much got it like

43:23: working outside the Unity editor. Um,

43:25: and it just it it just it kind of works.

43:28: Like there's still, you know, stuff that

43:30: has to be implemented. like I like right

43:32: before the stream was actually working

43:33: on the particle systems and I I almost

43:36: have those working. Um, and then like

43:39: you know there's like some little bits.

43:40: Actually, I might actually be doing some

43:41: of the VR stuff after that too, but

43:44: we'll have to see. Um, but generally

43:47: it works

43:50: like

43:52: it's uh it's like you know it's it's

43:56: pretty much like past the phase like

43:57: where I would be worried like is this

44:00: going to work at all? like, you know, is

44:02: this

44:04: like is it like is this like, you know,

44:08: going to hit some like weird uh like,

44:10: you know, unexpected roadblock that's

44:12: just going to make the whole thing fall

44:14: apart. And

44:16: like I'm I know like that's not going to

44:18: happen at this point now because you

44:20: know the stuff is generally working and

44:22: most of what work I've been kind of

44:23: doing is just you know implementing more

44:26: of the things that need to be proxied

44:28: for rendering and implementing more

44:30: things for the input system. So it is

44:33: just kind of you know it's almost like

44:36: when you start a project like you know

44:38: it's like this big cloud of you know

44:40: unknown and you don't know where

44:42: anything will go and it's just

44:43: everything's floating everywhere and you

44:45: kind of start working on it. You start

44:47: putting pieces together and then it's

44:48: sort of like you hit that point where it

44:51: starts crystallizing and now like you

44:53: know it starts forming a pattern and

44:55: everything kind of nucleates

44:58: and you know

45:00: kind of falls in place and that's kind

45:03: of how it's been you know going for for

45:06: quite a while now with the with the

45:07: thing. So

45:11: um so yeah I'm I'm pretty confident. The

45:14: main thing still that's like still

45:16: unknown is like you know how much is it

45:18: going to help with performance because

45:20: um like I haven't been able like to

45:22: really get the builds to like work

45:25: without all the debugging stuff yet like

45:27: without running a debugging mode. Um I

45:29: did like there's like a benchmark that

45:32: somebody sent me and I run it with the

45:33: debug mode but when um when the even net

45:37: 9 when it's running in the debug mode

45:39: it's actually compiled without

45:41: optimizations to the code it's compiled

45:43: with a lot of instrumentation so there's

45:46: like you know a lot of like things where

45:47: it's like waiting for break points and

45:48: where is like you know gathering data

45:50: about the runtime and that's adding a

45:52: lot of overhead too and on top of that

45:55: I'm also like you know logging literally

45:58: every like message that's happening and

46:01: even with the message sometimes I will

46:02: like log every individual operation like

46:04: this material is being updated to this

46:06: this material is being updated to this

46:07: and it's being logged all the time. Um,

46:11: so like with that like it's it doesn't

46:13: really give you like representative

46:14: performance and the focus has been you

46:17: know just kind of like getting getting

46:18: stuff to work and I'm still yet to get

46:20: to the phase but I'm like okay I'm just

46:22: going to remove all of that and I'm

46:24: going to let it run and see how it runs.

46:26: And I don't know how fast it's going to

46:28: be running. And right now like the

46:32: feeling is like you know this should be

46:33: better. bath. Oh, actually the other

46:36: thing is there's also like you know um

46:41: um another big thing is you know that um

46:44: the actual loop I'm just like using like

46:48: a loop that's like made more for the

46:50: headless that has its own like frame

46:52: pacing that's set to 60 frames a second

46:55: and like then the unit has like its old

46:57: loop and now those two are kind of like

46:59: you know trying to like do a thing I

47:00: need to actually write a proper render

47:02: loop that like is not going to try to

47:04: keep its own frame pacing. It's just

47:06: going to let it be driven by the

47:08: renderer uh when the render is like you

47:10: know active.

47:12: Um so there's like still like a whole

47:14: bunch of bits and like right now that is

47:16: still unknown and I'm hoping like like

47:18: it is as big as it can be but until

47:21: until all that is done is an unknown or

47:25: relative unknown and we mostly have an

47:27: educated guess and the educated guess is

47:32: you know it it's It's looking good based

47:35: you know on the data we got from the

47:37: headless which is one of the reasons

47:38: like we put like you know or I put like

47:40: so much like work into this uh is

47:43: because the headless running the same

47:45: code that like you know runs the

47:46: graphical client except all the visuals

47:48: it showed very substantial performance

47:50: improvement with net 9 compared to mono

47:53: so I think it's going to be substantial

47:56: but until like you know all is done I

48:00: there's you know still always like part

48:02: of your brain where it's like what if

48:03: what What if it's bad? What if what if

48:06: it's going to explode for some reason?

48:07: But um everything is like unlikely to

48:10: like be like, you know, dead. Like I

48:12: think it's going to be but like I'm one

48:15: of those people like you know where even

48:17: even if there's like

48:20: 1% chance of something kind of surfing

48:23: happening and I'm going to be thinking

48:24: about that 1% and be anxious about it.

48:27: So um there's that. But even so like you

48:30: know all the data I have like is you

48:33: know like

48:36: it's very promising

48:38: but my brain keeps focusing in that 1%

48:41: but I think it will be good like

48:42: rationally I think it will be good.

48:47: So r I would say rationally I'm

48:48: confident emotionally

48:51: I'm an anxious person.

48:56: Next questions from the Jet Forge will

48:59: be attending your friends 29. Yes. So I

49:04: mean assuming like something doesn't

49:05: happen but like I should be there. Um

49:08: this also always confuses me because

49:10: like like a lot of cons they will like

49:12: use the year but this is actually just

49:14: the number of it. So this is like you

49:16: know this year but it's like it's um

49:19: it's weird.

49:22: But yeah, I I should be there and I hope

49:24: to see a bunch of people there.

49:28: Uh, next question

49:31: uh is from Jello

49:34: 20216_.

49:35: [Music]

49:37: Are you ever worried that VR chat is

49:38: going to feature enough to make it

49:40: unappealing as it would have just been

49:41: alternative with less people? For

49:43: example, with news, you could change

49:45: height and interact with normal avatar

49:47: bon something you can also do in VR

49:49: chat. Honestly, I don't think that's

49:51: going to happen. Like I think they can

49:53: like, you know, do a lot of like the

49:55: smaller kind of things, but big part of

49:57: Resonite is, you know, we're not trying

49:59: to copy VR chat. You know, we are making

50:02: Reson to be what we want a social

50:05: platform to be, but not just a social

50:08: platform, but sort of like, you know,

50:09: one where you can pretty much do

50:10: anything.

50:12: And big part of Resonite is, you know,

50:15: the creativity of it, being able to like

50:17: edit things, being able to build in

50:18: game.

50:20: um and being able to like you know have

50:22: like this like amount of flexibility

50:24: with everything and even like more long

50:27: term like we want there's on it to be

50:28: you know something where you could work

50:30: where you can build your virtual studio

50:31: where you can produce content um you

50:34: know like where you can do like 3D

50:36: modeling where you can like do

50:39: pretty much whatever you imagine you

50:41: know like something that goes beyond a

50:43: social VR platform and I don't

50:47: think like you know they're going to be

50:48: able to replicate that because in order

50:51: to do a lot of like what Resonite does

50:53: especially with like you know the

50:54: interactivity of the stuff you have to

50:57: build the engine from the fundamentals

51:00: and as you know the big part like you

51:02: know where we where our benefit is that

51:04: we put all the work into you know FKS

51:06: engine um into the data model and we

51:09: build the entire engine around the data

51:11: model to enable all of this and it's if

51:15: you wanted to do you know something that

51:17: Donite does like to this

51:19: um with existing solutions like it's one

51:22: of those things where if you just tack

51:25: it on you know with the traditional

51:27: approach just like ends up like you know

51:29: falling under its weight. So everything

51:32: like it is going to be you know

51:34: especially as the time goes on Arizona

51:36: is going to be more kind of distinct in

51:39: what it offers to users where the social

51:41: aspect is you know it's just part of it

51:44: but there's like you know lots of other

51:45: things to do that like are not possible

51:47: to do in other software and then

51:50: necessarily you know are not even

51:52: focused like on the social aspect but

51:54: where the social aspect is more just

51:58: making things natural Because one of the

52:01: things I like to think about is you know

52:02: in the real life you can you know you

52:05: can visit a friend and the friend can be

52:07: like you know uh they can be in their

52:09: workshop and it could be like you know

52:10: messing with things you know and you can

52:12: be hanging out with them while they kind

52:14: of work on their stuff and chatting

52:15: about things or like you know maybe the

52:18: friend is like you know painting

52:20: something or whatever task they're doing

52:24: they do not exist you know in a

52:26: completely different form of reality

52:28: where like you wouldn't be able to like

52:29: join them and interact with them. But

52:32: that's kind of like how things are a lot

52:34: like you know in

52:36: virtual software like if somebody's like

52:38: in some kind of like you know say

52:40: sculpting like you know VR sculpting

52:42: tool they're sort of unless the tool

52:44: actually has like multiplayer they're

52:46: going to be you know isolated they're

52:47: going to be on their own you cannot

52:48: visit them until they are finished with

52:51: a task. Um and the big vision of

52:54: Resonite is like you know that like

52:57: whatever application is developed here

53:00: you always have that social layer you

53:02: always have the persistence you always

53:04: you know have that interoperability

53:06: where you can have multiple people in

53:08: the same environment doing bunch of

53:10: different things

53:12: uh because

53:14: like I feel like we don't even like you

53:15: know think about it where in real life

53:17: like you know we can socialize in so

53:19: many different context doing so many

53:20: different things that like often times

53:23: it's not like, you know, the

53:24: socialization isn't something you think

53:25: about. It's just the it's just the way

53:27: the real world works. You know, you can

53:29: talk to people, you can see them, you

53:31: can move around them. Um it's not

53:34: something you have to think about. It's

53:35: not something that has to be, you know,

53:37: enabled in some way. It's just inherent

53:40: the reality.

53:41: But with like, you know, social VR

53:43: platforms, I feel like, you know, that

53:44: kind of becomes that forefront. It's

53:46: like, you know, that's that's that's

53:47: like the thing you do. You socialize.

53:50: Um but with Resonate I feel like you

53:54: know

53:55: like we try to make the social

53:58: socialization be more of like you know

54:00: something that just given that's like

54:03: you know it's the base layer. Of course

54:07: you can socialize because that's how the

54:08: real world works. But then what can you

54:10: do on top of it? What kind of activities

54:12: can you do? Can you you can work? You

54:14: can educate. You can have fun. You can

54:16: play games. You can build things. You

54:18: can have multiple people hanging in a

54:20: room, everybody doing their own thing or

54:22: collaborating like it it doesn't matter.

54:25: um say like you know you're a musician

54:27: once we add stuff like you know Prolex

54:29: DSP you'll be able to like you know say

54:32: compose music

54:34: and you know often times music musicians

54:36: they will collaborate and if you

54:38: collaborate in a real life you can

54:40: literally invite somebody into your

54:41: studio and you know you can be all

54:43: messing up with the knobs and plugging

54:45: wires and playing things you know and

54:47: doing stuff but like you can't do it

54:50: like easily in virtual world and that's

54:52: what Resonate is designed to you know

54:53: sort of enable

54:55: where that just becomes automatic. Like

54:57: the the social aspect of it, which like

55:00: if you invite somebody to your studio to

55:02: like work with them on something, you

55:03: don't think about, you know, needing to

55:06: have some system to be able to talk to

55:09: see each other and to see whatever

55:11: you're doing in the environment staying

55:13: in sync between each other. And that's

55:16: kind of like, you know, what night is

55:17: and why is designed the way it is is to

55:19: enable interactions like that. And I

55:22: think that's the biggest advantage like

55:23: you know over other platforms because

55:25: the other the other platforms they more

55:28: focus you know they make their social

55:30: thing the main thing

55:32: and then you tack on a few other things

55:34: about it but like it they will not able

55:37: to do it in sort of you know

55:41: in such a fundamental way that we do. So

55:44: like longterm like I don't not really

55:47: like worried because like so far like

55:48: what I've kind of seen is like you know

55:50: it's those kind of smaller things but I

55:53: don't think the fundamental aspect of

55:54: resonite is something that can be um I

55:59: don't think like I don't think you are

56:01: able to get there with feature creeping

56:03: like I think you can do like you know

56:05: some surface things those you can kind

56:07: of feature cre and mix similar but the

56:10: fundamental stuff like you need to

56:12: approach it fundamentally to get there

56:14: and doing that with an existing platform

56:17: that's going to be exceedingly like hard

56:21: like it was and I can tell it was

56:23: already hard to do it you know from

56:25: scratch uh because I've been working on

56:27: fruits engine for I don't even know how

56:30: long I think it's over a decade like

56:32: with a lot of the principles of it and

56:34: and stuff

56:36: um yeah so yeah I don't I I know what

56:42: else to like say on this one at this

56:44: point. But I hope that kind of answers

56:45: the question.

56:48: I could maybe like ramble, but I want to

56:51: get to the other questions, too.

56:55: Uh, next. Actually, how much time is

56:58: Okay, I got we got one hour left. So, I

57:01: think I should be good on time. There's

57:03: whole bunch of questions. So, oh yeah,

57:06: there's a lot. Should be get I should be

57:08: getting 30s faster than less.

57:12: Um, JFT tools asking, uh, favorite indie

57:16: game releases of 2025 so far. Um, I

57:20: don't actually know many. I haven't

57:21: really played that many games. Like the

57:23: main big game I played this year was

57:25: Doom Dark Ages, but that's not indie.

57:29: Um, I don't know. There's like one I'm

57:31: looking forward to, but I don't know if

57:32: you consider it indie anymore, but I

57:34: guess so. Uh, it's the Holo Night Sock

57:37: song. Like that one I'm really looking

57:39: forward to. So maybe that one.

57:44: Uh

57:48: uh Jake the Fox is asking on a scale of

57:50: one to mango, how are you excited for

57:53: the split split ending? I'm going to say

57:55: mango gelato.

57:57: That's final answer.

58:01: Um

58:04: let's see. Uh next question is from

58:06: Kaiobs. Uh, I just thought of something

58:09: after stopping games got a go back on

58:11: the radar. Would there be any plan for

58:13: local storage instead of just saving

58:15: out? So, this has been asked before, but

58:17: SKG did get me thinking Harrison could

58:20: continue via community if development

58:21: stopped. So, you can already save some

58:24: things like locally like you can save

58:26: stuff to your local, but there

58:27: definitely something you want to like

58:28: flesh out a bit more. There's like few

58:30: other things where um

58:34: um there's like few other things

58:37: like where

58:39: you know like stuff isn't cached or

58:41: loaded from cache properly that like we

58:43: need to kind of fix up to make this work

58:44: better. But I feel like if we were to

58:46: like you know like one of the things

58:48: that's been kind of on my brain like

58:49: even in the past um because like in the

58:52: past like you know some of the things

58:54: were kind of uncertain is like

58:57: if something like went really bad you

59:00: know like and we knew like we had to

59:02: like we wouldn't be able to work on this

59:03: project anymore. What we would probably

59:05: do or what I would probably do is do

59:10: something to transition into like very

59:12: self-hosted kind of thing. you know,

59:14: publish the code, maybe wrap it around,

59:16: maybe adopt some sort of like, you know,

59:18: very like peer-to-peer protocol and make

59:20: sure like, you know, it can it can live

59:21: on in some form. So, I will probably

59:24: make it that, you know, be my kind of

59:26: like last act or something like if if it

59:29: literally got to that point. And I don't

59:32: think it will like will kind of fight

59:34: like, you know, if if it were even

59:35: getting closer that we probably like,

59:37: you know, fight and look for like ways

59:38: to kind of keep going on. But um for

59:42: myself like I I love like you know

59:44: preserving things. Um it's one of the

59:46: reasons I also like 3D scanning. Um and

59:49: I like archiving things. So we'll

59:52: definitely do something kind of along

59:53: those lines to make sure it can keep

59:54: going.

59:57: And also the climber, thank you for the

59:59: subscription in case I missed that. Um

01:00:01: I'm going to pop that on the stream just

01:00:03: there. We go.

01:00:07: Uh, Mintshock is asking,

01:00:11: uh, with how par with how paralyzed

01:00:13: multi trade everything is, there is a

01:00:15: general is, will you run into issues

01:00:17: where CPU physically can't do any more

01:00:19: operations in parl? I mean, CPUs aren't

01:00:21: that great at doing many things at once.

01:00:23: Uh, that's why you have GPUs. Um, it

01:00:26: kind of depends like usually with

01:00:27: parallelization like um there's like

01:00:30: multiple problems that you run into with

01:00:32: parallelization. One of the most

01:00:34: fundamental ones is probably, you know,

01:00:35: just making algorithms that are well

01:00:38: paralyzable because you can you could

01:00:41: just, you know, take like some existing

01:00:43: code and paralyze it, but then like

01:00:47: it's not going to work. It's going to

01:00:48: break because like um you're going to

01:00:50: have the same pieces of code, try to

01:00:52: modify the same things at the same time

01:00:54: and just corrupt the state and explode.

01:00:57: So what do you do? You add locks around

01:00:59: it. So like you know they they lock on

01:01:01: the resources but now you have created a

01:01:03: source of contention. So the contention

01:01:06: you know one thread is going to try to

01:01:08: access something another's going to try

01:01:09: to access something another's going to

01:01:10: try to something but only one can work

01:01:12: at a time. So now you actually have the

01:01:14: course waiting for a little time and

01:01:17: that's not going to like you know

01:01:18: perform super well. In fact, sometimes

01:01:21: when you do that, the paralyzed code,

01:01:24: especially if you paralyze it that way,

01:01:26: will perform worse than single threaded

01:01:28: because now there's a whole bunch of

01:01:29: like overhead with like, you know, the

01:01:31: locking mechanisms. So to get really a

01:01:34: lot out of parallelization,

01:01:36: you ideally need to design structure

01:01:39: everything so it benefits from it. And

01:01:41: usually the form it takes is where the

01:01:44: each core can take a chunk of work

01:01:46: that's pretty much independent of all

01:01:47: the other work. uh it doesn't need to

01:01:50: like you know communicate with the work.

01:01:51: The core can like you know spend a whole

01:01:53: bunch of time heavily processing its own

01:01:55: chunk of work and at some point later

01:01:57: like you know all the work gets kind of

01:01:59: pulled together and reintegrated and

01:02:01: that tends to perform like a really

01:02:03: well. Uh the problem is not every single

01:02:06: task is you know

01:02:09: can really be like reworked that way. So

01:02:11: it's going to depend a what kind of uh

01:02:14: what kind of problem you're dealing with

01:02:16: and how well it will paralyze and there

01:02:18: there is a theorem for it or like I

01:02:21: forget what's it called or law or

01:02:23: whatever I forget it name but pretty

01:02:25: much it says you know every single

01:02:27: algorithm it is a point beyond which it

01:02:31: cannot be further paralyzed and it's

01:02:33: also the thing with GPUs you know GPUs

01:02:35: are great at paralyzing things and

01:02:37: running you know huge things in parallel

01:02:39: problem is everything paralyzed as well.

01:02:43: So GPUs are great on bunch of tasks, but

01:02:45: they're also not going to be good, you

01:02:47: know, a task that are more

01:02:49: heterogeneous, you know, in how they

01:02:51: kind of work. Like like GPUs in

01:02:54: particular, they're like super designed

01:02:56: if you're processing huge book of data

01:02:59: and you're processing every piece pretty

01:03:01: much the same way as every other piece.

01:03:03: If you if you're doing lots of different

01:03:05: things for every piece, the GPU is not

01:03:07: going to run well with that. is not

01:03:08: designed to run that well. Um, and it

01:03:11: can kind of deal with like some of it,

01:03:12: but like the more you kind of the more

01:03:14: you diverge, the worse it will kind of

01:03:16: get. Um,

01:03:19: so there's actually aspects too because

01:03:21: like you do have some resources that

01:03:23: also shared. So if you paralyze things

01:03:26: and one of the problems we actually run

01:03:27: into right now with Unity is when some

01:03:29: of the stuff is paralyzed because it all

01:03:32: uses the same memory. It ends up running

01:03:34: into issues with a garbage collector

01:03:36: where if a lot of threads are just you

01:03:39: know going ham allocating tons of memory

01:03:42: processing tons of data loading tons of

01:03:44: assets it's going to make Unity and

01:03:46: specifically the garbage collector pause

01:03:48: the entire application just to like you

01:03:50: know collect memory and free it up. So

01:03:52: even though like you're doing bunch of

01:03:53: work in parallel and that work is not

01:03:56: even touching anything you're doing on

01:03:57: the main thread which is you know

01:03:58: updating all of this stuff it's going to

01:04:01: affect the main thread because it's

01:04:02: going to cause the garbage collector to

01:04:04: pause a lot and it's going to pause the

01:04:05: main thread and you're going to get

01:04:07: stutters and freezes even though we have

01:04:09: like task that's completely in the

01:04:11: background. So most systems you will

01:04:14: going to have you know some resources

01:04:16: that are shared between them um and that

01:04:20: are going to like you know affect each

01:04:21: other with uh more modern run times like

01:04:24: you know for example with net 9 the

01:04:26: garbage collector is actually smarter

01:04:27: and often times threads will have their

01:04:29: own chunk of memory for all of the

01:04:31: processing

01:04:32: um so like when they do heavy work they

01:04:35: don't need to affect any others until

01:04:37: like you know they cross a certain

01:04:38: threshold so yeah it It depends like par

01:04:42: like multi- threading and paralization

01:04:44: is like one of the big kind of problems

01:04:46: in computer science.

01:04:48: So um

01:04:52: hope that answers that question.

01:04:55: Uh next one's from the bloody rash. Uh

01:04:58: bloody brash is asking uh with the

01:05:00: splitting occurring are there any

01:05:02: specific areas where we should see

01:05:04: significant performance improvements or

01:05:05: is there more overall improvement where

01:05:07: everything is about the same? So yeah,

01:05:10: it's definitely going to depend like

01:05:11: some areas are going to see more

01:05:13: improvement than others. Um I think like

01:05:16: overall like uh like on you know

01:05:19: everything like pretty much like

01:05:21: everything almost everything's going to

01:05:22: see some performance improvement when it

01:05:25: comes to the CPU processing. What's what

01:05:28: probably not going to see improvement is

01:05:30: any you know sort of graphical tasks. So

01:05:31: like for example if you are if you have

01:05:34: a world and you know it has tons of

01:05:36: geometry you know or you're rendering I

01:05:40: don't know like 50 million polygon like

01:05:42: you know 3D model it's not going to help

01:05:45: with that because that's you know on the

01:05:47: G largely on the GPU

01:05:49: uh but if like I studed with CPU

01:05:51: specifically you know CPU processing

01:05:53: with any parts of the f engine overall

01:05:56: there should be like noticeable uplift

01:05:58: but even there like you know some things

01:06:00: are going to have more improvement than

01:06:02: others. One of the things that we

01:06:04: noticed the most improvement is stuff

01:06:06: like you know the depo physics. So like

01:06:08: anything has to do with the physics

01:06:09: engine stuff like the particle system

01:06:11: like when we like cyro here and uh some

01:06:14: benchmarks with the particle system and

01:06:16: he found like like the particle system

01:06:18: literally the simulation is 10 times

01:06:21: faster with net 9 versus mono. Um so you

01:06:26: know some things like especially when

01:06:27: it's very mild heavy and it's very

01:06:29: paralyzed that actually gets pretty

01:06:32: significant boost but overall there

01:06:34: should be like a boost to everything.

01:06:37: It's going to depend on the specific

01:06:38: task. There's like other ones like for

01:06:40: example startup is faster um already

01:06:43: even with debug mode and one of the

01:06:45: reasons is um uh with normal startup

01:06:49: there's sort of like uh reflection where

01:06:51: like you know the code is sort of like

01:06:53: analyzing itself and its own kind of

01:06:55: structure and classes and there's also

01:06:56: bunch of other things um like

01:07:00: um

01:07:02: for example like you know um what I was

01:07:06: going to say

01:07:08: I got my train of thought. Um

01:07:13: yeah, I kind of lost my train of

01:07:14: thought. Oh, I remember. Yeah. So like

01:07:17: um you know there's like reflection uh

01:07:19: and when the like when when you use like

01:07:22: the inspector that's using reflection

01:07:23: like you know a bunch of stuff when

01:07:24: you're using like protolex that uses

01:07:26: some reflection and sometimes you get

01:07:27: hitches and freezes and one of the

01:07:29: reasons we found the reflection for some

01:07:32: reason in mono is very slow and it gets

01:07:34: slower the more it's used but net 9 is

01:07:38: way faster. And one of the biggest uh

01:07:42: one of the biggest cases where it

01:07:45: happens

01:07:46: uh is like you know when the startup

01:07:48: happens and like the um the sort of

01:07:51: analysis the compatibility analysis it

01:07:53: has not been generated yet because on

01:07:55: the normal builds we generate it for you

01:07:57: and like then it's cached so it doesn't

01:08:00: have to do it but whenever build has to

01:08:02: do it with unitis mono it takes about 40

01:08:04: seconds to do it and then if you run it

01:08:07: with net 9 that same code doing the same

01:08:10: analysis it it it completes in like 2

01:08:13: seconds.

01:08:14: So there's like some things that have

01:08:16: like you know huge like improvement but

01:08:19: the overall improvement like that's

01:08:20: going to depend a lot on the scenario.

01:08:22: So, um there's going to be overall

01:08:25: boost. Uh some things are going to be

01:08:27: way faster, some going to be a little

01:08:28: bit less faster, but I think there's

01:08:31: overall should like feel things should

01:08:33: just feel a lot better.

01:08:36: But we'll we'll we'll see like you know

01:08:38: how much like overall as well as once

01:08:40: it's done.

01:08:44: The next questions from uh Ky Wops. Kops

01:08:48: is asking another question I have. Uh me

01:08:50: and friend were talking about Resonoid

01:08:52: uh VM saying they need to get on

01:08:54: Resonate more. However, they find having

01:08:56: to launch resite and edit setup avatar

01:08:58: on external unit like editor be planned

01:09:01: after spliting for those people. I think

01:09:04: more options to do things is nice. So it

01:09:07: kind of depends what exact issues

01:09:09: they're having because like often times

01:09:10: like our first instinct is going to be

01:09:12: what issues do they have with the tools?

01:09:14: We want to fix those issues like we you

01:09:19: know often times like we will not um

01:09:22: like if if you have a problem with a

01:09:24: tool you don't throw away the tool and

01:09:27: you know just get a different one you

01:09:29: try to fix the tool.

01:09:31: So knowing more that would kind of help

01:09:35: but ignoring that you know like it's

01:09:38: it's our initial approach uh there are

01:09:41: reasons for like you know external

01:09:43: editors

01:09:45: um it wouldn't be unity like editor we

01:09:48: would do literally like you know unity

01:09:49: SDK where you can if you set up stuff in

01:09:52: unity you can you know convert it to

01:09:56: resonite uh and that's something we want

01:09:58: to do because there's like you know lots

01:09:59: of benefits to that You can take you

01:10:01: know existing content in Unity and just

01:10:03: convert it to resinite you know with not

01:10:06: really much work that you need to do and

01:10:08: that's a I feel like really good use

01:10:10: case if the tool also works you know two

01:10:12: ways you could use Resonite to build

01:10:13: stuff move it to Unity do stuff there

01:10:15: move it back there might be like really

01:10:17: interesting workflow for people like you

01:10:19: know who build stuff with Unity and

01:10:21: could also be extended to other tools

01:10:22: like you know Unreal Engine or goodo um

01:10:26: so we definitely want to do stuff like

01:10:28: that We

01:10:31: won't make like our own Unity like

01:10:33: editor because our own Unity like editor

01:10:35: is this like we just going to you know

01:10:39: focus on improving the tools that are in

01:10:42: the engine rather than just building

01:10:44: completely separate tools because now

01:10:46: you know we're splitting our time

01:10:48: between two tools and you get like

01:10:51: instead of getting into one tool that

01:10:52: like where we put all our effort into it

01:10:55: um where we put all our effort into it.

01:11:00: Um, now we're sprinkling our attention

01:11:02: between two. So like you don't you get

01:11:04: two force ones. So we're don't really

01:11:07: have like a reason to like you know do

01:11:08: that kind of approach but having an SDK

01:11:11: for existing tools and existing editors

01:11:14: like I do see benefit for that. There's

01:11:17: not any specific you know timeline for

01:11:19: things like we haven't decided what's

01:11:20: going to be the priority after

01:11:21: splittening. So it's going to depend.

01:11:24: It's definitely one of those things to

01:11:25: consider. Um, and there is a Unity SDK

01:11:30: GitHub issue. So, like if you are

01:11:32: interested in that one, give it an up

01:11:34: vote because that's going to help

01:11:35: influence um this being added.

01:11:45: The next question is from Czech the Fox.

01:11:48: I also just checking in on time. Uh, got

01:11:51: about what is that 50 minutes left. Um,

01:11:55: check the fox authors asking you've

01:11:56: mentioned a while ago blue sky that

01:11:58: you're thinking about adding in-game

01:12:00: share on social media functions to

01:12:02: resonate. This is a really cool thing I

01:12:04: was not thinking about before. How could

01:12:05: a potential implementation of it look

01:12:06: like? Wouldn't you need ability to

01:12:08: record the video on resite first or we

01:12:10: also consider adding it for text image

01:12:12: only first? Yeah, I think like we could

01:12:14: add it for images like pretty much the

01:12:16: idea is like you know say like you take

01:12:17: like screenshot. Well, I don't really

01:12:19: have anything to get screenshot. I'm

01:12:20: just I'm going to screenshot you chat.

01:12:22: Um, so you know, you get a screenshot.

01:12:26: Um, and then instead of, you know, this

01:12:28: being, you know, just an image, it

01:12:31: actually has like a nicer interface. So

01:12:33: you could, um, let me grab a brush. Uh,

01:12:38: essentials. I'm going to

01:12:42: brushes.

01:12:45: Uh, where's the

01:12:47: Oh, there we go.

01:12:50: Uh, what brush should I use? I hope this

01:12:52: is going to be visible. I'm going to use

01:12:54: this one because this Oh, this is this

01:12:56: might not be a good one. I think this is

01:12:58: um this is a very funky breath. It's not

01:13:01: this one.

01:13:06: Oh, my face tracker is weird. I might

01:13:09: need to clean my face tracker. Uh okay,

01:13:12: I think this works well enough. Uh this

01:13:14: should work well again for

01:13:15: demonstration. It's like you know take a

01:13:17: picture and now you have like you know a

01:13:18: bunch of things here you know where is

01:13:20: like share on things and you click it

01:13:22: and you get like you know stuff and the

01:13:25: other things we could add you know to to

01:13:27: these things is you know say like you

01:13:28: actually draw something on top and maybe

01:13:30: there's also a button to you know bake

01:13:32: it. So like you get a new image and make

01:13:34: bunch of things simpler. But you get

01:13:36: like you know say like literal like you

01:13:37: know blue sky you know icon and you

01:13:40: click it and you like you know other

01:13:41: thing and it just posts. Um in order for

01:13:44: that to appear you would like you know

01:13:46: you would like link your blue sky

01:13:47: account to Resonite so it can like you

01:13:48: know post on your behalf and stuff like

01:13:50: that. Once we have like video recording,

01:13:53: you know, then we could have to do the

01:13:55: same for videos, have like, you know, a

01:13:56: bunch of buttons pop up, make it super

01:13:58: easy for you to share the moments from

01:14:00: Arizonaite, which I think would help

01:14:02: like, you know, promote the platform.

01:14:04: Um, so I think like being able to like

01:14:07: record video like this super quick or

01:14:10: having, you know, a button so you can

01:14:11: like quickly turn it on and just start

01:14:14: recording. Uh I think it would be like

01:14:16: overall beneficial

01:14:19: and we could probably hopefully get like

01:14:20: you know people to share like all the

01:14:22: cool like stuff happening on the

01:14:24: platform and all the shenanigans a fair

01:14:25: bit more.

01:14:29: Uh next question is from uh Kra Blades

01:14:32: is asking uh what have your thoughts

01:14:34: been on the big screen beyond 2 so far?

01:14:37: So, uh, for me it looks very promising.

01:14:40: Like I'm right now I'm kind of like

01:14:42: waiting on some initial reviews, but

01:14:45: like I've if if it's if it's good, like

01:14:48: I might like consider like upgrading to

01:14:50: it because I've been needing like a new

01:14:51: headset for a while. I'm still using the

01:14:53: Very and I've used like, you know, other

01:14:55: headsets, but like I don't really none

01:14:57: of them really fit my needs and the big

01:15:00: screen to be on. It sounds like it

01:15:03: might, but I kind of want to see it just

01:15:05: because before I made like you know that

01:15:07: kind of investment into it.

01:15:11: Uh the next questions from also from

01:15:14: Kster Blades.

01:15:16: Uh Kub was asking do you expect the

01:15:19: procedural mesh system to be a handle

01:15:21: something like paralyzed procedural

01:15:23: furniture systems where you can for

01:15:25: example freely stretch a couch long ways

01:15:27: and the legs and arms everything will

01:15:28: stay proportional and when you hit

01:15:30: specific points it will start putting

01:15:31: new sections of the couch like in the

01:15:33: reverse for example I don't know since I

01:15:36: can't post things in chat I mean I don't

01:15:38: see why not like the procedural mesh

01:15:40: system it's uh like it there's no limits

01:15:45: on what kind of meshes it generate. So

01:15:48: yeah, it's pretty much it's pretty much

01:15:50: just like you know just like building

01:15:52: that system like um and to give you some

01:15:55: idea UIX

01:15:58: you know the the UI that enters

01:15:59: inspectors that is a procedural mesh and

01:16:02: is you know pretty complex. uh like the

01:16:05: like all it does under the hood like it

01:16:07: just generates you know vertices,

01:16:10: triangles, fills them with data. So

01:16:13: you can implement whatever you want like

01:16:15: there's not really limit to it. It's

01:16:18: just amount you know the amount of

01:16:19: effort like you put into a system.

01:16:26: Um Mshock's asking um what is your time

01:16:30: distribution between like coding,

01:16:31: planning and organizing? So that's going

01:16:34: to kind of depend uh depends like which

01:16:36: phase of the project I'm in. Usually

01:16:39: when I'm like starting like you know

01:16:41: some big work usually like I will start

01:16:43: like doing design of it. So I'll be like

01:16:46: you know sketch out like ideas how is

01:16:48: this going to work? How is this going to

01:16:49: fit together and I spend a lot of time

01:16:51: on that? Um and I'll keep like

01:16:54: iterating. I'll keep like putting the

01:16:55: design through you know a bunch of

01:16:57: different scenarios try to like you know

01:16:59: see like how would it work? how would it

01:17:01: handle lots of things? If I wanted to do

01:17:03: this thing in the future, is it going to

01:17:04: be able to handle that? How's this going

01:17:06: to fit into everything? So, it's going

01:17:09: to be mostly pretty much like, you know,

01:17:11: sort of like the planning and organizing

01:17:13: like for the um for the thing. Um then

01:17:17: once I'm kind of more confident with the

01:17:19: design, I'll start transitioning into

01:17:21: actually coding it. Uh so I will you

01:17:25: know usually like as in the initial

01:17:28: phases when the code is kind of coming

01:17:29: coming together there's still going to

01:17:31: be some design changes. So often times

01:17:32: like be like okay this doesn't really

01:17:34: work so I'll change this aspect of

01:17:36: design and then kind of transitions

01:17:37: where I get like really heavy into the

01:17:39: coding and it's where I'm right now with

01:17:41: the splittening where pretty much like

01:17:43: majority of it is you know just coding

01:17:46: the things and there's still like a

01:17:47: little parts of like design but like now

01:17:49: like things are sort of like you know

01:17:51: fitting into it like so it's more like

01:17:53: deciding small things is more like

01:17:56: what's going to be the layout of this

01:17:57: track you know to communicate this thing

01:17:59: but even that like it just kind of then

01:18:01: following

01:18:03: uh following the structure everything

01:18:04: around it. So it it depends

01:18:09: and it kind of changes.

01:18:12: Uh Navy 2001 is asking what is the FPS

01:18:15: outside of debug stuff. Uh I haven't

01:18:16: been able to run it outside of debug

01:18:18: stuff yet. So uh I've kind of talked

01:18:20: about this like earlier like earlier. Um

01:18:24: so but so I'm not going to get like into

01:18:27: depth again. path pretty much like right

01:18:30: now the focus is on just getting things

01:18:31: to work. There's a lot of things that

01:18:33: need to be kind of removed. Um or and

01:18:36: also implement it to kind of get out of

01:18:38: the debug, you know, things like for

01:18:40: example, you know, the render loop. Um

01:18:44: so I haven't been able to like do that

01:18:46: yet, but it's getting there. So soon

01:18:49: hopefully

01:18:50: it's going to be the moment of truth.

01:18:53: Um

01:18:55: next question. Check the fox is asking

01:18:58: uh can you think of a potential future

01:19:00: use case for an idea you're even

01:19:02: thinking about yet? I mean by definition

01:19:05: no

01:19:07: if because if I'm if I'm thinking about

01:19:09: it then it doesn't fit your question.

01:19:12: So no

01:19:18: what questions can you may maybe you

01:19:21: have a use case that we aren't thinking

01:19:23: about yet

01:19:26: if so like then share

01:19:29: and nuko's asking I was trying to work

01:19:31: with bits at the beginning of the be of

01:19:33: the week and are in weird issue why I

01:19:35: can't do powers of two in int mod

01:19:37: protolex it made me cast the floats how

01:19:40: are we supposed to work with bits here.

01:19:42: I'm not sure what exactly you're asking.

01:19:44: It might be like

01:19:48: Oh, you mean do you mean like Twitch

01:19:50: bits?

01:19:53: I'm not sure if I understand because I

01:19:54: don't know like what the bits have to do

01:19:56: with integer math.

01:19:59: Yeah, I don't you have to rephrase your

01:20:01: question. I don't really understand. I'm

01:20:04: sorry.

01:20:08: Don't don't have sire to have a look if

01:20:11: we can interpret the question this time.

01:20:13: Um, next question is from Koles. Uh,

01:20:16: that reminds me, I'm curious if the

01:20:18: splitting fixes the issue where night

01:20:20: will drop to 5 when I go to a virtual

01:20:22: desktop as when there's an adjusted

01:20:24: everything starts lagging. Do you think

01:20:25: splitting will fix that? I don't think

01:20:27: it will. That sounds like something that

01:20:29: happens with the render. Probably like

01:20:30: the process like restricts it to like

01:20:32: low frame rate and that should pretty

01:20:35: much be the same. So that's probably not

01:20:37: going to fix that.

01:20:44: Um,

01:20:45: next questions from modern balloon. Uh,

01:20:48: what are the remaining goals before

01:20:50: splitting would be considered ready for

01:20:51: a pre-release? So, um, pretty much like

01:20:54: getting a bunch of things to render and

01:20:56: removing a lot of the debugging kind of

01:20:58: stuff that's like, you know, very heavy

01:20:59: because right now like it runs for like

01:21:01: a minute and it makes like 200 megabytes

01:21:04: log. So that's not good for the testing.

01:21:07: Um there's still like you know a bunch

01:21:09: of things to implement. I probably also

01:21:11: want to get like um

01:21:14: probably want to get like VR working but

01:21:16: I might also like start like doing

01:21:18: pre-release on just desktop because that

01:21:20: might be sufficient

01:21:22: uh to do initial testing because usually

01:21:25: my goal is to start testing as soon as

01:21:26: possible. Um you can

01:21:31: um

01:21:33: one thing I'm kind of concerning is you

01:21:34: know like the initial pre-release might

01:21:36: not have everything

01:21:38: real like it might not have everything

01:21:40: implemented yet like notably like blto

01:21:42: desktops for example camera that is

01:21:44: doing right now we don't need it to

01:21:46: start like testing the initial stability

01:21:48: of things um what's the other stuff um

01:21:53: uh some of the other components like the

01:21:55: col niche like maybe like like implement

01:21:58: like during the thing like desktop

01:22:00: texture like you know that that one's

01:22:02: not as crucial for the test thing. to um

01:22:09: like I'm I'll see I'll I'll see like

01:22:11: what state is it in like once I feel

01:22:14: this is you know I'm not really finding

01:22:16: too many bugs I might like be like okay

01:22:19: let's start doing pre-release testing

01:22:20: and we kind of I've done something

01:22:22: similar with audio where the first

01:22:24: pre-release it was miss still miss a few

01:22:25: things it was missing reverb zones and

01:22:27: you know doppler but we could test

01:22:30: everything else and it was actually

01:22:32: really beneficial to test everything

01:22:33: else before I even started thing you

01:22:35: know to doler and uh like you know uh

01:22:39: Doppler and reverb zones. So

01:22:43: um

01:22:46: probably do something similar there like

01:22:48: once there's enough that it can be

01:22:50: tested and give like good data without

01:22:52: like you know dying within few next few

01:22:54: minutes. Uh I think that's enough to

01:22:57: start testing and that's kind of like

01:22:59: you know why because the VR

01:23:01: implementation is going to take a bit as

01:23:02: well. um we can start testing things in

01:23:04: desktop and start getting like data and

01:23:07: while people are because it also helps

01:23:09: you know pipeline things because once

01:23:11: you start testing things usually what I

01:23:13: do I start working on the other things

01:23:15: so I don't have to like you know just

01:23:17: kind of sit around and wait so if I just

01:23:19: go and implement everything and then I

01:23:21: like you know start testing I'll just

01:23:23: sit around and be waiting you know for

01:23:25: results to come in but it's more

01:23:26: efficient just be like okay here's the

01:23:28: pre start testing it I start working on

01:23:30: implementing the remainder of the things

01:23:31: and then the next them, okay, there's a

01:23:33: bunch of these bugs. I'm going to work

01:23:35: on these. And it helps sort of make

01:23:37: things go faster overall. And you kind

01:23:40: of you get things earlier because like

01:23:41: if if the initial pre-release had

01:23:43: everything in it, you know, you would

01:23:46: get it like a week later, you know, than

01:23:48: you would otherwise. Um, so well, I'll

01:23:52: I'll see once it feels ready. Often

01:23:54: times like it's a little bit of like,

01:23:56: you know, intuition thing, too.

01:23:59: Uh Nikki Kun's asking uh performance

01:24:02: question how much faster is using

01:24:04: existing component or node versus coding

01:24:06: in protoflux like what's the overhead

01:24:08: there like how much faster does equation

01:24:10: like a square + b square run when it's

01:24:12: c# as opposed to protoflux uh it's going

01:24:15: to depend a lot on the specific

01:24:16: component there's not really a single

01:24:18: answer to that um it's probably going to

01:24:21: be faster like if it's a component

01:24:22: because like uh it can like the compiler

01:24:25: can like you know remove a lot of the

01:24:27: overhead but the specific numbers like

01:24:29: that. It all depend heavily on what

01:24:31: you're doing. But also you can have

01:24:33: scenarios where the protolex might do it

01:24:35: faster because you know maybe the

01:24:37: components doing it in particular way

01:24:39: and it's doing a lot of stuff that you

01:24:40: don't need. So you can code protolex

01:24:42: thing that's specific for a use case and

01:24:45: maybe that's faster. So

01:24:48: it it it depends. It's one of those

01:24:50: things like you know we can't really say

01:24:52: generally.

01:24:56: Uh, next question is from Alex DPI

01:24:58: asking, uh, have you had a chance to

01:25:00: hear or do your own triangle splat thing

01:25:02: yet? So, I've seen it. It's It's very

01:25:05: like interesting looking. Um, I don't

01:25:07: think there's like any easy to use

01:25:09: software yet that you could use like,

01:25:11: you know, to like comput those and learn

01:25:13: RAM. There's like one thing when reading

01:25:15: it that like bugged me a little bit

01:25:17: though because it feels tot like

01:25:19: misleading to me is their sort of

01:25:21: performance claims because one of the

01:25:24: things they showcase is uh they show

01:25:27: that you can just put it into

01:25:28: traditional triangle pipeline just

01:25:30: render triangles and the performance

01:25:33: metrics they were like putting at the

01:25:34: forefront are from this. The problem is

01:25:38: that doesn't have like any of the

01:25:39: blending you know and other stuff for

01:25:41: the triangle. So it actually has a lot

01:25:43: of artifacts like you know it's you can

01:25:45: see the individual triangles. It doesn't

01:25:47: look like a splat. Um and they use the

01:25:51: performance metrics for that but then

01:25:53: like you know for the actual quality

01:25:55: they have a different render and I'm

01:25:58: like how was the performance of this

01:26:00: actual render versus this? Um,

01:26:04: so like I don't know. I have to like

01:26:05: look more into it. But like it that that

01:26:08: part was like a little bit like where I

01:26:10: was like hm are they like are they like

01:26:13: fudging this like trying to like be like

01:26:16: like we get the performance numbers of

01:26:18: the nave implementation that's like way

01:26:21: super fast but also doesn't look

01:26:23: correct.

01:26:25: And it's kind of

01:26:27: sometimes it kind of happens like

01:26:29: studies will they will try to like make

01:26:32: things look better than they are and so

01:26:35: end up like being a bit skeptical. But I

01:26:37: do hope it works well. Um with a lot of

01:26:39: these things like it's like you know

01:26:40: also question is this going to make it

01:26:42: into commercial software where I can

01:26:43: just use it easily because a lot of the

01:26:45: times you'd have to like you know

01:26:47: download a bunch of like Python stuff

01:26:49: and like you know download the sources

01:26:51: and the models and everything and like

01:26:52: it's not very user friendly process and

01:26:55: a lot of the times I don't really have

01:26:56: that much time to like mess with it or

01:26:59: get it working

01:27:01: and you know and the other part is you

01:27:03: know if we were to implement something

01:27:05: like that is this going to be widespread

01:27:07: enough to like you know, be really worth

01:27:08: it. Uh because it will be fun, you know,

01:27:11: to have it, but like if we're going to

01:27:12: spend a bunch of time like implementing

01:27:14: it, we want people to be using it.

01:27:19: Um

01:27:22: uh Nikki's asking, I might have missed

01:27:24: the answer while I was saying by defer,

01:27:26: but what word is this? What pick it? Uh

01:27:28: this is the sigil island by the new

01:27:30: project VIP. Um I pretty much like

01:27:32: picked it. One reason uh let me just

01:27:35: show you. I don't this showing reverse

01:27:38: on the camera. Oh, I can rotate this.

01:27:40: Let me just do this. Sual island by the

01:27:43: new project final final VIP.

01:27:49: So, um Robert shows it because it's like

01:27:52: nice. It's like light. Uh I can also

01:27:54: like, you know, show around. It's very

01:27:56: There's like a whole bunch of stuff.

01:27:57: There's bunch of like gags here as well.

01:28:00: Um you get like, you know, this thing. I

01:28:03: can like

01:28:05: Oh, I did it wrong. There we go. It can

01:28:08: It can be like fishing

01:28:11: and then like, you know,

01:28:16: I forget how this works. I don't think I

01:28:18: got anything.

01:28:21: Um, I think actually this was might have

01:28:22: been one of the MMCM3s, but I forget.

01:28:24: It's pretty old, but it's very looking

01:28:27: and it's light. Oh, there we go. There

01:28:30: we go. Hold on. What are we pulling?

01:28:31: We're getting something.

01:28:34: Come on. Get.

01:28:38: Come on.

01:28:42: What? Where'd it go? I didn't get

01:28:44: anything. I feel cheated. Um there's

01:28:49: also

01:28:51: if I go around,

01:28:54: I'm just going to I'm totally to stand

01:28:56: up. I'm just going to float around. Um

01:28:58: it's very nice world. It's also like,

01:29:00: you know, kind of fitting the

01:29:01: temperature because it's pretty hot over

01:29:03: here. So,

01:29:06: there's these got like a sigil and this

01:29:08: is kind of cool to it.

01:29:12: And then you got

01:29:15: you got a bunch of stuff over here. It's

01:29:17: like some nature. You can get up on top

01:29:20: of here.

01:29:25: Oh, this is a different area. Uh there's

01:29:27: like a pond up here.

01:29:34: I'm pretty sure I wanted something light

01:29:35: because I haven't done this in a while

01:29:36: and like I don't want to load a heavy

01:29:38: world.

01:29:40: Oh

01:29:41: yeah, there's like a nice thing here and

01:29:43: this goes down there. Oh, this is the

01:29:45: place I wanted to go.

01:29:49: Oh boy,

01:29:51: I forgot. How do I get up there?

01:29:55: Oh, there we go. There. There go the

01:29:56: ramp.

01:29:57: So, I go up here and there's more like

01:30:01: fishing stuff here. And also, there's

01:30:04: like this thing with like a lot of

01:30:05: jokes.

01:30:07: Our fishing shack is in alpha state.

01:30:09: Thank you for understanding. I can see

01:30:11: it's this is in alpha state.

01:30:19: Oh, there's a bait or stuff. I forget

01:30:21: how this works. It's been a while. But

01:30:24: it's it's a nice world. I recommend it's

01:30:26: it's good social hangout, too.

01:30:29: Um,

01:30:31: so I'm going to go back here.

01:30:34: There we go. I'm going to go

01:30:38: clip and

01:30:40: put this back here. There we go.

01:30:45: Okay, this Oh, wait. I have the anchor.

01:30:47: Yeah, there we go. There we go.

01:30:51: I'm going to put this back here.

01:30:55: There we go. So, yeah, it's a nice

01:30:57: world. Oop, I just bushed my keyboard by

01:31:01: accident.

01:31:03: Okay,

01:31:05: let's see.

01:31:08: The

01:31:10: next question is from Asent Twitch. Uh

01:31:20: um as on Twitch 17 was proper debugging

01:31:23: and being able to use visual debugging

01:31:24: tools for red light in your things that

01:31:26: will help dev team a bingo card. Um, so

01:31:31: I don't know what bingo card like if

01:31:34: it's like it was something that like was

01:31:37: unexpected but it is one of one of the

01:31:40: reasons not like the the most major one

01:31:42: but it was pretty major reason like why

01:31:44: you know we wanted to do spliting in the

01:31:47: first place because the performance

01:31:50: that's like you know the major reason

01:31:52: but there's a lot of like you know

01:31:53: smaller reasons for it as well. One of

01:31:56: them is being able to debug things

01:31:58: because we were not really able to do

01:31:59: that before. So it was pretty much you

01:32:02: know it was expected you know this would

01:32:04: help a ton with that. Um even just the

01:32:08: general tooling like you know being able

01:32:09: to use the profiler that's another thing

01:32:11: you know I'll be like we'll be able to

01:32:13: around you know the visual studio

01:32:15: profiler just open a build open a world

01:32:17: do stuff invite people measure

01:32:20: performance statistics you know see like

01:32:22: where time is being spent.

01:32:24: That's going to help a lot. A lot of the

01:32:26: times when you need to this kind of

01:32:27: debugging, it would have to be done with

01:32:29: headless and we have to write some code

01:32:31: to sort of instrument the headless to do

01:32:33: whatever I need to replicate. But being

01:32:35: able to do it like this, that was one of

01:32:38: the things, you know, we wanted out of

01:32:39: spliting. Um, so yeah, it's it's it's

01:32:44: like

01:32:46: there's lots of reasons to do it and why

01:32:48: we were like this is going to be really

01:32:50: important to prioritize. You know, the

01:32:51: other one is also being able to use more

01:32:53: than runtime, you know, being able to

01:32:55: use modern language features, being able

01:32:57: to use modern libraries. There's lots of

01:33:00: things into one and like with the

01:33:02: performance being the main thing, you

01:33:03: know, that made it like pretty much

01:33:06: like, you know, be like the best choice

01:33:08: for prioritization we could make at a

01:33:10: time.

01:33:12: I'm going to move a little bit here. I

01:33:15: feel like the camera is a bit offset

01:33:16: now. Like it was uh

01:33:20: Let's see.

01:33:23: I'm going to clean this up. Uh

01:33:28: let's see. How's this?

01:33:35: Is this looking? Okay,

01:33:41: there we go.

01:33:43: I think it works. I think

01:33:47: I kind of messed around with things and

01:33:48: now it feels like things are misaligned.

01:33:51: Okay, so next question is from Nukun.

01:33:57: Uh, actually I'm going to move this out

01:33:58: of the way because it's kind of the way.

01:34:02: Uh, next question is from Nicon. Nukon

01:34:04: is asking, "What sort of thing would you

01:34:07: like to do in a sci-fi and mode type

01:34:09: game? FPS shooting, talking to NPCs,

01:34:11: investigating places, solving puzzles.

01:34:13: What gameplay do you enjoy in a sci-fi

01:34:15: game? Um, yes, all of the above. Um, I

01:34:20: really like like uh FPS shooting

01:34:22: probably like that's the main my main

01:34:24: part. Like I like when it's like a lot

01:34:26: of action stuff. Uh, but also like I

01:34:28: like investigating things like you know

01:34:30: when there's like some kind of like

01:34:31: mystery and you learn things behind it.

01:34:33: When there's like good like world

01:34:34: building, you know, lore stuff like

01:34:36: that. puzzles depends on the puzzle.

01:34:38: Like they have to be like good

01:34:40: challenging puzzles and have to not be

01:34:42: annoying puzzles. Um but yeah, like like

01:34:45: I depends on the game, but action is

01:34:49: definitely like good one. I don't like

01:34:50: when it's too much talking. Like if it

01:34:52: ends up like being like way too much

01:34:53: talking and not like it doesn't get

01:34:56: balanced enough, then it gets salted a

01:34:57: bit. But um but I'm going stuff like

01:35:00: Mass Effect. Massive Effect is like a

01:35:01: really good one. But B is more like

01:35:03: third person shooting.

01:35:07: Uh Mr. Set's asking, "Will met support

01:35:10: be added to?" Um probably at some point.

01:35:14: Um I would say like, you know, uh there

01:35:17: should be a GitHub issue. If there's not

01:35:19: one, you know, make one and if there's

01:35:20: one, you know, make sure to keep it an

01:35:22: upload that's going to make it more

01:35:23: likely for us to prioritize it.

01:35:26: It also depends exactly what you mean by

01:35:28: med support because death can mean a lot

01:35:31: of different things, you know. It's

01:35:33: going to be anything from like being

01:35:35: able to do, you know, low level

01:35:37: interactions with the protocol to like,

01:35:40: you know, having like bunch of like

01:35:41: helpers and integration with things.

01:35:46: Uh, next questions from Nook. Also, I'm

01:35:48: checking on time. We got 25 minutes. So,

01:35:50: I might need to speed up through these

01:35:52: questions because we got a lot. Um

01:35:56: so the answers might be

01:35:59: I might need to like um

01:36:02: I might need to like uh speed a bit

01:36:05: faster. Um

01:36:08: so Ning is asking on the subject of open

01:36:10: source alternatives uh which use local

01:36:11: storage have you seen over do you think

01:36:13: there will be day when we can use stuff

01:36:15: developed for their JB and web assembly

01:36:16: based systems more natively? Um I know

01:36:19: like off version I don't know exactly

01:36:21: what will like the web assembly stuff

01:36:23: be. I don't think you'd be very likely

01:36:26: to use the web assembly stuff because if

01:36:28: it's something that has to work with

01:36:30: specific API of a system then that

01:36:33: usually doesn't port like you have to

01:36:35: write some sort of translation layer and

01:36:37: often times they would like depend on

01:36:39: like you know specifics of the engine

01:36:41: and because it's a different engine it

01:36:42: just doesn't translate. So I don't think

01:36:44: like you'd be able to like use that go

01:36:46: maybe if it's just models but I don't

01:36:48: know what that feature is exactly. So um

01:36:53: something like we would like look into

01:36:56: uh skipping that one. More than bal is

01:36:59: asking uh do you think mono is still

01:37:02: going to be a problem on the unit side

01:37:03: when the splittening happens? Um, so

01:37:07: it's definitely going to be as faster,

01:37:08: but it the main thing with the

01:37:09: splittening is like 99% of the code now

01:37:13: doesn't run under like you know unit is

01:37:15: mono. So that 1% maybe is slower than it

01:37:19: would have been, but you know, we're

01:37:20: still like way better off. However, one

01:37:24: of the things I've been kind of

01:37:25: considering is that since there's

01:37:26: actually not much as much code in Unity,

01:37:29: we could try compiling with IO to C++

01:37:32: and see how that performs. So probably

01:37:35: post splitening but like one of the you

01:37:37: know sort of performance task afterwards

01:37:40: it going to be

01:37:42: um

01:37:44: oh what oh no did I might have hit a

01:37:47: button by accident.

01:37:53: Uh okay this wasn't

01:37:57: okay should be good. Um, so

01:38:02: one of the times is probably going to

01:38:03: be, you know, trying to compile all to

01:38:05: C++ and see what difference that makes.

01:38:08: Um,

01:38:10: and you know, we'll we'll see

01:38:14: and that might kind of help like you

01:38:15: know with the with the unit side of the

01:38:17: things like get get some additional

01:38:19: performance like improvements.

01:38:22: Uh next question is from Nik Reson

01:38:24: currently uses only 30 to 50% of the my

01:38:28: uh 1050 TI after the spliting this might

01:38:31: be bottleneck what can be done after to

01:38:33: optimize that what can we do as

01:38:35: verbalist to optimize post spliting

01:38:36: world so for graphics like one of the

01:38:38: things you want to optimize is like you

01:38:40: know the geometry complexity you want to

01:38:42: optimize VRM usage um you want to you

01:38:46: know optimize lights you know how many

01:38:48: lights are in the world how big they are

01:38:50: how how much are they like affecting

01:38:51: things. So I much like overdraw. So for

01:38:54: example, if you have like you know a lot

01:38:55: of like if you have a lot of um

01:38:59: like overlaid effects, a lot of like you

01:39:01: know very heavy fog like with a lot of

01:39:03: particles you know that might tax your

01:39:05: overdraw. So there's a lot of things you

01:39:07: could do like there. Uh there's also

01:39:09: like you know um there's also like um

01:39:14: you know LOD systems impostors that we

01:39:16: could kind of do. So we'll see like how

01:39:18: it kind of runs with things and see if

01:39:20: we can add more features to to like

01:39:22: improve things.

01:39:24: um nitras uh asking do you think more

01:39:28: about open sourcing the right um a bit I

01:39:32: mean it's something like I think might

01:39:35: be beneficial to do because like it

01:39:36: would allow people like you know to make

01:39:38: contributions to the renderer fix up

01:39:40: some issues there's you know uh probably

01:39:42: like any major features but um like it

01:39:45: it might be beneficial to do there's

01:39:47: like some business things to consider

01:39:49: like and also like you know can we

01:39:51: actually open source it like there's

01:39:52: other pieces of code like that we can

01:39:54: publish. So we have to kind of go over

01:39:56: that. So we'll see. But generally I

01:40:00: would want to do it. Um

01:40:04: but it might like you know like mess

01:40:06: with some of the business stuff too. So

01:40:07: like like there's definitely problems

01:40:09: that we need to like sort through before

01:40:10: we do it. So

01:40:13: don't want to make like any promises

01:40:14: right now.

01:40:16: Uh dust sprinkles is asking about the

01:40:19: unit like editor question. it might be

01:40:21: worth going into detail about how you

01:40:23: plan to improve the scope experience at

01:40:24: overlay window similar to Blender's UI.

01:40:27: So, there's actually a video on that

01:40:28: already like uh I went to quite into a

01:40:31: bit of detail in one of the past

01:40:33: resonances uh and it's been made into a

01:40:36: video clip. Uh I'm probably since

01:40:38: there's not much time left, I'm not

01:40:39: going to be going into the details right

01:40:41: now, but I recommend checking that clip

01:40:43: out. Like search for like, you know,

01:40:45: desktop on my YouTube channel and uh you

01:40:47: should find it.

01:40:50: Uh

01:40:52: case is asking this long one. Uh uh if

01:40:56: someone were to do a concert but there's

01:40:58: an idea that was trying to explain what

01:40:59: a platform is. Would you prefer a a day

01:41:02: for group creatives and there's only

01:41:03: putting together things for concert end

01:41:05: of day working virtual booth uh stage

01:41:08: ending performance or b some who wakes

01:41:10: up in resonite is taken by mysterious

01:41:12: butler and elevator between different

01:41:14: instances before taking death toward the

01:41:16: end. Um, I mean, I feel like either

01:41:19: could be like interesting video. Like I

01:41:21: I think like either would work as long

01:41:23: as it's like, you know, well done. It's

01:41:24: like I feel like it doesn't matter which

01:41:26: one it is.

01:41:29: It's just making cool videos, you know,

01:41:30: to help like sold the ideas like uh

01:41:32: that's what the important part is. Um,

01:41:37: next questions from Nuki rephrasing

01:41:39: early questions about bits. I am talking

01:41:41: about computer variable bits. What?

01:41:44: What? Twitch bits was initial when you

01:41:47: said that at beginning of the week I was

01:41:49: using bits of int as flags like ar of t

01:41:52: tools. You know what I'm talking about.

01:41:54: I was trying to Oh, there's like more. I

01:41:56: was trying to get single bit values by

01:41:58: powers of two but it made me cast float

01:42:02: just to do an exponent operation. What

01:42:03: does it make cost of float? Why isn't

01:42:05: there power integer?

01:42:09: Uh oh, I see. So like power like you can

01:42:13: um one of the things is like you know it

01:42:15: goes it goes up very fast so like it

01:42:18: doesn't become super useful you can

01:42:20: usually if well if you want to do

01:42:24: um if you want to do like you know power

01:42:26: of like two like you can just bit shift

01:42:28: it that's the usual approach uh if you

01:42:31: can do so how are you supposed to work

01:42:32: with bitwise stuff like that yeah if

01:42:34: you're working with bitwise stuff you

01:42:35: want to use the bit shift operations

01:42:37: like that's not really like a power

01:42:38: operation because that's um more like

01:42:40: moth where use like you know float

01:42:42: stubbles

01:42:44: um for integer stuff like you do use

01:42:46: like the bit bit shift bit all oper

01:42:49: operators and forget which category it's

01:42:52: in to so

01:42:56: uh

01:43:04: so mother uh is asking do you think

01:43:07: we'll we'll be able to eventually get

01:43:09: game performing features with the new

01:43:11: runtime as creators. Yeah, there's

01:43:12: actually one library

01:43:14: integrate called Tracy. So, that might

01:43:17: actually help with that.

01:43:20: So, yes, not not sure when, but I think

01:43:24: at some point it will happen. Uh Nikki's

01:43:27: asking,

01:43:30: "Oh, there's like

01:43:32: uh if you're uh if you're talking like

01:43:35: within the thing, you know, between

01:43:37: yourselves, try to like avoid like the

01:43:39: questions mark so it doesn't pop there."

01:43:42: Uh next one is from here defing, could

01:43:46: we at some point see a desktop view

01:43:47: outside of Dash? If no, why not? Except

01:43:50: like an interesting feature. I mean,

01:43:52: there's technically nothing concerning

01:43:54: to the Dash. You could make a facet and

01:43:56: actually I've seen people do this where

01:43:58: you make a facet that like you know has

01:44:00: the desktop like it's just a component

01:44:02: for accessing desktop so you can build

01:44:04: your own with it. Uh we're probably

01:44:06: going to add like more features to it as

01:44:08: well in the in the future but um you can

01:44:10: already do that on your own.

01:44:13: Um

01:44:16: so asking Mass Effect is great. I was

01:44:18: annoyed they pulled a main feature

01:44:20: talking to focus on the two person

01:44:21: shooter set and a third one. I feel like

01:44:23: there was like a lot of talking this

01:44:25: third one too like there was like a lot

01:44:27: of like cinema things.

01:44:30: Uh wait calling V activing

01:44:36: I mean depends what you mean by better

01:44:37: but like overall the performance should

01:44:39: be better.

01:44:41: So like if you use the calling system,

01:44:42: you're going to get even better. Uh

01:44:44: because the activated souls won't even

01:44:46: be shown to unity. Like I mean they

01:44:47: don't get shown to unity right now

01:44:49: either kind of. Well depends what it

01:44:51: mean exactly. Like they don't actually

01:44:52: get sent but like once they're off like

01:44:55: they don't really have much of an

01:44:56: impact.

01:44:58: Um also asking we are going to do

01:45:02: another big overhaul of the avatar

01:45:04: station soon. Uh 3 to 3.1 stuff

01:45:07: especially since mentor program is

01:45:08: getting even more involved here. Have

01:45:09: you seen Have you been there recently?

01:45:11: Any thoughts for us? I don't have any.

01:45:13: I'm sorry. I haven't like had much

01:45:15: chance to like look at things also. Yes,

01:45:18: there's a big chip now. Like it might be

01:45:20: like I don't remember the category

01:45:22: unfortunately. I have to check on the WK

01:45:25: mo is asking are you Drew? I'm not Dude,

01:45:28: but there's Drew in the code.

01:45:33: Uh it's like one of the comments that

01:45:35: Brian found. Uh these questions from

01:45:38: Nikki missed a follow-up. There was in

01:45:40: my response to baloney by the way. I

01:45:42: think we know I um sorry I don't know

01:45:45: like if if you're asking like followups

01:45:47: like you need to like provide context

01:45:49: otherwise like I won't remember and also

01:45:52: you can still jumping to Dadoo. We are

01:45:54: playing that Monday blues at the

01:45:56: beginning of the week. No actually

01:45:57: haven't heard it in a bit.

01:46:02: Uh it just was for example desktop view.

01:46:04: I think they meant desktop streaming to

01:46:06: other people. if you mean to stream to

01:46:08: other people then yeah like that's

01:46:09: something we would definitely love to

01:46:10: add. Uh it be I think a really cool

01:46:12: feature for like you know sharing

01:46:14: things. So um I would want us to like

01:46:17: have that but it is a fairly it's not a

01:46:20: trivial amount of work.

01:46:24: Also got uh 15 minutes left. So if you

01:46:27: have any more questions like now is the

01:46:29: time to like ask them. Uh, and please

01:46:32: know like, you know, really big like

01:46:34: involvements at this point because I

01:46:36: don't think I have time like to get into

01:46:38: those. Um,

01:46:41: let's see.

01:46:45: Uh, just the sprinkles asking and also

01:46:48: also because we're out of questions now,

01:46:50: I will inform you that you snapped the

01:46:52: line while you were fishing. You're

01:46:54: supposed to let it rest while it's red.

01:46:55: Oh, I see. Okay. Yeah, I forget. I used

01:46:59: to like

01:47:00: I like Lego mess around with this world

01:47:02: and I kind of forgot because it's been a

01:47:03: while.

01:47:07: Uh con asking why can't window mode go

01:47:10: below 800 by 600 to create potential

01:47:13: blower. I mean the simple answer is like

01:47:15: things don't really fit on screen

01:47:17: anymore. It gets like too cramped and

01:47:19: the text will not be readable. Um

01:47:25: like um yeah it's it's pretty much dead

01:47:28: or like like technically you could go

01:47:29: lower but like it's not really going to

01:47:31: be usable.

01:47:34: Uh Nitra is asking with the splittening

01:47:36: are meshes uh being stored more than

01:47:39: once in memory. I think row data is

01:47:41: stored both in mesh x class and the

01:47:43: shared memory file at least and well

01:47:46: temporarily like so um when it uploads

01:47:50: the mesh it needs to like generate the

01:47:52: mesh buffer but that was also happening

01:47:54: before because I need to generate just

01:47:56: different ones which actually was worse

01:47:58: than this one. However, that mesh buffer

01:48:02: is not like unless it's a dynamic mesh

01:48:04: that's being updated that's not being

01:48:05: kept in memory. So the meshex like it

01:48:08: actually generates the mesh buffer from

01:48:10: meshex sends it over that gets uploaded

01:48:12: once it's uploaded then mesh buffer gets

01:48:15: freed the shared memory gets released uh

01:48:17: if it's a dynamic mesh that is updating

01:48:19: it keeps it because like otherwise it be

01:48:21: just constantly allocating the

01:48:23: allocating memory which is not fast but

01:48:25: for most static meshes once it's

01:48:27: uploaded to the GPU that's shared memory

01:48:29: is freed same with textures when

01:48:31: textures are being uploaded um the in

01:48:34: memory data handles the textures are

01:48:36: trans readable that is made which is

01:48:38: actually why you know one of the reasons

01:48:40: why why is there a readable field. Um

01:48:42: you might also have like you know other

01:48:44: versions of the mesh because we have for

01:48:45: example mesh collider that has its own

01:48:47: data structure like the acceleration

01:48:49: structure that is kept in memory. Uh and

01:48:51: the mesh often times is kept in memory

01:48:54: for the colliders as well because it

01:48:56: needs to access the data to figure out

01:48:58: like you know stuff like reccast and

01:48:59: like where exactly you hit.

01:49:05: Um, next question. Small B is asking,

01:49:07: would it be an issue uping the main

01:49:09: build right now because of the

01:49:10: splittening? I mean, it we can do it.

01:49:13: It's just like switching between builds

01:49:15: is normally like like there's like

01:49:18: multiple components to it. Like

01:49:19: switching to another thing, especially

01:49:21: when you're like deep into something,

01:49:22: it's kind of harder to do because you

01:49:24: have to like do a mental context switch.

01:49:26: And like often times like when I have

01:49:29: like bunch of like

01:49:32: um

01:49:34: like when when I have like bunch of

01:49:35: momentum on something or like were to do

01:49:38: something else, I have to stop that

01:49:39: momentum and I have to like you know

01:49:40: kind of get going again. So that's one

01:49:42: of the reasons like why often times I'll

01:49:44: stop like doing more regular builds when

01:49:47: something is like you know when I'm

01:49:48: really in depth of something. The other

01:49:51: aspect is you know switching the

01:49:53: branches usually okay but with the

01:49:55: splitting specifically it actually sort

01:49:57: of restructures how the project you know

01:50:00: is around and that creates a lot of kind

01:50:03: of work to deal with like when toggling

01:50:05: back and forth between the old systems.

01:50:06: So like I kind of tend to avoid it when

01:50:09: when I was actually releasing a build

01:50:10: when I was already working on the

01:50:11: splittening I would do it on my laptop

01:50:14: because on my laptop I still had the old

01:50:16: environment because I didn't want to

01:50:18: like mess with it on the on my PC. So,

01:50:22: um it's if if it really needs to happen,

01:50:24: that's probably going to be like, you

01:50:25: know, some more builds, but like then to

01:50:27: like avoid it like not do them as often

01:50:30: because it's just the overhead is gets a

01:50:32: bit higher.

01:50:36: Okay, is asking favorite elements in a

01:50:38: sci-fi plot? I really like like high

01:50:40: like hard sci-fi like mystery things

01:50:43: like something that's like really going

01:50:46: into like you know the laws of physics

01:50:48: are messed up like you know something's

01:50:51: wrong there. Um there's like really good

01:50:53: stuff that happens in the expanse books

01:50:55: with that for example. Um there also

01:50:57: like bone cool things with like Stargate

01:50:59: like does a lot of like really

01:51:01: interesting stuff. Um,

01:51:08: uh, Nick King is asking, "Do you like to

01:51:11: role play?" It depends. What do you mean

01:51:13: by role play? There's like multiple ways

01:51:15: I can interpret that.

01:51:19: Uh,

01:51:22: Dusty Sprinkle's asking, "On scale one

01:51:24: of 10, how evil is the current build

01:51:27: system?" I feel like the resonate

01:51:30: was okomito answer seven. I don't know.

01:51:34: I don't how do you rate this? I don't

01:51:36: think it's the worst thing. I would

01:51:37: actually and it's also like a lot better

01:51:39: after like a lot of improvements that

01:51:41: Sire did to it. Um

01:51:44: so I'm going to say like I don't know

01:51:46: five. And then like um there's a lot of

01:51:50: work that J4 is now doing to like sort

01:51:52: of automate things as well. So,

01:51:55: um,

01:51:57: so like you know this this going to

01:51:58: improve it even further.

01:52:01: So that's all the questions right now. I

01:52:03: still got like No, there's another

01:52:04: question. Um,

01:52:08: uh, I've got 8 minutes left, so I could

01:52:11: still ask a few more questions, but I'm

01:52:13: getting a bit tired and I need water.

01:52:16: So, um, I'm kind of considering ending

01:52:18: it a little early. So, if there's no

01:52:20: more questions, I'll probably end it

01:52:22: here. Um, Gw asking, I just heard of

01:52:26: something who maintains the wiki cuz I

01:52:28: would like examples of what the API

01:52:30: sense expects. I'm not sure what you

01:52:32: mean by that. Uh, Pearl Prime is one of

01:52:34: the people who does a lot of wiki, but

01:52:35: like it's a short effort to do. So

01:52:38: there's a lot of people

01:52:42: there's a lot of people like you know uh

01:52:44: contributing to it and I don't quite

01:52:47: understand what you're asking with the

01:52:49: API like sense expects like do you mean

01:52:51: like documenting stuff for our API or

01:52:56: I'm not quite sure I'm understanding

01:53:03: okay so no more questions so

01:53:12: Uh, Shadow is asking, "Have you done

01:53:15: anything uh with eye tracking other than

01:53:18: expressions? Some people have menus that

01:53:20: pop up if you look at specific objects."

01:53:22: I don't have anything myself right now.

01:53:24: Well, actually, no, I do. Um,

01:53:27: I have a thing. I I've shown this

01:53:29: before. Uh, so I have like these like

01:53:34: needles. Actually, one thing that it

01:53:37: does, it aims them where I look. So like

01:53:45: so you see like I'm looking over there

01:53:47: and shoot over there. I'm going to look

01:53:48: over there

01:53:52: and shoots over there. So like I've

01:53:54: done stuff like that. I've also done

01:53:56: like a bunch of like demo like um while

01:53:58: packag like went pretty viral

01:54:09: looking is asking probably prime

01:54:11: maintenance wiki. Yeah, Prem is like one

01:54:12: of the main people who's like taking

01:54:14: care of a lot of things, but again like

01:54:16: it's not a a lot of people contribute to

01:54:18: the wiki like you know and it's that's

01:54:20: the whole point of it is like anybody

01:54:22: can contribute. So it's not you know

01:54:25: just a single person like who does all

01:54:26: the contributions

01:54:29: and if you want like you know please

01:54:30: contribute to the wiki.

01:54:34: Uh Kro is asking swag or cringe I um I

01:54:39: don't really understand. I'm sorry. Few

01:54:47: ws asking what I mean by examples is

01:54:49: like what fields does it send because

01:54:51: does it X like okay what does it send

01:54:53: otherwise I have to spend extra time

01:54:55: looking what I see but what feel I I'm

01:54:57: sorry I don't have like I don't have

01:54:59: context for what you're asking like

01:55:01: which API are you talking about

01:55:04: are you talking like the Resonate API or

01:55:06: talking about wiki API like something

01:55:08: else like I I I don't understand.

01:55:19: Okay. Uh

01:55:21: got a 4 minutes left.

01:55:25: We probably like ended like few minutes

01:55:28: early because my throat is getting all

01:55:30: the scratchy.

01:55:31: Also, thanks for watching my stream

01:55:34: earlier. It was pretty cool. Like I

01:55:36: really like the whole interview like

01:55:37: idea like get more people from the

01:55:39: community and show them like around. So

01:55:43: yeah, I kind of need water and questions

01:55:45: are kind of slowing down. So um

01:55:50: um so thank you very much. Um

01:55:54: oh

01:55:56: my de was asking who do you mean in

01:55:58: Smash? I I don't play that game

01:56:01: unfortunately. I'm not really a Nintendo

01:56:03: person. I've been like I had like a Sega

01:56:06: like but um I'm not really much for

01:56:10: consoles

01:56:11: right now.

01:56:13: Anyways, uh so thank you everyone for

01:56:16: watching. I hope like you enjoyed like

01:56:18: the resonance. Um

01:56:20: uh and answering like asking all the

01:56:22: questions and like listening to answers,

01:56:24: listening to me rambling and so on. Um,

01:56:28: so what? My brain's kind of slow. Um,

01:56:34: anyway, thank thanks everyone, you know,

01:56:36: for for like watching the stream. Thanks

01:56:39: for like, you know, part of the

01:56:40: community, doing kind of cool things,

01:56:42: supporting us, you know, whether it's

01:56:43: like just being part of our community,

01:56:45: like, you know, making this platform

01:56:46: kind of come alive or supporting us

01:56:48: financially, whether it's like, you

01:56:50: know, Patreon or Stripe. But if you're

01:56:52: still on Patreon, consider switching to

01:56:54: Stripe. like we literally if even if you

01:56:56: switch on the same tier we get more

01:56:57: money out of it so that actually helps

01:57:00: us out more and we can then like you

01:57:01: know we invested like more into the

01:57:03: project. Um uh there's also like you

01:57:06: know other office hours that you can

01:57:08: like check out like prime does is like

01:57:10: Tuesday there's art ones like on

01:57:12: Wednesday. Uh there's the moderation

01:57:14: ones they're like right before this one.

01:57:16: Uh you can find like bunch of

01:57:17: information our discord. So thank you

01:57:19: very much for watching and um we'll see

01:57:22: you with the next one. The next week

01:57:24: like it should be pretty normal. So um I

01:57:27: don't expect like those to be skipped.

01:57:29: So um thank you very much and see you

01:57:33: next time.