Toggle menu
Toggle preferences menu
Toggle personal menu
Not logged in
Your IP address will be publicly visible if you make any edits.

The Resonance/2026-01-18/Transcript

From Resonite Wiki

This is a transcript of The Resonance from 2026 January 18.

This transcript is auto-generated from YouTube. There may be missing information or inaccuracies reflected in it, but it is better to have searchable text in general than an unsearchable audio or video. It is heavily encouraged to verify any information from the source using the provided timestamps.

00:00: Start recording. Okay, we should be

00:02: live.

00:04: >> Hi.

00:05: >> Hello. Actually, let me double check I

00:07: have enough space on my drive so it

00:09: doesn't recording. Yeah, I got 200 over

00:12: 200 gigs. So, that's good. I'm going to

00:14: post the announcements.

00:18: Oh, we already have people. We're

00:20: already

00:22: almost ready.

00:25: Here's the announcement.

00:29: And here's another announcement. Uh, if

00:33: I can find it. Where is it? Live

00:35: streams. There we go. Live streams. Have

00:37: it really prepared. There's another

00:39: Twitch. There we go. I'm going to post

00:43: socials announcements.

00:46: Post. Post.

00:49: And we're live. Okay.

00:52: Going to keep this open. Um, hello

00:55: everyone. We already got people. Uh, let

00:58: me turn this off. Hello everyone.

01:01: Let me adjust the camera too. So, it's

01:04: going to There we go. Um,

01:08: we got Grand Okay, we got chicken focus.

01:10: We got as twitch. Oh, as a different

01:12: color now. Oh, no. I'm concerned with

01:15: this first two. Um, we'll get to those

01:19: later. But, um, anyway, hello and

01:21: welcome to another episode of Resonance.

01:23: Oh, hello

01:26: U1F98A.

01:28: I'm sure I don't know how to pronounce

01:29: your name. Um, check.

01:34: >> Oh, interesting. Rendered. That's kind

01:36: of interesting.

01:41: >> Anyway,

01:42: >> hello and welcome to another another

01:45: episode of Resonance. Um, we're back in

01:48: VR, unfortunately from IRL. Um

01:52: but we are here to answer any questions

01:54: you might have like about resonate

01:55: whether it's like you know technical

01:56: ones whether it's like you know about

01:58: the platform phils of it future past

02:00: present like whatever whatever you want

02:02: to ask even if you have like you know

02:04: any like um personal questions like you

02:06: want to get us know more feel free to

02:08: ask the only thing make sure to put a

02:10: question mark in the chat that way it

02:13: shows up on our uh on our list and like

02:15: you know we can go through the question

02:18: um we also have a bunch of questions

02:20: piled from the Discord like in advance

02:22: questions. So, we're going to go through

02:23: those first and we're going to go to

02:25: Twitch. Also, I'm get Syro. Um, I think

02:30: everything

02:32: >> I haven't done this in a few weeks and

02:33: so I'm like I'm always like,

02:37: >> how do I do this?

02:39: >> So, I should probably made a camera

02:40: anchor. Uh, let me make a camera anchor.

02:44: Uh, put this here so it's the camera is

02:47: not floating around.

02:49: Uh, seems a good angle.

02:55: Come on. There we go.

02:58: So, we are here in the Creator Jam um

03:03: Creator Gem New Year's World. It's very

03:05: fitting. Um, and also, yes, uh, we're

03:10: doing a little bit of um, in advance

03:12: questions. Uh, Navy 3000 is asking, "Why

03:15: is Chairo?" What? I cannot talk. You're

03:18: Chyro now.

03:20: >> Why is Cyro?

03:22: >> Why is Cyro in the chair? Well, we told

03:24: him to get in the chair, so he got into

03:27: the chair.

03:28: >> Yeah, everybody always says to sit in

03:30: the chairs. So, like I'm sitting in the

03:32: chair like everyone asked me to. So,

03:35: >> yeah.

03:36: See, that's the that's the power of VR.

03:38: You can like literally sit in the chair.

03:41: When people say, you know, you can sit

03:43: in the chair, like what they mean is to

03:45: say you sit on the chair, but in VR, we

03:48: have the power to actually sit in the

03:50: chair. So,

03:52: >> yeah,

03:53: >> the power of VR,

03:55: >> not even in VR.

03:56: >> It's the power of VR.

03:59: R&B.

04:00: >> This makes me think of like, you know,

04:02: that like um

04:04: you know like in everything everywhere

04:06: all at once when they go like you say I

04:10: can't be here when you mean I shouldn't

04:12: be here because you see I can be here.

04:17: Anyway, we should get we should get into

04:19: the questions. Alo hello Rising and Hund

04:22: and BL

04:25: good people. Um, so

04:30: let's start with the questions uh coming

04:32: from the Discord. Uh, the first one, um,

04:35: I kind of wish I was actually more

04:36: prepared for this one. Um, Colin the cat

04:39: is asking. Actually, let me duplicate it

04:41: because it's kind of hard to read this

04:43: way. I'm a good thank you. Oh, we have

04:45: to do the

04:47: Yes, this is how you subscribe Confetti.

04:49: Thank you. Thank you for your

04:50: subscription.

04:52: Um, actually, let me spawn another chat

04:54: so we can be also confetted in case more

04:56: of this happens. Like, hint hint nudge

04:59: wink.

05:01: Uh, I'm just going to turn this off.

05:04: We don't need pin messages for this one.

05:06: And just show this under the floor.

05:08: There we go. So, if anything happens,

05:11: we'll be sprayed with confetti. Uh, but

05:13: yes, Colin is asking, uh, how are the

05:17: real-time infinite FPS

05:19: uh, race new year fireworks? They're

05:22: actually pretty cool. Um, we don't see

05:25: Oh, oh my god. Thank you.

05:28: Thank you, Cavior.

05:30: Um,

05:32: uh, went to like, um, like a New Year's

05:34: party. Uh, we like went to the community

05:37: 10 people. Um, we didn't see too much

05:38: fireworks in person. There were like

05:40: some people like uh shooting them out

05:42: like in the like around the area. Um I

05:44: actually managed to get some really good

05:45: photos

05:47: uh where I was like trying to photo the

05:49: fireworks like real quick because they

05:51: were kind of happening in real time. The

05:53: camera happened to focus on like three

05:54: branches in front and it actually looks

05:57: super cool. But I still need to process

05:59: those photos. Well, finish processing

06:00: them because I kind of started but I

06:02: didn't finish them and I wanted to show

06:03: you but I'll probably have to like bring

06:04: it to the next Arizona like um finish

06:07: those for that. Um, so I can show you,

06:09: but it was pretty neat like hanging out

06:11: with everyone. We also watched the

06:13: creator jam like New Year's like put it

06:15: like on the TV. So we're kind of like

06:17: watching it even though like we're not

06:18: able to be there like um in VR directly,

06:21: but we saw like you know people like

06:22: hang out in this world and um talking

06:25: about stuff and exploring.

06:27: Um so it was pretty cool.

06:33: >> They're very loud. A couple of them went

06:35: off like in the in the neighborhood like

06:37: those like concussion mortars. I think

06:39: uh some people shot them a little too

06:41: low and it sounded like a [ __ ] cannon

06:43: outside the window.

06:44: >> Yeah, it was loud. Um but they were

06:47: cool. I I do wish I have processed the

06:50: photos because I would like show you but

06:52: I don't have those processed yet. Um ask

06:54: me like ask me again for the next 30

06:56: minutes. Hopefully I'll have them

06:58: processed by then. Um

07:01: let's see. Uh

07:06: uh which order are these in? Is this the

07:08: next one?

07:09: >> Uh top down left. Top down, left, right.

07:12: >> Okay. It kind of doesn't matter that

07:14: much anyways, but uh I'm going to put

07:17: this one here. This a longer one. Um so

07:20: for this one, uh Stara is asking only

07:23: tentionally related to resite, but what

07:25: is your technique for capturing images

07:27: for good caption splats? I know you need

07:29: to take a lot of images, a lot of

07:30: angles, but what about the speed you

07:31: move your camera while taking photos or

07:33: how much should I be moving around the

07:35: subject or how close and far should I be

07:37: from the subject? Also, would be a good

07:39: split counter compared to the view gion

07:40: spot in VR in Arizona without taking

07:42: performance too much for majority of

07:44: users. So, um for 3D sking in general,

07:48: you want like generally recommend taking

07:50: like lots of photos uh making sure your

07:52: photos are good quality. You want to

07:54: have also like a good coverage. Um, so

07:58: let me actually grab a brush. Uh, so I

08:00: can kind of sketch this a little bit.

08:02: Um,

08:04: uh, brushes, brushes, brushes. Where's

08:06: brushes? Geometry line brushes. There we

08:08: go. Um, so say like you have like a

08:11: subject like you know, say like this is

08:13: like I don't know, I'm going to make

08:14: like a simple shape, but like say I'm

08:16: scanning this thing and this is kind of

08:18: knowled. Um,

08:21: you want to take like lots of photos and

08:23: imagine this is like a photo. You want

08:25: like lots of parallax,

08:28: but also you want each parallax to have

08:29: like a lot of overlap because like the

08:32: cameras they you know you need like an

08:34: area where it's like overlapping because

08:36: it uses it to match those cameras to

08:38: where they are. You also want to do like

08:40: multiple angles. So sometimes you'll do

08:41: like you know angle like this and do

08:44: like you know bunch of them like this.

08:46: Uh you can also do this by doing like

08:50: sort of like semicircle around some

08:51: areas.

08:53: Uh so you can do like you know cameras

08:55: that are like this

08:58: so so

09:01: you know that kind of goes around here

09:03: but also another important part is like

09:04: you have to consider the geometry of

09:06: what you're scanning. Uh because

09:08: whenever you take a photo imagine like

09:10: you know imagine like the camera casts

09:12: light into the scene. So imagine like

09:14: this is you know casting light and it

09:16: sort of like illuminates the object and

09:18: only the illuminated parts you know

09:20: actually let me let me show it on this

09:21: camera. It's going to be easier. Imagine

09:23: this cast a light

09:26: and this cast a light. Uh so this is

09:28: like the sort of light that is casting.

09:31: Think about like where the shadow is

09:33: because like you know this part is going

09:34: to be obscured. So like if I go from

09:36: this camera here,

09:38: this this area here that's in shadow

09:41: which means it's not seen by this

09:43: camera. Um so the only surfaces that are

09:46: actually seen are you know here, here,

09:50: and here.

09:52: And for a scan to work like you know to

09:54: give you good reconstruction each piece

09:56: of the surface needs to be seen at least

09:59: from two cameras from different angles

10:02: and different positions uh because it

10:04: needs to use that to sort of estimate

10:06: the you know the depth. So if I have

10:08: this camera this me another view and

10:10: then you know the shadow is going to be

10:12: like this. So it doesn't see these but

10:14: like it it sees this part and it sees

10:16: this part which like you know helps for

10:18: this part. It helps correlate these two

10:20: cameras and it can reconstruct this part

10:22: of the object but this is only seen by

10:24: this camera because this camera doesn't

10:27: see this part which means you need

10:29: another camera that covers this part you

10:31: know so it it can also reconstruct. So

10:34: if you have like you know geometry like

10:35: this like you have more complex geometry

10:38: you might want to add more cameras from

10:40: you know different angles so you get

10:42: like full coverage and this applies both

10:45: for traditional photoggramometry and it

10:47: also applies for gshian spluffing. Uh

10:50: essentially it needs to see every piece

10:52: of the surface of the object that you

10:54: want to reconstruct needs to be seen

10:55: from at least angles ideally more. Um

11:00: there's another aspect of um of gashion

11:05: splats though. So let me move this. I'm

11:07: going to remove these.

11:10: Um there's one more aspect for like

11:12: gashion splits that's different from

11:13: photoggramometry. Gashion splits in

11:16: generally um they um

11:22: they sort of like learn how they should

11:24: look to match your source cameras. If

11:26: you do 3D scanning, um

11:30: actually do say like do this like say

11:33: like you're scanning this thing. Um if

11:36: you're scanning with uh f

11:39: work if you're scanning

11:40: photoggramometry, you can kind of

11:43: actually no let me let me do something

11:44: different. Um let's make this simple.

11:48: Let's say you're scanning this. With

11:50: photoggramometry you can you know just

11:53: take pictures from this side and like

11:56: you know it sees these cameras will see

11:59: the whole surface. You take picture here

12:01: and this sees you know like you cover

12:03: the entire surface and photoggramometry

12:06: it actually reconstructs the surface. It

12:08: will try to figure out okay this is how

12:10: far it is based on multiple cameras. It

12:12: figures out where this line is. With

12:15: gashion splats, they actually work

12:18: different like they gashion splits will

12:20: learn how they should look. So they

12:22: match your cameras from where they see

12:26: the image like where they match the

12:27: images. But everywhere you don't have

12:30: data, they can do whatever. So if you do

12:34: photoggramometry and you reconstruct

12:35: this with photoggramometry, you probably

12:38: like if if it's good quality, you're

12:40: going to get nice, you know, straight

12:42: surface. Uh, and if you if you then

12:45: look, you know, this is not a camera,

12:47: you look from this side, it'll still

12:50: look kind of correct. Maybe textures

12:52: will be a little bit smudged like maybe

12:53: not. Maybe like, you know, you have good

12:55: quality textures, but it will look

12:57: correct because you you reconstruct

12:58: actual geometry. Gashion splits, they

13:01: don't actually reconstruct geometry. So,

13:03: if you reconstruct as a gashion split,

13:05: it will look correct from here, but if

13:07: you look at it from here, it might just

13:10: be like, you know, garbage. Um, and it

13:13: often times happens because like the

13:14: gussian splits like they don't care how

13:17: they look for like viewpoint angles that

13:21: are not in your input data. So for

13:23: caution splits in particular, you have

13:25: to make sure you capture it from all the

13:27: different angles that you want the scan

13:30: to be viewed from. And this can be like

13:32: particularly tricky like when you're

13:33: doing like for example like a street 3D

13:36: scan because I've had like many scans

13:38: where like you know I'm on the street

13:39: like you know say like I don't know like

13:40: there's a lamp you know and people are

13:43: like this

13:45: and then I you know I will thank you to

13:48: for the subscription and I'll be like

13:50: you know doing scanning you know from

13:52: the surface like this and I cover like

13:56: you know go both ways but then I like

13:59: process it and then I go with camera and

14:01: look straight down. This will look

14:03: transparent. It will look correct from

14:05: the angles you normally view the street

14:06: at, but you look at angle that like you

14:08: didn't capture and suddenly the scan

14:10: breaks apart. And actually, I have a 3D

14:14: scan that shows this like in in a very

14:17: stark way. Um, well, I have a video of

14:20: one.

14:21: Uh, give me a second. I'm going to spawn

14:25: it out.

14:27: So, this Oops.

14:31: Make sure this actually shows on the

14:33: camera properly. Um,

14:37: hold on. I'm going to

14:40: reset this. I wish I could just play

14:42: this from the back. So, this first scan,

14:44: this is recon. Oh, it's not playing.

14:47: This is reconstructed with like, you

14:49: know, photoggramometry. And you see like

14:50: it's just three cameras. The blue um,

14:53: let me actually move this back. Uh

14:56: the blue the blue like squares you see

14:59: that's the cameras. This is a scan from

15:01: just three photos.

15:05: Um

15:08: and uh you see like like even look at it

15:11: from the side it still looks kind of

15:13: correct. There's the shape. There's like

15:14: a lot of missing because I didn't see

15:15: parts of it but like the shape is

15:17: generally correct. Um and that's with

15:19: photoggramometry. Then I reconstructed

15:21: the same same three photos I

15:23: reconstructed with Gian spots. And you

15:25: can see with the default like like this

15:29: it's fine. But the moment you look away,

15:31: it just breaks apart like completely.

15:33: There's no actual geometry. Um the gian

15:37: plots have learned how to how they

15:38: should look. So they match the input

15:41: cameras. So if I look from like around

15:43: angles where you know the cameras are

15:45: from it looks fineish. But the moment I

15:48: like you know go away from them it just

15:50: completely breaks apart. And this is an

15:52: extreme example but like caution splits

15:53: will do this. So like you have to be

15:55: very um cautious about like you know

15:58: which angles

16:00: you you know capture your thing from for

16:02: a good caution split like compared to

16:05: photoggramometry you might want to like

16:06: add additional angles that you wouldn't

16:08: do with photoggramometry. But in general

16:10: you want lots of parallax. You want lots

16:12: of you know like overlap between them.

16:15: Um, but also you want your photos, like

16:17: you know the other part of your

16:18: question, you want your photos to be

16:19: sharp. And usually the way I approach

16:21: this because I'm taking lots of photos

16:23: is I will literally be like holding the

16:24: camera and be like click move click move

16:27: click move click move click move because

16:30: you don't want to be moving the camera

16:31: all the time because then like you know

16:33: you get like if you're taking p pictures

16:36: uh you might get motion blur and also

16:38: depends you know on lighting like if you

16:40: if you like if it's daylight you can set

16:42: the shutter speed to be like super fast.

16:45: So like you know like you can be like

16:46: literally moving and you don't get

16:47: motion blur. In that case you can just

16:49: you know keep moving without stopping.

16:52: But if you have like lower light

16:53: settings you might want to like you know

16:55: to do like click move click move click

16:57: move click move click move kind of like

16:58: motion. Um it's also like one of the

17:01: reasons people sometimes ask can I use

17:03: video for the construction which yes you

17:06: can but I don't recommend it because you

17:08: actually get worse uh results as like

17:11: you know from the video because like you

17:13: have you have the video motion blur we

17:15: have the video compression and the

17:18: frames like usually it's better to have

17:19: the frames be higher resolution where

17:22: they have more detail in them for the

17:24: software to use and videos typically are

17:27: um lower resolution compared to photos,

17:30: especially raw photos. Um,

17:33: and also like you know, um, there's like

17:37: way too many of them, so like it might

17:38: also take longer to process. So like it

17:41: kind of generally they're not a good

17:43: idea. Um,

17:46: the photos are kind of like better. Uh,

17:48: for how close and far is from the

17:49: subject depends on the geometry. You do

17:51: want to maintain the overlap. Sometimes

17:53: like when I'm 3D spinning something I

17:55: will like you know if I want to get more

17:57: details somewhere you know say like this

17:59: is like this thing and say there's like

18:00: a statue or something uh you know um I

18:05: will take a lot of more photos closed

18:07: around the statue and then like you know

18:10: fewer photos just for the general

18:13: oops wrong way uh for the general kind

18:15: of you know area. So, you want to take,

18:18: you know, photos where you're closer,

18:20: but like if you want more detail there,

18:22: uh, because you also get like texture

18:24: from it. So, like if you photograph

18:25: something like from really far away, um,

18:28: then like, you know, you're not going to

18:29: get as much detail and as much texture

18:31: because this is going to be smaller in

18:33: your photos. Um,

18:38: so I think that kind of covers most

18:40: everything um, except for the last part

18:42: of the question. I'll get to that in a

18:43: second. But uh yeah, like generally

18:47: for me it's like you know like when I'm

18:48: 3 scanning it's kind of largely now

18:50: intuition because I've like done lots of

18:52: scans so I know like what works what

18:54: doesn't. Generally I kind of follow

18:56: these kinds of rules. Um if I want more

18:59: detail, if I want like more kind of like

19:00: you know reconstruction I will take more

19:01: photos around that area where I want

19:03: more detail and where there's more

19:04: geometric complexity to capture. Um

19:09: for the other part of the question, what

19:11: would be good splat count for

19:12: comfortably viewing in gashion splot in

19:14: VR? Um I would probably recommend like

19:17: not more than half a million splats. Um

19:20: keep it like you know in the hundreds of

19:22: thousands once you get like into like

19:24: million or like 2 million 3 million. It

19:26: that's get heavier. But also like it's

19:28: not that simple. Uh it dep like a huge

19:32: factor with gian splats is how big the

19:34: splats are. Um, and it kind of depends,

19:36: you know, what the construction you use

19:38: and what subject you're capturing.

19:39: Because some splats you will have a lot

19:41: of huge, you know, splats that are like

19:44: overlapping each other. And if you have

19:46: like fewer splats, like you can

19:47: literally have like just 10,000 splats.

19:49: And if you have like a lot of

19:50: overlapping ones, that's going to be

19:53: slow because overrow like it's going to

19:55: it's going to kill that. Um, so it's not

19:58: as straightforward. sometimes like you

20:00: know you can also afford to have a

20:01: little more splats if the splats are

20:03: smaller and they're not like overlapping

20:04: as much. Um there's essentially two

20:08: parts to 3D scanning like sorry there's

20:11: two parts to like rendering gian splits

20:13: where part is like the actual processing

20:15: of the individual splats where it

20:16: doesn't matter on their parameters and

20:19: their size where it has to like you know

20:20: do sorting and computations and that's

20:22: where the splat count matters and it

20:25: kind of sort of rises sort of

20:27: quadrophically like the number of

20:28: splats. Like if you go like in a 3

20:29: million like that's going to take a lot

20:31: of performance for sorting. And the

20:32: other part is the actual rendering. Uh

20:35: and these are kind of like

20:37: semi-independent.

20:39: So you can you can have like you know

20:41: very few splats but they're very big and

20:43: overlapping and it's going to hurt on

20:45: the overdraw. Or you can have like you

20:47: know lots of splats and it's going to

20:48: take a lot of for sorting but then it's

20:50: like kind of quicker to render. Or you

20:52: can have both and it's going to really

20:53: hurt. Um so it kind of depends.

20:56: Sometimes you might want to like you

20:57: know clean clean those up. It also

21:00: depends on the hardware because it's

21:01: kind of you know spans are heavier to

21:03: render than traditional scans. Um

21:07: and they will also take a lot of VRM

21:09: although we have like whereas on it will

21:11: compress them in VRM to kind of keep

21:13: like their footprint lower but even then

21:17: like you know it's they're heavier than

21:19: traditional geometry. Uh so it's going

21:21: to depend on how much VRM somebody has

21:22: and what kind of GPU they have. It is

21:24: pretty it can be pretty heavy on

21:26: overdraw and it can be heavy on compute.

21:29: So I think it kind of hopefully answers

21:32: this.

21:37: Uh let's see

21:40: the next question

21:42: we got uh is from Mintshock. Uh Mshock

21:46: is asking what mechanism is used to

21:48: determine what should be drawn before

21:50: after something on canvas. I haven't

21:51: been able to figure that out on my own.

21:53: My theory is that adjacent elements

21:55: render from top to bottom uh items in

21:57: terms of ordering inspector so that

21:59: lower slots render over higher ones. Um

22:02: but I don't understand how rendering of

22:04: parent and children is determined

22:06: kind of close like it it depends on the

22:08: sorting in the uh hierarchy. So on the

22:11: children so generally um if you have

22:14: like hierarchy let me just draw a quick

22:17: one. Uh

22:19: so let's say this is hierarchy. Let's

22:21: say this is the root

22:23: and then I got another one.

22:26: So the way it's going to like lay things

22:28: out is essentially it goes literally

22:30: like top to bottom. So, it'll start here

22:32: and it'll go here and you know and like

22:35: then it goes here and draws this and

22:37: then this will draw on top of this and

22:39: then it goes here and all of this will

22:42: draw on top of this

22:44: and then like you know whatever is here

22:46: is going to draw on top of this and

22:48: whatever is here is going to draw on top

22:50: of this. So, it's kind of just going

22:52: down the hierarchy. It's assuming you

22:53: use the um say material for those like

22:57: because it we will try to like order

22:58: them but if you use material that kind

23:00: of renders out of order then like it'll

23:02: kind of like take it sort of out of that

23:04: order. Um which I generally don't

23:06: recommend use like recommend using the

23:08: UI material so they can be like sorted

23:10: properly. Um but gen like it kind of

23:12: goes like top to bottom.

23:18: Uh,

23:21: next question we got is uh

23:24: from computer user fake. They're a fake

23:27: computer user. Um, have you considered

23:30: making uh frequently asked for common

23:33: issues and suggestions that have already

23:34: been addressed? Every now and then I

23:36: hear see someone mentioned topic already

23:37: been addressed in a previous office

23:39: hours office hours talk. It could be

23:41: useful to have a place to point for

23:43: common issues suggestions. Um, we do

23:45: make clips from this. So on our YouTube

23:47: channel, if you go to the Resonate

23:50: YouTube channel, uh there's like whole

23:52: playlist of like resonance clips. Uh

23:54: that one has like a lot of commonly

23:56: asked questions. We can kind of, you

23:57: know, clip those out and usually we'll

23:58: kind of direct people to those. It might

24:00: be a good idea to like, you know,

24:01: collect some of those like, you know, um

24:03: into some document or somewhere, but

24:04: like you can also just, you know, go

24:06: there and search. Um

24:08: um also feel free like you know if you

24:10: want to like put them on wiki like um I

24:12: think some people have already done that

24:13: like for some of the clips like where

24:14: you kind of transcribe some parts or

24:16: include it. So um

24:21: yeah it might it might like how many

24:23: like maybe like some of the most common

24:25: questions but uh the YouTube like

24:27: playlist is a good place to start. I

24:29: feel

24:32: could maybe like make like a something

24:33: like on wik there's like really like

24:35: common questions and then like put like

24:37: links to those clips there.

24:42: Uh

24:46: next question um from colat uh when is

24:51: phase one of rendering going to be

24:52: finalized? So generally we don't do like

24:55: when

24:57: um it depends a lot on uh

25:01: oh I think the show is starting

25:04: maybe it depends when like you know we

25:08: feel comfortable with it. Uh right now

25:10: like we're kind of focusing on other

25:11: things like the idea of that was you

25:12: know just kind of open that up and be

25:14: like okay like we're going to call like

25:15: like a bunch of like feedback. uh we're

25:17: going to let you know community kind of

25:18: go through it give it some time then

25:20: we're going to look at it you know

25:22: iterate and like once we feel

25:23: comfortable uh this is a good list and

25:26: we got like you know the feedback that

25:27: we've gotten um we'll say you know it's

25:30: finalized but right now I don't know

25:31: when it's going to happen we're kind of

25:32: focusing on other things still uh so at

25:34: some point like we're going to loop back

25:36: to it like you know we'll go through the

25:37: feedback evaluate then

25:44: uh next questions from Ozie Okay.

25:48: Uh Elsie is asking uh there's a

25:51: particular part of the upcoming sliding

25:53: module. I'm curious about the

25:54: orientation for sliding particle being

25:56: contractually facing away from the

25:58: normal of the collider. Uh it's sliding

26:01: off in non-view mode. Say you have

26:03: snowflake going on the surface. You

26:05: expect it to face away from surface as

26:07: it goes down. I know orient by velocity

26:10: can sort of do this. Uh but gravity will

26:12: always cause particle to face weirdly

26:14: through collider due to the velocity

26:16: being down. Um so it kind of happens

26:19: like what do you mean like facing like

26:20: what the module does um if you have like

26:23: you know surface here

26:26: um

26:28: and we've got like a particle and say

26:29: the particle is going this way and it

26:31: collides it's going to make it face like

26:33: kind of this way and it'll keep that

26:35: normal sort of like you know the it will

26:38: keep the velocity sort of aligned so it

26:41: doesn't face through the thing. So like

26:42: if you use like align by velocity you

26:44: could you know have it um if you want

26:47: the particle like you know to be facing

26:48: this way. So you configure it so like

26:51: when the normal is this way it's like

26:53: you know it's facing this way which

26:55: means when it's like this it would be

26:57: kind of you know facing this way and if

27:00: it's like snowflake like it's probably

27:01: going to be like you know going um it's

27:05: going to be doing kind of mostly

27:06: straight down which means like it should

27:08: kind of work out but um um generally

27:13: like like the sliding it doesn't affect

27:16: the

27:18: like it doesn't affect like the the

27:19: facing it just affects the velocity or

27:22: like direction. Um, so you can kind of

27:24: like work with that. But if there's like

27:27: some need for something more complex, we

27:28: have to kind of like, you know, look at

27:29: it more in detail and sort of evaluate

27:31: what could be done potentially. But, uh,

27:35: that's kind of like what it's going to

27:36: do out of the box.

27:43: Uh,

27:45: next question is from uh, Mshock. Uh is

27:49: there technical reason for resid link to

27:51: be exclusive to host especially for

27:53: stuff like unity SDK when that's the

27:54: thing at some point um so it kind of

27:58: makes some things a little bit easier

27:59: because like one it gives host the

28:01: control like you know is there session

28:03: so they kind of decide how things kind

28:05: of work but also like you know uh for

28:07: host we can assume like the permissions

28:09: don't really change like you know if you

28:11: start word and you have builder

28:12: permissions then like you have builder

28:13: permissions throughout the entire

28:15: session. um we can potentially expand it

28:18: to like users but it just adds a little

28:19: bit of additional kind of complexity for

28:21: some things and we probably want to like

28:23: add like permissions where the host can

28:25: determine that because like the host

28:26: might be like well I don't want people

28:28: like to use like there's an link in my

28:29: session. Um so there's like both like

28:34: technical and there's also kind of like

28:35: you know session kind of like moderation

28:37: like if the host doesn't want people to

28:38: use it then um they should be able to

28:40: say so.

28:46: Uh next question uh this one this one

28:50: has multiple things. So uh

28:54: this might be trickier to get uh put

28:57: this here I guess. Um,

29:00: uh, Kra is asking, "Hey, Frooks, got a

29:02: few." One, hope you've been doing well

29:03: and taking care of your mental question.

29:06: Thank you. Oh, how it is starting.

29:12: >> I'm actually going to lower that.

29:15: >> Yeah, that's really uh

29:19: >> Yeah.

29:19: >> Yeah, I thought this might happen.

29:22: >> I just I lowered the

29:24: >> It's the New Year's thing. It's

29:26: activating animation. I couldn't stop

29:28: it. I just lowered my multime media so

29:30: like it's not like loud. Let me know if

29:32: it's too loud on the stream, but uh it

29:34: should be lowered.

29:36: Um

29:38: anyway, yes, like it's been kind of

29:40: better. I've been like taking like a

29:41: break from things and doing some more

29:42: like fart things. Um

29:47: and say it's been it's kind of getting

29:49: better like having fun like with some

29:51: things like linking and particle systems

29:53: like sub emitters and so on. Ooh,

29:57: this is sly pretty now. Um

30:00: two is material stacking still square

30:03: will probably break with new render

30:05: stack or has the problems a likelihood

30:07: of how much still probably break if it's

30:09: left unhandled push into we need to find

30:10: long-term solution category or is it

30:12: like hey [ __ ] where is no viable

30:14: overall solution reading the things to

30:16: avoid it wasn't clear if it was just

30:17: because it was source planning thing or

30:19: it was generally another

30:22: kind of depends this might be actually a

30:23: good thing to bring into the discussion

30:25: for the render requirements because we

30:27: might want to consider is um we might

30:30: make it like a solid requirement since

30:32: we've been telling people like you know

30:33: this is like not necessarily a thing

30:35: that might be supported but like there

30:36: might be methods to like convert like

30:39: you know the existing behavior but what

30:41: we would need to have done is document

30:44: that behavior. Uh so we kind of like you

30:46: know properly understand like what does

30:48: material sticking actually do and how

30:50: can we emulate it through other means.

30:53: Uh because if we can potentially convert

30:56: the existing content into like you know

30:58: some more primitives um

31:02: like like for for this for example the

31:04: question would be in what ways does

31:07: material sticking behave differently

31:10: compared to just you know stacking just

31:12: the measure. Um if you document that and

31:15: you know we can sort of isolate like

31:17: this is the behavior of this uh we can

31:19: say you know these behaviors will be

31:20: supported these behaviors will not be

31:22: supported. Um, so

31:26: I would recommend like you know like um

31:30: posting in the discussion channel and

31:31: maybe seeing like can we got you know

31:33: some people to look at it like like um

31:36: because that could kind of help you know

31:38: with those considerations.

31:40: Uh for three depending on with mesh

31:42: stacking would a wpper mesh be more

31:44: performant than duplicating mesh in any

31:45: scenario for example sk mesh render with

31:47: ton of blend shapes think body of an

31:48: avatar?

31:50: Uh potentially could be like this. This

31:52: is the thing like with the material

31:53: stacking maybe like um we can reuse you

31:56: know that like because like if it does

31:57: transformations. Oh am I going to play

32:01: if uh if it's like you know doing a lot

32:05: of computations for the mesh then like

32:07: yes it can it could like help to just

32:09: reuse the data and render it again. Uh

32:12: which is another kind of consideration

32:13: for that. Um yeah, I would recommend

32:17: like bring it into the discussion so we

32:18: can kind of like you know isolate and

32:20: once the biggest part for me right now

32:23: would be like you know just kind of

32:24: understanding actually what are the

32:26: differences of material sticking but

32:28: like just from not pure not performance

32:31: parts but purely from like how does it

32:33: look, how does it behave? Uh what are

32:36: the differences of like you know

32:37: material stacking

32:39: versus just like you know uh duplicating

32:42: the mesh ignoring you know like assuming

32:44: like all the mesh like stuff is kind of

32:46: the same properties like you know

32:47: ignoring the performance impact because

32:48: once we can understand that um we can

32:52: kind of better like you know reason

32:53: about how can we support this and how

32:55: can we convert this and maybe keep it

32:57: like you know performant.

33:03: Next questions from Ozie.

33:07: Uh Ozie is asking with resident link

33:09: being way to expose manipulate data

33:11: model external is there concern that

33:13: future data model rework could change

33:14: API mess with our established libraries

33:16: in the future? No, that's actually the

33:19: part the big part of the data model. It

33:20: creates an abstraction layer. Um and it

33:23: kind of provides like certain guarantees

33:25: like you know on how the data model

33:26: operates and the data model rework is um

33:30: meant to like change how it works

33:31: internally but not change like you know

33:33: sort of the guarantees the data model

33:35: gives. So and that includes you know

33:38: because I link is built on those kind of

33:40: interactions that needs to be preserved

33:42: because if we broke those behaviors of

33:45: the data model um it would also break

33:47: the content because that also depends on

33:49: it like everything in there kind of

33:50: depends you know on the sort of

33:52: guarantees provided by the data model.

33:53: So the rework will preserve those kind

33:56: of behaviors but kind of change how they

33:58: work under the hood. Um which is a bit

34:01: like similar you know how like when we

34:02: rework other systems like when we

34:04: reorked the particle system you know

34:06: completely changes how it works under

34:07: the hood uh most things you know just

34:09: kind of it gets converted it works the

34:12: same uh and we've done that like for a

34:14: number of other systems. So um at least

34:17: unless like you know something is you

34:19: like using some sort of implementation

34:22: quirks like graph hacking for example um

34:24: that's something you know we don't

34:26: guarantee that will be preserved with

34:27: the rework um but for resling

34:30: specifically you know we avoid like

34:32: using anything that we know is hacky

34:34: which means like you know there's not

34:36: really a risk it'll break because it's

34:37: using the behaviors that we are that are

34:40: designed to be preserved long term.

34:46: Uh next questions from Colad. Um Colin

34:51: cat is asking uh how are driven lists

34:53: working in the data model and what like

34:55: what special variable collector is

34:56: doing? It shows all filed items uh in

34:59: the list is being driven. Does it mean

35:01: their values are localized? How ids of

35:03: those fields since a lot of items in the

35:05: list can vary between users? Are those

35:06: still synchronized? to each user's

35:08: fields and list have distinct ids or

35:10: they somehow synchronized.

35:12: Um actually I have to look this up but

35:15: like I'm pretty sure like they they

35:17: shouldn't get synchronized. Uh if the

35:19: list it's always driven like is not

35:21: being synchronized.

35:23: um at least until like you know it

35:25: becomes like under at which point like

35:27: it does a full sync. But I would have to

35:29: kind of actually check.

35:32: Um

35:34: but yeah, they should each have their

35:36: like their own like ids are already sort

35:39: of localized to the user like you know

35:40: when you do allocation. So each user is

35:43: going to have like unique ids. Um

35:47: so but like those items are not going to

35:49: like you know exist for other users. So

35:53: yeah, we have to kind of look at the

35:54: details, but uh

35:58: I think like it it's uh like generally

36:02: driving something avoids like you know

36:03: the synchronization.

36:07: Uh next questions from Zenuru got a

36:10: technical question for you. What led you

36:12: to choose the cloud to interprocess

36:14: package over other crossplatform IPC

36:16: libraries? I started using myself, but

36:17: lack of any asing methods could almost

36:19: make it feel inefficient to be

36:20: constantly pulling while through a loop.

36:23: Um, so there's two parts of it. One, um,

36:26: you don't actually need to be constantly

36:28: pulling in a while through loop. Uh,

36:30: that's like it uses something called

36:31: like busy spinning. So that's the wild

36:34: through loop where it'll it'll like spin

36:36: and check like a few times, but if it

36:38: like doesn't get anything, then actually

36:40: waits on a semophore. Uh and the

36:42: semaphore is like using like you know

36:44: one that's global between different

36:46: processes. So like if it doesn't get

36:47: anything for a few spins it'll sleep for

36:50: a bit and then like you know it'll wait

36:52: on the semaphore. So when the other end

36:55: like you know sends some data it also

36:56: triggers a semaphore and it wakes that

36:58: up and that actually is you know is what

37:01: makes it efficient. Um and s actually

37:04: worked like on making this also work

37:05: like you know on Linux like across like

37:07: you know the Windows and Linux like I

37:09: don't know if you do you want to talk

37:10: about that part a bit. Uh yeah um so the

37:16: way that you can like so firstly I do

37:19: want to expand a little bit on like the

37:21: way that you can wait for a message on

37:23: the other side like you can you can

37:24: literally just like shove the uh like

37:28: read message like function. I forget

37:31: what it's called. um in like its own

37:34: thread and like Fuk said the semaphore

37:37: will actually like like it'll actually

37:41: halt the execution of the thread until

37:43: it receives more data. So you don't have

37:44: to like do the busy spinning yourself.

37:46: It already handles that super

37:48: efficiently. Um,

37:52: in terms of getting it to work on Linux,

37:54: that was definitely a little fun

37:56: because,

37:58: um, I had to make like a like a shim

38:02: library to let the wine process actually

38:06: call, uh,

38:09: and like get like data from native uh,

38:14: pix semaphors. uh because one it has to

38:18: implement a lot of these sync primitives

38:21: uh itself because Windows has very

38:24: specific behavior that it needs to mimic

38:26: for some of these primitives. Um but the

38:28: semaphors those work pretty much the

38:31: same honestly. Um, and so just letting

38:35: the wine crisis call into the native

38:37: Pixaphors.

38:38: Uh,

38:40: was a kind of a

38:44: sorry, I'm I'm like losing my thought

38:46: here. Uh, it was definitely a little bit

38:49: of an adventure uh because trying to

38:52: make a a Windows process talk to a Linux

38:55: process and actually see all of like the

38:58: stuff. The shared memory actually worked

39:00: perfectly fine. It's just the semaphors

39:01: that needed to to work. And I uh I think

39:05: one of our community members, I can't

39:07: remember your name, I'm very sorry, um

39:11: made like a prototype like Rust library

39:14: uh for it that I referenced. Um

39:18: and uh yeah we have a very unique

39:20: architecture now where uh the like we

39:23: have shared memory and semaphore

39:25: communication between two entirely

39:27: different operating system architectures

39:29: on the same machine which is really cool

39:32: to think about. I don't

39:34: >> I can't think of any use like the like

39:38: >> yeah like very like unique I think to

39:40: use like the the like wine like thing

39:42: like where you can actually like expose

39:44: some like Linux like essentially Linux

39:47: like kernel like resources like to the

39:49: Windows applications.

39:51: >> Yeah. So like that was also fun because

39:53: I had to write this as a like a wine

39:57: like quote unquote native library

40:00: because wine will implement some of the

40:02: windows libraries itself in pix code

40:04: directly um rather than relying on the

40:07: emulation or it's not an emulator. It's

40:09: not an emulator. I'm sorry. I'm sorry.

40:10: It's not an emulator. Sorry. You're

40:13: going to be expelled from the Linux

40:15: community now.

40:16: >> Yeah. Um

40:19: but yeah. No, like it's it's very it's

40:22: very cursed to reference both the Pix uh

40:25: standard libraries and uh the Windows

40:28: libraries in the same project and make

40:31: like Windows function signatures with

40:33: like Linux code inside. It's very scary,

40:36: but it works and it works perfectly.

40:41: Yeah, but in general it's actually more

40:44: efficient than you think. Like that's

40:46: like the the while through loop is only

40:49: like because the assumption is sometimes

40:51: like you know the message will be being

40:53: pushed fast because waiting on a like

40:56: you know on like a semaphore um while

40:58: it's efficient is not as efficient as

41:00: not doing it in the first place because

41:02: it requires you know um the transitional

41:04: control from from the like you know

41:06: programmed like back to the operating

41:07: system thread and then like it needs to

41:09: wake it up and that will consume like a

41:10: fair amount of like CPU cycles. Not like

41:12: a huge one but like a lot if you do it a

41:14: lot. So if there's like messages coming,

41:16: that's why it does the business spinning

41:18: because say say like it takes um it

41:22: takes like you know uh I don't know say

41:25: 10,000 CPU cycles you know to transition

41:28: from the program like you know and wait

41:30: on the semaphore uh it might spend like

41:33: you know a thousand cycles just kind of

41:35: waiting because uh if something comes

41:37: within those thousand cycles which is

41:39: can be very likely for like something

41:40: that's like frequent communication then

41:42: it doesn't need to like you know you

41:44: don't need to take the cost of the you

41:45: know 10,000 cycles to like transition

41:48: but if nothing's coming in the time you

41:50: know then it's just going to it frees

41:51: the CPU resources uh it may be wasted

41:53: like you know thousand cycles but that's

41:55: not a huge deal um it waits on it and

41:57: then like you know it gets woken up uh

41:59: but if something comes within those you

42:01: know thousand cycles then like you saved

42:04: like even more and it's general like you

42:06: know philosophy be behind like busy

42:08: spinning uh but in terms of like what

42:11: led us to choosing this library it just

42:12: it seemed like a good fit like it was

42:14: like it had the stuff we needed. It

42:16: didn't have like a lot of bloat. We're

42:17: kind of looking for shared memory. A lot

42:18: of the IPC libraries actually didn't use

42:20: shared memory. They were like using like

42:22: you know pipes and other stuff which we

42:24: specifically didn't want to use. Um and

42:27: also like you know was open source was

42:28: the like license that we can use. So

42:31: there was a few things missing from the

42:32: library that we needed because we also

42:34: needed sort of like raw access to the

42:35: memory. So we modified the library to

42:38: kind of expose those things uh because

42:40: it has the cues but also has like you

42:42: know just general shared memory

42:44: primitives um and we just made like you

42:47: know some small modifications. So it was

42:50: um it just seemed like the best kind of

42:52: fit.

42:57: Uh next question is from Gold Racing. Uh

43:03: let's see uh goldun is asking

43:07: uh what is the technical reason why

43:08: perflex data relays not impulse relays

43:10: incur performance penalty. Um I mean the

43:14: main reason is like it takes a little

43:16: bit of extra implementation to make it

43:18: not incur performance penalty and that

43:20: extra effort hasn't been put in yet. Um

43:24: that's pretty much it. like it's

43:25: possible like um when it's like you know

43:27: sort of building like the internal

43:30: structures where it sort of like will

43:32: consider a sort of pass through and will

43:33: sort of like you know try to eliminate

43:35: it but like it needs some a bit of

43:36: special code to be able to handle that

43:38: and handle it in all scenarios that it

43:40: can exist in and that code hasn't been

43:42: like implemented yet. Um,

43:47: that's pretty much that's pretty much

43:48: it.

43:51: And next question. Well, the last

43:53: question from Discord is also from

43:54: Oussie. Um, let's move this here. I'm

43:59: going to put this here. Um, with a link

44:03: saying that it will be used to help with

44:04: future SDK. Will it officially be done

44:06: by team members or will it be more

44:08: focused on being a community project?

44:10: Second question that I'm spitting. Oh um

44:16: I'll do the second question after. So um

44:19: with the UN SDK it's going to be an

44:22: official project but we also going to

44:26: make it open source the same way like

44:27: resite link is so like you know all the

44:29: code it's going to be built on top of

44:30: link and all the code for the uh unit

44:32: SDK is also going to be open. Um our

44:35: goal is essentially to sort of provide

44:37: like you know the baseline where this is

44:39: like you know the unity SDK here's a

44:40: bunch of mechanisms to convert and

44:43: there's a mechanism to sync like and

44:45: push stuff to resolink

44:47: uh but

44:49: it being open it will both allow you

44:51: know community to like improve it you

44:53: know build upon it and also add like

44:55: additional converters because one of the

44:56: things the unit SDK will be doing is

44:59: taking unity components like you know

45:01: for example unism and like you know it's

45:03: collider system and like you know

45:05: particle system and have some logic that

45:07: converts it to the resonite primitives

45:10: uh or components. Um and that's going to

45:14: be like designed to be modular where you

45:16: know you can likely write like this is

45:18: how this is converted. Um

45:21: so

45:23: um

45:26: there's a

45:28: like we're probably going to provide you

45:30: know converters like for like lo common

45:32: Unity components but you might also want

45:34: to you know convert components specific

45:36: to your project or maybe components

45:37: specific to other SDKs if you have like

45:39: you know content in those um and it

45:42: being community based it'll allow you

45:44: know people to contribute more

45:46: conversions and improve the conversions

45:48: like you like if there's for example

45:50: some edge cases or situations that are

45:52: being converted well people will be able

45:55: to like you know help us like improve

45:57: this or maybe you can you know make a

45:59: fork make it work with like your

46:00: particular project you know maybe a

46:02: bunch of specific converters that are

46:03: only like you know special to your

46:04: project so you'll be able to build upon

46:06: it. So it's going to be both where we

46:08: sort of want to provide you know the

46:10: like the fundamentals the foundation for

46:12: it but then also like allow uh everyone

46:14: in the community to sort of like expand

46:16: it like you know use it for their own

46:17: thing like do pretty much like whatever

46:19: we want with it and if some

46:21: contributions are like general that

46:22: would benefit other users we can like

46:24: you know then merge those in uh for the

46:27: second question um what is the goal of

46:30: the future SDK is it aimed to be a way

46:32: to develop things forite inside of unit

46:34: editor limitations so let ation is

46:37: pretty much going to be like you know

46:38: whatever tooling is there. Um with

46:41: resolic link you can access pretty much

46:43: the entire like not right now there's

46:44: still some pieces missing but it's

46:46: designed so you can let do work with a

46:47: data model. So anything tech you can

46:49: build in Resono technically you can

46:51: build through Resonoid link the bigger

46:53: question is going to be you know how

46:55: much tooling is implemented inside of

46:56: Unity. So for example, you could

46:59: residite link with the unit SDK, you

47:01: could build proto flags um by just, you

47:05: know, setting up the components and

47:06: setting up the references, but you're

47:08: not going to have visual editor. We're

47:09: very unlikely to actually implement

47:11: visual editor for Protollex for Unity,

47:15: but it being open opens the door. If

47:18: anybody in the community wants to do it,

47:19: you know, they'll they'll be free to do

47:21: so. Um so the limitations are going to

47:23: be like you know more along these lines

47:26: you know and how well the tooling is. Uh

47:29: it's also not going to do you know when

47:31: you're building stuff in Resonite you

47:32: get that sort of like immediate feedback

47:34: but for some things you will need to

47:37: like you know you will need to send it

47:39: over to Resonite

47:41: um just to like be able to see like you

47:43: know it actually behave because anything

47:45: that exists on the Unity SDK side is not

47:47: going to have like you know it's not

47:49: going to be able to actually run the

47:51: behaviors even something you know say

47:52: like like a spinner component you put a

47:54: sp you could with like with the unit SDK

47:57: you could put a spinner component

47:59: but you won't be able to like, you know,

48:00: to kind of tweak it and see it actually

48:02: spin. Um, because it doesn't have any

48:04: actual code for the behavior. It's sort

48:06: of like just defining the structure. Um,

48:09: but again, if somebody wants to

48:10: contribute and like add a little bit of

48:12: code that like, you know, lets you

48:13: preview that. Um, that's a possibility.

48:17: Um it might like you know might be

48:20: complications where like it might not be

48:22: fully in sync with how the component

48:24: behaves on the resonite you know where

48:25: it's like actual diff like you know um

48:28: because that depends a lot of data model

48:29: stuff and other things so that might

48:30: make things a little bit more

48:31: complicated but in general if people

48:33: want to like use the Unity SDK to build

48:36: stuff they'll be able to um you can use

48:39: it to build new projects we generally

48:41: would recommend people like to build

48:43: stuff in Resonate Um

48:47: but um

48:50: it um like you know we're not going to

48:52: like stop people from like using it if

48:54: they really want to build things in

48:56: Unity. Um this is a method to do it. And

48:59: one thing I think that will actually

49:00: also help is uh the way uh we plan to

49:03: implement the entity SDK is and this is

49:06: one of the reasons why why resite link

49:08: you can just you know enable it on you

49:09: know in resonite session um is that you

49:14: enable like you know resonite link and

49:15: then you build your stuff in unity you

49:18: send it over to resonite and then like

49:20: you can like you can literally have like

49:22: resite running and you can have unity

49:23: running side by side and then you just

49:26: send your stuff from Unity into Resonate

49:28: see what does how it behaves, you know,

49:30: and then like modify it in Unity and

49:33: sync it back into like, you know,

49:34: resonite and it's going to, you know,

49:35: update whatever thing is in there,

49:37: whatever is already like send because

49:39: it's going to remember, you know, stuff

49:40: that's in there. Um,

49:43: so,

49:46: uh,

49:49: it's, um, what's the word

49:53: that's potentially, you know, workflow

49:55: that that's opened up with this. Um but

49:58: also like the one of the main goals for

50:01: the unit SDK is also to ease like you

50:03: know ease sort of entry to resonate for

50:07: a lot of existing users if they have

50:08: like existing avatars existing content

50:10: for other platforms. Uh this will make

50:13: it easier to bring that over you know to

50:15: sort of give those users a head start.

50:18: But hopefully like you know once people

50:20: burn their content like you know um

50:22: they'll start also like using the

50:23: reserite tools to you know build upon

50:25: that content modified you know do more

50:27: more with it. Um and it's kind of like

50:30: you know one of the kind of motivating

50:31: factors but in general the motivation is

50:33: like you know open up new workflows that

50:35: were not possible before. Um the other

50:38: motivation is you know making it open so

50:40: people can expand and improve those

50:41: workflows. we kind of give you know

50:43: everyone the community more power uh

50:45: this way and also like it's sort of like

50:48: way for us like you know doing more like

50:50: testing waters like making more sort of

50:53: uh opensource kind of projects for usite

50:56: uh because we want to do more of this

50:57: and kind of seeing you know how this

51:00: works you know sort of testing the

51:01: waters this kind of gives us you know

51:03: some valuable experience and feedback

51:06: um on how we approach things and you

51:09: know maybe open up like more parts of

51:10: the engine you know for community

51:11: contrib contributions and expansions. So

51:14: there's like multiple sort of

51:15: motivations and goals and I feel like

51:17: the unit SDK and those are like are good

51:19: ones because they're very suited for

51:21: this and they're very you know um

51:24: there's not too much risk of like things

51:27: you know exploding because like if we

51:29: for example opened like you know say

51:30: some parts of the engine that are more

51:32: um like uh sensitive to compatibility

51:36: issues we might have like you know some

51:38: contributions and people contribute

51:39: something and suddenly lots of content

51:40: breaks. um withite link, you know, it's

51:43: not like modifying anything in how

51:45: things work in there is allowing it to

51:47: send stuff over. Um so there's not like

51:50: you know risk of that happening and that

51:52: kind of makes it it's like a bit safer

51:54: way to kind of you know see how this

51:56: sort of works out.

52:00: So that covers all of our questions from

52:03: um covers all our questions from uh

52:06: Discord. Uh we should be able to start

52:08: going through the Twitch ones. Let's see

52:10: how many we actually have. Uh because we

52:12: also want to we want to showcase some of

52:16: the stuff uh from social media. I still

52:18: want to get like the embeds so we can

52:21: kind of do it more easily. But um uh

52:24: let's do a little bit of community

52:25: showcase. So uh for those going to work

52:29: though

52:30: blue sky so it might work. Uh

52:34: oh, we actually got like one like

52:35: straight from one of our team. Uh Rusty

52:38: Bot got uh big screen edit tracking.

52:41: Let's see if this works.

52:46: I guess that's the that's the question.

52:48: I know his YouTube videos have been like

52:49: having issues, but I think Blue Sky

52:51: should be fine. Maybe. Is it loading?

52:56: >> Yeah, it seems to be loaded for me.

52:58: >> Is it loaded for me? Or is it? Hold on.

53:01: >> Oh,

53:03: maybe something's barked.

53:07: I does it use YouTube DLP because

53:09: YouTube DLP can download blue sky and

53:12: like Twitter videos. So I wonder if

53:14: maybe

53:15: >> I mean this work before

53:17: like we've already done like you know

53:19: this thing. Um

53:21: so this is the

53:24: tweet like uh they got like his eye

53:26: tracking

53:28: but unfortunately the video's not

53:29: loading for me. I don't know why.

53:37: I know YouTube has been having some

53:38: issues, but sometimes YouTube has like

53:40: some problems.

53:47: Yeah, if the video doesn't load, we

53:48: might need to skip this for today and

53:50: like look into like why blue sky is

53:51: being painful,

53:54: which sucks because there's like some

53:55: good videos um they wanted to share.

53:58: Yeah, this is not loading.

54:00: Let me see if this one works. Uh, I'm

54:03: going to try one more. If this doesn't

54:04: work, we'll defer this for next time and

54:08: hopefully get like the embeds too for

54:10: for that because I don't want to like

54:12: just be able to bring the blue sky and

54:14: have like everything in one message so

54:15: it's kind of easy to share. Um,

54:22: let's see if this loads.

54:24: Nope. It's

54:27: Nope. It's being It's being

54:30: uncomparative.

54:42: Yeah. No.

54:42: >> Surprisingly surprisingly like these

54:45: Wait, is this actually Yeah,

54:47: surprisingly these are loading for me.

54:49: >> Yeah,

54:51: >> I'm the one streaming. Yeah, I'll have

54:54: to look into this. I'm sorry.

54:57: Um yeah, it's not being cooperative

55:00: right now. Uh first with you know like

55:02: we kind of want to do like more kind of

55:04: community showcase where we like you

55:05: know go through like some of the post

55:06: and showcase you know some cool stuff

55:08: that people have been doing over there

55:09: tonight. Um

55:13: so we'll we'll we'll see. I have to poke

55:15: around and see why it's not loading for

55:16: me and hope hopefully next week like it

55:19: will be working so we can kind of

55:20: showcase some really cool community

55:22: stuff. So,

55:24: >> with that, uh, we're going to

55:28: we're going to start going through the

55:30: Twitch questions. Uh, and we got a first

55:32: one from from Grand UK. Granuk is asking

55:36: uh, Schnop Schnopent.

55:41: I I don't even know how to pronounce

55:42: this. I don't

55:44: >> Is this is Schnop like Tongs?

55:48: Well, it's it it's

55:52: like what doesn't make us explode. I

55:54: Yeah, like it's just the inverse. It's

55:55: like stuff that just um doesn't make us

55:58: disintegrate.

56:00: >> Well, I I don't want to talk about the

56:02: good stuff and then the bad stuff. I

56:03: want to talk about the bad stuff and

56:04: then the good stuff. So, just leave it

56:06: on a good note. No,

56:07: >> this is this is this is different. This

56:09: is like what doesn't make you

56:14: disintegrate. Like it doesn't

56:15: necessarily makes you integrate, but it

56:16: just doesn't make you disintegrate.

56:19: >> I don't know. Uh

56:21: >> apples.

56:23: >> I don't know the existing.

56:26: >> This is a good one.

56:30: Um and then we got Tyunks from a on

56:34: Twitch. Um what doesn't make what

56:36: doesn't make us integrate but also like

56:37: it doesn't like you know silly I don't

56:39: know breathing air.

56:41: >> It's just it's just there. It just it's

56:43: a thing you just do.

56:44: These are very neutral. Uh

56:48: >> these really have the same answer.

56:50: >> Yeah, this is like just ne neutral. It

56:52: doesn't it doesn't affect us in

56:55: particular way.

56:58: >> I don't know.

57:00: I don't know.

57:01: >> I don't know.

57:04: >> I don't know, man. I just work here.

57:08: >> We got a question. I don't

57:12: so I kind of want to add like sort of

57:13: like a motion log or something so I

57:15: don't keep like doing this. Uh check the

57:17: fox author is asking why is JSON parsing

57:19: in C# so weird? I'm kind of starting to

57:22: implement d type decoding in Python. I

57:23: think I have a approach that works now

57:25: too. I don't know. I don't like I don't

57:27: think it's weird.

57:29: >> I'm not sure what's weird about it. Like

57:31: I've not had an issue doing this in uh

57:35: Lua or even the wacky like um one of the

57:38: wacky programming languages in Gmod like

57:41: expression 2. So like I'm not sure where

57:45: you're getting hung up. It just like you

57:47: you get the message and you see its type

57:49: and then you just handle it based on its

57:51: type. I'm not sure. Uh

57:53: >> why it's weird. I think it's like the

57:56: derived types but like because like C#

57:59: like it's kind of made like you know

58:00: because C# is very strongly typed. Um,

58:03: so like of the messages we use like we

58:05: actually specify these are the derived

58:06: types and then like when it parses it,

58:08: it actually instantiates the right class

58:10: which means it needs to actually read it

58:11: and then decide which one it does and

58:14: sort of the the um system like you know

58:18: system text JSON is like you know has

58:20: this mechanism

58:22: but like I figured like Python would be

58:24: easier because Python is a lot more kind

58:26: of dynamically typed than strongly

58:28: typed. So like I don't know. I don't

58:31: work with Python much so like it's hard

58:33: for me like to say but um if there's if

58:37: there's like

58:39: I there's like anything specific that

58:41: could potentially help like feel free to

58:43: make issues. Resling is still like in

58:46: beta state which means like you know

58:47: we're okay like if something seems

58:49: generally good change that would kind of

58:51: help the like implementation.

58:54: We're still open to making it even

58:56: though like you know it will require

58:57: everyone to kind of rework the

58:59: integrations. Um yeah I would recommend

59:02: like maybe making an issue like now

59:05: because if if you like wait too long and

59:07: we kind of like move it from beta then

59:08: like we'll be like a lot less likely to

59:10: actually make that change. But also, I

59:12: don't know what

59:14: changes we could make to like, you know,

59:16: make it easier for Python.

59:19: >> Yeah, I'm not really

59:21: It would be good to see if this is like

59:23: a like an issue like that is just

59:27: exacerbated by Python not being well

59:29: suited for it or if it's just like

59:31: something that like you're personally

59:33: confused by cuz like I personally had no

59:35: issues um like interpreting like the

59:38: messages. I mean, I just I just like I

59:41: basically just handle the messages in my

59:44: stuff based on what type they come in

59:46: as, and that's pretty much it. Like, it

59:48: doesn't seem weird to me.

59:50: >> It's kind of funny because I was kind of

59:52: expecting more issues with like, you

59:53: know, strongly typed languages because

59:54: the then the JSON serialization needs to

59:57: be able to like determine the type you

01:00:00: get, you know, during the parsing

01:00:03: process. But like I figured like you

01:00:05: know with like dynamic languages you can

01:00:07: just you know you get the data and then

01:00:08: you like oh look like oh this type so

01:00:11: you do stuff based on that. Um but like

01:00:15: we would need like more kind of context

01:00:17: for this

01:00:21: and also thank you again Kobe for the

01:00:23: subscription. Um uh this question in

01:00:26: chat by the way um since it's a question

01:00:29: in chat like um we don't use Newton and

01:00:32: JSON for it. We use the system text JSON

01:00:34: for link.

01:00:36: >> Yeah Newtonoft is kind of crusty and now

01:00:41: >> so uh Dionos is asking have you heard

01:00:43: anything about the steam frame def kit?

01:00:46: Um,

01:00:47: no.

01:00:49: Well, sort of actually. Like I got asked

01:00:51: uh but like like by and like uh was one

01:00:54: of the editors at Upload VR

01:00:56: um if we got one and like we said like

01:00:58: no and like he paid one of the Valve

01:01:01: people. So maybe we'll see if that does

01:01:04: anything

01:01:06: to like have one at the very least, you

01:01:07: know, to make sure things are compatible

01:01:09: because um probably need like to make

01:01:12: some adjustments for the controllers and

01:01:13: so on. But do you want to like you know

01:01:15: like uh see how we can get it to run

01:01:17: natively because we have like you know

01:01:19: native support for ARM for the headless

01:01:21: and we could have like interest like

01:01:22: interesting architecture where like you

01:01:24: know the main process runs natively as

01:01:26: ARM um because the headless will run on

01:01:29: the Raspberry Pi um thanks to like J4's

01:01:32: work um and some people from the

01:01:33: community too um and we want to see like

01:01:36: can we get like you know the main engine

01:01:38: running natively and then like you know

01:01:40: only have the render under like you know

01:01:41: fax and proton on.

01:01:45: But uh we'll we'll we'll we'll see.

01:01:52: Uh Grand K is asking, "Have you seen

01:01:54: Hightail?" And it's almost in game

01:01:56: modding abilities. I actually haven't.

01:01:58: I've heard like a little bit, but like I

01:02:00: haven't heard much.

01:02:03: >> Seems kind of neat.

01:02:05: Raid was talking to me about it a little

01:02:07: bit. Uh it seems like Hightail kind of

01:02:11: seems like if Minecraft was developed

01:02:14: with like modders as like a first class

01:02:16: like heading the development like it the

01:02:19: day one it already had

01:02:21: >> actually was it day one? I don't know.

01:02:23: Within the first like day or two of it

01:02:25: releasing it already had 240 mods for

01:02:27: it.

01:02:29: >> That's quite a bit. It was actually like

01:02:31: one of the things that inspired like FRS

01:02:34: engine like like I was like because I

01:02:35: really like Minecraft and like like how

01:02:37: many mods there are for Minecraft and I

01:02:39: was like what if there was an engine you

01:02:41: can you know sort of build within and

01:02:42: have like a lot of control over it. Um,

01:02:45: so that's kind of interesting, but I

01:02:46: don't know any specifics for Hightail,

01:02:48: but there isn't really a lot of

01:02:50: inspirations for Resonate itself. But

01:02:52: also Resonate was like designed to be

01:02:53: more just more general engine or like,

01:02:56: you know, specific type of game. Uh, and

01:02:59: from what I've seen from what I've seen

01:03:00: like from Hightail, it's still kind of

01:03:02: like, you know, kind of like Vauxil

01:03:03: based kind of game.

01:03:06: the uh the TLDDR that I've been told

01:03:09: occasionally is that like if you like if

01:03:12: you like Terraria and Minecraft, it's

01:03:15: kind of it's kind of almost like a

01:03:16: fusion of the two. Oh,

01:03:19: >> interesting. Like gameplay wise or

01:03:22: >> Yeah. Like it has some elements from

01:03:23: Terraria where like you don't have a

01:03:25: crafting grid, but instead it will use

01:03:26: like the resources from the chests

01:03:28: around you when you're crafting and like

01:03:29: without having to make a shape. Um, so

01:03:32: it like doesn't it has more like a

01:03:34: Terraria crafting system.

01:03:36: >> Yeah,

01:03:38: that sounds interesting. I'm going to

01:03:40: play it sometime maybe, but we just

01:03:43: bucks.

01:03:43: >> We just we just we just got the um

01:03:46: vintage story and like haven't started

01:03:49: it yet. Wanted to play that one.

01:03:51: >> So many games.

01:03:54: Actually, I I um I recently played

01:03:57: through like on Wake 2 and it's made by

01:03:59: uh like a Finnish studio by the Games

01:04:02: and one thing I feel like funny in the

01:04:04: game because there's like so many

01:04:05: Finnish references in that um there's

01:04:07: like a post poster in the game and it

01:04:09: mentioned Glogy and I was like it's the

01:04:11: thing

01:04:12: >> Guggy makes sense.

01:04:14: >> Lucky.

01:04:19: >> Uh next question is from Yummy Deol. Uh

01:04:22: what are some things you wish you would

01:04:23: have known about conventions before

01:04:25: attending one? Um so there's actually a

01:04:28: thing like uh one of people from our

01:04:30: community told me at my first

01:04:31: convention. Um

01:04:34: when I attended one like I didn't kind

01:04:35: of know like what to do and like um

01:04:41: pretty much like like I remember the

01:04:42: first time I was like attending

01:04:43: convention I was kind of confused

01:04:44: because it just felt like we're just

01:04:46: going to constantly moving from one

01:04:47: place to another just like visiting

01:04:49: somebody in the room and felt like

01:04:50: nothing's really happening and I was

01:04:53: kind of getting like a little bit like

01:04:54: sad about that because it felt like you

01:04:55: know the days are kind of ticking away

01:04:56: and there's like not really much going

01:04:58: on. Um

01:05:01: and one of people in the community like

01:05:03: uh Mark like he actually told me like

01:05:05: it's kind of similarish how his first

01:05:06: convention was that like it was kind of

01:05:08: like you know not expecting to do and he

01:05:11: told me that like like conventions are

01:05:13: kind of like um it's kind of like an RPG

01:05:16: game like this kind of like more

01:05:17: open-ended where you can you know you

01:05:20: can have like and it's kind of like

01:05:22: conventions a lot about like would you

01:05:24: make it yourself

01:05:26: um and it's like you know you have like

01:05:28: sort of um main quests which are like

01:05:32: you know some of the official stuff you

01:05:33: know like the opening ceremony you know

01:05:35: like the dealers then you know like like

01:05:37: other kind of stuff and those kind of

01:05:39: the main quests and they have like you

01:05:40: know side quests which are like you know

01:05:41: room parties a lot of like side stuff

01:05:42: like you know happening in the

01:05:44: convention space um and people like you

01:05:47: know will take completely different

01:05:48: approaches for different conventions

01:05:49: like you know some people only do side

01:05:51: quests they will like go from room party

01:05:52: to room party some people go to lots of

01:05:55: panels some people kind of mix those uh

01:05:57: and something you won't be able to do

01:05:59: all of them. There's just like too much

01:06:01: stuff. You don't want to be too

01:06:02: constrained. But also it's like you know

01:06:04: make it your own thing. Like some people

01:06:06: will you know go and pester people from

01:06:08: AV and like you know maybe the AV people

01:06:09: will be like they'll let you like you

01:06:12: know press the like funny buttons. Um

01:06:14: and ever since then like you know I've

01:06:15: kind of had like more of stuff like this

01:06:17: kind of happening where I'm like okay

01:06:18: like I want to do these things and then

01:06:19: some things kind of happen. Um and like

01:06:23: you know like and you just kind of like

01:06:24: you know make it your own thing and like

01:06:26: you know find like the quests you kind

01:06:29: of want to do at the convention. But

01:06:30: also important thing is like you won't

01:06:32: be able to do everything like if you try

01:06:33: to plan too much stuff um especially

01:06:35: with fries like planning is difficult.

01:06:38: So like um you want to be like you know

01:06:40: kind of flexible and not consider that

01:06:44: everything will like you know happen

01:06:45: that you want like you know to do at a

01:06:48: convention. Um sometimes literally like

01:06:50: you can't because there will be like you

01:06:51: know two panels scheduled at the same

01:06:53: time that you want to attend. Um but

01:06:56: yeah it's it's it's kind of you know

01:06:57: this kind of like mindset like the

01:06:58: conventions kind of what you make it. It

01:07:00: also depends like how many people you

01:07:01: know there like usually I find like

01:07:02: conventions are better the more people

01:07:04: you have there. Like I went to some that

01:07:06: are like smaller just kind of like you

01:07:08: know wanted to see others but like then

01:07:10: there was a lot of like I don't know

01:07:11: what to do like like you know nothing's

01:07:14: happening. um nobody's around versus

01:07:17: like you know when it's like a bigger

01:07:18: convention there's a lot of people you

01:07:19: know then like it feels more like it's

01:07:22: the opposite it's like you know you're

01:07:23: going to hang out with these people I'm

01:07:24: going to hang out with this person but

01:07:26: like you know um a British group like

01:07:28: you know or this group is doing a thing

01:07:29: and this group wants to do a thing

01:07:33: so that kind of becomes like you know

01:07:34: the opposite where like you there can be

01:07:37: too much happening and like you won't be

01:07:39: able to like do all of it and then you

01:07:41: kind of feel bad you know because you're

01:07:42: like I want to do this thing too but

01:07:44: this is Sorry, this thing's happening.

01:07:46: So, um

01:07:48: yeah, it depends. It also depends like

01:07:50: conventions are very different like like

01:07:52: there's different types of conventions.

01:07:54: Um

01:07:57: yeah, I also do want to do you want to

01:08:01: talk about your experience too on this

01:08:03: one?

01:08:05: >> My first convention, I mean, I didn't

01:08:07: really know what to expect.

01:08:09: I uh I had never really gone out before

01:08:13: that. That was like in 2021 or so, I

01:08:16: think. Um like the last for context, the

01:08:20: last time I had really truly been like

01:08:22: even out of state at that point, like

01:08:26: not in like not just to freaking like

01:08:27: the state over cuz I live in like the

01:08:30: bottom left of my state, the bottom

01:08:31: right or whatever. Um,

01:08:35: I was in Resight and uh my buddy uh

01:08:39: Green came up to me and was like, "Hey

01:08:41: Sarah, are you going to BLFC?" And I'm

01:08:42: like, "Yeah, probably not." He's like,

01:08:45: "You want help with that?" And a few

01:08:48: months later, I suddenly find myself in

01:08:49: the middle of an airport with an

01:08:50: American Airlines ticket and all of my

01:08:52: bags with me. And I'm like, "Huh? Where

01:08:54: am I? Oh." Uh, and then I proceeded to

01:08:58: have the best time of my life. Um,

01:09:02: >> it was very it was very good. I mean,

01:09:05: that goes without saying, of course. Um,

01:09:09: but it I definitely didn't know what to

01:09:11: expect. Uh, it opened my eyes and gave

01:09:14: me a lot of ambition and uh

01:09:19: like, huh, this uh this world thing

01:09:21: actually is pretty cool when when I'm

01:09:23: hanging out with people I like in it.

01:09:24: Cuz I didn't really like going out much

01:09:26: before.

01:09:27: um had a great time, you know, got used

01:09:31: to being at the convention and then uh

01:09:33: when it ended I uh

01:09:38: I was a big old crybaby and I uh ended

01:09:43: up uh

01:09:45: I like went to go like hug uh Ver at the

01:09:48: end and I just like sobbed into his

01:09:50: shoulder cuz I had such a great time.

01:09:52: Um, so yeah, also taught me it's okay to

01:09:57: cry in front of your friends.

01:10:00: >> Take a sponge and wipe them out.

01:10:03: >> Yes,

01:10:08: >> that's good. Another question is for

01:10:09: you. Uh, D uh Damos is asking, is that

01:10:13: your confisair?

01:10:15: >> Uh, yes. Thank you for asking.

01:10:18: It supports your

01:10:21: uh diaphragm.

01:10:23: >> Yes, it uh it I'm very tired, so I got

01:10:28: to have some way to sit up.

01:10:32: Next question is from computer user. Um

01:10:36: uh Nakon said this on Discord, but a

01:10:38: little too late. Got there's an answer.

01:10:40: I had to mult another day. It was

01:10:41: obvious inappropriate. It's part of why

01:10:43: the problem they've been trying to be

01:10:44: part of solution for. What can we do to

01:10:46: help stop the spread of rumors about

01:10:48: what is causing bugs? How can we help

01:10:50: when people are complaining about things

01:10:52: that uh either aren't happening or that

01:10:54: aren't your fault at Discord?

01:10:57: So, there's like a few things. Uh I feel

01:10:59: like generally one thing that helps is

01:11:02: like learning to direct people to

01:11:04: primary sources. So like if there's like

01:11:06: rumors like you know often times it can

01:11:08: be like you know where's this

01:11:09: information has like you know if people

01:11:11: are saying like this is this this this

01:11:13: and this is a thing be like okay is

01:11:15: there like a video clip of us you know

01:11:16: saying this is a thing is there like a

01:11:18: thing on the wiki is there like you know

01:11:19: is there something solid we can point to

01:11:22: that sort of you know backs that rumor.

01:11:26: Um if it's not if there's not uh often

01:11:29: times a good approach is you know bring

01:11:31: it you know ask us um bring it to like a

01:11:33: resonance you know bring it to like

01:11:35: office hours like anyone on the team so

01:11:37: we can actually kind of you know talk

01:11:38: about it. Um that's usually kind of you

01:11:41: know the best way sort of like know

01:11:42: directed energy into the proper

01:11:45: channels. Um,

01:11:49: so for BS it can also, you know, depend

01:11:52: like like because often times like it

01:11:53: feels there's like so many like side

01:11:55: channels and just stuff and there's like

01:11:56: rumors and like things kind of like you

01:11:58: know like spinning on those side

01:11:59: channels and it never touches any of the

01:12:01: official channels uh and just kind of

01:12:04: like you know spins and snowballs and so

01:12:06: on and like we know we don't even

01:12:08: sometimes we don't even like know that

01:12:09: it's kind of happening or we just kind

01:12:11: of hear things but we don't have

01:12:12: anything solid. Um I feel like just in

01:12:16: general it's good to always

01:12:19: like you know you have like uh whenever

01:12:20: you hear something have like some amount

01:12:22: of like skepticism towards it be like

01:12:24: you know like what

01:12:27: you know like where's this information

01:12:29: coming from like you know what is the

01:12:30: background why are people like you know

01:12:32: saying this is this actually the case

01:12:33: because I feel like people don't do that

01:12:36: enough um where

01:12:39: like you know like me kind you know like

01:12:41: I have like both like some science

01:12:42: background ground and like engineer

01:12:44: background like you know I I I have to

01:12:45: kind of do that a lot like every time

01:12:47: I'm presented with information like

01:12:49: usually my first instinct is be like is

01:12:52: that correct what is this based on like

01:12:53: worse information is there like backing

01:12:55: it and it's not necessarily being like

01:12:58: you know not that information is wrong

01:13:00: but like it's more like

01:13:03: what is it like based on because like

01:13:05: once I kind of understand what it's

01:13:06: based on then like you know I can better

01:13:08: understand the information and actually

01:13:09: have some confidence that this is

01:13:10: actually happening you know this

01:13:12: actually has you know some kind of

01:13:14: effect. So that kind of thinking I feel

01:13:16: like encouraging people to do that and

01:13:18: finding the proper channels finding the

01:13:20: proper sources um that's like the best

01:13:24: way to kind of go about it. Um if people

01:13:26: like you know experiencing bugs and they

01:13:28: think you know maybe they know why a bug

01:13:30: is happening you know post about it in

01:13:33: the GitHub issue maybe provide like

01:13:34: information often times like you know we

01:13:36: will talk like you know look at it and

01:13:38: sometimes it is you know people will be

01:13:39: like this is happening because of this

01:13:41: and we'll be like oh okay and then ends

01:13:43: up being fixed or maybe people are this

01:13:44: happening because of this and it will be

01:13:46: like no this actually the system doesn't

01:13:48: you know touch this you know this

01:13:49: doesn't affect this and we can you know

01:13:51: we can clear it up but when it never

01:13:53: touches the official channels kind of

01:13:55: lets things, you know, snowball and

01:13:58: morph. It's kind of like, you know, the

01:14:01: um what's the English word for it? the

01:14:05: game of telephone or Chinese whispers or

01:14:08: like where like it's a game like you

01:14:10: have like a bunch of people

01:14:12: >> and like one person like you know you

01:14:15: you tell them something and you can have

01:14:17: like you know just like few people the

01:14:20: first person you know gets like some

01:14:22: kind of sentence and they tell the other

01:14:23: person you know they just whisper it and

01:14:25: that person whispers to the next person

01:14:27: the next person you think

01:14:30: that like you know you would get pretty

01:14:32: much the same sentence at the And often

01:14:34: times you get something completely

01:14:35: different because every time the person

01:14:37: whispers it, they mutate it a little bit

01:14:39: and the mutations they add up, you know,

01:14:41: very quickly and you end up like with

01:14:44: completely different unrelated, you

01:14:46: know, statement at the end. And

01:14:50: that's spec and that's specifically when

01:14:51: it's a game where your goal is to just

01:14:54: you know pass the information to other

01:14:57: person very much

01:14:59: and you know and when you're very aware

01:15:01: this is what you're supposed to be doing

01:15:03: people still kind of you know fail at

01:15:04: that you know because of kind of going

01:15:06: into biases. Um the thing is that this

01:15:10: kind of game of telephone this happens a

01:15:12: lot you know like it like because um

01:15:16: like you know somebody says something

01:15:17: somebody like misunderstands it and

01:15:19: people aren't even trying to like

01:15:20: present information verbatim the

01:15:22: information it mutates really fast. Um

01:15:26: so to prevent that from happening you

01:15:28: you want something to kind of ground it

01:15:31: and you know the official channels

01:15:32: official social iss like you know that's

01:15:34: one of the best ways to kind of do that.

01:15:37: So hopefully that kind of that kind of

01:15:39: helps.

01:15:45: Uh yummy dev tools asking uh what would

01:15:49: need to be implemented for YouTube links

01:15:50: to work again reliably? Uh will it be

01:15:53: high priority fix or do we just have to

01:15:54: live with it for a bit? Um unfortunately

01:15:57: that's not on our end like that's like

01:15:59: we are using u utility called YouTube

01:16:02: DLP uh which essentially like you know

01:16:05: used to fetch YouTube data um and the

01:16:08: problem is like you know YouTube they

01:16:10: keep breaking things you know like kind

01:16:13: of intentionally because um and it's

01:16:16: sort of like you know a space race where

01:16:17: like you know YouTube they have to end

01:16:20: like doing pretty much like really good

01:16:21: job at it but you know YouTube will keep

01:16:23: breaking things and then YouTube has to

01:16:25: find new ways to get around the

01:16:27: limitations, you know, and the changes

01:16:29: they've made. So, I don't think there's

01:16:31: ever like it's ever going to be 100%

01:16:34: reliable because of that. Um,

01:16:39: like it's like there's going to be no

01:16:40: phases where the YouTube videos are just

01:16:42: kind of breaking until YouTube DLP finds

01:16:44: a fix for it. Um, we do have a now

01:16:47: feature where YouTube DLP will update

01:16:48: itself automatically. So once they fix

01:16:50: it um on their end which you know

01:16:53: depending on what YouTube did like you

01:16:55: know like they um it can take them you

01:16:58: know some time um you'll probably like

01:17:01: you know be unreliable and then might be

01:17:03: reliable again and then like YouTube

01:17:04: will change something and be unreliable

01:17:06: again. So it's kind of like this kind of

01:17:07: like you know cycle if you really want

01:17:09: to like um if you really want to like

01:17:12: you know kind of contribute like you can

01:17:14: look into contributing to the YouTube

01:17:15: DLP project you know help them you know

01:17:17: with the with the work they're doing. Um

01:17:21: there's like some things you could

01:17:22: potentially do like kind of considering

01:17:24: it could like help things a bit because

01:17:26: like we could for example make YouTube

01:17:27: DLP download a video and then like you

01:17:29: know sync it to other users within a

01:17:31: session. Um so if like it works for one

01:17:34: user you know uh then they can like you

01:17:36: know then um send the data to other

01:17:38: users and like that can help get around

01:17:40: things but like there's you know side

01:17:42: effects to that. Um

01:17:45: so it is unfortunately kind of trick

01:17:46: issue and there's not like a perfect fix

01:17:48: for it.

01:17:51: I would actually say like you know if if

01:17:53: if you really wanted this to be reliable

01:17:55: it would require everyone to switching

01:17:57: to like you know some kind of like open

01:17:59: platform in place of YouTube but

01:18:04: like you know good luck like like

01:18:05: getting all the creators moving away

01:18:07: from YouTube.

01:18:09: So it's it's it's it's what it it's what

01:18:12: it is.

01:18:16: Uh

01:18:18: Dynamos is asking uh were there any

01:18:20: thoughts about ordering gather jobs? I'm

01:18:21: on very slow connection. I'm frequently

01:18:23: falling through walls before colliders

01:18:24: load. Um yes, the system actually has

01:18:27: like some priorization, but there's not

01:18:29: like a mechanism in the world to say,

01:18:31: okay, load this asset, you know, first

01:18:32: before this one. However, um there is a

01:18:36: solution where you can configure the

01:18:37: world and be like, don't initialize

01:18:39: locomotion until this collider is

01:18:42: loaded.

01:18:43: um a lot of people don't set it uh set

01:18:46: it up um which means like you know

01:18:49: falter the world I would generally

01:18:51: recommend for people to set it up to

01:18:52: avoid those kind of situations we do

01:18:54: want to eventually add like you know

01:18:56: more systems where I can say like these

01:18:57: meshes these you know assets are high

01:18:59: priority download and load this first um

01:19:02: right now those mechanisms don't exist

01:19:06: you can also use a primitive collider

01:19:08: under spawn to stop people from falling

01:19:09: through the world and I would actually

01:19:11: recommend people to do proper collider

01:19:15: passes on their world rather than just

01:19:17: leave everything as mesh colliders

01:19:19: >> because while it does work, it is

01:19:21: inefficient and leads to the problems of

01:19:24: like people needing to actually load

01:19:26: them. Whereas with primitive colliders,

01:19:28: if you compose your stuff out of prims,

01:19:30: um then

01:19:33: uh that doesn't have to be downloaded

01:19:35: because it's just primitive colliders

01:19:37: and they'll load instantly.

01:19:39: Yeah, here's another thing is like it is

01:19:41: a little more work but like generally

01:19:43: like using proper collider pass is like

01:19:46: going to lead to much better results.

01:19:49: Also, oh no, we've got the uh you want

01:19:53: F98A is asking type within the I I don't

01:19:58: know man.

01:20:00: >> I want to go back to the regular ones

01:20:02: please.

01:20:03: >> Too many negatives.

01:20:08: That's just a reply to chat.

01:20:09: >> Does it reply?

01:20:17: >> Uh asking

01:20:20: >> I don't think that's actually a

01:20:21: question. I think that's just them

01:20:22: talking in the chat.

01:20:23: >> Oh yeah.

01:20:25: >> Yeah.

01:20:25: >> Yeah. Let's skip that. I was also kind

01:20:27: of getting into personal kind of things.

01:20:29: So,

01:20:31: okay. So, that's actually all the

01:20:32: questions we have from

01:20:34: uh from Twitch right now. Uh let's see

01:20:37: how much time we have. We still have

01:20:41: uh 40 minutes left. I think it's 14

01:20:43: minutes.

01:20:44: >> Yeah.

01:20:46: >> Um

01:20:47: so we should uh I mean feel free to ask

01:20:50: like more questions, but there's like

01:20:51: one thing I kind of wanted to talk about

01:20:53: like that's I just found like

01:20:55: interesting. I kind of started like

01:20:56: talking about this like on blue sky.

01:20:57: It's kind of started interesting to

01:20:58: thread. Um

01:21:01: um so for like feel free to ask more

01:21:04: questions. We got like 40 minutes. So

01:21:05: like we still got like you know time. Um

01:21:08: but one thing I find like interesting is

01:21:10: there was like the news like where meta

01:21:12: currently closed you know the horizon

01:21:15: like work space thingy. Um and I just

01:21:19: like showed it and like every time I see

01:21:23: like you know news like that because the

01:21:24: first time there's like been number of

01:21:26: like you know social VR platforms and

01:21:29: like you know VR things and over time

01:21:31: since companies that like you know they

01:21:32: get like

01:21:34: you know dozens or hundreds of millions

01:21:36: of dollars of funding you know in beta

01:21:39: probably billions like you know they can

01:21:40: have like so many resources um and it's

01:21:43: so weird because like I remember years

01:21:45: and years ago Um, like I will see that

01:21:49: and I like I would have some people tell

01:21:50: me like be like, you know, like, well,

01:21:52: what if your building is just going to

01:21:53: be irrelevant? They're going to make it

01:21:54: irrelevant, you know, like you might as

01:21:56: well kind of give up now, you know,

01:21:58: because like you're not going to be able

01:21:59: to compete with like, you know, company

01:22:00: that has like thousands of people

01:22:02: working on these projects and there's

01:22:04: like, you know, billions of funding.

01:22:05: It's just going to completely destroy

01:22:07: whatever you're making. And it kind of

01:22:09: felt like kind of like, you know,

01:22:11: scary because it's like, you know, it's

01:22:13: like you're really like, you know, how

01:22:14: how how we going to compete like, you

01:22:16: know, with thousands of developers, how

01:22:17: are we going to compete with like

01:22:18: billions of dollars of funding, you

01:22:21: know, or even dozens of millions of

01:22:24: dollars of funding. Um,

01:22:28: and then like you know, you see you see

01:22:30: those news and then like you see like

01:22:31: you know, they try for a bit and then

01:22:33: like few years later like you know, they

01:22:36: closed the projects. Um

01:22:40: and it's

01:22:43: you it's so weird like you know being

01:22:45: like where we like still like small like

01:22:48: we're largely community funded. Um,

01:22:52: and we just kind of, you know, keep

01:22:54: going.

01:22:55: And there's like many times like, you

01:22:56: know, where it's like things are very

01:22:58: very hard and like things are like, you

01:23:00: know, like you feel like if there's like

01:23:02: many times like where if like this was

01:23:04: like a normal job, like like I feel like

01:23:06: we're like quit many times over, but

01:23:09: because this is like a passion project,

01:23:11: you know, it's something that's

01:23:12: important to us, we kind of keep going.

01:23:13: We kind of push through. But those kind

01:23:16: of be companies like you know it's all

01:23:17: about kind of it's all about numbers for

01:23:18: them and they just kind of kill the

01:23:20: project. Um

01:23:23: and it's always like so interesting to

01:23:25: see because it it shows that like it

01:23:28: doesn't

01:23:31: like having the like you know having in

01:23:35: like a huge amount of like monetary

01:23:37: resources doesn't matter as much as I

01:23:40: would think.

01:23:44: Um

01:23:47: like obviously it kind of matters in

01:23:49: some ways like we still like you know

01:23:50: even ourselves we need funding and we

01:23:52: kind of like you know we need more

01:23:53: funding than we have right now but even

01:23:56: like with what we have like you know we

01:23:57: keep going we keep kind of pushing

01:23:58: through um where I just kind of given up

01:24:02: and always kind of gives me like bit

01:24:04: more kind of hope that like you know us

01:24:06: kind of being sort of the underdog you

01:24:09: know kind of theme like we

01:24:12: um we keep going like while such big

01:24:15: projects you know fail

01:24:17: and it kind of started like interesting

01:24:18: like discussion like on like socials too

01:24:21: and

01:24:23: I don't know it's interesting feeling

01:24:31: >> but it's also like in big part like you

01:24:33: know thanks to you and the community

01:24:34: because you helped you know um you help

01:24:38: us kind of keep going

01:24:41: And that's like, you know, like and I

01:24:43: feel this like one thing like we have

01:24:44: that like, you know, those projects

01:24:45: don't have is like they try to like

01:24:48: sometimes they will try to like force

01:24:49: things like really hard like you know um

01:24:53: and just like it doesn't work like it

01:24:55: like the stuff like you have to build

01:24:57: like you know it it has to be genuine.

01:25:02: Um

01:25:05: and it's it's good to know like you know

01:25:07: that matters like the genuinity of it.

01:25:11: >> Yeah.

01:25:12: >> And it can matter a lot more than just

01:25:15: throwing like you know millions at

01:25:17: something or even billions in case of

01:25:19: meta.

01:25:23: Anyway, we got we got some more uh

01:25:26: questions.

01:25:28: Uh,

01:25:29: unless you want to talk about this more.

01:25:33: >> Uh,

01:25:36: I'm thinking we can I mean, we also do

01:25:39: have 30 minutes left as well. So,

01:25:41: >> yeah, we got time.

01:25:47: >> I mean, I can do more questions just

01:25:49: like knock out these for you. Okay, so

01:25:53: as Twitch is asking, how about a normal

01:25:55: schnopid and taongs?

01:25:58: I actually might have some actually I do

01:26:02: have like I think I have a schnoid like

01:26:03: which is sort of like related to

01:26:06: um somehat we talked earlier.

01:26:11: Um, it kind of goes like you know with

01:26:18: actually no hold on let me let me see

01:26:20: what I've been document

01:26:32: so I have Uh

01:26:41: yeah, actually you have like one that's

01:26:43: sort of like sort of related to like

01:26:45: what talked about like earlier. Um, and

01:26:48: it kind of goes like, you know, from

01:26:49: that kind of sort of, you know, initial

01:26:50: kind of skeptical kind of mindset is

01:26:52: like where sometimes it kind of feels

01:26:54: like at least for me is like where

01:26:56: asking a lot of questions is kind of

01:26:59: looked down upon like people kind of

01:27:01: like you know like you're like oh we're

01:27:02: asking so many questions and so on. Um,

01:27:05: but like usually my instinct is like you

01:27:07: know if I don't understand something I

01:27:11: will just keep asking questions until I

01:27:13: do. Uh, and so that gets difficult

01:27:15: because like you know like I'm just kind

01:27:16: of not finding the understanding I want.

01:27:19: Um, and it kind of gets frustrating

01:27:20: because you know I just keep asking

01:27:22: questions like you know it and just kind

01:27:23: of keeps scrolling like and like I'm

01:27:24: just like and I'm like

01:27:27: you know but for me

01:27:30: I find like some people like they will

01:27:33: like you know at some like they will

01:27:36: like you know just like somebody say

01:27:37: something or ask something and I just

01:27:39: kind of like be like kind of like not

01:27:40: along and like um

01:27:43: and like you know it might be based on

01:27:44: like their misunderstanding and

01:27:46: sometimes it takes a bit to find they

01:27:47: actually didn't quite understand like

01:27:48: you know what you meant or what you

01:27:50: wanted them to do. Um but it's just kind

01:27:52: of a note like you know finish this

01:27:53: conversation feels like you know quick

01:27:54: like you know you feel you feel like

01:27:56: you're in agreement. Um but for me like

01:27:59: you know if I like like I will generally

01:28:02: like not do that like I will like if I

01:28:04: feel like I don't understand something

01:28:06: I'll try like my hardest to like you

01:28:08: know really get down to like and really

01:28:10: like understand and get it. And

01:28:11: sometimes it happens like in threads

01:28:13: like where it's like um

01:28:16: it's like you know it just kind of makes

01:28:19: a very long discussion and like it feels

01:28:20: like you know like more like like you're

01:28:23: being combative but like even so like

01:28:25: the goal is to like you know to actually

01:28:27: get that you know understanding have it

01:28:29: like have it click in my head when it's

01:28:33: not clicking.

01:28:35: And I wish like it wasn't like you know

01:28:37: I wish it wasn't as much of a thing like

01:28:38: where it's like a thing where

01:28:41: um like it even happens like you know

01:28:43: for example in like you know when you're

01:28:45: in school and for times like you know

01:28:48: when the teachers like you know ask is

01:28:49: there any questions you know a lot of

01:28:51: students will not want to ask any

01:28:53: questions even though like they probably

01:28:54: need to there's like stuff they don't

01:28:56: understand because there's and I kind of

01:28:58: remember this myself from school like

01:29:00: where it's like if you ask a question

01:29:02: you know like you're showing other

01:29:03: people you don't understand something

01:29:05: and that's bad you know but it's

01:29:09: I wish it wasn't bad you know like like

01:29:11: the way it is where

01:29:14: if you ask questions it is a good thing

01:29:16: because you're trying to understand the

01:29:18: thing more and you're showing that you

01:29:20: don't understand something that's not a

01:29:22: bad thing because not like yes you're

01:29:25: showing like you don't understand

01:29:26: something but like for other people

01:29:27: probably also don't understand like you

01:29:29: know parts about it but you're showing

01:29:30: willingness to understand it more to

01:29:32: like you know to learn Um

01:29:36: I know like like there's many times like

01:29:37: when I was like at school where I wish

01:29:39: like I was very anxious and I would

01:29:41: often times not understand you know

01:29:42: something because of this sort of um uh

01:29:48: like like

01:29:52: you know feel this kind of like you know

01:29:54: embarrassment like from asking the

01:29:55: question. I wish that wasn't really much

01:29:56: of a thing and it was kind of more

01:29:59: normalized that like it's okay to ask

01:30:01: you know those questions. it's a like

01:30:04: good thing to do. And then some people

01:30:06: like you know like I kind of uh um later

01:30:10: in life like you know they kind of have

01:30:11: like you know this kind of mindset it's

01:30:12: like you know it's like that is

01:30:14: important like you know like you will

01:30:15: there will be a lot of things you don't

01:30:16: understand and the only thing you get

01:30:18: better understanding them is by asking

01:30:21: questions you know by sort of like

01:30:23: learning and that's kind of like you

01:30:24: know how I kind of got more comfortable

01:30:27: with it myself. Um, but it's still like

01:30:30: there's like places where I feel like

01:30:32: you know it's like

01:30:36: it's kind of like kind of sort of like

01:30:38: upon like like you know like where and

01:30:40: people just don't want to ask things.

01:30:42: Um, and I just wish like there was like

01:30:45: you know less of that. I'm kind of

01:30:48: rambly about this one.

01:30:51: Do you have do you have a schnop?

01:30:52: >> Yeah. Um

01:30:56: uh yes,

01:30:59: if you see me in a world guys,

01:31:03: please ask before petting me because I

01:31:07: keep having people who come up and they

01:31:09: pet me and they keep not asking

01:31:13: and it really makes me uncomfortable

01:31:15: because you wouldn't go up to someone in

01:31:16: real life and pet them. So, I'm asking

01:31:19: very kindly to please ask me. And if you

01:31:23: ask me, I will more than likely say yes.

01:31:25: Uh,

01:31:27: but it's really starting to grin on me.

01:31:29: So, pretty please. Thank you.

01:31:32: I think that's a good etiquette. I mean,

01:31:34: it doesn't even have to be like, you

01:31:36: know, like

01:31:38: like I feel like like it's a very kind

01:31:40: of quick question. Like some people are

01:31:41: like, you know, you can lally be like,

01:31:42: can I pet?

01:31:44: >> Yeah. And sometimes like I feel like you

01:31:46: can even like be like you know you don't

01:31:48: have to be like verbal about it. You can

01:31:50: be like and sometimes people will be

01:31:52: like yeah or they'll be like no.

01:31:56: >> Yep. But it's it's it's it's a good

01:31:58: thing like oh it's bright

01:32:02: happen when you don't ask explode

01:32:05: >> like yes

01:32:07: >> like yes it is a it is you know

01:32:09: technically like we're in a game you

01:32:11: know like all the argument is like oh

01:32:13: you can just take the headset off or you

01:32:14: can just not look at the computer screen

01:32:16: but like

01:32:18: >> if I take the headset off I'm not in VR

01:32:21: anymore. Yeah. Uh, and so like you're

01:32:24: basically saying don't play the game,

01:32:28: you know, like I want to play the game,

01:32:29: but I also want to be comfortable while,

01:32:31: you know, hanging out with my friends.

01:32:34: So,

01:32:36: uh, whether, you know, whether you're my

01:32:38: friend or not, um, you please just ask.

01:32:42: That's all I'm That's all I'm asking.

01:32:44: There's also a thing is like you know if

01:32:46: somebody wants to like pet somebody in

01:32:47: VR usually it's because you want to like

01:32:48: you know it feels nice and you want to

01:32:50: make the person feels nice and like it's

01:32:53: good to like just get like you know

01:32:54: quick confirmation that wolf make them

01:32:56: feel nice and sometimes like you know it

01:32:58: depends on the context like if you don't

01:32:59: ask like you know it's going to make

01:33:00: them feel bad and why do you want to

01:33:02: make feel somebody bad?

01:33:04: >> Yep. Yeah, that's when I shop it.

01:33:08: >> I do have a typon that's sort of related

01:33:11: to what I was like saying. um is sort of

01:33:14: where it hits like where it like you

01:33:15: know goes where I really like you know

01:33:17: environments and places and spaces where

01:33:21: it is kind of you know encouraged to ask

01:33:22: questions. it's considered like a good

01:33:24: thing and also like ones where

01:33:27: um

01:33:29: it's okay you know like you know it's

01:33:31: okay to make mistakes as long as you

01:33:34: learn from them and like the focus isn't

01:33:36: you know on the thing like you made a

01:33:37: mistake the focus is on what you learn

01:33:39: from it and how you fix it and how you

01:33:40: improve and um there's actually been

01:33:44: like you know one one of the people I

01:33:45: know like I known for a while um uh his

01:33:49: name is like better Khan uh he goes by

01:33:51: peg you can sometimes like see her like

01:33:53: a university teacher and he does like he

01:33:55: did this like whole lecture where like

01:33:57: you know it is important to make

01:33:58: mistakes because mistakes is kind of how

01:34:00: you learn and it kind of comes down to

01:34:02: like you know where like mistakes are

01:34:04: kind of looked down upon like you know

01:34:05: like there's so many spaces and areas

01:34:07: like where it's like you know you cannot

01:34:08: make even a single mistake you know like

01:34:10: when you make mistakes like things are

01:34:11: bad um but like they're like a natural

01:34:15: part of the learning process

01:34:18: um and of course it depends you know it

01:34:21: depends on um

01:34:24: depends on the situation like there's

01:34:26: places where you really want to minimize

01:34:28: the mistakes but if you're learning you

01:34:30: know or trying something new like like

01:34:33: you the mistakes will be natural part of

01:34:36: the process and it's okay you know as

01:34:38: long as you're learning from them.

01:34:41: So I like those kind of places where

01:34:43: it's you know because it feels kind of

01:34:44: constructive. It's like you know it's

01:34:45: not like it's not necessarily like a bad

01:34:47: thing like you know something happens

01:34:49: and

01:34:52: um it's more like you know what can we

01:34:54: learn from this? How do we make sure

01:34:56: this mistake isn't happening again

01:34:57: because mistakes they do get bad like

01:35:00: the moment like you know you

01:35:03: um don't learn from them. you keep

01:35:05: making the same one like over and over

01:35:07: again. And sometimes, you know, it also

01:35:08: can be like, you know, like you want to

01:35:10: make mistakes like say you're working on

01:35:13: something that does involve safety, you

01:35:16: know, say like um say like flying, you

01:35:19: know, um you're learning to be a pilot.

01:35:22: Obviously, you don't want to be making

01:35:24: mistakes when you're flying, you know, a

01:35:25: bunch of people on commercial airplane,

01:35:27: but that's why, you know, like you train

01:35:30: and when you train, you know, usually

01:35:32: like something in simulator and you will

01:35:33: be making mistakes and it's okay to make

01:35:36: mistakes, you know, as part of the

01:35:37: process because that's kind of how you

01:35:39: learn like don't do this, don't do that,

01:35:40: you know, like if you do this, this

01:35:42: happens and it gets you better

01:35:43: understanding. So there's like a way to

01:35:45: structure things where the learning

01:35:47: process you know like you will make

01:35:50: mistakes and then like you know you

01:35:52: learn from them and then when you

01:35:54: actually do things you know where like

01:35:55: it kind of matters you know it's like

01:35:58: because you learn from them and you have

01:35:59: that understanding you don't make those

01:36:02: mistakes anymore. Of course it depends

01:36:04: you know sometimes you'll make mistakes

01:36:05: that will affect some things negatively.

01:36:07: Um

01:36:09: and the question is like you know could

01:36:10: even prevent it like you know like like

01:36:12: is like um what's the impact of that

01:36:15: mistake in some cases you know the

01:36:18: mistake can lead like to things that

01:36:20: don't really have that much of a

01:36:21: consequence or consequence slow which is

01:36:22: like you know the main thing is you know

01:36:25: learn from it um but in general just

01:36:29: kind of like

01:36:32: I like enjoy like you know spaces where

01:36:34: it's like you know no this is like and

01:36:37: just the mind like you know whenever

01:36:38: people like the mindsets is like okay

01:36:40: like you know

01:36:43: let's not dwell like on like you know

01:36:45: just kind of this um

01:36:48: let's not like you know on like uh this

01:36:51: kind of being like like you know this

01:36:54: mistake kind of being made but more

01:36:56: focus on like where do we go from here

01:36:59: like how do we fix this what do you

01:37:00: learn from it

01:37:04: my points

01:37:12: Let me think.

01:37:21: That's not a good sign. I can't think of

01:37:22: anything good.

01:37:24: What? It's not a good bad.

01:37:27: >> I can only think of the bad things.

01:37:29: Help.

01:37:32: More

01:37:32: >> tons.

01:37:34: Well,

01:37:35: >> just just treat and be like like when

01:37:36: people ask before they pet you.

01:37:40: >> I mean, I guess that could be a that

01:37:42: could be a valid one.

01:37:44: >> Uh

01:37:49: I don't know. Resonate link. That's my

01:37:51: type. It's cool. We put stuff in the

01:37:53: game really easy.

01:37:56: >> I was actually going to ask a chat that

01:37:57: thing, but um let's go through these

01:37:59: questions. Oh, we've actually got a few.

01:38:00: Um let's go through our questions first.

01:38:06: this one. Um,

01:38:11: um, Alexia is asking, "Pause, muscles,

01:38:13: claws, or tails?" Um, yes.

01:38:21: >> Kyle Wolves is asking, "Uh, a thought

01:38:23: popped in. I remember hearing at one

01:38:25: point that realizing perfect do

01:38:27: performance hit. Is this true? Was this

01:38:28: true and no longer is? I don't actually

01:38:31: remember if that was like

01:38:34: >> we we we answered this earlier.

01:38:36: >> No, it was only data. They specifically

01:38:39: ask about like not the impulse ones.

01:38:43: >> Oh,

01:38:43: >> actually wait. Oh, they're not

01:38:45: specifying. Yeah, some of them I don't I

01:38:47: think the impulse ones got optimized

01:38:49: away, but the data ones haven't. I

01:38:51: forget.

01:38:52: >> Yes, I don't remember.

01:38:54: >> It is. The impulse ones do get stitched

01:38:57: out. Um

01:38:58: >> Okay. So th those ones are actually

01:39:00: removed from the execution.

01:39:02: But the val like the the relays that

01:39:05: carry references and values around those

01:39:07: do have an impact albeit it's very

01:39:09: small. Um but if you notice like a

01:39:13: really really tight loop that you're

01:39:15: doing is running a little slow try

01:39:17: moving some relays from it.

01:39:18: >> Yeah. And we eventually want to

01:39:21: implement like optimization for this but

01:39:23: uh right now that's not a thing.

01:39:27: Uh Jacob Fox is asking uh

01:39:34: uh uh in the link C# reference

01:39:36: implementation is it correct that slot

01:39:38: and component are both are defined as

01:39:40: diver ships of worker despite both being

01:39:42: workers or is that an oversight? I know

01:39:44: that's intentional. Um there link it's

01:39:47: sort of like it's doesn't 1% like you

01:39:51: know copy the exact hierarchy because

01:39:52: like there's not really any types of

01:39:54: workers that you can currently access to

01:39:55: link. Um there's also not necessarily

01:39:58: stuff that needs to be shared between

01:40:00: those. We might maybe change that if it

01:40:02: like ends up like being shared, but like

01:40:05: it's not like, you know, the goal isn't

01:40:08: like to like mimic all the internals of

01:40:11: engine, but sort of like more like in

01:40:13: how you sort of interact with it.

01:40:18: But right now, right now, the fact that

01:40:20: they're separate is like intentional

01:40:21: just to kind of not like complicate

01:40:23: things.

01:40:29: Uh, next question is from computer user.

01:40:34: Uh, with the plan for multi-purpris

01:40:36: architecture, would that mean more

01:40:38: systems in game would act like how

01:40:39: current render for engine system works?

01:40:42: If that is the case, what would mean

01:40:44: each system would effectively have its

01:40:45: own garbage collector? Sorry if I'm

01:40:47: asking this way too soon. No, that's

01:40:49: fine. Um so yeah like uh if we separate

01:40:53: like you know each world into its own

01:40:55: process then yes each world is going to

01:40:56: have its own garbage collector like you

01:40:58: know its own like memory space that

01:41:00: might help like you know some of the

01:41:01: like you know performance across the

01:41:02: worlds. Um

01:41:05: I mean yeah that's pretty much it like

01:41:07: if if it runs uh it own process then

01:41:09: each process has its own like you know

01:41:11: run time and it own you know garbage

01:41:13: collector. Um

01:41:18: that was that's kind

01:41:20: >> I I figured we were already on like the

01:41:23: multipprocess architecture as it was

01:41:27: I had understood it.

01:41:29: >> Oh well we do have the multiprocess

01:41:31: architecture between the render and the

01:41:32: main process but we also want to in the

01:41:34: future we want to split the main process

01:41:35: like each world you run

01:41:37: >> because it's on process as well. So it's

01:41:39: like there's like further levels of

01:41:40: isolation

01:41:41: >> and we can also like you know then for

01:41:42: example sandbox individual worlds and

01:41:44: even like for example if a world crashes

01:41:47: um it doesn't crash the entire engine

01:41:50: even if it's like you know hard crash.

01:41:53: >> Okay.

01:41:54: >> So all the questions we got uh we still

01:41:56: got about uh 18 minutes

01:42:00: 18 minutes left.

01:42:01: >> Um

01:42:03: so I actually have a question for Chad

01:42:05: as well. Um, you know, since we like

01:42:07: released Resonide Link, have you seen

01:42:09: any cool things or like what are the

01:42:12: what are the things you want to see the

01:42:14: most with Resonite Link? Uh, or what

01:42:17: what are things that you want to do with

01:42:18: resol link or have you seen any like

01:42:20: cool stuff that uh that you like?

01:42:24: >> I could bring in my picture.

01:42:25: >> Yeah, you can bring a picture like talk

01:42:27: about it a bit.

01:42:28: >> Bring my picture in.

01:42:30: Uh, let me Where did I put it? Where's

01:42:33: my picture?

01:42:34: Where' I put my picture? Where's my

01:42:37: picture?

01:42:38: Where is it? Uh,

01:42:42: did I save it to my inventory?

01:42:45: I swear I did.

01:42:47: >> Can copy it. I can maybe grab it from

01:42:49: this. Oh, I got it. There we go.

01:42:51: >> I got it. Um,

01:42:53: yeah. So, I was using uh Resonate Link

01:42:57: uh in Gary's mod and I Where's the

01:43:00: camera? There it is. Um, I was using

01:43:03: Resonate Link in Garry's mod and I

01:43:06: managed to there's a function over there

01:43:08: that lets you query like what the global

01:43:11: illumination looks like at a particular

01:43:13: point in the map. So, I just like

01:43:16: because I wanted to just like test the

01:43:18: the image sending over to Resonite, I

01:43:21: just in uh yeah, Gary's mod um

01:43:25: I just like wrote a little like Louis

01:43:27: script to sample like a 2D grid of like

01:43:31: the lighting data. So, this is just like

01:43:32: a slice of the map. It's not like an

01:43:34: actual like light map. Um

01:43:37: >> and I I sampled the data directly. I

01:43:41: stored it in a table and I just uh I

01:43:43: sent the websocket message for the

01:43:45: texture uh to send it directly over the

01:43:48: websocket with just like raw bytes and

01:43:51: uh it gave me back a picture URL and I

01:43:54: pasted that in game and it worked.

01:43:59: >> Uh SDK1.

01:44:02: >> Yeah, man. Uh I mean I it could be

01:44:06: possible. Would we do would you be

01:44:09: showing that video with the physics

01:44:10: thing?

01:44:11: >> Yeah, I can I can bring it in.

01:44:13: >> I found it really cool.

01:44:17: Where did I put it? There it is.

01:44:20: Video.

01:44:27: Um, in this video, uh, I took a physics

01:44:32: object in Gary's Mod and I sent its

01:44:36: creation over to Resonate, and I'm just

01:44:39: continually syncing the position. Um,

01:44:43: Resonate Link's really not meant for

01:44:45: this. it uh it runs and it looks very

01:44:50: smooth, but I would imagine trying to do

01:44:52: this with any more than maybe like a few

01:44:55: dozen like over a hundred objects would

01:44:58: be very very very slow. Um

01:45:02: >> yeah, the most efficient way

01:45:04: >> already with one uh it's uh already kind

01:45:09: of hurting a little bit because it is so

01:45:12: slow. Um is it Oh. Oh, why did that

01:45:17: import like that?

01:45:20: Um, picture

01:45:24: import. Yeah, it does not run very fast.

01:45:27: That's half of a millisecond to uh just

01:45:30: do that one thing.

01:45:31: >> Do you know what's like spending time

01:45:33: on? Is it like the JSON serialization?

01:45:36: >> Um, it's likely the JSON serialization

01:45:39: coupled with a lot of memory pressure.

01:45:42: Um, it's very very very hard to not

01:45:45: allocate memory in Lua because all of

01:45:48: like the vectors and stuff and whatnot.

01:45:50: Like I have to make a new table every

01:45:52: time I want to serialize a vector.

01:45:54: >> Um,

01:45:56: and making a vector in and of itself is

01:45:59: an object on the heap that has to be a

01:46:00: garbage collected.

01:46:02: >> Yeah.

01:46:03: >> So it's very

01:46:06: it's it's quite you have to get very

01:46:08: crafty with making it run fast. And

01:46:10: that's not even Lua. That's expression

01:46:12: two which is written on top of Lua.

01:46:17: >> So it is very cool just like seeing like

01:46:19: you know something coming from Gary's

01:46:21: mod.

01:46:22: >> Yeah, you can get to go a lot faster in

01:46:25: normal Lua.

01:46:26: >> Um

01:46:29: but uh yeah, this is this is literally

01:46:31: just me giving into the demons and

01:46:32: seeing if I could do it and I did. And

01:46:35: it's also like one of the things like we

01:46:36: we generally like there's not like it's

01:46:38: not designed you know for like

01:46:40: synchronizing a real time you know lots

01:46:42: of real time objects. Um the goal is

01:46:44: like you know as much ease of

01:46:46: implementation to sort of like bring

01:46:47: data in but you can do it if you want

01:46:49: to. Um it's made like you know so you

01:46:52: can play with things like that. It's

01:46:53: just not particularly optimized for

01:46:55: that. But like have fun like this

01:46:57: because stuff like this is fun.

01:47:00: >> Yeah. if you can find a way to like

01:47:04: it's one thing it's one thing with like

01:47:06: you know ref hacking or whatever because

01:47:09: we tell you not to do that but like if

01:47:10: you find a creative way that we didn't

01:47:13: think of to use the engine or maybe use

01:47:16: it in a way that we didn't think it was

01:47:18: optimized for but it actually works

01:47:19: really well for then go for it surprise

01:47:22: us

01:47:27: >> let's uh some comments Gay Foxson's

01:47:31: asking in Gar's mod. Yes.

01:47:33: >> Yes. Gar's mod.

01:47:34: >> That's that's one of the cool things is

01:47:35: you know with like link is you can like

01:47:37: like pretty much the goal of the library

01:47:38: is like you can link it to almost

01:47:40: anything because JSON websockets are

01:47:42: like very widely supported.

01:47:45: Actually, the the only reason I was able

01:47:47: to do the texture upload into Gary's mod

01:47:51: was because literally the websocket

01:47:54: module for Gary's mod uh

01:47:59: the module for it um

01:48:05: had a beta version

01:48:07: that came out literally in November of

01:48:11: last year

01:48:13: >> um that added binary gave me message

01:48:15: support. So I was like, "Oh my god, I'm

01:48:19: saved." And it works.

01:48:20: >> Yay. I saw a shark. How's that physics?

01:48:24: And I just put the physics objects in G

01:48:26: mode. You can do that now.

01:48:29: >> And and you can make the objects clean

01:48:31: themselves up. That's what I did with um

01:48:34: that's what I did uh with like the uh

01:48:38: like a a further revision on it is I

01:48:41: made it clean itself up. So, Alex2Pi is

01:48:44: like, "I don't have anyone to play GM

01:48:45: modem mode with." Um, I have to answer

01:48:48: that.

01:48:49: >> That's because we're all playing

01:48:50: Resonate.

01:48:52: >> Just just synchronize GM mode to

01:48:54: Resonite and then play Resonite and you

01:48:55: can be playing

01:48:57: GM mode with others through Resonite.

01:49:01: >> Yeah,

01:49:03: it's great.

01:49:05: Oh my god. Uh

01:49:13: um you want F 98A is asking one thing I

01:49:16: noticed about there is another link

01:49:18: setting material on meer. It doesn't

01:49:19: seem to work.

01:49:21: >> I've seen like a report on that one but

01:49:23: I haven't be able to replicate it yet.

01:49:25: Um

01:49:26: >> are you sure you're actually setting the

01:49:28: data right? Cuz like I've been able to

01:49:30: set references and stuff correctly.

01:49:32: >> Yeah, that's fine.

01:49:34: I I haven't been able to like look into

01:49:36: the depth for that one, but like it

01:49:38: seems like it works for other people, so

01:49:40: it might be just something.

01:49:43: I don't know what yet.

01:49:46: >> I'm like a good replication case or

01:49:49: something.

01:49:52: >> So, computer users asking,

01:49:54: >> it would be really fun if someone

01:49:55: manages to use resol to play Resonite

01:49:57: from Doom as opposed to playing Doom and

01:49:59: Resonite. Yes, please do that.

01:50:05: How about a little

01:50:07: How about How about Hightail?

01:50:11: Veryable.

01:50:13: >> Yeah, I guess so. If you could get a

01:50:15: website. Yeah.

01:50:18: >> Yeah. Like have fun with it. This little

01:50:19: options.

01:50:21: Have you seen like any stuff like with

01:50:22: the meshes and audio?

01:50:25: Um,

01:50:27: I know somebody is uh

01:50:31: I think someone was excited about the

01:50:33: audio thing for some reason. I don't

01:50:35: remember why. Yeah, Fooy. I don't

01:50:37: remember why.

01:50:38: >> They work with a lot of audio stuff.

01:50:40: >> Yeah. Um,

01:50:44: I guess that's like I haven't seen

01:50:46: anybody do stuff with it. The only thing

01:50:48: I've seen people use with the assets is

01:50:50: like my own thing that I did and I

01:50:53: haven't seen anybody else use it yet.

01:50:58: There's also like a uh

01:51:01: some other people discover having

01:51:03: similar issue maybe just something to

01:51:04: think about it. There's exact messages I

01:51:06: was sending in the GitHub issue for when

01:51:08: you get time. I mean this is the thing I

01:51:10: did like look at it briefly but like

01:51:12: like the messages they seemed like I

01:51:14: have to like look at it more but um when

01:51:17: I just try to do a quicker application

01:51:19: of it like they were not doesn't seem to

01:51:20: be actual JSON because they were like

01:51:22: using different symbols for things and

01:51:25: like some of them like were missing

01:51:28: like because it was like referencing

01:51:29: some objects that were created in

01:51:31: messages that like didn't they were not

01:51:33: included but I might have missed

01:51:36: something. I haven't like really had

01:51:37: time to like look at it fully,

01:51:43: but this this just kind of like what I

01:51:44: ran into like when I was like skimming

01:51:46: through stuff.

01:51:48: >> Gson is asking uh I wonder if we could

01:51:51: make video portal between humite in a

01:51:53: world mode players could see their

01:51:55: physics objects in there as well. Can

01:51:57: you bring videos to G mode like that

01:51:59: like video streams?

01:52:01: Uh yeah, there's

01:52:04: you can actually watch videos in Gmod

01:52:06: that are synced with each other.

01:52:08: >> Okay.

01:52:09: >> So do that.

01:52:11: >> That was actually a feature when I was

01:52:14: like 14 years old already.

01:52:18: >> Oh, that guy's been there a while.

01:52:20: >> Yeah, it's it's been there a while. It's

01:52:21: probably a lot better now. I used to

01:52:23: kind of suck back in the day.

01:52:25: Yeah,

01:52:28: >> you kind of take synced videos for

01:52:30: granted after playing Resight.

01:52:34: Uh, Alexi is asking, "Sometime ago, I

01:52:37: said that importing Minecraft maps has

01:52:39: been updated in updates, but somehow I

01:52:41: couldn't get it to work on Linux." I

01:52:43: don't think it's going to work on Linux,

01:52:44: unfortunately. We use a tool called

01:52:46: Mineways um that like does the

01:52:49: exporting. I don't actually know if

01:52:50: Mineways does support Linux. I suppose

01:52:53: Linux, we might be able to get it to

01:52:55: work, but it will probably not work out

01:52:57: of the box.

01:53:00: >> Worst case scenario,

01:53:02: >> worst case scenario, you have to like

01:53:03: replace the .exe with like a script that

01:53:05: launches it in Wine or something.

01:53:09: >> Version.

01:53:12: Uh,

01:53:17: yeah. I don't I don't Oh, wait. They do

01:53:20: have like Linux stuff. Um,

01:53:26: did we have like for Mac? Um,

01:53:30: looks like you might need to build it

01:53:31: for Linux. Maybe

01:53:35: Linux. There's like

01:53:36: >> Oh. Oh, no. It just says for Linux,

01:53:39: download the Windows version and use W

01:53:41: emulate Windows. We might actually proxy

01:53:44: that through

01:53:46: >> We might proxy that through the

01:53:49: bootstrapper then.

01:53:51: to launch it because like the main

01:53:53: process it runs like outside of wine. So

01:53:55: like if it tries to launch it, it's

01:53:56: probably not going to work. But we might

01:53:59: make make a GitHub issue about it.

01:54:02: >> I mean the the main proc so the main.net

01:54:05: process launches

01:54:08: >> so I run the bootstrapper which launches

01:54:10: the native.net process and then the I

01:54:13: think the

01:54:14: >> I believe the native.net process

01:54:16: launches the renderer.

01:54:18: >> No,

01:54:18: >> in back. No, it doesn't. No, it's the

01:54:20: bootstrapper. Does it?

01:54:21: >> I can't remember.

01:54:22: >> It's the bootstrapper because the main

01:54:24: process is running outside of wine. So

01:54:26: if it tries to launch something, it's

01:54:28: not going to run under wine. That's why

01:54:30: you have to read the bootstrapper.

01:54:33: >> I remember now.

01:54:34: >> Yeah. So one of the main reasons why the

01:54:37: bootstrapper even exists is like so it's

01:54:38: like

01:54:39: >> Yeah.

01:54:41: >> Yeah. I would say make it up if like we

01:54:43: can potential like look into that. I've

01:54:44: been kind of playing with like

01:54:46: >> I've been playing with like Linux

01:54:48: myself. So like maybe they could have

01:54:49: like fun to get it to work.

01:54:52: >> I uh

01:54:52: >> no promises but uh

01:54:55: >> I blocked the whole uh bootstrapper how

01:54:58: to actually start reset on Linux ordeal

01:55:00: out of my mind trauma.

01:55:06: >> But it shouldn't be too much of an issue

01:55:07: like like just like another mechanism to

01:55:09: bootstrap or like launch um other

01:55:11: process because you want me to do it for

01:55:13: other things too. Um and they would

01:55:16: launch them under like you know under

01:55:19: wine and then they should work. So

01:55:24: either way make make a get au so it's

01:55:26: like a portrait like otherwise like you

01:55:27: know we will forget

01:55:29: and I no promises we might not like have

01:55:32: time for this one but like without a

01:55:34: getab issue it's like 100% not getting

01:55:36: fixed.

01:55:37: >> Yeah.

01:55:39: Um, I would say with like we got what we

01:55:43: got 5 minutes left. I'd say we're

01:55:45: probably cut questions at this point.

01:55:48: Uh,

01:55:49: >> if people have quick questions or if

01:55:51: they have like quick comments on link

01:55:52: and it kind of cool stuff like you would

01:55:53: want to see like

01:55:55: >> just uh don't don't ask your longass

01:55:59: questions now like uh we've gotten in

01:56:03: the past. It's like, can you explain the

01:56:05: philosophy behind ResNite in

01:56:08: excruciating detail, please?

01:56:11: >> Yeah.

01:56:16: I'm not really sure why.

01:56:19: Sorry.

01:56:19: >> I mean, asking the main confusion in C#

01:56:23: JSON the right models that attribute is

01:56:24: either the base class, not actually the

01:56:26: right model class. It's like backwards.

01:56:28: Um

01:56:30: I don't like I mean it doesn't like this

01:56:33: shouldn't affect like your

01:56:34: implementations like like because that's

01:56:37: very C# thing but also like for C# it

01:56:39: makes sense because when it's d

01:56:40: serializing it sees the base class like

01:56:42: that's the only thing it knows so it

01:56:44: needs to like know to look it up. You

01:56:46: could potentially have implementation

01:56:48: where like you know it searches

01:56:49: everything but it's going to be very

01:56:50: inefficient because like the derish

01:56:52: classes can come from additional

01:56:54: assemblies and maybe like you know

01:56:55: they're loaded dynamically. So it also

01:56:59: could potentially lead to like issues

01:57:00: where something could like try to

01:57:02: intrude in introduce like you know under

01:57:04: the derived class that like the original

01:57:05: implementation doesn't account for and

01:57:07: now stuff explodes. So it kind of needs

01:57:09: to say okay this type of message you

01:57:12: know can have these subtypes and sort of

01:57:15: like define them so the implementation

01:57:17: knows ahead of time like all the types

01:57:18: that will exist but like that shouldn't

01:57:22: like that's like C# implementation

01:57:24: details so like you don't need to follow

01:57:26: that for like you know if Python does it

01:57:28: on the dash classes use that like you

01:57:31: know it it should work the same the main

01:57:33: part is like you know you need to

01:57:34: visualize the right type based on the

01:57:36: type property but

01:57:39: where you put like you know how you do

01:57:41: that like you know that's specific to

01:57:42: your language.

01:57:44: >> Yeah. Like the way I've been doing it is

01:57:46: I literally just check the the dollar

01:57:48: sign type of the message and depending

01:57:51: on the dollar sign type that's how I

01:57:53: handle the logic. That's literally it.

01:57:56: >> That's all I've been doing. Uh I don't

01:57:59: really see why this is

01:58:01: >> it does also help because like you know

01:58:03: like if you look at it class you know

01:58:05: all the types you know that it will

01:58:07: have. So like you know you don't need to

01:58:10: go like hunting through the codebase for

01:58:12: all of them.

01:58:14: >> Yeah I guess I I guess I really am just

01:58:17: not grasping what is making this so

01:58:19: confusing.

01:58:22: And I guess can like

01:58:24: like I do feel like this way is like

01:58:26: kind of better because like it just you

01:58:28: know you have it in one place you know

01:58:30: what is you know what types are being

01:58:33: derived

01:58:36: but yeah um now we only have like 1

01:58:38: minute left so we're going to like exit

01:58:39: it here. So thank everyone for joining

01:58:42: for the stream. Uh hope like had like

01:58:44: you know um really like had like you

01:58:47: know good year uh good New Year's and

01:58:50: this year is going to be good. Um should

01:58:53: be like you know pretty girl now like um

01:58:55: so there should be like another one next

01:58:57: Sunday. Um anyway thank you very much

01:59:00: for watching. Thank you very much and

01:59:01: everyone who's like supporting Resonate

01:59:03: uh whether it's like you know just by

01:59:04: playing the platform making cool things

01:59:07: um or like you know supporting us like

01:59:09: through Patreon or Stripe. Uh if you're

01:59:11: still on Patreon, we strongly like, you

01:59:14: know, recommend switching to Stripe. Um

01:59:16: we're going to get about like, you know,

01:59:17: 10% more from the same amount of money.

01:59:19: So that kind of helps us a lot the more

01:59:20: people switch. Uh but either way, like

01:59:22: you know, thank you for supporting us.

01:59:24: Thank you also for the like, you know,

01:59:25: subscribing on Twitch. Um that helps a

01:59:28: lot as well. Um

01:59:31: and like I guess we'll see you next

01:59:32: Sunday or like you know there's also

01:59:35: like other office hours so you can like

01:59:36: check out those as well. So,

01:59:39: um let's see if we also have anybody to

01:59:41: raid because we do like to raid people

01:59:44: uh from the community.

01:59:46: Uh

01:59:48: >> just to quickly answer seventh void's

01:59:50: question. Yes. Uh you feasibly could do

01:59:53: that if he made a script for it.

01:59:58: >> Person streaming right now is uh Creator

02:00:01: Jam. So, we're going to send you over to

02:00:02: Creator Jam. Uh,

02:00:06: which there's also, by the way, they're

02:00:08: going to be

02:00:10: starting MMC again. Uh, you might have

02:00:12: already seen some people with the MMC

02:00:14: sponsor patch.

02:00:17: Uh,

02:00:20: oh, come on. Twitch is not being

02:00:23: cooperative. There. There we go.

02:00:26: So, I'm actually kind of excited like

02:00:28: but we'll like we will see like cool

02:00:29: stuff. Um,

02:00:31: you know, there's an link.

02:00:37: Okay,

02:00:41: there we go.

02:00:45: All right, it's getting ready. So, thank

02:00:48: you again very much for watching and uh

02:00:51: we'll see you next time. Wow.

02:00:54: >> Bye guys. See you

02:00:59: >> and going to stop the recording and