This is a transcript of The Resonance from 2025 February 9.
00:00: Live. Sending announcement. Another announcement. Where's the office hours? I'm blind. There we go.
00:18: More... more sort of questions. And we're going to post the blue sky. There we go.
00:25: I know everyone, people should be piling in.
00:31: Hello? Hello? We have people, we have people!
00:33: Bwah! Hello! Hello Jack, hello Real Linus and hello Nuki. Oh, they're watching ads.
00:39: They can't hear our hellos yet.
00:43: Wah. Wah. Wah.
00:48: Hello.
00:51: Bwah pooping in.
00:53: Hello Vultboy!
00:54: Hello everyone.
00:56: Hello.
00:59: Don't be listening while we're looking at our MMC stuff.
01:02: Ah, thank you for your subscription, yes. Thank you.
01:06: We've got Columbia Jay.
01:08: It's been... I can't show it.
01:14: Thank you.
01:17: They've subscribed? Hello Jay Widen.
01:21: Hello everyone, how's everyone doing today?
01:27: Ah, they're trying to set it on fire.
01:31: It didn't work. It didn't work.
01:34: We don't have that on this one.
01:37: It's not conductive to answering questions.
01:41: So we can maybe set that one. Maybe if people are sensible with it.
01:46: Cyro's giving a side-eye, so I don't think he likes that idea.
01:50: Oh, thank you Tyra. Another subscription.
01:54: The Resonite logos, which unfortunately don't render here, but they do render at Twitch.
01:59: Thank you Tyra.
02:02: Or should I say, or dear.
02:04: This is what emojis look like to us.
02:06: Yes, it shows the raw text.
02:09: Oh no, he's doing it again.
02:12: Gerand is doing Xnopyt.
02:14: Before we actually get into the questions, hello and welcome everyone.
02:18: This is another episode of The Resonance.
02:20: It's essentially like my office hours, slash podcast, where we talk about essentially anything Resonite.
02:29: There's a heavy focus on technical stuff, but we can ask anything about philosophy, how's the company doing,
02:36: pretty much anything that has to do with the platform.
02:39: We can just ask, whatever you want to ask, you can ask during the stream.
02:44: If you're going to ask a question, make sure to end it with a question mark.
02:53: That way it pops on our train and we make sure we don't lose it.
02:56: If you're asking a follow-up question, please include context of the question because oftentimes there's a bunch of questions in between
03:03: and then we forget what the original question was and it makes it hard to answer.
03:10: And Frooxius also has Cyro with me, like he's from our engineering team, so we should be ready to get started.
03:19: We also do have some questions that we ask ahead of time, there's actually three now.
03:24: In our Discord, one thing we started doing is for anyone who's not able to attend...
03:31: Ooh, also thank you Moonbase for the subscription, thank you.
03:34: More subscriptions.
03:36: If you're not able to attend, you can ask your questions in the Discord, there's a channel created before the stream starts
03:44: and we can pile your questions there and we'll try to get through them as well.
03:49: I'll see how stuff goes, because I'm not sure if it's better to prioritize them at the beginning or the end.
03:54: We have a few questions there, so let's say we go for an hour, and if there's free time,
04:02: like if there's not too many questions, we can go to the Discord once.
04:05: If it's too busy, we'll switch to the Discord questions after a bit,
04:12: just so those get answered and people ask ahead of time.
04:16: But for now, I'll try to keep them for the quieter moments and see how stuff goes.
04:22: Let's clear these things out.
04:24: The first question is, like, Grandrick is asking Xnopyt.
04:30: That ended up being like a 15-minute ramble about rev-packing last time,
04:34: so like, other than rev-packing, what's the other thing that makes you spontaneously disintegrate?
04:41: Something that makes... let's see, what else makes me spontaneously disintegrate?
04:47: Other than rev-packing.
04:50: Mono-packing.
04:52: Yes. There we go. I don't think we're gonna jump into a big one on that.
04:57: Yeah, the only thing I gotta say about that is, like, just don't do it. There's no reason to.
05:07: There's no benefit.
05:11: Maybe a tiny one.
05:13: Yeah, like, if you have two nodes, maybe, and you just, like, don't care to pull out the ProtoFlux chip,
05:19: that's okay, I don't really care about that, they might as well just be components at that point.
05:23: But if you're packing, like, a hundred-plus nodes on a single slot, what are you doing? Re-evaluating life?
05:31: Well, the thing is, like, that's gonna make, like, you know, if somebody tries to open it, it's gonna make them explode and disintegrate.
05:38: So they become, they become shnuppets.
05:40: It makes me explode. It makes me cry.
05:44: Don't cry.
05:46: Well, this one makes you cry, but the last one, you said you're gonna drink the people's tears when stuff breaks.
05:54: Different explosions, different disintegrations.
05:58: Anyway, we got some other questions probably allowed, like, we don't want to, like, make this one, like, into a big one.
06:04: So, the next question is from GameTheCupDog.
06:07: I asked my question half-time because I didn't think I could make it.
06:10: Should I clear from the thread?
06:12: Let me kind of feel free to kind of keep it there.
06:14: Like, we're gonna, we're gonna get to the questions in the Discord.
06:20: So I'll just stay at this point, I'll keep it there.
06:25: Next question is from NukiKoon.
06:26: Question pads?
06:29: Oh, I don't have any pads right now.
06:33: No pads, unfortunately.
06:36: And subscription.
06:39: NukiKoon is asking, would you tell us about your Saturday show?
06:41: I think there's, like, a question within the chat, I'm not sure what the context of that one was.
06:48: Check the Foxutter, are there any specific procedural assets you want to add in the future?
06:52: Uh, yes. There's gonna be a lot of procedural assets, like, just generally, like, I'm gonna expand, add, like, you know, more primitives, more kind of common things, add, like, more procedural textures, sounds, you know, like, lots, lots of different kinds.
07:06: It also kind of depends what kind of procedural asset you mean in general, because there's a lot of things that are technically procedural assets, even though you might not even, like, realize that.
07:17: The way UIX works, it actually generates procedural meshes that, you know, are for the, like, actual UI, and then it's, like, you know, rendered out with, like, you know, the typical mesh render that exists as a local slot that you cannot see.
07:34: But under the hood, like, there's a bunch of procedural asset providers, you know, that are sort of orchestrated by the Canvas component, so there's things like that that are also procedural assets.
07:47: One of the examples, you know, where you might have such a system which is actually using procedural assets behind the hood, but they might not be, you know, direct procedural asset components that you can use directly, is something like Metaballs.
08:00: Because Metaballs, you know, is not, like, a single thing, like, usually, you don't have just a single Metaball, it's one mesh that you generate from multiple things, and it can have, like, you know, a variable number of them, where you have, like, you know, things that cannot generate fields.
08:15: So we're an example of, like, you know, one where technically it's also procedural asset, but you might not be, it's not, like, you know, single procedural component asset that you use, it's, like, you know, more like a subsystem that's using those, you know, for more complex functionality.
08:33: Even something, you know, like, once we have, like, Vertex mesh editing, that's actually gonna use, you know, procedural assets in the background too.
08:40: But yeah, like, I don't have, like, any, like, there's a lot of, like, it's just small ones, like, one over the top of my head is also, like, a hemisphere, essentially, like, you know, half of a sphere.
08:53: Oh, actually, there's, there's one, there's one thing I really want to add that's gonna be a useful building block for lots of procedural assets.
09:04: Right now, we are missing implementation of method for, like, triangulation of arbitrary, you know, arbitrary kind of, like, what's the word?
09:18: Essentially arbitrary kind of, like, you know, kind of curves. Let me grab a brush so I can just draw it quick.
09:23: We're not gonna be moving through the board yet, but I'll make a quick sketch.
09:28: I should have bought my brush at the beginning. There it is. It's my favorite brush.
09:35: I should just save it to the root of my inventory.
09:37: So, like, right now, like, for example, if you have something like this that is convex, you know, say, like, you have, like, these little vertices and you want to fill this,
09:48: we do have, like, a method that's, you know, able to do, you know, that's able to do, like, triangulation that uses, like, a triangle fan, for example.
09:56: So it starts here, it just does like, oh, that's pretty much it.
10:01: And it goes, you know, fills this with triangles. The problem is, what happens, if we have, like, more complex shape that is convex, but it's concave, so, like, have something like this,
10:12: and it goes inside, and it goes back here, so, like, we have something like that.
10:20: So now, if you did the triangle fan, this would actually break, because if you started
10:24: here, and you did like, you know, now we have a triangle covering this area that's supposed
10:33: to be empty.
10:34: So, typically, you know, you have to use some algorithm, say for example like ear clipping,
10:39: or some other algorithm that's a bit smarter about this, where essentially you give it
10:47: points, and it figures out, okay like maybe I'll put triangle here, I'll put another here,
10:51: and another here, and this kind of covers this shape of triangles.
10:55: Once we have that algorithm, that's going to open up options for a lot more procedural
11:02: meshes, because now, you know, this building block can be used for lots of different things,
11:06: and one of the cool things I want to add is 3D text.
11:10: Because if you think about 3D text, the way fonts work is essentially they're sort of
11:17: like, say for example the letter T. So the letter T that might be literally just an outline
11:24: like this, so this is specified by the font file, you get this line, and then we can feed
11:35: it to the algorithm that as triangulation it figures out how to do these, I don't even
11:42: know if I'm doing it right, but it figures out how to kind of fill this out, and now
11:47: we can also extrude it, so we can make it 3D, this is like the simpler part, and make
11:54: this like a thick T, and we can make this a procedural mesh, you just type whatever
12:00: text you want, you give it whatever font you want, and you get that letter as a 3D object
12:08: in here. Plus you know, all kinds of other stuff, like for example, we could add busier
12:15: curves, so we can define some kind of shapes, and then something that kind of triangulates
12:22: it and fills it out, so like, that's going to open up a lot of options, and it's like
12:26: one of the relatively small things, but it's like one that I think is going to open up
12:33: a lot of options, and I just really wanted to do it for a while, and didn't get to it.
12:37: I have it on my list of my kind of fun issues. There's also another one, I do want to add
12:42: also an algorithm that's kind of building blockflow out of these, called Marching Cubes,
12:51: and essentially lets you reconstruct like a mesh surface for some kind of field, and
12:56: is actually used for implementing meta balls, because meta balls, the way they work, you
13:03: create a field around them, and in order to make it into a mesh, you actually feed
13:08: it into Marching Cubes, and that is what gives you the final mesh from red. But it's just
13:12: for a lot of other stuff too. One example is with a particle system, because with particles
13:19: we could make each particle be almost like a little meta ball, and then instead of those
13:24: being rendered individually, we feed it to that algorithm, for the Marching Cubes algorithm,
13:29: and we generate a bunch of particles here, it's going to create this blob, they're going
13:35: to blobify this, another one is going to be like this, and if there's two that are near
13:39: each other, maybe they'll be like this. And what that can be used for is just making it
13:45: look a little bit like liquid. For example, if you've played Portal, specifically Portal
13:49: 2, they have liquid gels, and this is one way you can do it, we're just shooting particles,
13:54: and instead of rendering them as each individual billboard or meshes, you have them goop together.
14:03: So those are some of the things that I would like to add, because I think that's going
14:07: to open up a lot of really cool effects. Plus, once the algorithm's in there, it can be used
14:11: for lots of things. Like, say with the Marching Cubes, once we have a terrain system, that's
14:17: probably going to be one of the building blocks you can use to make terrains, because it's
14:20: also oftentimes used for, you know, destructive terrains, like where the terrain is defined
14:25: by some kind of field, and you can use Marching Cubes to construct a mesh.
14:31: So I hope that answers that question.
14:37: Next question is from Erasmus0211. With Stripe age verification knocking on the door, is
14:44: there plans for nodes to be used with it?
14:46: I think there's actually a little bit of misconception. We did introduce Stripe recently.
14:53: It's not for age verification, it's a payment method. So if you want to support Resonite,
15:01: the only way to do that until now was pretty much Patreon. The problem with Patreon is they take
15:06: pretty big fees, which means, you know, kind of like on the money that you give us, we actually
15:11: lose a lot. We lose several thousand every month, pretty much. With Stripe, with the
15:21: Patreon, there's about, on average, the fees are like 15-ish percent, give or take. With Stripe,
15:28: so far they're kind of working out to be around 5%. And with amounts like we get, that means
15:34: extra $2-3,000 a month, which we can then invest into other powers of Resonite, other people
15:42: marketing services and some other things. So that kind of gives us a lot more resources to work with.
15:49: So if you do support us on Patreon, consider switching to Stripe, because even if you switch
15:56: on the same tier and essentially give the same amount of money, we get a lot more from that
16:01: because Stripe takes less. It should also be more like implementation. So for example, with Patreon,
16:07: you can have to like wait a bit, but if you subscribe at Stripe, you get the benefits
16:10: immediately. It's based like on webhooks, they have like really modern API, so it's a lot more
16:16: powerful. Plus one more benefit, if you want the lowest tier, which on Patreon is $1 a month,
16:24: that works out enough to $12 a year. On Stripe, we don't actually offer that as a per month option,
16:32: you have to pay for the whole year, but as a discount, you're paying only $10.
16:37: And the funny part about that is we actually get more money as a result because with the really
16:43: small fees, generally payment processors, they will take like much bigger cut. So for the $1
16:53: on Patreon, we get like, you know, something like 70% because they take around 30%. So we get about
17:00: 70 cents, something like that. So like, you know, of the $1, we get like 70 cents from,
17:10: and that's what I said, I don't actually, I kind of do math in my head super well.
17:14: I think it works out like, I just calculated it before, what's like 0.7 times 12?
17:23: It's like 0.7 times 12. Let me calculate real quick. I'm also terrible at mental math.
17:30: Yes. I could also just go with Percaflex, but like you're on desktop, so it's a little bit easier.
17:37: 0.7 times 12, 8.4.
17:40: So that's 8.4. So with Stripe, they take 5%, and because we paid like $10 at once,
17:49: the fee is much smaller, it's around 5%, and 5% of $10, that's like 50 cents. So we get 9.7.
18:00: So we actually get like, you know, more, like you pay less and we get more. So like it ends up like,
18:04: you know, working out like a lot for us, especially with the amount of people on the list here.
18:12: Since we asked about Stripe, we like, we do use it specifically for the payment,
18:21: like we're not using the age verification right now. It's possible we might use it in the future,
18:26: we haven't like really started like that process yet, it's something we had conversations about,
18:30: but there's not a solid plan to like do that yet. If there were to happen, we were likely,
18:37: you know, gonna open up a discussion and be like, you know, this is what we plan to do, this is
18:40: one of the providers like we're potentially looking at, so, you know, at that point,
18:47: you know, we'll figure out how it's gonna be like integrated exactly. I don't know if there are
18:52: gonna be nodes for it because, you know, that's like way too far ahead because right now,
18:56: right now we're not even at a stage where like, you know, we're doing age verification with Stripe,
19:01: that's not a thing that's happening, at least not now. So like, you know, asking if there's
19:07: gonna be nodes for it, like it's too far into that. But yeah, hopefully that kind of answers
19:14: the question and clears like some misconceptions that maybe exist.
19:21: The next question is from Modern Balloony. Hey, so a question. Do you have an idea of what the
19:26: new IK system would look like or is it a bit too early to say? There's actually a bunch of like
19:30: information, like if you go on GitHub, we collect a bunch of information from people and there's
19:35: some general plans which go into details of what is planned.
19:41: Generally, I have a rough idea how it's going to be structured. But for the specifics,
19:48: that is a bit too early to say, because typically when new features are implemented, the way it
19:54: works is there's like a design phase for it. And during the design phase, it's like, you know,
20:05: if this thing works, you know, this way, how is it going to affect this thing and how is it going
20:08: to do this thing? And it kind of goes through lots of iterations, you know, to get like much
20:13: more solid and robust kind of design. And because like, you know, like we haven't started working
20:20: on that part yet, like it's too early, so like we don't have the robust design. It's just kind of,
20:26: you know, just a general rough idea of how it's going to be structured.
20:32: One of the things, like I do want it to be kind of modular, so you can, you know, for example,
20:37: have like say arbitrary number of like arms and legs. I don't know if it's going to complicate
20:41: a little bit from too much, like it might not, but it also might. And that's something that
20:45: will be revealed during the design phase. So there's like, you know, it's like, you know,
20:51: it's more like a collection of goals. The other part of it is like, that's a problem I wanted to
20:57: talk about, is like some sort of like retargeting, because oftentimes, like, you know, IK has
21:01: issues if you have avatar that doesn't match your body proportions, because the only one
21:05: that matches, it works pretty well. And the more it's far away, like the worse it kind
21:11: of gets, because like, you know, if like my actual, you know, pelvis is here, but like
21:16: say the avatar has like a really short legs and the pelvis would be like, you know, much
21:19: lower. The current IK is going to, you know, try to, try to take the pelvis that's like,
21:24: really low and it's going to try to pull it up. So one of the things I want the IK system
21:29: to have is a mechanism for, you know, retargeting. And what that is, is you essentially, you
21:37: know, we can compute the IK for your actual real body proportions, and then you adjust
21:42: those proportions to match the avatar. So like, you know, that it kind of behaves a
21:49: lot more, a lot better. So that's like another aspect for the IK. But yeah, for, there's
21:58: a bunch more kind of details on the GitHub, there's both like discussion on the GitHub
22:01: issues, so definitely check those out if you're interested. But for the specifics, you know,
22:07: it is too early. Overall, like the goal is to make it, like this is one of the things
22:15: that's a little bit harder to find, but it's like, you know, it's to make it feel good.
22:20: Especially if you go to some poses, like it doesn't always feel good because I'm
22:24: doing quite a match, like the neck is crunched, you know, maybe the hips are not doing like
22:28: what you're supposed to do, you know, maybe it's like offset weird, you have the calibration
22:32: often. So that's one of the primary goals is like making it feel good. And for that,
22:39: like, it kind of needs a lot of kind of testing where, you know, essentially it's going to
22:43: implement and it's just going to keep like tuning it and tweaking it. And, you know,
22:47: just kind of iterating on how it works. And sometimes during this kind of process, you
22:52: know, there might be some big design changes that need to happen. This is what happened
22:58: during PhotonDust as well, where I had like an initial idea how it's going to work, but
23:01: then actually during the implementation, I found like, okay, this is an issue. I need
23:05: to make some design changes to this. So, you know, things kind of change throughout the
23:13: process. This is generally like, you know, the idea how it's going to kind of work. It's
23:17: going to be composable kind of multiple solvers, hopefully like, you know, modular ones that
23:21: kind of can contribute. And a little bit of detail is going to be working out like once
23:28: it gets prioritized. The next question is from Dev Hummer. How excited to have PhotonDust
23:35: implementations almost done over? It's, it's, yes, it's been like, it's kind of turned into
23:42: a little bigger thing, particularly because there's a lot of compatibility issues because
23:49: one thing I didn't expect is like, there's a lot of messiness with the unit particle
23:54: system. Some of it is like on our end, like on how we kind of expose things and some of
24:00: it's like on you this and like we're doing things in a really weird way. And the problem
24:05: is like, there's been a lot of bugs like with PhotonDust where technically it's not a bug
24:10: of PhotonDust. Like for example, PhotonDust, it uses consistent coordination systems. So
24:15: like, you know, if you have, did I remove my brush? No, I have two brushes now. Like
24:21: if you have, you know, for example, here, like say you have like an object and this
24:24: would be like its up axis and this would be like, you know, its forward axis. And then
24:30: if you have PhotonDust, you know, and you have a particle and the particle, it essentially
24:34: falls the same, you know, like the axis are going to match. But then in this particle
24:39: system, the particles, for some reason, this would be like, you know, the up axis and this
24:44: would be like, you know, the forward axis. It's like all offset and weird, but also it
24:49: behaves like weird. If the particles aligned with this axis, then, you know, maybe this
24:54: randomly flips here and it's just very messy. And, but also this only only happens if you
25:04: change it to different mode and it suddenly matches again. So it feels like, you know,
25:08: their system is just bunch of different systems made by different teams that like didn't super
25:12: coordinate. And that made things very difficult because in order to make, in order to make
25:20: things look pretty much the same as they used to after conversion, we have to replicate
25:26: those bugs. And that takes a lot of effort. Like the most recent one I had to deal with
25:30: is like where somebody used negative value for velocity scaling. And what happens with
25:36: the unity system when you do that is like if you emit a particle, you know, say like,
25:42: this is your source of particles. If you stretch them normally, you know, the particle kind
25:45: of stretches like this, but if it's positive, you know, stretches like this. But if you,
25:51: if you put a negative value, it stretches, but only like, you know, it also offsets.
25:56: So it's no longer centered. And it only does that when you use negative scaling. So I had
26:02: to add a whole new module, like the pivot module, which allows, you know, to offset
26:06: particles from their center and set up a thing where it like, if it detects that mapping,
26:10: it's configures that module. So it offsets the particles and it makes it look the same.
26:16: And that took a while. And, you know, this is also like one of the reasons, like when
26:19: people are like, like, for example, when we say like, we no longer like, we don't support,
26:24: you know, putting negative values for certain things, you know, and people are like, I just
26:27: let it like use it. This is one of the reasons it creates a lot of weird problems. In this case,
26:34: it was already there and kind of like, you know, committed to like preserving that backwards
26:38: compatibility. So I spent time implementing those things. And it's been like, you know, kind of
26:45: link journey. Like we also had like a lot of people from our community, like, you know,
26:48: finding all these problems and reports, which helped a lot because people help like, you know,
26:52: isolate things and make sure it's that like, you know, like something like 99% of the contents
26:59: should just like work and look the same. But it takes a lot of effort. It takes a lot of kind of
27:04: like, you know, mental drain and so on. So I'm really looking forward to it finally be done.
27:08: And the other part is, you know, like I really want to like, you know, finish the performance
27:13: update as soon as possible. And until PhotonDust is done, you know, we kind of move to the next
27:19: milestone, which is going to be the audio system. There's also like a question for it as well. So
27:23: we'll be talking about it a little bit later. But like, you know, it being nearly done means,
27:31: you know, we can finally like move on to like a different thing, which also helps a bit mentally
27:34: because it kind of changes up like, you know, what I focus on. Because it does get like, like,
27:41: after a bit, it gets a little bit grindy, you know, just kind of working on the same system
27:44: for so long. But hopefully people are still like, you know, like, enjoying it too. Because like,
27:50: whenever I work on it, I try to throw in a bunch of few extra things, you know, some, you know,
27:55: modules, like, you know, for like, leadership, I've added like some so you can, you know, change
27:59: the core of the particles based on velocity. So like, there's, you know, little sprinkles of like,
28:04: fun stuff with those compatibility things as well. But yes, I'm very excited. For those who are not
28:11: aware, like we're actually running the last phase of the pre-release testing, the legacy particle
28:16: system is very likely going to be removed sometime next week. Assuming there's like, you know,
28:22: another big blocker, which means everything's going to be automatically converted to PhotonDust.
28:26: And you know, the milestone is going to be done and can move to the audio system.
28:32: But yes, very excited.
28:35: Check The Fox Author is asking,
28:37: Also, what are the topic of mono-packing? How do you plan to implement ProtoFlux static
28:42: asset compilation? I'm super much going to be like a thing, like, I'm not like, again,
28:47: decided on the specific details yet, but pretty much it's like, when, when you build
28:53: ProtoFlux, ProtoFlux itself, it's technically a separate library and has its own kind of
28:57: representation of all the nodes. And it actually, it doesn't care about any of the, you know,
29:04: about how the nodes are connected to each other. So what is probably going to happen with a static
29:13: asset, we essentially take, you know, like that kind of dynamic representation for the
29:18: nodes that builds the sort of linear representation and that's, what's going to be serialized.
29:23: So it's going to get saved and it's, you know, then going to get loaded. So it kind of skips
29:26: that step. Like, you know, like think about it, you know, we have like a bunch of nodes
29:31: and, you know, so like they're connected and they're like doing things and this one here,
29:35: this one here, this one here, and this is like, you know, you have all these objects and they're
29:38: kind of spatial and also it's kind of bright, so it's halter C. Just move it here, you know,
29:45: so you have like nodes connected. What PhotonDust does, not PhotonDust, ProtoFlux,
29:51: it essentially converts, you know, to a serial list of nodes. So you have like, you know,
29:55: one node, you have the other node, and you have the third node, and then each node has like,
29:59: it connects here, and connects here, just kind of referencing things. So when you compile
30:05: into static asset, like this sort of happens in the background, when you're connecting things,
30:09: it's building this with ProtoFlux. If you compile into static asset, we'll just take this,
30:16: and we'll save this, and this gets thrown away. And that way, you know, it kind of skips that like
30:24: whole step, like where it has to convert and figure out the connections, like from the binding
30:29: Resonite side, and build this, and we just load this directly, which is gonna, you know, help.
30:39: And then we'll load it in.
30:42: Nookicon's asking, do you like getting pets? Do you mean like pets like as in like animals, or like being petted?
30:53: It's fine, yes.
30:59: Moonbase is asking, can I ask where you'll be going? I don't know what it is in context of.
31:08: Where you'll be going? I'm trying to think of what we're talking about, like...
31:13: Yeah, like if you're asking something, provide context, please.
31:18: Next question we have from Rabbuts.
31:21: Hi, I have a question. With the addition of something like a rigidbody, what kinds of things can we do, and how it will be implemented?
31:28: So, for the rigidbody, it's probably gonna copy a lot of the stuff that's already kind of in Bepu Physics.
31:36: Because usually with rigidbodies, you have the rigidbody itself, so the body can simulate and tumble around, and it's physically simulated.
31:46: There's gonna be mechanisms for that, for efficient synchronization, so if you have multiple bodies, it all stays in sync for multiple users.
31:55: There's also probably gonna be components to work with those rigidbodies, so you can grab them, toss them around, move them, apply forces, and so on.
32:03: And those will be likely designed in a way so there's some sort of handoff.
32:09: For example, if you're the one interacting with a rigidbody, it's gonna primarily simulate on your end, you're gonna be the authority on it,
32:18: and if somebody else touches it, maybe that ownership goes to them, and they're the ones.
32:23: That way, it ensures that for anything you do with a rigidbody, you have the low latency, you don't have to wait for a simulation to happen on somebody else.
32:32: And then get the data back.
32:35: There's also gonna be, like I mentioned, a bunch of components that work with it, so mechanisms to apply forces, to read the values back, do things.
32:43: And we're also gonna integrate constraints, because oftentimes physics engines, they come with a number of constraints.
32:49: You can, for example, say, hey, I have one rigidbody here, and there's another one here, and maybe I add a joint here.
32:58: So it moves like this, and this whole thing is gonna move, and it's connected by the joint.
33:04: Or maybe there's a spring in between.
33:08: Maybe there's a spring.
33:11: So there's a number of different constraints which let you tie them together and create more complex creations.
33:17: So there's gonna be components, and also some tools, so you can work with all that and make all kinds of contraptions.
33:27: So that's very likely how that's gonna happen.
33:30: It's like the rigidbody component itself, a bunch of components to work with, and also like, you know, integrating constraints and other bits.
33:41: Navy3001 is asking, just wondering, do we know of UploadVR?
33:44: Yes.
33:45: Actually, there should have been an article published within the last few months,
33:51: because we also did an interview with Voices of VR.
33:56: No, sorry, not Voices of VR.
33:59: This is gonna buy.
34:01: My brain just gonna...
34:04: Is this show that Skiva and Alex host?
34:10: I just completely blanked out the name.
34:15: It was like a VR podcast, and they do collab with UploadVR as well.
34:22: What was the...
34:32: Let's see...
34:33: I completely blanked out on it.
34:36: There's a...
34:38: Yeah, they also write an article about us as well.
34:44: Do you remember...
34:46: Oh, Between Realities, yes.
34:48: That was the Between Realities podcast.
34:52: And they did publish it on UploadVR because they do collab with them.
34:58: Next question's from Moonbase.
35:01: Isn't it... Oh, they're talking.
35:04: Next question, Tyre Whitefail is asking,
35:07: If we want to move our support account off Patreon, what is the best way?
35:11: Sign up on the...
35:13: Oh, sorry, the link, I think.
35:14: Page first, and then cancel renewal on Patreon, or does it matter?
35:19: When you cancel Patreon, it doesn't actually...
35:22: It doesn't drop your current perks.
35:25: It's just they don't get renewed.
35:27: So in what order you do it, it doesn't really matter.
35:30: You can cancel now, and then even subscribe to Stripe later on.
35:35: Say if your Patreon was about to renew on March 1st,
35:39: you can cancel it right now, and then you can sub on Stripe on March 1st,
35:45: because there's a little bit of a grace period.
35:48: Or maybe a little bit before then, maybe a little bit after.
35:52: You just want to avoid a period where you don't have an active subscription.
35:55: But if you do cancel, the reminder of your subscription for that month,
35:59: it still completes.
36:01: This should create an interesting effect, because the way our system works,
36:05: if you subscribe and have board active, a lot of the perks, they will stack.
36:10: So for example, if you're getting, say, 25GB from Patreon,
36:13: and you subscribe to Stripe, you get 50GB total.
36:16: Once Patreon expires, then those 25GB go away, and you get 25GB at that point.
36:29: It doesn't super matter. The only thing you want to make sure is,
36:33: if you don't want to lose your benefits, you have to subscribe before the Patreon one expires.
36:43: Navy3000 was asking, what are your thoughts on hiring a project manager?
36:47: I mean, it would kind of help for some things.
36:49: The main thing is, you know, the hiring requires giving them a wage,
36:54: and we are a small company, so that can be quite costly for us.
37:00: We have a number of people who kind of are in the draw,
37:05: both like Bob the Good and Purple Prime are kind of like, you know,
37:08: taking some project manager responsibilities, making sure stuff kind of goes through on different things.
37:13: And I kind of read a little bit on some things, I feel like I don't like doing it super much,
37:17: I don't like, you know, kind of poking people, be like, working on this thing and so on.
37:23: So it's like, it's, it's like helpful thing, like, especially like having somebody like dedicated to it,
37:32: but I don't think like, you know, right now that would be the best kind of like investment.
37:38: Like, I don't think like it would bring enough benefit over like, you know, something like additional engineer.
37:43: Because if you, if you're like going to think about these things when you hire somebody,
37:48: you have to think, you know, what is going to be,
37:51: what is going to bring the most value to the company and to the project at this current time.
37:57: And sometimes like, you know, there's like bunch of departments that need more work.
38:03: And maybe, you know, we get like more benefit out of additional engineer compared to like, you know, a project manager.
38:09: If the engineer is like, you know, if the engineer is kind of good fit and they can like work pretty like independently on a lot of things.
38:15: So it depends.
38:17: It's always like, you know, because we get like all of questions, you know, like, why not hire this person?
38:23: Why not hire this person? And people don't realize, you know, there's only so many people we can actually afford to hire.
38:34: Like which ones cannot bring the biggest benefit.
38:38: Because we just, you know, we cannot do them all.
38:41: At least not at once. You want to like, you know, grow gradually.
38:46: The next question is from FemiScout.
38:49: Question, what is your favorite Tetris cube?
38:52: I would say, I forget, I don't know their names, but this one.
38:57: Just because it's very satisfying if you have like, you know, whole row and you put it there and it just goes.
39:03: Four rows at once.
39:06: Do you have a favorite Tetris cube?
39:11: I like the T-shape one, it's satisfying.
39:15: This one?
39:16: Yeah, like when you, it's like a key, like when you lock it in.
39:21: That's also a good one.
39:23: It feels like very satisfying when I can fit it.
39:26: Yeah.
39:31: Next question, JWiden4 is asking, what's the future of videos and livestreams in Resonite?
39:37: I ask because I think Resonite could be amazing for watch-togethers.
39:42: We're trying to watch Stargate in here and it's been fun, but it's difficult since audio seems to desync consistently.
39:47: Unity, LBLC both seem to have their quirks and it makes me wonder if things would be improved, for example, through an MPV FM mega-based backend.
39:54: If the problem is deeper, it can't be fixed, the new audio rendering thoughts.
40:01: So the problem is we're not using LBLC directly.
40:06: It's actually kind of interesting because Unity shouldn't desync because it has more...
40:11: It shouldn't desync audio because it has more direct control.
40:14: The problem is the Unity one, it's very fragile.
40:19: We have to be very careful what video streams will be feeded because it doesn't do much validation.
40:26: Early on in its implementation, we would essentially let it try to load any stream and if it fails then we fall back to LBLC.
40:35: The problem is sometimes it says I can load a stream when it cannot and then what you get is horrible graphical glitches and screeching in your ears.
40:43: Which is a no-no.
40:46: So we have to add pre-filtering where we are very cautious which data we feed it.
40:56: Because if we figure out it's probably not going to be able to play this one, we just fall back to LBLC even though maybe it could play it.
41:03: But it's better to error on the side of caution than have people just have horrible screeching in their ears because that was awful.
41:13: For the LBLC, the situation is a little bit complicated because we are not actually using LBLC directly.
41:20: We are using a project called UMP, Universe Media Player, which is sort of a wrapper around LBLC.
41:27: Our problem is it doesn't expose a lot of things we would need from the VLC particle for audio synchronization.
41:32: We sort of have to hack it to get good audio data and we don't have any sort of timestamp when we get it.
41:40: We just try to read it out as fast as we can and the problem is there is not a mechanism to determine how is it aligned to the video.
41:49: There is a mechanism within LBLC because LBLC itself, the way it gives you audio, you actually register a callback method.
41:56: You call this when there is new audio data, it also has a bunch of callback methods where it says the user has seen a different part of the video or made a drop.
42:07: Clear whatever audio data you are about to play, start fresh.
42:11: So it kind of controls the playback and makes sure it matches the video.
42:16: We wanted to switch to the official LBLC library.
42:20: Right now it's kind of got stuck, because as far as I'm aware there have been some issues that run with it.
42:29: We could look into other solutions. There is actually one that I think was called AVPro or something like that that I wanted to explore.
42:38: But the problem is it's a paid solution, so it's a question of do we want to fork over a thousand banks for a solution right now that we won't be able to port to the new engine.
42:51: There might be a DFFM bug like MACE. The main thing we pretty much need is for that solution to have integration with Unity.
43:02: Because it's not like you can't just use it on itself, it needs to be integrated with the game engine to make the video data available as a texture that you can then project on other things and it requires specific integration.
43:16: There's actually one thing that might be worth it, if you open this up, because if there's interest in the community to help push some of these things through, that's something we could use to help with.
43:30: If there's, for example, the issues with LPLC, if people want to look into it, that could help as a contribution.
43:38: But it's a difficult thing of how much time do we want to invest and how much time and money do we want to invest into Unity when we want to switch, because the new rendering engine that we switch to, that integration needs to also be done with that engine.
43:55: So the work that's done for Unity is eventually going to be wasted.
44:00: Which makes it a little bit of a trickier thing, because it might take a fair amount of time, and then if we waste the time, that might hurt us.
44:11: And just putting the time into making the switch.
44:17: But yeah, different little doves that make it better overall, just fix a lot of these issues.
44:24: Next question, The Game Cup Dog is asking, How would making procedural niches from scratch in ProtoFlux work? What features would be needed?
44:33: So there's like two ways that's going to be possible. One, there's a mesh DSP that's planned.
44:39: It's like a data processing pipeline, where for example you have a node, and the node is a mesh source, maybe it gives you a grid.
44:51: This gives you a grid, and you say I want this grid to be 10x10.
44:58: And then maybe you feed it to a node, where it takes a texture, and you have some kind of noise texture, and it displaces the grid, it does a thing.
45:09: And maybe you feed it to another thing, which maybe voxelizes it, does processes.
45:14: And as you change these parameters, it kind of goes through this pipe, and spits out a mesh at the end of it.
45:21: So you can do a bunch of meshes, there are several sources, you can feed those, and do various processing on them, various filtering.
45:32: The other aspect to this, which is also going to plug into this, is being able to just build a mesh from scratch, from the primitives.
45:40: Which is very likely going to work in a way where you have some kind of ProtoFlux function.
45:52: And what it's going to need is collections. And essentially you get an impulse, and then what you do, you have some kind of for-loop.
46:00: And this is backwards for you. You have a for-loop, and you build out a list of triangles, and you compute their positions, and pretty much do whatever you want.
46:12: You add vertices, you add triangles, you build a mesh literally from those primitive pieces, do whatever math you want, and once it finishes, you have your procedural mesh.
46:23: And this mesh is parameterized by some things, so you can for example have whatever values, and you plug into this, and you build the vertex by vertex, triangle by triangle, you position them however you want.
46:38: This will give you the ultimate flexibility to build procedural meshes.
46:43: And the cool thing is, once you have this, you can then wrap it into its own node, and it outputs a mesh that it generates, and then you can actually put it into another filter, maybe applying a subsurface subdivision to make it smoother and so on.
47:07: So that way you can kind of compose these things, the systems will interact with each other.
47:15: One of the goals for the procedural meshes is, once we have the definition, and I talked about this in one of the recent resonances, there should be a video, but once we have the definition of it, this actually becomes a component.
47:31: So you have two parameterizations, like A and B, and then this becomes a component, you can just attach this procedural mesh, and you have your A and B, and you just plug values, and it just runs your code on a background thread to generate that mesh.
47:54: So I think it's going to be a really cool, really powerful mechanism for building procedural assets on your own.
48:02: It works similar for textures, where you literally just loop through all the pixels, do whatever math you want to compute each pixel, and then we know the way to make a procedural texture, send it out there, just compute the individual samples.
48:18: It's just working from the basic parameters that these assets are composed of.
48:24: Nanotopia is asking, in a previous OH, you mentioned animation timelines maybe happening in Resonite. This would of course be amazing, so I guess my question is, is something like this in the works, or is it just a thought at the moment? Thanks.
48:38: There's an issue for it, which describes what this will do. It's not being actively worked on right now in terms of fully designing it, implementing it. It's one of those background things that's like, you know, this is generally how it's going to work.
48:54: Because whenever features are being added into Resonite, the process like, you know, there's a lot of features that are not being actively worked on.
49:07: On performance updates, for example, specifically finishing the PhotonDust and so on.
49:12: But there's like, you know, these little threads that kind of happen in the background, just thinking, you know, this is how this is going to work.
49:18: And they sometimes influence how other things are done. Because the way, you know, Resonite kind of works, it's also like, it's designed so it will work really well with the timeline.
49:31: And some of the features that are being added right now, like for example, when I was working in ProtoFlux, there are thoughts how this is going to work with the timeline feature.
49:39: So there's like little like nuggets and threads, you know, that sort of go into that feature. So there's like little like tiny pieces of work that kind of go into it, like, you know, design wise.
49:51: And at some point, we're going to be like, okay, now we're implementing this feature, and all those threads kind of come together, and that becomes, you know, the major focus that's being, you know, implemented right now.
50:02: So yeah, like it's not actually working right now, like not in like any major sense, but it's like a thread that's kind of like building.
50:14: The next question is, Cyro was working in parallel on the audio system, right? How's that going?
50:18: So Cyro's been working on the specific on the like reverb implementation, because the existing one in Unity, we have like, you know, Unity has like a reverb zone.
50:27: And we need to like, you know, provide an equivalent, something that's similar, since Cyro has been working on integrating like a library called like Zita Reverb.
50:35: We actually thought a little bit more about it.
50:39: Yeah, so the the actual library that we're using is, it's called Soundpipe.
50:45: And it's made by this guy called Paul, I think it's like, I think his name is Paul Batchelor. I can't remember, I'd have to look it up.
50:53: But what it does is it provides a basically like we were looking for a library that provided like a nice set of sound effects, particularly the reverb because we need that.
51:08: And Soundpipe seemed to be pretty good. It's a little C project. It's easy to build. It's not too much hassle to really integrate.
51:16: And one of the effects it provides is, of course, the Zita Reverb, which I think it's a, I think it's kind of like a well established like type of reverb.
51:30: I'm not really like too well versed in audio, so I'm not quite sure on that one.
51:41: But one of the nice things about it is, is compared to Unity's reverb, which is, which uses FMOD as the underlying type of like audio processing system.
51:54: This one actually is like stereo. So the reverb is like in both ears and there's like slight differences between both ears.
52:04: And while I was like AB comparing it, I realized that like, it just, it sounds way better. Like it just it sounds a lot more like full and like rich and like the reverb is coming from like all around me.
52:19: Yeah, it's gonna be cool once I've kind of have it, especially once we have like a, as audio zone people can play with it too.
52:26: So yeah, it's essentially a thing, you know, because like, this is like one of the components of the audio system that like I specifically kind of needed like help with.
52:34: So I kind of know Cyron has been like, doing like a lot of kind of good work, like, you know, for both like finding because I think I can ask you to like, you know, look into like what solutions exist and what's and then like, you know, integrating making a wrapper, but also the really important thing that you've done.
52:50: Cyron has like donated like his ears and probably some of his sanity just doing a lot of A-B testing to like, because the Zita Reverb, it works different, you know, from the FMOD reverb, which means we need to kind of map the existing presets to the new one.
53:06: And Cyron essentially did like a lot of A-B testing, you know, just to kind of manually match them.
53:12: Yeah, so Unity has a lot of different parameters that control like various aspects about how like the sound is processed, like how much quote unquote room it has, or like how much the lower frequencies like Resonate versus the higher frequencies and how long they do that and how they sound.
53:35: And Zita has a lot of those same parameters, but it has like a couple parameters that are kind of like rolled into one.
53:49: So like, it technically has fewer inputs than the Unity one, but like you can make it sound a lot better.
53:56: The problem is making them sound the same.
53:59: And so what I had to do was I had to take like music and I like recorded my own voice.
54:05: And over the course of like a week or two, I just sat in my homeworld, basically with my, I like made a little like mock-up, like reverb so I can listen to it in game.
54:20: And I just had to compare them and tweak the values and compare them and tweak the values.
54:26: Time to replicate the sewer pipe preset. Oh boy.
54:31: And I just sat there and I like tweaked the like the high frequencies and the low frequencies and how long they resonated.
54:39: And like. Oh my gosh, dude, I maybe listen to the same audio clip over like 200 times.
54:51: We appreciate your sacrifice.
54:55: It was a harrowing experience.
54:57: One of the reasons, you know, why I also was like, you know, like Cyro, could you like help with this thing?
55:02: It like saves a lot of time, you know, like so I can kind of focus on the other things too.
55:07: But it's going to be worth it.
55:09: And it's going to be like, you know, I'm kind of excited to like get like final like integrated and, you know, with the PhotonDust like being done.
55:17: Yeah, and you'll be able to actually play with it right away in the form of,
55:24: I made it so that you can actually like process audio clips using the using the reverb parameters.
55:30: So any audio clip you have in game, you'll be able to apply reverb to.
55:34: So maybe like something you can make tools that, you know, automatically apply certain presets to specific audio clips for certain areas of your world, for example.
55:43: I don't know.
55:43: Yeah.
55:44: I want people to do something cool with it.
55:46: So this is a kind of cool thing is because like when something like that is like integrated, you know, it's a building block and it can be kind of used and exposed in lots of different ways.
55:54: One way there's also like what we want us to do is like once we have the audio DSP for ProtoFlux, we're going to turn it into a node so you can actually pipe audio through it, you know, to kind of like do whatever, you know, audio effects you want.
56:06: And there's also like, you know, a whole bunch of like other effects in the library that like, you know, might end up like integrating as well.
56:13: Yeah.
56:14: So yeah, it's going to be, there's always like the way we approach things and always like, make things more into like, you know, building blocks that can be like, you know, used in lots of different like ways.
56:26: But I hope that answers that question.
56:30: I will, I will say, were we ever, were we going to get to the questions that were in?
56:36:
56:36: Okay.
56:37: As I was like, do that, there's a few, like, I was going to do it for the full hour because like, there's like a whole bunch of things.
56:45: So we're going to answer the questions from Discord in a second.
56:50: We're going to go through a few of these because there's like very quick ones.
56:55: NukiKoon is asking, well actually I'm not asking, I give the Frooxius and Cyra one pet each.
57:01: One pet.
57:02: Thank you.
57:03: I'm honored.
57:04: Oh yeah, JacktheFroxius is asking, is there a better libvlc library for C-sharp?
57:13: So just to kind of clarify, for what I was saying earlier, the official, like libvlc now has an official C-sharp implementation.
57:20: The problem is, we need a unit integration on top of that.
57:25: And that's kind of, you know, where some of the kind of issues lie.
57:27: Because if there's an integration, like, you know, we can kind of plug it in and replace the existing one.
57:33: Of course I know there's some issues with that implementation.
57:38: Last time Ginns was kind of looking into this part, so you might want to ask him in his office hours for any details on that.
57:51: So I'm going to go to the questions, this one's a bigger one, so I'll check the questions people ask in the Discord.
58:02: So Red has been asking, what are some of the currently planned extra features in the new audio system?
58:10: Kind of how we got Simplex Turbulence and PhotonDust.
58:13: We actually kind of covered some of this in a previous Resonance, where we kind of went into details how it's going to work and so on.
58:20: And there's a published video, so I do recommend checking that one out.
58:23: But in short, one of the features I'm particularly looking forward to, that's actually kind of relevant to this as well, is where we can have multiple listeners.
58:34: So instead of like, you know, right now you have only like one listener, where I pretty much set the audio to broadcast.
58:43: If I set the audio to Spatialized for Cyro, for example, you would hear them wrong, because you would be hearing them from my perspective.
58:50: I can switch it to the perspective of the camera, but then it would be wrong for me, and it really messes with my brain.
58:57: So I don't do it, and I just do broadcast.
59:00: And the reason we cannot do it is because with Unity, we can only have one sort of viewpoint for the ears.
59:10: With our own system, we can design as many listeners as we need to, which means we can render audio for the user,
59:18: and then we can actually render the audio again for the viewpoint of the camera, and send it to a different audio device.
59:25: That way you can have audio that's going to you, and you can have another audio that's going to the stream,
59:30: and that has the audio spatialized fully from the camera's viewpoint.
59:34: What's even cooler with that one, we might make it so you don't need to capture it on a microphone,
59:39: we'll render your own audio spatialized for the camera, so you're also not in broadcast,
59:45: you're going to be properly spatialized for the camera's viewpoint as everybody else.
59:52: There's other things I want to do with that one, for example make it so you can make a microphone in the world,
59:56: and you can just record things into spatialized audio.
59:59: Or maybe you can just make the audio source so you can make walkie-talkies that go there.
01:00:07: That's really one of the things.
01:00:08: The other part, we're going to have much better control over how the audio falloff works,
01:00:13: so there's probably going to be a bunch of features with that, exposing more control,
01:00:18: maybe making settings where you can tweak the falloff locally,
01:00:24: for example over a month with audio, or maybe have trouble hearing people.
01:00:30: So there's stuff like that.
01:00:32: Some of it we might not do as part of it, because usually for the Switch, the main goal is feature part.
01:00:40: There's a lot of features we could add,
01:00:43: if they take a lot of time, that means they delay the performance update, which is the main focus.
01:00:50: So usually we'll only add things that are small enough.
01:00:55: But we also might add, for example, mechanisms where you can have...
01:00:59: Because right now audio sources, if you're a source of audio, is just like a sphere.
01:01:05: If you're here, it's the same volume as it's here, because it's the same distance.
01:01:12: But what it might do is, there's ways to define them to be different.
01:01:17: For example, maybe the audio is more like this.
01:01:21: It's very quiet here.
01:01:24: It's very loud if you're in front of it, but quiet if you're behind, so you can do different shapes.
01:01:32: Because we'll control how the volume is computed based on the positions.
01:01:38: And we can plug whatever functions we want there.
01:01:42: The other part I was gonna mention with that...
01:01:47: What was I gonna mention? I kinda blanked out again.
01:01:54: And I'm blanked out.
01:01:58: There's another thing...
01:02:01: I don't remember now.
01:02:04: But there's a few.
01:02:07: One of the things I was looking into is that we want to add it eventually, but it probably isn't gonna happen for this one for MVP.
01:02:16: We're gonna be integrating Steam Audio.
01:02:18: And Steam Audio supports...
01:02:21: You can actually use the word Geometry for audio bouncing and occluding.
01:02:26: Oh! Actually I remembered, there's another thing that I found.
01:02:29: One of the things that Steam Audio supports is simulating air absorption.
01:02:33: So if something's really far away, certain frequencies get absorbed more than others.
01:02:38: And it should be easy enough to integrate, so maybe you'll get that one.
01:02:45: Oh, and the other one is...
01:02:47: We might support loading custom sofa files for...
01:02:53: The Steam Audio for the audio spatialization is just something called HRTF, which is head-related transfer function.
01:03:03: And usually it's just some default ones that come through, so they work okay for most people.
01:03:07: But you can actually get different ones, you can even get your own ears measured.
01:03:12: So you get a custom one that matches your physiology.
01:03:18: So you might expose mechanisms so you can just load a custom one so you have more personalized audio spatialization.
01:03:24: I'm not making any premises on these things right now.
01:03:28: Like I said, the main goal is feature parity.
01:03:37: So this should answer that question.
01:03:42: Next question is from GameTheCupDog.
01:03:46: When multiprocess happens, how viable would overriding HIDRAK in the FrooxEngine.unity.ipc for a customer render be?
01:03:54: I mean, it kind of depends because one of the things we do want to do ourselves is swap it for a custom rendering engine.
01:04:01: So we have FrooxEngine, there's all this stuff, and then it's coming together over IPC, and it's rendering stuff.
01:04:10: Which makes it easier, because we now have this well-defined interface, it makes it easier to take this away and put a different one in.
01:04:19: The problem is, when we do this, we're probably going to change this a fair bit, so it's not going to work 100% the same.
01:04:28: And then we're going to make it so it works really well with the new engine.
01:04:31: It's going to work on the similar principles, but we can change how bits on this side also work, so it works better with the new engine.
01:04:39: Which gives us more flexibility.
01:04:42: So, if you mean swapping it on our end, like that's something that will happen.
01:04:49: If you mean swapping it by somebody from the community, that's actually harder, I feel.
01:04:58: One, you'll have to reverse engineer how this works to make a different renderer.
01:05:07: And the question is, are you going to replace it with a completely different renderer?
01:05:11: Because making a different renderer, that's a lot of work.
01:05:16: A lot.
01:05:17: You have to implement all the mechanics of how it works, all the intricacies, you have to implement all the shaders.
01:05:24: All the functionality.
01:05:28: It's something you could technically do, it might be way more work than you expect.
01:05:36: Plus, if you were doing that, you don't have the flexibility of adjusting this to match the new renderer.
01:05:43: Which we do for our own development.
01:05:47: Practically I don't think it's going to happen, but I think it's more likely making a modified version.
01:05:51: So you take our Unity renderer and you make modifications to it, but you still build around the same base.
01:05:57: If you want to make one from scratch, that's going to be tough.
01:06:02: Especially if you want to do a full feature part with everything.
01:06:07: But it's going to happen, we might even ask for some community help with the official ones.
01:06:13: We'll see how that goes.
01:06:15: And the last question in the channel is from Ozzy.
01:06:22: I've been trying to understand some stretchiness of haptics and have some questions about it. Feel free to go over the questions given I'm asking a lot.
01:06:28: Is the HapticManager component that injects local haptics for other users meant to not have any debug visuals when checked?
01:06:36: I'm actually not sure on that one.
01:06:38: There should be a component that visualizes the haptics.
01:06:43: I don't remember from the top of my head. I would have to get haptics set up.
01:06:47: Actually, no, we should be able to show it off, maybe.
01:06:50: Sorry, do you mind if I inspect you?
01:06:54: Yeah, I don't mind.
01:06:55: Let me have a look, because there should be a bunch of components that you can poke at.
01:07:02: So I'm actually going to switch the camera to POV.
01:07:06: And we're going to inspect Cyro.
01:07:09: So we're going to open him up.
01:07:12: Good thing inspected.
01:07:14: Let's see, let's go...
01:07:18: Is this the top, or is there more?
01:07:22: No, there is more, I think.
01:07:26: Yeah, there is more.
01:07:30: Let's see, let's open...
01:07:31: I forgot where exactly is it placed, because there should be a component.
01:07:36: Controller, hand simulator...
01:07:40: Targeting...
01:07:40: Outliner, Interaction...
01:07:45: PhotocaptureManager...
01:07:46: There we go, there is HapticManager.
01:07:51: Oh, yeah, there is Show Debug Visuals, so if I enable this on, this might not do much.
01:07:59: Do you see anything?
01:08:01: Yeah, I think it's because I'm in desktop, I technically don't have any.
01:08:06: Okay, let's inspect me instead.
01:08:09: So I'm going to open myself up.
01:08:15: Let's go on higher.
01:08:19: So, if I go here...
01:08:26: HapticManager, Show Debug Visuals...
01:08:29: I don't think... you might need somebody...
01:08:32: I don't think you have anything that kind of triggers haptics right now, so...
01:08:36: Wait, wait, wait, I might actually... hang on.
01:08:38: Do you have a thing?
01:08:39: Because this usually happens when there's something present that can activate that system.
01:08:46: Does that work?
01:08:47: Oh, no, this is just controllers.
01:08:49: No.
01:08:50: Just the controllers? Darn.
01:08:51: Yeah, I think I would need to have a device on so this kind of shows.
01:08:57: Because usually this will kind of show...
01:09:00: Actually, what have I rebuilt?
01:09:01: Actually, no, there we go.
01:09:03: I kind of forced it to happen.
01:09:06: So now I actually see...
01:09:09: Oh, right, because this actually makes it happen for other users.
01:09:14: So this component, what it does, it looks at the other users and it checks haptic triggers into their avatar.
01:09:23: And it can kind of see how they're kind of sized.
01:09:27: No, it has nothing to do with that one.
01:09:30: Because I don't think you can see them because they should be just injected locally on my end.
01:09:36: Yeah, but no, I don't see any.
01:09:38: And there's some kind of properties, but I don't think we have any way to persist this right now.
01:09:45: However, there is a way...
01:09:49: Because we're asking a bunch of questions, let me actually check.
01:09:52: Can the haptic energy be externally disabled? If not, inject haptic points of avatar.
01:09:56: And the avatar haptic source manager can override those generate haptics.
01:09:59: However, the list is called haptic volume active zones, but disabling active does not actually disable is unintended.
01:10:04: Do different types of sensations mean anything to be haptics?
01:10:08: So let's just go over this, because you should be able to say that you don't want the haptics to be injected for certain parts of the body on your avatar.
01:10:17: So I'm going to, let's click this, I'm going to open up Cyro.
01:10:25: And I'm going to add a component to, let's see, I think it's an input haptics.
01:10:37: There's something like that, sampler, haptic source manager, yeah there we go, avatar haptic source manager.
01:10:47: It's been a while since I actually worked with this system, so I'm going to just hide haptic source, hide haptic point mapper, filters, control haptic point mapper.
01:10:58: So some of these should all be, these are like mappers, they're receiving haptics, and some are like, they're like setting up.
01:11:07: I might need to like look into the system, because it's been a while since I've worked with it, so I'm kind of piecing it back.
01:11:16: What does this one do?
01:11:21: Haptics source, no that's, oh wait that's a different one.
01:11:28: It might be under common avatar instead, let's see.
01:11:35: Do I have haptics?
01:11:42: Avatar pose.
01:11:46: Yeah I'm actually not sure, I just felt like dig through this because it's been a while since I worked with the system.
01:11:53: Like even though like I wrote it, I don't remember how it works, I'd have to like check the source code.
01:12:02: Let me see if I can figure this out.
01:12:10: Yeah I don't think there's...
01:12:13: I mean device, no we don't have devices anymore.
01:12:18: Yeah I'm actually not sure because this one...
01:12:23: I don't know what this one...
01:12:26: Yeah, I'll have to like do some research for this.
01:12:32: It's actually a thing because like I cannot really, it's not like when I'm in VR it's not that easy to check the source code.
01:12:38: So I don't think I can answer those questions right now.
01:12:46: Like the way I cannot remember like roughly working is like you should be able to like place...
01:12:52: Like you can place like you know haptic triggers on the avatar and then like you know say I'm overriding these and then when it generates the mapping...
01:13:02: It should then like you know you should be able to say like don't auto-generate you know the chest, don't auto-generate the arms and so on.
01:13:08: And they would like suppress it.
01:13:10: You probably cannot like you know control...
01:13:12: Well you cannot control like the components on the route because that gets like injected but you can control how it behaves.
01:13:21: There's one question I can ask, I can answer.
01:13:25: Do the different types of sensations mean anything to be haptics?
01:13:28: I noticed that say force and vibration feel a little different on the controller haptics but I wasn't sure if it was be haptic.
01:13:34: Yeah, generally they have different implementations for each so like if it's like you know just the vibration then you know it'll try to like do something different so like it feels different.
01:13:46: One of the reasons like you know why those are there is because you know in case there's like different haptic devices for example ones that can simulate hot and cold they can actually instead of vibrations they do hot and cold.
01:14:00: But if you don't have those it still does something you know to indicate something is there.
01:14:06: I think for example for the pain like the be haptics will do sort of like this kind of like pulse thing like heartbeat and was kind of inspired like you know by Half-Life Alyx you know how when you're low on health your controls are kind of like pulse like that.
01:14:21: Because long you know it's kind of happens like when you're in pain in the game.
01:14:25: But sorry I've kind of done most of this like from the tip of my head I'd have to like check the source code.
01:14:31: Which means I probably need to like figure some good way to kind of maybe just a little kind of open and have my keyboard with me so I can like you know check things out.
01:14:41: But yeah, this is all the questions we could Discord.
01:14:47: Try asking maybe like you know for the next one and I'll figure out.
01:14:52: The problem is this one kind of pops up like really late so I didn't know it was coming.
01:14:56: So with that like that's all the questions from the Discord which means we can move back to the, back to like the questions from Twitch.
01:15:16: NukiKoon is asking what is the current progress of deintegrating Resonite from Unity?
01:15:21: What optimization updates should we expect before then? What are we working now?
01:15:24: So right now PhotonDust is essentially getting like finalized. We're running the last phase of like pre-release testing which should be relatively short.
01:15:35: Essentially what it is, it's removing the old particle system completely and then like you know like the conversion essentially becomes automatic which means like anything you made now gets automatically converted to PhotonDust.
01:15:52: Once that is done, the next part is the audio engine. So I'm gonna work on like making a custom audio engine.
01:16:01: That should be way faster in the particle system because like there's a lot less kind of complexity to it.
01:16:08: So hopefully that one shouldn't take as long.
01:16:11: Once the audio system is done, essentially it's gonna be the last major system that's gonna you know, making us kind of still intertwined with Unity where it's gonna you know interweave and makes it hard to separate.
01:16:25: After that, I'm gonna be like reworking how FrooxEngine actually communicates with Unity and then it's gonna get pulled into its own process.
01:16:33: And that pulling into its own process is like when we're gonna get the major performance boost because we'll be able to run with .NET 9 runtime.
01:16:42: So it's pretty much, it's getting close, it's getting there.
01:16:45: Right now it's mainly just need to finish PhotonDust, you know, delete the old particle system which is already done you know on the branch.
01:16:52: Then do the audio engine and then it's pretty much gonna be the main work, the actual like you know pulling it apart.
01:16:58: It's also gonna take a bit because the mechanism for communication is to be reworked and there's gonna be you know some bits there.
01:17:07: But it's gonna be the actual you know the splittening.
01:17:12: So it's getting close and I'm kinda excited for it.
01:17:17: The next question is from GrandUK.
01:17:20: Are there plans for audio video processing in Resonite so that more full feature productions can take place in Resonite like compositing video camera into another camera?
01:17:29: Yes. I definitely love Resonite to be more of a production tool.
01:17:35: Because our overall goal is you know like having the social layer as something you know that's like a sort of foundation, as a basis.
01:17:44: But you can do a lot on top of it. It's not just you know like if you wanna you can just hang out and socialize you know that's perfectly valid.
01:17:50: But you can also you know collaborate with other people.
01:17:54: You know you can like work with them and you can already kind of do that you know building worlds in here.
01:17:58: And the functionality we add you can use Resonite to build and produce things you know for outside of Resonite.
01:18:05: So one of the things I'm excited for for example is the audio DSP.
01:18:10: Because the audio DSP it'll let you you know build a virtual like music audio production studio.
01:18:16: And you can you know collaborate like you know we could be like here with Cyro we could be just you know building together.
01:18:21: And you know we can be bringing some sound effects and plopping them in and you know then we're passing them through filters.
01:18:26: And maybe you know somebody integrates like you know we have like a keyboard so you kind of like you know sample some things you know.
01:18:32: And like feed it into the system and it makes sound and you know just make make like sounds and music you know.
01:18:39: And if you want to save your progress you literally just save the world or maybe you build like a really cool filter you know.
01:18:46: You package into a custom node and you can share it with the community and other people can use it in their own productions.
01:18:52: So there's got to be like you know a lot of like great synergy with stuff like that.
01:18:57: The other feature that I feel is going to help with that is the timeline.
01:19:01: Because of timeline and I kind of went into like deeper in previous resonances there's like a video kind of covering it in depth.
01:19:08: But for example with timeline the way it helps you know with audio production is like you know we can maybe plop sound effects on it.
01:19:15: You know we have like you know you have like your timeline and say like you have like you know a sound here you know.
01:19:22: There's a sound and you can I know you can plop it here or maybe you know you cut it you know so you cut like a piece of it.
01:19:27: Or maybe have like another sound you know here and it kind of composite them together.
01:19:31: Maybe you duplicate this one you know you can play it multiple times or something or stretch it or do whatever you want.
01:19:38: You know you can place all kinds of things on the timeline and you say I'm gonna render this out into a new sound.
01:19:45: So there's gonna be tools that are gonna do that that are gonna work in over the timeline and get a new sound effect and then you export this and you know use it wherever.
01:19:56: Or maybe like you know over the timeline the other way it can work with the audio thing is you know use it to like you know sample things.
01:20:02: So you have like for example you know these are like you know keys and maybe you know and this and this then you know for example each row we have like a thing that's generating audio.
01:20:14: You know and maybe this one goes into this one it's gonna goes into this one and you mix it together and you know you use the timeline as a sequencer for audio production.
01:20:25: And the goal is to essentially add the timeline a sort of like a building block like a base that lots of other systems then can be built around.
01:20:36: And it's gonna kind of you know sort of glue serve as a glue that kind of glues those systems together.
01:20:41: And that way you get lots of like really cool synergies between all the functionality you know in the Resonite.
01:20:49: Same thing like you know video processing. Video processing is a little bit trickier because like we would need to integrate some libraries for like you know really efficient video decoding.
01:20:57: But it's something I would want us to have at some point in time.
01:21:02: Because same you know video timeline you can just you know place video clips and maybe you know this gets composited you know somewhere.
01:21:10: Like you have like a video clip here because right now with the video player like it doesn't have much control like we cannot say decode this exact video frame.
01:21:20: You know like if you if you have your you know timeline you know ahead and like it's you're moving it around like if it's here we need this exact video frame to be you know outputted.
01:21:32: And we don't have that level of control right now so like we would need something more robust for that and that will you know take some time.
01:21:40: So if you have that functionality you know then we can take this and you know and maybe you know you actually have like a thing.
01:21:47: Like you know it used to be like that the old like movies they would actually have like you know like a matter where it's like you know for example like you know some kind of terrain and you literally put the camera you know like you literally put this in front of the camera.
01:22:06: And imagine this is you know kind of pretty terrain or something you put this in front of camera and you film like this.
01:22:12: You're essentially doing real-world compositing but just like you know layering things.
01:22:17: And maybe you know you could have like another thing you know that's in front of your camera over here and maybe you know this one and there's like something that's moving this in front of the camera.
01:22:32: If I switch this you know it's literally just things that are being moved in front of the camera that are compositing you know this image.
01:22:46: So what we could do is like make tools where it's easy to set up a contraption.
01:22:56: Where I pull a little thing we can make it eat I'll keep pulling a thing just grab this like this.
01:23:03: Oh, there we go.
01:23:06: We could like make a thing like where you have a little video on the feed maybe you know there's a texture in the world.
01:23:16: And you know and this video actually goes into this and then you have a camera here that's looking at it.
01:23:26: And then I have like you know something behind and maybe you have like another layer over here.
01:23:30: And there's another track and maybe you composite you put like a filter in here in between.
01:23:36: So you can sort of physically composite things together in virtual reality.
01:23:42: And you can also you know collaborate with other people and also find the cool stuff.
01:23:45: So I think there's a lot of like a really exciting workflows that can be unlocked with these features.
01:23:54: And I would love like you know to have like Resonite be something that can be used for like you know collaborative productions.
01:24:01: Where you use it as your you know virtual studio and maybe you know you make like videos with it.
01:24:06: Like you know I'm not even relevant to Resonite.
01:24:08: You just use it as a tool which gives you you know real-time collaboration like you know over the internet.
01:24:15: It gives you sort of embodiment in the virtual world and gives kind of more physicality to that like you know whole production editing process.
01:24:25: Doing these things in a way it actually used to be done or like similar.
01:24:32: Where you know they used to actually literally put things in front of camera to composite like you know multiple effects together.
01:24:40: And now it's kind of you know all digital so like you kind of lose that like physical analogy.
01:24:43: But like with VR you can kind of bring that physical analogy while keeping you know getting the benefits of the physicality of it.
01:24:52: So yes I would definitely I would definitely love like you know for these things to kind of become more of a thing.
01:24:58: The audio one is probably gonna happen way sooner than the video because you know audio is easier to process.
01:25:03: Video like we need those you know more advanced libraries.
01:25:10: The next question is from Nukekun.
01:25:13: What is the lower image for how many audio sources can be active before there are issues? What issues might we encounter?
01:25:19: I mean that kind of depends on the hardware mostly like and also like a lot of hardware and like you know what you're doing with audio effects.
01:25:26: Because if we have audio effect that's for example spatialized that will take more CPU time than one that is not.
01:25:32: Because we have to spend CPU cycles doing the spatialization.
01:25:36: So there's not like a there's not a specific number you know it's gonna depend on a computer.
01:25:43: It's kind of a similar thing you know like if you've got a beefy computer you can run you know way complex worlds.
01:25:50: You know you can have more geometry in the world before you start like suffering a lot.
01:25:55: Compared to like you know somebody running on the low-end machine.
01:25:58: So generally these questions you know there's not a specific number.
01:26:01: The number is going to depend on the hardware.
01:26:03: So then it becomes question you know what's the lowest hardware that somebody might be running.
01:26:08: And you know then we can like you know maybe measure stuff on that.
01:26:11: And you know and then like we get also into other problem is like you know we have to benchmark this stuff.
01:26:15: So you have to kind of see we have to kind of see like how it runs.
01:26:26: Because right now like you know can't really answer it without measuring it.
01:26:30: Usually the issues you encounter if it's if the audio engine is stalled the audio essentially starts popping.
01:26:36: Because what happens you know like when you're rendering audio.
01:26:40: So like if you're at 44.1 kilohertz we essentially need to like you know compute 44,100 samples every second.
01:26:50: And that usually happens in small chunks you know like maybe you're computing 1024 samples you know at a time or maybe 2048.
01:27:00: So like what happens is you know I don't know how that like works out so I think it's something like...
01:27:08: What is like 44,100 divided by 1024?
01:27:15: 44,100 divided by 1024. About 43.06.
01:27:22: So that's like 43 milliseconds?
01:27:27: So like yeah it actually works like so say like you know every 43 milliseconds you need to compute 43 milliseconds of audio.
01:27:35: Which means you cannot take more than 43 milliseconds to compute it because then you're too late.
01:27:43: So what essentially it needs to do it needs to compute 43 milliseconds of audio in less than 43 milliseconds.
01:27:52: So it's a question you know how much can it compute before it happens because if you if you don't compute it in time you miss it and now you have no audio to play to the user which means essentially you know you're playing audio.
01:28:05: And suddenly you get like a burst of silence and maybe you know then it goes late and then you need to compute another one and get another burst of silence and it just starts like you know popping and it doesn't sound good.
01:28:18: Fun fact I believe that is if you were to compute one second of audio at 44.1 kilohertz I'm pretty sure that comes out to like 180 megabytes with the floating point.
01:28:35: Yeah I mean this is not much for memory like usually it's processing stuff but like I mean if you think about like the images like if you calculate how much data is every single frame like you're getting to like sometimes gigabytes per second.
01:28:51: Yeah so like audio is generally nothing like generally stuff is fast but like the problem is you know usually you have more than one audio source and the more you stack you know the more you have to compute for every single sample.
01:29:05: Like you know where it lowers because if you have say if you have like you know two audio sources you have to essentially mix them together.
01:29:15: And what did you like you know you take like at every point you take the sample and you just add them together you know just add them together add them together add them together add them together you're gonna do that in a loop so it's fast.
01:29:24: Now if you have like you know if you think about it like you know this is like this is like one addition for each sample but if we have you know say 32 of these now I have 31 additions.
01:29:39: And maybe you're doing more complex calculations or for each maybe you know for each one you're actually doing the spatialization which adds you know complexity and the more you add the more calculations you're doing in the same unit of time.
01:29:52: And eventually you know you run off like this actually makes it like you know take the more you have it makes it take longer and longer and longer.
01:30:01: And at some point you're gonna you know hit a threshold where you're asking the system to compute you know 43 milliseconds of audio and it needs more than 43 milliseconds to do it because you just added so much and then it doesn't keep up and it just starts popping.
01:30:23: Next question from GrandUK. Is there work or ideas thrown around for Molekule? More specifically anything not in the initial GitHub issue for it?
01:30:32: I don't know what's in the original GitHub issue so this one's kind of hard to answer. I would say like it's like general stuff that's in the GitHub issue.
01:30:40: Like the main point of like Molekule is you know sort of like a versioning system that we can use both for Resonite itself and we can build our own distribution but also it can be used you know for components.
01:30:50: So if you build stuff, if you build you know items you can use it to version them. If you build say like once we have like you know libraries for ProtoFlux or WebAssembly you can use it to version them as well and resolve like you know dependencies and stuff like that.
01:31:06: So I like the main like the core ideas of it like you know they should be in the GitHub and if it's not that's probably not a very core idea so like I don't know.
01:31:25: It's yeah like this one's a little bit hard to answer. Especially also like you know if they cannot read through the issue.
01:31:32: So for now we can molecule as our plan sort of like you know versioning system so like you know it can manage the builds of Resonite and also other items.
01:31:41: And we can we can like you know we essentially like have control because I know when we publish bots as steam sometimes it doesn't want to update for people.
01:31:55: It's also hard to have multiple you know branches for testing things because like you know the team is like working on stuff.
01:32:02: If it was easy to access that build and you know you know like if they like for the community to like you know run multiple things in pyro.
01:32:15: And like we cannot really do that with steam because we have just one per release and we want to multiple releases we need to build make more build scripts and makes things hard to manage.
01:32:23: So that's just going to kind of give us a lot more kind of control.
01:32:26: Plus you'll be able to like easily switch you know say I'm going to switch to this build you know this older build to test something because sometimes you know be like did this.
01:32:37: Like when something breaks and somebody makes a bug report you're like does this break recently like this is new and people don't know.
01:32:44: And then I know it's hard to be like you know go through this build and check it if it's still broken on this one.
01:32:49: And it can you know help us you know figure some bugs out so there's gonna be a lot of things it's going to allow us to do.
01:32:56: Also thank you it's lengthy for the subscription with Prime.
01:32:59: It was probably like a while back like it's just kind of taking time to get through the questions but thank you very much for the subscription.
01:33:07: Next question is from Rabus.
01:33:10: I understand that once a particle and audio are separated from Unity, the migration to .NET will begin.
01:33:15: How challenging do you think the migration to .NET will be and how long it might take?
01:33:20: Or is it possible that it will simply be a matter of switching the runtime to .NET?
01:33:23: So I can tell you straight up it's not as simple as switching the runtime to .NET.
01:33:32: The big part is essentially making it so FrooxEngine, you know, so making FrooxEngine run with .NET is very easy.
01:33:44: Because that's pretty much what the headless is.
01:33:46: The headless is pretty much almost all of the parts of the FrooxEngine running with the graphical output.
01:33:53: The challenging part is the communication with the unit because we do need it to render stuff out.
01:34:02: So, you know, we have the FrooxEngine.
01:34:11: Where did it go? That was weird.
01:34:14: Where did it jump? Oh, I moved. Okay.
01:34:17: Yeah, you moved.
01:34:18: I bumped the joystick. I was like, why did it just suddenly jump to the right?
01:34:22: So, you know, we have the FrooxEngine and then we have like an IPC mechanism.
01:34:29: You know, and the IPC and this communicates, you know, the unit there.
01:34:41: So, this is probably the most complicated part.
01:34:44: Making it, you know, communicate efficiently and making sure these two kind of stay synchronized.
01:34:50: And, you know, this keeps feeding data in and this also keeps being like, okay, like the frame is ready, here's stuff for the next frame.
01:34:59: We're going to be using mainly shared memory for sharing the bulk of data.
01:35:03: Because what shared memory is, is like literally a piece of memory that both have access to.
01:35:10: As if it was their own memory.
01:35:11: So like, you know, this one has its own memory and this one has its own memory.
01:35:16: And this piece is shared, which makes it very easy to exchange data.
01:35:20: And then we just need to send like, you know, tiny messages communicating when this happens.
01:35:23: And this might also happen over this, where it just, you know, puts a piece of data there.
01:35:29: So maybe it's all going to be shared memory, that's not decided yet, but we'll see.
01:35:35: But the part is like, for actual splitting, like the FrooxEngine right now, it still has, you know, a bunch of kind of ties, you know, like how it kind of communicates.
01:35:47: So the biggest, the longest part is probably going to be, you know, unifying this into this.
01:35:56: Once the communication is kind of, you know, streamlined, that makes it like much easier to, you know, just kind of split this up and, you know, run this like, you know, on separate processes.
01:36:09: And the hardest part on that one is going to be making it efficient.
01:36:11: I don't know how long it'll take exactly, like that's kind of hard to estimate before actually starting to work on it.
01:36:18: Because usually with these things, you know, you cannot discover a bunch of the problems and issues and so on, like once you actually start working on it.
01:36:25: So I don't really want to make an estimate at this time.
01:36:30: But yeah, it's going to be like, like, I don't think it's going to be like, like super hard challenge.
01:36:39: It's like the hardest part was, you know, just getting everything to the point like where we can do it, you know, if it like, it started kind of with a type system, then doing, then doing, you know, PhotonDust and next one is going to be an audio system, which I think is going to be the simple part.
01:36:56: And it's the splitting, which is going to be figuring all these like mechanisms.
01:37:01: There's also going to be some mechanisms, you know, because there's needs to be some back end for communication between them and making that like run efficient and reworking that like in a efficient manner.
01:37:13: So hopefully that kind of like answers the question.
01:37:16: Ozzy asking that works. I don't know what's that in reference of.
01:37:25: This is a repost because I forgot to include question mark.
01:37:29: I'm really quick on what it says, how we're rendering engine. However, I rarely get a chance to check its progress on Divlog and the void anchor score server.
01:37:37: I'm curious about which phase it's currently in and when transition to Unity is expected to take place.
01:37:42: So there's not like a specific timeline for it right now.
01:37:45: Last, this might be best to ask in Guinz's office hours as well.
01:37:50: But the thing that I know has been working most recently is actually consolidating shaders on Unity's side for FrooxEngine.
01:38:00: So they're going to be much easier to port over Sauce.
01:38:05: I've been working also on some bits of Sauce, but I don't have the most up-to-date information on that one.
01:38:18: I'm going to play some Resonite to socialize and talk to people, right?
01:38:23: Yes, that's one of the favorite memes.
01:38:27: This is the point, I can do both.
01:38:30: The way I like to look at Resonite is that a lot of people put the software into this box.
01:38:44: It's social VR, you go there to socialize.
01:38:47: The way I like to look at Resonite is that it's like a realm, kind of like the real world.
01:38:54: Because in the real world, when you think about it, you can navigate.
01:39:00: You can go into rooms, you can go into the city, you can move around.
01:39:03: And you can talk to people and socialize.
01:39:05: And it's something that you take for granted.
01:39:08: In the virtual world, that's not something that is given for granted, but it should be.
01:39:14: And it's kind of what Resonite tries to be.
01:39:15: It starts to provide a realm that you can exist in.
01:39:18: You can talk with people and you can do stuff together.
01:39:21: And then the stuff that you can do, that's where it gets interesting.
01:39:25: So, same in the real world.
01:39:27: You can just go there, visit somebody, you can just hang out, vibe, whatever.
01:39:31: Watch some videos together, socialize, you can do that.
01:39:35: Or you can meet somebody, say, in a hacker space.
01:39:38: And you're maybe doing some hardware together, and doing some engineering and some things.
01:39:43: Or maybe you meet with some people who are artists and you paint together, maybe have a class.
01:39:48: You can do so many different activities in the real world and you don't really think about it.
01:39:54: And you don't think about that all these activities are built on our ability to communicate, our ability to move in the world.
01:40:07: Which are sort of there, they're always there.
01:40:12: And we don't think about them.
01:40:14: But in the virtual world, they're not.
01:40:18: And Resonite is trying to make it so that it's a thing that's always there.
01:40:23: You don't have to even think about it.
01:40:25: You can exist in a world that's fully synchronized and whatever you do stays sync with people and you don't have to think about it.
01:40:33: It's just how reality works.
01:40:36: And then it becomes more about what do you do together with other people, what do you do on your own.
01:40:42: You pick whatever activities you want to do the same way you do in real life.
01:40:51: I like to look at things.
01:41:02: Probably not immediately.
01:41:05: That's probably not going to happen until we actually switch to the Sauce.
01:41:11: Because there might be a number of other optimizations.
01:41:15: Once the split happens, we'll very likely evaluate how much this helps, how it improves things.
01:41:24: There might be a possibility, but we kind of need to look at things and be like, what's the best path forward.
01:41:31: Because doing the multi-process architecture also might be more difficult on Quest.
01:41:37: So that might be a hurdle.
01:41:42: It's something we'll evaluate once the split happens.
01:41:46: But there isn't a plan to do it immediately after.
01:41:49: If it becomes easy to do, then maybe.
01:41:53: Sometimes when we do these things, once it happens, we're like, this is not much easier to do.
01:41:59: Maybe we want to prioritize this.
01:42:03: But it requires evaluation to happen.
01:42:07: So right now, I'll say maybe not, maybe after Sauce.
01:42:11: Because that's also going to make the rendering more efficient, which might be needed for Quest.
01:42:16: But we'll see.
01:42:22: Did you ever forge with all the different features you're planning, will Resonite ever become an operating system like Emacs?
01:42:27: I mean, kinda.
01:42:29: An operating system is another way I like to look at it.
01:42:33: Because when you have an operating system, it's a common interface where you can have multiple different apps and things co-exist with each other and communicate.
01:42:43: And you have your core mechanisms, like Windows, where the application is in Windows.
01:42:49: You can drag it, you can move it around and arrange it.
01:42:52: You have stuff like Clipboard and things.
01:42:54: Similarly, Resonite provides a bunch of things that are provided to everything.
01:42:58: And you can build stuff around it, on top of it, that exists in that environment.
01:43:04: So I would say yes. I would definitely love it too.
01:43:12: The next one is, I'm also starting to speed through this a little bit, because we've got a bunch...
01:43:17: Well, there's three.
01:43:20: We have about 15 minutes left, so at this point it's possible we might not get to a question.
01:43:29: If you want, you can still ask it, but we might not get into it at this point.
01:43:36: If we don't get into it, I remember we now open a thread where you can ask questions in advance for the next week.
01:43:45: So you can also put it there.
01:43:48: But we have a question from OurBoy.
01:43:50: There are many upcoming features that I have personally been really excited about, and know I've been talking about for a long time,
01:43:54: like proper physics system, workshop, hard permissions, but those feel like they're going on years of waiting.
01:43:59: I've seen multiple times people told they should wait until hard permissions are implemented to make permissions system-related systems,
01:44:05: and that's been a long wait.
01:44:06: I was wondering how do you prioritize features to work on, and if any of those above have any updates?
01:44:11: So usually we look...
01:44:13: I mean, there's going to be lots of plans, because Resonite is a long-term project,
01:44:18: which means there's a lot of things we want to do,
01:44:22: it'll take years to get to some of them,
01:44:25: but that's the point of the project,
01:44:28: it's a very long-term project and we're just building a lot of things,
01:44:32: there's only so many things we can do at a time.
01:44:36: And it's almost like a lot of these things are kind of like milestones,
01:44:39: but if you think about it, one nice way to think about it is almost like having a skill tree,
01:44:47: and say you're here, and there's all these things,
01:44:51: and maybe there's some things here,
01:44:55: let me actually move this a little bit,
01:44:57: you're here, and for example these things need this thing,
01:45:01: and this needs this thing, this needs this thing, this needs this thing.
01:45:05: And then you're here, and you're kind of deciding which way do you want to go,
01:45:08: and maybe you're like, okay, this is the most important thing that would help the project the most at this time,
01:45:14: so we're going to develop this.
01:45:16: And then you're like, okay, this also helps this, but we need to do this, so we also do this,
01:45:21: and now we can do this.
01:45:23: And now that we've kind of explored this part, we'll be like, okay, at this point we have this,
01:45:27: this is helping us a lot.
01:45:30: At this current state, you know, with the community, and with the company, and with how everything else is going,
01:45:36: we think this is the most important bit, so we start developing here,
01:45:40: and then maybe here, and then we maybe go a little bit here,
01:45:43: and maybe there's like, it goes even further,
01:45:46: I'm actually making it to...
01:45:51: Oh boy.
01:45:54: There we go.
01:45:55: And it goes even further, and maybe we decide, okay, we're not going to go all the way,
01:45:59: you know, like we're going to pause, and we're going to instead develop this,
01:46:03: and then this, and maybe a later time we return back here,
01:46:05: because like this is, you know, good enough.
01:46:10: So it's almost like, you know, like there's so many things to do,
01:46:13: and at every point when we're kind of deciding what to prioritize,
01:46:19: we are like, you know, what Resonite needs the most right now.
01:46:25: What is like the most important thing?
01:46:31: And that's kind of a hard question to answer sometimes,
01:46:34: but, you know, there's a lot of things that go into it.
01:46:39: There's actually a big post on our GitHub called How We Prioritize,
01:46:43: and it kind of goes into a lot of the signals and thought that goes into prioritizing things.
01:46:51: But for major features, it's like, what will bring us the most support,
01:46:59: and what's going to help most with other features?
01:47:02: Because, for example, with the upgrade to .NET 9,
01:47:05: we can now be unblocked on so many things,
01:47:08: because it makes stuff like Rigidbody way easier,
01:47:11: because we can upgrade to this Bepu, and we get the benefits of that.
01:47:15: It lets us clean up a lot of code with modern mechanisms.
01:47:19: It lets so many features much easier,
01:47:23: that it makes sense to prioritize as the things that do.
01:47:29: Because if you implement certain things,
01:47:34: they make other things easier,
01:47:37: and that can contribute to those things being prioritized.
01:47:40: And specifically now, with performance,
01:47:44: for the longest time I felt that performance
01:47:46: is probably one of the biggest blockers
01:47:50: for a lot of people staying on Resonite.
01:47:52: And we need more people in the community,
01:47:54: we need more support.
01:47:57: And a while back, we did a survey,
01:48:00: and we asked people,
01:48:01: what's preventing you from playing Resonite more?
01:48:04: And the overwhelming majority of people said it's the performance.
01:48:08: And why that got prioritized,
01:48:11: why we said this is going to be our major focus
01:48:16: while making it way better,
01:48:18: because that can help the community,
01:48:20: that can help the future development,
01:48:23: and just make the software better.
01:48:26: And once that's done, we'll do a similar thing.
01:48:28: We'll be like, what's the biggest thing?
01:48:30: But also, in that process, we make smaller things.
01:48:35: The performance being the major focus
01:48:37: doesn't mean we stop doing the smaller things.
01:48:40: They're just sprinkled around it.
01:48:44: Sometimes there's other stuff that comes up
01:48:46: that we need to deal with.
01:48:47: For example, if servers are on fire,
01:48:50: no more performance work for now until that is resolved.
01:48:53: That needs to be fixed.
01:48:57: The reason we prioritized, for example, Stripe,
01:49:00: we had working in parallel,
01:49:03: is because we've calculated that every day
01:49:07: that we don't have Stripe,
01:49:09: we lose $50 to $60 every single day.
01:49:12: So the longer it takes, the more money we lose
01:49:16: that we could potentially put into other things
01:49:19: like hiring more people or maybe more marketing
01:49:22: to bring more support.
01:49:26: But overall, it's like prioritization.
01:49:29: It is a complicated thing.
01:49:30: You have to weigh so many things
01:49:31: and figure out what's the most important one right now,
01:49:37: what's going to help the project the most,
01:49:39: what's going to bring us the most support,
01:49:40: what's going to help the community the most.
01:49:43: You weigh all of these and then you make your decision.
01:49:46: And maybe sometimes you want a different decision or something,
01:49:51: but there are so many variables,
01:49:56: it becomes difficult because you don't quite see
01:50:02: where some things will go.
01:50:03: You don't know how people are going to react to some things.
01:50:06: So you might have some expectations.
01:50:08: And people also have different things,
01:50:12: like some people don't care about the performance
01:50:14: and they want IK.
01:50:17: And ultimately the decision comes to being like,
01:50:21: the majority of people do, so you have to go with that
01:50:24: because that's one of the things that's blocking this platform
01:50:28: from growing more.
01:50:34: One thing we're probably going to prioritize,
01:50:36: actually that's changed a bit, is Molecule,
01:50:39: because it's one of those things that people,
01:50:41: I feel like people don't super care about,
01:50:44: because there's not something super user-facing,
01:50:46: but we as developers really need it,
01:50:48: because there's been so many cases
01:50:52: where I've lost hours of time dealing with issues
01:50:56: that would be solved by having it.
01:51:03: Sometimes we end up prioritizing things
01:51:06: and that really helps speed up the rest of the development,
01:51:10: which means we can actually do the other things faster
01:51:13: than we would have otherwise been,
01:51:15: but it means we have to spend time on this thing for a bit.
01:51:21: Hopefully that answers the question.
01:51:23: I do recommend reading the How We Prioritize post on GitHub,
01:51:27: because it goes into a lot of details on this.
01:51:32: CheckitheFoxAuthor is asking,
01:51:33: I know you can't say when you'll get to work on it yet,
01:51:36: how large the scope of Protoflux collection support is.
01:51:39: I ran into a few situations last week where I needed to write horribly inefficient code
01:51:42: because I couldn't store collection data.
01:51:44: It's the second highest I've voted GitHub issue,
01:51:46: but I'm not actually sure how large of a task it is
01:51:48: compared to other stuff like PhotonDust.
01:51:50: Sorry for double posts, forgot a question mark.
01:51:54: I think actually, out of all things,
01:51:57: I think Protoflux collection is not that big.
01:52:01: Because most of the stuff should already be there.
01:52:13: The main thing is, there's a few things,
01:52:16: because one thing that I feel is going to make collections a little bit more complicated
01:52:20: we might want to have some mechanisms to restrict how large you can make collections.
01:52:27: Because we don't want to somebody just make a loop and just fill your memory.
01:52:31: So from the technical side, it should be fine.
01:52:35: It mostly just needs the localness tracking, but there's other mechanisms that can be built on top of.
01:52:42: Then it's adding a bunch of nodes for working with everything on them, but that's relatively trivial.
01:52:50: And the biggest part would be just some sort of system to track how much data you can allocate and put some limits and checks on that.
01:52:59: And I think that's going to be the biggest part of it.
01:53:01: So it is actually something we might end up prioritizing, because that's going to help creators a lot.
01:53:08: I would definitely love to add it, because there's so many other features it's also blocking,
01:53:13: and I do feel if it's a feature we have that's going to help with a lot of the content that people are building,
01:53:19: and then in turn also helps us, helps the platform, because people now build more complex content that we're not able to build before,
01:53:27: because we just made it way simpler to do.
01:53:33: Next question is from Rabbids.
01:53:36: I know you're a genius programmer dawg, thank you.
01:53:39: But do even genius dawgs like you receive support from AI tools such as ChatGPT or GitHub Co-pilot during development?
01:53:46: And if you don't use them, what is their reason?
01:53:49: So I don't really just GitHub Co-pilot, like I kind of find it more annoying.
01:53:57: Sometimes I do use ChatGPT.
01:53:59: I don't use it mostly for code.
01:54:03: I did use it for code if I need a boilerplate code.
01:54:11: So for example, I was like, I asked ChatGPT,
01:54:15: can you write C-Sharp code that initializes a hash set with all the C-Sharp keywords?
01:54:22: And it just kind of put it together for me.
01:54:23: Or if I need to edit a bigger block of code just in some predictable way, I ask it to do that.
01:54:29: I don't like using it for making new code that requires more complex reasoning.
01:54:41: Because usually when I generate something, I kind of comb it through and I'm like,
01:54:46: does this make sense?
01:54:50: Does this code...
01:54:51: Because I have trouble trusting it.
01:54:55: Because I've asked it about a bunch of things that I know a bit about.
01:55:00: And sometimes it gives good answers, but sometimes it just gives completely bogus answers.
01:55:06: But it's very confident about them.
01:55:08: And I'm like, I can't trust this.
01:55:11: There's like 50% chance it's just going to give me something wrong.
01:55:15: And I don't want to put it in code without checking it.
01:55:17: And if I spend that much time checking it, then it doesn't really help me anyways.
01:55:23: So for something like Boilerplate, that's fine.
01:55:28: Because I'm like, simple enough, that's not really complex.
01:55:32: But for complex code, I didn't really use it.
01:55:36: I did ask it once to make a collection for probabilistic sampling support.
01:55:42: And it just made code made no sense at all.
01:55:46: It was just making internal queues and it was adding things to them and then removing them for no reason.
01:55:55: So I was like, I don't know.
01:55:59: The other way I kind of use it sometimes is as a starting point for doing research on some things.
01:56:04: I'll for example ask it, do you know of any good libraries for C Sharp for doing this thing?
01:56:13: And sometimes it actually gives me good pointers.
01:56:16: This library is also a thing because it does tend to hallucinate.
01:56:20: So sometimes it'll be like, you know, there's this library and then I Google it and it doesn't exist.
01:56:25: But you know, that's a thing I can easily check it. I can Google it.
01:56:28: And if the library doesn't exist, I know it gave me bogus answer.
01:56:31: But if it does exist and I'll be OK, I'm going to look into this one.
01:56:35: So it can be really useful for that kind of thing.
01:56:39: Um, yeah, I will say I will say, like, it's
01:56:44: I would almost consider it like dangerous if you're trying to learn like how to code, for example.
01:56:51: Because it I've asked it like some C sharp questions and maybe this is just a C sharp thing.
01:56:57: Maybe it's not good at C sharp. No, it's general.
01:57:01: I've asked, I've asked, I've asked ChatGBT about like some simple like C sharp things.
01:57:08: Like, you know, what's the most efficient way to like iterate over this thing and do this other thing.
01:57:13: And it it just keeps doing it in like the most suboptimal way that like
01:57:22: it just doesn't know like the like a no brainer, like just do it this way.
01:57:27: Like sometimes it'll just be dumb and that now it's taught you the dumb way to do it.
01:57:34: There's actually been like a study, like, like I think they look at one of the latest models from ChatGBT
01:57:39: and they found it's, it gives you the wrong answer on average 54% of the time.
01:57:47: More than half of the times it's going to give you something wrong, which means it's hard to trust it.
01:57:53: And if I like, if I have to spend like, you know, a lot of energy figuring it out, like if what it gave me is good or not,
01:58:00: then it like, you know, it doesn't help much unless it's very easy to verify.
01:58:06: So like, you know, if I can just do a quick Google search, figure out like you gave me something bogus,
01:58:11: I can't trust this, you know, then I'll use it for those things to kind of get pointers.
01:58:15: But like if it's something where I have to like come through it and like do like
01:58:20: a lot of kind of complex analysis, figuring out like, is this answer correct or no?
01:58:25: Then it's not worth it because like, you know, like it doesn't really save much time.
01:58:31: And it just kind of looks kind of paranoid that like, you know, I'm going to put something that's going to cause like issues.
01:58:38: But yeah, that's pretty much it.
01:58:40: We also, this is pretty much the last minute, so I think like this was the last question.
01:58:45: So with that, thank you everyone, you know, for like joining.
01:58:49: I really enjoyed kind of like, you know, learning more about Resonite.
01:58:54: Thank you all for all the questions.
01:58:57: As such as a reminder, we are running the last phase of PhotonDust testing.
01:59:02: There's like an announcement in our discourse.
01:59:04: So if you can, you know, give it a try because we'd like to merge it in, we'd like to merge it in like this upcoming week.
01:59:13: We also launched Stripe.
01:59:14: So if you are supporting us on Patreon, which we appreciate a lot.
01:59:19: Please consider switching, you know, on the same tier to Stripe.
01:59:24: Because we actually get like, you know, a lot more, a much bigger cut that way.
01:59:29: Patreon on average takes about 15%, Stripe takes about 5%.
01:59:32: So we get like, you know, a lot more money than we can then invest into Resonite.
01:59:39: Thank you so much, you know, whether you support us, you know, or not.
01:59:44: Thank you, like, hoping for like, you know, just joining like the stream, asking questions, being part of our community, making cool stuff, and just in general being part of this platform and helping us grow.
01:59:54: Thank you, Cyro, for being here, hoping we can answer some of the questions too.
01:59:59: Yeah, I'm glad I could sit here and nod along.
02:00:05: But yes, and I hope like everybody's like, you know, having fun, like, working on MCProchase.
02:00:10: Like, I can't wait, like, you know, to see like what everyone kind of makes, like, you know, at the end of the month.
02:00:18: I hope people aren't getting too burned out.
02:00:21: I know, like, from where I heard some of my daughter be, and they're doing like scope creep and stuff like that, but as long as you're having fun.
02:00:30: So, thank you very much for watching, and we'll see you for the next one.
02:00:33: And I'm gonna check if there's anybody to raid.
02:00:43: I think it's only Creator Jam that I can see. Let me check if there's anyone streaming Resonite.
02:00:51: No, it's just me and Creator Jam.
02:00:53: If you like to stream, I recommend streaming around this time, because you're gonna get like a lot of viewers from us.
02:01:02: So, we're gonna send everyone to Creator Jam.
02:01:08: So, we're getting the raid ready.
02:01:09: Oh, just type great Creator Creator.
02:01:13: Creator Jam, there we go.
02:01:17: And it should be getting ready in seven seconds.
02:01:21: So, thank you very much.
02:01:23: And say hello to Creator Jam, say hello to Medra for us.
02:01:27: Bye. Bye. Bye.
02:01:30: Did I click it? Oh, I click.