The Resonance/2024-11-17/Q&A

From Resonite Wiki
Revision as of 14:16, 7 May 2025 by Hemisputnik (talk | contribs) (First Q&A page. Warning: very WIP!)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Is Glitch cute?

Yes, he's cute. It's proven right here on the stream.

Is Mayonnaise a ProtoFlux node?

No, but I actually have a list of ideas for April Fools, and there's a food-related ProtoFlux node in there that might pop up at some point, maybe. We have the leaky impulse bucket, maybe we could have the leaky mayonnaise bucket. Hopefully that answers your joke question with more jokes.

Where do you want to see Resonite positioned within the VR / social VR space?

Ah, this is a good ramble-inducing question. There's a few things that we know about Resonite. One of the big ideas of this platform is that it's built of multiple layers. At the base layer, you have things like automated networking. Everything you build, even the engine itself, you always get everything synchronized by default. You don't even have to think about it. Everything is potentially persistent. You can save everything into inventory, into the cloud, under hard drive. Everything that you get on the platform, you can persist. The way I see it is once you have this kind of layer, you can start building on top of it. We also have layers for working with various devices, various interactions, grabbing stuff, touching stuff, pointing out things. Those are things that I feel like are important to solve really well. Do them properly. When I started my work in VR, I was doing a lot of disparate applications, where one application had these features and supported this hardware, and the other application supported these things and this other hardware. Sometimes I would like functionality from this one application and this other one, but it was kind of difficult to bring them over. Plus, I would also find myself solving the same kind of problems over and over. For example, being able to grab stuff. One of the driving forces was to create a framework, a layer, where everything is part of the same shared universe, and build an abstraction layer. It's kind of analogous to programming languages, where the really old ones had assembly programming, and you had to do a lot of stuff like managing memory, like where is this stuff, and managing your stack, and doing a lot of manual work to make sure the state of everything is correct. Then came high-level programming languages, where they would essentially do it for you, and they would let you focus more on the high level. What do you want to do? Personally, what I want Resonite to do in the VR social space is do a similar paradigm shift for applications, where no matter what you build, you always have real-time collaboration. You don't even have to think about it. You can always interact with multiple users, and you always have persistence, and you always have integration with lots of common hardware. To me, the social VR layer is the basis. You always have the social stuff. You can join people, you can talk with them, you can be represented as your avatar, but then everyone can build lots of different things. Some people will just socialize, some people will play games, but some people will build a virtual studio. Maybe they want to produce music, maybe they want to program stuff, and they're able to use Resonite, a framework to do that, and share whatever they make with other people. If you're good at building tools, you can make tools, like I mentioned, for example, producing music. Say somebody makes really cool tools. Other people who do like to produce music can take those tools made by the users, and because they exist within the same universe, you can build your own music studio, and you have all the guarantees that I mentioned earlier. Video Music Studio can invite people in and collaborate with them no matter where they are. You can save the state of your work, or maybe say you can make a really cool audio processing filter or something. You save it, you can share it with other users, and it kind of opens up this kind of interoperability. I want Resonite to be general enough where you can build pretty much any application. Whatever you can think of, you can build on here and get those guarantees. Kind of similar to how you have a web browser. Web browsers used to be just browsers for websites, but now we have fully-fledged applications. You have your office set, like Google Docs. There's a version of Photoshop. We can play games. There's so many applications on the web that it essentially becomes its own operating system in a way. I want Resonite to do a similar thing, where the platform itself is like the analog of the web browser. You can build any kind of application in it, but also you get the guarantees of the automated networking, of the persistence, of the integration with the hardware, and other things solved for you so you don't have to keep solving them. That's pretty much in broad terms what I want Resonite to do. I hope that Dremble can answer that question well. I think it answered it pretty good. When you were talking about this, I was thinking of way, way back, before we had any sort of proper type of game engine. You'd program all of your code, all of your games. You would just program them raw. You didn't have Unity, you didn't have Unreal. If you wanted to collaborate with people, you had your immediate vicinity of the people who you lived around. And then now you have game engines and stuff, which integrate a lot of the typical stuff that you would need to make a game. But you're still limited to basically working over a Skype call, or again with people close to you physically. But now, this is kind of like a layer on top of that even. Yes. Where now, as social creatures, we don't really have something like this in that sort of space, and now we do. And being able to have that same sort of collaboration like you could have in real life, with people working next to you, you can have from people who live a thousand miles away, across the entire world, and you can work exactly as if you were right there, and a lot of the things that you'd expect to work just kind of do like, oh, you can see my context menu when it comes up, you can see this in Spectrum opening. It's just like putting a piece of paper down on a table and working on it with someone standing right next to you. Yeah, that's a really good point. There's actually another thing that I've seen that inspired me, is seeing engines like Unity and Unreal. Because it used to be when you wanted to make a game, you pretty much had to build your own engine, which in itself is a big undertaking, and you needed bigger studios. But then game engines came out, they were more generalized, and what they essentially did, they erased the minimal part, where suddenly everybody has access to a fully-fledged game engine, and it's no longer a problem you have to solve on your own. And now you have small studios, even just individuals, who are able to build games and applications that previously would take entire teams of people to do. And where I see Resonite is doing that same thing, just pushing it even further, where we go from just the game engine, where you don't have to worry about stuff like making a rendering pipeline, making a system for updating your entities, and so on. Now you have additional guarantees, like real-time collaboration, synchronization, persistence, that all just kind of comes for free, and you don't have to solve those problems, and you can focus even more of your time on what you actually want to do in the social VR space. What do you want to build, how do you want to interact. So that's definitely a very good point, too, with the game engines.

What are some bugs that we have said it's a feature?

This answer needs a clip.

Have you thought about other ways to get audio-video out of Resonite other than simply mirror-to-display of camera and the audio output of Resonite?

So one of the big things that we're focusing on right now is a big performance upgrade. And actually, I think I've seen a question so this might answer some of that too. It's doing a big performance upgrade. The two big things that need to be done... Well, there's actually one more, but the two big systems that need to be done is a particle system, which is being worked on right now, and the audio system, which Cyro actually has been working on a part of it for doing a reverb system. Those two systems, they're essentially the last two big systems that are sort of like a hybrid between FrooxEngine and Unity. I'll go a little bit more into details on this one with a later question, but we are going to be reworking the audio system, and with the current one, the Unity one, it doesn't support multiple listeners. The goal for reworking the audio system is so we can actually do that, there's one listener that's for you, for your ears, and there's additional listener that can be for camera that you route to a different audio device, so you can actually kind of split it too. Because you can switch to camera, but then I'll be hearing everything from camera's viewpoint that it kind of messes with my kind of specialization. So yes, there's going to be a way to do it, we just need to get it out of the system.

What are the chances of implementing social events and gathering lists in-game that notifies people about upcoming events and more?

that notifies people about upcoming events and more? Yes, that's actually one of the things I would like us to do. We do have a GitHub issue for it, so if you search events UI, I kind of forget its name exactly. On our GitHub, there's a bunch of details. It would be really like adding server-generalized systems plus some UI to be able to register events and see what's happening. It's going to help people discover more things going on in the platform and make it easier to socialize and join things. It's probably going to happen sometime after we finish with the triangle of the performance update because there's a bunch of UI improvements we want to do, and we want to focus on so many things at a time. So it's going to come at some point. No timeline yet. At the very least, it's going to be sometime after the performance update. With one of the things that's definitely on my mind, and that I think should be pretty high on the list because we want to help people drive socialization engagement, so editing is a pretty important feature. Actually, when you were talking about the performance, I actually saw someone in the chat. Yes. And I actually wanted to say that the rendering engine in particular, like using Unity, isn't necessarily like a blocker for the performance update. I see there's two questions that are related to this, so I'll go a little bit more in detail on this one.

Could you explain the roadmap to a big optimization update?

This answer needs a clip.

Any idea of how in-game performance metrics for user content would work?

That's actually, that's a good question, and like, measuring performance, that's a very kind of difficult thing, because one of the things with performance is like, it depends. Like, it depends on a lot of stuff. So usually, like, you want to have like, you know, kind of a range of tools, you know, to kind of measure like, measure things. One of them is, you know, you can measure how long individual components take, you know, to execute, and sort of some way to aggregate the data, so you can kind of see, okay, this is consuming a lot of time, this is consuming a lot of time, but the performance impact of something is not always like, you know, that direct, because something can, for example, the components themselves, they can be quick to execute, but maybe the object is, you know, has a really complex geometry, so it's taking a long time on the GPU to render out. The other part is like, performance can also differ depending on the scenario. Say, you build an object, and the object is doing a raycast, it's doing, you know, some kind of checks for collisions. If you have an object in a simple world, maybe it doesn't like, you know, it runs pretty fast, but you bring that object into a world with much more complex colliders, it suddenly, it starts hurting performance, because now those collision checks are needed, like, you know, are more complex. The other example is like, say you use like a node, like find child. You try to search for a child in a hierarchy. And if you're a simple world, maybe like, you know, the hierarchy of objects is, you know, it doesn't have too much in it. So it runs fast. But then you go into a world which has way more, and now the performance kind of tanks. Now the thing that was running reasonably fast in one world is running slower in the other one. So, one of the ideas we kind of had is, we would kind of build some sort of like, you know, kind of like benchmark worlds. We would like, you know, like have like different scenarios, complex worlds with complex hierarchies, you know, for this and that and then have a system where you can essentially like run that object in that world and sort of, you know, see how fast it runs and how does it differ depending, you know, on a different kind of scenario. Overall, I think this eventually end up with, you know, lots of different tools. So you have like, you know, you don't have the tools to measure how much the components take to execute, how long the, you know, GPU takes to execute, just sort of like lots of different tools to analyze different like, you know, performance things. So I think that's overall like, you know, like what you should expect, like once those tools come in it's not going to be a single tool, but it's going to be like, you know, a range of them that will probably keep like, you know, expanding and building upon. If I could append to that we probably also because I know this is, I know this has like come up occasionally in relation to questions like this, we probably also wouldn't like give things like we wouldn't do like an arbitrary limiting system like oh you can only have 60,000 triangles, you can only have X number of seconds of audio on you. We do want to add tools so you can restrict, because it can like, it's not it's not a perfect solution, but like we want to add people like we want to add tools so people can like, you know, set some limits on things. Because whatever kind of philosophy is like, you know, we want to give people a lot of control. And if you want to run a session like where you can spawn object that has, you know, 50 million triangles and like everybody's going to be running at like you know, 10 FPS, but you know, you want to be like I have a beefy GPU, I want to look at this super detailed model, we always want people to have ability to do that. At the same time we want to add tools so like, you know, if you want to host like a chill world if you want to keep it like, you know, more light, you have tools to kind of like, you know, set certain limits on the users, how much they can spawn in, how much they can bring in. So we're not going to make them, you know, forced, but we're much more likely to add like tools where you have the kind of control to decide what you want, you know, in your world, what you want in your experience. Other aspect of that is like, you know, we have the asset variant system and we already use part of it, like you can go into your settings and you can lower the resolution of textures. You can, for example, clamp it to like 2K. So if you're, you know, low on VRM, you can lower the textures and if somebody has, you know, 8K texture on their avatar, you're only going to load it up to 2K. You know, it's not going to hurt you, but other people, like say somebody has, you know, one of the, you know, 1490 with 24 gigs of VRM and they don't care, they can keep it like, you know, kind of unlocked. And it's kind of, you know, aligned with our kind of like philosophy is like give people as many tools as possible to kind of control your experience. But also we don't want to enforce, like, you know, limits on people where possible. Yeah, that's kind of more so where I was going with that, is that we wouldn't have like a sort of hard and fast, these are the rules for the whole platform kind of rules. Because, you know, not everybody's computers are equal, and so maybe I don't want to render your 500 million polygon model, right? But we also don't want to we want to present this stuff in a sort of like unbiased way. Like, we don't want to, like, we wouldn't I wouldn't want to color, like, 500, like, you know, someone's polygon count in, like, red or something.

Would it even be possible to multithread world processing in Resonite?

What do you think? Alright guys, say it with me. Oh gosh, the camera's moving. Hold on, hold on, hold on, hold on. Alright, say it with me. Resonite is not single-threading. This is a myth that has somehow spread around that Resonite only runs on a single thread. This is abjectly not true. Yeah, this is a thing we kind of get a lot, because I think people are just like, you know, it runs with poor performance, therefore it's single-threaded. When it comes to multithreading, it's like way more complex. It's not a black and white thing. So, the way I kind of put it is, you know, it's not like an on-off switch. Imagine you have a city or something, and the city has poor roads. Maybe there's areas where the roads are very narrow, and it's kind of hard for cars to get through. You can have areas of the city where you have highways, and you can have lots of cars in there. It's not an on-off switch where you just turn a switch and suddenly every road is wide, but you can gradually rebuild more of the city infrastructure to support more of that high bandwidth. With Resonite, there's a lot of things that are multithreaded. There's also a lot of things that could be multithreaded, and they're going to be more multithreaded in the future, but it's not it's essentially not a black and white thing, whether it's either multithreaded or not multithreaded. You have to think about Resonite, it's like lots of complex systems. There's so many systems, and some of them are going to be multithreaded, some of them are not going to be multithreaded. Some of them are not multithreaded, and they're going to be multithreaded. Some of them are going to stay single-threaded, because there's not much benefit to them being multithreaded. So we definitely want to do more, but we already have a lot of things running on multiple threads, like asset processing that's multithreaded, the physics that's using multiple threads, a lot of additional processing spins off, does a bunch of background processing, and then integrates with the main thread. So there's a lot of multithreading to the system already, there's got to be more. It's not something that's like a magic silver bullet. With performance, there's a lot of complexity. There's a lot of things that can be causing low performance, and multithreading is not always the best answer. So for example, the .NET 9 switch, that's actually not going to change anything with multithreading, but it essentially makes the code that we already have, which as we know, whatever multithreading has right now, it makes it run several times faster, just by switching to the runtime, just by having better code gen. So there's a lot of different things that can be done to improve performance, multithreading is just one of them. I think I should cover a lot of it, but yes. One more thing is, it's also something like, when there's a world that's very heavy, it depends what's making it heavy, because some things you can multithread, but some things you cannot multithread. If you have some user content that's doing lots of interactions with things, if you're just blatantly multithreaded, it's going to end up corrupting a bunch of stuff, because with every algorithm there's always a part of it that's irreducible. So we want to introduce more systems that use multithreading where possible, but again, it's not a silver bullet. It's more like a gradual kind of process that happens over time. Next we have GrandUK is asking are there roadmaps with time estimates for both development and what do you want Resonite to be? So for roadmaps, we generally don't do super ahead of roadmaps. Right now our focus is on performance updates, and you can actually find on our GitHub there's a project board, and there's a list of issues that pertain to performance updates, and you can see how those progress. We don't do time estimates because the development varies a lot, and oftentimes things come in that we have to deal with, the delay things or maybe there's additional complexity, so we don't avoid promising certain dates when we are not confident we could actually keep them. We can give you very rough ones, for example with the performance, with the big performance upgrade I roughly expect it to happen sometime in Q1 sometime early next year. We'll see how it goes but that would be my rough estimate on that one. After that, we usually once we finish on a big task, we re-evaluate what would be the next best step for the platform at that point, and we decide are we going to focus on UI are we going to implement this thing, are we going to implement that thing, because we try to keep our ear to the ground and be like this is what would give the community and the platform most benefit right now this is what's most needed right now, and we want to make the decision as soon as possible no, actually as late as possible.

What are some examples of features you've implemented you're proud about?

There's a whole bunch, I do a lot of systems, the one I'm actually working on right now, the particle system I'm pretty proud of that, it's technically not out yet, but I'm very happy with how it's going in part because it now gives us control to very easily make new particle effects and do stuff we were not able to do easily before the one that came before that is the data feed system that's a culmination of a lot of approaches I've been developing to how we do UI in the Resonite so with that one, one of the big problems we've had with the UI is because the Resonite is building a lot of things from ground up because of the layer I was talking about in the stream but it also makes things difficult because we cannot just take existing solution and use it so a lot of the UI, we actually have to build those systems ourselves and build frameworks to work with them, and the old UIs, they have the problem where the code of them is like this big monolith and it's really hard to work with, we have to if there's misaligned button or something we have to go to the code, change some numbers there, change some methods that are called, compile, wait for it to compile run the application, look at it, be like that's still wrong go back to the code, make more changes, compile, wait for it wait for it to launch, look at it, it's still wrong, go back to the code sometimes people are like, oh this thing is misaligned in this UI, and we're fixing that sometimes it takes an hour, just messing around and that's not very good use of our engineering time but the data feeds is a system that's very generalized that essentially allows us to split the work on UI between the engineering team and our content, or now, our team so when we work the settings UI, on the code side we only have to worry more about the functionality of it, like what's the structure, what's the data interfaces, and then we have the rest of our team like our team, actually build the visuals in-game, and put a lot of polish into each of the elements and that process has made it much simpler to rework the settings UI, but what's an even bigger part of it is the data feed system that this is built on is very general, and it's been kind of designed to be general so the settings UI, it was used as sort of like a pilot project for it but now, we're going to use it once we get to more UI work, to rework the inventory rework the contacts, rework the word browser, file browser rework the inspectors, and it makes the work required to rework those UIs be at least an order of magnitude less, which means that before the data feeds these are rough estimates, but say it would have taken us two months to rework the inventory UI, now it's going to take us two weeks and those numbers are more of an illustrative point, but it's essentially on this kind of order, it makes it way simpler, it saves us so much time which means we can rework a lot more UI in a shorter time span. There's lots of things I'm kind of proud of, I just kind of did two most recent ones, so I could ramble for this for a while, but we have a lot of questions, so I don't want to hold things up. When you were talking about the build process, that kind of made me think of something that I really enjoyed working on. It's kind of one of those things where it's really important, but it's just so invisible. And what I did behind the scenes is I basically reworked the entire build process of FrooxEngine, almost. Since FrooxEngine has been around for a while, and it's been through many updates to C Sharp and C Sharp's project system, we were still using the legacy C Sharp project format, which is called MSBuild. And that really only works in something like Visual Studio these days. It's kind of hard to work with, it's not quite as robust as the newer build system for .NET, and as a result there would oftentimes be like, you'd have like weird issues if you wanted to add packages and stuff, and you could only use something like Visual Studio as your IDE of choice to boot. And I saw that, and I decided to poke at it, and it actually ended up being a lot easier than I anticipated because Microsoft provides a nice little tool to upgrade your projects, and so what I did is I went through and I upgraded all of the projects to the new C Sharp format, which means that we can take advantage of the much nicer project files, which means it's easier to edit them directly and add actions and stuff and it also means the engine can now be built in IDEs other than VS Code. You could use, or VS Code, Visual Studio Lopper is what I meant to say there. But now you can build it in like VS Code, or like, you could build it in you could probably build it in like Rider if you pay for Rider, you could build it, you could even build the engine from the command line now, which is really really good for yeah, like automated builds. That's a big thing I did that nobody saw, but I'm really really proud about. It's one of those things where it doesn't show on the surface, but it makes our lives as developers way easier, because I had so many times where I would literally lose sometimes even hours of time just trying to deal with some kind of problem, and having those problems kind of resolved, and have the system kind of be nicer it allows us to invest more of our time into actually building things like we want to build and dealing with project build issues. One of the problems, for example, that's kind of weird, like one of those weird things is with ProtoFlux. Because for ProtoFlux, it's technically a separate system and we have a project that actually analyses all the nodes and generates C-Sharp code that binds it to Resonite. The problem is, with all the MSBuild, for some reason, even if the project that generates that code runs first the build process doesn't see any of the new files in that same build pipeline. So if we ever added a new node, we would compile it and it would fail because it's like, oh, this code doesn't exist even though it actually exists at the time, it just doesn't see it. With the changes Cyro made, the problem is gone. We don't have to talk about this whole thing. But the really big thing is it prepares Resonite for more automated build pipeline, which is something we've been trying to move towards to because it's going to be one of the things that's going to save us a lot more time as well that's going to make it so we can actually just push code into the repository. There's automated tests that are going to run, there's going to be automated builds, the binaries are automatically going to be uploaded and it's just going to remove all of the manual work that happens all the time. It makes bringing on people like me easier too. It makes it easier to bring more engineers as well because now they don't have to deal with those weird issues. I know Prime also lost sometimes he lost a day just dealing with project issues and a day you could spend working on other stuff and instead you have to just make things work. Thank you Cyro for making this. Things like this, even though they're not visible to the community, they help a lot.

With sound system updates, can we get a way to capture a user's voice with permission and import audio streams dynamically into the world?

I don't understand enough about how you want to capture it But since we'll be handling all the audio rendering we'll be able to build a virtual microphone that actually captures specialized audio from its own, whatever it is in the world. So that's one of the things you'll be able to do. You'll be able to bring the camera and have it stream the audio device. So I would say yes on that part, on the kind of capture. I don't know... I think I know what they mean. Am I correct in assuming that you want a way to import multiple streams into the world from a single user? Is that what you're talking about? You'll probably have to wait for them. Yeah, wait a second. We might get back to this question. You'll essentially be able to render audio out from any point in the game in addition to rendering for the user. And then it becomes a question what do we want to do? Do we want to record an audio clip? Do we want to output it into another audio device so we can stream it into something? So that will work. If you want to import audio back in that's probably a separate thing. That's probably not going to come as part of it. We'll see. If you have any kind of clarification just ask us more and we'll get back to this.

Will the headless be upgraded to .NET 9?

Yes. Plan to do this soon. It should be mostly just a flip of a switch, we don't expect big issues. One of the things we want to do is we're going to make announcements so people know this is coming, you can prepare your tooling make sure whatever scripts you're using to upload your headlesses, they don't just explode. There's a GitHub issue on it and I'll try to make the announcement in a bit, probably sometime next week.

Makes me wonder what's currently a culprit of most crashes. I must have seen information that Unity crashes, couldn't you just restart Unity?

We also had a discussion about couldn't you just I mean, so for the first part of the question, crashes, they can have lots of reasons it's really hard to say, like in general you pretty much have to send us the crash log, we look at it, we look at the calc tag and be like this is probably causing it, so it's kind of hard to say in general, for the part where we just restart Unity I mean, it's kind of what a crash is, it essentially breaks and then it has to shut down and you have to start it again, so in a way you're kind of restarting Unity it's just that the restart is kind of forced but this actually kind of ties because if you've been here earlier we've been talking about how FrooxEngine is going to essentially be moved into its own process, and then Unity is going to be handling the rendering one of the things that I'm considering as part of the design is so the Unity can actually be restarted maybe. So if Unity happens to crash, we can keep running FrooxEngine, start a new Unity, and just reinitialize everything. So I do want to make that part of it just in general to make the system more robust, so it's possible but TBD we'll see how that kind of goes.

Is there any plans later to remove the patron requirement for the headlesses when things are more stable and performant?

So at some point we'll probably make it more open. Our tentative goal, and this is not set in stone, so things might change. Our tentative goal is we want to offer a service where we make it easy to auto-spin headlaces and move Patreon to that, so if you support us financially you will get a certain amount of hours for the headlaces and we're going to make it very easy to host, and if you want to self-host we're going to give you the headlaces. We don't have to add it from the business perspective because Patreon is one of the things that's supporting the platform and it's allowing us to work on it. So we don't want to compromise that because if we do something with that it ends up hurting our revenue stream, then we're not able to pay people on our team, and then we're not able to work on things and things end up kind of bad. We don't want it to be accessible to as many people as possible, but we're sort of balancing it with the business side of things.

Cyro also did FFT mode a while ago. Having the audio system that could make waveform visuals or be able to do better detection of bass music effects.

That's actually separate from, because that happens fully with the Resonite. The audio system is more about rendering the audio output and pushing it to your audio device.

A few people have mentioned that they are not happy with the new working system and how good it looks.

We can always improve things. We just released an update which integrates some of the community settings which would make it look way better. For things that are like, you know, that people still find us and issues with it, we will need reports on those because right now we're not sure after the update, we're not sure what's making people not happy about it. We have more concrete stuff to work with as well as people make reports so we can know what do we focus on. But yes, in general, we are always willing to improve things. We want to make essentially want to make it as polished as it can be, but we also need more kind of hard data to work with so we know where to invest our time.

Who is the second person here on the camera?

This is Cyro. He's our engineering intern. Hello. Hi. How you doing guys? It's me.

If video players are going to be updated with Core and VLC, I have heard from several builders that players use very outdated Core.

Yes, the system we use right now, it's a plugin called UMP Universal Media employer, which is a builder on VLC, unfortunately hasn't been updated in years, which means it's using an older version of it. We've been looking into upgrading to actual official VLC plugin. The problem is it's still not mature enough in some ways. The last I remember, there's issues where you cannot have more than one video at a time. You can only have one, and if you try to do another one, it just explodes. There's other things we can look into, like alternative rendering engines, but there's also potential time and money investment. If the pros are bad, we can consider that we might invest into one, but we need to do some testing there and see how well it works. It's unfortunately difficult situations because the solutions are limited. It's something we want to improve, but it's also difficult to work with, unfortunately.

Thoughts on about 75% of all users being in private worlds around the clock.

This is not a Resonite problem. This is a problem of scale. All platforms have a pretty wide majority of people who just kind of want to hang out and not really be bothered. Unfortunately, we're not the biggest platform out there. We're still kind of small. And as we grow, that problem will undoubtedly get better. It's not really a technical problem, it's more like a social one, because people behave in a certain way, and it's really hard to change that. There's some things we want to do to entice people to make it easier to discover things, like we were talking earlier, adding an event's UI, so you cannot see these are the things that are coming up, these are going to be public events that you can join. Right now, I believe there's the creator chain event that's going on, and it's always every weekend, it's public to everyone. But it depends what people are coming in for, because people might come in, and they don't actually want to join public events, they want to go into those private worlds. But the question is, how do you make those people discover the friend groups and hang out in those worlds? It's a challenging problem, especially from the platform perspective, because we can't just force people into public worlds. People will host whatever worlds they like, but always want to see what kind of tools we can give to entice people and make worlds and socialization easier for them to discover. But like Cyro said, it is a thing that gets better with scale, once we can grow more. There's a number of events, though. If people go right now, since we don't have the event's UI in-game, if you go into the Resonite Discord, if you go into Resonite Discord, we have community news, and lots of different communities post regular events there, so people can find what's going on in the platform, it helps a bit in the meantime if people are looking for things to do.

Is Sauce being worked on in parallel with the current performance-related updates?

Yes, it's actually been working in parallel. Ginns is one of the main people working on that. We did have meetings now and then, we're sort of synchronized on the status of it. Last time, that was two weeks ago or so, we talked about the multi-process architecture, how that's going to work, how it's going to integrate with Froox Engine, and how those systems are going to communicate. Ginns' approach was to look at what the current unit integration has and were implemented on source end. However, there's a lot of things that we're actually moving, like the particle system, audio system, input system, lots of things that are going to be moved forward into Froox Engine, so they don't need to be implemented on source side, and they're going to focus more on other things. They have a list of source features, and specifically bevy features, because source is being built around the bevy rendering engine, which maps the current features we have. For example, we have lights, do they support shadows, we have reflection probes, do they support this and that. So they're working on making sure there's a feature part there. Once we have a performance upgrade, we can work more on the integration. They also work on the Resonite side, so you know what Jenkins has been doing on consolidating the shaders, because all the shaders we have right now, they need to be rewritten for source, because the current ones, they're not designed for Unity, so we need equivalents the equivalents of those are essentially going to be implemented for the new rendering engine.

How do you make walkie-talkie system?

There's actually one thing you should be able to do with the new audio system, you'll be able to have a virtual microphone, put it on a thing and then have it output from another audio source. There actually might be a thing you'll be able to do once we rework that, because it shouldn't be too difficult to add components for that.

Rigid-body newtonian physics system when, soon or later?

So definitely sometime after the performance upgrade we integrate a physics engine called Bepu Physics. One of the things we want to do after we move the Froox engine out of Unity and it's running on .NET 9, we want to synchronize Bepu to the latest version, because right now we kind of have to diverge because the Bepu physics, it used to work with .NET Framework, which is what Resonite is like right now for Unity. But now the newer versions they require, I think .NET 5 or maybe they even bumped it higher, which means we cannot really use those, at least not with lots of backporting. So one of the tasks is going to be to sync it up and then we're going to be able to look at how much work is it, when do we want to prioritize how we should put a simulation integrated with Froox Engine. It's also going to help because Bepu Physics is designed to work with modern .NET to be like a really performant, which is why I would like a person to consider it as a prerequisite for implementing that as the performance upgrade, so we're actually running it with the runtime it's supposed to run with. But there's no specific kind of prioritization right now. Once we're done with the performance update, we might focus more on UI and be focused on IK, maybe other things we'll reevaluate at that point.

Could it be possible to dynamically connect, disconnect from VR runtime without restarting the game?

There's not really even a thing that needs to move away from Unity. It's possible to implement it with Unity. It just takes a fair amount of work. So, possible yes, I would say. The question is are we going to invest time into implementing that. For that I don't know the answer right now.

Would these audio rendering sources allow for spatial data for your own voice? Example, if I want to record conversation between myself and someone else from third person without it sounding like I'm right at the camera.

Yes, there wouldn't be an issue because we can just have any sort of listener in the world and just record that with binaural audio and everything.

What flavor of sauce, what does it taste like?

It's very salty. Mayonnaise. Not mayonnaise, it's actually made of his own kind of sauce which is why it's named sauce. Actually, I forget what he calls it. Scotch sauce. Scotch sauce, yes. He makes this really delicious sauce, it's a very salty one, but it has loads of flavors to it.

Cyro, I heard that some people don't trust you and that you don't care.

I'm in desktop a lot, and I'm often either working in Froox Engine these days, or I'm kind of audio sensitive and I can get overstimulated kind of easily, so sometimes I will just kind of stand there. Or maybe I won't respond so colorfully. But I like having people around, and so that's why I exist despite that. I also appreciate it when I'll probably open up a lot more if ...how do I put this... Basically, if you want to interact with the Cyro creature well, do things like ask before poking my nose or patting my head and stuff. And ask me if you want to send me a contact request. Just don't come up to me and be like, you're cute, and then click my name and add me. Because then I have to explain, I'm probably not going to add you, man, we talked for maybe two seconds. I need at least 45 seconds. But I... If you've come across me and I've been in that sort of state where I'm not super talkative, or maybe I seem a little detached, hopefully that sheds a little light on that. I love this place very dearly, and I love all of you very dearly. Cyro is a good bean.

What's the current workflow for identifying performance bottlenecks?

So, generally, the workflow is something like, you know, it kind of depends, because there's lots of things that can cause performance issues. So usually it's a combination of different things, but usually it kind of starts more with just observation. You know, kind of seeing what's running slow, when am I lagging, and so on. Once there's that initial observation, we will try to narrow down to the root of the issue. And for that, we can use a variety of tools. Some of them are in-game. For example, we have stats on how much are certain parts of the process taking. Once we need more detailed information, we can, for example, around Headless, the Headless client with Visual Studio profiling tools, and they actually measure how long is spent in each method, how long is spent in each code. That gives us some kind of data. The other part of it is benchmarking. Once we can have suspicion, this thing is causing a lot of performance problems. We can write a test sample, and then run it with different runtimes, run it with different settings, do A-B test things, see how things change. For example, I've done this with a lot of Resonite's offer extensions methods where, for example, even with stuff like the base vector operations, I would try different ways to implement certain operations, run a benchmark, and see how fast it runs. One thing that kind of depends there is what the runtime it uses. One thing I would, for example, find is certain implementations, they actually run faster with Mono, and then slower with the modern .NET runtime. There's a lot of things in FrooxEngine where sometimes people kind of decompile and say, why is this done this weird way? And in some cases, it's because it actually, even though you wouldn't do it with more modern code, it interacts better with the runtime used at a time. But for example, with these general operations, I would find if I compare them with the Mono in Unity and compare them with the modern runtime, they would run 10, sometimes even 100 times faster. There's some other things that also speed up, some things that are the same. But generally, it's just a combination of tools. We observe something not performing well, we have a suspicion that this might be causing it, and then we just use tools to dig down and figure out the root cause of that problem. So hopefully that answers that. I think there are also some manual profiling tools out there, like Tracy, I know there's some Tracy bindings for C Sharp, which are really cool. That's actually one of the cool things, because there's a bunch of libraries that we cannot even use right now because of the old runtime. Tracy, I think it requires .NET 8 or some new version. It's listed for .NET 7, but I think it's just interop, so it could work, but it's better to just wait. We do want to integrate more tools. Usually, you have a performance profiling toolset so you just dig down and figure out where it could be coming from. Sometimes it's easier to find, sometimes it's harder, sometimes you have to do a lot of work. For example, the testing I've done before comparing the .NET 5 or whatever version it was and Mono, I saw this code is running way better so I think it's going to help improve a lot, but it's still usually testing bits and pieces, and it's hard to test the whole thing because the whole thing doesn't run with that new interface. That's why for our current performance update, we moved the headless first, because moving the headless was much easier since it exists outside of Unity and we could run sessions and compare how does it perform compared to the Mono one. And the results from that, we got it's essentially beyond expectations, it's way faster. That makes us more confident in doing all this work to move FrooxEngine out of Unity, it's really going to be worth it.

As of my perception, Resonite is somewhat being marketed as a furry social VR platform, which is not the case at all. Are there thoughts about this topic that could maybe bring in more people?

So, we don't really market like Resonite as a furry social VR platform. We actually specifically, on Chroma, we know who's heading our marketing, we specifically for our own official marketing materials, we show different diverse avatars because yes, there's a lot of furries on this platform and it's also a self-perpetuating thing where because there's a lot of furries, they bring in a bunch more. We do have lots of other communities as well, which are not just big, but they are here as well. So, we want Resonite to be for everyone. It's not designed specifically for furries. We want everyone to be welcome here. It's sort of like a complicated kind of thing because the marketing we make, we try to make it as generalized, but the question is when you come to the platform, you're going to have lots of furries. I think the only way to bring in more people is to showcase lots of different people on the platform, lots of different kind of communities, but if there's lots of furries, it becomes kind of difficult. It's self-perpetuating. But I think it's also a thing of scale. As we keep growing, there's going to be more different groups of people and the communities that are different kind of fandoms or just different demographics, they're going to get bigger and it's going to help people who are from those demographics find their groups much easier, once there's more of them. Yeah, Resonite's all about self-expression and stuff and being who you want to be and building what you want to build, and furries kind of got that down pat, and so that's probably why you see a lot of them, but everybody can do that. It's not just those people, it's made for every person to come together and hang out and build and just be you, no matter who you are. Yeah, we try to make this platform kind of inclusive and for everyone. It's our goal. We don't want anybody to feel unwelcome. I mean asterisk, because we don't want hate groups, people like that. So that one we would have an issue with, but generally we want this platform to be everyone.

Is rendering performance being looked into before you move the source as well?

So there's actually a thing that source will help with. We don't want to invest super much time into the current rendering pipeline with Unity, because the goal is to move away from it, which means any time we invest improving Unity, it's essentially going to be wasted and it's going to delay the eventual big switch. Source is going to use much more modern rendering method. Right now we're using deferred method which can be quite heavy, like memory bandwidth and so on. With source, it's going to use something called clustered forward rendering which allows lots of dynamic lines while also being much lighter on the hardware. So that should improve rendering performance on itself, and once we make the move we can look for more areas to optimize things, introduce things like impostors, more LOD systems and things like that. So yeah, it's pretty much like, unless there's any sort of very obvious low-hanging fruit with rendering that would take us less than a day or maybe just a few days to get a significant boost in performance we're probably not going to invest much time into it and instead want to invest into the move away from anything.

How straightforward is conversion of our current particles to PhotonDust?

So the conversion, I can't really answer it exactly, because the conversion actually isn't written yet however, the main focus right now is actually feature parity. So I actually have a list, and I can post it in the devlog if you're curious, where I have all the things that the legacy system has, and I'll be working through that list just making sure that PhotonDust has the same or equivalent functionality. The goal is to make it so it's pretty much equivalent so it converts and it will look either the same or just very close so hopefully there won't be things that are too different however, sometimes those things become apparent during the testing period so once those things here come out, we'll look at them and we'll be like this is easy enough to fix, or maybe this one's a little bit more complicated maybe we just bring it close enough and ask people to manually fix things, but we'll have to see how this kind of goes sometimes it's kind of hard to know these before it actually happens but it should have a feature parity, well it's going to have a feature parity with the current host of things that just work.

Is there a way to stop the dash particles from being shown when streaming?

I don't think there is I think we would have to implement that, does it show? it does show, yeah.

What things are currently planned for the whole performance update?

So we actually answered this one earlier I'm not going to go into details on this one but essentially moving to .NET 9 we're originally going for .NET 8, but .NET 9 released literally just like a week ago or so in short, currently there's two main systems that need to be moved fully into Froox Engine because they're a hybrid system, that's the particle system which is being worked on right now, there's the sound system, which Cyro did some work on once those systems are fully in Froox Engine, we're going to rework how Froox Engine interfaces with Unity, and then we're going to move it out into its own process, to use .NET 9 and it's going to be the big performance uplift from there we're going to post, this video is going to be archived if you're curious in more details, I recommend watching it later, because we went into quite detail on this earlier on the stream.

In the future, could there be a way to override values not just per user, but in different contexts? For example, override active-enabled state of a slot or component for a specific camera, basically same concept of RTO, but more flexible.

So probably not like this the problem with RTO is if you want to override certain things for example, in rendering when rendering is happening, although work on updating the world is already complete, which means the render actually has much more limited functionality on what it can change probably the best way to handle situations like that is you have multiple copies of whatever you want to change or whatever system you want to have and you mark each one to show in a different context but you need to manually set them up consider a scenario where you override an active-enabled state that component might have a lot of complex functionality, maybe there's even ProtoFlux or some other components that are reading the active state and doing things based on being enabled or disabled and once you get into that realm the effect of that single enabled state can be very complex where you can literally have a bunch of ProtoFlux that does a bunch of modifications to the scene when that state changes and it's too complex for something like the render to resolve because you would have to run another update on the world just to resolve those differences and the complexity of that system essentially explodes so probably not in that sense if you give us more details on what you want to achieve we can give a more specific answer but this is pretty much how much I can say on this look in general.

Was the locomotion animation system one of the unit systems that need to be implemented in FrooxEngine or was it something else?

that one was something else, it came as a part of business contracts it's not something it's not something I kind of wanted to prioritize myself it's kind of a complicated situation but unfortunately it was necessary at the time and I'm not super happy with how that whole thing went because it came at the wrong time and it's it was essentially a lot of, because we don't have a lot of systems for dealing with animation which would have made these things much easier and we have never worked with IK itself which would have made things also easier so there was a lot of foundational work that was not there and also the timeline was kind of really short so it was pretty much like just a month of constant crunch just kind of working on it and there wasn't enough time to kind of get it through so it is a complicated situation unfortunately. And there's a thing that kind of happens sometimes with businesses like you end up in a situation where you don't really have a good path so you just have to deal with it we want to eliminate those kind of situations and we had a number of conversations internally to see how do we prevent this from happening again, how do we make sure we don't end up in a situation where we have to do something like that and we have a much better understanding of the problem now where if a situation like this were to occur again we're going to be better equipped on the communication side how do we deal with it and how do we make sure it doesn't mess with our priorities and things we need to focus on so it was like it was a messy situation, I'm not happy with how I handled some of the things with it but it's pretty much it is what it is and the best thing we can do right now is learn from it and try to improve things.

How are you compiling the questions from the stream chat?

We do have this thing here where we're going through the questions. This is an older one I need to grab a bigger one, but it's sort of like, you know, sorting the questions for us. The Twitch nodes also would have actually broken where the displays of them and could have fixed very recently, I pushed the update for it last week.

Mods were able to access internal array to edit things like color over lifetime. Will those be properly converted?

It's just going to work out of the box. The good news is there's also new modules because PhotonDust, the new particle system, is designed to be way more modular So there's modules that instead of just the internal array, you can also specify the color over lifetime using a texture, or using starting and ending color. You can also do starting and ending color in the HSV color space, so there's a lot of new color effects that it can do that's going to give you more control over the particle system. And we can always add more, because we now have full control of the system, so those modules are very easy to write.

On the topic of the platform being for everyone, why were nipples allowed?

So, the reason why we wanted to take a stand on topic quality that's what this issue is called, by the way, it's called top equality is, um, because ultimately like, if a man can have a bare chest you know, why can't a woman? The only difference is that on average women have larger chests than men, and I think we're also an EU-based company, right? I'm from Europe. Okay, this is the stance in a lot of places in Europe, too, where top equality is just sort of the norm and we want to normalize that, because we do need this kind of equality, like why can't a woman have, you know, their why can't a woman be topless, you know, in a non-sexual context? There's just no precedent for it. And, let me see if I'm... There's also a thing with this, it's like we you know, we believe in equality and we believe in a lot of progress, so we don't need to take stance on those things, but also we kind of give you tools to kind of deal with those, so if it's something you really don't want to see there's an avatar block function. You can block those people, they will not appear for you. There's probably more things we can do in that area as well, but ultimately we want to be like, you know, very kind of like open and very kind of progressive as a company when it comes to these things. There's also like, I would really recommend like asking this question also in the moderation, like office hours, because the moderation teams, you know, the one that kind of deals with this a lot of detail and they're going to have like, they're going to have a lot more kind of context for these things. But also like, you know, I don't necessarily believe that like, you know, it's like the majority of the people on the internet, you know, like having that stance like it's, there's, there's a good chunk of like, you know, kind of people like who are kind of like, you know, very open about this and I feel like, you know, that the chunk is kind of growing. People are kind of getting like, you know, more open with things. I do recommend like, you know, bringing this like with the moderation office hours, like they're going to be able to give you like kind of much, much kind of a better answer for this because they've been dealing with this topic, you know, for a while. So, you know, take what we say like a little bit of a grain of salt. I don't want to, you know, kind of step on the moderation teams like those with that. Yeah, I was going to say something to, I was going to say something to wrap it up. What was I going to say? Yeah, I was just going to say, I don't know what I don't know what you mean by, because I commented I don't know what you mean by this rule being exploited by transgender males and females, but being transgender has nothing to do with this. If you want to be a boy or want to be a girl that has no bearing on this rule. Most part of the quote too is like, you know, because it kind of like erases that kind of disparity, like it doesn't really matter. If you do feel there's some exploit you can always, you know, you can file moderation reports or you can file, like you know you can bring these to the moderation office hours and discuss these there. Then we can kind of see what is happening and then we sort of evaluate does it fit with our rules or does it not. So you can, if you feel there's some issue you can make us aware of it and we can promise that we're going to agree with you, that we're going to have the same view on it, but we can at the very least look at it and listen to what you have to say on that.

Hearsay. I have heard from someone that they try to report someone to the moderation team but because they were connected to the team nothing happened of it and they ended up banned instead.

So, I understand there's not like super many details but I can kind of talk in general. Sometimes we do have cases where there's actually two things. We do have cases where there's reports against people who are on the moderation team or even on the Resonite team. If it's a report against someone who's on the moderation team that will usually go to the moderation leads and those people cannot deal with it, they will investigate. We actually have multiple moderation leads as well. That way it's not like there's a single person who can just bury the thing but there's multiple people who all can see the same data and then sort of check on each other. If it happens something with a team or if there's an issue with somebody on the actual Resonite team, usually that goes like the Canadian kid who's doing those things and he brings these things with me. We have cases where we had to deal with difficult situations before but on the theme, but in the moderation team as well. I can't really go into details because there's privacy issues with that. I can tell you there's been cases where people on the moderation team they had to permanently ban some people who were their friends, even long-time friends, because they did something wrong. This caused people on the moderation team a lot of distress, but they still made the decision to ban their friend because they want to uphold the moderation rules above all else. I've looked at a few of those cases because I do want to make sure things are going okay, there's favoritism happening. I've been involved in a few of those cases as well. Part of the discussion of it and so on. There's been a number of difficult discussions on those and every single one, if there was sufficient evidence for somebody's wrongdoing, even if we knew that person personally, even if they were connected to the team, they were still banned. There's one thing I kind of noticed that's also kind of in general, is usually when people do get banned, they're almost never truthful about the reason, because we do make sure as part of the moderation, if somebody ends up being banned, usually they will receive warnings first, depending on the severity. If they end up being banned, the reasoning is explained to them. Oftentimes there's somebody from the team who's actually going to sit down with them and be like, we have this evidence, this kind of happened, you're getting banned for these reasons. They are made aware of it. And in a lot of cases, those people will come out and give completely different reasons for why they're banned. And this kind of puts us in a difficult situation, because we value privacy, and sometimes giving details to the public could put innocent people who are involved in those incidents at risk. So we cannot really say the person was actually banned for these reasons. But it is a thing that happens. So the only thing I can request is be more skeptical about what people say about these things. If you see something, if you believe you can always send us a report, we will look at it, we will evaluate it, we will see what evidence we have. But ultimately, we will not necessarily tell you the details of how it was resolved to protect the privacy and potential security of people involved.

Does the question mark need to be at the end of the question?

I think it doesn't need to be. I think I can put it in the middle, but just to be sure, I would put it like... Actually, no. I literally see a question that has a question mark in the middle of it, so no, it doesn't need to be at the end.

Any more flux nodes in the works? If yes, which excites you the most?

You're working on some new ones. Which ones am I working on again? I'm just the one I just took. Oh yes, there is a new ProtoFlux node I'm particularly excited about. So, you know how for those of you who do ProtoFlux, there is currently a node where you can perform a raycast which shoots an infinitely thin line, and whenever it hits, you can get the position, you can get the direction, stuff like that. What I'm going to implement is I'm going to implement sweeping, or I think it's also been called shapecasting or whatever, unlike some other platforms, but it's essentially a way of doing thick raycasts using a shape that you essentially extrude in the direction that you want it to go. So, if you wanted to shoot a sphere in a direction, you would essentially be shooting a capsule however long you want to shoot it, and anything within there it would hit. Or in this case, you know, the first thing it hits it will return basically exactly like a raycast, but it's thick, and you can do that with different shapes like a sphere, or a cube, or I think you should also be able to do it with convex hulls, right? I'm not sure if we have that one, maybe. I thought it was going to be better. At the very least, you'll be able to do it with spheres, and cubes, and cylinders, and capsules, and stuff. But I think that will be very useful, especially for those of you who make vehicles who don't want your raycasts to shoot between two infinitely close triangles in geometry, and now your car is flying across the map. Yeah. Thick raycasts. Yeah, thick raycasts. Because we do have a lot of the functionality, it's already in the part of the car. We use it internally in our own engine. For example, the laser is actually using sweeps to behave a bit better. And this is going to expose them, so you can also use them from ProtoFlux.

What app are you using to do those scans?

For most of my scans, I'm using a software called Agisoft Metashape. It's a photogrammetry software, and essentially you take lots of pictures of the subject from lots of different angles, and it's able to do those reconstructions. It figures out based on the patterns in the photos, where the photos are, and then reconstructs a mesh. I also sometimes use additional software, like I'll for example use Photoshop to like, with certain photos, I will do like an AI denoise on them, which helps increase the quality of the scans, and I will also do some kind of tuning of the lighting and so on. But I guess Metashape is the main one. There's also one that I kind of started experimenting with a few days ago, and I literally turned my room into like, it's a software called, actually I forget the first, it's called PostShot. Let me see the full name. Joseth PostShot. And this one's for Gaussian Splathing, which is sort of like this new technique, you know, for 3D reconstruction, or more general like rendering, which can reconstruct the scenes with much better fidelity. And we're kind of playing with it, like because I have all my datasets, I've been just kind of throwing at it and see like how it kind of works with different things. So like I might like integrate that one more into my workflow as I kind of like go. I posted like a quick video and have like a bunch more that I'll be posting soon-ish. But yeah, I guess some mentorship is the main one to use, like you know, it makes it easier to just get a mesh, bring it in.

Will anything be done regarding people who do not want to see bare top females in public sessions?

So in the future we do want to implement stuff like content tagging and that will come with the ability to, you know, if things are tagged a certain way you can take a checkbox and you won't see them anymore, right? So you could make use of that. That's something we will do in the future. But other than that, for the time being if you don't want to see that, don't go to those sessions. Well, you can still go to those sessions because we do have the ability to block somebody's avatar. I can actually show you if I click on Cyro's name... Careful, it might ban me from the session. Oh, it should be just block avatar. There we go, see now Cyro is gone. I don't have to look at it. Well, I can still see it, but I don't have to look at it like him anymore. Yeah, that is something I forgot about. This is one of the reasons we added it. You have the power. If some avatar is legitimately upsetting you, you can block it. The other part is if you host your own sessions, you can enforce your own rules. We do allow for that, with some caveats. So if you want to enforce a dress code, that's completely up to you. You have that freedom. You can always add additional rules to whatever sessions you want to host. That's another thing. Eventually the content tagging system should make these things more generalized. You don't even have to go and see it in the first place as long as the content is properly tagged. We can filter certain things out. We can block certain avatars. We don't want to give you the tools, but we don't want to make global decisions just forbidding these things for everyone. There is a nuance I was going to get to there in that if you decide to not allow, let's say you're like, I don't want to see nipples in my world, that also has to apply to the men in the session as well. It is universal, you cannot discriminate. So it's either nipples allowed for all, or no nipples at all. It actually reminds me, because there was one thing that was particularly funny to me. With the Creator Jam, they actually made a nipple gun they were shooting around the world, and people got upset, and they were like oh no, it's okay, those are male nipples, they're not female nipples. It was a funny way to point out to that like, double standard, you know, for this kind of thing.