m Added link to mod for Quest Pro face/eye tracking via Steamlink |
m Changed Quest Pro "Is Supported" from "Yes, via Mod" to "Yes". Added "Steam Link only" in Limitations and provided a link to the Virtual Desktop face tracking mod. Switched "Limitations" to before "Alternative Mod Github Link" |
||
(One intermediate revision by one other user not shown) | |||
Line 6: | Line 6: | ||
Resonite allows you to use any key on your keyboard as a way of controlling your expressions using [[ProtoFlux]]. | Resonite allows you to use any key on your keyboard as a way of controlling your expressions using [[ProtoFlux]]. | ||
To do so, plug a [[Type:Key|Key]] enum into a [[ProtoFlux:Key Pressed|Key Pressed]] and that into a [[ProtoFlux:Fire On True|Fire On True]] with the user who you want to check is pressing that key (Usually you wanna use a [[ProtoFlux:Get Active User Self|Get Active User Self]] node) to check when the key is pressed. Making the same settup but using a [[ProtoFlux:Key Released|Key Released]] and a [[ProtoFlux:Fire On False|Fire On False]] will allow you to tell when they key is unpressed. Using the two impulse outputs with a [[ProtoFlux | To do so, plug a [[Type:Key|Key]] enum into a [[ProtoFlux:Key Pressed|Key Pressed]] and that into a [[ProtoFlux:Fire On True|Fire On True]] with the user who you want to check is pressing that key (Usually you wanna use a [[ProtoFlux:Get Active User Self|Get Active User Self]] node) to check when the key is pressed. Making the same settup but using a [[ProtoFlux:Key Released|Key Released]] and a [[ProtoFlux:Fire On False|Fire On False]] will allow you to tell when they key is unpressed. Using the two impulse outputs with a [[ProtoFlux:Data Model Boolean Toggle|Data Model Boolean Toggle]] will allow you to have a boolean for if the key is currently pressed. | ||
An example is also below: | An example is also below: | ||
Line 36: | Line 36: | ||
! Tracker | ! Tracker | ||
! Is Supported | ! Is Supported | ||
! Limitations | ! Limitations | ||
! Alternative Mod Github Link | |||
|- | |- | ||
| Vive standalone tracker | | Vive standalone tracker | ||
| Yes | | Yes | ||
| None | |||
| N/A | | N/A | ||
|- | |- | ||
| Quest Pro | | Quest Pro | ||
| Yes | | Yes | ||
|[https://github.com/ | | Steam Link only | ||
|[https://github.com/Zeitheron/VDFaceTracking Mod for Virtual Desktop] | |||
|- | |- | ||
| Vive Pro Eye | | Vive Pro Eye | ||
| Yes | | Yes | ||
| None | |||
| N/A | | N/A | ||
|- | |- | ||
|} | |} | ||
Once a tracker is connected to Resonite, avatars that support facial tracking will automatically have facial movements. Which driven by the raw data your facial tracker gives. These facial movements are driven by the [[Component:AvatarExpressionDriver|Avatar Expression Driver Component]]. You can also use this component to drive a [[Component:ValueField| Float Value Field Component]] rather than shapekeys on a mesh, and use that as an input to ProtoFlux or other components. | Once a tracker is connected to Resonite, avatars that support facial tracking will automatically have facial movements. Which driven by the raw data your facial tracker gives. These facial movements are driven by the [[Component:AvatarExpressionDriver|Avatar Expression Driver Component]]. You can also use this component to drive a [[Component:ValueField| Float Value Field Component]] rather than shapekeys on a mesh, and use that as an input to ProtoFlux or other components. |
Latest revision as of 04:26, 28 May 2024
This article or section is a Stub. You can help the Resonite Wiki by expanding it.
Preface
Facial animations in Resonite are a way of showing expression on your avatar. Whether that be an avatar with stars in its eyes when you press a button, when you tilt your head you automatically give a confused look, or when you frown with a facial tracker your ears flop down, there are many cool and unique ways you can make facial expressions on an avatar.
Keyboard
Resonite allows you to use any key on your keyboard as a way of controlling your expressions using ProtoFlux.
To do so, plug a Key enum into a Key Pressed and that into a Fire On True with the user who you want to check is pressing that key (Usually you wanna use a Get Active User Self node) to check when the key is pressed. Making the same settup but using a Key Released and a Fire On False will allow you to tell when they key is unpressed. Using the two impulse outputs with a Data Model Boolean Toggle will allow you to have a boolean for if the key is currently pressed.
An example is also below:
This article or section is a Stub. You can help the Resonite Wiki by expanding it.
Plugging this output into Switches or into a Zero One can turn this boolean into numbers to control facial expressions, shapekeys, materials, or bone positions.
Controller Buttons
The second most affordable but just as difficult way for facial animations is controller input. There are many different nodes that can provide controller inputs using ProtoFlux. Some of these include Standard Controller, Index Controller, and Touch Controller to name a few from the Devices/Controllers ProtoFlux Category. These nodes can provide different data like booleans for button inputs and float2s for joystick and for Index the position you are touching your TouchPad with your thumb at. These values can be used with the boolean logic nodes and basic math nodes to drive shapekeys and facial expressions.
Example circuits:
This article or section is a Stub. You can help the Resonite Wiki by expanding it.
You can also use the rotation or pointing direction of your body limbs to drive shapekeys and actions, using nodes from vector math and even transform math nodes from Transform to make body gesture based facial expressions, which can be useful for users who have only one controller, missing support for their controller, or have no hands to use said controller.
Example circuits:
This article or section is a Stub. You can help the Resonite Wiki by expanding it.
Facial Tracker Based
If you are using a facial tracker, Resonite itself and various community modifications supports almost every facial tracker on the market. With Resonite, supported facial tracker connection is easy and only requires the software that came with your facial tracker. Resonite will automatically hook into your facial trackers native driver software, and will start working right out the box, even if you're using a modification to the game to support your tracker.
List of trackers that are supported and if mods are required
Known Facial Trackers | |||
---|---|---|---|
Tracker | Is Supported | Limitations | Alternative Mod Github Link |
Vive standalone tracker | Yes | None | N/A |
Quest Pro | Yes | Steam Link only | Mod for Virtual Desktop |
Vive Pro Eye | Yes | None | N/A |
Once a tracker is connected to Resonite, avatars that support facial tracking will automatically have facial movements. Which driven by the raw data your facial tracker gives. These facial movements are driven by the Avatar Expression Driver Component. You can also use this component to drive a Float Value Field Component rather than shapekeys on a mesh, and use that as an input to ProtoFlux or other components.