Gestures: Difference between revisions

From Resonite Wiki
m Changed from Quest 3 Pro to just Quest Pro for facial tracking.
Line 44: Line 44:
| None
| None
|-
|-
| Quest 3 Pro
| Quest Pro
| Yes
| Yes
| {{Stub}}
| {{Stub}}

Revision as of 22:19, 24 January 2024

This article or section is a Stub. You can help the Resonite Wiki by expanding it.


Preface

Facial animations in Resonite are a way of showing expression on your avatar. Whether that be an avatar with stars in its eyes when you press a button, when you tilt your head you automatically give a confused look, or when you frown with a facial tracker your ears flop down, there are many cool and unique ways you can make facial expressions on an avatar.

Keyboard

Resonite allows you to use any key on your keyboard as a way of controlling your expressions using ProtoFlux.

To do so, plug a Key enum into a Key Pressed and that into a Fire On True with the user who you want to check is pressing that key (Usually you wanna use a Get Active User Self node) to check when the key is pressed. Making the same settup but using a Key Released and a Fire On False will allow you to tell when they key is unpressed. Using the two impulse outputs with a Data Model Boolean Toggle|Data Model Boolean Toggle will allow you to have a boolean for if the key is currently pressed.

An example is also below:


This article or section is a Stub. You can help the Resonite Wiki by expanding it.


Plugging this output into Switches or into a Zero One can turn this boolean into numbers to control facial expressions, shapekeys, materials, or bone positions.

Controller Buttons

The second most affordable but just as difficult way for facial animations is controller input. There are many different nodes that can provide controller inputs using ProtoFlux. Some of these include Standard Controller, Index Controller, and Touch Controller to name a few from the Devices/Controllers ProtoFlux Category. These nodes can provide different data like booleans for button inputs and float2s for joystick and for Index the position you are touching your TouchPad with your thumb at. These values can be used with the boolean logic nodes and basic math nodes to drive shapekeys and facial expressions.

Example circuits:

This article or section is a Stub. You can help the Resonite Wiki by expanding it.


You can also use the rotation or pointing direction of your body limbs to drive shapekeys and actions, using nodes from vector math and even transform math nodes from Transform to make body gesture based facial expressions, which can be useful for users who have only one controller, missing support for their controller, or have no hands to use said controller.

Example circuits:

This article or section is a Stub. You can help the Resonite Wiki by expanding it.


Facial Tracker Based

If you are using a facial tracker, Resonite itself and various community modifications supports almost every facial tracker on the market. With Resonite, supported facial tracker connection is easy and only requires the software that came with your facial tracker. Resonite will automatically hook into your facial trackers native driver software, and will start working right out the box, even if you're using a modification to the game to support your tracker.

List of trackers that are supported and if mods are required

Known Facial Trackers
Tracker Is Supported Required Mod Github Link Limitations
Vive standalone tracker Yes N/A None
Quest Pro Yes

This article or section is a Stub. You can help the Resonite Wiki by expanding it.


None
Vive Pro Eye Yes N/A None

Once a tracker is connected to Resonite, avatars that support facial tracking will automatically have facial movements. Which driven by the raw data your facial tracker gives. These facial movements are driven by the Avatar Expression Driver Component. You can also use this component to drive a Float Value Field Component rather than shapekeys on a mesh, and use that as an input to ProtoFlux or other components.