2. Using the Hand Blueprints
Within the Content/Blueprints folder, are 2 SkeletalMesh Actor Blueprints:
Essentially, these Blueprints are the same, except for a boolean change to determine the “handedness” of the glove, and a re-oriented model. The picture below shows the parameters required to setup an instance of a DataGlove using a GloveController as the target.
3. Required Parameters for "Create Data Glove" BluePrint Function
- Is Right Hand (boolean)
- Is Using Bluetooth Device Name
- Check to Use a Bluetooth Device Name to scan for
- Bluetooth Device Name (fname)
- Name of the Data Glove you want to connect to (example: Forte R H100)
- Controller Platform (Enum)
- Oculus Controller
- Used with Rift and Quest
- Vive Tracker
- No Tracker
- Used for applications that don’t need VR
- Oculus Controller
- Posable Mesh
- Knuckle Joint Names Array (fname array)
- Middle Joint Names Array (fname array)
- Tip Joint Names Array (fname array)
4. Setting the Joint Name Arrays
The Joint Name Array parameters in the “Create Data Glove” function are determined by the name of the bones/joints in your model. The Unreal DataGlove SDK uses the SteamVR hand model for the RightHand and LeftHand BluePrints.
The SteamVR hand model utilizes it’s custom joint-naming convention, and thus another model would needs a new set of names to function. At BeBop Sensors, some of our custom hand models use the same naming convention, so only the model needs to be switched out and the BluePrint visual scripting nodes can remain the same.
Below is a picture showcasing the Macro used to set simple Name arrays:
5. Swapping out a hand 3D model in the BluePrint
Swapping out a hand model is very simple. Under the components tab of the Blue Print (in the top left corner of the Editor), there are 2 Mesh components:
Each component has a mesh field, so simply replace each mesh with the new desired mesh. Make sure that the joint names of the new mesh match the ones being sent via the “Create Data Glove” function.
6. Detailed Look at the GloveController Component and exposed BluePrint functions
The last section briefly went over that the GloveController Component in the Hand BluePrint. To recap, the GloveController component is a C++ class located in the C++ Classes/DataGlove folder. The GloveController acts as bridge to handle incoming data from the Data Glove into Unreal and for sending commands to the Glove.
All the Data Glove API functions are exposed for use for the Hand Blueprints. The GloveController handles multi-platform communication with both the Windows DLL and Andoid JAR file, which act as plugins for the Data Glove API on each platform.
Notes on Windows DLL:
- The Windows DLL is dynamically loaded.
- Function typedefs are created for the core functions of the Data Glove API.
- Function definitions are declared and defined at runtime.
- The Build.cs file copies the DLL to the local Plugins folder.
Notes on the Android JAR:
- The GloveController uses JNI to find the core Data Glove API functions from the DatagloveAPL.xml
- DatagloveAPL.xml acts a bridge between the C++ and Java functions in the JAR
- The Build.cs file copies the JAR to the Intermediate/Android/APK/libs folder.
The following image showcases the 6 modules the GloveController offers:
- Data Glove and Sensor Information
- Data Glove Calibration
- Data Glove Device Info
- Data Glove Haptic Library
- Data Glove Sensor Information
- Data Glove Setup
- Contains some sub modules with more functions during the setup process
7. Setting up the GloveController
- The setup module of the DataGlove API contains functions for setting up a Data Glove connection, setting default parameters, setting up VR controller offsets, and toggling various components.
- The IMU and sensor data can be toggled on/off when the Platform Controller is set to No Tracker in the “Create Data Glove” function. By default, since VR platforms support both rotation/translation with their trackers/controllers the IMU quaternion rotational data is not needed and only the tracker/controller’s orientation data is used.
- The VR Controller positional and rotational offset functions are a way to programmatically set the offset, so the hierarchy of the BluePrints does not need to be changed for every new model that is integrated.
- Since 3D models are exported with various X/Y/Z forward settings, this can make these offsets mandatory. For those unfamiliar with C++ these offset functions can be used, but the C++ can also be changed as needed to get the appropriate offset for your model and (if needed) tracker offset (which is mainly an Oculus Controller issue when mounted at the angle we suggest to maximize it’s visibility to the Rift/Quest camera).
8. Calibration Function Nodes
The Calibration module contains functions for homing the glove and calibrating the finger sensors. The homing and calibration commands are mapped to button press events in the Right and Left Hand Blueprints.
Note: With VR (Oculus/Vive), IMU homing is not needed since the tracker/controller’s rotation data is being used instead of the Data Glove IMU data. For non-VR apps, the imu homing command is way to orient the glove, so it correctly matches the users hand orientation.
Below is the calibration function list:
9. Calibration Process
Flat Calibration: All fingers are out straight
Fist Calibration: All fingers are in, except thumb
Thumb Calibration: Thumb curled in, while the other fingers are out
Up to 16 unique haptic sound files can be resident on the glove with new files rapidly uploaded over Bluetooth or USB. The glove’s custom haptic actuators are non-resonant with a 4+ octave frequency response of 100 Hz to 2000 Hz. The 6 actuators are located on each of the 5 fingertips and palm, with the palm haptic conveying possession of an object. The haptic signals can be pitch shifted and volume adjusted in real-time. Signals can loop continuously or operate in a single play mode. Our API includes easy commands to control the haptics.
The Data Glove has the ability to play back waveforms with looping granular control. A grain is simply a portion of a sound file that plays back in a loop. The loop size can be increased to maximum so that the entire sound file loops, or can be decreased so small that the loop covers a minute portion of the larger sound file.
Complex sounds can be created on any given actuator with creative use of the following parameters: Pitch – Also known as playback speed, this is the rate at which the loop is played. Grain Volume – The playback loudness. A new note with a volume of zero is a note off message. Grain Location – This is the loop start location point within the waveform where the loop will begin. Grain Size – The length of the loop. Grain Fade – The length of the fade in and out of each grain. Waveform Selection – The current selected sound file.
11. Haptic Function Nodes
Below is a list of the functions from the haptics module for BluePrint scripting:
Base Haptic Commands:
One Shot Haptic Commands:
Loop Haptic Commands:
Silence Haptic Commands:
12. Packaging an Unreal project
When packaging a project: Select the Windows 64-bit option (since the Data Glove API Windows DLL is 64-bit).