Detect and visualize hands from a Sprite
An example on how to detect and visualize hands from a Sprite.
Last updated
An example on how to detect and visualize hands from a Sprite.
Last updated
This example demonstrates how to load and display image sprites in a Unity scene with an , implement hand tracking with the , and use the to render detected fingers on a 2D canvas.
Open the Unity Project you created in the section.
Right-click on the Assets
folder and select Create > Scene
.
Type the scene's name. In this example, we'll use the name PictureDemo
.
After the scene is created, right-click on the scene and select GameObject > UI > Canvas
.
Navigate to the LightBuzz Prefabs
folder at Assets\LightBuzz Hand Tracking\Runtime\Prefabs
.
Then, right-click on the Hierarchy
pane and select Create Empty
.
Give a name to the new component. In this example, we'll use the name Demo
.
Then, go to the Inspector
pane and select Add Component
. In the search bar, type new
and select the New script
option.
Type the script's name and select Create and Add
. For this example, we'll use the name PictureDemo
.
Double-click on the newly created MonoBehaviour
script and import the necessary namespaces.
After adding the serialized fields, go to the Unity Editor to connect these fields with the Demo
component.
At the Inspector
pane, select the round button next to each SerializeField
.
Then, at the Scene
tab, select the corresponding prefab. For example, for the Image
field, select the ImageView
prefab.
The Sprite
field should connect to an image sprite. You can either use the demo image sprite lightbuzz-hand-tracking.png
included in the Assets/LightBuzz Hand Tracking/Runtime/Media
folder or create your own image sprite.
When all fields are connected, the result should resemble the following image.
Then, select the HandManager
prefab, under the Canvas
, and connect the Image
field to the ImageView
prefab.
After connecting all the fields, navigate to the Canvas
to set the render options.
Change the Render Mode
to Screen Space - Camera
.
Then, set the Main Camera,
from the Scene tab, as the Render Camera
.
When all the render options are set, the result should look like the following image.
Create a Texture2D
object with the Sprite
's texture.
Here is the full example code that has the same functionality as the Hand Tracking Unity plugin LightBuzz_Hand_Tracking_Picture
sample.
By following these steps, you will be able to load a Sprite
into your application, detect hands, and finally, render these detections on a 2D canvas.
For this example, drag and drop the and prefabs into the Canvas.
For this example, we'll need a Sprite
to load an image, an to draw its texture and a to visualize the detected hands.
Finally, return to the script and instantiate a to detect hands.
Load the Texture2D
object onto the to show the image.
Pass the texture created in Step 3 to the for processing.
The will analyze the texture and detect any hands present in the image.
To display the detected hands on a 2D canvas, simply pass the detections to the . It will manage the rendering and updates required to accurately depict the hands on the canvas based on the detection data provided.
Dispose of the object to ensure that all associated resources are released.