Detect and visualize hands from a video
An example on how to get video feed, detect and visualize hands in a 2D canvas.
This example demonstrates how to load and display a video in a Unity scene with a VideoSource and an ImageView, implement hand tracking with the HandTracker, and use the HandManager to render detected fingers on a 2D canvas.
Step 1: Create a Unity scene
Open the Unity Project you created in the Installation section.
Right-click on the Assets
folder and select Create > Scene
.

Type the scene's name. In this example, we'll use the name VideoDemo
.
After the scene is created, right-click on the scene and select GameObject > UI > Canvas
.

Navigate to the LightBuzz Prefabs
folder at Assets\LightBuzz Hand Tracking\Runtime\Prefabs
.

For this example, drag and drop the ImageView, VideoSource and HandManager prefabs into the Canvas.

Then, right-click on the Hierarchy
pane and select Create Empty
.

Give a name to the new component. In this example, we'll use the name Demo
.
Then, go to the Inspector
pane and select Add Component
. In the search bar, type new
and select the New script
option.

Type the script's name and select Create and Add
. For this example, we'll use the name VideoDemo
.

Step 2: Initialize the visual components
Double-click on the newly created MonoBehaviour
script and import the necessary namespaces.
using LightBuzz.HandTracking;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Video;
For this example, we'll need a VideoPlayer
to get the video frames, an ImageView to draw image texture and a HandManager to visualize the detected hands.
[SerializeField] private ImageView _image;
[SerializeField] private HandManager _handManager;
[SerializeField] VideoPlayer _videoPlayer;
Add a Texture2D
object to load the video frames.
private Texture2D _texture;
After adding the serialized fields, go to the Unity Editor to connect these fields with the Demo
component.
At the Inspector
pane, select the round button next to each SerializeField
.

Then, at the Scene
tab, select the corresponding prefab. For example, for the Image
field, select the ImageView
prefab.

When all fields are connected, the result should resemble the following image.

Then, select the HandManager
prefab, under the Canvas
, and connect the Image
field to the ImageView
prefab.

Then, select the VideoSource
prefab, under the Canvas,
and connect the Video Clip
field to a video source. You can either use the demo video lightbuzz-hand-tracking.mp4
included in the Assets/LightBuzz Hand Tracking/Runtime/Media
or create your own video source.

After connecting all the fields, navigate to the Canvas
to set the render options.
Change the Render Mode
to Screen Space - Camera
.

Then, set the Main Camera,
from the Scene tab, as the Render Camera
.

When all the render options are set, the result should look like the following image.

Finally, return to the script and instantiate a HandTracker to detect hands.
HandTracker _handTracker = new HandTracker();
Step 3: Register handler methods to VideoPlayer events
Enable the frameReady
events. Any registered delegates with VideoPlayer.frameReady
will be invoked when a frame is ready to be drawn.
_videoPlayer.sendFrameReadyEvents = true;
Register for the VideoPlayer.prepareCompleted
event with the VideoPrepared
event handler method.
_videoPlayer.prepareCompleted += VideoPrepared;
Register for the VideoPlayer.frameReady
event with the VideoFrameReady
event handler method.
_videoPlayer.frameReady += VideoFrameReady;
Initiate playback engine preparation. After this is done, video frames can be received immediately.
_videoPlayer.Prepare();
Step 4: Implement the VideoPrepared handler
Create the Texture2D
according to the VideoPlayer
resolution.
_texture = new Texture2D
(
(int)source.width,
(int)source.height,
TextureFormat.RGB24,
false
);
Start playback.
_videoPlayer.Play();
Step 5: Implement the VideoFrameReadyhandler
Create a RenderTexture
with the VideoPlayer
frame data.
RenderTexture renderTexture = (RenderTexture)_videoPlayer.texture;
RenderTexture.active = renderTexture;
_texture.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
_texture.Apply();
Load the texture onto the ImageView to show the video feed.
_image.Load(_texture);
Pass the texture
to the HandTracker for processing.
List<Hand> hands = _handTracker.Load(_texture);
The HandTracker will analyze the texture and detect any hands present in the image.
Pass the detections to the HandManager. The HandManager will manage the rendering and updates required to depict the detected hands.
_handManager.Load(hands);
Step 6: Release the resources
Unsubscribe the VideoPrepared
event handler method from the VideoPlayer.prepareCompleted
event.
_videoPlayer.prepareCompleted -= VideoPrepared;
Unsubscribe the VideoFrameReady
event handler method from the VideoPlayer.frameReady
event.
_videoPlayer.frameReady -= VideoFrameReady;
Dispose of the HandTracker object to ensure that all associated resources are released.
_handTracker.Dispose();
Destroy the Texture2D
object created in Step 2.
Destroy(_texture);
Full example code
Here is the full example code that has the same functionality as the Hand Tracking Unity plugin LightBuzz_Hand_Tracking_Video
sample.
using LightBuzz.HandTracking;
using System;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Video;
public class VideoDemo : MonoBehaviour
{
[SerializeField] private ImageView _image;
[SerializeField] private HandManager _handManager;
[SerializeField] VideoPlayer _videoPlayer;
private Texture2D _texture;
private readonly HandTracker _handTracker = new HandTracker();
private void Start()
{
// Enable the frameReady events.
_videoPlayer.sendFrameReadyEvents = true;
// Register the VideoPrepared handler method.
_videoPlayer.prepareCompleted += VideoPrepared;
// Register the VideoFrameReady handler method.
_videoPlayer.frameReady += VideoFrameReady;
// Initiate playback engine preparation.
_videoPlayer.Prepare();
}
private void OnDestroy()
{
// Remove the VideoPrepared handler method.
_videoPlayer.prepareCompleted -= VideoPrepared;
// Unsubscribe the VideoFrameReady handler method.
_videoPlayer.frameReady -= VideoFrameReady;
// Dispose of the HandTracker.
_handTracker.Dispose();
// Destroy the texture.
Destroy(_texture);
}
private void VideoPrepared(VideoPlayer source)
{
// Assign the VideoPlayer data to texture.
_texture = new Texture2D((int)source.width, (int)source.height, TextureFormat.RGB24, false);
// Start playback.
_videoPlayer.Play();
}
private void VideoFrameReady(VideoPlayer source, long frameIdx)
{
// Create a RenderTexture with Videoplayer frame data.
RenderTexture renderTexture = (RenderTexture)_videoPlayer.texture;
RenderTexture.active = renderTexture;
_texture.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
_texture.Apply();
// Draw the camera texture.
_image.Load(_texture);
// Detect hands in the texture.
List<Hand> hands = _handTracker.Load(_texture);
// Visualize the hands on a Canvas.
_handManager.Load(hands);
}
}
By following these steps, you will be able to load the video feed into your application, detect hands, and finally, render these detections on a 2D canvas.
Last updated