👐
Hand & Finger Tracking for Unity
HomepageSupportContact
  • Hand Tracking Unity Plugin
    • Overview
    • System Requirements
    • Installation
    • QuickStart
    • Supported Joints
  • Help & Support
    • Ask for help
    • Request a feature
  • Examples
    • Detect hand and retrieve joints
    • No Code: Detect and visualize hands from a Sprite
    • Detect and visualize hands from a Sprite
    • Detect and visualize hands from a video
    • Use webcam to detect and visualize hands (2D Canvas)
    • Use webcam to detect and visualize hands (3D World space)
  • API Reference
    • Finger
      • Index
      • Middle
      • Palm
      • Pinky
      • Ring
      • Thumb
    • FingerJoint
    • FingerJointType
    • Hand
    • HandExtensions
    • HandSide
    • HandTracker
  • UI Reference
    • HandManager
    • HandViewer
    • HandVisual
    • ImageView
    • StreamSource
      • SpriteSource
      • VideoSource
      • WebcamSource
    • HandTrackerExtensions
    • WebcamSelector
Powered by GitBook
On this page
  1. Examples

Detect and visualize hands from a video

An example on how to get video feed, detect and visualize hands in a 2D canvas.

PreviousDetect and visualize hands from a SpriteNextUse webcam to detect and visualize hands (2D Canvas)

Last updated 1 year ago

This example demonstrates how to load and display a video in a Unity scene with a and an , implement hand tracking with the , and use the to render detected fingers on a 2D canvas.

This is a code walkthrough of the LightBuzz_Hand_Tracking_Video Hand Tracking Unity plugin sample. The plugin includes the no-code demo that has the same functionality.

Step 1: Create a Unity scene

Open the Unity Project you created in the section.

Right-click on the Assets folder and select Create > Scene.

Type the scene's name. In this example, we'll use the name VideoDemo.

After the scene is created, right-click on the scene and select GameObject > UI > Canvas.

Navigate to the LightBuzz Prefabs folder at Assets\LightBuzz Hand Tracking\Runtime\Prefabs.

Then, right-click on the Hierarchy pane and select Create Empty.

Give a name to the new component. In this example, we'll use the name Demo.

Then, go to the Inspector pane and select Add Component. In the search bar, type new and select the New script option.

Type the script's name and select Create and Add. For this example, we'll use the name VideoDemo.

Step 2: Initialize the visual components

Double-click on the newly created MonoBehaviour script and import the necessary namespaces.

using LightBuzz.HandTracking;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Video;
[SerializeField] private ImageView _image;
[SerializeField] private HandManager _handManager;
[SerializeField] VideoPlayer _videoPlayer;

Add a Texture2D object to load the video frames.

private Texture2D _texture;

After adding the serialized fields, go to the Unity Editor to connect these fields with the Demo component.

At the Inspector pane, select the round button next to each SerializeField.

Then, at the Scene tab, select the corresponding prefab. For example, for the Image field, select the ImageView prefab.

When all fields are connected, the result should resemble the following image.

Then, select the HandManager prefab, under the Canvas, and connect the Image field to the ImageView prefab.

Make sure the Is 2D option is selected to see the hand tracking detections in the 2D space. If the option is not checked, detections are displayed in the 3D world space.

Then, select the VideoSource prefab, under the Canvas, and connect the Video Clip field to a video source. You can either use the demo video lightbuzz-hand-tracking.mp4 included in the Assets/LightBuzz Hand Tracking/Runtime/Media or create your own video source.

After connecting all the fields, navigate to the Canvas to set the render options.

Change the Render Mode to Screen Space - Camera.

Then, set the Main Camera, from the Scene tab, as the Render Camera.

When all the render options are set, the result should look like the following image.

HandTracker _handTracker = new HandTracker();

Step 3: Register handler methods to VideoPlayer events

Enable the frameReady events. Any registered delegates with VideoPlayer.frameReady will be invoked when a frame is ready to be drawn.

 _videoPlayer.sendFrameReadyEvents = true;

Register for the VideoPlayer.prepareCompleted event with the VideoPrepared event handler method.

_videoPlayer.prepareCompleted += VideoPrepared;

Register for the VideoPlayer.frameReady event with the VideoFrameReady event handler method.

_videoPlayer.frameReady += VideoFrameReady;

Initiate playback engine preparation. After this is done, video frames can be received immediately.

_videoPlayer.Prepare();

In this example, the VideoPlayer event handler methods are registered in the Start() method.

Step 4: Implement the VideoPrepared handler

Create the Texture2D according to the VideoPlayer resolution.

_texture = new Texture2D
    (
        (int)source.width,
        (int)source.height,
        TextureFormat.RGB24,
        false
    );

Start playback.

_videoPlayer.Play();

In this example, the prepare and playback functionality is implemented in the VideoPrepared event handler method.

Step 5: Implement the VideoFrameReadyhandler

Create a RenderTexture with the VideoPlayer frame data.

RenderTexture renderTexture = (RenderTexture)_videoPlayer.texture;
RenderTexture.active = renderTexture;

_texture.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
_texture.Apply();
_image.Load(_texture);
List<Hand> hands = _handTracker.Load(_texture);
_handManager.Load(hands);

In this example, all the hand-tracking functionality is implemented in the VideoFrameReady event handler method.

Step 6: Release the resources

Unsubscribe the VideoPrepared event handler method from the VideoPlayer.prepareCompleted event.

_videoPlayer.prepareCompleted -= VideoPrepared;

Unsubscribe the VideoFrameReady event handler method from the VideoPlayer.frameReady event.

_videoPlayer.frameReady -= VideoFrameReady;
_handTracker.Dispose();

Destroy the Texture2D object created in Step 2.

Destroy(_texture);

In this example, the resources are released in the OnDestroy() method. Alternatively, you could do that in the OnApplicationQuit() method.

Full example code

Here is the full example code that has the same functionality as the Hand Tracking Unity plugin LightBuzz_Hand_Tracking_Video sample.

using LightBuzz.HandTracking;
using System;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Video;

public class VideoDemo : MonoBehaviour
{
    [SerializeField] private ImageView _image;
    [SerializeField] private HandManager _handManager;
    [SerializeField] VideoPlayer _videoPlayer;

    private Texture2D _texture;

    private readonly HandTracker _handTracker = new HandTracker();

    private void Start()
    {
        // Enable the frameReady events.
        _videoPlayer.sendFrameReadyEvents = true;
    
        // Register the VideoPrepared handler method.
        _videoPlayer.prepareCompleted += VideoPrepared;
    
        // Register the VideoFrameReady handler method.
        _videoPlayer.frameReady += VideoFrameReady;
    
        // Initiate playback engine preparation.
        _videoPlayer.Prepare();
    }

    private void OnDestroy()
    {
        // Remove the VideoPrepared handler method.
        _videoPlayer.prepareCompleted -= VideoPrepared;
      
        // Unsubscribe the VideoFrameReady handler method.
        _videoPlayer.frameReady -= VideoFrameReady;
      
        // Dispose of the HandTracker.
        _handTracker.Dispose();
      
        // Destroy the texture.
        Destroy(_texture);
    }

    private void VideoPrepared(VideoPlayer source)
    {
        // Assign the VideoPlayer data to texture.
        _texture = new Texture2D((int)source.width, (int)source.height, TextureFormat.RGB24, false);
    
        // Start playback. 
        _videoPlayer.Play();
    }

    private void VideoFrameReady(VideoPlayer source, long frameIdx)
    {
        // Create a RenderTexture with Videoplayer frame data.
        RenderTexture renderTexture = (RenderTexture)_videoPlayer.texture;
        RenderTexture.active = renderTexture;

        _texture.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
        _texture.Apply();

        // Draw the camera texture.
        _image.Load(_texture);

        // Detect hands in the texture.
        List<Hand> hands = _handTracker.Load(_texture);

        // Visualize the hands on a Canvas.
        _handManager.Load(hands);
    }
}

By following these steps, you will be able to load the video feed into your application, detect hands, and finally, render these detections on a 2D canvas.

For this example, drag and drop the , and prefabs into the Canvas.

For this example, we'll need a VideoPlayer to get the video frames, an to draw image texture and a to visualize the detected hands.

Finally, return to the script and instantiate a to detect hands.

Load the texture onto the to show the video feed.

Pass the texture to the for processing.

The will analyze the texture and detect any hands present in the image.

Pass the detections to the . The will manage the rendering and updates required to depict the detected hands.

Dispose of the object to ensure that all associated resources are released.

ImageView
VideoSource
HandManager
ImageView
HandManager
HandTracker
ImageView
HandTracker
HandTracker
HandManager
HandManager
HandTracker
VideoSource
ImageView
HandTracker
HandManager
Installation
Create new scene
Add a Canvas
LightBuzz Prefabs
Add prefabs
Create empty component
Add script
Create script
Seriaze fileds
Connect prefab
Connected serialize fields
Connect field to HandManager
Change Render Mode
Select Render Camera
Render settings