# Detect and visualize hands from a video

This example demonstrates how to load and display a video in a Unity scene with a [VideoSource](https://handtracking.lightbuzz.com/ui-reference/streamsource/videosource) and an [ImageView](https://handtracking.lightbuzz.com/ui-reference/imageview), implement hand tracking with the [HandTracker](https://handtracking.lightbuzz.com/api-reference/handtracker), and use the [HandManager](https://handtracking.lightbuzz.com/ui-reference/handmanager) to render detected fingers on a 2D canvas.

{% hint style="info" %}
This is a code walkthrough of the `LightBuzz_Hand_Tracking_Video` Hand Tracking Unity plugin sample. The plugin includes the no-code demo that has the same functionality.
{% endhint %}

#### Step 1: Create a Unity scene

Open the Unity Project you created in the [Installation](https://handtracking.lightbuzz.com/hand-tracking-unity-plugin/installation) section.

Right-click on the `Assets` folder and select `Create > Scene`.

<figure><img src="https://1895788644-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FyGqKFJQGBmrlsrsolzNU%2Fuploads%2Fjq6n6qCxjh0OR775jrfm%2F1_create_scene.png?alt=media&#x26;token=f8e0e814-512e-4bab-b17e-c341a8bd2281" alt=""><figcaption><p>Create new scene</p></figcaption></figure>

Type the scene's name. In this example, we'll use the name `VideoDemo`.

After the scene is created, right-click on the scene and select `GameObject > UI > Canvas`.

<figure><img src="https://1895788644-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FyGqKFJQGBmrlsrsolzNU%2Fuploads%2Fde9seJ1XF1in693SVhhX%2F2_add_canvas.png?alt=media&#x26;token=dc33f608-7150-4fe9-a2c4-07fa0e212c35" alt=""><figcaption><p>Add a Canvas</p></figcaption></figure>

Navigate to the LightBuzz `Prefabs` folder at `Assets\LightBuzz Hand Tracking\Runtime\Prefabs`.

<figure><img src="https://1895788644-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FyGqKFJQGBmrlsrsolzNU%2Fuploads%2FptBTcWOYXEUCIgv8n8Cs%2F3_prefabs.png?alt=media&#x26;token=a8c96402-2def-43f9-87cb-764c0336e3ed" alt=""><figcaption><p>LightBuzz Prefabs</p></figcaption></figure>

For this example, drag and drop the [ImageView](https://handtracking.lightbuzz.com/ui-reference/imageview), [VideoSource](https://handtracking.lightbuzz.com/ui-reference/streamsource/videosource) and [HandManager](https://handtracking.lightbuzz.com/ui-reference/handmanager) prefabs into the Canvas.

<figure><img src="https://1895788644-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FyGqKFJQGBmrlsrsolzNU%2Fuploads%2FwJUmhTNdJfPzuWAFvw8h%2F4_add_prefabs_video.png?alt=media&#x26;token=5b04d4ba-acdc-473f-803a-0861a9e1e248" alt=""><figcaption><p>Add prefabs</p></figcaption></figure>

Then, right-click on the `Hierarchy` pane and select `Create Empty`.&#x20;

<figure><img src="https://1895788644-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FyGqKFJQGBmrlsrsolzNU%2Fuploads%2F128JJiRWg5tVPvbu5wAZ%2F5_create_empty.png?alt=media&#x26;token=1388ed19-e6ee-4b1d-b2cc-32d51aa6882e" alt=""><figcaption><p>Create empty component</p></figcaption></figure>

Give a name to the new component. In this example, we'll use the name `Demo`.

Then, go to the `Inspector` pane and select `Add Component`. In the search bar, type `new` and select the `New script` option.

<figure><img src="https://1895788644-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FyGqKFJQGBmrlsrsolzNU%2Fuploads%2FOmSwepgdDduiVmMQubl0%2F6_add_script.png?alt=media&#x26;token=627e78d1-8fd0-4408-933b-2cfe6f2fbb81" alt=""><figcaption><p>Add script</p></figcaption></figure>

Type the script's name and select `Create and Add`. For this example, we'll use the name `VideoDemo`.

<figure><img src="https://1895788644-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FyGqKFJQGBmrlsrsolzNU%2Fuploads%2FHT5xsq0ieDfjgeQeRoHD%2F7_name_script_video.png?alt=media&#x26;token=4d818f51-28bf-472d-824f-ed913f2e5619" alt=""><figcaption><p>Create script</p></figcaption></figure>

#### Step 2: Initialize the visual components

Double-click on the newly created `MonoBehaviour` script and import the necessary namespaces.

```csharp
using LightBuzz.HandTracking;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Video;
```

For this example, we'll need a `VideoPlayer` to get the video frames, an [ImageView](https://handtracking.lightbuzz.com/ui-reference/imageview) to draw image texture and a [HandManager](https://handtracking.lightbuzz.com/ui-reference/handmanager) to visualize the detected hands.

```csharp
[SerializeField] private ImageView _image;
[SerializeField] private HandManager _handManager;
[SerializeField] VideoPlayer _videoPlayer;
```

Add a `Texture2D` object to load the video frames.

```csharp
private Texture2D _texture;
```

After adding the serialized fields, go to the Unity Editor to connect these fields with the `Demo` component.

At the `Inspector` pane, select the round button next to each `SerializeField`.

<figure><img src="https://1895788644-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FyGqKFJQGBmrlsrsolzNU%2Fuploads%2FV1Y57uWdtAtPsq8BOytu%2F8_connect_serialize_fields_video.png?alt=media&#x26;token=e23b76d7-60e8-41e3-b821-eb4da6e5b4c9" alt=""><figcaption><p>Seriaze fileds</p></figcaption></figure>

Then, at the `Scene` tab, select the corresponding prefab. For example, for the `Image` field, select the `ImageView` prefab.

<figure><img src="https://1895788644-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FyGqKFJQGBmrlsrsolzNU%2Fuploads%2Fu5hbIW0JSuqVkmPJJ04O%2F9_imageview.png?alt=media&#x26;token=87365163-98eb-424a-af4b-859ca882f91c" alt=""><figcaption><p>Connect prefab</p></figcaption></figure>

When all fields are connected, the result should resemble the following image.

<figure><img src="https://1895788644-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FyGqKFJQGBmrlsrsolzNU%2Fuploads%2FHk62AuP5yeJzy5B0RXBa%2F10_connected_prefabs_video.png?alt=media&#x26;token=c49c5c68-d6cc-4055-8bce-147cdbd12e13" alt=""><figcaption><p>Connected serialize fields</p></figcaption></figure>

Then, select the `HandManager` prefab, under the `Canvas`, and connect the `Image` field to the `ImageView` prefab.

<figure><img src="https://1895788644-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FyGqKFJQGBmrlsrsolzNU%2Fuploads%2FlKAsHvTsDJeE9mM2KW73%2F10a_connected_handmanager.png?alt=media&#x26;token=02226392-8b7b-4fa3-a01b-0848744f75dd" alt=""><figcaption><p>Connect field to HandManager</p></figcaption></figure>

{% hint style="info" %}
Make sure the `Is 2D` option is selected to see the hand tracking detections in the 2D space. If the option is not checked, detections are displayed in the 3D world space.
{% endhint %}

Then, select the `VideoSource` prefab, under the `Canvas,` and connect the `Video Clip` field to a video source. You can either use the demo video `lightbuzz-hand-tracking.mp4` included in the `Assets/LightBuzz Hand Tracking/Runtime/Media` or create your own video source.

<figure><img src="https://1895788644-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FyGqKFJQGBmrlsrsolzNU%2Fuploads%2Fmx0XLE9nfo3nrnZeCvEB%2F10b_connected_videosource.png?alt=media&#x26;token=62141713-64d6-4af2-9735-df3243b97f21" alt=""><figcaption></figcaption></figure>

After connecting all the fields, navigate to the `Canvas` to set the render options.

Change the `Render Mode` to `Screen Space - Camera`.

<figure><img src="https://1895788644-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FyGqKFJQGBmrlsrsolzNU%2Fuploads%2F3Nl9z2Beuk8eRI1DZj7G%2F11_render_mode_camera.png?alt=media&#x26;token=e582de34-c7cb-4610-82a7-9d2a3b9f486d" alt=""><figcaption><p>Change Render Mode</p></figcaption></figure>

Then, set the `Main Camera,` from the Scene tab, as the `Render Camera`.

<figure><img src="https://1895788644-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FyGqKFJQGBmrlsrsolzNU%2Fuploads%2FIUjjpjWLwZqDxqRdJ5q9%2F12_main_camera.png?alt=media&#x26;token=d82b6dce-3e08-4777-8bb2-23ed475d6aca" alt=""><figcaption><p>Select Render Camera</p></figcaption></figure>

When all the render options are set, the result should look like the following image.

<figure><img src="https://1895788644-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FyGqKFJQGBmrlsrsolzNU%2Fuploads%2FqVosuFDLymh0LRB1qK48%2F13_render_camera_settings.png?alt=media&#x26;token=6a83eba6-4c2c-4787-a324-f1c519b26026" alt=""><figcaption><p>Render settings</p></figcaption></figure>

Finally, return to the script and instantiate a [HandTracker](https://handtracking.lightbuzz.com/api-reference/handtracker) to detect hands.

```csharp
HandTracker _handTracker = new HandTracker();
```

#### Step 3: Register handler methods to VideoPlayer events

Enable the `frameReady` events. Any registered delegates with `VideoPlayer.frameReady` will be invoked when a frame is ready to be drawn.

```csharp
 _videoPlayer.sendFrameReadyEvents = true;
```

Register for the `VideoPlayer.prepareCompleted` event with the `VideoPrepared` event handler method.

<pre class="language-csharp"><code class="lang-csharp"><strong>_videoPlayer.prepareCompleted += VideoPrepared;
</strong></code></pre>

Register for the `VideoPlayer.frameReady` event with the `VideoFrameReady` event handler method.&#x20;

```csharp
_videoPlayer.frameReady += VideoFrameReady;
```

Initiate playback engine preparation. After this is done, video frames can be received immediately.

<pre class="language-csharp"><code class="lang-csharp"><strong>_videoPlayer.Prepare();
</strong></code></pre>

{% hint style="info" %}
In this example, the `VideoPlayer` event handler methods are registered in the `Start()` method.
{% endhint %}

#### Step 4: Implement the VideoPrepared handler

Create the `Texture2D` according to the `VideoPlayer` resolution.

```
_texture = new Texture2D
    (
        (int)source.width,
        (int)source.height,
        TextureFormat.RGB24,
        false
    );
```

Start playback.

```csharp
_videoPlayer.Play();
```

{% hint style="info" %}
In this example, the prepare and playback functionality is implemented in the `VideoPrepared` event handler method.
{% endhint %}

#### Step 5: Implement the VideoFrameReadyhandler

Create a `RenderTexture` with the `VideoPlayer` frame data.

```csharp
RenderTexture renderTexture = (RenderTexture)_videoPlayer.texture;
RenderTexture.active = renderTexture;

_texture.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
_texture.Apply();
```

Load the texture onto the [ImageView](https://handtracking.lightbuzz.com/ui-reference/imageview) to show the video feed.

```csharp
_image.Load(_texture);
```

Pass the `texture` to the [HandTracker](https://handtracking.lightbuzz.com/api-reference/handtracker) for processing.

```csharp
List<Hand> hands = _handTracker.Load(_texture);
```

The [HandTracker](https://handtracking.lightbuzz.com/api-reference/handtracker) will analyze the texture and detect any hands present in the image.

Pass the detections to the [HandManager](https://handtracking.lightbuzz.com/ui-reference/handmanager).  The [HandManager](https://handtracking.lightbuzz.com/ui-reference/handmanager) will manage the rendering and updates required to depict the detected hands.

```csharp
_handManager.Load(hands);
```

{% hint style="info" %}
In this example, all the hand-tracking functionality is implemented in the `VideoFrameReady` event handler method.
{% endhint %}

#### Step 6: Release the resources

Unsubscribe the `VideoPrepared` event handler method from the `VideoPlayer.prepareCompleted` event.

```csharp
_videoPlayer.prepareCompleted -= VideoPrepared;
```

Unsubscribe the `VideoFrameReady` event handler method from the `VideoPlayer.frameReady` event.

```csharp
_videoPlayer.frameReady -= VideoFrameReady;
```

Dispose of the [HandTracker](https://handtracking.lightbuzz.com/api-reference/handtracker) object to ensure that all associated resources are released.

```csharp
_handTracker.Dispose();
```

Destroy the `Texture2D` object created in Step 2.

```csharp
Destroy(_texture);
```

{% hint style="info" %}
In this example, the resources are released in the `OnDestroy()` method. Alternatively, you could do that in the `OnApplicationQuit()` method.
{% endhint %}

### Full example code

Here is the full example code that has the same functionality as the Hand Tracking Unity plugin `LightBuzz_Hand_Tracking_Video` sample.

```csharp
using LightBuzz.HandTracking;
using System;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Video;

public class VideoDemo : MonoBehaviour
{
    [SerializeField] private ImageView _image;
    [SerializeField] private HandManager _handManager;
    [SerializeField] VideoPlayer _videoPlayer;

    private Texture2D _texture;

    private readonly HandTracker _handTracker = new HandTracker();

    private void Start()
    {
        // Enable the frameReady events.
        _videoPlayer.sendFrameReadyEvents = true;
    
        // Register the VideoPrepared handler method.
        _videoPlayer.prepareCompleted += VideoPrepared;
    
        // Register the VideoFrameReady handler method.
        _videoPlayer.frameReady += VideoFrameReady;
    
        // Initiate playback engine preparation.
        _videoPlayer.Prepare();
    }

    private void OnDestroy()
    {
        // Remove the VideoPrepared handler method.
        _videoPlayer.prepareCompleted -= VideoPrepared;
      
        // Unsubscribe the VideoFrameReady handler method.
        _videoPlayer.frameReady -= VideoFrameReady;
      
        // Dispose of the HandTracker.
        _handTracker.Dispose();
      
        // Destroy the texture.
        Destroy(_texture);
    }

    private void VideoPrepared(VideoPlayer source)
    {
        // Assign the VideoPlayer data to texture.
        _texture = new Texture2D((int)source.width, (int)source.height, TextureFormat.RGB24, false);
    
        // Start playback. 
        _videoPlayer.Play();
    }

    private void VideoFrameReady(VideoPlayer source, long frameIdx)
    {
        // Create a RenderTexture with Videoplayer frame data.
        RenderTexture renderTexture = (RenderTexture)_videoPlayer.texture;
        RenderTexture.active = renderTexture;

        _texture.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
        _texture.Apply();

        // Draw the camera texture.
        _image.Load(_texture);

        // Detect hands in the texture.
        List<Hand> hands = _handTracker.Load(_texture);

        // Visualize the hands on a Canvas.
        _handManager.Load(hands);
    }
}
```

By following these steps, you will be able to load the video feed into your application, detect hands, and finally, render these detections on a 2D canvas.
