Use webcam to detect and visualize hands (2D Canvas)

An example on how to get camera feed, detect and visualize hands in real time in a 2D canvas.

This example demonstrates how to load and display camera feed in a Unity scene with a WebcamSource and an ImageView, implement hand tracking with the HandTracker, and use the HandManager to render detected fingers on a 2D canvas.

This is a code walkthrough of the LightBuzz_Hand_Tracking_2D Hand Tracking Unity plugin sample. The plugin includes the no-code demo that has the same functionality.

Step 1: Create a Unity scene

Open the Unity Project you created in the Installation section.

Right-click on the Assets folder and select Create > Scene.

Type the scene's name. In this example, we'll use the name WebcamDemo.

After the scene is created, right-click on the scene and select GameObject > UI > Canvas.

Navigate to the LightBuzz Prefabs folder at Assets\LightBuzz Hand Tracking\Runtime\Prefabs.

For this example, drag and drop the ImageView, WebcamSource and HandManager prefabs into the Canvas.

Then, right-click on the Hierarchy pane and select Create Empty.

Give a name to the new component. In this example, we'll use the name Demo.

Then, go to the Inspector pane and select Add Component. In the search bar, type new and select the New script option.

Type the script's name and select Create and Add. For this example, we'll use the name WebcamDemo.

Step 2: Initialize the visual components

Double-click on the newly created MonoBehaviour script and import the necessary namespaces.

using LightBuzz.HandTracking;
using System.Collections.Generic;
using UnityEngine;

For this example, we'll need a WebcamSource to get the frames, an ImageView to draw camera texture and a HandManager to visualize the detected hands.

[SerializeField] private ImageView _image;
[SerializeField] private WebcamSource _webcam;
[SerializeField] private HandManager _handManager;

After adding the serialized fields, go to the Unity Editor to connect these fields with the Demo component.

At the Inspector pane, select the round button next to each SerializeField.

Then, at the Scene tab, select the corresponding prefab. For example, for the Image field, select the ImageView prefab.

When all fields are connected, the result should resemble the following image.

Then, select the HandManager prefab, under the Canvas, and connect the Image field to the ImageView prefab.

Make sure the Is 2D option is selected to see the hand tracking detections in the 2D space. If the option is not checked, detections are displayed in the 3D world space.

After connecting all the fields, navigate to the Canvas to set the render options.

Change the Render Mode to Screen Space - Camera.

Then, set the Main Camera, from the Scene tab, as the Render Camera.

When all the render options are set, the result should look like the following image.

Finally, return to the script and instantiate a HandTracker to detect hands.

HandTracker _handTracker = new HandTracker();

Step 3: Open the webcam

Open the webcam to get the live feed.

_webcam.Open();

In this example, the camera is opened in the Start() method. Alternatively, you could open the camera with the click of a button.

Step 4: Get the live feed

Check that the camera is open and available for capturing video.

if (!_webcam.IsOpen) return;

Load the new frame from the _webcam object onto the ImageView to show the live feed to your users.

_image.Load(_webcam);

Step 5: Detect hands

Pass the Texture2D object from the camera frame to the HandTracker for processing.

List<Hand> hands = _handTracker.Load(_image.Texture);

The HandTracker will analyze the texture and detect any hands present in the image.

Step 6: Visualize the hands

To display the detected hands on a 2D canvas, simply pass the detections to the HandManager. It will manage the rendering and updates required to accurately depict the hands on the canvas based on the detection data provided.

_handManager.Load(hands);

Steps 4 through 6 are incorporated into the Update() method.

Step 7: Release the resources

Close the webcam to stop the live feed, preventing further video capture.

_webcam.Close();

Dispose of the HandTracker object to ensure that all associated resources are released.

_handTracker.Dispose();

In this example, the resources are released in the OnDestroy() method. Alternatively, you could do that with the click of a button or in the OnApplicationQuit() method.

Full example code

Here is the full example code that has the same functionality as the Hand Tracking Unity plugin LightBuzz_Hand_Tracking_2D sample.

using LightBuzz.HandTracking;
using System.Collections.Generic;
using UnityEngine;

public class WebcamDemo : MonoBehaviour
{
    [SerializeField] private ImageView _image;
    [SerializeField] private WebcamSource _webcam;
    [SerializeField] private HandManager _handManager;
    
    private readonly HandTracker _handTracker = new HandTracker();

    private void Start()
    {
        _webcam.Open();
    }

    private void OnDestroy()
    {
        _webcam.Close();
        _handTracker.Dispose();
    }

    private void Update()
    {
        if (!_webcam.IsOpen) return;

        // 1. Draw the camera texture.
        _image.Load(_webcam);

        // 2. Detect hands in the camera texture.
        List<Hand> hands = _handTracker.Load(_image.Texture);

        // 3. Visualize the hands on a Canvas.
        _handManager.Load(hands);
    }
}

By following these steps, you will be able to load the camera feed into your application, detect hands in real time, and finally, render these detections on a 2D canvas.

Last updated