# Use webcam to detect and visualize hands (2D Canvas)

This example demonstrates how to load and display camera feed in a Unity scene with a [WebcamSource](/ui-reference/streamsource/webcamsource.md) and an [ImageView](/ui-reference/imageview.md), implement hand tracking with the [HandTracker](/api-reference/handtracker.md), and use the [HandManager](/ui-reference/handmanager.md) to render detected fingers on a 2D canvas.

{% hint style="info" %}
This is a code walkthrough of the `LightBuzz_Hand_Tracking_2D` Hand Tracking Unity plugin sample. The plugin includes the no-code demo that has the same functionality.
{% endhint %}

#### Step 1: Create a Unity scene

Open the Unity Project you created in the [Installation](/hand-tracking-unity-plugin/installation.md) section.

Right-click on the `Assets` folder and select `Create > Scene`.

<figure><img src="/files/AWAZBZqNkgh8mdgwr3hv" alt=""><figcaption><p>Create new scene</p></figcaption></figure>

Type the scene's name. In this example, we'll use the name `WebcamDemo`.

After the scene is created, right-click on the scene and select `GameObject > UI > Canvas`.

<figure><img src="/files/SOVl2RObo6YAMFPnsd5J" alt=""><figcaption><p>Add a Canvas</p></figcaption></figure>

Navigate to the LightBuzz `Prefabs` folder at `Assets\LightBuzz Hand Tracking\Runtime\Prefabs`.

<figure><img src="/files/aLOxlnZxIBjwQFtfkzwZ" alt=""><figcaption><p>LightBuzz Prefabs</p></figcaption></figure>

For this example, drag and drop the [ImageView](/ui-reference/imageview.md), [WebcamSource](/ui-reference/streamsource/webcamsource.md) and [HandManager](/ui-reference/handmanager.md) prefabs into the Canvas.

<figure><img src="/files/ihYQ5AJQowyqRRRavG1k" alt=""><figcaption><p>Add prefabs</p></figcaption></figure>

Then, right-click on the `Hierarchy` pane and select `Create Empty`.&#x20;

<figure><img src="/files/tWKAoCSDw2MNbLtdn4Dg" alt=""><figcaption><p>Create empty component</p></figcaption></figure>

Give a name to the new component. In this example, we'll use the name `Demo`.

Then, go to the `Inspector` pane and select `Add Component`. In the search bar, type `new` and select the `New script` option.

<figure><img src="/files/aEWEKS46jXoNR0dQLnvd" alt=""><figcaption><p>Add script</p></figcaption></figure>

Type the script's name and select `Create and Add`. For this example, we'll use the name `WebcamDemo`.

<figure><img src="/files/dpZB2jLjZsSWk0DTCq2s" alt=""><figcaption><p>Create script</p></figcaption></figure>

#### Step 2: Initialize the visual components

Double-click on the newly created `MonoBehaviour` script and import the necessary namespaces.

```csharp
using LightBuzz.HandTracking;
using System.Collections.Generic;
using UnityEngine;
```

For this example, we'll need a [WebcamSource](/ui-reference/streamsource/webcamsource.md) to get the frames, an [ImageView](/ui-reference/imageview.md) to draw camera texture and a [HandManager](/ui-reference/handmanager.md) to visualize the detected hands.

```csharp
[SerializeField] private ImageView _image;
[SerializeField] private WebcamSource _webcam;
[SerializeField] private HandManager _handManager;
```

After adding the serialized fields, go to the Unity Editor to connect these fields with the `Demo` component.

At the `Inspector` pane, select the round button next to each `SerializeField`.

<figure><img src="/files/TsmfbgO2lPiUzRTVuBhs" alt=""><figcaption><p>Seriaze fileds</p></figcaption></figure>

Then, at the `Scene` tab, select the corresponding prefab. For example, for the `Image` field, select the `ImageView` prefab.

<figure><img src="/files/0jqp9uI7zQIPnTTMXJIa" alt=""><figcaption><p>Connect prefab</p></figcaption></figure>

When all fields are connected, the result should resemble the following image.

<figure><img src="/files/Haww42vUdpbdSN6LMQvl" alt=""><figcaption><p>Connected serialize fields</p></figcaption></figure>

Then, select the `HandManager` prefab, under the `Canvas`, and connect the `Image` field to the `ImageView` prefab.

<figure><img src="/files/J7W7y81Uo00EVQAPLrLA" alt=""><figcaption><p>Connect field to HandManager</p></figcaption></figure>

{% hint style="info" %}
Make sure the `Is 2D` option is selected to see the hand tracking detections in the 2D space. If the option is not checked, detections are displayed in the 3D world space.
{% endhint %}

After connecting all the fields, navigate to the `Canvas` to set the render options.

Change the `Render Mode` to `Screen Space - Camera`.

<figure><img src="/files/8xapwQLfkO58oNi2gJhH" alt=""><figcaption><p>Change Render Mode</p></figcaption></figure>

Then, set the `Main Camera,` from the Scene tab, as the `Render Camera`.

<figure><img src="/files/FKcxZ5Eyq9JlM0GAFHZu" alt=""><figcaption><p>Select Render Camera</p></figcaption></figure>

When all the render options are set, the result should look like the following image.

<figure><img src="/files/bVxgPsw5dk4FXW0xcKXy" alt=""><figcaption><p>Render settings</p></figcaption></figure>

Finally, return to the script and instantiate a [HandTracker](/api-reference/handtracker.md) to detect hands.

```csharp
HandTracker _handTracker = new HandTracker();
```

#### Step 3: Open the webcam

Open the webcam to get the live feed.&#x20;

```csharp
_webcam.Open();
```

{% hint style="info" %}
In this example, the camera is opened in the `Start()` method. Alternatively, you could open the camera with the click of a button.
{% endhint %}

#### Step 4: Get the live feed

Check that the camera is open and available for capturing video.

```csharp
if (!_webcam.IsOpen) return;
```

Load the new frame from the `_webcam` object onto the [ImageView](/ui-reference/imageview.md) to show the live feed to your users.

```csharp
_image.Load(_webcam);
```

#### Step 5: Detect hands

Pass the `Texture2D` object from the camera frame to the [HandTracker](/api-reference/handtracker.md) for processing.

```csharp
List<Hand> hands = _handTracker.Load(_image.Texture);
```

The [HandTracker](/api-reference/handtracker.md) will analyze the texture and detect any hands present in the image.

#### Step 6: Visualize the hands

To display the detected hands on a 2D canvas, simply pass the detections to the [HandManager](/ui-reference/handmanager.md).  It will manage the rendering and updates required to accurately depict the hands on the canvas based on the detection data provided.

```csharp
_handManager.Load(hands);
```

{% hint style="info" %}
Steps 4 through 6 are incorporated into the `Update()` method.
{% endhint %}

#### Step 7: Release the resources

Close the webcam to stop the live feed, preventing further video capture.

```csharp
_webcam.Close();
```

Dispose of the [HandTracker](/api-reference/handtracker.md) object to ensure that all associated resources are released.

```csharp
_handTracker.Dispose();
```

{% hint style="info" %}
In this example, the resources are released in the `OnDestroy()` method. Alternatively, you could do that with the click of a button or in the `OnApplicationQuit()` method.
{% endhint %}

### Full example code

Here is the full example code that has the same functionality as the Hand Tracking Unity plugin `LightBuzz_Hand_Tracking_2D` sample.

<pre class="language-csharp"><code class="lang-csharp">using LightBuzz.HandTracking;
using System.Collections.Generic;
using UnityEngine;

public class WebcamDemo : MonoBehaviour
{
    [SerializeField] private ImageView _image;
    [SerializeField] private WebcamSource _webcam;
    [SerializeField] private HandManager _handManager;
    
    private readonly HandTracker _handTracker = new HandTracker();

<strong>    private void Start()
</strong>    {
        _webcam.Open();
    }

    private void OnDestroy()
    {
        _webcam.Close();
        _handTracker.Dispose();
    }

    private void Update()
    {
        if (!_webcam.IsOpen) return;

        // 1. Draw the camera texture.
        _image.Load(_webcam);

        // 2. Detect hands in the camera texture.
        List&#x3C;Hand> hands = _handTracker.Load(_image.Texture);

        // 3. Visualize the hands on a Canvas.
        _handManager.Load(hands);
    }
}
</code></pre>

By following these steps, you will be able to load the camera feed into your application, detect hands in real time, and finally, render these detections on a 2D canvas.


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://handtracking.lightbuzz.com/examples/use-webcam-to-detect-and-visualize-hands-2d-canvas.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
