Skip to main content

TROS Gesture Detection

Application Scenarios

RDK Studio helps beginners get started quickly, enabling an efficient gesture recognition workflow: Gesture recognition algorithms integrate technologies such as hand keypoint detection and gesture analysis, allowing computers to interpret human gestures as corresponding commands. This enables functions like gesture control and sign language translation, mainly applied in smart home, smart cockpit, smart wearable devices, and other fields.

Preparation

Supports connection of both USB and MIPI cameras. This section uses a USB camera as an example. The USB camera connection method is as follows:

Camera Connection Image

Running Process

  1. Click Node-RED under the TROS Gesture Detection example.

    Example Page

  2. Enter the example application flow interface.

    Tip

    Click the Link Icon icon in the top right corner of RDK Studio to quickly open the example in a browser!

    Gesture Recognition Example Page

  3. Select the type of connected camera, click the corresponding Start(USB Cam) command. After waiting about 10 seconds, the visualization window will automatically open for recognition, and the recognized results will be broadcast via voice.

    Example Image

  4. Performance Result: Click the debug icon on the right to position the right sidebar to the debug window, where you can view performance information output results.

    Example Image

  5. Output Statistics: Outputs statistical results of collected gesture counts.

    Example Image

  6. Click to execute the Stop command to turn off the camera.

    Example Image

    Note

    If you modify nodes, flows, etc., you need to click the Deploy Image button in the top right corner for the changes to take effect!

  7. Click the × icon in the top right corner ,Select "Close APP" to exit the Node-RED application.

    Example Image

More Features

Visualization Page

  1. Click to execute the Visualization Interface command, which automatically opens TogetherROS Web Display.

    Example Image

  2. Click Web Display to enter the visualization page for real-time gesture recognition.

    Example Image

  3. Click the × in the top right corner of the visualization page to exit it.

Learn More

Click to execute the Learn More command to open the online documentation to find more information about the examples.

Example Description Image