Skip to content
This repository has been archived by the owner on Nov 1, 2024. It is now read-only.

Real-time hand pose visualization using mock hand pose estimation #91

Open
jfResearchEng opened this issue Aug 26, 2022 · 0 comments
Open

Comments

@jfResearchEng
Copy link
Contributor

🚀 Feature

Hand tracking enables the use of hands as an input method for AR/VR applications. When using hands as input modality, it delivers a new sense of presence, enhances social engagement, and delivers more natural interactions with fully tracked hands and articulated fingers.

MediaPipe can be used for real-time hand tracking. LabGraph can be used to record the data captured from a webcam. It could also be extended to other downstream applications.

This task is to create mock hand pose data and visualize it with real-time hand. This task needs the completion of a previous task: #90

Additional context

  1. MediaPipe can be found [here] (https://google.github.io/mediapipe/solutions/hands)
  2. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/devices/webcam/
  3. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  4. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  5. Add proper license header.
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant