You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 1, 2024. It is now read-only.
Hand tracking enables the use of hands as an input method for AR/VR applications. When using hands as input modality, it delivers a new sense of presence, enhances social engagement, and delivers more natural interactions with fully tracked hands and articulated fingers.
MediaPipe can be used for real-time hand tracking. LabGraph can be used to record the data captured from a webcam. It could also be extended to other downstream applications.
This task is to create mock hand pose data and visualize it with real-time hand. This task needs the completion of a previous task: #90
🚀 Feature
Hand tracking enables the use of hands as an input method for AR/VR applications. When using hands as input modality, it delivers a new sense of presence, enhances social engagement, and delivers more natural interactions with fully tracked hands and articulated fingers.
MediaPipe can be used for real-time hand tracking. LabGraph can be used to record the data captured from a webcam. It could also be extended to other downstream applications.
This task is to create mock hand pose data and visualize it with real-time hand. This task needs the completion of a previous task: #90
Additional context
The text was updated successfully, but these errors were encountered: