Skip to content
This repository has been archived by the owner on Nov 1, 2024. It is now read-only.

Hand Tracking Visualization #82

Open
jfResearchEng opened this issue May 3, 2022 · 0 comments
Open

Hand Tracking Visualization #82

jfResearchEng opened this issue May 3, 2022 · 0 comments

Comments

@jfResearchEng
Copy link
Contributor

🚀 Feature

Hand tracking enables the use of hands as an input method for the Oculus Quest headsets. When using hands as input modality, it delivers a new sense of presence, enhances social engagement, and delivers more natural interactions with fully tracked hands and articulated fingers.

We can use LabGraph to record the data captured from Quest headset and visualize the data (e.g. in Unity). Oculus Hand Tracking API can be found here.

One set of Quest 2 and Link Cable could be provided for a US-based user who has contributed to LabGraph (subject to review/approval).

This task is a follow-up task of #81 on visualize the data obtained.

Additional context

  1. Existing application can be found [here] (https://developer.oculus.com/documentation/unity/unity-handtracking/)
  2. The code should be added at folder is https://github.com/facebookresearch/labgraph/tree/main/devices/quest2/visualization
  3. Create setup.py and README.md, where example can be found at: https://github.com/facebookresearch/labgraph/tree/main/extensions/labgraph_viz
  4. Add github action support, reference: https://github.com/facebookresearch/labgraph/actions/workflows/main.yml
  5. Add proper license header.
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant