Releases: benaclejames/VRCFaceTracking
🎊 VRCFaceTracking V5.0.0 - Unified 🎉
Download Installer
Unified Expressions, Unified Tracking, Unified Interface.
Despite a year of no official updates, we've actually been hard at work on what might be our most anticipated VRCFaceTracking update yet! This update introduces our new expression standard Unified Expressions, our new tracking interface that can drive the new expressions, and a completely overhauled user interface bringing many quality of life additions and changes to the core application!
We also have brand new documentation for VRCFaceTracking at docs.vrcft.io!
(Please note, the documentation is still actively being developed and some things may be wrong or missing! We encourage users to contribute.)
The documentation consolidates a lot of information that was previously only available on the VRCFaceTracking Discord server. The docs include a quick-start, guides for supported hardware and software, sections dedicated to avatar creators and the new Unified Expressions standard, and developer SDK documentation to help create tracking modules!
We have also added VRChat's "native" eye tracking endpoints, so now nearly any Avatar 3.0 avatar can be used with eye tracking from VRCFT!
🎭 Unified Expressions (More Shapes!)
Unified Expressions is our new expression standard that aims to help unify many face tracking shapes into one overarching standard. Heavily based on human facial anatomy, Unified Expressions provides a strong basis for supporting expressive avatars with the addition of formerly untracked facial features such as the eyebrows and nose, and also including much more nuanced lip, mouth, eye, and tongue shapes!
The best part is that Unified Expressions is also fully backwards with all commonly used face tracking standards such as Apple ARKit
/PerfectSync
, SRanipal
, FACS
, and Quest Pro FACS
! It is also designed to be, in theory, forwards-compatible with potential new face tracking standards that may come in the future.
If you are comfortable with already existing face tracking standards then you can rest assured that your avatar will work without any change to any blendshapes! Keeping backwards compatibility with all avatars is an important consideration for the new standard, and all existing VRCFaceTracking avatars are automatically supported by the new standard.
Our new expression standard is now available for use; all available documentation can be found here. Parameters used by VRCFaceTracking for Unified Expressions are also found here.
(Please note, the documentation and standard is still actively being written. Changes to the reference avatar and descriptions are still being worked on at this time.)
🤖 Unified Tracking (Overhauled Tracking!)
VRCFaceTracking's Unified Tracking interface was overhauled to work with the new Unified Expressions standard. Tracking modules now have access to many new tracking points to take advantage of more sophisticated tracking interfaces as part of the Tracking Module SDK. Modules can now provide robust tracking for eyebrows, lips, mouth, eyes, and tongue.
If you are a module developer, we are looking for feedback on the Tracking Module SDK! Please let us know if we can be more verbose or accommodating with onboarding developers with creating VRCFaceTracking modules.
As of release, the following tracking modules are already available:
- SRanipal (Vive Pro Eye, Vive Face Tracker, etc.)
- Quest Pro (OpenXR)
- Pico 4 Pro/Enterprise Stream Assistant
- MeowFace (Android App)
The architecture of the tracking data has mostly remained the same: module developers should be able to easily adapt to the new interface changes. We hope the changes done to Unified Tracking and our updated documentation help developers looking to implement new tracking interfaces to VRCFaceTracking!
With Unified Expressions, other VRChat face tracking solutions that provide different parameters is now also a possibility. We hope that this flexibility will mean VRCFaceTracking can be used to further unify face tracking and grow as a standardized interface!
🔮 Unified Interface (Shiny New UI!)
VRCFaceTracking has a brand new UI built upon the Windows Mica UI. The user experience of VRCFaceTracking has been drastically improved, including: a new Home page; an Output logging page to aid debugging and setup; and the Module Registry page, where VRCFaceTracking Modules can be installed.
The Home page
- Provides quick access to tracking toggles, OSC information, and avatar and loading status.
The Output page
- Provides quick access to logging output. Any important information about the states of VRCFaceTracking will be logged here. You can also save a log or copy to the clipboard now! Logs will also save on crash as well (in-case it happens 💥), and log additional information to help debug the issue.
The Module Registry page
- An entirely new page that houses the new downloadable VRCFaceTracking modules! Modules here are easily installable and removable. This will also show any modules that are locally installed (such as ones a developer is locally testing), so now there is a quick way to find out what exactly VRCFaceTracking is loading.
The Settings cog
- You can now finally adjust settings in the program itself, including the OSC port, theme, and more!
Localization!
- That's not all! With the new UI comes localization, which means the ability to use VRCFaceTracking with text in your preferred language! This release will ship with Simplified Chinese (简体中文) as a localization option, courtesy of
tianrui#6510
. As always, if you would like to contribute and add your preferred language as an option, we encourage you to make a pull request.
Easier installation!
- The new UI also allows us to use Windows' appinstaller and the new
.msix
packaging format. This really speeds up the install (no more dragging folders around if you don't want to). The appinstaller also allows VRCFaceTracking to automatically update, but you can choose to update manually by installing with the.msix
package.
The UI is easily the most visible feature, but we think VRCFaceTracking has been improved in every aspect! These changes should also help us continue to release exciting features.
📜 Update Log
The following is a nearly comprehensive list of all features, additions, fixes, changes, code-cleanups, and improvements.
✨ Features / Additions
- User interface overhaul
- A brand new look and greatly improved performance (no more unreasonable CPU usage from the app itself!)
- Streamlined UI functionality
- Added pages that help inform users and gives useful features such as module loading, tracking toggles, and OSC information
- Modules can be managed via the Module Registry
- Logs can be saved to text file manually, or automatically after an application crash (as if that would happen)
- Added localized language support, with Simplified Chinese as the first included localization option
- Improved user interface elements, such as avatar ID handling, module name indicators, and output window improvements.
- Implemented Unified Expressions and parameters
- Unified Expressions parameters are now available for use
- Accessed via
v2/...
, All parameters listed here
- Accessed via
- Added backwards compatibility for existing VRCFaceTracking avatars
- VRCFaceTracking now 'emulates' SRanipal parameters meaning that tracking interfaces that interface with Unified Expressions will automatically support existing SRanipal based avatars!
- Greatly improved inter-standard mapping from Quest Pro intermediary community builds of VRCFaceTracking
- All output parameters that were used to make testable Quest Pro compatible avatars are supported as well. Though we still encourage users to update their Quest Pro compatible avatars to use Unified Expressions; We believe it should provide a more accurate tracking experience
- Unified Expressions parameters are now available for use
- Added UnifiedExpressionMutator class for parameter handling
- Note: This is currently unexposed to the UI, and may drastically change over time
- Ability to calibrate and configure tracking parameters
- Added support for VRChat "native" eye parameters
- Supports basic blinking and eye gaze
♻ Changes
- Parameters
- Eye Gaze can now be used as a binary parameter
- Parameters are now more robustly loaded. Feel free to load parameters with any arbitrary prefix! Eg.
test/prefix/example/v2/JawOpen
- Applies to all parameter types
- Improved parameter parsing when loading an avatar's OSC config
- Modules / Interface / Loading
- Modules can provide metadata for the Module Registry
- Currently the Module Registry pulls metadata from an internally set source, but we are looking to have a more direct way to allow module developers to add their own modules easily to the Registry
- Unified Tracking overhaul
- Refactored and cleaned up the tracking module and Unified Tracking Data class for better consistency, readableness, and accessibility
- Removal of default integrated SRanipal interface
- Modules now have a single Status to represent the state of the entire module
...
- Modules can provide metadata for the Module Registry
🖼️ UI Release
It's been a long time coming, but finally VRCFaceTracking has a coherent user interface! Gone are the days of white text on a black terminal! You'll also be able to access eye and lip image feeds with this UI which will hopefully allow you to better position trackers on nonstandard headsets!
Thanks to everyone who contributed and otherwise supported the development of this update, and thanks to you for continuing to show VRChat that facial tracking isn't "just a niche"!
What's Changed 🤔
- VRCFT OSC UI by @dfgHiatus in #81
- Adding error message when loading an external module fails by @InconsolableCellist in #84
- Added a check for the user folder parser to make sure an Avatar subfolder exists by @regzo2 in #85
- Moving try/catch so that it's actually useful. by @InconsolableCellist in #86
- Re-Enable and Fix Pimax Eye Tracking by @0x8080 in #88
- Reduce memory consumption when working with large camera images by @m3gagluk in #91
- Automatic OSC enable functionality
- Add lip and eye images
- Fixed double module teardown
- Updated external tracking modules to be less likely to break with minor changes to their base class by using abstract class overrides
- byemax a87fbbc
New Contributors 💖
- @Gorialis made their first contribution in #79
- @dfgHiatus made their first contribution in #81
- @0x8080 made their first contribution in #88
- @m3gagluk made their first contribution in #91
Full Changelog: v3.0.1...v4.0.0
🎉VRCFT Standalone
THIS VERSION IS OUTDATED, CLICK HERE
For new users, please check out the wiki and the README for some guidance on avatar setup!
Known Issues
- Switching avatar after logging in without pausing beforehand will break VRChat's "set parameter" functionality and will require a game restart. Press
P
to toggle pause and unpause when the swap has completed. - Launch the exe before vrchat otherwise it'll have no way of telling what avatar you're using.
And feel free to join the Discord! We're a community full of face-tracking obsessed creators, developers and users and we'd love to see what you create and provide help when needed!
🎉VRCFT Standalone Release!🥳
THIS VERSION IS OUTDATED, CLICK HERE
Thanks to the new VRChat open-beta, VRCFaceTracking will now offer a standalone exe to allow for full facial tracking inside VRChat! 🎉
The existing parameters should still work minus a couple of merged lip parameters, but most existing avatars will continue to work, as well as any new avatars that follow the current wiki guides.
✨ VRCFT Release 2.6.0
This new version of VRCFT introduces a new parameter type! Binary Parameters!
Binary Parameters use base 2 to save space on networking parameters, while maintaining the smoothness of a prioritized float using animator smoothing. They can be somewhat of a pain to set up, but allow for seemingly infinite customiseability of parameters, down to their data size. Use them to create tongue tracking that's accurate to 262143 values, or create a smile that's only accurate to 7, its completely up to you!
More info on binary parameters can be found in the documentation. Many thanks to @regzo2 for the initial creation of these parameters and for guidance on making them infinitely scalable!
What's Changed
- Launching VRChat in Administrator will now display a live camera feed from the vive pro eye, instead of the traditional eye indicators.
- Most parameters are now encompassed under the EParams class, meaning these parameters can be a binary param, float param, or a bool param depending on what you set up on your avatar. You decide!
- Combine TongueSteps 1 and 2 by @holadivinus in #70
- Implement PositiveNegativeAveragedShape and use it to implement combined left/right parameters by @AeroScripts in #73
- Fixed SRanipal not honoring the quickmenu tracking toggles.
- Vastly improved wiki and documentation by @Adjerry91
Full Changelog: v2.5.1...v2.6.0
🛠 Fixes
Parameter setting was broken the latest update. This release should fix any issues this update introduced.
Many Thanks!
🛠 UI Update
This release includes the new "Last Resort Module Re-Init" functionality for wireless users or just users with a bad headset cable, and will re-initialize any modules if they stop responding.
More UI Improvements
This release finally adds back the lip image functionality to the UI. Now you can stare at... your teeth.. I guess.
Threading Fix
Trying out a new fix for the dreaded "Error collecting from unknown thread" crash. Thanks to the pointers from @MoePus and the bug report thread from @ShawnMayer. As always, please let me know if you continue to experience the issue.
Thanks
P.S. I've added a launch flag to disable the threading in case this fix doesn't work. --vrcft-nothread
🛠 Fixes
This update fixes an issue introduced in the latest VRC update that stops the tool detecting when a new avatar is equipped to know when to scan for parameters. This update also disables the lip image functionality of the UI due to instabilities.