OSC App to allow VRChat avatars to interact with eye and facial tracking hardware
Hello. I'm the author of [Avatar Optimizer](https://an12.net/aao). Is there any documentation or source code location that provides all parameters VRCFaceTracking provide to / read from Avatar? I'm planned to implement more optimization for Animator Controller in the future release of Avatar Optimizer assuming parameters are not animated. However, this optimization would break compatibility with OSC Tools, so Avatar Optimizer is going to have system to declare parameter information about OSC Tools since next release, and bundle information for popular tools like VRCFaceTracking. Therefore, I want to know all parameters VRCFaceTracking provide to / read from Avatar. I found https://docs.vrcft.io/docs/tutorial-avatars/tutorial-avatars-extras/parameters but this says it's incomplete, and there are no information about which parameters are read by VRC Face Tracking. > There are many parameters listed here, and some that are undocumented to help users save performance (such as 100% optimal parameter replacements for some parameters). I found https://github.com/benaclejames/VRCFaceTracking/blob/master/VRCFaceTracking.Core/UnifiedTracking.cs#L37-L46 and it looks complete but I could not trust my assumption so I asked here. And this also doesn't have information about which parameters are read by VRC Face Tracking.
This issue appears to be discussing a feature request or bug report related to the repository. Based on the content, it seems to be still under discussion. The issue was opened by anatawa12 and has received 4 comments.