Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Jeka8833/FoxyFace

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

52 Commits
 
 
 
 
 
 
 
 

Repository files navigation

FoxyFace

FoxyFace allows you to use your real face to control your avatar's face in VRChat using any camera that is connected to your computer. You can also use the camera of an Android device, iOS device or another computer, but this will require you to download additional programs, here are instructions on how to do it.

FoxyFace uses the MediaPipe Face landmark detection neural network bundle and the neural network from Project Babble.

FoxyFace is a good starting point as it doesn't require you to invest any money if you have a computer and a camera on "any" of your devices.

Almost complete facial tracking

Example of Face Tracking Face is taken from FreePik, and Yeenie avatar is made by SMU

The FoxyFace is currently tracking 83 parameters out of 102 parameters supported by VRCFT, which is 81%. That's taking into account the Blended Shapes.

Supported parameters
BrowInnerUpLeft, BrowInnerUpRight, BrowLowererLeft, BrowLowererRight, BrowOuterUpLeft, BrowOuterUpRight, BrowPinchLeft, BrowPinchRight, CheekPuffLeft, CheekPuffRight, CheekSquintLeft, CheekSquintRight, CheekSuckLeft, CheekSuckRight, EyeOpennessLeft, EyeOpennessRight, EyeSquintLeft, EyeSquintRight, EyeWideLeft, EyeWideRight, EyeXLeft, EyeXRight, EyeYLeft, EyeYRight, HeadPitch, HeadRoll, HeadX, HeadY, HeadYaw, HeadZ, JawForward, JawLeft, JawOpen, JawRight, LipFunnelLowerLeft, LipFunnelLowerRight, LipFunnelUpperLeft, LipFunnelUpperRight, LipPuckerLowerLeft, LipPuckerLowerRight, LipPuckerUpperLeft, LipPuckerUpperRight, LipSuckLowerLeft, LipSuckLowerRight, LipSuckUpperLeft, LipSuckUpperRight, MouthClosed, MouthCornerPullLeft, MouthCornerPullRight, MouthCornerSlantLeft, MouthCornerSlantRight, MouthDimpleLeft, MouthDimpleRight, MouthFrownLeft, MouthFrownRight, MouthLowerDownLeft, MouthLowerDownRight, MouthLowerLeft, MouthLowerRight, MouthPressLeft, MouthPressRight, MouthRaiserLower, MouthRaiserUpper, MouthStretchLeft, MouthStretchRight, MouthUpperLeft, MouthUpperRight, MouthUpperUpLeft, MouthUpperUpRight, NoseSneerLeft, NoseSneerRight, TongueBendDown, TongueCurlUp, TongueDown, TongueFlat, TongueLeft, TongueOut, TongueRight, TongueRoll, TongueSquish, TongueTwistLeft, TongueTwistRight, TongueUp
Unsupported parameters
EyePupilDiameterMMLeft, EyePupilDiameterMMRight, JawBackward, JawClench, JawMandibleRaise, LipSuckCornerLeft, LipSuckCornerRight, MouthTightenerLeft, MouthTightenerRight, MouthUpperDeepenLeft, MouthUpperDeepenRight, NasalConstrictLeft, NasalConstrictRight, NasalDilationLeft, NasalDilationRight, NeckFlexLeft, NeckFlexRight, SoftPalateClose, ThroatSwallow

Step 0

  1. Make sure you've installed VRCFaceTracking.
  2. Make sure you find an avatar that supports face tracking or head movement. You won't be able to check if it works without this/third-party module enabled. Here's a video tutorial: link
  3. The most important step is to make sure that you have enabled OSC in the avatar settings and enabled tracking of individual parts of the face/head; by default, this is all turned off.

Installation

Perform the installation in this order:

  1. Install FoxyFace, instructions here.
  2. Install FoxyFaceVRCFTInterface, instructions here.

Camera setup

Instructions on how to set up the camera can be found here.

Instructions on how to use another device as a webcam can be found here.


Updating the Project Babble neural network

Instructions on how to update the neural network from Project Babble can be found here.


Want to control your avatar's head rotation?

Instructions on how to track head rotation can be found here.


Update FoxyFace Application

Instructions on how to update the FoxyFace app can be found here.


Build

Note

Simply cloning (git clone) without --recurse-submodules or downloading a Zip archive from GitHub won't work because the repository uses submodules!

Build FoxyFace

Python version 3.12 is required. A newer version of Python is not supported. Older versions of Python have not been tested.

Automatically configuring the Python Virtual Environment doesn't happen in the IDE, but the basic plan consists of:

  1. Cloning the repository using:
git clone --recurse-submodules https://github.com/Jeka8833/FoxyFace.git
  1. Opening FoxyFace folder in IDE (PyCharm)
  2. The PyCharm may try to create .venv on its own, but it will most likely do so with the wrong version of Python, you need to recreate .venv with Python 3.12.
  3. Next, the PyCharm will prompt you to install the required libraries from the requirements.txt file, you agree to this.

This is quite a complicated process for beginners, if you know how to automate this, feel free to offer your thoughts.

Build FoxyFaceVRCFTInterface

Clone the project using internal IDE (JetBrains Rider, Visual Studio, ect...) tools, and select the project file FoxyFaceVRCFTInterface.sln. Then you click FoxyFaceVRCFTInterface -> Build in the IDE, and it creates a compiled module for you in the release directory.

Instructions on where to put the module and in general on developing modules for VRCFT can be found here.

License

Note

This repository contains 2 separate projects and which have different licenses.

FoxyFace code is licensed under Apache License 2.0.

FoxyFace uses code from third-party developers under license:

  1. License for Baballonia: Apache License 2.0

FoxyFaceVRCFTInterface code is licensed under Unlicense.

FoxyFaceVRCFTInterface uses code from third-party developers under license:

  1. License for VRCFaceTracking: Apache License 2.0