Facial AR Remote is a tool that allows you to capture blendshape animations directly from your iPhone X into Unity 3d by use of an app on your phone.
This repository is tested against the latest stable version of Unity and requires the user to build their own iOS app to use as a remote. It is presented on an experimental basis - there is no formal support.
Get the latest release from the Releases tab
Project built using Unity 2018+, TextMesh Pro Package Manager, and ARKit plugin. Note ARKit plugin is only required for iOS build of remote. For your convenience, you may want to build the remote from a separate project. For best results use Bitbucket tip of ARKit plugin
This repository uses Git LFS so make sure you have LFS installed to get all the files. Unfortunately this means that the large files are also not included in the "Download ZIP" option on Github, and the example head model, among other assets, will be missing.
-
Setup a new project either from the ARKit plugin project from BitBucket or a new project with the ARKit plugin from the asset store.
-
(Unity 2018.1) Add
TextMesh-Pro
to the project fromWindow > Package Manager
. The package is added automatically in Unity 2018.2 and above. -
Add this repo to the project and set the build target to iOS.
-
Setup the iOS build settings for the remote. In
Other Settings > Camera Usage Description
be sure you add "AR Face Tracking" or something to that effect to the field. Note You may need to set theTarget Minimum iOS Version
to11.3
or higher. You may also need to enableRequires ARKit Support
Note The project defaults to ARkit 2.0, To use ARkit 1.5 you will need to setARKIT_1_5
inOther Settings > Scripting Define Symbols*
this will be required only if you have not updated your remote app to support ARkit 2.0. Note You may need to update your version of the ARkit plugin and update to XCode 10 or greater for ARKit 2.0. -
Open
Client.scene
and on theClient
gameobject, set the correctStream Settings
on theClient
component for your version of ARKit. -
When prompted, import TMP Essential Resources for TextMesh Pro
-
Enable "ARKit Uses Facetracking" on UnityARKitPlugin > Resources > UnityARKitPlugIn > ARKitSettings
-
Set
Client.scene
as your build scene and build the Xcode project.
-
Add
TextMesh-Pro
to your main project or new project fromWindow > Package Manager
. -
Add this repo to the project. Note You should not need the ARKit plugin to capture animation.
-
To test your connection to the remote, start by opening
../Examples/Scenes/SlothBlendShapes.scene
. -
Be sure your device and editor are on the same network. Launch the app on your device and press play in the editor.
-
Set the
Port
number on the device to the samePort
listed on theStream Reader
component of theStream Reader
game object. -
Set the
IP
of the device to one listed in the console debug log. -
Press
Connect
on the device. If your face is in view you should now see your expressions driving the character on screen. Note You need to be on the same network and you may have to disable any active VPNs and/or disable firewall(s) on the ports you are using. This may be necessary on your computer and/or on the network. Note Our internal setup was using a dedicated wireless router attached to the editor computer or lighting port to ethernet adaptor.
-
Character Rig Controller does not support Humanoid Avatar for bone animation.
-
Animation Baking does not support Humanoid Avatar for avatar bone animation.
-
Stream source can only connect to a single stream reader.
-
Some network setups cause an issue with DNS lookup for getting IP addresses of the server computer.
Note: History edits were made on 10/29/2018. If you cloned this repository before that date, please rebase before submitting a Pull Request.