AR Tech startup ModiFace revealed their deep learning-based live video hair tracking and hair color simulation. They trained their neural networks with 220,000 hair images to make their networks detect hair in each video frame and adjust the coloration of hair in a photorealistic way.
You can try out this new tech at “Hair Color” app on App Store.
Google released a preview of a new SDK called ARCore. It is currently running on the Pixel and Galaxy S8.
ARCore works with Java, OpenGL, Unity and UE. Focusing on three things:
- Motion tracking: Using the phone’s camera to observe feature points in the room and IMU sensor data, ARCore determines both the position and orientation (pose) of the phone as it moves. Virtual objects remain accurately placed.
- Environmental understanding: It’s common for AR objects to be placed on a floor or a table. ARCore can detect horizontal surfaces using the same feature points it uses for motion tracking.
- Light estimation: ARCore observes the ambient light in the environment and makes it possible for developers to light virtual objects in ways that match their surroundings, making their appearance even more realistic.