Procedural Mesostructure Synthesis for Digital Materials
Tamy Boubekeur and Sébastien Deguy (Allegorithmic)
Digital materials coming in the form of spatially-varying bidirectional reflectance distribution functions (SVBRDF) can be efficiently represented, for a large set of applications, using a collection of 2D maps storing the standard SVBRDF channels. While the reflectance properties are captured by specific microstructure parameters – such as the roughness which drives the spread of the normal distribution function hidden in a classical microfacet reflectance model – geometric mesostructures are typically expressed using a displacement map. In this paper, we discuss how a directed acyclic graph (DAG) of processing nodes offer a powerful abstraction to design such displacement maps and present a procedural modeling workflow for mesostructure creation.
InnovateUK HARPC: high-quality facial capture unencumbered by head mounted cameras
Jon Starck (Synthesia)
HARPC is an InnovateUK funded project developing new solutions for high-quality facial capture aimed at bringing more creative, accessible and cost-effective performance capture to a range of productions.
Deformable Objects for Virtual Environments
Martin Jowers and Eleanor Whitley (Marshmallow Laser Feast)
The emergence of the experiential industry increasingly involves real time interactive experiences. User behaviours are shifting from passive to interactive entertainment as hardware capabilities offer input devices that make it easier for software to be integrated with the physical world. We can now digitally map rigid 3D environments and interact with them. But what about the deformable objects that change shape? - Soft pillows, immersive theatre sets and props -even food such as jelly? As cameras in consumer electronics become more intelligent, they offer a range of opportunities for developers to exploit features for interactive immersive experiences. Tracking for basic objects, like VR control devices with active markers (eg. Oculus Touch, Playstation Move, Vive controller), or hand, eye, skeletal and facial tracking, are already present in many high end VR projects (e.g. The Void). However in the VR/AR/XR industry, real-world deformable objects are the last in line to be detected, tracked, identified and generally have their physical behaviours understood in 3D real time.
Marshmallow Laser Feast and CAMERA have been developing a method to track & visualise deformation in virtual environments called DOVE (Deformable Objects for Virtual Environments). The innovation and focus of DOVE is to combine state-of-the-art machine learning algorithms for deformable object tracking from depth sensors with bespoke rendering software for real-time object display in VR, AR, even projection mapping. This is combined with existing hand tracking technology which will allow users to interact with real world objects which are displayed in novel ways in virtual environments depending on the experience.
The unique demonstrator, and initial commercial exploitation, for this innovative VR technology is ‘Sweet Dreams’ – a VR dining experience in which diners will be immersed in a processional virtual world while eating tracked food displayed to the user in new and exciting ways.
Modern Rigging Paradigms to Improve Performance and Modularity
Chus Nieto, Charlie Banks and Ryan Chan (DNEG)
Several years ago DNEG set out to build Loom, a new rigging framework in a bid to improve the performance of our Maya animation rigs. This talk is an update on its development in the light of the discontinuation of its original evaluation back-end, Fabric Engine. In particular, we describe the design choices which enabled us to achieve a DCC agnostic rigging framework, allowing us to focus on development of pure rigging concepts. Also how this setback prompted us to extend the framework to properly deal with the deformation side of rigs, targeting memory efficiency, GPU/CPU memory interaction and high-end performance optimizations.