SA '18- SIGGRAPH Asia 2018 Real-Time Live!

Full Citation in the ACM Digital Library

An architecture for immersive interactions with an emotional character AI in VR

When entering a virtual world, the users expect an experience that feels natural. Huge progress has been made with regards to motion, vision, physical interactivity, whereas interactivity with non-playable character stays behind. This live-demo introduces a method that leads to more aware, expressive and lively agents that can answer their own needs and interact with the player. Notably, the live-demo covers the use of an emotional component and the addition of a layer of communication (speech) to allow more immersive and interactive Als in VR.

Live replay movie creation of Gran Turismo

We perform live replay movie creation using the recorded play data of Gran Turismo. Our real-time technologies enable movie editing such as authoring camerawork and adding visual effects while reproducing the race scene with high quality graphics from the play data. We also demonstrate some recent development for the future.

More real, less time: mimic's quest in real-time facial animation

Mimic Productions' CEO, Hermione Mitford, will present a live-stream demonstration of detailed facial animation in real-time, utilizing her photo-real 3D digital-double. The presentation will include a speech from Mitford (and her avatar) addressing Mimic's technological approach, as well as the corresponding applications for the technology. A specific focus will be placed on realism and the details of the human face.

MR360 live: immersive mixed reality with live 360° video

DreamFlux presents MR360 Live, a new way to create immersive and interactive Mixed Reality applications. It blends 3D virtual objects into live streamed 360 videos in real-time, providing the illusion of interacting with objects in the video.

Pinscreen avatars in your pocket: mobile paGAN engine and personalized gaming

We will demonstrate how a lifelike 3D avatar can be instantly built from a single selfie input image using our own team members as well as a volunteer from the audience. We will showcase some additional 3D avatars built from internet photographs, and highlight the underlying technology such as our light-weight real-time facial tracking system. Then we will show how our automated rigging system enables facial performance capture as well as full body integration. We will showcase different body customization features and other digital assets, and show various immersive applications such as 3D selfie themes, multi-player games, all running on an iPhone.

The power of real-time collaborative filmmaking 2

PocketStudio is designed to allow filmmakers to easily create, play and stream 3D animation sequences in real-time using real-time collaborative editing, a unified workflow and other real-time technologies, such as augmented reality.