SA '18- SIGGRAPH Asia 2018 Virtual & Augmented Reality

Full Citation in the ACM Digital Library

Art plunge: experiencing the inner worlds of famous artworks

Art Plunge is a virtual reality experience where you can get the feeling of being transported to the inner worlds of famous paintings. We have created VR-interpretations of works like Mona Lisa, Starry Night and the Birth of Venus. (Figure 1)

We explore the concept of what a painting could be in virtual reality, and how different boundaries are blurred in the process: Boundaries between our interpretation and the original painting, between technology and artistry and between now and then.

FiveStar VR: shareable travel experience through multisensory stimulation to the whole body

We have developed a multisensory virtual reality system, FiveStar VR (five senses theater for VR), that enables participants to relive or share other's behavior through well-designed simultaneous stimulation to multiple modalities. FiveStar VR consists of somatosensory displays in addition to the conventional audiovisual VR setup. In the FiveStar VR, the body parts of the participant are forced to move, which is synchronized with those of an avatar in the VR space, resulting in inducing a strong perception of presence at the past walk behavior of someone else. By taking advantage of the cyclic nature of walking, the arms, the lower limbs, and the body are synchronously moved to simulate the sensation of real walking. These motion profiles do not completely follow the measured data of real walking but each gain of magnitude of the modalities is adjusted on the basis of the subjective intensity of motion impression, mainly due to the lack of the sensory suppression from the motor command of the participant. The demonstration of our exhibition booth presents a virtual trip to a tourist site, Toronto and Niagara Falls, Canada. A short-time experience of walking around the area is relived/shared by the attendee.

Flow zone: a cross-modal music creation VR experience to induce flow

Many people want to live a happy and fulfilling life, yet finding this positive sense of well-being is often quite a challenge. Flow is a wonderfully powerful experience that not only feels amazing in the moment but actually improves a person's general sense of well-being as well. It seems like a great answer to this problem, but unfortunately the complex concoction of parameters necessary to enter flow prevent the experience from occurring regularly. Flow Zone aims to lower the barrier to entry with its sophisticated design tailored to maximize potential for flow. VR was used as the medium to create an immersive environment that simultaneously removes distractions and focuses the player's attention on the task at hand. The enhanced immersiveness of cross-modality combined with game design elements centered around creative expression through music creates a streamlined pathway to the flow state.

Games in concert: collaborative music making in virtual reality

Over the last two years, the Games in Concert project explored the possibilities and implications of collaborative artistic creation of music in virtual reality (VR). Therefore a multiuser VR environment and three virtual instruments have been designed to create, shape and experience sound in various ways. These Are:

Paint: The artist can literally paint music in the 3D space.

Trees: The artist can add and customize sounding tree-like objects.

Keys: An external input device was used to explore the possibility to incorporate and visualize non-VR instruments in a VR space.

Additionally, we built a stage setup to test the impact of a VR concert on an audience. The musicians embed the artistic content directly within the virtual environment. The spectator is free to explore their work independently, having his/her own visual and auditory perspective. Also, spectators can closely observe what the artists are creating in real-time. They are standing with them, in a manner of speaking, on the (virtual) stage. To be able to present the spectacle to a larger audience we introduced the "Game Jockey". Acting as the intermediary between the artists inside and the public outside the virtual environment, his in-game view and hearing are rendered on a large screen and multiple speakers, providing a visual experience with surround sound.

Haptopus: haptic VR experience using suction mechanism embedded in head-mounted display

Along with the spread of VR experiences using low-cost head-mounted displays (HMDs), many proposals have been made to improve the VR experience by providing tactile information to the fingertips. However, attaching a device to fingertips has issues such as difficulty in attaching and detaching and hindering free movement of fingers. To address these issues, many methods have been proposed to incorporate a haptic presentation mechanism in an HMD, but only for presenting passive tactile information of the face or whole body. To present the tactile sensation of the fingertips in a configuration that can be incorporated in an HMD, we developed a skin-suction mechanism called Haptopus to simulate the pressure applied to multiple fingers. Haptopus can express the sense of fingers touching virtual objects by presenting corresponding suction pressure around the eyes.

HapTwist: creating interactive haptic proxies in virtual reality using low-cost twistable artefacts

In recent years, virtual reality (VR) with head-mounted displays is gaining an increasing amount of attention in the consumer market, with highly realistic visual and audio contents. However, it is still challenging to use these VR devices with pre-fabricated shapes of controllers to simulate realistic haptic/kinesthetic information (i.e. the shape, the size, and the weight) of the virtual objects.

Islands/Seom: the creating of extended existence through AR and world simulation

Islands/Seom is an interactive AR archive about creating the extended existence of the people that we admire and care for. The audience enters a room of floating cube sculptures called Islands. Each cube encapsulates the perspective, voice, and beliefs of one of the artists. Together, we preserve their ghost and transform it into a landscape. View the sculptures through our custom-made mobile app. The avatar of the artist will reanimate on top. Listen to their manifesto. Send them to a virtual world called Lacus, where they grow organically into unique ecosystems. Through a game of world-simulation, we catalyze collaboration with their persona, against the odds of distance, personalities, and culture.

Little hero wins the masks: virtual reality creation of taiwanese classic comics

Taiwanese comics artist Hung-Chia Yeh created martial arts comic series "Little Hero" (諸葛四郎) in 1958. Little Hero was later super famous in 1960s. However, Taiwanese comics culture has declined due to various historical background, which caused younger generation know little about Taiwanese comics. We did this derivative work with VR and AR technology based on one of the classic story in Little Hero called "Little Hero Wins the Masks" (諸葛四郎大鬥雙假面). The features of this work are: (1) Develop the technology that connect VR and interactive game table together to improve the insufficiency of sociability in VR games; (2) The feedback of handheld controller and the game card of interactive game table are designed to improve interactivity in the game (See Figure 1).

Lotus: enhancing the immersive experience in virtual environment with mist-based olfactory display

With the advance of virtual reality (VR) headset and haptic technologies, users can have a great experience when they are immersed in the virtual environment (VE). A fewer of research allow users to perceive the odor from the VE simultaneously. Based on the five senses, olfaction is one of the human sense that can perceive chemical information from the environment, which is also important for recreating the VE. In the past, some research groups have shown the techniques of olfactory display. However, to create the olfactory feedback for immersive VR when the user moving around in the tracking area, where a moveable display or a lightweight portable device is required, due to the user's nose is the only human receptor for perceiving the scent. We present Lotus, a steerable mist-based olfactory display with an airflow guiding module for simulating environments with olfaction. It can provide two kinds of VEs simultaneously for enhancing the immersive experience without carrying the weighty liquid.

Mochitsuki: a real-object-based, interactive haptic interface

We propose an approach that breaks down the haptic experience into multiple elements and assigns optimal devices to each element, either a physical device or an electronically-controllable device. As an example of this design approach, we built a VR system that provides the experience of Mochitsuki. We designed a more durable system than the initial prototype. With this configuration, direct interactions between the user and the system are all performed by physical devices to ensure the quality of experience, while the parameters affecting the experience can be controlled by electronic devices and mechanisms.

MR360 interactive: playing with digital creatures in 360° videos

We present "MR360 Interactive", interactive mixed reality (MR) experiences using pre-recorded and live streaming 360° videos (360-video) shown in head mounted displays. We developed the MR360 toolkit, an interface to create interactive MR content using 360-video. The toolkit detects salient lights in the 360-video and casts realistic shadows. Image based lighting is perceptually optimized to provide fast results. Real-time differential rendering obtains a composition between the virtual objects and the real-world background. We present two applications of our toolkit: a VR experience using pre-recorded 360-video, and the MR Stage, an MR experience using live streaming 360-video. In both applications, participants are interacting with digital creatures with high presence in 360-video.

Muscle action VR: to support embodied learning foundations of biomechanics in musculoskeletal system

Traditional anatomy education has struggled with teaching students muscle movements in the mindset of three-dimensional anatomical structure. We present Muscle Action VR, an embodied learning virtual reality system that allows students to explore the effects that muscles have on the body. This application was created for studying musculoskeletal structures through playful and creative engagement, while staying accurate to anatomical structures and terminology. Users learn the basics of the biomechanics of human anatomy by either moving their own body with VIVE trackers, or directly manipulating specific muscles using VIVE controllers. We believe this application contributes to teach three-dimensional spatial awareness and foundational biomechanics in anatomy education.

Oceans we make: immersive VR storytelling

Oceans We Make (OWM) is a 3-minute long immersive and interactive virtual reality (VR) experience that encourages participants to question their use of plastic. The experience blends beautiful cinematic graphics, engaging game mechanics and an emotional narrative as a novel form of VR storytelling to drive positive environmental impact.

Rapture of the deep

Rapture of the Deep is an interactive Virtual Reality experience with eye tracking. The Experience is set in an underwater scenario using eye tracking as the main mechanism which allows the environment to react to the player's gaze and attention. In this project we worked with a retrofitted version of the HTC Vive headset with a complete Eye Tracking integration by Tobii1 Pro and the Tobii Pro SDK for the Unity3D2 Engine. Rapture of the Deep seeks to test how eye tracking technology can be employed as an attentive and invisible user interface allowing people to use reflexive and emotional behavior as a game controller.

Real-time visual representations for mobile mixed reality remote collaboration

In this study we present a Mixed-Reality based mobile remote collaboration system that enables an expert providing real-time assistance over a physical distance. By using the Google ARCore position tracking, we can integrate the keyframes captured with one external depth sensor attached to the mobile phone as one single 3D point-cloud data set to present the local physical environment into the VR world. This captured local scene is then wirelessly streamed to the remote side for the expert to view while wearing a mobile VR headset (HTC VIVE Focus). In this case, the remote expert can immerse himself/herself in the VR scene and provide guidance just as sharing the same work environment with the local worker. In addition, the remote guidance is also streamed back to the local side as an AR cue overlaid on top of the local video see-through display. Our proposed mobile remote collaboration system supports a pair of participants performing as one remote expert guiding one local worker on some physical tasks in a more natural and efficient way in a large scale work space from a distance by simulating the face-to-face co-work experience using the Mixed-Reality technique.

Self-umbrelling turns over subjective direction of gravity

Self-umbrelling is a head-mounted display (HMD) interaction system that provides an experience approximating an out-of-body experience (OBE) involving the reversal of the subjective perception of the direction of gravity. Specifically, opening an umbrella while lying on one's back switches one's view from the first-person perspective (1PP) to a third-person perspective (3PP) originating from a position just above the supine body. In addition, the base of the 3PP moves upward every time the umbrella is opened. Thus, through the periodic action of opening the umbrella, the 1PP offers the experience of blowing something away, while the 3PP offers that of being blown away. This interaction is expected to activate a potentially hidden cognitive function related to OBEs, bringing up an important subject for designing HMD interaction between a player and an avatar.

Space fusion: context-aware interaction using 3D scene parsing

Context-aware interaction (interaction varying correspondingly to scene semantics or object categories) is an important element to make mixed reality experience more immersive and realistic. However, few mixed reality applications provide this kind of interaction since it is difficult to recognize all objects in the real world scene densely as 3D.

In this work, we present a 3D scene parsing system by combining semantic segmentation with visual Simultaneous Localization and Mapping (SLAM). This system can reconstruct and recognize the real indoor scene as dense point cloud with categorical labels in real-time. We also present a context-aware mixed reality application that utilizes the parsing system. Users can import their own room into the mixed reality world, and enjoy interaction with a virtual robot in their room through a head-mounted display (HMD). The virtual robot behaves correspondingly to the real object's category. Therefore, our 3D scene parsing realizes context-aware interaction.

"The player is the star": futuristic vision for mixed reality world developing mixed reality game - PAC IN TOWN

"The player is the star" - Futuristic vision for Mixed Reality World is our (BANDAI NAMCO Studios Inc.) [Bandai Namco 2018] achievement and showing future vision for mixed reality entertainment.

We developed a mixed reality game called "PAC IN TOWN" for mixed reality device - Microsoft HoloLens. [Microsoft HoloLens 2018]

Trajectile command

This is a quick overview of Trajectile Command: A free virtual reality arcade game that runs in a web browser.