Emerging Technologies Fact Sheet
Chair: Tomoe Moriyama, Museum of Contemporary Art, Tokyo/The University of Tokyo
Co-Chair: Adrian David Cheok, National University of Singapore/Keio University
Conference: Wednesday 10 December - Saturday 13 December
Exhibition: Thursday 11 December - Saturday 13 December
The Facts
- The SIGGRAPH Asia 2008 Emerging Technologies programme shares its theme with the Art Gallery. The title of the show, Synthesis, refers to a kind of chaos, a random transformation of structured knowledge and exploration in digital art and emerging technologies.
- SIGGRAPH Asia 2008 Emerging Technologies presents a rich resource of delicate, aesthetic technologies and vivid, innovative ideas that define the future of computer graphics and interactive techniques.
- Ten works from over 50 submissions and eight curated works will be displayed at SIGGRAPH Asia 2008.
- Seventy-six percent of the works come from Asia.
- SIGGRAPH Asia 2008 Emerging Technologies also features a curated exhibition, Asian Hybrid Art and Technologies, and other related events, including a joint show with the Computer Animation Festival and collaborative panels with the Japan Media Arts Festival.
A Quote from the SIGGRAPH Asia 2008 Emerging Technologies Chair:
"We have combined the two programmes, Art Gallery and Emerging Technologies, under the title of Synthesis, this year to merge their experimental histories. There are over 30 brilliant works altogether, and in all the projects, attendees can enjoy the vast potential and rich diversity of hybrid art and technologies. We are so happy to show you this fruitful result of emerging creativity, here in Singapore, at the melting point of culture and business."
A Quote from the SIGGRAPH Asia 2008 Emerging Technologies Co-Chair:
"For the first time in Asia, digital-technology enthusiasts are experiencing and interacting with the bleeding edge of interactive technologies in SIGGRAPH Emerging Technologies, where attendees can explore the frontier technologies that are redefining interactive and digital media, and human-computer collaboration. In SIGGRAPH Asia 2008 Emerging Technologies, scientists, engineers, and inventors demonstrate new and speculative interactivity in robotics, mixed reality, physical interfaces, tangible interaction, design, and entertainment. We are sure these demos will trigger more quantum-step innovations and inventions in the future."
SIGGRAPH Asia 2008 Emerging Technologies highlights include:
Heaven's Mirror: Mirror Illusion Realised Outside of the Mirror
With this system, users experience a mirror illusion through three modalities of feedback (haptic, visual, and auditory) and perceive a boundary-less transition between the real world and the world inside the mirror.
Enhanced Life
Sometimes, mirrors provide illusions that distort physical laws. In Heaven's Mirror, the illusions become "real" as users' visual, tactile, and auditory senses are immersed in the world inside the mirror. This approach opens new possibilities for using mirrors in virtual reality.
Goals
To allow users to perceive a seamless boundary between the inside and outside of the mirror.
Innovations
Heaven's Mirror focuses on the physical relationship between the real world and the world inside the mirror. It uses a mirror illusion and amplifies it to the real world so users can experience a mirror illusion through three modalities of feedback.
Contributors
Seunghyun Woo
Takafumi Aoki
Hironori Mitake
Naoki Hashimoto
Makoto Sato
Tokyo Institute of Technology
Balance Ball Interface
A user-interface device for exercise and entertainment. As users move while sitting on the balance ball, the system captures their motion and behaviour.
Enhanced Life
This easy-to-use, inexpensive interface system liberates people from sedentary, unhealthy computer work. It is a surprising new concept that changes our assumptions about chairs and interfaces, and promotes a new reality.
Goal
To develop a game-interface device that acquires complex movements of a human body in a sitting posture.
Innovations
This interface technology converts information from an acceleration sensor and a pressure sensor into posture information. Movements of the upper body and the waist are calculated from these inputs and converted into whole-body movements.
Contributor
Masasuke Yasumoto
Graduate School of Film and New Media, Tokyo University of the Arts
TransCAIP: Live Transmission of Light Field from a Camera Array to an Integral Photography Display
TransCAIP provides a real-time 3D visual experience by using an array of 64 cameras and an integral photography display with 60 viewing directions. The live 3D scene in front of the camera array is reproduced by the full-color, full-parallax auto-stereoscopic display with interactive control of viewing parameters.
Enhanced Life
This project demonstrates the potential of live 3D TV systems in a prototype system. The core technology is a fast and flexible data-conversion method from the multi-camera images to the integral photography format. Because the conversion method is applicable to general combinations of camera arrays and integral photography (and multi-view 3D) displays, it could be an essential technology for future 3D TV systems.
Goals
The overall goal is to develop a practical live 3D TV system that reproduces a full-color 3D video of a scene with both horizontal and vertical parallax in real time. The system gives users a perception of observing the 3D scene through a window without requiring them to wear special glasses. The main technical goal is to develop a fast and flexible data conversion method between asymmetric input and output devices, which runs in real time (more than five frames per second) on a single PC with GPGPU techniques and enables users to interactively control viewing parameters of the displayed 3D images for enhancing the 3D visual experience.
Vision
Three-dimensional TV is a promising technology for providing a more natural and intuitive perception of 3D scenes than existing two-dimensional TV. In particular, live 3D TV systems, which transmit 3D visual information in real time, could have a significant impact on many applications in communication, broadcasting, and entertainment in the near future.
Contributors
Yuichi Taguchi
The University of Tokyo
Takafumi Koike
The University of Tokyo, Hitachi Ltd.
Keita Takahashi
Takeshi Naemura