Technology can enhance human abilities. For example, eye glasses and a hearing aid can support individual abilities for seeing and hearing. If the haptic sensation can be enhanced and augmented, we can provide the haptic sensation of craftsman and enhance touch experiences of daily life.
We present HapticAid, a wearable system that enhances haptic sensations. We envision three application areas with the HapticAid system: Enhance - amplify and enhance haptic sensations; Enchant - embed haptic sensations with HapticAid in otherwise passive objects; and Empathize - communication of haptic experiences that allows connecting and empathizing with others’ haptic sensations.
STEP is the shoe that lets you feel Virtual Reality through your feet. Embedded tactile actuators and sensors in the shoe transmit various haptic sensations in response to your foot movement. Manipulating the strength and rhythm of the actuators allows to simulate, for instance, a range of terrain texture depending on foot pressure against ground. STEP gives you an immersive experience in game, music and sports fields through haptic sensation. Enjoy the addition of “foot sensation” like never before.
We propose “Layered Telepresence”, a novel method of experiencing simultaneous multi presence with user’s eye gaze and perceptual awareness blending of real-time audio-visual information received from multiple telepresence robots. The system arranges audio-visual information received through multiple robots into a priority driven layered stack; a weighted feature map was created for each layer based on the objects recognized in each layer using image processing techniques; and pushes the most weighted layer around the desired user gazes in to the foreground, and all other layers are pushed back to the background providing an artificial depth of field effect. The proposed method not only works with robots, but also each layer could represent any audio-visual content such as video see thru HMD, television screen or even your PC screen enabling true multitasking.
The emergence of head-mount-displays(HMDs) have enabled us to experience virtual environments in an immersive mean. At the same time, omnidirectional cameras which capture real-life environments in all 360-degree angles in either still image or motion video are also getting attention. Using HMDs, we can view those captured omnidirectional images in immersion, as though we are actually "being there". However, as a requirement for immersion, our view of these omnidirectional images in the HMD is usually presented as first-person-view and limited by our natural field of view, i.e. we only see a fraction of the environment which we are facing, while the rest of the 360-degree environment is hidden from our view. This is even more problematic in telexistence situations where the scene is dynamic so setting a default facing direction for the HMD is impratical. We can often observe people, while wearing HMDs, turn their heads frantically trying to locate interesting occurrences in the omnidirectional environment they are viewing. We propose a "planet" view for visualizing an meta-view of the environment that a user is immersed in. Equirectangular images taken from a omnidirectional camera can be transformed into "planets". The planets are then placed at users' feet and becomes visible when look down. The planets capture the full omnidirectional scene so users can easily obtain a whole view of the environment, even those that are behind them. As these planets are around the user's feet, we further suggest natural feet-interaction with the planets to enhance our experience.
AnyOrbit is a 6DOF spatial navigation technique that utilises orbital trajectories. It can be applied to virtual and augmented reality (VR and AR), computer-aided design (CAD), data visualisation, and the consumption of other 3D content including 3D real-time sports and video. The key invention here is the use of spiral trajectories to navigate smoothly between orbital paths around different points of interest. We have demonstrated several regimes and use-cases including: user controlled point of interest selection; directed point of interest; and eye-tracked point of interest selection. The current direction of the research is to apply the technique to storytelling, data visualisation, and augmented reality CAD systems.
VR Sound World is about exploring new sensory experiences that we can have using new media technologies such as virtual reality. The project builds on previous music visualisation work, and adds a whole new dimension of full-body high-fidelity haptic vibration provided colleagues at Embodied Media, Keio Media Design. The virtual environment consists of a succession of interactive environments made up using procedural meshes that respond to music. A natural mapping between sound and colour enhances immersion and presence by presenting consistent stimuli in visual and audial sensory channels, creating a world where our senses are hyper-connected, inspired by synesthesia. While the sound creates the geometries and colours in the kaleidoscopic environment, the geometries in turn reach out and touch the user's body causing vibrational stimuli to add another sensory channel by which the reality of the world becomes objective in the mind of the user. The result is a level of sensory integration not possible even in reality.