Projects
Embodied Media - Project Archive

An overview of the research prototypes, public exhibitions, industrial products, and modern organizational designs that enhance our everyday lives.

Embodied Media では、日常生活を豊かにするテクノロジーの開発、プロダクトのデザイン、未来の表現や社会の仕組みを提案する研究活動に取り組んでいます。

Haptic Design

わたしたちは日々、さまざまなモノやヒトと触れ合い、刺激を受けるその度に喜び、哀しみ、ときに感極まって涙を流す……
わたしたちのからだは“センサーの塊”です。ヒトの五感のひとつである、からだ全体で感じる感覚「触覚(HAPTIC)」のデザインは、たとえばグラフィックデザイナーが紙の“厚みや材質”にこだわり、ファッションデザイナーが見た目以上に“着心地”に気を使い、UXデザイナーが“ふれあいによる影響”を考えるように、デザインの現場で大切にされながらも、不思議なことに、今まで名前もなく、体系化されることもないままでした。見る「視覚」のデザインがビジュアルデザインとして、聴く「聴覚」のデザインがサウンドデザインとして確立されてきたように、身体を通じてヒトと世界をつなぐ、触れる「触覚」のデザイン分野をHAPTIC DESIGN(ハプティックデザイン)と名付け、研究分野とデザイン分野との融合にチャレンジします。

Haptics
2016
Haptic Aid

Technology can enhance human abilities. For example, eye glasses and a hearing aid can support individual abilities for seeing and hearing. If the haptic sensation can be enhanced and augmented, we can provide the haptic sensation of craftsman and enhance touch experiences of daily life.

We present HapticAid, a wearable system that enhances haptic sensations. We envision three application areas with the HapticAid system: Enhance - amplify and enhance haptic sensations; Enchant - embed haptic sensations with HapticAid in otherwise passive objects; and Empathize - communication of haptic experiences that allows connecting and empathizing with others’ haptic sensations.

技術は人の能力を拡張することができます。例えば,眼鏡や補聴器は個々人の視覚や聴覚を拡張しています。もし,視覚における眼鏡や聴覚における補聴器のように,触感覚を拡張・増強させることができれば,熟練者の感覚を容易に得たり,老化によって衰えた触覚を回復したり,あるいは私たちが日常の中で得る様々な感覚をより楽しいものに変化させたりできるのではないでしょうか。

視覚における眼鏡,聴覚における補聴器のように,低下してしまった感覚や失われてしまった感覚を取り戻すことを目指す研究として,補触器,「HapticAid」の研究を行っています。また,「HapticAid」は触感覚を増強させるEnhance,触れたモノに魔法をかけるようなEnchant,自分と他者の体験を共有するようなEmphathiseの3つの軸で研究を行っています。

Haptics
2016
Step

STEP is the shoe that lets you feel Virtual Reality through your feet. Embedded tactile actuators and sensors in the shoe transmit various haptic sensations in response to your foot movement. Manipulating the strength and rhythm of the actuators allows to simulate, for instance, a range of terrain texture depending on foot pressure against ground. STEP gives you an immersive experience in game, music and sports fields through haptic sensation. Enjoy the addition of “foot sensation” like never before.

「STEP」は、足の触覚をバーチャルに生み出すVRシューズです。靴に内蔵された複数のバイブレーター及びセンサーによって、センシングしたユーザーの足の動きに連動したさまざまな触覚体験を生み出します。例えば、複数のバイブレーターの振動の強弱・リズムを制御することで、足を踏み込む時の地面のテクスチャーを変化させるなど、ゲーム・音楽・スポーツにさらなる臨場感や没入感を加えるための新しい触覚体験を実現します。「STEP」でこれまでにない”足感覚”を楽しんでください。

Haptics
2016
Layered Presence

We propose “Layered Telepresence”, a novel method of experiencing simultaneous multi presence with user’s eye gaze and perceptual awareness blending of real-time audio-visual information received from multiple telepresence robots. The system arranges audio-visual information received through multiple robots into a priority driven layered stack; a weighted feature map was created for each layer based on the objects recognized in each layer using image processing techniques; and pushes the most weighted layer around the desired user gazes in to the foreground, and all other layers are pushed back to the background providing an artificial depth of field effect. The proposed method not only works with robots, but also each layer could represent any audio-visual content such as video see thru HMD, television screen or even your PC screen enabling true multitasking.

Telexistence
2016
Marathon Runner

マラソンランナーの視点をリアルタイムに共有できるシステムです。ランナーに装着した全天球カメラの映像と音声をリアルタイムに伝送し、ヘッドマウントディスプレイを用いてユーザー視点で閲覧できます。
一般的なLTE回線を用いた全天球映像と音声のリアルタイム配信には、NTTコミュニケーションズが提供中のWebRTCプラットフォーム「SkyWay」を利用しています。

Telexistence
2016
VR planet

The emergence of head-mount-displays(HMDs) have enabled us to experience virtual environments in an immersive mean. At the same time, omnidirectional cameras which capture real-life environments in all 360-degree angles in either still image or motion video are also getting attention. Using HMDs, we can view those captured omnidirectional images in immersion, as though we are actually "being there". However, as a requirement for immersion, our view of these omnidirectional images in the HMD is usually presented as first-person-view and limited by our natural field of view, i.e. we only see a fraction of the environment which we are facing, while the rest of the 360-degree environment is hidden from our view. This is even more problematic in telexistence situations where the scene is dynamic so setting a default facing direction for the HMD is impratical. We can often observe people, while wearing HMDs, turn their heads frantically trying to locate interesting occurrences in the omnidirectional environment they are viewing. We propose a "planet" view for visualizing an meta-view of the environment that a user is immersed in. Equirectangular images taken from a omnidirectional camera can be transformed into "planets". The planets are then placed at users' feet and becomes visible when look down. The planets capture the full omnidirectional scene so users can easily obtain a whole view of the environment, even those that are behind them. As these planets are around the user's feet, we further suggest natural feet-interaction with the planets to enhance our experience.

XR
2016
AnyOrbit

AnyOrbit is a 6DOF spatial navigation technique that utilises orbital trajectories. It can be applied to virtual and augmented reality (VR and AR), computer-aided design (CAD), data visualisation, and the consumption of other 3D content including 3D real-time sports and video. The key invention here is the use of spiral trajectories to navigate smoothly between orbital paths around different points of interest. We have demonstrated several regimes and use-cases including: user controlled point of interest selection; directed point of interest; and eye-tracked point of interest selection. The current direction of the research is to apply the technique to storytelling, data visualisation, and augmented reality CAD systems.

XR
2016
VR Sound World

VR Sound World is about exploring new sensory experiences that we can have using new media technologies such as virtual reality. The project builds on previous music visualisation work, and adds a whole new dimension of full-body high-fidelity haptic vibration provided colleagues at Embodied Media, Keio Media Design. The virtual environment consists of a succession of interactive environments made up using procedural meshes that respond to music. A natural mapping between sound and colour enhances immersion and presence by presenting consistent stimuli in visual and audial sensory channels, creating a world where our senses are hyper-connected, inspired by synesthesia. While the sound creates the geometries and colours in the kaleidoscopic environment, the geometries in turn reach out and touch the user's body causing vibrational stimuli to add another sensory channel by which the reality of the world becomes objective in the mind of the user. The result is a level of sensory integration not possible even in reality.

XR
2016