征稿已开启

查看我的稿件

注册已开启

查看我的门票

已截止
活动简介

Virtual reality (VR) and augmented reality (AR) are currently two of the "hottest" topics in the IT industries. Many consider them to be the next wave in computing with a similar impact as the shift from desktop systems to mobiles and wearables. This hype is rooted in technological improvements (such as mass-produced high-resolution mobile displays), which resulted in cheaper, high-performant devices for the common consumer market. Examples include Facebook's Oculus Rift and Microsoft's HoloLens for VR and AR, respectively.

Despite this progress, we are still far from the ultimate goal of creating new virtual environments or augmentations of existing ones that feel and react similarly as their real counterparts. Many challenges and open research questions remain - mostly in the areas of multimodality and interaction. For example, current setups predominantly focus on visual and auditory senses, neglecting other modalities such as touch and smell that are an integral part of how we experience the real world around us. Likewise, it is still an open question how to best interact and communicate with a virtual world or virtual objects in AR.

Multimodal interaction offers great potential to not only make AR and VR experiences more realistic, but also to provide more powerful and efficient means of interacting with virtual and augmented worlds. This workshop wants to explore these opportunities and thus invites contributions on all kinds of works related to interaction or multimodality in the context of VR and AR computing.

征稿信息

重要日期

2016-07-15
初稿截稿日期

征稿范围

  • Multisensory experiences and improved immersion, including audio-visual installations, haptics/tactile, smell/olfactory sensations, taste/gustation (contributions focusing on single, but enhancing senses are welcome), perception of virtual objects, etc.

  • Multimedia & sensory input, including affective computing and human behavior sensing for VR/AR, multisensory analysis, integration, and synchronization, speech, gestures, tracking for AR/VR, virtual humans and avatars, etc.

  • Multimodal output, including smart and ambient environments, multimedia installations, etc.

  • Interaction design & new approaches for interaction in AR/VR, incl. tangible interfaces, multimodal communication & collaborative experiences, social aspects in AR/VR interaction, gesture-based interaction design, 3D interaction, advanced interaction devices, etc.

  • System design & infrastructure for multimodal AR/VR, including real-time and other performance issues, rendering of different modalities, distributed and collaborative architectures, etc.

  • Applications, incl. use cases, prototypes, or prove of concepts for new and innovative approaches in serious as well as leisure domains.

留言
验证码 看不清楚,更换一张
全部留言
重要日期
  • 11月16日

    2016

    会议日期

  • 07月15日 2016

    初稿截稿日期

  • 11月16日 2016

    注册截止日期

移动端
在手机上打开
小程序
打开微信小程序
客服
扫码或点此咨询