HOSTED BY
The HCI Society of Korea
ORGANIZED BY
KAIST
ARRC
NAIST
SPONSORED BY
KAIST
sigmr
VRSJ sigMR
ieie
IEIE sigAH
The HCI Society of Korea
HCI Korea sig10+
CFP   |   SUBMISSION   |   PROGRAM   |   INVITED SPEAKER   |   REGISTRATION   |   COMMITTEE   |   VENUE   |   PREVIOUS
  Program

International Symposium on Ubiquitous Virtual Reality (ISUVR) is an international symposium for emerging new paradigm on virtual reality in ubiquitous computing era, which accelerates the convergence of multidisciplinary technologies to enrich the quality of user experience. With this trends, for the past years, ISUVR has been a successful event where participants exchange research ideas about "Ubiquitous VR" with other researchers from different backgrounds. Thus, it would be timely to initiate a forum to discuss and share ideas on “Augmented Experience in Ubiquitous VR" which combines various AR/MR/VR related technologies to enhance human’s ability.

 * Sessions :  Check the above Program web page for the day, time and session of your presentation.

 * Timing :

- (O)  15 minutes have been allocated to each oral presentation.
          This includes speaker setup, talk and question period. Please leave at least 2 minutes for questions.
- (P)  3 minutes have been allocated to each poster presentation. The poster also will be hanged on the wall.

 * Important :    

Please be in the lecture hall at least 15 minutes before the start of your session. Introduce yourself to your session chair, get any last-minute instructions from him/her, and make sure your laptop works. In particular, all speakers must check before the session begins that the laptop & projector work together.



  Special Session

  ●   O1. Kai Kunze  (Japan)

             : Open Eyewear and beyond - From Superhuman Sports to Amplifying Human Senses
  ●   O2. Jason Orlosky  (Japan)

             : Augmented Reality as an Entry Point to Parallel Consciousness
  ●   O3. Jinshi Cui  (China)

             : Children's behavior computation for social inhibition research
  ●   O4. Thammathip Piumsomboon  (Australia)

             : Sharing What You Feel and Interacting with What You See
  Session 1   Interface in AR/VR

  Session Chair : Daniel Saakes, KAIST

  ●   O5. Sangwoo Noh, Juyeon Han, Jinwoo Jo and Ahyoung Choi. (Korea)
             : Virtual Companion Based Mobile User Interface: An Intelligent and Simplified
  ●   O6. Seokwon Lee, Kihong Park, Junyeop Lee and Kibum Kim. (Korea)
             : User Study of VR Basic Controller and Data Glove as Hand Gesture Inputs in VR Games
  ●   O7. Hyeongcho Yoon, Soobin Lee, Jungyong Park, Younghwan Choi and Sanghyun Cho. (Korea)
             : Development of Racing Game Using Motion Seat
  ●   O8. Gary Ng and Daniel Saakes. (Korea)
             : First-Person Interface for Designing Video Games in Augmented Reality
  ●   O9. Varunyu Fuvattanasilp. (Japan)
             : 3D object arrangement in handheld augmented reality application using gravity information
  ●   P1. Hyunjin Lee and Woontack Woo. (Korea)
            : Mobile UI design for children to enhance positive emotions
  ●   P2. Hayun Kim and Woontack Woo. (Korea)
            : Spatial and Mobile Augmented Reality for Cultural Heritage Learning in Classroom
  Session 2   3D Interaction/Vision in AR/VR

  Session Chair : Ahyoung Choi, Gachon University

  ●   O10. Gary Ng, Joon Gi Shin and Daniel Saakes. (Korea)
              : Handheld Augmented Reality for Couples Designing their Living Room
  ●   O11. Mankyu Sung and Sunghwan Choi. (Korea)
              : Selective Anti-Aliasing for Virtual Reality Based on Saliency Map
  ●   O12. Heekyu Park, Soobeom Jeong, Taehoon Kim, Daegeun Youn and Kibum Kim. (Korea)
              : Visual Representation of Gesture Interaction Feedback in Virtual Reality Games
  ●   O13. Hikari Takehara, Tomokazu Sato, Norihiko Kawai, Kiyoshi Kiyokawa and Naokazu Yokoya. (Japan)
              : Free-viewpoint indirect augmented reality
  ●   O14. Oral Kaplan. (Japan)
              : In The Midst of Informatics and Sports
  ●   P3. Gabyong Park and Woontack Woo. (Korea)
            : Efficient Finger Tracking with Semantic Segmentation for Playing an Augmented Piano
  ●   P4. Noh-Young Park and Woontack Woo. (Korea)
            : Imaged-based bidirectional VR authoring framework for 360-degree video applications
  ●   P5. Woojin Cho and Woontack Woo. (Korea)
            : Efficient 3D Hand Data Generation with a Colored Glove for Model-based Hand Tracking
  ●   P6. Hyung-Il Kim, Juyoung Lee, Seoyoung Oh, Hui-Shyong Yeo and Woontack Woo. (Korea)
            : Electrical Muscle Stimulation Based Guidance for Augmented Reality
  ●   P7. Junmok Do, Kyeongho Jang, Jaemin Byun and Sanghyun Cho. (Korea)
            : Motion-Based Action Recognition Game Using Kinect
  Session 3   Visualization for Augmented Perception

  Session Chair : Christian Sandor, NAIST

  ●   O15. Jinwoo Jeon and Woontack Woo. (Korea)
              : Object Independent AR Space Generation
  ●   O16. Whie Jung and Woontack Woo. (Korea)
              : Duplication Based Distance-Free Freehand Virtual Object Manipulation
  ●   O17. Yuma Ouchi. (Japan)
              : 3D Ground Reaction Force Visualization for Sprint Training
  ●   O18. Takuro Matsui. (Japan)
              : Automatic Fences Removal for Diminished Reality using Curvelet Transform
  ●   O19. Qirui Zhang. (Japan)
              : Guided Image Upsampling via Domain Transform
  ●   O20. Ryo Akiyama. (Japan)
              : Robust Estimation of Reflectance for Appearance Control with a Projector-Camera System
  ●   P8. Boram Yoon and Woontack Woo. (Korea)
            : TrackYourCups : Personal Informatics Visualization using Geographic and Choropleth Map
  ●   P9. Jinwoo Park and Woontack Woo. (Korea)
            : Handy Light Source Mapping Tool for Consistent Illumination in Augmented Reality
  Session 4   Perception Augmentation in AR/VR

  Session Chair : Kiyoshi Kiyokawa, NAIST

  ●   O21. Youngho Lee, Choonsung Shin, Alexander Plopski, Yuta Itoh, Arindam Dey, Gun Lee, Seungwon Kim
                and Mark Billinghurst. (Korea)
              : Estimating Gaze Depth Using Multi-Layer Perceptron
  ●   O22. Kar-Long Chan, Kohei Ichikawa, Yasuhiro Watashiba and Hajimu Iida. (Japan)
              : Cloud-Based VR Gaming: Our Vision on Improving the Accessibility of VR Gaming
  ●   O23. Taishi Yamamoto, Hiroto Aida, Daisuke Yamashita, Yusuke Honda and Mitsunori Miki. (Japan)
              : E-Book Browsing Method by Augmented Reality Considering Paper Shape
  ●   O24. Damien Rompapas. (Japan)
              : EyeAR: Refocusable Augmented Reality Content through Eye Measurements
  ●   O25. Alexander Plopski. (Japan)
              : Eye-gaze aware micro-lens based near-eye head-mounted display
  ●   P10. Juyoung Lee, Ji-Hyun Lee and Woontack Woo. (Korea)
              : Attention Modeling based on Wearable Activity Tracking: An aspect Managing Information Throughput
                for Adaptive Notification
  ●   P11. Hyerim Park and Woontack Woo. (Korea)
              : User Requirement Analysis for Location-based Film Experience
  ●   P12. Jae-Eun Shin, Kyungmin Cho and Woontack Woo. (Korea)
              : Smart Hospital: An Interactive Toy Kit to Alleviate Children’s Hospital Fears