Hui-Shyong Yeo

I am a 3rd year PhD student in University of St Andrews,

advised by Aaron Quigley. 

I was also a researcher in UVR Lab, KAIST.

hsy@st-andrews.ac.uk

About Yeo

I am particularly interested in exploring and developing novel interaction techniques that transcend the barrier between human and computers, rendering more natural and intuitive interaction. 

I am interested in topics such as Gestural, Mid-air Interaction, Mobile, Wearable Interaction, Tangible Interaction, Augmented/Virtual Reality, Text Entry and Pen Interaction.

Currently, I focus on Single Handed Interaction Techniques for mobile and wearable devices for my PhD thesis.

Beyond HCI, I am also interested in cloud computing and cloud storage. When I proscrastinate, I post interesting stuffs to my HCI Research Fan page and blog.

Selected Projects

Mirror Mirror: An On-Body Clothing Design System [CHI16 Note]

We contribute the Mirror Mirror system that supports not only mixing and  matching existing fashion items, but also lets users design new items in  front of the mirror and export designs to fabrication devices. Mirror Mirror makes use of spatial augmented reality and a mirror Virtual garments are visible both on the body for precise manipulation as well as in the reflection to obtain a third person perspective. While much  previous work deals with re-texturing and registering virtual garments to live user data, we focus on collaborative design and show various ways of designing using real bodies as mannequins. Sample: Facebook, Google Drive, Twitter 

WatchMI: Pressure Touch, Twist and Pan Gesture Input on Unmodified Smartwatches [MobileHCI16 Honorable Mention Award]

We present WatchMI (Watch Movement Input) that enhances touch interaction on a smartwatch to support continuous pressure touch, twist, pan gestures and their combinations. Our novel approach relies on software that analyzes, in real-time, the data from a built-in Inertial  Measurement Unit (IMU) in order to determine with great accuracy and  different levels of granularity the actions performed by the user, without requiring additional hardware or modification of the watch, all seamlessly integrated in an unmodified smart watch. 

RadarCat: Radar Categorization for Input & Interaction [UIST16 Paper]

In RadarCat we present a small, versatile radar-based system for material and object classification which enables new forms of everyday proximate interaction with digital devices. We further demonstrate four working examples including a physical object dictionary, painting and photo editing application, body shortcuts and automatic refill based on RadarCat. 

SWiM: Shape Writing in Motion [CHI17 Paper]

In SWiM, we propose and evaluate a novel design point around a tilt-based text entry technique which supports single handed usage. Our technique is  based on the gesture keyboard (shape writing). However, instead of drawing gestures with a finger or stylus, users articulate a gesture by tilting the device. This can be especially useful when the user’s other hand is otherwise encumbered or unavailable. 

Sidetap and Slingshot Gestures on Unmodified Smartwatches [UIST16 Best Poster]

We present a technique for detecting gestures on the edge of an unmodified smartwatch. We demonstrate two exemplary gestures, i) SideTap - tapping on any side and ii) Slingshot - pressing on the edge and then releasing quickly. Our technique is lightweight, as it relies on measuring the data from internal Inertial measurement unit (IMU) only. With these two gestures, we expand the input expressiveness of a smartwatch, allowing users to use intuitive gestures with natural tactile feedback, instead of limiting the interaction to the small touch  screen only. 

TiTAN: Typing in Thin Air Naturally [CHI17 LBW, to appear]

We present TiTAN, a virtual keyboard system that enables freehand midair text  entry for distant display while only requiring a low-cost depth sensor.  Leveraging user’s spatial familiarity with the QWERTY layout, our system allows users to input text in thin air by mimicking the typing action they usually perform on a physical keyboard or touchscreen device. Both hands and ten fingers are individually tracked, along with clicking action detection to enable a wide variety of interactions. We propose three midair text entry techniques: bi-manual hunt-and-peck, ten fingers  touch-typing and one hand shape writing. 

An HMD-based Mixed Reality System for Avatar-Mediated Remote Collaboration with Bare-hand Interaction [ICAT-EVGE15]

We present a novel framework for mixed reality based remote collaboration system, which enables a local user to interact and collaborate with another user from remote space using natural hand motion. Unlike conventional system where the remote user appears only inside the screen, our system is able to summon the remote user into the local space, which appears as a virtual avatar in the real world view seen by the local user. 

SpeCam: Sensing Surface Color and Material with the Front-Facing Camera of a Mobile Device [MobileHCI17 Honorable Mention Award]

SpeCam is a lightweight surface color and material sensing approach for mobile devices which only uses the front-facing camera and the display as a multi-spectral light source. We leverage the natural use of mobile devices (placing it face-down) to detect the material underneath and therefore infer the location or placement of the device. SpeCam can then be used to support discreet micro-interactions to avoid the numerous distractions that users daily face with today's mobile devices. 

Hand Tracking and Gesture Recognition System for Human-Computer Interaction using Low-cost Hardware [MTAP Journal]

We present a robust marker-less hand/finger tracking and gesture recognition system using low-cost hardware. We propose a simple but efficient method that allows robust and fast hand tracking despite complex background and motion blur. Our system is able to translate the  detected hands or gestures into different functional inputs and interfaces with other applications via several methods. We also developed sample applications that can utilize the inputs from the hand tracking system.

Itchy nose: discreet gesture interaction using EOG sensors in smart eyewear [ISWC17]

We propose a sensing technique for detecting finger movements on the nose, using EOG sensors embedded in the frame of a pair of eyeglasses. Eyeglasses wearers can use their fingers to exert different types of movement on the nose, such as flicking, pushing or rubbing. These subtle gestures can be used to control a wearable computer without calling attention to the user in public. We present two user studies where we test recognition accuracy for these movements.
https://www.youtube.com/watch?v=IQ_LkPM_GHs