Event

Research Seminar – Aaron Quingley

Object recognition in HCI with Radar, Vision and Touch

Aaron Quigley, Chair of HCI – University of St Andrews

Wednesday 2nd October 2pm – Bayes Centre G.03, 47 Potterrow EH8 9BT

The exploration of novel sensing to facilitate new interaction modalities is an active research topic in Human-Computer Interaction. Across the breadth of HCI we can see the development of new forms of interaction underpinned by the appropriation or adaptation of sensing techniques based on the measurement of sound, light, electric fields, radio waves, biosignals etc. In this talk I will delve into three forms of sensing for object detection and interaction with radar, blurred images and touch.

RadarCat (UIST 2016Interactions 2018IMWUT 2018 UbiComp 2019) is a small, versatile system for material and object classification which enables new forms of everyday proximate interaction with digital devices. RadarCat exploits the raw radar signals that are unique when different material and objects are placed on the sensor. By using machine learning techniques, these objects can be accurately recognized. An object’s thickness, state (filled or empty mug) and different body parts can also be recognized. This gives rise to research and applications in context-aware computing, tangible interaction (with tokens and objects), and in industrial automation (e.g., recycling), or laboratory process control (e.g., traceability). While AquaCat (MobileHCI 2017 workshop) is a low-cost radar-based system capable of discriminating between a range of liquids and powders. Further in Solinteraction we explore two research questions with radar as a platform for sensing tangible interaction with the counting, ordering, identification of objects and tracking the orientation, movement and distance of these objects. We detail the design space and practical use-cases for such interaction which allows us to identify a series of design patterns, beyond static interaction, which are continuous and dynamic with Radar.

Beyond Radar, SpeCam (MobileHCI ’17) is a lightweight surface color and material sensing approach for mobile devices which only uses the front-facing camera and the display as a multi-spectral light source. We leverage the natural use of mobile devices (placing it face-down) to detect the material underneath and therefore infer the location or placement of the device. SpeCam can then be used to support “discreet computing” with micro-interactions to avoid the numerous distractions that users daily face with today’s mobile devices. Our two-parts study shows that SpeCam can i) recognize colors in the HSB space with 10 degrees apart near the 3 dominant colors and 4 degrees otherwise and ii) 30 types of surface materials with 99% accuracy. These findings are further supported by a spectroscopy study. Finally, we suggest a series of applications based on simple mobile micro-interactions suitable for using the phone when placed face-down with blurred images.

Finally, with touch we can show a sensing technique for detecting finger movements on the nose, using EOG sensors embedded in the frame of a pair of eyeglasses (ISWC 2017). Eyeglasses wearers can use their fingers to exert different types of movement on the nose, such as flicking, pushing or rubbing. These subtle gestures in “discreet computing” can be used to control a wearable computer without calling attention to the user in public. We present two user studies where we test recognition accuracy for these movements. I will conclude this talk with some speculations around how touch, radar and vision processing might be used to realise “discreet” and “blended reality” interactions in AR and beyond.

Aaron Quigley is the Chair of Human Computer Interaction in the School of Computer Science at the University of St Andrews. He is director of the Scottish Informatics and Computer Science Alliance (SICSA) which is a collaboration of 14 Scottish Universities. In addition, he is a general co-chair for the ACM CHI conference in Yokohama Japan in 2021 and currently serves as the ACM SIGCHI Vice President for Conferences. He has published over 180 internationally peer-reviewed publications. His research interests include discreet computing, global HCI, pervasive and ubiquitous computing and information visualisation on which he has delivered over 50 invited talks.


Bayes Centre G.03, 47 Potterrow EH8 9BT