03
2022
Do not miss
Sign up to get our updates
Event
EventThursday TalkDI Webinar – Nadia Berthouze & Youngjun Cho
Click here to book your tickets for this online event.
Engaging people with material through low cost technology
This seminar presents our work in progress to develop low cost ubiquitous technology to support a textile circular economy based on wellbeing.
In the first part we will discuss how we aim to harness interactive low cost ubiquitous tools to capture garment material properties and behaviour. We will present our initial work on building a new AI-powered material sensing technology based on video camera and thermal imaging. This is to help consumers make informed choices during purchasing as well as help consumers care about long-term material quality maintenance.
In the second part we will discuss our aim to create technology that engages consumers in embodied experiences of textiles to increase awareness of how they feel. We are investigating how multimodal movement sensing technology could be leveraged to engage people in exploring how textiles feel to touch and how they behave and feel when the textiles are worn. The long term aim is to engage people in an embodied dialogue with the clothes they have or want to purchase to ensure a more fulfilling and satisfying experience.
Bio
Nadia Bianchi-Berthouze is a Full Professor in Affective Computing and Interaction at the University College London Interaction Centre (UCLIC). Her research focuses on designing technology that can sense the affective state of its users and use that information to tailor the interaction process. She has pioneered the field of Affective Computing by investigating how body movement and touch behaviour can be used as means to recognize and measure the quality of the user experience. Her work has been motivated by real-world applications such as physical rehabilitation (EPSRC Emo&Pain, H2020 EnTiMeMent), textile design (EPSRC Digital Sensoria, EPSRC Textile Circularity Centre; EPSRC Textile Dematerialization for CS), education (H2020 WeDraw) and wellbeing (Intelligent Embodied Interaction, EPSRC; H2020 Human Manufacturing, EPSRC Embodied Intelligence). She has published more than 200 papers in Affective Computing, HCI, and Pattern Recognition.
Youngjun Cho is Associate Professor in the department of computer science at UCL and a co-champion at Global Disability Innovation Hub (GDIH) – WHO Collaborating Centre for Research on Assistive Technology. He is also a programme director of MSc Disability, Design and Innovation. He explores, builds and evaluates novel techniques and technologies for the next generation of artificial intelligence-powered physiological computing that can boost disability technology innovation. He has pioneered mobile imaging-based physiological sensing and automated detection of affective states (e.g. mental stress). Before returning to academia back in 2018, he worked as a senior research scientist in the industry (including LG Electronics CTO division) and led a variety of industrial research projects for about a decade, successfully commercialising his novel sensing and machine learning technologies (e.g. gesture-driven advanced touchscreen for vehicles). He has authored more than 70 articles (including patents) in areas related to affective, physiological computing, human-computer interaction and multimodal sensing and feedback.
Running Order
16.00 – Welcome by John Vines and Susan Lechelt
16.10 – Talk by Nadia Berthouze & Youngjun Cho
16.40 – Q&A
17.00 – End
Limited seats at Inspace are available, please book tickets in advance.
* Please note that this webinar will be recorded *
Online on Zoom or Inspace