Ewa Luger

Ewa Luger

Chair of Human-Data Interaction

Dr Ewa Luger, Chair of Human-Data Interaction

Professor Ewa Luger is co-director of the Institute of Design Informatics and co-Director of AHRC’s Bridging Responsible AI Divides (BRAID) programme.  She works closely with policy-makers and industry and is a member of the DCMS college of experts.


Ewa’s research explores social, ethical and interactional issues in the context of complex data-driven systems, with a particular interest in design, the distribution of power, spheres of exclusion and user consent. Past projects have focused on responsible AI, voice systems and language models in use, application of AI in journalism and public service media, intelligibility of AI and data driven systems to expert and non-expert users, security and safety of systems in the cloud and at the edge, and the readiness of accounting institutions and knowledge workers to make use of AI.


Ewa has been investigator on over £22 million of externally-funded projects (EPSRC, ESRC, AHRC, Centre for Digital Built Britain and DataLab) since 2016. She is currently Co-PI of the AHRC Bridging Responsible AI Divides (BRAID) programme, Co-I on the EPSRC Fixing the Future project, and collaborator on the UKRI Digital Twinning Network (DTNet+). Ewa is also an organiser of Conversations, an annual international workshop on chatbot research, since 2015.


Previously a fellow of the Alan Turing Institute, Researcher at Microsoft, Fellow of Corpus Christi College (University of Cambridge), and consultant ethicist for Microsoft Research (2016-2020). She builds upon 15 years as a lifelong learning expert and practitioner (NIACE, 1999-2014) where she investigated digital inclusion within marginalised communities. Ewa holds a BA (Hons) in International Relations & Politics, an MA in International Relations and a PhD in Computer Science.


Bridging Responsible AI Divides (BRAID)

BRAID is a UK-wide programme dedicated to integrating Arts and Humanities research more fully into the Responsible AI ecosystem, as well as bridging the divides between academic, industry, policy and regulatory work on responsible AI.

Read More →

Designing Responsible NLP

Designing Responsible NLP is a PhD with intergrated programme being run as part of the newly funded (as of the start of 2024) UKRI AI Centre for Doctoral Training (CDT) in Responsible and Trustworthy in-the-world Natural Language Processing.

Read More →

Intelligent Governance Design

Read More →

Constructing the Agenda: Public Service AI in News and Media Production

Exploring the implications of artificial intelligence (AI) for public service media (PSM), drawing out the big questions AI and data-intensive technologies raise alongside the practical and conceptual challenges they pose.

Read More →

Fixing the Future: The Right to Repair and Equal-IoT

‘Fixing the Future’ is an interdisciplinary project investigating how the lack of repairability in the consumer Internet of Things (IoT) will adversely impact equity, inclusion, and sustainability in the digital economy.

Read More →

Designing Ethical Human-Computer Systems

Funded by: EPSRC ‘Telling Tales of Engagement, 2016’ award

Read More →

Exploring Ethical AI in Accounting

Accounting is one of the latest professions to consider the benefits of Artificial Intelligence (AI) to support decision-making.

Read More →

Network Plus in Human Data Interaction

Network Plus in Human Data Interaction: Legibility, Agency, Negotiability

Read More →


Interaction Design for Trusted Sharing of Personal Health Data to Live Well with HIV

Read More →


DCODE is a EU-Funded network of seven world-class higher education design institutions and stakeholders from industry, government, and civil society, to explore new ways of designing with data and autonomous systems

Read More →

Building Public Value via Intelligible AI (PubVIA)

The PubVIA project aims to build public value via intelligible AI (Artificial Intelligence) and to enable the BBC to consider the potential threats and risks AI and IoT (Internet of Things) pose for journalism.

Read More →