New AHRC grant awarded to Ewa Luger

‘Enabling a Responsible AI Ecosystem’

The Arts and Humanities Research Council (AHRC), part of UK Research and Innovation (UKRI), have appointed Professors Shannon Vallor and Ewa Luger to direct the £3.5 million programme ‘Enabling a Responsible AI Ecosystem’ in collaboration with the Ada Lovelace Institute.

The Programme Directors will work closely with the Ada Lovelace Institute, selected by AHRC as a Collaborating Partner for the Programme, to define and shape the research priorities and themes, and deliver other activities to support a responsible AI ecosystem.

The research programme will mobilise expertise from arts, humanities and social sciences to create an environment within which AI and data-driven innovation is responsible, ethical, and accountable by default.

Through harnessing the expertise of researchers and innovators from a range of disciplines the programme will develop a responsible AI ecosystem which is responsive to the needs and challenges faced by policymakers, regulators, technologists, and the public.

The programme includes experts in philosophy, human computer interaction, law, art, health, and informatics.

AI hands

The programme will build connections between academia, industry, policy, regulation and the public to help build confidence in AI, enable innovation, stimulate economic growth and deliver wider public benefit.

Professor Shannon Vallor holds the Baillie Gifford Chair in the Ethics of Data and Artificial Intelligence in the School of Philosophy, Psychology and Language Sciences and is Director of the Centre for Technomoral Futures at the Edinburgh Futures Institute.

Professor Ewa Luger holds a personal chair in Human-Data Interaction within the School of Design, is Co-Director of the Institute of Design Informatics, and Director of Research Innovation for Edinburgh College of Art.

“For AI technologies to be successfully integrated into society in ways that promote shared human flourishing, their development has to be guided by more than technical acumen. A responsible AI ecosystem must meld scientific and technical expertise with the humanistic knowledge, creative vision and practical insights needed to guide AI innovation wisely. This programme will work across disciplines, sectors and publics to lay the foundations of that ecosystem.” Professor Shannon Vallor

“We have reached a critical point within responsible AI development. There now exists a foundation of good practice, however it is rarely connected to the sites where innovation and change happen, such as industry and policy. We hope that this programme will make new connections, creating creating an ecosystem where responsibility is not the last, but the first thought in AI innovation.” Professor Ewa Luger

 

The three-year programme is supported by AHRC, part of UK Research Innovation (UKRI), to develop research into the challenges and opportunities around technologies. Enabling a Responsible AI Ecosystem is the first large-scale research programme on AI ethics and regulation in the UK. The project will be delivered with the Ada Lovelace Institute to broaden the existing foundation of responsible AI research and contribute to the wider UK vision.