Fully funded PhD Studentship: Games of Trust and Ethics for the use of AI

Games of Trust and Ethics for the use of AI: Deadline 08 January 2021 for February start
A fully funded 3.5-year PhD position is available to work with Professor Michael Luck (Director of the UKRI Centre for Doctoral Training in Safe and Trusted Artificial Intelligence/Department of Informatics, King’s College London) and Dr Detlef Nauck (Head of AI & Data Science Research, BT). The project will begin in February 2021 and is supported by an Industrial CASE award; the successful candidate has the benefit of spending at least 3 months working with BT during the PhD.

The conditions and mechanisms that lead to different outcomes in complex, dynamic ecosystems are poorly understood and have not been systematically modelled. For example, given the unequal distribution of data, the impact of AI is likely to be disruptive and unequal, with larger entities (organisations) having advantages and increased control capabilities over smaller ones. Positive action to ensure fairness (generating advantage for individuals and excluded communities, for example) could change these dynamics and generate positive utility. The aim of this project is to explore the problems of network dynamics and agent interactions by leveraging such methods as game theory, normative reasoning, and/or other modelling and simulation techniques. At its core, the project will provide new computational tools and techniques to understand complex and unequal ecosystems of the kind described above, and in its application will seek to address issues arising from the development of AI itself. In this way, as well as providing novel technological developments, the methods and approaches developed may also provide insight into the policy drivers for AI, mechanisms for understanding the impact of AI as a digital technology on organisations, and/or approaches to governance around AI.
It is expected that the successful candidate’s project will be aligned to the Centre for Doctoral Training in Safe and Trusted AI, participating in its training and other activities.

Candidate profile
Applicants will normally be expected to have an MSc in computer science or a related discipline, or an outstanding undergraduate qualification, but all applications will be considered on merit as appropriate to the individual case. Applications from individuals with non-standard backgrounds are encouraged, as are applications from women, disabled and Black, Asian and Minority Ethnic (BAME) candidates, who are currently under-represented in the sector. All applicants will need to demonstrate enthusiasm and aptitude. 

The studentship will start in February 2021, is funded by an EPSRC/BT Industrial CASE award and will be funded for 3.5 years in the first instance. It includes a tax-free stipend at the standard UKRI rate (which was £17,285 per annum including London Allowance for 2020/21), full time (UK) PhD tuition fees, and a generous allowance for research consumables, additional training, conference attendance, etc.

Application Information
Applicants are strongly encouraged initially to contact the supervisor, Professor Michael Luck, at michael.luck@kcl.ac.uk, to discuss their interest.

Formal applications for this PhD in Computer Science Research should be made via the application portal, specifying Professor Michael Luck as the supervisor and “Games of Trust and Ethics for AI” as the Project Title/Reference. Applications should also include a short (3-4 page) research proposal based on the brief outline above. Anyone making a formal application should also advise Professor Luck that they are doing so.

Applications should be submitted by 08 January 2021. The anticipated start date is 01 February 2021. Full details of how to apply can be found at https://www.kcl.ac.uk/study/postgraduate/research-courses/computer-science-research-mphil-phd.