I am a Senior Lecturer (Associate Professor) in the Centre for Robotics Research (CoRe) Group within the Department of Engineering, King's College London, United Kingdom since 2022, where I am the Head of Social AI and Robotics (SAIR) Lab. I obtained my PhD degree in Electrical and Electronics Engineering from Bogazici University, Turkey, in collaboration with the National Institute of Applied Sciences of Lyon, France. After my PhD, I spent several years as a postdoctoral researcher in the Personal Robotics Lab, Imperial College London; Graphics and Interaction Research Group (Rainbow), University of Cambridge; and Multimedia and Vision Research Group, Queen Mary University of London. In 2018, I joined King's College London as a Lecturer in Robotics, which has the best location in central London -- see my office view.
My research interest is machine learning to develop socially aware systems capable of autonomous interaction with humans. This encompasses tackling challenges in multimodal perception, understanding and forecasting human behaviour, as well as advancing navigation, manipulation, and human-robot interaction. So far, my team's research has been supported by the EPSRC, The Royal Society, Innovate UK, EU Horizon, and industrial collaborations including Toyota Motor Europe, SoftBank Robotics Europe, and NVDIA. Here is a short article about SAIR.
Highlights of our research are:
This project aims to explore the application and practicalities of embedding Socially Assistive Robots (SARs) within Paediatric Palliative Care by first understanding the mental-health benefits that can be derived from synthetic companionship with SARs and discover its potential towards enhancing the quality-of-life for sick children.
SERMAS aims to develop innovative, formal and systematic methodologies and technologies to model, develop, analyse, test and user-study socially acceptable XR systems. [More info]
The vision of the CL-HRI project is to 1) develop scalable methods that can advance the current state of continual learning in human-centric computer vision tasks; and 2) integrate the developed techniques with real robots to enable them to constantly adapt and automatically exploit new information for generating appropriate interactive behaviours.
The LISI project aims to set the basis for the next generation of robots that will be able to learn how to act in concert with humans by watching human-human interactions.
An exciting multidisciplinary project that is on a mission to build a 5G connected smart platform for public safety and health in large urban areas such as London. The team is distributed across three King’s faculties, namely, NMS, IoPPN, SSPP, and five research centres including CTR, CoRe, CTI, CUSP, and SLaM. More info will follow - stay tuned!
PAL project aimed to develop (mobile) health applications for the purpose of assisting diabetic children through a social robot and its mobile virtual avatar. My role involved developing vision-based methods for estimating user’s mental states to enable system personalisation mechanisms
This project was a unique collaboration between researchers and artists to enable people to access to public spaces through cutting-edge robotic telepresence systems. My role involved developing (multimodal) methods to automatically model nonverbal cues during human-robot interaction.
MAPTRAITS aimed at developing natural, human-like virtual agents that can not only sense their users, but also adapt their own personalities to their users' personalities. My role involved devising a real-time personality prediction system.
This module introduces learners to the design of sensing and perception algorithms for robotic and autonomous systems. Learners will implement a range of algorithms used by a robot to sense and interpret its surroundings, with sensors such a 2D and 3D cameras, LIDAR and radar, and test these algorithms in design applications. The module also covers advanced techniques to fuse measurements from proprioceptive and exteroceptive sensors for improved state estimation.
In the implementation of mechatronic and other control systems, a key component is the integration of computational brainpower with sensing (measurement of unknown signals or parameters) and actuation (affecting the surrounding environment). During this module, I teach the Sensors part and introduce advanced concepts in sensing for mechatronic and other relevant systems including estimations of parameters from measurements and analogue sensors.