Diversity is a fundamental and invaluable asset that enriches every aspect of society, including workplaces, educational institutions, and communities. For this reason, EELISA’s research and innovation arm, InnoCORE, has actively collected best practices and initiatives implemented by individual researchers and research structures to promote diversity within their teams.
The focus extended to encompass various aspects such as gender, disability, ethnic and cultural backgrounds, socio-economic status, sexual identity (including LGBTQ+), education background, caring responsibilities.
This year’s awardee is Hatice Köse, leader and founder of the Cognitive Social Robotics (CSR) Laboratory, a research group that specialises in Social and Assistive Robots designed for children with autism or hearing impairments.
Q.- Professor Hatice Köse, as the founder of the Cognitive and Social Robotics Group at ITU, could you please provide insight into the group’s background and its composition?
A.-Our group consists of researchers, graduate and undergraduate students from AI and data engineering, robotics, computer engineering, gaming, mechanical engineering, electrical and electronics engineering, biomedical engineering, and architecture backgrounds, and psychologists and artists who voluntarily be a part of our laboratory and take part in the studies. Our main goal is to to use the assistive robots, Artificial intelligence, sensors and gaming in the therapy and education of children with disabilities.
We encourage female students and researchers to take part in the studies, and our team consists of female members in majority. Our graduate students play a very important role in our group, we design the systems together and they build the platforms, and take part in the interaction studies with children.
Q.- In your research group, you collaborate with children who have autism or hearing impairments, along with their families, therapists, and teachers, in order to create robotic and AI-based solutions. Could you please explain how this multi-actor and interdisciplinary interaction is effectively deployed?
A.-We are engineers, we are can design and develop robotic systems, deep learning based applications and serious games, but we need our collaborators to understand the nature of the problems and their requirements. In every project, we listen to our collaborators, we try to learn the problem, the requirements, and together we try to come up with technology based solutions. We visit the hospitals, watch the treatment sessions, talk with the families. It takes several turns to shape the studies. The most important thing is to build up a system which can be accepted and used by the children. If the children do not like the system, they will not use it and benefit from it. Therefore strict laboratory setups we use in our standard tests, or deep learning methods which require high processing power or a very precise eye-tracker which requires the user to stay still for 10 minutes will not be useful in the study at all.
It is important to know your end user, which are children, their families, therapists and teachers.
An example: One boy refused to play with our robot and left the study, because it “was not a transformer robot and could not transform into Bumblebee” like his cheap plastic toy.
Q.-The main goal of this work is to support these children’s education and health. Could you provide us with an example of a practical application that aligns with this goal?
A.-In our Erasmus+ project Emboa: Affective loop in Socially Assistive Robotics as an intervention tool for children with autism” we investigate the use of a social humanoid robot in therapy of children with autism and we enriched this study with affective computing technologies. Here we developed AI based approaches to detect the emotion and stress of children during their interaction with the robot and define guidelines for the researchers, therapists and families.
In our other project Roborehab: An Assistive Audiology Rehabilitation Robot for children, we developed a system on Pepper humanoid robot which can apply hearing tests to children with hearing impairements in the Audiology Units of hospitals, and also detect emotion, attention and stress during these tests with Artificial Intelligence based methods. The children were getting stressed or negative emotions when they were in the hospitals for the tests, which avoid them to correctly answer the questions and cause failure in obtaining the level of their hearing loss correctly. Our motivation was to develop a social and affective robot and interactive games to monitor and decrease their stress and increase their motivation in the health tests.
In these studies, social robots and interaction games are used to give the education or health related material to children, and support their motivation and attention, while Artificial Intelligence based techniques (such as Deep learning models) and sensors are used to understand children and give a feedback to the robot to support the children better. Every child is uniq and technology help us to adapt the education and health to the children’s needs and likes better.
Q.- Based on your experience, how do you forsee the implementation of cognitive and social robotics can contribute to the achievement of a more egalitarian society?
A.-These children have the same creativity, intelligence, hopes, dreams with the rest of the society, just they can not use the standard educational material, or communicate as the rest of us, which isolates them and build a gap between them and the society.
I believe, cognitive and social robots can be rather used as a social support system to help the child to transfer what they learn from their interaction with the robots, to their interaction with the other people in the society and decrease this gap between them and the society.
Q.- How do you believe receiving the EELISA Diversity Award recognition can contribute to furthering both the development of your group’s project and your professional career?
A.- Receiving the EELISA Diversity Award recognition shows us that our project is important and has an impact, and motivates us to proceed further on this topic. As a woman in engineering, especially I feel supported, and this encouraged me to take part in similar projects in the future in collaboration with other European Institutions and researchers. I believe this award will have a significant impact in my career.
Professor Hatice Köse
She is a full Professor at the Faculty of Computer and Informatics Engineering, Istanbul Technical University, Turkey. She coordinates the GameLab and Cognitive Social Robotics Lab since 2010.
She received Ph.D. degree from the Computer Engineering Department, Bogazici University, Turkey. In 2006-2010, she worked as a Research Fellow at the University of Hertfordshire. Her current research focuses on gesture communication (involving sign language) and imitation based interaction games with social humanoid robots for the education and rehabilitation of children with hearing impairment and children with ASD. She is leading several national projects, and taking part in several Horizon2020 projects, Erasmus+ and Cost actions, on social assistive robots, sign language tutoring robots, and human-robot interaction.