ALICE the robot visits Bloomsbury
The INSIGHT Patient and Public Involvement and Engagement (PPIE) team was very pleased to take part in the Bloomsbury Festival 2021. The festival took place across a week in October 2021 and is focused on science and art.
The theme this year was ‘Shining Light’, so we thought we would shine a light on the subject of machine learning and the use of data in healthcare and research.
We had a place alongside other scientists and educators at the Senate House Discovery Hub, where, over the course of two days, we introduced hundreds of children and adults to our very own Artificial Learning Intelligent Computer Eye Robot, or ALICE for short.
We have been talking to young people about machine learning with ALICE for some time. We began with the Moorfields Young Persons’ Advisory Group (Eye-YPAG), where we looked at how best to explain the complicated topic. We then used a short film about ALICE to illustrate the principles of machine learning. This included how a computer can learn to recognise images from being ‘taught’ using large datasets containing labelled and classified images, such as recognising a cat without being able to see all of its features. You can watch the film below.
We took this one step further at the Discovery Hub by bringing along a life-size ALICE, who could look at an object and identify it... or not!
We used a camera as ALICE’s eyes, linked up to a piece of software programmed to recognise objects. The software had varying levels of success, which led to our most informative conversations.
Some of the questions raised by interactions with ALICE
If the object was instantly recognised, why?
If the object was identified by its materials, why?
If the object was identified as something else altogether, why?
Did the software recognise humans?
How do human brains work to identify things and how did they think this differs with machines?
These questions led to us talking about the computer needing to see lots of labelled images of the same object to be able to confidently label a new image it sees. For example, a plastic toy cow was sometimes identified as a cow figurine, sometimes as a plastic toy, and sometimes as a penguin. This was dependent on the angle of the image shown to the camera, and led to a conversation about how a computer recognises shapes and materials, and how you can teach a computer to learn more.
We also talked about how researchers and healthcare professionals were using the principles of machine learning and applying them to big datasets. For example, using OCT (Optical Coherence Tomography) scans to teach a computer to identify eye conditions earlier.
We had a brilliant time with all the people we met and spoke to. It was rewarding to see adults and children join all the dots together themselves. Our conversations led to people starting to understand the processes involved in machine learning in healthcare. We hope that our audience will take the ideas behind ALICE back to their homes and schools, and use them to learn more about the subject. and how we can make a difference using this tool to advance scientific discovery.