Observatory of Educative Innovation

Rose Luckin, a professor at the Institute of Education at University College London, spoke with the Observatory about technology in class, humanities, and ethics in education.

Read the full transcript:

Observatory (O): What message will you give to skeptics about the use of technology in the classroom?

Luckin (L): People are right to be concerned about damage; particularly people are spending too much time on their screens with their devices, people being influenced by information that’s not true, the kinds of things that are happening on social media. There are lots of reasons to be concerned. For me, there are many reasons to be concerned about artificial intelligence and the way that it might be misused. We have to be aware of those, but the only way to really tackle them is to embrace the technology because the technology is here. It’s not going away. We either hide our head in the sands, as we would say, and then if things go wrong because we haven’t embraced it and paid attention to what’s going on, or we embrace it and we look for the good, but we’re aware of the damage.

For me, if we get it right with artificial intelligence, we can have more time away from screens. We’re going to have more time for art and drama and creativity and all the things that are currently being lost from the system. Advances in technology don’t necessarily have to mean more time with that piece of technology.

Video: https://youtu.be/apeC_lk6oHA

O: How can schools promote and increase the participation of teachers and learners in the design and use of technologies?

L: I think you have to show them how it’s going to help them be the best teacher they want to be. Every teacher wants to be the best teacher they can possibly be. We need to show them that AI can help them to do that. I think one of the key ways to do that is to show them that AI can help to reduce some of the routine tasks that take far too much of their time and don’t really use their specialist skills. Also, we want them to be involved in how AI is designed for education. We value their expertise.

I think, speaking obviously with more knowledge of the UK than anywhere else, -I do travel a lot and I do work with people across the world, I’m not suggesting it’s the case everywhere, but certainly in many places including the UK – the introduction of educational technology for many educators felt like something that was being done to them, not with them. I think we must not make that mistake with artificial intelligence.

I also think that many of the companies developing artificial intelligence for use in education training know very little about education and training. When I say it is the Wild West, I mean there is no regulation on educational technology other than legal regulation about setting up a company. Now in the EU, we have the general data protection regulation, but you don’t need to know anything about education to set up an educational technology company. You don’t need to know anything about education to set up an AI company and say you’re going to use your AI for education and training. We have to get the educators into this conversation. We absolutely have to.

O: Teachers wonder if they are going to be replaced with machines. How can they draw on AI to nurture and expand their human capabilities?

L: For me, nobody can replace a human teacher other than a human teacher. The repertoire of human expertise is enormous and far beyond anything we can automate, but we can automate some parts of the job incredibly well. And that should allow human teachers to use those human skills and expertise that actually they don’t get to use enough at the moment because they’re doing too much of the stuff that we could automate. I want to give them comfort that they are absolutely not replaceable. I think educators are amongst the most important, if not the most important profession for the future because everybody is going to be learning for so much more of their lives. I think my grandchildren are going to be learning until 80 at least. Some need education all of that time. Educators are going to be in even more demand. We need AI to help those educators deliver education for all of the people who are going to need it for much longer.

O: What is the future of education in an automated world?

L: Students who struggle with the existing system because it doesn’t really address their needs. We could address their needs much more effectively, and they might be academic needs, they might be emotional needs, and I don’t mean having the computers addressing the emotional needs, I mean the human teachers having more time to address the emotional needs.

We could do so much for students with special needs. Just look at the different interfaces that we could provide for technology from robots to avatars to virtual reality or augmented reality, voice-activated interfaces. There are huge possibilities for students who are physically disabled or who have challenges in terms of their academic ability. We can really give them a much fairer crack of the whip in terms of getting a good education because we can deliver something that is much more tailored to individual needs.

O: What is your take on humanities in the future of education?

L: I think humanities are increasingly important. You only have to look at the way that  big tech companies are looking to employ people from the humanities more and more. Humanity and compassion are so important; emotional intelligence and social intelligence.

You know, we had a terrible fire in London last year, and our prime minister did not engage with the people who lost loved ones. There were many protests. I remember seeing a placard that said: “Theresa May where is your humanity?” I thought: isn’t that interesting? That is absolutely it. Humanity is so important. As technology has advanced more and more, we need that human touch, and we need to understand how to develop that really effective human touch to a much greater extent.

O: What are the tech trends with more potential for the future of education?

L: I taught in some fairly tough situations before I went to university, I taught in a school where students really struggle to learn a lot of them had very poor backgrounds and had suffered abuse, and it was a very challenging situation. I know that if I had the kind of AI technologies that we can develop now, and we are developing, that would have helped me understand so much more about where they were struggling and how I could help them. I could have been a much better teacher.

I think there are huge possibilities because of the way that we can capture so much data, analyze so much data about people as they interact in the world that can help us understand so much more about their learning and therefore help teachers to understand more about where they’re having problems. That really helps to enhance the job. I know as an educator that it could make my job much more effective and therefore make me feel that I am doing a much better job.

O: What can you tell us about The Institute for Ethical Artificial Intelligence in Education?

L: There are various different initiatives to try and address those risks but we felt there wasn’t enough being done to look specifically at education because I think education is a special case that rarely gets enough attention. We have loads of ethics for medicine, for health, but we don’t have the same thing for education and we should, and we need it increasingly now. That’s why we launched the Institute for Ethical Artificial Intelligence in Education in the UK in October. We were worried that nobody was looking specifically at education. We’re trying to do that now.

We’ll produce a report at the end of next year, an interim report on how we believe that regulation might look, and the kinds of guidelines that could be put in place. Then we will produce a final report in 2020. We’ll be looking globally and trying to look at places where effective systems are being developed. We will be working with companies who develop AI to try and make sure that we understand where they’re coming from but also that they understand that regulation needs to be put in place and hopefully we can come up with something that’s effective.