Transcript for Episode 2: The AI future is here and what this means for education
Welcome to the Edspresso series from the New South Wales Department of Education. These short podcasts are part of the Education for a Changing World initiative. Join us as we speak to a range of experts about how emerging technologies such as artificial intelligence are likely to change the world around us and what this might mean for education.
So, why is artificial intelligence such a big deal? How urgently do we need to prepare for it? What might it mean for school education? To learn more, we interviewed Toby Walsh, Professor of Artificial Intelligence at the University of New South Wales. Toby is a leading global expert on artificial intelligence and its ethical implications, and author of the new book titled, ‘2062 The World That AI Made’.
In this episode, Toby describes how AI is changing our society, and what this could mean for how education prepares young people for this future.
My name is Toby Walsh, I'm a Professor of Artificial Intelligence at the University of New South Wales here in Sydney. Artificial intelligence is many different things, just like our intelligence is many different things. But at the moment it is a bunch of different technologies. We have things like machine learning and Bayesian reasoning. It's not one thing, that's a mistake people often make - they think is just one thing, it's actually a collection of different tools and techniques that solve a collection of different problems.
It's hard to think of an aspect of our lives that artificial intelligence won't touch. And it certainly is going to change the way that we work, it's going to change the way that we play. It has been called the Fourth Industrial Revolution because it is going to hopefully take away all the dull, dirty repetitive, dangerous things in our lives, and free us up to do the things that are more important for us as humans.
So, it could be the second renaissance, it could be a moment where we get to sit back and let the machines take the sweat, and we get to do the important things. Equally, it could be something that increases divisions within our society, puts a lot of people out of work, it will change the nature of work undoubtedly.
Many of the dire predictions about the number of jobs being replaced, I think, should be treated with a pinch of salt, technology has always created more jobs than it's ever destroyed, that's not necessarily going to be the case in the future, and the future doesn't necessarily repeat the past. But there will be many more jobs created. What the balance is, whether as many jobs get created as destroyed, we don't know.
One thing though, we do know with pretty great certainty is that the new jobs will require different skills than the old jobs, and so, that's a real challenge for education, to think about reimagining. What is it that people need to learn? What skills do people need so that they can be prepared for this AI enabled future?
We need to be preparing urgently for the impact of AI, in fact, we needed to be preparing for it yesterday. We're also seeing a shortage of skills for certain areas as well, so, it's clear that we need to think about, what it is that we should be teaching people so that they’re prepared for the technologies of the future. And it's actually quite a hard thing to solve because we need people to be skilled for jobs in 30 years’ time, with technologies that we will invent in 20 years’ time.
And so, this idea that learning is going to have to really be life long, is I think, central to this, that we need to equip people with the skills so that they, then independently, on their own, with the support of their employer, with the support of government and so on, can pick up new skills as new technologies get invented.
So, what are the most important thinking skills? I don't think there's one skill that you need above all others. The machines are not very creative and so, creative skills are going to be ones that are going to be highly useful in the future. Machines aren't going to take over many of the creative aspects. Even if they could, we probably aren't going to care too much about what machines create because they won't speak to the things that are important to us.
Equally, our adaptability is really important, the fact that machines still are very brittle. If you give a computer a task and you change the task, even the smallest amount, it will fall over, whereas humans don't. We're amazingly adaptable and robust. You can parachute someone into a new circumstance and they can tap into what they've learned in the past and apply this to a completely new domain, whereas a computer will just stop and break.
Critical thinking skills are going to be very important to this AI enabled world. The machines are going to do any of the dull repetitive jobs. We need to play to our strengths, we need to play to our creativity, our adaptability, our emotional intelligence, our social intelligence, these are all the things that computers don't do very well today and probably for quite a long time, if ever in the future will never be able to do.
There are risks in outsourcing, some of our thinking to machine, we've already seen this, we're probably the last generation that knows how to read a map because we've seen to be handing that over to Apps. And they'll actually take something away from our lives, just knowing where North and South is, and East and West. Those are actually quite important skills to have because sometimes the technology doesn't work and also to understand how the world ticks.
Technologies have done this to us. Technologies always take things away from us, and then they give us things back. And so the interesting question to ponder, is what will we get back once we start handing out some of these other things to machines? It is very important that schools consider the ethical challenges posed by AI. Many of them actually aren't new challenges, they're about dealing with existing values that we already have.
There is one new value that AI creates which is the idea of autonomy, that we'll be giving over responsibility to machines to act independently in the world, and that raises really important questions about accountability. And we see this already in the discussion around autonomous cars, that autonomous cars will at some point be killing people by mistake, of course, and who is going to be held accountable? What sorts of decisions should they make? We also see this again in the discussion around autonomous weapons.
It's also wrong to think that all we need to do is just to have more programmers, in fact, AI is actually going to take away the need for many programmers. The whole idea of machine learning, that computers can learn things themselves, is they learn the program themselves, and so, ultimately, we will probably need fewer and fewer programmers. But that doesn't mean we want the technology to be magic. If the technology is magic, people don't understand it, then they’re not going to be active participants, it's going to be easy taken advantage of.
And so, people do need to understand the basic ideas of computation. I do think today that, probably, it's more important for people to understand computation than calculus. We used to live in the mechanical world and calculus was the mathematical way to understand and describe that. Well now, we live in a very digital world and you really do need to understand the fundamental ideas of computation, to understand what machines can do and what machines can't do.
I think one of the challenges in trying to navigate the future is that a lot of what people understand about AI is driven by what they see from Hollywood. And so, there’s a really important role to be had from AI researchers like myself, trying to inform the discussion with schools and with the community as a whole about what the technology can do today.
You added everything up you read in the newspapers today, you'd believe that we have intelligence machines or they're about to arrive any day soon. Actually, if you ask most of my colleagues they'll say it's 50, 100, 200 years away before machines will be as capable, fully as capable as humans.
And then there are lots of other questions, machines have no sentience, no consciousness, it's not clear if they ever will. These are fundamental questions of fundamental things that people need to understand about limits on the technology and how much further we've to go.
Having said that, even the not very smart computers we have today, already can do useful things in our lives and that has consequences, has consequences to the future of work, has ethical consequences about handing those decisions to machines that humans used to make. As researchers, I think we need to help inform that discussion and schools need to prepare the student for the world that's going to be, and it is a world that's going to be full of AI. So, ultimately, it's going to be woven into the fabric of all our devices.
If I went back in time to the young Toby and try to offer him some advice, I'm not sure I would listen, I'd be very sceptical about this time traveller from the future, but the advice I would try to give to my sceptical self would be, actually, to spend some more time on the soft skills. I spent a lot of time working very, very strongly on my technical skills and programming and mathematics and so on.
Those of all stood me in great stead through my career and through life, but if you want things to happen, you actually have to understand how to work with other people and how to work within that society. So, I would encourage myself to focus on that softer skills as much as the harder technical skills. As to what advice I'd give to students today, find something that is your passion and that will drive you through your life, and then work is not work, work is actually play.
Thank you for listening to this episode of the Edspresso series. You can find out more about the Education for a Changing World Initiative via the NSW Department of Education's website. There you can sign up to our mailing list or join our conversation on Twitter @eduction2040.