An interview with Simon Longstaff

Overview

How urgently do we need to prepare for the impact of artificial intelligence on our society? How might school education contribute to technology that serves human interests and not the other way around?

To find out, Leslie Loble, former Deputy Secretary, External Affairs and Regulation speaks with Simon Longstaff, Executive Director of the Ethics Centre.

They discuss the importance of ethics in an age of rapid technological advancement and how education can help to develop strong ethical reasoning skills in young people.

Credits: Recording and production by Andrew Piper, editing by Andy Maher, and voiceovers by Sally Kohlmayer. This episode was recorded in 2018. The views expressed in Edspressos are those of the interviewees and do not necessarily represent the views of the NSW Department of Education.


Podcast

Image: Simon Longstaff
Special episode: an interview with Simon Longstaff

Voice-over:

Welcome to the Edspresso series from the New South Wales Department of Education. These short podcasts are part of the Education for a Changing World initiative. Join us as we speak to a range of experts about how emerging technologies such as artificial intelligence are likely to change the world around us, and what this might mean for education.

Narrator:

Could the rise of smart technologies lead to tyranny? What will it mean to be human in an AI world? In this special episode, Leslie Loble, a Deputy Secretary in the New South Wales Department of Education, interviews Dr Simon Longstaff, the Executive Director of the Ethics Centre. Simon is a distinguished philosopher and applied ethicist, and was made an Officer of the Order of Australia in 2013. In this interview, Simon discusses why ethical literacy will be crucial to ensure that AI serves human interests and how schools can develop it in their students.

Leslie Loble:

Simon Longstaff, head of the Ethics Centre, thanks very much for joining us today. You have argued recently that technical mastery on its own without ethics can lead to even to tyranny, but certainly to some severe consequences. So, given the speed of technological advancement, you know, two questions for you. One, what impact do you think artificial intelligence, in particular, as one of those technologies, what impact do you think that will have? And the second is, how urgently do we need to prepare for a future of AI?

Simon Longstaff:

Well, I think it's not merely an assertion. I think it's evident in history that when you have a certain kind of approach to technology and its mastery without ethical restraint, it does in practice, underwrite tyranny. So, I think we have to recognise that first of all, and to understand that we have a choice in this matter to, say, take the second of your two questions, first about speed, we have no time to ignore this and hope that others will sort it out. We're already sowing the seeds for the kind of society we'll have in the future. And we have a remarkable opportunity, even at this point, to bring the ethics into formal and practical consideration and to begin to embed our ethical judgements into the things that we make at the point at which we design them, develop them and release them into the world.

Now, AI, of course, is a particularly potent form of technology which in alliance with other technologies like robotics and even biology. You can start to see it having a radical transformative effect which I think will give rise, in practise, to civilisational change of a kind we haven't seen for centuries. And there are going to be various forks in the road that open before us as to what we do about that. We could just have a kind of technological determinism and think - well, whatever is going to happen is going to happen. We could recognise that our choices in this matter are of profound importance and that we need the technology to be the servant of our interests, rather than in some sense mastering us. For example, I don't just to become just a network element. I want to know that I maintain some of my autonomy. But the practical effects of AI and robotics and nanomanufacturing and all the rest will be massive displacement of people, if not completely out of formal employment, certainly into radically different forms. It will see the middle classes, as well as people from a broad spectrum of a socio-economic environment, being very much affected by this in a way we haven't seen in a long time. And all of it, whether you think it's going to be wonderful and produce, as it could, some truly transformative effects for the good or bad, depends on whether or not we bring our minds to bear on this right now. And that's what I think we have to do with AI and other associated technologies.

Leslie Loble:

That's a pretty profound and stimulating statement that we're facing significant impacts, potentially even on what we think of as our civilisation. Do you think it will change what we even think of as human?

Simon Longstaff:

It will, and it might draw us to a point where we recognise what is truly distinctive about our form of being. But this can be one of the benefits of it. When you start to think about AI, I'll give you a simple example. We already have the capacity, through expert systems, for these tools to be far more accurate in their diagnoses of some forms of cancer than any human pathologist could be. So, they have much higher accuracy rates these neural nets that are looking at images and comparing. My contention is, though, although they can tell whether you or I have cancer, they can never tell us that we do in the same way that a person can, a human being. Why? Because we know what it is to be mortal. We understand that we will die. And so, when we communicate to another person that they face the prospect of death, we can do that with an authenticity that no machine can possibly convey. They may be able to say the words, they may be able to have the tone perfectly well, but they don't know what it means to be mortal. Now, you learn that about us, in a sense, by reflecting on the alternative, what the machine can do - it highlights what's distinctive within us, it highlights, for example, that we live in an environment where we're not bound by instinct or desire as animals are. Yet we make choices in conditions of profound uncertainty where there are circumstances where in principle there is no right answer.

Someone comes home whom you love, they've just bought a new piece of clothing, or they've had their hair cut and, "How do I look?" And you look at them and you say to yourself, "Oh, damn," you know? Not great, but you don't say it despite the fact that you believe in the truth. But you may also believe in compassion. And you may believe in those two things with equal weight, they pull you in opposite directions and the theoretical answer to which matters more is null. Now, a machine faced with a situation where there is no right answer, what's it going to do? We know what we do. We invent something like go and just give them a hug rather than answer. Maybe that's the thing. So, that part of us in our form of being that we inhabit as humans, that will become clearer to us as well. So, there's lot of upside in this. And then, once we understand that about ourselves, we might say, "what do we wish to retain in terms of our exercise of judgement and agency and not hand over to a machine, even if it would be more efficient?" But we can't those conversations until we understand these issues.

Leslie Loble:

So, if we think about that in terms of school education, and also in the technological context, does that suggest that alongside things like computational thinking so that students understand the construction of that technology and what it can and can't do. Does it also suggest that we should be emphasising the things that are uniquely human - the creative side, the critical thinking, those sorts of things?

Simon Longstaff:

Well, I do agree with the general proposition, but within some boundaries. So, I think we all should understand the essential elements of the science behind these things that we're creating. We should also understand that at some point what we have created will exceed our knowledge of its precise operation. This raises profound issues to do with accountability. And they will become, in time, really superior at certain things. And allied to things like nanofabrication and things. Most of the stuff that gets done which is routine, humdrum, analysis or manufacturing will be done by these expert systems and their tools, hopefully in the service of human interest. So, we should understand that core of it, but we'll never be able to say that therefore allows always to intervene in their operation. Except that we should always have, as Alan Finkel, the Chief Scientist has proposed the ability to turn them off. There should always be an off button in whatever these happens to be. So, that might be the limit of our ability to intervene. But, what that then says to us is, well, what is that we need to be able to do to live rich and fulfilling lives, if they're actually not going to be defined by the employment that we might have had formally, whether we were cleaning windows, driving cars, working in law firms, working as accountants, all of those things. A lot of them, well most of the things will be done at least at the most basic level better than we do them, and at a lower cost by these machines. So, how do we make ourselves ready for the kind of life that we could live? How do we imagine this life?

Well, we've seen it in the past, we've seen societies like that of Athens, ancient Athens, had a pernicious institution called slavery, but the citizens of Athens had rich and meaningful lives because of their citizenship. So, what do you need from an education system to develop a kind of civic intelligence in which you understand both the obligations and opportunities of citizenship so that you can practically engage? How do you, therefore, debate the issues that really matter to the society in which you live without destroying yourselves because of petty ignorance or jealousy or because you're locked into an echo chamber of dissenting opinions? Well, you do critical thinking. How do you take the one thing that machine can't produce namely, something that bears the mark of a human being? And we see this already. Years ago, they started to put on menus, 'hand cut chips'. Why would you tell someone that it's been cut by a person? Because there's something in value that's been made by hand. So, artisanal bread and brewers and all the rest, but now maybe the thing that adds value to something produced by a machine is the artistic, creative component that the human being brings to bear immediately with their touch, almost like the thumbprint in the brick that shows that someone actually made it, some other human being.

There'll be issues to do with that question about our mortality and what that means, and what does it mean to live a vulnerable life. So, in other words, when we could go through... I think ethical literacy will become important. All of this looks like an education which is closer in some respects to the idea of a liberal education as it used to be conceived, where one has to be able to be at least introduced to different forms of knowledge. You should understand the nature of them, mathematics, philosophy, the arts. So the way we've organised things, but that I think we become slightly more adept in them, rather than just saying education as it is tended to become, at least in some parts of the world, as an intensely vocational exercise that prepares you for a certain role in life. I think if I can put it this way, I think there is a set of skills to be learned in being human. That is not just something you do when you wake up. You learn it, you develop it. And that can be refined through the process of education.

Leslie Loble:

I'm quite interested in that notion that you've referred to before of the Socratic examined life. And that, in fact, education needs to remind itself of the mission it's always had, which is to produce well-rounded contributors to society through citizenship and through work. But I'm also interested in your thoughts about how we develop in students that agency. It can be very overwhelming, some of these things. The technology can seem to have a life of its own and almost to be a natural force, which as you say, it is not, the design that is absolutely human-driven, to a point. But young people who certainly feel great passion about the issues they're surrounded by. How can we best give them that ability, throughout their lives, to understand complex and changing challenges and to not only gain some mastery over them, even if it's not a particularly narrow expertise, but some mastery in agency over shaping those forces that will face us across the 21st century and beyond?

Simon Longstaff:

The first thing on that, we shouldn't forget that education has not always been clear in its purpose being related to a developing agency. There was a period certainly after the early stages of the industrial revolution where education for most people we used it as this form of this socialisation to make you factory-ready, giving you the discipline, and things like that. So, education itself has to always reflect on its practice, particularly, its defining purpose not just in terms of outcomes. But what is the thing that it is good for, that it is made for? But that said, I think that younger people, like anybody, I think perhaps in any time, they want to feel a sense that they're living a meaningful life. They want to feel that it is possible to make a difference. And too often they're exposed to a worldview that tells them to be realistic. I'll confess, I'm not a realist, I'm a pragmatic idealist.

Leslie Loble:

Right.

Simon Longstaff:

I believe that one should be practically oriented, but I don't feel myself circumscribed by the bounds of what others tell me to be possible. I think I'm open to failing but at least after having made an effort. So, I think there's something exciting about that. It can be terrifying but it's exciting. So, I think education, if it conceives to these young people that they're opening up a vista of possibilities in which they can act in the world, in which the decisions they make will literally shape the world. So, we see this all the time. Students who are inducted into their agency and understanding how they impact upon the world, it's actually sometimes that the small scale of their friends and then its's like a larger vista and larger vista, and horizon beyond that. I think they can say, "I now see the point of the education that I'm having. It's not to me ready to perform a certain task or to secure a particular job. The point of it is to make me ready for my agency in the world and to accept the kind of obligations that come with stewardship as a human being who has this extraordinary capacity to make conscious choices." Which, at least, as far as we know, is not available to other creatures. Maybe it is, we just don't know. Maybe we will one day know. And therefore, in the face of all of these things that might be happening around them not merely to accept them as given, but to interrogate them, to shape them and to make sure that they understand the purpose of a good life, so that the things we make are fit for serving those ends.

Leslie Loble:

And if I can just pick up on that, that the last two points are sort of shaping it and shaping it for good - the 'it' being technology in this case. That suggests an expertise as well as you've spoken very articulately about an ethical reasoning and a sense of agency. But it also suggests that there needs to be groundedness in the expertise...

Simon Longstaff:

And skilful means. It's not enough to just have good intentions. What flows from that is that you must, therefore, develop the skilful means. The skills that you do, but also the dispositions to use them. And this is why teaching is such a significant profession, particularly in schools, when it comes to ethical questions because it's not like where you teach a skill and whether you use it or don't use it, doesn't matter except as a matter of practical utility. When teachers are engaged in educating students, particularly where they bring this ethical component to bear. Not only are they saying, "Well, here's how you do it", but they're urging people "and actually you should use it." So, they're encouraging the development of dispositions that align to the exercise of those skills. And it's this package that we've got to find a way for education to use, and it will be enabled by tools. I mean, you think about virtual reality or augmented reality, you can put people into worlds where their empathetic response can be improved. So, we can use a lot of these things to really extend what we do, as long as we never let them become ends in themselves but serve the purpose as we ought to serve.

Leslie Loble:

And so, you've painted a picture quite powerfully that this isn't a binary world, that somehow ethical responses to technology and some rules around it are somehow designed to stop creativity, innovation, technological invention. But in fact, it's a package that comes together and that in fact, if we're going to develop those technologies it's encumbered upon us as we're seeing almost every day with things like Christchurch and elsewhere to actually think in the design about a set of principles that lead to wonderful inventions, but that serve good.

Simon Longstaff:

Yup. And embed that, you know, anticipate this. Too often, we let the horse bolt and then try and find it and bring it back in and reign things in. This is not sensible. No sensible person would do that. We're far better off asking ourselves as far as we can, What's it for? What are the core principles that ought to be embedded in it? People have different ways of doing it as long as they can give an account of this, this is sufficient. How do we know that that's been done? How do we track evolution and change in the things we've made relative to the circumstances in which they're used? But that we give a little bit of forethought. I mean, going back to where we started - this massive change that's coming, it's possible it can be, as it tends to have been in the past, unjust, in that the burdens disproportionately fall on some. Chaotic, you know, just everything thrown up in the air. This kind of Schumpeterian notion of creative destruction and things. Or, we can say, no, we actually want it to be just. We want a proportionate allocation of the benefits and the burdens of the transition. And we want it to be orderly, we want to put in some plans, so that we can manage it. That there aren't too many shocks here or there that our institutions can cope with it. And we want to produce a people who can do all those things. As I say, the skills, the knowledge, the dispositions to be able to do this. And that's a hugely wonderful opportunity for the education system because, as you say, there's so much good that could be harnessed from this incredible application of knowledge, if only we get those things right.

Leslie Loble:

Right, I think I might end with one question which we ask of all the people that we have had the opportunity to gather their thoughts and that's to take you back to your own education. And if you look back on your education, particularly, let's say school education. I was going to take you even to primary school. What advice would you give yourself as a primary school student, in terms of your educational interest, progress, focus?

Simon Longstaff:

It would've been easier if you'd asked me about secondary education. This is a good question about primary school.

Leslie Loble:

You can answer both.

Simon Longstaff:

Well, I will start with secondary school cause there I learned one of the most important lessons of all which is to attack the argument rather than the person. So, be bold and adventurous in your thinking, but always respect the people even if you're disagreeing with your ideas or challenging them. So, that was a hugely important lesson. Primary school for me, I think was more about being open, I think. I think primary school for me was kind of an adventure of sorts. It was about learning how to get on with people and to navigate what possibly seemed to me to be very great difficulties at the time, but were probably a little bit like a slightly warm crucible, not an absolutely scarifying experience, but that was good. So, that was more than what had happened in the classroom. That was more about learning the foundations.

Actually, the one thing I remember, we used to do these SRA, I think it was, which was a set of graduated reading exercises - reading and comprehension. And I remember doing those and being fascinated by the words that were going on and, you know, little things like what did it mean to be stingy? And then suddenly finding out that it was 'stingy'. So, it was that sense of adventure, I think. But that's what I remember. It was that, kind of, slow progression of getting the foundations right but having a sense that I was being inducted into something which I could use. And then, eventually, it sort of all just trailed into the rest of life.

Leslie Loble:

Right. And if you were giving advice, take a high school student, would you give different advice if you were talking to a group of secondary students today, what might you say?

Simon Longstaff:

I'd say the same thing whether in primary or secondary - there's not such thing as a dumb question.

I'd say, Ask. Ask. Ask.

Leslie Loble:

Terrific. Thanks very much for your time. It's been a great exchange.

Simon Longstaff:

No, thank you very much.

Leslie Loble:

And I appreciate that you've given us your insights into this.

Simon Longstaff:

Pleasure.

Voice-over:

Thank you for listening to this episode of the Edspresso series. You can find out more about the Education for a Changing World Initiative via the New South Wales Department of Education's website.

Video

Ethical citizenship in the time of AI: Simon Longstaff and Leslie Loble

LESLIE LOBLE:

Simon Longstaff, head of The Ethics Centre, thanks very much for joining us today.

SIMON LONGSTAFF:

Pleasure.

LESLIE LOBLE:

You've said in the past that technological mastery divorced from ethics has quite serious consequences, and, indeed, even could lead to tyranny. Can you give us a sense, turning particularly to artificial intelligence, where do you think that's going to lead? And is there an urgency in terms of how we respond?

SIMON LONGSTAFF:

AI, of course, is a particularly potent form of technology which, in alliance with other technologies like robotics and even biology, you can start to see it having a radical transformative effect, which I think will give rise, in practice, to civilisational change of a kind that we haven't seen for centuries. And there are going to be various forks in the road that open before us as to what we do about that. We could just have a kind of technological determinism and think, well, whatever's going to happen is going to happen. We could recognise that our choices in this matter are of profound importance, and that we need the technology to be the servant of our interests rather than, in some sense, mastering us. For example, I don't want to become just a network element, I want to know that I maintain some of my autonomy. But the practical effects of AI and robotics and nanomanufacturing and all the rest will be a massive displacement of people if not completely out of formal employment, certainly into radically different forms. It will see the middle classes as well as people from a broad spectrum of the socioeconomic environment being very much affected by this in a way that we haven't seen for a long time. And all of it, whether you think it's going to be wonderful and produce - as it could - some truly transformative effects for the good, or bad, depends on whether or not we bring our minds to bear on this right now. And that's what I think we have to do with AI and other associated technologies.

LESLIE LOBLE:

Right, so that's a pretty profound and stimulating statement that we're facing significant impacts potentially even on what we think of as our civilisation.

SIMON LONGSTAFF:

Yeah.

LESLIE LOBLE:

Do you think it'll change what we even think of as human?

SIMON LONGSTAFF:

It will. And it might draw us to a point where we recognise what is truly distinctive about our form of being. I mean, this can be one of the benefits of it. And when you start to think about AI... I'll give you a really simple example. We already have the capacity through expert systems for these tools to be far more accurate in their diagnosis of some forms of cancer than any human pathologist could be. So they have much higher accuracy rates, these neural nets that are looking at images and comparing. My contention is, though, that although they can tell whether you or I have cancer, they can never tell us that we do in the same way that a person can, a human being. Why? Because we know what it is to be mortal. We understand that we will die. And so when we communicate to another person that they face the prospect of death, we can do that with an authenticity that no machine can possibly convey. They may be able to say the words, they may be able to have the tone perfectly well, but they don't know what it means to be mortal. Now, you learn that about us, in a sense, by reflecting on the alternative - what the machine can do. It highlights what's distinctive within us. It highlights, for example, that we live in an environment where we're not bound by instinct or desire, as animals are, yet we make choices in conditions of profound uncertainty where there are circumstances where in principle there is no right answer. If someone comes home who you love, they've just bought a new piece of clothing or they've had their hair cut, "How do I look?" And you look at them and you say to yourself, "Oh...damn," you know. Not great. But you don't say it, despite the fact that you believe in the truth, but you may also believe in compassion. And you may believe in those two things with equal weight, they pull you in opposite directions, and the theoretical answer to which matters more is null. Now, a machine faced with a situation where there is no right answer, what's it going to do?

We know what we do. We invent something like go and just give them a hug rather than answer. You know, maybe that's the thing. So, that part of us and our form of being that we inhabit as humans, that will become clearer to us as well. So there's a lot of upside in this, and then once we understand that about ourselves, we might say, "What do we wish to retain?" in terms of our exercise of judgement and agency, and not hand over to a machine, even if it would be more efficient. But we can't have those conversations until we understand these issues.

LESLIE LOBLE:

Turning to school education, you've described powerfully the need for ethical reasoning, critical thinking and the like. Is there also a need for the technological understanding? How do these algorithms work? How do... How can we master the technology for force of good if we don't actually understand what sits behind it? And, you know, do you see the combination as being important?

SIMON LONGSTAFF:

I think we all should understand the essential elements of the science behind these things that we're creating. We should also understand that at some point what we have created will exceed our knowledge of its precise operation. Indeed, there's artificial intelligence. Yeah, we have with this case, as you know, convoluted networks and all the rest, they will work, but for reasons we may not fully understand, so this raises profound issues to do with accountability. And they will become, in time, really superior at certain things. And allied to things like nanofabrication and things, most of the stuff that gets done which is routine, humdrum analysis, or manufacturing, will be done by these expert systems, and their tools, hopefully in the service of human interests. So we should understand the core of it, but we'll never be able to say that, therefore, allows us always to intervene in their operation. Except that we should always have - as Allan Finkel, the Chief Scientist, has proposed - the ability to turn them off. There should always be an off button in whatever these happen to be. So that might be the limit of our ability to intervene, but what that then says to us is, well, what is it that we need to be able to do to live rich and fulfilling lives if they're not actually going to be defined by the employment that we might have had formerly? Whether we were cleaning windows, driving cars, working in law firms, working as accountants, all of those things, a lot of them will be...well, most of the things will be done, at least at the most basic level, better than we do them, and at a lower cost by these machines. So how do we make ourselves ready for the kind of life that we could live? How do we imagine this life? Well, we've seen it in the past. We've seen societies, like that of Athens, ancient Athens, had a pernicious institution called slavery, but the citizens of Athens had rich and meaningful lives because of their citizenship. So, what do you need from an education system to develop a kind of civic intelligence in which you understand both the obligations and opportunities of citizenship so that you can practically engage? How do you, therefore, debate the issues that really matter to the society in which you live without destroying yourselves because of petty ignorance or jealousy or because you're locked into an echo chamber of dissenting opinions, or you do critical thinking? How do you take the one thing that the machine can't produce namely something that bears the mark of a human being? And you see this already. I mean, you know, years ago, they started to put on menus, "Hand-cut chips". Why would you tell someone that it's been cut by a person? Because there's something in value that's been made by hand. So, artisanal bread and brewers and all the rest. But now maybe the thing that adds value to something produced by machine is the artistic creative component that the human being brings to bear immediately, with their touch, almost like the thumbprint in the brick that shows that someone actually made it, some other human being. There'll be issues to do with that question about our mortality and what that means. And what does it mean to live a vulnerable life? So, in other words, I mean, we could go through... I think ethical literacy will become important. All of this looks like an education which is closer in some respect to the ideal of a liberal education, as it used to be conceived, where one has to be able to be at least introduced to different forms of knowledge. You should understand the nature of them - mathematics, philosophy, the arts - so the way we've organised things, but that I think we've become slightly more adept in them. Rather than seeing education as it has tended to become, at least in some parts of the world, as an intensely vocational exercise that prepares you for a certain role in life. I think...if I could put it this way, I think there is a skill or a set of skills to be learned in being human. That it's not just something that you do, and you wake up one... You learn it, you develop it, and that can be refined through the process of education.

LESLIE LOBLE:

You've talked before about Socrates and that notion of the examined life. In terms of school education, what does that lead you to think we need to be doing to give students that sense of agency over their lives, and ability to understand and shape where their lives are going?

SIMON LONGSTAFF:

I think that younger people... They want, like anybody, I think perhaps in any time, they want to feel a sense that they're living a meaningful life. They want to feel that it is possible to make a difference, and too often they're exposed to a worldview that tells them to be realistic. I confess, I'm not a realist, I'm a pragmatic idealist.

LESLIE LOBLE:

Right.

SIMON LONGSTAFF:

I believe that one should be practically oriented, but I don't feel myself circumscribed by the bounds of what others tell me to be possible. I think I'm open to failing, but at least of having made an effort. So I think there's something exciting about that. It can be terrifying but it's exciting. So, I think education, if it concedes to these young people that they're opening up a vista of possibilities in which they can act in the world, in which the decisions they make will literally shape the world. I mean, we can carve this off to a certain degree, but we need to recognise that, at least at this point in history, when anybody looks around them and they look at the architecture, the technologies, the institutions, everything around us, limited only by the laws of nature, is a product of human choice. I mean, the Pyramids could have been vast cubes on the deserts of Giza. Someone chose that they had to be pyramidal. They could have still been magnificent. And here in Sydney, where we are, the Opera House could have been a neoclassical building on Bennelong Point, but someone chose to have that extraordinary building. So, we see this all the time. Students who are inducted into their agency and understanding how they impact upon the world, if they see sometimes at the small scale of their friends, and then a slightly larger vista and a larger vista and a horizon beyond that, I think can say, "I now see the point of the education that I'm having." It's not to make me ready to perform a certain task or to secure a particular job. The point of it is to make me ready for my agency in the world and to accept the kind of obligations that come with stewardship as a human being who has this extraordinary capacity to make conscious choices, which, at least as far as we know, is not available to other creatures. Maybe it is, we just don't know. Maybe we will one day know. And therefore, in the face of all of these things that might be happening around them, not merely to accept them as given but to interrogate them, to shape them, and to make sure that they understand the purpose of a good life so that the things we make are fit for serving those ends.

LESLIE LOBLE:

Thanks very much for your time. It's been a great exchange.

SIMON LONGSTAFF:

Thank you very much.

LESLIE LOBLE:

And I appreciate that you've given us your insights to this.

SIMON LONGSTAFF:

Pleasure.

LESLIE LOBLE:

Thanks.

Category:

  • Teaching and learning

Business Unit:

  • Centre for Education Statistics and Evaluation
Return to top of page Back to top