Google – adaptive learning, AI for everyone

In this workshop, Google for Education demonstrate how AI technology can save teachers time and create unique learning paths for students.

Join us for a group lab session exploring Google’s adaptive learning technologies that are built for personalisation and collaboration. Explore Practice Sets, an AI driven feedback system and Screencast, our powerful Chromebook screencasting app with inbuilt text-to-video editing and automatic translation. We will give you space to get hands-on with our suite of tools that work together to transform teaching and learning, so that every student and educator can pursue their personal potential.

Watch 'Google – adaptive learning, AI for everyone' (44:54)

Kimberley Hall and Steve Smith

(Duration: 44 minutes 54 seconds)

[Music]

[Red and blue logo revealed reading ‘STEM 2022 on demand’.]

Kimberley Hall:

Hello and welcome to the Adaptive Learning session with Google for Education. We will introduce ourselves in just a moment, but we're really excited that you are choosing to watch this video whenever and wherever you are.

Steve Smith:

This could be happening in the future.

Kimberley Hall:

Okay, I think it's happening in the future, actually, Steve. Before we jump in, we would like to take a moment to acknowledge the traditional custodians of the land that we are joining you, or I am joining you from today, and you are no doubt watching from. I am in Melbourne, so for me it's the Wurundjeri people of the Kulin nation. I realise that New South Wales, there are different traditional owners in different parts of the land. So, please take a moment to acknowledge the custodians of the land you are on. And, you know, remember how amazing they have done to maintain our fabulous country for such a long time. And I am so grateful for that. And my friend Steve is joining us from the other state, the long lost state of Australia. Am I allowed to say that?

Steve Smith:

Nice. Yeah, thanks The other other state of Australia. I love it. Just because there's no constitution, doesn't mean it's actually real. Kia Ora whanau nga mihi ki koutou katoa ki a papatuanuka - tēnā koe ki te whare - tēnā koe ki te tūpuna - tēna koe tēna koutou katoa. Tihei Mauri ora. So, welcome everyone in from, if you happen to be watching from New Zealand, from Aotearoa, the land of the long white cloud. Or welcome to you in Australia, or as we like to call it, the West Island, with our North and South Island. So, great that you could spend some time with us today, and hopefully you'll be able to get a lot of cool stuff out of this session.

Kimberley Hall:

Awesome, thanks Steve. And yes, so my name is Kimberley Hall, I am based in Melbourne as I mentioned, and part of the Google for Education team. Really looking at the impact that we can drive around teaching and learning with our tools nationally. And then, because, you know, I was like, "Hey, we should have a fun accent on the video." I was like, "Who's my friend that I know who has a cool accent?" So it's Steve-

Steve Smith:

That's me. And I know there's a few words I won't say, because you'll make fun of me while we're talking. So we'll just try and do our best today.

Kimberley Hall:

Excellent. Okay. So, today we are going to do a little bit of a walkthrough of different products and tools that are really harnessing some of the best of the technology that Google has across the breadth of Google tools. So not just exclusively in education. But what we're actually starting to see is this really exciting shift to bringing some of that amazing technology into the core education tools. You know, and to provide some incredible insights, different ways of learning and hopefully, you know, impact teaching and learning in a really positive way. Saving teachers time, creating different learning paths for students, all of that good stuff that we believe is on the horizon in education. But Steve, before we jump into those specifics.

Steve Smith:

Yeah, so I guess we kind of kick it off by talking about, yeah, what is AI? And as you say, we, at Google, we have a tonne of amazingly smart people working on AI, and how can we bring what they're doing to education help, as you say. So, AI is the science of of making things smart. And machine learning is a technique used to develop AI, if you like, and give it its smarts. They allow us to process large amounts of data and make it useful. And today we can do that at a scale previously unheard of. And, as I said, we've kind of brought those capabilities into educational tools to help teachers do what they do best, and that's teach kids. So, artificial intelligence is the main topic for our session. And you may have heard it referred to as AI. And so, AI is pretty much teaching machines to appear intelligent. It's artificial intelligence. They can appear intelligent based on the tasks they do,or being able to to see patterns in data for instance.

Kimberley Hall:

Awesome. And so a few examples. So, one of the things we might want to teach a machine to do is see, for example. So, we would call this computer vision. And an example of this would be being able to see social media filters for example. Another thing might be to teach a machine to read or listen. So, this is called natural language processing, and this is used in our Google assistance and Google search, which I use all the time. I don't know what I did before I could say, "Hey Google, tell me something that I need to know." Oh, now I've set off all my devices. Shh! No, Google. So a combination of- Oh sorry. It is answering on one of my devices. A combination of seeing, reading and listening, allows us to teach machines things like suggesting the most relevant news articles to different users. So, our next definition is machine learning or ML for short. And this is the process of teaching AI what it knows. It is when systems learn to find patterns in examples in order to make predictions. We make an AI which knows nothing, then we give it a data set, and the data set is all the information we want the AI to learn about. If we were wanting to recognise certain animals, our data set would have lots of animal pictures in it, for example. Add labels to our data set. And this labelling would result in cat images being labelled as cats or dog images being labeled as a dog. So, we teach the machine, we test the machine, and that's the basics of AI and machine learning.

Steve Smith:

Nice, that's a great, great explanation. And we have a bunch of AI principles that we work to. And on the slide you can see there, there is an ai.google/principles, and you can dig into them and see those in detail. So, mainly the seven are there up on the slide. So, they should be socially beneficial, they should do some good. They should avoid creating or reinforcing bias. So, if there is an existing bias, and when we look at the AI should not reinforce it. Be built and tested for safety. And I think the most important bit is number four, that it's accountable to people. So, it's not kind of machines training machines, a la, The Terminator. There are always people that are, that are looking after that, that data set. We need to incorporate privacy design principles, uphold high standards of scientific excellence and be made available for uses that link to all these principles. So basically if you think about education, we want to do all those things from one to six and then make that IO available for educators wherever it benefits them the best.

Kimberley Hall:

Yeah, awesome. And we're really excited about some of the things that have already been around for a while that schools have been utilising but also getting to share a few of our newer tools in this space. And as I briefly mentioned at the start, because we want this, we know that this is being recorded and so watched at some time that suits you, but we would love for this to be as interactive as possible given those constraints. So, we are going to demonstrate some different tools, talk about them a little bit and then what we are encouraging you to do is to actually pause the video, the miracle of technology and go and actually have a hands on play yourself. Ideally you'll have a play and then come back to the video and un-pause us. However, some of the tools are pretty cool so we do understand that you might go down a rabbit hole of just doing that for the rest of the session.

Steve Smith:

They may come back at an hour or so perhaps once-

Kimberley Hall:

That's right, whenever you come back that is great. Thank you for making the time to rejoin us. So, we are going to go through a bit of a journey of some of the tools that have already been around for a while, share our favourites in those regards and then as I said, go into some newer ones. So, we're going to kick off with a fun tool called Quick Draw. And if I click this button here, let me see if I can play the video for you.

Yonas:

Hi, I'm Yonas.

Henry:

Hi, I'm Henry.

Yonas:

Quick Draw is a game a few of us at Google made. You draw and the computer uses machine learning to guess what you're drawing.

Computer voice:

I see square or suitcase or canoe. Oh I know it's shoe.

Yonas:

It's an experiment that uses some of the same technologies that helps Google translate recognise your handwriting.

Henry:

To understand handwriting or drawings, you don't just look at what the person drew, you look at how they actually drew it. Which strokes did they make first, which direction did they draw them in? You train the computer on millions of characters from hundreds of languages and over time it learns whether you wrote look or whether you wrote book.

Yonas:

Training is a big part of how the computer can get your drawings correctly. As people, it's easy for us to look at these three drawings and know they're all cats, but your computer, they're very different. One is just head, one has a full body and one is just facial features. To get the computers understand, you have to show it a lot of cat doodles. And then it starts to see patterns. Like that almost all doodles of cats have pointy ears, a small nose and whiskers. Of course it doesn't always work, that's because it's only seen a few thousand doodles, but the more you play with it, the more it will learn and the better it will get at guessing.

Computer voice:

Oh I know it's cat.

Yonas:

We put it on the web for anyone to play with. We hope it inspires other people to think about fun ways to use machine learning. You can play it at g.co/aiexperiments.

Kimberley Hall:

Now that link is actually old, we're going to give you a different link to that but I know Steve and I have wasted many hours of our life doing Quick Draw since it was first released. But when we, when when we were actually, and I saw this video for the very first time, Steve shared that with me and I loved the analogy of like not just showing it the actual end result but the process of the drawing. And I think that's such a key thing around linking back to what we're doing with AI and ML. We have to train it around what we want to achieve, but I also think there's a really nice link there into what we actually want to do with education too. We don't want to just recognise kids for the end result of their work. We actually want to work out how we can better reward and recognise the process and the journey they've taken to get to that end result as well.

Steve Smith:

And I love the fact that when you play with them it shows you at the end, you know, your six pictures you drew and it says we're actually storing this in the memory so we can make it better. So, it has that nice overlap with, you know, this is what AI is but also this is what machine learning is. So, it's a really nice way to show machine learning. So this is, this is the link isn't it, where people can go and play.

Kimberley Hall:

Absolutely. So, if you would like to pause the video, feel free to do so right now. I mean personally I don't remember the website for basically anything like google.com's basically the only website I know I think, but I just Google everything. But if you would like to type in the web address yourself you can, so quickdraw.withgoogle.com and have a play. You'll obviously either need to be really good at drawing with your mouse curser, which is like a really proper challenge in life. Or ideally you would have some sort of touch enabled device to play Quick Draw.

Steve Smith:

Oh definitely. I mean I'm lucky that my Chromebook has a touch screen and a stylus and so playing this is heaps easier than kind of doing it with with a mouse, but you can still do it. So give it a go see how you go and come back.

Kimberley Hall:

So we're going to, we're going to pause ourselves just for three seconds to give an obvious time for you to pause us and go and play with Quick Draw. Okay, I feel like that was sufficient wait time.

Steve Smith:

That was, . We're back. Hey welcome back.

Kimberley Hall:

Welcome back. I hope you enjoyed playing Quick Draw. Okay, let's talk about our next AI powered tool.

Steve Smith:

Sweet. So, this one is called Teachable Machining and the link is down the bottom there. Don't go there yet, wait until we tell you to pause and then you can go over and have a play. And as I said thank you for coming back after playing with Quick Draw. So Teachable Machine is a really great tool that allows you to actually teach a machine. It does what it says on the tin. So, when we have a look at this little video, you'll see that it ingests some information or you, you give it some information, you give it information about one thing, you give it information about something else and then it trains the model and you can test the model. So, let's play the video, we'll have a little look at it and we'll come back and I'll, I'll show you what it looks like on the next page.

Voice over:

People are training computers and creating machine learning models to explore all kinds of new ideas. New ways to interact.

Computer voice:

Yes.

Voice over:

To understand the world, to play and experiment. But machine learning is pretty intimidating to get into. So, we've been wondering, what if it wasn't? With Teachable Machine, we set out to make it easier for anyone to create machine learning models, without needing to write any machine learning code. When it first launched in 2017, it allowed everyone to get a feeling for what machine learning is all about. But now Teachable Machine puts the power of machine learning in your hands. Allowing you to save your models, and use them in your own projects.

Voice over 2:

So let's say you want to build a model to recognise you versus your dog. You just open up the site, record samples of you and samples of your dog. Click train. And you instantly have your own machine learning model, which you can use in your sites apps and more. You can upload your model to host it online or download it to work entirely on device.

Voice over:

With Teachable Machine. You can create custom models for all sorts of things. Using images, audio, or even poses. Personalised machine learning models for the things that matter to you. And folks have already been trying it out using Teachable Machine in their own experiments. Solving problems in their communities, or even just at home. Start creating, and see where your ideas take you at g.co/teachablemachine.

Kimberley Hall:

I did not click, I did not do my double click quickly enough then Steve.

Steve Smith:

No you didn't.

Kimberley Hall:

And then I did like a quadruple click I think.

Steve Smith:

So, on that video you saw how the machine works basically. So, this is a screenshot of how you can train your model. So, you've got class one, so give it a name, dog. Class two, give it a name, say cat. And then what you do is you either upload a whole load of pictures of dogs, into the dog. And then upload a whole load of pictures into the cat class. And then you click train the model and it trains the model, it takes it all those pictures and figures it out. Then you can go preview it and what you can do is you can then, it opens your webcam and you can hold up, well I guess you could hold up a cat or a dog if you have one, or on your phone for instance, grab a picture of a cat you've downloaded, different one, and hold it up to the camera and then it will say yes dog or cat. So, it's a really cool way to do it. Then you saw there were some people in that last video who had done things like connected it to Arduino controllers for opening doors or making the sort of, there was someone who you had the guy here when he hummed it would recognise the hum and then would switch on a light. So, you can use this really simple Teachable Machine to start talking about our AI, then you can actually build this up much bigger and export the model and use things like Arduino controllers to make it do stuff as well. So what we'll do is we'll do our little pause again and feel free to jump out the Teachable Machine dot, with google.com and give it a go. It's a great little thing to use and a lot of fun. And again, it's a way to show students how this weird thing called AI actually works and what its. So I think we'll do a little pause there, shall we, Kimberley? And then-

Kimberley Hall:

Yeah

Steve Smith:

Come back for the next tool.

Kimberley Hall:

Three, two, one. Welcome back. I know I really hope you did enjoy having a play with Teachable Machine. If you did pause us to go and have a play right now. As Steve was saying, I think it's such a great entry point into making this sort of new way of thinking really accessible for, you know, youth through to adults and it, yeah, it's so fabulous that you can build it just to experiment and play or actually use it. So if you built something then see if you can export it and where you might want to actually place it in your life.

All right, we're going to move onto our next awesome tool now. And this is some experiments that live within our Arts and Culture. Now if you've never been to Arts and Culture, if you Google Arts and Culture. If you Google Arts and Culture, lots of Googles in that sentence, you will find it. And if you have done that right now, I guarantee you'll stop listening to Chris, to Steve and I, sorry Steve, I keep calling you Chris, because Arts and Culture is probably, it's in my top five things we've ever built, I think. It is just an amazing curation and collection of wonders from around the world. So, we could do an entire, you know, four hours on that, but we won't.

What we do want to hone into within Arts and Culture is something called experiments. And within our experiments section we actually have a, and the ability to filter the collection for AI specific experiments. And if you go in there and you see that, what you'll actually find is that we have experiments including Quick Draw is listed in the AI experiments within Arts and Culture. But there is a whole lot of other, I guess they're kind of just like ideas that people have had. Like hey, I wonder if we could get a machine to give us feedback on X. So, one of the ones that's a bit weird, wonderful and quirky in there is, I think it's called the Freddy O'meter. And it's literally, let me find the word. It's literally an AI powered singing challenge that rates how closely you're singing matches the voice of Freddie Mercury. I mean I can see why that's an important thing to do, but it's also just sort of these ideas of like I wonder if, and then people are actually starting to code them, try and train a machine to see if it is possible to sort of interact with a machine in that particular way. And there are some really, really fabulous things in there to have a play and explore with as well. So, Teachable Machine you'll actually find in there as well as, I don't actually know how many there are lots though of experiments.

Steve Smith:

I think you make a really good point Kimberley. I do say to people whenever I introduce Arts and Culture, if you haven't seen it, make yourself comfortable on the couch, get a really big coffee or a big tea or a big water and just expect to lose a couple of hours because there is an amazing amount of stuff in there.

Kimberley Hall:

There is absolutely. So, the tiny URL to go to directly to the experiment section of Arts and Culture is on the screen. Or alternatively if you are like me, just Google Arts and Culture experiments and go there. And then you will see experiments of all different types and at the top you do have the ability to filter by collection. So, the type of experiment, and there is a specific section on AI experiments. There's also AR experiments and whole lot other exciting things in there. So what we might do, Steve, is just take a brief pause

Steve Smith:

Yes.

Kimberley Hall:

To allow people who would like to pause the video here at a logical point to go and have a play with Arts and Culture experiments. And we'll come back in three, two, one.

Steve Smith:

Hey we're back.

Kimberley Hall:

Wow, welcome back. Wow, we missed you guys, you were so gone for so long. If you did pause the video and go and have a little bit of a play with Arts and Culture experiments, we hope you found some things that will probably suck a lot of your spare time away from you. So, apologies for that. But also some really cool things to check out with students in your classrooms.

Okay, so we're going to pivot from some of these things that have been around for a little bit longer and talk about something slightly different on the same topic roughly though Steve.

Steve Smith:

Yeah, there you go. So, it's that idea of bringing AI to learning I guess, or to education. So yeah-

Kimberley Hall:

Yes.

Steve Smith:

So, I guess we've used this term adaptive learning or personalised learning or student agency or learner agency or whatever you kind of want to call the, the idea to tailor learning to the student that's, that's involved in it. The idea of learning with the student, not learning at the student or doing it learning to them, it's for them basically. So, we're looking the way that we can support learners by providing that personalised resource or activities that really address that individual student's needs. And the real challenge of course is for one teacher to be able to personalise learning to the needs of 20 or 30 learners, which is you know, next to impossible without hours and hours and hours of work. And so we're trying to figure out how we can help educators with some really cool tools that might help them to do a little bit of personalisation as they go.

And we've had quite a few tools over the last few years that have helped to kind of do that. And one of the early ones was, is 'cause it's still a thing, is Socratic. Also known as Homework Helper. And it's the way that people can, can use this little tool to find information if they get stuck. So, you may be familiar with it, you may not have seen it. If you haven't, do a quick search for it and find it. So, it's the original Homework Helper that was built and then we built these abilities into Google lens. So, inside lens you can actually click it, scan a homework problem, say an algebra problem, and it's going to give you steps on how you can do it. So, the ability to give you feedback where you want it. Next came a really cool tool called Read Along, which we'll have a look at in a little bit of depth, which gives you an AI powered reading tutor.

So again, it lets kids learn when they need it. And what we've just announced recently is an amazing tool called Practice Sets, which we'll have a little bit of a look at later on. So those are kind of the four things we are having a look at and we'll look at those in a bit of depth as we go. It's the ability to personalise resources and activities to address a specific learner's needs.

Kimberley Hall:

Yeah, absolutely. And also at scale, so as you said before because that is a big challenge for teachers is how do we do this at scale, not just for one student. 'Cause we definitely want to make sure that we are personalising the experience for every young person who is in our care. So before we jump into these though, we also do want to tell you about another really cool tool which has just launched on our Chromebooks. And so just before we do that, we thought, hmm, wonder if everybody who's watching this will know what a Chromebook is and how it works. So we're just going to do a quick, what is a Chromebook?

Steve Smith:

Oh!

Kimberley Hall:

So the Chrome operating system is basically completely cloud first. So, everything is running completely in the cloud, which we could do another half an hour on what the cloud is. I probably have to bring someone else in though to talk about that. But the concept is that when it's in the cloud and the way that the Chromebook works, for me personally, the biggest thing that made a difference when I started working in ChromeOS or on a Chromebook was this concept that it didn't really matter.

Obviously I want to take care of my Chromebook and I like my Chromebook, but if I don't have my actual physical device that belongs to me with me, I can log into any other Chromebook anywhere in the world and access every single thing that I need to. It'll have all my personalised settings, all of the security things that we need to have. Anything like that will follow me wherever I go because we are doing everything in the cloud first. And I think this is a really interesting mind set shift in the way that we've been working in education because we have, I like the idea that it being in the cloud, it actually lends itself to collaboration and community and connection as opposed to, I've saved it on my local hard drive and so if I'm not at school, none of you have access to whatever this resource is that we're working on. I think it's such a great way of working in a school as well.

Steve Smith:

Or that terrible thing where you're on the way home and your drink bottle tips over in your backpack and, whoops, I've just fried my laptop. And if you had everything on there, I'm sorry, just say bye-bye to all your work.

Kimberley Hall:

Yeah that, I think probably every teacher has had that or maybe not the new young teachers 'cause every, most things are cloud based now, but most, most of us who are older as teachers, I feel like that was in my first couple of years of teaching. I might have a bit of PTSD still from it. So just quickly, final thing on Chromebooks and then we're going to talk about the new tool that's specifically built for Chromebooks in this space. So, if you haven't seen a Chromebook for a couple of years, when they first came to our region, they really sort of had just our, we call clam shells. So, the chunkier, not sure that's the official marketing term for them, chunkier, smaller sort of device that was more robust. So you know, let's drop it from, don't actually always drop it from the table, I can drop it from my table, I can tip my bottle of water over it and it's designed to withstand that.

But we now have this full spectrum of Chromebook devices that are available from, we still have lots of great clamshell devices that are really really versatile and functional for school devices, all the way through to really high end devices. And we actually just announced even recently around building a whole new fleet of Chromebooks specifically designed for gaming and that sort of technology needed, which I'm not the expert in, I will say. So, it's one of those things of like, when I said gaming and I was hoping Steve, you might step in and start talking about some of the technology, but it's fine. But.

Steve Smith:

They have, they have much bigger processes, better graphics cards. The screen refresh rate is a lot faster so, built specifically for gaming. So yeah, it's a huge change from what people kind of think they thought a Chromebook was.

Kimberley Hall:

Absolutely. And as Steve referenced, like both of us are working on Chromebooks right now. We use a Chromebook day in day out, it's the only device I use. And you know, we've got full touch, we've got, you know, front and rear facing, so you got your world facing camera so I can flip my Chromebook over and I can video things. I've got my stylus, I can do all of those things. But my actual favourite thing about Chromebooks, I will tell you, is that the speed at which they boot up. Because I have had a previous life where I had a device that I used to have to turn on and then I would go and make myself a cup of tea and then I would come back and it would still be running some of its updates. And I may or may not actually be able to start doing any work after several minutes has happened, but Chromebooks boot so incredibly quickly. But I also run with two operating systems, which means all of those updates and things happen in the background.

So you don't ever have a loss of productivity time, which is my, probably, personal favourite thing about it as a device. Now all of that is good and once again I feel like I keep saying we could talk about this for another four hours because it's actually quite legitimately true. But we want to talk about a tool that has just been recently released specifically for Chromebooks. It's utilising some of this amazing AI, ML technology and that's Screencast. So, concept of Screencasting is not new. I think after surviving the last few years of lots of remote or hybrid learning, many many educators globally became very good at Screencast as a way of disseminating information or collecting, getting insights into what kids are doing. But our Screencast tool is utilising a a couple of features that we're really excited about and so we're just going to show you a quick video of what that looks like.

Voice over:

The best lessons are those we take with us. And now ChromeOS is helping to make this easier. With Screencast teachers can record instructional videos or live classes with built-in transcription and translation. Students can also use Screencast to record video reports presentations or to access lessons remotely. All recordings are automatically uploaded to Google Drive. Here's how it works. To record a lesson, open the Screencast app on your Chromebook and click new Screencast. This records the speaker's face and voice along with the content on screen. The marker tool allows you to point to and highlight specific parts of your screen while you are presenting. Once the recording is stopped, Screencast transcribes spoken words into text. Both the video and transcript are automatically uploaded to Google Drive for easy sharing within your school domain. When you open the recording you will see the video on the left and an auto-generated editable transcript on the right. You can also trim the video simply by clicking skip section. You can choose to go back and undo these changes later. By clicking the share button you'll receive a unique video link to share with your students. With this link, students can open the recording on their Chromebooks. They can use the transcript to locate and re-watch key portions of the video. And translate text into their language of choice. Anyone at your school can use Screencast. All you need is a Chromebook and something to say.

Kimberley Hall:

Awesome. So, a couple of things that I particularly like about our Screencast tool. Obviously the automatic transcription is really a huge time saver in terms of making, and we're talking about the, at the start of is this personalising learning. So making sure that the learning that we are offering opportunities for every single learner to engage. Having a transcript there can make a huge difference to learners who might have auditory processing challenges to actually be able to read along and being able to click through or search for a key term. You know, everybody is bit time poor nowadays and so like rather than watching the whole five minute video, like where was it that Steve said that word, that was that key term that I need to revise again. now I search, click on it and actually it would jump me to that point in the Screencast. And all of that, I was going to say happens automagically, I'm pretty sure that happens through AI and ML. But you know, for me as a teacher, it wasn't that I have to spend the time building in an extra layer of this to make it as accessible and inclusive for my students as possible. It's just the default within the technology, which is really exciting.

Steve Smith:

Yeah, totally. And I think you make a good point there that it is really easy to use and, and it's a great way for students to show their learning as well so they can grab an image, write all over it, record their voice, hand it I. It's a different way that I can show their learning as well. And often much more accessible for people who don't want to spend a lot of time writing. If you could just scribble on something and make a hold of notes and chat about it, heaps easier to show your learning that way.

Kimberley Hall:

Hundred percent and it uses a completely different cognitive process, that visible learning, having to talk through why I made the decision to include or exclude certain things. Okay, should we jump over and talk about our other new exciting AI powered tool, Steve?

Steve Smith:

Yeah, definitely. The other one that we announced recently is, Practice sets. And this is something that sits inside Classroom and it's, when we first kind of started the iterations, of what we're calling a smart worksheet. And I think if you think about the way it works is that you can put some content onto it, it will allow you a space to show you're working, for instance, it will also, you put in as a teacher, you put in the the answer you want. So it will scan what you put in, it will mark it for you. So, self-marking, top tip there for teachers, and also if the student's having trouble they can click a little bubble and it will give them a help video or a help card to show them how to actually work through that problem. So, the students again can go to the work, if they get stuck they can, you know the old ask three before me, the first one can be actually the AI powered assistant in this, click on something and it's going to find some content for them, it's going to mark it for you and it's going to give you a whole lot of feedback on how the students are going. So, I think we have a little video on this one as well.

Kimberley Hall:

We do.

Voice over:

With all the typical question types like multiple choice and short answer, Practice sets also have a built in math keyboard. And use our adaptive learning technology to enable equivalent answers for robust auto grading. Teachers assign Practice sets in Google Classroom like any other assignment. Students open Practice sets right from an assignment in Google Classroom. As they work auto grading lets them know right away if their solution is correct. Which helps build confidence, keeps them engaged and prevents one misunderstanding from impacting the entire set. If they don't get a question on the first try, built in hints, help them along. When a student is unsure of where to start, they can open a resources section which provides recommended videos and concept cards for each learning skill. Additional features include designated space to show their work, allowing them to demonstrate understanding while giving their teacher visibility into the their thinking.

Kimberley Hall:

I think-

Steve Smith:

Sorry, go ahead Kimberley.

Kimberley Hall:

No, no I was just going to try and skip forward on my video. I'm really struggling with this skipping. Sorry Steve.

Steve Smith:

No problem. And the nice bit that it highlighted in there is if you have a look on this, this screenshot here, so this is what the mark book looks like when you have a Practice set given out. And on the right hand side you can see that marking, sorry, the working space that the students did, you can see how many attempts they did. So, the student at the top did one other attempt at it. You can also have a look at their working as well. So, you can see all the stuff they've done to get to that point. On the left hand side you can see the class with the, with the class insights as we call it. So, I can scan down that work there and see that question one, everyone was pretty good on. Question two, everyone was pretty good on. Question three, bit of trouble there. Question four are a lot of trouble. So, maybe I'm going to go back over the, the section where question three and four came from, to reinforce the learning. Or give them some additional resources they can look at and then come back and reteach. So it does give you a lot of insight into what the students are doing. Again, all this is is done by AI engines in the background to help cut down the amount of time you have to do as an educator. But also getting some really great information about your students.

Kimberley Hall:

Yeah, absolutely. And you know the thing I like about this insights dashboard is that we could go into the mark book for traditional piece of marked work and see, you know, scan down and have to analyse the data ourselves. But this is actually taking out one of those steps for us and actually providing us, not just data, but insights into that data, which is then enabling us to take action. So, really fabulous. All right, we have a couple of final things we're going to share with you and then we are going to say goodbye for today.

Steve Smith:

Oh my goodness.

Kimberley Hall:

I know.

Steve Smith:

So we'll have a quick look at originality reports. Now many of you will be really familiar with our originality reports. And these are something that are very difficult to use. Oh hang on, no they're not. You've just got to tick a little box and it starts doing one. So, the beauty of this is, tick a little box in Classroom, and it runs it for you automagically, to use Kimberley's amazing word there, in the background. So, you can see on this one it gives you the highlighted sections that are not the students own work or are not correctly cited, which is an interesting way or not correctly referenced. You can see we've got some information from websites and we've got some other web matches as well. And also from Google Books.

So, we'll scan all of that. So it's a really nice tool that does a whole lot of work for you without having, you know- I remember when I was teaching I'd highlight a whole lot, put it into a Google search, find where it came from, go back again, a lot of time. So, this one really simple for educators, tick a little box and off it goes. And if you have our workspace plus additions that will also scan everything that's been handed into your school. Just your school, because of privacy reasons, and let you know if somebody else has handed it. So, if I was Kimberley's much older brother obviously and she handed in my work from a couple of years ago, then it will come up that Kimberley handed in Steve's work and a little discussion would ensue perhaps between the educator and Kimberley about that. I mean because my work.

Kimberley Hall:

Never happened Steve, I probably wrote your work for you even though I was younger than you, that's what would happen. That sounds more realistic.

Steve Smith:

Yeah, you're probably right there. I think there's probably more like.

Kimberley Hall:

One final thing I'll say about originality reports is I love that you can actually, like you, it can empower the students to actually do the checking on their work as well. So, it doesn't have to be like a gotcha moment at the end where the teacher's checking up on the student, it actually gives the kids the opportunity to say, oh how am I going? Because it is a really daunting task of taking the world information sometimes and distilling it down into whatever content a teacher has requested a child provides information on. So, it can actually help them, coaching them I guess along the journey of, well how do we make sure that we are, you know, having a balance between other people's ideas and our owns and correctly citing when we are quoting from other people.

Steve Smith:

And also one of the brilliant points about originality reports in Classroom, is if the student runs the report and checks it when the teacher runs it, it's not going to go, oh this has already been run like some other originality checkers do. So the student can run it, fix it, hand it in. The teacher will run it again. It's not going to show up all the other stuff that's been done. It'll just show you what's in front of you as as an educator because the student has had the chance to change their work and hand in their best work.

Kimberley Hall:

Yeah. Awesome.

Okay, our final tool that we wanted to just showcase for today, and I think this might be our current new favourite, this week, We're a bit fickle so it does change weekly, but that's Read Along. And this is a fabulous app that is actually recently also launched on the web. So, I'll just show you the address for that. So, I mean if you want to Google Read Along you will find it straight away or readalong.google.com. And basically this is a reading assistant for students.

So, I have a daughter who's in her very first year of school this year and so she's been learning to read and you know, creating the time to sit and have an adult read alongside her and coach her and give her feedback on what she's reading, is challenging for a lot of people. And a lot of students around the world don't have that privilege of having someone who actually can coach them through the journey of learning to read. And so what Read Along does is it actually uses an AI powered assistant called Dia and she reads with the kids. So, the children actually will read to her and she will give them feedback on what they're actually reading. They get little stars above the words if they pronounce them correctly or it flags that that might not be quite the right pronunciation 'cause they read the first two letters and guess what the rest of the word is, which is what my six year old does. And you know, tries to reward and encourage the kids and is super positive and the kids seem to really have fun reading with Dia.

I think we're up to something like over 20 million students have read over 118 million different books with Dia now on Read Along. So, it's a pretty amazing thing to think about how this sort of AI technology is truly unlocking opportunities for people around the world otherwise have not had that sort of coaching. I think it's fundamental.

Steve Smith:

I think the really nice thing, sorry, I was interrupting you there.

A really nice thing about this is that the students who are struggling can go and read with Dia in a really kind of nice private space where they're reading backwards and forwards with their little friend Dia, you don't have to kind of show yourself as not understanding. You can go and practice by yourself and then come back to class. So, it's a really nice low impact, low stress way for kids to get help with their reading. Which is super important obviously. And as you said Kimberley, it's for one teacher to go around 30 different kids and give individual feedback on their reading is a lot of work. I know our primary school educators are absolutely amazing how they can do that and this is one of those tools that can help them be as great as they as they can be.

Kimberley Hall:

I find it difficult as one parent to go to one child in my house some days with that. So, I am in awe of all of your primary school teachers that do manage to do that reading coaching in your classrooms.

Steve Smith:

Totally.

Kimberley Hall:

Okay. So, we have come to the end of our time together. If you are still here, thank you for choosing to continue to stay with us through our little journey down the AI, ML personalised adaptive learning world that we are now unlocking in education. We do have a team in Australia of us, not all of us look as scary as in our profile pictures in real life, I promise. So, across Australia and New Zealand, this is sort of our core team. We're always really happy to hear from you if there's more information you'd like, things you want to follow up or even better than that, not that those things aren't great, sharing something cool that you've done or learned or we'd love to hear your stories and help to amplify them and share with other educators and schools and systems as well. So, thank you. Enjoy the rest of the sessions that you tune into during the event and we hope to see you again soon.

Steve Smith:

Yeah, thanks very much for your time today and I hope we could spark some little, a little bit of enthusiasm about jumping into something new perhaps.

Kimberley Hall:

Yeah. And if you have made it off Arts and Culture and back to this webinar. Well done.

Steve Smith:

Well done indeed.

Kimberley Hall:

Okay, bye.

Steve Smith:

Thanks. Bye-bye. Ka kite.

[Video concludes by displaying the NSW Government logo.]

[End of transcript]

Return to top of page Back to top