The need to socially distance following the Covid-19 pandemic has increased the digitisation of everything we know in our lives, from healthcare to the way we socialise.
Heads and teachers have been remarkable at embracing new technologies to keep learning going for the benefit of the children.
The use of technologies such as artificial intelligence (AI) has become prevalent in our schools and opened new worlds of possibilities for learners.
The benefits of those technologies range from the use of smart analytics to identify areas where children struggle, to making us more efficient with our lesson planning. Daily administrative tasks have been simplified, and new and exciting opportunities have emerged for interacting with students who have learning disabilities.
But as our appetite for new technologies has grown – partly out of necessity – there has been limited time to identify the risks they may pose. Any previous concerns about the dangers of AI-powered systems were swept away as their adoption became the norm in assessing and tailoring lessons for individual students.
The spotlight on privacy controls only came into play as new issues previously unheard-of heightened concerns.
Zoom bombing (cyber-harassment in which uninvited individuals interrupt meetings over Zoom) for example, became a worrying trend despite the claims that the video-conferencing platform had end-to-end encryption.
For schools, the General Data Protection Regulation (GDPR) of 2018 meant that they had greater accountability for the data they collected, especially where this information was handled by a third party. This, too, raised anxiety levels as worried school leaders were left to deal with the repercussions.
During the pandemic, we spent more and more time online in order to survive, both at home and at school. We shopped, worked, took care of our health needs and our social life remotely. The more we did so the more information about ourselves we revealed, without fully understanding the potential repercussions.
Algorithms gone awry
The A level fiasco of 2020 shone a light on hidden risks that many teachers had not previously considered.
Transparency and accountability measures were called into question and the public’s trust in AI was eroded.
Students protested at the results churned out by an algorithm that proved to be biased towards particularly cohorts, with young people attending schools in disadvantaged areas identified as among those worst affected.
What followed was difficulty in explaining who was accountable for the errant coding, how this had happened, and why.
Technologies like AI are neither good nor bad; what is important is how we use them. We need to heighten our awareness of both the benefits and the threats.
The Moral Imperative
Children born today and those currently in our schools have arrived in a digital world from which they cannot opt out.
Their exposure to AI will be greater than any generation before them. So, if our moral imperative is around making a difference to the life chances of young people to ensure they are prepared for life, work and the world in general, then we need to develop our own understanding and find the sweet spot between panic and complacency.
To do this we need to give young people a voice in conversations centred around data, AI and ethics so they can make more informed decisions about their futures.
Do your pupils currently understand what AI is? Do they understand how data about their academic progress, attendance, medical conditions and learning challenges is being collected and used? Do they know who has access to this information?
We usually seek consent from parents to obtain their children’s personal details, but it is the young people who will pay the price if we, as educators and parents, get it wrong.
To create good citizens in the future - citizens who understand the risks of the technological world in which they reside - we need to ensure we get consent from the children, as well as their parents, when collecting pupil data, and to talk to them about what that consent means.
They need to have a voice. Currently we do not give them any opportunity to discuss this, nor do we broach it with them in a way they might understand.
We must, therefore, develop a language that will help them to grasp what data is, why it is collected and what happens to it. And that they have some control over that process, because technology increasingly forms part of the armoury for teaching and learning.
Using role play and games
These discussions can begin at a very young age, through conversations, games and role play.
Allowing young children to act out certain situations through role play can help them to make sense of real-life events collaboratively, particularly when it comes to complex notions such as lending and sharing.
One way to approach this, is to ask pupils to bring in something that may be important to them, such as a favourite possession. This will be used to represent data. Now ask the children how it would feel to lend this possession to a friend.
Discuss how the child feels about this favourite item and the reasons they might give for allowing someone to play with or borrow their possession, and what reasons they might give for not allowing it. For example:
- He/she always asks
- He/she broke it and wouldn’t give it back
- He/she didn’t ask me, they just took it
In this way, children begin to understand the parameters of lending and borrowing someone else’s personal things. We can talk about our rights as the owner of the toy (data) and the responsibility of the person who borrows it. For example:
- It’s our right to be asked if someone wants to borrow our possessions
- It’s our friend’s responsibility to prove that you gave it freely
- It’s also up to them to look after it and to tell you how long they would keep hold of it
The role-play conversation can be built upon further by allowing the borrower to give the toy to another friend. What rights should the owner now expect and what responsibility does the borrower have in this situation?
If the borrower doesn’t look after the item properly, to whom would you turn for help and what questions would you ask?
In this way, we can start building up the ideas about the importance of protecting our personal belongings, and this includes our data.
Is artificial intelligence, intelligent?
AI seems to be everywhere, but its definition is less clear. It is often thought of as futuristic and the science-fiction industry hasn’t helped dispel this perception.
Challenging our students to develop their critical thinking is an essential, and James Nottingham’s Learning Pit provides a framework with which to think deeply about questions like the ones posed above. It encourages children to be more comfortable with metacognition and helps them to reflect and develop their questioning skills, which in turn moves their knowledge from surface to deeper learning.
Karine George is a former primary headteacher and co-founder of Leadership Lemonade.