The AI Playground for Adolescents: Innocence meets the Internet

Your all-in-one guide to a teen’s AI use

Back in the 80s, our youthful, thrill-seeking experiments with rule-breaking led us to risky and exhilarating adventures. A daring dash across a swollen creek led to an elaborate cover-up of why we were soaking wet on a chilly April day. Our risky behavior was tested against real-world dangers and how well we could bend the truth. 


Though it might seem totally different, I think there are important parallels to be drawn to today's digital playground; these dilemmas about safety and risk have always faced teens, and it’s part of what my first book, Would I Have Sexted in the 80s?, tried to unpack. We’ve all been there: of course, teens will push boundaries and test out new ways to get a thrill. It is just that the old version of pushing a boundary didn’t involve tech. 


Today, children are navigating far more complex risks online, with a completely different set of dangers powered by the undercurrents of artificial intelligence, some bad players, and very efficient algorithms. The opportunities for adventure are incredible, but those come with a whole new set of mischief and dangers that we adults sometimes have a hard time imagining. So, let's dive into some of those risks; it’s important to put ourselves in their shoes to truly understand their digital landscape.


Understanding AI Misuse Among the Young

  • Cheating: Students have always found ways to cheat. This was the first area of concern when ChatGPT erupted last year, but this was never my first or biggest worry. Deceiving oneself about one's accomplishments can have dire consequences, but ultimately, the individual remains accountable for these outcomes at some point down the line. It's imperative to understand that the ramifications of cheating pale in comparison to other pressing issues of AI use. These issues, notably bullying, can lead to severe psychological impacts such as Post-Traumatic Stress Disorder (PTSD), alongside a host of emotional challenges including feelings of isolation, anxiety, and  depression. When students hear you talking about AI only in the context of cheating, they may feel that the complexity and severity of the online problems they face, those beyond mere academic dishonesty, are being ignored. Cheating guidelines can be updated without too much fuss, so let’s focus on the issues that cause lasting harm.

  • Voice Message and Email Pranks: Pranks have always been common among teens, but the advent of AI has led to more sophisticated and believable antics. One recent instance involved students using AI to generate a realistic voice message, falsely informing a parent about their child's suspension. There have been similar reports of pranks involving emails as well. These messages sound real, and as much as I understand that these teens are probably just goofing around on a Friday evening, there’s a real danger to such realistic impersonation.  These teens reminded me of the same antics we got up to before the internet, challenging each other to outlandish dares, fake notes, and prank calls. But because they were misusing AI in an uninformed way, they were oblivious to the harm they might be causing or the fact that this might be illegal.  

  • The Sticker Problem: Custom stickers on messaging apps can be fun to create. My first instinct is fun, and I remember my son loving making his Memoji speak in messages to me. However, I have come across a lot of harm in the last six months: both AI and new software updates have made it easier than ever to make, modify,  and send stickers from a picture with just a tap or two, and it’s seen kids using them to create offensive images or alter photos of classmates and teachers. Some examples I have seen in this school year were images of a teacher who had been transformed into something sexual, Hitler stickers, naked bottoms, and edited nudity without consent. These were being shared widely on class WhatsApp chats. I understand students want to express themselves: we can all remember making secret doodles on yearbook photos. But today's alterations, however, carry a wider, more damaging reach due to both their shareability and realism. In most cases I have seen, students misunderstood the harm of their actions, and were just trying to express themselves in a highly visual world to fit in with the digital conversations happening around them. 

  • Deepfake Dilemmas: The creation of deepfake videos or images, where a person's likeness is used to create new content in which they realistically appear (for example, think of a video of a school principal which looks real, but has been manipulated so that they do something they might never do in real life, like personally attack a student or another teacher) , has moved from a novelty to a tool of deception and cyberbullying. It was hard to miss that even Taylor Swift has had to deal with her image being misused for sexually explicit deepfakes. Teens tell me that they are fearful of their own images being misused, reflecting a disturbing trend of digital identity manipulation and a new era of sexism.

  • Meme Culture Missteps using AI: While memes can be a form of self-expression and humor, they often venture into offensive territory. The ease with which memes can be created and shared, coupled with a lack of understanding of their impact, contributes to an online culture that can quickly become toxic. With AI, students have more ease manipulating photos, making the memes more personal and hurtful. 

  • Creating fake accounts: It is not new that fake accounts are made to bully classmates, but with AI, it is easier than ever to get images to make these fake accounts seem realistic. The worst case of bullying a student told me this year centred around a fake account and a dating scam from a classmate, where digital reality was manipulated to create a whole new level of miscommunication.

The Legal Labyrinth: Navigating AI Misuse and a Lack of Legislation

As AI technology races ahead, legislation lags behind, struggling to address the complexities of digital misconduct. The European Union's steps toward criminalizing the sharing of non-consensual deepfake content marks a promising start–at least for celebrities. Additionally, Spain has been grappling with the legal implications of deepfake technology, particularly in the context of AI-generated nude images of minors, with discussions navigating whether it constitutes a crime and the potential application of existing laws, such as those related to child pornography. Yet, the intricacies of digital law, especially those concerning minors, have demanded urgent attention since the onset of easily disseminated revenge pornography; governments have struggled to keep up with the ethics of technological innovation or band together against global tech giants long before the proliferation of easy access to AI, and action to protect against misuse and exploitation has been unfortunately slow.

A Call to Arms: What Parents Can Do

Parents can play a pivotal role in guiding their children through the digital landscape by:

1. Educating Your Children about Responsible AI Use: It is crucial to educate children about responsible AI use. This means you need to get on AI with them and discuss the platform's learning with them, all while engaging your child in discussion. Here are some questions you might ask:

  • What does ‘machine learning’ really mean?

  • What are the ethical implications of AI?

  • What are the benefits and limitations of AI? Why might AI not get it right in a particular situation?

We’ve been exposed to the benefit of human involvement in decision–making our whole lives–every year, younger and younger kids have lived more of their lives with these new technologies than without. It may seem silly to even have to ask some of these questions, but these developments are a big part of their lives–they should be big parts of yours too. Discuss with your child why, for example, ‘My AI’ on Snapchat might not have enough knowledge of their personal context to give appropriate advice. 

Having these conversations helps expose your teen to the reality that mom, dad, a school counselor, or a trusted adult might be a better resource to seek out. If I had a teen today, I would test My AI alongside them and add my corrections and conversation to the mix, showing them how I, as a parent who loves them, can add to this conversation. 

2. Involving Children in AI Development: Involving children in developing and deploying AI systems that use their data can help them understand AI better and give them a sense of agency and opportunity to shape AI systems. Nothing is worse for a teen than being treated like they cannot be part of the conversation around their futures. If we want them to be active, respectful, and involved citizens, , we should respect them enough to let them join our conversations. 

3. Teaching Children about Their Rights and AI: Parents can teach children about their rights in the context of AI, and help them understand how AI systems should be designed to protect and benefit them. I would go further here, talking about bullying, memes, deepfakes, and the use of stickers. I most certainly would talk to children about what to do if this happens to them, as well as if they are bystanders and how they might work to be upstanders. 

By taking these steps, parents can play a vital role in preparing their children to navigate the opportunities and risks associated with AI in an informed and ethical manner.

Schools as Safe Havens for Digital Citizenship

Schools have so many areas they need to consider, including all the questions around plagiarism and cheating all the way through child protection and bullying. We know students do best when we have given them clear expectations and boundaries, and they know the consequences of missteps. Students tell me that often the set of expectations they’re given are lagging behind the real digital landscape,  leaving them to fend for themselves. 

I believe that in addition to a robust digital citizenship program, schools need to constantly audit and update their policies and guidelines. School programs also need to involve training and awareness sessions for students, staff, and parents.

Here are some of the basic things you should be discussing and considering when talking about expectations and policies:

1. Clear Definitions and Expectations

Define Misuse: Clearly define what constitutes inappropriate use of digital tools and content. Be specific that any content or usage that will harm others will be considered a misuse of this technology. 

Expectations: Set clear expectations for digital conduct, emphasizing respect, integrity, and responsibility.

2. Reporting Mechanisms

Clear Reporting Channels: Establish a confidential and easy-to-use reporting mechanism for students to report concerns or incidents of inappropriate digital behavior. Many teens tell me they do not want to report, for fear of being a snitch. They also feel they are left without support if they have fallen victim to bullying or harassment online. Without adults understanding their online realities and developing secure ways for them to discuss concerns, students are left to fend for themselves while schools remain in the dark about their online world. There are two ways of establishing this security for students:

  • A confidential online survey of things they see and do online that need education and discussion. No names should be used, and the aim is to flag concerns and set up a class on how to deal with the concerns. One student mentioned recently that at 12, he encountered gore videos online for the first time, and did not know what to do. He remained with these negative thoughts and images without understanding how to let go of the extreme content we see online. This would be one example of the kind of information that would help schools set special topics yearly. 

  • Bring in an external expert (psst…I do this!) who can work with the students and gather information about issues in your school as someone external to your community.

3. Consequences and Interventions

Many students fear coming forward when they have fallen victim to online issues. They fear that things will get worse if they report. We need first to let students know that they have a support system and how we will deal with this as a school. The majority of students who report to me ask, “What will my school do if I report this?” They honestly do not know, and are scared of causing a huge drama, blowing up a friend group, or wrecking their social status. If students are made aware of the consequences, the confidentiality measures in place, and the support for victims and perpetrators of online harassment alike, they will feel more comfortable and open to addressing the real issues which face them online today.

It is important to think not only about disciplinary measures but also about restorative justice. The aim should be restoring and repairing the harm as well as being clear about any disciplinary measures we might put into place. 


Other things schools should consider while writing tech policies are the preventive measures they have in place, legal compliance in the country, cooperation with law enforcement (for extreme cases), and a continuous review process. This is a fast-moving environment, and changes will need to be made often. Keep policies simple and clear and regularly review them to adapt to new technological developments and challenges, incorporating feedback from students, staff, and parents.


As I think back to the adventures of my youth, from crossing raging creeks to the thrill of forbidden midnight chats, it's clear that the spirit of exploration within us never fades; it merely finds new realms to conquer. Today, that realm is digital, brimming with wonders and hazards. Our children are just as curious and audacious as we once were. As the adults in their lives, our job is to find ways to show our youth that the wisdom we have from our years of exploring is valuable to them, even in the digital world they claim we will never understand. Just step back and remember, we told our parents and educators they were clueless as well. 


At the end of the day, students report to me that they want joy, discovery, and positivity to be the themes of their tech use. They want to get up to shenanigans in their new digital playground, creating jokes, memes, and memories with the new tools at their disposal. Let's guide them to understand the boundaries of the digital world, keeping them safe while letting them explore.


If you like my blog posts, please sign up to my newsletter and spread the word. You could also buy me a virtual coffee–our team would appreciate the support.

–Allison Ochs, social pedagogue/worker, author, mother of three, wife

If you are interested in a webinar or workshops click here

If you want to look at our free resources click here

If you want to buy the Oscar and Zoe and primary school books click here

If you want to buy our books and resources for teens click here

If you want to subscribe to our mailing list click here