I thought I would put some resources together for young people and parents to navigate this technology. More and more we are hearing about Chatbots and Companion AI being used by young people. Often young people like to interact with this technology because it is engaging , fun and available from a young age.  This technology is widely available now but how much do we know about the risks and where it is coming from? Let’s look at the potential risks and then i will direct you to some great information on ways you can talk about this together.

The risks can be-

 Exposure to dangerous concepts – Some Chatbots may expose Young people to images or concepts that are not age appropriate, even with filters. Interactions are unpredictable. Chatbots don’t always understand the nuance of language and can misunderstand meanings. They don’t always give appropriate and safe responses when people are asking for advice . Advice asked for can be a range of things,  but the worst case scenario could be advice about suicidal thoughts or self harm. They are not always correct in their advice. There could be huge consequences of the wrong advice here which has happened. https://www.bbc.co.uk/news/articles/ced2ywg7246o

Dependency and social withdrawal – We can develop strong connections with AI companions . Excessive use may create a dependency and may cause us to favour this type of interaction above social interactions with other people. Interaction with AI companions may reduce loneliness to begin with but then may make it harder to interact with people as time goes on, creating further loneliness or isolation.

Unhealthy attitudes to relationships– When our brains are developing we are learning about social interactions , consent and mutual respect. AI companions lack boundaries and consequences for breaking them. This can potentially distort what human relations look like  (i.e messy and complex at times). If we just interact with Al around relationships- romantic or otherwise , we can develop expectations about relationships that may not work out in the real world eg expecting a human to be as quick to respond as a human being.

Heightened risk of sexual abuse/privacy/ risk– Exposure to highly sexulised conversations can undermine our understanding of what safe interaction or age appropriate behaviour looks like , particularly with unknown adults. This can potentially put us at risk of being groomed and abused online or in person. Some apps collect and store our data. Make sure that no personal details are given.

Financial exploitation– Some Al platforms have paid for features or in- app purchases and some may use persuasive tactics for further purchasing. Emotional attachment might mean excessive spending for exclusive features.

 

The main advice seems to be – Find out as much as you can together, discuss , and do what you can to avoid the risks and navigate this in a responsible way.

For more details about apps that provide Al companions and chatbots for young people. Plus up to date advice

Please see Internet matters- https://www.internetmatters.org/resources/ai-chatbots-and-virtual-friends-how-parents-can-keep-children-safe/

Another break down of risks and ways to discuss this Technology together to find a way forward please see the e safety comissioners website in Australia https://www.esafety.gov.au/newsroom/blogs/ai-chatbots-and-companions-risks-to-children-and-young-people

For younger children you might want to take a look at this brilliant animation and book to help them to navigate technology safely- not giving personal information and telling an adult they trust if ‘scary’ unwanted information or videos appear  https://www.nspcc.org.uk/keeping-children-safe/support-for-parents/techosaurus/

 

  • 259