Children are already interacting with AI technologies in many different ways: toys, video games, tick tok, YouTube, Instagram and adaptive learning software. All the time, algorithms provide recommendations to children on what videos to watch next, what news to read, what music to listen to and who to be friends with. Moreover, AI keeps children attracted and addicted to the mobile phone, the tool that AI uses to grab their attention.
Artificial Intelligence and children. Abusing or Empowering?
UNICEF is a leading organization in terms of creating child friendly AI. Thus, UNICEF’s draft policy guidance describes the importance of promoting children’s development in AI, avoid harming their mental health and their rights through AI that prioritize commercial gains. Through several years, UNICEF formed strategies and practices and offers practical recommendations for governments and industry.
Online harassment has been a burning issue and one of the critical concerns of parents regarding their children’s engagement with social media. This concerns raises the importance of having child friendly AI.
One example of child friendly AI the CrimeDetector system, developed by the Finnish start-up SomeBuddy, helps support children in Finland and Sweden aged 7–18 who have potentially experienced online harassment.
How it works?
As explained in UNICEF website:
“When children report incidents, such as cyberbullying, the system automatically analyzes the case using natural language processing and provides tailored legal and psychological guidance for the affected child, with the aid of a human-in-the-loop. The digital service has been conceived with the insights of social media experts and psychologists, child-rights experts and lawyers, and was also built through active co-creation with children. SomeBuddy’s objective is to provide support in all unpleasant and conflictual situations that children may face on social media platforms and help define when these situations constitute a crime.”
In addition to these kinds of software’s that allow children to report online abuse, UNICEF works for with a long term perspective to create policies that enable governments and corporation to integrate children’s right into the AI world.
UNICEF’s policy guidance draws upon the Convention on the Rights of the Child to present three foundations for AI that upholds the rights of children:
– AI policies and systems should aim to protect children
– They should provide equitably for children’s needs and rights
– They should empower children to contribute to the development and use of AI
– Automated decision-making in machine learning can lead to discrimination.
– If this discrimination is not prevented, it would cause irreversible damages such as distrust of the technology and the companies that develop it.
– This is just one of the risks relating to machine learning.
In consultation with youth and children, critical recommendations developed to bridge the gaps. One of these recommendations was for AI developers’ policymakers and other stakeholders to involve parents and educators and develop guidelines on how they can best handle AI and children. AI is too dangerous to leave it to IT specialists alone. AI is about privacy, mental health, and wellbeing of our children and our future.
AI systems are not neutral. They can be biased. Commercially biased, westernization biased, adult biased, white biased and male biased. This is why governments and industry need to ensure that all AI systems are aligned with the rights, needs and realities of children, much of the interaction children will experience with AI-based systems will take place in the home or at school. So parents and teachers need to be aware of the challenges, risks and opportunities that AI can bring to children.
Another recommendation was to consider the different realities of children. Children are not. We cannot move them out of their economic and social contexts. Children are poor and rich, empowered and marginalized, safe and endangered. Yes, global guidance is important and necessary, but it is also important to consider the specific realities of minorities, for example, or children with disabilities.
AI is not only about adults and offices. AI is penetrating houses, schools and follow our children into their warm beds. Since AI is mainly developed by adults, children have to be integrated into the development of just and diversified AI.