Safeguarding People in the Metaverse and AI Era

MetaEthics helps young people, educators and developers practice digital safeguarding, ethical AI use and online safety in immersive and AI-driven environments.

Metaverse and AI illustration

What is MetaEthics?

An educational web project focused on digital safeguarding, ethical AI use and online safety for everyone – from young people to educators and developers
  • Understand risks in immersive spaces
  • Practice responsible and ethical AI use
  • Promote wellbeing and safety by design

What is Digital Safeguarding?

In the era of the Metaverse and Artificial Intelligence (AI), digital safeguarding refers to protecting ourselves and others from harm, abuse and exploitation in virtual and AI-driven environments. This covers recognising risks such as identity theft, harassment, mis-information and biased automated systems – and knowing how to respond appropriately.

Key points

Signs to Watch For

Here are some of the common warning signs in virtual and AI-driven spaces. If you notice any of these, take action.

Suspicious or unexpected friend requests or invites

In immersive spaces profiling is easier and avatar identities can be manipulated.

Learn more about suspicious friend requests and avatar manipulation

Inappropriate content or behaviour

Harassment, bullying, sexualised interactions or grooming may occur in virtual spaces where moderation is weaker.

Learn more (PDF)

Identity theft or avatar takeover

Virtual identities and assets may be compromised, impersonated or stolen.

Learn more about identity theft and avatar security threats

Deepfake or manipulated interactions

AI-driven avatars or media may be used to deceive, impersonate or manipulate users.

Learn more about deepfakes and AI-driven manipulation

Privacy intrusions or data collection

Immersive systems may gather more data than expected (biometrics, motion tracking, social behaviour) and reuse it without clear consent.

Learn more about privacy risks and data collection in the metaverse

Physical or psychological discomfort

Immersive VR/AR experiences can cause nausea, disorientation, confusion or blur the line between virtual and real.

Learn more about physical and psychological effects of immersive experiences

Action Steps

If you suspect a safeguarding concern in a metaverse or AI environment, follow the steps below:

  • Pause the interaction. If you feel unsafe, uncomfortable or uncertain, exit the space or avatar interaction.
  • Note any details: username/avatar, time, platform, what happened.

  • Use the platform's tools to block, mute or report the user/avatar. Most immersive/social platforms provide user controls. Learn more (PDF)
  • If you're a minor, stop further engagement and seek support from a trusted adult.

  • Report the behaviour to the platform's moderation team or support channel. Include screenshots or logs if available.
  • If the incident involves harassment, sexual content, grooming or hacking – consider reporting to relevant authorities (e.g. CEOP in the UK).
  • Use online safety organisations for advice: e.g. Internet Matters (UK) offers guidance for metaverse safety. Security. Learn more about metaverse safety guidance for parents

  • If you are under 18 or feel emotionally affected by the incident, speak with a parent, teacher or trusted adult.
  • Consider professional help if you experience distress, anxiety or feel unsafe repeatedly.

Contact and Support

If you encounter safeguarding issues in metaverse or AI-driven environments, you can reach out to these organisations and resources:

UK Safer Internet Centre

Advice and reporting for online safety:
saferinternet.org.uk

Internet Matters

Guides and support for parents in virtual spaces:
internetmatters.org

CEOP (Child Exploitation and Online Protection Command)

For child safety online:
ceop.police.uk

National Cyber Security Centre (UK)

Digital security advice:
ncsc.gov.uk

Your metaverse platform's help centre

Look for "Report", "Block" or "Safety" options in menus.

Notes for Developers and Guardians

Ensure your platform or tool enables moderation, avatar boundaries and reporting tools. Learn more (PDF)

Promote digital literacy: users (especially young people) must understand that virtual interactions carry real-world implications. Learn more about promoting digital literacy for young people

For educators: set clear guidelines for use of VR/metaverse in schools or youth settings, ensure supervision and safe environment setup.