Algorithmic Mirror:

Help Your Child See How Algorithms See Them


The Synthetic Society Lab and Oxford Child-centred AI Lab needs your help!

Ever wonder what determines what your teenager sees on TikTok, YouTube, or Netflix?

The Synthetic Society Lab and Oxford Child-centred AI Lab are researching how algorithms create hidden profiles of young people based on their online activity. We've developed Algorithmic Mirror —an application built on Latent Lab that reveals how platforms might categorise your child's digital footprint.

As one teen from a recent study said: "It's like there's an FBI agent watching us... I want to know what they're doing with our data."

Join our study and:

  • See unique visualisations showing how platforms might categorise their interests over time
  • Understand how their viewing habits data can be analysed by algorithms
  • Learn practical strategies for managing their digital footprint

Who: Families with teens ages 11-17
What: A research session exploring algorithmic categorisation
Why: To give teen insight into the invisible forces shaping young people's digital experiences

Help your child understand the digital world they navigate every day.

Click Here to Join this Research!


Prototype:
Algorithmic Mirror
Preprint:
https://arxiv.org/abs/2504.16615

Lead Researcher: Yui Kondo (Research Associate, Oxford Internet Institute)
Principal Investigator: Dr.Luc Rocher (Associate Professor, Oxford Internet Institute, University of Oxford, UK)
Collaborator:  
Kevin Dunnell, (PhD candidate at MIT Media Lab, USA),
Qing Xiao
(PhD Candidate at Carnegie Mellon University, USA)
Dr. Jun Zhao (Senior Researcher,the Oxford Child-Centred AI Lab of Oxford Computer Science Department, University of Oxford)











    Why does this matter?

    Every video watched, every like, and every interaction creates data that platforms use to build sophisticated profiles of your child. These algorithmic profiles determine:
    vulputate


    • Content Curation: What content appears in their feeds
    • Time Spent: How much time they spend online
    • Targeted Ads: Which advertisements target them
    • Behaviour Prediction: How their interests and behaviours are predicted and influenced

    The good news: When young people understand how algorithms interpret their data, they become more thoughtful digital citizens who can better protect their privacy and wellbeing online.







    Introducing Algorithmic Mirror

    The Synthetic Society Lab at Oxford Internet Institute, in collaboration with MIT Media Lab, Carnegie Mellon University, and the Oxford Child-centred AI Lab , has developed Algorithmic Mirror—an innovative visualisation tool that reveals how platforms like YouTube, TikTok, and Netflix interpret young people's viewing histories.

    Your Child will:

    • See unique visualisations  showing how algorithms categorise their interests over time
    • Understand how their view habits data can be analysed by algorithms
    • Learn practical strategies for managing their digital footprint
    • Develop critical thinking skills about algorithmic systems

    “With Algorithmic Mirror,
    I realize I'm watching quite a bit of content from genres that don't even remain in my memory. It's scary to realize that I'm watching things without awareness.”

    FROM A PARTICIPANT IN THE LAST STUDY





    Temporal Evolution of Algorithmic Mirror from Sep 2018 to July 2024



    Temporal Evolution of  Digital Footprint
    Our participants from last study realised that the categories and generative summaries were based on long-term data, and observed,

    “recommendation algorithms remember you, like how you were interested.''

    Another participant critically questioned,

    “ When YouTube tries to understand what I like, my question is how would it [recommender system] try to track my interest over time and project new interests, or would it just like take me as I currently am and give me exactly what I like? [...] I can imagine it can try to predict how my personality would evolve in the next five years or so [...] it's a little scary”






    Who can participate?

    We're seeking:
    • Young people aged 11-17 years
    • Who actively use at least one social media platform (YouTube, Netflix, or TikTok)
    • Willing to participate either in person in Oxford or online

    What's involved?

    Session 1: Data Download (30 minutes)

    • Virtual guidance session to download platform data
    • Conducted from home or school
    • Complete privacy—no data shared with researchers

    Session 2: Interactive Workshop (60-90 minutes)

    • Secure data upload to Algorithmic Mirror
    • Explore personalised algorithm visualisations
    • Group discussions with peers about findings
    • Parents welcome to observe

    Optional: Follow-up Interview (30 minutes)

    • Share experiences and insights
    • Help improve the tool for future users

    Your child's safety is paramount

    • Full ethics approval from University of Oxford (Reference: [1990161])
    • Password-protected visualisations only accessible to your child
    • Experienced researchers with safeguarding training
    • All personal information removed before analysis
    • Automatic data deletion after 6 months



    ‘If you overeat, it shows on your face, and you can look in the mirror. But with YouTube viewing, it's like constantly eating junk food without a mirror to show you've become unhealthy. I probably wasn't aware of it, but now that I can see the mirror, I might decide to cut down on the junk.'


    Benefits for participants

    We hope this study will help participants learn more about algorithms on social media and help your child become a more informed and empowered digital citizen, with potential benefits for other children in the future.

    You will receive £20 total as our thanks for participating: £10 when you upload your data to the algorithmic mirror, and another £10 upon completing the workshop.

    Take Part in this Research!

    Help your child understand the digital world they navigate every day.

    Contact:

    Email: algorithmicmirror@oii.ox.ac.uk
    Lead Researcher: Yui Kondo
    Principal Investigator: Dr. Luc Rocher
    Form: Link

    Click here to join this research!

    Frequently Asked Questions

    Q: Will my child's personal data be visible to researchers?
    No. All identifying information is removed before upload, and visualisations remain password-protected.

    Q: What if we change our minds?
    You can withdraw at any time before 31 September 2025, and all data will be deleted.

    Q: Is this study approved?
    Yes, by the University of Oxford Central University Research Ethics Committee.

    Q: What platforms are studied?
    YouTube, TikTok, and Netflix viewing histories.