Misinformation spreads faster than truth, with only 4% of people consistently identifying false content. Young people, who spend hours on social media daily, are particularly vulnerable as algorithms reinforce echo chambers, distorting reality and critical decision-making.
Project Real (An UK project led by Dr Yvonne Skipper) aims to redesign its online platform to engage young audiences (age 11-14) in identifying misinformation. The challenge is to create an interactive and intuitive website that attracts users, deepens their engagement, and empowers them to share tools and insights within their communities.
The team convenes every Wednesday to advance the project’s development. Led by Aaron Teng, a Visual Designer from NedBank, South Africa, the team includes Matthew Bromage, User Experience Designers from Sky/Comcast in the United States and the United Kingdom, respectively; Mark Lendacky, a Design Technologist at Sky/Comcast, based in the United States; me, a current RCA student from the United Kingdom. The team benefits from the mentorship of David Stevens, Director of Product Design at Globant in the United Kingdom, while Dr. Yvonne Skipper, Senior Lecturer in Psychology at the University of Glasgow, serves as the primary point of contact for the development organization.
As this is an international collaboration, all discussions are conducted through Zoom calls, with ideas and contributions organized and developed collaboratively on a shared Miro board. It was an honor to be one of 45 Master’s students selected from RCA to work directly on a Design for Good project brief from 17 development organisation partners.
Initial research process
The team began with a Discover and Define Session with Dr. Yvonne Skipper on 28th October, where key insights were gathered and recorded using sticky notes. This session provided valuable direction for understanding the challenge and refining our approach to tackling misinformation.
Following this, the team conducted desktop research on a miro board, exploring a variety of resources including videos, academic papers, news articles, and documentaries to deepen our understanding of misinformation. Each resource was analyzed, with summaries and links documented for reference.
To incorporate real-world perspectives, the team engaged in interviews, speaking to children, teachers, parents, and even real and fake content creators. These discussions offered diverse views on misinformation and highlighted how it is experienced and perceived across different contexts.
Design Hypothesis
After conducting research and interviews, we identified key factors shaping our younger audience. Input from team members with children and an interview with a London school teacher for troubled kids provided valuable insights:
• Love for being ‘right’: Kids feel motivated and accomplished when they are in the ‘correct’ position.
• Detached responsibility: They often share fake news without understanding its consequences.
• Impact of physical experiences: Interactive, hands-on workshops leave lasting impressions, as recalled by adult team members.
• Digital familiarity: Kids are accustomed to online quizzes and tools through modern school learning methods.
• Ripple effect: Lessons learned in school are often shared with adults, extending the impact beyond the classroom.
From these, we set up our design hypothesis.
More interviews and solidifying
A few team members conducted in-depth interviews with children in their households to explore the intersection of education and entertainment.
We discovered that while kids enjoy games as a medium for learning, they prefer the educational elements to be subtle rather than overt. They value the experience more when the learning aspect is integrated seamlessly into the gameplay, avoiding the feeling of being overtly “taught.”
Game Concept Design
From our insights, we aim to create a gamified media literacy solution built around an online “debunker” tool. This tool identifies misinformation, explains why content is false, and empowers users to critically analyze information. Customizable modules ensure nuanced detection, addressing the multifaceted nature of fake news. The project is set to enter full production by mid-2025.
Here are the current updates.
See related: