Adolescents’ Perception of the War inUkraine on Social Media: A Researchers’ Night Experiment

Adolescents increasingly learn about geopolitics through social media feeds optimized for engagement rather than accuracy. During Researchers’ Night, we piloted a rapid, skills-based intervention aimed at improving Romanian teens’ judgments of Russia–Ukraine war headlines. Participants completed a brief pretest survey and rated a mix of verified and fabricated headlines. They then received a short lesson on misinformation and cognitive biases, practiced a micro fact-checking routine adapted from the “Four Moves & a Habit” approach (Caulfield, 2017), and played a live “echo-chamber” game that simulated algorithmic curation. Post-intervention, participants evaluated a new set of posts. We expect improvements in discrimination between true and false content, reduced share-intent for questionable items, and increased willingness to consult a second source.

Why this study, why now

Young people encounter complex international events through platforms that privilege speed, novelty, and emotion—conditions known to amplify misinformation (Vosoughi et al., 2018). Adolescents are developmentally primed for social learning and identity exploration, which can heighten sensitivity to in-group cues and emotionally arousing content. Short, transferable verification habits—especially those that slow users down at the moment of strongest emotion—have shown promise for improving judgement online (Caulfield, 2017; Wineburg & McGrew, 2019). Likewise, making algorithmic curation visible can disrupt the intuitive “this is what everyone sees” illusion that fuels echo chambers (Pariser, 2011; Sunstein, 2017). Building on this literature, we developed, together with psychologist Răzvan Iliș, a concise, public-facing learning sequence and evaluated its short-term impact.

The setup: who / what / how

Setting and participants. The activity took place during Researchers’ Night and was open to adolescents aged 13–19 years. Participation was voluntary and anonymous; no identifying data were collected. Consent procedures followed age-appropriate guidelines (parental consent and/or participant assent/consent, as applicable). Across the event, 44 adolescents completed the full sequence (pretest, intervention, posttest) and constitute the analytic sample. We did not impose quotas by age, gender, or school; therefore, the sample should be considered a convenience sample.

Materials

We assembled a balanced set of social-media–style posts on the Russia–Ukraine war. True items were sampled from the vetted CORECON corpus, while fabricated items were modeled on misinformation tropes documented by Romania’s Ministry of Defence debunking project, Inforadar.ro. Each stimulus was formatted to resemble a typical platform headline. To meet ethical standards for adolescent participants, headlines deliberately excluded
graphic content. Before the intervention, participants completed a battery of brief measures capturing news literacy, news habits and trust, affect, and thinking style. Specifically, we used a culturally adapted version of the True/False News Literacy Knowledge scale (Maksl et al., 2024), items aligned with the Reuters Institute Digital News Report methodology for news habits and trust, the I-PANAS-SF for current affect (Thompson, 2007), the short Need for Cognition scale (NCS-6; cf. Cacioppo & Petty), and the willingness to Self-Censor scale (Hayes, Glynn, & Shanahan, 2005). Pre- and post-intervention outcome ratings for each post captured perceived credibility, sharing intention, second-source intention, confidence, primary emotion, and willingness to comment.

Procedure

Participants first completed a baseline questionnaire covering media habits, emotions during news consumption, cognitive style (e.g., intuitive vs. analytic tendencies), and willingness to self-censor. They then provided headline-level judgments—perceived credibility, intention to share, intention to seek a second source, confidence, primary emotion, and willingness to comment.

The intervention had two parts.

  1. Micro-lesson + skill: We introduced how misinformation exploits cognitive shortcuts (confirmation bias, motivated reasoning, affective arousal) and taught a compact verification routine adapted from Four Moves & a Habit—check what’s already known, go to the source, read laterally, restart when stuck, and habitually check your emotions—following Caulfield’s guidance (Web Literacy for Student Fact-Checkers, 2017).
  2. Echo-chamber game: To make algorithmic curation tangible, participants physically selected coloured “posts,” each colour representing an interest (wellness, technology, the Russia–Ukraine war, entertainment). They formed interest groups, each member gave three “Likes,” and we surfaced the top in-domain feed—deliberately monochrome. Then we created a cross-domain feed by taking the top post from each interest, illustrating how diversified walls more closely approximate reality than single-topic streams.

Finally, participants completed a posttest with a fresh but comparable set of posts rated on the same metrics.


Outcomes and analysis plan

Our primary outcome is discrimination between true and false headlines (e.g., higher credibility for true items and lower for fabricated ones). Secondary outcomes include reduced share-intent for fakes, increased second-source intent, better confidence
calibration, and dampened high-arousal emotional responses to dubious items (Lewandowsky et al., 2012; Pennycook et al., 2020).

Results – forthcoming

Data processing is underway. If our expectations are borne out, we should observe a widened pre/post gap favoring accurate discrimination, coupled with more cautious engagement behaviours. Even small effects would be practically meaningful given the intervention’s brevity and portability.

Acknowledgements

Our heartfelt thanks go to the students of Octavian Goga National College and Andrei Șaguna National College in Sibiu, and “Gustav Gundisch” Theoretical High School in Cisnădie. Their curiosity, questions, and good humor made the workshop lively and
insightful. Special thanks to the teachers who facilitated logistics and encouraged students to engage thoughtfully with the activities.

References

Caulfield, M. (2017). Web Literacy for Student Fact-Checkers (and other lessons). (Also known as “Four Moves & a Habit”.)

Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., et al. (2012). Misinformation and its correction. Psychological Science in the Public Interest, 13(3), 106–131.

Pariser, E. (2011). The Filter Bubble: What the Internet Is Hiding from You. Penguin.

Pennycook, G., Epstein, Z., Mosleh, M., et al. (2020). Shifting attention to accuracy can reduce the sharing of misinformation online. Nature, 592, 590–595.

Sunstein, C. R. (2017). #Republic: Divided Democracy in the Age of Social Media. Princeton University Press.

Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146–1151.

Wineburg, S., & McGrew, S. (2019). Lateral reading and the nature of expertise. Teachers College Record, 121(11), 1–40.