What are the solutions to challenges of journalism in the era of AI? How to improve guidelines, policies, and ethical codes for journalists? Can simulation games help answer these questions? The Polish subteam of CORECON piloted a simulation game in which participants role-played the work of a media outlet. The game was seen as a participatory method that helped enrich the team’s insights on responsible journalism with non-expert perspectives.
The idea and context
The Polish CORECON subteam invited media and communication majoring students from FORTHEM Alliance universities to participate in a simulation game, within a five-day intensive educational program: “Responsible Journalism in the Era of AI”, organized at the University of Opole in October 2025. The game was focused on updating ethical guidelines and recommendations for journalism training in the era of AI and in the context of war reporting. Using simulation games as a tool for policymaking is not a new concept; it has already proven its worth, for example, in urban planning and climate change decision-making. Applying it to media policies is still a relatively new approach, and it differs from public policymaking. In most cases, media are not public or state institutions but private companies, which need to balance economic goals and democratic responsibilities. Listening to students’ voices might help create better journalism tailored to citizens’ expectations.
So what were the students doing during that event? You can read more about it here. They started the training with workshops covering the crucial aspects and toughest challenges of contemporary journalism, including the use of artificial intelligence and countering disinformation. Since theory and practice do not always go hand in hand, the students could confront theoretical insights with perspectives of media practitioners. They visited a local radio station, and met the editor-in-chief of the local newspaper. These professional journalists shared their experiences from their daily practice and their thoughts on concerns, fears, and hopes regarding technological developments, especially the use of AI and its influence on their work. Both workshops and meetings with the experts were a good starting point to delve deeper into their roles in the simulation.
Creating policies and ethical guidelines
What does responsible journalism mean? What is the mission of news media outlets? What are the ethical principles for contemporary journalism that face unprecedented challenges? During the game, the students responded to these questions when they simulated three different media outlets: a public medium, a local newsroom, and an innovative start-up. The task required not only good planning skills but also creativity, reflexivity, and imagination. Each team received a “generous grant” to create a mission statement and ethical guidelines for the outlet of their dreams. They could let their imagination run wild, as they were free of financial restrictions or corporate dependencies that could limit them in this task.
According to the simulation scenario, just before they finalized the code and guidelines, they had been hit with news about the viral video of the military incident on the border. How to react responsibly in such a situation? How to distinguish between what is true and what is false? How to tell the audience about it so as not to evoke unnecessary panic? Both the knowledge and experience the participants developed during previous activities proved to be useful when writing the headline and the lead to the newspiece about this incident. This task also demonstrated to the students the importance of clearly sharing responsibilities within the editorial team and what was at stake. Having this experience, they decided who in their team makes decisions (the editor-in-chief), who brings the news and pushes to publish it (the reporter), who is responsible for verification (the fact-checker), and other duties if they decided they were needed (e.g., an IT specialist in case of one of the groups).
The simulation shows how complex the problem of AI implementation is, resulting in three completely different approaches:
- The group simulating a digital start-up was the most innovative and enthusiastic towards new technologies. They used AI to verify information and detect deepfakes, demonstrating a high level of trust in AI’s ability to debunk fake news.
- The group simulating public media was on the other side of the spectrum, being the most skeptical about using these technologies for anything beyond grammar polishing, and even then, according to their policy, it should be strictly controlled.
- The group simulating a local newsroom balanced the two extremes by taking a quite technical approach, using AI to detect anomalies and analyze metadata, and postulating a mandatory disclaimer on AI use in any AI-supported text.
Despite differences, all three teams shared core principles: verify before publishing, protect sources, and be transparent with audiences. However, the various approaches to implementing AI technologies in journalistic practice show that there are many possible directions for responsible journalism.
A crash test – challenging students’ work
With the ethical codes freshly signed, it was time for a crash test. We handed each group a challenge card with a fictional (but not unrealistic) scenario simulating situations that journalists might face in their daily practice.
Let’s imagine a viral video of the president of a neighboring country delivering a speech where he declares war. Is it real or fake? It spreads quickly across social media, the public starts panicking and demands instant updates and information about the situation, and other media outlets are already reporting on that. The team has 10 minutes to decide how to cover the news responsibly. Just after solving the first problem, they were hit with another; in total, each group faced 5 challenges.
Other scenarios were creating similar pressures, for example, a sudden cyberattack on the outlet, a disinformation campaign against it, the journalists stopped at the border of a military conflict area, or a call from the ministry to prevent some information from being published. In each case, the students needed to verify if their policies actually work in such situations. If the mission, guidelines, or ethical codes did not withstand pressure, could not anticipate situations, or were not specific enough, they needed to be modified immediately. It was to simulate the real editorial team’s concerns, when the reality turns out to be more complicated or surprising than any ethical code, guidebook, or policy.
A simulation game cannot replicate reality, and, obviously, professional journalists rely not only on written codes, guidelines, and media outlet policies but also on their broad experience. However, playing simulation games offers valuable insights into what young people from different European countries expect from contemporary media. The students have a fresh perspective, and as early adopters of various modern technologies, they might shed new light on the practitioners’ and experts’ insights. The variety of possibilities and propositions shows that there is no monopoly on good solutions, and responsible journalism in the third decade of the 21st century might take many different forms.





