Board game about online conspiracies and how social media and its users amplify them created by a Concordia PhD student | Education
From QAnon to crisis actors, stolen elections to ivermectin, fake news and conspiracy theories thrive in our connected online media. The more bizarre or absurd the premise seems, the faster it spreads, with serious real-world consequences.
Scott DeJong wants to understand why online lies go viral and how they shape our relationships with social media — and with each other. The communications doctoral student has created a board game that pits conspiracy theorists and their online troll amplifiers against moderators and educators. DeJong says Lizards & Lies, available as a free download, is meant as a blueprint to help us grasp the fake content movement and the frustrations those who struggle with it can feel.
As it explains, the game explores the widely held assumption that the spread of conspiracy theories and misinformation online is a new front in a global war on truth. It’s a precept he says is oversimplified and overlooks the vital role human users play in spreading fake news, whether willingly or not.
“I wanted to see if these analogies work when we simulate them in a game,” he says. “A game works like a system of rules, where boundaries can be set and we can explore what happens in those spaces.”
Push the lies and push back
Lizards & Lies is an asymmetrical three-round, two- or four-player game, meaning each player has a separate role. The gameplay revolves around preparing for a theoretical election, with one side spreading conspiracies (especially about reptilians and birds being spy drones) and the other doing their best to dispel them. Players can take on the role of conspiracy theorist, Edgelord/troll – who amplifies theory – fact checker or digital literacy educator. Both sides are fighting to see if one can successfully spread these lies on its networks and if the other can prevent it.
“Characters are designed to reflect real-world challenges,” DeJong explains. “Conspiracy theorists try to build up small pockets of supporters and use them to proliferate. Edgelords are less focused on plotting and more on where users are vulnerable; they seek to make them react to their content. At the same time, he notes, educators will find education to be time-consuming, while moderators are overwhelmed: the amount of content they have to process is endless.
“Their goals are meant to reflect the challenges that everyday people face outside of the game by representing them inside.”
Users collect points at the end of each round, tallied at the end to determine which side won. Coordinated team play can increase the chances of victory, he adds.
DeJong says the game is meant to be fun, but also educational. As a research project that grew out of classroom work, Lizards & Lies works to clarify the idea that social media functions as an ecosystem, with many different actors and many different spaces, each influencing the other. An idea that pops up on Reddit, for example, can quickly migrate to TikTok, Instagram, or Facebook.
“The game examines how existing analogies have failed to demonstrate how fake news and misinformation proliferate,” he says. “It is not just an algorithm on a platform that is responsible for spreading lies. It’s that ecosystem that users are part of, where what we say and do and what we scroll or choose to like or share influences what’s going to be popular in those spaces.
Watch a video of Scott DeJong explaining his research: https://www.youtube.com/shorts/qjy3GzRRsA4
— By Patrick Lejtenyi
— Concordia University
– A B