Walkthrough for YLE’s “Troll Factory,” an educational game about disinformation campaigns

Before reading this walkthrough, you probably want to play Troll Factory a few times and see if you can get 15000 shares and 2000 followers within a week. If you do, you too can be the “Lord of Lies.” The game is free to play, but be warned: the content is repellent, as it has to be. “Troll Factory” uses the hate we see online to make its case about disinformation forcefully.

Why does an educational game need a walkthrough? An educational game is typically meant to get students talking about what is immediate, how it has impacted them, and how it has more impact and reach than we were previously aware. But I’ve learned the hard way that sometimes you can’t get the discussions during class which a class needs. And in those cases, a proper discussion must be modeled somewhere. This isn’t to try to tell everyone what to think, but to show a way of exploring a topic and questioning one’s own assumptions about how the world works. One shows a way of coming to terms with things only partially observed or strongly felt.

So yes, an educational game most of all needs a walkthrough. YLE does a nice job of pointing out the prevalence of fake news, emotionally charged content, bot armies, and conspiracy cults online. Here’s a screenshot from where they explain the sort of content you use in-game to get followers:

This does a good job of helping explain some of the people we run across online, people who will not read past the headline. Some people want to react, they want to feel confirmed in what they already feel in some way. However: how does this work in tandem with fake news? With conspiracy theories? The answer isn’t as obvious as you would think, especially if it’s your job to spread your posts and build an audience.

I think at this point in the Internet’s history, it’s safe to say all of us at some point have tried to build an audience. What’s remarkable about Troll Factory is that, from your very first post (you could probably substitute “tweet” here), you don’t have to work all that hard to do it. Your first post is invariably hate and it brings the followers to you. In the anti-immigration disinformation campaign, you get the choice of generic “defend the borders” rhetoric, a snarky “liberals are also the enemy,” or a response to another tweet which curses out an entire religion. No matter what you pick, you’ll get a few followers.

To get the followers for the in-game hate campaign, I’ve tended to use the tweets with hashtags. But it isn’t clear to me this makes a substantial difference. People who hate are actively looking for others who share their anger, it seems. I wonder if the makers of Troll Factory would agree with this: the type of content you share alone will attract the audience. After your first post, you share a meme. Your first post gets a few followers, but the meme can bring you 100 followers. Later, one of the in-game events that inevitably expands your reach is a “fake news” event. Your boss has you distribute deliberate disinformation to your followers. This spreads to other nodes no matter what–from what I can tell, it reaches beyond your following at the moment. It may be justifiable to say there’s an audience intentionally looking for fake news, even if that audience is others involved in the disinformation racket. (Others involved in the disinformation racket, like Trump, need fake news to find the easiest marks.)

There are three events which if used properly cause your shares and following to dramatically spike. You’ll be asked, after the game makes you post a hateful meme, if you want to reach a targeted audience. You must reply “Thanks, I’ll give it a try” in order to increase your followers for a multiplier effect for the rest of the campaign:

Screenshot from “Troll Factory”–the awful meme does not represent my views. Migrants should not be demonized.

You are subsequently asked to buy a botnet or visibility for your posts. If you buy bots which share and like your content 100,000 times a day, or spend 50,000 euro on visibility, your shares and follows jump dramatically. If you try for more than either–say you pick a million times a day or spend too much euro–you have no impact. Companies and people recognize you as spam if you try too hard. Try too little and you have no impact. It’s a weird principle: in reality, what you’re doing, whether it be done a little or a lot, is spamming and is quite recognizable as spam. The question of why an audience is desensitized to some hate but not desensitized to a deluge of hate speaks volumes about the “content” you’re sharing.

Finally, your boss asks you to help spread fear. The choice which always garners the most shares and follows centers around the conspiracy theory of an airline crash. You will not get quite the same response with slanderous, dangerous Islamophobia. The strongest response comes from feeding your audience a conspiracy theory (no less racist or slanderous or dangerous, mind you) and giving them a very simple, uncluttered code. I’m honestly not sure what to make of this, the last part of the game. When we were studying Plato’s Apology of Socrates a few weeks ago, we talked about how Socrates says he’s been slandered since many of the jury were children. Aristophanes’ Clouds, as mass media, became a sort of tradition. Mass media isn’t really mass media. It’s an amplification of belief, an amplification of a community’s perception of what reality is. No wonder certain groups always rail against the media—they think that’s the “only” obstacle to why the world isn’t the way they think it ought to be. Hate sells and it needs to be made far less profitable than it is. But there’s something else out there—a complete disconnect from reality, a warped vision of community or divinity or togetherness—that we have only glimpsed.

2 Comments

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.