top of page

The Information Laundromat

There’s been a lot in the news recently about suspected Russian interference in elections, so called “Fake News” and misinformation in general. It’s a topic that fascinates me; not least because I like to think that I’m smart and able to make my own mind up about things.

Of course that’s a very naive view. We’re all susceptible to manipulation; otherwise the PR, marketing, investor relations and lobbying industries wouldn’t exist. But the nature of how we can be and are influenced is changing radically, in particular through social media and the internet.


Thanks to technology that puts broadcasting abilities in everyones hands and social media platforms, anyone can share their content and opinions with a worldwide audience. Influencers now have the same, if not a greater, reach than traditional media.

One of the reasons social media works so well is that, on the face of it, it is self-selecting: we choose who we share content with and whose content we see. Whether that’s getting an update from Great Aunt Maude who lives on the other side of the world, or watching a video of a celebrity we find entertaining.


At heart we’re social animals; we naturally seek validation from others and like sharing stories and experiences. We also trust our friends and acquaintances. Social media amplifies and facilitates these tendencies. That can be good (charitable causes, connecting families living in different continents) or, more relevantly for this blog, bad.


The latter comes in many forms. Like “astroturfing” where influential journalists or politicians have their Twitter feeds taken over by huge numbers of fake accounts (some bots, some not) which seek to persuade them that a particular line of thinking is more widely accepted than it actually is, with a view to influencing their coverage or policy making. Like this Tweet from a well-known UK journalist:



Or the creation of “clickbait” stories that are simply untrue, but which play to people’s fears and which they then, for well intentioned reasons, share with their friends. More on that in this excellent Wired magazine article:



I’m reminded of money-laundering, the process by which criminals take the proceeds of crime and then “clean” it by mixing it with legitimate sources of income through the banking system.


Social media is like an Information Laundromat, whereby things that aren’t true, are pushed through channels that give the story legitimacy. So a story you see on your Facebook feed gets your attention whether you know it or not. Obviously things that are shared by your Friends have the most impact, but even those that aren’t will impact your subconscious.


Sometimes things that begin as “promoted content” which the social media network weaves in to your feed between the things you’ve asked to see, is re-shared by users; turning an advertisement into an endorsement.


The mainstream media also gets in on the act. Have a look at how many stories are now sourced from social media. It’s a natural place for them to look, especially given the underinvestment in proper journalism these days. But it also gives further legitimacy to social media as a reliable information source.


It probably shouldn’t surprise us that people are seeking to manipulate opinions via social media. It’s low cost, easily scaleable and sadly, it works.

Because we’re predictable in how we behave and we’re not properly wired for the internet age, we’ve got a huge case of Human Risk in action.


We’re bombarded with ever increasing volumes of information that require us to take more decisions than we’ve ever had to before. To navigate that we rely on shortcuts that are instinctive to us. We trust the familiar and we become creatures of habit; just try switching the location of App icons on someone’s phone and see how it frustrates them! We can get information on everything and anything. Not all of it accurate.


Unfortunately, the tools we use to navigate today’s world aren’t necessarily going to serve us as well as they might have done in days of old.


Obviously this is likely to be more of an issue for digital migrants than digital natives.

Although arguably the cynicism that older digital migrants demonstrate in the face of online banking and new perceptions of what constitutes privacy, might serve them better than those who are more trusting.


So what can we do?

The good news is that there’s lots. It starts by being better informed ourselves and sharing that understanding with others. I recommend the work by Mike Hind who has a fascinating podcast called The Disinformation Age and the research done by a not for profit organisation called First Draft News


Then there’s the fabulously titled Calling Bullshit which contains an excellent series of lectures which study the proliferation of BS in the modern world and promotes some ideas about dealing with it.


We can also not take everything at face value and to question things more. Mike has some simple tips on his website about how to stop bots from polluting your Twitter feed.


Given it’s a technological problem, there are ways that tech can help solve it. There’s a wonderful initiative called Re:Scam which uses AI to power an email bot that wastes scammers time. If you redirect scam emails to them, they’ll put their bot onto it. The more time the scammers spend engaging with the bots, the less time they can focus on exploiting Human Risk in others.


Copyright secured by Digiprove

Some Rights Reserved


Comments


bottom of page