Don't like something on Facebook? You can shut down the page.
That tactic is apparently now open to anyone with a large Facebook following. This week a pro-science Facebook page with more than 70,000 followers was shut down because a mass complaint was organised by people who took offence at what was written there.
These reports triggered an automatic response from Facebook that removed the page without human oversight.
And what was written there that justified such a reaction? Well, a middle school teacher in the United States had enough of the spread of pseudo-scientific misinformation on the internet - and decided to do something about it. He set up a page somewhat provocatively called We love GMOs and vaccines, and started sharing scientific information about these topics, along with some memes mocking those that propagate the myths about them. You can get a feel for the type of information they were publishing from their website. Its worth noting that while these topics aren't controversial in the scientific literature, these facts are rejected by several well-organised alternative medicine groups, many with a large Facebook following.
Some of those groups organised their supporters to report the page to Facebook administrators and without any review or justification whatsoever the page was permanently deleted. Apparently no recourse or appeal is available.
The implications of this are serious.
Already it looks like a pseudoscientific page has been taken down in retribution.
While some people have suggested that retribution is the answer, we must remember that while social media does allow people to spread misinformation, it is a medium that also allows people to find credible and life-saving information.
What happens if groups like these try to shut down more serious and legitimate Facebook pages? What might happen if they tried to shut down the CDC or the NHS or the BetterHealth Channel here in Australia?
Of course this isn't confined to health or science pages. Anyone with enough organisational ability can apparently take down any page they don't like. What happens when football fans, racists or even terror groups cotton on to this?
Usually in online debates the person who mentions Hitler the first loses, in what's known as Godwin's Law. But I'm going there.
Destroying communications channels rarely works. In fact, as I've written before, it can cause disastrous unintended consequences. The purpose of burning books or intimidating your critics or shutting down Facebook pages is to prevent people from putting their points of view. But we should consider all points of view and encourage people to publish them. Shutting down a communications channel doesn't stop people from believing in misinformation and there are much better ways of challenging that misinformation than driving it underground. Throughout history it is the people that are most afraid of the truth that often do these things. People who speak the truth should have no fear of debate. And if it is science's job to counter misinformation, then scientists and science communicators need to know what this misinformation is and where to find it.
And if you're anything like me, you've been frustrated at how difficult it can be to have offensive and even illegal information legitimately removed from it - even when the material contravenes its policies. And the filter-bubble effect is already making it hard to challenge misinformation. But these are not reasons to shut down people's Facebook pages just because you don't like them or don't agree with them.
Instead of retribution, all science communicators should campaign together to persuade Facebook to revise its policies to make sure that legitimate - and even misguided - pages can't be taken down just because a mob of opponents don't like the information on it. We should welcome all points of view so that they can be debated.
After all the scientific community is strong when we welcome debate and meticulously challenge misinformation instead of participating in a modern-day book burning.
For more on this topic read Dr Steven Novella's excellent opinion piece.
Humans are lazy. We do what we can to protect our daily allotment of thinking resources - and that means trying not to invest too much energy into thinking if we can get away with it. The reason why we do this makes evolutionary sense. Why invest limited brainpower into an activity when you might need that later on that day for an even more crucial task?
We even have a somewhat lazy term for a person who is doing this â€“ a cognitive miser (Fiske & Taylor, 1991).
If our thinking resources are capped (and psychologists suggest they are) we should try to keep some thinking resources available just in case we need them later on. So, in a binary choice between delving into reams of scientific journal articles or listening to an opinion many humans would choose listening to an opinion. Thatâ€™s because opinions are relatively easy to understand. Nobel Prize-winning social psychologist and psychology professor Daniel Kahnemann summarised the science of these categories of thinking in his book Thinking Fast and Slow, explaining that there are two types of thinking â€“ type one which is intuitive, quick and easy to engage, and type two which is slower, takes more effort to operate and is more rational (Kahnemann, 2011).
Without any hint of irony, Instaread Summaries has produced a 30-minute summary of it to cater for people too lazy to read the 504-page version, but given the topic Iâ€™m tipping your type two brain would still get a fair work-out.
Chaiken (1980) and Eagly & Chaiken (1984) showed that people develop clever little work-arounds so they can use their type two brains and keep type one power in reserve. Petty & Cacioppo (1986) took this concept further and developed what is known as the Elaboration Likelihood Model (ELM) of Persuasion. If youâ€™re wondering what the strange graphic the top of this site is all about, thatâ€™s what this is. ELM suggests that one of the little workarounds people develop in preference to using their type two brain is to find someone they trust and listen to what they have to say about a difficult topic (among many others).
Of course, that's not to say that everyone will prefer opinion over hard data and everyone will want hard data some of the time. Depending on whether youâ€™re heavily and personally invested in a topic, whether you have enough mental ability to process a difficult topic you might seek out an opinion leader rather than investigate all of the the facts for yourself.
And thatâ€™s why more people listen to people like Andrew Bolt than read the data from the Intergovernmental Panel on Climate Change. It also helps to explain Donald Trump's current popularity - at least among Republican voters. But before you criticise these people for being lazy, remember that youâ€™re often lazy too. Just think about the last time you put in your football tips â€“ I bet you came up with your own clever little workarounds rather than use statistical analysis.
And that's the danger for Americans in the coming presidential election. While Trump might be unappealing to those who carefully weigh the facts and the merits of their choice, large numbers of voters don't do this and use peripheral information instead.
Chaiken, S., 1980. Heuristic versus systematic information processing and the use of source versus message cues in persuasion. Journal of Personality and Social Psychology, Volume 39, pp. 752-756.
Chaiken, S. & Eagly, A. H., 1984. Cognitive theories of persuasion. In: L. Berkowitz, ed. Advances in experimental social psychology. s.l.:Academic Press.
Fiske, S. & Taylor, S., 1991. Social Cognition. Second ed. New York: McGraw-Hill.
Kahnemann, D., 2011. Thinking Fast and Slow. First ed. London: Penguin.
Lau, R & Redlawsk, D. Advantages and Disadvantages of Cognitive Heuristics in Political Decision Making. American Journal of Political Science 45(October), pp. 951-971
Petty, R. E. & Cacioppo, J. T., 1986. The Elaboration Likelihood Model of Persuasion. Advances in Experimental Social Psychology, Volume 19, pp. 123-205.