How to Find Trustworthy Medical Information in the Age of Fake News
How do you know if you can trust the information you read online? When it comes to understanding fertility and pregnancy—myths and conflicting information abound. So, how do you know which information you can trust? (And how do myths develop in the first place?)
This week, I spoke with Cailin O’Connor, an expert in how myths spread. O’Connor is a philosopher of science and co-author of a new book, The Misinformation Age: How False Beliefs Spread. Read below for an edited transcript of our conversation.
Ava: How do false beliefs spread?
O’Connor: So the reason we wrote this book is there’s a crisis in false belief right now, where a lot of people came to be adopting false beliefs from the internet, and sharing fake news and conspiracy theories—these have been cropping up and becoming quite powerful. My co-author and I noticed that a lot of people were explaining these with cognitive bias or issues basically related to our cognition. Of course, these are really important to understand why people believe things. But probably much more important is the fact that almost everything we know comes from other people. So, if you think about the scientific things you believe: you can eat tomatoes, the earth moves around the sun, or we evolved from primates. You know these things because someone told them to you, not because you reasoned your way to that understanding, right?
Most of our knowledge is social in this way. And we thought if we want to understand false beliefs, we really need to focus on the social aspect of knowledge, the way people share ideas and information with each other and the way people care about their social connections and environment at that shaped how they share ideas and beliefs.
Ava: How can people researching online know that what they are reading is backed by research—or whether it’s even true?
O’Connor: Yeah, so we all have to trust other people quite a lot when it comes to our beliefs because all of our knowledge is social in this way. So, I believe that the earth moves around the sun and the reason I believe that is that I trust the scientists and the textbooks tell me that this is true, right?
We all have ways that we decide who to trust and who not to trust and what sources to trust and what information to trust people have their own sort of rules for how to do that. A lot of what we use is what you might call heuristics, kind of quick and dirty rules for making a good guess about whether some piece of information is true or not.
Ava: What about like handling conflicting information, particularly on the topic of pregnancy?
O’Connor: That’s an incredibly tricky, and one thing you see people doing is throwing up their hands when that happens too much. Especially when it comes to scientific facts and thinking, wow, this is all just a mess—I can’t trust any of these people. Of course in some cases when you’re seeing conflicting facts about science, like about a topic like pregnancy, sometimes there just isn’t scientific consensus on the matter.
And in other cases, there aren’t conflicting matters of fact. Someone is just wrong, but there is a scientific consensus on the matter. So then you have to try to go out and find really trustworthy resources to verify what you’re hearing. Of course, you should trust something that comes from a scientific article or from a really legitimate news source or from a website like WebMD much more than something like an activist website meant to promote an idea. I think the best you can do is try to look at your sources and pick ones that are dependable and then go out yourself and try to verify the information if you can.
Ava: What about when people believe something that’s been disputed by science, for example, that miscarriage is caused by something the woman did or did not do?
O’Connor: I’m not an expert on this and it’s hard to say in any one case like that. But one thing that we talk a lot about in the book is connected to the persistence of falsehoods—we all share social information and have to trust each other about this information. We have this central system where it’s possible for false beliefs to spread and kind of persistent take on a life of their own.
We see this happening all the time, especially in the medical world where it can be pretty hard to get really good information about what’s true and what’s not true. There have been tons of cases in history where we believe something false and for a really long time. You know, for example, that mercury is a safe treatment. So if people have heard that misbelief from people who they know love and trust—they’re going to persist with that belief.
Ava: Has social media amplified that?
O’Connor: In general, there have been these changes to our social structure caused by social media that has changed the way that false beliefs can travel. One element is the way people choose their social partners. It used to be that everyone had friends who lived in their town and they were just connected to their family, their neighbors, or people around them. Now, of course, people can choose social partners who they’ve never met in person, but who they’re connected to over the Internet. And they can choose them because share some kind of belief.
People who think vaccines cause autism can connect with other people who share those beliefs and they can support each other, share evidence that perpetuates this belief and create a community around false beliefs in a way that was much harder before we had social media.
It still happened sometimes before, but it was just much more difficult. And the other really important thing about the way social media has changed the spread of beliefs that inside now people who want to influence people to believe to say people in the industry or sometimes pharmaceuticals or sometimes other governments. For example, the Russian government now can get much more direct contact with day to day people—and they can do it while posing as another person or as an authority—someone they might trust.
Right now, people often have these connections online where they think they’re talking to another like-minded person, but that actually might be a Russian Bot or something like that. So that’s a big change from what we had before.