Fake news is a hot topic these days. And it’s not just a political issue. This garbage might even be hiding in your well-intentioned but misleading Powerpoint presentation.
FAKE NEWS IN YOUR SLIDESHOW
You’re putting the finishing touches on your amazing presentation. All you need to top it off is an affirming quote from some brilliant dead person - Einstein, Lincoln, Twain. Their endorsement will prove beyond a shadow of a doubt that your thinking is clean and correct.
And the internet doesn’t let you down. There are an unlimited number of quote sites offering up this kind of stuff.
For instance, since I’m writing about fake news I could use this Mark Twain quote…
“If you don’t read the newspaper, you are uniformed. If you do read the newspaper, you are misinformed.”
Unfortunately Twain never said it, or wrote it.
So says Garson O’Toole, an expert on quote attributions. I suggest you take a minute to check out his site so you don’t accidentally become the bearer of fake quotes.
FAKE NEWS IN YOUR FAVORITE RESEARCH STUDY
Back to your Powerpoint, where you’re in a pinch. You just ditched the quote you loved because it turned out to be fake.
No worries. You still have an awesome research study you can reference. It supports your point of view and it’s backed by a bunch of PhD’s armed with big data and kick ass graphs.
Or maybe not. Maybe it’s more fake news. Or at least news that isn’t as real as you might think.
So says billionaire John Arnold. He’s an ex hedge fund guy who’s calling bullshit on lots of researchers and their studies. He’s doing this by trying to recreate their work.
Point being that if only the original researcher can make the study work, it might not be valid. Sound research should be repeatable.
And what Arnold’s team has found is not good.
In the field of psychology they’ve only been able to replicate 40% of the studies they’ve attempted. In other words, 60% are NOT replicable.
That’s bad. Maybe even horrible. But it could be overstated - maybe Arnold’s team isn’t good at replication. Either way, it’s a finding that’s getting lots of attention. It’s forcing folks to take a hard look at potential causes of the problem. And one culprit they’re shining a lot of light on is the confusing difference between correlation and causation.
CORRELATION vs. CAUSATION
Correlation measures the extent to which two variables move in tandem.
Causation or causality measures the influence that one variable has on another.
Here’s a graph from a website that explores how people get confused by these two factors.
This image charts two variables.
US spending on science, space, and technology in red.
Suicides by hanging, strangulation and suffocation in black.
Notice that these two lines are almost identical. This is an example of high correlation. Which is nice to know, but not real useful by itself. The million dollar question is whether movement in one variable CAUSES the other to move. So, for instance, if we increase technology spending will suicides also increase?
Using common sense you can deduce that the answer is no. Greater tech spending isn't likely to CAUSE more suicides. Which means these two data points are closely correlated, but not causal. A change in one does not CAUSE a change in the other.
No big deal when you’re looking at variables that are so obviously not related. But a real bitch to isolate when you’re looking at variables that seem like they could or should be related.
Here’s a real life example from a website called - SENSE about SCIENCE. They're a group devoted to improving statistical literacy of researchers and journalists.
"For example, eating breakfast has long been correlated with success in school for elementary school children. It would be easy to conclude that eating breakfast causes students to be better learners. Is this a causal relationship—does breakfast by itself create better students? Or is it only a correlation: perhaps not having breakfast correlates highly with other challenges in kids’ lives that make them poorer students, such as less educated parents, worse socio-economic status, less focus on school at home, and lower expectations.
…But the clear message here is that a causal relationship has been extremely hard to establish, and remains in question."
So here we see some correlation but CAN NOT know for sure if there is causation. We can’t say for sure that eating breakfast CAUSES kids to learn better.
But if my company sells pre-made breakfasts to schools I’m going to be wired to see causation. To convince myself that a great breakfast CAUSES better academic performance. The data is murky enough, and I'm so convinced of the connection, that I'm going to see it.
And even if I don't, some journalist will likely see it for me.
Journalists need stories to hit deadlines. Journalists love sweet sounding research. And at most colleges the journalism and math departments aren't next door to each other. The writers don't tend to do statistics in their spare time. So they might not be the most capable fact checkers when it comes to these studies.
So they pump out a headline grabbing story suggesting a causal link between eating breakfast and improved grades. Exactly what I, the guy who sells breakfasts to schools, was looking for. I don’t even have to look at the data myself. I just grab the story, paste it into my marketing campaign, and I’m off to the races.
This is the fake news you have to watch out for. The crap that can pollute your otherwise trustworthy Powerpoint slides.
PROTECTING OUR REPUTATIONS
No one has the time to dive into the details behind every interesting study.
But if you’re using a data-based claim to backstop your most critical argument, you should take the time to go deep. You don't want to be spreading false information. Your reputation is on the line if someone takes the time, that you didn't take, to call bullshit on your claim.
For me, this website is my Powerpoint presentation. And no, I do not go deep on every single data point I share. But I do go extra deep on points that underpin my work.
For instance, the Herzberg motivation research that sits underneath my Maps. I first saw this work cited in a book by Clayton Christensen titled How Will You Measure Life . I was intrigued. It explained a lot of my past successes and failures as well as mixed feelings I’d had about many career experiences. So I spent time reading pro and con reviews. And I took the time to read Herzberg’s book from the 1950’s - the source material for his original motivation study.
Worth the time?
Absolutely! I’ve read that book at least five times and I still refer to it regularly. Understanding the details gives me greater confidence in the validity of my own personal Maps. And helps me to write with more confidence.
For stuff that really matters, we need to take the time to study the source material. Question its validity. Understand its applicability. And make sure that we’re not carelessly putting our good names behind fake news.
Until next time…
***Note: This site works best when you read the posts in order. So please head to the ARCHIVE to get started.