All the fear that’s fit to print: How the media distort our perception of danger

“We don’t like bad news, but we need it. We need to know about it in case it’s coming our way.” — The Bad News by Margaret Atwood

Our brains are hardwired in such a way that we are naturally more receptive to negative news than positive news. According to Yuval Noah Harari, the author of the book Sapiens, we are “full of fears and anxieties over our position, which makes us doubly cruel and dangerous.”

Negative news is dramatic (natural disasters, wars, and stockmarket crashes), its sudden, and its spectacular. Positive events on the other hand (longer life expectancy, higher standard of living and cures for diseases) tend to be more gradual, unnoticed and dull.

The media focuses on the extraordinary, rather than the humdrum and the mundane. The charge against the media of course is that misrepresenting the risk of something bad happening means more newspapers are sold, more viewers and more advertising revenue. In short fear sells. It gets more retweets and more shares too. We’re also to blame.

The issue is also on the supply side too. Journalists are under greater pressure to write more and more stories. Lack of time and resources means that they are less able to go to the source, to verify facts and to provide the proper context from which people can digest the real risk of something bad happening.

Once established, the media’s anxiety fueled based narrative on a extraordinary issue often grows stronger and stronger. As Dan Gardner, author of Risk: The Science & Politics of Fear highlights:

“The media reflect society’s fear, but in doing so, the media generate more fear, and that gets reflected back again. This process goes on all the time but sometimes — particularly when other cultural concerns are involved — it gathers force and produces the strange eruptions sociologists call a moral panic.”

Science related stories, especially ones about our health have become fodder for media misinformation and sensationalism. They have also been at the centre of a number of moral panics. Misinformation and sensationalism has its cost. It can divert resources (our empathy and money) to the wrong causes: silent killers like cancer and malnutrition are just two that suffer from the lack of attention.

Bogus stats

It is often said that there lies, damned lies and statistics.

News stories routinely say there is a possibility of something bad happening without providing a meaningful sense of how likely that bad thing is. According to Dan Gardner the media suffer from a “denominator blindness”:

“The media routinely tell people ‘X people were killed’, but they rarely say ‘out of Y population’. The ‘X’ is the numerator, ‘Y’ is the denominator. To get a basic sense of the risk, we have to divide the numerator by the denominator — so being blind to the denominator means we are blind to the real risk.”

Natural frequencies as they are called are much easier to understand (1 in 10,000 people for example), but the media rarely use them.

Another way the media often mislead the public with statistics is by using relative risk. For example, a 2013 article in The Guardian reported that “Cancer risk 70% higher for females in Fukushima area, says WTO”. This was actually drawn from statistics showing an increase in absolute risk from 0.77% to 1.29% in the aftermath of the Japanese nuclear disaster of 2011. However, as the Wall Street Journal (WSJ) reported the absolute increase is ‘tiny’ — about 0.5%. In contrast, the WSJ headline read, “WHO: Tiny Cancer Risk After Japan Nuclear Accident”.

Presenting risk accurately and in a way that you and I can understand is important. We base our daily decisions on our prior beliefs about risk, and if those beliefs have been heavily influenced by bogus stats presented in the media then our decisions are also likely to be poor. If a person or organisations advice is followed by a very large number of people then any misrepresentation of the risks (whether deliberate or not) are magnified by the changes in behaviour they encourage.

Risky images

Images have a powerful impact on our perceptions of risk. No image and there is no emotional hit to the gut. Add some emotionally charged images to an article and suddenly it seems much more real. This was vividly demonstrated by Rhonda Gibson of Texas Tech University and Dolf Zillman at the University of Alabama. The researchers invented a fictitious disease to gauge what the presence of images in media reports does to perception of the risk. The disease — “Blowing Rock Disease” — was said to be spread by ticks in the American southeast with children particularly vulnerable.

Participants in the study were presented with one of three versions of an article about the disease. Each version had the same text. The first had no image, the second had a picture of ticks while the third had an image of some ticks plus that of a child said to have been infected. Those who got the second article thought the risk was significantly higher than those who were presented with the first. The third article (the one with an image of those nasty ticks plus an infected child) sent the perception of risk higher still.

We’re all part of the same world but that doesn’t mean we are all exposed to the same type and degree of risk. Whether through an emotionally charged image or video, or a statistic designed to alarm, both distort our perception of what to pay attention to outside our homes — from the mundane to the dramatic and from the likely to the unlikely.

What I learned from “Alchemy: The Surprising Power of Ideas That Don’t Make Sense” by Rory Sutherland

I first heard Rory Sutherland speak in an interview he did for the State of the Markets podcast. Unlike almost every other interviewee he stood out as being able to make completely nonsensical suggestions for my people behave the way they so, but in a way that actually kind of made sense.

So when his new book (Alchemy: The Surprising Power of Ideas That Don’t Make Sense) came out I jumped at the chance to read it. While not obviously a book related to commodity markets, macroeconomics or investing more broadly the book has some interesting insights that I will certainly be taking on board.*

Be an alchemist

As some of you may know my background is economics, but as a number of my posts will attest I’m not wedded to the notion that economists have all the answers, nor that they are necessarily correct. Indeed, I’d rather be, and feel much more comfortable floating around the edge where at least I have a chance of being able to see other perspectives*:

“The alchemy of this book’s title is the science of knowing what economists are wrong about. The trick to being an alchemist lies not in understanding universal laws, but in spotting the many instances where those laws do not apply”

In the real world investors, business owners and others with skin in the game care shouldn’t care that one thing led to another in some rational, logically thought out way. What they really care about, and what really drives their decisions might be something else entirely:

“…we would all benefit if we learn to accept the fact that our unconscious motivations and feelings may have remarkably little to do with the reasons we attribute to them.”

Context matters…a lot

Indeed, context matters a lot more than economists give credit. We are of course a social animal and the context in which decisions have to be made matter far more than simple experiments suggest.

Rory recounts a story from Nassim Taleb’s book Skin in the Game, in which someone explains how depending on the context has entirely different political preferences: ‘At the federal level I am Libertarian. At the state level, I am a Republican. At the town level, I am a Democrat. In my family I am a socialist. And with my dog I am a Marxist.”

Conventional logic can only get you so far

If as investors we always think in the same rational way then we’re all likely to go into the same markets as everyone else. That’s okay if you are early, but apart from those with a seriously long term horizon that’s becoming increasingly difficult for the average investor.

“The fatal issue is that logic always gets you to exactly the same place as your competitors.”

Escaping ‘group-think’ by using nonsensical reasoning opens up new opportunities that others miss. While institutions have to rationally explain their logic for different strategies, you don’t have to do that. This gives the individual (private) investor a distinct advantage:

“…it doesn’t always pay to be logical if everyone else is also being logical. Logic may be a good way to defend and explain a decision, but it is not always a good way to reach one.”

There may be better ways of meeting your objectives than going down the conventional routes. As Rory says in the book abandoning having to appear logical can open our eyes to cheaper, faster acting and more effective solutions:

“The mythical ‘butterfly’ effect’ does exist, but we don’t spend enough time butterfly hunting.”

When making decisions under uncertainty, consider the variance, not just the average

Rory describes making decisions under uncertainty is like travelling to the airport to catch a flight. You can use Google Maps to calculate the optimal time to leave the house, taking account of the distance and the current traffic conditions. But this leaves no room for error, especially if Google suggests you take the motorway. What if there is an accident? Suddenly the ETA jumps 50% and you’ve missed your check-in. As Rory suggests sometimes taking the slower, sub-optimal route is the better bet:

“Remember, making decisions under uncertainty is like travelling to Gatwick Airport: you have to consider two things – not only the expected average outcome, but also the worst-case scenario. It is no good judging things on their average expectation without considering the possible level of variance.”

You could argue that only considering the average result has been the core reason behind many a folly – the Great Financial Crisis the most pertinent example. To borrow another Taleb quote (not in Rory’s book), “Don’t cross a river if it is four feet deep on average.”

For while the majority of investors only consider the average outcome, taking account of the potential variance can increase the likelihood of capturing asymmetric returns, while also reducing the chances of a catastrophic event wiping out your portfolio.

Rory’s Rules of Alchemy:

1.The opposite of a good idea can also be a good idea.

2.Don’t design for average.

3.It doesn’t pay to be logical if everyone else is being logical.

4.The nature of our attention affects the nature of our experience.

5.A flower is simply a weed with an advertising budget.

6.The problem with logic is that it kills off magic.

7.A good guess which stands up to observations is still science. So is a lucky accident.

8.Test counterintuitive things only because no one else will.

9.Solving problems using rationality is like playing golf with only one club.

10.Dare to be trivial.

*As a side note (but probably the most important recommendation I’m going to make in this whole article) I find books on the edge of something I’m interested in often have the most valuable insights. This book doesn’t disappoint.

*One of my favourite quotes in the book, “A change in perspective is worth 80 IQ points.” – Alan Kay (computer graphics pioneer)

Related article: What I learned from “Principles for Navigating Big Debt Crisis” by Ray Dalio

Geopolitics and the price of oil

An extract from my book, “Crude Forecasts: Predictions, Pundits & Profits In The Commodity Casino” on how even perfect geopolitical foresight doesn’t necessarily translate into knowing what it will mean for oil prices.

Ruptures in the price trend for many commodities are often the result of geopolitical developments. Political scientists Ian Bremmer and Preston Keat defined geopolitics as: “The study of how geography, politics, strategy, and history combine to generate the rise and fall of great powers and wars among states.” Given its importance to the running of the modern global economy, nowhere is this more vividly observed than in the battle for energy resources and, in particular, oil.

A cursory look at a simple oil price chart dating back to the 1970s reveals a series of bumps. Each of these bumps can be pinpointed to wars and conflicts, whether it was the Iranian revolution, the Iraqi invasion of Kuwait or the US-led invasion of Iraq. More recently, the Arab Spring-related uprisings in Libya or Egypt, civil war in Syria and violence in Iraq and the Ukraine have resulted in escalating geopolitical tensions across many important energy production and transit countries. There is a strong correlation between war casualties in energy producing countries and disruptions to oil output. As many historical episodes suggest, oil producing and distribution systems are hard to keep running when countries are immersed in civil wars or wars with neighbouring countries.

Other commodity markets also face their own geopolitical risk factors; the cocoa market, for example. Supply is concentrated in West Africa and in one country in particular, the Ivory Coast (supplier of approximately 40% of the world’s crop). Frequent political instability in the area has resulted in unrest and outright civil war, disrupting the production and export of cocoa from its ports.

If geopolitics plays such a big role in seismic moves in the oil market and many other commodities, then if geopolitical shifts can be forecasted with any accuracy this must give forecasters an edge, right? The Good Judgment Project, set up by Phillip Tetlock, set out to answer the first part of this question. They explored the profile of the best out of hundreds of forecasters who made over 150,000 predictions on roughly 200 events during a two year period. Forecasters were asked a multitude of questions, such as: Will the United Nations General Assembly recognise a Palestinian state by September 30th 2011? Will Bashar al-Assad remain president of Syria through to January 31st 2012? The researchers found that forecasters can be good at spotting changes, but only over long timescales.

The problem with geopolitical events is that they tend to be binary outcomes (although clearly not always). They either happen in the future or they don’t. This contrasts with what we might term “market” or “economic” risks which are more dynamic. There are three main problems with binary outcomes: first, they offer little information advantage for investors to play with; second, they are hard to predict and, third, they offer few easily identifiable markets that might benefit from a particular outcome.

Even if you have fantastic foresight about how a geopolitical event is likely to develop, the next problem is decoding what the impact is likely to be on a range of different commodity markets. All too often pundits focus on the immediate effect; for example, based on whichever candidate wins an election. However, they forget to draw the dots as to how the “narrative” could change once the geopolitical uncertainty of the political event falls away. Even if you could correctly forecast that the regime of a particular oil producing nation would be toppled within a given year, you wouldn’t be able to know the exact path that oil prices would go as a result.

Just as its very difficult to determine cause and effect in the past, it’s even harder to do it in the near future, even when you have a good grasp of all the information that could affect prices. In the first two months after Iraq invaded Kuwait in 1992, the oil price doubled from around $20 per barrel to almost $40 per barrel. If you asked any intelligent “analyst” or journalist he would have predicted a rise in the price of oil in the event of war. Their reasoning would have been simple – war in the most important oil producing region in the world creates the risk of disruption to oil distribution and supply.

In 1992, oil prices rose relentlessly as the war drums were beating louder and louder in Kuwait, until the cacophony became unbearable. As soon as it became likely that the US and its allies would invade Kuwait to push back the invading Iraqi forces, oil prices gradually fell back. Indeed, once the US-led coalition began its bombing campaign, in a matter of days prices fell from $30 per barrel to below $20 per barrel. As Nassim Taleb’s fictional character, Tony, describes in the book “Fooled by Randomness”, “War could cause a rise in oil prices, but not scheduled war – since prices adjust to expectations. It has to be in the price.”