July 24, 2021
tami sin youtube  twitter facebook

    How we can make better predictions

    August 28, 2019

    A few individuals have a heightened ability to forecast what will happen to companies, the economy and politics. What traits do they share?

    By William Park

    Some people have a gift for predicting the future. Not a vague, ambiguous prediction, but reasoned, cautious and thoughtful foresight. These people can see the likelihood of a companies’ commercial success or the outcome of elections better than anyone else. They are called “super-forecasters”, and what they can teach us about how to make smarter judgements could save companies billions, or even prevent countries from going to war.

    In 2007, Steve Ballmer, Microsoft’s then chief executive, told USA Today that “there’s no chance that the iPhone is going to get any significant market share”. He later went on to predict Apple might take 2-3% of the growing mobile market. Ultimately, Apple’s global market share peaked at about 23% in early 2012. Ballmer made bad predictions. (This article is adapted from an episode of CrowdScience from the BBC World Service. You can listen to the whole programme by following the link – Can I predict the future?).

    You might also like:

    • The best time to make a change to your life
    • Lies, propaganda and fake news: A challenge for our age
    • How to make wiser judgements about the future

    Ballmer did not possess the characteristics of a super-forecaster – humility, open-mindedness, inquisitiveness, among other things. What made things worse was his lack of willingness to amend his forecast. The people who make the best predictions about the future are also happy to change their prediction when presented with new information. Ballmer did not, and Microsoft’s presence in the smartphone market suffered as a result. Some critics have described him as “the worst CEO of a large public company” in the US.

    If we know what makes someone good at predictions, can we teach people to be better forecasters? Perhaps, if it is possible, mistakes like Ballmer’s could be avoided.

    Good judgement

    The idea of cultivating super-forecasters has only fully been investigated in the last few years. From 2011 to 2015, the Intelligence Advanced Research Projects Activity (IARPA), which is part of the US Office of the Director of National Intelligence, ran a tournament to find teams of naturally talented forecasters. The payoff for the US intelligence community here was potentially huge: a large, organised group of forecasters who could make significantly smarter predictions than their own intelligence officers.

    In total, 25,000 forecasters took part, making predictions on the future of the Eurozone to the likelihood of Vladimir Putin losing power in Russia. Forecasters who worked in teams outperformed even well-trained individuals – the theory being that each member balances out any biases in the other members. The winning team, the Good Judgment Project, subsequently became a forecasting business. (Read about the simple rule that can help you predict the far future.)

    “If you look at some of the media or political experts at the moment – watch the news – everyone is saying ‘This is unpredictable, I don’t know what will happen’,” says Kathy Peach, who leads Nesta’s Centre for Collective Intelligence Design. “If you open a newspaper you’ll see those incredibly complicated flow charts of all the different Brexit outcomes. So, if the experts are saying they can’t predict what will happen, if the statistical methods of prediction don’t work because there is no historical precedent for this, then how are you meant to make decisions either as an individual or as an organisation?”

    The Centre for Collective Intelligence Design is part of Nesta, the UK innovation foundation, who are collaborating with BBC Future to find individuals who can forecast global events and to investigate their predictions. (Read more about the challenge has gone so far here).

    “In a time of increased complexity and uncertainty it’s actually increasingly less likely that any one single individual will have access to all the information about what is happening,” says Peach. “But if you take a collective view by bringing together and combining the predictions of lots of different people you get to a more accurate result because they all hold different pieces of information that help to build a more complete picture overall. By combining those individual forecasts you’re also cancelling out, perhaps, some of the biases or inaccuracies that might exist in one individual forecast alone.”

    Pattern recognition

    The Good Judgment Project’s team of super-forecasters repeatedly made excellent predictions – so what set the individuals in the team apart from the rest? As an original member of the team, Michael Story is one of the world’s best super-forecasters. Now, he is managing director of Good Judgment Inc, the commercial spin-off of the project. “When we test people to see whether they are likely to be a good forecaster, the number one predictor of forecasting isn’t subject knowledge or anything like that, it is pattern recognition from pictures,” says Story.

    What traits make super-forecasters?

    According to the Good Judgment Project, super-forecasters recognise that few things are certain, they constantly test their beliefs like they are hypotheses and are not wedded to a single idea or agenda. Below are the traits they say are shared by the best-performing super-forecasters – how many do you have?

    Philosophical approach and outlook: cautious, humble and nondeterministic

    Abilities and thinking style: open-minded, inquiring, reflective and numerate

    Methods of forecasting: pragmatic, analytical, synthesising, probability-focused, thoughtful updaters, aware of biases

    Work ethic: improvement-minded, tenacious

    By asking people to spot patterns in a series of photographs you can make an accurate assessment of whether someone is a good forecaster. But to do this, they need to be able to see past their own biases.

    Story describes confirmation bias – where we selectively look for evidence to support our own ideas – as being like playing cards. How many times do you or other players say “Oh I knew you had that card” after someone reveals a winning hand? You feel certain that you knew they had those cards all along, but in reality you thought through several possibilities. When one of those is revealed you convince yourself that that was the possibility you felt most strongly about.

    “I always thought I was quite good at predicting things and so on, but it is also extremely easy to convince yourself of that fact,” says Story.

    Philip Tetlock, one of the founders of the Good Judgment Project describes super-forecasters as being “somewhat distinctive psychologically”. They possess a unique combination of characteristics that enable them to overlook their own prejudices.

    “I think that if I had to identify one particular thing, it is that whereas most people think of their beliefs as something very precious and self-defining, even sacred sometimes, super-forecasters tend to see their beliefs as testable hypotheses that should be revised in response to evidence,” Tetlock says. “That means they tend to be better belief-updaters... as news comes in and requires either moving a probability up or down.”

    Super-forecasters have also been observed to make smaller changes to their predictions than other people, when given the opportunity to revise them. It might be the case that their original estimate was already pretty accurate. But researchers argue that super-forecasters can think of many more possible outcomes than an average forecaster. When they revise their answer they move to a slightly different response rather than jumping wildly to a completely different solution.

    Research suggests that groups of women are more collectively intelligent than groups of men

    Peach says she is also keen to explore whether women make better group predictions than men. Similar research suggests that groups of women are more collectively intelligent than groups of men but further work is needed to see whether collective intelligence equates to super-forecasting. The more intelligent you are, for example, the more likely you are to be guilty of confirmation bias, as you are better at finding reasons to support your own argument.

    Should you wish to improve your forecasting skills you will be pleased to hear that super-forecasters are partly discovered and partly created, according to Good Judgment Project co-founder Barbara Mellers. The more predictions you make, the better you get.

    BBC Future has teamed up with Nesta to make our own crowd-sourced predictions about the future. You can flex your own forecasting muscles by signing up to the You Predict the Future challenge. Just taking part could turn you into a better forecaster…

    How To Hack Your Year PsychologyThe best time of year to make a life decision?Many of us make big decisions in January. But there are some compelling reasons to wait until warmer months – depending on the choice in front of you.

    By Amanda Ruggeri and Miriam Quick

    This story is part of a series we’re running on how to ‘hack’ your year. We’ll be looking at the best time of year to get engaged, buy a house, sit an exam, go to hospital and more. Keep checking back through December here for more stories.

    When we’re trying to make a big decision, many of us think (and over-think) about the choice itself. If we’re really analytical, we might also think about our decision-making process: should we write up a list of pro’s and cons, or make a weighted spreadsheet? Research endlessly, or cut ourselves off from accumulating too much data?

    But as well as thinking about how to make a choice, we may also want to think about when to make it.

    Whether it’s changing careers or buying a house, January always feels like a prime time for a reset – or, at least, to decide on a reset. And many of us are returning from holiday, where free time and conversations with loved ones can make us think about our life choices.

    But is January really the best time to make a big decision?

    Accessibility links
    Skip to contentAccessibility Help
    Sign in
    Search the BBC
    Search the BBC
    Future Now
    What is BBC Future?
    Best of..
    Follow the Food
    Future Now
    (Credit: Getty Images)
    Lies, propaganda and fake news: A challenge for our age
    With news sources splintering and falsehoods spreading widely online, can anything be done? Richard Gray takes an in-depth look at how we got here – and hears from the researchers and innovators seeking to save the truth.


    By Richard Gray
    1 March 2017
    Future Now
    Who was the first black president of America? It’s a fairly simple question with a straightforward answer. Or so you would think. But plug the query into a search engine and the facts get a little fuzzy.

    Grand Challenges

    A guide to the issues that define our age

    We may have things better than ever – but we’ve also never faced such world-changing challenges. That’s why Future Now asked 50 experts – scientists, technologists, business leaders and entrepreneurs – to name what they saw as the key challenges in their area.

    The range of different responses demonstrate the richness and complexity of the modern world. Inspired by these responses, over the next month we will be publishing a series of feature articles and videos that take an in-depth look at the biggest challenges we face today.

    When I checked Google, the first result – given special prominence in a box at the top of the page – informed me that the first black president was a man called John Hanson in 1781. Apparently, the US has had seven black presidents, including Thomas Jefferson and Dwight Eisenhower. Other search engines do little better. The top results on Yahoo and Bing pointed me to articles about Hanson as well.

    Welcome to the world of “alternative facts”. It is a bewildering maze of claim and counterclaim, where hoaxes spread with frightening speed on social media and spark angry backlashes from people who take what they read at face value. Controversial, fringe views about US presidents can be thrown centre stage by the power of search engines. It is an environment where the mainstream media is accused of peddling “fake news” by the most powerful man in the world. Voters are seemingly misled by the very politicians they elected and even scientific research - long considered a reliable basis for decisions - is dismissed as having little value.

    For a special series launching this week, BBC Future Now asked a panel of experts about the grand challenges we face in the 21st Century – and many named the breakdown of trusted sources of information as one of the most pressing problems today. In some ways, it’s a challenge that trumps all others. Without a common starting point – a set of facts that people with otherwise different viewpoints can agree on – it will be hard to address any of the problems that the world now faces.

    Having a large number of people in a society who are misinformed is absolutely devastating and extremely difficult to cope with – Stephan Lewandowsky, University of Bristol

    The example at the start of this article may seem a minor, frothy controversy, but there is something greater at stake here. Leading researchers, tech companies and fact-checkers we contacted say the threat posed by the spread of misinformation should not be underestimated.

    Take another example. In the run-up to the US presidential elections last year, a made-up story spread on social media claimed a paedophile ring involving high-profile members of the Democratic Party was operating out of the basement of a pizza restaurant in Washington DC. In early December a man walked into the restaurant - which does not have a basement - and fired an assault rifle. Remarkably, no one was hurt.

    Some warn that “fake news” threatens the democratic process itself. “On page one of any political science textbook it will say that democracy relies on people being informed about the issues so they can have a debate and make a decision,” says Stephan Lewandowsky, a cognitive scientist at the University of Bristol in the UK, who studies the persistence and spread of misinformation. “Having a large number of people in a society who are misinformed and have their own set of facts is absolutely devastating and extremely difficult to cope with.”

    A survey conducted by the Pew Research Center towards the end of last year found that 64% of American adults said made-up news stories were causing confusion about the basic facts of current issues and events.

    Alternative histories

    Working out who to trust and who not to believe has been a facet of human life since our ancestors began living in complex societies. Politics has always bred those who will mislead to get ahead.

    But the difference today is how we get our information. “The internet has made it possible for many voices to be heard that could not make it through the bottleneck that controlled what would be distributed before,” says Paul Resnick, professor of information at the University of Michigan. “Initially, when they saw the prospect of this, many people were excited about this opening up to multiple voices. Now we are seeing some of those voices are saying things we don’t like and there is great concern about how we control the dissemination of things that seem to be untrue.”

    There is great concern about how we control the dissemination of things that seem to be untrue – Paul Resnick, University of Michigan

    We need a new way to decide what is trustworthy. “I think it is going to be not figuring out what to believe but who to believe,” says Resnick. “It is going to come down to the reputations of the sources of the information. They don’t have to be the ones we had in the past.”

    We’re seeing that shift already. The UK’s Daily Mail newspaper has been a trusted source of news for many people for decades. But last month editors of Wikipedia voted to stop using the Daily Mail as a source for information on the basis that it was “generally unreliable”.

    Yet Wikipedia itself - which can be edited by anyone but uses teams of volunteer editors to weed out inaccuracies - is far from perfect. Inaccurate information is a regular feature on the website and requires careful checking for anyone wanting to use it.

    For example, the Wikipedia page for the comedian Ronnie Corbett once stated that during his long career he played a Teletubby in the children’s TV series. This is false but when he died the statement cropped up in some of his obituaries when writers resorted to Wikipedia for help.

    Other than causing offense or embarrassment – and ultimately eroding a news organisation’s standing - these sorts of errors do little long-term harm. There are some who care little for reputation, however. They are simply in it for the money. Last year, links to websites masquerading as reputable sources started appearing on social media sites like Facebook. Stories about the Pope endorsing Donald Trump’s candidacy and Hillary Clinton being indicted for crimes related to her email scandal were shared widely despite being completely made up.

    “The major new challenge in reporting news is the new shape of truth,” says Kevin Kelly, a technology author and co-founder of Wired magazine. “Truth is no longer dictated by authorities, but is networked by peers. For every fact there is a counterfact. All those counterfacts and facts look identical online, which is confusing to most people.”

    For every fact there is a counterfact and all those counterfacts and facts look identical online – Kevin Kelly, co-founder Wired magazine

    For those behind the made-up stories, the ability to share them widely on social media means a slice of the advertising revenue that comes from clicks as people follow the links to their webpages. It was found that many of the stories were coming from a small town in Macedonia where young people were using it as a get-rich scheme, paying Facebook to promote their posts and reaping the rewards of the huge number visits to their websites.

    “The difference that social media has made is the scale and the ability to find others who share your world view,” says Will Moy, director of Full Fact, an independent fact-checking organisation based in the UK. “In the past it was harder for relatively fringe opinions to get their views reinforced. If we were chatting around the kitchen table or in the pub, often there would be a debate.”

    But such debates are happening less and less. Information spreads around the world in seconds, with the potential to reach billions of people. But it can also be dismissed with a flick of the finger. What we choose to engage with is self-reinforcing and we get shown more of the same. It results in an exaggerated “echo chamber” effect.

    People are quicker to assume they are being lied to but less quick to assume people they agree with are lying, which is a dangerous tendency – Will Moy, director of Full Fact

    “What is noticeable about the two recent referendums in the UK - Scottish independence and EU membership - is that people seem to be clubbing together with people they agreed with and all making one another angrier,” says Moy. “The debate becomes more partisan, more angry and people are quicker to assume they are being lied to but less quick to assume people they agree with are lying. That is a dangerous tendency.”

    The challenge here is how to burst these bubbles. One approach that has been tried is to challenge facts and claims when they appear on social media. Organisations like Full Fact, for example, look at persistent claims made by politicians or in the media, and try to correct them. (The BBC also has its own fact-checking unit, called Reality Check.)

    Research by Resnick suggests this approach may not be working on social media, however. He has been building software that can automatically track rumours on Twitter, dividing people into those that spread misinformation and those that correct it. “For the rumours we looked at, the number of followers of people who tweeted the rumour was much larger than the number of followers of those who corrected it,” he says. “The audiences were also largely disjointed. Even when a correction reached a lot of people and a rumour reached a lot of people, they were usually not the same people. The problem is, corrections do not spread very well.”

    The problem is that corrections do not spread very well – Paul Resnick, University of Michigan

    One example of this that Resnick and his team found was a mistake that appeared in a leaked draft of a World Health Organisation report that stated many people in Greece who had HIV had infected themselves in an attempt to get welfare benefits. The WHO put out a correction, but even so, the initial mistake reached far more people than the correction did. Another rumour suggested the rapper Jay Z had died and reached 900,000 people on Twitter. Around half that number were exposed to the correction. But only a tiny proportion were exposed to both the rumour and correction.

    This lack of overlap is a specific challenge when it comes to political issues. Moy fears the traditional watchdogs and safeguards put in place to ensure those in power are honest are being circumvented by social media.

    “On Facebook political bodies can put something out, pay for advertising, put it in front of millions of people, yet it is hard for those not being targeted to know they have done that,” says Moy. “They can target people based on how old they are, where they live, what skin colour they have, what gender they are. We shouldn’t think of social media as just peer-to-peer communication - it is also the most powerful advertising platform there has ever been.”

    We shouldn’t think of social media as just peer-to-peer communication, it is also the most powerful advertising platform there has ever been – Will Moy, director of Full Fact

    But it may count for little. “We have never had a time when it has been so easy to advertise to millions of people and not have the other millions of us notice,” he says.

    Twitter and Facebook both insist they have strict rules on what can be advertised and particularly on political advertising. Regardless, the use of social media adverts in politics can have a major impact. During the run up to the EU referendum, the Vote Leave campaign paid for nearly a billion targeted digital adverts, mostly on Facebook, according to one of its campaign managers. One of those was the claim that the UK pays £350m a week to the EU - a figure Sir Andrew Dilnot, the chair of the UK Statistics Authority, described as misleading. In fact the UK pays around £276m a week to the EU because of a rebate.

    “We need some transparency about who is using social media advertising when they are in election campaigns and referendum campaigns,” says Moy. “We need to be more equipped to deal with this - we need watchdogs that will go around and say, ‘Hang on, this doesn’t stack up’ and ask for the record to be corrected.”

    Social media sites themselves are already taking steps. Mark Zuckerberg, founder of Facebook, recently spelled out his concerns about the spread of hoaxes, misinformation and polarisation on social media in a 6,000-word letter he posted online. In it he said Facebook would work to reduce sensationalism in its news feed on its site by looking at whether people have read content before sharing it. It has also updated its advertising policies to reduce spam sites that profit off fake stories, and added tools to let users flag fake articles.

    Other tech giants also claim to be taking the problem seriously. Apple’s Tim Cook recently raised concerns about fake news, and Google says it is working on ways to improve its algorithms so they take accuracy into account when displaying search results. “Judging which pages on the web best answer a query is a challenging problem and we don’t always get it right,” says Peter Barron, vice president of communications for Europe, Middle East and Asia at Google.

    “When non-authoritative information ranks too high in our search results, we develop scalable, automated approaches to fix the problems, rather than manually removing these one by one. We recently made improvements to our algorithm that will help surface more high quality, credible content on the web. We’ll continue to change our algorithms over time in order to tackle these challenges.”

    Judging which pages on the web best answer a query is a challenging problem and we don’t always get it right – Peter Barron, Google

    For Rohit Chandra, vice president of engineering at Yahoo, more humans in the loop would help. “I see a need in the market to develop standards,” he says. "We can’t fact-check every story, but there must be enough eyes on the content that we know the quality bar stays high.”

    Google is also helping fact-checking organisations like Full Fact, which is developing new technologies that can identify and even correct false claims. Full Fact is creating an automated fact-checker that will monitor claims made on TV, in newspapers, in parliament or on the internet.

    Initially it will be targeting claims that have already been fact-checked by humans and send out corrections automatically in an attempt to shut down rumours before they get started. As artificial intelligence gets smarter, the system will also do some fact-checking of its own.

    “For a claim like ‘crime is rising’, it is relatively easy for a computer to check,” says Moy. “We know where to get the crime figures and we can write an algorithm that can make a judgement about whether crime is rising. We did a demonstration project last summer to prove we can automate the checking of claims like that. The challenge is going to be writing tools that can check specific types of claims, but over time it will become more powerful.”

    What would Watson do?

    It is an approach being attempted by a number of different groups around the world. Researchers at the University of Mississippi and Indiana University are both working on an automated fact-checking system. One of the world’s most advanced AIs has also had a crack at tackling this problem. IBM has spent several years working on ways that its Watson AI could help internet users distinguish fact from fiction. They built a fact-checker app that could sit in a browser and use Watson’s language skills to scan the page and give a percentage likelihood of whether it was true. But according to Ben Fletcher, senior software engineer at IBM Watson Research who built the system, it was unsuccessful in tests - but not because it couldn’t spot a lie.

    “We got a lot of feedback that people did not want to be told what was true or not,” he says. “At the heart of what they want, was actually the ability to see all sides and make the decision for themselves. A major issue most people face without knowing it is the bubble they live in. If they were shown views outside that bubble they would be much more open to talking about them.”

    We got a lot of feedback that people did not want to be told what was true or not – Ben Fletcher, IBM Watson Research

    This idea of helping break through the isolated information bubbles that many of us now live in comes up again and again. By presenting people with accurate facts it should be possible to at least get a debate going. But telling people what is true and what is not does not seem to work. For this reason, IBM shelved its plans for a fact-checker.

    “There is a large proportion of the population in the US living in what we would regard as an alternative reality,” says Lewandowsky. “They share things with each other that are completely false. Any attempt to break through these bubbles is fraught with difficulty as you are being dismissed as being part of a conspiracy simply for trying to correct what people believe. It is why you have Republicans and Democrats disagreeing over something as fundamental as how many people appear in a photograph.”

    One approach Lewandowsky suggests is to make search engines that offer up information that may subtly conflict with a user’s world view. Similarly, firms like Amazon could offer up films and books that provide an alternative viewpoint to the products a person normally buys.

    There is a large proportion of the population living in what we would regard as an alternative reality – Stephan Lewandowsky, University of Bristol

    “By suggesting things to people that are outside their comfort zone but not so far outside they would never look at it you can keep people from self-radicalising in these bubbles,” says Lewandowsky. “That sort of technological solution is one good way forward. I think we have to work on that.”

    Google is already doing this to some degree. It operates a little known grant scheme that allows certain NGOs to place high-ranking adverts in response to certain searches. It is used by groups like the Samaritans so their pages rank highly in a search by someone looking for information about suicide, for example. But Google says anti-radicalisation charities could also seek to promote their message on searches about so-called Islamic State, for example.

    But there are understandable fears about powerful internet companies filtering what people see - even within these organisations themselves. For those leading the push to fact-check information, better tagging of accurate information online would be a better approach by allowing people to make up their own minds about the information.

    Search algorithms are as flawed as the people who develop them – Alexios Mantzarlis, director of the International Fact-Checking Network

    “Search algorithms are as flawed as the people who develop them,” says Alexios Mantzarlis, director of the International Fact-Checking Network. “We should think about adding layers of credibility to sources. We need to tag and structure quality content in effective ways.”

    Mantzarlis believes part of the solution will be providing people with the resources to fact-check information for themselves. He is planning to develop a database of sources that professional fact-checkers use and intends to make it freely available.

    But what if people don’t agree with official sources of information at all? This is a problem that governments around the world are facing as the public views what they tell them with increasing scepticism.

    Nesta, a UK-based charity that supports innovation, has been looking at some of the challenges that face democracy in the digital era and how the internet can be harnessed to get people more engaged. Eddie Copeland, director of government innovation at Nesta, points to an example in Taiwan where members of the public can propose ideas and help formulate them into legislation. “The first stage in that is crowdsourcing facts,” he says. “So before you have a debate, you come up with the commonly accepted facts that people can debate from.”

    When people say they are worried about people being misled, what they are really worried about is other people being misled – Paul Resnick, University of Michigan

    But that means facing up to our own bad habits. “There is an unwillingness to bend one’s mind around facts that don’t agree with one’s own viewpoint,” says Victoria Rubin, director of the language and information technology research lab at Western University in Ontario, Canada. She and her team have been working to identify fake news on the internet since 2015. Will Moy agrees. He argues that by slipping into lazy cynicism about what we are being told, we allow those who lie to us to get away with it. Instead, he thinks we should be interrogating what they say and holding them to account.

    Ultimately, however, there’s an uncomfortable truth we all need to address. “When people say they are worried about people being misled, what they are really worried about is other people being misled,” says Resnick. “Very rarely do they worry that fundamental things they believe themselves may be wrong.” Technology may help to solve this grand challenge of our age, but it is time for a little more self-awareness too.

    You might also like:
    • Should you trust your gut feelings?
    • Why contemplating death changes how you think
    • How good are you at thinking about uncertainty?

    The answer depends on our mood.

    Many of us find that we feel a little lower in winter. For some people, it can be extreme. Seasonal affective disorder (SAD), marked by having depressive episodes in the winter months, is especially common in northern latitudes. One review found that up to almost 10% of people in the north, including North America, are affected by the disorder, while a recent study in Switzerland following participants over more than 20 years found that 7.5% of the population experienced seasonal depression.

    Symptoms also can last for longer than you might expect: one study found that in the US, those affected by SAD struggle with symptoms for an average 40% of the year.

    But even those who don’t meet the diagnostic criteria for SAD often feel that their mood is lower in the winter. Back in the 1980s, a telephone survey of Maryland residents found 92% of people noticed seasonal mood changes to some degree – mainly that their mood became lower in winter.

    Your mood doesn’t just affect how you feel. It can affect your decision-making abilities. But to make matters more complex, having a low mood doesn’t mean you’ll always be worse at making a choice.

    Reward risk

    A depressed mood tends to make us more risk-averse. Researchers think this may stem from a curtailed ability to experience pleasure, meaning a depressed person doesn’t have the same potent (and optimistic) emotional response to the possibility of a gain or a reward as a non-depressed person.

    When given a card-playing task designed to assess risk-taking, for example, depressed participants had a harder time remembering which options were more likely to yield rewards, making them worse at the game than non-depressed participants. Participants with depressive symptoms also were more conservative in their risk-taking than non-depressed participants – sticking with safe choices that had low chances of reward, instead of adopting higher-risk strategies with potentially larger payoffs.

    These are laboratory studies, but there is some good evidence that the same effects play out in the real world. People with SAD are more likely to be conservative in their financial decisions in the winter than people who didn’t have SAD, for instance.

    And when it comes to making decisions, being more risk-averse isn’t always a bad thing.

    This is especially true because most healthy individuals have the opposite problem: ‘optimism bias’. Most of us believe we’re less likely to experience a negative event (like getting cancer or being in a car crash) than the statistics would warrant, and that our future is likelier to be rosier (whether in terms of getting more job offers or having a great holiday) than actually turns out to be the case. We also tend to think that we’re more in control than we really are – particularly if we’re involved in the event ourselves.

    As you might expect, depressed people, who have a more pessimistic view of the world, don’t fall into this trap. This ‘depressive realism’ means they are better at accurately assessing time intervals and at predicting how other people's decisions will affect them than their more optimistic peers. They also learn to avoid risky responses faster than non-depressed people.

    But that doesn’t mean they’re accurate with forecasting in general – depressed people are worse than healthy people at predicting football World Cup match results for example.

    There is another twist, too. Optimists may see the future with rose-coloured glasses – but they’re also better at making that future come true. Greater optimism is associated with more career success, better relationships and better health. Long-running studies also have found that the effect seems to go beyond correlation (‘I’m optimistic because I’m in good health’) and perhaps be causation (‘My optimism helps me have good health’). One study, for example, looked at 97,000 women, all of whom had no cancer or cardiovascular disease when the study began. Eight years later, the optimists were less likely than pessimists to have developed coronary heart disease or have died of any cause.

    And if you’re struggling to make a life choice, it may also be worth waiting until longer days bring a lighter mood: depressive symptoms can interfere with the decision making process that that it is harder to make any decision at all, with people with depression feeling more conflicted and indecisive than non-depressed people.

    So the relationship between mood and decision-making is not a simple one – which means that if you’re considering when to make a big decision, you may want to think about what kind it is. Does it involve potentially catastrophic losses – something that may require caution and a realistic outlook? Then winter may be better. Or is it a decision where there’s everything to play for, if you can accept a certain amount of uncertainty about the outcome? Then perhaps you should take advantage of your more elevated mood in summer.

    And if you feel stymied from making a choice at all, you might want to wait a bit until sunshine returns. Who knows – it may help clear up not only your mood, but your indecision.
    How to make wiser judgements about the futureForecasting the future is far from easy, but there is a way to hone your skills no matter who you are.
    By David Robson
    Will the UK leave the EU within 2019?Will the US House of Representatives impeach Trump this year?Will a man or a woman win the Democratic primary?How you answer these questions is not simply a matter of intelligence or education. A four-year forecasting tournament, the Good Judgment Project, tested thousands of people’s abilities to predict world events. The top performers were educated, for sure, but their performance also relied on many styles of thinking that are not measured in traditional academic tests. They also represented a mix of genders, ages and backgrounds.

    Take the example of Elaine Rich, a pharmacist from a Maryland suburb, who signed up in her 60s and quickly excelled in making the tournament’s probability-based predictions. As she told NPR, world affairs had never been her forte and she hadn’t studied maths in college. But with a bit of training, she ended up performing in the top 1% of forecasters.

    As the project leader, Philip Tetlock, wrote in his book Super-forecasting: “A brilliant puzzle-solver may have the raw material for forecasting, but if he doesn’t also have an appetite for questioning basic, emotionally charged beliefs, he will often be at a disadvantage relative to a less intelligent person who has a greater capacity for self-critical thinking.”

    If you fancy testing your own forecasting skills, BBC Future has teamed up with Good Judgment and the UK innovation foundation Nesta to launch our own forecasting challenge that is open to anyone to join.

    Sign up for the You Predict The Future challenge
    You may find that you already have an amazing talent for predicting the future – but don’t worry if you initially underperform. With a little practice, anyone can improve their prediction abilities on all kinds of issues, and in time you may see other benefits too, such as a boost in open-mindedness. If you would like to learn more about the science of making wiser judgements before joining the challenge, read on...

    The question of what constitutes good judgement has troubled philosophers for millennia, along with the suspicion that raw brainpower, alone, is not enough for truly wise insights.

    Indeed, according to Rene Descartes, it may even push us to greater error. “The greatest minds are capable of the greatest vices as well as the greatest virtues,” he wrote in the 17th Century. “Those who go forward but very slowly can get further, if they always follow the right road, than those who are in too much of a hurry and stray off it.”

    As I write in my book The Intelligence Trap, cutting-edge psychological research now offers some very precise ways in which greater brainpower drive us down the wrong track.

    When thinking about politics, your brilliant brain can become a tool for propaganda rather than truth-seeking

    Consider the phenomenon of “motivated reasoning”. When we think about emotionally charged issues – particularly those associated with our identities – we don’t apply our thinking in an even-handed way to appraise the evidence at hand. Instead, we may simply use it to rationalise our existing viewpoint and demolish any arguments that contradict those views. And the more intelligent or knowledgeable you are, the easier it may be to build elaborate arguments that support your point of view. Your brilliant brain becomes a tool for propaganda rather than truth-seeking.

    We can see this most clearly with views on global warming. Among Democrats, more numerate and scientifically literate people are more likely to endorse the view that human emissions are causing climate change; but among Republicans, the exact opposite is true. The more numerate and scientifically literate participants are more likely to deny the effect of human emissions on the environment.

    Researchers have now observed similar patterns on many other issues – from gun control to fracking and stem cell research – where greater knowledge seems to increase political polarisation. If you’re trying to predict the outcome of something emotionally charged like Brexit, motivated reasoning is almost certainly skewing your thinking.


    Predicting the future is impossible – but signing up and participating will hone your skills and could make you a better forecaster than your peers.“When you’re part of the challenge, you’ll get feedback on how accurate your forecasts are. You’ll be able to see how well you do compared to other forecasters. And there’s a leader board, which shows who the best performing forecasters are,” says Kathy Peach, who leads Nesta’s Centre for Collective Intelligence Design. It's also possible there will be unexpected upsides. “New research shows that forecasting increases open-mindedness, the ability to consider alternative scenarios and reduces political polarisation,” adds Peach.

    So, what are you waiting for? Sign up for the You Predict The Future challenge now, and take the first step to becoming a better, wiser, and more even-handed forecaster.

    Besides motivated reasoning, smart people may also suffer from “earned dogmatism” – in which your perceptions of your own expertise cause you to become more closed-minded. If you had a politics degree, for instance, you may have a tendency to ignore new evidence that contradicts your preconceptions – because you feel that you know everything there is to know already.

    Not every smart person falls for these traps, of course. It depends on whether your intelligence is complemented by some other traits that ensure you use it wisely.

    Take curiosity. Both questionnaires and behavioural measures confirm that some people are naturally more curious than others – for them, learning new facts is a reward in its own right (and even creates a dopamine kick in the brain). Do you actively seek new knowledge? And would you get the itch to read a newspaper article on a favourite subject, even if it threatens to challenge your assumptions? These people appear to be less likely to allow their views to be skewed by their political affiliations, since their hunger for new knowledge trumps any dogmatic tendencies. But there are many intelligent people who do not have this hunger for knowledge for its own sake.

    You might also enjoy:

    Could you be a super-forecaster?
    The ordinary people who can predict world events
    The best way to predict the future
    Psychologists are also interested in the protective potential of “intellectual humility” – essentially, how easily you can admit that you are wrong. Once again, this can be measured by questionnaires. People with greater intellectual humility are less likely to hold polarised, dogmatic views. Instead, they appraise the evidence at hand and to listen to alternative points of view.

    In the initial Good Judgment Project, the super-forecasters with the most accurate insights showed just this kind of thinking – they checked their confidence, and were quick to admit their mistakes and update their opinions when new evidence emerged. People who were most prone to rigid closed-minded thinking, dogmatism, and arrogance, in contrast, performed much worse – no matter what their IQ and academic credentials.

    Fortunately, our minds are malleable: anyone can learn to avoid blinkered reasoning. One technique is the “consider the opposite” strategy, in which you deliberately occupy the alternative stance and argue against your initial intuitions, a practice that has been shown to reduce a range of self-serving biases. (You can read more about that here.)

    There is also evidence that learning about logical fallacies and common thinking errors can help you to think more rationally about the news you consume, so that you appraise information based on the quality of the arguments – rather than whether it simply aligns with your existing beliefs. And a very basic refresher in ways to calculate risk and uncertainty can help improve your overall decision making on a range of issues – such as health, finance, and politics.

    Good thinking can be learnt – and our new forecasting tournament might be a perfect opportunity to sharpen your judgements. The practice of forecasting forces you to turn your intuitions and beliefs into testable hypotheses that can be proven or disproven. As a result, you will learn to overcome your motivated reasoning and earned-dogmatism – if you suffer from them – and you’ll develop a better understanding of risk and uncertainty.

    There’s evidence suggesting participants in forecasting tournaments can improve their accuracy with training. And what’s more, compared to a control group, the participants in a forecasting competition went on to show greater intellectual humility and reduced polarisation on a range of issues – including those that had not been discussed in the tournament itself. They had learnt to question their assumptions and to look beyond the blinkers of their ideology.

    So why not sign up to our project and see how you fare? Whether you wish to be a super-forecaster or you simply want to develop a more rational understanding of the world around you, you might be surprised by what you find.

    dgi log front