TIP146: SUPERFORECASTING

THE ART AND SCIENCE OF PREDICTION

9 July 2017

During Preston and Stig’s interview with legendary investor, Edward Thorp, he provided the recommendation that the hosts should read the book Superforecasting, by Philip Tetlock and Dan Gardner.  Based on that recommendation, we have provided a chapter for chapter overview in this podcast on the art and science of predictions.

According to The Wall Street JournalSuperforecasting is “The most important book on decision making since Daniel Kahneman‘s “Thinking, Fast and Slow.”  The Harvard Business Review paired it to the book The Power of Mathematical Thinking by Jordan Ellenberg.  Regardless of other peoples’ opinions, we found the book to be quite interesting and useful for value investors.  The book does a great job of teaching the reader the importance of understanding cognitive biases and also trying to use a balanced state of mind to conduct assessments on how things might behave in the future.

Subscribe through iTunes
Subscribe through Castbox
Subscribe through Spotify
Subscribe through Youtube

SUBSCRIBE

Subscribe through iTunes
Subscribe through Castbox
Subscribe through Spotify
Subscribe through Youtube

IN THIS EPISODE, YOU’LL LEARN:

  • The right framework for making correct forecasts
  • How to prevent confirmation bias for your investments
  • Why financial experts are storytellers rather than forecasters
  • How and why the best investors constantly change their opinion

TRANSCRIPT

Disclaimer: The transcript that follows has been generated using artificial intelligence. We strive to be as accurate as possible, but minor errors and slightly off timestamps may be present due to platform differences.

Preston Pysh  0:03  

So a few months ago, we had a huge name in finance on our show. And the gentleman’s name is Ed Thorp. Ed Thorp’s personal net worth is around $900 million. During our discussion, I was talking to Ed Thorp about certain ideas about what might happen in the future, particularly about central banking. I asked Ed a really difficult question, and I was kind of expecting him to tell me, “Yeah, I think there’s a high probability that that might be the case with respect to central banks.”

What I got back from him was kind of an interesting response. Ed basically said to me, “I have no idea.” He responded so quickly without any hesitation that it just shocked me. 

During that interview, Ed said, “You really got to read this book called, ‘Superforecasting’ by Philip Tetlock and Dan Gardner.” Because of that interview, and because of that moment that I experienced with Ed Thorp, and how he responded to this question about forecasting, and then, he recommended this book for us to read, that’s why we decided to do this episode today on “Superforecasting.”

Stig Brodersen  1:12  

And everyone really forecasts whether it’s about the weather, beating their morning traffic or it’s the financial markets. While forecasting might appear to be a game, is in fact very real, and the stakes are high and substantial. 

As a society, it’s important that we hone the skills of forecasting because countries start embracing evidence-based policies, which basically means that we are trying to forecast. It’s also true on the personal level, because the ability to forecast is the difference between success and failure in life and business. 

So, in this episode, we’re investigating why some people forecast better than others. And we’ll teach you the techniques to think rationally about your own predictions.

Preston Pysh  1:52  

Alright guys, so if you’re ready, we’re ready. Let’s go ahead and do this.

Intro  2:00  

You’re listening to The Investor’s Podcast, where we study the financial markets and read the books that influence self-made billionaires the most. We keep you informed and prepared for the unexpected.

Preston Pysh  2:21  

Okay, so let’s get this episode going here, Stig. As we said in the intro, we’re talking about the book, “Superforecasting” by Philip Tetlock and Dan Gardner. And I really liked this book. I’m just gonna throw that out there. I thoroughly enjoyed some of the discussion here. 

Because when you’re talking about stock investing, or any type of investing, it all comes down to what you kind of expect the future to look like and what you’re kind of estimating those future cash flows to be and you’re discounting those back to which you think your return might be. 

And so, this was such a relevant book for us and the writing and it was really good. It was easy to understand. It wasn’t a difficult read. But in general, I really like that I’m just trying to capture your thoughts here before we start plowing through this chapter by chapter.

Stig Brodersen  3:07  

I think it’s very important for people to realize that we forecast all the time. As stock investors, we automatically think about stock investing. But you’re also forecasting whenever you leave in the morning, can I beat the traffic? I mean, it happens all the time. And how do you come up with these conclusions? And how do you forecast the best? And that’s really the interesting thing about this book. 

Read More

Preston Pysh  3:26  

All right, so let’s just go ahead and jump into this. So chapter one is titled, “An Optimistic Skeptic”. This chapter was pretty generic to start off the book. And what it’s really getting at is, when we think about forecasting into the future, a lot of people might have the cop-out statement that you can’t predict the future, it’s impossible. And that’s not true either. 

So the example that Tetlock and Gardner used in the book is, a), if you look at the forecast of what the weather might be tomorrow, there’s a pretty high probability that they’re going to be really close to what the truth is, with respect to the temperature, whether it’s going to be sunny, rainy. 

But where it gets more difficult is when you start stepping to a week into the future, and it becomes more fuzzy. What they’re really trying to get at is this idea of an array of possibilities, and this is something that we’ve talked about on the show numerous times in the past. 

When you think about where you’re at right now in time, you know what’s happening around you. But if you had to forecast what’s going to happen in the next minute, you have a pretty good idea of what will happen in the next minute. If you had to forecast in an hour, the possibilities of what could happen start opening up more your left and right limit of potential starts opening up more. And when you push that out further and further, call it a year, that’s when it starts getting to be very difficult depending on what you’re talking about. 

And so, that’s what they’re really talking about in the first chapter is opening up this idea that forecasting can be done. It’s just the difficulty and the probability of that changes as you extend that timeline into the future.

Stig Brodersen  4:57  

And the author has a really interesting discussion about how we don’t usually check up on the forecaster’s track record. So whenever you hear people in the news talking about what they think will happen, you don’t talk about if the forecasts have been accurate in the past. And he relates that to a sports team, like, would you ever acquire a player if his stats weren’t good, if he can’t prove that he’s actually capable of carrying out that task? 

Preston Pysh  5:21  

So something else that I want to highlight is the whole name of the book, “Superforecasting” is this idea that both of the authors conducted this experiment. They were working with the government on this idea, are there people out there that are better at conducting forecasts of the future than the average person? 

Also, what they found out is that this is a true statement that there is a superforecasting group of people in the world that are good at this. They statistically prove that they can outperform the typical person when making predictive analysis. 

And a perfect example would be the whole North Korea thing that’s happening in the world right now at the end of the first quarter of 2017. There’s a lot of talk about, is the United States or China or anybody going to go into North Korea and do something. So, that would be an example of an event that they would have superforecasters versus regular forecasters trying to predict whether that’s actually going to happen or not. 

Through their research, they had proven that superforecasters exist. And so in this book, what they’re doing is they’re outlining and trying to understand what separates those people from the normal forecasters. Why are they able to make better predictions than the typical person? And so, he goes chapter by chapter talking about some of these dynamics on how they’re able to do it better than the typical person.

Stig Brodersen  6:43  

And the interesting thing is that the intelligence agencies actually are really interested in this because what they prove in this book is that the best forecasters are a lot more efficient than the agencies, which shouldn’t make any kind of sense because there are thousands of thousands of highly skilled people trying to predict or forecast, if you will, what’s going to happen in the future. So, why is it that a few handful of people are doing so much better? What can we learn from them, that can be implemented into our intelligence services? That’s also one of the reasons why he’d been writing this book.

Preston Pysh  7:13  

And a really neat highlight is that some of these people that are considered superforecasters are like, one gentleman was a farmer out in the middle of the US who had no government ties, and like, some of the backgrounds of these people are just quite phenomenal. And you’re wondering, how are they different? And how are they able to do this without the background of maybe a trained professional that has a niche in a specific area? And so the authors talk about why that exists. So we’re going to get into some of that. 

Let’s jump into Chapter Two, and the title of this chapter is, “Illusions of Knowledge.” And the premise of this chapter is pretty simple. What he’s saying is that a person who’s very knowledgeable in a specific area, they sometimes, not all the time, but sometimes have a bias towards what it is that they actually know versus don’t know. 

He uses an example of a doctor who’s providing a recommendation for medicine that a person should take. And the person who’s receiving this diagnosis and the prescription that’s associated with that, they just take it at face value because the person has the assumption, they’re just like, well, this person has to be right, they’re a doctor or they have to be 100% right, is the mindset of a lot of people. 

But what the authors talk about is that is actually pretty far from the truth. In fact, a lot of doctors misdiagnose different things, and they get it wrong. And that this culture in medicine. [It] is that it’s not or at least it wasn’t in the past.

I think they’ve made a lot of changes more recently in the last 10 years, but some of this still persists in the culture where people don’t question: “Are you making the right decision? Where are we making a mistake? What is wrong about my analysis, when I’m thinking through this,” and he uses the medical community to kind of highlight this “illusion of knowledge.” [It’s] the way that they describe it in the book.

Stig Brodersen  9:07  

Basically, we’re talking about confirmation bias, a type of bias that we had talked about many times before, where you were always looking to find a reason why you’re right and all the people are wrong. I definitely know that for myself. Whenever I read something in the news about the stock market being incorrectly priced, or even undervalued sometimes, immediately, I don’t want to read it. 

And if I do read it, I’m having this mindset of now, I’m really going to try to disprove this guy. And that’s even before I started reading the article, when he’s saying that, if you have an open mind set something that’s really *inaudible* from a superforecaster and not saying, I know what the truth is, but rather, I’m seeking the truth, then you’ll be a lot more successful. So I try to learn from that.

Preston Pysh  9:52  

And for the person who’s hearing this and thinking, well, how can I prevent that from happening? I think it’s really simple. So in order for a person to prevent confirmation bias from occurring. 

Let’s just use the stock market, for example. My personal opinion is that it’s highly priced and that there’s a lot of risk in owning it right now. If that’s truly my opinion, the articles I should be reading are articles that support the stock market going higher. And people who think that basically has more to run and their reasonings for why it has more to run. I should be reading all those kinds of articles to counter my opinion, and to basically remove that bias. 

Now, am I good at that? No, I’m not good at that. And I think part of the first step is admitting that. But I think that that is a really, really important highlight and something that you take away from this book that a lot of people definitely don’t do, because the majority of people, they have an opinion, and then they go and search for that on Google and what are they going to find, they’re going to find other people that have similar opinions. 

Then, they’re going to read that, it’s going to solidify that opinion. And they’re going to get even more hardened into that opinion instead of trying to troubleshoot it and find maybe the array of potential outcomes that support the direction of forecast go in a different direction than what they expect.

Stig Brodersen  11:12  

So, there’s a short really neat story about this in the book and like one of the superforecasters have done is to program an algorithm, so that in his newsfeed, he will get like a random selection of news articles, and he can see where it comes from, so he won’t be biased. 

Now, I have no clue how he’d program anything like that, but I just love that story. It really tells you about something that it’s really your mindset and your approach to forecasting that really makes the difference.

Preston Pysh  11:37  

It just shows you how unbiased these superforecasters are. They know they’re influenced by this. And so they’ve done everything that they can to remove these cognitive biases. They’re experts in cognitive biases at the end of the day, that’s what I really took away from the book. 

They understand these things and they’ve designed the way that they receive information to prevent those biases from impacting them, and that is such an important takeaway from the book in total.

Stig Brodersen  12:03  

Alright, let’s move on to the third chapter, and that’s called, “Keeping Score”. What the author talks about in this chapter is that he had found that the more confident experts are, the more wrong they are also. I kind of like that analysis. What he’s saying is that the most famous forecasters, basically the experts you see on TV, their skill is not forecasting, but they’re really good at telling a clear narrative, and they have this very confident attitude whenever they’re telling that narrative. It really reminds me of the interview we had with Guy Spier back in Episode 14, where he’s saying that he doesn’t want to give the narrative of a stock whenever he bought it, because he doesn’t want to be too attached to it. 

He wants to be able to sell it whenever he needs to, and he wouldn’t like to come up as a flip over or anything like that, if he decides to sell that the next day. So, he wants to be as detached as he possibly can.

Preston Pysh  12:56  

All right, so jump into chapter four. This one’s titled, “Superforecasters.” We briefly mentioned earlier about the study and all that kind of stuff. But the thing that I think’s worthy of highlighting here in chapter four is this idea that the superforecasters are people that question everything. And, they basically design a roadmap from a particular event. 

So, think of it like this. We’re going back to the North Korea example. The question comes down to, is there going to be some type of event of war or something like that in North Korea? That would be the question whether that’s true or false. And the way that these superforecasters will go through the thought process of this, they will start breaking the question down into sub components. And then, they’re assigning probabilities to these different sub components. 

For that example, is there going to be a war in North Korea? And what they would do is they’d say, well, let’s look at it from a political landscape. What do we think the probability is assigned to the political landscape? What are we thinking the probability is that North Korea could potentially do something like this to set it off? What would be the implications of China? 

They would dissect the entire array of potential reasons for how that could eventually happen, and then the corresponding probabilities for each one of those particular events from playing out. The analysis would be done from a pro and a con of why it could or could not happen. 

So they’re very analytical and very organized in their thinking, which I think is very different from the way the typical person approaches complex and difficult problems like that, because the normal person would, like Stig was saying earlier, they will latch on to a narrative. 

They’ll maybe hear a friend or somebody else who says, “Well, China in the past has always done this, so that’s why there is not going to be a war in North Korea.” And that’s the end of their analysis. That’s the end of their thought process of how they broke it down. 

Whereas the superforecaster is going so much deeper and so much more involved in the way that they’re processing all the variables at play for a particular event. And that is so important when you’re trying to think through a complex problem in forecasting.

Stig Brodersen  15:13  

So in chapter five, he talks about whether or not superforecasters are super smart, because it seems very advanced, perhaps what they’re doing. And what he found is that IQ is not fully predicting superforecast, even though it is actually important. He’s saying that forecasters typically perform really well if in the top 30% of the population. 

And then, there’s another step until the superforecasters, which are all in the top 20% of the population. But he said there’s something that’s even more important than IQ, that is how much they enjoy cognitive challenges. Do they like to do Sudoku and crossword? That was actually his examples in the book. 

Also, what is their openness to experiences? And he came up with this example. He said that one of the questions they asked superforecasters was they want them to predict the presidential election in Ghana. 

Now, very few people outside of Ghana would know about the presidential election. So, it basically comes down to this: how do you think as a human being. I was thinking, “That doesn’t concern me. I don’t want to spend my time on it.” Or are you thinking, “This is a great chance to learn more about Ghana.” And if that’s what you’re thinking, then you have the mindset of a superforecaster.” 

Because as a superforecaster, you need to acquire new knowledge and you need to come up with a good realistic thesis about what you think will happen. And there is no amount of hard work that can compensate for you being open to that experience of gaining new knowledge.

Preston Pysh  16:43  

So the next chapter is chapter six, and the title of this is, “Superquants?”, and what this chapter really gets into is the idea of confidence in some of the predictions and the probabilities that are being determined by the superforecasters. And so, one of the examples in the book and I’m going to read here from my notes, the example that’s used in the book is for financial advisors. 

People usually trust confident advisors faster when compared to advisors that are less confident. On the face of it, accuracy and confidence may seem different, but they’re actually correlated. And for many people, we play so much confidence in this correlation that we exaggerate it unintentionally. 

So, this is a bias that people have, that whenever they see a confident person who’s maybe saying this is what’s going to happen, and these are all the reasons why. They immediately put way too much emphasis on the probability that it’s actually going to occur. And so that is a potential defect in our thinking. This is a bias that people have. 

To combat this and to overcome this, when you get around a person who is very confident and they’re giving you very profound examples on why something might or might not happen. If you can’t explain why they’re wrong or you can’t provide other reasons why that thinking might be flawed. The mindset should be that you don’t have enough confidence to understand the counter argument to something and you should probably back down on what you think the probability actually is from that confidence coming from the other person. So, let’s say somebody gives you a great argument, you say the probability of that happening is 80%, based on what they told you, but if you can’t figure out ways to think through why that argument might be wrong, your assessment of the confidence of that needs to maybe be shaped and maybe degraded and brought down because you can’t provide other means to troubleshoot them. 

Okay, so chapter seven is titled, “Supernews junkies?” And this one really revolves around a bias called consistency bias. And this is something that we read a lot about in the Robert Cialdini books where when a person puts an opinion out there, they have a lot of momentum to keep that opinion because they want to remain consistent in their thinking. 

[This is] because there’s this stigma in society that when you change your opinion, you’re a person of volatile thinking and that you aren’t confident in what you think. And it’s definitely viewed from a cultural standpoint as being a liability for a person’s behavior. 

As a result, most people get hardened into these positions that they have, like, “I have the opinion that X, Y, and Z is going to happen.” Even if they have a ton of proof, later, that’s unraveled and shown to them that they’re wrong. A lot of the time, people will even become more hardened in that opinion to remain consistent in their thinking. 

So, what the authors are getting at in chapter seven is that these superforecasters take a very different approach to this bias. In fact, they’re very quick to change their opinion. When new information is presented. They’re very quick to say, “Oh, that’s really interesting. I think maybe the way I was thinking about this before is wrong, and now I might actually have the exact opposite opinion.” That’s the hallmark of a superforecaster. It’s that way of thinking. 

A real famous investor that immediately comes to mind for me when reading through this chapter was Stanley Druckenmiller. Because when Drunkenmiller’s on TV, I’ve heard him say, I don’t even know *inaudible* how many times. “This is my opinion today, I think gold’s going to go up, but, I might change my opinion tomorrow, and I might actually have the exact opposite opinion. I might put on the exact opposite play tomorrow, if new facts are presented to me.” And so, that is definitely a person of superforecasting mentality and thought that is being presented. 

This chapter title, “Supernews junkies?” They’re constantly reading the headlines. They’re constantly reading both sides of an argument, and they’re constantly updating their projection on what they think the probability of something happening is. 

So, if they had an estimate that is 63% probable one day, they might read a news article and then determined that now it’s shifted to 57% likelihood the following day based off of this information that they got. 

One final point from this chapter that I think is important to highlight is that superforecasters are really good at picking apart the critical variables that are driving the ultimate outcome of something. They don’t get caught in the weeds of ideas or evidence that’s really not going to produce the final outcome. And I think that’s a really, really important part. I don’t know how a person owns that skill, but I definitely agree with that analysis that they present in the book.

Stig Brodersen  21:33  

A really good quote that he has in this chapter was from the British economist, John Maynard Keynes. And apparently he’s said that, “when the facts change, my opinion changes”. I think it’s also really a cultural thing as you also said before, Preston, because especially in the West, we don’t like what we would call flip overs. We don’t like people to change their opinion. We see them as inconsistent. We see them as not having perhaps a good analysis because why are they changing their opinion? I think that’s something I thought a lot about. 

For instance, after we did the first episode about gold with Jim Rickards. I remember thinking, “Hmm, perhaps I was wrong about how I looked at gold.” Not in the sense as I would like to have it as an investment, but in terms of what does gold mean. It’s actually a currency, and what that really meant after also reading his book. 

The first thing I thought about after reading the book was, 30 episodes ago or whatever, I said something completely opposite. And so I was considering, could I really say on the podcast that I changed my opinion? How would people think about it and how they perceive me? Would they perceive me as a flip over, instead of just being authentic? 

I don’t know if I’m right, but I think I’ve gotten smarter about this subject, and this is what I mean now, and this is what I meant in the past, and this is why it’s different. I realized this part about when the facts change, you change your opinion. 

Preston Pysh  22:49  

Yeah, I totally agree. 

Stig Brodersen  22:51  

So chapter eight is called, “Perpetual Beta.” It’s about how it’s hard to look back at your previous forecast. It actually turns out that forecasters have a really hard time remembering what they actually predicted in the past, especially if they were wrong. And they actually have this experiment where they’re asked forecasters what they thought about a given event. 

This situation was the Berlin Wall. It actually turned out that they misremembered by a margin of 31%. So that would mean that 71% thought it would fall, when it was actually only 40%. And this is the problem we have all as forecasters. We tend to remember the times that we were actually spot-on on our predictions and kind of forgetting or coming up with excuses, whenever we’re wrong. And we see that in the stock market all the time. 

Whenever you’re talking to other people or fellow investors, they’ll be telling you about all the times they were correct in the analysis, whether or not it was because of the analysis. And there was probably also because of bad luck and whatnot that the bad investment had turned out.

Preston Pysh  23:56  

So in Chapter Nine, this is called, “Superteams.” The best way to describe this is the idea of groupthink. I think most people that listen to the show, because I think we’ve talked about this a few different times, are aware of this bias that occurs when a group of people get together, one person throws out an idea, and everyone kind of feeds off of it. It’s almost like confirmation bias as well, where people were feeding off of that idea, and they’re going in this certain direction, and they fail to go back and shoot holes through the argument of why that approach or that forecast and direction that they’re looking to go might be wrong. 

And the book does a great job describing an example of this, and they used President Kennedy’s administration with the Cuban Missile Crisis. The team’s forecast as a group of what they actually thought was going to happen versus how it actually turned out. 

So, I really like the story that they provided in the book. It was a really good example of how this goes wrong in many different ways, and ways that you can also go about trying to prevent it from happening, which is really kind of opening up to the group and saying, okay who has the opposite opinion? Who sees this from this vantage point? What do you think the probability of this occurring is? And going around the room and kind of forcing those people who normally don’t talk to throw ideas. 

Or here’s another example, going to a person who has a really strong opinion, and says, this is what it is, and then going back to that same person and saying, I want to hear you argue the other side of the opinion here. I know you think that this is what’s going to happen, but I want to hear you argue the opposite opinion of what you have, and forcing that person to think outside the box. 

And then whenever you would create that dynamic in the group, what you’re going to have is you’re going to have everyone else in the room thinking through, oh, well, here’s maybe how you could argue the other side of that, and you start getting everybody in the room starting to think of ideas of how they could argue the other side of it. So much of this is driven by the leader who’s basically moderating the discussion. 

So in this case, it would be President Kennedy. He has such an important responsibility in driving this conversation, and most importantly, remaining neutral in the way he’s accepting the information because as soon as all those subordinates in the room start seeing that maybe he’s leaning in one way or the other, they now immediately start tailoring their discussions and their narratives in that direction, because he’s their boss, they want to look good in his eyes. 

And so, that’s a really important consideration for anybody in a leadership role to try to go about this from a superforecaster perspective, is to really try to keep things in a balanced argument mode throughout the entire discussion.

Stig Brodersen  25:39  

Yeah, and they really found that diversity is a strength whenever it comes to these superforecasters teams. It basically comes down to that different people have different ways of collecting the data and how they process it. I don’t know if that’s also explained by superforecasters [who] are doing better than intelligence agencies. 

Again, I don’t know anyone in an intelligence agency, so I won’t know, but I would kind of assume that a lot of them would have a more similar background in the teams than what we see in this book, whether it just basically come from a *inaudible* background you can think of, because the most important thing is to embark on a consensus fallacy. And that is that we are so similar, and we like each other, we don’t want to have disagreements, so why can’t we just go with this conclusion and then run with that. 

On the other hand, a group cannot be too diverse. We can’t have too many disagreements because if we see people having too many disagreements, the author also found people wanting to win the argument rather than finding the truth or finding the right forecast. It’s more like my argument is better than yours, and I don’t like you. So they’re more looking into themselves and having a sense of fulfillment from winning an argument or doing the shouting match than actually coming up with a good answer. 

Preston Pysh  27:54  

Boy, is that one true. When you think it through, I mean, think about it, how many people are trying to save face, and it more becomes an ego thing than let’s discover what the truth is, and regardless of how that might make me look in the long run.

Let’s jump straight into chapter 10 because it revolves totally into this conversation, which the title for chapter 10 is, “The Leader’s Dilemma”. What it discusses is that the three key characteristics that most people attribute to strong leadership is confidence, decisiveness, and vision. 

And so, when you look at those first two words, confidence and decisiveness, it really kind of goes against a lot of the things that we were describing in the previous chapter where the leader needs to be non committal. He needs to provide this framework for two opposing points of view to play out as if he has no idea what choice he’s going to make in order to create that environment of disparity between the two sides of the argument so that the truth can be unveiled. 

The authors say that, although there’s this dilemma between being a great leader with confidence and decisiveness, leaders can implement this. They can display these attributes of a superforecaster, but it’s very difficult for many leaders, especially ones that maybe have been in charge for a long time, and they have this army of staff around them that are accustomed to feeding that leader exactly what they want to hear. It’s a really interesting discussion. I think it’s a really important highlight, especially for anybody in a managerial role.

So, let’s go ahead and jump to Chapter 11. This one is titled, “Are They Really So Super?” and Stig’s going to go ahead and cover this one.

Stig Brodersen  29:40  

So basically, in this chapter, he’s trying to come up with counter arguments by his thesis for this book that superforecasters actually exist, why that might be wrong. He presents the argument from Nassim Taleb’s book, “The Black Swan”. We talked about that book in Episode 47. And the premise for “The Black Swan” is that just because you can’t prove it’s not correct, doesn’t mean it’s correct. 

And the reason why he has called that black swan was that he was talking about living in Europe in the 1500s. And you could never, ever imagine a black swan, because that didn’t exist in Europe. But just because you haven’t seen one, just because you haven’t experienced it, just because people haven’t told you about it, doesn’t mean that they don’t exist. And he brings up the same premise for his book. 

He said, I have all this great evidence why there’s something called superforecasters, and I think I can identify why I’m right about my thesis. But what can I do to prove myself wrong in the sense that there’s probably nothing like superforecasters, that this is all wrong? And I think that really shows something about a very humble attitude that he has to his own work.

Preston Pysh  30:51  

I think in general, at the end of this chapter, the thing that the authors are really getting at is, I think they have a deep appreciation for Nassim Taleb and some of the research that he’s done with the book, “The Black Swan,” and from a statistical standpoint. 

But I think that they disagree with Taleb where Taleb really kind of writes off that anyone who’s trying to make forecasts in the future, this is probably how Nassim Taleb would say it, “He’s an idiot.” And I think they have a much more positive outlook in their research through these superforecasters that it can be done with a very high level of predictability and confidence from the people that are very good at it. So, I just see that they have kind of a conflicting point of view, and that was highlighted in the 11th chapter of the book. 

Moving on to the final chapter, which is, “What’s Next?”. It’s just kind of a really quick recap of the book. The authors talk about how forecasting is really important for a business’s success and for a government’s success in order to accurately predict what’s potentially on the horizon. And he talks about if you use the framework that’s outlined in the book, you’re not going to get perfect results. But, you’ll have a framework for keeping track of what your forecasts have been, and that was a really important part in the book is don’t just make a bunch of forecasts and then don’t look back at what the track record is because you have no way of actually determining whether you’re in this category of superforecaster without keeping a track record statistically proving whether you’re an outlier or not. 

I think that that’s a really important consideration is the historical reference of how well a person has done it, after making 100 or 1000 different forecasts, and using evidence-based policies in order to develop the framework for how those forecasts are being conducted. So, he says that if you’re going about it in that manner, it could actually be very fruitful for the individual who has invested a lot of time and effort into honing this skill, and I completely agree with them. 

So, that’s our analysis of the book. I really enjoyed this. I found it to be really interesting. If you’re a person who isn’t aware of a lot of biases, this would be a great read for you to kind of get caught up on maybe where there’s some flaws in your thinking, especially when it comes to assessing where investments might be going into the future. 

Stig Brodersen  33:13  

And if you’re interested in reading our executive summary of “Superforecasting”, you can go into our show notes, or you can sign up for our email list where we send out PDF files with these executive summaries twice a month. 

But guys, that was all that we have for this week’s episode of The Investor’s Podcast. We see each other again next week.

Outro  33:32  

Thanks for listening to TIP. To access the show notes, courses or forums, go to theinvestorspodcast.com. To get your questions played on the show, go to asktheinvestors.com, and win a free subscription to any of our courses on TIP Academy. 

This show is for entertainment purposes only. Before making investment decisions, consult a professional. This show is copyrighted by the TIP Network. Written permission must be granted before syndication or rebroadcasting.

HELP US OUT!

Help us reach new listeners by leaving us a rating and review on Apple Podcasts! It takes less than 30 seconds and really helps our show grow, which allows us to bring on even better guests for you all! Thank you – we really appreciate it!

BOOKS AND RESOURCES

NEW TO THE SHOW?

P.S The Investor’s Podcast Network is excited to launch a subreddit devoted to our fans in discussing financial markets, stock picks, questions for our hosts, and much more! Join our subreddit r/TheInvestorsPodcast today!

SPONSORS

  • Support our free podcast by supporting our sponsors.

CONNECT WITH PRESTON

CONNECT WITH STIG

PROMOTIONS

Check out our latest offer for all The Investor’s Podcast Network listeners!

WSB Promotions

We Study Markets