Ever thought about why you make the decisions you do? And how you know it’s the right one? From acting on intuition to weighing the risks and rewards, there’s a lot behind our big decisions. Listen in as Pat McCauley of Bridgewell Agribusiness, Rick Thomas of Pilot Wealth Management and Brandon Laws of Xenium HR discuss the science behind the decisions we make. We’ll cover tactics that traders use to combat uncertainty, how to avoid bias in decision making and tips for making the best choice you can every time.
 

MP3 | iTunes | Stitcher
Run Time: 38:32


 

JOIN US FOR A SPECIAL KEYNOTE PRESENTATION ON JUNE 12TH

Xenium HR and Pilot Wealth Management will be co-hosting the seminar, Traps and Biases – How to Make More Effective Management Decisions, with Pat McCauley on June 12, 2018. Seminar fee is $49. However, for podcast listeners only, the first dozen registrants will receive a 50% off discount. Use the promo code “XENIUMHR” upon registration to apply the discount.

 

 


 

 
Brandon Laws: Hey. Welcome to today’s show. I’m your host Brandon Laws and today is a very, very special episode. We’re going with more of a business-focused topic today. We invited Pat McCauley, CEO of Bridgewell Agribusiness, on the podcast.
The unique thing about this particular episode is that Rick Thomas who has been on the podcast before, he is a business adviser and a partner at Pilot Wealth Management. He also has a podcast called The Idea Revolution.
So Rick and I put our heads together. We both wanted to interview Pat. We had him speak at an event back in November 2017 and we loved his talk. He was fantastic on our CEO panel and so we both wanted to interview him and thought, “Hey, let’s do a simulcasted podcast.”
So, for the purposes of this particular episode, Rick and I share hosting responsibilities. So that’s why you will hear both of us jump in and play host. You’re going to love this episode. Pat is a wealth of knowledge and he put together a seminar on the science behind decision making.
So why do we make the decisions we do and is there a science behind it? In this episode, Pat dives into why thinking like a trader and thinking in bets is really good for decision making. He talks about how bias plays a role in the decision making and so much more. We really only touched the tip of the iceberg on this and that’s why we are putting on a full seminar in Portland, Oregon on June 12th, 2018. Pat McCauley will be the featured presenter. It will be a great breakfast and networking event as well with other senior business leader type people.
So again, June 12th. Link to the registration is in the show notes. Special promo code for this event is also in the show notes and it’s exclusive to you as a podcast listener.
So I’m going to step out of the way. You’re going to really enjoy this episode. Let me know what you think about this episode. Reach out to me on LinkedIn. Send me an email, whatever you want to do. I want to hear if you like these business-focused topics.
If you don’t like these sorts of topics, please do forward them on to other senior leaders within your organization. I’m really trying to obviously have very focused HR topics and things about the workplace. But I also want some really truly human development-based topics and I think the science behind decision making really falls under that category.
So it’s a little bit more business-focused. But I think you will really enjoy this. You’re going to get a lot from this. So enjoy the episode.


Rick Thomas: Very cool to be here with Brandon and Pat this morning. Brandon and I are doing this. We’re kind of an experiment here. We’re doing a simul-podcast. So we will see how this works.
Brandon: It’s exciting.
Rick: With mine being The Idea Revolution and Brandon, you’ve got the HR for Small Business podcast.
Brandon: Yeah.
Rick: So very cool to be teaming up and doing this with you.
Brandon: Yeah, I’m excited to have Pat here too.
Rick: Even more so, yeah. So we’ve got Pat McCauley, CEO and owner of Bridgewell Agribusiness here in the area, in the Portland area. Welcome, Pat.
Pat McCauley: Thanks. I appreciate you having me on.
Rick: We had the opportunity to meet Pat a few months ago, actually late last year. He was a guest on a panel that we had, as part of the Leadership and Economic Summit that Pilot and Xenium has done for a number of years here in the Portland area. As part of the CEO panel, you participated. And in getting to know to panelists, Pat began to talk about decision science. Right away, I started geeking out because I’m really curious.

First of all, when I meet people that are a lot smarter than me, I pay attention. So definitely wanted to learn more –
Pat: I appreciate that.
Rick: Learn more about Pat and rather than listen to me ramble on, tell us a bit about Bridgewell Agribusiness. What is it you guys do? Then kind of what has been your larger career track? How did you get to that place? What kind of informed this whole topic around decision science?
Pat: Sure, thanks. Bridgewell Agribusiness is in the business of sales and trading of commodity-based food products. So we trade grains, oils, some specialty fruits, tropical products.
Our business is national and international. We focus primarily on customers that are looking to solve problems. So we tend to focus on products where customers have either historically or currently have issues with getting supply of different commodity products that they need and our job is to help them figure out how to solve those problems.
So what we end up doing is we talk about ourselves as supply chain solutions in niche markets. So we tend to manage the overall supply chain. We find the source of the commodity. We manage the supply chain together to the customer. We do some value-add in getting the customer exactly what they need.
So sometimes we’re – for example, we might be refining oils. We might be sourcing organic oils. We might be finding specialty grains and then having them floured or milled to a certain specification.
What really got me to the business was I was recruited to be the CEO of a different – of a related company about four years ago and that job kind of morphed into me actually buying the Bridgewell Agribusiness from that entity and going out on our own. I have two partners. We work in the business together.
Prior to that and what really kind of shaped my background, particularly with decision science, was my experience at Susquehanna International Group, which is a large Wall Street proprietary trading firm, one of the largest proprietary trading firms in the world at this point. Just to give some perspective, they probably do north of 15 percent of the average daily volume in derivatives markets, so options and ETF trading. And they do about – probably over two percent of the listed NYSE and Nasdaq stock volume on a daily basis as well. And all the capital is just from the partners who started the business in 1987.
Rick: In terms of dollars that are being traded, give us some sense, just to put scale to that.
Pat: It will be billions of dollars in long and short positions on a daily basis that are rolling over.
Rick: Yeah.
Pat: Now, not all the positions are long term positions. It’s typically short term. But I was fortunate. I started there out of business school in 1991. I was hired as a trader. I started in Chicago on the floor of the Chicago Board Options Exchange and there were about 125 people when I started.

I found myself in 1996 – they asked to be the COO of the company. At that point, we had about 350 people. I had two different stints as a COO and after the first one, which was about seven years, we went from 350 to about 1250 people.
I was responsible for a lot of the growth and development and the hiring and the decision making around what we were going to do in terms of what businesses we were going to be in and how we were going to staff those businesses.
Rick: Right, right. In particular to the decision making, I’m assuming that was kind of what was that road that led you to this whole notion around decision science?
Pat: I was hired originally as a trader. But very quickly, as the company was growing, they approached me and they said, “Would you be interested in actually teaching other people how to trade?” We were hiring people like crazy and we did not have anybody who was formally working on teaching other people, who we were hiring, how to make the trading decisions.
So I spent – with one of the managing directors, one of the founders, I built the curriculum which they still use today, to teach everybody how to make trading decisions.
It was in that that I realized that I had a knack for teaching. I had a real interest in decision science among other things and I put together that curriculum. So I would say that was probably the seminal time where the things that I do today kind of came out of that experience of being – you know, effectively the teacher of all the people that worked at Susquehanna who were trading in venture capital.
Rick: Interesting, interesting.
Pat: And we’re going to talk – I mean I know we’re really going to talk about trading and decision making and so I’m anxious to dive into that.
Rick: Right.
Brandon: Yeah.
Pat: I don’t want to sort of spoil the party.
Rick: Yeah. Well, let’s get there quickly and I’m going to mention one more thing for the listeners, that if this topic is intriguing, stay tuned, because you can hear some really cool stuff. At the end we’re going to talk about an opportunity to participate in the seminar that we’re going to host between Xenium and Pilot jointly here in Portland in June, which Pat is going to lead for us. So stay tuned for that.
Brandon: Yeah.
Rick: So let’s get to the goods as they say. Brandon, why don’t you tee us off?
Brandon: I’m fascinated by decision making in general and obviously there’s a science behind it. Our brains are very complex and I think there’s so much we don’t know about how we make decisions. Your background on trading is fascinating and a component to your work is thinking like a trader in terms of making the right decisions. Talk to me about that. Unpack that thought there.
Pat: The real premise for trading is trading is really decision making under uncertainty. So you are forced to make a lot of decisions in a very compressed time period where you get very quick feedback on the decisions that you made and to the extent that you are risking a significant amount of capital in making those decisions. You get flushed out pretty quickly in terms of whether or not you are a good decision maker or not a good decision maker or if you have biases or you have other – you know, blind spots in your thinking.
So trading was always really, really good for teaching people how to – it’s a good form for teaching people how to make decisions.
Brandon: Because things are moving so fast.
Pat: Because things move so quickly and because – and this is true in the real world, which I don’t consider trading to be the real world. In the real world, it’s true because you have to make decisions with incomplete information. So imagine you are – you could think of yourself as a manager of a company and you’re running a large business and you’re trying to make decisions about what to do in the future. Compare that to somebody who’s standing in a trading crowd and they’re trying to make a decision about whether or not they want to risk millions of dollars by being long or short in any particular security that they’re trading at that time.
They’re going to know within a very short period of time, probably minutes, whether or not their decision was any good.
Brandon: Well it’s funny. I always think back to the book Blink by Malcolm Gladwell. The thesis behind the book is you make split-second decisions and sometimes they’re right and sometimes they’re wrong. I imagine in the trading world, things are moving so fast and there are high stakes. A quick bad decision can cost a lot of money.
Pat: Yeah. It can cost you a lot of money really quickly and it’s pretty significant. So one story we tell is the first week that I was at Susquehanna – and I was fresh out of business school, which of course meant that I thought I knew everything. I was down on the trading floor and we were standing around as a group and we had started to have a conversation about basketball. Now I played basketball in college and I’ve always considered myself to be a really good basketball player.

Suddenly the conversation, as traders will have it gravitate to, it sort of gravitated to gambling and betting and me being so cocksure about who I was and what I knew. I was kind of speaking up about wanting to be involved in the bet and people were sort of aggressively trying to bet against me. I realized something didn’t feel right. So my instinct was, you know, maybe I’m sort of leaving out what the bet was.
But what the – maybe I’m not really thinking about this the right way and I sort of quickly came to realize, mostly through people kind of verbally hitting me over the head, that there was something that I was missing about what was going on in the bet and that I was really just a sucker.
It was kind of a – it was a very good moment, very early in – like literally in the first week of my career where I was like, “OK. Well, maybe I don’t know as much as I thought and this is a little bit humiliating,” and everybody sort of dispersed and walked away. And my manager at the time who was a phenomenal trader and a great guy, he looked at me and he said, “Don’t ever say anything that you’re not willing to bet on.”
So you start talking – I had started to talk about what I was willing to do and what I thought I could do. He said, “Don’t say anything you’re not willing to bet on.” That’s really kind of the essence of what trading is, is if you have opinions about things and you have decisions that you want to execute on or you want to risk capital on, you’re really just taking a bet and you better be really confident that the return that you’re expecting to get is compensating you for the risk that you’re taking. Too often it happens in business where people don’t really think about things in that way. So there’s a great book that just came out by a woman named Annie Duke who was a professional poker player.
I’m fortunate to actually know her personally. She’s a great person and she just came out with a book called Thinking in Bets, which really kind of talks about – it’s the first book that I’ve read where it really talks about kind of the trader mentality and how you might apply it to other things in business.

But if you think about it, if trading is really decision making under uncertainty and you’re making decisions about taking risk, with incomplete information all the time, trading is really no different than running a business or managing people or thinking about what you want to do with your company. So some examples. Hiring and firing. That’s really – you’re making decisions about who to keep on your team and who not to have on your team, with most of the time not full information.
Rick: Yeah.
Pat: Forecasting and budgeting. Capital allocation. New business development. These are all arenas where you are making bets about the future of your business with not full information. One thing that’s really important is if you had full information, there’s no decision to make and I think people forget that very often, that you just – if you know for sure that something is going to happen, then we don’t have to have a discussion about making a decision. It’s already done. We know what the outcome is going to be.
But that’s where it’s really fun is when you’re not really sure and you have to make these decisions. If you do it carefully and you do it thoughtfully and you keep score with what you do – and we will talk more about that later – you can be a really effective decision maker in what I would say is thinking more like a trader.
Rick: Well, that’s great. So let’s continue to pull that thread. You talk about probabilistic thinking in the materials that we’ve reviewed and what not. First of all, define that. What does that mean?
Pat: The first thing I would say is there are really kind of two key elements to being a decision maker. One is truth-seeking and one is probabilistic thinking. So it’s really about – truth-seeking is really about honesty and candor. So honesty with yourself, honesty with the people that work for you, candid feedback that the market gives you. Now in the trading world, the feedback is as candid as it gets and it is just black and white. You either win or lose and it’s not so easy and it’s much more subtle obviously when you’re managing people and you’re managing business, but honesty and candor.
Then probabilistic thinking, I think is really around inquiry and curiosity. So if you are thinking in probabilities, you’re thinking about not just what you think is going to happen, but how likely is it to happen. What are the other things that could happen and really how wrong could you be about what it is that you’re thinking?
Rick: Right.
Pat: So probabilistic thinking is really about framing all of your decisions with what do I expect to happen and then how likely is that outcome to come to be. What are all the possible other outcomes? Then from a risk-reward standpoint, I now have a framework for evaluating what I expect to happen and what the risk associated with the outcome, the variance of the outcome, is.
That’s, I think, applicable to really anything that you do. It’s particularly applicable to trading because trading is bound by mathematics to a large degree. So it makes it simpler. But really if you think about it, anything that you do in business, whether you frame it this way or not, you have some idea of what you expect to happen and you have some sense that what I expect to happen might not happen and I will think about maybe other things that could happen. But how rigorously am I doing that? How scientifically am I doing that? That’s really what the basis, the fundamental basis, of probabilistic thinking is.
Rick: And I’m curious. For the people that have excelled in being able to create that mental model or that framework that really serves that probabilistic mindset, it seems like there’s a qualitative aspect to people that do it well and people that don’t do it so well. I’m guessing here – you know, in dealing with the implicit uncertainty that comes with most – any decision, there’s that emotional component. Does the framework of probabilistic thinking help mitigate that emotional component or is that just a qualitative aspect that – you know, it’s either a virtue that people have, that they’re able to quiet that, or it’s – or not and they will never get it.
Pat: Well, I would say that anyone is capable of getting it. So the first thing I would say is anybody can learn how to think probabilistically, even if you’re not a math person. So what I hear a lot of times when I talk to people is “I don’t like math,” or “I was never very good at math,” or “I’m scared of math.”
What I realized mostly through trial and error with people that I’ve worked with and people that have worked for me is you can explain to them that, well, there’s actually a mathematical component to really anything that you do.
If you work in human resources or you work in accounting, obviously it’s math. If you work in operations, you work in technology, there’s a math component to all those jobs. I think people get very nervous when they hear the word “probabilistic” because it’s like, “Well, I’m not really sure. I’m not very good at math. I don’t know what that really means. I’m not …”

But if you just – if you think about it in terms of – almost every decision that you make, whether it’s personal or professional, you are basing – and Annie Duke says this in her book is you are – when you make a decision about something to do, you are in some respect betting against all the other possible outcomes that you could have chosen.
Rick: Right.
Pat: Or all the other possible versions of what the future might look like or what you might look like in the future. So you’re doing this subconsciously. I would encourage people to think about doing it a little bit more rigorously, so that you can do it better. So when we talk about – when I think about probabilistic thinking and I think about I have some idea of what I expect to happen and then I have some sense of how likely that is to happen, the way I think about that is I’m thinking about kind of confidence intervals around my assessments.
So I might think that something is going to happen. I have to have some marker or some confidence interval around how likely that is. So a good example is in the stock market, this happens all the time, is if I want to buy – we will use Facebook because they’re in the news a lot. Everybody knows Facebook.
Rick: They’re very topical.
Pat: You can go on your computer right now and you can see what the current price of Facebook is. But what you’re not seeing or what you can see around that is what’s the market for Facebook. What’s the bid and what’s the ask? How much volume is the – in aggregate is the world willing to buy on the bid or willing to sell on the ask before the price changes?
So what that really is, in a very simple sense and a good example, is that’s a confidence interval. So people think they know what Facebook is worth today based on all the information they have at this particular moment and here’s their confidence interval around it. It’s marked by the width and the size of the market. That’s really what we’re talking about doing when we talk about probabilistic thinking is thinking about your decisions in terms of what you expect to happen, what the current price is based on what you know and then a confidence interval with width and size. How wide and how deep is the market that you’re willing to make around it?
Obviously the more confident you are, the tighter you can make your market and the more you might be willing to bet on that outcome. I think that’s a good analogy, I think, for thinking about how you might make business decisions or how I might make business decisions that have nothing to do with trading.
Brandon: What if we had taken so much information, we’re really confident in what we’re thinking about, and then all of a sudden, we develop biases? For example, a confirmation bias. I hear this a lot where you a person becomes so confident in a way of thinking and they seek more information to support that, maybe not intentionally. How does the biases play a role in how we make decisions?
Pat: That’s a very good question. I think that’s – I don’t want to give away too much with the seminar. One of the things that we will cover in the seminar in a fair amount of detail and I think in ways that people will be able to personalize is talking about confidence and over-confidence in particular.
So without giving away too much in the context of confidence is – you know, why are weather forecasters really good predictors and doctors are not? You would hope – no offense to weather forecasters – the doctors who have had much more extensive education and presumably have gone to – have demonstrated a high level of aptitude for school, are going to be smarter than weather forecasters.
Brandon: It’s all about the information.
Pat: At least in terms of academics. Yet weather forecasters are really, really good at making predictions relative to – it turns out weather forecasters are actually not very good predictors of the weather. It’s a separate issue. Nate Silver covers that. Nate Silver who is –
Brandon: Yeah, he has got great work.
Pat: He does great work. The Signal and the Noise is a great book. Nate Silver has sort of kind of blown the doors off the myth that weather forecasters are actually good predictors. They’re not. But my point is, is that relative to doctors, they are or historically have been. So why?

The answer is that if you’re a weather forecaster, your job depends on your ability to make predictions about what’s going to happen with the weather, at least out – hopefully out at least 48 to 72 hours. But the point is, is that you get a lot of feedback. So you make a lot of decisions. You have a lot of modeling tools. You get a lot of feedback. When you don’t do a good job consistently, you get fired or you get replaced because you’re in the job – if you’re telling people it’s going to snow or it’s going to rain, take your umbrella, and it’s really hot and sunny, and they’re wearing galoshes and it’s – you know, they should be wearing sandals, they’re not going to be too happy with you. They’re going to call the station and you’re going to get fired. Whereas if you’re a doctor and you go in and you do an examination on Rick and the doctor says, “Rick, I think you may have some really exotic disease. It gives you about a two percent chance of survival beyond six months. But I’m not really sure. Let me do some more tests and get back to you,” there’s no recourse.
So they don’t keep score and there’s no recourse if their decisions are necessarily bad. That’s not a bad thing. I don’t mean that as a value judgment. But the point is, is that they don’t have any mechanisms that are set up for them to get really good feedback on their assessments of what might be wrong with Rick when he comes in and says, “I don’t feel good.”
That’s really what – so what you want to do is you want to have your confidence in your assessment of anything match what the actual likelihood of that thing happening is. So someone who’s really good – someone who has done really good work on this is a gentleman named Phil Tetlock and he has a book called Superforecasting. He has a book called The Halo Effect. If you go online and you Google his name and you Google a confidence test, he actually has an online test where you can measure your – how confident you are with the actual – how good you are at sort of measuring your confidence with your actual predictive abilities.
What his research has found, which is really true, is that people are not very good at predicting and they don’t really measure and they don’t really care and we will get to biases later where they tell a story around that, that makes them feel better about themselves. But his point is that if you want to be really good at forecasting, you want to be really good at predicting, which is really if you want to be really good at probabilistic thing, you need to be able to match up your level of confidence with your predictive ability.
So it’s not about you being 100 percent right all the time. But if you’re only 60 percent right, you better have a 60 percent confidence interval around that and you better prepare for the other 40 percent.
Rick: Right. What’s the downside –?
Pat: But if your outcome is 60 percent to be true and you’re 100 percent sure, you’re setting yourself up for making really, really bad decisions.
Rick: What we’ve been mainly talking about are – are how to prepare for or how to deal with decision making in the moment. What about after the fact? What are kind of some guidelines in terms of assessing the quality of the decision making or how we want – how do you look at that?
Pat: I think you have to really do two things. One is you have to force yourself and force the people that you work with to make some assessment of how confident they are in what they’re assessing. Then you have to keep score. So oftentimes, we don’t do confidence intervals. We don’t keep score. But then we get outcomes regardless of whether or not we do the first two things.
Rick: Right.
Pat: And then we tell ourselves a narrative around those outcomes that speaks to, I think on a lot of levels, the biases that people are subjected to. Let’s do an example. So let’s talk about financial modeling. Every company does financial modeling. I do it all the time. I literally do it on a daily basis. I’ve been trying to predict what our cash flow and our cash position is going to be on a daily basis.
So I’m kind of fully engaged in it. But people do financial modeling all the time. So they do financial modeling around what they expect salespeople to generate, what they expect their company to do on profitability on a daily, weekly, monthly basis depending on – or yearly basis, depending on their business. We’re really just in the business of doing predictions and we just talked about the fact that we’re not particularly good or wired to be good at doing predictions.
So if we take financial modeling as the example, what a model really is, is a model is a representation of what you expect the world to look like or hope the world is going to look like, right?
So what do you do when you create a model? You come up with baseline based on a bunch of assumptions. So any model that you create is really only as good as the assumptions that you make around that model. What we don’t do a lot of times is we don’t rigorously test our assumptions. We don’t build confidence intervals around our assumptions. We don’t think about how likely our assumptions are to be wrong until exposed, we see what the outcome is. So if we’re going to be really good at making decisions and doing predictions around things in the context of modeling, we better be really good at – before we come up with what the model looks like is rigorously evaluating our assumptions. So we know where our model could break down and we have an appropriate level of confidence in our model.
It doesn’t mean that we’re going to walk out the door and say, “I’m 100 percent sure that what I have on this paper is going to happen.” But I want to make sure that my confidence in what’s going to happen matches up with what’s on the paper.
Rick: Right.
Pat: And that’s really, really difficult to do because we’re not in the business of making assumptions or testing our assumptions. We don’t really have a good understanding overall of risk or how to measure risk or how to evaluate risk. We don’t really do a lot of post hoc analysis over what happens. We look at the outcome. But we don’t necessarily spend a lot of time thinking about what the outcome – how likely that outcome was to happen before it actually happens, right?
We’re much more reactive when we could be a lot more proactive in making those kinds of decisions. You brought up something, which we’re going to now kind of get into is biases. So when you brought up the word “emotion,” a lot of times what emotion really talks to is the biases that we all have.
When I say biases, what I mean is we are all subject to – as human beings, our brains are not hardwired to make rational, sound, fundamentally-objective decisions. We get fooled all the time by our own intelligence and that’s really what biases talk about.
So biases are not about – they’re not value judgments. They are things that even people who are really, really good at making decisions are subject to these biases all the time and even if you’re aware of them, you’re always subject to them.
So it’s not a – I’ve gone through – you know, instead of AA, I’ve gone through BA, right? Bias Anonymous and now I’m cured. I’ve gone through the 12-step program.
There is no cure for it. I think what you can hope for at best is to have a really functioning awareness of what these biases are and then you can be really, really successful at consistently making really good decisions.
I’m aware of it all the time. I spend a lot of time thinking about it and I make mistakes all the time. I get clouded by – my judgment gets clouded all the time in very small and subtle ways, probably a lot less than maybe other people would. But clouded nonetheless.
Brandon: Bringing science into this discussion makes so much sense because you hypothesize. You test that hypothesis and then you recap and you look at, “Hey, did it go exactly the way we thought?” and then –
Rick: Objective analysis.
Brandon: Objective analysis. I don’t think we do that in decision making.
Pat: We don’t and that’s one of sort of – that’s one of the big issues that I think if you – and again, this is always post hoc. But when you think about why did we not hit our budget last month or why did we not make the revenue we thought we were going to make last year? Why didn’t the salesperson who we hired work out the way we thought they were going to work out when we hired them?
The answer a lot of times is that it is not that we were wrong or we got bad luck. It has a lot more to do with maybe we weren’t really focused on the right things upfront. Maybe we made assumptions that we shouldn’t have made. But probably as much as that is, we’re probably just too confident in what we thought going in. That’s a big problem, right? Because if you have people working for you and you say, “Here’s your budget. I expect you to hit your budget,” well, sometimes people hit their budget and sometimes they blow through their budget, right? Sometimes they miss their budget.
There’s variance there and you need to understand – you need to have a sense of how confident you are in what you originally thought the budget should be, which means you got to really stress-test your assumptions around the model that you give somebody because that’s really just giving them a model and saying, “Here’s what I expect you to do and why.”
But without speaking to how much variance there is around that or what risks there are that may or may not be in the person’s control, you’re setting them up for either failure or you’re setting them up for an interpretation post hoc that is probably biased.
Rick: For those of you listening, you’re probably getting a sense of how much meat that there is on this bone and I could go for a lot longer. But I think we’re probably going to summarize it at this point. But definitely let’s talk about the event that we’re having on June 12th. So we – Pat, we’re going to have you come in and do a – kind of a half-morning seminar on taking a deeper dive on unpacking some of these topics as well as that – you’ve talked about today as well as a few others and along with some other people in the room to go through maybe some live-fire examples. Is there anything else you would want to tease up on that seminar at this point?
Pat: Well, there’s a lot to tease up. I mean we talked about overconfidence. There are a number of other biases. Warren Buffett is quoted as saying, “We interpret all information so our prior conclusions remain intact.” I think that’s a really good way of summing up a lot of the biases.
So some of the things that we will talk about in a fair amount of detail and give some tools and skills that people can take away to be aware of these biases. It’s overconfidence. It’s confirmation bias, which is really about looking for information that confirms what we might already believe.
There are a number of self-serving biases that we will talk about. Self-serving biases are really around stories that we tell ourselves, so we can preserve our beliefs and our self-esteem or what we thought was true. There are hindsight biases. We will talk about hindsight bias, which is really – you know, we view the outcome and then we construct a story around what we believe happened and why it happened.
You know, the famous quote is, “I knew it all along.”
Brandon: Yeah, right.
Pat: That’s really what hindsight bias talks about. There are some other effects too like the “Lake Wobegon Effect,” which is this – Garrison Keillor who is a writer and satirist and is a radio personality, he created this fictitious town in Minnesota called “Lake Wobegon” where everybody who lives there is above average.

Brandon: That’s not possible.
Pat: It’s this mythical sort of place that everybody should – every town should strive to be like Lake Wobegon and everybody is above average. But let me just throw out a couple of quick statistics, which I think – this is why it’s called the “Lake Wobegon Effect.” 25 percent of all employees think they’re in the top one percent of performance, which –
Brandon: The math doesn’t work out.
Rick: Yeah.
Pat: Eighty percent, eighty percent – and these are all studies that have been done. So I’m just kind of throwing out the summaries.
Rick: Right.
Pat: Eighty percent of school children rate themselves above average in leadership. So 80 percent of the kids think they’re in the top 50 percent.
Brandon: Eighty percent, wow.
Pat: This one is my favorite one. Ninety-four percent of academics rate themselves above average, relative to their peers.
Brandon: That’s unreal.
Pat: Yeah. In terms of – I believe it’s in terms of research and the work that they produce. Here’s another interesting one just – again, just as a teaser is, you know, 75 percent of employees score themselves higher than their peers and their managers do, than their managers score them, on the same – in evaluations.
So when you do like 180-360 kind of peer reviews. So when you think about sort of all these things going on and you say – and here’s the problem when you’re a manager is I can say, “OK. I’m a manager. I’m aware of this stuff and I’m as aware of it as anybody. I know that I’m still subjected to it. So I worry about people who don’t think about it as much as I do.”
But then consider this on top of it – you have a whole bunch of people that work for you and they don’t know anything about this or they don’t pay attention to it or you don’t evaluate them on it or you’re not teaching them that. So they are going to be really, really subject to these biases.
Rick: Right.
Pat: And that’s really what I think – if I was going to say, “What’s their summary takeaway that we would hope?” is I would hope that people would leave and say, “I have an appreciation for what these are and I have some tools that can help me at least do a better job of managing myself and helping to manage the people who work for me.”
Rick: Well, we will definitely include more information along with the podcast post on how to sign up for the workshop.
Brandon: It’s pretty limited, 60 spots.
Rick: Yeah, about 60 spots. Yeah, yeah.
Brandon: So podcast listeners really get an exclusive look at this –
Rick: Definitely, definitely. So more information to come. Definitely look for that. Pat, this is awesome as always. I always enjoy diving into this and look forward to doing this more on the 12th. Thanks Brandon.
Brandon: You’re welcome.
Rick: This was really cool doing this.
Brandon: Thank you guys. I really appreciate you doing this and it’s great to get the word out on decision making.
Rick: Very cool. We will do it again,
Brandon: Thanks Pat, Rick.
Pat: Thank you.