Go Back

Corporate Finance Explained | The Psychology of Financial Decision-Making

August 14, 2025 / 00:13:42 / E150

You’ve spent days building a perfect financial model, yet something about the final decision just feels “off.” The numbers don’t lie, but our own brains can quietly lead us astray. In this episode of FinPod, we dive deep into the world of cognitive biases and reveal how these hidden mental shortcuts can derail even the most rigorous financial analysis.

We explore real-world corporate case studies and provide a practical toolkit of proven safeguards you can build into your own process to protect your decisions from these human flaws:

  • Discover how biases like overconfidence and loss aversion led to multi-billion dollar mistakes at Hewlett-Packard and Kodak.
  • Learn to use pre-mortem analysis to confront potential flaws and risks before a project even begins.
  • Understand how groupthink and confirmation bias contributed to the tragic Boeing 737 MAX crisis, and how red team reviews can build institutionalized skepticism.
  • Implement a decision journal to track your assumptions over time, helping you spot your own recurring biases.
  • Break free from anchoring bias with scenario diversification, a technique used to explore a wider range of outcomes beyond a single, comfortable forecast.

This is a must-listen for any finance professional looking to move beyond the spreadsheet and ensure their strategies are robust and resilient.

Transcript

You know that feeling? You’ve spent like days building this perfect financial model. Everything’s crunched. The data supports all the assumptions. And yet when it comes down to the actual decision, something just feels, I don’t know, a bit off. Yeah, I know exactly what you mean. It happens a lot in corporate finance, doesn’t it? It really does. And it’s this weird paradox because the numbers themselves, they don’t decide anything. People do. Exactly. And people, well, we all have biases, whether we realize it or not. They’re just built in.

(…)

So that’s exactly what we’re diving into today, how our own human psychology, these cognitive biases, can quietly or sometimes maybe not so quietly mess with even the most careful financial analysis. And this isn’t just some academic idea. It really hits everything, right? From big capital allocation decisions down to bludgeoning, forecasting the works. Absolutely. And the source material you’ve gathered for this, it’s fascinating. Yeah. Real companies, real situations, and the psychological insights behind them.

(…)

Yeah, some great stuff there. So our mission today really is to unpack the biases that most often sneak into finance departments. We want to show you how they actually play out in the wild with major companies. And give people tools to fight back. Precisely. And most importantly, give you practical ways to spot and counter them in your own work. Think of it as a shortcut to understanding those hidden human elements in financial strategy. OK, good. So let’s lay the groundwork. What are cognitive biases exactly? Well, fundamentally, they’re just mental shortcuts. Our brains use them to process tons of information really quickly. Useful in everyday life, I guess, like snap judgments. Totally useful most of the time. Yeah. But and this is the crucial part for finance. Those same shortcuts can turn into really costly errors. They quietly twist how we see the data, even when we’ve done super thorough analysis. And the tricky part is you often don’t even see them happening, right? They’re kind of invisible. That’s the kicker. They’re often invisible and worse. They can feed off each other, creating these compounded blind spots. You didn’t even know we’re there.

(…)

OK, so what are some of the usual suspects we should be aware of? The common biases. Right. Our sources point to a few key ones that pop up again and again in finance. First, there’s overconfidence bias. The classic I definitely got this right feeling. Exactly. Overestimating how accurate your own analysis or forecasts actually are. Then you’ve got anchoring bias. That’s sticking too much to the first number you see. You got it. Giving way too much weight to that initial piece of info. Like last year’s results become this immovable baseline. Even if the whole market’s changed, it anchors your thinking. I can see how that would happen. Then there’s loss aversion. This one’s powerful. It’s basically fearing losses much more than you value making equivalent gains. So playing it too safe. Often, yeah. It leads to overly cautious decisions that might actually be suboptimal in the long run. And finally, confirmation bias. Looking for proof you’re right and ignoring anything that says you’re wrong. Precisely. Actively searching for and really favoring information that confirms what you already believe, while sort of unconsciously tuning out contradictory evidence. Wow. OK. And these aren’t just minor psychological quirks. They can seriously warp everything. Evaluation models, scenario planning, capital budgeting, the big stuff. So we know what they are now in theory, but spotting them in the heat of the moment when maybe millions are on the line, that feels like the real challenge. Can you give us some examples of how these actually manifest in real corporate finance decisions?

(…)

Absolutely. Let’s connect the dots. In capital allocation, for instance, that overconfidence bias. It can lead executives to greenlight projects based on frankly, overly rosy return assumptions. The infamous pet project. Exactly. Driven more by someone’s strong belief than by the cold, hard numbers. And often they just never hit their financial targets, just draining resources. OK. What about forecasting? Well, with forecasting, anchoring bias is huge. Last year’s actual numbers become the anchor for this year’s projections, even if market conditions are completely different. So instead of a fresh look. You just get tiny tweaks, marginal adjustments. You might totally miss a major shift that’s happening. Right. And budgeting. You mentioned loss aversion there. Yeah. During budgeting, loss aversion can make finance teams really, really hesitant to cut funding for underperforming divisions or products. Because it feels like admitting failure. Exactly. It’s painful to admit something isn’t working and cut your losses. So that reluctance means you keep pouring good money after bad instead of moving those resources somewhere more productive. It really paints a picture. Talking about it abstractly is one thing, but the sources you shared had some really stark, real-world examples. That’s where it hits home, I think. Oh, definitely. These case studies show the tangible consequences. Where should we start? Maybe Hewlett-Packard. Yeah. The autonomy acquisition. Yeah, that was something. Right. Hewlett-Packard’s acquisition of autonomy in 2011.

(…)

HP buys this UK software firm for $11 billion. OK, big number. But then, later, they have to write off $8.8 billion of that value. $8.8 billion. Just poof. Gone. Our sources really point to this as a textbook case of overconfidence, maybe by the leadership and definitely confirmation bias. The deal went ahead, even though there were apparently pretty significant warning signs about autonomy’s accounting, its market position. Makes you wonder how those flags got missed or ignored. Doesn’t it? Did you have Kodak? A really painful one. The digital camera. They invented it, right? They did back in 1975, but they clung to their film business for decades. Why? Yeah. Loss aversion, fear of losing the massive profits from film and status quo bias. Just inertia, really. So they couldn’t bring themselves to cannibalize their own cash cow, even though the future was staring them in the face. Pretty much. They didn’t shift capital aggressively enough towards digital until it was far too late. They fear the loss more than they saw the opportunity.

(…)

OK, what about Boeing? That’s a more recent and tragic example. Yeah, the Boeing 737-MX crisis. The roots go back to design decisions. Reports strongly suggest that internal decision-making was seriously clouded by groupthink, everyone agreeing, maybe too easily, and confirmation bias. Meaning, safety concerns were? Downplayed. Yeah. Or maybe even ignored. All apparently in the push to keep production schedules on track and maintain their market position against Airbus. The human cost there was obviously devastating, and the financial fallout immense. Terrible. And there was Microsoft and Nokia, too. Right. Microsoft’s acquisition of Nokia’s devices division in 2014. Microsoft paid seven point two billion dollars for Nokia’s phone business. I remember that. They’re trying to get into the phone hardware game. Exactly. But within just two years, they wrote down basically the entire investment. The entire thing. Yep. Our sources suggest this shows signs of anchoring bias, maybe anchored to Nokia’s past glory or an initial strategic vision. And certainly a massive overestimation of the synergies they thought they could get. They just never materialized. Another huge writedown.

(…)

And one more quick one. Quibi. Remember Quibi? Oh yeah. The short-form video thing launched with a huge splash. Huge splashes, right. They raised an incredible one point seven five billion dollars, launched in 2020. And disappeared almost immediately, didn’t it? Shut down just six months later. It’s just a really stark example of, again, overconfidence bias from the founders and backers and maybe insufficient scenario planning. Did they really stress test the downside scenarios for that capital allocation? Billions raised and burned in six months. Wow. It’s quite a list. What really jumps out from all these stories, it wasn’t necessarily that the spreadsheets were wrong or the raw data was faulty. It was the human layer, wasn’t it? The interpretation, the decision process itself, these biases were working behind the scenes. That’s the common thread. The human element twisted the lens through which the numbers were viewed. So this all sounds pretty negative, these biases. But I wonder, is there ever a case where maybe a little bit of overconfidence could be good, like to fuel innovation, or maybe loss aversion stops truly reckless bets? That’s a really insightful question. It’s complex, isn’t it?

(…)

Biases are generally framed as errors, especially in a field like finance the values rationality. Right. But maybe extreme risk aversion could be seen as loss aversion, preventing a disaster. Or maybe some overconfidence is needed to pursue a truly disruptive idea that looks crazy on paper initially. So it’s not about becoming robots, but managing the downsides. Exactly. The goal isn’t to eliminate intuition or conviction, but to put guardrails in place, which leaves us to the really practical part. How can people actually do that? How can finance professionals or really any decision maker protect their choices from being hijacked by these biases? Okay, yeah. What are the tactics? Well, the sources we looked at highlighted four proven techniques that seem really effective. First up is pre-mortem analysis. Pre-mortem, like before it’s dead. Kind of. Before you even start a project, before you commit the resources, you get the team together and imagine it is a completely failed, utter disaster. Okay, sounds a bit depressing. But powerful. Then you work backwards. Why did it fail? What went wrong? Everyone brainstorms potential reasons. It forces you to confront risks and downsides that optimism might otherwise gloss over. It surfaces potential flaws really early. Huh. I like that. Forces you to be pessimistic for a bit. Yeah.

(…)

What’s next? Second, keep decision journals. Like a diary, but for decisions. Pretty much. At the moment you make a significant decision, write down your rationale. What were your assumptions? What data did you rely on? What were you expecting? And then look back later. Exactly. Reviewing these journals over time can reveal patterns in your thinking. You might notice, wow, I consistently overestimate market growth. Or I always seem to anchor on the first proposal. It helps you spot your own recurring biases. Okay. That requires some discipline, but I can see the value. Third. Third, red team, blue team reviews. This is more organizational. You assign a specific group, the red team, whose sole job is to actively challenge the assumptions. Poke holes in the analysis and stress test the models developed by the main project team, the blue team. So basically, institutionalized skepticism, a devil’s advocate team. Precisely. It builds critical challenge right into the process, making it harder for confirmation bias or groupthink to take hold. Makes sense. And the fourth one. Fourth is scenario diversification.

(…)

Instead of creating just one forecast or relying on a single most likely scenario. Which might be anchored. Right. Instead, you develop multiple forecasts using wildly different starting assumptions.

(…)

What if interest rates double? What if our main competitor folds? What if demand drops 50%? Explore the extremes, not just the comfortable middle. Yeah. Running these very different scenarios helps break the hold of any single anchor point and gives you a much better sense of the potential range of outcomes and the real risks involved. Those all sound really practical. Pre-mortem, decision journal, red teaming, diversifying scenarios. They are. And they work. And I guess the big takeaway from these techniques, it’s not just about trying harder to be unbiased in the moment, is it? It’s about building these checks and balances into the system. That’s absolutely crucial. It shifts bias mitigation from relying purely on individual willpower, which is unreliable under pressure, to making it a structured integral part of how decisions get made. It becomes process, not just personality. Okay.

(…)

So, bringing this all together. It’s clear that bias in corporate finance isn’t some fuzzy academic idea. It’s real. It has measurable effects on where money goes, how companies plan. Huge impacts. Capital allocation, forecasting, accuracy, strategic planning – it touches everything. And as we saw with the case studies, the consequences aren’t small change. They can be absolutely massive derailments. Yeah, billions of dollars in some cases. Right. So it seems pretty clear that the finance professionals who really stand out, who truly excel, are the ones who get this. They don’t just rely on their spreadsheets. They proactively build these kinds of safeguards against bias into their everyday work. They’re protecting their decisions and their organizations from these very human flaws, making sure things hold up under real-world pressure, not just in the model.

(…)

Couldn’t have said it better myself. It’s about ensuring robustness against our own psychology. So, a final thought for everyone listening. As you go about your own work this week, maybe you’re modeling that big acquisition. Maybe you’re just reviewing a departmental budget or even looking at a personal investment. 

(…)

Take a second. Ask yourself what human factors, what potential biases might be subtly shaping your choices right now? And maybe more importantly, what’s one safeguard? Just one of those techniques we discussed that you could start building into your own process to make your decisions just that little bit more robust, a little less susceptible to those hidden influences.

0 search results for ‘