Hidden in ‘Plane’ Sight

In the Mathematics community, Abraham Wald is well known for his work in statistical sequential analysis, decision theory and econometrics. But to the rest of the world, he is known best for his work in World War II where he helped the US military strengthen planes by analysing the aircraft that made it back to the base as well as those that did not make it back. If you’ve been reading our blog, you’re familiar with our affinity for WWI and II tech lore (here and here), and this particular story illustrates a theory that we as designers have often unwittingly encountered, called Survivorship Bias.

Survivorship Bias is “…a cognitive shortcut that occurs when a visible successful subgroup is mistaken as an entire group, due to the failure subgroup not being visible”. When metal helmets were first introduced in World War I, there was a rise in the number of head injuries being reported, making people question the helmet’s actual usefulness. However, while there indeed was a higher rate of injuries, this was because more soldiers were surviving combat and returning, albeit injured, as opposed to not returning at all.  

Image courtesy The Data School

Examples of Survivorship Bias are not limited to the past. With the amount of data available today, it’s even more critical to be aware of our cognitive shortcomings particularly when conducting research, to prevent incorrect conclusions from being drawn. Compared to an airplane with clear, inarguable physical boundaries, research can be ambiguous, seemingly endless and also reliant on data provided by stakeholders, which makes it tough to create a reliable boundary.  

Clients will come to designers convinced they know exactly where their problem areas exist. At BRND Studio, we were once faced with such a dilemma. The client was certain they knew the root of their problems and consequently defined the project ‘boundary’ for us. We were asked to fix issues in the supply chain – specifically in product storage – where extensive product damage was leading to revenue loss. Over the course of interviewing multiple stakeholders, mid-level managers, visiting the stores and warehouses, and understanding the complete supply chain, we realised the story wasn’t adding up. As we started veering away from their assumptions, conducting additional interviews to fill the gaps and further probing with additional data we had collected, we were able to reach the actual core of the problem, which revealed that there were multiple points along the product lifecycle that needed to be addressed, starting from product development. We learnt that not only must we be aware of our own biases, but also those of our clients, and why it’s imperative to get inputs from stakeholders with different perspectives and functions. 

Some stakeholders are more inclined to share positive feedback, focussing more on the successful outcomes rather than the negative ones. This could be for many reasons, including fear of backlash from leadership. This leads us to another perspective on Survivorship Bias – the human tendency to keep success in the limelight, and failures in the dark.

We’ve all heard the Bill Gates, Mark Zuckerberg, Steve Jobs stories of college dropouts who made it big. Media coverage favours the Unicorns (the label itself suggests a mythical quality) and the successes of those who take big risks. We love success stories; they are inspiring, entertaining and fuel future generations to become leaders. What these stories don’t talk about, is that 90% of startups fail. Timing, connections and socio-economic background all play a role in these success stories. Gates, Zuckerberg and Jobs are not the rule, they’re the exception. 

In the case of missing data, Survivorship Bias can be problematic; it creates an incomplete narrative that can lead to disastrous outcomes in real world applications. However, identifying that data is missing (when you don’t know there is data to be missing) is not easy either; many say that Survivorship Bias is only really visible in hindsight. But the first step into identifying missing data is acknowledging the possibility that your existing data set may not be complete. This will compel you to dig deeper and fill the gaps to tell a cohesive story. 

Survivorship Bias comes naturally to human beings. In a social setting, it elevates stories by exaggeration. In professional settings, it can be the linchpin to cinching a big project or delivering a heavy blow to the business. But Abraham Wald did not have the luxury to use it as a punchline or for glory. Instead, he used his knowledge to uncover a hidden dataset that eventually went on to save millions of lives.

This article was written by Sr. Design Researcher Stuti Agrawal.