Facts vs. Experience: Are We Trusting Data More Than People?
- 2 hours ago
- 4 min read

By Meghna Sethuraman
On October 29th, 2018 a catastrophic plane crash, known as Boeing’s 737 MAX crash took place. The plane crash killed 346 citizens, and numbers of over $60 billion worth of financial damage formed. It culminated in thousands of canceled orders, and even led to the firing of CEO Dennis Muilenburg. Breaching the trust of a multitude of citizens in Boeing's engineering and safety protocols, the crashes continue to “linger” to this day, as seen even through the example of the Alaska Airlines door plug incident where the 737 MAX’s reputation still suffered remaining damage.
How is any of this relevant to statistics and data? Originally, the 737 MAX was designed to limit the amount the plane could tilt its nose upwards, ultimately preventing “stalls”. To fix this issue, MCAS or Maneuvering Characteristics Augmentation System was introduced, a system utilizing sensor data to measure the degree of steepness that the plane was climbing, and subsequently adjust its nose. Boeing trusted this very system, thinking that stalls are solely rare events, and in the rare occasion that one did occur, automation is unmistakably more reliable than human reaction. As MCAS is contingent upon only one sensor which malfunctioned, it forced the plane to tilt down, crashing into the sea and resulting in those casualties.
The Boeing crashes show us that numerical authority cannot automatically be correlated to the truth. It illustrates what truly holds us down, automation despite the pilots feeling the need to correct the plane. It highlights the idea of context, the fact that pilots intuitively understand the behaviour of planes and feel the urge to control it, clashing with the reassurance that data and automation will correct better. Everyday, individuals trust data for advice regarding health, fitness, travel, and weather but do not ever think twice after receiving that information. The true question lies in why humans behave this way and when the right time is to trust ourselves.
We trust this empirical information because we assume these numbers are factual, measurable and precise. We tend to think that data does not represent any emotion, opinion or personal interest, therefore is not open to interpretation. Even visuals, like charts and graphs give us a sense of scientific authority, enabling individuals to assume that data reflects “reality” rather than interpretation. Because these statistics are purely numerical, it becomes difficult for humanity to argue against them, making us vulnerable and critically disengaged. However, this unquestioned belief in its precision, at times, obscures the assumptions and limitations hidden beneath the surface.
One of Tim Harford’s main ideas in his book “The Data Detective” revolves around the practice of combining data with lived experiences. In his book, he emphasizes on the importance of context behind data, the stories that give meaning to numbers, and the reality that data without human insight can mislead. Often, individuals maintain a particular mindset towards humanity and begin to make generalizations that human judgement is influenced by “fear”, “stress” or “bias” and people often make mistakes, especially under pressure. They argue that personal experiences would differ, leading to conflicting perspectives or that emotions will always override logical reasoning. While this is partly true, not considering the story behind the algorithmic systems before analysing it, renders your work incomplete and inaccurate.
As a result of over-trustworthiness, issues such as automation bias and negligence of intuition arise. Automation bias is essentially the tendency to fully trust computer systems over human judgement, assume technology is more accurate than human beings and even ignore certain evidence given that it contradicts what data displays or any automated output. Eventually, humans will stop double-checking these very automated decisions and errors will go unnoticed merely because “the system said so”. Society’s ability to think critically slowly dwindles and even if a small error is present within a system, the results can potentially be fatal, considering experts will not meddle with it.
Excessive reliance on statistical evidence additionally precipitates the dismissal of authentic profession experience, “gut” feelings, and even emotional signals. This intuition is exceptionally vital, especially because it is fast and adaptive when in uncertain environments. Experts such as pilots, doctors and engineers have developed subconscious expertise through repetition and often sense problems even before data fully detects and reflects them. Once this intuition is neglected, these experienced professionals hesitate to act against failing system outputs, their confidence declining and training undervalued, as seen in the 737 MAX crashes.
Although, stating that lived experiences are better sources of information is ignorant. There are certain instances where data should be trusted over personal experiences, and it is vital to know exactly how it can be trusted. In fields such as business and finance, past data is constantly being used to analyse trends and predict future ones. While imperfect, data can be trusted here as it works in a way that individual intuition and perception could rarely mimic. Here, human experience is comparatively limited and is often anecdotal, leading to impulsive decisions being made. Knowing if technological frameworks are reliable depends on factors such as its quality, consistency, timeliness and even traceability.
Coming back to the question of “Are we trusting data more than people”, the answer is yes. In modern society, data is embedded into our everyday lives, whether it be health, finance or technology and so it is difficult not to have faith in these statistics. Here, the issue is not unreliable data, rather the manner in which we accept it, providing context and ethics. The frameworks of individual perception and statistical evidence should not be thought of as two separate entities, but as a single system that aids humanity in the future.
Bibliography



Comments