Uncategorized

The Dangers of Bias in Times of Uncertainty

By May 7, 2020 No Comments

We’re moving into a period of unprecedented uncertainty. A time in which our government has communicated its commitment to take a data driven approach to managing our health, economy and general well-being through the COVID-19 pandemic.

It may be an understatement to say that there have been conflicting interpretations of COVID-19 data depending on the sources. Further, the quality of the data that is having such a profound impact on our lives can be called into question. Here are a few examples:

  • Existence – How much are cases being overstated as a result of presumed cases?
  • Completeness – How much are cases being understated as a result of insufficient testing availability and asymptomatic carriers?
  • Timing – How is the delay in reporting impacting our assessment of the curve?
  • Presentation – Is the data being presented to end users in a way that is understandable, or one that distorts reality?

Bias is a preference or inclination, especially one that inhibits impartial judgment. Questions surrounding data quality increases the opportunity to introduce bias and a lack of precedent make the impact of that bias more difficult to quantify.

Financial institutions face these types of data quality and interpretation dilemmas daily, which are exacerbated by the current state of our economy. For example, are skip-a-pay programs the lifeline borrowers need to get above water or are they simply delaying the inevitable? In many cases, a skilled analyst could tell either story using the same data set.

So how do we avoid biases in data driven decision making? We do so by striving for independence, leveraging experience and by validating and back-testing our conclusions.

Independence

Independence refers to the separation of the analytics department from parties that may have a financial interest in the results of those analytics. For example, if you ask the Director of Payments to evaluate your credit card portfolio, you’ll most likely be sacrificing some of the objectivity of that analysis because their job is dependent on the success or failure of the very thing they are evaluating, the credit card portfolio..

Independence can be achieved most directly by working with a third party. It’s important to consider who’s interpreting the output and driving the story? Licensing software to drive internal conclusions brings a different level of independence as compared to conclusions reached by a third-party.

The next best thing would be maintaining independence in fact, a state of mind that permits the provision of an opinion without being affected by influences that compromise professional judgment, through an internal culture of objectivity. Create quantifiable and measurable goals that reward long-term success as opposed to goals that can be manipulated through aggressive story-telling.

Experience

People learn by making mistakes. In pursuit of avoiding bias, there are few substitutes to the experience of having fallen victim to these biases previously. Experience also generally brings the ability to differentiate between what conclusions are a result of organic findings and what conclusions are more likely the result of low quality information (data).

There is often a disconnect between those with experience with the business outcomes and those with experience in the data. Collaboration and constructive criticism are both key here. Those with experience in business outcomes should trust the outcomes of their analytics while applying professional skepticism – questioning results that don’t pass the smell test.

Those with experience with data should avoid statements like well that’s what the data says and use the team’s feedback to verify the results of the analysis.

Validation, Testing and Improvements

There is a reason folks refer to it as the Analytics Journey. Analytics is not a destination. Even well thought out models often result in imperfect suggestions. The same level of independence, collaboration and use of experience used in developing your model should be put into the testing of its results.

Maintaining humility through this process will allow you to either revisit or fine tune the structure and assumptions used in your models to improve the reliability of your outcomes.

The saying past performance is not necessarily indicative of future results is most true when the circumstances influencing those results are changing. Drawing reliable, unbiased data driven conclusions takes independence and experience. Maximizing your likelihood of a successful long-term data strategy requires the humility to use the feedback gathered through testing to improve as you move through your data journey.

 

Dan Price, CPA CFA

President

2020 Analytics

2O2O Analytics

Toll Free 1-877-392-2021

13577 Feather Sound Drive
Suite 400
Clearwater, FL 33672
United States of America