Advertisement

AI can perpetuate racial bias in insurance underwriting

While artificial intelligence (AI) technology has made mortgage underwriting and insurance claims faster and easier, it could also unintentionally discriminate against protected classes.

The Casualty Actuarial Society (CAS) this year acknowledged “the potential impact of systemic racism on insurance underwriting, rating and claims practices" in four research papers examining the potential discriminatory effects algorithmic bias may have when used by financial institutions determining credit-scoring, insurance (auto, life, home), and mortgage lending.

“The insurance industry started a discussion after the death of George Floyd and we wanted to make sure actuaries who are responsible for insurance pricing were prepared to be an active participant in the discussion,” Mallika Bender, diversity, equity and inclusion staff actuary at the Casualty Actuarial Society (CAS), told Yahoo Money. “In our Methods for Quantifying Discriminatory Effects, we survey different statistical approaches for evaluating review models for fairness and ways to mitigate bias with the data, modeling process, and outputs.”

The consequences of algorithmic bias could mean that Black and Hispanic individuals end up paying more for insurance and experience debt collection at higher rates, among other financial repercussions, because of the data AI uses to make decisions.

ADVERTISEMENT

“You see this on commercials – ‘we gave it some AI and now you can trust the data,’ but without any explanation on what steps it actually took to get to an outcome,” Sonia Gipson Rankin, a professor with a focus on race, technology and the law at The University of New Mexico School of Law, told Yahoo Money. “AI perpetuates bias through codifying existing bias, unintended consequences, and nefarious actors.”

Team of Computer Engineers Work on Machine Learning Neural Network Technology Development.
Credit: Getty Images (gorodenkoff via Getty Images)

Zip code location data can perpetuate bias

“We already know that the disparity we see today is because of historical practices, like redlining in housing led to people living in certain communities, which led to the devaluation of certain homes and therefore [AI] codifies existing biases in place,” Professor Rankin said. “The same thing happens when you look at criminal justice data — it has been codifying over-policing in certain communities and that's the second way that AI perpetuates bias through unintended consequences.”

With Freddie Mac offering hybrid appraisals using AI for refinances, there are concerns about what this means for Black and Hispanic homeowners who were denied refis at higher levels during the pandemic housing boom.

Similarly, zip code or geographic location is an important factor determining car and homeowners insurance premiums. On one hand, a location can help determine weather-related losses from natural disasters and the likelihood of auto accidents that typically happen close to home.

“While the use of geographic location is supported by the correlation to loss, location may also be correlated with race due to ongoing societal segregation," CAS research found. "Major highways tend to divide Black and white sections of large cities, including Buffalo, Hartford, Kansas City, Milwaukee, Pittsburgh, and St. Louis.”

However, geographic location can also be a proxy for race as many communities have been and still are racially segregated.

Zip code as proxy for race
Zip code as proxy for race

The impact on homeownership and homeowners insurance is evident when historically Black neighborhoods were relegated to the most undesirable locations like flood zones.

There’s a “disproportionate impact of natural disasters on minority groups…for example, lower-income neighborhoods were more affected by Hurricane Harvey than wealthier ones because they were more concentrated in flood-prone parts of Houston,” CAS research found.

Racial bias in traffic stops leads to bias in auto insurance premiums

In 2017, some major insurers charged drivers in minority neighborhoods as much as 30% more than drivers in other areas with similar accident costs, according to an investigation by Consumer Reports and ProPublica.

Motor vehicle records and driving history are two important factors used to determine auto insurance premiums. The more traffic violations generally means a higher premium.

Black drivers are more likely to be pulled over without probable cause during a traffic stop unrelated to a potential accident or reckless driving, according to CAS research, which "may be directly transferred into auto insurance prices through offenses not related to driving."

“These studies are important because a narrative that the Black community is predisposed to criminality has been used to justify more traffic stops for Black drivers compared to the population,” Mustapha Ndanusa, lawyer and managing partner at Davis, Ndanusa, Ikhlas, Saleem, LLP, told Yahoo Money. “The study shows most of these traffic stops were dismissed for lack of evidence, which means more likely it was pretextual and no probable cause for police pullovers.”

Loan denials
Loan denials

Racial discrimination in debt collection and credit reporting

Researchers also found that people in predominantly Black neighborhoods were sued by debt collectors twice as often as their white counterparts in large part due to an inability to pay for lawyers to defend against fraudulent debt claims.

Having a debt collection on your credit history negatively impacts credit ratings and can lead to a denial on a job or housing application.

These “flaws and inaccuracies in credit reporting [that] contribute to racial bias in credit scores and CBIS [credit-based insurance scores],” CAS research found.

“There needs to be a check and balance for any AI software implemented so it doesn't have built in equal protection violations,” Professor Rankin said. “We know that there were purposeful bots in social media spaces that were part of a larger cyber warfare attack of disinformation and misinformation. When all of that gets mixed together, have you trained your AI to not go to certain sources for data analysis?”

Ronda is a personal finance senior reporter for Yahoo Money and attorney with experience in law, insurance, education, and government. Follow her on Twitter @writesronda

Read the latest personal finance trends and news from Yahoo Money. Follow Yahoo Finance on Twitter, Instagram, YouTube, Facebook, Flipboard, and LinkedIn.