Apple’s new credit card is popular and easy to use. Claims of gender bias have raised a red flag with New York State authorities.
By Chand Bellur
November 10, 2019 at 2:01 p.m. PDT
Apple Card May Have Gender Bias
The Apple Card offers a remarkably streamlined application process. Simply open the Wallet app, answer a few questions and you’re approved. Customers can use the digital card immediately. A gorgeous titanium credit card arrives in the mail a few days later.
Not everything goes smoothly for all applicants. Famous creator of Ruby on Rails, David Heinemeier Hansson, complained that his wife’s better credit score still resulted in denial of a credit line increase. The only possible conclusion is that the algorithm is gender biased.
Hansson Tweeted the following observation:
“My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time, yet Apple’s black box algorithm thinks I deserve 20x the credit limit she does.”
The Tweet ended up going viral enough to attract the of the New York State Department of Financial Services. The state authority is opening an investigation into the Apple Card’s creditworthiness algorithm.
New York State Department of Financial Services to Investigate Apple Card Algorithm
We live in a high tech reality where government agencies are put in the awkward position of regulating algorithms. The New York State Department of Financial Services announced on Saturday that they would launch an investigation into the Apple Card’s creditworthiness algorithm.
Although Apple has yet to respond, Andrew Williams of Goldman Sachs contends that there’s no bias involved in the process:
“Our credit decisions are based on a customer’s creditworthiness and not on factors like gender, race, age, sexual orientation or any other basis prohibited by law”
In addition to Hansson, Apple co-founder Steve Wozniak also faced problems with the Apple Card. His own credit limit is ten times that of his partner’s.
The investigation has yet to begin, however, the DFS has their work cut out for them. Finding specific evidence of intentional gender bias in an algorithm is a difficult task. In the end, this issue could simply be a coincidence. Only a thorough examination of the data and code will find the truth.