Apple's co-founder says the Apple Card algorithm gave the wife a lower credit limit
NEW DELHI: Apple Inc co-founder Steve Wozniak joined the online debate about accusations of gender discrimination by the algorithm behind the credit card of the iPhone manufacturer, fueling the scrutiny of the newly launched Apple Card .
The criticism started on Thursday, after entrepreneur David Heinemeier Hansson railed against the Apple Card in a series of Twitter posts, saying it gave him 20 times the credit limit his wife received.
The long-awaited titanium credit card, part of Apple's broader effort to earn more revenue from services after years of heavy dependence on iPhone sales, was launched in August, in association with Goldman Sachs Group Inc.
In an email, Goldman said Apple Card applicants were evaluated independently, according to income and creditworthiness, taking into account factors such as personal credit scores and personal debt.
It was possible that two family members received significantly different credit decisions, the bank said, but added: We have not made decisions, and we will not, based on factors such as gender.
Hansson, who is the creator of the Ruby on Rails web application framework, did not disclose any specific income-related information for him or his wife, but tweeted that they filed joint tax returns and that his wife had a better credit rating.
On Saturday, Wozniak intervened with a similar experience, saying he obtained 10 times more credit on the card, compared to his wife.
We do not have separate bank or credit card accounts or any separate assets, Wozniak said on Twitter, in response to Hansson's original tweet.
However, it is difficult to reach a human for a correction. It is a great technology in 2019.
The New York Department of Financial Services said it was beginning an investigation into Goldman Sachs credit card practices.
New York law prohibits discrimination against protected classes of individuals, Linda Lacewell, the superintendent of the New York State Department of Financial Services, wrote in a blog post.
That prevented an algorithm, like any other method to determine credit solvency, of disparate treatment based on individual characteristics such as age, creed, race, color, sex, sexual orientation, national origin, among others, he added.
We know that the issue of discrimination in the algorithmic decision also extends to other areas of financial services.
Apple did not immediately respond to a request for comments from Reuters on Sunday.