Does Goldman Sachs online banking Marcus have a gender issue with the Apple card?

Apple and Goldman Sachs face accusations that the algorithms behind the companies’ iPhone-based joint credit card can discriminate against women. But the Apple Card isn’t the only Goldman company that may be ripe for allegations of gender bias.

Investment banking online banking platform Marcus, which the Wall Street firm launched a few years ago to serve the needs of middle-income millennials, analyzes the personal information that goes into its lending algorithm. in the same way as the Apple card.

It is not a surprise. Goldman developed the technology used to approve borrowers for the tech giant’s Apple Card, which launched in mid-August. But problems soon arose. Technology entrepreneur David Heinemeier Hansson tweeted that he had been offered a loan limit 20 times higher than that received by his wife despite her higher credit rating. Even more embarrassing, Apple co-founder Steve Wozniak then tweeted that his wife had encountered a similar problem.

Senator Elizabeth Warren, presidential candidate jumped into the fray, claiming that Goldman’s proposed remedy – that women who believe they have been discriminated against should contact the bank – has not worked. It should be up to Goldman to explain how his algorithm works, and if that isn’t feasible, “they have to take it out,” Warren said.

New York State is also investigating. Linda Lacewell, superintendent of the New York Department of Financial Services, said in a Publish on Medium that she would examine whether Goldman’s algorithm violated state laws on bias in the way it makes credit limit decisions.

“It’s a problem,” said Robert Bartlett, a law professor at UC Berkeley, who has researched the issue. “It is clear that there is a legal risk, although it is possible that these credit decisions – if they are ultimately rooted in income and credit scores – are entirely legal. “

Apple Card doesn’t fall far from the loan tree

The controversy comes at a time when a number of tech giants are entering the consumer credit industry. Last week, Google announced he would soon start offering checking accounts.

It is also because a growing body of research suggests that the algorithms these new lenders are using do not eliminate, and in some cases could add to, traditional prejudices against minorities and other groups.

Earlier this month, Bartlett and four Berkeley economics professors released a revised version of their research paper on bias and fintech lenders. The paper found that lenders relying on algorithm rather than traditional loan underwriting charged African-American and Latino borrowers 0.05 percentage points more in interest per year. Overall, this difference cost minority borrowers $ 765 million in additional interest per year, the researchers said.

“The problem is not exclusive to Apple,” said Adair Morse, one of the co-authors of the article. “Apple and Goldman aren’t the only ones who have built their algorithms to achieve this exact kind of gender disparate treatment.”


Apple Card accused of sex discrimination

03:26

The study focused on mortgages and did not look at Apple Card or Marcus. But the researchers cite Marcus as a lending platform that might encounter the same bias issues documented in their study.

Goldman said potential concerns about the bias of Marcus, who has nearly $ 5 billion in loans outstanding, are unfounded.

“Goldman Sachs does not and never will make decisions based on factors such as gender, race, age, sexual orientation or any other legally prohibited factor in determining creditworthiness,” said a holder. Goldman’s word in an emailed statement.

Goldman’s explanation

Goldman maintains that the allegations of bias did not stem from its algorithm, but from a legitimate business decision to only allow individual accounts when applying for a loan.

Marcus, like Apple Card, does not allow joint borrowers or any form of co-borrower or co-signer on a loan. Unlike the Apple card, however, Marcus allows individuals to indicate their total household income on their loan application. It is simply not easy for applicants to find this option.

A reference to household income on Marcus’ website is buried in the sixth location of a drop-down menu of income sources. Household income eligibility is also disclosed in question 24 on the online banking FAQ page. The question is “What types of documents are accepted to prove income?” The third paragraph of the response says that if household income is included, then documentation should be included. Individual income does not need such verification.

Hansson, who wrote the original tweet that sparked the Apple card controversy, said his wife was a stay-at-home mom with no direct source of income.

A Goldman spokesperson told CBS MoneyWatch, “We look at an individual’s income and creditworthiness, which includes factors like personal credit scores, how much debt you have and how that debt has gone. been managed. Based on these factors, it is possible for two family members to receive very different credit decisions. ”

Indeed, in reporting this story, the journalist and his wife both individually requested a loan of $ 40,000 from Goldman’s online platform Marcus. The journalist’s wife got a loan of $ 20,000 over three years with an annual interest rate of 7.99%. The journalist’s loan request, a male, was rejected.

Single borrowers are at a disadvantage

The biggest problem, experts say, is that offering only individual accounts, heavily based on individual income, would likely result in men being issued higher credit limits with lower interest rates. A 2006 study by the National Community Reinvestment Coalition found that joint male and female borrowers “performed more favorably than male and female borrowers” ​​taken alone. Additionally, the NCRC found that individual borrowers were more likely to end up with more expensive subprime loans than joint borrowers.

“In general, the fact that joint borrowers are treated differently from single borrowers creates a disparate impact on African American women, as they are statistically more likely to be single mothers,” the NCRC chief executive said, Jesse Van Tol.

According to the Berkeley study, it is illegal for lenders to create a lending algorithm that results in disparate treatment of minority groups, even if the decision that resulted in the disparate treatment was not intentionally discriminatory. The only exception is if the decision was made for a legitimate business reason, according to the study.

Incoming biased data, outgoing bias

A source familiar with the Goldman Thought said the reason for not allowing joint Apple Card accounts is because the account is linked to an individual’s iPhone, and the bank and Apple believed to link the card to two accounts. phone calls could create a cybersecurity risk. Last Wednesday, however, following the controversy, Goldman said it would soon introduce the option for family members to share an Apple Card line of credit.

In the 1970s, banks were criticized for forcing women to sign credit card applications with their husbands. Sarah Harkness, a University of Iowa sociologist who has studied gender and credit issues, said that forcing women to borrow from a male partner, or banning the practice, can lead to gender bias.

“There is a strong historical component in the recognition of the family unit as a source of financial support,” Harkness said, adding that the biggest problem with loan algorithms is that they tend to reflect historical biases of ‘a society. “If the algorithm is based on biased credit history data, you will get disparate treatment regardless of which algorithm is used.”



Source link

David A. Albanese