War with hidden algorithms trapping poverty

Hello Habr! I share with you a post that explains how a group of lawyers are discovering and fighting automated systems that deny poor people housing, jobs, and basic services. The American experience is being considered, but in Russia this problem will also be relevant very soon, because credit scoring algorithms are being actively implemented in our country. And where else to raise questions of the ethics of such systems, if not among those who create them?












Introduction



Miriam was only 21 years old when she met Nick. She was a photographer, recently graduated from college and served tables. He was 16 years her senior and owned a business in finance. He was charming and charismatic; took her on fancy dates and paid for everything. She quickly fell under his influence.



It all started with one credit card. She was the only one with Miriam at that time. Nick increased his debt with business purchases by $ 5,000 and paid off quickly the next day. Miriam, who asked me not to reveal their real names for fear of interfering with the divorce proceedings, found Nick's trick boosting her credit rating. Growing up with a single dad in a low-income family, she trusted Nick's know-how more than herself. Nick readily supported her, saying that she did not understand finance. She opened several more credit cards for him in her name.



The trouble began three years later. Nick asked Miriam to quit her job to help him with the business, which she did. Nick told her to go to graduate school and not worry about her aggravating her existing student debt. Miriam obeyed Nick again. He promised to take care of everything, and she believed him. Nick stopped paying credit card bills shortly thereafter. Miriam's account began to fade.

And yet Miriam stayed with Nick. They got married and had three children. And then one day FBI agents raided their house and arrested Nick. A federal judge found Nick guilty of nearly $ 250,000 in fraud. Miriam discovered the amount of debt in tens of thousands of dollars, which Nick issued in her name. “The day he went to jail, I had $ 250 in cash, a house and car on bail, and three children,” Miriam says. “In a month, I went from being able to hire a nanny, from living in a nice house and all that, to real poverty.”



Miriam has experienced what is known as “forced debt,” a form of violence usually perpetrated by a close partner or family member. Economic abuse is a long-standing problem: digital banking has made it easier to open accounts and lend in the name of the victim, says Carla Sanchez-Adams, lawyer for legal aid at Texas RioGrande. In the age of automated credit rating algorithms, the consequences can be much more devastating.



Credit scores have been used to assess consumer creditworthiness for decades, but now that calculation is based on algorithms, they matter much more: they not only account for significantly more data, both in terms of volume and type, but also increasingly affect whether you can buy a car, rent an apartment or get a permanent job. Their pervasive impact means that if you have bad credit, it is nearly impossible to recover. Worse, the algorithms are owned by private companies that do not disclose how these algorithms arrive at their solutions. Victims can descend a social ladder that sometimes ends up losing their homes or returning to their abuser.



Credit scoring algorithms are not the only ones that affect the economic well-being of people and their access to basic services. Algorithms now decide which children end up in foster care, which patients receive medical care, which families receive access to stable housing. Those of us with the means can live our lives unaware of any such thing. But for low-income people, rapid growth and the introduction of automated decision-making systems have created a hidden web of interconnected traps.



Fortunately, more and more civil lawyers are organizing around this issue. Having borrowed a textbook on combating risk assessment algorithms from the criminal world, they strive to study these systems, create a community and develop strategies for behavior in litigation. “Basically every civil lawyer starts to deal with this material because all of our clients are somehow affected by these systems,” says Michelle Gilman, professor of clinical law at the University of Baltimore. - We need to wake up and learn. If we want to be really good, holistic lawyers, we must be aware of what is happening. "



"Will I cross-examine the algorithm?"



Gilman has been practicing law in Baltimore for 20 years. Her work as a civil and anti-poverty advocate has always boiled down to the same thing: she has represented people who have lost access to basic needs such as housing, food, education, work or health care. Sometimes this kind of work means clashing with a government agency. In other cases, it is the credit reporting agency or homeowner. Increasingly, the struggle for the right of the client affects one or another algorithm.



“With our clients, this happens gradually,” she says. “They are entangled with many different algorithms that prevent the use of basic services. And customers may not be aware of this because many of these systems are invisible. "







For people with low incomes, one temporary economic hardship can lead to a vicious cycle that sometimes ends in bankruptcy or homelessness.



Gilman doesn't remember exactly when she realized that some decisions about acceptability are made by algorithms. But when this transition was just beginning, it was rarely obvious. On one occasion, Gilman represented an elderly disabled client who was inexplicably cut off from her sponsored home care. “We couldn't figure out why,” she recalls. She was getting worse, and usually, if you get worse, you get more hours, not less. "



It wasn't until Gilman and her client stood in the courtroom in the middle of the hearing that a witness representing the state reported that the government had just put in place a new algorithm. The witness, the nurse, could not explain anything.



“Of course not — they bought it off the shelf [about the bulk product not to order],” Gilman said. “She is a nurse, not a computer scientist. She could not answer what factors influence the behavior of the system. How are they weighed? What are the expected results? " So I was with my student lawyer who was in my law clinic with me and he asked something like, "Oh, am I going to cross-examine the algorithm?"



For Kevin De Liban, a lawyer at the law firm Arkansas Legal Aid, the change was just as treacherous. In 2014, his state also introduced a new system for distributing funded home health care, cutting off a number of people who were previously eligible for such care. At the time, he and his colleagues could not identify the root cause of this cutoff. They only knew that something had changed. “We could understand that there was a change in the rating systems from a paper questionnaire with 20 questions to an electronic questionnaire with 283 questions,” he admits.



It wasn't until two years later, when an error in the algorithm again led to legal action, De Liban finally got to the heart of the matter. He realized that the nurses were telling the patients, "Well, the computer did it — it wasn't me."



“This is what alerted us,” he says. “If I knew what I knew in 2016, I would probably be better protecting the client in 2014,” adds De Liban.



A person goes through many systems every day



Since then, Gilman has gained a lot of experience. Representing clients with a range of problems, she watched the emergence and collision of two algorithmic networks. The first network consists of credit reporting algorithms similar to those trapped by Miriam, which affect access to private goods and services such as cars, homes, and work. The second network contains algorithms adopted by government agencies that influence access to public goods such as health care, unemployment, and child support services.



In terms of credit reporting, the growth of algorithms has been driven by the proliferation of data that is easier than ever to collect and disseminate. Credit reports are not new, but their impact is much more extensive these days. Consumer reporting agencies, including credit bureaus, tenant verification companies, and so on, collect this information from a wide variety of sources: public records, social media, web browsing, banking, app usage, and more. Algorithms then assign people “worthiness” scores, which go a long way toward background checks performed by lenders, employers, landlords, and even schools.



Government agencies, on the other hand, are forced to adopt algorithms when they want to upgrade their systems. The adoption of web applications and digital tools began in the early 2000s and continued with the shift towards more data-driven automated systems and artificial intelligence. There are good reasons to strive for such a change. During the pandemic, many unemployment benefit systems struggled to cope with the sheer volume of new requests, resulting in significant delays. Upgrading these legacy systems promises faster and more reliable results.



But the software procurement process is rarely transparent and therefore there is no accountability. Government agencies often purchase automated decision making tools directly from private vendors. As a result, when systems go wrong, affected people and their lawyers are left in the dark. "They don't warn about this anywhere," complains Julia Simon-Michel, a lawyer at the law firm Philadelphia Legal Assistance. “This is rarely written in any manuals or help. We're at a disadvantage. "



The lack of public scrutiny also makes algorithms more prone to error. One of the most egregious problems occurred in Michigan in 2013. After much effort to automate the state unemployment benefit system, the algorithmmislabeled over 34,000 people as scammers . “This resulted in a huge loss of benefits,” says Simon-Michel. - There were bankruptcies, there were, unfortunately, suicides. It was a complete mess. "







Gilman fears coronavirus-related debts and evictions will be codified into credit scores, making it forever difficult for people to get jobs, apartments and loans.



Low-income people bear the brunt of the transition to algorithms. These are the people most vulnerable to temporary economic hardships that are codified in consumer reports, and those who are in need and seeking benefits. Over the years, Gilman has seen more and more cases where clients are at risk of entering a vicious circle. “A person goes through many systems every day,” she says. - I mean it happens to everyone. But the consequences are much worse for the poor and minorities. ”



She cites a current case in her legal clinic as an example. One family member lost his job due to the pandemic and was deprived of unemployment benefits due to an automated system failure. The family then stopped paying rent, which led to their landlord suing them for eviction. Although eviction will not be legal due to the moratorium on disease control and prevention, the claim will still be publicly recorded. These records can then be used in tenant selection algorithms, which can make it difficult for a family to find stable housing in the future. Their inability to pay their rent and utilities can also be a blow to their credit scores, with consequences again. “If people try to use cell phones or take out a loan, buy a car, or apply for a job, these cascading ripple effects will occur,” Gilman says.



Every human situation will turn into an algorithm situation



In September, Gilman, who is now at the Institute for Data and Society Research, released a report that outlined all the algorithms that poverty lawyers might face. Report called "Algorithms poverty» ( Poverty Lawgorithms ) and designed as a guide for lawyers in the field. The report is divided into specific areas of practice such as consumer law, family law, housing law, and public goods. The talk explains how to deal with the challenges posed by algorithms and other data-driven technologies within existing legislation.



For example, if a client is denied an apartment purchase due to a low credit rating, the report recommends that a lawyer first check whether the data entered into the scoring system is accurate. Under the Fair Credit Reporting Act, reporting agencies are required to ensure the accuracy of their information, but this is not always the case. Challenging any erroneous claims can help restore the client's credit and thus access to housing. However, the report acknowledges that existing laws can only help achieve this. There are still regulatory gaps that need to be filled, Gilman says.



Gilman hopes the report will serve as a wake-up call. Many of her colleagues still do not understand what is going on and are unable to ask the right questions to discover algorithms. Problem-aware people are scattered throughout the United States, studying the problem, heading to the problem point, and fighting the algorithms alone. Gilman sees an opportunity to bring them together and create a broader community of people who can help each other. “We all need to learn more and learn more — not just in terms of the law, but also in terms of the systems themselves,” Gilman says. In the end, it looks like every human situation will turn into an algorithm situation. "



Longer term, Gilman seeks inspiration in the criminal justice world. Criminal lawyers “worked ahead of the curve,” she says. They organized themselves into a community and fought judgment-defining risk assessment algorithms, delaying their implementation. Gilman wants civil lawyers to do the same: create a movement to bring more public scrutiny and regulation to the hidden network of algorithms their clients face. “In some cases, the system should probably just shut down because there is no way to make it fair,” she says.



As for Miriam, after Nick's conviction, she was gone forever. Miriam and her children moved to the new state and contacted a non-profit organization that supports survivors of forced debt and domestic violence. With the help of the organization, Miriam took several courses to teach her how to manage her finances. The organization has helped Miriam close many forced debts and learn more about credit algorithms. When she went to buy a car with the help of her father, who became a guarantor, her credit rating barely reached the required minimum. Since then, her constant car payments and student debt have been steadily increasing her credit score.



Miriam still needs to be on the lookout. Nick has her social security number and they haven't divorced yet. She constantly worries that Nick will open other accounts and take out new loans in her name. For a while, she checked her credit card statement daily for fraud. But now she is looking ahead. Miriam's father, who is in his 60s, wants to retire and move. Both of them are now focused on preparing to buy a home. “I am very excited about this. My goal is to bring them to 700 by the end of the year, ”she says about her credit points,“ and then I will definitely be ready to buy a house. “I've never lived in my own home,” she adds. "My dad and I are working together to save money on our own home."



image






Don't forget about the promo code HABR , which adds 10% to the discount on the banner.








All Articles