You’ve likely heard about AI powering driverless cars, writing term papers and creating unsettling deep fakes. Can that same technology also prevent people from becoming homeless?

That’s what Los Angeles County is trying to find out. Officials there are using AI technology to predict who in the county is most likely to lose their housing — and then stepping in to help those people with their rent, utility bills, car payments and more. 

It’s still an experimental strategy. But the program has served more than 700 clients since 2021, and 86% have retained their housing. It comes at a time when more than 180,000 Californians have no place to call home, and people are ending up on the streets faster than government agencies and nonprofits can get them into housing. Officials all over the state are turning to methods aimed at preventing homelessness before it happens.

LA County’s algorithm analyzes data from residents’ emergency room visits, jail stays, use of food assistance and more, and has sparked interest from Silicon Valley to San Diego. Final data on the program — which has roughly $26 million in federal COVID funds and is expected to end in 2026 — aren’t yet out. If it’s successful, it could have major implications for helping cities and counties spend their limited resources more efficiently. 

“If we know who people are who unfortunately are going to have that experience, and they’re already county clients, it’s a real opportunity to do something early on in their lives to prevent that from happening,” said Dana Vanderford, associate director of homelessness prevention for LA County’s Department of Health Services. 

Dana Vanderford, Associate Director of Homelessness Prevention at Housing for Health at LA County Department of Health Services
Credit: Jules Hotz

How does artificial intelligence predict homelessness?

The idea started in 2019, when UCLA’s California Policy Lab began experimenting to see if it could use machine learning, combined with LA County data, to predict homelessness. Then, the county paired that with money to intervene before people ended up on the street – the program is predominantly funded with $26 million in COVID-era funds from the federal American Rescue Plan. 

The UCLA researchers start with a list of 90,000 people who recently used services from the county’s Health Services or Mental Health departments. Using 580 factors, the computer ranks those people from 1 to 90,000 based on their risk of becoming homeless. The people deemed to be highest-risk tend to show up in emergency rooms and jails at high rates, and have high usage of services such as CalFresh food benefits. But the model takes many more data points into consideration. 

For example, if people receive services in many different geographic areas, it could mean they’re couch surfing — bouncing from one precarious living situation to the next. 

“You sort of let the computer learn what it finds to be predictive over time,” said Janey Rountree, executive director of the California Policy Lab at UCLA. 

To train the algorithm, the researchers showed it a list of people who became homeless along with the services they used prior to losing their housing. Then, they had the algorithm practice “predicting” homelessness using old data, and checked to see if it was accurate. When they were satisfied, they started using it for real predictions.

How well does it work? Among the 90,000 people the researchers started with, 7% became homeless in 18 months. Among the 10,000 people the algorithm deemed to be highest risk, 24% became homeless. 

If they were targeting fewer people (say 1,000 instead of 10,000), it would be even more accurate, Rountree said. But social workers aren’t able to get in touch with many of the people on the list, and others don’t agree to participate in the aid program, so they have to cast a broader net. 

Is a computer really better at guessing who will become homeless than human social workers trained in this work? Rountree says yes — 3.5 times better, to be exact. The problem with humans, she said, is that they’re biased toward the people they know. 

“It’s just human nature to want to help the people that you’re in contact with,” she said. “They all seem housing-unstable and at high risk. You want to help those individuals or those families in front of you. But not all of them are going to become homeless and be on the street or use shelter if they don’t get assistance.”

Caseworkers also often prioritize people with lower needs, Rountree said. Someone who recently lost their job but otherwise is stable gets preference over someone facing ongoing struggles with their mental health or addiction, because the stable person is easier to help. But the stable person may not be the one who needs the help the most. 

There’s also a belief that people with higher needs won’t spend the money they’re given wisely, Rountree said. But AI doesn’t have that bias, so it ensures the money goes to who needs it most. 

The results are apparent. People the algorithm targets are much more likely to have been incarcerated, sought substance use treatment, had mental health issues or been hospitalized than the people who seek aid through LA County’s other homelessness prevention programs, Rountee said. In that way, this program fills a hole in LA County’s net of services, she said. 

LA County’s other, more traditional programs geared to prevent homelessness rely on people reaching out to request help or on case workers referring clients. 

Interestingly, they aren’t duplicating efforts. There’s almost no overlap between the people targeted by the AI algorithm and those served by traditional prevention programs, Vanderford said.

“We know there’s a significant population of folks who if somebody doesn’t reach out to them to offer assistance, they might lose their housing right out from under them without reaching out for assistance themselves,” she said.

Then, a human steps in

Four times a year, the Policy Lab researchers send LA County a list of residents the AI program has deemed most likely to become homeless. The county then mails those people letters, telling them they’ve been selected to participate in the program. After that, a social worker cold-calls them to tell them the good news.

Frequently, the person at the other end of the line is convinced it’s a scam. After all, how often does someone legitimate call out of the blue offering free money? 

When that happens, case worker Genice Brown usually will ask if she can email them — a move she hopes lends a bit more credence to her pitch. Once she convinces them the program is real, nine out of 10 people agree to sign up, she said. 

Genice Brown, a medical case worker with the Housing Stabilization and Homelessness Prevention Unit, in Los Angeles
Credit: Jules Hotz

Individuals enrolled in the program receive a base sum of either $4,000 or $6,000 (the amount is randomly assigned so researchers can assess the impacts of different amounts of money). Families start at $6,000 or $8,000, with larger families receiving more.

Brown can use that money for whatever her clients need most. Usually rent comes first, but it also can cover other bills. In addition, she helps connect her clients to doctors, dentists and mental health services. If they’re looking for work, Brown gets them gift cards for interview outfits, helps them with their resumes and role-plays interview questions. She works with each client for three or four months.

‘I just really needed the help’

For 38-year-old Sandricka Henderson, help came just in time.  Diagnosed with lupus at the start of the Covid-19 pandemic, Henderson could no longer work her physically-demanding warehouse job. Disability benefits gave her barely more than $1,000 a month — just a quarter of what she made while she was working. With an 8-year-old son to support, Henderson found she was at least $400 behind on her bills every month.

Just before Christmas last year, Henderson received a call from a woman offering free money. Henderson was sure it was a scam, and braced for the woman to ask for her Social Security number. 

But the social worker (who turned out to be Genice Brown) didn’t, and Henderson eventually realized the program was real. The first thing Brown gave her was a $100 giftcard to a local grocery store — a blessing, Henderson said, because she had nothing in her refrigerator. 

Shortly after, Henderson’s landlord sent her a letter warning she had 10 days to pay her rent or be evicted. About a week later, Brown sent the rent money and helped Henderson avoid catastrophe. She also helped Henderson catch up on her car payment. 

Now, Henderson no longer feels like she’s teetering on the edge of homelessness. She has some money in her savings account, and her rent is prepaid for several months. 

“I just really needed the help,” Henderson said. Because she’s used to working hard and taking care of herself, she added, she never would have reached out and asked for it.

“It really did change my whole circumstances,” she said. “My son had a Christmas that I didn’t think I was going to be able to give him.”  

The future of AI in homelessness services 

Throughout California, new people are becoming homeless faster than aid workers can find existing homeless residents housing. In Santa Clara County, for example, for every one homeless household that moved into housing last year, another 1.7 became newly homeless, according to Destination: Home, a Santa Clara County-based organization focused on ending homelessness.

The LA County team has met with government agencies from all over the country who are interested in its AI model, including Santa Clara and San Diego counties, Vanderford said.

San Diego County is working on a plan for homelessness prevention, Tim McClain, spokesman for the county’s Health and Human Services Agency, said in an email to CalMatters. He wouldn’t provide any additional updates. 

Santa Clara County met with the California Policy Lab earlier this year, and hopes to schedule another informational meeting soon, said Consuelo Hernandez, director of the county’s Office of Supportive Housing. The county has its own homelessness prevention program, which relies on humans triaging clients. If artificial intelligence can do that work more efficiently, it’s worth exploring, Hernandez said. 

But at the end of the day, what they really want is more money to help the people that already fill their queues. 

“Without having additional resources,” Hernandez said, “what is the true benefit of knowing there are more people out there who are in need?”

Marisa Kendall, CalMatters

Leave a comment

Your email address will not be published. Required fields are marked *