LA-based AI startup looks to help ease overloaded 911 systems

Harm.AI wants to route non-emergency callers to help faster as Los Angeles struggles with 911 delays.

Joseph Shavit
Amyn Bhai
Written By: Amyn Bhai/
Edited By: Joseph Shavit
Add as a preferred source in Google
Harm.AI says it can steer non-emergency callers to help before they get stuck in LA’s overloaded 911 system.

Harm.AI says it can steer non-emergency callers to help before they get stuck in LA’s overloaded 911 system. (CREDIT: Wikimedia / CC BY-SA 4.0)

The calls come in steady and they come in wrong.

A woman locked out of her own home by a neighbor throwing objects. A resident in mental health crisis transferred off the emergency line and told to wait. A person reporting harassment, routed somewhere, then nowhere. In Los Angeles, the distance between dialing 911 and reaching actual help has become, for a significant share of callers, a journey through a system that was not designed for what they are asking it to do.

The city answered just 57.43 percent of 911 calls within 15 seconds in 2024. Officials have cited operator shortages, attrition, and infrastructure that has not kept pace with demand. A recent LAPD report to the City Council made clear that Los Angeles still lacks a genuinely separate lane for non-emergency response.

Operators assigned to secondary calls are expected to jump back into the primary 911 queue when volume spikes. Calls diverted to the city's unarmed crisis response teams still require trained communications staff to screen them first. The workarounds have not resolved the underlying problem. They have simply redistributed it.

Into that gap, a small Los Angeles company is trying to build something.

Harm allows you to press a button, describe what's happening, and receive instant assistance. (CREDIT: Harm.AI)

A Different Kind of Triage

Harm.AI was founded by Aitan Segal, who serves as CEO, alongside Connor MacLeod, Vice President of Operations. Their platform is not a 911 replacement, and they are careful to say so. It is designed to intercept a specific and stubborn category of call: the one that does not belong in the emergency queue but still leaves someone without a clear path to help.

The scenarios Segal and MacLeod describe are familiar to anyone who has followed the city's dispatch coverage closely. Mental health crises. Suicidal ideation. Homelessness-related incidents. Harassment. Trafficking concerns. These situations feel urgent to the person experiencing them, carry real stakes, and routinely end up routed through a system built primarily to send patrol cars, fire engines, and ambulances.

"For us, that represents like a system failure," MacLeod said. "A lot of people who are calling 911 with something that feels like an emergency to them, they're not getting the support they need."

The platform works by having a person describe what is happening. The system assesses the situation and attempts to connect that person to the most relevant resource within roughly a minute. That might mean the 988 suicide and crisis lifeline, a local mental health provider, homelessness services, or another targeted support line. The routing is designed to be specific rather than generic, matching the nature of what someone is experiencing to a service that can actually address it, rather than transferring them into another queue where they wait again.

"The goal is within a minute, you'll be routed to the exact resource that's most appropriate to your issue," MacLeod said.

Harm.AI offers a compassionate AI Companion that listens, guides, and supports you through non-life-threatening situations. (CREDIT: Harm.AI)

The Problem With the Existing Pipeline

What makes the company's pitch coherent is the specific failure mode it is addressing. Los Angeles has spent years struggling with response time metrics while the callers who were never going to receive a dispatch response have largely been left to navigate a fragmented network of hotlines and support services on their own. When those callers enter the 911 system, they slow it down. When they are transferred out, they often disappear.

The city's non-emergency line, which has long been positioned as the appropriate alternative for lower-priority situations, has its own documented response problems. Residents have described waiting 30 minutes, an hour, or longer after being transferred during incidents that felt pressing enough to call 911 in the first place. The structural issue is that the non-emergency system draws from the same staffing pool. There is no truly separate infrastructure. Reducing 911's non-emergency call volume does not automatically produce capacity elsewhere if the same operators are simply handling a different phone.

Harm.AI is not trying to fix that staffing problem. Segal and MacLeod are explicit about the limits of what they are building. Dispatchers remain essential in genuine emergencies. The company is not competing with them and is not claiming to replicate what trained human judgment provides in life-or-death situations. The argument is narrower: if the platform can capture non-dispatchable calls before they enter the emergency queue, the people who are left in that queue may be the ones the system was actually designed to serve.

"We're not looking to take anyone's jobs away," MacLeod said.

Data as Infrastructure

Segal describes a second purpose running alongside the routing function, one that may prove as significant as the immediate service connection over time. Every interaction on the platform, logged and anonymized, feeds into what the company calls HARM Maps: a data layer that tracks where non-emergency harm is occurring, what categories of need are emerging, and how well existing resources are actually performing once people reach them.

The feedback loop is central to how the company presents its value to municipalities. Routing someone to a mental health line is one thing. Knowing whether that call was answered, whether the person received useful assistance, and whether the same location is generating repeated reports of the same category of incident is another. Cities have historically made decisions about service allocation based on 911 call volume, which captures only the calls that came in, not the outcomes, and only the subset of situations serious enough that someone reached for the emergency line.

"What we can do is we can get a concentric unified data stream or an understanding of what's occurring where in the city," Segal said. "The goal is to allow people's voices to be elevated so that no one is unheard."

The company says it plans to log outcomes and refine the platform based on what it learns. In practice that means tracking whether users actually reached a service, whether the connection was useful, and where the system still fell short. Segal described that accountability function as core to the project rather than incidental to it.

MacLeod added that the platform can be customized to reflect different municipal priorities, depending on what a city identifies as its most pressing non-emergency response gaps.

Privacy and the Municipal Model

The company's revenue model is built around contracts with cities rather than around the data its users generate. MacLeod said user information is encrypted and not sold or retained, with the platform operating under government-grade encryption standards and security practices aligned with SOC requirements. The company says it collects only the minimum information required to route someone effectively, which may include general location or device settings but does not require personal identity details.

That structure is intended to address a concern that follows any technology platform operating in crisis-adjacent spaces: whether the people using it can trust that what they share will not be used against them. For someone describing a mental health crisis, a trafficking concern, or a situation that carries legal or social risk, the terms under which their information is handled matter considerably. Harm.AI's stated approach is to make the platform useful without making it extractive, a distinction that becomes relevant when the alternative is that vulnerable people simply do not seek help at all.

The Difficulty of Getting Cities to Listen

Building the platform has proven more straightforward than getting municipalities to engage with it.

MacLeod, who lives in Santa Monica, said the company has made approaches to local officials and encountered the familiar resistance that greets civic technology startups: difficulty getting in front of the people who make decisions, and a default institutional preference for managing existing systems over experimenting with new ones. Segal said the pattern has repeated elsewhere. Cities that are visibly struggling with their emergency response infrastructure have often been more comfortable defending what they have than seriously evaluating what they are missing.

That reluctance is not entirely irrational. Governments face real accountability risks when they adopt unproven technology in domains that touch public safety. A platform that misroutes someone in crisis, or that creates a perception that the city is offloading responsibility to a startup, carries political and practical costs. The burden of demonstrating reliability falls on the company, and the cycle of needing adoption to generate data and needing data to earn adoption is a familiar one.

What is harder to defend is the status quo. Los Angeles has documented its own failure to answer emergency calls within basic performance thresholds. It has documented the absence of a functional non-emergency response lane. Residents have described, on record, what it looks like when someone in genuine distress is transferred off 911 and then left waiting for hours.

Addressing Structural Failures

Segal and MacLeod are not arguing that their platform resolves those structural failures. The city still needs more trained dispatchers. It still needs better equipment. It still needs a non-emergency response architecture that people can actually rely on.

Their case is more specific than that. It is that a city this overwhelmed cannot keep routing vulnerable people into the wrong system, watching them fall through the gap between emergency and non-emergency response, and treating that outcome as an acceptable baseline. Harm.AI is a bet that technology can do some of that sorting earlier, more reliably, and with enough data discipline to help cities see where the gaps are and what it would take to close them.

Whether Los Angeles is ready to find out is a different question.

The original story "LA-based AI startup looks to help ease overloaded 911 systems" is published in The Brighter Side of News.



Like these kind of feel good stories? Get The Brighter Side of News' newsletter.


Amyn Bhai
Amyn BhaiWriter
Amyn Bhai is a Culver City–based media journalist covering sports, celebrity culture, entertainment, and life in Los Angeles. He writes for The Brighter Side of News and has contributed to The Sporting Tribune, Culver City Observer, and the Los Angeles Sentinel. With a strong curiosity for science, innovation, and discovery, Amyn focuses on making complex ideas accessible and engaging for a broad audience.