Cincinnati leaned hard into a youth mental health app, with sobering results

A community member speaks during an engagement discussion about AI and mental health at the Cincinnati Public Library.
(Photo courtesy The Enquirer)
With a youth mental health crisis sweeping the nation, few things scare parents more than the thought of their child attempting suicide. In Cincinnati, where many schools have suffered student suicides, the desperation is palpable.
Enter Cincinnati Children’s Hospital with a glittering promise. Clairity, an AI-powered app patented by the hospital’s researchers, could record and analyze your child’s speech for just five minutes to determine whether they’re at a high risk for suicide. I was intrigued. Nearly a decade after this technology was launched in Cincinnati Public Schools, what were the results?
As a 2024 USC National Fellow, I found the support and mentorship I needed to pursue a yearlong investigation in search of answers. What I discovered was a consistent lack of evidence proving that the app works. Interviews with clinicians familiar with the app and records from 65 school districts revealed that none of the schools who used the app could prove it helped prevent youth suicide.
At Hamilton County Juvenile Court, which shelled out $20,000 to use Clairity in 2021, social workers ran into major technical issues while attempting to use the app with juvenile offenders. They discontinued their Clairity contract in 2023. Uncovering these findings from the court required months of negotiation with social workers and candid conversations about why I was pursuing the story, which allowed me to get interviews on the record. When Clarigent Health, the NIH-funded startup behind Clairity, revoked the court’s access to the app, I worked with our newsroom lawyer to argue that Clairity counted as a public record and obtained its data.
Another key component of my investigation was audience engagement, which aimed to reach the students, parents, and mental health clinicians who would be most impacted by this technology. I knew that hearing from a young and diverse cross-section of the city would require some strategizing.
I worked with Teena Apeles, my engagement editor, to write a survey that I distributed both online and in person. Together, we tabled at the University of Cincinnati during students’ lunch hour and visited various student ethnic theme centers on campus to ask students of color for their opinions. At the Neighborhood House, a social services organization in the majority Black neighborhood of the West End, we surveyed a group of high school students and youth workers. At Findlay Market, Cincinnati’s most high-traffic farmer’s market, we surveyed random passersby who were interested in the theme of mental health and AI.
I hosted a listening session at Cincinnati’s downtown branch of its public library, where mental health clinicians and high schoolers alike weighed in on the implications of using AI to help diagnose mental health issues. Along with being able to ask community members what questions they would ask relevant authorities if they were investigating the story, the listening session deepened my understanding of how clinicians felt about the technology. It solidified my belief that the perspective of clinicians — and more specifically, the challenges they face when it comes to mental health treatment and diagnosis — was the backbone of the story.
In the end, I heard from 100 Cincinnatians whose ages ranged from teens to over 60 years old. Most were students, parents, mental health clinicians, and youth workers. Close to 80 people filled out our survey, while around 20 attended our library listening session to share their thoughts.
Ultimately, Clarigent Health, the startup behind the AI, ended up shutting down two months after my investigation was published.
Here are some of the key lessons I learned over the course of my reporting and engagement work:
1. Your peers are a great resource. Lean on them.
One of the best parts of being a National Fellow was being surrounded by a cohort of journalists who are also tackling ambitious projects. During our monthly meetings, we would each share our progress and give each other tips. I always appreciated my group members’ wealth of knowledge about government, accountability reporting, and records. Texting and scheduling quick calls with each other, outside of our monthly meetings, usually helped me whenever I felt I hit a wall in my reporting.
2. If your investigation peters out, pivot.
Initially, I thought the focus of my investigation would be the school districts in Cincinnati that permitted their counselors to use the app. After months of filing records requests and doing interviews, however, I realized that the information I could get from schools was scant: principals, student service officers, and other administrators who had advocated for the app to be used didn’t even remember whether it had made a difference in student mental health outcomes. However, it was through one school district’s records that I found out that Clarigent Health had also partnered with Hamilton County Juvenile Court. Interviewing employees of the court gave me the access I needed to social workers who had used the app directly, which was invaluable.
3. Do engagement early, and use it to build an audience for your story.
One advantage of circulating a survey and hosting a listening session in the community is that you can help create an audience for your story when it publishes. Keeping a spreadsheet of their email addresses was a great way of making sure that my reporting reached the people who were interested in it.
4. On surveys, keep it simple and think creatively about distribution.
If you’re using a survey to source your investigation, keep the language of the survey simple, especially if you're looking to reach a younger audience. Also, if you're having trouble getting people to respond to your survey online, think about how to reach large numbers of people in person. Libraries and college campuses are especially great for seeking out engaged crowd.