QUITTXT

The problem
Most smoking cessation apps give generic advice. QuitTxt is different — it's grounded in actual clinical research: peer-reviewed studies, intervention protocols, evidence-based strategies. The challenge was building an AI that could make that evidence accessible through conversation without making things up. In healthcare, a wrong answer isn't just unhelpful — it's dangerous.
What I built
An AI assistant that answers questions about quitting smoking using only what it can find in the clinical literature. Ask it something covered by the research, and it gives you a clear answer with source references. Ask it something outside its knowledge, and it tells you honestly that it doesn't have enough information — instead of guessing and potentially giving harmful advice.
The safety system
The most important feature isn't what the AI says — it's when it stays quiet. Every answer goes through a confidence check: is the supporting evidence actually relevant to this question? I tested hundreds of queries against known answers to find the right threshold. Below that line, the system declines to answer. It sounds simple, but getting a confidence threshold right for medical content takes careful empirical work, not guessing.
Working with clinical content
Medical documents aren't like blog posts — they're dense, structured, and full of context that gets lost if you break them up wrong. I had to find the right way to prepare the research papers so the AI could find relevant passages accurately. The approach I landed on preserves the structure of clinical documents while still making them searchable. Retrieval quality improved significantly once I got this right.
Result
Published in Health Education Research, a Q1 journal. Deployed on Google Cloud and used daily by the UTSA research team as part of ongoing clinical research into AI-assisted smoking cessation.