Case Study

Your users tell you everything.
You just have to listen.

Odysser's users chat with an AI to edit their videos. Agnost AI listens to every one, and tells the team what users want, what's broken, and who's about to leave.

1,247 Hidden feature requests
87 Silent churns, avoided
About

Odysser is an AI video editing agent for content creators. You upload a talking-head or a marketing clip, and Odysser does the pro editor work for you: animations, captions, B-roll, storyboards, motion graphics. All the stuff that normally eats a whole afternoon in After Effects. There's no timeline to scrub. No keyframes to fight with. If you want something changed, you type it. "Make this slower." "Swap the caption." "Add a B-roll here." Every edit happens through a conversation.

The Problem

Odysser was growing fast. Millions of chats a month flowing through the editor. Most of them were routine edits. But a meaningful chunk weren't: a user asking for a feature that didn't exist, hitting a bug nobody knew about, or quietly getting frustrated enough to leave. At that volume, finding those chats was impossible.

No clicks to track. PostHog and Amplitude are built for apps with buttons. Odysser's product is a chat.
Their observability stack was already storing every conversation. But tokens, traces, and latency don't tell you what a user is actually asking for.
The signals aren't labeled. A feature request looks like any other message until you've read ten similar ones. A user about to churn just retries the same edit a few times, then stops. Nobody finds patterns like that by eyeballing logs.

They had the data. They just couldn't use it.

The Solution

Odysser plugged in Agnost AI. For the first time, the team could see what users were saying across every session. Not a sample. All of it.

Intents, clustered Every message tagged by what the user was trying to do. Grouped into themes the team can scroll through, instead of a wall of raw transcripts.
Feature requests, ranked The things users keep asking for, ranked by how many people are asking. The roadmap, basically written by the people using the product.
Frustrations, caught When someone retries the same edit five times, pushes back on the output, or bails halfway through, Odysser sees it in the session it happened.
Tool-by-tool performance Odysser runs a handful of AI tools behind the scenes: storyboard, animation, captions, B-roll. Agnost AI shows which ones are landing and which are quietly breaking.
The Result

The chats were always being stored. Now they're being read.

In the first few weeks, Agnost AI surfaced 1,247 feature requests buried inside conversations Odysser had already logged. It also flagged 87 users showing clear frustration signals: repeated retries, angry pushback, abandoned sessions. Users who would've churned without anyone noticing.

The roadmap isn't coming from guesses or the loudest voice on Twitter anymore. It's coming from what users are actually typing into the product. And when someone's about to silently bail, the team sees it in time to do something about it. The data was there the whole time. Now it's actually doing something.

"Agnost AI showed me users were asking for features we don't even have. I had no idea. All that feedback was just sitting inside conversations we already had."
Merouane Zouaid
Merouane Zouaid CTO, Odysser
Ready to read what your users are actually saying?