Odysser's users chat with an AI to edit their videos. Agnost AI listens to every one, and tells the team what users want, what's broken, and who's about to leave.
Odysser is an AI video editing agent for content creators. You upload a talking-head or a marketing clip, and Odysser does the pro editor work for you: animations, captions, B-roll, storyboards, motion graphics. All the stuff that normally eats a whole afternoon in After Effects. There's no timeline to scrub. No keyframes to fight with. If you want something changed, you type it. "Make this slower." "Swap the caption." "Add a B-roll here." Every edit happens through a conversation.
Odysser was growing fast. Millions of chats a month flowing through the editor. Most of them were routine edits. But a meaningful chunk weren't: a user asking for a feature that didn't exist, hitting a bug nobody knew about, or quietly getting frustrated enough to leave. At that volume, finding those chats was impossible.
They had the data. They just couldn't use it.
Odysser plugged in Agnost AI. For the first time, the team could see what users were saying across every session. Not a sample. All of it.
The chats were always being stored. Now they're being read.
In the first few weeks, Agnost AI surfaced 1,247 feature requests buried inside conversations Odysser had already logged. It also flagged 87 users showing clear frustration signals: repeated retries, angry pushback, abandoned sessions. Users who would've churned without anyone noticing.
The roadmap isn't coming from guesses or the loudest voice on Twitter anymore. It's coming from what users are actually typing into the product. And when someone's about to silently bail, the team sees it in time to do something about it. The data was there the whole time. Now it's actually doing something.
"Agnost AI showed me users were asking for features we don't even have. I had no idea. All that feedback was just sitting inside conversations we already had."