Work
Tired of tab-switching to read docs mid-code, I built a RAG assistant that answers Appliqué component questions right inside the IDE — powered by Qdrant, OpenAI, and an MCP server.
Point it at any YouTube video, and you can ask questions, get summaries, and jump to timestamps — all powered by RAG over the video transcript stored in Qdrant.
Built with OpenAI GPT-4.1 and the Agents SDK. Full cart operations via natural language, voice in and voice out via Web Speech API and OpenAI TTS, plus a human-in-the-loop handoff when the AI can't help.