Today, we're building a simple, end-to-end application called Podcast Memory Chatbot — a tool that lets you chat with podcast episodes using vector search and OpenAI.
We'll use:
- Next.js 14 (App Router)
- Tailwind CSS (For Styling)
- Vercel AI SDK (for streamlined AI integrations)
- OpenAI (for embeddings + chat + transcription)
- Vectroid (for vector search)
And we'll vibe code everything inside Cursor.
Let's go.
✨ What We're Building
The idea is simple:
You'll enter a YouTube link, and we transcribe it, embed it, and make it searchable through natural language.
You can then ask things like:
- "What did the guest say about open source AI models?"
Or:
- "Summarize Lex Fridman's opinion on consciousness."
The bot finds the most relevant moments from the transcript and responds with grounded answers. No hallucinations — just facts from the pod.
🧠 How It Works
Here's the flow we'll implement:
- Paste a podcast URL from YouTube
- Download the raw transcription from YT
- Split transcript into chunks (e.g. ~500 tokens)
- Create embeddings for each chunk with
text-embedding-3-smallvia Vercel AI SDK - Store embeddings in Vectroid
- In the chat UI:
- Embed it using Vercel AI SDK
- Query Vectroid for relevant chunks
- Stream GPT responses with Vercel AI SDK
- Return a grounded response with real-time streaming🛠️ Step 1: Set Up Cursor & Define Rules
Before we dive in, make sure you have Cursor set up:
- Download Cursor: Head to cursor.com and download the latest version for your operating system
- Create a new project: Open Cursor and create a new empty folder for your podcast chatbot project
- Initialize the project: We'll be building everything from scratch, so start with a completely empty directory
Now, here's the key to success with Cursor: it's all about how you communicate with it. You can't just drop vague ideas into Cursor and expect production-grade code.
We'll use the modern Project Rules approach with .cursor/rules directory and MDC files.
Main Project Rule (.cursor/rules/project.mdc)
Component Standards (.cursor/rules/components.mdc)
interface ComponentProps {
// Define props here
}
export function ComponentName({ prop }: ComponentProps) {
// Component logic
return (
{/* JSX */}
);
}
API Route Standards (.cursor/rules/api.mdc)
🧑🍳 Step 2: Let Cursor Cook
With our structured rules in place, Cursor now understands our project context automatically. Here's how we prompt incrementally:
Phase 1: Core Infrastructure
- "Create the transcribe API route using our standards" — Cursor applies the API rule automatically
- "Build the chunker utility in lib/" — Clean, typed functions
- "Create the Vectroid wrapper with proper error handling"
Phase 2: UI Components
- "Build the upload form component" — Cursor follows component standards
- "Create the chat interface with Vercel AI SDK streaming"
- "Add the transcript viewer with timestamp navigation"
Phase 3: Integration
- "Connect everything with proper state management"
- "Add error boundaries and loading states"
The magic? Cursor automatically applies the right rules based on what files you're working with. No more repeating yourself — the context is built-in.
Pro Tips from the Cursor Docs:
- Keep rules under 500 lines for optimal performance
- Use
@filenamereferences to include template files - Generate rules from successful chat sessions using
/Generate Cursor Rules - Organize with nested rules — put frontend rules in
frontend/.cursor/rules/
🧪 Step 3: Test It Out
Drop in a YouTube link. Wait for the transcription. Then chat.
Example:
"What does the guest say about alignment?"
And boom — you get a grounded response with exact timestamps.
You can:
- Click to jump to that moment in the transcript
- View retrieved chunks
- Swap episodes
- Use the same pattern for your own content (lectures, docs, meetings…)
📎 Resources
Happy vibe coding 🧘

Written by Hamit Tokay
Software Engineer