Back to Blog
Tutorial6 min read

Let's Vibe Code a Podcast Chatbot (Responsibly)

Let's build a simple, end-to-end application called Podcast Memory Chatbot — a tool that lets you chat with podcast episodes using vector search and OpenAI. We'll vibe code everything inside Cursor using Next.js 14, Tailwind CSS, OpenAI, and Vectroid.

Hamit Tokay

Hamit Tokay

Software Engineer

#tutorial#ai#vector-search#podcast#nextjs#cursor

Today, we're building a simple, end-to-end application called Podcast Memory Chatbot — a tool that lets you chat with podcast episodes using vector search and OpenAI.

We'll use:

  • Next.js 14 (App Router)
  • Tailwind CSS (For Styling)
  • Vercel AI SDK (for streamlined AI integrations)
  • OpenAI (for embeddings + chat + transcription)
  • Vectroid (for vector search)

And we'll vibe code everything inside Cursor.

Let's go.

✨ What We're Building

The idea is simple:

You'll enter a YouTube link, and we transcribe it, embed it, and make it searchable through natural language.

You can then ask things like:

  • "What did the guest say about open source AI models?"

Or:

  • "Summarize Lex Fridman's opinion on consciousness."

The bot finds the most relevant moments from the transcript and responds with grounded answers. No hallucinations — just facts from the pod.

🧠 How It Works

Here's the flow we'll implement:

  1. Paste a podcast URL from YouTube
  2. Download the raw transcription from YT
  3. Split transcript into chunks (e.g. ~500 tokens)
  4. Create embeddings for each chunk with text-embedding-3-small via Vercel AI SDK
  5. Store embeddings in Vectroid
  6. In the chat UI:
- Take user question

- Embed it using Vercel AI SDK

- Query Vectroid for relevant chunks

- Stream GPT responses with Vercel AI SDK

- Return a grounded response with real-time streaming🛠️ Step 1: Set Up Cursor & Define Rules

Before we dive in, make sure you have Cursor set up:

  1. Download Cursor: Head to cursor.com and download the latest version for your operating system
  2. Create a new project: Open Cursor and create a new empty folder for your podcast chatbot project
  3. Initialize the project: We'll be building everything from scratch, so start with a completely empty directory

Now, here's the key to success with Cursor: it's all about how you communicate with it. You can't just drop vague ideas into Cursor and expect production-grade code.

We'll use the modern Project Rules approach with .cursor/rules directory and MDC files.

Main Project Rule (.cursor/rules/project.mdc)

--- description: Podcast Memory Chatbot - Main project configuration alwaysApply: true --- # Podcast Memory Chatbot Build a web app that lets users upload podcast links, transcribe them, and chat with the content using vector search. ## Tech Stack - Next.js 14 (App Router) with TypeScript - Tailwind CSS for styling - Vercel AI SDK for streamlined AI integrations - OpenAI for embeddings, chat, and Whisper transcription - Vectroid as vector database ## Architecture Principles - Keep components focused and reusable - Use server actions for API routes - Implement proper error boundaries - Follow Next.js 14 app directory conventions - Always include proper TypeScript types ## File Structure app/ upload/page.tsx chat/page.tsx api/transcribe/route.ts api/embed/route.ts api/query/route.ts components/ ChatWindow.tsx UploadForm.tsx TranscriptViewer.tsx lib/ vectroid.ts openai.ts chunker.ts types/ transcript.ts

Component Standards (.cursor/rules/components.mdc)

--- description: Component development standards globs: ["components/**/*", "app/**/*.tsx"] alwaysApply: false --- **Component Standards:** - Use TypeScript interfaces for all props - Implement proper error handling with try/catch - Use Tailwind CSS classes (no custom CSS) - Export components as named exports - Include JSDoc comments for complex logic - Use Vercel AI SDK hooks for streaming responses **Example Structure:** interface ComponentProps { // Define props here } export function ComponentName({ prop }: ComponentProps) { // Component logic return (
{/* JSX */}
); }

API Route Standards (.cursor/rules/api.mdc)

--- description: API route development standards globs: ["app/api/**/*"] alwaysApply: false --- **API Route Standards:** - Use Zod for request/response validation - Implement proper error handling with try/catch - Return consistent JSON responses - Use appropriate HTTP status codes - Include rate limiting for production - Use Vercel AI SDK for OpenAI integrations **Template:** import { NextRequest, NextResponse } from 'next/server'; import { z } from 'zod'; const RequestSchema = z.object({ // Define schema }); export async function POST(request: NextRequest) { try { const body = await request.json(); const validatedData = RequestSchema.parse(body); // API logic here return NextResponse.json({ success: true, data: result }); } catch (error) { return NextResponse.json( { error: 'Internal server error' }, { status: 500 } ); } }

🧑‍🍳 Step 2: Let Cursor Cook

With our structured rules in place, Cursor now understands our project context automatically. Here's how we prompt incrementally:

Phase 1: Core Infrastructure

  1. "Create the transcribe API route using our standards" — Cursor applies the API rule automatically
  2. "Build the chunker utility in lib/" — Clean, typed functions
  3. "Create the Vectroid wrapper with proper error handling"

Phase 2: UI Components

  1. "Build the upload form component" — Cursor follows component standards
  2. "Create the chat interface with Vercel AI SDK streaming"
  3. "Add the transcript viewer with timestamp navigation"

Phase 3: Integration

  1. "Connect everything with proper state management"
  2. "Add error boundaries and loading states"

The magic? Cursor automatically applies the right rules based on what files you're working with. No more repeating yourself — the context is built-in.

Pro Tips from the Cursor Docs:

  • Keep rules under 500 lines for optimal performance
  • Use @filename references to include template files
  • Generate rules from successful chat sessions using /Generate Cursor Rules
  • Organize with nested rules — put frontend rules in frontend/.cursor/rules/

🧪 Step 3: Test It Out

Drop in a YouTube link. Wait for the transcription. Then chat.

Example:

"What does the guest say about alignment?"

And boom — you get a grounded response with exact timestamps.

You can:

  • Click to jump to that moment in the transcript
  • View retrieved chunks
  • Swap episodes
  • Use the same pattern for your own content (lectures, docs, meetings…)

📎 Resources

  • 🔗 Vectroid Documentation
  • 🛠️ Cursor.com
  • 📕 OpenAI Docs
  • 🧠 Erik on Vibe Coding in Prod
  • Happy vibe coding 🧘

    Hamit Tokay

    Written by Hamit Tokay

    Software Engineer

    6 min read

    Want product news and updates? Sign up for our newsletter.