Lucas.Cort
← Back to Software

AI Avatar Product Showcase

Multi-Agent Live-Broadcast AI Presenter

Built and demonstrated

The Problem

As the company sold more franchised venues, each needed product walkthroughs and brand-consistent presentations. No way to produce that volume with a human presenter. Off-the-shelf avatar tools (HeyGen, Synthesia) only generate static MP4s — not live, interactive, context-aware broadcasts.

The Solution

A live multi-agent AI presentation system: an AI avatar reads a generated script while OBS auto-switches scenes and visuals in real time, pulling related assets from a RAG-indexed library. Three cooperating agents — script-writer, asset-picker, scene-director — each tunable independently.

Key Features

  • Multi-agent orchestration via Microsoft AutoGen (3 specialized agents)
  • Described-asset RAG library with ChromaDB vector search
  • Live OBS WebSocket control — scenes switch during the show
  • HeyGen avatar integration for narration
  • Built end-to-end in 1-2 weeks

Tech Stack

PythonMicrosoft AutoGenChromaDBOpenAI GPT-4HeyGenOBS WebSocket API

Outcome

Demonstrated a live autonomous broadcast presenter — indistinguishable from a human-run product showcase. Represents the evolution from manual OBS production (Unis Showcase 2020) to AI-driven automated broadcast.