Back

VRA portfolio

Tracy Lownsberry / Annuity Giants

Voice agent + AIUGC pipeline

voice-agent11labshey-genclaudemeta-adslooper
scrape meta ad library script claude scheduled render 11labs + heygen ship meta ads conversion api

Problem we walked into

A single advisor brand needed to ship more ad creative than a human team can plausibly produce — and stay in the voice + face of one specific person (Tracy). Manual scripting + manual recording = a ceiling. The ceiling was the problem.

What we built

End-to-end AI UGC pipeline:

  1. Weekly Meta Ad Library scrape — competitor ads pulled, components extracted into a library.
  2. Claude generates fresh AIUGC scripts — fed by the component library + brand voice prompt.
  3. 11 Labs + Hey Gen render — Tracy's digital twin (voice + face) speaks the script.
  4. LOOPER bridge — scheduled Claude writes a state file; the local AIUGC pipeline reads it; renders run on rotation.
  5. Deploy as ad creative — straight into Meta with conversion tracking.

Tech under the hood

Anthropic SDK (scheduled jobs + tool use), 11 Labs voice cloning, Hey Gen digital twin, Meta Ad Library API, custom Python orchestration, a state-file pattern (we call it LOOPER) that bridges scheduled cloud agents to local rendering boxes.

Why this maps to Signal

11 Labs + Hey Gen + Claude is Signal's voice + digital-twin stack. The LOOPER pattern (scheduled Claude → state file → local pipeline) is structurally what Anthropic Managed Agents will look like in production — same shape, different host.