Problem we walked into
A single advisor brand needed to ship more ad creative than a human team can plausibly produce — and stay in the voice + face of one specific person (Tracy). Manual scripting + manual recording = a ceiling. The ceiling was the problem.
What we built
End-to-end AI UGC pipeline:
- Weekly Meta Ad Library scrape — competitor ads pulled, components extracted into a library.
- Claude generates fresh AIUGC scripts — fed by the component library + brand voice prompt.
- 11 Labs + Hey Gen render — Tracy's digital twin (voice + face) speaks the script.
- LOOPER bridge — scheduled Claude writes a state file; the local AIUGC pipeline reads it; renders run on rotation.
- Deploy as ad creative — straight into Meta with conversion tracking.
Tech under the hood
Anthropic SDK (scheduled jobs + tool use), 11 Labs voice cloning, Hey Gen digital twin, Meta Ad Library API, custom Python orchestration, a state-file pattern (we call it LOOPER) that bridges scheduled cloud agents to local rendering boxes.
Why this maps to Signal
11 Labs + Hey Gen + Claude is Signal's voice + digital-twin stack. The LOOPER pattern (scheduled Claude → state file → local pipeline) is structurally what Anthropic Managed Agents will look like in production — same shape, different host.