Claude Just Took Over the Video Editing Booth — And It's Actually Good

Nate Herk just dropped a full walkthrough of a Claude Code pipeline that trims, animates, and renders video end-to-end. As someone who spent two decades doing this manually, I had feelings.

M
Madison
4 min read·Apr 24, 2026·Summarizing Nate Herk
ai

I started editing video when I was around eleven years old. That’s not a humble brag — it’s context for why watching Nate Herk’s latest demo hit differently than most AI content I come across.

Nate dropped a ~28-minute walkthrough this week showing how Claude Code can now handle the entire video editing pipeline: trim filler words and dead air from a raw recording, add motion graphics, sync animations to spoken timestamps, and render a final cut — all from a single natural-language prompt. No timeline scrubbing. No clicking through Adobe Premiere. Just you, a raw file, and Claude.

AI video editing has crossed a threshold: it’s no longer about replacing a single step in the workflow. Claude can now orchestrate the whole thing — and the output is actually worth publishing.

What the Stack Actually Looks Like

Nate’s setup uses three tools in concert:

  • Claude Code as the orchestrator (available in the Claude desktop app or VS Code)
  • Video Use — a new open-source tool that handles transcription, trimming, and filler-word removal
  • Hyperframes — a motion graphics layer that takes the trimmed output and adds animated overlays, cards, and dynamic elements

The workflow Nate demonstrates goes like this: drop a raw video into your Claude project, reference it with the @ command, and tell Claude what you want. It transcribes the clip using Whisper (or a free local option), removes retakes and silence, generates timestamps tied to every spoken word, and hands the trimmed output to Hyperframes for animation.

The result in his demo? A 50-second raw clip turned into a 27-second polished piece with motion graphics that actually sync to the words on screen.

The Part That Got My Attention

Nate makes an important distinction between two animation engines: Remotion (built into Video Use) and Hyperframes (HTML-based, newer). Both work. He prefers Hyperframes because the animations feel more sophisticated — liquid glass cards, better timing, more visual polish.

I watched both outputs side by side in his video. He’s right. The Hyperframes version looks like something a motion designer spent real time on. The Remotion version looks like what AI video editing looked like six months ago: functional, but you can feel the seams.

What strikes me is that neither of these tools existed in their current form a year ago, and Claude Code wasn’t something most people outside the developer world were using as a creative production tool. That gap closed fast.

What This Means If You’ve Been Editing Manually

I’ve spent a lot of time in Premiere, in DaVinci, in whatever timeline editor was the tool of the moment. The part that always took the longest wasn’t the creative decisions — it was the mechanical work. Scrubbing through a raw recording looking for the cleaner take. Marking in and out points. Nudging clips by a few frames so the pacing felt right.

That’s the part that’s now automated.

Nate is transparent that it’s not perfect out of the box. He uses the analogy of teaching a kid to ride a bike — you have to hold the handlebars at first, correct the wobbles, and build up to the point where it just goes. That framing is honest and useful. This isn’t a one-click magic button; it’s a pipeline you set up once and then tune over time as Claude learns your style.

But here’s what’s changed: the tuning phase is now the hard part, not the editing itself. That’s a meaningful inversion.

The Fully Automated Version

Nate also gestures at the end-state: pair this with a HeyGen avatar, and you can remove the human recording step entirely. Drop in a script, generate the avatar clip, run it through the Video Use + Hyperframes pipeline, and get a finished video with zero time in front of a camera or a timeline.

He’s made separate videos on that flow and chooses not to use it for his YouTube channel because he wants to keep things real. I respect that call. But for course content, product demos, explainers, onboarding videos — the use cases where the message matters more than the personality — that full automation story is already here.

Getting Started

If you want to replicate Nate’s setup:

  1. Install the Claude desktop app and make sure you’re on a paid plan with Claude Code access
  2. Clone the Hyperframes and Video Use repos into a project folder (or grab Nate’s student kit from his free School community)
  3. Open a Claude Code session, paste both repo URLs, and prompt Claude to set up the full pipeline
  4. Drop in a raw video file, reference it with @, and ask Claude to trim and animate

The barrier here is real: you need a paid Claude plan and some comfort with setting up a code project. But Nate walks through it in detail and makes a point of showing the desktop app path for people who find VS Code intimidating.

The Bottom Line

This is the video editing workflow I’ve been watching for. Not because I want to hand off creativity to an AI, but because the mechanical work of editing — the stuff that used to eat an afternoon — is no longer the bottleneck. Claude Code is now good enough at orchestrating tools like Video Use and Hyperframes that the output clears a publishable bar on the first pass.

That changes what’s possible for anyone who creates video content and has been losing time to the timeline.

aiClaude Codevideo editingAI automationHyperframesVideo Usemotion graphicsAI toolscontent creationNate Herk