Modern Creator Network
Nate Herk | AI Automation · YouTube · 35:27

Higgsfield Just Turned Claude Into a Creative Agency

Nate Herk wires the Higgsfield CLI into Claude and Claude Code, then runs a one-prompt creative agency that researches a brand, generates the catalog, fills a Google Sheet of ad variants, and ships hyper-motion video on a routine.

Posted
1 weeks ago
Duration
Format
Tutorial
educational
Channel
NH|
Nate Herk | AI Automation
§ 01 · The Hook

The bait, then the rug-pull.

Nate Herk opens with the promise stated as a fait accompli: Higgsfield has every state-of-the-art image and video model, Claude knows how to talk to it, and together they ideate and generate 'a 100 times faster than the average human could.' Then he immediately cuts to a wall of finished ads — headphone halos, sleep-supplement bottles, hyper-motion launch videos — generated 'in literally five minutes with one prompt.' Promise + proof, in the first thirty seconds.

§ · Stated Promise

What the video promised.

stated at 00:12When we combine these tools, we're able to actually scale up our content because we can ideate and we can generate a 100 times faster than the average human could. So in today's video, I'm gonna show you guys how we're able to turn Claude into a creative agency.delivered at 31:00
§ · Chapters

Where the time goes.

00:0000:56

01 · Cold open + promise wall

Promise stated (Claude + Higgsfield = scale-up creative agency) and proven by walling up five-minute outputs: Murmur headphone ads, Sleep Support bottle ads, hyper-motion videos. Sets the bait: 'all of this from one prompt.'

00:5602:47

02 · Connect Higgsfield to Claude.ai (custom connector / MCP)

Walks through claude.ai → Settings → Connectors → Add custom connector. Paste the Higgsfield MCP URL, OAuth in, set permissions. Demonstrates the connector is live and ready to prompt against.

02:4704:50

03 · One-prompt brand generation in Claude.ai

Single prompt: 'Build me a headphone brand from scratch — do research, branding, catalog, and for each product generate a product photo, IG ad, and UGC video via Higgsfield MCP.' Claude returns a brand called Murmur with positioning, target buyer, voice, visual identity, and three SKUs (over-ear Halo, wireless earbuds Drift, open-back wired).

04:5006:20

04 · Review the auto-generated catalog + iterate inline

Walks the Halo / Drift / open-back outputs: product photos, Instagram ads, UGC videos. Shows how to fix mistakes by replying in-thread ('two headers — remove one'), how 'Animate' takes the same prompt graph into a video, and what the video model gets right (realism) vs wrong (duplicated text).

06:2007:20

05 · Marketing Studio: hyper-motion launch video

Asks Claude to use Higgsfield's Marketing Studio to make a launch video for the Halo. First pass is too quiet/intimate; second pass with 'hyper-motion variant, 16x9, more engaging' lands the cinematic ad seen in the intro. Brief detour about a 'sensitive content' block and how to debug the prompt with Claude.

07:2009:15

06 · Second product run: Sleep Support from a reference image

Drops in an existing product photo (blue Sleep Support bottle), asks for Instagram-ready ads. First pass loses the on-bottle text — lesson surfaced: 'be more specific about telling it not to change the reference image.' Iterates to ads with headlines like 'Asleep in 12 minutes' and 'Stop counting sheep — start sleeping through the night.'

09:1510:13

07 · Pivot to Claude Code (desktop) for power features

Why move off claude.ai: more control, reusable skills, true automations. Important nuance — not the CLI terminal, the Claude Code desktop app: still has chat + project, with skills/files/routines layered on top. Sets the scene for the Claude Code build-out.

10:1312:00

08 · Install the Higgsfield CLI + agent skills

Open a blank folder ('Higgsfield Studio'). Grab three commands from Higgsfield > MCP & CLI: install CLI, run hf auth (browser OAuth), install Higgsfield agent skills. Paste all three into Claude Code in one prompt, let it run. Side-explainer: CLI > MCP for token cost and agent efficiency.

12:0013:45

09 · Bring in outside expertise as a research markdown

Pre-built advertising_masterclass.md (617 lines) via a deep-research prompt: best organic ad strategies for 2026 across TikTok / Meta / X, what captures attention, what converts, platform differences. Lives in the project so agents reference it when ideating. Joe-relevant: 'utilize other people's expertise' — swipe-file thinking.

13:4517:55

10 · Master Sheet: log every Higgsfield generation via GWS CLI

Asks Claude to read all 45 assets from the Higgsfield account and write them into a Google Sheet (tabs: generations, by product, by style, planning). Builds a creative-ops database — product, style, image/video, model, prompt, status, result URL, job ID. GWS CLI is the unlock: lets the agent move across Sheets/Docs/Gmail/Calendar/Drive without MCP overhead.

17:5520:35

11 · Test matrix: 30+ variants ideated from masterclass + data

@-tags advertising_masterclass.md and the existing generations, asks Claude to mix-and-match variables (header, style, content type) into testable variants. Sheet gets a new 'creative slate' tab with priority-ranked ideas across products. Frames the philosophy: 'we’re not the bottleneck on creativity or production anymore.'

20:3523:30

12 · Generate rows 3–7 + add status tracking

'Create prompts for rows 3-7, add a status column, generate them in Higgsfield, then mark them complete on the sheet.' Sponsor break for Glido (Nate is joining the Glido team, switched from Whisper). When the batch returns, the Sleep Support bottle drifts off-brand because the reference wasn’t locked.

23:3025:30

13 · Lock the brand asset + regenerate

Drags the canonical Sleep Support product image directly into Claude Code: 'every advertisement must show the product exactly like this — same color, same text, don’t change anything.' Regenerates. New batch comes back consistent and on-brand, mixing nano-banana-2 and gpt-image-2 across angles (curiosity, contrarian, pattern interrupt, stat flash).

25:3028:36

14 · Skills: reverse-engineer a recipe from a winning prompt

Definition: a skill is a recipe for an agent (the pancake analogy). Workflow: find a generation you love, copy its exact prompt back into Claude, say 'turn this prompt into a skill that lives in .claude/skills so anytime I ask for a hyper-motion video you use this.' Highlights the meta-loop: bake learned constraints (e.g. words that get flagged for sensitive content) into the skill so the agent improves over time.

28:3631:00

15 · Invoke the new skill on the locked reference

@-tags the saved Sleep Support reference plus tries /hypermotion. First run uses the wrong skill (default higgsfield_generate). Restarts the Claude Code session; new run reads the .claude/skills/HyperMotion-Video file, asks the right clarifying question ('model in the ad, UGC, or product only?'), and produces the cinematic hyper-motion clip he wanted.

31:0033:20

16 · Output review + honest critique

Output is strong — cinematic feel, real-looking product — but the label text is mangled in image-to-video. Nate names the limitation directly: 'this is the worst AI video generation will ever be,' suggests a workaround (simpler label for hero shots), and pushes back on knee-jerk 'AI slop' framing.

33:2035:27

17 · Routines: schedule the agency to run while you sleep

Claude Code Routines inject prompts on a cadence. The proposed pipeline: Sunday night routine — analyze the Sheet + platform data, add 50 new generation ideas. Monday morning routine — pick 30 blank-status rows, generate, mark complete. Scale to twice-weekly planning + generation. Optional final step: pipe winners into Potato or Meta Ads Manager for full auto-posting. CTA: like + watch the routines deep-dive.

§ · Storyboard

Visual structure at a glance.

promise wall
hookpromise wall00:00
halo headphone proof
proofhalo headphone proof00:29
sleep video proof
proofsleep video proof00:48
MCP connector setup
valueMCP connector setup01:29
build a brand prompt
valuebuild a brand prompt02:19
Murmur catalog
valueMurmur catalog03:50
hyper-motion ad
valuehyper-motion ad06:20
Higgsfield tool call
valueHiggsfield tool call08:00
rendered sleep ad
valuerendered sleep ad09:00
Claude Code + routines
valueClaude Code + routines10:10
§ · Frameworks

Named ideas worth stealing.

10:45concept

MCP vs CLI for Agents

MCP exposes every tool by default — token-heavy. The CLI is the lean alternative: same capabilities, faster, cheaper, better for agents that loop. Default to CLI when wiring tools into Claude Code; reserve MCP for ad-hoc claude.ai exploration.

Steal forany agent project where token cost compounds across iterations — Joe's Trigger.dev tasks, Paperclip orchestration
23:50concept

Skill = Recipe (pancake analogy)

A skill is to an AI agent what a recipe is to a cook: lock the inputs, ordering, and constraints so output stays consistent run-over-run. Bake in negative constraints too (words that triggered moderation flags, brand assets that must lock).

Steal forJoe's .claude/skills layer in MCN — codify Killing Excuses shot recipes, Mod Producer runsheet templates, ad creative voice rules
25:05model

Reverse-engineer a skill from a winning prompt

  1. Generate 5+ variants of a creative
  2. Pick the 1–3 outputs you actually love
  3. Copy the exact prompt that produced them
  4. Paste back into a fresh chat and say: 'turn this into a skill in .claude/skills so anytime I ask for X, use this'
  5. Iterate the skill every run — tell it what you liked/didn't and have it self-update

Don’t write skills from scratch — mine them from outputs that already worked. Outputs first, recipes second.

Steal forevery winning ad / video / email in MCN should be reverse-engineered into a skill the next agent run can replay
12:18concept

Bring outside expertise into the project as Markdown

Don’t expect the base model to be a master copywriter. Run a deep-research prompt against Twitter threads / YouTube / Perplexity / books, save the result as `advertising_masterclass.md` in the project, then @-tag it whenever you ideate. The model now has subject-matter expertise on tap.

Steal forJoe's Maria Wendt 255-email swipe file, HTSS notes, Hoffmann's Reward Funnel — land them as project-local .md files for every MCN content agent
15:20model

Master Sheet + status column = creative ops database

  1. Tabs: generations, by-product, by-style, planning, creative slate
  2. Columns: product, style, image/video, model, prompt, status, result URL, job ID
  3. Status drives which rows the next routine picks up
  4. Sheet is queryable by the agent for analysis later

One source of truth makes the rest of the automation possible. The status column is the glue — it’s what stops agents from duplicating work across routine runs.

Steal forJoe's Content Forge meme/quote pipeline, Mod Producer runsheets — a Sheet with a status column beats any custom DB for v1
33:45model

Routines: ideate Sunday → generate Monday

  1. Sunday 9pm — 'analyze sheet + platform data, add 50 new generation ideas with empty status'
  2. Monday 8am — 'pick 30 blank-status rows, generate prompts + assets, mark complete'
  3. Add Thursday/Friday as a second cycle to double throughput
  4. Optional: pipe completed assets to Potato or Meta Ads Manager for posting

Two cron-style prompts and you have an autonomous ad-generation loop. The cadence (Sun plan / Mon execute) prevents the agent from generating against stale ideas.

Steal forJACE / REESE routines on the MCN side — weekly content planning + execution cycles for Killing Excuses and Notes to Myself
§ · Quotables

Lines you could clip.

00:12
We can ideate and we can generate a hundred times faster than the average human could.
tight thesis statement, lands the promise of agent + media-gen stack in one lineTikTok hook
23:31
We're pulling the lever on the slot machine, which is AI. If we don't have guidelines, if we don't have recipes — skills — then they're not gonna be super consistent.
names the core problem and the fix in one breathIG reel cold open
23:51
A skill is essentially a recipe for an AI agent.
best one-line definition of an agent skill in the wildnewsletter pull-quote
32:05
This is the worst that AI video generation models will ever be. Every day, every month, they're going to get better.
reframe line — disarms 'AI slop' critiques without arguingTikTok hook
20:32
I could set an agent off to generate all this stuff, and then I could go to bed, and I could wake up with a 100 different ad copies and creatives ready to go.
concrete, visual outcome — 'go to bed, wake up with 100 ads' is the dream-state for the audienceIG reel cold open
19:46
We're not the bottleneck on creativity, and we're also not the bottleneck on production.
names the unlock in seven wordsnewsletter pull-quote
§ · Pacing

How they spent the runtime.

Hook length56s
Info densityhigh
Filler8%
Sponsors
  • 20:3321:05 · Glido (voice dictation — Nate disclosed he joined the team)
§ · Resources Mentioned

Things they pointed at.

05:57toolHiggsfield Marketing Studio (hyper-motion / UGC / unboxing templates)
10:15toolHiggsfield CLI + Agent Skills (for Claude Code)
15:25toolGoogle Workspace CLI (GWS CLI) — Sheets/Docs/Gmail/Calendar/Drive for agents
12:18tooladvertising_masterclass.md — project-local deep-research swipe file
20:33productGlido (voice dictation, sponsor)
23:50tool.claude/skills folder — project-scoped Claude Code skill recipes
33:20toolClaude Code Routines (scheduled prompts)
34:55toolPotato — social scheduling / posting
34:57toolMeta Ads Manager — final delivery destination
§ · CTA Breakdown

How they asked for the click.

34:59next-video
If you guys wanna dive a little bit deeper into routines, I'll tag a full video right here where I dive into how you set them up… if you guys enjoyed, please give it a like.

Soft CTA — anchors on the next-video tag for routines and a like ask. No newsletter push, no product (other than the embedded Glido sponsor at 20:33). Trade-off: zero conversion pressure, but also no lead capture from a 35-min watch.

§ · The Script

Word for word.

HOOKopening / re-engagementCTAthe pitchmetaphoranalogystory
00:00HOOKSo Higgs Field has access to all of the best AI image and video generation models. And Claude lets us talk to Higgs Field and build custom skills and agents and schedule all of these automations to run while we sleep. And when we combine these tools, we're able to actually scale up our content because we can ideate and we can generate a 100 times faster than the average human could. So in today's video, I'm gonna show you guys how we're able to turn Claude into a creative agency. So real quick, let me show you guys a few examples that I was able to generate in literally five minutes with one prompt.
00:56I mean, those are incredible. I think that this second one's probably my favorite. Like, the zoom in on the product, the detail, all of the animations in the background, the music. I mean, this is incredible. Even this detail here. Think about how long this would have taken you if you either wanted to edit this by hand or, you know, shoot this with a studio and with a paid actress. It would have taken so much more time and resources.
01:19And like I said, I was able to generate all of those outputs just by talking to Claude with a prompt that looked something like this. So before I show you guys these conversations and exactly how I did it, let me show you how we connect Hicksfield to Claude. So if you don't have a Hicksfield account, go to hicksfield.ai and you can sign up. You will have to get on some sort of subscription,
01:36but once you're there, you basically wanna come to this page that says MCP and CLI. And we're gonna first of all connect this to Claude in the web. This is just your typical Claude chat that you've probably been using for months now. You're gonna go to the settings and you're going to click on connectors and you're gonna have to add a custom connector. So down here you can see that I have Higgs field as a custom connector. So go ahead and click add custom connector, and you're basically just gonna call this, you know, Higgs Field or if you wanna call it something else, I don't really know why you would. And then you're going to copy this command right here. And back in Claude, you're just gonna paste that in, hit add.
02:07Mine says it's not gonna work because I've already done that, but yours will connect. And then basically it'll prompt you to sign in. So you'll hit configure. It'll take you to the Hicksfield OAuth. You'll sign in with the account that you just created, and then you will now have Hicksfield ready to go. And then if you wanna click on configure, you can basically change the permissions. So if you only want it to be able to do certain things in your Hicksfield account, you can limit it to do that, or you could come in here and you could just say, hey. Always allow all of these. I don't really care. So however you wanna get that set up. And now whenever you're in a cloud chat, if you go over here and you look at your connectors, you can see that Higgs field is right there and you can just prompt it to create you images with that or videos with that or whatever you wanna do. Now it will be able to actually talk to Higgs Field as you can see. So just to start off and show you guys a few quick examples,
02:49what I said here is build me a headphone brand from scratch. I want you to do research, build the branding, build the product catalog, and for each of them, want you to generate assets. So a product photo, an Instagram ad, and a UGC video, and I told it to use the Higgs Field MCP for all of these generations. So yes, what we could have done is done all this research on our own or taken all this research from Claude, put it in a Google Doc, and then went over to Higgs field on our own and found all these different AI image and video models and just generated all of this stuff in the interface. We could have done that. But in today's video, I'm showing you guys how you can essentially just treat Claude code or Claude as the interface to do all this stuff and in a more consistent and repeatable and automated fashion that's really gonna help you scale much more than if you were to do this all manually. So as you can see, it does the research, it looked at the market, and now it's helped me build a brand called Murmur.
03:36And we have stuff like positioning, target buyer, voice, visual identity, and all this stuff. And it created three different products. So an over the ear, which is the most expensive. We've got wireless earbuds, and we have open back wired headphones once again. And then it just goes ahead and generates all the stuff. So all these images and videos that you're about to see were just minimal input. It was literally just, hey. Build me a brand and build these things. And now we have this first photo of our halo product, which looks very nice. We've got this Instagram ad, which came out a little bit duplicated with the text for some reason. But if we wanted to fix that, all we'd to do is say, okay, edit. And then we could give it a prompt because it knows the exact reference image and it knows this, what we're talking about. And I could just say, hey, you know, we have two different headers in here, remove one of them. So you could iterate really quick on that there. And you could also click animate. So it's basically the same type of prompt and we could say, hey, turn this into a video where, you know, the headphones are spinning in a circle and floating in the air, whatever we wanna do. So that's the ad, and then we have this video which is basically just a person
04:36listening to music, wearing a hood, and smiling at the camera. So you can see though that video looks really good. Like, person looks very real. So we've got our second one. We've got the picture right here. We've got the ad right here. That one looks much better actually. And then similarly, we have this video with music of a person putting one of them in their ear and then smiling at the camera. And then we have the exact same thing for our final product. We have an image. We have an ad, and then we have a video of someone using the product
05:05as you can see. Then I said take the halo, which was the first one, and I said use Higgs Field's marketing studio to create a launch video. So Higgs Field in here has a really cool thing called marketing studio where you drop in a product or a link to a product and you can even put in your own custom avatar, and it basically turns it into a different type of format. So hypermotion,
05:25sort of like this. It could be an unboxing. You know, there's other styles. There's UGC. There's so many different things you can do in here for marketing. Like these hypermotions are super super cool. I mean, look at that. So I told it to use this marketing studio format inside of Claude right here. So it looks at the marketing studio, it found out what to do, and it comes back with this.
05:47As you can see, it's cool, but it's not like the hyper motion style. It's very quiet. We have that weird scene where the person kind of intimately touches the headset. Don't know what that's about. But it's it's fine. Right? So I said, okay. That's a little bit, you know, not what we want. I want you to use the hyper motion variant and I want you to make it more engaging and I also want a 16 by nine version. So I wanted it to capture more attention and then that's when it comes back and it gives us this one which I absolutely love. I think this one is really good.
06:17Okay. But then for the 16 by nine version, it said that it was sensitive content and it refunded my credits. So I said, okay. Try again. It did the exact same thing. And then eventually I was able to say, Why did that get denied? Show me the prompt. Figure out why that happened. It read me the prompt, and then it said okay, think it's because of these words, know, all this kind of stuff, so I'm gonna get rid of that. And it does it again and then I was able to get this version which you guys also saw in the intro of this video.
06:45So it's not perfect, but what we're able to do is we're able to build skills and stuff around this so that this doesn't happen again where we're getting sensitive content blocks. Okay. So another quick example, I drop in an image. So let's say we already have a product that we wanna start with. I want some advertisements and stuff for that. So I ask it to, you know, make me a couple of Instagram ready ads for this product, and if it has any questions to ask, so it asked me one thing and I gave it a quick answer, and then it starts generating some stuff. So here's one picture. Now what you'll notice is it got rid of some of the words. So on the original photo, we had like some extra little
07:19captions and subtext right here, and it got rid of that. So we have to be more specific about telling it to not change the reference image at all. That's very important. But anyways, that's a nice picture. This one's also cool. It's a little bit more of like a effects type of picture. And then we also have this one which is kind of like an Instagram story, which isn't exactly what I was looking for, but it does look good as far as like the interface and the realism. So it said, hey. We have a calm one. We have a cinematic one, and we have like a real relatable one. Now we're able to keep iterating with what we want, but I said that's not good enough. I need these to be actual ready to go advertisements across different socials, and then it comes back. It generates some more for us. So this one has text. It says stop counting sheep. Start sleeping through the night. It has sort of that cinematic feel. That one I thought was pretty cool. We also have this one, which I loved. Asleep in ten minutes, fall asleep faster, stay asleep longer, wake up refreshed. I absolutely love this one as well. And you can see with very minimal prompting. All I said was, hey. Create me Instagram ready ads. It understands, like, headlines and spacing,
08:15and I just thought that these looked really, really good. And then we had one more down here as well. So the main problem is that it didn't have the bottle appearing exactly as we wanted it to in the reference image, which is a pretty easy fix. Here's another one. You guys didn't see this yet because it's a lot slower paced and it's not the exact, like, HyperMotion style, but it's still a nice little animation. And then we also have this one which is
08:36just, you know, the person kind of looking at it. So these are decent videos, but they're not ready to go ads. We So had to be a little bit more specific again. So I said give me one that's fast paced and energetic. It has camera cuts, it has slow motion, it has close ups and then we got the one which you guys saw earlier which was right down here. It took me a few more tries and then we got this one that you guys saw in the demo with the capsules. I thought this one was really, really good. So what's cool about this is you're able to take a super vague high level idea. You can say words like engaging,
09:05HOOKcool, fast paced, things that are emotional. And Claude does the hard work of figuring out the prompting and then sends that over to Hicksfield Marketing Studio which has a nice, like, pre trained model in the back end. And then you get stuff like this in minutes. And then what you're able to do is you can say, hey. This one was a winning ad. You know, this format, this style, this colorway, I want a bunch of different versions of this to test out. And it will just go through and it will plan out and strategize on the different versions, the different headers, and then we'll just go make all of them. And now you have so many more pieces to to test, and ultimately once you find your winning combo, you just chuck more budget at that one and really try to scale your product. Okay. So that's how you do it in Claude. You use the Higgs build MCP. It's super easy. It's great to get some POC done in there. Now there's a lot of things that we wanna improve. The image stuff. We want more control. We wanna build some reusable skills. We wanna build some automations. And that's where we're gonna head from Claude into Claude code. Now don't get scared. We're gonna be using the desktop app just for today's tutorial, and all this looks like is Claude. You have a chat, and you have a project, and you're able to build way more power in this interface. So I'm gonna show you guys how. So the first thing you're gonna wanna do is you're gonna wanna open up a blank folder.
10:13You're gonna go to your finder wherever you wanna have it. So wherever you wanna have this, you could just have it right in your desktop. You're gonna create a new folder and call that like Higgs Field Marketing Studio or whatever you wanna call it. So in my case, this is called Higgs Field Studio. So the very first thing that we need to do is get everything set up. So let me show you guys the way that I would prompt this to get set up. I would go into Higgs field. I'm gonna click on MCP and CLI.
10:35And when we're specifically doing this for Claude or if we wanna do this with OpenClaw or Hermes or whatever, we wanna do this with the CLI instead of the MCP. Now the reason being, ultimately, functionally, they can do pretty much the same things, but the MCP has all those tools. So from a token perspective, it's actually more expensive to use an MCP,
10:54and the CLI is just better for agents. It's gonna be faster. It's gonna be more efficient. Like I said, we're gonna use the CLI. So all you have do is copy these three commands exactly. I'm gonna copy this. I'm gonna copy this, and I'm gonna copy this. And I'm gonna go into Claude code and say, this project is basically being set up to use Higgs Field, and it's gonna be set up for kind of a creative studio, a marketing studio. I need you to install
11:19the Higgs Field CLI. I need you to then run the auth for me to sign in, and then I need you to install the Higgs Field agent skills. So here below are the three commands. There you go. I paste in all three commands, and then I just go ahead and run that, and it's gonna get you set up. It's going to install the CLI, and then it's going to do the auth flow. So it'll open up a tab, and then you will just sign in wherever you created your Higgs Field account, and then it will add the agent skills for you to use. Now here for me specifically, when it tries to install this, it's gonna say, hey. This already exists, but yours will actually just go ahead and get set up. So here you go. It says the CLI is working now running off login, which opens up a browser. And as you can see right here, it's basically just asking me if I'm okay to connect Claude to Hicksfield, and you're gonna go ahead and hit connect and then sign in. And now you have been authorized. And back in Claude, it should say, okay. Cool. You're now connected in. I can see your account, and now I'm installing the agent skills. So exactly like you see right here. Now while this is getting set up, there is something I wanna talk to you guys about, which is the fact that this stuff isn't magic. This stuff just lets you automate things and
12:16ideate. So my point being, if you're not a master copywriter or advertiser, it might be really tough for you to build amazing
12:25tier one advertisement copy and creatives. And that's why what you can do with Claude is you can utilize other people's expertise and you can bring that in to make Claude code the subject matter expert here. So it's not perfect. You know, a a master copywriter is gonna build a better newsletter automation than I would. So what you can do is in your project, you can do some research. So here's a chat that I did earlier right before this, and I basically said, hey. I need you to do a deep research on the best strategies for advertising in 2026 when it comes to organic advertisements on platforms like TikTok, Meta, or X, and what captures people's attention, what converts, and how it differs per platform. So I basically wanted it to create a full markdown file called advertising masterclass
13:05that would live in this project, and then I could have my agents look at that when they need help ideating or when they need help analyzing what's going on with our data, and it's gonna help them give us better copy and give us better prompts to feed into Hicks Field. And this is something that I do all the time. Whenever I'm building different agents or automations, I always leverage Twitter threads, YouTube videos, perplexity research. I utilize information that's out there and proven, and I bring that into my systems. So basically, this did all that research. It asked me questions, and then it gave me a full markdown file, which now lives in this project. As you can see, advertisingmasterclass.md.
13:40It's 617 lines. I could go ahead and open it up right here, and we could read this, which is a master playbook for organic content. It's last updated May 2026. We have a cheat sheet. We have different platforms, and we have a bunch of information about how attention is captured. And now all of our advertisements are going to hopefully be better because we've done a research doc like this. But anyways, now we have the CLI installed. We have been authenticated in, and we have our skills installed. So we're ready to get started
14:08trying to build some automations here. Okay. So I cleared out that chat, and here's the first thing that I would recommend you do. So in my Higgs field, if you guys remember, if I go to my assets, we already have a ton of assets generated. We have 45 different assets created. We have a few different products. We have a sleep pill thing. We have a few different headphones that were generated, a few different UGC ads. So here's what I wanna do.
14:30Go ahead and take a look at all the assets that we've generated in Higgs Field. I need help basically creating a log tracker of every generation that we do together just for, you know, visibility into our prompts and our data and statistics and things like that.
14:48And I want you to formulate this into a Google Sheet, a few tabs if you need, but use the GWS CLI in order to create this Google Sheet and organize it based on, you know, like, the product and the, you know, the prompts, maybe however you think that this makes sense. Ultimately, this project is being set up to become a master creative agency, so we need to have a database where everything lives so that we can track stuff over time as we scale up how many pieces of creative media we have. So the reason why I want to do this is because I ultimately want to get to a place where we have something like this as an example. So I have all my generations here. We can see the product, we can see the style, we can see the image or video, the model, and we can see what we're actually generating. We can also look at the results, and we can look at the actual prompts for all of them, which is really important. From there, we can analyze which ones of these do we like the best, which ones actually converted the best, which ones, you know, had the best budget or had the best spend on our meta account or whatever it is?
15:45And we can have all these other statistics like by products, by style. We can have some planning. And then based on all this data, and especially if you bring in actual real data from your Google Ads or your meta ads or even just your TikTok or Instagram account. Claude Code can look at it. It can use the subject matter expertise that you had at research, and then it can plan things.
16:04So now it's planning out different versions of ads based on certain things. So we have different value props, different headlines, different avatars, different styles. And it gives us this test matrix where we now have a 100 different things to test, and they all switch up little different variables, and we can ultimately test way more things now because we're not the bottleneck on creativity, and we're also not the bottleneck on production. Because I could set an agent off to generate all this stuff, and then I could go to bed, and I could wake up with a 100 different ad copies and creatives ready to go. And And then you could also have an agent that every single week generates a 100 more. So you just have like this unlimited bank of things to test, and it's all based on data, not just random stuff. So it was looking at the master sheet which already exists, which is this one right here. So I'm just basically gonna go ahead and say, hey. I actually just wanna create a completely new one. That one was
16:54good, but I'm just doing a demo, so I want you to just do it again so I can show my audience how good you are at things like that. And if you guys haven't used the GWS CLI before, it's amazing. It's another CLI just like this Higgs field CLI is a CLI, and our agents are now able to super quickly look at Google Sheets, Google Docs, Gmail, Calendar, Drive. It can look everywhere, and it's much, much more efficient than using, like, a bunch of MCP servers or a bunch of API calls. So if you haven't tested out the GW CLI before, it's a huge unlock. I'll drop a full video right up here where I talk about it more. Okay. So now it has pulled all 45 of those generations as you can see. Let me just open up this sheet so we can take a look at it. We've got all the generations right here. I'm gonna go ahead and make these smaller. Okay. So we've got these 45 generations. We can also see by product. We can see by style and planning. So really what I wanted to do there is just show you guys that Cloud Code can now look inside of Higgs Field, see everything that we've done, and not only just see it, but it can pull information like the job ID, the status, the prompt, you know, the sizing, all this kind of stuff. That's really important. I mean, think about how long this would have taken you to manually go look at all your stuff and get it into some sort of internal database. And then, of course, this is where it's able to build on top of it, and it's able to give us tons and tons of new ideas to actually go and generate more creatives from. So let's actually go ahead and do that. I'm gonna open back up Cloud Code, and real quick, I'm gonna do an at sign, which is gonna let us tag certain things, and I'm going to tag the advertising master class dot m d file, which is, you know, the full breakdown. And I'm gonna say,
18:26alright. So I want you to look at all of the different generations that we've done. I also want you to read that advertising masterclass doc, and I want you to help me figure out a bunch of different variations that we could create. You know, we've done some things with different headers, with different styles, with different types of content, and I want to mix and match a bunch of different variables so we can ultimately test a bunch of these and put a bunch of budget behind this and see which one of them spends the best. So
18:52use your creativity here, use your best practices, and help me get a bunch of different ideas for, you know, more creatives. And once we shoot it off with that plan, I probably should have said to put that into the GW CLI, into that Google Sheet. Hopefully, it understands that. If it doesn't seem like it's understanding that, I'll go ahead and stop it and say, by the way, put that in the Google Sheet, but that is what we're looking for here as a deliverable. Okay. So it looks like this is all finished up. I did have to end up telling it to put it in the Google Sheet. Wasn't that smart yet. But let me go over to the sheet and we'll see we have this new tab called creative slate.
19:25CTAAnd here, this one looks better than the previous one. It didn't give us a 100. We could obviously say, hey, Give us a 100, but this one ended up giving us somewhere in the thirties. So it also kind of showed us priorities. So these obviously need to go ASAP, and then we've got other ones as well for different products. You can see the Murmur Halo. We've got the sleep supplements. So anyways, but now we have this database that we can look at. We could also have some sort of status so that when we are having it automatically generate these, it can mark them off as processed or done or whatever it is. And then it will create the prompts and talk to Higgs Field to create them for us. So it's gonna be very, cool. Sorry about the lighting. I just saw, like, apparently a tornado came through and it got very dark. So I turned on my little light. Okay. So now that we have all of these examples set up, let's say that we wanna generate these top five priority ones. So rows three through seven, we're gonna generate. So what I want you to do is now create the prompts for rows three through seven, so those first five that you created. But what we also need to do, which is really important, is we need tracking. I noticed on the sheet, you didn't have any status columns. So add a status column, and what I want you to do is create the prompts for those five, go off to Higgs Field and generate them, and then mark them off on the sheet.
20:34CTAOnce they come back and they're done as complete or in review or whatever you wanna do there, Just mark them off so we can keep track of what's going on. So just shut that off. If you guys are curious about how I'm talking with my voice, then check out the tool in the description. It's called Glido. I officially have joined the Glido team, and I'm super, excited about it. I've switched over from Whisper to Glido.
20:53CTAI just truly believe in the vision that we're building over here. So if you guys wanna support and if you need a voice tool, check out Glido. It's faster, it's private, and it is gonna be way more authentic, so join the movement. But, anyways, this is gonna shoot off to Higgs Field, and I'll check-in with you guys when we get those advertisements back to take a look at. Okay. So those five are finished. I'm gonna go into the actual Google Sheets so we can take a look. We can see over here that it's added a status column. It's marked these as complete. We have the result URL. We have the job ID. So let's take a look at just a few of these real quick. Okay. So this one is interesting. It's sort of like a meme. You know, it's eighteen months, no sleep. So there's that. You can also notice that the image doesn't really look like our reference image at all. If you guys remember, if I come over here, it was this one. No. Not this one. Sorry. It was
21:33this one. So that's honestly my fault. I didn't prompt it to do so. So we're gonna have to redo another round, but let's real quick just see what else we've got here because what you'll notice is these probably aren't gonna be super consistent, and that's because we're just kind of blindly prompting. We're pulling the lever on the slot machine, which is AI.
21:51And if we don't have guidelines, if we don't have recipes around it, skills, then they're not gonna be super consistent. So this one, I don't know what that is. This one, is a video.
22:07Very generic. Okay. So what happened is in the prompt, I'm assuming, it basically said, hey. It is a blue bottle. It says sleep support on it, and that's it. So it's creating these just random looking sleep support looking bottles. So what I'm gonna do is I'm gonna have to go back into Cloud Code. I'm going to drag in the actual image. This is our actual
22:26product image. This is an asset that we should always be using. So when you're creating these advertisements for the sleep supplement product, it has to appear as shown in this reference image every single time. It must appear exactly like this. Same color, same text. Don't change anything.
22:43And I need you to go ahead and regenerate those five examples. Also remember, the goal here is conversion. The goal is to get someone to want to buy our product. So go ahead and do those five again. Now, obviously, we're looking for some sort of consistency,
22:59but what's important is we have a different kind of angle. You know, this one was curiosity. This one was contrarian. This one was a pattern interrupt, a question, and a stat flash. So that's the value of being able to generate so many different types of angles. But also what you'll notice here is it used different models. It wanted to try these two with nano banana two. It wanted to try these two with gbt image two. So we just have a lot more to play with here, and it's obviously way more automated. So I'll check-in with you guys when we get those back. So I thought while this is running, and apologies if you hear thunder or fire trucks, I wasn't kidding about the tornado.
23:33I thought that I would take a quick second to talk about skills. Because what happened here is one of them came back with being
23:41restricted, just like we saw earlier. And so what we can do is start to build a bit of a knowledge bank around what prompts get restricted and why and what don't so that it doesn't happen in the future. So what is a skill? A skill is essentially a recipe for an AI agent. So if someone said, hey, can you make me some chocolate chip pancakes? You would pull up a recipe of chocolate chip pancakes and you would make it. And the next time, you would pull up that same recipe and you would make the pancakes and they'd be the exact same. But if you didn't have a recipe and you were kind of guessing the measurements and guessing the order and the temperature,
24:10your pancakes would come out different every single time. So when we give our agent a skill, it basically means, okay, whenever I want an Instagram ad, you do it exactly like this and now everything feels on brand, everything feels consistent. So if we bake into a skill, hey, by the way, in the past, you used these five phrases and these five words and it got, you know, basically flagged, so don't ever use those five words or phrases ever again.
24:32And the coolest part about skills is every single time that you run your skill, that skill gets better because you can run your skill and you can say, okay. You just created me these five advertisements with this skill. I don't like x, y, and z, but I love a, b, and c. So update the skill to make sure that next time, you know, it's better. So while this is actually finishing up and while those generations are happening, let me just show you something. I'm opening up a new chat in this project,
24:56and I'm gonna go to our Claude over here. Let's say that we wanna find one that we really, really like. So, actually, one of my favorite generations so far was this one. What I can do is I can take this prompt and I can copy this prompt. And essentially, what I'm gonna do is reverse engineer a skill from this prompt. So I'm gonna go back into the Claude.
25:16I'm gonna paste in that prompt, and I'm just gonna start yapping. Hey, Claude. So this prompt that you're looking at right above is my favorite output we've gotten from Higgs Field Marketing Studio. This was a hyper motion, fast paced kind of like launch video for our product, and I loved it. It had fast cuts. It had nice zooms. It had nice details.
25:38And I want to turn this into a skill that lives locally inside of this project in the dot cloud slash skills so that anytime I ask for a hyper motion style video, you will utilize this and they're always consistent and they always have this style. So turn this into a skill for me.
25:58And that's basically it. I mean, obviously, the skill's not gonna be perfect on the first shot, but when it comes to actually creating the skill, that's all it takes. And usually the way that I like to build them is I like to play around with a bunch of outputs, you know, maybe generate five different things, and then I pick the one that I like the most or the two or three I like the most and say,
26:15how can I reverse engineer a skill from these outputs? And that's exactly what you just saw me do. Okay. So these should be done now. I'm gonna open up the sheets and we're gonna take a look Hopefully now, the most important thing I'm really looking for here is that the images or sorry. The picture of our product looks exactly as it should. So perfect. This one looks exactly like the reference image. This says eighteen months, no sleep, then seven nights of this. Try it tonight, sixty night money back guarantee. And this looks like an Instagram story style ad. According to the research that it did, this one is kind of what happens and what should be converting.
26:48Let's take a look at this next one here. We have the sleep bottle, which looks exactly as it should, which is perfect. It says melatonin does not equal sleep. Try the formula. 28,000 parents swear by free shipping. So once again, this looks like it's supposed to be sort of an Instagram story style of advertisement. Let's take a look at the video we've got here.
27:12Okay. So super short five second video. I don't love this one. I don't know if it's super engaging, but, hey, this was based on the research that our agent did, and obviously we would test this one. If it's not spending well, we would kill it. This one is more of a square style, so it's looks like it's only one out of five different carousel slides, but why am I exhausted at 2PM even when I sleep? Eight hours. Advanced sleep formulas, have to see why.
27:35Okay. So that was just really proof of the bottles actually coming through now as they should. This one has a huge stat up top. So we can see that all of these were based on the on screen text, based on the CTA, based on the platform fit and the notes, and all of this other stuff that the agent did research on and helped us ideate on. Now what I wanna do is let's try to make another one based on this skill. So once this skill has finished up, we will try to invoke it, and we will try to create another
28:00hyper motion style video from this skill right here. Alright. So it created us one called hyper motion video. Now keep in mind, when we installed our Higgs field CLI, we also installed the agent skills. So even if you don't have this one and you say, hey. Can you create me an image? Can you create me a video? It should automatically be triggering these other default Higgs field skills
28:20automatically. But, anyways, what we're gonna do is we need to open up a new session to try to use this skill. Okay. Now before I actually use the skill, I wanna check, did it save our reference image or not? So I'm going to go back into this. I'm gonna click on files, and I wanna see if we have any brand assets. It doesn't look like we do. We have a bunch of data. Actually, wait. No. We do. Right here. Data, assets. We have our sleep bottle image. So you can't see it in this preview, but if I actually open up our folder, which this project lives in, which is right here at Higgs Field Studio, if I go into data and if I go into assets,
28:52you can see that it did upload this as a reference image, which is great because what that means is in a new chat, I can tag it. So it was called sleep bottle reference PNG, so I can use that. And then I can hopefully do a slash command for Higgs field. Oh, no. It wasn't called Higgs field. It was called hypermotion. So if I do slash hypermotion okay. I'm not seeing it right now. Let's try to just invoke it by natural language. So I want you to use the HyperMotion skill
29:17and turn this sleep bottle image, which is the reference image I've tagged, and I want you to create a HyperMotion style video in Hicksfield using their marketing studio. So that's all I'm gonna say for a prompt. I'm not gonna give it anything else. What I'm watching for here is to make sure that it actually invokes the skill because every time Cloud Code invokes a skill, it will tell you that it does. So hopefully what we see is that it's searching for one and that it actually calls it. If it doesn't, I'm gonna stop this generation. And right now it's running the wrong skill. It's running Higgs field generate. So this isn't the one we actually wanted to do. We wanted it to run the hyper motion skill. So what's happening is probably just because
29:54we just created that, so it hasn't registered yet. So I'm gonna go ahead and stop this session real quick. And what we wanna do is check if that skill actually exists. So I'm gonna go to our project. I'm gonna go back to the main section. The skills will live in a dot Claude folder. So dot Claude, skills, HyperMotion video. The skill does indeed exist, so I'm not exactly sure why it didn't get called. Let's open it up real quick and take a look at it. This is what a skill file actually looks like. It's just marked down. It's called HyperMotion video. Generate a HyperMotion style premium product launch video via Higgs Field Marketing Studio, high energy,
30:25when to invoke, what to do, what to ask before generating, basically the template. So here are the hard rules for the skill. So I might just have to, like, close out of the Clot app, open it back up, and then we should be able to see the skill. Alright. So I close out of the app, and now that I'm back in, I can see the HyperMotion video skill, and I'm just gonna say, this is the skill that I want you to actually use.
30:46I also realized that my dictation accidentally corrected HyperMotion to Remotion, so that's why maybe Clog got a little bit confused. But it's gonna look at the image, it's gonna read the skill, and then we should have a pretty solid output. And because it read the skill, it said, okay. One quick question before I lock in the prompt. Do you want a model in the ad? Do you want it to be UGC, or do you just want it to be product only? I'm gonna say product
31:09only, and hopefully, it should be good to go now. Alright. So moment of truth. This is done. I'm gonna go ahead and open it up. Let's see what we got.
31:31Wow. I mean, that's crazy good. I think, you know, obviously, this is where I'm like, okay. That's pretty bad. But all of this stuff, I mean, like, it followed the skill and we'd be able to say, hey. I like this. I don't like this, and you make it a little bit better. But I think that this is really good. I mean, I really liked the feel. It looks very real.
31:51And, yeah, I am pretty happy with this output. Now it is a little bit unfortunate because in the reference image, all of the words do come through perfect. So it's not an issue with the reference image. It's an issue with the model that generated this video. Sometimes models are going to mess up text a little bit, especially when you're doing image to video rather than image to image.
32:13Now one thing to keep in mind is this is the worst that AI video generation models will ever be. Every day, every month, they're going to get better and better. So right now is the worst it'll ever be. But I do wonder if there's some stuff that you could do in that video and in the prompting of the video to actually make it a little bit more accurate. And I think if it were me and I had that output and I just, you know, was getting that sort of quality but the words weren't coming through right, then then I would probably just say, you know what? That's fine. For our videos that are this high quality, we're just going to use a different label cover, which is just the logo and the name rather than like all the little metadata that we might not need right now. So anyways, there there are some ways to work around. There's some ways to have a better positive attitude about it rather than just hating on it and calling everything AI slob. And because this was using marketing studio video,
32:57HOOKit said that it didn't actually use specifically one model. It kind of used a mix because it used, you know, marketing studio. So you don't choose it directly unless you would have ran it with some sort of flag. So anyways, this is kind of the main workflow. Right? We're using it to ideate. We're filling in a sheet. We're marking off statuses. But how do we actually start to actually automate this? Well, think about this. In Claude, we're able to set up these things called routines.
33:21HOOKAnd what routines are is it basically just, on a set cadence, injects a prompt into Claude code. So what we just did here, this could be a routine. All we have to do is prompt it to do so. So in a routine, what we could do is we could build a new one that basically says, okay. Every Sunday, I want you to look at this Google Sheet, and I want you to also pull in data from, you know, Instagram or wherever we're posting this stuff, and I want you to analyze what's working, what's not, and then I want you to ideate and I want you to add on top of the sheet
33:49HOOK50 new generations. So 50 new every single week. And then what we do is that let's say that that's a Sunday night. We could maybe have a Monday morning one, that's a new routine. Okay. Monday morning, you're gonna go to the sheet and you're gonna pick 30 videos with a blank status, and that's how we ensure that we're not duplicating efforts. So let's say it picks, you know, these 30. It's not 30, but let's say it picks all of these because there's no actual status. It would grab all 30 of these. It would create the prompts for all of them. It would generate all of them, and then we'd wake up on Monday morning with all of these completed with URLs and with job IDs. And then we could scale it up. Maybe we're doing planning every Sunday and Thursday, and then maybe we're doing generating every Monday and Friday, and maybe we're scaling up from 50 to a 100 or 200. And that's how you sort of push the system to the point where it's scaling way faster than you could as a human or even maybe multiple humans. Then if you wanna take it one step further, you could actually connect this pipeline once you actually get it in a place where you trust the outputs to something like Potato or even maybe plug it into the meta ads manager. And you're starting to schedule and post these things automatically too because you've built up a batch of skills that you actually trust enough to just let them run autonomously.
34:56HOOKCTASo that's kind of the idea. And everyone's gonna get in here and everyone's gonna do it a little bit differently, but it's really, really simple to set up your routines right in here or to come into here and say, hey. I want you to set up a routine for me, 8AM, Monday. Do this and this Because it's all very possible with natural language. So if you guys wanna dive a little bit deeper into routines, I'll tag a full video right here where I dive into how you set them up and some of the little gotchas that are in there. But that is gonna do it for this video. So if you guys enjoyed, please give it a like. It helps me out a ton. And as always, I appreciate you guys making it to the end of the video, and I will see you in the next one. Thanks, ever
§ · For Joe

Steal the Claude Code creative agency stack.

Modern Creator playbook

Wire Higgsfield (or any media-gen) into Claude Code via CLI, drop your swipe files in as project markdown, log every generation to a Sheet with a status column, then let two routines plan and execute on a weekly cadence.

  • Default to the CLI, not the MCP, whenever an agent will loop — token cost compounds across 100-row batches.
  • Land your swipe files (Maria Wendt 255-email, HTSS, Hoffmann's Reward Funnel) inside the project as `.md` so every agent run has the subject-matter expertise pre-loaded.
  • Mine skills from outputs you already love — copy the winning prompt back into chat, ask Claude to convert it into `.claude/skills/<name>.md`. Don't author skills from scratch.
  • Bake negative constraints into skills too (banned words that triggered moderation, brand assets that must lock to a reference image). Skills should encode failure modes, not just successes.
  • A Google Sheet with status / result URL / job ID columns is a v1 creative-ops database. Don't build a custom DB until the Sheet is the bottleneck.
  • Two routines beat a permanent agent: Sunday 'plan 50 new ideas,' Monday 'execute 30 blank-status rows.' Scale by adding more weekly slots, not by making routines smarter.
  • Reframe AI-generated weirdness with the 'this is the worst it'll ever be' line — it's a softer counter to AI-slop critiques than arguing on quality.
§ · For You

What this means if you're trying to ship more creative without a team.

If you want to try the workflow yourself

You don't need to be technical to copy this — you need one Claude subscription, a Higgsfield account, and the patience to run the same prompt five times until you have one output good enough to turn into a skill.

  • Start in claude.ai (not Claude Code) — add Higgsfield as a custom connector under Settings > Connectors and prove the workflow with one prompt before you touch any CLI.
  • Use one starter prompt: 'Build me a [product type] brand from scratch — do research, branding, catalog, and generate a product photo, IG ad, and UGC video for each item via Higgsfield.' Edit the noun, keep the structure.
  • When an image is wrong, reply in the same thread with the fix ('two headers — remove one,' 'use this reference image, don't change the bottle') instead of starting over. Iteration is the workflow.
  • Always drop in a reference image when the product already exists — it cuts the 'looks nothing like my product' problem in half.
  • Keep a running Google Sheet of every prompt + result + which one converted. The sheet is the moat — it's how you compound across weeks.
  • Treat the first hyper-motion video as a draft, not a deliverable. Save the prompts that work; throw away the ones that don't.
§ · Frame Gallery

Visual moments.