Back to Notes
SEO

Your SEO Workflow Is a Mess. Here’s How to Fix It With One Folder and Claude.

Michael Ehrlich

Michael Ehrlich

Author

PublishedMarch 7, 2026
Your SEO Workflow Is a Mess. Here’s How to Fix It With One Folder and Claude.

Let’s talk about the dumbest part of SEO in 2026.

It’s not keyword research. It’s not link building. It’s not even trying to figure out what Google’s algorithm wants this week.

It’s the data wrangling.

You’ve got Google Search Console open in one tab. GA4 in another. Google Ads in a third. Maybe some AI visibility tool in a fourth. And you, beautiful genius that you are, sitting in the middle trying to be the human glue between all of it. Copy-pasting into spreadsheets. Running VLOOKUPs. Squinting at column headers wondering why nothing lines up.

This is not strategy. This is digital janitorial work. And most of us are spending more time on the janitorial work than on the actual thinking that moves the needle.

I’m going to show you a setup that eliminates almost all of it. It takes about an hour to build. After that, you can ask questions like “which keywords am I wasting money on in Ads that I already rank for organically?” and get an answer in 90 seconds instead of 90 minutes.

No dashboards. No Looker Studio. No hiring a data person. Just a folder, some Python scripts, and Claude Code.

What the Hell Is Claude Code?

Claude Code is a command-line AI tool from Anthropic. It runs in your terminal, it can read your local files, and here’s the important part. It can write and execute code on the fly.

That means you can point it at a folder full of data and just… talk to it. In English. Like a person.

“Which pages get tons of impressions but nobody clicks on?”

“Group my keywords by topic and tell me where I’m weakest.”

“Compare my paid search terms against my organic rankings and find the overlap.”

It reads the files, writes the analysis code, runs it, and gives you the answer. No spreadsheet required. No pivot table. No afternoon of your life you’re never getting back.

Why Should You Care?

Because right now, the bottleneck in your SEO work is not intelligence. It’s plumbing.

Think about it. The strategic questions are usually pretty obvious. “Where are we ranking on page two that we could push to page one?” “What are we paying for that we don’t need to?” “Where are the content gaps?”

You already know what to ask. The problem is that getting the answer requires pulling data from three different platforms, normalizing it, cross-referencing it, and doing a bunch of mechanical work that has nothing to do with being good at SEO.

So either you spend half your day on spreadsheets, or you pay someone else to spend half their day on spreadsheets, or you just… don’t do the analysis as often as you should. Most people pick option three. And then they wonder why their SEO strategy feels reactive instead of proactive.

This setup kills the plumbing problem. You get to skip straight to the thinking.

The Setup (It’s Just a Folder)

I’m going to spare you the developer cosplay. Here’s what the whole thing actually is.

A folder on your computer.

Inside that folder, three things.

  1. A config file. Client name, domain, API property IDs. Takes two minutes to fill out.
  2. Python scripts that fetch data. One per source. GSC, GA4, Google Ads. They authenticate, pull data, and save JSON files.
  3. The JSON files themselves. That’s your data. Queries, traffic, search terms, spend. All sitting in one place.

No databases. No servers. No deployment. No Kubernetes. (If someone tries to sell you Kubernetes for this, run.)

You open Claude Code in that folder, and it can see everything. Now you’re having a conversation with all your SEO data at once.

The Part Nobody Tells You About Google APIs

Here’s where it gets fun.

Connecting to Google’s APIs sounds scary. It sounds like something a developer needs to do. It sounds like documentation and OAuth flows and credential management and all that stuff that makes normal people’s eyes glaze over.

And yeah, there is a little setup. You make a service account in Google Cloud, enable a couple APIs, download a key file, and add the service account email to your client’s GSC and GA4 properties. It’s the same process as adding a team member. Takes maybe 15 minutes.

Google Ads is slightly more annoying because it uses OAuth instead of a service account. You need a developer token and a one-time browser auth. But here’s the thing. If you don’t want to deal with it, just download your search terms report as a CSV and drop it in the folder. Claude Code doesn’t care whether the data came from an API or an export. It reads files. That’s what it does.

And here’s the part that actually blew my mind. you don’t need to read any API documentation. None. Zero. You tell Claude Code “I want the top 1,000 queries from Search Console for the last 90 days” and it writes the script for you. It already knows the APIs. It figures out the authentication, the endpoints, the parameters, all of it.

For anyone who has ever spent an afternoon trying to decode Google’s GA4 Data API docs, that sentence alone should make you want to try this.

The One Question That’s Worth More Than Everything Else

Once you have GSC data and Ads data in the same folder, there is one question that pays for the entire setup.

“Show me every keyword where I’m paying for clicks but already rank well organically. And show me every keyword where paid is my only presence.”

That’s it. That one question.

The first half finds wasted spend. You’re buying clicks for keywords where Google is already giving you traffic for free. Maybe you can dial back the budget. Maybe you can reallocate to terms where you actually need the paid support.

The second half finds content gaps. These are keywords where your ads are doing all the heavy lifting because you have zero organic presence. That’s a flashing neon sign that says “create content here.”

This is the kind of analysis that agencies charge thousands of dollars for. And the reason it’s valuable isn’t because it’s hard to do. It’s valuable because almost nobody does it regularly. Because the manual process is such a pain in the ass that it gets pushed to quarterly reviews or skipped entirely.

When the analysis takes 90 seconds instead of an afternoon, you do it every month. You do it for every client. And suddenly your strategy is based on current data instead of something from three months ago.

More Questions You Should Be Asking

The paid-organic overlap is the headliner. But once you have the data loaded, keep going.

  • “Which pages get massive impressions in GSC but terrible CTR?” These are title tag and meta description problems. The page is ranking. People see it. They just don’t click. Fix the copy.
  • “Group my queries by topic cluster. Where do I have the most impressions but the worst positions?” This is your content investment roadmap. Stop guessing where to write next.
  • “Which pages in GA4 have high bounce rates but strong GSC positions?” Pages that rank but disappoint. The SEO is working. The content isn’t. Fix the page, keep the rankings.
  • “What are my top 20 organic queries by impression that I’m not running ads against?” Paid amplification candidates, handed to you on a silver platter.

The beautiful thing is the follow-up. You ask one question, get an answer, notice something interesting, and ask another question based on what you just learned. No rebuilding the analysis. No re-exporting the data. It’s a conversation, not a project.

The AI Visibility Thing (Yes, You Need to Start Thinking About This)

Here’s the part of SEO that most people are still ignoring. AI search.

Google AI Overviews. ChatGPT. Perplexity. Copilot. These platforms are answering questions using your content (or your competitor’s content) and most people have zero visibility into whether they’re being cited or not.

If you have an AI visibility tracking tool, export the data and add it to your project folder. Now Claude Code can cross-reference AI citations against your organic and paid data. You might find two blog posts cannibalizing each other for the same AI citations. You might find that a page with mediocre organic rankings is getting cited heavily in AI responses. These are strategic signals you cannot get from traditional SEO tools.

If you don’t have a tracking tool, start with Bing Webmaster Tools. It’s free. It shows you first-party data on how often your content appears in Copilot and Bing AI responses. It’s the only platform right now giving you real citation data instead of estimates.

For Google AI Overviews, cheap SERP APIs like DataForSEO will give you citation data for about a penny per query. Or go full scrappy and write a script that sends consistent prompts to the major LLM APIs and checks if your brand gets mentioned. Perplexity’s API is especially useful here because it includes source citations in every response.

The data is rough. More wind sock than GPS. But having a wind sock is infinitely better than flying blind, which is what most companies are doing right now.

Where This Falls on Its Face

I’m not going to sit here and tell you this replaces your brain. It doesn’t.

Claude Code finds patterns in data. It does not understand your client’s business. It doesn’t know that the CEO hates the word “synergy” or that the sales team just pivoted their messaging last week. It doesn’t know context. You do.

It also hallucinates sometimes. Not often, but it happens. I’ve seen it confidently report a number that didn’t match the source file. So treat the output the way you’d treat work from a smart but brand-new intern. Probably right, worth verifying before it goes anywhere important.

And it doesn’t replace your SEO platforms. You still need Semrush or Ahrefs or whatever for historical trends, competitive analysis, and automated monitoring. What this gives you is the ad hoc cross-referencing layer that none of those platforms handle well on their own.

Different tools for different jobs. This one’s for the “I have a question that spans three data sources and I want an answer in two minutes” job.

How to Actually Start

Don’t try to set up all four data sources at once. That’s how you end up with a half-finished project in a folder you never open again.

  1. Start with GSC. Easiest API to connect. One service account, read-only access, free. Pull 90 days of query and page data. Ask Claude Code to group by topic, find page-two opportunities, surface high-impression-low-CTR pages. That alone is worth a couple hours of manual work saved.
  2. Add GA4. Same service account. Now you can cross-reference. Which pages rank well but have terrible engagement? Which channels are actually driving conversions?
  3. Add Google Ads. The OAuth is a little more work, but the paid-organic gap analysis will make you feel like you have superpowers.
  4. Layer in AI visibility. Start free with Bing Webmaster Tools. Add a SERP API when you’re ready.

Each layer compounds. You don’t need all four to get value. GSC plus GA4 alone will surface things that would take you hours to find with spreadsheets.

The total setup for a new client is about 15 minutes. Monthly data pulls take about 5 minutes. The analysis is as fast as you can type questions.

Compare that to your current workflow and ask yourself why you’re still doing it the old way.