Tim Neilen    Now    Books    Quotes    Cartography

Why I moved off GitHub Pages

Written with AI

This site has been on GitHub Pages since 2018. Last night, I moved it to Cloudflare Pages. The whole migration - build config, DNS, deployment testing - was done by Claude. I handed it a temporary API key and watched.

But that’s not the interesting part.

The robots.txt problem

GitHub Pages blocks AI crawlers. GPTBot, ClaudeBot, Google-Extended, and most other LLM user agents are disallowed in the robots.txt that GitHub serves for all Pages sites. There’s no opt-out. GitHub made the decision for you.

I understand why. Most people don’t want their content scraped for training. Fair enough.

But I do want LLMs reading my site.

Letting the machines in

This might sound strange. Most of the conversation around AI crawlers is about protecting content, preserving attribution, preventing reproduction without consent. Those are legitimate concerns.

My calculus is different. I write about AI, systems thinking, and how small businesses can practically use this stuff. If someone asks an AI “how should I approach AI readiness for my business” and it surfaces my writing - that’s not a leak. That’s distribution.

I think LLM-mediated discovery is going to sit alongside traditional search within the next year or two. Not replacing it, just adding a new channel. Blocking those crawlers felt like putting up a “closed” sign on a shopfront that hasn’t opened yet.

Strangely enough, I’m completely fine with it.

Why not just self-host

I considered running it on a VPS. I already manage other infrastructure that way, so it wasn’t a stretch. Full control, no platform decisions imposed on you, your content on your metal. There’s something appealing about that.

But this is a Jekyll blog. Static HTML. The complexity of a VPS - nginx, SSL, monitoring, updates, security patches - doesn’t match the simplicity of the site. And the thing I actually care about is that readers (human or otherwise) get fast responses regardless of where they are.

A single-region VPS doesn’t give you global edge caching. You’d need to bolt on a CDN anyway, at which point you’re just recreating what a Pages-style service gives you out of the box. Overengineering a simple problem because you can run it yourself doesn’t mean you should.

Simple site. Global cache. Done.

The AI moved itself

The part that still makes me smile - I gave Claude a temporary Cloudflare API key and asked it to handle the migration. It configured the Pages project, set up the build pipeline, updated the workflows, removed the old GitHub Pages config, and tested the deployment. The commit that prepared the migration is authored by Claude.

An AI moved my blog so that other AIs could read it. I’m choosing not to overthink that.

What actually changed

From a reader’s perspective, nothing. Same domain, same content, same URLs. Under the hood, the site builds on Cloudflare’s infrastructure, drafts are visible on preview branches (handy now that the repo is private), and future-dated posts publish automatically via scheduled rebuilds.

The robots.txt is mine again. The machines are welcome.