Standards for the Machine-Readable Website

Saw this from Garry Tan:
https://www.linkedin.com/in/garrytan/recent-activity/all/

Then ended up here:
https://x.com/flynnjamm/status/2023465136204419096?s=46

And I couldn’t stop thinking about one thing:

How do you sell to agents?

Not humans.
Not people clicking blue links.
Agents.


The shift is real

Search clicks are dropping.

People are getting answers without visiting sites.

SEO was built for clicks.
Agents are built for extraction.


Bots are everywhere

AI bot traffic is up 300%+ (Akamai).

More machines reading your site than humans.

But your site?
Still written for humans.


The actual problem

It’s never been easier to launch a site.

It’s also never been easier to launch one agents can’t understand.

We have pieces:

But no unified “AI-ready” standard.

No Lighthouse for agents.


The idea:

Think Lighthouse, but for AI discoverability. Lighthouse is Google’s site audit tool — it scans your site and gives you performance + SEO scores.

It would:

Because soon:

Your customer might never visit your site.
Your pricing page might be summarized somewhere else.
Your product might be recommended by a model.

In that world, ranking doesn’t matter.

Being extractable does.

SEO shaped the last 20 years.

Agent readiness might shape the next 20.

Published on