Trends
How we make sites AI-agent-ready (audit & setup)
AI agents already browse the web. Is your site ready?
ChatGPT with tool-use, Perplexity, Claude, Gemini-in-Chrome — all of them already read sites and take actions "on the user's behalf." Next week a prospect will tell an AI: "find a marketing agency in Chișinău and put a call on my calendar." If the agency's site isn't agent-ready, the request dies at step 2-3.
In March we ran smmart.md through [isitagentready.com](https://isitagentready.com) — a free audit covering 10+ standards for AI agents. We were at ~40%. Here's what we shipped to clear 90%+.
What an agent-ready audit checks
- `llms.txt` — the plain-text site map agents read first - `robots.txt` with an explicit allowlist for GPTBot, ClaudeBot, PerplexityBot, Google-Extended - JSON-LD schemas (Organization, BlogPosting, BreadcrumbList) — agents parse them as structured data - `/.well-known/agent-skills/index.json` (RFC v0.2.0) — a catalog of actions agents can call - Markdown content negotiation — `Accept: text/markdown` returns markdown instead of HTML - WebMCP — `navigator.modelContext.provideContext()` in the browser - `/.well-known/security.txt` — disclosure path for researchers
What we shipped on smmart.md
Agent-readable content. `/llms.txt` lists 21 marketing pages + 7 articles across three locales. Sitemap.xml with 39 URLs. Every page carries schema.org JSON-LD (Organization, Service, BreadcrumbList, BlogPosting).
Markdown for Agents. Send `Accept: text/markdown` to any page and you get the markdown source instead of HTML. `curl -H "Accept: text/markdown" https://smmart.md/en/ideas/playbook-smm-moldova-2026` returns the markdown body directly.
Agent Skills. `/.well-known/agent-skills/index.json` declares two skills: `submit-lead` (POST /api/lead with a JSON schema) and `browse-articles`. Agents know which actions they can take on the site.
WebMCP. In Chrome with an active AI helper, `navigator.modelContext` sees `submit-lead` as an inline tool. The agent can file a brief directly, without filling in the visual form.
Why it matters for a local brand
Today: ~1-3% of B2B leads arrive via AI search (ChatGPT, Perplexity, AI Mode). Twelve months out, analysts forecast 10-15%.
The brands that show up first in AI results have:
- Clean schema markup (Google scrapes it for AI Overviews) - An up-to-date `llms.txt` - A response to `Accept: text/markdown` - Descriptions and meta tags optimised to 150-160 chars
This is not 2010-era SEO. This is AI SEO for 2026.
What we do for clients
A full audit via isitagentready.com plus per-platform implementation (WordPress, Webflow, Next.js, Shopify):
- `llms.txt` + `robots.txt` with an AI-bot allowlist
- Schema.org Organization + LocalBusiness + per-page schemas
- Markdown content negotiation (Cloudflare, Vercel, Netlify edge middleware)
- Agent Skills index if the site exposes public actions (leads, booking, search)
- WebMCP provider where it earns its keep (Chrome-heavy brands)
- Security headers + `.well-known/security.txt`
smmart.md is our own proof — pushed from ~40% to 90%+ on isitagentready in three days.
[Send a brief](/contact) if you want the same audit + setup on your site.
Key takeaways
- AI agents (ChatGPT, Perplexity, Claude, Gemini-in-Chrome) already browse the web on users' behalf. - Agent-ready checklist: `llms.txt`, robots allowlist, JSON-LD schemas, `agent-skills`, markdown negotiation, WebMCP, `security.txt`. - Today ~1-3% of B2B leads come via AI search; analysts forecast 10-15% within 12 months. - Markdown content negotiation (`Accept: text/markdown`) is the cheapest agent-ready win. - smmart.md moved from ~40% to 90%+ on isitagentready in three days — same process for clients.