How to Score 100/100 on AgentReady: Complete Guide
AgentReady scores your website's AI agent readiness on a scale of 0 to 100 across six categories. Most websites score between 5 and 25 on their first scan. This guide walks you through every category, explains what's checked, and provides the exact code to fix each issue.
Time estimate: A typical website can go from ~15/100 to 100/100 in 1-2 hours. Static sites and SPAs are the fastest to fix. WordPress sites may need a plugin or theme edit.
1. WebMCP Compliance
Weight: 30%This is the largest scoring category. AgentReady checks whether your forms have WebMCP attributes (declarative) and whether you register tools via the JavaScript API (imperative).
What's checked
- Forms have
tool-nameandtool-descriptionattributes - Form inputs have
tool-param-descriptionattributes - The page calls
navigator.modelContext.registerTool() - Tool definitions include proper JSON Schema parameters
- Tools have descriptive names (not generic like "form1")
Fix: Declarative approach
<form action="/search">
<input name="q" type="text">
<button type="submit">
Search
</button>
</form><form action="/search"
tool-name="site-search"
tool-description="Search
products by keyword">
<input name="q" type="text"
tool-param-description=
"Search query text">
<button type="submit">
Search
</button>
</form>Fix: Imperative approach
if (navigator.modelContext) {
navigator.modelContext.registerTool({
name: "site-search",
description: "Search products by keyword, category, or price range",
parameters: {
type: "object",
properties: {
query: { type: "string", description: "Search keywords" },
category: { type: "string", description: "Product category filter" },
maxPrice: { type: "number", description: "Maximum price in USD" }
},
required: ["query"]
},
handler: async ({ query, category, maxPrice }) => {
const results = await fetch(`/api/search?q=${query}&cat=${category || ''}&max=${maxPrice || ''}`);
return results.json();
}
});
}2. AI Discovery Protocols
Weight: 20%AgentReady checks for the five standard AI discovery files. Each detected protocol adds points.
What's checked
/.well-known/agent.json— A2A Agent Card/.well-known/mcp.json— MCP Discovery/agents.json— Agent policy manifest/openapi.jsonor linked OpenAPI spec/llms.txt— LLM-readable site summary
You don't need all five for a good score, but having all five gets you maximum points. See our detailed guide on AI discovery protocols for implementation instructions.
Quick win: llms.txt
# Your Site Name
## About
Brief description of your website and its purpose.
## Key Pages
- / — Homepage
- /products — Product catalog
- /contact — Contact form
- /api — Developer API documentation
## API
- OpenAPI spec: /openapi.json
- Base URL: https://yoursite.com/api/v1
## Contact
- Email: hello@yoursite.comPlace this at your site root as llms.txt. Takes under 5 minutes.
3. Semantic HTML Structure
Weight: 15%Even without WebMCP, good semantic HTML helps AI agents understand your page. AgentReady checks your HTML quality from an AI-readability perspective.
What's checked
- Form inputs have associated
<label>elements - Inputs have
name,type, andplaceholderattributes - Required fields are marked with
requiredattribute - Forms use semantic elements (
<fieldset>,<legend>) - Page has a logical heading hierarchy (h1 > h2 > h3)
- ARIA labels and roles are used where appropriate
<div class="form">
<div>Email</div>
<input class="input-1">
<div>Message</div>
<textarea></textarea>
<div class="btn"
onclick="send()">
Send
</div>
</div><form action="/contact"
method="POST"
tool-name="contact"
tool-description="Send a
message to our team">
<label for="email">
Email
</label>
<input id="email"
name="email" type="email"
required
tool-param-description=
"Your email address">
<label for="msg">
Message
</label>
<textarea id="msg"
name="message" required
tool-param-description=
"Your message">
</textarea>
<button type="submit">
Send
</button>
</form>4. Crawlability & Robots
Weight: 15%AI agents need permission to access your site. AgentReady checks whether your robots.txt and meta tags welcome AI crawlers.
What's checked
robots.txtexists and is accessible- AI-specific bots (GPTBot, ClaudeBot, etc.) are not blocked
- No
<meta name="robots" content="noindex, nofollow">blocking agents - Sitemap is referenced in robots.txt
- Pages load without JavaScript-only rendering issues
Optimal robots.txt
User-agent: *
Allow: /
# AI Crawlers — explicitly welcome
User-agent: GPTBot
Allow: /
User-agent: ClaudeBot
Allow: /
User-agent: GoogleOther
Allow: /
User-agent: PerplexityBot
Allow: /
User-agent: Applebot-Extended
Allow: /
Sitemap: https://yoursite.com/sitemap.xmlSee our complete guide to robots.txt for AI bots for details on each crawler.
5. Structured Data
Weight: 10%JSON-LD structured data helps AI agents understand the type and context of your content.
What's checked
- Page contains JSON-LD structured data (
<script type="application/ld+json">) - Schema.org types are used correctly
- Key properties are populated (name, description, url)
- No JSON parsing errors in structured data blocks
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "WebApplication",
"name": "Your App Name",
"description": "What your app does in one sentence",
"url": "https://yoursite.com",
"applicationCategory": "BusinessApplication",
"operatingSystem": "Web",
"offers": {
"@type": "Offer",
"price": "0",
"priceCurrency": "USD"
}
}
</script>6. Meta & OpenGraph
Weight: 10%Proper meta tags give AI agents context about your page before they even parse the DOM.
What's checked
<title>is present and descriptive<meta name="description">exists and is meaningful- OpenGraph tags (
og:title,og:description,og:type,og:url) - Twitter card meta tags
- Canonical URL is set
- Language is declared (
langattribute on<html>)
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Your Page Title — Your Brand</title>
<meta name="description" content="Clear, concise description
of what this page does. 150-160 characters.">
<meta property="og:title" content="Your Page Title">
<meta property="og:description" content="Same or similar description.">
<meta property="og:type" content="website">
<meta property="og:url" content="https://yoursite.com/page">
<meta name="twitter:card" content="summary_large_image">
<link rel="canonical" href="https://yoursite.com/page">
</head>Putting It All Together: The 100/100 Checklist
- Add WebMCP attributes to every
<form>and<input>on your pages - Register tools via
navigator.modelContext.registerTool()for complex interactions - Create llms.txt and agents.json at your site root
- Add A2A Agent Card at
/.well-known/agent.json - Add MCP Discovery at
/.well-known/mcp.json(if applicable) - Generate or link OpenAPI spec (if you have an API)
- Fix semantic HTML — labels, names, types, required attributes
- Configure robots.txt — allow AI crawlers, reference sitemap
- Add JSON-LD structured data with appropriate Schema.org types
- Complete meta tags — title, description, OG tags, canonical URL
Pro tip: After making changes, use AgentReady's rescan feature to verify your score immediately. The Agent Simulator tab shows exactly how an AI agent sees your page before and after your fixes.
Common Mistakes That Cost Points
- Generic tool names — "form1" or "submit-form" lose points. Use descriptive names like "product-search" or "newsletter-signup".
- Missing parameter descriptions — Every input needs a
tool-param-description. Agents need to know what each field expects. - Blocking AI bots in robots.txt — Some default robots.txt configs block everything. Check for blanket
Disallow: /rules. - JavaScript-only forms — Forms rendered entirely by JavaScript may not be detected by some scanners. Always include WebMCP tool registration in your JS.
- Empty or placeholder meta descriptions — "Welcome to our website" or "Page description" are flagged as low quality.
Ready to hit 100?
Scan your site now and see exactly which fixes you need. AgentReady generates the code for you.
Scan Your Site Free