A new tool is available for website owners to assess their site's readiness for AI agents. The "Agent-Readiness" scan checks multiple emerging standards across five categories: discoverability, content accessibility, bot access control, protocol discovery, and commerce.

The scan evaluates key aspects such as robots.txt configuration, Markdown negotiation, MCP (Machine-readable Content Protocol) server cards, and OAuth protection. It also checks for x402, UCP (Unified Commerce Protocol), and ACP (Agent-to-Content Protocol) implementations.

According to the tool's instructions, website owners can improve their site's score by publishing a valid robots.txt with AI bot rules and sitemap directives, as well as exposing useful discovery headers or metadata on their homepage.

To use the scan, users can copy and paste the provided instructions into their coding agent (such as Cursor, Claude Code, Windsurf, Copilot, etc.) to receive recommendations for improvement.