| Internet-Draft | Bot Service Index | April 2026 |
| Rehfeld | Expires 25 October 2026 | [Page] |
The internet was designed for human actors. Its discovery infrastructure — search engines, directories, and hyperlinked documents — assumes a human reading and navigating. Autonomous agents (bots) operating on the internet today face a structural gap: there is no machine-native, globally accessible index of services they can consume.¶
This document proposes the Bot Service Index (BSI): a HATEOAS-based, globally accessible, commercially sustainable service discovery infrastructure designed for autonomous agents as its primary consumers. The BSI provides a central, always-up-to-date, searchable index of machine-consumable API services, together with a structured three-dimensional trust model that allows consuming agents to apply their own trust policies against verifiable metadata.¶
This Internet-Draft is submitted in full conformance with the provisions of BCP 78 and BCP 79.¶
Internet-Drafts are working documents of the Internet Engineering Task Force (IETF). Note that other groups may also distribute working documents as Internet-Drafts. The list of current Internet-Drafts is at https://datatracker.ietf.org/drafts/current/.¶
Internet-Drafts are draft documents valid for a maximum of six months and may be updated, replaced, or obsoleted by other documents at any time. It is inappropriate to use Internet-Drafts as reference material or to cite them other than as "work in progress."¶
This Internet-Draft will expire on 25 October 2026.¶
Copyright (c) 2026 IETF Trust and the persons identified as the document authors. All rights reserved.¶
This document is subject to BCP 78 and the IETF Trust's Legal Provisions Relating to IETF Documents (https://trustee.ietf.org/license-info) in effect on the date of publication of this document. Please review these documents carefully, as they describe your rights and restrictions with respect to this document.¶
The internet's foundational infrastructure — HTTP, HTML, DNS, and search engines — was designed with human actors as the primary consumers. Web pages render visual layouts for human eyes. CAPTCHA systems explicitly discriminate against non-human access. Discovery mechanisms such as search engines index content for human-readable navigation.¶
Autonomous agents — software programs that independently execute tasks, consume APIs, and interact with external services without per-action human instruction — are not recognized as legitimate, first-class internet participants in this architecture. They are systematically treated as threats to be filtered, blocked, or rate-limited.¶
This situation is changing. The rapid growth of large language model-based agents, robotic process automation, and programmatic service consumers means that non-human actors now represent a significant and growing proportion of internet traffic. As this proportion increases, internet service providers will increasingly need to serve autonomous agents as a recognized user class alongside humans.¶
The Bot Service Index is premised on this trajectory: bots are becoming first-class internet participants, and the infrastructure to support them — starting with service discovery — does not yet exist.¶
The Bot Service Index was not conceived in the abstract. It emerged from a concrete practical failure.¶
A buying bot was built for a private use case: monitoring online shops for a specific product and purchasing it automatically the moment it became available. This is a straightforward task for an autonomous agent — exactly the kind of task agents are well-suited for.¶
The bot failed, not because the task was technically complex, but because the internet's infrastructure is actively hostile to it:¶
HTML-only product pages. Product availability, price, and purchase state were encoded in HTML rendered for a human eye. No machine-readable API existed. The bot had to parse HTML — fragile, maintenance-intensive, and broken by every redesign.¶
Cloudflare Bot Management and equivalent shields. The majority of commercial web services now sit behind bot mitigation infrastructure. Cloudflare Bot Management, and equivalent products from Akamai, Imperva, and others, are deployed specifically to detect and block non-human request patterns. Repeated automated requests — even at modest frequency — trigger rate limiting, CAPTCHA challenges, or silent blocking. A buying bot that polls a product page to detect availability is treated identically to a malicious scraper or a DDoS participant.¶
CAPTCHA payment barriers. Even when product pages were reachable, payment flows required solving CAPTCHAs that explicitly excluded non-human actors. The purchasing step — the final, necessary action — was deliberately made inaccessible to the bot.¶
Proxy network pollution. To work around rate limits and bot detection, the bot required a rotating proxy network — different IP addresses on each request to disguise its automated origin. This is not a solution: it pollutes internet traffic with avoidable requests, raises the cost of operation, and contributes directly to the adversarial dynamic between bots and infrastructure operators. Every proxy request is a wasted roundtrip that a machine-readable API endpoint would have made unnecessary.¶
Polling as the only state-change mechanism. Because the bot had no way to subscribe to product availability events, it had to poll the product page continuously. This is architecturally wasteful: the bot consumes server resources and network bandwidth to repeatedly ask a question whose answer has not changed. If the service had provided a notification registration endpoint — a webhook, a server-sent event stream, or a WebSocket channel — the bot could subscribe once and receive a push notification when the product became available. No polling. No proxy network. No CAPTCHA exposure.¶
These are not edge cases. They are the standard experience for any autonomous agent attempting to consume a commercial internet service today. The buying bot illustrates why the Bot Service Index is necessary: not as an academic exercise, but as the infrastructure layer that makes autonomous agents functional participants in the commercial internet.¶
When an autonomous agent must fulfill a task that requires an external service, it faces a fundamental discovery problem: how does it find services that can fulfill its requirement?¶
Current approaches are inadequate:¶
Hardcoded URLs: brittle, require human maintenance, do not adapt to new or changed services.¶
LLM training data: stale, non-authoritative, not machine-verifiable.¶
Human-curated lists: do not scale, not machine-navigable, lack structured metadata.¶
Web search: returns HTML documents designed for humans, not structured service descriptions for agents.¶
What is needed is a machine-native equivalent of a search engine: a global, always-current, structured index of services that autonomous agents can query by capability, trust level, liveness, and other machine-relevant criteria.¶
The BSI is not the first attempt at a global service registry. Prior efforts must be understood explicitly so that their failure modes are not repeated.¶
UDDI (Universal Description, Discovery and Integration)
UDDI was a SOAP-era standard for a global service registry with the same
conceptual goal as BSI, published as an OASIS Committee Draft in October
2004 (editors: Clement, Hately, von Riegen, Rogers). It failed for three
reasons: (1) extreme complexity of the XML-based data model; (2) no
automatic verification — all data was self-asserted with no crawling or
validation; (3) no adoption incentive — there was no commercial model to
sustain registration or discovery. BSI addresses all three directly: a
simple JSON manifest, automated spider verification, and a commercial tier
model.¶
robots.txt (Robots Exclusion Protocol)
Machine-readable, but concerned with exclusion — telling crawlers what not
to access — not with discovery of capabilities. Per-domain only. Not a
registry.¶
MCP (Model Context Protocol)
Defines tool and capability descriptions for LLM-based agents. Excellent
for consumption once a server URL is known. Does not address the discovery
problem: there is no index of MCP servers. BSI is complementary to MCP —
it can index MCP servers as one supported spec type.¶
Well-Known URIs (RFC 8615)
Per-domain machine-readable metadata at /.well-known/. Useful for
per-service metadata but requires the consumer to already know the domain.
No cross-service search or global index.¶
DNS
DNS resolves names to addresses but carries no capability semantics. It is
an architectural analogy for BSI's federation model, not a comparable system.¶
This document has no IANA actions.¶
RFC 2119 — Bradner, S., "Key words for use in RFCs to Indicate Requirement Levels", BCP 14, RFC 2119, March 1997.¶
RFC 8615 — Nottingham, M., "Well-Known Uniform Resource Identifiers (URIs)", RFC 8615, May 2019.¶
RFC 8446 — Rescorla, E., "The Transport Layer Security (TLS) Protocol Version 1.3", RFC 8446, August 2018.¶
RFC 9110 — Fielding, R., Nottingham, M., Reschke, J. (Eds.), "HTTP Semantics", RFC 9110, June 2022.¶
OpenAPI Specification 3.1 — OpenAPI Initiative, https://spec.openapis.org/oas/v3.1.0¶
Model Context Protocol — Anthropic, https://modelcontextprotocol.io¶
AsyncAPI Specification 3.0 — AsyncAPI Initiative, https://www.asyncapi.com/docs/reference/specification/v3.0.0¶
RFC 8949 — Bormann, C., Hoffman, P., "Concise Binary Object Representation (CBOR)", RFC 8949, December 2020.¶
Robots Exclusion Protocol — Koster, M., 1994. https://www.robotstxt.org/¶
draft-cui-ai-agent-discovery-invocation-01 — Cui, Y. (Tsinghua University), Chao, Y., Du, C. (Zhongguancun Laboratory), "AI Agent Discovery and Invocation Protocol", IETF Individual Submission, February 2026. Expires August 2026. https://datatracker.ietf.org/doc/draft-cui-ai-agent-discovery-invocation/¶
draft-am-layered-ai-discovery-architecture-00 — Moussa, H., Akhavain, A. (Huawei Canada), "A Layered Approach to AI discovery", IETF Individual Submission, March 2026. Expires September 2026. https://datatracker.ietf.org/doc/draft-am-layered-ai-discovery-architecture/¶
draft-hood-agtp-discovery-00 — Hood, C. (Nomotic, Inc.), "AGTP Agent Discovery and Name Service", IETF Individual Submission, March¶
Expires September 2026. https://datatracker.ietf.org/doc/draft-hood-agtp-discovery/¶
draft-mozleywilliams-dnsop-dnsaid-01 — Mozley, J., Williams, N. (Infoblox), Sarikaya, B. (Unaffiliated), Schott, R. (Deutsche Telekom), "DNS for AI Discovery", IETF Individual Submission, March 2026. Expires September 2026. https://datatracker.ietf.org/doc/draft-mozleywilliams-dnsop-dnsaid/¶
draft-batum-aidre-00 — Batum, F. (Istanbul), "AI Discovery and Retrieval Endpoint (AIDRE)", IETF Individual Submission, April 2026. Expires October 2026. https://datatracker.ietf.org/doc/draft-batum-aidre/¶
draft-mozley-aidiscovery-01 — Mozley, J., Williams, N. (Infoblox), Sarikaya, B. (Unaffiliated), Schott, R. (Deutsche Telekom), "AI Agent Discovery (AID) Problem Statement", IETF Individual Submission, April¶
Expires October 2026. https://datatracker.ietf.org/doc/draft-mozley-aidiscovery/¶
draft-pioli-agent-discovery-01 — Pioli, R. (Independent), "Agent Registration and Discovery Protocol (ARDP)", IETF Individual Submission, February 2026. Expires August 2026. https://datatracker.ietf.org/doc/draft-pioli-agent-discovery/¶
draft-narajala-courtney-ansv2-01 — Courtney, S., Narajala, V.S., Huang, K., Habler, I., Sheriff, A., "Agent Name Service v2 (ANS): A Domain-Anchored Trust Layer for Autonomous AI Agent Identity", IETF Individual Submission, April 2026. Expires October 2026. Supersedes draft-narajala-ans-00. https://datatracker.ietf.org/doc/draft-narajala-courtney-ansv2/¶
draft-vandemeent-ains-discovery-01 — van de Meent, J., Root AI (Humotica), "AINS: AInternet Name Service - Agent Discovery and Trust Resolution Protocol", IETF Individual Submission, March 2026. Expires September 2026. https://datatracker.ietf.org/doc/draft-vandemeent-ains-discovery/¶
draft-aiendpoint-ai-discovery-00 — Choi, Y. (AIEndpoint), "The AI Discovery Endpoint: A Structured Mechanism for AI Agent Service Discovery and Capability Exposure", IETF Individual Submission, March 2026. Expires September 2026. https://datatracker.ietf.org/doc/draft-aiendpoint-ai-discovery/¶
draft-meunier-webbotauth-registry-01 — Guerreiro, M. (Cloudflare), Kirazci, U. (Amazon), Meunier, T. (Cloudflare), "Registry and Signature Agent card for Web bot auth", IETF Individual Submission, October 2025. Expired April 2026; renewal expected. https://datatracker.ietf.org/doc/draft-meunier-webbotauth-registry/¶
webbotauth IETF Working Group — Chairs: Schinazi, D., Shekh-Yusef, R. AD: Bishop, M. Active WG. https://datatracker.ietf.org/wg/webbotauth/¶
W3C AI Agent Protocol Community Group — Chairs: Chang, G., Xu, S. Established May 8, 2025. 216 participants as of April 2026. https://www.w3.org/community/agentprotocol/¶
UDDI Version 3.0.2 — Clement, L., Hately, A., von Riegen, C., Rogers, T. OASIS Committee Draft, October 19, 2004. (Historical reference; see Section 1.3 for analysis of failure modes.) https://www.oasis-open.org/committees/uddi-spec/doc/spec/v3/uddi-v3.0.2-20041019.htm¶