Papers
Topics
Authors
Recent
Search
2000 character limit reached

Internal APIs Are All You Need: Shadow APIs, Shared Discovery, and the Case Against Browser-First Agent Architectures

Published 1 Apr 2026 in cs.ET and cs.AI | (2604.00694v1)

Abstract: Autonomous agents increasingly interact with the web, yet most websites remain designed for human browsers -- a fundamental mismatch that the emerging ``Agentic Web'' must resolve. Agents must repeatedly browse pages, inspect DOMs, and reverse-engineer callable routes -- a process that is slow, brittle, and redundantly repeated across agents. We observe that every modern website already exposes internal APIs (sometimes called \emph{shadow APIs}) behind its user interface -- first-party endpoints that power the site's own functionality. We present Unbrowse, a shared route graph that transforms browser-based route discovery into a collectively maintained index of these callable first-party interfaces. The system passively learns routes from real browsing traffic and serves cached routes via direct API calls. In a single-host live-web benchmark of equivalent information-retrieval tasks across 94 domains, fully warmed cached execution averaged 950\,ms versus 3{,}404\,ms for Playwright browser automation (3.6$\times$ mean speedup, 5.4$\times$ median), with well-cached routes completing in under 100\,ms. A three-path execution model -- local cache, shared graph, or browser fallback -- ensures the system is voluntary and self-correcting. A three-tier micropayment model via the x402 protocol charges per-query search fees for graph lookups (Tier~3), a one-time install fee for discovery documentation (Tier~1), and optional per-execution fees for site owners who opt in (Tier~2). All tiers are grounded in a necessary condition for rational adoption: an agent uses the shared graph only when the total fee is lower than the expected cost of browser rediscovery.

Summary

  • The paper introduces Unbrowse, a system that uses a shared route graph to leverage internal APIs and reduce web automation latency by over 3x.
  • It employs passive network observation and heuristic filtering to mine and normalize callable endpoints from live web traffic.
  • The approach integrates a novel micropayment model that aligns economic incentives, ensuring scalable and sustainable agentic interactions.

Summary of "Internal APIs Are All You Need: Shadow APIs, Shared Discovery, and the Case Against Browser-First Agent Architectures" (2604.00694)

Motivation and Problem Statement

The paper critiques current browser-based agent architectures for web automation, identifying the mismatch between human-centric UI design and agent task requirements. Autonomous agents are burdened by costly and redundant browser-automation workflows involving repeated DOM inspection, UI navigation, and reverse-engineering of callable routes. Official APIs, while performant, are rarely available, excluding a majority of the web from efficient agent interaction. The authors argue for a paradigm shift: direct leveraging of internal (shadow) APIs already exposed by websites for their own interfaces, transforming private reverse-engineering into a shared, commons-based infrastructure for web agents.

Unbrowse System and Shared Route Graph

The proposed Unbrowse system centralizes route discovery via a shared graph that aggregates callable endpoints mined from live browsing traffic rather than manual developer input. The architecture introduces a three-path execution model: local cache, shared graph lookup, and browser fallback. This ensures optimality via voluntary participation; agents revert to self-discovery only when the shared approach fails to deliver surplus utility. Unbrowse integrates with standard agent hosts (via CLI, MCP, AgentSkills.io), providing drop-in replacement for browser automation with transparent intent resolution and capability packaging.

Route discovery relies on passive network traffic observation, heuristic filtering of API calls, and normalization into structured skills. The marketplace indexes skills using semantic vector search, ranking candidates with a composite score on embedding similarity, reliability, freshness, and verification status.

Economic and Incentive Model

The shared route graph is governed by a concrete economic condition: agents participate only when the cumulative fee for graph lookup and execution is lower than the expected rediscovery cost (including latency, compute, token burn, and retry probability). This market discipline is hard-coded and prevents unproductive rent-seeking.

The micropayment architecture, built atop the x402 protocol, consists of three tiers:

  1. Tier 1 (Skill Installation): One-time fee for documentation and code package.
  2. Tier 2 (Site-Owner Execution): Opt-in per-execution fee for sites electing to monetize agent traffic.
  3. Tier 3 (Search/Routing): Per-query fee for graph lookup, enabling sustainable index maintenance.

Fee split and contributor attribution are handled via delta-based scoring, considering marginal improvements by historical contributors. The system avoids storing credentials in public routes, maintains local execution for security, and periodically verifies endpoint freshness to counter schema drift.

Empirical Results

In a live-web benchmark across 94 domains and information-retrieval tasks, Unbrowse achieved a mean latency of 950 ms versus 3,404 ms for Playwright (browser automation), yielding a mean speedup of 3.6x and a median speedup of 5.4x. All tasks were output-equivalent; cached execution on select domains consistently completed in under 100 ms. Cold-start (route not cached) averages 12.4 s, but subsequent executions revert to the cached speed, with breakeven realized in 3–5 uses per route. The system scales by network effects: more agents generate more demand, which drives broader route coverage and improved amortization.

Practical and Theoretical Implications

The Unbrowse architecture demonstrates that internal APIs are a nearly universal substrate of the web, and that collective discovery and sharing can decisively outcompete browser-centric automation for agent workflows—subject to the discipline set by real rediscovery cost. This approach expands agentic reach beyond sites supporting official APIs, establishing a robust knowledge commons with credible incentives for route contribution and maintenance.

The economic framework, including adoption condition and dynamic pricing, provides a principled basis for agent economies, discouraging free-riding and ensuring sustainability. Mechanisms for security, trust, and provenance are underway, with noted limitations due to anti-bot measures, session expiry, and evolving legal standards.

Future Directions

Areas of further research include formal equilibrium analysis of the commons, incentive compatibility proofs for contributor attribution, large-scale evaluation of network effect dynamics, integration with emerging agent communication protocols (MCP, ANP), comparative studies versus proprietary API aggregation platforms, and enhanced adversarial robustness.

Conclusion

"Internal APIs Are All You Need" (2604.00694) presents an authoritative, empirical case for transitioning agent architectures from browser-first to shared-discovery-first modes. By organizing route knowledge into a commons, and underpinning agent interaction with a concrete, disciplined economic model, the system achieves significant computational and economic advantages in web automation. The implications for scalable agentic workflows and the emerging Agentic Web are substantial, provided security, incentive, and legal frameworks continue to mature in parallel.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 8 tweets with 23 likes about this paper.