Feature

17,000+ LLMs Now Available on Backboard

What's New

Backboard now gives developers access to 17,000+ LLMs through a single API — the largest model library available on any AI platform, enabling flexible, cost-efficient multi-model routing across all major providers.

Scale of Access
  • 17,000+ models from OpenAI, Anthropic, Google, Meta, Mistral, Cohere, AI21, DeepSeek, and dozens more.

  • New models added automatically as providers release them — no SDK updates needed.

  • Unified model metadata — context window sizes, pricing, capabilities, and latency benchmarks for every model.

Intelligent Routing
  • Cost optimization: Route to the cheapest model that meets your quality threshold.

  • Latency optimization: Prefer faster models for real-time applications.

  • Capability matching: Automatically select models with required features (vision, function calling, JSON mode).

  • Fallback chains: Define ordered model preferences with automatic failover.

Context Window Management
  • Automatic detection of each model's context window size.

  • Adaptive truncation and summarization when switching between models mid-conversation.

  • No more silent context overflow — the system warns and adapts before tokens are lost.

How to Use It

Browse the Model Library in your dashboard or use the GET /v1/models endpoint to search and filter. Any model can be used by passing its ID to the model parameter in chat completions.

ON THIS PAGE

No headings found on page

CATEGORY

Feature

PUBLISHED

SHARE