
Changelog
Jan 28, 2026
17,000+ LLMs Now Available on Backboard
We have expanded the Backboard model ecosystem to include over 17,000 large language models, giving independent developers access to one of the largest and most flexible AI model collections available through a single API.
A significant portion of these models come from our partners at Featherless, who specialize in curating and maintaining high-quality open source models for real-world workloads.
What this unlocks for builders
Real choice, not forced defaults
With access to 17,000+ models, you are no longer constrained to a short list of general-purpose LLMs. You can choose models based on cost, speed, size, or specialization, and switch them without reworking your application.
Purpose-built open source models
Many of the newly available models are optimized for narrow tasks such as coding, reasoning, classification, summarization, and domain-specific inference. For independent developers, this often delivers better results with lower latency and cost.
Tool-capable models at scale
Approximately 60 percent of the models support custom tools, including retrieval-augmented generation, search, and external function calls. This enables builders to create agents and workflows that retrieve data, take actions, and reason across systems.
Clear model discoverability
All models that support tool calling are documented in the Backboard model library. You can quickly see which models work with RAG, search, and custom tools before you build.
Experiment freely, optimize continuously
With this breadth of models, experimentation becomes a first-class workflow. You can benchmark models against real use cases, iterate quickly, and evolve your stack as open source models improve.
One API, consistent memory and state
Every model runs through the same Backboard API, with consistent handling of memory, state, routing, and tools. You get flexibility without added complexity.
Example: a lightweight research agent
Consider a simple research agent built by an independent developer:
A small, fast open source model from Featherless handles query understanding and routing
A tool-capable model performs retrieval over documentation or notes using RAG
A separate summarization model produces concise, structured outputs
Because all three models are available behind one Backboard API, the developer can mix and match models without managing multiple SDKs or infrastructure. Memory and state persist across steps, and models can be swapped as better open source options appear, without changing the agent’s architecture.
This kind of setup is often cheaper, faster, and easier to tune than relying on a single large model for every task.
Partner spotlight: Featherless
Featherless provides a deep catalog of specialized open source models, many of which are designed to work seamlessly with tools and external systems. Their focus on practical, task-specific models makes it easier for independent developers to build efficient, modular AI applications.
Get started
Explore the model library to find tool-capable models
Test specialized open source models for cost and performance gains
Build and route across multiple models using a single Backboard integration
This update expands what independent developers can build on Backboard while keeping the experience simple and unified.
Changelog

