CHANGELOG

Changelog

New features, improvements, and fixes shipped to Backboard.

CHANGELOG

Changelog

New features, improvements, and fixes shipped to Backboard.

Build AI Faster with Backboard’s New Unified API

API

Backboard API Update: One Call to Build Stateful AI Systems

Simpler Integration, Faster Time to Production

Building AI systems shouldn’t require stitching together multiple layers of infrastructure before you can even send your first request.

That’s the problem we set out to fix with our latest Backboard API update.

A Simpler Way to Build with Backboard

We’ve rethought how developers interact with Backboard from the ground up.

The result is a more intuitive API that reduces setup, removes unnecessary steps, and gets you to a working system faster.

One Call to Get Started

Previously, working with AI systems often meant setting up multiple components before anything useful happened.

With this update, the experience is simplified.

A single request is now enough to:

  • start a conversation

  • handle context automatically

  • apply configuration for that interaction

From there, you can continue seamlessly by referencing the same thread.

No complex setup required.

A More Streamlined API Surface

We’ve introduced a set of improvements designed to reduce friction and make the API easier to work with:

Unified Interaction Flow

A single endpoint now handles the full lifecycle of a request—from input to response—while managing context, memory, and system behavior behind the scenes.

Simplified Tool Handling

Working with tools is now more straightforward, with fewer moving parts and less state to track manually.

Better Control Over Runs

Developers can now manage in-progress requests more easily, including the ability to stop execution when needed.

Memory Management Improvements

You can reset or adjust memory state without needing to rebuild your system or change configurations globally.

Per-Interaction Configuration

Every request can be configured independently.

This includes:

  • system instructions

  • tool usage

  • model selection

  • memory behavior

Each interaction behaves exactly as specified—without unintended side effects.

For persistent configurations, dedicated endpoints remain available.

Updated SDKs

Both Python and JavaScript/TypeScript SDKs have been updated to reflect this simplified interaction model.

The goal is to make the most common workflows easier to implement, while still supporting advanced use cases.

Documentation Improvements

We’ve also rewritten the documentation to make it more usable:

  • clearer examples

  • improved explanations

  • better organization

  • updated guides for streaming and tool usage

If you’ve looked at our docs before, it’s worth taking another pass.

Backward Compatibility

All existing endpoints remain supported.

There are no forced migrations or deadlines.

You can adopt the new API at your own pace.

Why This Matters

The biggest bottleneck in building AI systems today isn’t the model—it’s everything around it.

By simplifying how you:

  • manage context

  • handle memory

  • orchestrate workflows

…you can spend less time on infrastructure and more time building your product.

Getting Started

If you’ve been waiting for a simpler way to build with Backboard, this update is designed for you.

Explore how Backboard simplifies AI infrastructure 👉 https://backboard.io/use-case

Stateful Agents, Real Orchestration: 6 Backboard Releases

Core

6 updates shipped for building stateful, orchestrating AI agents — plus free state management for life via MLH.

The 6 Updates
  • Adaptive context management — truncate, summarize, and reshape payloads automatically before they hit the model.

  • Memory tiers (Light vs Pro) — Light at ~1/10th the cost for speed and affordability; Pro for mission-critical accuracy.

  • New navigation + organizations + docs overhaul — structured workspaces, cleaner nav, more detailed docs.

  • Custom memory orchestration per assistant — define custom_fact_extraction_prompt and custom_update_memory_prompt in natural language.

  • Manual memory search via API — inspect, query, and audit what your agent stored for debugging, admin dashboards, and compliance.

  • Portable parallel stateful tool calling — parallel tool calls with persistent state across rounds, portable across providers. Loop until status == COMPLETED.

Free State Management for Life
  • Free state management on Backboard for life (limited to state features, no expiration).

  • $5 in dev credits (~one free month).

  • Combined with BYOK: OpenAI, Anthropic, OpenRouter, and Cohere are now stateful for free.

17,000+ LLMs Now Available on Backboard

Feature

What's New

Backboard now gives developers access to 17,000+ LLMs through a single API — the largest model library available on any AI platform, enabling flexible, cost-efficient multi-model routing across all major providers.

Scale of Access
  • 17,000+ models from OpenAI, Anthropic, Google, Meta, Mistral, Cohere, AI21, DeepSeek, and dozens more.

  • New models added automatically as providers release them — no SDK updates needed.

  • Unified model metadata — context window sizes, pricing, capabilities, and latency benchmarks for every model.

Intelligent Routing
  • Cost optimization: Route to the cheapest model that meets your quality threshold.

  • Latency optimization: Prefer faster models for real-time applications.

  • Capability matching: Automatically select models with required features (vision, function calling, JSON mode).

  • Fallback chains: Define ordered model preferences with automatic failover.

Context Window Management
  • Automatic detection of each model's context window size.

  • Adaptive truncation and summarization when switching between models mid-conversation.

  • No more silent context overflow — the system warns and adapts before tokens are lost.

How to Use It

Browse the Model Library in your dashboard or use the GET /v1/models endpoint to search and filter. Any model can be used by passing its ID to the model parameter in chat completions.