Changelog

Nov 10, 2025

Embedding Models Now Available in Model Library

We’re excited to announce that embedding models are now fully integrated into the Backboard Model Library. This update expands the power and flexibility of your AI stack—giving developers direct access to embeddings across multiple providers, right alongside LLMs.

What’s New

You can now:

  • Browse 12 embedding models from OpenAI, Google, and Cohere in the Model Library

  • Filter by provider, dimensions, and model type

  • Use embedding models directly when creating or configuring assistants

New API Endpoints

Developers can programmatically access embedding models using the following endpoints:

GET /api/models/embedding/all           # List all embedding models
GET /api/models/embedding/{model_name}  # Get details for a specific embedding model
GET /api/models/embedding/providers     # List available embedding providers
GET /api/models?model_type=embedding    # Filter models by type

Available Models

OpenAI (3 models)

  • text-embedding-3-large (3072 dims)

  • text-embedding-3-small (1536 dims)

  • text-embedding-ada-002 (1536 dims)

Google (3 models)

  • gemini-embedding-001-768

  • gemini-embedding-001-1536

  • gemini-embedding-001-3072

Cohere (6 models)

  • embed-v4.0 (256, 512, 1024, 1536 dims)

  • embed-english-v3.0

  • embed-multilingual-v3.0

How to Use

When creating an assistant, you can now specify embedding parameters directly:

{
  "name": "My Assistant",
  "embedding_provider": "openai",
  "embedding_model_name": "text-embedding-3-large"
}

The selected model must exist in the Model Library.

Compatibility

All updates are fully backward compatible—existing integrations and assistants continue to work without modification.

Embedding models open up new possibilities for retrieval, classification, search, and RAG workflows inside Backboard.

For support or questions, contact the Backboard team or visit backboard.io/docs.

Changelog