Skip to content

Add fallback providers for reliability #34

@sonic182

Description

@sonic182

Currently, llm_composer supports advanced features like fallback models and provider routing—but only when using the OpenRouter backend. OpenAI, Ollama, and Bedrock do not have automatic fallback capabilities.

Today is failing OpenRouter, so would be nice to implement the fallback providers at llm_composer library level.

The desired behavior includes:

  • Detecting provider errors (e.g., network issues, rate limits, downtime).
  • Automatically switching to an alternate, configured provider.
  • Optionally allowing provider order or fallback list configuration independent of OpenRouter settings.
  • Retaining clear tracing/logging so users know when a fallback occurred.

This improvement would strengthen robustness, ensuring llm_composer can gracefully continue operations when individual providers fail.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions