feat: add LiteLLM as embedding provider#809
feat: add LiteLLM as embedding provider#809RheagalFire wants to merge 1 commit intobasicmachines-co:mainfrom
Conversation
Signed-off-by: RheagalFire <arishalam121@gmail.com>
c029eb3 to
849f9f5
Compare
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: c029eb3b86
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| elif provider_name == "litellm": | ||
| from basic_memory.repository.litellm_provider import LiteLLMEmbeddingProvider | ||
|
|
||
| model_name = app_config.semantic_embedding_model or "openai/text-embedding-3-small" |
There was a problem hiding this comment.
Map the built-in default model for LiteLLM
When users switch only semantic_embedding_provider to litellm, BasicMemoryConfig still supplies the non-empty default model bge-small-en-v1.5, so this or never selects the LiteLLM provider default. The factory then instantiates LiteLLMEmbeddingProvider(model_name="bge-small-en-v1.5") instead of a LiteLLM-routable model such as openai/text-embedding-3-small, making the new provider fail for the documented minimal configuration; mirror the OpenAI branch's remapping of the FastEmbed default or otherwise treat it as unset.
Useful? React with 👍 / 👎.
|
cc @phernandez |
Summary
LiteLLMEmbeddingProviderimplementing theEmbeddingProviderprotocol, following the exact same pattern asOpenAIEmbeddingProvidercreate_embedding_provider()factory withprovider_name == "litellm"Changes
src/basic_memory/repository/litellm_provider.py- newLiteLLMEmbeddingProviderwith:litellm.aembedding()for async embeddingdrop_params=Truefor cross-provider kwargs compatibilitysrc/basic_memory/repository/embedding_provider_factory.py- addedelif provider_name == "litellm"branchpyproject.toml- addedlitellm>=1.60.0,<2.0.0to dependenciestests/repository/test_litellm_provider.py- 13 unit tests (all passing)Tests
Unit tests (13/13 passing):
Example usage
See https://docs.litellm.ai/docs/embedding/supported_embedding for all supported embedding models.
Impact
litellmadded as dependency in pyproject.tomldrop_params=Truesilently drops provider-unsupported kwargsOpenAIEmbeddingProviderprovider_name == "litellm"config