首页 > 其他分享 >litellm

litellm

时间:2024-10-05 15:22:48浏览次数:9  
标签:https litellm docs LLM model name

litellm

https://github.com/BerriAI/litellm/tree/main

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]

LiteLLM manages:

  • Translate inputs to provider's completion, embedding, and image_generation endpoints
  • Consistent output, text responses will always be available at ['choices'][0]['message']['content']
  • Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI) - Router
  • Set Budgets & Rate limits per project, api key, model LiteLLM Proxy Server (LLM Gateway)

Jump to LiteLLM Proxy (LLM Gateway) Docs
Jump to Supported LLM Providers

标签:https,litellm,docs,LLM,model,name
From: https://www.cnblogs.com/lightsong/p/18447871

相关文章

  • litellm proxy ui 的处理
    litellmproxyui关于static的处理是基于了fastapi的StaticFiles,使用了绝对路径配置的,以下是一个简单说明参考代码litellm/proxy/proxy_server.pycurrent_dir=os.path.dirname(os.path.abspath(__file__))ui_path=os.path.join(current_dir,"_experimental","out")app.m......
  • 揭秘 LiteLLM:轻松驾驭百余种语言模型 API,开发者的致胜法宝
    目录引言一、LiteLLM简介二、主要特点1.统一的调用接口2.输入输出标准化3.重试与回退逻辑4.预算和速率限制5.异步支持6.流式传输7.日志与可观测性三、使用方法1.安装2.设置环境变量3.调用模型4.异步调用5.流式传输四、相关项目资料地址结语引言在当......
  • litellm 配置embedding 模型
    litellm对于embedding模型的配置实际与普通模式的配置是一样的,embedding有特定的,同时chat模型也都有embedding能力参考配置通过proxy模式model_list:-model_name:text-embedding-ada-002litellm_params:model:ollama/michaelborck/refu......