LLM Clients
LLMClient in Eclipse
The LLMClient
class in the Eclipse framework serves as a configuration-driven interface for interacting with Large Language Models (LLMs) from various providers, such as OpenAI, AWS Bedrock, and more. It allows users to set parameters specific to the model and provider they intend to use, facilitating seamless integration and interaction with different types of LLMs.
Supported LLMs
DeepSeek
OpenAI
Google Gemini
Configuration Parameters
LLM Type
llm_type
str
Identifies the service provider or platform from which the language model is sourced. Common values include: openai
for OpenAI's GPT models.
Model (optional)
model
str
Specifies the language model to use, identified by a name or version, such as gpt-4o
, gpt-3.5-turbo
, or another supported model version. Default is None
.
API Key (optional)
api_key
str
The API key for the model, which can be provided either as a parameter or retrieved from the environment. Default is None
.
Base URL (optional)
base_url
str
The base URL for the model API. Default is None
.
API Version (optional)
api_version
str
The required API version for the Azure OpenAI llm_type
. Default is None
.
Async Mode (optional)
async_mode
bool
Indicates whether to use asynchronous mode for OpenAI or Azure OpenAI clients. Default is None
.
Embedding Model (optional)
embed_model
str
Embedding model name, with supported models including OpenAI, Azure OpenAI, Mistral, and Llama 3.1. Default is None
.
Example Usage
DeepSeek
OpenAI
Google Gemini
LLM Types and Corresponding Values
DeepSeek
deepseek
llm_config = { 'llm_type': 'deepseek' }
OpenAI
openai
llm_config = { 'llm_type': 'openai' }
Gemini
gemini
llm_config = { 'llm_type': 'gemini' }
Last updated