Eclipse
  • Get Started
    • Introduction
    • Installation
    • Quick Start
    • Tokenomics
    • Links
  • Key Components
    • Pipe
    • Agent
    • Memory
    • LLM Clients
    • Engine
    • Prompt Template
    • Handler
Powered by GitBook
On this page
  • LLMClient in Eclipse
  • Supported LLMs
  • Configuration Parameters
  • Example Usage
  • DeepSeek
  • OpenAI
  • Google Gemini
  1. Key Components

LLM Clients

LLMClient in Eclipse

The LLMClient class in the Eclipse framework serves as a configuration-driven interface for interacting with Large Language Models (LLMs) from various providers, such as OpenAI, AWS Bedrock, and more. It allows users to set parameters specific to the model and provider they intend to use, facilitating seamless integration and interaction with different types of LLMs.

Supported LLMs

  • DeepSeek

  • OpenAI

  • Google Gemini

Configuration Parameters

Attribute
Parameter
Type
Description

LLM Type

llm_type

str

Identifies the service provider or platform from which the language model is sourced. Common values include: openai for OpenAI's GPT models.

Model (optional)

model

str

Specifies the language model to use, identified by a name or version, such as gpt-4o, gpt-3.5-turbo, or another supported model version. Default is None.

API Key (optional)

api_key

str

The API key for the model, which can be provided either as a parameter or retrieved from the environment. Default is None.

Base URL (optional)

base_url

str

The base URL for the model API. Default is None.

API Version (optional)

api_version

str

The required API version for the Azure OpenAI llm_type. Default is None.

Async Mode (optional)

async_mode

bool

Indicates whether to use asynchronous mode for OpenAI or Azure OpenAI clients. Default is None.

Embedding Model (optional)

embed_model

str

Embedding model name, with supported models including OpenAI, Azure OpenAI, Mistral, and Llama 3.1. Default is None.

Example Usage

DeepSeek

from eclipse.llm import LLMClient

llm_config = {
    'model': "deepseek-chat",
    'llm_type': 'deepseek',
    'base_url': 'https://api.deepseek.com/'
}
llm_client = LLMClient(llm_config=llm_config)

OpenAI

from eclipse.llm import LLMClient

llm_config = {
    "model": 'gpt-4o',
    "llm_type": 'openai'
}
llm_client = LLMClient(llm_config=llm_config)

Google Gemini

from eclipse.llm import LLMClient

llm_config = {
    'model': 'mistral',
    'llm_type': 'gemini',
}
llm_client = LLMClient(llm_config=llm_config)

LLM Types and Corresponding Values

DeepSeek

deepseek

llm_config = { 'llm_type': 'deepseek' }

OpenAI

openai

llm_config = { 'llm_type': 'openai' }

Gemini

gemini

llm_config = { 'llm_type': 'gemini' }

PreviousMemoryNextEngine

Last updated 4 months ago