llm-client

0

SDK for using LLM

Infrastructure

LLM-Client-SDK is a software development kit (SDK) for communicating with various generative AI large language models (LLMs), including OpenAI, AI21, HuggingfaceHub, Aleph Alpha, Anthropic, and local models. The SDK provides an async native and production-ready interface that integrates different LLMs while retaining flexibility in API parameters and endpoints. It exposes two simple interfaces for communicating with LLMs, namely BaseLLMClient and BaseLLMAPIClient. With the LLMAPIClient, users can configure an API key, session, base URL, default model, headers, etc., for communicating with LLMs. The package also supports multiple optional dependencies and offers various installation options. For instance, users can install all current clients or only specific ones based on their needs. The SDK offers several usage examples, including OpenAIClient, LLMAPIClientFactory, and LocalClient. The SDK is licensed under MIT, and contributions are welcome. Some of the planned improvements include adding more LLMs support, adding support for more functions via LLMs, creating easy ways to run multiple LLMs in parallel, and converting common models parameters.
Company Screenshot