LLM Proxy Server for SAP AI Core

Iby I355895
Created 6/3/2025
Project Description

The LLM Proxy Server for SAP AI Core is an open-source project designed to serve as a lightweight, performant, and extensible intermediary between SAP AI Core and a variety of large language model (LLM) backends. Inspired by solutions like LightLLM, this proxy server standardizes interactions with different model providers (OpenAI, Anthropic, HuggingFace, etc.), handles rate limiting and caching, and simplifies integration for enterprise use cases. Its purpose is to provide a scalable, multi-tenant LLM gateway that plugs seamlessly into the SAP AI Core inference pipeline.

Contribution Type

- Model Adapter Layer: Implement new adapters to support additional LLM providers or fine-tuned models hosted on SAP AI Core. - Request Routing & Optimization: Improve routing logic, caching strategies, and load balancing between LLM providers. - Security & Multi-Tenancy: Enhance authentication, logging, and quota management to support secure, tenant-aware deployments. - SAP Integration: Develop SDK components or API contracts to enable easy consumption of the proxy within SAP BTP-based applications.

Repository

View on GitHub
Project Stats
Votes
0
Contributors
0
Comments0
Time Commitment
Required Skills
Join the Project
Login to vote for this project or assign yourself to contribute.