OpenCSGs/llm-inference

llm-inference is a platform for publishing and managing llm inference, providing a wide range of out-of-the-box features for model deployment, such as UI, RESTful API, auto-scaling, computing resource management, monitoring, and more.

PythonStars 90Forks 16Watchers 90Open issues 13License Apache License 2.0
Details
仓库信息
OwnerOpenCSGs
Homepage
Last pushed2024-05-17
Last updated2025-12-14
Issues fetched at

Stats

Community at a glance

Loading...

Loading

--

Loading

--

Loading

--

Loading

--