EricLBuehler/candle-vllm
Efficent platform for inference and serving local LLMs including an OpenAI compatible API server.
Details
仓库信息
Efficent platform for inference and serving local LLMs including an OpenAI compatible API server.
Stats
Loading...
Loading
--
Loading
--
Loading
--
Loading
--