vllm-project/vllm

A high-throughput and memory-efficient inference and serving engine for LLMs

PythonStars 65329Forks 11941Watchers 65329Open issues 3126License Apache License 2.0
Details
仓库信息
Ownervllm-project
Last pushed2025-12-14
Last updated2025-12-14
Issues fetched at

Stats

Community at a glance

Loading...

Loading

--

Loading

--

Loading

--

Loading

--