Ecosyste.ms: OpenCollective
An open API service for software projects hosted on Open Collective.
vLLM
vLLM is a high-throughput and memory-efficient inference and serving engine for large language models (LLMs).
Collective -
Host: opensource -
https://opencollective.com/vllm
- Code: https://github.com/vllm-project/vllm
Moderate
Ecosystems: pypi
Packages: vllm
Source: github
Published: 3 months ago
vllm: GSA_kwCzR0hTQS13YzM2LTk2OTQtZjlyZs4AA_mw
vLLM Denial of Service via the best_of parameterEcosystems: pypi
Packages: vllm
Source: github
Published: 3 months ago
High
Ecosystems: pypi
Packages: vllm
Source: github
Published: 3 months ago
vllm: GSA_kwCzR0hTQS13MnI3LTk1NzktMjdoZs4AA_m0
vLLM denial of service vulnerabilityEcosystems: pypi
Packages: vllm
Source: github
Published: 3 months ago