Ecosyste.ms: OpenCollective

An open API service for software projects hosted on Open Collective.

vLLM

vLLM is a high-throughput and memory-efficient inference and serving engine for large language models (LLMs).
Collective - Host: opensource - https://opencollective.com/vllm - Code: https://github.com/vllm-project/vllm

Moderate
vllm: GSA_kwCzR0hTQS13YzM2LTk2OTQtZjlyZs4AA_mw
vLLM Denial of Service via the best_of parameter
Ecosystems: pypi
Packages: vllm
Source: github
Published: 3 months ago
High
vllm: GSA_kwCzR0hTQS13MnI3LTk1NzktMjdoZs4AA_m0
vLLM denial of service vulnerability
Ecosystems: pypi
Packages: vllm
Source: github
Published: 3 months ago