An open API service for software projects hosted on Open Collective.

vLLM

vLLM is a high-throughput and memory-efficient inference and serving engine for large language models (LLMs).
Collective - Host: opensource - https://opencollective.com/vllm - Code: https://github.com/vllm-project/vllm

vllm: GSA_kwCzR0hTQS1jNjVwLXg2NzctZmdqNs4ABIby

Ecosystems:
Packages:
Source:
vllm: GSA_kwCzR0hTQS12YzZtLWhtNDktZzlxZ84ABHQt

Ecosystems:
Packages:
Source:
vllm: GSA_kwCzR0hTQS1oajR3LWhtMmctcDZ3Nc4ABHQT

Ecosystems:
Packages:
Source:
vllm: GSA_kwCzR0hTQS1yaDRqLTVyaHctaHI1NM4ABD24

Ecosystems:
Packages:
Source:
vllm: GSA_kwCzR0hTQS1ybTc2LTRtcmYtdjlyOM4ABEMS

Ecosystems:
Packages:
Source:
vllm: GSA_kwCzR0hTQS01dnFyLXdwcmMtY3BwN84ABFsD

Ecosystems:
Packages:
Source:
vllm: GSA_kwCzR0hTQS1wZ3I3LW1ocDUtZmdqcM4ABFtr

Ecosystems:
Packages:
Source:
vllm: GSA_kwCzR0hTQS1tZ3JtLWZnanYtbWh2OM4ABFpf

Ecosystems:
Packages:
Source:
vllm: GSA_kwCzR0hTQS05ZjhmLTJ2bWYtODg1as4ABHQS

Ecosystems:
Packages:
Source:
vllm: GSA_kwCzR0hTQS14M204LWY3ZzUtcWhtN84ABFpg

Ecosystems:
Packages:
Source:
High
vllm: GSA_kwCzR0hTQS13MnI3LTk1NzktMjdoZs4AA_m0
vLLM denial of service vulnerability
Ecosystems: pypi
Packages: vllm
Source: github
Published: 10 months ago
Moderate
vllm: GSA_kwCzR0hTQS13YzM2LTk2OTQtZjlyZs4AA_mw
vLLM Denial of Service via the best_of parameter
Ecosystems: pypi
Packages: vllm
Source: github
Published: 10 months ago