An open API service for software projects hosted on Open Collective.

vLLM

vLLM is a high-throughput and memory-efficient inference and serving engine for large language models (LLMs).
Collective - Host: opensource - https://opencollective.com/vllm - Code: https://github.com/vllm-project/vllm

Financials

Loading...
Loading...
Donations: 4 ($608.93)
Expenses: 7 (-$7,569.59)
Donors: 3
Spenders: 3
Loading...

Project activity

Loading...
Loading...

New Projects: 0
New Releases: 5

New Issues: 240
New Pull Requests: 403

Closed Issues: 84
Merged Pull Requests: 141
Closed Pull Requests: 31

Issue Authors: 207
Pull Request Authors: 185
Active Maintainers: 30

Time to close issues: 50.4 days
Time to merge pull requests: 4.4 days
Time to close pull requests: 11.5 days

Loading...

Commit Stats

Loading...
Loading...

Commits: 0
Commit Authors: 0
Commit Committers: 0
Additions: 0
Deletions: 0

Loading...