An open API service for software projects hosted on Open Collective.

vLLM

vLLM is a high-throughput and memory-efficient inference and serving engine for large language models (LLMs).
Collective - Host: opensource - https://opencollective.com/vllm - Code: https://github.com/vllm-project/vllm

Financials

Loading...
Loading...
Donations: 2 ($131.00)
Expenses: 1 (-$100.00)
Donors: 2
Spenders: 1
Loading...

Project activity

Loading...
Loading...

New Projects: 0
New Releases: 3

New Issues: 93
New Pull Requests: 165

Closed Issues: 56
Merged Pull Requests: 50
Closed Pull Requests: 20

Issue Authors: 81
Pull Request Authors: 105
Active Maintainers: 26

Time to close issues: 73.9 days
Time to merge pull requests: 3.5 days
Time to close pull requests: 11.8 days

Loading...

Commit Stats

Loading...
Loading...

Commits: 0
Commit Authors: 0
Commit Committers: 0
Additions: 0
Deletions: 0

Loading...