Ecosyste.ms: OpenCollective

An open API service for software projects hosted on Open Collective.

vLLM

vLLM is a high-throughput and memory-efficient inference and serving engine for large language models (LLMs).
Collective - Host: opensource - https://opencollective.com/vllm - Code: https://github.com/vllm-project/vllm

pypi: vllm-acc 0.4.1
A high-throughput and memory-efficient inference and serving engine for LLMs
github.com/vllm-project/vllm - 8 versions - Latest release: 6 months ago - 263 downloads last month
pypi: hive-vllm 0.0.1
a
github.com/vllm-project/vllm - 1 version - Latest release: 8 months ago - 23 downloads last month
pypi: moe-kernels
MoE kernels
github.com/vllm-project/vllm - 12 versions - 649 downloads last month
pypi: tilearn-infer 0.3.3
A high-throughput and memory-efficient inference and serving engine for LLMs
github.com/vllm-project/vllm - 3 versions - Latest release: 6 months ago - 121 downloads last month
pypi: marlin-kernels
Marlin quantization kernels
github.com/vllm-project/vllm - 3 versions - 74 downloads last month
pypi: vllm-consul 0.2.1
A high-throughput and memory-efficient inference and serving engine for LLMs
github.com/vllm-project/vllm - 5 versions - Latest release: 12 months ago - 162 downloads last month
pypi: nextai-vllm 0.0.7
A high-throughput and memory-efficient inference and serving engine for LLMs
github.com/vllm-project/vllm - 6 versions - Latest release: 6 months ago - 148 downloads last month
pypi: tilearn-test01 0.1
A high-throughput and memory-efficient inference and serving engine for LLMs
github.com/vllm-project/vllm - 1 version - Latest release: 6 months ago - 50 downloads last month
pypi: byzerllm 0.1.89
ByzerLLM: Byzer LLM
github.com/vllm-project/vllm - 132 versions - Latest release: 5 months ago - 1 dependent package - 2 dependent repositories - 11.2 thousand downloads last month
pypi: llm_atc 0.1.7
Tools for fine tuning and serving LLMs
github.com/vllm-project/vllm - 6 versions - Latest release: 11 months ago - 399 downloads last month
pypi: superlaser 0.0.6
An MLOps library for LLM deployment w/ the vLLM engine on RunPod's infra.
github.com/vllm-project/vllm - 6 versions - Latest release: 8 months ago - 294 downloads last month
pypi: vllm-xft 0.3.3.1
A high-throughput and memory-efficient inference and serving engine for LLMs
github.com/vllm-project/vllm - 8 versions - Latest release: 5 months ago - 198 downloads last month
pypi: llm_math
A tool designed to evaluate the performance of large language models on mathematical tasks.
github.com/vllm-project/vllm - 5 versions - 843 downloads last month
Top 3.4% on pypi.org
pypi: vllm 0.4.2
A high-throughput and memory-efficient inference and serving engine for LLMs
github.com/vllm-project/vllm - 41 versions - Latest release: 6 months ago - 46 dependent packages - 5 dependent repositories - 595 thousand downloads last month
pypi: llm-swarm 0.1.1
A high-throughput and memory-efficient inference and serving engine for LLMs
github.com/vllm-project/vllm - 2 versions - Latest release: 7 months ago - 187 downloads last month
pypi: llm-engines
A unified inference engine for large language models (LLMs) including open-source models (VLLM, S...
github.com/vllm-project/vllm - 14 versions - 1.32 thousand downloads last month
pypi: vllm-online 0.4.2
A high-throughput and memory-efficient inference and serving engine for LLMs
github.com/vllm-project/vllm - 2 versions - Latest release: 6 months ago - 58 downloads last month
Top 9.6% on proxy.golang.org
go: github.com/vllm-project/vllm v0.4.2
A high-throughput and memory-efficient inference and serving engine for LLMs
github.com/vllm-project/vllm - 33 versions - Latest release: 6 months ago