Ecosyste.ms: OpenCollective

An open API service for software projects hosted on Open Collective.

vLLM

vLLM is a high-throughput and memory-efficient inference and serving engine for large language models (LLMs).
Collective - Host: opensource - https://opencollective.com/vllm - Code: https://github.com/vllm-project/vllm

github.com/vllm-project/vllm

A high-throughput and memory-efficient inference and serving engine for LLMs

Stars: 31,635 - Last synced: 20 Dec 2024