waybarrios/vllm-mlx
OpenAI and Anthropic compatible server for Apple Silicon. Run LLMs and vision-language models (Llama, Qwen-VL, LLaVA) with continuous batching, MCP tool calling, and multimodal support. Native MLX backend, 400+ tok/s. Works with Claude Code.
0Active
On the radar — signal detected
Stars
665
Forks
157
Contributors
13
Language
Python
Downloads (7d)
1.4k
pypi/vllm-mlx
Score updated Mar 26, 2026
// SUBSCRIBE
The repos that moved this week, why they matter, and what to watch next. One email. No noise.