AI Copilot for Vim/NeoVim
-
Updated
Feb 28, 2025 - Python
AI Copilot for Vim/NeoVim
A high-performance API server that provides OpenAI-compatible endpoints for MLX models. Developed using Python and powered by the FastAPI framework, it provides an efficient, scalable, and user-friendly solution for running MLX-based vision and language models locally with an OpenAI-compatible interface.
Add MLX support to Pydantic AI through LM Studio or mlx-lm, run MLX compatible HF models on Apple silicon.
Various LLM resources and experiments
LLM model inference on Apple Silicon Mac using the Apple MLX Framework.
MLX inference service compatible with OpenAI API, built on MLX-LM and MLX-VLM.基于MLX-LM和MLX-VLM构建的OpenAI API兼容的MLX推理服务.
Add a description, image, and links to the mlx-lm topic page so that developers can more easily learn about it.
To associate your repository with the mlx-lm topic, visit your repo's landing page and select "manage topics."