Official
Beginner-friendly
LLM lane
MLX LM
The most practical entry point for running and fine-tuning language models in the MLX ecosystem.
Who it's for
People who want local LLMs on a Mac without building a full stack from scratch.
Why it matters
If you want one fast, useful first win with MLX, this is usually the move.
What to do next
Pair it with compatible weights from mlx-community or the Hugging Face MLX browser.
Quick note
Best first lane for most newcomers.
llm
official
inference
finetuning