Best projects to try first

The MLX projects and resources most worth your attention first

This page is intentionally opinionated. The goal is not completeness. The goal is to save newcomers from spending their first two hours in the wrong layer.

Official Intermediate Official foundation

MLX

The core MLX framework for arrays, neural networks, autograd, and Apple silicon-optimized ML work.

Why it matters: This is the root layer. If you want to understand the ecosystem instead of just using it blindly, start here.
Who it's for: Builders who want the actual foundation, not just wrappers around it.
Try next: Read the docs lightly, then jump to MLX Examples or MLX LM so the abstractions become real.
Quick note: Foundational, not the fastest first win.
Official Beginner-friendly LLM lane

MLX LM

The most practical entry point for running and fine-tuning language models in the MLX ecosystem.

Why it matters: If you want one fast, useful first win with MLX, this is usually the move.
Who it's for: People who want local LLMs on a Mac without building a full stack from scratch.
Try next: Pair it with compatible weights from mlx-community or the Hugging Face MLX browser.
Quick note: Best first lane for most newcomers.
Official Beginner-friendly Docs + community

MLX Examples

A repo of example projects and demos showing how MLX gets used in practice.

Why it matters: Examples are where the theory stops being vague.
Who it's for: People who learn faster from concrete demos than from abstract docs.
Try next: Pick one example that feels close to your goal and get it running end to end.
Quick note: Great bridge between docs and real use.
Official Beginner-friendly Docs + community

Official MLX docs

The documentation hub for installation, APIs, and core MLX concepts.

Why it matters: When you want the cleanest source of truth, this is it.
Who it's for: Anyone who wants the official explanation before community shortcuts.
Try next: Read just enough to orient yourself, then switch to examples or MLX LM for hands-on learning.
Quick note: Best used in small doses, not as your only learning path.
Community Beginner-friendly Model discovery

mlx-community on Hugging Face

A major source of MLX-compatible model weights so you can try models without conversion drama.

Why it matters: This is where practical experimentation becomes plug-and-play.
Who it's for: People who want to run models fast instead of wrestling with formats first.
Try next: Pick a model that matches your use case, then run it through MLX LM.
Quick note: Huge practical unlock for newcomers.
Community Intermediate Multimodal + media

mlx-vlm

A community project for running vision-language models with MLX.

Why it matters: It proves MLX is not just about text chat models.
Who it's for: People curious about image + text workflows, not just pure LLM use cases.
Try next: Treat it as a community frontier project after you've done one simpler MLX lane first.
Quick note: Exciting, but less foundational than the official repos.
Community Beginner-friendly Multimodal + media

MFLUX

MLX-native implementations of modern image generation models on Apple Silicon.

Why it matters: This is one of the clearest ways to show newcomers that MLX can drive practical image workflows too.
Who it's for: People who want a visual, tangible MLX project instead of another abstract repo.
Try next: Try it after you understand the core stack so the image lane feels connected instead of random.
Quick note: Strong demo value and very shareable.