Stable Diffusion 3.5 Medium
40 consumer GPUs can run Stable Diffusion 3.5 Medium at Q4 natively. Precise VRAM thresholds and benchmarks below.
llama.cpp 0.2.x · CUDA 12 · ROCm 6 · Updated monthly · methodology →
Execution Context
This model requires a Entry GPU (8 GB VRAM)
Cómo ejecutar este modelo
Check if your GPU can run Stable Diffusion 3.5 Medium →
VRAM Calculator — instant compatibility check
RTX 5090
32 GB · Runs Q4 natively · Check availability
*Prices and availability may change. Some links are affiliate links.
System Requirements
VRAM by Quantization
| Quantization | VRAM needed | Disk space | Quality |
|---|---|---|---|
| FP16 (max quality) | 10 GB | 4 GB | Maximum |
| Q8 (high quality) | 7 GB | 2 GB | Near-lossless |
| Q4 (recommended) Best balance | 5 GB | 1 GB | Recommended |
| Q2 (minimum) | 4 GB | 0.5 GB | Quality loss |
Model Details
| Developer | Stability AI |
| Parameters | 2B |
| License | Stability AI Community |
| Use cases | image |
| Released | 2024-10 |
Hugging Face
stabilityai/stable-diffusion-3.5-medium Can your GPU run Stable Diffusion 3.5 Medium?
Stable Diffusion 3.5 Medium requires 5 GB VRAM at Q4. 40 consumer GPUs meet this threshold. Below 8 GB or 3 GB you'll hit significant offload latency.
Hardware Performance Matrix
40 Q4 native · 0 offload · 0 unsupported
| GPU Unit | VRAM | Compatibility | Est. Speed | Action |
|---|---|---|---|---|
| RTX 5090 | 32GB | Optimal | 300 tok/s | Calculate → |
| RTX 4090 | 24GB | Optimal | 300 tok/s | Calculate → |
| M4 Ultra | 128GB | Optimal | 300 tok/s | Calculate → |
| RTX 5080 | 16GB | Optimal | 300 tok/s | Calculate → |
| M3 Ultra | 192GB | Optimal | 284 tok/s | Calculate → |
| RTX 4080 Super | 16GB | Optimal | 271 tok/s | Calculate → |
| RTX 5070 Ti | 16GB | Optimal | 294 tok/s | Calculate → |
| RTX 3090 | 24GB | Optimal | 299 tok/s | Calculate → |
| M4 Max 48GB | 48GB | Optimal | 201 tok/s | Calculate → |
| RX 7900 XTX | 24GB | Optimal | 300 tok/s | Calculate → |
| M4 Max 36GB | 36GB | Optimal | 201 tok/s | Calculate → |
| RTX 4070 Ti Super | 16GB | Optimal | 247 tok/s | Calculate → |
| RTX 3080 Ti | 12GB | Optimal | 291 tok/s | Calculate → |
| RX 7900 XT | 20GB | Optimal | 284 tok/s | Calculate → |
| RTX 5070 | 12GB | Optimal | 247 tok/s | Calculate → |
| RTX 3080 | 10GB | Optimal | 280 tok/s | Calculate → |
| M4 Pro | 24GB | Optimal | 100 tok/s | Calculate → |
| RX 7800 XT | 16GB | Optimal | 230 tok/s | Calculate → |
| RX 6800 XT | 16GB | Optimal | 189 tok/s | Calculate → |
| RTX 4070 | 12GB | Optimal | 186 tok/s | Calculate → |
| RTX 4060 Ti 16GB | 16GB | Optimal | 106 tok/s | Calculate → |
| RX 7700 XT | 12GB | Optimal | 159 tok/s | Calculate → |
| RTX 3070 Ti | 8GB | Optimal | 224 tok/s | Calculate → |
| RTX 4060 Ti | 8GB | Optimal | 106 tok/s | Calculate → |
| RTX 3070 | 8GB | Optimal | 165 tok/s | Calculate → |
| RX 6700 XT | 12GB | Optimal | 142 tok/s | Calculate → |
| M3 Pro | 18GB | Optimal | 56 tok/s | Calculate → |
| RTX 3060 Ti | 8GB | Optimal | 165 tok/s | Calculate → |
| RTX 2080 Ti | 11GB | Optimal | 165 tok/s | Calculate → |
| RTX 3060 | 12GB | Optimal | 133 tok/s | Calculate → |
| M2 Pro | 16GB | Optimal | 74 tok/s | Calculate → |
| RTX 4060 | 8GB | Optimal | 100 tok/s | Calculate → |
| Arc A770 16GB | 16GB | Optimal | 83 tok/s | Calculate → |
| M1 Pro | 16GB | Optimal | 74 tok/s | Calculate → |
| RX 7600 | 8GB | Optimal | 107 tok/s | Calculate → |
| RX 6600 XT | 8GB | Optimal | 100 tok/s | Calculate → |
| Arc A750 8GB | 8GB | Optimal | 75 tok/s | Calculate → |
| RX 6600 | 8GB | Optimal | 91 tok/s | Calculate → |
| RTX 3050 8GB | 8GB | Optimal | 83 tok/s | Calculate → |
| GTX 1660 Super | 6GB | Optimal | 123 tok/s | Calculate → |
Recommended GPUs for Stable Diffusion 3.5 Medium
Best picks by compatibility, VRAM headroom, and value — prices and availability may change.
RTX 5090
32 GB VRAM
Check availability →
RTX 4090
24 GB VRAM
Check availability →
M4 Ultra
128 GB VRAM
Check availability →
Some links are Amazon affiliate links. We may earn a commission at no extra cost to you. Amazon cookies may last up to 24 hours after your click.
Stable Diffusion 3.5 Medium — Compatibility guide
A lightweight model like Stable Diffusion 3.5 Medium runs well on consumer hardware from 6 GB VRAM. Ideal for daily use with Ollama or LM Studio. Use the VRAM calculator to check your setup.
Compare GPUs for Stable Diffusion 3.5 Medium
Which GPU is worth it? Real specs and benchmarks side by side.
Compatible Hardware
GPUs that run Stable Diffusion 3.5 Medium at Q4 — sorted by AI performance score.
Some links are Amazon affiliate links. We may earn a commission at no extra cost to you. Amazon cookies may last up to 24 hours after your click.
More Practical Alternatives
Similar models in the image category with comparable VRAM footprints.
Stable Diffusion 3 Medium
2B params • 3GB VRAM
Stability AI • Stability AI Community
Stable Diffusion XL
6.6B params • 6GB VRAM
Stability AI • CreativeML Open RAIL++-M
Stable Diffusion 3.5 Large
8B params • 10GB VRAM
Stability AI • Stability AI Community
Flux.1 Dev
12B params • 12GB VRAM
Black Forest Labs • FLUX.1-dev Non-Commercial
Not sure which GPU you need for Stable Diffusion 3.5 Medium?
The VRAM Calculator tells you exactly which quantization your hardware can handle.