Stable Diffusion 3.5 Large
29 consumer GPUs can run Stable Diffusion 3.5 Large at Q4 natively. Precise VRAM thresholds and benchmarks below.
llama.cpp 0.2.x · CUDA 12 · ROCm 6 · Updated monthly · methodology →
Execution Context
This model requires a Mid-range GPU (16 GB VRAM)
Cómo ejecutar este modelo
Check if your GPU can run Stable Diffusion 3.5 Large →
VRAM Calculator — instant compatibility check
RTX 5090
32 GB · Runs Q4 natively · Check availability
*Prices and availability may change. Some links are affiliate links.
System Requirements
VRAM by Quantization
| Quantization | VRAM needed | Disk space | Quality |
|---|---|---|---|
| FP16 (max quality) | 24 GB | 16 GB | Maximum |
| Q8 (high quality) | 14 GB | 8 GB | Near-lossless |
| Q4 (recommended) Best balance | 10 GB | 4 GB | Recommended |
| Q2 (minimum) | 8 GB | 2 GB | Quality loss |
Model Details
| Developer | Stability AI |
| Parameters | 8B |
| License | Stability AI Community |
| Use cases | image |
| Released | 2024-10 |
Hugging Face
stabilityai/stable-diffusion-3.5-large Can your GPU run Stable Diffusion 3.5 Large?
Stable Diffusion 3.5 Large requires 10 GB VRAM at Q4. 29 consumer GPUs meet this threshold. Below 8 GB or 8 GB you'll hit significant offload latency.
Hardware Performance Matrix
29 Q4 native · 10 offload · 1 unsupported
| GPU Unit | VRAM | Compatibility | Est. Speed | Action |
|---|---|---|---|---|
| RTX 5090 | 32GB | Optimal | 84 tok/s | Calculate → |
| RTX 4090 | 24GB | Optimal | 47 tok/s | Calculate → |
| M4 Ultra | 128GB | Optimal | 51 tok/s | Calculate → |
| RTX 5080 | 16GB | Optimal | 45 tok/s | Calculate → |
| M3 Ultra | 192GB | Optimal | 37 tok/s | Calculate → |
| RTX 4080 Super | 16GB | Optimal | 34 tok/s | Calculate → |
| RTX 5070 Ti | 16GB | Optimal | 42 tok/s | Calculate → |
| RTX 3090 | 24GB | Optimal | 44 tok/s | Calculate → |
| M4 Max 48GB | 48GB | Optimal | 25 tok/s | Calculate → |
| RX 7900 XTX | 24GB | Optimal | 45 tok/s | Calculate → |
| M4 Max 36GB | 36GB | Optimal | 25 tok/s | Calculate → |
| RTX 4070 Ti Super | 16GB | Optimal | 31 tok/s | Calculate → |
| RTX 3080 Ti | 12GB | Optimal | 33 tok/s | Calculate → |
| RX 7900 XT | 20GB | Optimal | 37 tok/s | Calculate → |
| RTX 5070 | 12GB | Optimal | 31 tok/s | Calculate → |
| RTX 3080 | 10GB | Optimal | 35 tok/s | Calculate → |
| M4 Pro | 24GB | Optimal | 13 tok/s | Calculate → |
| RX 7800 XT | 16GB | Optimal | 29 tok/s | Calculate → |
| RX 6800 XT | 16GB | Optimal | 20 tok/s | Calculate → |
| RTX 4070 | 12GB | Optimal | 20 tok/s | Calculate → |
| RTX 4060 Ti 16GB | 16GB | Optimal | 13 tok/s | Calculate → |
| RX 7700 XT | 12GB | Optimal | 18 tok/s | Calculate → |
| RX 6700 XT | 12GB | Optimal | 13 tok/s | Calculate → |
| M3 Pro | 18GB | Optimal | 7 tok/s | Calculate → |
| RTX 2080 Ti | 11GB | Optimal | 16 tok/s | Calculate → |
| RTX 3060 | 12GB | Optimal | 17 tok/s | Calculate → |
| M2 Pro | 16GB | Optimal | 9 tok/s | Calculate → |
| Arc A770 16GB | 16GB | Optimal | 8 tok/s | Calculate → |
| M1 Pro | 16GB | Optimal | 9 tok/s | Calculate → |
| RTX 3070 Ti | 8GB | Offload | 23 tok/s | Calculate → |
| RTX 4060 Ti | 8GB | Offload | 19 tok/s | Calculate → |
| RTX 3070 | 8GB | Offload | 19 tok/s | Calculate → |
| RTX 3060 Ti | 8GB | Offload | 18 tok/s | Calculate → |
| RTX 4060 | 8GB | Offload | 14 tok/s | Calculate → |
| RX 7600 | 8GB | Offload | 12 tok/s | Calculate → |
| RX 6600 XT | 8GB | Offload | 12 tok/s | Calculate → |
| Arc A750 8GB | 8GB | Offload | 9 tok/s | Calculate → |
| RX 6600 | 8GB | Offload | 10 tok/s | Calculate → |
| RTX 3050 8GB | 8GB | Offload | 9 tok/s | Calculate → |
| GTX 1660 Super | 6GB | N/A | 11 tok/s | Calculate → |
Recommended GPUs for Stable Diffusion 3.5 Large
Best picks by compatibility, VRAM headroom, and value — prices and availability may change.
RTX 5090
32 GB VRAM
Check availability →
RTX 4090
24 GB VRAM
Check availability →
M4 Ultra
128 GB VRAM
Check availability →
Some links are Amazon affiliate links. We may earn a commission at no extra cost to you. Amazon cookies may last up to 24 hours after your click.
Stable Diffusion 3.5 Large — Compatibility guide
A lightweight model like Stable Diffusion 3.5 Large runs well on consumer hardware from 10 GB VRAM. Ideal for daily use with Ollama or LM Studio. Use the VRAM calculator to check your setup.
Compare GPUs for Stable Diffusion 3.5 Large
Which GPU is worth it? Real specs and benchmarks side by side.
Compatible Hardware
GPUs that run Stable Diffusion 3.5 Large at Q4 — sorted by AI performance score.
Some links are Amazon affiliate links. We may earn a commission at no extra cost to you. Amazon cookies may last up to 24 hours after your click.
More Practical Alternatives
Similar models in the image category with comparable VRAM footprints.
Stable Diffusion XL
6.6B params • 6GB VRAM
Stability AI • CreativeML Open RAIL++-M
Flux.1 Dev
12B params • 12GB VRAM
Black Forest Labs • FLUX.1-dev Non-Commercial
Flux.1 Schnell
12B params • 12GB VRAM
Black Forest Labs • Apache-2.0
Stable Diffusion 3 Medium
2B params • 3GB VRAM
Stability AI • Stability AI Community
Not sure which GPU you need for Stable Diffusion 3.5 Large?
The VRAM Calculator tells you exactly which quantization your hardware can handle.