CodeLlama 34B
10 consumer GPUs can run CodeLlama 34B at Q4 natively. Precise VRAM thresholds and benchmarks below.
llama.cpp 0.2.x · CUDA 12 · ROCm 6 · Updated monthly · methodology →
Execution Context
This model requires a High-end GPU (24 GB VRAM)
Cómo ejecutar este modelo
Check if your GPU can run CodeLlama 34B →
VRAM Calculator — instant compatibility check
RTX 5090
32 GB · Runs Q4 natively · Check availability
*Prices and availability may change. Some links are affiliate links.
System Requirements
VRAM by Quantization
| Quantization | VRAM needed | Disk space | Quality |
|---|---|---|---|
| FP16 (max quality) | 68 GB | 68 GB | Maximum |
| Q8 (high quality) | 34 GB | 34 GB | Near-lossless |
| Q4 (recommended) Best balance | 19 GB | 19 GB | Recommended |
| Q2 (minimum) | 10 GB | 10 GB | Quality loss |
Model Details
| Developer | Meta |
| Parameters | 34B |
| Context window | 16,384 tokens |
| License | llama-2-community |
| Use cases | coding, chat |
| Released | 2023-08 |
Install with Ollama
ollama run codellama:34b Hugging Face
codellama/CodeLlama-34b-Instruct-hf Can your GPU run CodeLlama 34B?
CodeLlama 34B requires 19 GB VRAM at Q4. 10 consumer GPUs meet this threshold. Below 8 GB or 17 GB you'll hit significant offload latency.
Hardware Performance Matrix
10 Q4 native · 19 offload · 11 unsupported
Recommended GPUs for CodeLlama 34B
Best picks by compatibility, VRAM headroom, and value — prices and availability may change.
RTX 5090
32 GB VRAM
Check availability →
RTX 4090
24 GB VRAM
Check availability →
M4 Ultra
128 GB VRAM
Check availability →
Some links are Amazon affiliate links. We may earn a commission at no extra cost to you. Amazon cookies may last up to 24 hours after your click.
CodeLlama 34B — Compatibility guide
CodeLlama 34B requires a high-end GPU like the RTX 4090 or a Mac with M2 Ultra or better. The Q4 version needs 19 GB VRAM. Check the VRAM calculator for your options.
Compare GPUs for CodeLlama 34B
Which GPU is worth it? Real specs and benchmarks side by side.
Compatible Hardware
GPUs that run CodeLlama 34B at Q4 — sorted by AI performance score.
Some links are Amazon affiliate links. We may earn a commission at no extra cost to you. Amazon cookies may last up to 24 hours after your click.
More Practical Alternatives
Similar models in the coding category with comparable VRAM footprints.
Not sure which GPU you need for CodeLlama 34B?
The VRAM Calculator tells you exactly which quantization your hardware can handle.