Skip to main content

AI Hardware Catalog

Curated components for building and upgrading your AI-capable PC. Every product is selected for local AI performance.

Javier Morales Especialista en Hardware e IA Local — 8 años de experiencia
GitHub: github.com/javier-morales-ia
Model-First Catalog Path
Compatibility-safe exits

Choose a compatible starting path before product browsing

Each route pairs a scenario with a model and GPU that fit at Q4 VRAM, so you can jump to compatibility or continue with guided hardware decisions.

3
Visible Scenarios
11.4GB
Avg Required VRAM
19.2GB
Highest Requirement
check_circle
science

Home AI research lab

Researchers, ML students, advanced enthusiasts

Model: DeepSeek R1 Distill 32BGPU: RTX 4090
Selected
Q4 requirement vs GPU VRAM
Required VRAM
19.2 GB
4GB8GB12GB16GB 24GB+
19.2GB
Model Q4
24GB
GPU VRAM
terminal

Local coding assistant

Developers who want private AI autocomplete

Model: DeepSeek Coder V2GPU: RTX 4060 Ti 16GB
Q4 requirement vs GPU VRAM
Required VRAM
9 GB
4GB8GB12GB16GB 24GB+
9GB
Model Q4
16GB
GPU VRAM

NVIDIA RTX 50 Series

Flagship

Best high-end GPUs for local AI — RTX 50 Series Blackwell cards for LLMs, image generation, and multi-model AI workflows.

Browse GPU-NVIDIA-RTX-50XX 0 picks

NVIDIA RTX 40 Series

RTX 40 Series GPUs for local AI — Ada Lovelace cards with strong performance for LLMs, Stable Diffusion, and AI workloads.

Browse GPU-NVIDIA-RTX-40XX 0 picks

NVIDIA RTX 30 Series

Best budget GPUs for local AI — RTX 30 Series Ampere cards for affordable LLM inference and AI model execution at home.

Browse GPU-NVIDIA-RTX-30XX 0 picks

AMD Radeon RX 7000

RDNA cards for users exploring local AI outside the NVIDIA stack, especially when VRAM per dollar matters.

Browse GPU-AMD-RX-7000 0 picks

AMD Radeon RX 6000

Older AMD options for experimental or budget AI builds where availability matters more than cutting-edge support.

Browse GPU-AMD-RX-6000 0 picks

Intel Arc GPUs

Intel Arc cards for low-cost experimentation, media-heavy workflows, and builders testing non-CUDA paths.

Browse GPU-INTEL-ARC 0 picks

Workstation GPUs

Pro

Professional GPUs for labs and heavy-duty inference where stability, VRAM, and pro drivers matter more than value.

Browse GPU-WORKSTATION-PRO 0 picks

DDR5 64 GB RAM Kits

Best RAM for local AI — DDR5 64 GB memory kits for AI model offloading, VRAM extension, and multitasking on AI workstations.

Browse RAM-AI-DDR5-64GB 0 picks

DDR4 32 GB RAM Kits

Affordable RAM for AI PCs — DDR4 32 GB memory kits for budget local AI builds and system upgrades on existing platforms.

Browse RAM-AI-DDR4-32GB 0 picks

NVMe Model Storage

Best NVMe SSD for AI model storage — fast SSDs for local AI model libraries, embeddings, and inference caches.

Browse NVME-MODEL-STORAGE 0 picks

High-Wattage PSUs

Power supplies sized for modern AI GPUs, transient spikes, and future upgrade headroom.

Browse PSU-HIGH-WATTAGE 0 picks

AIO CPU Cooling

Liquid coolers for workstation CPUs and long-running AI sessions where thermal stability matters.

Browse COOLING-CPU-AIO 0 picks

CPU Workstations

Prebuilt towers centered on strong CPUs for mixed local AI, developer tooling, and CPU-assisted inference.

Browse CPU-AI-WORKSTATION 0 picks

Mini PCs for AI

Compact systems for lighter local AI use cases, edge setups, and space-constrained desks.

Browse MINI-PC-AI 0 picks

Complete AI Workstations

Turnkey

Full prebuilt systems for buyers who want a turnkey local AI machine instead of assembling components.

Browse WORKSTATION-COMPLETE 0 picks

Thunderbolt eGPU Gear

External GPU enclosures and accessories for laptop users exploring local AI without a full desktop rebuild.

Browse THUNDERBOLT-EGPU 0 picks

Quick Buying Guide

8+ GB

Minimum VRAM

For 7B models and basic image gen

32 GB

Recommended RAM

Handles most AI workloads

1 TB

SSD Storage

Room for many models

Some links on this page are affiliate links. We may earn a small commission at no extra cost to you. This helps support the project.