// железо для ИИ
Каталог видеокарт и ускорителей для машинного обучения: характеристики, цены, сравнение.
| Название | Тип | VRAM | FP16 | FP32 | TDP | Цена (руб) | Год |
|---|---|---|---|---|---|---|---|
| Apple M4 Ultra | SoC | 192 GB (unified) | 54 | 27 | 120W | — | 2025 |
| NVIDIA RTX 5090 | GPU | 32 GB GDDR7 | 660 | 105 | 575W | — | 2025 |
| NVIDIA B200 | GPU | 192 GB HBM3e | 4500 | 180 | 1000W | — | 2025 |
| NVIDIA RTX 4080 SUPER | GPU | 16 GB GDDR6X | 261 | 52 | 320W | — | 2024 |
| NVIDIA H200 SXM | GPU | 141 GB HBM3e | 1979 | 67 | 700W | — | 2024 |
| Intel Gaudi 3 | AI Accelerator | 128 GB HBM2e | 1835 | 600W | — | 2024 | |
| NVIDIA H100 SXM | GPU | 80 GB HBM3 | 1979 | 67 | 700W | — | 2023 |
| Google TPU v5e | TPU | 16 GB HBM2e | 197 | 200W | — | 2023 | |
| Google TPU v5p | TPU | 95 GB HBM2e | 459 | 400W | — | 2023 | |
| AMD Instinct MI300X | GPU | 192 GB HBM3 | 2600 | 163 | 750W | — | 2023 |
| NVIDIA L40S | GPU | 48 GB GDDR6 | 733 | 91.6 | 350W | — | 2023 |
| NVIDIA RTX 4090 | GPU | 24 GB GDDR6X | 330 | 82.6 | 450W | — | 2022 |
| NVIDIA A100 80GB | GPU | 80 GB HBM2e | 624 | 19.5 | 400W | — | 2020 |
| NVIDIA A100 40GB | GPU | 40 GB HBM2e | 624 | 19.5 | 400W | — | 2020 |
| NVIDIA RTX 3090 | GPU | 24 GB GDDR6X | 142 | 35.6 | 350W | — | 2020 |
SoC2025
Apple M4 Ultra
VRAM: 192 GB (unified)FP16: 54TDP: 120W
GPU2025
NVIDIA RTX 5090
VRAM: 32 GB GDDR7FP16: 660TDP: 575W
GPU2025
NVIDIA B200
VRAM: 192 GB HBM3eFP16: 4500TDP: 1000W
GPU2024
NVIDIA RTX 4080 SUPER
VRAM: 16 GB GDDR6XFP16: 261TDP: 320W
GPU2024
NVIDIA H200 SXM
VRAM: 141 GB HBM3eFP16: 1979TDP: 700W
AI Accelerator2024
Intel Gaudi 3
VRAM: 128 GB HBM2eFP16: 1835TDP: 600W
GPU2023
NVIDIA H100 SXM
VRAM: 80 GB HBM3FP16: 1979TDP: 700W
TPU2023
Google TPU v5e
VRAM: 16 GB HBM2eFP16: 197TDP: 200W
TPU2023
Google TPU v5p
VRAM: 95 GB HBM2eFP16: 459TDP: 400W
GPU2023
AMD Instinct MI300X
VRAM: 192 GB HBM3FP16: 2600TDP: 750W
GPU2023
NVIDIA L40S
VRAM: 48 GB GDDR6FP16: 733TDP: 350W
GPU2022
NVIDIA RTX 4090
VRAM: 24 GB GDDR6XFP16: 330TDP: 450W
GPU2020
NVIDIA A100 80GB
VRAM: 80 GB HBM2eFP16: 624TDP: 400W
GPU2020
NVIDIA A100 40GB
VRAM: 40 GB HBM2eFP16: 624TDP: 400W
GPU2020
NVIDIA RTX 3090
VRAM: 24 GB GDDR6XFP16: 142TDP: 350W