Hopper H200 SXM vs Hopper H100 SXM5

Complete side-by-side comparison of specs, performance, memory, power efficiency, and pricing.

NVIDIA

Hopper H200 SXM

74

Perf Score

Spec Wins

30
NVIDIA

Hopper H100 SXM5

73

Perf Score

Detailed Specifications

SpecHopper H200 SXMHopper H100 SXM5
ArchitectureHopper Hopper
Memory141GB HBM3e 80GB HBM3
Memory Bandwidth4,800 GB/s 3,350 GB/s
FP16 TFLOPS1,979 1,979
FP8 TFLOPS3,958 3,958
BF16 TFLOPS1,979 1,979
INT8 TOPS3,958 3,958
TDP700W 700W
InterconnectNVLink 4.0 (900 GB/s) (900 GB/s) NVLink 4.0 (900 GB/s) (900 GB/s)
Perf Score74 73
EcosystemCUDA CUDA
Est. Price$30,000 $25,000

Hopper H200 SXM — Best For

LLM InferenceLarge Models

Hopper H100 SXM5 — Best For

LLM TrainingHPC
Ask AI Advisor