Skip to content
NVIDIA

NVIDIA A10

125 FP16 · 24GB GDDR6 · 150W

10Score
Inference ServingFine-TuningVirtual Desktops

Specifications

ArchitectureAmpere (GA102)
Memory24GB GDDR6
Memory Bandwidth600 GB/s
FP16 TFLOPS125
FP8 TFLOPS0
BF16 TFLOPS125
INT8 TOPS250
TDP150W
InterconnectPCIe Gen 4 ×16 (0 GB/s)
EcosystemCUDA
GenerationPrevious
Est. Price$4,000

Recommended Configuration

4–8× A10 via PCIe in GPU servers

Training Intelligence

CUDA
PyTorch
TensorFlow
JAX
DeepSpeed
Training Time Estimates
LLaMA 7B(7B)
~4 days8 GPUs
GPT-3 175B(175B)
~50 days1024 GPUs
SDXL(3.5B)
~3 days8 GPUs

Cloud cost: $1.10/hr

Ask AI Advisor