GPU Resource Calculator

Open Source LLMGPU Resource Calculator

Calculate the GPU resources needed for inference and training of open source large language models (LLMs) to determine appropriate hardware configurations.

LLM GPU Calculator

LLM GPU Calculator
Calculate GPU memory requirements for LLM inference and training

Select a popular model or customize parameters below

Number of parameters in billions (e.g., 7 for a 7B model)

Precision for model weights

Number of inputs processed simultaneously

Maximum context length (e.g., 2048, 4096, 8192)

Number of transformer layers in the model

Dimension of the model embeddings

Number of attention heads

Percentage of parameters to train (for LoRA/QLoRA)