LLaVA Team

LLaVA 1.6 34B

Vision-language model. Analyze images and answer questions about them. Best quality variant.

34B parametersllavaapache-2.04K context22GB - 38GB VRAM

Check Your Hardware

See which quantizations of LLaVA 1.6 34B your hardware can run.

Quantization Options

QuantizationBitsFile SizeVRAM NeededRAM NeededQuality
Q4_K_M4.520 GB22 GB26 GB
85%
Q8_0836 GB38 GB42 GB
98%

Frequently Asked Questions

How much VRAM do I need to run LLaVA 1.6 34B?

LLaVA 1.6 34B requires 22GB VRAM minimum with Q4_K_M quantization. For full precision, you need 38GB VRAM.

What is the best quantization for LLaVA 1.6 34B?

Q4_K_M offers the best balance of quality and VRAM usage. Q8_0 is near-lossless if you have enough VRAM.