LLM API Providers Leaderboard - Comparison of over 100 LLM endpoints
Comparison and ranking of API provider performance for over 100 AI LLM Model endpoints across performance key metrics including price, output speed, latency, context window & others. For more details including relating to our methodology, see our FAQs.
API providers compared: OpenAI, Playground AI, Mistral, Ideogram, Microsoft Azure, Amazon Bedrock, Hyperbolic, Groq, Together.ai, Anthropic, Black Forest Labs, Perplexity, Google, Fireworks, Cerebras, Recraft AI, Cohere, Upstage, Simplismart, Speechmatics, Deepinfra, , Replicate, Genmo, Nebius, Adobe, Runpod, Rev AI, AssemblyAI, fal.ai, DeepSeek, Reka AI, Deepgram, Gladia, Baseten, Stability.ai, Midjourney, Databricks, ElevenLabs, IBM, SambaNova, xAI, Cartesia, LMNT, 01.AI, and AI21 Labs.
Features | Price | Output tokens/s | Latency | ||||
---|---|---|---|---|---|---|---|
Further Analysis | |||||||
o1-preview | 128k | 85 | $26.25 | 36.9 | 27.96 | ||
o1-mini | 128k | 82 | $5.25 | 77.1 | 13.11 | ||
GPT-4o (Aug '24) | 128k | 77 | $4.38 | 82.0 | 0.40 | ||
GPT-4o (Aug '24) | 128k | 77 | $4.38 | 100.8 | 0.79 | ||
GPT-4o (May '24) | 128k | 77 | $7.50 | 81.0 | 0.39 | ||
GPT-4o (May '24) | 128k | 77 | $7.50 | 114.9 | 0.76 | ||
GPT-4o mini | 128k | 71 | $0.26 | 85.9 | 0.39 | ||
GPT-4o mini | 128k | 71 | $0.26 | 163.2 | 0.67 | ||
Llama 3.1 405B | 128k | 72 | $9.50 | 18.6 | 0.89 | ||
Llama 3.1 405B | 128k | 74 | $4.00 | 16.6 | 0.89 | ||
Llama 3.1 405B | 128k | 72 | $7.99 | 13.0 | 1.77 | ||
Llama 3.1 405B Base | 128k | 72 | $1.50 | 33.7 | 0.83 | ||
Llama 3.1 405B Vertex | 128k | 72 | $7.75 | 29.5 | 0.42 | ||
Llama 3.1 405B | 128k | 72 | $8.00 | 17.0 | 0.54 | ||
Llama 3.1 405B | 128k | 72 | $3.00 | 78.1 | 0.55 | ||
Llama 3.1 405B | 33k | 72 | $1.79 | 18.6 | 0.53 | ||
Llama 3.1 405B | 8k | 74 | $6.25 | 166.2 | 1.51 | ||
Llama 3.1 405B Turbo | 128k | 72 | $3.50 | 79.7 | 0.70 | ||
Llama 3.2 90B (Vision) | 128k | 67 | $2.00 | 40.4 | 0.49 | ||
Llama 3.2 90B (Vision) Vertex | 128k | 67 | $0.00 | 35.6 | 0.21 | ||
Llama 3.2 90B (Vision) | 128k | 66 | $0.90 | 48.6 | 0.43 | ||
Llama 3.2 90B (Vision) | 8k | 67 | $0.36 | 32.3 | 0.32 | ||
Llama 3.2 90B (Vision) | 8k | 67 | $0.90 | 250.3 | 0.35 | ||
Llama 3.2 90B (Vision) Turbo | 128k | 66 | $1.20 | 54.8 | 0.34 | ||
Llama 3.1 70B | 8k | 66 | $0.60 | 2,115.0 | 0.36 | ||
Llama 3.1 70B | 128k | 66 | $0.40 | 30.9 | 0.68 | ||
Llama 3.1 70B | 128k | 65 | $0.99 | 31.3 | 0.68 | ||
Llama 3.1 70B Base | 128k | 65 | $0.20 | 43.8 | 0.70 | ||
Llama 3.1 70B Fast | 128k | 65 | $0.38 | 71.3 | 0.66 | ||
Llama 3.1 70B Vertex | 128k | 65 | $0.00 | 72.0 | 0.28 | ||
Llama 3.1 70B | 128k | 65 | $2.90 | 28.2 | 0.56 | ||
Llama 3.1 70B | 128k | 65 | $0.90 | 138.4 | 0.32 | ||
Llama 3.1 70B (Turbo, FP8) | 128k | 65 | $0.32 | 35.6 | 0.28 | ||
Llama 3.1 70B | 128k | 65 | $0.36 | 32.7 | 0.31 | ||
Llama 3.1 70B | 128k | 66 | $0.64 | 250.2 | 0.35 | ||
Llama 3.1 70B (Spec decoding) | 8k | 66 | $0.69 | 1,668.8 | 0.36 | ||
Llama 3.1 70B | 64k | 65 | $0.75 | 455.4 | 0.56 | ||
Llama 3.1 70B | 128k | 65 | $1.00 | 49.6 | 0.35 | ||
Llama 3.1 70B Turbo | 128k | 65 | $0.88 | 116.7 | 0.47 | ||
Llama 3.1 70B | 128k | 64 | $0.90 | 114.0 | 0.46 | ||
Llama 3.2 11B (Vision) | 128k | 53 | $0.35 | 135.0 | 0.35 | ||
Llama 3.2 11B (Vision) | 128k | 54 | $0.20 | 118.0 | 0.29 | ||
Llama 3.2 11B (Vision) | 128k | 53 | $0.06 | 75.7 | 0.21 | ||
Llama 3.2 11B (Vision) | 8k | 54 | $0.18 | 751.0 | 0.26 | ||
Llama 3.2 11B (Vision) Turbo | 128k | 54 | $0.18 | 147.3 | 0.31 | ||
Llama 3.1 8B | 8k | 53 | $0.10 | 2,206.0 | 0.37 | ||
Llama 3.1 8B | 128k | 54 | $0.10 | 112.8 | 0.49 | ||
Llama 3.1 8B | 128k | 53 | $0.22 | 90.3 | 0.37 | ||
Llama 3.1 8B Fast | 128k | 53 | $0.04 | 183.2 | 0.57 | ||
Llama 3.1 8B Base | 128k | 53 | $0.03 | 65.9 | 0.60 | ||
Llama 3.1 8B Vertex | 128k | 53 | $0.00 | 120.0 | 0.19 | ||
Llama 3.1 8B | 128k | 53 | $0.38 | 58.0 | 0.40 | ||
Llama 3.1 8B | 128k | 53 | $0.20 | 268.9 | 0.26 | ||
Llama 3.1 8B | 128k | 53 | $0.06 | 78.0 | 0.21 | ||
Llama 3.1 8B | 128k | 53 | $0.06 | 750.4 | 0.26 | ||
Llama 3.1 8B | 16k | 52 | $0.13 | 1,116.9 | 0.42 | ||
Llama 3.1 8B | 128k | 53 | $0.20 | 157.2 | 0.36 | ||
Llama 3.1 8B Turbo | 128k | 52 | $0.18 | 154.8 | 0.36 | ||
Llama 3.1 8B | 128k | 52 | $0.15 | 342.1 | 0.41 | ||
Llama 3.2 3B | 128k | 36 | $0.10 | 202.2 | 0.50 | ||
Llama 3.2 3B | 128k | 47 | $0.15 | 143.9 | 0.40 | ||
Llama 3.2 3B | 128k | 47 | $0.10 | 262.1 | 0.28 | ||
Llama 3.2 3B | 128k | 47 | $0.04 | 94.3 | 0.26 | ||
Llama 3.2 3B | 8k | 47 | $0.06 | 1,687.3 | 0.31 | ||
Llama 3.2 3B | 4k | 47 | $0.10 | 1,559.1 | 0.35 | ||
Llama 3.2 3B Turbo | 128k | 47 | $0.06 | 128.7 | 0.34 | ||
Llama 3.2 1B | 128k | 29 | $0.10 | 314.0 | 0.34 | ||
Llama 3.2 1B | 128k | 26 | $0.10 | 553.9 | 0.32 | ||
Llama 3.2 1B | 128k | 27 | $0.01 | 164.1 | 0.26 | ||
Llama 3.2 1B | 8k | 27 | $0.04 | 3,315.0 | 0.48 | ||
Llama 3.2 1B | 4k | 27 | $0.05 | 2,480.0 | 0.32 | ||
Gemini 1.5 Pro (Sep) (Vertex) | 2m | 80 | $2.19 | 58.3 | 0.45 | ||
Gemini 1.5 Pro (Sep) (AI Studio) | 2m | 80 | $2.19 | 59.8 | 0.81 | ||
Gemini 1.5 Flash (Sep) (Vertex) | 1m | 68 | $0.13 | 188.0 | 0.24 | ||
Gemini 1.5 Flash (Sep) (AI Studio) | 1m | 68 | $0.13 | 190.3 | 0.40 | ||
Gemma 2 27B | 8k | 61 | $0.80 | 49.3 | 0.52 | ||
Gemma 2 9B Fast | 8k | 47 | $0.04 | 179.5 | 0.55 | ||
Gemma 2 9B Base | 8k | 47 | $0.03 | 165.7 | 0.53 | ||
Gemma 2 9B | 8k | 48 | $0.06 | 50.4 | 0.35 | ||
Gemma 2 9B | 8k | 45 | $0.20 | 664.2 | 0.19 | ||
Gemma 2 9B | 8k | 48 | $0.30 | 97.1 | 0.40 | ||
Gemini 1.5 Pro (May) (Vertex) | 2m | 72 | $5.25 | 64.3 | 0.43 | ||
Gemini 1.5 Pro (May) (AI Studio) | 2m | 72 | $5.25 | 61.6 | 0.85 | ||
Gemini 1.5 Flash (May) (Vertex) | 1m | $0.13 | 310.6 | 0.26 | |||
Gemini 1.5 Flash (May) (AI Studio) | 1m | $0.13 | 312.0 | 0.34 | |||
Gemini 1.5 Flash-8B AI Studio | 1m | $0.07 | 283.8 | 0.35 | |||
Claude 3.5 Sonnet (Oct) | 200k | 80 | $6.00 | 47.6 | 1.00 | ||
Claude 3.5 Sonnet (Oct) Vertex | 200k | 80 | $6.00 | 60.2 | 0.75 | ||
Claude 3.5 Sonnet (Oct) | 200k | 80 | $6.00 | 59.5 | 0.80 | ||
Claude 3.5 Sonnet (June) | 200k | 77 | $6.00 | 49.1 | 0.95 | ||
Claude 3.5 Sonnet (June) Vertex | 200k | 77 | $6.00 | 60.5 | 0.74 | ||
Claude 3.5 Sonnet (June) | 200k | 77 | $6.00 | 59.5 | 1.02 | ||
Claude 3 Opus | 200k | 70 | $30.00 | 27.3 | 1.62 | ||
Claude 3 Opus Vertex | 200k | 70 | $30.00 | 28.1 | 3.23 | ||
Claude 3 Opus | 200k | 70 | $30.00 | 26.9 | 2.60 | ||
Claude 3.5 Haiku | 200k | 69 | $2.00 | 60.1 | 0.81 | ||
Claude 3.5 Haiku Vertex | 200k | 69 | $2.00 | 68.6 | 0.99 | ||
Claude 3.5 Haiku | 200k | 69 | $2.00 | 67.0 | 1.00 | ||
Claude 3 Haiku | 200k | 54 | $0.50 | 118.6 | 0.47 | ||
Claude 3 Haiku | 200k | 54 | $0.50 | 135.4 | 0.50 | ||
Mistral Large (Nov '24) | 128k | 74 | $3.00 | 34.6 | 0.49 | ||
Mistral Large 2 (Jul '24) | 128k | 73 | $3.00 | 32.6 | 0.47 | ||
Mistral Large 2 (Jul '24) | 128k | 73 | $3.00 | 33.9 | 0.48 | ||
Mistral Large 2 (Jul '24) | 128k | 73 | $3.00 | 56.3 | 0.41 | ||
Pixtral Large | 128k | 73 | $3.00 | 34.0 | 0.58 | ||
Mixtral 8x22B | 65k | 62 | $3.00 | 65.6 | 0.45 | ||
Mixtral 8x22B Base | 65k | 62 | $0.60 | 86.4 | 0.71 | ||
Mixtral 8x22B Fast | 65k | 62 | $1.05 | 96.2 | 0.71 | ||
Mixtral 8x22B | 65k | 61 | $1.20 | 83.1 | 0.29 | ||
Mixtral 8x22B | 65k | 60 | $1.20 | 61.0 | 0.81 | ||
Mistral Small (Sep '24) | 128k | 60 | $0.30 | 55.7 | 0.47 | ||
Pixtral 12B | 128k | 56 | $0.15 | 67.6 | 0.40 | ||
Pixtral 12B | 128k | 57 | $0.10 | 75.2 | 0.51 | ||
Ministral 8B | 128k | 53 | $0.10 | 136.0 | 0.42 | ||
Mistral NeMo | 128k | 52 | $0.15 | 119.7 | 0.40 | ||
Mistral NeMo Fast | 128k | 53 | $0.12 | 155.4 | 0.56 | ||
Mistral NeMo Base | 128k | 53 | $0.06 | 52.1 | 0.64 | ||
Mistral NeMo | 128k | 53 | $0.13 | 53.0 | 0.25 | ||
Ministral 3B | 128k | 51 | $0.04 | 210.5 | 0.41 | ||
Mixtral 8x7B | 33k | 44 | $0.70 | 87.4 | 0.44 | ||
Mixtral 8x7B | 33k | 42 | $0.51 | 78.2 | 0.35 | ||
Mixtral 8x7B Fast | 33k | 43 | $0.23 | 156.2 | 0.55 | ||
Mixtral 8x7B Base | 33k | 43 | $0.12 | 130.2 | 0.54 | ||
Mixtral 8x7B | 33k | 42 | $0.50 | 101.0 | 0.24 | ||
Mixtral 8x7B | 33k | 43 | $0.24 | 43.1 | 0.30 | ||
Mixtral 8x7B | 33k | 45 | $0.24 | 551.7 | 0.24 | ||
Mixtral 8x7B | 33k | 40 | $0.60 | 86.6 | 0.36 | ||
Codestral-Mamba | 256k | 36 | $0.25 | 94.8 | 0.58 | ||
Command-R+ | 128k | 56 | $6.00 | 44.0 | 0.52 | ||
Command-R+ | 128k | 56 | $4.38 | 70.8 | 0.29 | ||
Command-R | 128k | 51 | $0.75 | 108.0 | 0.35 | ||
Command-R | 128k | 51 | $0.26 | 116.1 | 0.22 | ||
Command-R+ (Apr '24) | 128k | 46 | $6.00 | 43.3 | 0.52 | ||
Command-R+ (Apr '24) | 128k | 48 | $6.00 | 66.5 | 0.30 | ||
Command-R+ (Apr '24) | 128k | 44 | $6.00 | 45.7 | 0.70 | ||
Command-R (Mar '24) | 128k | 36 | $0.75 | 108.1 | 0.35 | ||
Command-R (Mar '24) | 128k | 36 | $0.75 | 163.7 | 0.21 | ||
Command-R (Mar '24) | 128k | 36 | $0.75 | 103.4 | 0.52 | ||
Aya Expanse 32B | 8k | $0.75 | 121.4 | 0.22 | |||
Aya Expanse 8B | 8k | $0.75 | 137.7 | 0.26 | |||
Sonar 3.1 Large | 131k | $1.00 | 57.9 | 0.36 | |||
Sonar 3.1 Small | 131k | $0.20 | 143.3 | 0.35 | |||
Grok Beta | 8k | 70 | $7.50 | 57.2 | 0.48 | ||
Phi-3 Medium 14B | 128k | $0.30 | 45.0 | 0.44 | |||
Solar Pro | 4k | 61 | $0.25 | 51.6 | 1.21 | ||
Solar Mini | 4k | 48 | $0.15 | 84.9 | 1.13 | ||
DBRX | 33k | 48 | $1.20 | 82.9 | 0.35 | ||
Llama 3.1 Nemotron 70B Base | 128k | 70 | $0.20 | 47.6 | 0.63 | ||
Llama 3.1 Nemotron 70B Fast | 128k | 70 | $0.38 | 69.7 | 0.62 | ||
Llama 3.1 Nemotron 70B | 128k | 70 | $0.36 | 23.9 | 0.33 | ||
Reka Flash | 128k | 58 | $0.35 | 32.8 | 1.23 | ||
Reka Core | 128k | 57 | $2.00 | 14.9 | 1.13 | ||
Reka Flash (Feb '24) | 128k | 46 | $0.35 | 31.2 | 0.89 | ||
Reka Edge | 64k | 30 | $0.10 | 35.2 | 0.92 | ||
Jamba 1.5 Large | 256k | 64 | $3.50 | 51.0 | 0.70 | ||
Jamba 1.5 Mini | 256k | 46 | $0.25 | 82.5 | 0.49 | ||
DeepSeek-Coder-V2 | 128k | 67 | $0.17 | 16.4 | 1.06 | ||
DeepSeek-V2 | 128k | 66 | $0.17 | 16.4 | 1.07 | ||
DeepSeek-V2.5 | 64k | 66 | $0.17 | 16.5 | 1.08 | ||
DeepSeek-V2.5 | 128k | 66 | $2.00 | 7.5 | 0.85 | ||
Qwen2.5 72B | 131k | 75 | $0.40 | 47.0 | 0.60 | ||
Qwen2.5 72B | 131k | 75 | $0.20 | 46.0 | 0.64 | ||
Qwen2.5 72B Fast | 131k | 75 | $0.38 | 67.2 | 0.56 | ||
Qwen2.5 72B | 131k | 75 | $0.90 | 52.6 | 0.46 | ||
Qwen2.5 72B | 33k | 75 | $0.36 | 21.5 | 0.41 | ||
Qwen2.5 72B | 131k | 75 | $1.20 | 74.0 | 0.54 | ||
Qwen2.5 Coder 32B | 131k | 70 | $0.20 | 35.2 | 0.52 | ||
Qwen2.5 Coder 32B | 33k | 70 | $0.90 | 98.1 | 0.32 | ||
Qwen2.5 Coder 32B | 33k | 70 | $0.18 | 54.7 | 0.28 | ||
Qwen2.5 Coder 32B | 131k | 70 | $0.80 | 59.4 | 0.51 | ||
Qwen2 72B | 33k | 69 | $0.36 | 21.5 | 0.38 | ||
Qwen2 72B | 33k | 69 | $0.90 | 62.3 | 0.42 | ||
Yi-Large | 32k | 58 | $3.00 | 67.1 | 0.44 | ||
GPT-4 Turbo | 128k | 74 | $15.00 | 35.0 | 0.59 | ||
GPT-4 Turbo | 128k | 74 | $15.00 | 45.3 | 1.60 | ||
GPT-3.5 Turbo | 16k | 53 | $0.75 | 98.5 | 0.39 | ||
GPT-3.5 Turbo | 16k | 52 | $0.75 | 121.0 | 0.68 | ||
GPT-3.5 Turbo Instruct | 4k | $1.63 | 108.4 | 0.66 | |||
GPT-4 | 8k | $37.50 | 23.8 | 0.65 | |||
Llama 3 70B | 8k | 62 | $1.18 | 46.5 | 0.35 | ||
Llama 3 70B | 8k | 62 | $0.40 | 31.8 | 1.19 | ||
Llama 3 70B | 8k | 62 | $2.86 | 43.7 | 0.44 | ||
Llama 3 70B | 8k | 61 | $2.90 | 18.3 | 0.78 | ||
Llama 3 70B | 8k | 62 | $0.90 | 113.7 | 0.29 | ||
Llama 3 70B | 8k | 62 | $0.36 | 22.6 | 0.36 | ||
Llama 3 70B | 8k | 62 | $0.64 | 349.9 | 0.21 | ||
Llama 3 70B (Reference, FP16) | 8k | 62 | $0.90 | 109.0 | 0.55 | ||
Llama 3 70B (Turbo, FP8) | 8k | 62 | $0.88 | 45.1 | 0.41 | ||
Llama 3 8B | 8k | 45 | $0.10 | 57.7 | 0.34 | ||
Llama 3 8B | 8k | 46 | $0.38 | 101.8 | 0.32 | ||
Llama 3 8B | 8k | 45 | $0.38 | 73.8 | 0.47 | ||
Llama 3 8B | 8k | 46 | $0.20 | 153.7 | 0.30 | ||
Llama 3 8B | 8k | 46 | $0.06 | 122.5 | 0.20 | ||
Llama 3 8B | 8k | 46 | $0.06 | 1,198.9 | 0.26 | ||
Llama 3 8B | 8k | 46 | $0.20 | 217.9 | 0.39 | ||
Llama 2 Chat 13B | 4k | 25 | $0.30 | 53.1 | 0.48 | ||
Llama 2 Chat 7B | 4k | $0.10 | 123.8 | 0.33 | |||
Gemini 1.0 Pro (AI Studio) | 33k | $0.75 | 102.5 | 1.26 | |||
Claude 3 Sonnet | 200k | 57 | $6.00 | 60.2 | 0.76 | ||
Claude 3 Sonnet | 200k | 57 | $6.00 | 62.8 | 1.14 | ||
Mistral Large (Feb '24) | 33k | 57 | $6.00 | 33.5 | 0.46 | ||
Mistral Large (Feb '24) | 33k | 56 | $6.00 | 36.2 | 0.41 | ||
Mistral Large (Feb '24) | 33k | 55 | $6.00 | 39.4 | 0.56 | ||
Mistral Small (Feb '24) | 33k | 50 | $1.50 | 59.4 | 0.43 | ||
Mistral Small (Feb '24) | 33k | 50 | $1.50 | 53.0 | 0.44 | ||
Mistral 7B | 33k | 22 | $0.25 | 131.3 | 0.42 | ||
Mistral 7B | 33k | 24 | $0.16 | 93.5 | 0.33 | ||
Mistral 7B | 33k | 24 | $0.06 | 93.0 | 0.21 | ||
Mistral 7B | 8k | 24 | $0.20 | 111.8 | 0.31 | ||
Codestral | 33k | $0.30 | 80.9 | 0.43 | |||
Mistral Medium | 33k | $4.09 | 44.5 | 0.45 | |||
OpenChat 3.5 | 8k | 43 | $0.06 | 74.8 | 0.32 | ||
Jamba Instruct | 256k | 28 | $0.55 | 75.5 | 0.53 |
Key definitions
Artificial Analysis Quality Index: Average result across our evaluations covering different dimensions of model intelligence. Currently includes MMLU, GPQA, Math & HumanEval. OpenAI o1 model figures are preliminary and are based on figures stated by OpenAI. See methodology for more details.
Context window: Maximum number of combined input & output tokens. Output tokens commonly have a significantly lower limit (varied by model).
Output Speed: Tokens per second received while the model is generating tokens (ie. after first chunk has been received from the API for models which support streaming).
Latency: Time to first token of tokens received, in seconds, after API request sent. For models which do not support streaming, this represents time to receive the completion.
Price: Price per token, represented as USD per million Tokens. Price is a blend of Input & Output token prices (3:1 ratio).
Output price: Price per token generated by the model (received from the API), represented as USD per million Tokens.
Input price: Price per token included in the request/message sent to the API, represented as USD per million Tokens.
Time period: Metrics are 'live' and are based on the past 14 days of measurements, measurements are taken 8 times a day for single requests and 2 times per day for parallel requests.