o1: Intelligence, Performance & Price Analysis
Analysis of OpenAI's o1 and comparison to other AI models across key metrics including quality, price, performance (tokens per second & time to first token), context window & more. Click on any model to compare API providers for that model. For more details including relating to our methodology, see our FAQs.
Comparison Summary
Intelligence:
o1 is of higher quality compared to average, with a MMLU score of 0.841 and a Intelligence Index across evaluations of 62.
Price:o1 is more expensive compared to average with a price of $26.25 per 1M Tokens (blended 3:1).
o1 Input token price: $15.00, Output token price: $60.00 per 1M Tokens.
Speed:o1 Input token price: $15.00, Output token price: $60.00 per 1M Tokens.
o1 is faster compared to average, with a output speed of 169.5 tokens per second.
Latency:o1 has a higher latency compared to average, taking 16.25s to receive the first token (TTFT).
Context Window:o1 has a smaller context windows than average, with a context window of 200k tokens.
Highlights
Intelligence
Artificial Analysis Intelligence Index; Higher is better
Loading chart...
Speed
Output Tokens per Second; Higher is better
Loading chart...
Price
USD per 1M Tokens; Lower is better
Loading chart...
Parallel Queries:
Prompt Length:
o1 Model Details
Comparisons to o1
o1
o3
GPT-4.1
o4-mini (high)
o3-pro
GLM-4.5
Llama 4 Maverick
Gemini 2.5 Flash (Reasoning)
Gemini 2.5 Pro
Claude 4 Sonnet Thinking
Claude 4 Opus Thinking
Magistral Small
DeepSeek R1 0528 (May '25)
DeepSeek V3 0324 (Mar '25)
Grok 4
Nova Premier
Solar Pro 2 (Reasoning)
MiniMax M1 80k
Nemotron Super 49B v1.5 (Reasoning)
Kimi K2
EXAONE 4.0 32B (Reasoning)
Qwen3 235B 2507 (Reasoning)
GPT-4o (Nov '24)
Further details