
Meta
meta-llama/llama-4-maverickLlama 4 Maverick 17B Instruct (128E) is a high-capacity multimodal language model from Meta, built on a mixture-of-experts (MoE) architecture with 128 experts and 17 billion active parameters per forward...
6
credits / gen
Llama 4 Maverick 17B Instruct (128E) is a high-capacity multimodal language model from Meta, built on a mixture-of-experts (MoE) architecture with 128 experts and 17 billion active parameters per forward...
Provider
Meta
Type
Chat
Context Window
1,048,576 tokens
Pricing
6 credits
Knowledge Cutoff
2024-08-31
Vision
Can process and understand images
File Support
Can read PDF, DOCX, XLSX & more
Reasoning
Chain-of-thought reasoning exposed
1049K Context
Large context window for long documents
Vision (OR)
OpenRouter reports vision support