Back to Explore

Mercury 2

Chat

Inception

inception/mercury-2

Mercury 2 is an extremely fast reasoning LLM, and the first reasoning diffusion LLM (dLLM). Instead of generating tokens sequentially, Mercury 2 produces and refines multiple tokens in parallel, achieving...

2

credits / gen

Try this model
File Support 128K Context

About this model

Mercury 2 is an extremely fast reasoning LLM, and the first reasoning diffusion LLM (dLLM). Instead of generating tokens sequentially, Mercury 2 produces and refines multiple tokens in parallel, achieving...

Technical Specifications

Provider

Inception

Type

Chat

Context Window

128,000 tokens

Pricing

2 credits

Capabilities

File Support

Can read PDF, DOCX, XLSX & more

128K Context

Large context window for long documents