Mistral: Mixtral 8x22B Instruct
Mistral
Text
Paid
Mistral's official instruct fine-tuned version of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b). It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include: - strong math, coding,...
Parameters
22B
Context Window
65,536
tokens
Input Price
$2
per 1M tokens
Output Price
$6
per 1M tokens
Capabilities
Model capabilities and supported modalities
Performance
Reasoning
Good reasoning with solid logical foundations
Math
Strong mathematical capabilities, handles complex calculations well
Coding
Capable of generating functional code with good practices
Knowledge
-
Modalities
Input Modalities
text
Output Modalities
text
LLM Price Calculator
Calculate the cost of using this model
$0.003000
$0.018000
Input Cost:$0.003000
Output Cost:$0.018000
Total Cost:$0.021000
Estimated usage: 4,500 tokens
Monthly Cost Estimator
Based on different usage levels
Light Usage
$0.0800
~10 requests
Moderate Usage
$0.8000
~100 requests
Heavy Usage
$8.0000
~1000 requests
Enterprise
$80.0000
~10,000 requests
Note: Estimates based on current token count settings per request.
Last Updated: 2026/04/11
