LogoTop AI Hubs

Baidu: ERNIE 4.5 21B A3B

Other
Text
Paid

A sophisticated text-based Mixture-of-Experts (MoE) model featuring 21B total parameters with 3B activated per token, delivering exceptional multimodal understanding and generation through heterogeneous MoE structures and modality-isolated routing. Supporting an extensive 131K token context length, the model achieves efficient inference via multi-expert parallel collaboration and quantization, while advanced post-training techniques including SFT, DPO, and UPO ensure optimized performance across diverse applications with specialized routing and balancing losses for superior task handling.

Parameters

21B

Context Window

120,000

tokens

Input Price

$0.07

per 1M tokens

Output Price

$0.28

per 1M tokens

Capabilities

Model capabilities and supported modalities

Performance

Reasoning

Good reasoning with solid logical foundations

Math

-

Coding

-

Knowledge

-

Modalities

Input Modalities

text

Output Modalities

text

LLM Price Calculator

Calculate the cost of using this model

$0.000105
$0.000840
Input Cost:$0.000105
Output Cost:$0.000840
Total Cost:$0.000945
Estimated usage: 4,500 tokens

Monthly Cost Estimator

Based on different usage levels

Light Usage
$0.0035
~10 requests
Moderate Usage
$0.0350
~100 requests
Heavy Usage
$0.3500
~1000 requests
Enterprise
$3.5000
~10,000 requests
Note: Estimates based on current token count settings per request.
Last Updated: 1970/01/21