Mistral: Ministral 3B

mistralai/ministral-3b

Created Oct 17, 2024131,072 context
$0.04/M input tokens$0.04/M output tokens

Ministral 3B is a 3B parameter model optimized for on-device and edge computing. It excels in knowledge, commonsense reasoning, and function-calling, outperforming larger models like Mistral 7B on most benchmarks. Supporting up to 128k context length, it’s ideal for orchestrating agentic workflows and specialist tasks with efficient inference.

Uptime stats for Ministral 3B

Uptime stats for Ministral 3B on the only provider

When an error occurs in an upstream provider, we can recover by routing to another healthy provider, if your request filters allow it.

Learn more about our load balancing and customization options.

More models from Mistral AI

    Mistral: Ministral 3B – Uptime and Availability | OpenRouter