Files

81 lines
4.4 KiB
Markdown
Raw Permalink Normal View History

# Dashboard des Modèles Mammouth.ai
*Analyse comparative basée sur le prix (Mammouth) et la performance (Artificial Analysis).*
Dernière mise à jour : 2026-02-22 16:53:27
### Légende :
- **Note Prix** : 10 = Le moins cher, 0 = Le plus cher.
- **Efficience** : Ratio Performance / Prix. Un score élevé indique un excellent rapport qualité/prix.
## Coding
| Modèle | Prix (In/Out 1M) | Score | Vitesse | Note Prix | **Efficience** |
| :--- | :--- | :--- | :--- | :--- | :--- |
| grok-code-fast-1 | $0.20/$1.50 | **23.7** | 314.1 | 9.8 | **23.3** |
| qwen3-coder | $0.22/$0.95 | N/A | N/A | 9.9 | **19.7** |
| codestral-2508 | $0.30/$0.90 | N/A | N/A | 9.9 | **19.7** |
| qwen3-coder-flash | $0.50/$2.00 | N/A | N/A | 9.7 | **19.4** |
| qwen3-coder-plus | $1.80/$9.00 | N/A | N/A | 8.8 | **17.6** |
## Agents
| Modèle | Prix (In/Out 1M) | Score | Vitesse | Note Prix | **Efficience** |
| :--- | :--- | :--- | :--- | :--- | :--- |
| sonar-deep-research | $2.00/$8.00 | N/A | N/A | 8.8 | **17.7** |
| sonar-pro | $3.00/$15.00 | **15.2** | 129.2 | 8.0 | **12.2** |
## General
| Modèle | Prix (In/Out 1M) | Score | Vitesse | Note Prix | **Efficience** |
| :--- | :--- | :--- | :--- | :--- | :--- |
| kimi-k2.5 | $0.60/$3.00 | **46.7** | 44.8 | 9.6 | **44.9** |
| gemini-3-pro-preview | $2.00/$12.00 | **48.4** | 138.8 | 8.5 | **41.2** |
| gpt-5-mini | $0.25/$2.00 | **41.0** | 75.3 | 9.8 | **40.1** |
| kimi-k2-thinking | $0.55/$2.50 | **40.7** | 86.9 | 9.7 | **39.3** |
| gemini-3-flash-preview | $0.50/$3.00 | **35.1** | 177.9 | 9.6 | **33.8** |
| grok-4-0709 | $3.00/$15.00 | **41.4** | 39.4 | 8.0 | **33.1** |
| deepseek-v3.2 | $0.27/$0.42 | **32.1** | 49.1 | 9.9 | **31.8** |
| claude-opus-4-6 | $5.00/$25.00 | **46.4** | 66.8 | 6.7 | **30.9** |
| o4-mini | $1.10/$4.40 | **33.0** | 133.8 | 9.4 | **30.9** |
| claude-opus-4-5 | $5.00/$25.00 | **43.0** | 65.1 | 6.7 | **28.7** |
| gemini-2.5-pro | $2.50/$15.00 | **34.5** | 158.9 | 8.1 | **28.0** |
| deepseek-v3.1-terminus | $0.27/$1.00 | **28.4** | N/A | 9.9 | **28.0** |
| deepseek-v3.1 | $0.27/$1.00 | **28.0** | N/A | 9.9 | **27.6** |
| gpt-5-nano | $0.05/$0.40 | **26.7** | 130.9 | 10.0 | **26.6** |
| claude-4-sonnet-20250522 | $3.00/$15.00 | **33.0** | 72.8 | 8.0 | **26.4** |
| deepseek-r1-0528 | $0.50/$2.18 | **27.0** | N/A | 9.7 | **26.2** |
| kimi-k2-instruct | $0.50/$2.50 | **26.2** | 40.8 | 9.7 | **25.3** |
| claude-3-7-sonnet-20250219 | $3.00/$15.00 | **30.8** | N/A | 8.0 | **24.7** |
| grok-4-1-fast | $0.20/$0.50 | **23.5** | 119.7 | 9.9 | **23.3** |
| gpt-4.1 | $2.00/$8.00 | **25.6** | 103.9 | 8.8 | **22.6** |
| mistral-large-3 | $0.50/$1.50 | **22.7** | 55.9 | 9.8 | **22.1** |
| gpt-4.1-mini | $0.40/$1.60 | **22.4** | 77.4 | 9.8 | **21.9** |
| mistral-medium-3.1 | $0.40/$2.00 | **21.1** | 86.8 | 9.7 | **20.5** |
| grok-3 | $3.00/$15.00 | **25.0** | 67.7 | 8.0 | **20.0** |
| text-embedding-3-small | $0.02/$0.00 | N/A | N/A | 10.0 | **20.0** |
| text-embedding-3-large | $0.13/$0.00 | N/A | N/A | 10.0 | **19.9** |
| gemini-2.5-flash | $0.30/$2.50 | **20.5** | 235.9 | 9.7 | **19.9** |
| mistral-small-3.2-24b-instruct | $0.10/$0.30 | N/A | N/A | 10.0 | **19.9** |
| deepseek-v3.2-exp | $0.27/$0.41 | N/A | N/A | 9.9 | **19.8** |
| grok-3-mini | $0.30/$0.50 | N/A | N/A | 9.9 | **19.8** |
| grok-4-fast-non-reasoning | $0.40/$1.00 | N/A | N/A | 9.8 | **19.6** |
| gemini-2.5-flash-image | $0.30/$2.50 | N/A | N/A | 9.7 | **19.4** |
| claude-haiku-4-5 | $1.00/$5.00 | N/A | N/A | 9.3 | **18.7** |
| mistral-medium-3 | $0.40/$2.00 | **18.7** | 90.2 | 9.7 | **18.2** |
| llama-4-maverick | $0.15/$0.60 | **18.3** | 126.7 | 9.9 | **18.1** |
| gpt-5.1-chat | $1.25/$10.00 | N/A | N/A | 8.9 | **17.7** |
| gpt-5-chat | $1.25/$10.00 | N/A | N/A | 8.9 | **17.7** |
| claude-3-5-haiku-20241022 | $0.80/$4.00 | **18.7** | 46.4 | 9.5 | **17.7** |
| gemini-3-pro-image-preview | $2.00/$12.00 | N/A | N/A | 8.5 | **17.0** |
| gpt-5.2-chat | $1.75/$14.00 | N/A | N/A | 8.4 | **16.8** |
| deepseek-v3-0324 | $0.25/$1.00 | **16.4** | N/A | 9.9 | **16.2** |
| claude-sonnet-4-5 | $3.00/$15.00 | N/A | N/A | 8.0 | **16.0** |
| gpt-4o | $2.50/$10.00 | **17.3** | 168.6 | 8.5 | **14.8** |
| llama-4-scout | $0.08/$0.50 | **13.5** | 158.6 | 9.9 | **13.4** |
| gpt-4.1-nano | $0.10/$0.40 | **12.9** | 141.9 | 9.9 | **12.8** |
| claude-3-5-sonnet-20241022 | $3.00/$15.00 | **15.9** | N/A | 8.0 | **12.7** |
| mistral-large-2411 | $2.00/$6.00 | **9.9** | N/A | 9.0 | **8.9** |
| claude-opus-4-1-20250805 | $15.00/$75.00 | N/A | N/A | 0.0 | N/A |