Mistral
Text
Mistral Nemo
A 12B-parameter model with a 128k-token context length developed by Mistral in collaboration with NVIDIA. The model is multilingual, supporting English, French, German, Spanish, Italian, Portuguese, Chinese, Japanese, Korean, Arabic, and Hindi. It supports function calling and is released under the Apache 2.0 license.
Text
Mistral 7B Instruct
A high-performance, industry-standard 7.3 billion parameter model, optimized for speed and context length.