Mistral: Mistral Nemo logo

Mistral: Mistral Nemo

Visit

A 12B parameter model with a 128k token context length built by Mistral in collaboration with NVIDIA.

Share:

A 12B parameter model with a 128k token context length built by Mistral in collaboration with NVIDIA. The model is multilingual, supporting English, French, German, Spanish, Italian, Portuguese, Chinese, Japanese, Korean, Arabic, and Hindi. It supports function calling and is released under the Apache 2.0 license.

Comments

No comments yet. Be the first to comment!