A foundation model is an AI model trained on broad data at scale such that it can be adapted to a wide range of downstream tasks.[12]
Granite's first foundation models were Granite.13b.instruct and Granite.13b.chat. The "13b" in their name comes from 13 billion, the amount of parameters they have as models, lesser than most of the larger models of the time. Later models vary from 3 to 34 billion parameters.[4][13]
On May 6, 2024, IBM released the source code of four variations of Granite Code Models under Apache 2, an open source permissive license that allows completely free use, modification and sharing of the software, and put them on Hugging Face for public use.[14][15] According to IBM's own report, Granite 8b outperforms Llama 3 on several coding related tasks within similar range of parameters.[16][17]
See also
Mistral AI, a company that also provides open source models