Building AI for business: IBM’s Granite foundation models
IBM Big Data Hub
SEPTEMBER 7, 2023
The Granite family of models is no different, and so we trained them on a variety of datasets — totaling 7 TB before pre-processing, 2.4 TB after pre-processing — to produce 1 trillion tokens, the collection of characters that has semantic meaning for a model. Besides this, we apply a wide range of other quality measures.
Let's personalize your content