May. 15th, 2024

thewayne: (Default)
First off, it has to be pointed out this is a specialized AI model designed for programmers, not a generalized model like ChatGPT et al.

IBM trained it specifically on open source libraries to which they explicitly had permission, basically bending over backwards to avoid any possible legal issues. And they now have a working model that they've released to the public! Granite was trained on 116 different programming languages and has from 3 to 34 billion tokens, presumably per language. I wonder if you can ask it to list all the languages it's trained in, I'll bet there's some pretty esoteric ones in there! I'd love it if it had MUMPS! (I once found a book on MUMPS programming at the Phoenix Public Library, I imagine it's been weeded by now)

Anyway, interesting article. It describes how it was trained, etc., but one of the more interesting bits was saying that in the rather short time since ChatGPT et al have appeared and everyone started creating their own LLMs, the cost for training up an LLM has dropped from millions of dollars to thousands! That's a pretty impressive scale drop.

https://www.zdnet.com/article/ibm-open-sources-its-granite-ai-models-and-they-mean-business/

https://www.zdnet.com/article/ibm-open-sources-its-granite-ai-models-and-they-mean-business/

July 2025

S M T W T F S
   1 2345
67891011 12
13 1415 1617 18 19
2021 22 23242526
2728293031  

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 28th, 2025 10:07 pm
Powered by Dreamwidth Studios