StableLM Homepage, Documentation and Downloads – Language Model Developed by Stability AI – Development details

The StableLM project repository contains Stability AI’s ongoing development of the StableLM series of language models, and currently Stability AI has released the initial StableLM-alpha model set with 3 billion and 7 billion parameters. Models with 15 billion and 30 billion parameters are in development.

Stability AI is currently at Hugging Face Spaces A demo version of the 7 billion parameter model, StableLM-Tuned-Alpha-7B, was released on , and users can visit the website to try it out.

The StableLM-Alpha model is trained on the expanded version of the Pile dataset, which contains 1.5 trillion tokens, and the size of the expanded version of the dataset is about 3 times that of the Pile.

sizeStableLM-Base-AlphaStableLM-Tuned-AlphaTraining TokensParametersWeb Demos
7Bcheckpointcheckpoint800B7,869,358,080Hugging Face
15Bin progress(pending)
30Bin progress(pending)
65Bin progress(pending)

#StableLM #Homepage #Documentation #Downloads #Language #Model #Developed #Stability #Development details


發佈留言必須填寫的電子郵件地址不會公開。 必填欄位標示為 *