LLM-jp-13B is now available

LLM-jp has released “LLM-jp-13B” under an open license. The LLM-jp-13B is a 13-billion-parameter large-scale language model pre-trained mainly in Japanese and English. In addition, the instructional datasets and architecture used for training and fine-tuning this model are also available. Datasets and other data will be released sequentially in the future. For details, please see the following page.

Updated: