xAI

xAI Grok is now open source

Published

on

Artificial intelligence company xAI today announced that its Grok large language model (LLM) is now open source. This means that Grok is now available for any person or business to modify or customize.

The open source announcement of Grok follows xAI founder Elon Musk’s this week’s announcement.

Advertisement

xAI said that the company has released Grok-1 LLM with 314 billion parameters “Mixture-of-Expert” trained from scratch. These numbers are way higher than the Grok-0 with 33 billion parameters.

Grok tops its open-source competition with Meta’s Llama 2 (70 billion parameters), Mixtral 8x7B (70 billion parameters), Abacus AI Smaug (72 billion parameters), and others.

Advertisement

The company explained that the raw base model represents Grok-1 from the pre-training phase, wrapped up in October 2023. The data has not been revealed by the AI company but the model is not fine-tuned for any specific applications such as dialogue.

xAI said that the model was trained using a custom training stack on top of JAX and Rust in October 2023.

Advertisement

In November last year, xAI released Grok its first large language model with generative text capability. Since then the company has kept it in early access with social media site X under a paid subscription package.

Grok’s design is inspired by “Hitchhikers Guide to the Galaxy” a science fiction franchise created by Douglas Adams.

Advertisement

On Github:

xAI is hosting open-source Grok code on GitHub. The release note mentions system requirements including a machine with enough GPU memory to test the large-size model (314 billion parameters) with example code.

Grok-1 source code is licensed under Apache 2.0 license for the use, modification, and distribution (including commercial) of licensed works. The license only applies to the open-source files in the repository and the model weights of Grok-1.

Advertisement
Exit mobile version