The release of GLM-5.1 has been announced via a post on X (formerly Twitter).

The LocalLLaMA developer community hopes that this version will be released under an open-source license, allowing for greater access, customization, and the ability to deploy in on-premise environments.

Currently, no further technical details are available regarding the model's specifications, its performance, or the hardware requirements needed for inference or training. It remains to be seen whether GLM-5.1 will offer significant improvements over previous versions and whether it will actually be made available as an open-source project.