Open source coding models by Z.ai, based on a new base model and specializing in coding and tool calling.
To run the smallest GLM-4.7, you need at least 16 GB of RAM.
GLM-4.7 models support tool use and reasoning. They are available in gguf and mlx.

GLM-4.7-Flash offers a new option for lightweight deployment that balances performance and efficiency.
Head over to model search within the app (Cmd/Ctrl + Shift + M) and search for GLM 4.7.
Or from the terminal, run
lms get zai-org/glm-4.7-flash

GLM-4.7 models are provided under the MIT license.