Release Notes
1.15.0
Introduced advanced AI Gateway capabilities (TTS and Function Calling) and significantly improved Application Spaces, community experience, resource management, and observability, with broad performance and stability enhancements. Github
Inference services now support gradual rollout of new model versions on running instances, with both Canary and Blue-Green deployment strategies available for safe and seamless upgrades. (Enterprise Edition only)
1.14.0
Migrated the core deployment scheduler to Temporal Workflow, greatly improving reliability and scalability, while expanding support for MCP, Jupyter environments, and finetuning workflows. Github
Introduce XNet Smart Trunk Accelerator, significantly enhance storage efficiency and development experience. (Enterprise Edition only)
1.12.0
Strengthened core consistency and scalability with atomic repository creation, automatic runner discovery and cluster auto-scaling, and a new streamable protocol for MCP Space execution. Github
1.11.0
Fully refactored the Runner service for secure, flexible operation both inside and outside Kubernetes, while improving error handling, i18n notifications, and workflow stability. Github
1.10.0
Introduced the DataFlow one-click data processing tool, enhanced model inference and synchronization, and delivered full server-side internationalization support. Github
1.9.0
Version 1.9.0 adds notification services, optimizes inference and multi-source synchronization functions, and supports Traditional Chinese and ultra-large file uploads. Github
A new LLM and Prompt configuration management module has been added to the admin console. (Enterprise Edition only)
1.8.0
This upgrade brings major features such as one-click deployment, performance analysis, and multi-language support, making large model management more intelligent and inference more efficient! Github
Multiple new administrative features have been introduced in the admin console, including an asset dashboard, user tagging, and user consumption reports, enabling more fine-grained asset and user management. (Enterprise Edition only)
1.7.0
Version v1.7.0 comprehensively innovates the MCP architecture, optimizes inference services, and enhances model metadata.Github
1.6.0
v1.6.0 brings several core function upgrades and performance optimizations, significantly enhancing the support for large model inference, fine-tuning, evaluation and other processes, and further improving the inference service and user interaction experience.Github
1.5.1
This version focuses on enhancing model inference, space management, UI optimization, and error handling, improving the overall user experience and platform stability. Github
1.5.0
v1.5.0 focuses on model inference framework compatibility expansion, Docker application space creation capabilities, enhanced user information management, and front-end performance optimization, providing a solid foundation for enterprises to build a more flexible and intelligent large model management platform. Github
1.4.0
v1.4.0 brings comprehensive enhancements to the tag system, system broadcasting, file list performance, and dataset preview capabilities, and further improves the background operation experience and supports mainstream large models such as Deepseek R1, providing enterprise users with a more controllable and observable large model asset management platform. Github
1.3.0
In v1.3.0, CSGHub further strengthened the tag system, inference engine compatibility and API capabilities, and significantly improved the user interaction experience and the stability and test coverage of front-end and back-end code. Github