News Overview
-
Strategic Partnership: Puget Systems has partnered with Comino to offer advanced liquid-cooled multi-GPU servers optimized for artificial intelligence (AI), machine learning, and high-performance computing workloads.
-
Comino Grando Server: The collaboration introduces the Comino Grando Server, featuring dual CPUs and support for up to eight GPUs, providing extreme performance, efficiency, and reliability at a more accessible price point.
Original article link: Puget Systems partners with Comino to bring more affordable liquid cooled dual-CPU, 8-GPU systems to the masses
In-Depth Analysis
System Specifications
-
High-Density GPU Configuration: The Comino Grando Server supports up to eight GPUs, making it suitable for intensive computing tasks such as deep learning and scientific simulations.
-
High-Speed Memory: Equipped with Micron 8x 32GB DDR5 5600 RAM, the server ensures rapid data processing capabilities essential for AI research and rendering workloads.
-
Redundant Power Supply: The server features a redundant power supply system with up to four 2000W hot-swap CRPS modules, supporting multiple redundancy modes to maintain continuous operation under demanding scenarios.
Cooling and Efficiency
-
Liquid-Cooling System: The integration of a liquid-cooling system enables the server to dissipate up to 5.5kW of heat, allowing for sustained high performance without thermal throttling.
-
Environmental Compatibility: Designed for versatility, the server operates efficiently in both air-cooled and water-cooled rack environments, handling ambient temperatures up to 40°C. This ensures compatibility with legacy infrastructure and modern energy-efficient data centers.
Commentary
The collaboration between Puget Systems and Comino signifies a strategic move to make high-performance, liquid-cooled computing solutions more accessible to a broader range of industries. By offering a server capable of supporting dual CPUs and up to eight GPUs, this partnership addresses the growing computational demands in AI research, machine learning, and complex simulations. The inclusion of a robust liquid-cooling system not only enhances performance but also aligns with the industry’s shift towards energy-efficient data center operations. This development is poised to impact sectors reliant on intensive computational power, providing a cost-effective solution without compromising on performance or reliability.