Rapid advances in artificial intelligence (AI) are being enabled by the graphics processing units (GPUs) Nvidia originally designed for the video game market. Organizations seeking to capitalize on the capabilities of Nvidia GPUs must prepare their data centers for extreme power density and the heat that comes with it.

Nvidia revolutionized computer gaming with its GPU. These specialized circuits are able to produce cleaner, faster and smoother motion in video games by performing multiple mathematical calculations simultaneously.

Then, in 2007, Nvidia moved beyond the gaming market when it pioneered the concept of “GPU-accelerated computing.” GPUs are combined with traditional computer processing units (CPUs) in massively parallel processing environments that make compute-intensive programs run faster. That development provided the processing “oomph” required to enable essential AI functions such as deep learning.

Deep learning is a computing model designed to loosely mimic the way the human brain works with neurons and synapses. Nvidia’s GPUs are used to create so-called “artificial neural networks” that use a large number of highly interconnected nodes working in unison to analyze large datasets. This gives a machine the ability to discover patterns or trends and learn from those discoveries. This the essence of artificial intelligence.

Key architectural differences between a CPU and Nvidia’s GPU make this possible. A CPU has a few cores with lots of cache memory that can handle a few software threads at a time. CPUs are also optimized for sequential processing — the execution of processes in the order they are received. GPUs have hundreds of cores that can handle thousands of threads and execute multiple processes simultaneously.

GPU-accelerated computing can run some software 100 times faster than with a CPU alone. That makes it perfect for the deep learning type of algorithms that are powering a range of AI applications.

GPUs also bring significant challenges to the data center environment. While CPUs have steadily become more energy-efficient, GPUs consume a lot of power. The adoption of GPU-accelerated computing leads to higher power density in the data center — on the order of 30kW to 40kW per rack by some estimates. Many hyperscale data centers are only consuming about 10kW per rack.

Power densities of that magnitude mean significantly greater heat loads, which few environments are prepared to handle. Hot-aisle containment is essential, along with in-row cooling systems that focus their capacity on nearby equipment. In-row cooling captures and neutralizes hot exhaust air before it can escape into the data center.

Chilled-water cooling systems are often recommended for GPU-accelerated computing because water has about four times the thermal capacity of air. However, in-row cooling provides greater efficiency by shortening the airflow path and reducing the volume of space to be cooled.

Enconnex in-row cooling units give you the flexibility to choose the coolant of your choice. Available in condensate water, chilled water and DX air- and water-cooled configurations, Enconnex in-row cooling units deliver more than 100kW of cooling capacity yet fit comfortably in any data center environment.

Nvidia GPUs are being used to accelerate hundreds of AI-driven applications for uses such as quantum chemistry, fluid dynamics, video editing and medical imaging. Organizations looking to take advantage of AI must ensure that their data center infrastructures can handle the heat generated by these powerful chips.

Rahi Systems will be exhibiting at the Nvidia GPU Technology Conference, March 26-29 in San Jose, Calif. Stop by Booth #1225 to learn more about our in-row cooling solution and pick up your free give-away!

 

Rahi is a Global IT Solutions Provider. We are uniquely capable of combining data center, IT and audio/video solutions to create an integrated environment that drives efficiencies, enhances customer service and creates competitive advantages. We offer a full suite of products in physical infrastructure, storage, compute, networking, power and cooling, and audio / video. In addition, Rahi offers professional and managed services to aid customers in logistics, delivery, set-up, and ongoing support of their technology solutions.

About Rahi

Rahi is a subsidiary of Wesco Distribution, a Fortune 200 Company with operations in 50+ countries and annual revenues over USD 19B. Rahi delivers comprehensive data centre solutions for global enterprises, hyperscalers, and multi-tenant data centres. Rahi provides IOR, local currency billing, and RMA services, enabling businesses to operate efficiently anywhere.
Since being acquired in Nov. 2022, Rahi’s global presence and analytical expertise help clients achieve their business and IT requirements.

Contact Us
Rahi Systems Australia PTY LTD, Unit 30, Slough Business Park, 2 Slough Ave, Silverwater NSW 2128 Australia
Follow Us
© 2023 Rahi Systems, Inc.