Are GPUs Really Taking Over CPUs?

Not long ago, selecting a computer involved only two critical components: processor and memory. The computing landscape has since changed and among many other configuration options, there are important decisions to make in addition to CPU processing power: GPUs, ASICs, and FPGAs.

It all started with the CPU.

The first computers were massive, taking up nearly 2,000 square feet, and were built for very specific numerical computing tasks. One disadvantage besides their massive size was that a program for one computer would not work on any other computer.

Central Processing Unit (CPU)

In 1971, everything changed when “the Intel® 4004 became the first general-purpose programmable processor on the market—a ‘building block’ that engineers could purchase and then customize with software to perform different functions in a wide variety of electronic devices.” Programs written for one computer could work on another!

Today, Intel continues to be a leader in the field of microprocessors – we just call them “processors” now – such as their Xeon Scalable platform which has come a long, long way in less than 50 years.

Graphics Processing Unit (GPU)

In 1999, graphical display applications, primarily for the computer gaming industry came about.

The use case for the GPU has expanded and that’s why today, we see developers using GPUs for applications other than just graphics, but the name remains unchanged. A GPU card is no longer only a “gaming card.”

GPUs are presently in use across all industries with applications including high performance computing, rendering and animation, mapping, and self-driving cars. CIARA’s TITAN 2208-G4 can be equipped with up to 8 GPU cards in a 2U form factor for applications in design and simulation, or for hyperscale data center workloads.

The purpose-built architecture of GPUs allows for the offloading of certain calculations from the CPU. These types of calculations are called “single instruction multiple data” (SIMD) so GPUs are great at simplistic operations on large inputs, whereas CPUs excel at complex operations on small input streams.

  • Figure 1 A summary of the main architectural differences between a CPU and GPU
  • Looking at the two architectures, a CPU has multiple cores and a GPU has hundreds of cores.

  • The use of GPUs has become so widespread that supply is unable to keep up with demand.

    Cryptocurrency miners, for instance, spent $776 million on graphics cards last year according to Jon Peddie Research. Because the calculations required for cryptomining are ideally suited for GPU technology, cryptocurrency mining “rigs” can have upwards of 10 GPU cards installed; however, ASICs can be substituted GPUs.

  • Application Specific Integrated Circuit (ASIC)

    An ASIC is a chip designed and manufactured for a custom application and nothing else. The application can be anything.

The two main advantages of ASICs are (1) reduced size (only the electronics necessary to perform the specific functions are included), and (2) reduced power consumption (fewer components use less power). In fact, the use of ASICs in almost all of our modern electronics is one of the principal reasons for their miniaturization.

Reducing power consumption (and power cost) is important for cryptomining so some rigs use purpose-built ASICs instead of GPUs in order to execute the algorithms in the most cost and energy efficient manner possible.

But there is one major disadvantage: cryptomining ASICs are specially designed for “single hash” algorithms, meaning for mining a single cryptocurrency. Changing currencies requires the purchase of a new ASIC and with over 1500 cryptocurrencies presently trading, selecting equipment with a dedicated ASIC might be short-sighted.

Field Programmable Gate Array (FPGA)

About 5 years ago, prior to the availability of ASICs, cryptominers started swapping out GPUs for FPGAs in order to increase performance.

A FPGA is a reusable device that allows you to build almost any kind of digital circuit using its configurable logic blocks. Originally used for prototyping – ASICs are prototyped on a FPGA prior to final manufacturing – FPGA have found their way into different markets because they can reprogrammed after manufacturing when functionality requirements change.

Just as the GPU accepts certain offloaded tasks from the CPU, a FPGA can enable the same kind of hardware acceleration.

  • Where is it all going?

    Each of these different chips has its own function. ASICs will continue to be manufactured for custom applications and mobile devices, and the use of FPGAs will expand to other industries.

But what about CPUs and GPUs?

In March, Victor Peng, the CEO of Xilink, warned us that GPU architecture is not suited for automotive AI applications because “GPUs are not good at latency. And latency is a “critical requirement in most safety-related real-time applications such as […] automated driving or robotics

In order to address “low-latency demands of workloads such as real-time AI inference and continuous model training”, Intel recommends their “Intel® Xeon® Scalable processors in conjunction with Intel FPGAs” for the best results.

In addition, Intel remind us: “AI spans a wide range of use cases, deployment models, performance needs, power requirements, and device types.  For example, the solution for hyper-scale AI in a cloud data center is vastly different from autonomous driving or real-time quality inspections on an assembly line.”

Certain workloads may shift from GPU-based back to a CPU & FPGA platform in order to address latency concerns, but given the computational specialization of CPUs and of GPUs, and the fact that different workloads require either greater GPU or CPU performance, it’s unlikely we will see a converged CPU+GPU architecture any time soon.

  • Conclusion

To determine the appropriate hardware for optimal performance, you first need to understand your use cases. Talk to a CIARA specialist for help in selecting a computing platform best suited to your needs.

Intel Xeon Crypto Mining TITAN

Odyze Wright

Odyze Wright

I write about what's interesting and what's not. I write about current and what's not. Lastly, your suggestions are more than welcomed!

View all articles