How GPUs Will Replace CPUs In The Future

How GPU Will Replace CPU In The Future ?

 

 

VHS defeated Betamax in the 1980s. Blue-ray defeated HD-DVD in the 2000s. Now, GPUs are poised to defeat the CPUs.

GPU chips are a newish concept born in the late 1990s. They were originally developed exclusively for graphics in order to enhance the 3D visual imagery of the booming gaming industry. However, CPU chips have been developed alongside the modern computer industry since the 1960s.

LET’S DIVE DEEP

What’s different inside these two devices that gives them their functions and limitations? I’m glad you asked!

Physically, the units are quite dissimilar. A GPU chip contains up to thousands of cores so that many streams of data can be processed at the same time. While a CPU chip contains fewer cores and a lot of cache memory, it processes more quickly through those few cores. These differing structures make each chip better suited to different tasks.

You can think of it this way: a GPU is a burst of energy, and it can run the fastest 100 meters in the world. But a CPU is steady and it can finish a 26-mile marathon without a glitch.

This is important: GPUs might be able to function with good memory and the long-term smoothness of CPUs, but this will require new multi-threaded programming languages that fit best to a GPU’s parallel processing. This is a key reason we believe that GPUs will be able to replace CPUs, but not the other way around.

Here are some more reasons:

 

SO MUCH DATA, SO LITTLE TIME

A year ago, we were already creating 2.5 exabytes of data each day. Look at this compelling graphic from Northeastern University to see how many downloaded songs or Libraries of Congress are equivalent to 2.5 exabytes!

The thousands of cores inside a GPU are designed to process lots of bits of information at once, making it potentially much more efficient than a CPU at processing big data swiftly.

To be fair, a highly cited study sponsored by Intel found that GPUs may have been overrated in their ability to outperform CPUs. The study, called, “Debunking the 100X GPU vs. CPU Myth ” found that GPUs are 2.5x better at processing data than CPUs. This is far less than the commonly held belief that GPUs are up to a hundred or a thousand times faster.

But 2.5 times faster is still faster, right?! So even if GPUs have been overrated, they are still looking like a better bet than CPUs. For a mind-blowing and memorable example of the difference in speed, watch this demonstration by Mythbusters’ Adam Savage and Jamie Hyneman, sponsored by visual computer technologies leader, Nvidia.

What will we be doing with all this data? It brings us to—

 

COMMODIFYING Artificial Intelligence

 

A.I. was all over the news last year, largely because its practices are not standardized yet, so every company working on an A.I. project is constantly inventing new components and uses. They’ve been playing with A.I. programming in everything from fitness trackers to self-driving cars to…robots, of course!

A.I. comes into the GPU/CPU debate because we’re learning how to build machines that increasingly engage in the human world, rather than vice-versa. For example, do you remember when you had to sit down at a certain desk to use your enormous boxy computer and monitor? That was the age when we went to the computers, and now computers are coming to us.

Nvidia, a leading developer of A.I., posted an article on the difference between CPU and GPU. Nvidia opened by stating that a CPU is the computer brain, but a GPU is a computer soul. This poetic phrase indicate how the GPU is crucial in the future development of A.I.; When computers come into our world instead of the other way around, we need them to be able to process the complexity and immediacy of real life, a job the slow steady CPU is not capable of.

IS IT ALL OVER FOR THE CPU?

This a thoughtful article from the Data Center Journal makes a compelling case for the current benefits of CPU, explaining, “One of the main challenges of using the GPU is enabling programmers to tap into its capabilities.” Technology changes quickly, but GPUs are still quite new compared to the CPUs that have been developed since the Kennedy Administration. Most programming is CPU-optimal.

Overall, GPUs have a huge, significant place in the future of computing. However, their development (and the development of computing infrastructure which will support them) still has a way to go before CPUs become a thing of the past.

 

SOURCES

Look at this compelling graphic: http://www.northeastern.edu/levelblog/wp-content/uploads/2016/05/data.png

“Debunking” https://www.cis.upenn.edu/~devietti/classes/cis601-spring2017/slides/debunking-100x-gpu-myth.pdf

Watch this demonstration: https://www.youtube.com/watch?v=ZrJeYFxpUyQ

Nvidia article: https://blogs.nvidia.com/blog/2009/12/16/whats-the-difference-between-a-cpu-and-a-gpu/

This thoughtful article: http://www.datacenterjournal.com/gpu-replace-cpu/

 

Leave a Reply

Your email address will not be published. Required fields are marked *