A Copy of my final for one of my classes






Abstract


The paper will discuss a brief history of Graphics Processing Units (GPUS) and how they transformed from their early days in the late ’80s into what we consider a GPU today. How they have changed the tech world and developed alongside the whole industry even into a much broader field, and due to the constant evolution, new technologies have emerged within such.






Intro


Graphics are a fundamental building block and a vital component of our everyday tech life. Initially, simple visuals were just lines that were drawn on our screens. (Ping pong anyone ha-ha)


But behind all the computer graphics and any sort of visual artifacts flying across your screen. There has been a constant and ever-changing development of graphical processing. This would lead way to 3D graphics, the modern GPU, and now even the fuel behind AI (Artificial Intelligence) and Deep learning computing.
A brief history of Graphics Processing Units (GPUS) will be presented in this paper





The Origins of Graphics Cards








Before the big push for 3D rendering and PC gaming in the early 1990s. The colloquial term GPU as we know it now did not exist. In fact, the term was coined by Nivida and some even refer to it as the “gaming processing unit” stemming from its explosion in the late 1990s.





It is difficult to decide where to begin with the history of GPUs (graphics processing units), but the 1980s is certainly a logical place to start. Arcadie machines often take the first credit as they had onboard graphics chips and the budgets behind them to push the technology as far as it could. As well as the onboard graphics chips and or microprocessors on the commodore amigo and the IBM 8514. These were available in the early 1980s and were intended to display custom 2D graphics, but they paved the way for future developments of visuals and computing.








The topic of graphics was on everyone's mind - and other companies, including IBM, tried to improve both the clarity of images and videos. IBM began integrating color and monochrome display adapters into its hardware in 1981, allowing video to be played. Shortly after, Intel would release the ISBX 275 Video Graphics Controller Multimodule Board, which could display eight colors at a resolution of 256x256 and monochrome at 512x512. (Shadow, 2018)





The ATI Technologies company (later acquired by AMD) stepped in with Color Emulation Cards such as the EGA 800: 16-color VGA emulator and continued to expand their product lines into the next decade as the race for graphics began.





Around this time, many new companies and products began to appear. Among them were Trident, SiS, Tamerack, Realtek, Oak Technology, LSI’s G-2 Inc., Hualon, Cornerstone Imaging, and Winbond -- all formed in 1986-87. Meanwhile, companies such as AMD, Western Digital/Paradise Systems, Intergraph, Cirrus Logic, Texas Instruments, Gemini and Genoa, would produce their first graphics products during this timeframe as well. (Singer G, 2021)





Over the next decade, real-time 2D/3D graphics became immensely popular, and 3D graphics became the mass targeted goal. As a result, several companies like 3DLabs, Dynamic Pictures, 3DFx, ARK Logic, and Rendition began to compete to build the world's most powerful video cards. As well as with newcomer companies like NVIDIA and Matrox taking on old competitors like ATI, the market grew rapidly.








GPUs, as we know them today, are a piece of hardware with the ability to offload and process hardware-accelerated 3D rendering separate from the CPU, and not just 2D rendering with software. 3D hardware-accelerated cards requirements were quite different.


And there are 2 defining moments. The first being 3DFx, a company whose first 3D graphics cards the Voodoo 1 and Voodoo 2 quickly became market leaders, leading innovation for a decade. In spite of its 2D limitations, it would prove a tremendous success and launch 3DFx into the world of PC gaming. And second, being the introduction of the Nvidia Riva 128. Which doubled the Voodoo1's initial specs with a 100MHz core/memory clock, and had a four-megabyte SGRAM. For Nvidia, it was it’s the first to gain significant market share. (Ridley J, 2020)


Yet towards the end of the ’90s market volatility forced several graphics / display card companies to leave the business, or be acquired by peers. Two names would start to lead the way to our current concept of GPUs: ATI and NVIDIA.








The GPU is Born







So, hold the phone, so far, a lot of tech history and jargon has been thrown around and we still have not fully clarified what a GPU really is per say. Where are we drawing this line?



So, one definition is: A graphics processing unit (GPU) is a dedicated parallel processor optimized for accelerating graphical computations. (McClanahan C, 2010)


But to fully understand this we need to provide some basic context. If you know what a CPU is. It is the Central Processing Unit of the system or PC. CPUs have a certain number of cores that process one computation at a time. This is called a serial computation form of processing.


GPUs are a distinct unique use case microprocessor apart from the CPU that has many more cores than a CPU and those cores are for processing data computations a in parallel form. Meaning many computations all at once. GPUs do this very well. (For those screaming about integrated graphics we will cover that later).


In a nutshell: A GPU is a separate microprocessor that handles computations in parallel and usually for visual intensive workloads.


Now that is out of the way, even though other manufacturers stuck around into the mid-2000’s like with 3Dlabs with their Wildcat Video cards and other odd ware Accelerated video cards. The first real GPU (at the current definition) was not until 1999 with the GeForce 256.


Terms had been interchangeable up until this point. AGP (Accelerated Graphics Port card), VPU (Visual Processing Unit), Graphics card, Video Card, Video Accelerated card, video adapter card. You name it. Therefore, with the rise of gaming, 3D video cards popularity, paired when motherboard manufacturers adding a dedicated slot on motherboards known as the AGP (a slot on the motherboard that provided a direct connection between the card, CPU, and memory) and that coinciding with Nivida’s branding of the GPU (graphics processing unit) things started to solidify with the nomenclature at least ha-ha.


Now to make things even more confusing. There is a crucial DeFacto separate between what we call the GPU of today and the GPU before NIVIDIA’s GeForce 256. Which is the introduction of T&L.


The GeForce 256 was the first real GPU as it featured video technology: transform, clipping, and lighting (T&L or TCL). This moves or offloads most of the 3D information previously handled by the CPU (Central Processing Unit) to the GPU, allowing for a more complex 3D environments with more polygon counts and improved lighting; while allowing the computer processor to handle other tasks. (Computer Hope, 2018)


GeForce 256's hardware T&L technology really can set the point in time for a pin to be dropped. Thought when it first came out many were skeptical as not a lot of games or software could take advantage of the hardware with either DirectX 7's Hardware T&L engine or the implementation in OpenGL (both application programming interfaces or APIs) having to be used, but it set the stage for the next 21+ years and now pretty much these API’s and Hardware-based T&L are the standards. (Shimpi, A. L, 1999)









Figure 1. (Shimpi, A. L, 1999)






This was sort of the defining factor of what we now call a modern GPU. Previously, with Computer Graphics, things were rendered in software and done through the CPU or through some special proprietary graphics accelerated video cards that only handled the tail end of the rendering but like was stated before having a separate chip/ microprocessor that is dedicated to video processing computations in parallel really is the main milestone here.


These early GPU-like cards implemented only the rendering stage in hardware, requiring the CPU to generate triangles for the GPU to operate on. Only offloading some of the workloads.





Further improvements in GPU technology have enabled more stages of the rendering pipeline to be implemented in hardware on the GPU, freeing up CPU cycles. (McClanahan C, 2010)





Gaming became the major factor that created such a huge demand for GPUs and the demand for 3D graphics at home. Along with other demands like CAD (computer-aided design) rendering. With the help of TCL, this became possible and easier. And as the 2000’s moved forward most companies in the market that were adapted for hardware-based T&L.






ATI/AMD vs NIVIDA






Though even further consolidation occurred in the graphics industry at the turn of the century. AIT and NIVIDA moving forward and continuing to rise as top dogs in the game with their GeForce 2 GTS and ATI Radeon DDR cards crushing the competition.


On the other hand, companies like 3Dfx, despite filing for bankruptcy in 2002, still contributed a great deal to the gaming industry and great visual graphics technologies.


By using a few rendered frames and aggregating them into one image, T-buffer was introduced in the Voodoo 5 as an alternative to transformation and lighting. As a result, a slightly blurred image was created that, when viewed in frame order, smooths out the animation's motion. This impacted future anti-aliasing technology. But the swan song was about to be sung for 3Dfx.


PC graphics were dominated by a duopoly at the start of 2001 by the likes of both ATI & NIVIDA in addition to Intel supplying the vast majority of integrated graphics chipsets. (again, which we will talk about later). (Singer G, 2021)






ATI was acquired by Advanced Micro Devices on July 24, 2006 in a joint announcement. Now we know the current common names of this GPU war between AMD and NIVIDA.For the next 20 some years, both companies would try to top each other’s product lines. Back and forth they went. Motherboard companies also during this time around 2004 introduced PCI Express replacing the outdated AGP ports on most motherboards. Additionally, AMD and ATI's crossfire technology and Nivida's SLI technology entered the picture, resulting in further innovation and expansion being it brought support for the use of more than one GPU on the same motherboard. Also, around this time in 2006, NIVIDA launched its 8800 GTX which had a unified shader that could handle lots of effects and had a faster clock than the processing core as well as steam processors to allow parallelized tasks to improve some efficiency. More power-hungry and powerful GPUs continued down the pike in the years to come.


These computing technologies led the way for not just gaming performance but also in more general computing tasks that were previously only handed by CPU’s. This would later be built upon for other purposes like AI, scientific research processing, simulations, or even crypto currency mining, and hashing.


AMD and NIVIDIA are still the two leading companies, but each has found its own niche. NIVIDA is known for now supplying high-end GPU’s and is helping to power AI with its data center-based computing power. And AMD with both being a GPU, APU, and CPU company delivering mainstream products everywhere as well as having their APU’s take the win for the current/last-gen gaming consoles. As they ran AMD-powered graphics chipsets.





Apu's / Integrated Graphics Units






So, this is where I feel it should be addressed but it's kind of another grey area. APU's / integrated graphics, or IGPUs however you want to describe them are all built-in graphics processing chips on the CPU. These are how a lot of laptops, lower power draw PCs, and non-intensive workload computers process visuals or graphics. As well as the whole mobile phones and tablet market. The biggest issue for having Graphics Processing Units on the same chip/die as the CPU is it is difficult, it causes more heat and thus more cooling is needed and it's just not the same as having a discreet GPU as we've come to know for tasks like gaming, video editing, or other heavy workloads that utilize the GPU.


Integrated graphics also have to share the memory of your system instead of having their own VRAM (video random access memory) like GDDR or HBM on modern GPU's.


This leads to limits on resolution and texture qualities. It offers a huge share of the market but does not offer the same performance as a decreet GPU.


It’s also again is a grey area as normally APUs are considered to have a GPU build inside the CPU. Acting as that separate processor similar to a discreet video card just with shared memory and on the same chip. Yet traditional integrated graphics are built into the CPU meaning the computations are not improved by the ability to utilize parallel processing.


That being said, throughout the last few years integrated graphics and APU’s have come a long way as software and manufacturing has gotten much more sophisticated. With Products like AMD's Vega 8 APU's as well as countless others in the mobile and handheld markets pushing onboard graphics. And interestingly enough the driving force to make better IGPU (Integrated Graphis processing units) has even pushed Intel to break out into the discreet GPU world. With their new Intel Iris Xe DG1 (which is an IGPU on its own Pcb) Thus making it a GPU not an IGPU, Weird right. And finally, maybe adding another true competitor to the AMD vs NVIDIA GPU war that continues to wage on.


Fun fact: Intel did use make discreet video cards in the past. They licensed a graphics card (the µPD7220) by the company (NEC Information Systems), and in June 1983, Intel brought out the 82720, a clone of the µPD7220; it rolled out its iSBX 275 multibus-based add-in graphics board (AIB) with the chip later that year. Intel continued to offer the product up to 1986. (Peddie J, 2021).


APU’s also power things like the gaming consoles which, like I mentioned before AMD took control of. That console market featuring the Xbox One, Xbox Series X, and PS4/ PS5 all use AMD's Radeon technology for their integrated graphics.






Modern GPUs & AI Cloud








In recent years, cloud computing, virtualization, and the infrastructure and technology behind it have become increasingly popular. And the benefits of GPU’s and how they can process information in parallel have forever changed the whole field.






The GPU evolved as a complement to its close cousin, the CPU (central processing unit). While CPUs have continued to improve performance through architectural innovations, faster clock speeds, and core additions, GPUs are specifically designed to accelerate computer graphics workloads. (2021, Intel)



Today the GPU is used in a much broader sense than just video gaming graphics. Cloud computing, Deep learning, machine learning, and AI computations are all areas where it's heavily utilized. Some GPUs are even used now for specialized applications. The ability to use thousands of GPU’s together with hundreds of thousands of GPU cores all being able to optimize certain tasks has really changed the way computing works.


NVIDIA GPUs were deployed in 97.4 percent of AI accelerator instances – hardware used to boost processing speeds – at the top four cloud providers: AWS, Google, Alibaba, and Azure. It commands “nearly 100 percent” of the market for training AI algorithms, says Karl Freund, an analyst at Cambrian AI Research. Nearly 70 percent of the top 500 supercomputers use their GPUs. Virtually all AI milestones have happened on NVIDIA hardware. (Kobie N, 2021)



Their use is no longer limited to just high-intensive graphic video games. It has come a long way and is only increasing exponentially.


It should also be noted that with GPUs today, thermal solutions with regard to cooling and heat dissipation have advanced by a factor of ten. Each year, new options and ideas are developed. Whether that’s the beefy triple-fan air coolers and or full-on water blocks added to NIVIDIA’s Newest RTX 30 series cards and the Radeon’s RDNA 2 GPU’s. Thermal concerns affect us a lot more now because GPU's produce a lot more heat and consume a lot more power than cards of decades past.




Conclusion




The GPU has come a long way and the lines have blurred sometimes whether you are talking about a discrete graphics card or a mobile smartphone with a built-in GPU, or even an entire data center packed with rows and rows of GPUs running AI tasks with no acutely visual outputs. The thing is when it boils down to it. GPUs began as secondary chips to help and CPU process visual information and have now changed the way computing is being handled and even programs being coded for such. GPUs have become a keystone part of our technology. Whether or not it's a physical graphics card or one that’s built into a device the only constant has been pushing technological innovation. Keep in mind that the only constant is change and that change is inevitable. Always remember our technology is built on the shoulders of giants.






References



Neuralmagic. (2021, March 2). A brief history of gpus - neural magic blog. Neural Magic - No Hardware AI. Retrieved September 20, 2021, from https://neuralmagic.com/blog/history-gpus/#:~:text=Graphics%20processing%20units%20(GPUs)%20were,longer%20and%20more%20complex%20history.








Atari 2600 released in September 1977, Adapter, I. B. M. P. C. M. D., ATI EGA 800: 16-color VGA emulation, 800x600 support, ATI Graphics Ultra ISA (Mach8 + VGA), Fast forward: GLQuake released in 1997 versus original Quake, Vision 968: S3's first motion video accelerator, ATI Mach64 VT with support for TV tuner, & Screamer 2, released in 1996. (n.d.). The history of the modern graphics processor. TechSpot. Retrieved September 20, 2021, from https://www.techspot.com/article/650-history-of-the-gpu/.








The history of GAMING: The evolution of GPUs. Shadow. (n.d.). Retrieved September 20, 2021, from https://shadow.tech/en-gb/blog/gaming/history-of-gaming-gpus.








RichReport. (2011, December 16). GPU computing: Past, present & future. YouTube. Retrieved September 20, 2021, from https://www.youtube.com/watch?v=3OX_tqGGgfQ.








Hemsoth, N., Feldman, M., Morgan, T. P., & says:, T. S. N. (2018, April 23). The future of programming gpu supercomputers. The Next Platform. Retrieved September 20, 2021, from https://www.nextplatform.com/2018/03/26/the-future-of-programming-gpu-supercomputers/.





McClanahan, C. (2010.). History and Evolution of GPU Architecture. Retrieved from http://www.mathcs.emory.edu/~cheung/Courses/355/Syllabus/94-CUDA/Docs/gpu-hist-paper.pdf.





Chu, W. (2016, November 3). 22 game changing video Cards, 1981-2015 - Hardboiled. Smart Buyer. Retrieved September 20, 2021, from https://www.neweggbusiness.com/smartbuyer/components/22-game-changing-video-cards-1981-2015/.





Ridley, J. (2020, December 30). The most iconic, game-changing graphics cards ever made. pcgamer. Retrieved September 20, 2021, from https://www.pcgamer.com/most-important-graphics-cards-history/.








Graphic card Evolution timeline. Timetoast timelines. (1981, January 1). Retrieved September 20, 2021, from https://www.timetoast.com/timelines/la-evolucion-de-los-teclados-bd119b48-9cd2-4b56-a4e8-4641b3396dbe.











Pc, X. (n.d.). History of gpus. XOTIC PC. Retrieved September 20, 2021, from https://xoticpc.com/blogs/news/history-of-gpus.








GPU nomenclature history: No shortage of GPUS HERE. Tedium. (2021, March 26). Retrieved September 20, 2021, from https://tedium.co/2021/03/26/gpu-technology-history/.











Kobie, N. (2021, June 16). NVIDIA and the battle for the future of AI chips. WIRED UK. Retrieved September 20, 2021, from https://www.wired.co.uk/article/nvidia-ai-chips.








The Gamer. (2020, October 14). The evolution of the graphics card. YouTube. Retrieved September 20, 2021, from https://www.youtube.com/watch?v=TB9RETj6ZCw.











Singer , G. (2021). The history of the modern graphics processor. TechSpot. Retrieved September 20, 2021, from https://www.techspot.com/article/650-history-of-the-gpu/#part-one.





Definition of t&l. PCMAG. (n.d.). Retrieved September 21, 2021, from https://www.pcmag.com/encyclopedia/term/tl.








Shimpi, A. L. (1999, October 11). NVIDIA GeForce 256 Part 1: To buy or not to buy. RSS. Retrieved September 21, 2021, from https://www.anandtech.com/show/391/5.








Bardwell, T. (2020, November 11). APU vs CPU vs GPU - What's the Difference? [SIMPLE GUIDE]. GamingScan. Retrieved September 21, 2021, from https://www.gamingscan.com/apu-vs-cpu-vs-gpu/.








Computer Hope. (2018, November 13). What is T&L (transform, clipping and Lighting)? What is T&L (Transform, Clipping and Lighting)? Retrieved September 21, 2021, from https://www.computerhope.com/jargon/t/tandl.htm.








Graphics processing technology has evolved to deliver unique benefits in the world of computing. The latest graphics processing units (GPUs) unlock new possibilities in gaming, content creation. (n.d.). What is a Gpu? Graphics processing units defined. Intel. Retrieved September 21, 2021, from https://www.intel.com/content/www/us/en/products/docs/processors/what-is-a-gpu.html.








IBMCloud, A. (2019, March 20). Gpus: Explained. YouTube. Retrieved September 25, 2021, from https://www.youtube.com/watch?v=LfdK-v0SbGI.





Peddie, J. (2021). Famous graphics Chips: Intel's GPU History: IEEE Computer Society. IEEE Computer Society Famous Graphics Chips Intels GPU History Comments. Retrieved September 25, 2021, from https://www.computer.org/publications/tech-news/chasing-pixels/intels-gpu-history.





Graphics processing technology has evolved to deliver unique benefits in the world of computing. The latest graphics processing units (GPUs) unlock new possibilities in gaming, content creation. (n.d.). What is a Gpu? Graphics processing units defined. Intel. Retrieved September 25, 2021, from https://www.intel.com/content/www/us/en/products/docs/processors/what-is-a-gpu.html.





Kobie, N. (2021, June 16). NVIDIA and the battle for the future of AI chips. WIRED UK. Retrieved September 25, 2021, from https://www.wired.co.uk/article/nvidia-ai-chips.









Shadow. (2018). The history of GAMING: The evolution of GPUs. Shadow. Retrieved September 15, 2021, from https://shadow.tech/en-gb/blog/gaming/history-of-gaming-gpus.





Intel. (n.d.). What is a GPU? graphics processing units defined. Intel. Retrieved October 20, 2021, from https://www.intel.com/content/www/us/en/products/docs/processors/what-is-a-gpu.html.


Comments