Modern computer graphics that involve high-resolution smoke, high-resolution shadows, explosions, and details have been considered a norm in recent decades. The gaming and film industry invests billions of dollars into developing products that utilize state-of-the-art graphics. As a result, current graphic cards are some of the most expensive and valuable components of gaming consoles and personal computers. Yet, computer graphics technology had to have decades of development to reach this point.
The history of computer graphics started with cathode ray tubes (CRTs). Although currently, such technology is very hard to find as it has been replaced by LCD and OLED technology, its influence on the development of computer graphics is significant. Color CRTs use electron-firing guns with different phosphors to display images against a lead glass screen (Sawicki & Moody, 2020). However, such technology was harmful to humans as it emitted high-energy waves emitted from electron clouds that could cause various adverse side effects.
CRT was the example of analog displays, which were not able to store much data compared to digital technology. One of the examples of analog computer graphics can be seen in the game called Space Warriors, which was developed at the Massachusetts Institute of Technology. In 1962 the game was first written for the PDP-1, which utilized 2700 transistors compared to over 1 billion transistors found in modern processors (Sawicki & Moody, 2020). Therefore, the development of digital graphic technology was the next step.
Digital displays utilized intermittent pulses of data and could transmit more data per second. As the computers were developing, they allowed for more sophisticated computer graphic technology. For example, it allowed the creation of 3D images such as those seen in the Dire Straits music video “Money for Nothing” (Sawicki & Moody, 2020). Hence, the entertainment industry was one of the main contributors to the development of computer graphics. The film industry is not an exception, as Pixar has utilized CGI in its movies. For example, one of the landmark 3D animated films was “Toy Story” by Pixar, which utilized state-of-the-art computer graphics for the year 2000.
In the gaming industry, the graphic revolution had a massive impact. For example, during the 1980s, most games were two-dimensional and did not allow much variation in a genre as they mostly were arcades. However, with games such as Doom and Quake, the 3D technology in gaming was introduced and made the 2D games seem obsolete. As a result, more gaming companies invested in the development of 3D technology and 3D games as it provided them with an important competitive advantage over their rivals (Sawicki & Moody, 2020). As innovation was encouraged financially, tech companies focused primarily on computer graphics, which caused it fast and massive progress.
Currently, the state of computer graphics reached a point where it is not accessible to many viewers or gamers. Therefore, the technology that allowed computer graphics on platforms such as web browsers began to occur. As such, a library that allowed programmers and developers to integrate interactive 3D graphics into web browsers was created. The library’s name is WebGL, which was created fairly recently in 2011 (Angel & Shreiner, 2014). It made computer 3D graphics more accessible, and the library made it easier to integrate complex graphics into web pages. Due to the variety of user devices such as tablets, smartphones, and laptops, the WebGL library is a relevant tool for enhancing the user experience with computer graphics.
References
Angel, E., & Shreiner, D. (2014). Interactive Computer Graphics with WebGL. Addison-Wesley Professional.
Sawicki, M., & Moody, J. (2020). A brief history of computer graphics. In Filming the Fantastic with Virtual Technology (pp. 21-58). Routledge.