Computer Components in the Future Research Paper

Exclusively available on Available only on IvyPanda® Made by Human No AI

Technological innovation is always at the forefront of computer component technology with advances in production and development resulting in faster, better and lighter products (Bursky, 26). These changes can be seen with the improved memory capacities inherent in today’s hard disk systems, the increased processing power of processors, new innovations in disk drive technology which enable greater media storage capacities as well as new technologies enabling better component cooling (Bursky, 26). It must be noted though that not all technological innovations are actually inherently adoptable by the general population. For example, during the late 1990’s and early 2001 one of the latest innovations in external storage methods was the development of the ZIP drive. Back then it was thought of as a revolutionary concept in external storage however it was never truly adapted by the general population due to subsequent advances in driver technology which enabled people to burn information onto CDs. Nicholas Carr in his article, “IT doesn’t matter”, which examines the use of technologies and their implications on society explains that technologies and their widespread use only become cheaper once they reach their build out completion. The term “build out completion” refers to a point in technological development wherein a type of technology has already reached commercial viability and can be effectively replicated and mass produced. Carr explains that so long as certain forms of technology have not reached a point of build out completion they will most likely never be adopted due to their prohibitive costs and the uncertainty attached to the technology itself. This particular lesson can be seen in the case of ZIP drive technology wherein the uncertainty behind its use led to it never being adequately adopted by the general population. It is based on this that it can be assumed that not all technological innovations will actually be adopted by the general population and this includes several of the new innovations currently being released in the market today. For example, the advent of 3D computer screens is heralded by many as a possible new standard to computer viewing yet industry data shows that not only is its usage unwieldy for the average user it serves no purpose for normal computer tasks such as using word processing software or utilizing the internet. When trying to determine what the future holds for computer component technology the manufacturing process itself should also be taken into consideration. Lately various consumer report groups have stated that certain PC component shave increasingly been found to actually be designed to eventually break down due to inferior materials being used. Such a case is actually quite true in the case of certain components whose operational lives are limited to only a few years and are not meant to last more than 4 years of continuous usage at most. The reason behind this lies with changes in the method of production wherein components are no longer being built to last but rather are being built with the current pace of innovations and consumer demands for cheap components. While current trends in various computer innovations may seem like the future of computer component technology it should not be assumed that they will actually attain the status of general utilization since various factors such as consumer adaptability and their inherent build out completion need to be taken into account.

Future Computer Components and their Durability

When examining computer components made recently with those constructed 12 years ago it can be seen that older parts are bulkier, slower, and of course less advanced as compared to recent creations yet for some reason older computer parts seem to have a longer operational timeline compared to some of the newer parts created. Operational timelines refer to the length of time a particular component is expected to work under normal operational conditions. On average this can range from 3 to 5 years depending on the rate of usage before the parts begin to cease functioning. Yet an examination of various studies examining the durability of various computer parts constructed in 1998 show that components used back then apparently still work all the way till 2012. While it can be argued that those particular components do not endure the same type of “punishment” components today undergo the fact remains that when comparing the operational timeline of computer components constructed within the past year they are apparently getting lower and lower with some components lasting only 2 to 3 years before problems begin to occur. This rather strange phenomenon brings up a rather intriguing question: if advances in technology are suppose to make components better why are they breaking down sooner as compared to parts constructed in previous generations? An examination of previous computer parts show that a majority of the components are far heavier as compared to recently constructed parts. While it may be true that as technologies improve components get lighter further examination reveals that older model components seem to have a far sturdier basis for construction as compared to recently created parts. This can range from using heavier plastics, stronger silicon and using more metal in the construction of the parts themselves. This translates into greater durability over the long term as compared to parts that would use cheaper forms of metal and lighter materials. In comparison computer components today are made of far lighter materials which do translate into better heat and electrical conductivity however this also makes them far more prone to break down as compared to parts constructed out of heavier materials. The fact is computer components created 10 years ago were actually created to last for a significantly longer time as compared to parts today. In fact various studies examining advances in computer component technology reveal that several parts manufacturers actually build their components to eventually break down. The reasoning behind this lies with the fact that with the current rate of technological innovation building parts to last does not make as much sense as it used to since components are replaced with newer models and types on an almost yearly basis. More durable components translate into greater production costs which results in higher component prices. With competition in the component industry being determined by who can produce the latest type of product at the lowest price it is not competitively feasible to sell components at a higher cost on the basis that they are more durable. In effect durability no longer holds as much sway as it used to due to the current fast paced advances in technological innovation which almost ensures that public companies or private individuals replace their various computer components before they break down. Another factor to take into consideration is the fact that a majority of consumer buying behavior is geared more towards acquiring the latest parts rather than the most durable. Not only that, various studies examining consumer buying behaviors show that more consumers buy components on the basis of their low prices as compared to determining whether a particular component is durable or not. This in effect encourages companies to adopt a production strategy that focuses more on producing cheap components while at the same time sacrificing their durability. It is based on this that it can be expected that computer components in the future will be manufactured in such a way that manufacturers will intend for them to fail after a certain degree of usage. The inherent problem with this possible future situation is that while it inherently benefits companies due to a continuous stream of income it does not benefit certain segments of the population who cannot afford continuous component changes due to a limited component life spans.

Switching from Air Cooled to Water Cooled Systems to Mineral Oil Cooled Systems

Nearly 85% of all computer systems in the world utilize traditional air cooling technologies in order to control the high temperatures created by either the processor, the north and south bridge in the motherboard, the video card and various other components within a computer that generate heat through constant usage (EDN, 16). This process usually involves a metal plate directly being attached to a particular component with a set of metal attachments being connected directly to fans from which the heat is dissipated via cool air being circulated directly to the component through the fan system (EDN, 16). Auxiliary fans are also usually connected to various PC casings from which cool air from the exterior of the casing is circulated towards the inside with warm air being removed via an alternative fan system. This particular setup has been utilized for the past 17 years in a majority of computer systems today and continues to remain one of the dominant methods of cooling PC components (Goldsborough, 30). Unfortunately due to the increased temperatures produced by various parts this particular system has begun to reach its limitations in terms of effective usability. While fan systems are effective in bringing cold air into various systems they are actually ineffective in keeping temperatures low within casings over a prolonged period of time. In fact, over a certain period of time fan systems fail to keep temperatures within nominal levels for proper operation and this often leads to parts breaking down earlier than they should due to the high temperatures within the casing that the fans are unable to effectively control. This often leads to burned out components, sudden computer shut downs and a variety of other results normally associated with such ineffective systems for cooling. One method that companies have utilized in order to resolve this particular issue has been to constantly keep the areas where a number of systems are located under a particular temperature through various air conditioning units. While this particular method of resolving growing PC temperatures is effective in the short term the fact remains that in the long term such a method of temperature control is costly and as such an alternative means of resolving this particular problem is needed. In the past 5 years one of the growing alternatives to resolving the heating problem of PC’s has been to utilize liquid cooling systems as a replacement for traditional fan cooling systems (Goldsborough, 30). Liquid cooling systems utilize a series of tubes containing a mixture of distilled water and various coolants which circulate towards a cooling plate directly attached to the component producing heat (Upadhya and Rebarber, 22). Heat exchange occurs when cool water from the system hits the heat sink and absorbs the heat transferring it away from the system via the serious of tubes towards a radiator that expels the heat and cools down the water (Upadhya and Rebarber, 22). Unlike conventional fan systems liquid cooling can reduce heat up to 60 percent better and produces far less sound resulting in better component longevity and better long term performance from a PC (Upadhya and Rebarber, 22). It must be noted though that liquid cooling systems utilize more electricity compared to traditional fan cooling systems due to the use of both a pump and a radiator in order to dissipate the heat that and such systems are also more maintenance oriented due to the necessity of checking the coolant levels in order to ensure that there is enough liquid coolant to make the system work popular. Unfortunately, the utilization of liquid cooling systems over most of the current consumer market is still isolated to a few select groups such as gamers, graphics artists and variety of other users that utilize systems with high core temperatures. One of the reasons could be the inherent price of liquid cooling systems which are considerably higher than normal fan cooling systems. Nicholas Carr in his article, “IT doesn’t matter”, which examines the use of technologies and their implications on society, explains that technologies and their widespread use only become cheaper once they reach their build out completion. The term “build out completion” refers to a point in technological development wherein a type of technology has already reached commercial viability and can be effectively replicated and mass produced. Carr explains that so long as certain forms of technology have not reached a point of build out completion they will most likely never be adopted due to their prohibitive costs and the uncertainty attached to the technology itself. Based on this it can be assumed that another reason why the use of liquid cooling technologies hasn’t achieved widespread use is due to the fact that it hasn’t reached its build out completion thus creating uncertainty in the technology itself. An examination of current methods of liquid cooling show that while the technology itself has taken great strides in innovation, effective systems are still rather unwieldy for the average computer user. This as a result has created a certain level of consumer uncertainty in the product itself despite the fact that it is a far better alternative compared to fan cooling systems. Based on this it can be assumed that general adoption of liquid cooling systems will come in future only after the technology itself has reached sufficient build out completion that it could be utilized easily by average computer user and will be at a sufficiently lower cost. It must be noted though that the general utilization of liquid cooling systems is only one possibility that could occur for computer component cooling technologies in the future; recently the use of mineral oil cooling systems has taken been gaining a substantial following resulting in it being a possible contender in the future as a primary cooling technology. Mineral oil cool utilizes a different heat exchange principal compared to either fan cooling technologies or liquid cooling; it involves pouring a substantial amount of mineral oil into a water tight casing in order to immerse all the computer components. After which the oil is subsequently pumped through a series of tubes into an external radiator in order to dissipate the heat accumulated through the oil. The technical aspect behind this particular system is actually rather simple; heat from the various computer components is transferred directly to the mineral oil and is then subsequently cooled down by the external radiator. This actually ensures the parts rarely accumulate significant heat, it must also be noted that due to the special qualities of mineral oil it does not cause electrical shortages in the equipment unlike when the components are submerged in liquids such as water. Similar to the case of liquid cooling systems mineral oil technologies are far from their build out completion stage and as such it really cannot be said whether mineral oil systems or liquid cooling systems will become the dominant form of component cooling in the future. It will all depend on which system is the most feasible to commercially utilize in the immediate future and whether it can be adapted for general usage.

CPU’s made out of Diamond

In relation to the topic mentioned earlier involving heat dissipation technologies one of the latest breakthroughs in semi-conductor research is the utilization of diamond based semi-conductors as a replacement for current silicon based chipsets. What must be understood is that as consumers demand more performance and processing power from CPUs companies in turn have developed smaller transistors within processors in order to provide consumers with the processing power they need (Merritt, 12). Unfortunately, as CPUs get increasing smaller, more sophisticated and possess more processing power the end result is greater difficulties in terms of thermal dissipation. On average a single processor consumes nearly a hundred watts of electricity in order to maintain proper operational standards, as the number of processes increase so to does the level of power utilized (Merritt, 12). Unfortunately dissipating heat from an area that is smaller than one square centimeter presents itself as a significant problem for various chip manufacturers since the amount of heat produced tends to reach significant levels after a certain period of time (Davis, 37 – 38). With the release of core i7 processors by Intel as well as the latest Intel Sandy Bridge processor the end result are chips which require increasing sophisticated methods of cooling which conventional fan systems are hard pressed to provide. While alternatives do exist such as the use of liquid cooling systems or mineral oil systems these are still far from their build out completion stage and as such are not generally used by most consumers (Oskin, 70-72). This presents itself as a significant problem for processor manufacturers such as Intel since recent consumer trends show that on average consumers demand higher processing power nearly every 2 years due to the demands of ever increasingly sophisticated software systems. Unfortunately silicon based processors show signs of thermal stress after they reach temperatures of 100 degrees Celsius or more (Davis, 37 – 38). If consumer demands are to be met companies would need to increase the capacity of their current processors which would in turn increase the amount of heat produced which would very likely cause the processors to literally melt as a result of increased thermal stress. It must be noted that as technological innovations continue to grow companies increasingly find that the traditional materials and components they utilize have actually reached the limits of their usability and can no longer provide the necessary structural infrastructure needed for their products. It is this particular situation that current processor manufacturers find themselves in (Oskin, 70-72). One possible alternative to utilizing silicon that processor manufacturers say shows potential is the use of diamond based semi-conductors. It is a well known fact that diamonds are in fact the hardest substance known to man and due to their inherent qualities possess useful properties that can be utilized in the production of robust processors. On average diamonds possess the ability to conduct heat better than silicon, they have an inherently high breakdown voltage and it must be noted that they also possess a distinctly high carrier mobility. Not only that, as mentioned earlier silicon based processors have a tendency to show severe thermal stress when temperatures reach 100 degrees Celsius, diamonds on the other hand can endure several times that particular temperature and still have little ill effect. It must be noted that though that various critics are skeptical over the use of diamond based CPU’s due to the fact that a diamond is based from carbon and as such acts as an insulator rather than a semi-conductor. One way around this that researchers have discovered is to dope the diamond in boron resulting in it turning into a p-type semi-conductor. Not only that, by reversing the charge of the boron this in effect enabled the creation of an n-type semiconductor. It must be noted that both p-type and n-type semi-conductors are needed to create a transistor and as such the ability to create both through a diamond doped in boron indicates the definite possibility of effectively creating diamond based processors in the future. Unfortunately one of the inherent problems in utilizing diamonds as processors are their inherent cost. Due to the relative rarity of diamonds and the fact that they are coveted by the jewelry industry their utilization as a replacement for current semiconductor technology seems infeasible due to the fact they would not be cost effective for both consumers and manufacturers. Researchers found one way around this by effectively creating a process the creates artificial diamond “sheets” that are purer and on par in terms of overall hardness compared to real diamonds. The only difference is with this particular process the diamond like material can be molded into various different shapes depending on the use required and it is actually cheaper to create as compared to various semi-conductor sheets used in today’s processor industry. Based on this innovation various experts agree that it might be possible for processors to achieve speeds equivalent to 81 GHz without the processors subsequently melting from the sheer amount of heat produced by such an activity. It must be noted though that the utilization of this particular type of technology is still in the testing phase and as such will not come into effect for another 10 years at the very least.

Blu-ray Disks and the Future of Disk Drives

Disk drives have been an ubiquitous part of most computer setups for the past 18 years or so making it one of the most well known and used components in any computer. As of late though, the advances in driver technology have slowly updated the technology from initially allowing consumers to burn information onto a CD-rom disk to eventually allowing them to watch and burn DVDs. The latest incarnation of the technological advances in driver technology have resulted in the creation of the Blu-ray disk format enabling consumers to store up to 30 to 40 gigabytes of data onto a single disk (Digtial Content, 39). The creation of Blu-ray disk technology actually follows a distinct trend in driver technology wherein every few years drivers and disks are created that have higher storage capacities, faster speeds, better video resolution quality and an assortment of added functions that surpasses the previous generation. Initially CD-rom disks had a capacity of 700 mb this changed when DVD based technologies and drivers came into the picture resulting in storage capacities reaching 4 to 6 gigabytes depending on the type of DVD bought. Today, Blu-ray disk technology enables storage capacities of up to 30 gigabytes which far surpasses the storage capabilities of DVD based drivers and disks. Based on this trend in advances for this particular aspect of computer component technology it can be assumed that Blu-ray disk technology will not be the final version of disk drive technology rather it is merely another evolutionary level (Perenson, 94). In fact disk production companies such as Philips, Sony, Maxwell, TDK, Optware and Panasonic have already announced a new potentially new form of media consumption in the form of the HVD disk which is slated for release in the next few years. HVD or Holographic Versatile Disk utilizes a new form of holographic embedding technology enabling data to be stored holographically onto the surface of a disk thus enabling greater storage capacity on a smaller surface area (Digtial Content, 39). In fact estimates show that a single HVD disk has the capacity to hold up to 6 terabytes of data, greatly exceeding the 30 gigabytes of data most Blu-ray disks can hold. It must be noted though that one of the more unfortunate aspects of disk drive technology is the fact that when new drivers and types of disk come out it becomes a necessity to transfer data from the older version of the technology to the newer type which is an arduous process at best (Medford, 8). In recent years the degree of technological innovation has advanced to such an extent that newer versions of disk drivers and disks come out nearly every 2 or 3 years resulting in a painful cycle for consumers as they migrate their data from one storage medium to the next (Digtial Content, 39). Based on this it can be seen that disk drive technology has an inherent weakness connected to the migration of consumer data from one version of the technology to the next. In fact it can even be said that no matter how far this particular technology progresses in terms of storage capacity and sharper video playback it will still have the same data migration problems that consumers now face. At times it must even be questioned whether disk drive technology is even necessary at all. For example, at the present solid state devices such as USBs are one of the dominant forms of external storage due to the ease of data transfer that they provide (Digtial Content, 39). While such devices are no where near the capacities of the future HVD format the fact remains that they provide a far easier method of data transfer as compared to the disks (Medford, 8). Another factor that must be taken into consideration is whether regular consumers really need disk formats that can store 6 terabytes of data. While it may be true that in the current information age the amount of media consumed by the average person reaches several hundred gigabytes the fact remains that it doesn’t go above 500 gigabytes. In fact consumers that are able to consume storage capacities of 1 terabyte and up are in the relative minority compared to most computer users today. Another factor that must be taken into consideration is the fact that with the current popularity of cloud computing especially with the release of Apple’s iCloud network this has in effect made data storage more of a problem for companies rather then regular consumers. It is based on this that it can be stated that the development of increased storage for regular consumers should follow a slower rate of consumption in order to lessen the frustration of continuous changes to new media storage formats. Unfortunately, based on current trends in technology releases companies seem to be more inclined to release new media storage formats without taking into account actual consumer necessity behind the release. What is currently occurring is case where consumers seem to irrationally inclined to follow new media storage formats such as the Blu-ray disk without taking into account the fact that their current method of storage is perfectly fine. Companies take advantage of this by continuously releasing new storage formats since they know consumers will follow the new formats and port their data over to the new storage devices. This particular situation benefits companies more so than it does actual consumers and as such, based on current trend and consumer behavior, the future of disk drive technology seems to be destined for the continuous release of ever increasing file storage methods which consumers will unlikely need but will buy nonetheless.

Changes in Display Technology: is 3D the future of Digital Display Technology?

For many computer users, classic CRT display technology has been around for 25 years and was once one of the most used types of monitors in the computer component industry. Yet with the development of cheap LCD technologies within the past 7 years or so CRT screens are beginning to be phased out in favor of cheaper and more cost effect LCD screens. It must be noted that unlike other forms of component technology monitors tend to be rather slower in technological innovation. While it may be true that in the market today there are several brands and types of LCD screens ranging from small 22 inch screens to massive 41 inch LCD monstrosities the fact remains that most of them adhere to the same basic design principles with certain additions added in by manufacturers to differentiate them from the rest (Kubota and Yazawa, 942 – 949). In fact within the past 5 years the technology has only improved slightly with the creation of High definition screens and LED display systems but the basic design and components utilized are still roughly the same. It must be noted that unlike the developments seen in processor technology, disk drives and pc cooling, enhancements in display technology only benefits the visual aspect of a user’s experience. It does little to improve PC performance, longevity, and in fact a large percentage of current consumers tend to stick to the classic LCD models and types developed 5 years ago as compared to the newer high definition LED screens used today. The reason behind this is rather simple; most people are unwilling to pay higher prices for a technology that can be obtained at a lower cost with little discernable change in PC performance (Kubota and Yazawa, 942 – 949). In fact a majority of most PCs are used mainly for work and as such unless a person is in the media industry high definition screens aren’t really a necessity. While it may be true that as of late there has been significant developments in display technology as it can be seen with the creation of 3d screens for computer users it must still be questioned whether the adoption of such technology for general use will happen within the immediate future. 3D vision technology has been largely advertised by most companies as the latest wave in innovations in display technology. In fact companies such as ATI and NVIDIA have attained significant profits in selling video cards that are 3d capable, yet when examining the percentage of actual use most laptops and PCs today still utilize classic LCD technology that was available 5 years ago. Comparatively processors, disk drivers, memory sticks and even disk drivers today have changed drastically from way they were since then. The reason behind this is rather simple, LCD technology already reached its build out completion years ago as such the technology has proven to be stable, most consumers prefer to use it and its cheaper as compared to utilizing some of the latest screens available today. In fact various projects show that the consumer market that will avail of the latest developments in screen technology will be isolated towards gamers, media related junkies, and media corporations. For a vast majority of other computer users LCD technology will still be utilized for quite some time due to its stability and lower price. What must be understood is that when examining the current trend in 3d vision technology it seems to be more inclined as a creative gimmick rather than serving an actual use. While it maybe be true that it makes games seem more realistic it is not necessarily an integral and necessary part of a user’s computer experience. In fact, the 3d capabilities of a screen can be taken away and programs on PC will run with no difference whatsoever in performance or display. A majority of programs today don’t require 3d vision to be utilized and its use is isolated towards only a certain segment of the computer user population. It must also be noted that in order to actually use a 3d capable monitor a user would need to wear a special set of 3d glasses in order to see the “3d” effect. One problem with this method of usage is that various studies show that not all users are actually comfortable with utilizing 3d vision technology. Cases of eye strain, blurred vision and distinct feelings of discomfort have been noted in certain computer users which call into question the ability of the technology to appeal to a large segment of the population. Not only that it must be noted that 3d vision screens are on average several times the price of normal LCD screens and require special 3d capable video cards to actually work. This increases their overall cost to consumers and actually discourages them from buying the screens in the first place. Display technologies should serve to provide a discernable utility to consumers, while 3d vision may seem nice it doesn’t really serve any actually positive purpose beyond making games look better. Other potential technologies that could be applicable to future display technology is the use of holograms as a replacement for solid screen devices however after extensive research on the applicability of the technology in the near future for general use it has been shown that even marginal commercial use is still 10 years away at best.

Heat Sinks, RAM memory and its Future Capacity

When various people think of PC components they usually bring up topics such as screen resolution, hard disk space, disk drivers and the capacity of their video cards yet they always seem to forget about RAM memory. The reason behind this is actually quite simple, the inherent popular culture by the general population often interacts directly with output devices such as monitors and input devices such as keyboards, the mouse and the disk drive. It is due to this that they often take notice of the factors that directly affect their interaction with various PC systems and such the amount of memory on their hard drive in order to store their files, the quality of the screen resolution on their monitor, the type of disk drive they have as well as the capacity of their video card so that they can play crisp video files. RAM memory is often relegated into the capacity of being a secondary aspect of the average home computer setup yet is an integral component in any PC. Lately RAM memory has been generating a greater degree of interest as the amount of tech savvy enthusiasts grows resulting in more interest in individual computer parts. It was actually due to this growing interest that various PC enthusiasts discovered the inherent limitations of RAM memory due to their increasing capacity. Similar to the case of processors, as the amount of RAM memory grew per individual stick the end result was a greater degree of heat produced which in turn affected the performance of the RAM in the long term (Deen, 48). Prolonged operating conditions actually resulted in slower computer responses as the memory struggled with increasing temperatures affecting its ability to actually work. While this is not the case for all computers it has been noted in enough cases that the RAM production industry has in effect released a stop gap measure to resolve it (Deen, 48). Depending on the manufacturer of origin, certain memory sticks actually came with heat sinks included to help dissipate the heat away from the sticks and into the air within the casing. It must be questioned though whether this particular addition to RAM memory will become an industry standard in the coming years. Heat sink technology works by drawing heat away from a particular device through either a copper or aluminum tube and theoretically dissipates it into the air within the casing. While various experts and industry personnel may say that it is effective in dissipating heat logic dictates that it seems to be a rather inadequate method of heat dissipation. For one thing, the technology works through the process of inherent temperature changes within particular “zones” (Deen, 48). The high temperatures from the memory stick are drawn to the lower temperatures in the surrounding air through the copper or aluminum heat sink, in effect lessening the temperature burden on the memory stick. What must be understood though is that this works only if the there is a distinct temperature change between the two zones. As operational times increase so to does the ambient temperature within a particular PC case, while there are casings with sufficient temperature control mechanisms in the form of interior fans that help to regulate the temperature the fact remains that not all casings have this since this entails a significant additional cost to computer users. As a result ambient temperatures within particular case models can increase to such a degree that in effect it reduced the efficacy of heat sinks resulting in a gradual deterioration of performance. It must also be noted that as mentioned earlier liquid cooling systems or mineral oil cooling systems have been gaining significant amounts of consumer interested and could be eventually used as the primary cooling methods of PCs in years to come (Deen, 48). It has already been proven that both cooling mechanisms are far more effective in cooling memory sticks as compared to heat sinks and as such it cannot really be said that heat sinks incorporated into memory sticks will become industry standards. While various manufacturers are advocating that they should be (they get to charge higher prices for memory sticks with heat sinks) the fact remains that based on future methods of cooling such as the use of mineral oil and liquid cooling it cannot be said that the current trend in the utilization of heat sinks in memory sticks will continue into the future especially when liquid cooling or mineral oil technology reaches their respective build out completion times and become commercially viable.

The Push towards Miniaturization and Holograms

One of the most recent trends in the development of PC components has been a distinct push towards miniaturization with various components decreasing in size and weight as consumers demand more portability in the devices they use. This has given rise to products such Intel’s Atom processor, more efficient miniature laptop batteries and a host of other innovations all aimed at making PC components smaller and thus more easily carried by the average consumer (Murphy, 113). In fact it can even be said that this apparent push towards miniaturization is a trend that will continue far into the future with holographic technology taking precedent in future portable devices such as net books and laptops. The reason why holographic technology is being stated is due to the fact that one of the latest developments in holographic technology has been the utilization of credit card sized keyboard that uses infrared technology to identify the positioning of a user’s fingers to in effect create the simulation of interaction between a user and a holographic image (Issei et al., 32 – 34). While this particular type of technology is still a decade or two away from actual commercialization it must be noted that this in effect could become the future of all input devices (Gomes, 40). Holograms can be described as 3d images created through various output projection sources to create the illusion of volume. For example one of the current applications of the technology has been the creation of “vocaloid” concerts in Japan wherein a projected holographic image is created on stage to simulate an actual person singing in a concert. Unfortunately, while input technologies involving infrared light can be currently utilized holographic technology is still within its infancy. The fact is creating an effective 3d hologram requires a significant amount of energy as well as a self contained projection apparatus that can project the necessary image onto a black screen or onto a particular space template. With the current push towards miniaturization it cannot be feasibly stated that holographic technology can utilized as a portable medium within the next few years. Various studies examining the rate of development of holographic technology specifically state that it will require at least another 20 years before the technology becomes applicable for commercial purposes. One of the reasons behind this is connected to the fact that the development of holographic display technology is mostly being conducted by research labs in various universities and not by any significant commercial company that produces various forms of display technology. Companies such as AOC, VGA, LG and Asus are focusing their efforts more on traditional methods of display technology such as LCD screens rather than conceptual technologies that have yet to actually attain proper conceptualization (Gonsalves, 6). What must be understood is that companies are more or less profit oriented and as such they will not expend resources on developing a product that is still in the theoretical stage (Gonsalves, 6). While it may be true that holograms may be the wave of the future for display utilized in computers the fact remains that until the technology actually proves itself commercially feasible it is unlikely that companies will allocate any amount of resources towards the development of the technology.

Works Cited

Bursky, Dave. “Stunning Advances To Captivate ISSCC Attendees.” Electronic Design 53.2 (2005): 26.

Davis, Sam. “High Efficiency Challenges Power-Management Design.” Electronic Design 56.5 (2008): 37-40.

Deen, Mueez. “Memory fuels performance engine.” Electronic Engineering Times (01921541) 1509 (2008): 48.

Digtial Content.”A Word on Storage.” Digital Content Producer 33.3 (2008): 39.

GOLDSBOROUGH, REID. “PC a Little Sluggish? It Might Be Time for a New One – Or Not.” Community College Week. 2008: 30.

Gomes, Lee. “KEYS TO THE KEYBOARD.” Forbes 184.4 (2009): 40.

Gonsalves, Antone. “Nvidia shaves costs of graphics processing.” Electronic Engineering Times (01921541) 1509 (2008): 6.

Issei Masaie, et al. “Design and development of a card-sized virtual keyboard using permanent magnets and hall sensors.” Electronics & Communications in Japan 92.3 (2009): 32-37.

Kubota, S., A. Taguchi, and K. Yazawa. “Thermal challenges deriving from the advances of display technologies.” Microelectronics Journal 39.7 (2008): 942-949.

Medford, Cassimir. “Music Labels, SanDisk in CD Rewind.” Red Herring (2008): 8.

Merritt, Rick. “SERVER MAKERS GET GOOGLED.” Electronic Engineering Times (01921541) 1553 (2008): 22.

Murphy, David. “Upgrade to Gigabit Networking for Faster Transfers.” PC World 27.12 (2009): 113-114.

OSKIN, MARK. “The Revolution Inside the Box.” Communications of the ACM 51.7 (2008): 70-78.

Perenson, Melissa J. “Blu-ray on the PC: A Slow Start.” PC World 27.4 (2009): 94.

EDN.”Point cooling advances for hot ICs.” EDN 54.5 (2009): 16.

Upadhya, Girish, and Fred Rebarber. “Liquid Cooling Helps High-End Gamer PCs ‘Chill Out.’.” Canadian Electronics 23.3 (2008): 22.

More related papers Related Essay Examples
Cite This paper
You're welcome to use this sample in your assignment. Be sure to cite it correctly

Reference

IvyPanda. (2022, April 1). Computer Components in the Future. https://ivypanda.com/essays/computer-components-in-the-future/

Work Cited

"Computer Components in the Future." IvyPanda, 1 Apr. 2022, ivypanda.com/essays/computer-components-in-the-future/.

References

IvyPanda. (2022) 'Computer Components in the Future'. 1 April.

References

IvyPanda. 2022. "Computer Components in the Future." April 1, 2022. https://ivypanda.com/essays/computer-components-in-the-future/.

1. IvyPanda. "Computer Components in the Future." April 1, 2022. https://ivypanda.com/essays/computer-components-in-the-future/.


Bibliography


IvyPanda. "Computer Components in the Future." April 1, 2022. https://ivypanda.com/essays/computer-components-in-the-future/.

If, for any reason, you believe that this content should not be published on our website, please request its removal.
Updated:
This academic paper example has been carefully picked, checked and refined by our editorial team.
No AI was involved: only quilified experts contributed.
You are free to use it for the following purposes:
  • To find inspiration for your paper and overcome writer’s block
  • As a source of information (ensure proper referencing)
  • As a template for you assignment
1 / 1