The Most Important Trends in Information Technologies Research Paper

Exclusively available on Available only on IvyPanda® Made by Human No AI

Introduction

The contemporary world lives in an era of rapid development of information technologies. Over the last several decades, IT became one of the most important and influential spheres around the globe with a major impact the other industries, driving their growth, progress, and development. The evolution of IT has resulted in the creation of multiple branches, tendencies, and trends, some of which are extremely promising and may change the world in the immediate future. The discussion and analysis carried out in the present paper will focus on three of the most important trends in the sphere of IT today. The selected trends are the internet of things, artificial intelligence, and virtual reality.

These specific topics were chosen because they are highly influential and capable of providing new experiences and providing new opportunities for both average users and professionals in many areas; additionally, all of these trends may change the sphere of IT itself and cause shifts in the specialties that are popular in the field today. Also, for further discussion, a noteworthy problem will be identified in one of these selected trends: the issue is corrigibility, or the question of control over artificial intelligence, which is a quickly advancing contemporary technology.

Top Trends in the IT Industry

As a fast developing sphere of knowledge, IT has multiple trends that are recognized as promising for the industry. However, virtual and augmented reality, artificial intelligence, and the internet of things are some of the most commonly discussed ones.

Virtual reality

VR and AR are viewed as extremely broadly applicable trends that would benefit a wide variety of industries such as healthcare and medicine, the military, education, and entertainment, to name a few. The rationale for choosing this particular trend for the description in this report is that today, the future of digital technology is seen as tightly connected with the advancements in AR and VR (Garner, 2017). With the appearance of VR devices within the last few years, the technology has drawn considerable attention as a trend in the IT industry (Forbes Agency Council, 2017).

However, regardless of its quickly spreading popularity and the interest that it attracts from many consumers, the technology itself does not seem to demonstrate much advancement. Currently, it is mainly employed in the spheres of entertainment and education, and its use is highly limited by a series of disadvantages that need to be addressed if it is to attain the anticipated hypergrowth (Forbes Agency Council, 2017).

VR and AR are recognized by many business professionals as a trend that will not pass anytime soon; it is believed to have a particularly promising future in advertising and marketing, where some of the earliest adopters have already managed to employ this technology for various products and promotional campaigns (Forbes Agency Council, 2017). Also, the technology is widely used in the entertainment industry, where it has been successfully employed for video gaming. VR and AR have also been used in education for helping engage students and create an immersive environment for learning, and in psychology for re-creating traumatic situations and helping people with phobias and PTSD cope with their conditions (Wakefield, 2016)

Artificial intelligence

As recently as a few decades ago, AI used to dwell mainly in science fiction stories and films. The prospect of intelligent computers was definitely attractive and fascinating, but seemed unattainable. Today, the developers of computer technologies are nearing the moment when they can turn fantasy into practice by creating programs that can learn and improve their skills and performance (Knight, 2017). This process is based on a system of positive and negative reinforcements where the programs memorize the positive outcomes and attempt to find solutions to various context-related problems in order to achieve a positive result (Knight, 2017).

For now, reinforcement learning has proved successful for machines whose purpose was to win intellectual games against human players. In the future, this capacity of the current versions of AI is expected to be used in many different spheres and activities. In time AI could potentially impact many aspects of technological progress and many different areas of contemporary society, so that AI is clearly an IT trend deserving attention in this report and in the future.

Some of the first professional applications of AI learning through reinforcement will be industrial robotics and automated driving. Such machines will be able to correct their own errors and improve their performance independently without producing a negative impact on the working process or slowing it down (Knight, 2017). Other aspirations and plans of modern AI developers include the creation of programs not only for learning languages by means of using voice recognition but also for generating natural-sounding spoken language. The trend of AI learning has been driven by developments in the field of neural networks, especially the creation of dueling networks that interact with one another, thus learning from the generated data and improving their performance (Knight, 2017).

While the development of artificial intelligence is moving forward, so do fears and worries concerning the moral aspects of AI alongside it. The potential for the machines to go out of control is recognized as a very realistic possibility and not just as a science fiction scenario that people have seen too many times on TV (Reese, 2016).

The internet of things

One of the largest and the most commonly discussed IT trends in the contemporary world is the internet of things (IoT). It is predicted to become one of the mega-trends that will initiate what is known as the “third wave of the internet” (“The internet of things explained: Making sense of the next mega-trend,” 2017). The IoT works based on ubiquitous connections between everyday consumer objects that enable them to create a network where all the appliances and devices people use on a daily basis can communicate with one another as well as with their owners.

Such networks are anticipated to be able to connect the devices in individual homes (microwaves, stoves, thermostats, lights), the elements of the city infrastructure (traffic lights, parking meters), and personal wearable items (smart glasses, watches, fitness bands) (“The internet of things explained: Making sense of the next mega-trend,” 2017). This trend was chosen for this report not just because of its potential usefulness but also for its potential transformation of everyday life that would accompany it, not to mention how it would fundamentally affect our overall understanding of the internet and connectivity.

The major practical obstacles that prevented the development of the IoT were the cost of connections and bandwidth (which have significantly decreased over the last several years), the lack of robust WiFi connections, and the absence of functional smartphones and other end-user connective devices to establish communication between all the other objects in the network. Needless to say, these obstacles have by now largely disappeared so that the establishment of the IoT is a possibility today. For this reason, the trend toward establishing the IoT is currently experiencing rapid growth driven by consumers’ readiness to embrace it and to purchase various smart devices capable of connecting with one another via the IoT.

An Important Emerging Issue in a Trend and What It Means in Context to the Industry

The three IT trends chosen for the discussion in this report are rather interesting but are also significant in that they could potentially change the entire way of life of the society embracing them. However, one of them – the AI trend – faces an important and pressing issue that has been recognized as a serious problem slowing down development and making it potentially dangerous to move forward with AI development without finding solutions to this problem first. The selected issue is that of corrigibility –people’s ability to control and change machines powered by AI. In practical terms, this is the problem that prevents the aforementioned grim science fiction scenarios from becoming reality in the daily lives of human society.

In order to work and fulfil its purpose, an artificial intellect would have to be stimulated and rewarded. In other words, it would need to have values to care about that would align with its major function. Since machines and computer programs do not have a human (or even non-human) sense of right and wrong preventing them from doing things that are considered immoral, they would require some kind of reward in order to reinforce their further completion of tasks (Reese, 2016).

However, since in this way, the successful completion of a task would outweigh all other considerations, it is possible that a machine may choose unethical approaches to the achievement of its goal. In addition, switching the program or a machine off will be interpreted by it as an attempt to prevent it from completing its goal, and as a result, the machine will do everything to avoid being switched off. This issue is also known as the shutdown problem (Jakobsen, 2015). The only way to prevent this would be to assign a reward for switching off that would be higher than that for task completion; however, in that case, the machines would have no stimuli for working at all (Soares, Fallenstein, Yudkowsky, & Armstrong, 2015). As a result, a conflict between people willing to control, correct, and direct intelligent computers and the machines themselves is practically unavoidable.

Conclusions

Contemporary technologies are developing very rapidly, and new and promising trends emerge almost every year. The possibilities that used to belong solely to science fiction stories are now an approaching reality. Some of the present and emerging trends in the world of IT are very interesting to explore. Three of the most commonly discussed IT trends that were selected for this report are virtual reality, the internet of things, and artificial intelligence. The latter trend is also known to face the pressing problem of corrigibility and shutdown, which serves as a major barrier to the further advancement of the technology or its application. Progress does not stop, however, and within the next few decades, it will be the responsibility of IT professionals to the new generation to find solutions to the existing problems and continue the development of these technologies.

References

Forbes Agency Council. (2017). Web.

Garner, T. (2017). . Web.

. (2017). Web.

Jakobsen, S. K. (2015). The Shutdown problem. Web.

Knight, W. (2016). . Web.

Reese, H. (2016). Artificial intelligence: The 3 big trends to watch in 2017. Web.

Soares, N., Fallenstein, B., Yudkowsky, E., & Armstrong, S. (2015). . Web.

Wakefield, J. (2016). . Web.

More related papers Related Essay Examples
Cite This paper
You're welcome to use this sample in your assignment. Be sure to cite it correctly

Reference

IvyPanda. (2020, August 10). The Most Important Trends in Information Technologies. https://ivypanda.com/essays/the-most-important-trends-in-information-technologies/

Work Cited

"The Most Important Trends in Information Technologies." IvyPanda, 10 Aug. 2020, ivypanda.com/essays/the-most-important-trends-in-information-technologies/.

References

IvyPanda. (2020) 'The Most Important Trends in Information Technologies'. 10 August.

References

IvyPanda. 2020. "The Most Important Trends in Information Technologies." August 10, 2020. https://ivypanda.com/essays/the-most-important-trends-in-information-technologies/.

1. IvyPanda. "The Most Important Trends in Information Technologies." August 10, 2020. https://ivypanda.com/essays/the-most-important-trends-in-information-technologies/.


Bibliography


IvyPanda. "The Most Important Trends in Information Technologies." August 10, 2020. https://ivypanda.com/essays/the-most-important-trends-in-information-technologies/.

If, for any reason, you believe that this content should not be published on our website, please request its removal.
Updated:
This academic paper example has been carefully picked, checked and refined by our editorial team.
No AI was involved: only quilified experts contributed.
You are free to use it for the following purposes:
  • To find inspiration for your paper and overcome writer’s block
  • As a source of information (ensure proper referencing)
  • As a template for you assignment
1 / 1