Information Technology: Peer-to-Peer Computing Essay

Exclusively available on IvyPanda Available only on IvyPanda

Abstract

Peer-to-peer (P2P) computing entails computer platforms that rely on various dispersed computer components to operate in decentralized ways. It simply means the sharing of computer platforms and data. Organizations have pervasively deployed computers across their different business units. Consequently, P2P computing has gained significant attention in research, product development and usages in organizations.

We will write a custom essay on your topic a custom Essay on Information Technology: Peer-to-Peer Computing
808 writers online

Many organizations have preferred P2P models because of their benefits such as enhancing scalability, reducing dependency on a centralized system, reducing costs through direct communication among systems and improving resource aggregation and functionality.

This paper presents concepts of P2P systems and functions. It highlights the fundamental concepts and critical aspects of the model. In addition, the paper provides a general approach to models used in P2P. It will help readers understand the possible advantages of deploying P2P computing in organizations. Moreover, people unfamiliar with P2P concepts will get a general overview of this research.

Introduction

Milojicic et al. defined peer-to-peer computing as “a class of systems and applications that employ distributed resources to perform a function in a decentralized manner” (p. 1). The fundamental aspect of peer-to-peer (P2P) computing is the sharing of computer resources through exchange platforms.

Users can use computers to distribute critical roles such as content sharing, platforms, data and communication platforms among others (Li 45). The P2P model relies on decentralization of algorithms and data, but some core functions, systems, and platforms can be controlled from a central location. The normal P2P systems have been based on the Internet or ad-hoc networks (Milojicic et al. 2).

Capabilities of P2P Computing

Many users have preferred P2P computing because of the following reasons. First, P2P computing enhances valuable externalities by combining different resources through low-cost systems to deliver high performance. That is, the outcomes perform better than single components. Second, P2P computing results into low-cost of ownership. The model promotes cost sharing by using the available infrastructures.

In addition, costs associated with maintenance are significantly reduced or distributed. Third, P2P computing can enhance anonymity in an organization and give users more control over their data and system applications. Fourth, the use of P2P computing results in system dynamism. Dynamism shows that computer nodes and other systems can run continuously in a dynamic environment. P2P models have been effective in such dynamic environments because they can accommodate various users and adapt to changing usages.

1 hour!
The minimum time our certified writers need to deliver a 100% original paper

However, users must ensure continuity or security of their functions or computation jobs when other users take the system. Finally, P2P computing promotes ad hoc communication and collaboration among users. In this regard, ad hoc environments offer flexibility because users can come and go based on their locations, roles, and interests. This capability enhances the dynamism concept of P2P models because the system can support such changes.

P2P computing systems generally do not use any established computer infrastructures. Instead, they use logical overlay based on CAN and PAST or reside on the edge of the Internet or in ad-hoc networks (Milojicic et al. 2).

However, P2P has raised critical issues of network security for users. In addition, IT departments have also raised concerns on accountability. While the previous P2P models had some levels of concerns, current hybrid models have enhanced performance relative to pure unstructured networks. Nevertheless, technologies in P2P have continued to evolve and users can determine their benefits and hypes.

Architecture

The P2P networking is based on the concept of “similar peer nodes that can run as a server and a client at the same time for other nodes within a network” (Vu, Lupu and Ooi 171). The P2P networking model is completely different from the “client-server networking computing” (Vu et al. 171).

In the client-server model, a central server acts as the main point for both from and to communications. For instance, the File Transfer Protocol (FTP) is an example of a client-server networking model. The client must start the transfer process and the server handles the request.

Unstructured networks

P2P computing uses an unstructured network based on the ad hoc connection. The unstructured P2P networks do not require any specific network structure models based on the design (Stallings 89). In this case, nodes are responsible for creating random connections among themselves. Kazaa and Gossip are examples of unstructured P2P networks.

Generally, unstructured P2P networks do not require any specific standards. As a result, P2P unstructured networks are simple to develop and enhance local network performance on various areas of the overlay. In addition, all peers within the network have similar functions. This capability makes the system highly robust when facing increased churn i.e. when many peers join and leave the network at higher rates.

Remember! This is just a sample
You can get your custom paper by one of our expert writers

The major drawback of the P2P unstructured network originates from a lack of a structure. Specifically, it is difficult for peers to find certain contents fast. Consequently, they hit the network with several search queries in order to locate peers that contain the required information. Network flooding results in traffic and high bandwidth and more CPU memory usages because all peers must perform the search. Nevertheless, there is no guarantee that peers will resolve all the users’ search queries.

Still, unstructured P2P network does not provide any correlation among peers or content management. Moreover, flooding searches may fail to locate a peer with the required data. Popular content can be available to almost all peers. On the other hand, rare data could be difficult to find because they are only available for few users within the network and thus it could be difficult for flooding to be successful (Shen, Yu, Buford and Akon 118).

Structured networks

The P2P network is based on specific topology in the structured system. Nodes can efficiently locate extremely rare files within the network. A distributed hash table (DHT) has been the most common P2P network model. It involves the use of an alternative consistent hashing to allocate rights of every file to a given peer (Trunfio 837).

Therefore, peers can conduct queries on the network by relying on “hash table (key, value) located within the DHT and in any node that can take part in the search and provide the required value linked to a specific key” (Trunfio 837). Nodes within the network must have lists of nearest nodes that meet certain criteria when routing traffic to improve the effectiveness of the network. This implies that structured P2P networks are not efficient in situations of rapid churn.

A recent assessment of the P2P platforms subjected to real workloads identified several challenges related to DHT-based solutions. These included high costs of resources and resource discovery and load imbalance (Bandara and Jayasumana 634).

Consequently, Bandara and Jayasumana concluded that “real-world queries are relatively easier to resolve using unstructured, super peer and single-attribute dominated query based structured P2P solutions” (634). This implies that many current designs used can only operate under certain specific conditions and their performances normally deteriorate when subjected to realistic workloads.

Hybrid models

In P2P networks, hybrid models consist of both “peer-to-peer and client-server computing approaches” (Vu et al. 171). The most widespread hybrid model consists of a “central server that assists peers to locate other peers within the network” ( Shen et al. 118). Many hybrid models exist, but they all apply the centralized functionality approach supported by the structured server or client network and use the node provided by the pure peer-to-peer unstructured networks.

Hybrid models perform better relative to pure unstructured and pure structured networks. Hybrid models have enhanced search functions because of centralized capabilities. In addition, they also leverage the decentralized combination of nodes found in the unstructured networks.

We will write
a custom essay
specifically for you
Get your first paper with
15% OFF

Peer-to-Peer Computing Models

Farago-Walker identified three distinct P2P computing models as multiple peer relationship, distributed peer relationship and collaborative peer relationship (p. 1). In a multiple peer relationship, computers are “networked to each other via servers” (Farago-Walker 1). Users can share or gather files from other users within the network. One major challenge with a multiple peer relationship is security. The system is prone to security breaches, and it could result in intellectual property challenges.

In a distributed peer relationship, many computers are networked to enhance their processing and computing capabilities. Enhanced capabilities increase Internet activities and tackle complex issues that require massive processing speed and reliability. Finally, in a collaborative peer relationship, a small number of users agree to work together through a common user interface such as online tools, platforms, chat, messaging platforms or even e-learning tools.

Decentralization

Decentralization is a major feature of peer-to-peer computing. It influences the design of applications and systems. In this regard, a system developer must account for security, data structure, algorithms, system scalability, and availability, as well as users’ interaction (Milojicic et al. 20). For instance, participants may understand that other users are remotely located and may disengage. Hence, they must work by maximizing strategies of decentralized systems.

Distributed P2P systems must account for the decentralization and collaboration needed for diverse users, authorization, central control and data management (Silberschatz, Galvin and Gagne 741).

The ad hoc approach for P2P applications also influences system development. Users’ perception of the system is critical in ad hoc environments. Users can disappear at any time and this affects how the system is designed.

Security issues have remained major obstacles in P2P computing. In addition, anonymity could also be an important factor in certain locations and applications. Nevertheless, the system “scalability, enhanced performance, resilience and interoperability” (Milojicic et al. 20) have demonstrated the relevance of P2P computing in distributed systems.

P2P computing promotes content sharing. This ensures data availability to various communities of users. Hence, data security and system performance are paramount for many users. At the same, users also want the ease of use and anonymity in P2P computing.

P2P Markets

Different market segments have shown interests in P2P computing. There are individual consumers who use P2P computing for personal reasons (Farago-Walker 3). Napster has popularized the use of P2P in music sharing.

There are also organizations such as financial, biotech firms and other B2B that require P2P computing. Data Synapse uses P2P computing. There is a public class of users for P2P computing. They focus on information sharing and digitalization of contents.

Significance of Peer-to-Peer Computing

P2P computing is based on the concept of a widely connected world of computers. Hence, users do not have to rely on a central system to process their requests. P2P computing shows how administrators and users can leverage several available computing resources, power, storage capabilities, and connectivity to enhance system distribution.

P2P computing has transformed data sharing and communication. Today, users can share large files instantly and get immediate feedback and reduced costs of communication.

This computing model has changed collaboration among communities of users. For instance, it has eliminated physical contacts between people and facilitated collaboration among groups in widely dispersed locations. Users can collect and distribute their contributions via the system for further evaluation and additional suggestions. In other words, distance is no longer a challenge among communities of users.

However, collaboration is necessary for users to realize the effectiveness of P2P computing. Users must learn the technical aspects of P2P computing in order to collaborate and communicate. Collaboration among communities of users is critical for the development of P2P computing models. P2P computing can only achieve its maximum potential when significantly higher numbers of nodes are connected and have resources for distribution.

Currently, the common practices with P2P computing networks have not encouraged other users to contribute. While some users rely on resources available in other nodes, they have failed to contribute their own resources. This is a freeloader problem in the network system. Freeloading has affected the system significantly and has rendered some sites irrelevant.

Some P2P networks such as Freenet have focused on user privacy. They ensure that contents are protected and users’ identities and locations remain private within the network (Vu, et al. 179).

P2P systems have demonstrated a significant challenge to network neutrality disputes. Many Internet service providers (ISPs) do not want P2P systems because of the high bandwidth usages associated with these computing models. Relative to other applications such as Web browsing, sending e-mails and sharing other small files, P2P file sharing is a daunting task because of heavy bandwidth required to distribute large files to communities of users. It causes flooding in the system.

Current research

Previously, researchers had found commercial applications of P2P computing for individuals, enterprises and the public. Currently, they have embarked on computer simulations to enhance comprehension and assessment of multifaceted user characteristics when using interacting with others. Networking research must result in valid and reliable outcomes that can initiate further research.

Conclusion

P2P computing shows classes of computer systems that depend on distribution to leverage available network resources for content sharing among communities of users. Many users prefer P2P computing because of scalability, interoperability, fault resilience, distributed costs and availability among other benefits. However, system security and vulnerability are major challenges to P2P networks. In addition, a lack of collaboration among communities of users has resulted in the collapse of some sites.

Overall, peer-to-peer computing has become critical for users who want to connect directly to each other without relying on a central server. Consequently, it has transformed information sharing across the Internet and enhanced collaboration among communities of users. This computing model provides greater control to users and thus outcomes could be extremely uncertain.

Works Cited

Bandara, Dilum and Anura Jayasumana. “Evaluation of P2P Resource Discovery Architectures Using Real-Life Multi-Attribute Resource and Query Characteristics.” 2012 IEEE Consumer Communications and Networking Conference (CCNC). Las Vegas, NV: IEEE Communications Society, 2012. 634 – 639. Print.

Farago-Walker, Susan. Peer-to-Peer Computing: Overview, Significance and Impact, E- learning, and Future Trends. n.d. Web. 28 Oct. 2014.

Li, Jin. “On peer-to-peer (P2P) content delivery.” Peer-to-Peer Networking and Applications 1.1 (2008): 45–63. Print.

Milojicic, Dejan, Vana Kalogeraki, Rajan Lukose, Kiran Nagaraja, Jim Pruyne, Bruno Richard, Sami Rollins and Zhichen Xu.. Peer-to-Peer Computing. Santa Barbara, CA: Hewlett- Packard Company, 2002. Print.

Shen, Xuemin, Heather Yu, John Buford and Mursalin Akon. Handbook of Peer-to-Peer Networking. New York: Springer, 2009. Print.

Silberschatz, Avi, Peter Baer Galvin and Greg Gagne. Operating System Concepts: With Java. 9th ed. New York: John Wiley & Sons, Inc., 2012. Print.

Stallings, William. Operating Systems: Internals and Design Principles. 5th ed. Upper Saddle River, NJ: Prentice Hall, 2005. Print.

Trunfio, Paolo. “Peer-to-Peer resource discovery in Grids: Models and systems.” Future Generation Computer Systems 23.7 (2007): 837. Print.

Vu, Quang, Mihai Lupu and Beng Ooi. Peer-to-Peer Computing: Principles and Applications. New York: Springer, 2010. Print

Print
Need an custom research paper on Information Technology: Peer-to-Peer Computing written from scratch by a professional specifically for you?
808 writers online
Cite This paper
Select a referencing style:

Reference

IvyPanda. (2020, March 22). Information Technology: Peer-to-Peer Computing. https://ivypanda.com/essays/information-technology-peer-to-peer-computing/

Work Cited

"Information Technology: Peer-to-Peer Computing." IvyPanda, 22 Mar. 2020, ivypanda.com/essays/information-technology-peer-to-peer-computing/.

References

IvyPanda. (2020) 'Information Technology: Peer-to-Peer Computing'. 22 March.

References

IvyPanda. 2020. "Information Technology: Peer-to-Peer Computing." March 22, 2020. https://ivypanda.com/essays/information-technology-peer-to-peer-computing/.

1. IvyPanda. "Information Technology: Peer-to-Peer Computing." March 22, 2020. https://ivypanda.com/essays/information-technology-peer-to-peer-computing/.


Bibliography


IvyPanda. "Information Technology: Peer-to-Peer Computing." March 22, 2020. https://ivypanda.com/essays/information-technology-peer-to-peer-computing/.

Powered by CiteTotal, best bibliography generator
If you are the copyright owner of this paper and no longer wish to have your work published on IvyPanda. Request the removal
More related papers
Cite
Print
1 / 1