Updated:

Zuckerberg in Facebook-Cambridge Analytica Scandal Research Paper

Exclusively available on Available only on IvyPanda® Made by Human No AI

Introduction

By default, society views leaders as problem solvers, innovators, and individuals who take responsibility for their actions and those of their employees regardless of whether the consequences are either good or bad. Ethical leadership is a construct of leadership that is directed by virtuous behaviour and moral development (Othman & Rahman, 2014). Therefore, ethical leaders tend to uphold a more universal standard of moral conduct.

Ethical leadership usually presents itself in times of corporate scandals where the company’s stakeholders and the public question the morals of the managers. In today’s transparent social-media-driven world, high-profile executives are occasionally summoned to explain their morals in how they do business. A recent scenario is the Facebook-Cambridge Analytica data scandal in which the Founder and CEO of Facebook, Mark Zuckerberg, was summoned to explain the data breach. This paper aims to examine how Mark Zuckerberg portrayed his ethical leadership skills in the Facebook-Cambridge Analytica data scandal.

Description of the Situation

Will technology firms ever take ethics seriously? The Facebook-Cambridge Analytica data scandal is the largest public relations crisis that Facebook has faced (Fuller, 2018). It all began in 2014 when Alexander Kogan, a University of Cambridge researcher, developed an application known as “thisisyourdigitallife.” It is estimated that about 270, 000 Facebook users downloaded the application and consequentially gave away their profile details including that of their Facebook friends, therefore, Kogan had access to the details of 50 million people (Graham-Harrison & Cadwalladr, 2018).

Kogan shared this information with Cambridge Analytica, a data mining and political strategy company without the users’ knowledge. The idea of Cambridge Analytica was to glean people’s Facebook likes so that it could have a better understanding of their personality (hobbies, comments, gender, sexual orientation, political views, and religion), thereby more effectively target political advertising. The concept used can be referred to as psychographic profiling (Fuller, 2018). The psychographic data obtained from the analyses was later integrated into Donald Trump’s presidential campaign.

In 2015, Christopher Wylie, a whistleblower, reported and submitted evidence of the scheme to the New York Times and the Guardian (Graham-Harrison & Cadwallad, 2018). After the scandal was publicized, Facebook approached Alexander Kogan and Cambridge Analytica and requested that they delete the information they had acquired improperly. However, the parties never deleted this information and Facebook never bothered to follow up to ascertain whether they had deleted it. Therefore, this reflects how Facebook made it simple for data developers like Kogan to be granted access to private user profile data and even share it. Moreover, by 2018, Facebook had still never informed people on the breach regardless of it being aware that Cambridge Analytica had had this information for more than two years. As a result, Facebook violated several principles of technology ethics, which primarily include breaking the terms and conditions it has with its users.

Background of the Ethical Problem

Facebook users prefer signing into other websites and applications using the Facebook login tool rather than the websites’ or applications’ independent registration method. This is because the Facebook option is less cumbersome. What remains unknown to the users is that by using the Facebook login option, they give up some of their personal information such as religion, political preferences, comments, emails, and locations, to these other websites.

In 2014, a University of Cambridge researcher created an application, “Thisisyourdigitallife”, in which Facebook’s Application Programme Interface (API) lets developers similar to Kogan access user profiles (Graham-Harrison & Cadwalladr, 2018). It mined the Facebook profile data of its registered users thereby gathering information of more than 50 million Facebook users. At this time, Facebook had not yet revised its privacy contract; therefore, it allowed software developers to access user information. Mark Zuckerberg’s apologetic statement evidenced this gap in Facebook policies. In his interview with CNN, he stated that:

“We have a responsibility to protect your data, and if we can’t, then we don’t deserve to serve you (Solon & Helmore, 2018).”

Without due diligence, Kogan, the developer of the “Thisisyourdigitallife” app, shared the data with Cambridge Analytica. This violated Facebook’s terms and conditions. Thereafter, Cambridge Analytica being a political strategy company, used the acquired data to aid in Trump’s presidential election. Christopher Wylie, an ex-Cambridge Analytica employee, told the Guardian that:

“We exploited Facebook to harvest millions of people’s profiles. And built models to exploit what we knew about them and target their inner demons. That was the basis the entire company was built on (Graham-Harrison & Cadwalladr, 2018).”

Moreover, the New York Times verified that the user information possessed by Cambridge Analytica could be found online. Although the scandal began in 2014, Facebook failed to take responsibility until it became a global outrage in 2018.

As a consequence of their negligence, some Facebook users were angered and they deleted their accounts. There was a mass deletion of accounts, and this was also supported by influential people in the society, for instance, Brian Acton, the founder of WhatsApp. In addition, a hashtag trend, that is, #DeleteFacebook#, emerged and this exacerbated the deletions. Facebook’s stock prices also declined. To attend to the problem, Mark Zuckerberg went on CNN, and in his interview, he mentioned that Facebook had already revised some of the rules. He said:

“We also made mistakes, there’s more to do, and we need to step up and do it (Solon & Helmore, 2018).”

Solutions to the Problem

Mark Zuckerberg came up with several solutions to reduce and at most, eliminate the problem. After a five-day silence since the public announcement of the scandal, Mark Zuckerberg began by taking full responsibility for the scandal and apologized to Facebook users, stakeholders, and the community at large (Solon & Helmore, 2018). Secondly, he took steps to ensure that breaches similar to Cambridge Analytica’s never occurred.

As of 2014, Facebook had already put policies into place that barred developers from gaining access to user information (Solon & Helmore, 2018). However, after the 2018 scandal, Facebook went a step further to strengthen its policies. For instance, if a user goes for three months without using an application linked to Facebook, the company is going to discontinue developer access to any data about the person. Thirdly, for developers who did have prior access to all user data from way back in 2014, Facebook is going to request that they submit an audit or get removed from the platform.

Fourth, Cambridge Analytica was requested to delete all the data is collected from the “Thisisyourdigitallife” app after thorough auditing by cybercrime experts. Fifth, Facebook announced changes in its privacy settings that enable users to delete any data collected by Facebook’s network. Though these were initially present, the company wanted to remind its users of the power they still had on their data. Last but not least, Facebook claimed that it would open a public archive that will contain all advertisements associated with political labels.

The archive will show the public the amount of money spent on advertisement, demographic data for the audience reached, and the number of impressions it attracted. After creating and implementing these solutions, the question that arises is whether these revisions would successfully prevent a similar case from occurring in the future. On one hand, it is a start. This is because restricting developer access to user data could help Facebook start to rebuild some trust. On the other hand, technology is dynamic and ever-evolving; therefore, the solutions may only apply to the current situation, but they may not have a long-term effect.

Evaluating the Response to the Solution

Although Facebook has faced many data privacy issues, how Mark Zuckerberg handled the Facebook-Cambridge Analytica scandal portrays his embodiment as an ethical leader. The several behaviors likened to an ethical leader that Mark Zuckerberg possesses include taking responsibility. He swiftly acknowledged that he was clearly at fault, apologized, and took full responsibility for Facebook’s unethical actions. Secondly, concerning his public apology, he portrayed humility, an attitude of wisdom, and a willingness to learn.

The third attribute is honesty and straightforwardness. In addition to his apology, Zuckerberg gave a direct and clearer plan on how Facebook will fix the problem at hand. Most of Zuckerberg’s solutions were enacted early this year; therefore, the overall impact of the solutions cannot be effectively evaluated. Nevertheless, so far, Facebook is more stringent on data privacy, and there have not been any related concerns arising. Furthermore, the likelihood of another breach of privacy is unknown since technology is always developing; hence, such glitches are inevitable.

References

Fuller, M. (2018). Big Data and Facebook Scandal. Issues and responses. Theology, 122(1), pp. 14-21.

Graham-Harrison, E. & Cadwalladr, C. (2018). The Guardian. Web.

Othman, Z. & Rahman, R. (2014). Attributes of ethical leadership in leading good governance. International Journal of Business and Society, 15(2), pp. 359 – 372

Solon, O. & Helmore, E. (2018). The Guardian. Web.

More related papers Related Essay Examples
Cite This paper
You're welcome to use this sample in your assignment. Be sure to cite it correctly

Reference

IvyPanda. (2021, May 30). Zuckerberg in Facebook-Cambridge Analytica Scandal. https://ivypanda.com/essays/zuckerberg-in-facebook-cambridge-analytica-scandal/

Work Cited

"Zuckerberg in Facebook-Cambridge Analytica Scandal." IvyPanda, 30 May 2021, ivypanda.com/essays/zuckerberg-in-facebook-cambridge-analytica-scandal/.

References

IvyPanda. (2021) 'Zuckerberg in Facebook-Cambridge Analytica Scandal'. 30 May.

References

IvyPanda. 2021. "Zuckerberg in Facebook-Cambridge Analytica Scandal." May 30, 2021. https://ivypanda.com/essays/zuckerberg-in-facebook-cambridge-analytica-scandal/.

1. IvyPanda. "Zuckerberg in Facebook-Cambridge Analytica Scandal." May 30, 2021. https://ivypanda.com/essays/zuckerberg-in-facebook-cambridge-analytica-scandal/.


Bibliography


IvyPanda. "Zuckerberg in Facebook-Cambridge Analytica Scandal." May 30, 2021. https://ivypanda.com/essays/zuckerberg-in-facebook-cambridge-analytica-scandal/.

If, for any reason, you believe that this content should not be published on our website, please request its removal.
Updated:
This academic paper example has been carefully picked, checked and refined by our editorial team.
No AI was involved: only quilified experts contributed.
You are free to use it for the following purposes:
  • To find inspiration for your paper and overcome writer’s block
  • As a source of information (ensure proper referencing)
  • As a template for you assignment
Privacy Settings

IvyPanda uses cookies and similar technologies to enhance your experience, enabling functionalities such as:

  • Basic site functions
  • Ensuring secure, safe transactions
  • Secure account login
  • Remembering account, browser, and regional preferences
  • Remembering privacy and security settings
  • Analyzing site traffic and usage
  • Personalized search, content, and recommendations
  • Displaying relevant, targeted ads on and off IvyPanda

Please refer to IvyPanda's Cookies Policy and Privacy Policy for detailed information.

Required Cookies & Technologies
Always active

Certain technologies we use are essential for critical functions such as security and site integrity, account authentication, security and privacy preferences, internal site usage and maintenance data, and ensuring the site operates correctly for browsing and transactions.

Site Customization

Cookies and similar technologies are used to enhance your experience by:

  • Remembering general and regional preferences
  • Personalizing content, search, recommendations, and offers

Some functions, such as personalized recommendations, account preferences, or localization, may not work correctly without these technologies. For more details, please refer to IvyPanda's Cookies Policy.

Personalized Advertising

To enable personalized advertising (such as interest-based ads), we may share your data with our marketing and advertising partners using cookies and other technologies. These partners may have their own information collected about you. Turning off the personalized advertising setting won't stop you from seeing IvyPanda ads, but it may make the ads you see less relevant or more repetitive.

Personalized advertising may be considered a "sale" or "sharing" of the information under California and other state privacy laws, and you may have the right to opt out. Turning off personalized advertising allows you to exercise your right to opt out. Learn more in IvyPanda's Cookies Policy and Privacy Policy.

1 / 1