Initial thoughts on Facebook & data privacy
While most of us may not actively consider how much personal data and information is shared on a daily basis, app developers and tech companies such as Facebook can track your activities and sell your user data to third-party companies that use it for advertising and other purposes. Data breaches are becoming more common, and it spans beyond social media sites like Facebook. The CEOs of major tech companies, including Mark Zuckerberg, have testified to Congress this year and asked tough questions about their data privacy practices including the recent breach that exposed the personal information of millions of users.
While the CEOs or EVPs of these companies are on the hill testifying, we asked ourselves, “where are the managers of these companies and what responsibility do, they share in the decision-making and practices in the business?”
We strongly feel that Facebook managers are complicit in these poor business practices. The first HBSP article describes a spectrum from silent complicity to “aiding and abetting,” and we believe Facebook falls in the latter category. In recently published internal communications between managers at Facebook, there are clear actions taken to sell user data to third parties, where Mark Zuckerberg only became involved when it involved competitors. This means that Facebook managers were making clear decisions about the security controls that may have allowed groups like Cambridge Analytica to exploit user data. We did consider Facebook as being culpable in their actions, but due to the recent exposure of their inner workings, there’s a clear path beyond culpability to complicity when it came to selling and sharing user data.
As tech firms are exposed for their actions and the extent to which they affect society locally and globally, we think it’s essential for managers to consider an ethical framework for decision-making as they lead their teams (e.g., product development). Are these managers at Facebook comfortable with these practices, and are they respecting other people’s right to privacy?
In the end, it is often the firm that takes the risk for the economic reward but third-parties that suffer such unintended or undesirable consequences. This framework gets at the heart of what risk managers perceive and how risk averse we are towards our actions. To help avoid sticking to our intuition on such consequences, this first question should prompt managers to create an exhaustive list of stakeholders, and for each constituent identify the short-term and long-term likely impact of our action. Immediate impact may be predictable, but some effect including secondary and tertiary impact may be overlooked, especially where technology is involved. Often, the speed at which technology impacts a community or society-at-large is realized only in hindsight (i.e., unintended consequences) and it can be difficult to revert.
When considering communities, we believe managers are best to be mindful of the psychological aspect. Defining a “psychological community” allows us to find the component of trust that consumers form with the manager’s firm. Internalizing how manager’s actions can meddle with this belief can assist in avoiding adverse effects on the relationship between the customer and the brand. Furthermore, decisions made by a manager can fail a community by not fulfilling its role as a trusted service provider.
We anticipate managers at Facebook will consider balancing user behavior against competing opinions. Regarding privacy, “why do I care? I have nothing to hide from anyone” is a common refrain, which might help explain why Facebook continues to grow despite the headlines. This is an underreaction in our opinion, which underscores the importance for managers to consciously decide how they weigh feedback. An important factor when considering your community — and the stakeholders — are those who no longer use the service or have deleted their accounts but whose data is still accessible and valuable. We stress that managers understand the distinction between users who are willing participants and implied users.
In considering the four alternatives to business strategy, we felt there are other companies in the tech industry that will take a business as bystander approach, where managers see change as beyond their sphere of control beyond the scope of their responsibility. This stance is problematic because it will lead to those companies being impacted by data privacy regulation without their direct input.
Given pending action by the U.S. government and the overarching need for change, we would want Facebook to take the role of business as activist, and feel that this strong stance would help Facebook inform public policy rather than wait for the national conversation to dictate what is best. We can start by quickly looking to adopt newer standards to meet the growing demands of a more digitally literate user base. Techniques such as anonymization, reducing the amount of centralized processing, and educating users on how their data is being used are among the common practices emphasized at other tech giants, namely Apple.
Facebook makes an incredible amount of money with their advertising and by allowing third parties to use Facebook authentication, so managers would need to potentially sacrifice a portion of those economic benefits to ensure better business practices. This is, of course, easier said than done, given there are billions that could be left on the table. In the previously mentioned emails released by the United Kingdom, Facebook managers and Mark Zuckerberg exchanged emails that essentially exposed some of the privacy practices internally, including treating user data as a “commodity” and directly managing what companies can access personal information at any given point in time. This also brings up a multitude of legal questions that span beyond ethics, where users may have to take a closer look at what they consent to when signing up for Facebook. The agreements are usually dozens of pages, and most users do not read the language before consenting. We felt that legally, this is concerning as users do not fully understand what they signed up to share and may not recognize that data is sold to third parties.
We both have strong opinions concerning user privacy, informed by heavy technology use across multiple platforms. We feel this would directly affect us as managers at Facebook, and would actively try to remedy these situations, seeking alternative ways to achieve profitability at scale. Being uncomfortable with the status quo at Facebook, we would take a more proactive approach and look to be at the forefront of change as opposed to being dragged along later in the process. We would explore areas for our peers in the data privacy and advertising departments to combine forces and create safer practices around sharing data and storing personal information that is vulnerable to attack. The overall implication may result in revenue loss in the short-term, but long-term benefits such as a stronger reputation and more control over their practices may outweigh the immediate costs.
While Milton Friedman’s point that companies should only have the goal of maximizing profits free of deception and fraud, we both believe that if we were managers at Facebook, we would need to go beyond this framework and explore how to change their business practices and indeed be more ethical.
Co-authored with Brooke Watson.
Sources used:
HBSP Article: Responsibilities to Society
HBSP Article: A Framework for Ethical Reasoning