// Today’s post is an essay I wrote for a professional practice class. The assignment was to examine a single tenant of ASET’s Code of Ethics using a real-world example of an ethical issue. The assignment was for two pages in length – this ended up at 41/2 pages. //
Engineering Technologists are responsible for a vast amount of the everyday workflows and processes that keep our society functioning smoothly. The information technology, electricity, telecommunications, construction, health and medical sectors in Alberta rely on technologists to deliver professional, reliable and adequate services by “exercising independent judgement to solve problems in complex technological areas with limited direct supervision” (ASET, n.d.). Furthermore, they are called to do so within an ethical framework codified by the Association of Science and Engineering Technology Professionals of Alberta (ASET). This report will examine the realities & implications of one of the ten tenants of the ASET Code of Ethics with a concrete real-world scenario illustrating the subtleties of making ethical decisions as a technology professional – specifically, the code’s admonishment that “Members of ASET shall hold paramount the safety, health and welfare of the public, the protection of the environment and the promotion of health and safety within the workplace.” (ASET, n.d.). This report will discuss the ethical questions and challenges surrounding Facebook, Inc.’s decision to submit their users to multiple social experiments that measure the effects of social climate on an individual’s behaviour, especially in relation to the above ethical tenant. (Bond, 2012) (Booth, 2014) (Kramer, 2014)
In examining Facebook’s social experiments, as with discussing any ethical issue, it is important to have a framework in which the discussion may be structured. For simplicity and brevity, we will refer to Richard O. Mason’s seminal article “Applying Ethics to information Technology Issues” (Mason, 1995) to illuminate our examination. Mason puts forth his fundamental principle that decision making agents (individuals and groups thereof) will affect the ability of other agents to pursue their own goals; critically, that technology amplifies this ability. When an events takes place that raises the possibility of this happening, it is an ethical issue. Mason prescribes ‘ethical lenses’ to see clearly both the facts of the situation and the ideal future point in which the ethical issue is resolved. He broadly defines four historical categories of ethical theory, each with their own priorities and methods of weighing priorities and values. Summarized, the four are: a focus on the agent’s duty to act rightly, respecting the autonomy of others; the pursuit of happiness, in which the greatest good should be done for the greatest population; the pursuit of specific virtues, in which “courage, prudence, temperance, justice … faith, hope, charity … industry, honesty and trustworthiness” are to be integrated into the agent’s actions; finally, the pursuit of justice, in which fairness in opportunity, duties, good and privilege along with punishment is held to be the right of all agents. Mason concludes with an exposition of four critical questions that must be asked and answered using one or some of the four ethical theories, before a decision is made to resolve the ethical issue at hand: Who should decide? Who should benefit? How should the decision be made and carried out? How can the issue be prevented from arising in the future? See the full text for a deeper discussion of all the above points. With this foundation, we can now examine the situation Facebook found itself in recently.
In March 2014, sociologists and psychologists from Cornell University and the University of California co-authored an academic study with Facebook in which 689,000 users were shown modified versions of their news feed – “the flow of comments, videos, pictures and web links posted by other people” (Booth, 2014) – in which posts with either positive or negative keywords were selectively filtered out. This was done to observe the effect on the overall positivity or negativity of the user’s posts and activity on Facebook. Users were not explicitly informed of their inclusion before, during or after the experiment, leading to our first ethical issue: Did Facebook hold paramount the welfare of the public in pursuing this study? It is generally held that a key component of any modern behavioral study done on human participants is a debriefing, during which the experimenters disclose the details and extent of the study to the subjects, giving closure to the subjects and allowing for questions between parties. Facebook and its fellow researchers did not inform the subjects (users) they were subject to potential emotional manipulation at any point of the study, leading to an initial conclusion of unethical behaviour. However, complicating matters is the ethical lens that pursuing the greatest good for the greatest number may be more valuable than avoiding harm for the few. In our study’s case, the authors concluded that
“Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks.”
– (Kramer, 2014)
The study’s findings reveal a new piece of knowledge in understanding the behavioral patterns of people in large electronic social networks, which is a valuable contribution to the fields of sociology and psychology. Indeed, this evidence represents a great amount of good for a great amount of people, at the potential expense of the few (even if the ‘few’ were 689,000).
Richard Mason’s framework puts forward two questions relevant to the study: Who should benefit? How can the ethical issue be prevented form arising in the future? The easily-identified beneficiaries of this study are obvious: Facebook has gathered data that will help the company tailor its media structure to take advantage of emotional contagion, while the academic researchers have garnered a published paper with perhaps the largest sample size in academic history. The more important question is whether these findings benefit a larger part of society –a nebulous question suited to debate and uncertainty, with no obvious answers. Certainly there are claims that Facebook’s actions were harmful, with a British Member of Parliament going so far as to ask for a parliamentary investigation, saying “If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it.” It is clear that an ethical issue arose, that decisions were made regarding the study’s execution, and that it was widely perceived as presenting a conflict of interest. So Mason’s second question is begged to ask how this issue can be prevented in the future.
Radiolab, a radio programme produced by WNYC in New York City, broadcast an interview with Facebook Trust Engineer Arturo Bejar, who is part of the engineering team responsible for tinkering with users’ experience to modify their behaviour. (Bejar, 2015) Examples include the specific wording asking other users to remove embarrassing photos, prompts for users to post a status, and so forth. Most relevant to our question is the scale that Bejar revealed to the Radiolab hosts. According to Bejar, who was an active Facebook engineer as of the interview date, Facebook is constantly churning out user data from behavioral experiments in which parts of a user’s experience inside the platform are tweaked and their subsequent behaviour measured, collected, and interpreted. Bejar stated that these experiments had been ongoing for quite some time (with the cooperation and inclusion of many academic sociologists, psychologists and behaviorists to help interpret the data) before Kramer’s study was published and caused public notice. Notably, a vast experiment was undertaken during the 2010 United States Congressional Elections in which 60 million users’ political behaviour (voting) was shown to be modifiable with a simple “I Voted” button displayed alongside which of those user’s friends had clicked the same button (Bond, 2012). Moreover, the studies are so numerous that as of late 2014 (the interview date), each of Facebook’s 1.7 billion users were included, uninformed and without explicit consent, in “at least ten different experiments at a time” (Bejar, 2015). With this information, the question of how to prevent the ethical issue of lack of both informed consent and study debriefing seems to have been answered by way of disposal.
Mason asks “In making [an] ethical decision, what sort of social transcript do we want to write?” It is clear that Facebook’s leadership have answered this question for themselves. Also clear that is that it is a difficult question to answer: vast amounts of valuable data helping to inform the study of social organization in society, and how to modify behaviour for potential good, is generated daily. However, traditional methods of executing behavioral studies are not readily or easily applied to this digital form of experiment, and so ethical agents are forced to make leaps of faith and judgment, scrying the future while balancing the present benefits and risks.
To conclude, tackling ethical issues in the post-modern technology age is a complex undertaking. It requires sober thought, many viewpoints, and a certain amount of indelible uncertainty. We technology professionals must equip ourselves with ethical theories, frameworks, practices and self-evaluations, in order to be good stewards of the trust society has placed in us, and we must not underestimate the presence and impact of the ethical underpinnings we will daily move in.
References
ASET. (n.d.). Code of Ethics. Retrieved November 8, 2015, from http://www.aset.ab.ca: http://www.aset.ab.ca/About/About-ASET/Protecting-The-Public/Code-of-Ethics.aspx
Bejar, A. (2015, February 9). The Trust Engineers. (J. K. Abumrad, Interviewer)
Bond, R. M. (2012). A 61-million-person experiment in social influence and political mobilization. Nature, 489. doi:10.1038/nature11421
Booth, R. (2014, June 30). Facebook reveals news feed experiment to control emotions. Retrieved November 8, 2015, from The Guardian: http://www.theguardian.com/technology/2014/jun/29/facebook-users-emotions-news-feeds
Gilston, A. (n.d.). Debriefing in Psychology Research: Definition & Process. Retrieved November 9, 2015, from Study.com: http://study.com/academy/lesson/debriefing-in-psychology-research-definition-process-quiz.html
Kramer, A. G. (2014). Experimental evidence of massive-scale emotional contagion through social networks. PNAS, 111(24), 8788-8790. doi:10.1073/pnas.1320040111
Mason, R. (1995, December). Applying Ethics to Information Technology Issues. Communications of the ACM, 38(12), pp. 55-57. doi:ACM0002-0782/95/1200