Essay Sample on the Facebook's Emotional Contagion Study

Published: 2019-05-27
Essay Sample on the Facebook's Emotional Contagion Study
Type of paper:  Essay
Categories:  Facebook Communication Social networks
Pages: 6
Wordcount: 1618 words
14 min read
143 views

In January 2012, Facebook researchers teamed up with the Cornell University to study the effect of positive and negative news feed to the emotions of users for a period of one week. The research, renowned as the emotional contagion study, focused on a population of 689,003 Facebook subscribers as a sample size. In 2014, the study revealed that news feed in a users account control their emotions thus affects what they post (Kramer et. al., Para 1-17). However, the research team involved in the study embarked with manipulating and controlling the news feed in the profiles of Facebook users without their knowledge or consent. This aspect raises the ethical issues of the subject that has been debated by different researchers. A few scholars consider the study ethical while most consider it unethical on grounds of consent from the Facebook users and I second their opinions. Therefore, in giving arguments for the study being ethical despite claims on its unethical nature, the paper will first focus on the role Facebook algorithm in the social lives of the Facebook users.

Trust banner

Is your time best spent reading someone else’s essay? Get a 100% original essay FROM A CERTIFIED WRITER!

The Role Facebook Algorithm in the Social Lives of the Facebook Users

How Facebook Algorithm Works

The Facebook algorithm controls the information shown and updates in the users News Feed page. The feature tends to relate to the emotions of the users thus filters off the information that is contrary to the beliefs of the user (Kramer et. al., Para 1-17). It does this by tracking the kind of posts that the Facebook user posts in his or her page, the posts they like from their friends pages and the comments they post to other users posts. That is, the new visibility given by the Facebook end result is determined by the interests the user has, their posts, creator and the performance of the post, the type of posts that includes the status and how recent the popular post is(see the image above). Therefore, the first role that the Facebook algorithm plays in the social lives of the users is the control of information to them, thus they end up socializing either negatively or positively based on the emotions portrayed from their post, likes and comments.

Secondly, according to the Facebook researchers, it is a conclusive view that newsfeed algorithm diminishes ideologically diverse and crosscutting data or information people see from their Facebook pages. One sees few items that they hold opinion against and more of the news they support. In fact, according to Christian Sandvig, the algorithm in carrying out its function puts only 1 out of 20 hard news conservative stories and 1 in 13 liberal stories that a user supports (Kramer et. al., Para 1-17). The effect on this on social lives of the users is that information that might be crucial in their general knowledge is withdrawn from them. The fact that the algorithm filters information means that the users miss a diverse data of information that would have otherwise been beneficial to their social being.

Moreover, the Facebook algorithm determines the people who are going to view your posts in your page. The system does this by tracking the friends in the users page. The moment one posts the status, all the friends, that is the followers and those the user follows, are able to see the information. To further this effect, if the user has privatized their account, the algorithm functions at requesting the user whether they want their posts to be updated to all the friends or just specific ones. On one hand, this is advantageous in that it ensures that user still maintains the privacy he or she envisages. On the other hand, the algorithm function in this case is a demerit as important information that the user might have intended to pass to the public may fail to reach all the people. Additionally, in cases of business information, for instance, advertising a new product, this role by the algorithm proves disadvantageous.

Based on the 2014 results on research carried out by the Facebook researchers and the Cornell University, the News Feed algorithms are updated such that they improve the experience of the inactive members who have few posts in their pages (Forlani, Para 1-20). Previously, it was as strict rule that the more posts updated in the users profile the more information of other subscribers posts in the News Feed. Therefore, despite the study being considered unethical, the improvements made to the algorithm system are beneficial to the social lives of the inactive Facebook subscribes as little information is filtered.

The Unethical Debate

Following the results released by the Facebook research team carried out for one week in January 2012, the debate still ensues on whether the study is unethical or not. According to the proponents of the unethical debate, the aim, method and the secrecy of the study are the main breaches of the ethical guidelines. The experiment aimed at determining whether the posts in the News Feed page affect the emotions of the user. The method used, which encompassed a sample of 689,003 Facebook users, stirred with the emotions of people by the Facebook algorithm posting happy words for half the sample people, and sad or negative information to the remaining half (Tufekci, Para 1-14). The result was the positive or happy comments by the first group of people while the second group posted updated negative information in their status based on the sad post and commented negatively on the data available in their News Feed page. What the supporters on the unethical debate of the method of the research say is that the Facebook Corporation violated first their corporate social responsibility not to interfere with the affairs of the society they are serving.

Secondly, they opine that they breached the professional ethics they owed to the public by inflicting feelings contrary to what the user had anticipated to have. According to the proponents, this is morally wrong and socially unacceptable, thus the study is unethical. Concerning secrecy, the Facebook management did not inform the 689,003 sample population of the intention to use them in the experiment thus no consent was entered (Tufekci, Para 1-14). What this means is that the management allowed the presumption of consent by the researcher with no regards to the effect to the consumer. Professionally, the proponents claim, it is unethical. Socially, it is degrading the importance of the consumer and the ethical power one has on his or her own feelings.

Following the arguments by the proponents that the study is unethical, I find their opinions quite far-fetched, wanting, and an impediment to technological developments of our social networks. To begin, the proponents argument on the consent issue is practically impossible. Realistically speaking, no one can consent to being used as a guinea pig for whatever experiment free. What this means is that for the 689,003 people used in the emotional contagion experiment, the Facebook management would have had to seek consent and if the subjects asked for payment, then it would force them to pay. I am not trying to justify the ethical nature of the project based on money being used as consideration in obtaining consent from the subjects. Of course, Facebook being the second popular social corporation on average $23.3 million a day, hence, has money (Forlani, Para 1-20). The issue comes in the complexity of the process of paying the subjects money for consent. The process will need a contract, which will need of course legal fees etcetera, and the process will consume a lot of time yet the only main agenda of the research is to improve the quality of service given to the customers. Is it not ethical to improve the quality of services and products for the good of the society?

The aim of the research and the method used are only justified by the end results and not a notion of it is ethically wrong to manipulate the feelings of Facebook user. Supported by the sage, the end justifies the means, the improvements of the News Feed algorithms and the enhancement of marketing platforms in Facebook following this study; it is incorrect to say that the study is unethical (Forlani, Para 1-20). For instance, the fact that the News Feed algorithms now allows less active users to access vast information in their news feed in spite of their inactivity is a positive effect. Further, the fact that the study established the unsatisfied and sad feelings posted by users who see their friends private charts on the public new feed page resulted to the improvement in that sector.

In conclusion, following the positive roles that the News Feed algorithms play, it is incorrect to refer to the one week 2012 study whose result were released in 2014 as unethical. It is okay to give credit where it is due and to stop bringing up excuses like it is unethical to manipulate the feelings of people. The positive results of a project always surpasses its criticisms, and since this is a social network, the aspect of consent is null and void when it comes to using the consumers as subjects to improve services to them.

Works cited

Forlani, Christina. 'Three Changes to Facebook's Algorithm'. We are Social 2015. Available at <http://wearesocial.net/blog/2015/05/facebooks-algorithm/ >. Accessed on 2 Oct. 2015.

Kramer, A. D. I., J. E. Guillory, and J. T. Hancock. 'Experimental Evidence Of Massive-Scale Emotional Contagion Through Social Networks'. Proceedings of the National Academy of Sciences 111.24 (2014): 8788-8790. Available at<http://www.pnas.org/content/111/24/8788.full#xref-ref-2-1> .Accessed on 2 Oct. 2015.

Tufekci, Zeynep. 'How FacebookS Algorithm Suppresses Content Diversity (Modestly) & How The Newsfeed Rules The Clicks The Message'. Medium. N.p., 2015.Available at <https://medium.com/message/how-facebook-s-algorithm-suppresses-content-diversity-modestly-how-the-newsfeed-rules-the-clicks-b5f8a4bb7bab> . Accessed on 2 Oct. 2015.

Cite this page

Essay Sample on the Facebook's Emotional Contagion Study. (2019, May 27). Retrieved from https://speedypaper.net/essays/the-facebooks-emotional-contagion-study

Request Removal

If you are the original author of this essay and no longer wish to have it published on the SpeedyPaper website, please click below to request its removal:

Liked this essay sample but need an original one?

Hire a professional with VAST experience!

24/7 online support

NO plagiarism