[Academic Essay] Web 2.0 is disconnecting its user
Updated: Feb 15, 2021
The focus of this paper is to analyse the stream on communities and web 2.0 to bring forward the discussion of how have social networking sites distanced the connections between internet users and the large community. The paper will emphasise that while social networking sites are seen to be connecting people, it is, however pushing us further away from the “reality” in this disinformation age.
Photo by Designecologist on Pexels.com
With the ease of account creation and access to social networking sites, there has been a massive increase in the number of social media users globally over the past five years, from 2.07 billion in 2015 to 3.80 billion in 2020 (Kemp, 2020). When social networking sites were first created, the purpose was mainly to “connect” people – words like “connect” and “community” can be commonly found on a quick scan of the “about” page for different sites. Facebook, which has the highest number of active users, had stated in their site’s description that “Facebook builds technologies that give people the power to connect with friends and family, find communities and grow businesses” (Facebook Inc, 2020).
As defined by Boyd and Ellison (2008), social networking sites are online tools that let users set up a published profile and establish a connection with users they are connected to, all within a network system. While social media is an easy tool for users to make a connection, contents are mostly user-generated, users can choose only to share things that they want to share. The danger of social media is that since users are allowed to post or say anything that they want, it is hard to ensure the authenticity of the information or contents that other users see, which may reflect a misrepresented reality.
In short, the paper will be discussing how web 2.0 is disconnecting people from the community more than it is making connections. In this discussion, media effect like cultivation theory, selective exposure and spiral of silence will be applied in the paper, and be used to analyse how have social network pushed us further away from the true reality of the society in this disinformation age.
Photo by Maksim Romashkin on Pexels.com
The evolution of the web has been praised for bringing more convenience to the life of people. However, it should be noted that the usage of social networking sites not only bring along the utilitarian value, but also hedonic and social value such as self-identity and belongingness. Porter (2015) had observed values such as self-identity and belongingness, where participants of an online community are conscious of their membership status and hope to seek satisfaction from the cognitive connections that they get within the community.
According to a study done by Stoycheff (2016), it was observed on social networking sites that as the gap between the opinion of a user and the presumed majority opinion becomes wider, the possibility of them sharing their views online becomes lower. This observation is similar to the spiral of silence theory that was initially brought up by researcher Noelle-Neumann, where it focused on the human nature of fearing social isolation. According to Noelle-Neumann (1974), the theory applies to where humans will give reaction or opinion based on the environmental pressure that they perceived, causing opinions which are less popular to be silenced.
Although this theory was established before the internet era, it is still applicable in the current online environment. Most social networking sites includes functions like “likes” and “comments”, which also serve as a social cue on whether an opinion is widely accepted or whether is it in line with the “majority”. Apart from being a social cue, it is also part of the social reward system of approval, where users get “socially valued or rewarding outcomes—approval, acceptance, reciprocity” (Fareri & Delgado, 2014), but only if they express what is accepted by others. Thus, the comments and discussions that are seen online may not reflect real sentiment in society.
This can be drawn from the example of the United State 2016 presidential election, where Donald Trump had his victory. An observation made by Ohlheiser (2016) during the election period showed that there were in large more positive comments and sentiments surrounding Hillary Clinton than Donald Trump online. As of 8th November 2016, a quick comparison of the online-based community showed that a Facebook group supporting Hillary Clinton has 1.5 million members. On the other hand, a Reddit community which is Donald Trump’s largest online-based support had only 264, 241 number of subscribers. However, Donald Trump had won the election with 304 electoral votes compared to Hillary Clinton’s 227 electoral votes.
During the campaign period, the media had referred to online sentiments and widely reported that Hillary Clinton has a higher chance of winning the elections. Such an impression also led Donald Trump supporters to think that they are the minority and are less inclined to express their support on social media. Despite the fear of getting judged, this group of people still voted for Donald Trump. On the other hand, Hillary Clinton supporters thought that they were the majority and thought that Hillary Clinton had won the election, which caused the supporters not to bother casting their votes (Roulet, 2016).
Selective exposure and algorithm at work
Another reason that caused users to be further disconnected from the truth is the development of the algorithm, where social network systems will repeatedly show users things that they want to see, which is relevant to the theory of selective exposure. As Internet companies are increasingly enhancing their website’s usability and positive user experience, Internet users can now filter what they want to see and what they do not want to see. Valentino et al. (2009) suggested that, due to the ease of filtering selecting information on the internet than traditional media, it is more frequent to see the habit of motivated selectivity among Internet users now. This is similar to the observation made on selective exposure theory, which is the “motivated selection of messages matching one’s beliefs” (Stroud, 2014).
Through selectively exposing oneself to a specific similar set of information, it not only helps the individual to reduce cognitive dissonance when they face conflicting information (Festinger, 1957), but it also lessens the cognitive effort needed to process the information (Ziemke, 1980). These prompts more internet user to selectively expose themselves to certain information online, as they are now given more choices (Best et al., 2005). After which, selective retention and selective perception will also come in place, which postulates that messages that are similar to pre-existed attitudes and beliefs will be remembered best, and people tend to interpret messages in accordance to the pre-existed attitudes and beliefs (Baran, 2014).
The algorithm works well with selective exposure on the internet, which helps elevate the spread of falsehood on the internet. Messing and Westwood (2012) have suggested that social networking sites like Google and Facebook tend to show users selected contents that have a higher chance of being view or like, based on a predictive algorithm. Such systematic occurrences tend to throw users into a “filter bubble”, which is a “unique universe of information that you live in online” depending on “who you are” and “what you do”, which can toss off the balance of information that users are being exposed to (Pariser, 2011).
One example where the algorithm had elevated the spread of falsehood on the internet is the Flat Earth theory. This theory supports the idea of the earth being flat instead of round. A search on the internet also shown that an online community named Flat Earth Society was set up to gather “flat-earthers”, people who strongly believe that the earth is flat. A survey done by Olshansky (2018) surveying believers of the Flat Earth theory, had shown that majority validate the theory after watching YouTube videos related to Flat Earth. Olshansky (2018) also concluded that “YouTube played a significant role for introducing and eventually converting many Flat-Earthers”, with those surveyed credited YouTube as being the “a reliable and unrestricted source of visual evidence-based material produced by individual truth-seekers”.
An observation made on the Flat Earth Society’s website has also further proved how the community is strongly engaging in selective exposure. On the forum page of the website, there are several forums created to engaged believers in the discussion surrounding the theory; there are also rules stated for the communities to adhere. On the “Flat Earth community” discussion board, it was stated that it is for discussion of “subjects concerning the Flat Earth community” and users can share “interesting articles, interviews, YouTube videos, etc. regarding the Flat Earth Movement as a whole, but also discussion of recent events which affect the community” (Svarrior, 2018).
Photo by Produtora Midtrack on Pexels.com
Online Fake News
Social networking sites had previously been nicknamed as a “Micro-Propaganda Machine -an influence network that can tailor people’s opinions, emotional reactions, and create ‘viral’ sharing episodes around what should be serious or contemplative issues” (Albright, 2016).
Between January 2015 to July 2018, social networking sites Facebook and Twitter have recorded a total number of 570 fake news sites and 10, 240 fake news stories circulating on their platforms (Allcott et al., 2018), amounting to an average of more than 240 pieces of fake news being added into the circulation per month.
Due to the nature of fake news, which tends to be more sensational and interesting than factual news, fake news tends to spread faster than factual news. As analysed by Shu et al. (2018), social media not only attracts a huge number of users but also aids in the widespread of news that has intentional false contents due to its affordability, easy access and the ease of disseminating contents. With the proliferation of fake news, it is difficult for users to decipher what is real and what is fake, especially when the news is often assumed as a product of the gatekeeping process, which is the job of “selecting and shaping the small amount of information that becomes news” (Shoemaker, 2008).
However, as fake news is becoming more common with the help of social networking sites, the confidence that members of the public has on journalists had dropped, as more people now agrees that news “stories are often inaccurate” (Goodnature, 2017).
The circulation of fake news can have different impact in the society, 1) causing a society or country to be divided and 2) fooling people into believing falsehood that can lead to matters of life and death, especially in the midst of the current COVID-19 pandemic, etc. Just like what the Ministry of Law in Singapore had previously mentioned in a report, studies have shown that “many online falsehoods were aimed at interfering with elections and referenda” or for “financial consideration” (Ministry of Law, 2018).
An example will be The Online Citizen, a Singapore based online community that proclaims to be an independent media platform. They currently have more than 181 thousand followers on their Facebook page and more than 6000 subscribers on Telegram. Founded back in 2006, The Online Citizen is often seen to be writing articles that criticised policies and politicians in Singapore. They are also well-known for publicising articles that are controversial, leading them to be issued correction orders under Protection from Online Falsehoods and Manipulation Act (Pofma) act and being involved in defamation lawsuits.
Johnson (2018) had used the “pizzagate” incident in his article as an example to show the internet help in fostering belief. In this particular incident, a United State gunman was arrested for firing gunshots in a pizza outlet, after believing that the shop is involved in child sex trafficking ring which is related to Hillary Clinton (Johnson, 2018). In an interview done after his arrest, he told news outlet that it was after he went onto the internet that he started to believe in the rumour, and mentioned how “he did not like the term fake news, believing it was meant to diminish stories outside the mainstream media, which he does not completely trust” (Goldman, 2016).
In conclusion, although the introduction of web 2.0 has appeared to be of help in forging online users into “closer” proximity, but it is instead disconnecting the community on a larger scale. As discussed in the article, the example of how Donald Trump supporters thought that they were the minority, and did not dare to show their support publicly online. This has shown how herd mentality and the spiral of silence are applicable in disconnecting internet users from reality, and those who have the loudest voice may be wrongly assumed as the “majority”. What is unknown to the “majority” is that, some participants may just show “support” to gain acceptance from the group.
Apart from the spiral of silence, the growth of the technology had also boosted algorithms’ advancement, enhancing users’ engagement in selective exposure. The Flat Earth Society example used in this article has shown how internet users create and join online communities close to their belief. This example has also shown how algorithm will deliver content similar to what users have previously viewed, which in turn build up their perspective on different issues.
Finally, the case of The Online Citizen and “pizzagate” have also shown how web 2.0 provided online community platforms for people to spread fake news or falsehoods that can harm the cohesion of a society. Such online tools enable misinformation to spread faster and create a community that may be living in a disinformation fantasy.
Although the development of web 2.0 can also be bringing goods to the society, the most important thing is to strengthen the education of the internet users, and train them in their digital literacy skills. Such skills will help not only help to prevent internet users from believing everything that they have seen on the internet, but it will also help create a safer online environment for internet users.
This paper was submitted by Yeo Jie Yu from Curtin Singapore. There’s no attempt to edit the submitted paper for this publication.
Albright, J. (2016). The #election2016 micro-propaganda machine. Medium. https://d1gi.medium.com/the-election2016-micro-propaganda-machine-383449cc1fba
Allcott, H., Gentzkow, M., Yu, C. (2018). Trends in the diffusion of misinformation on social media. Facebook inc. https://about.fb.com/wp-content/uploads/2018/10/fake-news-trends.pdf
Baran, S.J. (2014). Introduction to mass communication (8th ed). McGraw-Hill.
Best, S.J., Chmielewski, B., & Kruegar, B.S. (2005). Selective Exposure to Online Foreign News during the Conflict with Iraq. Press/ Politics, 10(4), 52-70. https://doi-org.dbgw.lis.curtin.edu.au/10.1177%2F1081180X05281692
Boyd, D.M., & Ellison, N.B. (2008). Social network sites: Definition, history and scholarship. Journal of computer-mediated communication, 13(1), 210-230. https://doi-org.dbgw.lis.curtin.edu.au/10.1111/j.1083-6101.2007.00393.x
Facebook Inc. (n.d.) About Facebook. Facebook Inc. https://about.fb.com/
Fareri, S.S, & Delgado, M.R. (2014). Social rewards and social networks in human brain. The Neuroscientist (Baltimore, Md.), 20 (4), 387-402. https://journals-sagepub-com.dbgw.lis.curtin.edu.au/doi/full/10.1177/1073858414521869
Festinger, L. (1957). A theory of cognitive dissonance. Stanford University Press
Goldman, A. (2016, December 7). The Comet Ping Pong Gunman Answers Our Reporter’s Questions. The New York Times. https://www.nytimes.com/2016/12/07/us/edgar-welch-comet-pizza-fake-news.html
Goodnature, T. (2017). Antidemocratic effcts of the internet & social media: A survey. Stanford Law School. https://pacscenter.stanford.edu/publication/antidemocratic-effects-of-the-internet-social-media-a-survey/
Johnson, J. (2018). The Self-Radicalization of White Men: “Fake News” and the Affective Networking of Paranoia. Communication, Culture and Critique, 11(1), 100–115. https://doi-org.dbgw.lis.curtin.edu.au/10.1093/ccc/tcx014
Kemp, S. (2020). Special Report: Digital 2020: 3.8 billion people use social media. We Are Social. https://wearesocial.com/blog/2020/01/digital-2020-3-8-billion-people-use-social-media#:~:text=Worldwide%2C%20there%20are%203.80%20billion,percent)%20over%20the%20past%20year.
Messing, S., & Westwood, S. (2012). Selecting exposure in the age of social media: Endorsement trump partisan source affiliation when selecting news online. Communication research, 41(8), 1042-1063. https://journals-sagepub-com.dbgw.lis.curtin.edu.au/doi/pdf/10.1177/0093650212466406
Ministry of Law. (2018). Deliberate Online Falsehoods: Challenges and Implications. National Archives of Singapore. https://www.nas.gov.sg/archivesonline/government_records/Flipviewer/grid_publish/6/6797717d-f25b-11e7-bafc-001a4a5ba61b-06012018Misc.10of2018/web/html5/index.html?launchlogo=tablet/GovernmentRecords_brandingLogo_.png
Noelle-Neumann, E. (1974). The spiral of silence: A theory of public opinion. Journal of communication, spring 1974. 43-51. https://onlinelibrary-wiley-com.dbgw.lis.curtin.edu.au/doi/epdf/10.1111/j.1460-2466.1974.tb00367.x
Ohlheiser, A. (2016, November 8). Inside the huge, ‘secret’ Facebook group for Hillary Clinton’s biggest fans. The Washington Post. https://www.washingtonpost.com/news/the-intersect/wp/2016/11/07/inside-the-huge-secret-facebook-group-for-hillary-clintons-biggest-fans/
Olshansky, A. (2018). Conspiracy Theorizing and Religious Motivated Reasoning: Why the Earth ‘Must’ Be Flat [Master’s thesis, Texas Tech University]. Texas Tech University libraries. https://ttu-ir.tdl.org/bitstream/handle/2346/82666/OLSHANSKY-THESIS-2018.pdf?sequence=1&isAllowed=y
Pariser, E. (2011, March). Beware online “filter bubbles” [Video]. TED.https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles/details?language=en
Porter C.E. (2015). Virtual communities and social networks. In L. Cantoni, & J.A. Danowski (Ed.), Communication and technology (pp. 161-180). De Gruyter, Inc. http://ebookcentral.proquest.com/lib/curtin/detail.action?docID=1759936
Roulet, T. (2016, November 3). Pious progressives have created a spiral of silence which could yet conceal a Donald Trump victory. The Telegraph. https://www.telegraph.co.uk/news/2016/11/03/pious-progressives-have-created-a-spiral-of-silence-which-could/?utm_source=dlvr.it
Shoemaker, P.J., Vos, T.P., & Reese, S.D. (2008). Journalist as gatekeepers. In K.Jorgense & T. Hanitzsch (Eds.), The handbook of journalism (73-87). Routledge.
Shu K., Bernard H.R., Liu H. (2018) Studying Fake News via Network Analysis: Detection and Mitigation. Computer Science – Social and Information Networks, 1-22. https://catalogue.curtin.edu.au/permalink/f/iiil99/TN_cdi_arxiv_primary_1804_10233
Stoycheff, E. (2016). Under surveillance: examining Facebook’s spiral of silence effects in the wake of NSA internet monitoring. Journalism & Mass Communication Quarterly, 93(2), 296–311. https://journals-sagepub-com.dbgw.lis.curtin.edu.au/doi/full/10.1177/1077699016630255
Stroud, N.J. (2017). Selective exposure theories. The oxford handbook of political communication, 1-19. DOI: 10.1093/oxfordhb/9780199793471.013.009_update_001
Svarrior, P. (2018). Welcome to the Flat Earth community board. Flat Earth Society. https://forum.tfes.org/index.php?topic=10087.0
Valentino, N.A., Banks, A.J., Hutchings, V.L., & Davis, A.K. (2009). Selective exposure in the internet age: The interaction between anxiety and information utility. Political Psychology, 30 (4), 591-613. http://www.jstor.org/stable/25655419
Ziemke, D.A. (1980). Selective exposure in a presidential campaign contingent on certainty and salience. Annals of the internation communication association, 4:1, 497-511. DOI: 10.1080/23808985.1980.11923821