Skip to content

The terms and conditions of your Social Contract

Technological advancements are being kicked into high gear. Finding someone - centenarians excluded - without a phone and social media app installed on it is as rare as hens’ teeth. All registering with social media are asked to agree with a terms of service and conditions (ToS) agreement that is intentionally long to discourage readers. This can lead to some interesting, yet unsurprising, outcomes- a study in 2016 found that over 500 participants unwittingly agreed to give their first born to the company overlords. Whilst the Rumpelstiltskin-esque demand would never hold up in court, it does show how eager we are to throw away our rights signing online contracts, without bothering to see what we're giving away.

During the Dotcom bubble, Google had to change its revenue source to keep its head above water. The method they chose set the example for all future tech giants and created a very lucrative business model. Before 2002 Google used the data, it gathered from search requests, to improve the search engine itself. With every search, there was a behavioural data surplus that remained unused. This previously "surplus" data now makes up 86% of Google's livelihood, and similar practices can be found at other tech companies. User’s accumulated data over the years has proven to be a gold mine for tech companies, but at what point did they inform users?

If you had read the quarterly terms of conditions update that you were prompted with that one time, and quickly dismissed, you would have been aware. Well, aware is an overstatement, since the language in the ToS requires at least a master’s degree in law to decipher what is happening with your personal information.

Tech companies’ services are based solely around trying to accumulate your online behavioural data as efficiently and accurately as possible, under the guise of the ideal of interconnectedness or promised self-actualisation. Evidence of this comes from the acquisition of other tech companies by these giants. Zuckerberg stated himself that the direct revenue comes only second after the behavioural data in the purchase of Oculus Rift and WhatsApp. The same counts for when Google acquired YouTube. These acquisitions were not closely monitored simply because the product was ‘free’ for users and promised no quick profits in the traditional sense.

The common phrase that: “if are not paying for a product, then you are the product” is also vastly outdated as a justification. Tech companies’ services are based solely around trying to accumulate your online behavioural data as efficiently and accurately as possible, sidelining whatever interconnectedness they promised. An obvious example is promoted content, or other less obvious ‘popular’ content that is pushed upon the top of our feeds.

Of course, we shouldn’t dismiss the positive impacts social media has had on communities entirely. But, for all the happy family status updates and friends found, these terms of services have become the new social contract that we scribbled our names on without knowing the details what we signed up for. Being ignorant to certain practices can be excused, but when companies, by design, intentionally deceive their users it crosses into the immoral.

The concept of a Social Contract has been debated by a lot in the past Locke, Hobbes, Rousseu et al., but, without going to deep into the conceptual theory, the Merriam Webster dictionary defines it as:

“an actual or hypothetical agreement among the members of an organized society or between a community and its ruler that defines and limits the rights and duties of each.”

Before I argue the similarities of the social contract and ToS agreement it is important to understand that both, as it stands now, are contracts that are signed without consent. There is no reasonable way to ensure the understanding and consent of every citizen of a nation to being governed, and instead it is implicit consent that legitimizes a government.  One is upon birth, and the latter is by clicking a little box that without doing so bars the user from participation in the online community, similar to need for conformation to society.

The important distinction between the two is that the latter can be changed, since we are not born into our digital lives, not yet at least. It can be argued that by uploading your personal information onto the public domain, in anyway, is a form of consent that anyone can view it and has a right to process it. However, this can be refuted by the principles on which the World Wide Web was found, same as we can point to constitutions for our society’s values. The web, not to be confused with the internet, in its simplest terms is a digital knowledge base accessed over the internet. While it changed its former shell, it is not to late to turn back to its humble beginnings. Even the inventor, Tim Berners-Lee, agrees on this statement, and is actively trying to give ownership of the data back to its user through his new project called Solid.

Our community participation has moved from the real to the virtual, it provides quasi-democratic spaces, where approval is gained through likes, for citizens to express themselves and find like-minded individuals. The internet is now the biggest platform for meaningful debate. Consequently, ‘filter bubbles’ and ‘echo chambers’ are becoming more common and effective in comparison to the past issues of watching/reading partisan news outlets. It is precisely in these spheres that tribal views and opinions are being warped.

It would not be an enormous issue if these forums were merely targeted by marketing agents to stimulate consumerism with techniques used in the past such as targeted advertising. But with the Cambridge Analytica scandal it was revealed that, through behavioural data analyses, political opinions could be swayed, influencing election outcomes.

This subservience of people to machines is described by Jeffrey Ocay as “the concept of compliant efficiency, which results in the individual’s submission to the apparatus without any form of mental and physical opposition”. In other words; the virtual has conquered the real without notification or opposition.

The relation between technology and society has been the topic of discussion by many famed philosophers & scientists, but with the transition into the digital we need to revisit this debate and move on from seeing technology as something purely abstract. The virtual has taken a foothold in our lives and is here to stay. Unfortunately, due to the neoliberal forces of deregulation and free markets, the way our digital lives are constructed is guided by capitalistic forces that use online platforms to further their self-interests of shareholder revenue. I want to point out here that it is not because of the existence of neoliberalism that we have this dilemma, rather, is an economical model first, and only political second. The main feature is the free market that transfers the control of economic factors to the private sector. And through society’s demands companies are kept in check, but what if the product becomes so integrated into the fabric of life that there are no governing demands?

Luckily within the European Union, we are going somewhat in the right direction, albeit not with the necessary urgency. The first big step was the introduction of the GDPR, which has escaped no one’s notice since the day after your inbox was filled with permission requests. The second big step is in progress with the new Copyright directive that passed the European Parliament on the day of writing (26/3/2019). The Copyright directive has a clause on text and data mining (TDM), excluding the practice for non-research oriented organisations. However ambitious, the latter Copyright directive isn’t a guarantee for your digital safety. With consent companies are still allowed to process your data and use it for micro targeting.

Micro targeting, in this story, is the real kicker that has the potential to disrupt society the most. Because of the large amount of data available to those who run advertisement or political campaign messages can be modified to specific profiles. If it were just on the level that you like a cats facebook page and get shown cat food ads it isn’t worrisome, rather a convenience. However, when used to psychological map profile targets, find out what a group thinks and fears, it is shifts the balance of power from voters to data suppliers.  

With this micro-targeting tool it becomes really tempting to promise different things to different groups, making it difficult for citizens to not be subject tribalism. This phenomenon fragments society into groups that we are already noticing.

The business models of tech companies have been allowed to run rampant on the acquisition of our personal information. And just as easily as we try to prove we are not robots by checking a box, we sign part of ourselves away to be monetised and turned against us. By no means is this a rallying cry for the liquidation of tech giants, but we must not forget to be critical and have an open discussion if our laissez-fair ideals should be applied to the digital world. Transparency is key in this, if one but knew why you got a specific e-mail/ad/folder you can finally see how the sausage is made, and the truth isn’t that tasty.

On departing note, Christopher Maboloc puts the importance of our online life in the most pleasing way.

In today’s Internet age, true democratic participation can be found in the struggle for recognition of many social movements which uses social media to promote local culture, including songs and stories, indigenous artworks, and ethnic dances, in order to celebrate the beauty of human life. In today’s world, these indigenous art forms are helpful if we need to inoculate ourselves from the dangers of a hegemonic consumer culture.

So, let’s not let the hegemonic consumer culture dictate on what terms and conditions we make use of the virtual, but let data and contracts be governed by the society that produces it.



Gunkel, D. J. (December 01, 2014). Social Contract 2.0 : Terms of Service Agreements and Political Theory. Journal of Media Critiques, 1, 2, 145-168.

Maboloc, C. R. (January 01, 2017). Social transformation and online technology: Situating Herbert Marcuse in the internet age. Techne: Research in Philosophy and Technology, 21, 1, 55-70.

Obar, J. A., & Oeldorf-Hirsch, A. (July 03, 2018). The biggest lie on the Internet: ignoring the privacy policies and terms of service policies of social networking services. Information, Communication & Society, 1-20.

Ocay, J. (June 01, 2010). Technology, Technological Domination and the Great Refusal: Marcuse’s Critique of the Advanced Industrial Society. Kritike: an Online Journal of Philosophy, 4, 1, 54-78.

Zuboff, S. (2018). The age of surveillance capitalism: The fight for the future at the new frontier of power. London: Profile Books.