The fathers of the web as we know it admit that their progeny is failing society. But who will fix it? And as tech giants’ media empires continue to expand, is it time they started prioritising ethics over eyeballs? Bissie Anderson considers the issues.

A month is all it took for things to start to unravel. As we watch, in disbelief, the fallout of the Cambridge Analytica scandal, it is slowly starting to crystallise that the ‘read-write’ web that came into being a decade ago – promising to democratise discourse and level the playing field – has morphed into an intricate, opaque system of networks that is fertile ground for abuse by the rich and the powerful. What started off as an innocuous, altruistic attempt to bring people together has degenerated into one of the many theatres of corporate gain, psychological warfare, and propaganda.

From the techno optimism and democratic promise of the Arab Spring, where social media was instrumental to toppling dictatorships across the Arab World, to a ‘post-truth’ media reality so infused with disinformation and propaganda that we are losing the ability to tell fact from fiction. How did we get here?

To assess just how much damage has been done, we need look no further than the state of the traditional media. Media convergence brought about by the rise of the Web 2.0 has caused tectonic changes in the traditional media’s practices, structures, and business models. The precarious condition of ‘the fourth estate’ as a civic institution and a pillar of democracy, which continues to grapple with the ever-changing digital media ecosystem, is perhaps evident in the falling trust in mainstream media. A recent survey of 33,000 people across 25 countries found that only 57 percent of respondents said they trusted traditional media, while 64 percent trusted search engines, and 41 percent social media. A destabilised media ecosystem, in which we are not sure know who we can trust, is ripe for exploitation and poses a threat to democracy.


You might have seen this slogan on T-shirts and memes, but when it comes from the mouth of the founders of one of the world’s biggest social media networks, you know it’s no laughing matter. Twitter founder Evan Williams uttered these words in an interview for the New York Times last May and warned that things were about to get worse. And they did get worse.

No longer just a frivolous t-shirt slogan? Image: delta407 [CC BY-SA 2.0]

Fast forward ten months. On 12 March, the world wide web’s 29th birthday, its founder Sir Tim Berners-Lee wrote an open letter in which he expressed grave concern for the future of the web. ‘The internet is under threat’, he stated, ‘from misinformation and questionable political advertising to a loss of control over our personal data. Are we sure the rest of the world wants to connect to the web we have today?’

In his letter, Sir Tim also warned that the concentration of power of ‘a few dominant platforms,… built to maximise profit more than to maximise social good, and who control which ideas are seen and shared’, has made it possible to ‘weaponise the web’. He called for these companies to take responsibility for being such powerful actors in the public sphere. Just a week later, The Guardian and Channel 4’s year-long investigation into Cambridge Analytica’s use of Facebook data to influence elections shocked the world.


Social and search giants’ monopoly and growing sway over our digital lives has long been a cause for concern. But it is only now that we begin to understand their true power. And with power comes responsibility. The Web 2.0 started off with a seemingly altruistic aim – to connect the world, to let people share. And, for a while at least, it did seem that those aims were being achieved. But the failure of the tech giants to acknowledge their own importance as public players created the deceptive image that they were merely platforms. That suited them as it helped avoid regulation, but it also meant they were slow to take even partial responsibility for the fallout of the ensuing social and political crises.

Mark Zuckerberg on stage in San Jose for Facebook’s F8 2017 Developers Conference, ‘Building Global Community’. Image: Maurizio Pesce [CC BY 2.0]

When the Facebook Timeline was introduced in 2011, it encouraged users to add their life milestones to the social network to build the complete story of their lives. “Timeline is the story of your life,” explained Mark Zuckerberg, “All your stories, all your apps, express who you are.” This seductive rhetoric of ‘personal discovery’, ‘storytelling’, and ‘meaningful engagement’ chimes neatly with our psychological need for friendship and self-expression.

Faecebook’s Open Graph API was launched to encourage the development of apps that would help create a full picture of who users were. But as people scrambled to tell their life stories, they were leaving invaluable digital traces – for Facebook and advertisers anyway. It shouldn’t come as a surprise, then, that when your business model is exclusively built on data monetisation, there is a high chance that your platform will be abused and that insidious practices would seep through Facebook’s opaque algorithms, which it has historically guarded so closely.

There has been a distinct lack of transparency about how the personal data we generate on social media is being collected, stored, and used. Brittany Kaiser, Cambridge Analytica’s former Business Development Director who went on the record about the company’s digital targeting practices, argues that the world needs to move away from the centralisation of data towards a reality where individuals own their own data and are able to monetise it themselves, as “their own human value”. “Corporations like Google, Facebook, Amazon, all of these large companies, are making tens or hundreds of billions of dollars off of monetising people’s data,” said Kaiser in an exclusive interview for The Guardian.

But the debate goes well beyond that. Professor Christian Fuchs has extensively studied the ‘commodification of users and their data’ in what he calls the ‘informational capitalism’ on the internet. He claims that Web 2.0 users are, in effect, knowledge workers whose ‘digital labour’ is being exploited. And Jonathan Crary warns that our 24/7, always-on, culture has ‘commodified and financialised’ our most important physiological and psychological necessities, with sleep now deemed expendable in today’s world of round-the-clock digital media consumption and production.

What is more, social media intermediaries facilitate an illusory sense of agency, political engagement, and activism, in which we are slogging over producing content, thus revealing more about ourselves. And by doing so, we become potential targets of propaganda and psy-ops. Media convergence, once considered an opportunity for engendering a bottom-up, democratic, participatory culture, has instead – to use Adam Curtis’s words – resulted in “entrapping us into a narrow and empty world”.

The creation of ‘necessary illusions’ of agency and participation in the immediate comfort of our digital consumerist culture isn’t far from what Chomsky warned three decades ago in his book Necessary Illusions – that our attention is being diverted from what matters in order to reduce our capacity to think. This, he argued, is achieved through ‘oversimplification’ and ‘building up irrational attitudes of submission to authority, and group cohesion behind leadership elements’.

We may see evidence of this in the spread of disinformation and the polarisation of political discourse, facilitated – albeit unwittingly – by the structures of social media platforms. As audiences are quantified in the race for personalised delivery of content that “resonates”, the public has actually never been so vulnerable to spin and propaganda. Furthermore, the tribal connections that we inevitably form as we try to make sense of the ever-discordant, polyphonic reality of information overload – seeking shelter in the ‘safety’ of our own highly personalised, subjective, social media bubble – often hinders us from meaningful democratic engagement and social action IRL (in real life).

If the Cambridge Analytica allegations are true, then isn’t this enough evidence that social media is just an extension of the channels that those in power have used for centuries to exercise thought control and subvert democracy for personal gain, using us as mere pawns in their ‘game of thrones’?


If the data scandal can teach us one thing, it is that we’ve reached a real technological crunch – a period of crisis not only for the internet, but for democracy. And there is no turning back. Political propaganda and trolling practices, enabled by social media giants’ advertising targeting practices and lack of transparency regarding their algorithms, raise serious questions about the social responsibility and ethics of these corporations. It really is a no-brainer: when a handful of platforms dominate our digital lives, it is incumbent on those platforms to acknowledge their role as social actors and accept that they have responsibility for safeguarding democracy – a responsibility that they have long chosen to shirk.

Days after the Cambridge Analytica revelations, Mark Zuckerberg apologised for “the breach of trust” and admitted Facebook had not done “enough to deal with it”. He vowed to change how the platform shares data with third-party apps. But that verbal reassurance is not enough for Facebook users who are leaving the platform in their droves. Company shares are taking a hit, too: Facebook has lost billions in market value since the data scandal. Regulation, once considered distant, unlikely, and unenforceable in a global market that transcends national borders, now looms closer.

The EU’s competition commissioner Margrethe Vestager recently hinted at the possibility of breaking up Google into smaller companies due to its dominance, after charging the company with abusing its monopoly. And in a recently published report, the European Commission urged online platforms to sign up to a code of practice and join in the fight against disinformation, stating that its spread threatens the ‘democratic political processes, including integrity of elections, and democratic values that shape public policies’.

However, the recommendations in this code of practice – to be announced on 25 April – are non-binding, and there is concern they may not go far enough. The Sunday Herald News Editor Angela Haggerty warns against “a tokenistic move towards regulation” which, she argues, “would be a gravely missed chance to redesign our new data world and reorganise power.”

The GDPR promises to reshape data collection practices, protecting EU citizens’ privacy and personal data. Image: The Digital Artist [CC0]

A more promising measure is the EU General Data Protection Regulation (GDPR), which comes into force on 25 May. This is expected to completely overhaul web practices related to data collection. It vows to protect all EU citizens’ data privacy and ‘to reshape the way organizations across the region approach data privacy.’ Those companies who fail to comply will face hefty fines, the EU warns.

But with Scotland scheduled to leave the bloc as part of the UK in March 2019, with a likely transition period extending to December 2020, and the fact the UK will no longer be part of the digital single market, there are questions about whether British people will be covered by the GDPR protections after this date. And it is worth noting that since data protection is not devolved, Holyrood cannot legislate in this area.

The UK Department for Digital, Culture, Media and Sport (DCMS) Committee has been tasked with finding a solution. Its Digital Charter, published in January but still a work in progress, is the closest the UK has got to online regulation. The Charter promises to give individuals more control over their personal data and to make the web free, open, and accessible. ‘The days of the unregulated Wild West are over’, wrote DCMS Secretary of State Matt Hancock in the Evening Standard in the wake of the data scandal revelations: ‘It’s the job of the government to ensure that companies behave reasonably. It’s not for companies to decide on the delicate balance between privacy and innovation, but for society as a whole’. However, it is still unclear what the exact measures and guarantees will be.


Responding to the rising distrust in technology companies and tech giants’ tendency towards further digital conglomeration, start-ups have been busy building an entirely new web architecture where apps (or DApps, as they are called on this new decentralised ecosystem) are built on Blockstack. The ‘new internet’, as Blockstack co-founder Ryan Shea calls it, is in its infancy but it promises to forever eradicate problem issues surrounding privacy and security, bypassing the monopoly of digital giants, and letting users own their own data.

DApps are still few and far between but they have real potential to replace their centralised counterparts. Image:

One of the Blockstack platforms, Civil (due to launch this spring), for example, promises to replace the ‘broken’ ad-driven journalism revenue model with one that is more sustainable, and which will enable journalists ‘to focus on journalism, not satisfying clicks over quality mandates from third parties like advertisers and publishers’.

The decentralised journalism marketplace will offer the ability to incentivise good quality content and ‘will make it prohibitively expensive for trolls, spam and misinformation to persist on the platform’, along with taking advertisers and digital intermediaries out of the equation to provide accessible, open journalism. Upon its launch, Civil will initially have 15 newsrooms with more than 100 full-time journalists, hoping to become the ultimate go-to place for news. But whether, how, and when developers and users will eventually migrate to the new web remains to be seen.


Whether it be through regulation or brand new solutions like Blockchain, Sir Tim Berners-Lee remains hopeful that the internet can be fixed – as long as we put corporate, political, and cultural differences aside and come up with non-partisan, non-commercial, supranational solutions that enable us to think beyond physical borders and ideological boundaries.

The solution to our current crisis cannot be unilateral. As Sir Tim explains, in order to reimagine the web so that it works for everyone, we need “the brightest minds from business, technology, government, civil society, the arts, and academia to tackle the threats to the web’s future.” This would require the fundamental overhaul of the structures of the internet, regulatory systems that hold internet giants to account, and a binding social contract between the online giants and their users, based on the principles of social responsibility, ethics, and transparency.

As the #DeleteFacebook backlash continues, it is also time for personal reflection on just how immersed we are in the digital world. Are we ‘in too deep’ in social media waters? Can we really disentangle ourselves from years of digital living? And ultimately, how can we – both as individuals and as a society – reclaim the power over our virtual lives?

Bissie Z. Anderson is a PhD researcher in digital media and journalism at the University of Stirling with 15 years’ experience as a journalist and content producer. She is funded by the Scottish Graduate School of Social Science. She is on Twitter at @biscottto.

Feature image: Facebook protesters outside the United States Capitol Building in Washington. Image: Lorie Shaull [CC BY-SA 2.0]