Stages in the Revolution of Digital Affairs
Gramsci talked about how changes in the basic building blocks of material production, institutional management and ideologies can have seismic impacts on the legitimacy of a hegemonic order. In the Prison Notebooks, he spends a substantial amount of space discussing the industrial innovations taking root in the United States in the early twentieth century including such radical innovations as “Fordism,” the theories of scientific management pioneered by Frederick Taylor and the unique American attitude toward work and government. Under the label of “Americanism,” Gramsci states that the “virgin” nature of US society prevents it from having the same sort of class conflicts observable in European societies that prevent the adoption of new forms of economic production and management. “For this reason,” Gramsci argues, “the introduction of Fordism encounters so much ‘intellectual’ and ‘moral’ resistance, and take place in particularly brutal and insidious forms, and by means of the most extreme coercion.”[1] While these innovations are seen as “natural” in the United States and have a certain amount of legitimacy within the bulk of the society, in Europe they threaten certain social classes and vested social interests who are prepared to resolutely resist them, thus requiring substantial coercion. In applying a greater measure of coercion, European states and their prevailing hegemonic classes experience a crisis of legitimacy resulting in an upheaval of social order.
Similar events transpired earlier in the late eighteenth-century as two great upheavals erupted in Europe that had profound consequences for the entire world. The first was the industrial revolution emerging out of the Midlands of England and slowly spreading throughout the rest of the country and into the European mainland and overseas. The second was the French Revolution that marked the end the aristocracy’s position as the ruling class of Europe and catapulted notions of liberal rights to all corners of the globe.[2] These revolutions, however, were not singular events. They were on-going processes that had multiple fits and starts that impacted different parts of the world in different ways. In fact, recent historical studies have suggested that it is a mistake to think of the great revolutions of the late eighteenth century as watershed events where one can identify distinct political and socio-economic orders before and after the revolutions. In the case of the industrial revolution, one can identify at least four separate smaller “revolutions within the larger “meta-revolution.” The first corresponds with the initial construction of factories and steam-powered machinery of the late eighteenth-century that set in motion the social disruptions associated with the closing of rural enclosures and migration of the peasants into the cities.[3] The second revolution centered on the application of new technologies, technical methods and industrial processes between the years 1867 and 1944 that boosted production and created the foundation for a “consumer-oriented “society.[4] The third revolution elapsed after World War II encompasses the rise of nuclear power, basic computational capabilities and the beginnings of harvesting the potential of outer space. Finally, the fourth revolution entails the era of focus for this chapter—the era of digitalization where the production and manipulation of information becomes the driver of economic growth and the structuring of society in place of the production of material goods.[5] A similar but more digitally focused narrative of changing stages in the rise of an information-oriented order is seen in the idea of various iterations of the world wide web, often referred to as Web 1.0, Web 2.0, and Web 3.0.
Web 1.0
The term web 1.0 refers to the earliest stages of public interphase with the Internet beginning in the early 1990s and lasting through the so-called “dot.com” bubble bursting in 2000. From mostly bulky desk-based computers, individuals could access primarily text-based information that was stored in other computing platforms accessible largely through connections of fiber-optic cables. Though somewhat primitive by the standards of today, Web 1.0 gave the world its first taste of the “condition of post-modernity” that saw the traditional barriers of time and spaced compressed in some cases and erased in others.[6] Messages could be instantly sent via e-mail, websites could be created to publicize or advertise businesses or political causes and basic transactions ranging from the purchase of retail items to the transfer of bank funds could be completed. Despite the revolutionary rhetoric surrounding the new technology, however, the early days of digitalization still had its limitations in terms of who benefitted the most from the new technologies. As Peter Schultz and Paul Cobley write:
While the Internet may have been social from its early stages on, Web 1.0 was clearly not. The one-to-many communication model of websites dominated and commercial interests shaped its overall culture. E-commerce and copyright issues, hypertext and hyperlinks, information search and consumption, traditional advertising models (especially banner ads) and spam, content management and a database rather than narrative logic characterize what we now refer to as Web 1.0[7]
The most notable event during the period of Web 1.0 was the proliferation of “dot.com” businesses that attempted to use the connectivity of the internet to create super-profitable businesses by operating anywhere in the world where operating expenses would be the cheapest. While some early endeavors proved wildly successful (namely in the example of Amazon), most of the businesses of this “dot.com bubble” failed by the year 2000 triggering a global recession.
Web 2.0
In the case of Web 2.0, the focus shifts to convergence, portability and accessibility. The bulky desktop computers give way to the smartphone that can fit in the palm of one’s hand. These devices not only shrink the size of digital technology and make it easier to interphase with digital content, but it combines multiple other functions like telephony and cameras into them. With these capabilities, the “read-only” nature of static Internet websites transforms into the dynamism of social media platforms like Facebook, Twitter, Instagram and TikTok where content is constantly being added and manipulated by anyone who has access to the platforms. As these capabilities developed at greater speed and sophistication, their impact on the larger socio-economic dynamics of the world became more pronounced. One effect was to bury society in banality: “Yes, modern society is yearning for new ways to connect, but who knew we had a need for constant updates on the bowel movements of babies, the quirky behaviour of pets or the whereabouts and emotional states of distant acquaintances?”[8] On the other hand, others saw a profound emancipatory potential in many Web 2.0 technologies. Both technology gurus and social commentators were predicting the rise of “networked societies” empowered with digital technology that could pose an existential threat to governments and organizations who relied on coercion or authoritarianism to rule. In late 2010, it seemed these prognostications were coming true with the outbreak of the Arab Spring, an event where Web 2.0 technology played a central role in the uprising. However, in the succeeding years, the more problematic aspects of these capabilities emerged, epitomized by the revelation by the whistle blower Edward Snowden that the US National Security Agency (NSA) (along with the intelligence agencies of other powerful states), can collect, store and manipulate the data on virtually any Web 2.0 platform they choose.
Web 3.0
If Web 1.0 and 2.0 were about creating spaces for producing and sharing information, Web 3.0 is about “integrating and combining” this cascade of data that in many cases is only comprehensible between digital devices.[9] Examples of Web 3.0 technologies include applications of artificial intelligence, quantum computing, the blockchain ledger and the integration of all electronic devices and appliances through cyberspace to create an “internet of things.” With these technologies, a host of new and purportedly “freer” forms of social interaction become possible through a number of different applications including the so-called “metaverse” where individuals can virtually wander cyberspace as a digital avatar, cryptocurrencies and “non-fungible tokens” that represent quantities of value generated through the expansion of internet efficiency, and “decentralized online organizations” where individuals and groups can gather and make decisions based on the exchange of tokens on the blockchain.[10] The great promise of these technologies and applications is to restore the original promise of the Internet as a platform free from government or other forms of hierarchical oversight where free individuals can congregate and organize for whatever purposes they wish. The imperative to reclaim cyberspace as a place for free creativity became especially heightened by the combination of the financial crash of 2008 and the Snowden revelations in 2013. Both events demonstrated how many of the digital technologies of the previous eras had been hijacked by states and large corporations at the expense of the individual creator or entrepreneur. Web 3.0, with its ability to process data in an anonymous way, created a space free from oversight or scrutiny where alternative forms of politics, economics and society could take root. Thus, the idea of cryptocurrency became popular as an alternative medium of exchange and payment that was not controlled by the state and subject to its currency management and manipulation by centralized banks. Recent events, however, including the massive loss of value of many digital tokens and coins as well as the collapse of the cryptocurrency trading platform FTX show that despite the promise of decentralized forms of money and power returning agency to the individual in a highly administered society, many of the problems associated with the older analogue capitalism—including asset bubbles and wealth concentrations—remain in the brave new world of Web 3.0.[11]
The aim of using this framework is not to give some objectively accurate account of actual stages that transpired in the development of digital technologies, but to provide a general overview of changes that occurred since the adoption of the Internet (and the World Wide Web as its primary means of interacting with others on the Internet) by the masses and the impact this digital infrastructure and culture has had on the larger socio-economic and political arrangements of world order. Also worth noting is that this framework focuses on the impacts these technologies have on the places where they have been put into use. This is to acknowledge that different parts of the world may be at different stages in the development of their digital infrastructure or the impacts on the local societies may be less profound as fewer people may have access to the technologies in the first place. For now, the focus shifts to the ways these stages of digital development impacted the ideas, institutions and forms of material production in ways that have undermined the traditional configuration of power and opened up many of the spaces that the Caesars of the present day are able to exploit.
[1] Gramsci, Selections from the Prison Notebooks, p. 281 and Quaderni del Carace, Notebook 22, §2.
[2] See Eric Hobsbawm, The Age of Revolution 1789-1848 (London: Vintage, 1996), 2.
[3] Karl Polanyi, The Great Transformation: The Politics and Economic Origins of Our Time (Boston: Beacon, 2001), pp. 35-44.
[4] Vaclav Smil, Creating the Twentieth century: Technical Innovations of 1867-1914 and Their Lasting Impact. (Oxford: Oxford University Press 2005).
[5] Thomas Philbeck, and Nicholas Davis, “The Fourth Industrial Revolution: Shaping a New Era,” Journal of International Affairs, 72.1 (2018). pp. 17–22 <https://www.jstor.org/stable/26588339> [Accessed 15 October 2022].
[6] See David Harvey, The Condition of Post-Modernity (Cambridge, MA: Blackwell, 1990).
[7] Peter J. Schultz and Paul Cobley, Handbooks of Communication Science: Communication and Technology, (Berlin: de Gruyter Mouton, 2015), p. 182.
[9] Schultz and Cobley, p. 184.
[10] Dan Ashmore, “A Brief History of Web 3.0,” Forbes, 26 August 2022, <https://www.forbes.com/advisor/investing/cryptocurrency/what-is-web-3-0/> [Accessed 12 September 2022].
[11] Sirin Kale, “‘They couldn’t even scream any more. They were just sobbing:’ the amateur investors ruined by the crypto crash,” The Guardian, 12 July 2022 <https://www.theguardian.com/technology/2022/jul/12/they-couldnt-even-scream-any-more-they-were-just-sobbing-the-amateur-investors-ruined-by-the-crypto-crash> [Accessed 12 September 2022].
Leave a comment