PostCapitalism and the social movements of technology workers

Birth of technoplayIn PostCapitalism, Paul Mason has produced a work of stunning breadth and scope. His central project is to situate today’s world in political and economic history, and in doing so to answer a question that is troubling the world’s leading thinkers: why is it that at a time of technological change so rapid that one year’s science fiction is next year’s reality, is productivity stagnant and the market economy in crisis? In doing so, he clearly hopes to reorient the political left, alerting us to the new class struggles, new possibilities and new utopian visions being created by the digital age.

He asks us to imagine a future in which information is more important to production than machines, raw materials or even labour. Some economists call this “Trekenomics”. In the science fiction TV series Star Trek: the Next Generation, there is no money. Advanced machines can produce food from thin air, there is no scarcity of energy or raw materials, and so supply and demand no longer function.

Further, he argues that this world is already being brought into being, and that capitalism cannot cope. Information can be copied infinitely and freely – only copyright law stands in the way of every human being owning a copy of every work of art that has ever been created for the cost of a broadband connection and a few hard-drive. As the information content of physical goods rises capitalism’s pricing mechanisms break down. It is not possible to value abundant information within the market.

In order to cover such a large amount of ground, Mason is forced to skim over some key debates that have been taking place in movements of technology workers since the 1980s. The left has made a historic mistake, which Mason is trying to correct, in largely ignoring these movements because they did not take the same form as historic class struggles – trades unions, strikes, disputes over pay and working hours, and so on.

As PostCapitalism describes, we are at the beginning of a new era in production. Just as factories and assembly lines once displaced agriculture, the new information society will displace capitalism. The breakdown of the pricing mechanism for information clinches this argument. But just as the feudalism’s struggle between Lords and serfs was replaced by capitalism’s struggle between bosses and workers, the new era will be sculpted by contests between the powerful and the great mass of people. These new class struggles have already begun, and our enemy is both powerful and cunning.

PostCapitalism mentions the old buzzword “Web 2.0” only briefly, describing it as a rollout of easier development tools. This is an important mistake. Web 2.0 was a counter-revolution that threatens to create a new dystopia. To understand why, we will need to examine how the internet functions.

Computers execute lists of instructions, called code. We do not need to understand the technical detail, only that a computer will carry out the instructions given to it by humans in sequence. This code has been described by Professor Lawrence Lessig as the “law” of the internet. Code decides whether a given email lands in your inbox or your spam folder. It decides what results you see when you search. It decides which of your friend’s Facebook posts you see in your news feed, and which you do not.

In the early days of computing, code was available for inspection and modification by any skilled user. This meant that the law that guides networks was in the hands of its users. From the late 1970s, that changed, with companies deliberately hiding the code from users. A technology worker called Richard Stallman (sadly mentioned only once in the book) recognised that this was the struggle of the future – whomever controlled the code would control the world, and without openness there could be no accountability for these new laws. In our own time, he has been proven entirely right, as we have learned that some software companies hide backdoors in their products, allowing American intelligence agencies secret and unaccountable privileged access to our computers. It has even been suggested that Facebook could fix an election, if it so chose, by changing which articles users see, boosting turnout in one group and suppressing it in another.

Stallman’s GNU project set out to create a completely open set of software, so that no user would suffer the tyranny of living under laws that they could not even see, let alone control. A Herculean international effort created the operating system commonly called Linux, which today runs on 75% of smartphones and over 80% of public internet servers.

This project had much in common with the utopian socialist and cooperative movements of the 19th century, such as Robert Owen’s New Lanark (today, a world heritage site). They attempted to build a new world of cooperative work under capitalism, but failed in the face of market pressures and political opposition. Stallman’s project succeeded where they had failed by harnessing the time of thousands of developers across the world through the internet. The fact that information goods can be shared without cost allowed investment by individuals and companies into a common pool of capital without any fear of freeloaders.

Web 2.0 was the counterrevolution against this movement. Early internet technologies, largely financed by universities and the military, were built so that their code ran on the user’s computer. Microsoft Word is a good example – you install it on a machine, you save your documents to a local disk. This model proved to not be profitable enough, as anyone can ‘steal’ a free copy of Word from the Pirate Bay.

The first efforts to commercialise the internet ended in disaster with the 2001 dot com crash. A new model was needed, and that model was Web 2.0. Henceforth, code would run on ‘the cloud’, allowing the likes of Google and Facebook to collect users’ data centrally, and profit from this through targeted advertising. There was no technical advantage to this (in fact it is less efficient than older distributed technologies), but it allowed capitalist investment in the sector to turn large profits for the first time, funding great leaps forward in usability and marketing. The new, centralised services quickly displaced the old, decentralised services.

The artist and thinker Dmytri Kleiner has demonstrated the utopian nature of attempts to fund the new mode of production through the market. His Telekommunisten artist collective created the idea of a “social fiction” – a riff on science fiction. Science fiction imagines what would be possible if technology was more than it currently is; social fiction imagines what would be possible if society became more than it currently is. He built a clone of Twitter called Thimbl. It has two major differences from Twitter – firstly, that its code runs on the user’s computer or smartphone, so that they are in personal control of their data. Secondly, that it was based entirely on technology from the 1970s. The barrier to a distributed, democratic Twitter or Facebook isn’t technological – it is social, it lies in the class structure of our society and venture capital’s stranglehold on funding.

Worse still, the new centralisation created the conditions for massive state surveillance. With our emails, photos and conversations stored in giant warehouses in America, it became ever-easier for the security state to plug in and watch all of us. Military money has poured into Silicon Valley, integrating its largest firms into the American imperial project.

This security state / tech company nexus is feudal lord or capitalist of the new era, the social force that will seek to control and exploit us. The utopian movements created by technology workers, together with the new internet-driven political movements, are the counter force propelling society towards freedom and equality.

Mason has rendered an invaluable service in by putting those movements into their proper context in the history of worker’s struggles and political economy. It now falls to the rest of us to grow and connect our movements, to fight for investment into common production and open, accountable code. The outcome of the coming decade’s struggle could determine whether the next century is one of unparalleled freedom, or unparalleled ruling class power.

Comments (23)

Join the Discussion

Your email address will not be published.

  1. Kenny says:

    Microsoft Word an internet technology? Web 2.0 means cloud computing? Was Web 1.0 not in the cloud then? I’m sorry but it’s clear the author has very little knowledge of the subject. A really poor article.

    What’s more I think the independence movement should be steering well clear of aligning itself with Mason’s work. Maybe in 40 years it will all come to pass. In the near future discussing this nonsense just makes us look like brainless utopians. We need to discuss the currency issue, not producing food from thin air.

    1. Bob says:

      That’s a wee bit harsh Kenny – the audience for this is of all technical abilities and none.

      I think it would be great if our public services ran on open source – in my ‘brainless utopia’ for the NHS – hospital consultants and their secretaries would use tablets/PC’s running Linus Torvalds distro’s (not ipads), hospital systems would use open source systems rather than privately owned American offering.

      I think Microsoft levees a needless tax on our state – I’m not sure I can explain why.

      The natural inertia of large organizations – did anyone get sacked for buying Microsoft?
      The local councils / boards won’t jump unless Edinburgh says so – and Edinburgh can’t because they really don’t need another media firestorm – imagine the BBC’s HellonEarth Broadbrush working herself into a lather over SNP ‘incompetence’?

      i.e. like this:

      In my ‘brainless utopia’ – million pound contracts with these IT companies would be torn up and it managed internally. The bankers would be unhappy – but after all – are they not due a fiery afterlife?

      But we do agree on the currency issue – I’m a bitcoin man masel 🙂

      1. Kenny says:

        Bob, I agree that governments and other large organisations should be moving away from Microsoft, mainly from a security point of view.

        But how would this equate to the end of capitalism?

        Incidentally, do you use Linux yourself?

    2. Alistair Davidson says:

      During the era of Web 1.0, software that handled data stored it on user’s computers. When it passed through the net it was usually through federated distributed services – email, usenet.

      The article is aimed at a general, not a technical audience, and Word is given as familiar example of software that most people still run locally. There will be many people out there who have never downloaded email or usenet posts to their local machine.

      1. Kenny says:

        Alistair I’m not sure you even understand the difference between software, the web and the internet.

        Software and email are not part of the world wide web, and predate it. The web is html pages and other documents. It is information rather than software.

        The difference between web 1.0 and web 2.0 is that 1.0 uses static content whereas 2.0 uses dynamic, user-generated content, e.g. Youtube, Facebook and this comments section. Incidentally, the distinction is often reckoned to be tenuous and is not universally recognised.

        You seem to be getting web 2.0 mixed up with cloud computing, which is a fairly nebulous concept but is generally used to refer to accessing computing services remotely through third parties. For example, Adobe Photoshop can be accessed on subscription through the cloud rather than locally. This seems to be what you were talking about with your Word example. It has little to do with web 2.0. Cloud services may or may not be accessed over the web.

        Backing up though, what’s all this about producing food from thin air? Do you actually believe that is possible?

        1. Josef Davies-Coates says:

          You are mostly correct about what web 2.0 generally referred to (i.e. interactive user-generated content sites like youtube, but also blogs, wikis etc). I also raised an eyebrow at the version of history relayed in the (otherwise good) article. But then you go an make yourself look silly too by claiming that “The web is html pages and other documents. It is information rather than software.” HTLM is software. And on the server side php, perl, python and other scripting languages have powered gazzillions of web apps for donkeys years (like pretty much everything on the web you ever use). These day most of the web is full of client side javascript too (again, think all big and most small web apps/ platforms).

          Anyways, for those interested here is a little potted history (written in Jan 2014) from a friend who knows way more about this stuff than I do…

          To get an understanding of where things are going, it’s probably
          useful to understand where things have come from. So, in super rough
          terms, we have:

          * The Plain Web Era when men were men and wrote their HTML by hand.
          They would upload changes to their pages via FTP and let others
          consume them whole.

          * The CGI Era happened when some men realised that they could write
          scripts to take form input and dynamically generate web pages —
          resulting in wonders like e-commerce sites. Back in the day Perl was
          the language of choice for such endeavours but in the years since PHP
          pretty much won out on this front.

          * The Rich Internet Application Era. Tired of the pitiful interfaces
          that was possible on web pages, people looked to the likes of Flash
          and Java applets to create desktop-app-like interfaces. Flash found a
          foothold on many consumer sites thanks to being able to create nice
          animations and Java applets took off like crazy in the enterprise
          world. In the meantime, JavaScript, the only way to create any kind of
          client-side dynamism on web pages was left neglected and, when used,
          was mainly just for swapping images as people hovered over them.

          * The Dynamic Site Era was a Cambrian explosion in various server-side
          tools that let developers stitch together complex, dynamic pages on
          the backend with greater ease. The web gradually started to win out
          against the likes of Flash and Java applets thanks to the ability to
          link across pages and the LAMP stack became popular amongst various

          * The Enterprise Era was happening in parallel to all of this. Highly
          paid Java developers would create super complex apps using systems
          like J2EE and XML. People also started to get frustrated with having
          to constantly scrape web sites and tried to come up with ways of
          exchanging structured information across sites. This led to the
          promise of the Web Services revolution heralded by the likes of SOAP
          and WSDL.

          * The App Framework Era finally emerged when developers got frustrated
          with the wide range of server-side tools that were available to them.
          Few of these tools worked well with each other. Too many of the tools
          would get abandoned once a developer lost interest. It was just too
          much work to stitch together all the bits necessary to create web
          sites which were fast becoming more than just a collection of pages.
          So when Rails emerged in 2004, it was a breath of fresh air —
          developers could finally focus on the core logic of their app without
          having to waste too much time on repeating the same work that
          thousands of others had already done.

          * The AJAX Era coincided with the App Framework Era. Thanks to wonders
          like Gmail and Google Maps, people suddenly realised that the Web
          didn’t have to be so boring on the client-side. JavaScript, which had
          been neglected as DHTML in the late 90s, started to be recognised for
          its potential and frameworks like Prototype and jQuery started to
          emerge which made it significantly easier to manipulate the
          client-side DOM using JavaScript.

          * The API Era saw companies start to take advantage of the huge
          amounts of data that users were generating on their sites. By exposing
          an API for others to use, these players suddenly found themselves in
          the enviable position of being platforms in their own right. Much of
          the API work was inspired by earlier technologies like RSS, Semantic
          Web, Web Services, etc. — but ended up using far simpler methods like
          JSON over HTTP.

          * The Cloud Era took shape as more and more companies had to start
          scaling out. This saw yet another Cambrian explosion in technologies
          as people started to rethink how their web applications were
          architected. The simplicity of the LAMP stack (basically web server +
          app framework + relational database) was no longer enough and people
          started to incorporate elements like on-demand VMs, load balancers,
          cache servers, message queues, worker processes, NoSQL datastores,
          etc. Some companies like Amazon and Google even started to resell
          infrastructure to others to save them the time of building these
          systems themselves.

          * The DevOps Era emerged as developers looked to more and more tools
          to deal with the complexity of creating, maintaining and deploying
          modern apps. Similar to the server-side templates introduced a decade
          ago, tools like Sass and CoffeeScript let developers write CSS and
          JavaScript more conveniently. Tools like git and Jenkins enabled teams
          to collaborate more effectively. And the art of deploying apps started
          to evolve from plain SSH to using server orchestration systems like
          Capistrano, Chef, Puppet, etc.

          * The Client Side App Era finally started taking off a few years ago.
          Thanks to the renewed browser wars that Chrome had kicked off and the
          various HTML5 improvements that finally emerged, the client-side
          capabilities of web browsers started to become fairly impressive.
          People found that they could not only create rich interfaces (2D, 3D,
          audio, video, etc.) but also do a lot of the complex processing all on
          the client — simplifying their backends to the serving of APIs.

          The journey on the client side has been taking a similar journey to
          server side apps. Initially, people wrote entire JavaScript apps by
          hand — leveraging a few libraries like jQuery where appropriate. But
          as apps became larger this started to cause maintenance nightmares. So
          people looked to more and more frameworks.

          Richer client side apps are the immediate future with a JSON-based API for the backend so the backend could be written in anything — python, ruby, go, whatever and the client-side can be whatever that “compiles” down to javascript and HTML5 the best of breed today on the client-side would be a mix of: — for user interfaces — for connecting bits of your app together and then using whatever “HTML5” new JavaScript APIs you need, e.g. WebSockets, Storage, etc. bluebird hit 1.0 literally 10 days ago and react hit 0.8 about a month ago, i.e. is almost 1.0 material the next wave following react will be something like pedestal pedestal gets very close to the sensors and pattern matching based future we talked about back in the day, but all that stuff is finally starting to happen and gain adoption

          1. Thanks Josef, really helpful. It would be good to engage in Alistair’s political / cultural argument as well? It would be interesting to hear what people think is the state of (if it exists at all) Scotland’s hacktivist community?

          2. Kenny says:


            “HTLM is software.”

            Are you referring to HTML?

            “And on the server side php, perl, python and other scripting languages have powered gazzillions of web apps for donkeys years”

            By creating HTML pages.

            HTML (hypertext-markup-language) is markup. It’s data. At a deep conceptual level it’s designed to be data (not entirely successfully perhaps). That’s why you need scripting languages to make web pages do anything. Admittedly when you add in javascript and server scripting what you have is essentially software but HTML is not software.

            These distinctions may not ultimately be very meaningful but in the context of the article they were. Neither cloud-based nor open-source software rely on the web. And relating Web 2.0 to proprietary software is just nonsense. Web 2.0 is based on open standards used by sites like Wikipedia, which the post-capitalism advocates love so much.

        2. Alistair Davidson says:

          To state the points in different terms:

          * modern AJAX web applications centralise and sometimes monopolise communications technologies that were once distributed, federated and open (eg Facebook Messenger vs Email, Twitter vs Finger)

          * As the information content of a good rises versus the labour content, the marginal cost of that good declines towards zero. This is an argument from the book, and the proof is too large for a comment.

          Free food is not on the immediate horizon. Near zero marginal cost manufacturing may be, over the course of the next century, and the process has begun with the marginal cost of

        3. Alistair Davidson says:

          To state the points in different terms:

          * modern AJAX web applications centralise and sometimes monopolise communications technologies that were once distributed, federated and open (eg Facebook Messenger vs Email, Twitter vs Finger)

          * As the information content of a good rises versus the labour content, the marginal cost of that good declines towards zero. This is an argument from the book, and the proof is too large for a comment.

          Free food is not on the immediate horizon. Near zero marginal cost manufacturing may be, over the course of the next century, and the process has begun with the marginal cost of information falling to near zero.

          1. Kenny says:


            You’re right that the web seems to be moving towards a walled-garden rather than open approach. But how does this fit with the post-capitalism narrative? Surely it indicates the opposite.

            “As the information content of a good rises versus the labour content, the marginal cost of that good declines towards zero”

            Why would the information content rise versus the labour content? If you put in more work you can get more and better information. Good quality information has value, hence why people pay for subscriptions to quality publishers. Good quality information requires labour.

            The marginal cost of information is low because it can be cheaply reproduced. But this does not necessarily entail low quantities of labour. It costs very little to reproduce a copy of Windows but took a vast amount of labour to create it. In addition, although the marginal cost of reproducing Windows is tiny, the total cost for hundreds of millions of copies could be huge. This partly explains why IT is dominated by a small number of extremely large corporations. The marginal cost is low but it requires a lot of highly skilled (and expensive) labour and a lot of expensive equipment. Most IT companies need a lot of capital before they ever become profitable.

            If all you’re saying is that the web and IT generally are moving towards a highly capitalised, walled-garden approach, and that we need to resist this trend, then I whole-heartedly agree with you. I just don’t see that this fits in with the post-capitalism theme you open with, nor what the open standards of AJAX have to do with it.

    3. leavergirl says:

      “We need to discuss the currency issue, not producing food from thin air.”

      Hit the nail on the head.

      1. DR says:

        Except that *currency* is currently produced from thin air (and crashed or maintained by ‘confidence’, a euphemism for amateur information synthesis). So unless we get to grips with thin air (on which the vast majority of global food production now rests entire) we cannot have a functional currency or sustainable food production.

        The wider point is that the Guttenberg ellipsis is *over* (whether we’re talking art, media, code, or husbandry manuals) therefore ‘broadcast’ models – of political behaviour, economic confidence, global commodity prices, or whatever – do not accurately describe the present, never mind the future.

        Yes, we can not-talk about it, retreating into the fantasy-50s of modern British politics. (Which suffers from the fact that while most – of whatever discursive level – cannot articulate what postcapitalism will be like, we are all *living in* the transition.) Or we can at least try to catch up to the 21st century, and so have at least some hope of working the economy as is. It actually doesn’t matter all that much whether currency du jour is money or information (except when we insist on fetishizing money to the exclusion of whatever can actually enable us to do what needs doing).

        1. Kenny says:

          “Except that *currency* is currently produced from thin air…”

          But you can’t eat currency. To produce food you need capital, i.e. machinery, and land.

          “The wider point is that the Guttenberg ellipsis is *over* (whether we’re talking art, media, code, or husbandry manuals) therefore ‘broadcast’ models – of political behaviour, economic confidence, global commodity prices, or whatever – do not accurately describe the present, never mind the future.”

          What does this even mean? What is a broadcast model of economic confidence?

  2. Alistair MacKinnon says:

    One good zap from the sun….

  3. leavergirl says:

    I am betting that if the people of Scotland were selecting the hot topics that need discussing in depth right now, what “postcapitalism” might be like would not be high on the ladder of priorities.

    1. Frank says:

      So the masses are dictating consciousness are they?

      In my view there should always be a space for intellectual discussions beyond the here and now of day to day politics. I for one have thoroughly enjoyed the discussions on post capitalism.

      1. leavergirl says:

        I don’t really call anyone “masses.” It conjures the image of a mass of dumb flesh out there, and we the thinkers here. I think we all need to pitch in with setting the agenda for discussions, and broad access is important. So is being in touch with the grassroots.

  4. Fran says:

    It is for me and I’m sure I’m not alone. One of the big issues sparked off by the referendum has been questioning our system and what kind of society we want to be living in.

    1. leavergirl says:

      Yeah, but a lot of this is vague intellectualizing. What kind of a society? Sure, let’s talk about it, but make the discussion accessible to all, not only the pinheads. 🙂

  5. HiltonTong says:

    Anybody ken how the 2 guys fae Nairn who I caught on RT last year are getting on with their project to totally replace the web with a new system that uses similar encryption/block chain etc to Bitcoin?

  6. Mr. Verloc says:

    I’ve just noticed the irony here; Bella has one article on the front page touting a future economic system where food is magicked up in the talking microwaves from Star-Trek. Then they have another arguing that genetic modification of crops is inherently evil and a de-facto plot by monopolistic agribusiness.

    Tentative suggestion — the contemporary Left has a *really* weird, fractured and confused set of attitudes to technology.

    I like AD’s stuff, but I haven’t read Mason’s book, so perhaps this question is treated of well within it. But “PostCapitalism” does strike me as hearkening back to Anthony Giddens, “Living on Thin Air”, and all the slightly decadent ideas which heralded the dot-com bubble, and played a part in the illusion of growth shattered in 2008.

    The era which was to the Open Source movement a new dawn of post-capitalist production (brilliantly likened I must say to New Lanark), simultaneously saw an unprecedented, hyper-globalization of manufacturing, with new lows in exploitation of people and planet through outsourcing to BRIC.

    The current stramash over GMO is perhaps less about the DNA of plants and more about the DNA of the Left. It has been modified with a certain reactionary gene from the Green movement. One which wants to increase labour inputs to production, while decentralizing production and increasing scarcity as an end in itself. A case in point is the “broken window fallacy” of renewable energy, which we’re told will “create jobs” unlike advanced nuclear energy which “centralizes power”. Ideological opposition to GMO is perhaps similarly motivated.

    This all is quite sketchy, but someone needs to say it. We can’t eat information, and if we want the “Star Trek” future we need to think about how to actually get there.

  7. Ollie Loynes says:

    By Gavin Falconer One of the results of the independence referendum is that it makes all of us, whether we like it or not, gradualists. This weeks Hysterical Neighbours Prize is a three-way draw.

Help keep our journalism independent

We don’t take any advertising, we don’t hide behind a pay wall and we don’t keep harassing you for crowd-funding. We’re entirely dependent on our readers to support us.

Subscribe to regular bella in your inbox

Don’t miss a single article. Enter your email address on our subscribe page by clicking the button below. It is completely free and you can easily unsubscribe at any time.