Here is the translation on English of Giorgio Griziotti’s contribution to the seminar “Algorithms and capital”, held at Goldsmith College, january 20th, 2014.
* * * * *
The question is no longer deciding whether or not to develop biological engineering practices but how to use these techniques. The struggle is now over the alterative paradigms of bios and the multitude is called to battle over the idea and the reality, the model and language of the body that it wants to give to the General Intellect.
– Toni Negri
The Monster’s Desire: from the Circus to the Political Laboratory
The use of technology is the fundamental axis of the metamorphosis caused by the fusion of life and labor in cognitive capitalism. A metamorphosis that affects the inborn/acquired pair where the latter, in Darwin’s nature vs. nurture, is the environment’s “nutriment”, from mother earth to language.
For more than a century, the speculative debate on this binomial has fueled discussion and analysis in philosophy, psychology, medical research and human sciences and it is the foundation for a disciplinary organization of society – even the Nazi regime used it as the keystone of its destructive philosophy.
The society of control organized by cognitive capitalism is instead directly based on the manipulation of these two elements. According to neoliberal dogma, they are important as fundamental components of human capital or, better yet, of a human who, becoming capital, must earn time to live. In order to impose this economic rationality, the financial oligarchy that holds global power is directly involved with the processes of measuring the bios with behavioral and genetic alteration. The inborn and the acquired are hit by a technological tsunami while neurosciences, genetic engineering, nanotechnologies, artificial intelligence and robotics all come into play.
In Europe, a mercenary political class, subject to a financial élite that forces it to privatize welfare, no longer has any margin for exercising their antiquated socialdemocratic mediations; a new strategy of control over life and society, based on technological subjection and the generation of debt, has replaced it.
As far as nurture is concerned, digital and biohypermediatic technologies, also associated with discoveries in neurosciences, are what intervene in how we feel, perceive and understand the world. They are used in evermore subtle and articulated ways in strategies for influencing framing, business and governance.
Within the great tendencies of capitalism, and sometimes in apparent opposition to the dominant oligarchy, libertarian currents take on a new bent in Silicon Valley and actively participate in this strategy, facilitating the voluntary adoption of the instruments of control in exchange for the illusion of individual liberty.
The case of cryptocurrencies – whose creation is based on software, algorithms and network technologies that, at first glance, seem autonomous from global financial institutions and national and private banks – highlights certain ambiguities and the mixing of genres. Without going into a detailed analysis, the Bitcoin project (BTC) is based on an anonymous peer to peer production of money and is made relatively safe through a cryptography based on specific public algorithms; its code is under an open source license and it uses the principle of network computing. These aspects put it into the same category as great cooperative projects and collective socio-technological innovations that come from the hacker community, just like Linux.
Due to its open source characteristics, BTC gives way to forks, derivations that allow the implementation of other digital currencies; there are around 40 for now. Probably, the goal at the origins of BTC were to prosper as a tool of exchange outside the control of oligarchical institutions and to free transactions from commissions, exactions and market limits.
Unfortunately, this isn’t exactly what is happening. In this phase, instead, this cryptocurrency is above all used as an instrument of accumulation of financial speculation. The convertibility with classic currencies (starting with the Yuan and US Dollar) and a production that is algorithmically limited in quantity and time in some ways reproduce the role of gold as reserve currency. The metaphor also extends to the terminology used and to a certain gold rush mythology that is based on that of videogames. Like the extraction of gold, the production of cryptocurrencies (not by chance defined as “mining”) requires a great quantity of electrical energy and computational power, which are then respectively consumed and produced by powerful PCs, derived from those dedicated to gaming, to work at maximum regime.
The key criteria of BTC is found in the principle of an extraction of currency proportional to computational power but without having any of the principles that would be inscribed in the social code of an algorithm for a currency of the common. This is precisely why the experiment can’t break free from an innate capitalist immutability based on the guiding role of profit in the distribution of labor and social organization. BTC merely shifts the register. Leveraging technology, it is gaining support within the hacker and P2P movements. For the moment, it rather seems to be drawing them into the sancta sanctorum of finance, massively training hackers for trading, proposing a speculative race through an algorithmic production of “autonomous” money. Today, the technical abilities of hackers constitute an advantage but the computational power, and therefore hardware investment, becomes more and more preponderant. This is already the case of BTC, which is only minable today with special, dedicated computers that cost (tens of) thousands of dollars. This without taking into account digital corporations, new or old, that have been “inspired” by this process to launch their own currencies in the future. It is possible that these experiments are “disrupting”, but for the moment they seem to be part of the vast sector of the peer to peer cooperation activities that are subsumed by capital according to the unchanging principle: “For things to remain the same, everything must change“.
Despite this, the experience of BTC has the merit of having opened the way and the debate over the possibility of creating a truly autonomous digital money with the scope of creating a currency of the common that takes into account three essential elements, according to the economist Carlo Vercellone, hardwired into its algorithms and its implementation:
· The impossibility to accumulate and thus impeding it from becoming the object of speculation. Consequently, it must lose some of its value over time. It would therefore be a currency that melts down, a “demurrage charged money”.
· Mitigating workers’ dependency on the economic restrictions that force them to sell their labor power and therefore wage relations themselves; thus reducing precarity.
· Allowing, on these premises, for more free time and resources for developing alternative forms of cooperation based on the common pooling of knowledge, production and, in any case, on exchange networks that exclude the logic of profit. Participation in networks where a currency of the common circulates implies adhering to these principles, whether participants are individuals, businesses or institutional subjects, as in the case of certain alternative currency models experimented with on a local level.
The current transformation of electronic money from an autonomous tool into financial speculation explicitly demonstrates the importance and political centrality of algorithms in exploiting multitudinary cooperation.
Complementarily, other algorithms that are not open but proprietary, secret and protected by copyright, already play a greatly influential role in producing knowledge according to a logic that uses predetermined criteria to establish what is shown and to whom it is shown. PageRank, Google’s famous algorithm that determines the rating and therefore importance of a website, allows for the valorization of a site’s visibility on the net. This capacity to create and dominate the classification market of the network’s atoms has made Google the most influential corporation of the digital era.
Google has now set its eyes on the financial goals of being the world’s leading advertising agency with a consolidated 55 billion dollars of annual revenue, more than double that of the conglomerate that best represents pre-internet advertising, born from the fusion of Publicis and Omnicom.
This oligarchical power and extremely rapid accumulation of data now allow Google to pursue a strategic objective that is even more ambitious in entrepreneurial terms: they aim to lead the race to format humans beings themselves to make us more and more a function of cognitive capitalism and the transformation of our lives into merchandise.
PageRank’s algorithm manages to produce surplus value of our network activities and reveals the potential of mathematical models able to approximate human behavior to the point of shaping it.
EdgeRank follows the road opened by its predecessor. Facebook’s (FB) algorithm intervenes more directly in relations, creating a rank based on which it autonomously decides what appears in the News Feed of every member of the social network. EdgeRank establishes a relative value of all the posts from our “friends”, using a certain number of parameters and criteria aimed at quantifying our relations. “Affinities” are measured by counting “likes”, and the frequency and type of contact with our “friends” who publish the post. These posts themselves can be understood according to type: for FB, written posts, which can only be perceived through attentive and conscious cerebral activity, are obviously the most filtered medium. Meanwhile, photos and videos, which directly act on our senses and emotions without cognitive mediations, have a higher value.
The algorithm takes time into account in a linear way: whatever is most recent has a higher value. While Google’s algorithm analyses and exploits attention, knowledge and behavior online, FB tries to aseptically measure peer to peer relations.
In both cases, although possibly in a more evident way in the second example, the different facets of our subjectivity are schematized and emptied in order to allow measure and classification. This process shares a similarity with what industrial capitalism does in the factory, operating the subdivision between the workers’ living labor and the dead labor of machines. Cognitive capitalism extends a similar separation to the space and time of life, thanks to algorithms that implement this division by sterilizing the richness of relations, compulsively orienting desire, artificially saturating our senses and emotional states. This is the end game of the bulimic monster of BigData, nourished by data extrapolated by grinding human relations, sensorial vibrations, feelings, gestures and behavior through algorithmic machines.
Our highly desired smartphones, tablets and plenty of new connected objects are biohypermediatic sensors that, when used to put life to work, have the ability to capture both the biological parameters of our bodies as well as our shifts in behavior and mood. On the other hand, they give back and flood us with fluxes able to trick our empathetic mirror neurons.
These two ranks represent the first fruits of a race destined to cross through and modify the thousand planes of human nature in an attempt to render it homogeneous with capitalism, similar to what is being done to the whole of our biosphere that, consequently, falls into a lethal vortex that can only be stopped by a collective intelligence able to deviate this trajectory over time.
More than a hypothetical Moloch-Big Brother future à la 1984, cognitive capitalism thus seems to tend toward a Biorank, a meta-algorithm destined to classify humans and box them into integral compartments of exploitation, depriving them of their singularity. Parallel to what is happening in the large companies that have the man-day or man-hour as a product – where every “consultant” is merely a package of abilities sold at the best price – the project of capitalist governance is to extend this principle over existence itself. To accomplish this, it isn’t enough to act on acquisition alone, itself not able to guarantee that unforeseen reversals or changes in tendencies occur; it is necessary to irreversibly transform the inborn.
Biotechnologies in the sector of the living and, in particular, genetic engineering based on the use of recombined DNA molecules, both have the capacity to manipulate inborn characteristics and therefore to modify the genetic pool inherited from our ancestors. It seems logical and coherent that neoliberal philosophy would be interested in favoring the importance of the gene pool just as its illustrious predecessor did with aristocratic bloodlines. Finding the genes that allow us to identify the biological causes of complex illnesses and dependencies, from schizophrenia to drug addiction, comforts a vision dominated by the predetermination and predictability that is more functional to the logic of control and renders the perspective of being able to adapt homoeconomicus to the system feasible. Above all, we can see this in the attempt to assure an optimization of the value of human capital, increase the patrimony of offspring, and cure and prevent the risk of hereditary pathologies. From a strictly financial perspective the technical tools for genetic modification, added to the existent mechanisms of reproductive control, become the ideal complement to Biorank. They would allow for the perfection of biological social control over all of society, extracting value from any activity whatsoever, be it productive, reproductive or even unproductive.
In the science fiction film In Time, a possible dystopia of this neoliberal integration of inborn and acquired is shown. A society where the currency is time and people are genetically modified to never age but who can only live for the duration of the time capital that they own: the few rich can therefore live forever in the rare unpolluted and fancy areas, while the others live in a land that is desecrated, struggling daily for survival.
Even if we are far from having developed such manipulative abilities, this idea is precisely in one suggested to create a digital currency based on time, which actually isn’t a new idea at all. It dates back to Prudhon, who had proposed this as a currency of the common and he was criticized by Marx. Obviously, the ethical debate in this field is open, just as it has been since the beginning of civilization, but much doubt arises if we imagine this technology in the hands of Wall Street, or if it is controlled by politicians and the spin-doctors of storytelling. Without foretelling the advent of a declared capitalist eugenics, one could suppose that discriminated access to gene therapy already constitutes one de facto.
In the hands of capitalism, technology, algorithms and bioengineering become arms for a political domination that attempts to reduce the power and wealth of life to nude life: malleable and permeable zombie reflexes to everything and over which the absolute violence of power is exercised.
We are therefore at a crossroads, one where what is at stake isn’t so much the development of evermore integrated science and technology, but the way of using it to organize the struggle and exodus from the deadly model where technology takes the place of ontology.
Paris, January 2014
Special thanks to Carlo Vercellone for his contribution regarding currency, his proofreading and his invaluable comments.
U. Fadini,T. Negri et Ch.Wolfe (dir.), (2001) Desiderio del mostro. Da circo al laboratorio alla politica, Manifestolibri, Roma.
 Nature versus nurture is the phrase that opens the modern debate over the relative importance of the inborn in respect to individual experience. Coined by Francis Galton during the Victorian Age, it was influenced by his relative C. Darwin’s The Origin of the Species.
 For example, SHA256 is a cryptographic hash function conceived by the American NSA and used in Bitcoin. Other digital currencies like Litecoin and Feathercoin use the Scrypt algorithm.
 Bitcoin’s algorithm is conceived for a maximal and decreasing production of 21 million units of which 75% will be emitted by 2017. Instead, Litecoin foresees the production of 84 million units.
 Tancredi’s famous phrase in Giuseppe Tomasi di Lampedusa’s The Leopard (1958).
 See my article Biopolitics, territories and signs of crisis in multinational network companies given at the seminar Uninomade Impresa e Soggettività in Turin, on 24 march 2012; http://www.opendemocracy.net/giorgio-griziotti/biopolitics-territories-and-signs-of-crisis-in-multinational-network-companies
“BitCoin: a Rube-Goldberg machine for buying electricity”:
In the end, the artificial creation of the limited number of possible BitCoins via this “proof of work”(doing millions of SHA-256 hashes over and over) is madness. All you really need is to have “proof of limitation” without the politics—was the market restrained from creating too much money too fast?BitCoin’s use of a procedural solution is the wrong track when all you need do is define a constraint viaa formula and apply it as needed over time, instead of everyone continuously spinning a hash functionand wasting electricity. Keep the transactions public, cryptographically sign them, and audit them witha money model and you’ll be able to keep much of what is good about BitCoin. And of course, use a”commodity” the people can intuitively understand, something like… time. from http://trustcurrency.blogspot.fr/2011/03/bitcoin-rube-goldberg-machine-for.html