04 April 2014

The Antinomies of Computation

AntiSurveillance Feminist Poet Hair & Makeup Party
In this post I explore what I want to call the antinomies of computation.[1] This is part of a larger project to map out these contradictions but here I will only talk about one of the antinomies that I think is interesting, namely visibility/opacity. In subsequent posts I hope to explore multiple strata to map out different moments in these antinomies. This is an attempt to contribute to a critique of an increasingly softwarized society and economy that requires analysis and contestation (see Berry 2011, 2014).

Computation makes the collection of data relatively easy. This increases visibility through what Rey Chow (2012) calls “Capture”. Software enables more effective systems of surveillance and hence new capture systems. As Foucault argues, “full lighting and the eyes of a supervisor capture better than darkness, which ultimately protected. Visibility is a trap” (Foucault 1991:200). The question is also linked to who is made visible in these kinds of systems, especially where as Feminist theorists have shown, visibility itself can be a gendered concept and practice, as demonstrated in the historical invisibility of women in the public sphere, for example. Here we might also reflect on the way in which the practice of making-visible also entails the making-invisible – computation involves making choices about what is to be captured. For example, Zach Blas's work is helpful in showing the various forms of race, gender and class-based exclusion in computational and biometric systems (Magdaleno 2014).

The question then becomes how to “darken” the visibility to prevent the totalising nature of full top-view possible in computational society? Using the metaphor of “black boxes” – the technical notion of objects which have opaque or impossible to read internal states but readable surfaces – how can we think about spaces that paradoxically enable democracy and the political, whilst limiting the reading of the internal processes of political experimentation and formation. Thus, how to create the conditions of possibility of “opaque places” working on the edges or at the limits of legibility. This we might call opaque temporary autonomous zones, that seek to enable democratic deliberation and debate. These should be fully political spaces, open and inclusive, but nonetheless opaque to the kinds of visibility that computation makes possible. As Rossiter and Zehle (2014) argue, we need to move towards a "politics of anonymity", part of which is an acknowledgement of the way in which the mediation of algorithms could operate as a plane of opacity for various actors.

It is important to note that this is not to create paranoid spaces or secret societies, but conditional and temporary moments – glitches in the regime of computational visibility. The idea is not to recreate notions of individual privacy as such, but rather collective spaces of critical reflection for practices of creating a political response. That is, to draw on theory and "un-theory" as a way of proceeding theoretically as "an open source theory [and practice] in constant reformulation from multiple re-visions and remixings" (Goldberg 2014), what CTI (2008) calls "poor theory". Indeed, crypto practices can create shadows in plain sight thus tipping the balance away from systems of surveillance and control. Of course, paradoxically these opaque spaces themselves may draw attention to state authorities and the intelligence community who monitor the use of encryption and cryptography – demonstrating again the paradox of opacity and visibility.

CV Dazzle Project by Adam Harvey
By crypto practices, or crypto-activism, I mean the notion of “hiding in plain sight”, a kind of stenography of political practice. This is not merely a technical practice but a political and social one too. Here I am thinking of the counter-surveillance art of Adam Harvey, such as "CV Dazzle", which seeks to design make-up that prevents facial recognition software from identifying faces, or the "Stealth Wear" which creates the "potential for fashion to challenge authoritarian surveillance" (Harvey 2014). Some examples in political practice can also be seen at the AntiSurveillance Feminist Poet Hair and Makeup Party. Additionally, Julian Oliver's work has also been exemplary in exploring the ideas of visibility and opacity. Here I am thinking in particular of Oliver's works that embed code executables paradoxically in images of the software objects themselves, such as "Number was the substance of all things" (2012), but also "PRISM: The Beacon Frame" (2013) which makes visible the phone radio networks, and hence the possibility of surveillance in realtime of networks and data channels (Oliver 2014).

These artworks point towards the notion of "opaque presence" explored by Broeckmann (2010) who argues that "the society of late capitalism – whether we understand it as a society of consumption, of control, or as a cybernetic society – visibility and transparency are no longer signs of democratic openness, but rather of administrative availability" (Broeckmann 2010). It also is suggestively explored by the poet Edouard Glissant, who believes that we should "agree not merely to the right to difference but, carrying this further, agree also to the right to opacity that is not enclosure within an irreducible singularity. Opacities can coexist and converge, weaving fabrics" (Glissant 1997: 190).

So this is not just a technical (e.g. cryptographic) practice. Indeed crypto practices have to be rethought to operate on the terrain of the political and technical simultaneously. Political activity, for example, is needed to legitimate these cryptographically enabled “dark places”. Both with the system (to avoid paranoia and attack), with the public (to educate and inform about them), and with activists and others.

That is, we could think about these crypto-practices as (re)creating the possibility of being a crowd, both in the terms of creating a sense of solidarity around the ends of a political/technical endeavour and the means which act as a condition of possibility for it. Thus we could say in a real sense that computer code can act to create “crowd source”, as it were, both in the technical sense of the computer source code, and in the practices of coming together to empower actors within a crowd, to connect to notions of the public and the common. But these crypto-practices could also help individuals to "look to comprehend how things fit together, how structural conditions and cultural conceptions are mutually generative, reinforcing, and sustaining, or delimiting, contradictory, and constraining. [They] would strive to say difficult things overlooked or purposely ignored by conventional thinking, to speak critically about challenging matters, to identify critical and counter-interests" (Goldberg 2014).

In contrast, to think for a moment about the other side of the antinomy, liberal societies have a notion of a common good of access to information to inform democratic citizens, whilst also seeking to valorise it. That is, the principle of visibility is connected to not only the notion of seeing ones representatives and the mechanisms of politics themselves but also the knowledge that makes the condition of acting as a citizen possible.

Meanwhile, with the exploding quantity of information in society and the moves towards a digital economy, information is increasingly seen as a source of profit for capitalism if captured in an appropriate way. Indeed, data and information are said to be the new ‘oil’ of the digital age (e.g. Alan Greenspan 1971) (Berry 2008: 41, 56). This highlights both the political and economic desire for data. Meanwhile, the digital enables exploding quantities of data that are increasingly hard to contain within organisation boundaries.

One response to computational changes in political and the economy has been the kinds of digital activism connected with whistleblowing and megaleaks, that is the release of massive amounts of data into the public sphere and the use of social media and the internet to distribute it. These practices tend to act to take information out of the "black boxes" of corporations, governments and security services and provide information in the public domain about their mechanisms, practices and machinations. They seek then to counter the opaqueness of the organisation form, and making use of the copyable nature of digital materials.

However, as megaleaks places raw data into the public sphere – usually as files and spreadsheets of data – there is a growing problem of being able to read and comprehend it, hence the growing need for journalists to become data journalists. Ironically then, “opening the databanks” (Berry 2014: 178, Lyotard 1984: 67) creates a new form of opaqueness. Computational strategies are needed to read these new materials (e.g. algorithmic distant readings). Attached to the problem of information overload is that this mechanism can also be harnessed by states seeking to attack megaleaks by counter-leaking and delegitimate megaleaks. Additionally, in some senses the practices of Wikileaks are connected to creating an informational overload within organisations, both in terms of their inability to cope with the release of their data, but also the requirement to close communicational channels within the organisation. So information overload can become a political tactic of both for control and resistance.

But what is at stake here is not just the relationship between visibility and incarceration, nor the deterritorialisation and becoming-mobile made possible by computation. Rather it is the collapse of the “time lag between the world and its capture” (Chow 2012).  When capture becomes real-time through softwarized monitoring technologies and the mediation of “police” functions and control that implies.

The question then becomes what social force is able to realise the critique of computational society but also block the real-time nature of computational monitoring. What practices become relevant when monitoring and capture become not only prevalent but actively engaged in. Tentatively I would like to suggest embedding critical cryptographic practices made possible in what Lovink and Rossiter  (2013) calls OrgNets (organised networks).

Antisurveillence Feminist Party
But also what we might call crypto-activism, the creation of systems of inscription that enable the writing of opaque codes and the creation of "opaque places". This is not just the making possible the spaces of collectivity (“crowd source”) but also the hacking and jamming of the realtime mediation of politics, dissent and everyday life (Deleuze 1992). As Glissant argues "We clamour for the right to opacity for everyone" (Glissant 1997: 194). This, I think, calls for both a cartography of the hybridity of digital media (its post-digital materiality) but also and importantly, the possible translation of crypto, as a concept and as a technical practice, into digital activism tactics.


Notes

[1] This post is drawn from a talk given at Digital Activism #Now: Information Politics, Digital Culture and Global Protest Movements, at Kings College, London (KCL), 04/04/14. See http://www.kcl.ac.uk/aboutkings/worldwide/initiatives/global/nas/news-and-events/events/eventrecords/Digital-Activism-Now-Information-Politics,-Digital-Culture-and-Global-Protest-Movements.aspx


Bibliography

Berry, D. M. (2008) Copy, Rip, Burn: The Politics of Copyleft and Open Source, London: Pluto Press.

Berry, D. M. (2011) The Philosophy of Software, London: Palgrave.

Berry, D. M. (2014) Critical Theory and the Digital, New York: Bloomsbury.

Broeckmann, A. (2010) Opaque Presence / Manual of Latent Invisibilities, Berlin: Diaphanes Verlag.

Chow, R. (2012) Entanglements, or Transmedial Thinking about Capture, London: Duke University Press.

CTI (2008) Poor Theory Notes: Toward a Manifesto, Critical Theory Institute, accessed 14/4/2014, https://www.humanities.uci.edu/critical/poortheory.pdf

Deleuze, G. (1992) Postscript on the Societies of Control, October, vol. 59, pp. 3-7. Available at https://files.nyu.edu/dnm232/public/deleuze_postcript.pdf

Foucault, M. (1991) Discipline and Publish, London: Penguin Social Sciences.

Glissant, E. (1997) The Poetics of Relation, Michigan: The University of Michigan Press.

Goldberg, D. T. (2014) Afterlife of the Humanities, accessed 14/04/2014, http://humafterlife.uchri.org

Harvey, A. (2014) Stealth Wear, accessed 04/04/2014, http://ahprojects.com/projects/stealth-wear/

Lovink, G. and Rossiter, N (2013) Organised Networks: Weak Ties to Strong Links, Occupy Times, accessed 04/04/2014, http://theoccupiedtimes.org/?p=12358

Lyotard, J. F. (1984) The Postmodern Condition: A Report on Knowledge. Manchester:
Manchester University Press

Magdalenom J. (2014) Is Facial Recognition Technology Racist?, The Creators Project, accessed 05/04/2014, http://thecreatorsproject.vice.com/blog/is-facial-recognition-technology-racist

Oliver, J. (2014) Julian Oliver, accessed 05/04/2014, http://julianoliver.com/output/

Rossiter, N. and Zehle, S. (2014) Toward a Politics of Anonymity: Algorithmic Actors in the Constitution of Collective Agency and the implications for Global Justice Movements, in Parker, M., Cheney, G., Fournier, V. and Land, C. (eds.) The Routledge Companion to Alternative Organization, London: Routledge.

01 April 2014

On Capture

In thinking about the conditions of possibility that make possible the mediated landscape of the post-digital (Berry 2014) it is useful to explore concepts around capture and captivation, particularly as articulated by Rey Chow (2012). Chow argues the being "captivated" is
the sense of being lured and held by an unusual person, event, or spectacle. To be captivated is to be captured by means other than the purely physical, with an effect that is, nonetheless, lived and felt as embodied captivity. The French word captation, referring to a process of deception and inveiglement [or persuade (someone) to do something by means of deception or flattery] by artful means, is suggestive insofar as it pinpoints the elusive yet vital connection between art and the state of being captivated. But the English word "captivation" seems more felicitous, not least because it is semantically suspended between an aggressive move and an affective state, and carries within it the force of the trap in both active and reactive senses, without their being organised necessarily in a hierarchical fashion and collapsed into a single discursive plane (Chow 2012: 48). 
To think about capture then is to think about the mediatized image in relation to reflexivity. For Chow, Walter Benjamin inaugurated a major change in the the conventional logic of capture, from a notion of reality being caught or contained in the copy-image, such as in a repository, the copy-image becomes mobile and this mobility adds to its versatility. The copy-image then supersedes or replaces the original as the main focus, as such this logic of the mechanical reproduction of images undermines hierarchy and introduces a notion of the image as infinitely replicable and extendable.  Thus the "machinic act or event of capture" creates the possibility for further dividing and partitioning, that is for the generation of copies and images, and sets in motion the conditions of possibility of a reality that is structured around the copy.

Chow contrasts capture to the modern notion of "visibility" such that as Foucault argues "full lighting and the eyes of a supervisor capture better than darkness, which ultimately protected. Visibility is a trap" (Foucault 1991: 200). Thus in what might be thought of as the post-digital – a term that Chow doesn't use but which I think is helpful in thinking about this contrast – what is at stake is no longer this link between visibility and surveillance, indeed nor is the link between becoming-mobile and the technology of images, but rather the collapse of the "time lag" between the world and its capture.

This is when time loses its potential to "become fugitive" or "fossilised" and hence to be anachronistic. The key point being that the very possibility of memory is disrupted when images become instantaneous and therefore synonymous with an actual happening. Thus in a condition of the post-digitial, whereby digital technologies make possible not only the instant capture and replication of an event, but also the very definition of the experience through its mediation both at the moment of capture – such as with the waving smart phones at a music concert or event  – but also in the subsequent recollection and reflection on that experience.

Thus the moment of capture or "arrest" is an event of enclosure, locating and making possible the sharing and distribution of a moment through infinite reproduction and dissemination. So capture represents a techno-social moment but is also discursive in that it is a type of discourse that is derived from the imposition of power on bodies and the attachment of bodies to power. This Chow calls a heteronomy or heteropoiesis, as in a system or artefact designed by humans, with some purpose, but not able to self-reproduce but which is yet able to exert agency in the form of prescription often back onto its designers. Essentially producing an externality in relation to the application of certain "laws" or regulations.

Nonetheless, capture and captivation also constitute a critical response through the possibility of a disconnecting logic and the dynamics of mimesis. This possibility reflected through the notion of entanglements refers to the "derangements in the organisation of knowledge caused by unprecedented adjacency and comparability or parity". This is, of course, definitional in relation to the notion of computation when itself works through a logic of formatting, configuration, structuring and the application of computational ontologies (Berry 2011, 2014).

Here capture offers the possibility of a form of practice in relation to alienation by making the inquirer adopt a position of criticism, the art of making strange. Chow here is making links to Brecht and Shklovsky, and in particular their respective predilection for estrangement in artistic practice, such as in Brecht's notion of verfremdung, and thus to show how things work, whilst they are being shown (Chow 2012: 26-28). In this moment of alienation the possibility is thus raised of things being otherwise. This is the art of making strange as a means to disrupt the everyday conventionalism and refresh the perception of the world – art as device. The connections between techniques of capture and critical practice as advocated by Chow, and reading or writing the digital are suggestive in relation to computation more generally, not only in artistic practice but also in terms of critical theory. Indeed, capture could be a useful hinge around which to subject the softwarization practices, infrastructures and experiences of computation to critical thought both in terms of their technical and social operations but also to the extent to which they generate a coercive imperative for humans to live and stay alive under the conditions of a biocomputational regime.



Bibliography

Berry, D. M. (2011) The Philosophy of Software, London: Palgrave.

Berry, D. M. (2014) Critical Theory and the Digital, New York: Bloomsbury.

Chow, R. (2012) Entanglements, or Transmedial Thinking about Capture, London: Duke University Press.

Foucault, M. (1991) Discipline and Publish, London: Penguin Social Sciences.



12 February 2014

Digital/Post-digital

I want to take up the question of the definition of the "post-digital" again because I think that what the post-digital is pointing towards as a concept is the multiple moments in which the digital was operative in various ways (see Berry 2014a, 2014b, 2014c). Indeed, historicising the “digital" can be a useful, if not crucial step, in understanding the transformation(s) of digital technologies. That is, we are at a moment whereby we are able to survey the various constellations of factors that made up a particular historical configuration around the digital and in which the “digital” formed an “imagined" medium to which existing analogue mediums where often compared, and to which the digital tended to be seen as suffering from a lack, e.g. not a medium for “real” news, for film, etc. etc. The digital was another medium to place at the end (of the list) after all the other mediums were counted – and not a very good one. It was where the digital was understood, if it were understood at all, as a complement to other media forms, somewhat lacking, geeky, glitchy, poor quality and generally suited for toys, like games or the web, or for “boring” activities like accountancy or infrastructure. The reality is that in many ways the digital was merely a staging post, whilst computing capacity, memory, storage and display resolutions could fall in price/rise in power enough to enable a truly “post-digital” environment that could produce new mediated experiences. That is, that it appears that the digital was “complementary” but the post-digital is zero-sum. Here is my attempt to sum up some of the moments that I think might serve as a provocation to debate the post-digital.


 DIGITAL 


 POST-DIGITAL 
Non-zero sumZero-sum
ObjectsStreams
FilesClouds
ProgramsApps
SQL databasesNoSQL storage
HTMLnode.js/APIs
DisciplinaryControl
AdministrationLogistics
ConnectAlways-on
Copy/PasteIntermediate
DigitalComputal
HybridUnified
InterfaceSurface
BitTorrentScraping
ParticipationSharing/Making
MetadataMetacontent
Web 2.0Stacks
MediumPlatform
GamesWorld
Software agentsCompactants
ExperienceEngagement
SyndicationPush notification
GPSBeacons  (IoTs)
ArtAesthetics
PrivacyPersonal Cloud
PlaintextCryptography
ResponsiveAnticipatory
TracingTracking
SurfingReading

figure 1: Digital to Post-Digital Shifts 

This the table offers constellations or moments within a “digital” as opposed to a “post-digital” ecology, as it were, and, of course, a provocation to thought. But they can also be thought of as ideal types that can provide some conceptual stability for thinking, in an environment of accelerating technical change and dramatic and unpredictable social tensions in response to this. The question then becomes to what extent can the post-digital counter-act the tendencies towards domination of specific modes of thought in relation to instrumentality, particularly manifested in computational devices and systems? For example, the contrast between the moments represented by Web 2.0 / Stacks provides an opportunity for thinking about how new platforms have been built on the older Web 2.0 systems, in some cases replacing them, and in others opening up new possibilities which Tiziana Terranova (2014) has pointed to in her intriguing notion of “Red Stacks”, for example (and in contrast to Bruce Sterlings notion of “The Stacks”, e.g. Google, Facebook, etc.). Here I have been thinking of the notion of the digital as representing a form of “weak computation/computationality”, versus the post-digital as “strong computation/computationality”, and what would the consequences be for a society that increasingly finds that the weak computational forms (CDs, DVDs, laptops, desktops, Blogs, RSS, Android Open Source Platform [AOSP], open platforms and systems, etc.) are replaced by stronger, encrypted and/or locked-in versions (FairPlay DRM, Advanced Access Content System [AACS], iPads, Twitter, Push-notification, Google Mobile Services [GMS], Trackers, Sensors, ANTICRISIS GIRL, etc.)?  

These are not just meant to be thought of in a technical register, rather the notion of “weak computation” points towards a “weak computational sociality” and “strong computation” points towards a “strong computation sociality”, highlighting the deeper penetration of computational forms into everyday life within social media and push-notification, for example. Even as the post-digital opens up new possibilities for contestation, e.g. megaleaks, data journalism, hacks, cryptography, dark nets, torrents, piratization, sub rosa sharing networks, such as the Alexandria Project, etc. and new opportunities for creating, sharing and reading knowledges, the “strong computation” of the post-digital always already suggests the shadow of computation reflected in heightened tracking, surveillance and monitoring of a control society. The post-digital points towards a reconfiguration of publishing away from the (barely) digital techniques of the older book publishing industry, and towards the post-digital singularity of Amazonized publishing with its accelerated instrumentalised forms of softwarized logistics whilst also simultaneously supporting new forms of post-digital craft production of books and journals, and providing globalised distribution. How then can we think about these contradictions in the unfolding of the post-digital and its tendencies towards what I am calling here “strong computation”, and in what way, even counter-intuitively, does the digital (weak computation) offer alternatives, even as marginal critical practice, and the post-digital (strong computation) create new critical practices (e.g. critical engineering), against the increasing interconnection, intermediation and seamless functioning and operation of the post-digital as pure instrumentality, horizon, and/or imaginary.  



Bibliography

Berry, D. M. (2014a) The Post-Digital, Stunlaw, accessed 14/1/2014, http://stunlaw.blogspot.co.uk/2014/01/the-post-digital.html

Berry, D. M. (2014b) Critical Theory and the Digital, New York: Bloomsbury.

Berry, D. M. (2014c) On Compute, Stunlaw, accessed 14/1/2014,  http://stunlaw.blogspot.co.uk/2014/01/on-compute.html

Terranova, T. (2014) Red stack attack! Algorithms, capital and the automation of the common, EuroNomade, accessed 20/2/2014,  http://www.euronomade.info/?p=1708


23 January 2014

Marcuse and Objects

Herbert Marcuse
For Marcuse, the a priori concept of the object precedes and makes possible its appropriation by rational theory and practice. That is, that the links between science, technology and society are shared in the form of experience created through the technological a priori that creates a quantifiable reality of science and hence an instrumentalizable reality for society (Feenberg 2013) – objects as such. That is, "when technics becomes the universal form of material production, it circumscribes an entire culture; it projects a historical totality – a 'world'" (Marcuse 1999: 154). In other words, "technology has become the great vehicle for reification – reification in its most mature and effective form" (Marcuse 1999:168). As such "the world tends to become the stuff of total administration, which absorbs even the administrators" (Marcuse 199: 169). Thus, Marcuse argues that,
The science of nature develops under the technological a priori which projects nature as potential instrumentality, stuff of control and organisation. And the apprehension of nature as (hypothetical) instrumentality precedes the development of all particular technical organisation (Marcuse 1999: 153). 
Even experience itself becomes "corrupted" because of the way in which experience is mediated through technologies and scientific methods resulting in abstract labour and the fetishism of commodities (Feenberg 2013: 609). The measure of society is then, in this account, eliminated, depriving society and individuals of a means to critique or provide justifications against the prevailing a priori of technological rationality. This,
technological reality, the object world, (including the subjects) is experienced as a world of instrumentalities. The technological context predefines the form in which the objects appear... The object world is thus the world of a specific historical project, and is never accessible outside the historical project which organises matter, and the organisation of matter is at one and the same time a theoretical and a practical enterprise (Marcuse 1999: 219). 
As such, there are two moments that Marcuse identifies in relation to this, namely quantification and instrumentalization. He writes, firstly regarding quantification that,
The quantification of nature, which led to its explication in terms of mathematical structures, separated reality from all inherent ends and, consequently, separated the true from the good, science from ethics... And no matter how constitutive may be the role of the subject as point of observation, measurement, and calculation, this subject cannot play its scientific role as ethical or aesthetic or political agent (Marcuse 1999: 146-7)
Secondly he explains that it is claimed that,
Theoretically, the transformation of man and nature has no other objective limits than those offered by the brute factuality of matter, its still unmastered resistance to knowledge and control. To the degree which this conception becomes applicable and effective in reality, the latter is approached as a (hypothetical) system of instrumentalities; the metaphysical "being-as-such" gives way to "being-instrument." Moreover, proved in its effectiveness, this conception works as an a priori – it predetermines experience, it projects the direction of the transformation of nature, it organizes the whole (Marcuse 1999: 152). 
This creates a way of being, and experience of and set of practices towards everyday life that embody and realise this a priori in a number of moments across a life experience. Indeed, it develops an attitude or a towards-which that is infused with the instrumentality towards the world that conceives of it as being a world of entities which can be known, controlled, manipulated and if required transformed. Consequently,
the "correct" attitude towards instrumentality is the technical approach, the correct logos is techno-logy, which projects and responds to a technological reality. In this reality, matter as well as science is "neutral"; objectivity has neither a telos in itself nor is it structured towards a telos. But it is precisely its neutral character which relates objectivity to a specific historical Subject – namely, to the consciousness that prevails in the society by which and for which this neutrality is established. It operates in the very abstractions which constitute the new rationality – as an internal rather than external factor... the reduction of secondary to primary qualities, quantification and abstraction from "particular sorts of entities" (Marcuse 1999: 156). 
The question then becomes the extent to which this totalising system overwhelms the capacity for agency, and as such a critical consciousness. Indeed, related to this is the important question of the relationship between science and technology itself, in as much as the question to be addressed is, is science prior to technology and therefore a condition of possibility for it? Or has science become technologised to the extent that science is now itself subjected to a technological a priori? The latter a position held by Heidegger, for example. In other words, is science "complicit with the system of domination that prevails under capitalism" (Feenberg 2013: 609). Indeed, Marcuse agreed that,
Critical analysis must dissociate itself from that which it strives to comprehend; the philosophic terms must be other than the ordinary ones in order to elucidate the full meaning of the latter. For the established universe of discourse bears throughout the marks of the specific modes of domination, organisation, and manipulation to which the members of a society are subjects (Marcuse 1999: 193). 
The danger of "one-dimensionality" that the lack of critical thought implies, creates a form of modern reason that has domination built into its structure. Indeed, Horkheimer and Adorno argue,
The thing-like quality of the means, which makes the means universally available, its “objective validity” for everyone, itself implies a criticism of the domination from which thought has arisen as its means. On the way from mythology to logistics, thought has lost the element of reflection on itself, and machinery mutilates people today, even if it also feeds them. In the form of machines, however, alienated reason is moving toward a society which reconciles thought, in its solidification as an apparatus both material and intellectual, with a liberated living element, and relates it to society itself as its true subject. The particularist origin and the universal perspective of thought have always been inseparable. Today, with the transformation of the world into industry, the perspective of the universal, the social realization of thought, is so fully open to view that thought is repudiated by the rulers themselves as mere ideology (Horkheimer and Adorno 1999: 37; quoted in Feenberg 2013: 609).
How then to recover the capacity for reflection and thought and thus to move to a new mode of experience, a "two dimensional experience responsive to the potentialities of people and things" (Feenberg 2013: 610). This would require a new orientation towards potentiality, or what I call elsewhere possibility (Berry 2014) that would enable this new spirit of criticality, critical reason as such. In other words, the reconfiguring of quantification practices and instrumental processes away from domination (Adorno, Horkheimer, Marcuse) and control (Habermas), instead towards reflexivity, critique and democratic practices.

For Feenberg this requires "counter-acting the tendencies towards domination in the technological a priori" through the "materialization of values" (Feenberg 2013: 613). This he argues can be found at specific intervention points within the materialisation of this a priori, such as in design processes. Feenberg argues that "design is the mediation through which the potential for domination contained in scientific-technical rationality enters the social world as a civilisational project" (Feenberg 2013: 613). Instead, Feenberg argues that the "socialist a priori" should inform the processes of technical implementation and technical practice. However, it seems to me that this misses the instrumentality implicit in design and design practices more generally, which often tend to maximise instrumental values in their application of concepts of efficiency and organisation. This, in some senses requires a call for a radical politicisation of design, or a new form of critical design which is different and more revolutionary than the form outlined by Dunne & Raby (2013). Here we might start making connections to new forms of rationality that offer possibilities to augment or perhaps replace instrumental rationalities, for example in the potentialities of critical computational rationalities, iteracies, and other computational competences whose performance and practice are not necessarily tied to instrumental notions of efficiency and order, nor to capitalist forms of reification (Berry 2014).




Bibliography

Berry, D. M. (2014) Critical Theory and the Digital, New York: Bloomsbury.

Dunne, A. and  Raby, F. (2013) Critical Design FAQ, accessed 23/1/2013, http://www.dunneandraby.co.uk/content/bydandr/13/0

Feenberg, F. (2013) Marcuse’s Phenomenology: Reading Chapter Six of One-Dimensional Man, Constellations, Volume 20, Number 4, pp. 604-614.

Horkheimer, M. and Adorno, T. W. (1999) The Dialectic of Enlightenment, London: Verso.

Marcuse, H. (1999) One-dimensional Man, London: Routledge.



17 January 2014

Non-Media


French philosopher François Laruelle
If we take seriously the claims of François Laruelle, the French "non-philosopher",[1] that it is possible to undertake a "non-philosophy", a project that seeks out a "non-philosophical kernel" within a philosophical system, then what would be the implications of what we might call a "non-media"? For example in seeking a "non-Euclidean" Marxism, Laruelle argues that we could uncover, in some sense, the non-philosophical "ingredient", as Galloway (2012: 6) calls it. That is to find the non-Marxist "kernel" which serves as the starting point, both as "symptom and model". Indeed, Laruelle himself undertook such a project in relation to Marxism in Introduction au non-marxisme (2000)where he sought to "'philosophically impoverish' Marxism, with the goal of 'universalising' it through a 'scientific mode of universalisation'" (Galloway 2012: 194). That is, that Laruelle, in Galloway's interpretation, seeks to develop "an ontological platform that, while leaving room for certain kinds of casualty and relation, radically denies exchange in any form whatsoever" (Galloway 2012: 194). Indeed, Galloway argues Laruelle in,
deviating too from 'process philosophers' like Deleuze, who must necessarily endorse exchange at some level, Laruelle advocates a mode of expression that is irreversible. He does this through a number of interconnected concepts, the most important of which being 'determination-in-the-last-instance" (DLI). Having kidnapped the term from its Althusserian Marxist home, Laruelle uses DLI to show how there can exist casualty that is not reciprocal, how a 'relation' can exist that is not at the same time a 'relation of exchange', indeed how a universe might look if it was not already forged implicitly from the mould of market capitalism (Galloway 2012: 195).
That is, the exploration and therefore refusal of exchange at the level of ontology, rather than the level of politics – at what we might call, following Laclau and Mouffe (2001), the political. Laruelle argues that this "philosophical" decision is the target of his critique,
What is probably wounding for philosophers is the fact that, from the point of view I have adopted, I am obliged to posit that there is no principle of choice between a classical type of ontology and the deconstruction of that ontology. There is no reason to choose one rather than the other. This is a problem that I have discussed at great length in my work (Les philosophies de la différence), whether there can be a principle of choice between philosophies. Ultimately, it is the problem of the philosophical decision (Laruelle, quoted in Mackay 2005).
Thus the diagnosis is not the lack of a philosophy at the centre of works, but rather an excess, which results in the subversion of a system of abstraction that in some sense problematically uses exchange as an axiomatic. This is the notion that exchange renders possible the full convertibility of entities as a form of philosophical violence towards the multiplicity of the "real" and which founds a form of thinking that becomes hegemonic as a condition of possibility for thought – even radially anti-captialist thought within philosophy as defined. Instead Laruelle suggests we get the essence of something from the "real", as it were. Indeed, when Derrida asked "Where do [you] get this [essence] from?" Laruelle answered "I get it from the thing itself" (Laruelle, quoted in Mackay 2005). Laruelle argues,
We start from the One, rather than arriving at it. We start from the One, which is to say that if we go anywhere, it will be toward the World, toward Being. And I frequently use a formulation which is obviously shocking to philosophers and particularly those of a Platonist or Plotinian bent: it’s not the One that is beyond Being, it is Being that is beyond the One. It is Being that is the other of the One (Laruelle, quoted in Mackay 2005).
Within this formulation, Galloway argues that there is evidence that Laruelle is a "vulgar determinist and unapologetically so". That is, that for Laruelle,
The infrastructure of the material base is a given-without-givenness because and only because of its ability to condition and determine – unidirectionally, irreversibly, and in the 'last instance' – whatever it might condition and determine, in this case the superstructure. Thus the infrastructure stands as 'given' while still never partaking in 'givenness', neither as a thing having appeared as a result of previous givenness, nor a present givenness engendering the offspring of subsequent givens (Galloway 2012: 199).
There are clear totalitarian implications in this formulation of determinism running from a material base, and the resultant liquidation of the possibility of autonomy as a critical concept. Indeed, the overtones of a kind of scientific Marxism, read through a kind of simplistic Newtonian theorisation of science seems itself limited and regressive. Not only politically, which surrenders self-determination and individuation to the causal first cause, called the "One", to which "clones" are subservient in determinism.  As Srnicek describes,
At the highest level, one ultimately reaches what is called the One – the highest principle from which everything derives. Now there are a number of reasons why this highest level must be one – meaning singular, unified and simple. The first basic reason is that if it weren't simple, then it could be decomposed into its constituent parts. The highest principle of reality must not admit of multiplicity, but must instead be the singular principle that itself explains multiplicity. Now as a simple principle, it must be impossible to predicate anything of it (Srnicek 2011: 2)
Thus,
Non-philosophy is not just a theory but a practice. It re-writes or re-describes particular philosophies, but in a non-transcendental form—non-aesthetics, non-Spinozism, non-Deleuzianism, and so on. It takes philosophical concepts and subtracts any transcendence from them in order to see them, not as representations, but as parts of the Real or as alongside the Real (Mullarkey: 134). 
An approach to media that incorporates non-philosophy, a "non-media", would then be a rigorous non-philosophical knowledge of the "kernel" of media, the deterministic causality ground in an ontology of media that stresses it unidirectional causality and ultimate status as the ground of possibility. That is, a set of realist claims from a rigorously non-philosophical tradition, seeking to get at the core of media, from the "thing in itself". Indeed, Laruelle himself has talked about the links between philosophy and media, as Thacker outlines,
Near the end of his essay “The Truth According to Hermes,” François Laruelle points out the fundamental link between philosophy and media. All philosophy, says Laruelle, subscribes to the “communicational decision,” that everything that exists can be communicated. In this self-inscribed world, all secrets exist only to be communicated, all that is not-said is simply that which is not-yet-said. One senses that, for Laruelle, the communicational decision is even more insidious than the philosophical decision. It’s one thing to claim that everything that exists, exists for a reason. It’s quite another to claim that everything-that-exists- for-a-reason is immediately and transparently communicable, in its reason for existing. If the philosophical decision is a variant on the principle of sufficient reason, then the communicational decision adds on top of it the communicability of meaning (Thacker 2010: 24).
Hermes was the swift-footed messenger,
trusted ambassador of all the gods,
and conductor of shades to Hades. 
Indeed, Laruelle disdains the communicational, "meaning, always more meaning! Information, always more information! Such is the mantra of hermeto-logical Difference, which mixes together truth and communication, the real and information" (Laruelle, quoted in Thacker 2010: 24). A radically realist non-media would then dismiss the interpretative moment of understanding for the fidelity to the One, the possibility of the source of the communicational in terms of the "material" base from which all causality springs.[2] This would seem to be a step that not only dismisses any possibility of a philosophical or theoretical understanding of media in terms of its materiality, as such, but also the possibility of any agency created as a result of the material or technical a priori of media. In this sense, it is not difficult to share Derrida's repudiation of the possibility of a non-philosophy but also to question its claims to seek to work at the level of ontology, outside of philosophy, but also of interpretation (see Mackay 2005). Instead does such a claim rather represent a totalising moment in thought, an example or claim of a non-mediated experience, devoid of thought itself and therefore of the possibility of critical reason and politic? The "terror" of the real, in such a formulation, represents not a radical break with contemporary thought, here cast as "philosophical", but rather of the real as the horizon of thought and its limit.[3]



Notes

[1] Ray Brassier (2003) has described François Laruelle as "the most important unknown philosopher working in Europe today".
[2] It appears that the notion of "material" in this account, increasingly looks less like a historical materialist account and rather as a synchronic metaphysics cast as a "realism" outside of human history as such. Philosophy, history, culture and so forth being merely the epiphenomenon "determined" by the "material" or perhaps better, real, base of the "thing in itself".
[3] It is worth noting the contradiction of a position that claims such an overarching determinism stemming from the One, will inevitably undermine its own claims to veracity by the fact that such determinism would naturally have "caused" Laruelle to have written his books in the first case, and hence providing no possibility of agency to assess the claims made, as the individual agency (such as there is) of the readers and commentators would also be locked into this deterministic structure. Such that, even were one to detect such claims, ones consciousness having been formed from this source, would themselves be tainted by that determinism. 

Bibliography

Brassier, R. (2003) Axiomatic Heresy: The Non-Philosophy of Francois Laruelle, Radical Philosophy 121, Sep/Oct 2003.

Galloway, A. R. (2012) Laruelle, Anti-Capitalist, in Mullarkey, J. and Smith, A. P. (eds.) Laruelle and Non-Philosophy, Edinburgh: Edinburgh University Press.

Laclau, E. and Mouffe, C. (2001) Hegemony and Socialist Strategy: Towards a Radical Democratic Politics, London: Verso Books.

Mackay, R. (2005) Controversy over the Possibility of a Science of Philosophy, La Decision Philosophique No. 5, April 1988, pp62-76, accessed 17/01/2014, http://pervegalit.files.wordpress.com/2008/06/laruelle-derrida.pdf

Mullarkey, J. (2006) Post-continental Philosophy: An Outline, London: Continuum.

Srnicek, N. (2011) François Laruelle, the One and the Non-Philosophical Tradition, Pli: The Warwick Journal Of Philosophy, 22, 2011, p. 187-198, accessed 17/1/2014, https://www.academia.edu/947355/Francois_Laruelle_the_One_and_the_Non-Philosophical_Tradition

Thacker, E. (2010) Mystique of Mysticism, in Galloway, A. R., French Theory Today An Introduction to Possible Futures,  Published by TPSNY/Erudio Editions.

14 January 2014

Questions from a Worker Who Codes


In relation to the post-digital, it is interesting to ask the question as to the extent to which the computational is both the horizon of, and the gatekeeper to, culture today (Berry 2014a). If code operates as the totalising mediator of culture, if not the condition for such culture, then access to both culture and code should become social, political and aesthetic questions. This is partially bound up with questions of literacy and the scope of such knowledges, usually framed within the context of computational competence within a particular programming language. This question returns again and again in relation to the perceived educative level of a population in order to partake of the commonality shared within a newly post-digital culture – should one code? In other words, to what extent must a citizen be able to read and interact with the inscriptions that are common to a society?  Indeed, in the register of art, for example, Brecht considered the question itself to be superfluous, in as much as providing an opportunity of access and therefore praxis opens the possibility of such experiences and understanding. He writes,
one need not be afraid to produce daring, unusual things for the proletariat so long as they deal with its real situation. There will always be people of culture, connoisseurs of art, who will interject: “Ordinary people do not understand that.” But the people will push these persons impatiently aside and come to a direct understanding with artists (Brecht 2007: 84).
In relation to the practices of code itself, it is, of course, not a panacea for all the ills of society. However, it is on the other hand a competence that increasingly marks itself out as a practice which creates opportunities to interact with and guide ones life in relation to being able to operate, and define how the computational functions in relation to individuation processes (see Stiegler 2013, Cowen 2013, Economist 2014). Not only that, as the epistemic function of code grows in relation to the transformation of previous media forms into a digital substrate, and the associated softwarization of the process, culture is itself transformed and the possibilities for using and accessing that culture change too. Indeed, Bifo argues, without such competences, "the word is drawn into this process of automation, so we find it frozen and abstract in the disempathetic life of a society that has become incapable of solidarity and autonomy" (Berardi 2012: 17). For Berardi, cognitive labour would then have become disempowered and subjected to what he calls "precarization" (Berardi 2012: 141). In response he calls for an "insurrection" in as much as "events" can generate the "activation of solidarity, complicity, and independent collaboration between cognitarians", that is, "between programmers, hardware technicians, journalists, and artists who all take part in an informational process" (Berardi 2012: 142-3).

The aim of this literacy, if we can call it that, in relation to the computational, and which is similar to what I have called iteracy elsewhere (Berry 2014b), is also connected to notions of reflexivity, critique, and emancipation in relation to the mechanisation of not only labour, but also culture and intellectual activities more generally. Understanding the machine, as it were, creates the opportunity to change it, and to give citizens the capacity to imagine that things might be other than they are.

This is important to avoid a situation whereby the proletarianisation of labour is followed by the capacity of machines to proletarianise intellectual thought itself. That is, that machines define the boundaries of how, as a human being, one must conduct oneself, as revealed by a comment by a worker at a factory in France in the 1960s who commented, that "to eat, in principle, one must be hungry. However, when we eat, it’s not because we’re hungry, it’s because the electronic brain thought that we should eat because of a gap in production" (Stark 2012: 125). Delegation into the machine of the processes of material and intellectual production abstracts the world into a symbolic representation within the processes of machine code. It is a language of disconnection, a language that disables the worker, but simultaneously disables the programmer, or cognitive worker, who no longer sees another human being, but rather an abstract harmony of interacting objects within a computational space – that is, through the application of compute (Berry 2014c). This is, of course, a moment of reification, and as such code and software act as an ideological screen for the activities of capitalism, and the harsh realities of neoliberal restructuring and efficiencies, the endless work,[1] made possible by such softwarization. Indeed, under capital,
time sheds its qualitative, variable, flowing nature; it freezes into an exactly delimited, quantifiable continuum filled with quantifiable 'things' (the reified, mechanically objectified 'performance' of the worker, wholly separated from his total human personality): in short, it becomes space. In this environ­ment where time is transformed into abstract, exactly measurable, physical space, an environment at once the cause and effect ofthe scientifically and mechanically fragmented and specialised pro­ duction of the object of labour, the subjects of labour must like­ wise be rationally fragmented. On the one hand, the objectifica­tion of their labour-power into something opposed to their total personality (a process already accomplished with the sale of that labour-power as a commodity) is now made into the permanent ineluctable reality of their daily life. Here, too, the personality can do no more than look on helplessly while its own existence is reduced to an isolated particle and fed into an alien system. On the other hand, the mechanical disintegration of the process of production into its components also destroys those bonds that had bound individuals to a community in the days when production was still 'organic'. In this respect, too, mechanisation makes of them isolated abstract atoms whose work no longer brings them together directly and organically; it becomes mediated to an increasing extent exclusively by the abstract laws of the mechanism which imprisons them (Lukács 1971: 90).
But of course here, it is not seconds and minutes measured in "the pendulum of the clock [that] has become as accurate a measure of the relative activity of two workers as it is of the speed of two locomotives", but rather the microsecond and millisecond time of code, combined with new forms of sensors and distributed computational devices that measure time. Indeed, "time is everything, [humans are] nothing; they are at the most the incarnation of time. Quality no longer matters. Quantity alone decides everything: hour for hour, day for day" (Marx 1976: 125). For it is in the spaces of such quantification that lies the obfuscation of the the realities of production, but also of the possibility for changing production to a more democratic and humane system that makes, as Stiegler claims, "a life worth living" (Stiegler 2009).[2]



Notes

[1] It is interesting to think about the computational imaginary in relation to the notion of "work" that this entails or is coded/delegated into the machine algorithms of our post-digital age. Campagna (2013) has an interesting formulation of this in relation to Newman (2012) has called "nothing less than a new updated Ego and Its Own for our contemporary neoliberal age" (Newman 2012: 93). Indeed, Campagna writes, "westerners had to find a way of adapting this mystical exercise to the structures of contemporary capitalism. What would a mantra look like, in the heart of a global metropolis of the 21st Century? What other act might be able to host its obsessive spirit, whilst functioning like a round, magic shield, covering the frightened believers from their fear of freedom? There was only one possible, almost perfect candidate. The activity of repetition par excellence: Work. The endless chain of gestures and movements that had built the pyramids and dug the mass graves of the past. The seal of a new alliance with all that is divine, which would be able to bind once again the whole of humanity to a new and eternal submission. The act of submission to submission itself. Work. The new, true faith of the future" (Campagna 2013: 10). Here, though I argue that it is not immaterial apparitions and spectres which are haunting humanity and which the Egoist can break free from, but the digital materiality of computers' abstractions formed of algorithms and code and which are a condition of possibility for individuation and subjectivity itself within cognitive capitalism. 
[2] As Stark writes,  "for a worker to claim the right to create—to theoretically “unalienated” labor—was a gesture as threatening to the factory bosses as it was to the official organs of the left, with their vision of the worker acceding to a state of being-in-oneself through work. Regarding this form of sociological indeterminacy, Rancière argues that “perhaps the truly dangerous classes are . . . the migrants who move at the border between classes, individuals and groups who develop capabilities within themselves which are useless for the improvement of their material lives and which in fact are liable to make them despise material concerns.” Further, for Rancière, “Working- class emancipation was not the affirmation of values specific to the world of labor. It was a rupture in the order of things that founded these ‘values,’ a rupture in the traditional division [partage] assigning the privilege of thought to some and the tasks of production to others.” Binetruy affirms this rupture, recalling that while initially wary of “these Parisians who came stuffed with film and cameras,” he quickly realized that “they did not come to teach us any lessons, but rather to transmit technical training that would liberate our spirits through our eyes. Once you have put your eyes behind a camera, you are no longer the same man, your perspective has changed.”" (Stark 2012: 150).

Bibliography

Berardi, F. (2012) The Uprising: On Poetry and Finance, London: Semiotext(e).

Berry, D. M. (2014a) The Post-Digital, Stunlaw, accessed 14/1/2014, http://stunlaw.blogspot.co.uk/2014/01/the-post-digital.html

Berry, D. M. (2014b) Critical Theory and the Digital, New York: Bloomsbury.

Berry, D. M. (2014c) On Compute, Stunlaw, accessed 14/1/2014,  http://stunlaw.blogspot.co.uk/2014/01/on-compute.html

Brecht, B. (2007) Popularity and Realism, in Aesthetics and Politics, London: Verso Press.

Campagna, F. (2013) The Last Night: Anti-Work, Atheism, Adventure, London: Zero Books.

Cowen, T. (2013) Average Is Over: Powering America Beyond the Age of the Great Stagnation, London: Dutton Books.

Economist (2014) Coming to an office near you, The Economist, accessed 16/01/2014, http://www.economist.com/news/leaders/21594298-effect-todays-technology-tomorrows-jobs-will-be-immenseand-no-country-ready

Lukács, G. (1971) History and Class Consciousness: Studies in Marxist Dialectics, MIT Press.

Marx, K. (1976) The Poverty of Philosophy, in Karl Marx and Frederick Engels, Collected Works, Volume 6, 1845–1848, London: Lawrence & Wishart.

Newman, S. (2013) Afterword, In Campagna, F. (2013) The Last Night: Anti-Work, Atheism, Adventure, London: Zero Books, pp. 92-5.

Stark, T. (2012) “Cinema in the Hands of the People”: Chris Marker, the Medvedkin Group, and the Potential of Militant Film, OCTOBER, 139, Winter 2012, pp. 117–150.

Stiegler, B. (2009) What Makes Life Worth Living: On Pharmacology, Cambridge: Polity Press

05 January 2014

On Compute


Today, the condition of possibility for the milieu of contemporary life is compute. That is, compute as the abstract unit of computation, both as dunamis (potentiality) and energeia (actuality), that is as the condition of possibility for the question of the in-itself and the for-itself.  Compute as a concept, exists in two senses, as the potential contained in a computational system, or infrastructure, and in the actuation of that potential in actual work, as such. Whilst always already a theoretical limit, compute is also the material that may be brought to bear on a particular computational problem – and now many problems are indeed computational problems. Such then that the theoretical question posed by compute is directly relevant to the study of software, algorithms and code, and therefore the contemporary condition in computal society, because it represents the moment of potential in the transformation of inert materials into working systems. It is literally the computational unit of "energy" that is supplied to power the algorithms of the world's systems. Compute then, is a notion of abstract computation, but it is also the condition of possibility for and the potential actuation of that reserve power of computation in a particular task. Compute becomes a key noetic means of thinking through the distribution of computation in the technological imaginary of computal society.

In a highly distributed computational environment, such as we live in today, compute is itself distributed around society, carried in pockets, accessible through networks and wireless connections and pooled in huge computational clouds. Compute then is not only abstract but lived and enacted in everyday life, it is part of the texture of life, not just as a layer upon life but as a structural possibility for and mediation of such living. But crucially, compute is also an invisible factor in society, partially due to the obfuscation of the technical condition of the production of compute, but also due to the necessity for an interface, a surface, with which to interact with compute. Compute then as a milieu is such that it is never seen as such, even as it surrounds us and is constantly interacting with and framing our experiences. Indeed, Stiegler (2009) writes that,
Studying the senses, Aristotle underlines in effect that one does not see that, in the case of touching, it is the body that forms the milieu, whereas, for example, in the case of sight, the milieu is what he calls the diaphane. And he specifies that this milieu, because it is that which is most close, is that which is structurally forgotten, just as water is for a fish. The milieu is forgotten, because it effaces itself before that to which is gives place. There is always already a milieu, but this fact escapes us in the same way that "aquatic animals," as Aristotle says, "do not notice that one wet body touches another wet body" (423ab): water is what the fish always sees; it is what it never sees. Or, as Plato too says in the Timaeus, if the world was made of gold, gold would be the sole being that would never be seen – it would not be a being, but the inapparent being of that being, appearing only in the occurrence of being, by default (Stiegler 2009: 13-14)
In this sense, compute, is the structural condition of possibility that makes the milieu possible by giving it place, in as much as it creates those frameworks within which technicity takes place. The question of compute then, both as a theoretical concept but also as a technical definition is crucial for thinking through the challenge of computation more broadly. But, in a rapidly moving world of growing computational power, comparative analysis of computational change is made difficult without a metric by which to compare different moments historically. This is made much more difficult by the reality that compute is not simply the speed and bandwidth of a processor as such, but includes a number of other related technical considerations such as the speed of the underlying motherboard, ram, graphics processor(s), storage system and so forth.

Compute then is a relative concept and needs to be thought about in relation to previous iterations, and this is where benchmarking has become an important part of the assessment of compute – for example SPECint, a computer benchmark specification for a processor's integer processing power maintained by the Standard Performance Evaluation Corporation (SPEC 2014). Another, called GeekBench (2013), scores compute against a baseline score of 2500, which is the score of an Intel Core i5-2520M @ 2.50 GHz. In contrast, SYSmark 2007, another benchmark, attempts to bring "real world" applications into the processing measurement by including a number of ideal systems that run canned processing tasks (SYSmark 2007). As can be seen, comparing compute becomes a spectrum of benchmarks that test a variety of working definitions of forms of processing capacity. It is also unsurprising that as a result many manufactures create custom modes within their hardware to "game" these benchmarks and unfortunately obfuscate these definitions and comparators. For example,
Samsung created a white list for Exynos 5-based Galaxy S4 phones which allow some of the most popular benchmarking apps to shift into a high-performance mode not available to most applications. These apps run the GPU at 532MHz, while other apps cannot exceed 480MHz. This cheat was confirmed by AnandTech, who is the most respected name in both PC and mobile benchmarking. Samsung claims “the maximum GPU frequency is lowered to 480MHz for certain gaming apps that may cause an overload, when they are used for a prolonged period of time in full-screen mode,” but it doesn’t make sense that S Browser, Gallery, Camera and the Video Player apps can all run with the GPU wide open, but that all games are forced to run at a much lower speed (Schwartz 2013).
On a material register the unit of compute can be thought of as roughly the maximum potential processing capacity of a computer processing chip running for a notional hour. In todays softwarized landscape, of course, processing power itself become a service and hence more often is framed in terms of virtual machines (VMs), rather than actual physical machines – a number of compute instances can be realised on a single physical processor using sophisticated software to manage the illusion. Amazon itself defines compute through an abstraction of actual processing as follow,
Transitioning to a utility computing model fundamentally changes how developers have been trained to think about CPU resources. Instead of purchasing or leasing a particular processor to use for several months or years, you are renting capacity by the hour. Because Amazon EC2 is built on commodity hardware, over time there may be several different types of physical hardware underlying EC2 instances. Our goal is to provide a consistent amount of CPU capacity no matter what the actual underlying hardware (Amazon 2013).
Indeed, Amazon tends to discuss compute in relation to its unit of EC2 Compute Unit (ECU) to enable the discretisation.[1] Google also uses an abstract quantity and measures "minute-level increments" of computational time (Google 2013). The key is to begin thinking about how an instance provides a predictable amount of dedicated compute capacity and as such is a temporal measure of computational power albeit seemingly defined rather loosely in the technical documentation.

The question of compute is then a question of the origin of computation more generally, but also how the infrastructure of computation can be understood both qualitatively and quantitatively. Indeed, it is clear that the quantitative changes that greater compute capacity introduces makes possible the qualitative experience of computation that we increasingly take for granted in our use of a heavily software-textured world. To talk about software, processes, algorithms and code is then deficient without a corresponding understanding of the capacity of compute in relation to them and a key question for thinking about the conditions of possibility that computation make possible for our lives today.


Notes

[1] Amazon used to define the ECU directly, stating: "We use several benchmarks and tests to manage the consistency and predictability of the performance of an EC2 Compute Unit. One EC2 Compute Unit provides the equivalent CPU capacity of a 1.0-1.2 GHz 2007 Opteron or 2007 Xeon processor. This is also the equivalent to an early-2006 1.7 GHz Xeon processor referenced in our original documentation" (Berninger 2010). They appear to have stopped using this description in their documentation (see Amazon 2013). 

Bibliography

Amazon (2013) Amazon EC2 FAQs, accessed 05/01/2014, http://aws.amazon.com/ec2/faqs/#What_is_an_EC2_Compute_Unit_and_why_did_you_introduce_it

Berninger, D. (2010) What the heck is an ECU?,  accessed 05/01/2014, http://cloudpricecalculator.com/blog/hello-world/

GeekBench (2013) GeekBench Processor Benchmarks, accessed 05/01/2014, http://browser.primatelabs.com/processor-benchmarks

Google (2013) Compute Engine — Google Cloud Platform, accessed 05/01/2014, https://cloud.google.com/products/compute-engine/

Schwartz, R. (2013) The Dirty Little Secret About Mobile Benchmarks,  accessed 05/01/2014, http://mostly-tech.com/tag/geekbench/

SPEC (2014) The Standard Performance Evaluation Corporation (SPEC), accessed 05/01/2014, http://www.spec.org

Stiegler, B. (2009) Acting Out, Stanford University Press.

SYSmark (2007),  SYSmark 2007 Preview, accessed 05/01/2014, http://bapco.com/products/sysmark-2007#details-product-info

Disqus for Stunlaw: A critical review of politics, arts and technology