28 September 2012

Coping Tests as a Method for Software Studies

In this post I want to begin to outline a method for software reading that in some senses can form the basis of a method in software studies more generally. The idea is to use the pragmata of code, combined with its implicit temporality and goal-orientedness to develop an idea of what I call coping tests. This notion draws from the idea developed by Heidegger, as "coping" being a specific means of experiencing that takes account of the at-handiness (zuhandenheit) of equipment (that is entities/things/objects which are being used in action)  – in other words coping tests help us to observe the breakdowns of coded objects. This is useful because it helps us to think about the way software/code is in some senses a project that is not just static text on a screen, but a temporal structure that has a past, a processing present, and a futural orientation to the completion (or not) of a computational task. I want to develop this in contrast to attempts by others to focus on the code through either through a heavily textual approach (and critical code studies tends towards this direction), or else a purely functionality driven approach (which can have idealist implications in some forms, whereby a heavily mathematised approach tends towards a platonic notion of form).

In my previous book, The Philosophy of Software (Berry 2011), I use obfuscated code as a helpful example, not as a case of unreadable reading, or even for the spectacular, rather I use it as a stepping off point to talk about the materiality of code through the notion of software testing. Obfuscated code is code deliberately written to be unreadable to humans but perfectly readable to machines. This can take the form of a number of different approaches, from simply mangling the text (from a human point of view), to using distraction techniques, such as confusing or deliberately mislabeling variables, functions, calls, etc. It can even take the form of aesthetic effects, like drawing obvious patterns, streams, and lines in the code, or forming images through the arrangement of the text.

Testing is a hugely important part of the software lifecycle and links the textual source code to the mechanic software and creates the feedback cycle between the two. This I linked to Callon and Latour (via Boltanski and Thevenot) use of the notion of 'tests' (or trials of strength) - implying that it is crucially the running of these obfuscated code programs that shows that they are legitimate code (they call these legitimate tests), rather than nonsense. The fact that they are unreadable by humans and yet testable is very interesting, more so as they become aesthetic objects in themselves as the programmers start to create ASCII art both as a way of making the code (unreadable), now readable as an image, but also adding another semiotic layer to the meaning of the code's function.

The nature of coping that these tests imply (as trials of strength) combined with the mutability of code is then constrained through limits placed in terms of the testing and structure of the project-orientation. This is also how restrictions are delegated into the code which serve as what Boltanski and Thevenot can then be retested through 'trials of strength'. The borders of the code are also enforced through tests of strength which define the code qua code, in other words as the required/tested coded object. It is important to note that these also can be reflexively "played with" in terms of clever programming that works at the borderline of acceptability for programming practices (hacking is an obvious example of this).

In other words testing as coping tests can be understood in two different modes, (i) ontic coping tests: which legitimate and approval the functionality and content of the code, in other words that the code is doing what it should, so instrumentally, ethically, etc. But we need to work and think at a number of different levels, of course, from unit testing, application testing, user interface testing, and system testing, more generally in addition to taking account of the context and materialities that serve as conditions of possibility for testing (so this could take the form of a number of approaches, including ethnographies, discursive approaches, etc.).; and (ii) ontological coping tests: which legitimate the code qua code, that it is code at all, for example, authenticating that the code is the code we think it is – we can think of code signing as an example of this, although it has a deeper significance as the quiddity of code. This then has a more philosophical approach towards how we can understand, recognise or agree on the status of code as code and identify underlying ontological structural features, etc.

For critical theory, I think tests are a useful abstraction as an alternative (or in addition to) the close reading of source code. This can be useful in a humanities perspective for teaching some notions of 'code' through the idea of 'iteracy' for reading code, and will be discussed throughout my new book, Critical Theory and the Digital, in relation to critical readings of software/code opened up through the categories given by critical theory. But this is also extremely important for contemporary critical researchers and student, who require a much firmer grasp of computational principles in order to understand the economy, culture and society which has become softwarized, but also more generally for the humanities today, where some knowledge of computation is becoming required to undertake research.

One of the most interest aspects of this approach, I think, is that it helps sidestep the problems associated with literally reading source code, and the problematic of computational thinking in situ as a programming practice. Coping tests can be developed within a framework of "depth" in as much as different kinds of tests can be performed by different research communities, in some senses this is analogous to a test suite in programming. For example, one might have UI/UX coping tests, functionality coping tests, API tests, forensic tests (linking to Matthew Kirschenbaum's notion of forensic media), and even archaeological coping tests (drawing from media archaeology, and particularly theorists such as Jussi Parikka) – and here I am thinking both in terms of coping tests written in the present to "test" the "past", as it were, but also there must be an interesting history of software testing, which could be reconceptualised through this notion of coping tests, both as test scripts (discursive) but also in terms of software programming practice more generally, social ontologies of testing, testing machines, and so forth.[1] We might also think about the possibilities for thinking in terms of social epistemologies of software (drawing on Steve Fuller's work, for example).

As culture and society are increasingly softwarized, it seems to me that it is very important that critical theory is able to develop concepts in relation to software and code, as the digital. In a later post I hope to lay out a framework for studying software/code through coping tests and a framework/method with case studies (which I am developing with Anders Fagerjord, from IMK, Oslo University).




Notes

[1] Perhaps this is the beginning of a method for what we might call software archaeology. 


15 September 2012

New Aesthetic Argumentum Ad Hominem


Papercraft Self Portrait - 2009 (Testroete)
One of the most frustrating contemporary ways to attack any new idea, practice or moment is to label it as "buzz-worthy" or an "internet meme". The weakness of this attack should be obvious, but strangely it has become a powerful way to dismiss things without applying any any critical thought to the content of the object of discussion. In other words it is argumentation petitio principii, where the form of the argument is "the internet meme, the new aesthetic, should be ignored because it is an internet meme". Or even, in some forms, an argumentum ad hominem, where the attack is aimed at James Bridle (as the originator of the term) rather than the new aesthetic itself. Equally, the attacks may also be combined.

I think the whole 'internet meme', 'buzz', 'promotional strategy' angle on the new aesthetic is indicative of a wider set of worries in relation to a new scepticism, as it were (related also to the skepticism movement too, possibly). We see it on Twitter where the medium of communication seems to encourage a kind of mass scepticism, where one (the one as Das Man) assumes a priori that the other side is blindly following, a 'fanboy', irrational, suspect, or somehow beholden to a dark power to close, restrict or tighten individual freedoms – of course, the 'I' is smart enough to reject the illusion and unmask the hidden forces. This is also, I think, a worry of being caught out, being laughed at, or distracted by (yet) another internet fad. I also worry that the new aesthetic 'internet meme' criticism is particularly ad hominem, usually aimed, as it is, towards its birth within the creative industries. I think we really need to move on from this level of scepticism and be more dialectical in our attitude towards the possibilities in, and suggested by, the new aesthetic. This is where critical theory can be a valuable contributor to the debate.

For example, part of the new aesthetic, is a form of cultural practice which is related to a postmodern and fundamentally paranoid vision of being watched, observed, coded, processed or formatted. I find particularly fascinating the aesthetic dimension to this, in as much as the representational practices are often (but not always) retro, and in some senses, tangential to the physical, cultural, or even computational processes actually associated with such technologies. This is both, I suppose, a distraction, in as much as it misses the target, if we assume that the real can ever be represented accurately (which I don't), but also and more promisingly an aesthetic that remains firmly human mediated, contra to the claims of those who want to "see like machines". That is, the new aesthetic is an aestheticization of computational technology and computational techniques more generally. It is also fascinating in terms of the refusal of the new aesthetic to abide by the careful boundary monitoring of art and the 'creative industry' more generally, really bringing to the fore the questions raised by Liu, for example, in The Laws of Cool. One might say that it follows the computational propensity towards dissolving of traditional boundaries and disciplinary borders.

I also find the new aesthetic important for it has an inbuilt potentiality towards critical reflexivity, both towards itself (does the new aesthetic exist?) but also towards both artistic practice (is this art?), curation (should this be in galleries?), and technology (what is technology?). There is also, I believe, an interesting utopian kernel to the new aesthetic, in terms of its visions and creations – what we might call the paradigmatic forms – which mark the crossing over of certain important boundaries, such as culture/nature, technology/human, economic/aesthetic and so on. Here I am thinking of the notion of augmented humanity, or humanity 2.0, for example. This criticality is manifested in the new aesthetic's continual seeking to 'open up' black boxes of technology, to look at developments in science, technology and technique and to try to place them within histories and traditions – in the reemergence of social contradictions, for example. But even an autonomous new aesthetic, as it were, points towards the anonymous and universal political and cultural domination represented by computational techniques which are now deeply embedded in systems that we experience in all aspects of our lives. There is much to explore here.


Moroso pixelated sofa and nanimaquina rug, featured on Design Milk
The new aesthetic, of course, is as much symptomatic of a computational world as itself subject to the forces that drive that world. This means that it has every potential to be sold, standardised, and served up to the willing mass of consumers as any other neatly packaged product. Perhaps even more so, with its ease of distribution and reconfiguration within computational systems, such as Twitter and Tumblr. But it doesn't have to be that way, and so far I have more hope that it even in its impoverished consumerized form, it still serves to serve notice of computational thinking and processes, which stand out then against other logics. This is certainly one of the interesting dimensions to the new aesthetic both in terms of the materiality of computationality, but also in terms of the need to understand the logics of postmodern capitalism, even ones as abstract as obscure computational systems of control.

For me, the very possibility of a self-defined new 'aesthetic' enables this potentiality – of course, there are no simple concepts as such, but the new aesthetic, for me, acts as a "bridge" (following Deleuze and Guattari for a moment). By claiming that it is new 'aesthetic' makes possible the conceptual resources associated with and materialised in practices, which may need to be "dusted off" and to be used as if they were, in a sense, autonomous (that is, even, uncritical). This decoupling of the concept (no matter that in actuality one might claim that no such decoupling could really have happened) potentially changes the nature of the performances that are facilitated or granted by the space opened within the constellation of concepts around the 'new aesthetic' (again, whatever that is) – in a sense this might also render components within the new aesthetic inseparable as the optic of the new aesthetic, like any medium, may change the nature of what can be seen. Again, this is not necessarily a bad thing though. 

Glitch Textiles by Phillip David Stearns
Another way of putting it, perhaps, would be that a social ontology is made possible, which, within the terms of the the constellation of practices and concepts grounding it, is both distanced from and placed in opposition to existing and historical practices. Where this is interesting is that, so far, the new aesthetic, as a set of curatorial or collectionist practices, has been deeply recursive in its manifestation – both computational in structure (certainly something I am interested in about it) – and also strikingly visual (so far) – and here the possibility of an immanent critique central to the new aesthetic can be identified, I think. Of course, it is too early to say how far we can push this, especially with something as nascent as the new aesthetic, which is still very much a contested constellation of concepts and ideas and playing out in various media forms, etc., but nonetheless, I suggest that one might still detect the outlines of a kind of mediated non-identity implicit within the new aesthetic, and this makes it interesting. So I am not claiming, in any sense, that the new aesthetic was "founded on critical thinking", rather that in a similar way that computational processes are not "critical thinking" but contain a certain non-reflexive reflexivity when seen through their recursive strategies – but again this is a potentiality that needs to be uncovered, and not in any sense determined. This is, perhaps, the site of a politics of the new aesthetic.

Certainly there is much work to be done with the new aesthetic, and I, for one, do not think that everything is fixed in aspic – either by Bridle or any of the other commentators. Indeed, there is a need for thinking about the new aesthetic from a number of different perspectives, that for me is the point at which the new aesthetic is interesting for thinking with, and pushing it away seems to me to be an "over-hasty" move when it clearly points to a either a fresh constellations of concepts and ideas, or certainly a means for us to think about the old constellations in a new way. This means that we should not aim to be either for or against the new aesthetic, as such, but rather more interested in the philosophical and political work the new aesthetic makes possible.


Disqus for Stunlaw: A critical review of politics, arts and technology