20 September 2011

New Book: Understanding Digital Humanities

The application of new computational techniques and visualisation technologies in the Arts & Humanities are resulting in fresh approaches and methodologies for the study of new and traditional corpora. This 'computational turn' takes the methods and techniques from computer science to create innovative means of close and distant reading. This edited book aims to discuss the implications and applications of what has been called Digital Humanities and the questions raised when using algorithmic techniques. Within this field there are important debates about the contrast between narrative versus database, pattern-matching versus hermeneutics, and the statistical paradigm versus the data mining paradigm. Additionally, new forms of collaboration within the Arts and Humanities are raised through modular Arts and Humanities research teams and new organisational structures (e.g. Big Humanities), together with techniques for collaborating in an interdisciplinary way with other disciplines (e.g. hard interdisciplinarity versus soft interdisciplinarity). This book draws from key researchers in the field to give a comprehensive introduction to some of the key debates and questions.

16 September 2011

Iteracy: Reading, Writing and Running Code

Mark Marino posed a very interesting question on Twitter yesterday, asking:

Who has a good alternative to "literacy" when it comes to programming or reading code? (Marino, 2011).
It is something that I have been thinking about too with the concept of digital Bildung and computationality (see Berry 2011). However, I would like to suggest that iteracy might serve as the name for the specific skills used for understanding code and algorithmic culture – as indeed literacy (understanding texts) and numeracy (understanding numbers) do in a similar context. That is, iteracy is specifically the practice or being able to read and write code, rather than the more extensive notion of digital Bildung (see Berry 2011: 20-26).
In short, Bildung is still a key idea in the digital university, not as a subject trained in a vocational fashion to perform instrumental labour, nor as a subject skilled in a national literary culture, but rather as subject that can unify the information that society is now producing at increasing rates, and which understands new methods and practices of critical reading (code, data visualisation, patterns, narrative) and is subject to new methods of pedagogy to facilitate it (Berry 2011: 168). 
So digital Bildung would include the practices of iteracy and would build on them to facilitate a broader humanistic or critical education. Here, then, iteracy is defined broadly as communicative competence in reading, writing and executing computer code.

Iteration itself, is a term used in computing to refer to the repetition of a command, code fragment, process, function, etc. Understanding iteration is a crucial skill for developing programming skills as it is a means of re-using existing processes (looping structures). But also, iteration itself, combined with constant improvements, is a key way of developing software/code (very much associated with agile programming, for instance). An example of iteration in C++ code is:

int loop = 1;
while (loop <= 10)
  cout << "Iteration #" << loop << endl;

Here though I want to broaden the meaning of iteracy beyond mere looping structures in programming code. What skills, then, might be associated with this notion of iteracy?[1]

  • Computational Thinking: being able to devise and understand the way in which computational systems work to be able to read and write the code associated with them. For example abstraction, pipelining, hashing, sorting, etc. (see Wing 2011).
  • Algorithms: understanding the specifically algorithmic nature of computational work, e.g. recessions, iteration, discretisation, etc. 
  • Reading and Writing Code: practices in reading/writing code require new skills to enable the reader/programmer to make sense of and develop code in terms of modularity, data, encapsulation, naming, commentary, loops, recursion, etc.[2] 
  • Learning programming languages: understanding one or more concrete programming languages to enable the student to develop a comparative dimension to hone skills of iteracy, e.g. procedural, functional, object-oriented, etc. 
  • Aesthetics of Code: developing skills related to appreciating the aesthetic dimension of code, here I am thinking of 'beautiful code' and 'elegance' as key concepts (see Oram and Wilson 2007). 
  • Data and Models: understanding the significance and importance of data, information and knowledge and their relationships to models in computational thinking. 
  • *Critical Code Studies: critical approaches to the study of computer source code. Marino argues: 'that we no longer speak of the code as a text in metaphorical terms, but that we begin to analyze and explicate code as a text, as a sign system with its own rhetoric, as verbal communication that possesses significance in excess of its functional utility... In effect, [Marino proposes] that we can read and explicate code the way we might explicate a work of literature in a new field of inquiry' (Marino 2006).[3] 
  • *Software Studies: critical approaches to the study of software (as compiled source code), particularly large-scale systems such as operating systems, applications, and games. Alternatively this also includes the use of software to study other things, like culture (see Manovich 2008), which Manovich calls Cultural Analytics (Williford 2011). It might also entail the study of the use of software historically (see Ensmenger 2010). One important aspect of this is to focus on computer/technical systems within society and culture - for example the Internet, the email system, mobile data, the HTTP protocol, etc. 

I therefore see iteracy as developing the ability to reason critically and communicate using discourse to discuss, critique and study the medium of computer code[4]. Although I have kept critical code studies and software studies within the domain of iteracy I am tempted to place these approaches within the broader definition of digital Bildung, more specifically as methods and approaches related to critical inquiry of computationality (Berry 2011) or the post-digital society more generally (hence the 'scare stars'). For example, Douglass (2007) poses the question:

So how do Software Studies and Critical Code Studies relate... Both are larger critical perspectives (on software and source code, respectively) that aim at a deeper understanding of digital, computational art and culture. How do they relate to each other? That is a thornier question, and perhaps unproductive at this early stage in the game when each term is a flag to rally round rather than a nation with well-defined borders. Each could arguably be defined as a subfield of the other, although I suspect what we have here is a classic Venn diagram arrangement with a high degree of potential overlap. The question will be easier to resolve when we move from proposed themes to formal definitions of methodologies. If software studies is centered around the phenomena of computation, and critical code studies is centered on the ephemera of uncompiled source, what are the distinctions (and hence advantages) that each perspective offers the other? (Douglass 2007).

This is an interesting question which I don't propose to answer here, but both critical code studies and software studies draw on the kinds of skills I identify above as iteracy (and we could even accept that they recursively draw upon themselves too if I leave them in the definition). Nonetheless, I do think that iteracy has some heuristic advantages over terms like 'code literacy', 'digital literacy', 'information literacy', and so forth. Especially the connotations that iteracy has with iteration, a key part of how code functions and is read and written.

One last thought: although I make the link between iteracy and looping/repetition, I think it is probably more accurate to think of iteration not as a circle but as a spiral. That is, that learning builds on previous learning and skills in a virtuous upward spiral that develops competence and capabilities.[5]

Update: See this attempt to turn the discourse into code


[1] These are offered as a first draft of the kinds of skills iteracy might require. They remain very much a work in progress. 
[2] Of course computers read and write code too. We could therefore say that non-human entities have delegated iteracy.  
[3] Here I am bracketing the question over the boundaries between software studies and critical code studies but Douglass attempted a definition as '[f]or simplicity in these examples I’m imagining “the domain of software” as “computation, its penumbra as pre-computation, post-computation, imagined computation, representations of computation” and so-forth. “Code,” in the sense Mark uses it in his writings on Critical Code Studies, are something like “human-readable and writeable representations relating to software.”' (Douglass 2007). Whist being fully aware of the difficulties of these definitions and acknowledging that they are still under contestation, this has some heuristic value in appreciating the general positions of the two camps. 
[4] There is an interesting question about whether we can read code without any recourse to notions of computation. Personally I do not see any reason why code cannot be read as a self-standing or even historical text. Reading within the horizon of the program itself might be very productive, particularly for large scale systems that are extremely self-referential and intertextual. 
[5] Naturally this reminds me of Hegel's notion of History as a spiral. It also is evocative of notions of dialectics as a means of learning and education.


Berry, D. M. (2011a) The Philosophy of Software: Code and Mediation in the Digital Age, London: Palgrave Macmillan.

Douglass, J. (2007) Joining the Software Studies Initiative at UCSD, accessed 16 Sept 2011, http://writerresponsetheory.org/wordpress/2007/12/04/joining-the-software-studies-initiative-at-ucsd/

Ensmenger, N. L. (2010) Computer Boys Take Over, Cambridge: MIT Press.

Marino, M. C. (2006) Critical Code Studies, Electronic Book Review, accessed 16 Sept 2011, http://www.electronicbookreview.com/thread/electropoetics/codology

Marino, M. C. (2011) Who has a good alternative to "literacy", marcmarino, Twitter, Sept 15 2011, accessed 16 Sept 2011, https://twitter.com/markcmarino/status/114448471813144578

Manovich, L. (2008) Software Takes Command, accessed 16 Sept 2011, http://lab.softwarestudies.com/2008/11/softbook.html

Oram, A. and Wilson, G. (2007) Beautiful Code. London: O’Reilly.

Williford, J. (2011) Graphing Culture, Humanities Magazine, March/April 2011, accessed 16 Sept 2011, http://www.neh.gov/news/humanities/2011-03/Graphing.html

Wing, J. (2011) Research Notebook: Computational Thinking—What and Why?, accessed 16 Sept 2011, http://link.cs.cmu.edu/article.php?a=600

12 September 2011

Messianic Media: Notes on the Real-Time Stream

Realtime streams draw from the advantages of the processual nature of code/software to create a rapidly updating data-flow form that provides an ecology of real-time updates. An example of the real-time stream is the Twitter platform.

A stream is a dynamic flow of information (e.g. multi-modal media content). They are instantiated and enabled by code/software and a networked environment (see Berry 2011a). They are increasing part of the digital media ecology including:
  • notification streams (what you should know, @mentions)
  • activity streams (what are people doing?)
  • news and media streams (news and reporting, financial data, etc.)
  • ‘pure’ or branded streams (recognised entities, human and non-human)
  • aggregated or ‘mixed’ streams (streams of streams)
Importantly, the real-time stream is not just an empirical object; it also serves as a technological imaginary, and as such points the direction of travel for new computational devices and experiences. Of course, the 'real-time' itself is a mediated construct, created in software and managed through careful processing and presentational cues for the user. After all, the mere passing through computation creates some latency, or data lag, which is different for each system, that marks it as already in the past before the user receives it as a feedback loop. But this latency in real-time response, which may be micro or milliseconds, may also be disguised from the user through various forms of design transitions, computational techniques or anticipatory processing which makes the experience of real-time feel as if it is truly real-time. Indeed, the real-time experience itself may also be jarring and with the speed of networks and computation increasing, some platforms and software systems build in techniques to soften the real-time flow using interface techniques, such as sliding "cards". The latency produced by real-time streaming systems is due in part to the feedback loops built into the software.
A feedback loop involves four distinct stages. First comes the data: A behavior must be measured, captured, and stored. This is the evidence stage. Second, the information must be relayed to the individual, not in the raw-data form in which it was captured but in a context that makes it emotionally resonant. This is the relevance stage. But even compelling information is useless if we don’t know what to make of it, so we need a third stage: consequence. The information must illuminate one or more paths ahead. And finally, the fourth stage: action. There must be a clear moment when the individual can recalibrate a behavior, make a choice, and act. Then that action is measured, and the feedback loop can run once more, every action stimulating new behaviors that inch us closer to our goals (Goetz 2011).
This imaginary of everyday life, a feedback loop within and through streams of data is predicated on the use of technical devices that allow us to manage and rely on these streaming feeds. This combined with an increasing social dimension to the web, with social media, online messaging and new forms of social interaction, allows behaviour to be modified in reaction to the streams of data received. However, the technologies to facilitate the use of these streams are currently under construction and open to intervention before they become concretised into specific forms. We can ask questions about how participative we want this stream-based ecology to be, how filtered and shaped to we want it, who should be the curators, and who we can we trust to do this. 

Cognitively, it is argued that streams are also suited to a type of reading called ‘distant reading’ as opposed to the ‘close reading’ of the humanities (Berry 2011a; Moretti 2007). This ‘close reading’ created a certain type of subject: narrativised, linear, what McLuhan called ‘typographic man’. At present there is a paradoxical relationship between the close reading of current taught reading practices and the distant reading required for algorithmic approaches to information. To illustrate, books are a great example of a media form that uses typographic devices for aiding cognition for ‘close’ reading: chapters, paragraphs, serif fonts, avoiding textual 'rivers' and white space. Most notably these were instantiated into professional typographic practices that are themselves now under stress from computational algorithmic approaches to typesetting and production. Close reading devices required a deep sense of awareness in relation to the the reader as a particular subject: autonomous, linear, narrativised and capable of feats of memory and cognitive processing. Devices were associated with a constellation of practices that were surrounded around the concept of the author (see Berry 2011b). 

The future envisaged by the corporations, like Google, that want to tell you what you should be doing next (Jenkins 2010), presents knowledge as a real-time stream, creating/curating what they call ‘augmented humanity’. As Hayles (1999) states:
Modern humans are capable of more sophisticated cognition than cavemen not because moderns are smarter... but because they have constructed smarter environments in which to work (Hayles 1999: 289).
In a real-time stream ecology, the notion of the human is one that is radically different to the 'deep attention' of previous ages. Indeed, the user will be constantly bombarded with data from a thousand (million) different places, all in real-time, and requiring the complementary technology to manage and comprehend this data flow to avoid information overload. This, Hayles (2007) argues, will require 'hyper attention'. Additionally this has an affective dimension as the user is expected to desire the real-time stream, both to be in it, to follow it, and to participate in it, and where the user opts out, the technical devices are being developed to manage this too through curation, filtering and notification systems. Of course, this desiring subject is therefore then expected to pay for these streaming experiences, or even, perhaps, for better filtering, curation, and notification streams as the raw data flow will be incomprehensible without them.

Search, discovery and experimentation requires computational devices to manage the relationship with the flow of data and allows the user to step into and out of a number of different streams in an intuitive and natural way. This is because the web becomes,
A stream. A real time, flowing, dynamic stream of information — that we as users and participants can dip in and out of and whether we participate in them or simply observe we are [...] a part of this flow. Stowe Boyd talks about this as the web as flow: “the first glimmers of a web that isn’t about pages and browsers” (Borthwick 2009).
Of course, the user becomes a source of data too, essentially a real-time stream themselves, feeding their own narrative data stream into the cloud, which is itself analysed, aggregated, and fed back to the user and other users as patterns of data. This real-time computational feedback mechanism will create many new possibilities for computational products and services able to leverage the masses of data in interesting and useful ways. This might allow the systems being designed to auto-curate user-chosen streams, to suggest alternatives and to structure user choices in particular ways (using stream transformers, aggregation and augmentation). In some senses then this algorithmic process is the real-time construction of a person's possible 'futures' or their 'futurity', the idea, even, that eventually the curation systems will know 'you' better than you know yourself. This means that the user is 'made' as a part of the system, that is, importantly the user does not ontologically precede the real-time streams, rather the system is a socio-technical network which:
is not connecting identities which are already there, but a network that configures ontologies. The agents, their dimensions and what they are and do, all depend on the morphology of the relations in which they are involved (Callon 1998).
Nevertheless, it seems clear that distant reading of streams will become increasingly important. These are skills that at present are neither normal practice for individuals, nor do we see strong system interfaces for managing this mediation yet. This distant reading will be, by definition, somewhat cognitively intense, strengthening the notion of a ‘now’ and intensifying temporal perception. This is a cognitive style reminiscent of a Husserlian ‘comet’ subjectivity, with a strong sense of self in the present, but which tails away into history. It would also require a self that is strongly coupled to technology which facilitates the possibility of managing a stream-like subjectivity in the first place. Indeed, today memory, history, cognition and self-presentation are all increasingly being mediated through computational devices and it is inevitable that to manage the additional real-time streams data flows new forms of software-enabled systems will be called for.

Above we gestured already towards the softwarization of 'close reading' and the changing structure of a ‘preferred reader’ or subject position towards one that is increasingly algorithmic (of course, this could be a human or non-human reader). Indeed it is suggestive that as a result of these moves to real-time streams that we will see the move from a linear model of narrative, exemplified by books, to a ‘dashboard of a calculation interface’ and ‘navigational platforms’, exemplified by new forms of software platforms. Indeed, these platforms, and here we are thinking of a screenic interface such as the iPad, allow the ‘reader’ to use the hand-and-eye in haptic interfaces to develop interactive exploratory approaches towards knowledge/information and ‘discovery’. This could, of course, still enable humanitistic notions of ‘close reading’ but the preferred reading style would increasingly be ‘distant reading’. Partially, or completely, mediated through computational code-based devices. Non-linear, fragmentary, partial and pattern-matching software taking in real-time streams and presenting to the user a mode of cognition that is hyper attention based coupled with real-time navigational tools.

Lastly, more tentatively we would like to suggest an interesting paradox connected with the real-time stream, in that it encourages a comportment towards futurity. This, following Derrida, we would call ‘Messianic’ (a structure of experience rather than a religion) (Derrida 1994: 211), connecting the real-time stream to an expectation or an opening towards an entirely ungraspable and unknown other, a 'waiting without horizon of expectation' (Derrida 1994: 211). As Derrida writes:
Awaiting without horizon of the wait, awaiting what one does not expect yet or any longer, hospitality without reserve, welcoming salutation accorded in advance to the absolute surprise of the arrivant from whom or from which one will not ask anything in return and who or which will not be asked to commit to the domestic contracts of any welcoming power (family, state, nation, territory, native soil or blood, language, culture in general, even humanity), just opening which renounces any right to property, any right in general, messianic opening to what is coming, that is, to the event that cannot be awaited as such, or recognized in advance therefore, to the event as the foreigner itself, to her or to him for whom one must leave an empty place, always, in memory of the hope—and this is the very place of spectrality (Derrida 1994: 81).
The Messianic refers to a structure of existence that involves waiting. Waiting even in activity, and a ceaseless openness towards a future that can never be circumscribed by the horizons of significance that we inevitably bring to bear upon the possible future.
Derrida, like Benjamin, situates the messianic in a moment of hesitation. For Benjamin, that moment is one of "danger"; the past flashes up before disappearing forever. For Derrida, it is a moment of haunting; the spectral other makes its visitation in the disjunction between presence and absence, life and death, matter and spirit, that conditions representation. Although the messianic "trembles on the edge" of this event, we cannot anticipate its arrival. Because the arrival is never contingent upon any specific occurrence, the messianic hesitation "does not paralyze any decision, any affirmation, any responsibility. On the contrary, it grants them their elementary condition" (Specters 213). The moment of hesitation - the spectral moment - enables us to act as though the impossible might be possible, however limited the opportunities for radical change may appear to be in our everyday experiences. The global communications networks, although often invasive and dangerously reductive, also serve as privileged sites of messianic possibility precisely because of their accelerated virtualization (Tripp 2005).
This futurity raises important questions about the autonomy of the human agent, coupled as it is with the auto-curation of the stream processing not just providing information to but actively constructing, directing and even creating the socio-cognitive conditions for the subjectivity of the real-time stream, algorithmic humanity.  A subject with a comportment towards awaiting which forgets and makes present. Or as Derrida suggests:
It obliges us more than ever to think the virtualization of space and time, the possibility of virtual events whose movement and speed prohibit us more than ever (more and otherwise than ever, for this is not absolutely and thoroughly new) from opposing presence to its representation. "real time" to "deferred time," effectivity to its simulacrum, the living to the non-living, in short, the living to the living-dead of its ghosts (Derrida 1994: 212). 


Berry, D. M. (2011a) The Philosophy of Software: Code and Mediation in the Digital Age, London: Palgrave Macmillan.

Berry, D. M. (2011b) The Computational Turn: Thinking about the Digital Humanities, Culture Machine, Special Issue on The Digital Humanities and Beyond, vol. 12, accessed 12/09/2011, http://www.culturemachine.net/index.php/cm/article/view/440/470

Borthwick, J. (2009) Distribution … now, accessed 12/09/2011, http://www.borthwick.com/weblog/2009/05/13/699/

Callon, M. (1998) Introduction: Embeddedness of Economic Markets in Economics, The Laws of the Markets, Oxford: Blackwell Publishers, pp.1-57.

Derrida, J. (1994) Specters of Marx: The State of the Debt, the Work of Mourning, & the New International,  Trans. Peggy Kamuf, London: Routledge.

Goetz, T. (2011) Harnessing the Power of Feedback Loops, Wired, accessed 12/09/2011, http://www.wired.com/magazine/2011/06/ff_feedbackloop/

Hayles, N. K. (2007) Hyper and Deep Attention: The Generational Divide in Cognitive Modes, Profession, 13, pp. 187-199.

Moretti, F. (2007) Graphs, Maps, Trees: Abstract Models for a Literary History, London, Verso.

Tripp, S. (2005) From Utopianism to Weak Messianism: Electronic Culture’s Spectral Moment, Electronic Book Review, accessed 12/09/2011, http://www.electronicbookreview.com/thread/technocapitalism/dot-edu

Disqus for Stunlaw: A critical review of politics, arts and technology