// This text presents the unedited .txt files for Vol 4 No 1 (2015): DATAFIED RESEARCH (Peer-reviewed newspaper), edited by Christian Ulrik Andersen & Geoff cox. For references and quotes, please see the full .PDF of the journal issue. // What is DATAFIED RESEARCH? Christian Ulrik Andersen & Geoff Cox This newspaper is the outcome of a Ph.D. seminar organised by the Centre for Participatory IT at Aarhus University, School of Creative Media at Hong Kong City University and transmediale festival for art and digital culture in Berlin. It is the fourth of its kind in an ongoing collaboration between Aarhus University and transmediale, and seeks to address the thematic framework of the festival as a research topic. All participants have responded to an open call for participation, posted draft papers online for peer review, and met for face-to-face critique in Hong Kong in October 2014. Papers published here were developed through this process as part of a highly collaborative event, after which articles were modified on-the-fly. This year’s festival theme CAPTURE ALL “sets out to investigate and propose actions that push against the limits of today’s pervasive quantification of life, work and play,” as stated in the transmediale call. To what extend does data capture all – even research? By addressing DATAFIED RESEARCH, the workshop, this newspaper, and the publication of the online journal APRJA (A Peer Reviewed Journal About Datafied Research) address popular notions of datafication; including “the datafied self”,” the datafied city”, “datafied management”, and furthermore calls for a reflection on the darker forces involved in capturing and using data. We produce, share, collect, archive, use and misuse, knowingly or not, massive amounts of data, but what does “capture” do to us? What are the inter-subjective relations between data-commodity and human subjects? By asking these questions, the articles seek insights into the logics of data flows between materials, things, data, code, software, interfaces and other stuff that permeates a culture of “capture all”. Rather than merely mimicking the sciences’ use of (big) data, the arts and humanities must explore what kind of sensorium datafication generates for things and humans. What are the implications of being data? In Evil Media, Andy Goffey and Matthew Fuller write: “A set of words in a report, article, or illicit data dump becomes significant in a different way when placed in a mechanism that allows or even solicits unfettered access, than when that set of words is lodged in a closed directory or laid out as a book; allowing such open access has direct and pragmatic effects on the reception of ideas, to mention just one scale at which they might be operative.” By appealing for an unsolicited and open organisation and access to data, they implicitly highlight how datafication not only is a question of archiving and accessing data content and building information architectures of metadata. The computer is not just a medium that stores and displays but is capable of also reading and writing automatically. This affects human thinking, creativity, notions of life and death, and other relations between data and human experience. In common with the festival call, the articles here each in their own way, address this and seek analyses and responses that “outsmart and outplay” the logic of capturing everything applied by the corporate as well as scientific communities. It seems to us that the emerging field of Digital Humanities raises as many questions as it answers in this respect. Although datafication implies the presence of non-human readers and writers of data, a playful response to the appeal to “capture all” points to how readers and writers by no means have become mere automatons. Seeing things at different scales, from the grain of data, the material of data, the screens of data, or in other ways afforded by datafied research, leads the authors into addressing the persistence of data, the gaze of data, data as a thing, the language of data, the politics of data structures, and many other aspects of the complex question of what datafication does to us, and how we might begin to do things to it. VOLUME 4, ISSUE 1, 2015 Edited by Christian Ulrik Andersen & Geoff Cox Published by Digital Aesthetics Research Center, Aarhus University in collaboration with transmediale and School of Creative Media, Hong Kong City University. Design by The Laboratory of Manuel Bürger CC license ‘Attribution-NonCommercial-ShareAlike’ ISBN: 87-91810-26-4 ISSN (PRINT): 2245-7593 ISSN (PDF): 2245-7607 –––––––––––––––––––– WEATHER –––––––––––––––––––– A History of Capture in Planning, Programming and Design By Christian Ulrik Andersen Smart cities, biometric measurements, and the business models of Apple, Amazon, Facebook and Google are often presented as a paradigmatic shift in computing and digital culture: from data to big data, from measurement to anticipation, from interaction to participation, and so forth. There is indeed a great need for an “interface criticism” in this process – a critical understanding of how computational processes are over-layered by, and influence aesthetics and culture. For instance, behavioural models of reading patterns based on data analysis in Amazon Whispernet (a system tracking user behaviour on Amazon’s Kindle), affect both the circulation of text (what people read) and the notion of reading (away from reading in private). History In developing an interface criticism, an understanding of the history of tracking data is helpful. This history points to a tradition of correlating technical and social infrastructures (code and life) that extends across urban and software architectural planning. My main aim is to depict how the capturing of specific life practices – that was once seen as a critique of, and counterstrategy to statistical planning models – has been perverted in contemporary computational business. Cities The city is the domain per se of “big data” and this is where my history begins. Tracking data in cities is not a new thing. In her influential book The Death and Life of Great American Cities, Jane Jacobs points to how developments in science have influenced urban planning. Notably, the development of statistics offered new ways of mapping and controlling complexity. In the fifties, cities were mapped according to statistical information on neighbourhoods’ child mortality, employment, crime rate, etc. Based on this information, urban planners reorganized the city into uniform office, shopping, and residential areas with efficient infrastructures for motorized traffic. This was also know as “Urban Renewal” and caused the demolishment of many cities – much to the frustration of the their inhabitants who felt that the general principles of statistical management controlled their specific life practices. In opposition to this, Jacobs and likeminded people (like Henri Lefebvre), argued for people’s right to the reshape their own city: the city is an organic entity, and any general principle for planning must begin by engaging with the specific life practices of citizens (Jacobs was also a renowned activist) Pattern languages Such strategies gradually became organized. The architect Christopher Alexander developed the idea of a “pattern language”, a collection of people’s patterns of behaviours that would form a basis for designing infrastructures. As an example, Alexander uses the pattern ‘accessible green’ based on the observation that people need open green places to go to; but when they are more than three minutes away, the distance overwhelms the need. Consequently, green spaces must be built ‘within three minutes’ walk […] of every house and workplace.’ This pattern (along with patterns of ‘dancing in the streets’, or ‘holy ground’) helps fulfil larger patterns such as ‘identifiable neighborhood’. In total, Alexander’s book comprises of 253 patterns. Programming In computing, researchers were raising similar critiques of managerial tyrannies; notably, objections to the ways in computers were introduced into the work place. Together with other Scandinavian computer researchers, Kristen Nygaard worked closely with workers’ unions in finding ways that would not alienate the worker. There was, in other words, a great need for building systems that reflected the preferred practices and workflows of the users, and Nygaard’s mapping of these practices included the participation of the workers. Already in the sixties, Nygaard had been one of the founders of object oriented programming (SIMULA) – dividing data into “objects”, or “classes” that a prescribed “method” could do something with. What Nygaard gained from his co-research with workers was an unexpected insight: programming was not only a way of modelling a labour process, but people in general found a value in describing a program and defining objects, classes and methods. Writing programs may lead to deep insight into a social problem and its solutions. As he would say: ‘To program is to understand’ Nygaard is an important part of the tradition of Participatory Design, where computer scientists would co-research labour process in order to make technical and social infrastructures correspond. A well-known example of PD is the UTOPIA project from the early eighties that addressed the work situation of typographers in the print industry (an area that was heavily affected by the introduction of computers). Cities and programming Alexander has not been very influential as an architect, but one can easily see how his ideas where directly applicable in participatory design and object oriented programming. The programmer Ward Cunningham was particularly influenced by Alexander, and initiated the Portland Pattern Repository. In the nineties, the project integrated the WikiWikiWeb, the world’s first wiki. Using the schemata of Alexander in general ways, “Ward’s wiki” contained patterns that described problems and solutions in graphical user interface design and programming. It became popular because it allowed programmers to share and co-edit their experiences and develop a sophisticated pattern language for the correlation of human use and technical infrastructure. Perversion of participation Managerial control of the specific based on statics still exists – and may sometimes falsely be mistaken for “big data”. However, the smartness of big data suggests that there is a new kind of general control exercised through mapping the patterns of the specific. Contemporary computing’s perverted version of valuing life-practices has appeared in three waves. Firstly, we have seen the wave of usability and graphical user interface design. Though field experiments, interface designers have tracked user behavior and preferences. This has lead to predefined, standardized user-friendly software interaction. Secondly, we have seen a wave of locking software into standard objects such as tablets and smart phones. Whereas computers were once considered open structures where users could write their own programs and define their own configurations, hardware providers now strictly control the access to software. Cory Doctorow has labeled this “the coming war on general purpose computing.” Thirdly, we are currently witnessing a wave of large-scale experiments with users that involve massive collection of user patterns – including biometrics, geometrics, text mining, and much more. By correlating vast amounts of user patterns and generating general functions that can anticipate user behavior, new service providers are finding a market of life practices that is promoted as enhanced user experiences. Recent accounts of Facebook’s large-scale experimentation with users’ cognition exemplify this. State defense led programs obviously show related activities based on similar methods. One cannot deny that to many users the three waves all reflect meaningful correlations of physical and social infrastructures, but neither can one deny that the mapping of specific life-practices has lead to a generalized form of control. So, the question still remains: how can one evade the gaze of the pervert, and insist on the right to one’s own life-practices? List of references Alexander, Christopher et. al.,. A Pattern Language: Towns – Buildings – Construction. New York: Oxford University Press, 1977. Print. Doctorow, Cory. “The Coming War on General Purpose Computing.” 28th Chaos Communications Congress. Berlin, 2012. Keynote/Web Jacobs, Jane. The Death And Life of Great American Cities. New York: Random House and Vintage Books, 1961. Print. -------------------- A creative encounter with a biomimetic avatar By Deborah Lawlerdormer Human subjects are tracked and quantified in order to build evolutionary and emergent computational neuro-scientific models. In this bioengineering pursuit neuropsychological responses are monitored and predicted to both enable greater understanding for science while concurrently enabling technologies that feed big economies related to security and entertainment. Can an art intervention or reuse be developed that productively analyses and counters back? Can its reuse also enrich an intimate sensory embodied experience? Xyza, a biomimetic autonomous avatar, is currently being constructed by Mark Sagar within the Laboratory for Animate Technologies at the University of Auckland. A creative practice inquiry into Xyza has been initiated in order to activate a dialogue between embodied cognition, materiality and the neuro-computational. As a biomimetic model, the avatar raises questions regarding the relationship between scientific enquiry, the commodification of biological data and human/avatar identities. Xyza is a neurobehavioral computational model with emergent behaviours. It is an autonomous system that is both self-motivated and self-governing. Linked to the autonomous avatar animation is a real-time neural simulation. In a live neural network, representations of muscular anatomy through to the neuronal activity and neuromodulator levels can be viewed. This autonomous avatar character will be re-skinned and repositioned into a contemporary art installation. Placing Xyza in to a creative mixed reality art practice can be approached conceptually as providing an experimental creative laboratory exploring principles concerning enactive perception, embodied cognition, cross-modal sensing and multi-modal interactivity. It is proposed that this speculative inquiry will offer a re-contextualisation of both our understanding of our body, its sensing properties and interconnectedness with dynamic digital networked, environmental, political, social and virtual conditions. Xyza is a curious project as it activates an exchange between viewer and virtual, ‘probing’, surface and varying degrees of micro layers, reciprocal emotion and gesture. Thus, brings into play an oscillation between character, neurobiological representation and artificial intelligence. Within the complex techno-human environment of this engagement, Xyza will enable a sensor ‘mapped’ space that defines the territory in which the conditions of perception, and the resultant physical, social, emotional and cultural reactions can occur. The body of the viewer becomes the site where meaning is enacted and becomes a traceable event in and of itself. Within this framework of the hybrid physical/digital space, each element of the network both human and nonhuman, the network itself and its interdependent relationships are of interest. The analysis of the complexity and density of these networked environments oscillates between each unit and the whole network and its efficacy. The materiality of the works inclusive of the different ways that digital systems behave, and the uniqueness of the user, will enable and limit the work’s evolution. Each viewing of a networked hybrid environment is an individual performing of the work enabling emergent and generative properties where the structure of the work is what emerges through the interaction. Over time, patterns of relation become manifest. Within mixed reality art installation the virtual and the actual combine and new relationships between both are formed. In this activation, the programme can network and trigger a variety of potential outcomes including sound and light components of the installation. This will twist and problematize notions of shared experiential space between the human and non human. Engagement with the character will also offer latitude to explore embodied cognition in relation to sensing, action and process within the ‘live’ technology environment. Interacting with the avatar brings to the forefront the difference in relation when dealing with a non-human entity including differences in time and cognition. As Hayles confirms: Obviously, the meshing of these two different kinds of complex temporalities does not happen all at one time (or all at one place) but rather evolves as a complex syncopation between conscious and unconscious perceptions for humans, and the integration of surface displays and algorithmic procedures for machines. (13) In many ways, the various practice-based expressions in this research inquiry can be seen as provocative acts where the different methods, techniques, concepts and disciplines come together in an emergent practice of co-composing. Xyza tests the new media artist and curator in terms of contextualization, exhibition and audience engagement. This project interacts with a wide disciplinary territory, it has a complex technical nature, and it provokes specific arguments concerning embodied cognition and neurobehaviours, engages with scientific, entertainment and industrial contexts and has a collective and specialised research culture supporting its ongoing evolution. This expanded field generates an unstable terrain for curation, artistic intervention and analysis. As an autonomous biomimetic avatar, a remix of human biological data and the computational has occurred. As agents the human and non-human have combined together into one art expression. Recycled. Regenerated. Consumed. Footnote: Hayles, N. How We Think: Digital Media and Contemporary Technogenesis. Chicago: The University Of Chicago Press. Ebsco Publishing: ebook collection. (2012) Accessed 27/7/2014 -------------------- Genealogies of datafied man By Eric Snodgrass A little genealogy: economic man -> burning man -> emergency man -> and now, their prodigal offspring, the daughters and sons of datafied man. In a series of lectures tracing the historical roots of neoliberal thinking, Michel Foucault highlights the arrival of a new subject on the horizon of 18th century liberal ideology: *homo œconomicus*. In short, this economic man is a newly identifiable subject, one that arises out of the “grid of intelligibility” which new formulations around economic rationality begin to put into action. This grid will act as the “interface of government and the individual,” with economic man as a subjectivity that both pinpoints and emerges from the interstices of something as seemingly simple as an effectively calculated supply and demand curve. An emergent distinction of the neoliberal from its liberal originals will be a thorough rejection of the ability of government to ever understand causes, let alone attempt to manage them. While such a critique is originally formulated by liberalism in regards to government’s attempts to steer economic matters, neoliberalism will spread its grid of economic tribunal outwards towards all forms of governance. Cause is framed by neoliberalism as ultimately unknowable by government and thus economic man will be guided by an “invisible hand,” with the grid of intelligibility ostensibly being limited to operations on effects but never causes. In economic man’s mode of willing blindness to the possibility of government addressing causes (e.g. social, ethical, cultural), the notion of any particular baseline norm is effectively denied by neoliberalism’s preemptive incorporation of exception to any restrictive baseline. “Thus economized, ‘normal’ is whatever appears as a statistical constant on the collective level,” disciplinary power becoming instead “an adaptive reuptake mechanism for emergent normative variation” (Massumi). In such a calculative model of economic intelligibility, rationality can now be simply defined as that which reacts to reality in a systemically non-random, economically readable fashion. “Homo œconomicus is someone who accepts reality” (Foucault), that is to say, who agrees to the framework of the grid, the “reality” upon which this play transpires. And so, in the end, economic man is “someone who is eminently governable,” but with governmentality now acting more in a mode of “environmentality,” in the sense that “the technology to be employed is not discipline-normalization, but action on the environment. Modifying the terms of the game, not the players’ mentality” (Foucault). Burning man is economic man high on the accelerative fossil fuels of productivity. In a short riff on the Burning Man festival (the annual desert retreat long-beloved of many a Silicon Valley disruptor), Donna Haraway speaks of how we would be more well-served by calling what has been popularly construed as the age of the anthropocene as that of the “capitaloscene.” This is specifically so as to place the focus more squarely on how it is that the processual power of a capture all capitalism has now brought all things, humans included, within its wake. Such is the metabolism that results in today’s taught pairing of economy and ecology, the *oikos* (Greek root for “eco-”, originally invoking household, dwelling place or habitat) of each forming a particularly energetic symbiotic coupling in this “third age of carbon,” in which technologies of extraction and proliferation are continually refined and regulations further loosened so as to distill every last calorie of carbon out of the planet. A final spectral figure to consider in this consubstantial trinity is that of emergency man. In writing specifically on Foucault’s notion of environmentality in relation to the Bush Jr. government’s framing of the events of Hurricane Katrina, Brian Massumi suggests that there is a way in which threats of war and weather are increasingly being treated as similar in nature. Each is readily framed as an immanent and indiscriminate threat. This in turn prompts a perceived justification on the part of power for an equally immanent, militarised ontology whose rules of exception (e.g. military black sites, unconditional data collection schemes) are justified as being the best suited to preemptively manage such an unknowable, immanent threat. And why not. Capturing all in the most indiscriminate fashion with no predefined gold standard (as the “brute force” approach in Artificial Intelligence practices in recent years has amply demonstrated) would indeed seem to be the most efficient technique for extrapolation and interpellation of value, whether one is speaking of the NSA, Google, Facebook, Exxon, etc. The “nature” of this indiscriminate threat is entirely open-ended, always in beta, always on. It might variously take on the cast of a literal environmental catastrophe to come, a financial flash crash, a sudden terror from the sky, and so on. It is this very open-ended, vaguely defined quality of threat that enables what Massumi terms as an “infra-colonisation” of the “proto-territory” across a full spectrum, one in which all (politics, economics, the very conditions for life) is placed on “a continuum of war and weather.” Thus the emergence of emergency man, a subject free to do as they like - so long as they accept the immanent inevitability of this capture all terms of service. A fretful freedom, full of the dire forecasts of an economic war machine, with its undulating litanies of “unknown unknowns” (Rumsfeld) and rhythmic, self-driving “preferential relays” (Massumi). Datafied subjects, quantifying and tracking all, drilling down into and mining every last bit of data, might in many ways be understood as subsets, offshoots or descendants of this ontogenetic trinity of economic man///burning man///emergency man. But what might the potential forms of emergence be for this datafied subject in an environment of capture all capitalism? And in what meaningful ways might the rules of the game be in play? Sold as a world of possibilities, the rise of “big data” as yet another ready excuse to capture more and colonise the next emergent territory as market. Digital fracking as our latest logarithmic pursuit. And all the while the seemingly un-capturable terms of service agreement of the invisible hand pushes on, preemptively upgrading its capacity with strategic modifications to its grid of intelligibility. Ensuring a co-evolving relationship with its tools of capture for extracting the next proto-territory while making enforceable the continued emergencies of its own effects. Whether in the hippie sands of Black Rock Desert or on the militarised streets of Ferguson, the economic war machine of this “environmental technology” (Foucault) readily adapts and makes itself at home in the soil of the *oikos* it so ably prepared, unearthing further rites of capture under the guise of straw man theories and that ritual form of sacrifice known as survival of the fittest. ## bibliography @eco___bot, “Eco Bot tweets daily climate reports by compiling data from weather related news articles & social media posts”, 2014. (Web) https://twitter.com/eco___bot Foucault, Michel. The Birth of Biopolitics: Lectures at the Collège de France 1978–1979, trans. Graham Burchell. New York: Palgrave Macmillan, 2008. Haraway, Donna. “Anthropocene, Capitalocene, Chthulucene: Staying with the Trouble”, 2014. (Web) http://vimeo.com/97663518 Massumi, Brian. “National Enterprise Emergency: Steps Toward an Ecology of Powers”. *Theory Culture Society*, 26:153, 2009. Rumsfeld, Donald. “Defense.gov News Transcript: DoD News Briefing – Secretary Rumsfeld and Gen. Myers”. United States Department of Defense (defense.gov), February 12, 2002. (Web) http://www.defense.gov/transcripts/transcript.aspx?transcriptid=2636 -------------------- Data Disobedients? forthcoming propositions on recipes for water and soil By Fran Gallardo Geographies of the Artificial For most intents and purposes, it could be said that we inhabit a computational ecology where numerical modelling, computer simulations and infovizs, among others, saturate almost every atom and bit. A ‘Geography of the Artificial’, as in Herbert Simon’s (1969) work which has data as a foundation, constrain and potentiality all at once. From interpersonal communications, to species classifications and to food transportations, the automatisation of procedures for data gathering and manipulation are regarded as the means for the production of global knowledge, and have prompted the emergence of new disciplines, such as “social physics”. In other words, we live in a culture in which data is used to comprehend and often to predict, the behaviour and dynamics of its many complex systems – with implications for infrastructural knowledge and the logics associated with it. Within the context of a general ecology, this article would like to addresses two current paradoxes regarding the question of quantification. Firstly, the agency of data folds and unfolds not just upon the construction of knowledge, but also in the actual phenomena it is trying to document, whether it is the analysis of air quality or seabed sediment depositions. Secondly, while the question of data has resurfaced again since the 70’s quantitative revolution, in part thanks to global (and not just climate) changes, yet the old discussion of “models vs data” has remained relatively unattended. However, as scholars such as Paul Edwards and Sabine Höhler have articulated: “without models there is no data”. Or put differently: as observing systems evolve, so does global data alongside the models and algorithms that inhabit them. The basic tenets for a computational culture with which this article seeks to build upon. Computational Cultures Computational ecology, in this scenario, does not only imply the implementation of automatised technology in order to remote control—or drone—the dynamics of a determinate system, but rather thinking beyond the human condition (Deleuze, 1991) in the context of a increasingly quantified ecology. What counts as (ac)countable? Why and by whom? In this context, the notion of ecology is informed by Adam Robbert’s notion of speculative ecology or follows Timothy Morton’s call for an “ecology without nature” – suggesting that relationships of any kind— between an organism’s neural synoptics and its gut micro-biome, or between a computer screen and the internet of COx sensors located miles away—are fundamentally ecological. Although models and algorithms are interrelated and form a complex ecosystem, this article focuses on processual algorithms and its culture—as algorithms constitute the exchange currency for the economies of a model. Going beyond classic definitions of algorithms as procedures that execute a sequential number of steps organising data towards a result, Luciana Parisi reformulates them in terms of actualities. On the one hand, an algorithm performs a processing involving a new assembled unity that is added upon the composite of parts. On the other hand, it is a process intrinsically related to variation, and it is dependent on the procedure itself and the sets of data in which it works. As a cultural force in itself, algorithmic culture manifests itself not only bellow the threshold human perception in high-frequency-trading, but also across the fabric of the transduced domesticity of Ikea’s flat-pack design, or in what counts as human in an era of face-expression recognition software or IBM’s cognitive cooking. From these example, an algorithm is seen as a forward-looking event, or a self-recursive feedback loop with little force and agency in itself. Is it then possible for these procedures to have an imagination and agential forces in their own right? Could they access discourses of self-reflection and (mis)understanding of the posthuman condition? In short, could an algorithm disobey itself? And if so, how would it do so? Disobedience? Disobedience has gained considerable traction in the imaginary – from public manifestations of unrest to its increasing emergence in cultured forums. We can see this in works such as Andres Jaque’s IKEA Disobedients, which was the first performance piece to be acquired by MOMA’s permanent collection, or more recently the V&A exhibition Disobedient Objects. First of all, disobedience requires gaining some critical space with its own aesthetical considerations and material imperatives and concomitantly with the logics behind computation, and its relation to disobedience. As Matthew Fuller and Graham Harwood argue, although logics break down a phenomena by modelling it in order to produce a remote control, computation also affords comparative conditions at the material scale by making it modifiable. Disobedience could then act as a mode against privileged reason, complicating the autonomy of logics in computation. By doing so, and drawing in its own tradition, disobedience might relate to the sensibilities and creative forces driving both a process itself and the processing of data. Examples. There are interesting resonances for this condition in Klosky’s ‘The First Thousand Numbers in Alphabetical Order’ or in Drew & Haah’s paper ‘Lessness: Randomness, Consciousness and Meaning’. Both works are complex explorations of ordering processes that are much less functional, less effective and disorienting. Viscerally non-obedients, these are spaces for which Beckett’s texts seem to resonate quite fittingly by working more on the nerves rather than the intellect of the reader. Cooking Another material tradition with troubling questions concerning agency and computation is food culture, more than often instrumentalised towards a particular end or agenda, like in projects such as Michael Rakowitz’s Enemy Kitchen. However, little attention has so far been paid to recipes as computational forces with a specific aesthetic and biological consistency. Instead, foundational literature in computer science more than often uncritically equates the formal languages between algorithms and recipes—just compare the introductory flow chart of Donald Knuth’s book The Art of Computer Programming with algorithms for step-scheduling cooking much praised by bee-like silicon valley innovators. Within this context a recipe inhabits the “difference (that) inhabits repetition”, as Gilles Deleuze has formulated. It is a perspective of procedures that is informed more by the performative ‘liveness’ granted to data, and less by the subjective contemplation of mass-commodified rituals. Much like in Alison Knowles’ scope events such as The Identical Lunch (1967) or Proposition: Make a salad (1962-2012), computational procedures in this context—social, genre, technological— are relays. They are performative pieces that augment the computational character of life and repetitive labour, or the gendered relationships in charge of the maintenance of the self and sociality. This article started as a recipe to speculate on the disciplining logics within an algorithmic culture that governs environmental data, furniture assembling instructions, digital image processing or a cooking recipe. It did so in order to open up an inquiry into computational processes and alternative forms of agency. However, as we look closer, certain logics are not a matter of generality, but processes of difference inhabiting repetition. So, if I submit myself to a recipe, might I cook a delicious meal for most of you? Bon appétit Bibliography Simon, Herbert Alexander. The Sciences of the Artificial. Cambridge, London: MIT Press, 1969. Höhler, Sabine. "Spaceship Earth: Envisioning Human Habitats in the Environmental Age." Bulletin of the German Historic Institute No. 42: 65-85. 2008 Gilles Deleuze. Empiricism and Subjectivity: An Essay on Hume’s Theory of Human Nature. New York:Columbia University Press, 1991 Robbert, Adam. “Speculative Ecology: Matter, Media, and Mind.” CIIS Founders Symposium. San Francisco: N. p., 2012. Robbert (2012) Speculative Ecology: Matter, Media, and Mind. Paper delivered at the CIIS Founders Symposium, San Francisco, CA Luciana Parisi and M. Beatrize Fazi. ‘Do Algorithms Have Fun? On Completion, Indeterminacy and Autonomy in Computation’. In Olga Goriunova (ed) Fun in Software. Exploring Pleasure, Paradox, and Pain in Computing. (Bloomsbury Press, 2014) Timothy Morton (2007) Ecologies without Nature. Rethinking Environmental Aesthetics. Harvard University Press. Hilary Putnam (1979) Mathematics, Matter and Method. Cambridge University Press. IBM (2013) IBM Watson Cognitive Cooking Fact Sheet. Michel Foucault (2009), Security, Territory, Population, lectures at the College de France. Klosky, Claude , ‘The First Thousand Numbers in Alphabetical Order’, in Against Expression, an Anthology of Conceptual Writingi , (eds) Craig Dworkin and Kenneth Goldsmith (Chicago: Northwestern University Press, 2011), 148– 60. Elizabeth Drew, Mads Haahr (2002) Lessness: Randomness, Consciousness and Meaning. 4th International CAiiA-STAR Research Conference ‘Consciousness Reframed’ in Perth, Australia in August 2002. Samuel Beckett. Lessness. In Gontarski, S. E., ed. The Complete Short Prose, 1929- 1989. New York: Grove Press.197-201. Latour, Bruno (2011) Waiting for Gaia. Composing the common world through arts and politics. Lecture at the French Institute, London. Knowles, Alison (1965) By Alison Knowles, New York, NY: A Great Bear Pamphlet, Something Else Press. Knowles, Alison (1973) ‘The identical lunch for video’, typed narrative and proposal to the Greater New York Councils (GNYC), Alison Knowles Studio Archive, New York City, NY Michael Rakowitz (2004) Enemy Kitchen. -------------------- Gaming Systems, capitalizing on the gray area between gamespace and gamic space as a critique of social codification By Minka Stoyanova Hacking Tinder Marketers CamMi Pham and Blake Jamieson hacked Tinder. Their system to collect “over 2015 matches in under 17 hours” utilizes social engineering -- such as, modifying a user’s profile picture to appear sponsored by Tinder -- to “become wanted on Tinder.” While this hack might seem counter-intuitive or not in keeping with the objectives of Tinder (match-making), it arises from Pham and Jamieson’s subjective reinterpretation of the Tinder’s ultimate aims. For them, it became a marketing problem; “how does one increase an individual’s visibility within Tinder?” Huizinga, in HomoLudens, defines the area of play as being different from “ordinary life” (28). Within our current technological landscape there exists a cognitive slippage (gray-area) between gamespace (the space of the game-play -- different from ordinary life) and gamic (game-ic) space (ordinary life which has been gamified or presented in a game-like fashion). This gray area makes possible the critical application of subjective (sometimes absurdist) interpretations to Tinder (or other systems like it). In order to investigate the critical potentiality of this gray area, however, it is necessary to identify how it is formed. If it looks like a duck, and quacks like a duck, it might be a chicken. Our adoption of increasingly powerful mobile devices is leading us towards a multi-directional equivalence within our lived experience; we are able to work, play or socialize from any location through a single interface. Meanwhile, broadly accepted design practices are standardizing our technological interactions; we use the same gestures to browse our stock portfolio as we do to play Candy Crush. This trend is at once liberating and disorienting. Traditional signifiers of our presence in gamespace are being co-opted by ordinary life. For example, Google Glass promises a future in which we access information through heads-up-displays, a technology that was traditionally “uncomfortable in its two-dimensionality” -- visually reinforcing our presence in gamespace (Galloway 35). As we apply norms from gamespace to ordinary life (and vice-versa), we create the conditions for phenomenological slippage between gamespace and gamic space. While the adoption of Google Glass represents a possible material manifestation of this slippage, a related and perhaps more insidious blurring has already arisen within the applied logic of our computational systems. Living with Rules Huizinga defines play as executed “according to rules freely accepted but absolutely binding” (Huizinga 28). These rules do not simply define the activity within gamespace; they also, through their acceptance and execution, manifest the space itself. These rules can be called ‘algorithmic’ in that they are a mathematical/logical rule-set that defines a system. However, whereas gamespace is invented -- the rules precede the space -- science has traditionally used algorithms to describe or model an already existing space – the space preceded the rules. However, as models which once merely represented an observable reality become increasingly autonomous this relationship can reverse. As discussed by Kevin Slavin, the virtual world created in the space of algorithmic interaction is beginning to precede ordinary life and to enact influence upon it; the algorithms are becoming the builders of a space they once merely represented. Thus, ordinary life has begun to resemble gamespace in its genesis – beyond the generally accepted metaphoric/analogous similarities. Do you want to date my avatar? This algorithmic influence does not only construct the environment of life/play, it also imprints its logic on the process by which we construct the self. Data we provide through online activity is fed into algorithmic systems which attempt to ascertain our interests. However, these algorithms do not identify our interests as we intend to report them, but identify in our actions possible interests as they align with corporate/marketing agendas. The resulting targeted advertisements are placed in the same context as our social engagement where we (as social creatures) are most suggestible. As we continuously compare ourselves to the perfectly curated abstractions with which we engage – both in the form of our friends’ curated virtual selves and the simulations provided by algorithms -- we create an idealized aspirational self. By accepting this algorithmic logic we affirm its validity, just as gamers accept (buy into) a prescribed rule-set. Furthermore, this acceptance (and the idealized self that it generates) drives the final logic by which gamespace and gamic space become indistinguishable. 7, 50, 19… Habits of highly successful people “In every job that must be done, there is an element of fun. Find the fun, and snap! the job’s a game!” -- Mary Poppins The promise of algorithmic rationalization has always been to maximize efficiency. Quantification allows us to apply that optimization to our own behaviors, promising to help us achieve in meat-space the idealized self we present (and are presented with) in net-space. Game-like incentives prompt us track everything and modify our behaviors accordingly. We begin to view ourselves as the always-optimizable second-self; we become beholden to the algorithmic efficiency s/he is programmed to desire. Every second not devoted to productivity becomes wasted and every aspect of our lives becomes defined by its productive value. Revealing the Logic Through cheating, creative producers – acting within gamic space – are able to critique these codifications by revealing their algorithmic fallacies. Huizinga makes a distinction between the spoil-sport and the cheat. While the spoil-sport rejects the framework of play, the cheat accepts, but reinterprets the rules. The cheat might deconstruct, but does not destroy gamespace (11). Average players are content to navigate gamespace as intended by the algorithm (rule-set). However, some players (cheaters) prefer to interrogate the algorithm, these players co-opt the interface to discover meta-truths within the imposed logic; this is the critical space inhabited by Pham and Jamieson. Tinder is intended to be a tool to optimize the dating process. However, Pham and Jamieson -- by approaching Tinder as a game -- were able to reveal that, while the external (ordinary life) objective might be to find love, the internal objective of the game is to collect matches. The absurdity of the number of matches they were each able to acquire reveals the disparity between these two objectives. Furthermore, the process by which they cheated reveals something further about ourselves, our trust in the system itself. Users, seeing the modified profile images, dutifully “swiped right,” revealing themselves as average players. It is both the purview and the responsibility of the philosopher/artist to be more than an average player. However, as elucidated by Pham and Jamieson, this act is not restricted to artists, but has become a staple component of participatory culture. “Uber-users,” who thrive within the slippage between gamespace and gamic space, are forming a new class of creative critics -- challenging us to re-examine digital space and our relationship to it. References Bourriaud, Nicolas. Relational Aesthetics. Les Presses du reel, 2002. Print . Galloway, Alexander R. Gaming: Essays on Algorithmic Culture. Minneapolis, MN: University of Minnesota Press, 2006. Print. Huizinga, Johan. homo ludens: a study of the play element in culture. Boston: The Beacon Press, 1950. Print. Jamieson, Blake. “Beating the Tinder game. 800+ Matches. I’ll probably get banned for this…” Medium.com. 9 March 2014. Medium. Web. October 2014. Mary Poppins .”Just a Spoonful of Sugar.” Dir. Robert Stevenson. Writers. Bill Walsh and Don DaGradi Music. Richard M. Sherman and Robert B. Sherman. Perf. Julie Andrews. Buena Vista Pictures, 1964. Web. Pham, CamMi. “Cruel Intentions: How I Hacked Tinder and Got 2015 Matches in Under 17 Hours: The formula to become Wanted on Tinder.” Medium.com. 18 April 2014. Medium. Web. October 2014. Slavin, Kevin. “How Algorithms Shape our World.” Online video. TED.com. TED, (filmed) July 2011. Web. October 2014. -------------------- Welcome to the City of Discipline By Renée Ridgway “Capital burns off the nuance in a culture. Foreign investment, global markets, corporate acquisitions, the flow of information through transnational media, the attenuating influence of money that’s electronic … untouched money … the convergence of consumer desire” (DeLillo, 1997: 785). ‘Das Kapital’ from Underworld (1997) Digital capitalism, or often commonly termed 'cybercapitalism' refers to the internet or 'cyberspace' and seeks to engage in business models within this territory in order to make financial profit. The relationship between donations, gifts and sponsorship by the private sector (Google, Facebook, Twitter, Yahoo) results in reciprocation in the form of data, debt and power constructs. As we upload, tweet, post, blog and search we give away our data for free services. Google, for example, is dependent on us willingly furnishing data that is then filtered, as value is simultaneously extracted from the data. With ‘the network effect’, more people contribute online because others also choose to do so causing the value and power of the network to increase exponentially as it grows. This enables Google to have a completely free database, which has been provided by users of the internet and by designing specific algorithms that are able to index and crawl the internet, they provide ‘relative’ results. (Leach: 2014) Nowadays it has become clear that users pay with their data, which is increasingly the means to finance the corporation’s growth as they sell this data to third party advertisers. Cyber capitalism is structured by a highly intricate communication series of networks that connects users through their usage of social platforms but outside of these platforms 'hyperlinks' direct us. How do we navigate and explore this information superhighway? We do this predominantly through search requests. Algorithms ostensibly know what we want before we even type them, as with Google’s ‘autocomplete’. Search, thus, is not merely an abstract logic but a lived practice that helps manage and sort the nature of information we seek as well as the direction of our queries. Google’s ‘PageRank’ (Page, Brin: 1999) based on hyperlinks, has emerged not only as an algorithm for sorting and indexing information on the world wide web, but also a dominant paradigm that establishes the new social, cultural and political logics of search-based information societies – a phenomenon that Siva Vaidhyanathan characterizes as the ‘googlization of everything’ (2011). However, the implications of this hegemony in regard to questions of identity, free speech, expression, mobilization, etc. should not be underestimated. Are most users aware of the hidden control of search algorithms and how they affect obtained results, whether that is for the production of knowledge, information retrieval or just surfing? Since December 4, 2009 Google uses ‘personalisation’ where it captures and logs user's histories and adapts previous search queries into the real-time search results. This search engine bias retains user data as algorithms gather, extract, filter and monitor our online behavior, offering suggestions for the subsequent search requests. In exchange for our data we receive ‘tailored’ advertising, making things fit, turning ourselves into commodities for advertisers and receiving free internet usage. This personalisation is the present currency in the online marketing of our data. As we search everyday many users allow this personalisation to occur, without installing plug-ins that would inhibit it or by deleting cookies. Instead we sign in and donate our data and in return receive purported personalised search results. Technology is what the 21st century is about along with how it controls our attention, through the ‘filter bubble’ - where certain information on the internet is kept invisible and hidden, which deters us from learning about things we do not know. (Pariser: 2012). This leads to the ‘distortion effect’- one of the challenges posed by personalised filters. ‘Like a lens, the filter bubble invisibly transforms the world we experience by controlling what we see and don’t see. It interferes with the interplay between our mental processes and our external environment. In some ways it can act as a magnifying glass, helpfully expanding our view of a niche area of knowledge.' (Pariser: 2012) But at the same time, these filters limit what we are exposed to and therefore affect the way we think and learn. Personalisation has legitimised an online public sphere that is manipulated by algorithms. Welcome to the City of Discipline where we govern ourselves (Foucault:1975) through our 'behaviours' being captured and cultivated in the 'personalised' machines, sharing everything we do along with giving up our privacy for free services and the attention economy. This state of discipline is reflected in the logistical capture of our data, preferences, intimacies, and search queries. We enable this form of voluntary self-surveillance with our data, or in the words of venture capitalists, 'powerful information' by participating in online activities. The selling of our individual desires, wants and needs to large multinational corporations on the internet was already voiced by ‘Humdog’ in her prescient text from 1994, “pandora's vox: on community in cyberspace", in which she argued that the result of computer networks had led to, not a reduction in hierarchy, but actually a commodification of personality and a complex transfer of power and information to companies (Hermosillo: 1994). By remitting all of this information to corporations (Google) we get benefits out of it because supposedly we receive incredible recommendations. It's a transaction and we get relevance in the exchange. But is this really true? By adhering to the protocols of Google we control our freedoms as we let ourselves be subjected to the machinic and its demands. Our interests provide search engines with power and it is here that our subjectivity is exploited in these deterritorialized spaces. Yet as the data fragments of our daily lives are re-aggregated, algorithms are trying to predict our appearance not through our individual desires, wants and needs but through collective profiling. ‘A query is now evaluated in the context of a user’s search history and other data compiled into a personal profile and associated with statistical groups.’ (Feuz, Fuller, Stadler: 2011) Instead of the ‘sharing economy’ perhaps we need to claim ownership instead. Who owns ‘our’ data? Should we not be able to delete our data or enact the right to be forgotten? Or will we be coerced to negotiate our rights to retention, or forced to make a living selling our data instead of giving it all away? References Delillo, Dom (1997) Underworld. New York: Scribner Feuz, Fuller, Stadler (2011) Personal Web Searching in the age of Semantic Capitalism : Diagnosing the Mechanics of Personalisation. Volume 16, Number 2-7, February 2011. First Monday, peer-reviewed journal on the internet Foucault, Michel (1975) Discipline & Punish: The Birth of the Prison. London: Vintage Books Hermosillo, Carmen a.k.a. Humdog (1994) pandora’s vox: on community in cyberspace. https://gist.github.com/kolber/2131643 Leach, Fiona (2014) The Quantified Self: Can Life Be Measured? Analysis, BBC programme Page, Lawrence & Brin, Sergey (1999) The Anatomy of a Large-Scale Hypertextual Web Search Engine. http://infolab.stanford.edu/~backrub/google.html Pariser, Eli (2012) The Filter Bubble. New York: Penguin Books Vaidhyanathan, Siva (2011) Googlization of everything (And why we should worry). Oakland: University of California Press –––––––––––––––––––– THINGS –––––––––––––––––––– ERASE.all By Audrey Samson That moment when you create an event on Facebook to advertise your upcoming exhibition and Facebook suggests that you should invite Natalie Boisvert, a close friend that passed away a year ago. Digital death problematises network materiality. It is concerned with the life of data after a person dies. Who owns it, what actually happens to it, and who or what might have agency over it? Through examining digital death we can look at how the nuts, bolts and protocols of the network relate to our experience of those moments. Specifically, I will consider how Facebook and Google deal with digital death to illustrate two aspects of network materiality: conditions of datafication, and the persistence of data. Ultimately I propose ritualised erasure as an artistic strategy to make data tangible and to explore how these layers of stockpiled data constantly re-configure our identities, in an attempt to surpass post-mortem datafication and surveillance. QUOTE user$ rm -i -r -v all/ user$ rm: remove all arguments recursively? y conditions of datafication The 'Big Data' wave of enthusiasm breeds simultaneous concern in the realm of digital death. All those traces. Wendy Chun tells us that software promises eternity through constant reading or regeneration. Software is constantly executing: read-write. Though the idea of its permanence is paradoxical because of rapid deprecation, the illusion is sustained. Perhaps this is partially why online mourning is so widespread, digital data’s promise of preservation appeals to the desire to sublimate death. In the case of Facebook two options are possible when a person dies: to memorialise the profile page or to have it deleted. The person wishing to act upon the dead person’s profile must produce a death certificate. A memorialised page can no longer be modified and shouldn’t appear in suggestions such as ‘People You May Know’ or birthday reminders. Depending upon the privacy settings set upon memorialisation, posts may be made by friends on the Timeline. Interestingly, anyone can send private messages to the deceased person, yet a memorialised account cannot be logged into. Where are these private messages going? The other option is to request to have the profile deleted. Though it is not specifically offered, a 3rd party may request an account deletion if the condition of the profile owner is ‘irreversible’ (i.e. mentally or physically unable to maintain their Facebook account). Facebook reviews and decides upon these requests on individual basis. That said, it is important to note that the deletion is largely symbolic because it is impossible to erase all data for a range of reasons. Firstly, Facebook does not completely erase a person’s traces. They state that most personally identifiable information associated with the account like email addresses are removed from the database while some personally identifiable information may remain, such as the account holder’s name if a message was sent to someone else. The material characteristics of the network also determine the persistence of the data. Facebook states that: “copies of some material (ex: photos, notes) may remain in our servers for technical reasons”. These technical reasons are based on the nature of the network and the platform. Traces remain in the servers. In other words, as soon as a digital object (for example an image) has been linked to or shared, those instances are eternal, according to Chun, through their constant propagation. Both cases offer different conditions of datafication and affect the mourning experience differently. Nonetheless in both cases the data ‘lives on’. persistence of data Google catalogues and archives many aspects of our existence: Gmail, Drive, Calendar, Search History, Google+, Wallet, Talk, Location History, etc. The Search History, like other Google services, can theoretically be deleted after a determined period of inactivity if the account owner signed up for the Inactive Account Manager service, Google’s answer to digital death. This service offers the option to notify contacts and share data, specify the length of time that determines whether the account is inactive (i.e. 12 months), and the option to delete the account. Noticeably, the data can be shared with contacts, but not handed over. If the delete option is chosen, there are nonetheless some bits that can not be deleted, such as server logs. When a webpage is visited, the request sent from the user’s browser to the server is automatically recorded. The request contains such information as the user’s Internet Protocol address (IP), the date and time of query, the words that were entered in the search query box, and a unique ID. Therefore the server logs can show a relatively comprehensible image of a user’s search history. Google specifically states that it “may store searches in a separate logs system to prevent spam and abuse and to improve (our) services”. The data and its traces that remain after a person’s death are therefore subjected to whether the person signed up for the Inactive Account Manager and what options were chosen. If the account was not linked to this service the data continues to exist in the databases. Even if it was linked and the delete account option was chosen, the server logs that are kept can reflect a person’s search history and consequently their behaviour and interests. Arguably, we are being studied and marketed even after death – a sort of necro-financialisation of data. As with the case of Facebook, total deletion is not possible, both because of the terms and conditions as well as the materiality of the network: that is such that data propagates itself and leaves traces in a quasi non-reversible fashion. Therefore ‘our data’ (in fact it is no longer ours), is not only stored in server farms long after we die but it is regulated by bound to precise terms upon which we have no influence or agency. This determines not only the surveillance possibilities that have been the subject of so much concern but it also frames the mourning process whether in the form of memorialisation and inactivity managers, or in the form of haunting media. In the context of this apparently infinite data porn there is very little consideration about erasure. It would appear that a recent study by Nils Hadziselimovic et al. shows that the brain actively erases information and that mental illness could arise should that process be disrupted. Though we might perceive our memory as failing, it would seem that selective retention is how it is meant to work. Could datafication be affecting our need to forget? What are the data privacy issues as well as the political and social implications of lingering data? What would it mean to use erasure as a gesture to symbolically resist datafication? What is already uploaded to the cloud is out of our control, autonomously propagating throughout the network. Perhaps an artistic practice of ritualised erasure could engender reflection on issues of network materiality by emphasising the futility of the act of erasure. Such a digital data funeral would begin to address an overlooked and important part of digital archiving ubiquity: the erasure of digital data. Might we need to develop politics of erasure? --- References: Hadziselimovic, Nils et al. "Forgetting Is Regulated via Musashi-Mediated Translational Control of the Arp2/3 Complex" Cell 156.6 (2014): 1153–1166. Chun, Wendy Hui Kyong. Programmed Visions. Cambridge: The MIT Press, 2011. Print. Facebook. “What's the difference between deactivating and deleting my account?” Web. 20 October 2014 . Google. “Delete search history” Web. 20 October 2014 . -------------------- Contemporary datafications of creative acts By Damien Charrieras Datafication refers to the transformation of aspects of life into digital data and to the creation of new forms of value. The datafication of art can be understood as the way computerized processes take over tasks that were traditionally devoted to humans. Along the lines of a mind/matter dualism, Manovich distinguishes between low level automation (the computer takes over trivial tasks) and high level automation where the computer has “to understand the meaning embedded in the objects being generated” (Manovich, 2001, p. 32). Low level automation in art refers to the mechanical duplication of an artefact, or automatic processing of an output. High level automation would refers to the conceptual level of artistic ideation. This dichotomization is undermined at two levels: first we see the atomisation of high level creative tasks through creative software and through new organizations of work (Amazon Mechanical Turk as a online market place advertising freelance micro jobs as HITs - Human Intelligence Tasks). Secondly materialist approaches to creative practices show that separating between high level tasks and low level tasks is misleading. We do not cognate through a disembodied mind but through the material actions we perform (Noe, 2006). Creative acts as the everyday In the realm of art practices, the production of the artist has traditionally been turned into object to ensure the creation and circulation of value out of the artistic activity in a capitalistic context. Jeb, the artist from The Map and the Territory, notes that the return to the object in art is mainly due to commercial reasons: “An object, it’s easier to store and to resell than an installation or a performance” (Houellebecq, 2010). The artistic avant-gardes have repetitively put into question the centrality of 1/ the art object 2/ resulting from a virtuosic performance. Some art magazines focusing on the artistic process of production itself have emerged in the last years (The Happy Hypocrite, DotDotDot). Some contemporary artists coined the term “athletic aesthetic” to account for the continuous performance of the artist on social networks (Troemel, 2013). The DJ Richie Hawtin gives access to what music he listen in real time on his twitter account. More deliberatively, life log artists use wearable computing technologies to capture large portion of their life, giving way to the notion of immediate auto archiving as an art work (Morel, n.d.). The creative act cannot be reduced to a virtuosic performance, a cristallisation happening apart from everyday life. It is an ongoing emergent process embedded in prosaic life whose conceptual potential is recognized as such by artists. The datafication of creative acts operates at different levels (1) It can refer to the constant archiving and processing of audience expectations in order to produce a cultural product matching the datafied expectations of the viewers. House of Cards, as a political drama starring the actor Kevin Spacey, directed by David Fincher has been produced according to the data generated by the users of Netflix (Leonard, 2013). This datafication of art operates at the metalevel of cultural production where the creative output is conceived as the assemblage of datafied skills, people, audience expectation and the organisation of the system of production. The computer replaces the producer and as such purportedly resorts to a high level mode of automation. (2) Another form of datafication operates at a more material level of the performance of the artists. For instance the data generated by the performance of the West Coast rapper Tupac has enabled the creation and performance of a 3D avatar of the artist performing on stage after its death (Stanyek & Piekut, 2010). (3) A third level of datafication refers to the recording of creative practices and routines, and pertains to the commodification of the creative experience in commercial creative software. This plural mundane incarnations of the creative acts are captured by a growing and diversified ecology of recording technologies. Commenting Lazzarato, Chukhrov notes “Labor coincides increasingly with the creative maneuvers of a virtuosic performer, with active memory and an engagement with knowledge. (...) the aim of consumption today is not merely the production of goods, but the multiplication of new conditions and variations for production itself” (Chukhrov, 2010). Instead of a passive act of consumption, the economy of attention (Beller, 2006) calls upon an immersion of the consumer in the creative experience. By being consummerized, this creative experience is going mainstream and becomes pervasive, amplifying the social effectivitie of a view of art as the everyday. With more and more professional creative tools going mainstream and adopting the cloud computing model of access (e.g. The Creative Suite of Adobe), the recording of behaviors of consumption of creative experience is already possible. As Massumi puts it, the digital can potentialize only through a detour to the analogue, “through the experiential relays the reception of its outcomes sets in motion” (Massumi, 2002, pp. 141–142). There is an ongoing datafication of the perceptual regime of the artist/creative experience. Dystopia: all lived reality of the artist is accessible through diverse protocols, recorded, processed and integrated into interrelated scripts of the creative experience circulating into an ecology of distant creative tools. Massumi envisions a future where warnings against the conflation of the virtual with the digital might become anachronistic (Massumi, 2002, p. 142). The convertibility of the analog and the digital will operate so seamlessly (adaptative neural nets, biomuscular robots, etc.) that every fear to lose something in the process will resort to an old fashioned fetichism of the flesh. But as long as “the relationship between the analog and the digital are construed in mutually exclusive terms” (Massumi, 2002, p. 143), the set of concepts used to give an account of the analog cannot be conflated with the set of concepts used to give an account of the digital: “[the analog] perceptually fringes synesthetically dopplers, umbilically backgrounds, and insensibly recedes to a virtual center immanent at every point along the path” “[T]he analog is always a fold ahead” (Massumi, 2002, p.143). But the digital datafication of analog emergent creative practices through creative software recording the artitic actions might zeroed this fold ahead. Creative acts see their ontology reconfigured when all the dimensions of their actualization feedback in a permanent system of capture. In this context, creative acts as permament inventions (James, 1996) are accumulating more and more reality relatively to their affordances to various systems of capture. REFERENCES: Beller, J. (2006). The Cinematic Mode of Production: Attention Economy and the Society of the Spectacle. Dartmouth. Chukhrov, K. (2010). Towards the Space of the General: On Labor beyond Materiality and Immateriality | e-flux. Retrieved from http://www.e-flux.com/journal/towards-the-space-of-the-general-on-labor-beyond-materiality-and-immateriality/#_ftn2 Houellebecq, M. (2010). La carte et le territoire (Édition : Flammarion.). Paris: Flammarion. James, W. (1996). Essays in Radical Empiricism. University of Nebraska Press. Leonard, A. (2013, February 1). How Netflix is turning viewers into puppets. Retrieved October 20, 2014, from http://www.salon.com/2013/02/01/how_netflix_is_turning_viewers_into_puppets/ Manovich, L. (2001). The Language of New Media. Cambridge, Mass.: MIT Press. Massumi, B. (2002). Parables for the Virtual : Movement, Affect, Sensation. Durham, NC: Duke University Press. Morel, J. (n.d.). Auto-archivage immediat. Retrieved from http://incident.net/users/julie/wordpress/?p=5441 Noe, A. (2006). Action in Perception. The MIT Press. Stanyek, J., & Piekut, B. (2010). Deadness: Technologies of the Intermundane. TDR: The Drama Review, 54(1), 14–38. Troemel, B. (2013, May 10). Athletic Aesthetics. Retrieved September 11, 2014, from http://thenewinquiry.com/essays/athletic-aesthetics/ Websites https://twitter.com/rhawtin_live -------------------- DATA (SPEAKING) FOR ITSELF By Geoff Cox (with thanks to Nicolas Malevé and Michael Murtaugh) In discussions of big data, in all its vastness and complexity, there is a tendency of think of data as raw and unmediated; and that somehow data should simply be allowed to speak for itself rather than be lost in the ornamentation of visualization. In saying this I am making reference to Edward Tufte’s guidelines for information graphics, and the removal of unnecessary graphical information to “let the data speak for itself”. Of course in reality what happens is nothing like this, as unstructured data is selected, preprocessed and cleaned, mined, and so on, in far from transparent processes - not least to make it human readable. In addition, although data may begin relatively raw and uninterpreted, in practice there is always some additional information about its composition, not least derived from the means by which it was gathered in the first place. So if data were able to speak for itself, what would it say? In the concluding passages of Capital‘s opening chapter, Marx remarks that if commodities could speak, they would claim that value belongs to them. It’s an interesting reference but one that might be, and is, criticized for its assumption that commodities cannot speak. Marx appears to dismiss the possibility that commodities might possess their own agency and voice – a conception of capitalist production (and of civil society) that is simply too narrow to enable intersubjective relations with the commodities themselves. Yet the key point for Marx is not really whether commodities can speak or not, but that human agency is generally denied under capitalist conditions - indeed commodities require their owners to give them a voice. Does something similar take place with data? When data is harvested, and brought to market to be sold, what does it reveal about itself and its owners? To explore how the value of data is ventriloquized, I refer to the ‘data activism’ of the media art collective Constant and their attempts to understand the specific qualities of data through a series of unfinished experiments. This is related to their more general project Active Archives, running since 2006, that engages the politics of open data and introduces ethical values associated with free software development, the decentralization of resources and the ownership of infrastructures. Their working approach is to offer a series of speculations on the specific qualities of data by running computer programs. Nor is this reducible to something like a typical algorithm (eg. PageRank) that makes sense of the big data in distorted ways to 'reify' knowledge and take it to market. Rather, these ‘probes’ begin to uncover aspects of what is not directly apparent in the material, revealing aspects of what is not-yet known. To Constant, algorithms operate as ‘conversational’ agents, which perform ‘forensic’ operations to explain phenomena in their own informational terms - as data. In their logbook, they explain: “We can’t access the elements of the archive individually. Too many of them. We need intermediaries. People to tour us through. Tools, filters, sensors. That will listen, see, aggregate and separate, connect and disconnect, assemble and disassemble. / With the intermediaries, we will have to learn and speak the same language, accept the gaps, sense the priorities. The tools. They won’t see as we see through our eyes, they won’t listen as we listen, they will perceive through different dimensions, they will count time with another anxiety. / As our intermediaries, our tools will be our interlocutors.” 工具是我們的中介體,我們的工具將會是我們的對話者。 These tools treat images as data. For instance, their ‘data gallery’ is an attempt to give form to this ‘conversation’ beyond the limits of visual representation and the human sensory apparatus. An image is no longer simply what is shown on the page but what exists between knowledge produced by the different outputs of the algorithms. In this way it begins to exist in the imagination, evoking the ‘Forensic Imagination’ that Matthew Kirschenbaum also refers to in Mechanisms. Furthermore we might understand these probes as something close to the way that Eyal Weizman and Thomas Keenan define 'forensis' as more than simply the scientific method of gathering and examining data. To them, forensics gives an insight into how inanimate objects become ventriloquized, their testimonies voiced by human witnesses on behalf of the objects. “Forensics is, of course, not simply about science but also about the presentation of scientific findings, about science as an art of persuasion. Derived from the Latin forensis, the word's root refers to the ‘forum’, and thus to the practices and skill of making an argument before a professional, political or legal gathering. / In classical rhetoric, one such skill involved having objects address the forum. Because they do not speak for themselves, there is a need for a translation, mediation, or interpretation between the ‘language of things’ and that of people. This involves the trope of prosopopeia - the figure in which a speaker artificially endows inanimate objects with a voice.” And so algorithms can be understood to not merely ‘read’ information in images or sound files, to not only ‘detect’ features in data, but also to generate new forms, new shapes or sounds. An example from Constant’s work is ‘Spectrum Sort’ where the algorithm’s and human's voice combine. It is worth noting that Wolfgang Ernst also uses the example of ‘Fourier analysis’ to make the claim that the machine performs a better cultural analysis than the human is capable of. Furthermore Karen Barad explains that “knowing is a matter of inter-acting”, to point to how nonhuman entities are actively engaged in the making of epistemic claims. If agency is emergent through the ‘inter-action’ of such elements, data can only be understood as part of a larger assemblage that includes the computer, network, program, programmer, factory worker, and wider scientific, military, economic, medical, political systems within which it is materialized. It is in this way that the power of datafication emerges, in revealing the details of the processes by which data is brought to market and its value ventriloquized. This demonstrates new challenges for those attempting to give data a voice and new urgency to understand the ways in which datafied techniques are used upon us. REFERENCES Barad, Karen. Meeting the Universe Halfway, Durham & London: Duke University Press, 2007. Print Constant, Active Archives. Web. http://activearchives.org Ernst, Wolfgang. ‘Toward a Media Archaeology of Sonic Articulations,’ in Digital Memory and the Archive, ed. Jussi Parikka, Electronic Mediations no. 39, Minneapolis: University of Minnesota Press, 2013. Print. Keenan, Thomas, & Weizman, Eyal. Mengele's Skull: The Advent of a Forensic Aesthetics, Berlin: Sternberg Press, 2012. Print Kirschenbaum, Matthew. Mechanisms: New Media and the Forensic Imagination, Cambridge, MA & London: MIT University Press, 2008. Print Marx, Karl, Capital: A Critique of Political Economy, Volume I, Book One: The Process of Production of Capital, 1867. Web. https://www.marxists.org/archive/marx/works/1867-c1/ Tufte, Edward, quoted in Richard Wright, ‘Data Visualization’, in Matthew Fuller, ed. Software Studies, Cambridge, Mass.: MIT Press, 2008: 78-87. Print -------------------- Datafied and standardised (mobile) photography of the computational era Ny Lukasz Mirocha Contemporary culture is dominated by computational media which are created, transformed and distributed using consumer workflows that are based on certain software and hardware ecosystems composed of services, operating systems and devices developed by few dominant vendors (Apple, Google, Samsung etc.). These specific media ecologies have a profound impact both on datafied creative workflows that are offered to users and on contemporary aesthetic patterns in digital photography. As a result of culture datafication, photographies have transformed into digital entities composed of data sets which are govern by certain algorithms (Manovich 2013: 211-212). Smartphones have become dominant cameras of our times. Together with other elements of proprietary computational ecosystems (mobiles, desktops, clouds) they provide a specific backbone for one of the most important and easy-accessible cultural activity in todays visual-oriented culture. Each day, more than half a million of photos are uploaded to Flickr from mobile devices. The top 5 cameras in Flickr community are in fact few models of smartphones produced by Apple, Samsung and Sony. A couple generations of iPhones are responsible for far more photo uploads than all DSLRs combined (Flickr Camera Finder - flickr.com/cameras). The transition from photography being a rather separated (in terms of tools and practices) cultural practice into being one of a software/hardware "option" can be considered as a manifestation of the cultural conditions of contemporaneity. According to David Berry we live in the computationality, an era when our cultural and social practices are both rooted in and bound by digital technology (Berry 27). Majority of contemporary everyday photographers are at the same time smartphone users who are offered a standardised workflow for photo taking, editing and curation. It is pre-programmed by device vendor and enclosed in a standardised black box — a mobile app. Standardisation as such is a vital constituent of the computationality. Following Matthew Fuller, I would even argue that various types of mass-produced standard objects (physical — shipping containers or iPhones and digital — codecs or file formats) have become a vital constituent of todays software-driven and hardware-driven economy and culture (Fuller 105). The argument becomes even more valid if we take into consideration popularity of standardised, preinstalled photo software, as even the most popular third party apps or hardware accessories (e.g. extra smartphone lenses) were downloaded / bought by couple of millions of users in comparison to hundreds of millions of devices used worldwide. Taking into account the properties of digital media, softwarization of a photographic experience and even such a brief Flickr analysis, we can clearly assert that consumer digital photography considered as a cultural practice has shifted from being a separate domain with its own devices, techniques and community to being just one of many activities possible within software / hardware ecosystem. All smartphone users became photographers and a camera itself became an easily accessible application. A single software of hardware update within such ecosystem can introduce a whole new aesthetic paradigm into global mobile photography. HDR (high-dynamic range) or panorama photography or certain image effects (e. g. artificial lens flares, sepia tone) were popularised only because main vendors decided to "add", or better, to “unlock” this option in their devices. As a consequence, the user was granted access to another predesigned workflow. Similar situation can take place at a hardware level. Because of rapid popularisation of front-side camera in mobiles, a super-hyped selfie phenomenon has occurred. Furthermore, wide-angle lenses which are a primary cameraphones equipment introduce a certain type of distortions to the mobile photography — stretched edges of the frame together with slightly miniaturised centre of it. A cameraphone follows the logic of any digital devices — its software / firmware can be updated on the fly and this may significantly alter its image processing capabilities. Taking into account the mass-scale of mobile photography (Flickr example) and sharing options (Web 2.0), each software or hardware update within any of the main ecosystems, can introduce a new aesthetic pattern in todays digital imaging. Therefore, a camera phone should be considered not as a simple tool, but rather as a post-instrumental computational apparatus, a domain which combines both layers contemporary media are composed of – a cultural and a computer one (Manovich 2002: 63). At that point a camera is hardly a standalone entity. It is fully dependent on device’s computational capabilities (hardware) and the photographic experience is designed by its manufacturer along with the interface, storage, and curation capabilities. Even the notion of programmability is now present on various levels: The vendor by programming the device (equipping it with certain software and hardware specifications) along with the workflows programs also the user him/herself. The user of digital camera is also „programmed” (Flusser 2005: 28). As a result, mass-scale digital photography entered not only into the era of predetermined image filters, presets and fixed lenses but also into the domain of Big Data, meta-data and user tracking. We should not ignore the fact that usage practices and preferences of each user may be monitored and analysed by application’s developer and other third parties. Datafication in todays photography occurs not only in the photographic workflow itself, but in field of curating and distributing. On one hand, a digital (mobile) camera should be considered as a manifestation and trigger of aesthetics and creative workflows characteristic for the computationality. They only emerge as a result of software and hardware updates performed on millions of devices. However, on the other hand, it is also an instrument of the new politics of power imposed by standardisation on the creative process, on the apparatus and on the user/photographer himself. This is not the first time in the history of photography when we can observe such situation. The first Kodak camera (1888) was marketed as a magic black box: "You press the button, we do the rest." Furthermore, each time it had to be sent to the producer in order the roll to be developed (Sontag 31). Nonetheless, I argue that since photography has become digital, computational technologies enable to program and track photos, photographer and the whole creative process more for significantly than any other factor in its history. In the light of what was argued here, we should ask ourselves about conditions of existence of visual-based cultural practices. Whether we could unleash the creative potential of computational devices without standardisation and proprietary software and hardware ecosystems? Do we — everyday users — enchanted by user-friendly interfaces, one-tap integrated solutions, presets and omnipotent clouds feel the need to look for out-of-the-black box solutions? References: Berry, D. M. (2011) The Philosophy of Software Code and Mediation in the Digital Age, London: Palgrave Macmillan Manovich, L. (2002) The Language of New Media, London: MIT Press Manovich, L. (2013) Software Takes Command, London: Bloomsbury Fuller, M. (2007) Media Ecologies: Materialist Energies in Art and Technocultures, Cambridge MA: MIT Press Flusser, V. (2005) Towards a Philosophy of Photography, Reaktion Books: London Sontag, S. (2010) On Photography, Picador: New York. http://stunlaw.blogspot.dk/2012/04/abduction-aesthetic-computationality.html -------------------- Photographic Negative of Anonymity: 
Performativity, Betrayal, Materialism, Datafied Research By Wing Ki Lee 
"The negative is comparable to the composer’s score and the print to its performance. Each performance differs in subtle ways."
Ansel Adams (1902 -1984) Photography as a Performative Art How does a medium of indexicality and mechanical reproduction become a medium of performativity? Media studies theorist Arild Fetveit (2013) draws the discussion on how contemporary art gallerists and printers (as a profession, not a machine, perhaps a mechanism) posthumously reproduce Seydou Keïta’s (1921 – 2001) photographic negatives from the 1950s and ‘60s to illustrate this phenomenon. The way that Seydou Keïta’s printed and performed his negatives in the 1950s and ‘60s differed from the commercial galleries’ takes, and not in subtle way – Keïta’s original prints are moderate contrast 5”x7”s, whereas the prints produced by commercial galleries in the 1990s to the 2000s are high contrast and mural size (up to 48-by-60-inch designed for commercial appeal. Fetveit proposes a performative model of photography where performance is exercised via different human agents in different contexts and times. Photographic printing is a preference determined by ownership of the photograph in a commodity culture sense by the “authenticity by means of their closeness to the artist and the time and place they were taken.” (Fetveit, 2013: 92) Other than the dichotomy of consumption and (re)production produced by human performativity and agency, there is also non-human mechanical performativity, such as faults, glitches that are less desired by and not controlled by human agency. Applying the logic of performativity to Walter Benjamin’s (1936) notion of aura further problematises the stability and canonisation of mechanical reproduction, and such thought renders a wider spectrum within which to question standardised datafying process in reproduction of photographic work. A Resolution to an Age-old Debate Is there difference in performance/reproduction in digital and analogue workflows? A digital image is composed of and rendered through numerical data and computational process. What happens in the darkroom to render a silver-gelatin photographic print is not without digital information. Florian Cramer (2013) in his discussion on post-digital research stated that ‘digital information… in an idealised abstraction of physical matter which, by its material nature and the laws of physics, has chaotic properties and often ambiguous states.’ Score, notation, index and even the study of grain distribution through the focus-finder are manifestations of the abstraction of physical matter. Despite all sorts of differences between analogue and digital image processes, and despite the tendency to believe that digital imaging is a datafying process because of where quantification takes place, Mayer-Schönberger and Cukier argue that “digitalization turbocharges datafication. But it is not a substitute. The act of digitization – turning analog information into computer-readable format – by itself does not datafy.” (Mayer-Schönberger, Cukier, 83). In this light, to process a photographic negative into a digital surrogate has very little to do with datafying photography. Based on Fetveit’s proposal (2007:60) to separate the functions of medium to ‘medium of storage’ and ‘medium of display’, I would further argue that in photography the datafying process performs differently in the states of ‘medium of storage’ and of ‘display’. If photographic negatives were not identified as a medium of storage, but, through decontextualisation as a medium of display, what opportunities would it bring? Conditioning Photographic Negatives To situate and condition photographic negatives in the context of artistic practice, negatives are the source of “reproducibility” and “projectionability” to photographic prints (van Dijck, 108-9). Negatives are inarguably a source of data. Similarly to musical scores, negatives are rarely exhibited against white walls as a work of photographic art. Ansel Adams’ saying goes: negative as score, print as performance. Performances are valued over the source itself whether it is a concert (to a musical score) or a print (to a negative). Data is of archaeological value. When there is data, there will be database. The archive, be it physical or virtual, is where photographic negative resides, quietly, in “forgotten and dusty places.” This more or less describes the condition of photographic negatives in an artistic context. Performativity, Betrayal to Negatives of Anonymity ‘But to perform something is to interpret it, to betray it, to distort it. Every performance is an interpretation and every interpretation is betrayal, a misuse.’ Boris Groys (2008:84), “Religion in the Age of Digital Reproduction” My recent artistic practice addresses the aforementioned debate and discussion. The inception of the project is to build an archive anonymous photographers’ unwanted and unattributed photographic negatives of Hong Kong, usually found from local flea markets, vintage shops or via eBay trade. These anonymous and abandoned negatives usually do not come with attribution – who is the photographer? No history, or textual narrative describes its origin, let alone technical notes produced by the photographer. Conversely, the absence of prescription liberates the image reproduction process. Anonymous negatives are pure scores living in its visual and material forms. To set a standardised and normative parameter to perform this negative archive is a way to datafy and also to petrify them. The guilt of betrayal occurs every moment in the course of performing. Performance is betrayal, a betrayal to human agency which produces the negative (authenticity and closeness to the artist); a betrayal to the viewer who expects an optimal result that the negative should have and embed. After betrayals by humans, it brings up also the very “forensic materiality” (Kirschenbaum, 2012) of the negatives. Would humans perpetually exercise power over material and silent the autonomy and agency of the medium? Photographic data is memories of multiple aptitudes. There is cultural memory of human agency and social institution, as well as memory and conditioning of the material and technicality (Ernst, 2013). Photographic negatives are not solely an artifact to and datafies social and cultural events and to be understood through semiotics. In itself it embeds and lives its materiality. Negative lives, the material weathers, and the chemical weathering transform the image and conditions the object (think silvering-out photograph). These mechanisms authorise the photographic object its medium-specificity, and datafication of a medium takes place. References Cramer, Florian. “What is Post-digital?” APRJA Volume 3 Issue 1 (2014) Dijck, José van. Mediated Memories in the Digital Age. Stanford, Calif: Stanford University Press, 2007 Fetveit, Arlid. “Convergence by Means of Globalized Remediation” Northern Lights Volume 5 (2007): 57-74 Fetveit, Arild. “The Ubiquity of Photography” in Throughout: Art and Culture Emerging with Ubiquitous Computing, Cambridge, Mass.: MIT Press, 2013 Groys, Boris. “Religion in the Age of Digital Reproduction” e-flux journal Volume 4 (2008):1-10
 Kirschenbaum, Matthew. Mechanisms: New Media and the Forensic Imagination. Cambridge, Mass. MIT Press, 2012 Mayer-Schönberger, Viktor and Cukier, Kenneth. Big Data: A Revolution That Will Transform How We Live, Work, and Think. Eamon Dolan/ Houghton Mifflin Harcourt, 2014 Ernst, Wolfgang. Digital Memory and the Archive. Minneapolis: University of Minnesota Press, 2013 -------------------- Zombies as the living dead By Winnie Soon (PIT, Aarhus University) “We are with you everyday, we live in the Internet with peculiar addresses and enticing titbits, but you call us “spam”. We wander around the network, mindlessly, and you wanted to trash us, but we are still everywhere. We are just the children of your economic and social system, but you ignore and avoid us. We are not dead, we write, we create. -Zombies” (Soon 2014) Spam appears everywhere on the Internet. In 2014, statistics show that spam proportions reach almost 70% of global email traffic (Shcherbakova and Vergells 2014). Spam not only consists of commercial advertisements and enticing titbits, but they also come with peculiar email addresses. These email addresses become the spam’s identity, which appears in the inbox interface that one can reply to. However, many of them do not actually exist in the network. On the one hand, they are actively living in the network and are always monitored by algorithms; on the other hand, they consume numerous network resources and are regarded as “waste” (Parikka and Sampson 2009:4; Gabrys 2013:67) that are deadly trashed. This text explores the notion of the living dead, similar to a zombie figure in popular culture, to discuss the computational and network process of spam automation. It investigates the role of code and the material aspect of code that interacts with network environment in the process of zombification. A reflexive approach: understanding spam production I have taken a reflexive artistic approach (Soon 2014) to examine the technical and material aspects of spam. This includes setting up a spam production line, writing computer scripts to capture spammers’ email addresses instead, producing and distributing customized spam poems automatically in real time. Composing and sending massive emails requires computer code that deals with file reading and data processing. As such, code contributes significantly to the process of spam data quantification and automation. I argue, however, that the role of code cannot be taken for granted from a purely technical perspective in datafication. It requires a thorough understanding of its cultural implications and its relationship with network environments. Mutable code In spam production, the mutating value of a parameter, such as the sender and receiver’s email address, is arguably a property that facilitates spam automation. It allows data to be processed differently within the same parameter, and will not impact the entire production line. This mutable code enables data to be massively processed, and offers a certain degree of variability. However, this mutating value is not merely a technical data configuration, as Neff and Stark put it, “the information architecture is politics in code” (2004:186). Code, in this emailing context, also includes “technological and social systems” that shape what the becoming value might be (ibid). Security is continuously enhanced in an email system and its filtering rules as well. New sender addresses need to be continuously produced to escape from being identified and trashed. Harvesting live data with active email addresses is said to be one of the most challenging aspects of quantified emailing. Computer agents, such as web crawlers and web bots, use different ways such as web data mining (Raz n.d), spoofing and dictionary attacks to harvest valid and close-to-live addresses. The value of the receiver parameter stands for an actual target, and it is constantly mutating at the code level. The email server follows protocol specifications (The Internet Society 2001) that process email addresses one by one through command-line communication in the form of code. The specification “prescribes how the data should be formatted, the type of data allowed" (Hall 2000:13). This is what Alexander Galloway might call " network control" (2004:xix). On the one hand, these are technical standards; on the other hand, they "govern the set of possible behaviour patterns" as “regulations” (2004:6-7). The verification of mail servers includes the domain validity, receiver address, the sending limit and so forth. At the operation level of code, executing such spamming programs means submitting data for the email server’s regulatory check. In view of the receiver parameter, email servers constantly receive different lists of emails through coding interfaces. These addresses are mutating at the level of code based on what has been found from computer agents. Hence, this mutable quality constitutes the entire production chain of spam, as I argue, it is not simply a data configuration that substitutes a parameter value with data. It contains other cultural implications that facilitate the automated production in a quantified condition. The undead writing of automation Spam is like zombies, they do not have a physical body, but they possess a temporal identity and a body of text. They may not survive for long but even if one is being trashed, there are still many around the network. According to Boluk and Lenz, they draw upon Lauro, Embry and Weinstock to discuss zombie as “a figure of undead labor and consumption" and “is simultaneously a figure of pure automation, of programmed memory that infinitely loops" (2011:7). They are regarded as undead because the automated process minimizes human interventions and optimizes labour practices. All the digital labour, such as computer agents and computer job schedulers (also known as cron job), have efficiently become automated. This automated spam production is also understood as a repeatable writing process. According to Chun, “no matter who wrote it or what machine it was destined for; something that inscribes the absence of both the programmer and the machine in its so-called writing” (2011:42). As spam text is generated through computation, therefore, we could also say code writes spamming emails. From a confining process of computation to a wider framework of capitalism, zombies are undead, they are repetitively produced through writing- writing to mailboxes and writing for data capturing and processing. Computationally, Chun however reminds us code is a process of "undead writing, a writing that—even when it repeats itself—is never simply a deadly or living repetition of the same" (2011:177). This ‘undeadness’ suggests an attention at the material level of code and the corresponding automated processes. The notion of the living dead, as I argue, encompasses code automation – an undead and repetitive writing process where parameter values are constantly mutating. It contributes significantly to spam zombification, and possibly other kinds of software culture that demonstrate datafication. Reference: Boluk, Stephanie and Lenz, Wylie. Generation Zombie: Essays on the Living Dead in Modern Culture. Mcfarland, 2011. Print. Chun, Wendy. Programmed Visions: Software and Memory. The MIT Press. 2011. Print Gabrys, Jennifer. Digital Rubbish: a natural history of electronics. The University of Michigan Press, 2013. Print. Galloway, Alexander R. Protocol: How Control Exists after Decentralization. The MIT Press, 2006. Print. Hall, Eric. Internet Core Protocols: The Definitive Guide: Help for Network Administrators. O’Reilly Media, 2000. Print. Neff, Gina and Stark, David. “Permanently Beta: Responsive Organization in the Internet Era.” In Society Online: The Internet in Context. Eds, P.N. Howard and S. Jones. Sage Publications. 2004. Print. Parikka, Jussi and Sampson, Tony D. The Spam Book: On Virus, Porn, and Other anomalies from the Dark Side of Digital Culture. Hampton Press (NJ, 2009. Print. Raz, Uri. “How do spammers harvest email addresses?”. Web. Shcherbakova, Tatyana and Vergells, Maria. “Spam report: February 2014”. Securelist. Web. < http://securelist.com/analysis/monthly-spam-reports/58559/spam-report-february-2014/> Soon, Winnie. Hello Zombies. 2014. Web The Internet Society. “Simple Mail Transfer Protocol”. Apr 2001. Web. –––––––––––––––––––– OTHER THINGS –––––––––––––––––––– em:toolkit - cartography as embodied datification By Alessandro Carboni In this text, I propose to you a set of instructions to capture, extract and embody spatial data. The toolkit is a cartographic process that applies a new methodological approach to urban mapping based on a reflexive practice invoking the body and a series of steps comprising observation, analysis, extraction and the embodiment of data. Central to the understanding of this tool, is to consider urban space as an articulated dense environment of ‘events’ (Thrift, 2007). People, objects, streets, and their relations, constitute a complex urban texture in which bodies, as agents, interact, transform and move (Batty, 2009). According to Erin Manning, bodies, navigating in space, experience space by encountering events (2014). Drawing on this, I argue that each of these events are connected to each other and in this way space ‘functions’ as a linear timeline. I claim that, in order to preserve the stability of it, the system requires a certain ‘horizontality’. Any interruption, would generate a ‘vertical’ axis. This ‘verticality’ is an unexpected ‘event’, that destabilises the system, the urban space. At this critical moment of instability, as Micheal Batty proposes (2009), the system adapts itself to the given situation, re-establishing its ‘horizontality’ once again. This process of adaptation, I describe it as ‘diagonality’. Put simply, with the horizontality the system is stable, it preserves a certain linearity. If suddenly, any unexpected event occurs, there is a verticality. In this moment, the system moves into ‘turbulence’ in which it creates a period of instability. In this moment, the system moves to a critical point, described as diagonality in which it adapts itself to the given situation, re-establishing the horizontality. This sort of tension between order and chaos, described as cyclical pattern, horizontal-vertical-diagonal, is the most common structure of any adaptive complex system (Portugali, 2013). How do we read and map this emergent occurrence of unexpected events in urban space? Cartography is a mapping process, and map is a display of the alternating between practices of accumulation, disassembly and reassembly of spatial data (Corner, 1999). Drawing on Nigel Thrift (2007) who considers space as an issue of perception, and the body as the medium for perceiving it, I would argue that space and body are blended in a continuum of experience. I consider the body to be as much immaterial as physical (Thrift, 2007), and the endless possibility of accumulating data through experiences and senses through the body (Manning, 2009). I consider the body to be the initial and ultimate cartography tool of this mapping process. Building upon this, I propose to reconsider cartography as a datafied mapping process bound into time and relational connections of space and body. By applying the process suggested by Corner (1999), ‘accumulation-disassembly-reassembly’ as performative practice, I aim to capture, embody and represent data exclusively with the body. The aim of Em:toolkit is to develop a particular state of awareness in the participants who use it, prompting them to be better able to observe and interpret the ‘events’ that occur and happen in real-time in urban space. The tool enables users to reveal those ‘events’ that exist in urban space, but that we are not fully aware of because they are at the periphery of our attention. I suggest that they are just waiting to be discovered and that therefore the process of mapping is to reveal them and activate what would otherwise be latent. In the follow section, I propose a set of instructions in which users embody: The toolkit proposes three steps: observation, analysis, extraction. The first and the third should refer to a beginning and final stage, the second ‘supports’ the action. With these three steps you are able: first, to create the field, the setting of rules and the establishment of a system; second, to relate to the extraction, isolation of parts and data; and third, the plotting, the drawing-out, and the setting-up of relationships of the parts. In the observation, you define the location, the physical space in which you would operate. You create a boundary, with variable dimensions and scale, in which you focus your interests, mapping your process and actions. With the analysis, you circumscribe the ‘event’ and formulate a body-action, as a datafication of the experience. The last step, the extraction, operates as an execution and repetition of the body-action. The three steps are cyclical and they are constantly activated by you whenever an ‘events’ occurs. You, as a performer, experience the emergent properties of unexpected ‘events’, and have to adapt your body-actions to given circumstances. This reflexive modality is cyclical. Thus, at the outcome of each cycle, you embody the extracted data, as a body-action that functions as a map. Now you are ready to apply the toolkit. The mapping process starts in urban space. The fist step is the observation mode. While moving in the urban space, you are waiting to be capture an event. Suddenly when an event occurs, it is an interruption of the established horizontality. The event is unexpected and it produces verticality, and turbulence. It can be any situation that affects the performer. In this moment, you activate the analysis mode. You are not just a passive spectator. On the contrary, as you are affected, immediately you have to open several possibilities of action in response to the event. Those takes place in form of small ‘holograms’ in your mind. Can you see them? In order to avoid any ‘instinctive’ action, you should pause, wait and analyze the event, and find an answer to the following three questions: 1) what constitutes this event? 2) how can I make a relation to it? 3) where/when should I make an action? With the first question, you make a list of possibilities. To the second, you imagine a body-action. Before making the body-action, you answer the question as to where/when the action might take place. Once, this is decided you make the action. This re-writes the event, as it was not visible or in the centre of your attention. This is a process of circumscription, defined along a diagonal axis, in order to re-establish horizontality. Once the action is executed, you proceed to the extraction mode. You extract the body-action from the location where it was performed, and memorise it. As the final step of the mapping process, the extracted body-action becomes a Unit of movement which can be repeated. This Unit is a map. Once this Unit is extracted, you come back to the observation mode, waiting to capture another event. When it happens, you reformulate the three questions and start a new cycle. REFERENCES: Batty, Micheal. "Complexity And Emergence In City Systems." Malaysian Journal of Environmental Management 10.1 (2009). Print. Caquard, Se. Mapping Environmental Issues in the City Arts and Cartography Cross Perspectives. Berlin: Springer, 2011. Print. Corner James. "The Agency of Mapping: Speculation, Critique and Invention." Mappings. Reaktion, 1999. Print. Crampton Jeremy. "An Introduction to Critical Cartography." ACME: An International E-Journal for Critical Geographies 4.1 (2006): 11-33. Print. Dodge Martin, Kitchin Rob and Perkins Chris. The Map Reader: Theories of Mapping Practice and Cartographic Representation,. John Wiley & Sons, Ltd, 2011. Print. Manning Erin. Relationalscapes. London: Mit Press, 2009. Print Merleau-Ponty, Maurice. The Structure of Behavior. Beacon Press, 1967. Print Phelps, R & Hase, S. “Complexity and action research: exploring the theoretical and methodological connection”, Educational Action Research 10.3 (2002): 507-524. Print. Portugali Juan. Complexity, cognition and the city. Berlin: Springer, 2011. Print. Thrift Nigel. Non-Representational Theory: Space, politics, affect. London: Routledge, 2007. Print. -------------------- Tomorrow’s News. - The Early Edition By James Charlton There are big numbers where the Internet lives. Exabytes of information stored on servers, stacked in data fortresses around the world. Down corridors of container vessels technicians ride on scooters as if in some macro version of computer architecture, repairing and maintaining the physical network of numbers – numbers connected to numbers in networks of servers, ports and cables. This is the physical Internet; the bits of the bytes, where numbers exist embodied in physical objects. This is where data has dimension, weight, temperature and scale. Where it consumes energy, demands attention and becomes a thing-in-itself that can’t help but look backwards over its shoulder at the uneasy relationship between data and objects. Although data has been presented as embodied in the physical architecture of things, this is clearly not the same as a thing being data, and Big Data – the indiscriminate dirty-data of Mayer-Schoelberger and Cukier, is not simply things as big numbers. Rather Big Data seems to demand a rethinking of the relationship between the data-event and the data-object. The need for this differentiation is made clear by Tristan Garcia when he distinguishes between ‘that which is something, and that which something is’ (2014b, 52). A newspaper such as this is something, but the thing that the newspaper is – its data metrics - is not the same as the newspaper. Both exist in their not being of the other, a process through which they must, according to Garcia, maintain the relational potential of their own failure - their compactness. Data thus comes into being through the event of its own compactness. In the sense that data must always be its own thing, it is also continuous in its relation to its subject. How is it, then, that data separates itself from the subject? The emphasis Big Data places on correlation over causation - on ‘what rather than why’ (Mayer-Schönberger), appears to be such a separation of information from context. As Mayer-Schönberger asserts, it is Big Data’s willingness to embrace ‘real-word messiness rather than privilege exactitude’ (19), to deliberately ignore context and focus instead on predicting the future, that isolates it from its subject. Intent on the future, Big Data’s predictive gaze is grounded on a construct of time that is reliant on the discreteness of past, present and future. In not caring why something happened, Big Data distinguishes itself from the causal past and locates itself fully in the self-realising events of the predictive future. It becomes a thing-in-itself that is reliant on a discrete quantified construction of time that allows for the notion of prediction. Mayer-Schönberger’s insistence that predictions based on correlations lie at the core of Big Data is thus also a Presentist construction of time, predicated on locating the subject exclusively in the present. Perversely, only when we locate ourselves solely in the present can the potential of prediction be realised. Only when the future is not present does it remain the future. Only when a thing exists out of context and is a thing only in itself, can Big Data’s predictive claim be made. Big Data seems to want to play it both ways then – drawing from a subject that is in the present while positioning itself entirely in the future. Its predictive promise seems to rely in a form of temporal amnesia that in order to avoid a stack overload of object and event, demands we move forward so rapidly that the present fades into the past even before it has arrived. But rather than time disambiguating the relation between subject and data, Big Data seems to challenge us to consider models of time other than those that position the subject on the knife-edge of the present in a continuity between past and future. Is it plausible, then, to conceive of a temporal schema that enables a subject to be doubly locatable in time without compromising the compactness of itself? Drawing on Growing Block Universe Theory, Garcia offers only a partial solution to this with an alternate model that resolves the co-conditional construct of things as both things-in-something and things-in-time when he proposes a model in which past and present are intense variations of presence rather than isolated instances of equal intensity(3). The past, rather than being discreet and separated from the present, is part of the continuity of event-time in which the discrete thing is no less a thing but fades in the intensity of the present. Like yesterday’s newspaper that yellows in the sun, Garcia’s present constantly moves away from the now – away from a position of maximum presence. Yet as its printed copy degrades in legibility, yesterday’s newspaper never ceases to be fully determined. At every moment the past conserves its option to remain individuated as it accumulates absence upon absence in a qualitative time of intensity. While this accumulation of absence rather than presence enables Garcia to define the present as doubly locatable in past time, and removes the tension between objects of the past and events of the present, he also condemns tomorrow’s newspaper to remain on the printing press of the future where it resists circulation in the present. His future of maximal absence forever seems to maintain its distance from the present in order to maintain itself as the future it is – as a thing in itself. If the fading intensity of the past is not applicable to the future then the future remains defined by the relational model of time. While Big Data doesn’t change things, it cannot be so easily exempted from the world by escaping into the predictive future. In order to function in the predictive terms described by Mayer-Schoelberger, Garcia’s model of intensive time must then also allow for the future to be understood as doubly locatable in time; a way perhaps for the future to fade into present without losing its own indexical position - a way for us to turn the page with out losing our place. For, surely, although tomorrow’s news may never arrive - the Early Edition will still be delivered. WORKS CITED Garcia, Tristan. "Another Order of Time: Towards a Variable Intensity of Now." Trans. Kris Pender. Parrhesia: A Journal of Critical Philosophy 19 (2014a): 1-13. Parrhesia. Parrhesia. Web. 02 Oct. 2014. Garcia, Tristan, Mark Allan. Ohm, and Jon Cogburn. Form and Object: A Treatise on Things. Edinburgh: Edinburgh UP, 2014b. Print. Mayer-Schönberger, Viktor, and Kenneth Cukier. Big Data: A Revolution That Will Transform How We Live, Work, and Think. Boston: Houghton Mifflin Harcourt, 2013. Print. –––––––––––––––––––– PEOPLE –––––––––––––––––––– DOES CONSCIOUSNESS EXIST WHERE SURVEILLANCE CAN'T GO? BIG DATA TO BIG BRAINSTORM By Ellen Pearlman ICREACH is the NSAs covert system of secret surveillance records processing two to five billion new records every day, including  30 different kinds of metadata on emails, phone calls, faxes, internet chats, text messages, and cellphone location information.  When banks of big data including wearables expands to include bio-information about individuals, how will we navigate this new scenario? At the 2014 HopeX conference in New York City Edward Snowdon telepromted a live dialogue with Daniel Ellsburg, leaker of "The Pentagon Papers."   Snowden stated the government is creating a deep, robust data sets to analyze everybody, everywhere, all the time using network and cell phone intercepts. We the people, he stated, have the means and capabilities to encode our rights for the future.  In 2012 the Obama administration launched the 10-year Brain Initiative to map every neuron in the human brain, and in 2013 the European Union announced The Human Brain Project. Half of the allocated U.S funds go to the Department of Defense; the rest to the National Science Foundation and the National Institutes of Health. The implications of these disbursements are not yet clear, but in light of  Snowden's revelations are troubling. In the future gamers will use brain-enabled devices playing in a virtual world uploading brain data to the cloud . Researchers at the University of California, Berkeley have reconstructed rough images from dreams and other visual responses creating a map of the semantic brain. This data contains the core of who we are, and what we think. Once neural networks are cracked, and cognitive processes formerly inaccessible are open for monitoring what might the future hold? MEMORY MANIPULATED BY LIGHT - OPTOGENETICS Scientists at Albert Einstein College of Medicine at Yeshiva University in New York studied the molecular basis of memory using fluorescent tagged neurons of mice. They stimulated neurons in the mouse’s hippocampus, a brain area where memories form. They watched fluorescent memory molecules develop inside neuron nuclei. MIT professors used optogenetics manipulating individual cells with light. They placed a mouse in a box, shocked it, then  altered genes of shocked brain cells. The mouse was moved to a new location. It behaved normally. Researchers shone a special blue light activating the genetically manipulated memory cells.  The mouse's fear response returned though there was no threat, proving certain types of memory can be manipulated. RECONSTRUCTING THOUGHTS AND DREAMS The University of California, Berkley developed  software reconstructing visual imagery using fMRI brain data. Subjects watched two different groups of Hollywood trailers. Their fMRIs data was recorded. A software program categorized the brain information using an algorithm examining  935 different object and action variables of shapes and motion. Another algorithm analyzed eighteen seconds of thousands of random YouTube videos sorted by color palette. Special software selected brain patterns connecting  shape and motion from the movie trailers combining them into a simulated representation of the types of objects they viewed like a circle, horizon or face. Researchers have discovered the brain naturally catalogues images into visual semantic categories. Different brain locations exist for categories like moving vehicles, car, boat or truck, and are similar for most people. Can our thoughts and feelings remain hidden ? TELEPATHIC BRAIN IMPULSES A quadriplegic used a Braingate device using a robotic arm to grasp a cup powered by neural signals of just the intention to move a limb.  A tiny neural sensor implanted in the human brain with 100 electrodes records activity in the motor cortex. These signals are “decoded” in real-time powering an external robot hand. The quadriplegic performed complex tasks with the robotic arm by imagining the movements of their non-active limbs as neurons were still able to fire. Will drones and bombs be activated through soldier’s thoughts? MELDING THOUGHTS Researchers at the University of Washington produced the first non-invasive human-to-brain interface. One subject wore an EEG cap. He imagined controlling a video game with his right hand. That brain signal traveled over the internet. A second subject had a TMS or Transcranial Magnetic Stimulation on top of the area of his left motor cortex that controls the right side. He received the internet EEG impulse into his brain (non-invasively). Subject one imagined moving his right hand. Subject two's right hand received that impulse and jerked. Will people’s movements be controlled by the thoughts of others? EMOTIV CAP AND SPY SURVEILLANCE The Emotive Cap, a portable EEG device reads changes in electrical activity in the brain. Those changes can be mapped to emotions,  facial movements, eye, eyelid and eyebrow positions, smiles, laughter, clenched teeth, and smirks and be mapped to other devices or virtual avatars. A paper at the USENIX Security 12 conference  “On the Feasibility of Side-Channel Attacks With Brain-Computer Interfaces,”  concluded  someone could use brain data to steal a bank PIN number. The paper examined the "P300" brain "fingerprinting" signal  activated when someone recognizes something.  Researchers  had a 40-60 percent accuracy rate identifying details of where a subjecs banked and what their PIN was by flashing photos of bank logos and various numbers during monitoring of their P300 responses. Could hackers steal brain information ? IMPLICATIONS Computers model rudimentary representations of human dreams, perceptions and memories correlating stimulated areas of the brain by reconstructing  images through algorithms.   Only devices that use fMRI, or EEGs or MEG machines can currently deliver results. Blinking, moving one’s head, coughing or daydreaming can skew a reading. In order to decode a subjects imagery the subject needs to remain stationary inside an fMRI scanner. In the future, this will change. How will the military and law enforcement use and control brain information? [HOPE X] - Manning by Internet Society (The New Livestream) http://new.livestream.com/internetsociety/hopex1 How Cloud Computing Changes the Game for Retail Industry CIOs (The CIO Report RSS) http://blogs.wsj.com/cio/2013/10/08/how-cloud-computing-changes-the-game-for-retail-industry-cios/ ICREACH: How the NSA Built Its Own Secret Google - (The Intercept The Surveillance Engine How the NSA Built Its Own Secret Google Comments) https://firstlook.org/theintercept/article/2014/08/25/icreach-nsa-cia-secret-google-crisscross-proton/ http://newscenter.berkeley.edu/2011/09/22/brain-movies/ http://publicintelligence.net/ssci-mkultra-1977/ Reconstructing Visual Experiences From Brain Activity Evoked by Natural Movies (Reconstructing Visual Experiences From Brain Activity Evoked by Natural Movies — Gallant Lab at UC Berkeley) http://gallantlab.org/publications/nishimoto-et-al-2011.html http://www.smartbraintech.com/store/pc/viewCategories.asp http://www.whitehouse.gov/share/brain-initiative -------------------- Neuro Memento Mori: portrait of the artist contemplating death By Jane Prophet The human brain has commonly been described as the final frontier of the scientific biological exploration of the human body, largely unknown and under explored. The Human Brain Project claims it is “one of the greatest challenges facing 21st century science”. The relative lack of knowledge and research into the way the human brain works has been attributed to a paucity of data about the brain, the result, historically, of the limitations of instruments to measure living brains. Neuroscientist, Fred Mendelsohn notes that in 1960 “there was no way to image the structure of the living brain; the skull represented a virtually impenetrable barrier to further understanding”. Following the development of new scientific instruments such as Magnetic Resonance Imaging (MRI) and functional MRI (fMRI) that are capable of more safely imaging living brains, there has been a huge increase in data generated by neuroscientific research and US Congress named the 1990s the Decade of the Brain. Contemporary studies of the human brain generate significant quantities of large datasets, typically many gigabytes and terabytes of image data. The particular challenges associated with working with image data that are often presented as ‘informatics problems’. Neuroscientific studies that analyse the human brain, with their associated neuro images, have proved to be great clickbait, capturing the public’s imagination. Some of the appeal of contemporary neuroscience can be attributed to the excitement and power associated with a venture successfully marketed by nation states and corporations as ‘pioneering’. However, theorists have drawn attention to the particular ‘seductive allure’ of neuroscientific explanations of behavior and of neuroscientificsicnttifc images created using MRI and their role in what has been described as ‘neuro-popularization’/ and ‘neuro-hype’ respectively. Scholars interested in the rhetoric of neuroscience, like David Gruber, have studied numerous neuroscience reports in the popular press and argued that brain images generated by MRI and EEG are key to the appeal of neuroscientific research and the widespread dissemination of its data. Neuroscientific data, especially that represented using images like fMRI is contentious, and there are debates about how the data is gathered and the interpretations such data can yield and interpreted. Cordelia Fine for instance has made compelling arguments about the ways that neuroscientific literature has been used to support the claim that certain psychological differences between the sexes are ‘hard-wired’. Fine’s work is particularly relevant to our research as it has been claimed by Gur et al that “men and women differ in volumes of brain structures involved in emotional processing such as the temporo-limbic and frontal brain”.. In keeping with a feminist and new materialist approach to techno science I have previously adopted in my work with scientists and scientific data, I consider our experiments as intra-actions: “[i]ntra-action works as a counterpoint to the model of interaction, signifying the materialization of agencies conventionally called ‘subjects’ and ‘objects’, ‘bodies’ and ‘environment’ through relationships. Intra-action assumes that distinct bounded agencies do not precede this relating but that they emerge through it,” echoing theories defending a performative view of the gendered social subject. Karen Barad adds “method, measurement, description, interpretation, epistemology and ontology are not separable considerations”. To enable the extraction of microscopic data for modeling and simulation many neuroscientists use “BigBrain”, a free, publicly available tool that provides considerable neuroanatomical insight into the human brain. “BigBrain” provides a map that serves as a model of all human brains. It is an ultrahigh-resolution 3D model of a human brain at nearly cellular resolution of 20 micrometers, based on the reconstruction of 7404 histological sections of the brain of an unidentified person. This tool is a good example of what Matt Fuller would call a ‘standard object’ of the human brain. The concept of standard object refers to the ways in which an understanding of the qualities and affordances of an object in a specific context can be used to evaluate the capacities of this object in another context (Fuller, 2005: 172). “All standard objects contain with them drives, propensities, and affordances that are ‘repressed’ by their standard uses, by the grammar of operations within which they are fit” (Fuller, 2005, p.168). The anatomy of individual brains varies widely, influenced by genes, environmental exposures, experiences and disease. “BigBrain” as a standard object offers a normative model of the human brain approximating the physiology of a standard human brain. Through a performative artistic and scientific experiment, we propose to decontextualize “BigBrain” from its normal use to unleash the effectivities of some of its unseen cognitive affordances. Little is known about the neural substrates associated with the awareness of mortality. Brain imaging results may reveal the neural mechanisms underlying death-related psychological processes. For hundreds of years memento mori and vanitas artworks had the alleged function to prompt the viewer to contemplate their own mortality. My collaborative research with two neuroscientists, Zoran Josipovic (NYU) and Andreas Roepstorff (Aarhus University) and our respective research teams, gathers fMRI data via two related experiments that we have co-designed. This data is used to produce outputs, ranging from sculptural 3D printed objects to academic papers. In the first of our experiments I view representations of memento mori and vanitas art works that are interspersed with control images from a similar time period that are of similar form. This is a 2 x 2 experiment and before each image is displayed one of two text phrases is shown. The phrases are “You will die” and “Live the now”. The images and text combinations are random. In the second experiment I undertake a series of death meditations, interspersed with control meditations on compassion. This project emerges through intra-actions between experimental neuroscience and humanities and explores what parts of the brain are active when a subject (me) views memento mori images and when the subject contemplates death via a death meditation. Data will be analysed and compared to determine whether there is any correlation between the activity registered in different parts of the brain during these two activities. We hope to find out whether viewing memento mori and/or vanitas artworks prompts activity in the same areas of the brain that are active during the meditative contemplation of death. The experiments are being designed, conducted and evaluated in a critical context where we challenge, through a practice that puts forward a performative and relational ontology, the usual ways in which such MRI images are gathered as data, interpreted and build scientifically and rhetorically a certain image of the scanned subject. References Barad, Karen. Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning< br>. Duke University Press, 2007. BigBrain, http://articles.latimes.com/2013/jun/20/science/la-sci-big-brain-20130621 Decade of the brain http://www.loc.gov/loc/brain/ Fine, Cordelia. Delusions of gender: How our minds, society, and neurosexism create difference. WW Norton & Company, 2010. Fuller, Matthew. Media ecologies: Materialist energies in art and technoculture. MIT Press (2005). Gruber, David, et al. "Rhetoric and the neurosciences: Engagement and exploration." Poroi 7.1 (2011): 11. Gur, Ruben C., et al. Sex differences in temporo-limbic and frontal brain volumes of healthy adults. Cerebral cortex 12.9 (2002): 998-1003. Healy, Melissa. Scientists create detailed 3-D model of human brain. Los Angeles Times (2013) Mendelsohn, Fred. Understanding the brain and mind: science’s final frontier? (web) http://mdhs.unimelb.edu.au/news/understanding-brain-and-mind-science%E2%80%99s-final-frontier Prophet, Jane and Pritchard, Helen. Performative Apparatus and Diffractive Practices: an account of Artificial Life art. Artificial life (in manuscript, 2015) The Human Brain Project, https://www.humanbrainproject.eu/discover/the-project/overview Weisberg, Deena Skolnick, et al. The seductive allure of neuroscience explanations. Journal of cognitive neuroscience 20.3 (2008): 470-477. -------------------- Emails from an American Psycho By Lea Muldtofte “Dress like a secret agent, Fitted dress shirts and jackets for the modern man by Aaboteur, www.saboteurman.com” (Cabel, Huff: 2012) Published in 1991 American Psycho by Bret Easton Ellis presented a first person portrait of Patrick Bateman - a Wall Street banker and an industrious serial killer. Bateman, through his own voice, is revealed to be a narcissistic, status-obsessed perfectionist who not only thoroughly describes his own actions of torturing and executing, but also details his extreme regime of self-maintenance and his obsession with music. The artists Jason Huff and Mimi Cabel rewrote Ellis’ text. They called it American Psycho 2010, and this version was made by sending the text of American Psycho, page by page, between two Gmail accounts. The resulting Google-generated advertisements were kept as footnotes while the original text was deleted. American Psycho 2010 consists henceforth of 800 ads as footnotes corresponding to the voice of Patrick Bateman. I will here argue that this rewriting, moving from offline to online (and back to offline) literature through communicative media as a filter, not only manifests a here-and-now alternative, consumeristic portrait of Bateman co-authored by Google’s algorithm’s interpretation of the text, but also elucidates a reading and writing otherness - an otherness uttering within its own discourse, which we emulate in our daily email correspondence. Thus Google is reading and producing us as consumerist subjects through these literal discursive utterances. In order to clarify this argument, I will use Bernard Stiegler’s notion of grammatization. Stiegler draws upon Derrida’s reading of Plato and description of the act of writing as a mnemotechnic (Stiegler, 2009). Grammatization then implicates an exteriorization of consciousness and consequently an exteriorization of memory. Alphabetization or grammatization hence means making the interior into concrete, discrete units - making something into grammar, patterns and code. And since the thoughts, when grammatized, are units “out there” instead of abstractions “in here”, they can be infinitely duplicated and distributed independently of us. Though Plato was deeply concerned with this development, it is how we make and have been making history - collective and individual memory - as well as construct members of a society. We exteriorize our actions and ourselves in descriptive grammatizations (most basic: birth, death and social security number) so that others can know us and re-know us, even after we are gone. According to Stiegler, grammatization is therefore also a constitutive foundation for a feeling of belonging – a constitutive function from where a possible individuation of subjects can be derived, since it enhances the individuation of a we, a society, which then co-constitutes the understanding of the subject as an I, psychically and collectively. Stiegler points out that this is not a new socio-political argument: I am not human except insofar as I belong to a social group - which is an understanding he collects from Aristotle (Stiegler 2014). A possible co-individuation and trans-individuation is thus forwarded by a descriptive grammatization of social relationships. Discrete units of the printed psychopath In American Psycho Patrick Bateman is the subject of enounciation as he, through a first person narrative, is appropriating a present time discourse - reporting from his brutal killings. Consequently, the psychopath Patrick Bateman, as a character, is obviously written - grammatized; he exists only within his own grammatization. However, he, within his grammatization, uses the discursive language of a spoken conversation. He “speaks” in the present time, at some points directly addressing the reader, and has a curiously bad memory – as if he was not grammatized: “I've forgotten who I had lunch with earlier, and even more important, where.” (Ellis, 1991) In this grammatization it is arguable that the written (fictional) Bateman is engaging me as the reader in a we - the two of us together in our differences and perhaps in some ways disturbing similarities - as well as potentially making me reflect and relate to my surroundings through Bateman’s extreme narcissistic, description of himself and his milieu in the book – thereby potentially facilitating a reflective individuation for me as the reader. Psychopathic consumer Something remarkably different is happening in American Psycho 2010, since a number of different enounciators are at play, and grammatization here has a completely different role. Not only is the online subject (Bateman as well as the subject of any Gmail account) grammatized, the milieu, in which the subject is inscribed, is grammatized as well – written by a code. In this light, grammatization functions as datafication. And the crucial difference between a self-description offline and online is that a self-description online is also an instant self-indexication – it is traceable. Patrick Bateman, as the subject in American Psycho 2010, not only writes himself through first-person narrative (and then is produced by a reading subject through the act of reflecting, understanding, and interpreting); he is also caught in a parallel reading and writing process with algorithms. In the particular case of Gmail, Google’s algorithms use keyword identification within Patrick Bateman’s utterances to write Bateman as a consumer entity, mapped to Google’s corporate sponsors. An alternative portrait of Bateman as a mere consumer is manifested in the resultant ads. In Stiegler’s vocabulary, instead of the I as the grammatized subject individuating within a grammatized we in the correspondence in a Gmail conversation, the Is and the we are considered a they by the algorithms of Google, a collection of consumers, not individuals, - which thus means a loss of individuation. This becomes remarkably literal and explicit in American Psycho 2010, where Patrick Bateman’s utterances, self-description, history and memory are literally deleted – even Bateman as an extreme psychopath is read and written by a corporate algorithm reduced to a mere consumer like everybody else. In conclusion, in a broader cultural perspective, the piece by Cabel and Huff can be seen as a critique of how a reading/writing other (in this case, Google’s algorithms) is also constituting the human subject, simultaneous to the human subject’s reading/writing of him/herself via electronic communication, References: Ellis, Bret Easton: American Psycho, London: Picador (1991) Stiegler, Bernard: “How I became a Philosopher” in Acting out, Stanford, Calif.: Stanford University Press (2009). Stiegler, Bernard: The Symbolic Misery, Cambridge: Polity Press (2014) -------------------- DATAFYING THE GAZE, OR THE BUBBLE GLAZ By Mitra Azar "I am an eye. I am a mechanical eye. I, a machine, I am showing you a world, the likes of which only I can see" Dziga Vertov, WE: Variant of a Manifesto, 1919. Since a while I've been getting the impression that Google and the net are paradoxically becoming the conditions of existence of the real word, and not vice versa. I exist if I am googable, that is, if the algorithms which operate Google indexing are able to trace me, thus turning me into a thing other than the Cartesian res cogitans and res extensa, and converting me, one might say, in a res googable. The drifting of the Lebenswelt to the Googlewelt is operated through the Filter Bubble, a series of algorithms that direct my queries, based on my previous interactions with the search engine. The Filter Bubble encodes the subject's intentions, and replaces its experiential activities with an automated array of algorithms that projects - within the unity of an exogenous programmed identity - the traces we leave online. Recently, through wearable technology like the Google Glass, the Filter Bubble has conquered the third dimension. From a techno-aesthetic point of view, the perceptive core of the Glass is quite simple: a virtual image recorded by the Glass camera is projected by a micro-projector over a prism and therefore injected inside one of our eyes from a very close distance, between our gaze and the reality we perceive. The nature of this image is a political battlefield: POV (embodied image) proliferation is politically relevant, especially in relation to the anonymity and frozen inscrutability of CCTV footage or drone image (disembodied image) as metaphors of a post centralized panoptic gaze. POV images didn’t have a proper political connotation a few years ago, or were a short cut to a specific type of pornographic films, and only in recent time they’ve become the way people activate their digital netizenship in contexts of upraises and through the epistemology of an open distributed network of nodes. In this context, Google is sticking its hands on a perceptive region under the self organizing control of the crowd, with the intention of colonizing the most intimate point of view ever, that of the shadow. Now, let’s explore the Glass from one limit short-circuit we might experience while wearing them, when the image of what is right now in front of our eye(s) is recorded live by the Glass camera and projected on the semi-transparent screen in between our eye(s) and what is right now in front of our eye(s). What is our relation with this meta-subjective image (and gaze), where the actor is the ever changing zero degree point of view behind the image and, at the same time, its real time spectator? Google Glass users will see simultaneously off the frame and inside the frame, in front of the camera and behind the camera. From a temporal perspective, the narcissistic mirroring of the Bubble works according to the odd principle of “the influence of the future on the past”, quoting a phrase from the film Morel's Invention by E. Greco, from A. B. Casares’ eponymous novel, in which the scientist Morel invents a machine with the power to holographically reproduce reality, only to compulsively superimpose it on reality itself. The idea of a peripheral future (the collapse of the future into the past, and the consequent After the Future society, as Bifo would call it) marries perfectly with the micro-gestural/ non-gestural peripheral interaction of the user with the device. While the Glass are putting at work our mostly unaware body’s activities (eye-blinking-no-hands-shooting-technique, head shacking, etc), in the only physical interaction with the device (sliding a finger backward over the right stick of the Glass to access current events, sliding forward to access the past), the intuitive gestural movement between past, present and future is, thus again, reversed. From a screen perspective, the concept of display leaks between the prism and the imaginary layer where we possibly perform offline body meme for the Glass camera to activate our online sphere, as in the case of the hand-heart shape patented by Google - last frontier of the semantic web, before brain-to-brain interface will replace actions with mirror neurons. In the prism, we can see our visually datafied life(log) in real time, and we can interact with it by an almost invisible body language which activates the device towards the reality and enhance the circuit offline-online-offline (introflected datafication, full circuit). This circuit, potentially pregnant in terms of political disobedience, is now possibly colonized by the Glass. In the YouTube ads, Google suggestions about the ukulele book and the store where to buy it climax and become real when the protagonist starts playing the real instrument in front of a romantic sunset, hanging out (literally, over Google Hangout) with his girlfriend. This cathartic moment confers on Google the function of making real and reverse the hierarchical (also temporal) relation between online and offline, making the ukulele a res googable and anesthetizing the offline-online-offline circuit. Meanwhile, in the invisible layer created by our hands performing for the Glass, we can visualize the circuit as an online-offline half way turn (extroflected datafication, half circuit), where the offline performance is subordinated to its online consequence. From a phenomenological point of view, the Glass might target the uniqueness of the experiential relationship between environment and organism. Experiencing (ex-pèrior) etymologically means to experiment, while the intensive prefix ex conjures up a universe opposite to that of mirroring, similar instead to the ecstatic (ex-stasis) universe of coming out of oneself, of the challenge of the otherness. The process of singularization is indeed everted by the Bubble Glass, which singularizes on behalf of the subject, generating a user-oriented universe, “showing you a world, the likes of which only I can see”, as in the KinoGlaz of Vertov. Bubble Glaz is the attempted assimilation of the landscape to the map, making impossible both defamiliarization and alertness, which are typical in experiencing the singularizing space. From an ontological point of view, we might talk about a form of ontological onanism oppose to an ontological eroticism where, on the contrary, the relational fabric of an online connected collective intelligence sensuously unfolds itself, gets stronger and multiplies among the differences. It seems, though, that the Bubble Glaz ideology pragmatically confirms the epistemological futility of estrangement, and the potential and subsequent state of bewilderment. Yet, paradoxically, a place where it is impossible to get lost, is also a place from where it is not possible to escape. -------------------- Logistical Media and Black Box Politics By Ned Rossiter Logistical media determine our situation. While the missing flight MH370 is yet to be found, for the rest of us there is nowhere left to hide. The horror of cybernetic extension into the vicissitudes of daily life is now well and truly a reality. CCTV cameras, motion capture technologies, RFID chips, smart phones and locational media, GPS devices, biometric monitoring of people and ecological systems – these are just some of the more familiar technologies that generate data and modulate movement and consumption within the logistical city, or what Friedrich Kittler terms “the city as medium.” The logistical city marks a departure from both the global city of finance capital and the industrial city of factories. The logistical city is elastic, its borders are flexible and determined by the ever-changing coordinates of supply chain capitalism. Populated by warehouses, ports, inter-modal terminals, container yards and data centres, the logistical city is spatially defined by zones, corridors and concessions. It is a city that subtracts the time of dreams to maintain the demands of 24/7 capitalism (Crary). For many, the model has become the world. Our tastes are calibrated and relayed back to us based on the aggregation of personal history coupled with the distribution of desire across sampled populations. Decision is all too frequently an unwitting acceptance of command. The biopolitical production of labour and life has just about reached its zenith in terms of extracting value, efficiency and submission from the economy of algorithmic action. Nowhere is this more clear than in the “sentient city,” where the topography of spatial scales and borders gives way to the topology of ubiquitous computing and predictive analytics in which the digital is integrated with the motion of experience. In the sentient city data becomes a living entity, measuring the pulse of urban settings and determining the mobilization of response to an increasingly vast range of urban conditions: traffic movements, air quality, chemical composition of soils, social flash points. The horror of urban life is just beginning. The dystopia of the present leaves little room for responses other than despair and depression. All too often resistance to the distribution of power and the penetration of financial capitalism is, as Max Haiven argues, not only futile but quite often reinforcing that which it claims to oppose. Resistance is not interventionist so much as affirmative: “finance as we now have it, as a system that “reads” the world by calculating the ‘risk’ of ‘resistance’ to ‘liquidity’ and allocating resources accordingly, already incorporates ‘resistance’ into its ‘systemic imagination’.” In this slaughterous world, the nihilistic option is to find joy in the pleasure of immediacy, consumption and aesthetic gestures of critical self-affirmation. No matter the foibles of human life, predictive analytics and algorithmic modelling deploy the currency of data to measure labour against variables such as productivity, risk, compliance and contingency. What, then, for labour and life outside the extractive machine of algorithmic capitalism? Can sociality reside in the space and time of relative invisibility afforded by the vulnerable status of post-populations? Can living labour assert itself beyond the calculations of enterprise planning software and the subjugation of life to debt by instruments of finance capital? These are disturbing, complicated questions that require collective analysis if we are to design a life without determination. The politics of infrastructure intersects with the experience and condition of logistical labour and life within urban settings. Logistical labour emerges at the interface between infrastructure, software protocols and design. Labour time is real-time. Logistical labour is more than a unit to be measured according to KPIs. It is the life-blood of economy and design, exploitation and consumption. Logistical labour underpins the traffic of infrastructure and circuits of capital. But where is the infrastructure that makes these planetary-scale economies, biopolitical regimes and social lives possible? The politics of infrastructure invites a critique of the quantified self, where self-tracking bodies are regulated as data-managed socialities as they move within the logistical city. In the society of compliance, normative measures and standards are set by the corporate-state seeking to expropriate value from labour through regimes of fear, insecurity and self-obsession. There is an element to the profiling of what I would term “post-populations” that is external to logistical media of coordination, capture and control. I am thinking here of the peasants dispossessed of land in Kolkata who commit wilful acts of sabotage on infrastructure in the new IT towns, and of the proletariat and unemployed around the world who are not governed or managed in the name of political economy, but unleashed as a necessary surplus to capital which requires relative stability for infrastructures of investment to withstand assault that arises from social chaos. Yet post-populations, who to some extent can be understood as ensembles of non-governable subjects, can all too often be vital sources of technical invention that is then absorbed into systems of production. (Think of shanzhai culture, and the wild modification of mobile phone features in China.) This is why they are set free, since the parameters of capital accumulation can only be replenished when elements of contingency are programmed into the operational requirements of the logistical city. The problem with the post-digital settings of today is that we are unable to think within the box. We can speak of a politics of parameters, but ultimately this is still knowledge specific to engineers who design the architectures within which we conjure our imagination. We can no longer harness our imagination, only click on predetermined options. What, therefore, might it mean to design a program of research and cultural practice that exploits the geography of data infrastructure as we know it? When loyalty cards proliferate in our virtual wallets, when coupon systems and location based services are coupled with payment apps that track our patterns of consumption, we begin to get a sense of how shopping experiences are designed around economies of capture. To refuse is to perhaps miss out on that sweet feel of the discount, but at least we get a fleeting sense of having preserved our anonymity. Indeed, anonymity becomes a key algorithmic gesture, conceptual figure, and technical mechanism through which we might begin to design a black box politics within the horizon of logistical media. For to be anonymous renders the black box inoperable. References Crary, Jonathan. 24/7: Late Capitalism and the Ends of Sleep. London and New York: Verso, 2013. Haiven, Max. “Finance Depends on Resistance, Finance Is Resistance, and Anyway, Resistance Is Futile.” Mediations 26.1-2 (Fall 2012-Spring 2013): 85-106. Web. Kittler, Friedrich A. “The City is a Medium.” Trans. Mathew Griffin. New Literary History 27.4 (1996): 717-29. Rossiter, Ned and Soenke Zehle. “Experience Machines.” Sociologia del Lavoro 133 (2014): 111-32. Sassen, Saskia . “Unsettling Topographic Representation.” Sentient City: Ubiquitous Computing, Architecture, and the Future of Urban Space. Ed. Mark Shepard. Massachusetts: MIT Press, 2011. 182-89 Shepard, Mark, ed. Sentient City: Ubiquitous Computing, Architecture, and the Future of Urban Space. Massachusetts: MIT Press, 2011. Wood, Friedrich A. “At the Movies.” London Review of Books 36.8 (April 2014). Web. -------------------- Interface Industry: Cultural Conveyor Belts from (Post-)Ford to Jobs By Søren Bro Pold The computer has become a central cultural platform, but this in reverse also means that our culture is increasingly computed. Culture is at the center of IT development with the potential of changing our understanding and use of technology and in general it makes cultural content available in ways unimaginable just few decades ago. However, it also means that specific formats are introduced for cultural content and this has effect on the way we get access to culture, the way art and culture can reflect and challenge the computer, and consequently the way computers are designed and packaged. In contemporary interface culture, cultural production is tailored to the tablets’ distribution platforms, and to a large extend becomes shrink-wrapped and fitted to particular formats and predefined settings, e.g. as apps or e-books. Furthermore, while cultural production becomes a new kind of consumption, the consumption of culture also changes. Reading books, listening to music, or watching movies have traditionally been considered a private engagement, but is now integrated as a valuable part of the production chain: the successful prediction of what people will produce or consume deeply depends on processes of monitoring, quantifying and calculating consumption in controlled environments that can predict general behaviors; hence datafication. Culture industry In 1944 Theodor W. Adorno and Max Horkheimer wrote the essay "The Culture Industry: Enlightenment as Mass Deception" famous for coining the term culture industry and describing how the increasing management of culture by the culture industry is deceiving the masses through a 'system' of film, radio and magazines. The culture industry is in their understanding built around a relentless, totalitarian necessity of "never releasing its grip on the consumer" who will "experience themselves through their needs only as eternal consumers, as the culture industry's object." (113) Adorno and Horkheimer describe the technological development of mass media, which is propagating a quantifiable sameness through pacifying broadcasts and through this is eradicating individual, reflective perceptions and interpretations resulting in a hegemony of capitalism, consumerism and an industrialized life style integrating work and leisure. Resistance is either impossible or will get appropriated and enlightenment is turned into mass deception. While they were fleeing from totalitarian Nazi Germany in their exile in Los Angeles they faced a totalitarian capitalistic culture industry manufacturing business and consumers as its ideology. Adorno and Horkheimer's understanding of the culture industry was not aware of the computer simultaneously developed in research labs across the world, and when the computer became a mainstream cultural platform some decades later, the industrial era was supposedly over. In general, the modern personal computer has been seen as a technology to end the standardization of the conveyor belt capitalism, hence the term Post-Fordism, for example through ideas of increasingly flexible customization and modularization of production processes, leaky borders between producers and consumers as envisaged in the concept of prodUsers, flexible storage and distribution leading to a broader supply and demand (as in the concept of the "long tail"). Related to reception we have seen concepts such as interactivity and co-production that points to a more active consumption carried out by the new pro-sumer. From Daniel Bell to Alvin Toffler and Chris Anderson and through concepts such as the post-industrial society, the information society and the knowledge economy the networked computer has been tied to movements away from the standardized industrial paradigm. However, with the way that cultural production and distribution now are becoming tailored to the new digital platforms of tablets, smart phones, e-readers, smart TVs, a new cultural interface industry is being constructed and it has displaced the old cultural industry that Adorno and Horkheimer criticized. IT-companies are increasingly taking over from the old culture industry, and Apple iTunes has taken over from EMI and instead of the old global media companies we have even stronger global monopolies such as Apple, Google, Amazon and Facebook. Digital culture as not-just content The new interface industry is flexible and efficient and users get endless cultural context right in their pocket, but the amazing efficiency comes at the price of monitoring, control and strict licensing. Selling and owning cultural products is replaced with licensing and renting – and users' rights are limited towards specific patterns of consumption. If the conveyor belt produced standard goods in big numbers and was relatively inflexible towards individual consumer demands, the interface industry thrives on individual choice, consumption and co-production, which are fed back through the interface through detailed monitoring of every consumer behavior. Instead of the standardized production of conveyer belts we get individualization through cybernetic monitoring loops – think of the way Amazon know your reading taste and can guide you through its vaste selection. Building the infrastructure around interfaces allows for a more flexible, fine-grained and intimate culture industry that can change its public appearance according to demands and trends while experimenting with constantly new business models behind the screen. Earlier IT revolutions were tied to some ideas of emancipation and changing the status quo – Apple even once marketed its products with the slogan "Think Different" – but again seventy years after Adorno and Horkheimer enlightenment is turned towards deception. While it might be naïve to believe in the slogans of big corporations, the IT industry currently needs visionary thinking to make us buy the next upgrade. Digital culture is at the center of this, and an IT industry compromised by Snowden needs more than U2 to regain our confidence and future imagination. While the cultural interface industry might be the perfect solution to the distribution of traditional cultural content in digital formats, it also limits critical rethinking of the role of and access to the computer. Interfaces that keep culture and computer apart does not only produce potentially boring digital art, but is also harmful to how digital culture can develop new alternatives such as has earlier been the case with electronic music and literature, software and net-art. IT companies cannot claim the future by marketing pop acts past their prime. Horkheimer, Max, Theodor W. Adorno, and Gunzelin Schmid Noerr. Dialectic of Enlightenment : Philosophical Fragments. Stanford, Calif.: Stanford University Press, 2002. Print. --------------------