II.1 Streaming Archives

In: The Distributed Image
Simon Rothöhler
Search for other papers by Simon Rothöhler in
Current site
Google Scholar
Daniel Hendrickson
Search for other papers by Daniel Hendrickson in
Current site
Google Scholar
Textworks Translations
Search for other papers by Textworks Translations in
Current site
Google Scholar
Textworks Translations
Open Access

Wherever there are streams, memory and storage can be formed. Information flows are stabilized as storage content primarily through protocols that have been formatted bureaucratically since the nineteenth century.217 These protocols are distributed by an institution: the archive. Nonetheless, not every operation of storage and memory has archival intentions. Only certain techniques and practices of collecting, retaining, and holding ready are, strictly speaking, associated with any effective intention—at least as a background concern—to preserve materials and pass them on, the institutional expression of which are archival facilities. The persistence of archived material ensured by storage media—the implicit long-term guarantee that transfer into the archival space will, generally speaking, relieve a selected object of both the temporally induced effects of corrosion as well as unauthorized access—is thus joined with an archival agenda that is socio-culturally more or less explicit. While not every act of storage can be addressed archivally, the reverse is true: every archive can be traced back to its media-technical storage design—without, however, being completely determined by it.

Archives usually have identifiable sponsors and distinct supporting and administrative media. In their interplay—governed by the formalities of inventory and distribution—they formulate concretely developed archival politics. The processes realized in this formulation sometimes determine, in great detail, which materials are to be included in the institution and which will remain excluded, how (deeply) the organization of the archive—for instance, in the form of the classifications also archived there218—is inscribed in the emergence of the stored material, and under which conditions it can then circulate, be publicly visible, be made available for consultation: »In the archive the stored materials, the organizational principles, and the media that they register, are so intertwined that they cannot be disconnected from one another.«219

When processes and agents of storage are transformed, it is plausible to assume that the media-technological alterations responsible for this transformation have a decisive influence on which objects, which data configurations are fundamentally worth considering as archival materials, and through which modes of protective storage, of taxonomical incorporation and of regulated release these materials can be integrated as archival documents—which can also always be designated as documents of an archival arrangement transmitted by inscription, and regarded accordingly. In other words: which media contents are checked into the archive and at what rates they can be—selectively and temporarily—checked out, how the procedures of registration relate to those of acquisition and cassation, is essentially dependent on distributive versatility, on the distribution processes and potential of the storage media involved. Archives take discrete sections of reality out of present circulation, setting up instead new channels and transport media, by which these stored materials are meant to remain accessible in the future. Archival techniques of preservation, of distribution in storing and releasing objects work together in the model of an agency of remembrance based on fundamental accessibility and marketability.

Media infrastructures and discursive conjunctures of archives are manifestly likewise interconnected. If architectures are transformed, and in the process the cultural technologies of the archive are transformed as well, this also alters the circulation of archival holdings along with any concurrent archival semantics—such as a mnemonic configuration. As outlined in the previous chapter, a terminological consequence of what can be regarded on several levels as the excessive storage productivity of digital data distribution—including its »increasing traceability« (Latour)—is that once again the concept of the archive is forced to expand its borders. On the one hand, we observe an evident and everyday proliferation of collections of images, texts, and data (including a number of »humble archives,«220 in terms of their genealogy), which not only, according to the rate of increasing complexity in circuits predicted by Moore’s law, can always be achieved more comprehensively and cost-effectively, but can also be ever more effortlessly networked, distributed further, copied—a dynamic of extension that, with the transcoding pragmatics of digital representation, can also be transferred ever deeper into analog holdings. Data is being stored more frequently and more incidentally (often also secretly and unwittingly), while at the same time we can observe comprehensive processes of conversion in storage transfer. ›Native‹ digital and digitized files flow into the same depots (such as the databases of social media providers), are channeled, processed, and (temporarily) stored by the same protocols, programs, and platforms. On the other hand, parallel to the spread of digital data, media and cultural studies have worked toward universalizing the concept of the archive as a dominant cultural metaphor and generalizable epistemic figure of interpretation, which in the meantime has resulted in complaints about the »inflation of the archival« and concomitant losses in terminological precision—not least with regard to related agencies of memory like museums221 or libraries.222

The increased scope and dynamic of the term ›archive,‹ however, might lead to permanently risking the concept’s devaluation only if the diagnosis of the erosive forces causing such devaluation remains underdetermined. The focus here is on the question of the implications of the much-touted ubiquity of digital data storage, insofar as they are relevant to archival theory—or more precisely: the question of its operability as the media technology of the archive. What is the basis for the assumed authority of the concept of the archive, and what are its limits? What are the processes that move this concept into »streaming« or set it »in motion« in the first place? As Eivind Røssaak puts it: »The archive itself is […] no longer easy to archive, that is, it is no longer easy to control the concept, to store it, to stop it, to arrest it, and safely guard its meaning or origin from slippery semiosis. We can no longer shelter it from the flow of meanings and uses it is constantly getting involved in. Reflections on the archive today cannot seek the security of a shelter, of an archive; it has to walk out, get moving.«223

The following discussion—as an introduction to material examinations of selected image archive holdings with their own specific institutional agendas: the NYC Department of Records (II.2) and the United States Holocaust Memorial Museum (II.3)—will outline what it might mean to understand digital archives (with retro-digitized archival records and remote access) in media-logistical terms in the sense argued here, namely as streaming, (temporarily) stored, computer calculable data depots.

This will not be a matter of tracing the discursive ramifications of, or constructing a conceptually updated refuge for the »archival turn«.224 Rather, my interest here is: Which distribution calculations work in and about archival configurations that are concretely »in motion« insofar as they are operated in the media technologies of streaming, of ›real-time‹ instantaneous distributability? How do these archives, which are charged with special transport orders, calculate their contents? And what do the stream-like aggregate states of networked documents deposited in this way in turn mean for the distributed image, which is the focus here? What ›histories,‹ what media historiographies has it stored, how do new archival codes get inscribed and old ones rewritten? What genealogy of distribution technology does the data stream transport into the present of digital access to material?

The last section of this introductory chapter, which makes a general argument in terms of archival theory largely without specific reference to the image, is meant to open up a media-historiographic perspective in preparation for such questions. In subsequent steps—under the headings Long-Distance Photographs (II.2) and High-Frequency Videos (II.3)—this perspective will shine a media-archaeological light on distributive constellations, which are also archived to a certain degree, particles in the general data stream. As noted at the outset of this study: The image did not first become technologically distributable when it acquired the capacity to be calculated and transported by computer network protocols. The histories that become specifically distributable in the digital image can also be further interrogated for the media histories of technological image distribution that they distribute and thus carry forward into the present.

II.1.1 Transmission and Temporary Storage

In terms of distribution, as should already be clear, the operation of streaming archives is not in any way unregulated and ephemeral, but predictable and trackable in great detail, down to their microtemporal processes. Every datagram, every transport operation has a signature, leaves a trace, which can be verified and used for calculations. In the expanding discussions about digital archives in media theory there are various approaches—which in the final analysis are positioned in entirely similar ways—to conceptually grasping the computational archive dynamics between stream and storage, between data flow, data feedback, and data sedimentation. In Adrian Mackenzie’s reflections on the substructure of temporal dynamics in current »information networks« the archive operates as an almost oppositional movement—as a »drive,« which works against the (assumed) permanent presentness of real-time computation, thus bringing together different models of storage management:

The ever growing totality of inscriptions that weave the text of the Internet networks, with its mass storage and data warehousing systems, are the product of the archive drive. While real-time produces the temporal ›here and now‹ of virtual culture, the archive drive produces a locational ›there‹ composed of texts, images, indexes and records. […] Examples are: (i) the movement of existing printed and recorded texts into the space of the electronic archives: everyone from Bill Gates to the Louvre is involved in that translation; (ii) the business of ›data-mining‹ that is increasingly the backbone of corporate profitability in an increasingly real-time market through processing and modelling the archives or ›data-warehouses‹ of previous transactions.225

The increasingly full data depots of distribution therefore react, at least indirectly, to the finite signal transit times of transport packages, which are given a concrete ›lifetime‹ with every distribution operation on their way through the computer networks—a lifetime that is called exactly that: »Time to live« (TTL) tells a data packet in the header how much time it has to reach its goal. In this sense, the protocols responsible for the transfer contain bad news for the information concerned, namely deadlines (the counter variable is called »hops« and refers to computer relay stations, to the passage between network intersections). For Mackenzie the archive drive processes this specific mortality—»that no information can live in the networks forever«226—by regarding the circulation data themselves as storage contents. The archive drive compensates for the ›death‹ of data that occurs in the timeliness of the transfer with deposited transfer and transaction data storage, which can also be commodified according to the requirements of real-time analytics tools—that is, ›in real time.‹

Independently of this, the data holdings of the archive are augmented by a collection of archive usage metadata, which in turn, processed accordingly, has a profound reflexive effect on the collection. David Berry speaks here of the »second-order archive« and »computational rationalities,« which the archives ›cybernetize,‹ so to speak, in that feedback logics are able to control what can be collected and how it can be recorded, searched for, and found: »[…] this reflexive database (metadata) of the archive’s use and motion can be used to fine-tune, curate, and prune the archive algorithmically.«227

According to the model of this description, the process of archive consultation—at every step along the way: from the initial request and the concrete transport modalities to the successful delivery and receipt of the archival material by the user—is permanently integrated into the calculation of the archive drive. Mackenzie is not alone in observing that its operational mode derives from the processor-memory gap of the von Neumann architecture.228 Robert Gehl also sees real-time drive and archive drive, the two fundamental network dynamics, as inferable from the relationship between the central processing unit (CPU) and various working and temporary memory modes (where so-called arithmetic registers operate within the processor core, at the top of the memory hierarchy). According to Gehl, they are based on »the development of the modern computer itself, which is a synthesis of the immediate (in the form of the CPU or processor) and the archival (in the form of memory and storage of data).«229 This dichotomy is indeed understood conceptually as absolute, but it ultimately describes a division of labor, because the command cycle (fetch-execute) is configured cooperatively. Under computer network conditions this logic extends further to the distributed memory—which also draws on the data productivity and connectivity of the end user device230—provided that new connections between (combined) processor performance and (distributed) data resources are possible:

Memory capacity has grown tremendously, leading to today’s terabyte drives that store vast amounts of information. This information must be routed to the processor. To do so, computer architects have developed busses, short-term caches of memory, and dedicated pathways for instructions and data in order to link them. Thus, we have a basic architecture: processor, memory, and the path between the two. Computer engineers seek to optimize the relationship between memory and the processor to create an ideal synthesis of the immediate and the archival. In Web 2.0, the path between the user/processor and the archive is the broadband Internet connection.231

If one follows this argument, all the information flows of networked computers—even those that transfer archival data in the narrower, more institutional sense—are based on the operational logic of temporary memory. In terms of archive theory, this opens up a theoretical problem of memory at the level of the temporal horizon that is invoked, more precisely, between memory and storage,232 because the fact that data is shifted out of time for the short-term during data processing seems hardly compatible with the storage ideals, guided by persistence and duration, of conventional concepts of the archive.

In his work on archival and memory theory, Wolfgang Ernst also assumes that the transient storage processes that take place on the CPU of the digital computer represent the inescapable starting point in media theory for any understanding of the wide variety of computer network technological intensifications and ramifications of the »post-archival constellation.«233 In Ernst’s Das Gesetz des Gedächtnisses [The Law of Memory], which was published a good ten years ago, the data transfer operation of streaming figures prominently in the final chapter as a comparatively new web-based media technology, still struggling with broadband limitations. According to Ernst, however, it is nonetheless paradigmatic of a development to which his entire book is dedicated, namely the substitution and reinterpretation of classic archival storage processes by processes wired for permanently temporary storage and transfer.

The technological operativity of memory that thus becomes dominant begins to transform all archival processes, replacing long-term, document-stable storage with the performative principle of instantaneous regeneration on demand, replacing one-time safeguarding with the necessity of responding to digital obsolescence with routines of continual data migration and archives of backward compatibility that are capable of emulation.234 Where there were once archive buildings, now there are increasingly relational data banks in operation, linear numbered shelved are replaced with complex tabular keys and URL addresses, topographical localizability with comprehensive dynamics of temporalization. Where a temporal immobilization of the time of a stored document was once associated with a »transfer along the axis of time« (Winkler), there now dominates a logic of regeneration and updating, of update and refresh cycles, that is, the switchable real time of an incessantly present transfer of aggregate data:

In essence, it is the case for streaming media (for instance, RealAudio) that signals are not first completely loaded into storage in order to then be heard, seen, or computed, but rather a constant flow of compressed data packets between sender and receiver is maintained so that temporary archives are nested in the act of transfer itself […]. The fact that spatially defined archives are being converted to temporary archives results in the streaming archive. In place of the resident emphatic storage comes dynamic temporary storage, the transfer channel itself becomes an ›archive of time,‹ a dynamic archive.235

As a media technology of continuous temporary storage—which is also located, for instance, in the browser cache (and from there can be redirected to the hard drive memory and ›stabilized‹ as a file by means of download managers like ClipGrab)—the data distribution model of streaming, according to this argument, only continues what is already established in the data processing routines of the processor: transient storage or »technomathematical copying«236 of temporary results, dominance of transfer functions, dynamic-temporalized storage performance.

II.1.2 Depots

The fact that such distributed streaming data are nonetheless situated in a relatively fixed state on a server is consequently less significant than the circumstance that, by definition, they are continuously waiting there for querying clients, waiting to be transported and for data feedback to be triggered. From this perspective, even on the server side—beyond the half-life problematic of digital storage related to the carrier media237—the data in question are less archived than they are deposited, that is, stored in a way that is initially conceived more in terms of constant deliverability than permanent preservation.

In short, archives are typically permanent repositories, whereas depots are distribution centers. Their functionality essentially derives from their logistical position within a transport chain. Rather than closed horizons, this is a matter of determining relationships. This is why the terminology of storage logistics ascribes to the depot a bridging function, whose primary purpose is to maintain a constant level, not of material, but of material flows. Stability requirements do exist, but they accordingly relate not to establishing and preserving inventory long-term, but to maintaining a productive circulation, a consistent movement that regularly runs through the depot, although it is not stopped there, but merely regulated via feedback signals. The focus is therefore on rates of flow, stream volumes, issues of transmission. Insofar as archival processes are associated with stopping time, the question arises as to how much they can still be achieved, if at all, on the basis of the process and access times outlined above, which necessarily go hand in hand with the mobilization, the flow management of digital data from the CPU to migration and update cycles.

Anyone wishing to use the term »archive« under these conditions will at any rate have to adapt it to the technical conditions of the media and will need to consider the transfer costs. If »streaming archives« is to be more than simply an oxymoron, there must be some reflection on what archival concepts, what initiatives of archival action can be implemented on the basis of digital data depots. At what levels are the modifications effective, where do they come up against the limits of an operativity of media technology that, at least at the level of »data storage becoming time-sensitive,«238 initially seems to be arranged in a way that is ›anti-archival‹?

Martin Warnke, with good reason, shares Wolfgang Ernst’s reservations about applying the term »archive«—which tends to be understood as dedifferentiatedly mnemosematic, but is ultimately often purely metaphorical239—to the already difficult-to-define (in part because it is topologically distributed240) ›totality‹ of the Internet. Inasmuch as it tends, due to its basic network infrastructure, toward permanent self-overwriting, toward processual currency and presentness—that is, in many essential respects it consists of a coupling of transient packet shipments and dynamic, on-demand generations of connection, which, for their part, as in search engine results, are not only not archivable by the user, but are not even self-identically reproducible at any given later point in time241—a dissolution of archival logics into flexible transfer processes and constantly renewing acts of communication seems somewhat evident. Where there was once the archive, there now operate archival calculations, which have long since transcended formerly established institutional boundaries.242

Warnke considers it crucial to not pin down the archive wholly in terms of space and structure, but instead to locate it fundamentally in terms of interconnected activities and horizons of updating:

The internet is obviously a mixture of pure transport and temporary storage, which for data packets never lasts more than a few seconds. The question of the lifespan of Internet documents thus depends on the end device: the transmission packets disappear on their own. What is no longer on the server can no longer be reached, and the notorious ›Error 404, document not found‹ appears instead. The Internet itself is thus obviously unsuitable as an archive. […] Archives, especially digital ones, only last when they are constantly being used, when a preserving entity continually re-codifies, re-interprets, and re-evaluates them, actively acquiring their documents, publishing or concealing them, thereby enabling and structuring knowledge, provoking or seeking to suppress actions. Only in this way do digital archives survive the decades.243

The insight that it would be terminologically misdirected to describe the Internet in toto as an archive—because all that can be identified, even on the level of transfer protocols, are structures of limited duration and relative stability244—does not imply that web-based archival processes are conceptually unimaginable. It is uncontested, however, that a conversion takes place, which Wendy Chun calls »dynamic preservation«: »[…] to ›store‹ something digitally, one often destroys what actually exists […] when ›saving‹ a file, one writes over an existing one. […] To access repeatedly is to preserve through construction (and sometimes destruction).«245

Under digital conditions, practices of saving can have to do with copying, overwriting, regenerating, and above all with distribution.246 Locking away the originals does not have conservational value here, in the sense of the probability of future availability of storage memory; rather, this purpose is served by strategies of maximizing distribution. The more distributed the storage, the safer it is—this formula, which is initially counterintuitive from a traditional archival perspective, is made plausible at least by the persistent effects of certain data collections, which have been discussed as »already distributed archives«247 (Sebastian Lütgert) and are organized through collaborative filesharing protocols such as BitTorrent trackers:

The astonishingly resilient archiving practices around […] the Pirate Bay, and the even more virulent promise of actual or imaginary archives far beneath or beyond them—if, for one moment, we could step outside the age of copyright we all inhabit, and fully embrace the means of digital reproduction most of us have at our disposal—not just directly follow the trajectory traced by Benjamin and Langlois, but extend it to a point in the not-so-distant future where we will think of archiving primarily as the outward movement of distributing things: to create ad-hoc networks with mobile cores and dense peripheries, to trade our master copies for a myriad of offsite backups, and to practically abandon the technically obsolete dichotomy of providers and consumers. The model of this type of archive, its philosophical concept, would be the virus, or the parasite. And again, this model also allows us to make a tentative assessment of the risks and dangers of outward archiving: failure to infect (attention deficit), slowdown of mutation (institutionalization), spread of antibiotics (rights management), death of the host (collapse of capitalism).248

Even beyond these »viral« archival dynamics249—which are based on the duplication of digital copies by means of distributed data references, and thus also on distributed depositing of data in ›mirror‹ archives—it is the case that while the Internet may well be unarchivable and not an archive in itself, it does distribute archival agendas, practices, and holdings. These connections and operations, or more precisely: these institutionally bound material collections, their distributed contents, interfaces, and infrastructures, will be examined in the following chapters, which are intended to develop more concrete readings of digital image archives

II.1.3 Transmitting the Institutional

The heuristic restriction to institutional archival agendas and intentions that has been chosen here is justified in part by the multifaceted empiricism of a conjuncture that can be observed beyond the filesharing practices that show an affinity to archives. More and more traditional archival facilities are transcoding their reservoirs, establishing platforms, entering into social media forms of communication—generally speaking, they are organizing themselves through a hybrid combination of online and offline processes, integrating digital operations on a wide variety of levels into their institutional architecture and practice. The implications of the »connective turn« (Andrew Hoskins)250 concern archival materials as much as archival structures, which are likewise included in and thus reformulated by digitization processes. The result is novel archival objects and archival operations that intervene in different ways. Archives now routinely deal with digital holdings, while databases have become important archival agents (equipped with a significantly expanded agency in comparison to analog lists, catalogs, registers), without interrupting continuities at the level of archival practice.251

In this context—against the backdrop of a correlative determination of storage memory and functional memory—Aleida Assmann has spoken of a complementarity:

In the institutions of the archive respectively the Internet, two complementary memory operations can be differentiated, which relate to one other as do ›saving‹ and ›retrieving‹: the archive fulfills the desire for reliable material conservation and long-term safeguarding of information, the Internet fulfills the desire for speeding up the flow of data and immediate access to information. Both institutions are essentially based on the new technologies of digitization, but they differ in turn in the use that they make of these technologies. In the archive they serve to safeguard and preserve data, in the Internet they facilitate an acceleration of the flow of information and the increase in acts of communication.252

Referring to the Internet as an »institution« seems somewhat questionable—however, the pragmatics this designation describes are plausible, namely a division of labor that had already led to an »unrestricting«253 of the archive by information technologies, to other modes and ranges of access. But the fact that, from the perspective of media technology, this division of labor might be better described with the aid of a re-entry figure—to the extent that digital processes of retrieval are always operationalized by a relay between search requests and (temporary) storage, and furthermore that under the conditions of computer network protocols the request itself can also (potentially for the long term) become the stored content (initially, for instance, in the form of IP addresses automatically registered in a database)—already suggests that the concern must be with relating the new archival practices (regarding storage, safeguarding, classification, and also consultation) to the performativity of the storage media under discussion here. Particularly if it is true that, wherever storage units change, institutional ideas of »archivability«254 must also undergo a transformation. This is the case, for instance, for data material that cannot be stored anywhere in analog form because it was generated (or ›born‹255) digitally from the outset and now becomes an object of archival ›desire,‹ which is already conceived extensively and strives »to transform ever larger areas of the world into an archive by means of external storage, independent of the subject.«256

If more and more communicative practices and information objects are constituted digitally, thereby becoming ever larger »content streams,« from the perspective of the archive this creates problems of selection related to storage technology and normative questions on the one hand, while on the other hand it is also apparent that legitimate archival mandates arise in the digital realm—whether this concerns the Twitter account of a US president or the more ordinary digital routines of official document management, which continue to produce »paper knowledge,«257 no longer in the form of paper, but rather as records management in data files: the »immeasurably increased data processing power at the end of the era of paper documents.«258

In any case, the increasing digitization of »areas of the world« that are fundamentally of archival value is carried out against the backdrop of distributed memory networks. For Sven Spieker, the concomitant spatial diffusion of the archive entails its dissolution into environmental knowledge expedited by information technology:

As locations, archives have always depended on the rigorous distinction between (their own) inside and its outside. […] The archive today functions less as an inside defining itself against what surrounds it than as an environment without outside where what is archive and what is not is increasingly difficult to tell. We do not »enter« the archive, we are in it. In this environment, information is not deposited; it drifts like a cloud. To refer to the archive as a cloud is to suggest that in our globalized world information is a naturally occurring, ubiquitous commodity not tied to any location or a specific cultural technique.259

The crucial point here remains that digital archives not only store different things, they also store them differently. As Spieker implies in referring to the operations of cloud computing, this has consequences for storage usage as well. Because digital archives store their goods dynamically, this results in changes to the modes of storage and the means of access alike. When storage spaces are generally becoming transfer storage, the modes of archival opening and closing are also transformed, and the balance between access and preservation has to be readjusted. Following from the work of Wolfgang Ernst, Jussi Parikka summarizes this adjustment as follows: »[…] the archive is becoming less a stable storage place and increasingly a function of ›logistical interlinking‹ […]. Archives are suddenly not only about storing and preserving but about transmitting […].«260

With respect to media logistics, it should be noted at this point that contemporary depots often appear to the casual viewer to be disorganized—especially when they are not run according to hard drive systematics. Logistical theory speaks of chaotic or free storage261—but this does not denote a fundamental lack of order in technological storage management; rather, it is an intentionally distributed efficiency model of precise addressability. Goods here are not assigned to any fixed storage spaces predetermined by a classificatory system, but simply to those spaces that happen to be free, available, the right size, and directly accessible at the time of storage. The storage management technology of the depot is put »in motion,« increasing mobility through an orientation toward operations of flexible temporary storage.

The tendency to replace the fixed location system with the random location system262—a major trend even before GPS since the implementation of radio-frequency identification (RFID) using microchips,263 a trend that by now has closely linked the fluid positioning of transport goods and transport work(ers)264—is comparatively more similar to the archive than, for instance, the library.265 Nevertheless, even from archival standpoints the location of storage is not an arbitrary position. In converting to digital processes, which in many respects rely on relational databases, the coordinate system of media technology is altered. It is not the position here that is stable, only the address. Even in data depots, archivally received and discharged information is not deposited randomly, but repeatedly »structured«266—in relational databases, in the strict sense of the term, it is structured in such a way that end users do not have to be aware of and understand the data space, the inner systematics of storage allocation, in order to be able to carry out meaningful data operations, as David Gugerli has asserted in reference to the pioneering work of Edgar F. Codd: »Codd promised that a relational database structure would serve a substantially larger group of users who might be naive about information technology but who knew how to query.«267

II.1.4 Remote Access, Granular Exploration

When database management technologies enter into the archive, modifying their storage operations, but above all establishing novel modes of direct access, this changes how users interact with the collection as a whole as well as the operability of discrete archival objects. On the basis of database management systems, the entire holdings are prima facie less closed off to the archive user, more immediately understandable and searchable. They are articulated instantaneously and algorithmically. Facilitated by queries, a form of pragmatic transparency is produced, which is based, however, on blackbox technologies, which also tend to make the arrangement of the archive—its infrastructures, processes, and the media that support it—invisible.268 If conventional archive consultation was sometimes a more laborious and costly process, more hierarchically structured, defined by entry barriers, viewing prohibitions, and the communication of institutional authority—Arlette Farge has impressively described the institutional rituals and »entrance permits«269—now it is governed by low-threshold remote access and navigation tools optimized for operation.

The historian Lara Putnam approaches this conversion from a meta-historiographical perspective: »Web-based full-text search decouples data from place. […] Digital search offers disintermediated discovery. Algorithms fetch for us, doing away with the need for intermediaries like brick-and-mortar stores […].«270 The fact that the encounter with archival objects is always also an encounter with archival policies that one must »obey«271 nolens volens—even if from the labyrinthine opacity of the archival space there comes the digital promise of an unmediated view (which is de facto algorithmically and microtemporally determined)—remains a relevant point for archival reservations, even under the conditions of »visibility depots«272 regulated by computer networks. At any rate, the storage objects here do not have to be made accessible linearly or by directly tracing storage practices of hierarchical classification; rather, they are sorted instantaneously on demand, in ›real time‹—whether as inventories that are dynamically constructed and reconstructed, or, in the case of digitized copies stored in the form of databases or digital objects without analog antecedents, in direct connection with playing out and visually (re)generating the requested data sets through software.

This access is no less ›mediatized‹ at any level, only mediated differently. Tendencies toward opening up and democratizing, toward an increasing popularity of the archive,273 stand in contrast to new kinds of opacities and concentration of power. The fact that the haystacks of the secret service as well as the big data collections of private enterprise comprehensively register digital acts of communication, storing and processing them over long periods of time, has entered into large segments of the public discussion—even if more significant consequences for the politics of information and data protection laws have thus far failed to materialize, even post-Snowden. The once dominant concern about the shelf life of carrier media and data rot seems to have been largely replaced by a skepticism about storage that does not forget anything.274

A discursive effect associated with these widespread reservations addresses the dimension of access to digital archives. Access is again being discussed more strongly in relation to the asymmetries of the access to storage. Thus dealings with the newly emerging archival databases can also only be understood against the backdrop of those »digitally fostered values«275 that are shaped above all by the use of commercial search engines and platforms.276

From an archival theory perspective, alongside the general extensions of storage, it is also important that digital collecting technologies have »invasive« tools at their disposal, which not only process the full inventory through information technology, but also facilitate fine-grained explorations of documents: »as access not only to, but into the documents themselves.«277 Lisa Gitelman has narrated this evolution as a media history of the document, which arrives in the present day of digital archival practices with the portable document format (PDF):

Unloved or not, the portable document format has succeeded by dint of the ways in which it imagines and inhabits the genre of the document mobilized within the digital environment. The format prospers both because of its transmissiveness and because of the ways that it supports structured hierarchies of authors and readers (›workflow‹) that depend on documents. […] Using a file manager application to look on your own hard drive for a PDF is something like rooting through a filing cabinet, if you could ever root through files paying attention only to file names and locations, and not to things like thickness or signs of wear. And if you can let go of the idea that the document you call to the screen is actually entirely the same (rather than just looking the same) each time you call it up. Searching computationally for PDFs is different, though, both because searching can rely on data and metadata that go beyond file names and because of the ways that today’s searchable databases, at least, render location as relation.278

The PDF—the dominant format for digitally distributed written documents—thus reinstalls a hierarchy between readers and authors, readings and revision, while physical-material document information is replaced by the theatricality of software. Software ›performs‹ the supplied files following a »chain of command«279 originally created according to the specifications of the work flow of private industry »file management« (Vismann), thus ›performing‹ the written information as a new ›piece‹ each time it is opened.

For Gitelman, what is decisive in this context is that originally analog »paper knowledge« becomes digitally accessible in the mode of the self-invoking citation: »Whatever else they are, digital and (even more so) digitized documents appear as pictures of themselves.«280 Printedness is then expressed without paper, as a compressed image of a page of text, optimized for transmission, which may not be changed without authorization, but can be examined in great detail as a content stream. If such a document is digitized using an optical character recognition program (OCR), the information it contains exists as discrete values, which database management technologies can compute literally down to the last comma. A direct consequence for historians—the »prototypical«281 archive user group—consists in the possibility of searching entire collections micro-historically according to any random alphanumeric character string, as Putnam notes: »Text-searchable sources make it possible to trace individual people (or songs, or pamphlets, or phrases), allowing us to observe at the micro level the processes that generate, in the aggregate, macro-level flows and connections.«282 From this perspective, digital archive documents constitute data pools that can be evaluated in great detail for granular historiography,283 which can easily be scaled up globally based on networked holdings.

The manifest limits of this form of translocal archival evaluation arise on the one hand from unequally distributed resources for digitization, from the disparity between the Global North and Global South, from the consequences of the digital divide for the accessibility of subaltern archives.284 Whatever resists in analog form is not distributed; it initially moves into a comparatively peripheral orbit and ultimately loses its connection to the preferentially utilized archival flow of information. The non-digitized accumulates disadvantages in the economy of visibility and ends up on a path of increasing inaccessibility that can only be reversed at certain points: »[…] the Archives are hemorrhaging visitors as people believe they can access everything online. And the reliance in the capacity of digital search can mean paradoxically that less is found, for example, in the loss of the interpretative complexity embedded in the material and in the ›contextual marsh‹ of paper records.«285

On the other hand, however, genuinely digital forms of source readings emerge that have reached numerous disciplines under the label of Digital Humanities and shifted their theoretical self-understandings into a new range of questions.286 Documents are transformed into data sets, which can also be processed in ways that go beyond database queries initiated for general research purposes. There is a trend toward replacing »immersive reading«287 that is bound to a certain place with an epistemology that, from any given location, subjects large amounts of data to scanning that is tentative, easily variable, and capable of serial comparison. »Applying computer power to historical documents«288 begins with commonplace database search queries and ends in various models of »non-human readings of storage,«289 in an adjustable distant reading of digital or digitized documents. The ›readers‹ of streaming archives accordingly also include algorithms that are formatted to be eager to learn (namely, increasingly ›self-learning‹ algorithms290).

Michel de Certeau foresaw the impending advent of computer data processing in the historiographic use of archives already in the 1980s and warned of the modes of reading that would prevail there. The fact that archival collecting is not a passive storage process, that ›history‹ is generally not so much read and reconstructed from storage as it is invested in storage and »made« by »redistributing« archival objects—an epistemological process that must necessarily be realized in (ideally reflexive) dialogue with the organization of the archive—is translated with the »technological institution« of the computer into the granular evaluative calculations of computable data pools. On the formalized basis of technological storage, ›history‹ becomes adjustable and programmable from the archival data room, becomes the temporary storage product of »relational models,« which derive historical ›meaning‹ from serial calculating operations, from »model changes,« as Certeau writes:

[History as it has been practiced in the past] customarily began with limited evidence (manuscripts, rarities, etc.), and it took as its task the sponging of all diversity off of them, unifying everything into coherent comprehension. […] The often monstrous quantitative development of the search for documents had the result of introducing into the interminable process of research the very law that made it obsolete as soon as it was completed. A threshold has been passed, and beyond it this situation is inverted. […] With the computer, the quantity of information that can be studied in relation to these norms has become endless. Research has totally changed. Based on formal entities that are deliberately put forward, it is drawn toward the deviations that are revealed through logical combinations of series. It plays on the limits of models.291

Instead of beginning with vestiges that must be extrapolated into synthesis, the computer represents a historiographical procedure that presupposes »formalizations« that, aside from quantifiable results, only produce surpluses at the margins of the model. What falls by the wayside, according to Certeau, are forms of historical understanding that rely on ›incomputable‹ readings of a heuristically collected, decidedly selective body of sources.

Even a few decades of media history later, however, this could be countered by the fact that the amount of information is still not infinite nor has it plateaued, even if the prophets of big data would like to imagine the timeline both back into the past and forward into the future as navigable at will. The limited machine readability of the historical world, as we will see in chapter II.3, can be demonstrated directly with regard to the particularly resistant digital image repertoire. But even the aforementioned OCR software represents exactly the circumstance that the expanded computability of areas and types of documents that have been datafied to varying degrees is fundamentally carried out step by step.292 Potentially processable archival data holdings are undoubtedly being expanded, but this dynamic, firstly, encounters resistance and, secondly, has a history of its own that can be told, for instance, as the media history of archival access.

II.1.5 Pastness, Nowness

The idea of an »endless« amount of information in the sense of an automated, complete transcript of the past in data form ultimately belongs to the often dystopian narrative of an assault orchestrated by real-time technologies on ›historicity‹ itself. The supposed permanent ›presentness‹ initially appears here as a discursive figure that emerges, like an inevitable temporal consequence, so to speak, from the general opening dynamics293 of streaming archives. These dynamics range from intra-archival networks and the standard of remote access to algorithmically opened or ›readable‹ documents, leaving in their wake the question of what to make of an archive that is no longer capable of implementing its normative essence—the distinction between »what is worth preserving and what is not.«294

The inclusion of minoritarian, marginalized, »humble« archives ›from below,‹ which could originally be told as a history of progress, thus tilts with the continuously expanded storability of an almost completely datafied reality into a dysfunctional ›total archive‹ that inscribes ever more finely grained presents, but threatens to lose sight of the fact that archives produce historicity by evaluating, selecting, creating temporal distances and distances between archival holdings and non-archival holdings. In the explicit non-preservation of documents that are negatively selected, that is, become objects of cassation, the logic of archival evaluation is expressed as a closing of storage and organized obsolescence, which initially comes into effect on two levels: as a conservational »stored forgetting« (which remains in the »latent memory«) and as a targeted exclusion from storage, which may not lead to »imposed forgetting« (censorship), but does result in a higher probability of forgetting.295 The cycles of obsolescence that devalue the hardware of storage technologies are another form of forgetting that would need to be taken into account in the overall balance of what should be preserved, as John Durham Peters has noted: »The outmoding of storage media has become a fact of life. Massiveness of documentation, fragility of preservation: this is our condition.«296

Where there is no selection, where forgetting and removal from visibility is not possible in any form, the archive ultimately develops »into the double of the general.«297 The effect of an unlimited expansion of memory is thus paradoxical, as the idea of gapless, non-selective storage would de facto block access to older layers of storage. The »archive of the present« understood in this way would not actually be an archive at all, since the »media space in which we reproduce the present«298 does not subside, does not become a past present and thus does not become historical, but is constantly regenerated. The presentness-effects of real-time technologies subvert processes of historicization, delay or prohibit the formation of »historical records« (Peters). In the mode of temporary storage, past presents are constantly overwritten with new presents, that is, tendentially erased.

Assmann describes this form of synchronous co-presence, following Levi-Strauss, against the background of a rising erosion of the »consciousness of the past being past,« for which the institution of the archive can offer only dwindling support in resistance to the real-time transfer of digital communication: »Diachrony is tending toward […] dissolution in synchrony.«299 It is the ubiquitous accessibility of networked forms of storage that are able to communicate among themselves and the forms of instantaneous transfer associated with them that thus disturb the formation of historicity. The media-technological ›real-time‹ of streaming archives intensifies distribution, but in doing so it hampers processes that deliberately suspend circulation and connectivity in order to modulate deceleration, distance, forgetting. A prerequisite for the reformatting procedures used to inscribe storage objects into the archive, to referentialize and classify them for the purpose of documenting their provenance, is a temporal stoppage through which selected objects can be infused with the endurance of archival »time reservoirs.«300

In the sometimes one-sided, exaggerated emphasis on the lack of historicity determined by media technology, one need not agree with the above diagnosis, which has been proposed in various forms, in order to understand that under the process-temporal conditions of computer networks and streaming data we can in fact assert a different media productivity of ›nowness,‹ which also rebalances the formation of ›pastness‹ in certain respects. The question of the archive offers a heuristically suitable perspective here insofar as the practices connected with it resonate institutionally; that is, in contrast to the rhetoric of unbounded archival semantics, they are associated precisely not with hypertrophic, undifferentiated forms of storage, but rather with a selective perspective on what parts of the present might be relevant to the future and thus are not collected arbitrarily, but specifically stored, registered, kept available at a »new time point« (Winkler). With regard to the frequently assumed omnipresence, or permanent nowness, of digital data storage, it can be argued at this point with Sven Spieker that institutional archival practices do become capable of producing distinctions once again:

According to traditional archival doctrine, archives certainly store a great deal, but not necessarily everything. Today, however, the circulation of data and information is increasingly taking place in a global archival space that has lost its outside. In this sense, discussion of the archive is misleading when it refers to the global data memory banks of our day. The problem for us in the present consists less in deciding what an archive is, than in understanding what an archive-free space would have to look like, if it could in fact still exist. And here a paradox arises […]: the outside of the archive today are those traditional archives that are still concerned with separating what is worth preserving and what is not, with deciding what is of archival value under the given circumstances.301

In this model, institutional archives form an exterior of the ›global archive‹ that is turned inward insofar as they are special places of knowledge from which can be observed those excesses of storage that the traditional archive initially attempts to protect itself against—by means of its associated norms, methods, and epistemologies.302

But because this interior, which generates storage on its own expense, continuously comes into contact at least at its edges with an environment that has fallen prey to entropy, there arises an obviously permeable contact zone. Against the backdrop of a culture that takes ever more data ever more instantaneously into its stored memory, the institutional archive seems to have shifted in the direction of functional memory, so to speak. It sorts through information and tends to erase what it excludes from the surrounding data storage.

Under digital conditions, materials that are accepted into an archive, found to be of value and reformatted are generally not so much passively stored as they are kept for reproduction by media technology and pre-activated for transmission processes. This shift, which has already penetrated far into the traditional interior of archives, can be seen in the widespread desire of institutional facilities to lower access thresholds, to communicate access, and often even to produce digital copies and provide them through a variety of channels. The archives also ›stream‹ because a new assessment of distribution is gaining ground. In the model of the transmitting archive, accessibility wins out over the standards of persistence in the competition for institutional relevance: »Archives [want] to become brokers of information at the same level as libraries and similar institutions.«303 Mere preservation comes under suspicion of producing archive-corroding invisibilities. Many traditional archives are dedicating significant resources to switching their holdings, once protected from use and access, to open distribution. Archival objects and archival arrangements are digitally compiled and copied above all in order to remain connected with the dominant flows of information and therefore to remain accessible. What has been stored is meant to be neither completely nor permanently forgotten. Institutional archives therefore pursue their own specifically calculated storage agendas, while at the same time—with their networked databases organized for streaming—contributing to the ›globalization‹ of data storage, rather than being its antipodal Other in every respect.

It remains relevant, however, to identify the differences between the various modes of storing and distributing. Spieker’s model of externalization can be understood as a call for a potentially helpful shift in perspective. Precisely because digital storage calculations have long since come into contact with conventional archival agendas, intentions of handing down content for posterity, resilient referentialization, and the most long-lasting storage possible, the amalgamations that have emerged from this contact open up a very promising field of analysis.

Against this background, a paradigmatic product of this connection in the sense outlined here seems to be the material that the following case studies will examine: archival images that are retro-digitized by institutional agents and distributable as streaming data. An explicit dimension of historicity thus comes into play here, if only because these archival holdings have analog prehistories, which transcoding in some sense splits in two. The images in question exist from the moment of their logistical bifurcation in two states of media-technological storage: as physically tangible objects in concretely localizable archival spaces and as data objects in digital databases that are ready for transmission. A shared archival authority watches over both, guarantees the connection through »chains of transmission,«304 but equips these related objects with different distributional resources, ranges, practices. They will be examined as examples of distributed historical images—as visual documents of a public authority (the police) and of a historical context of events (the Shoah)—and in this will be approached from a fundamentally media-historiographical perspective. The »medial a priori«305 here is aimed on the one hand at the entry of digital calculations into archival systems, at the concomitant logistical restructurings. Alongside the historiographically preparable contents of such a (re)distributed history, what is of interest on the other hand is the history of distribution. For digitized objects maintain genealogical connections with early forms of technological image distribution. Their history is likewise stored in the data depots. In addition to storage processes,306 the archives in question here also preserve documents of distribution history that can be processed in terms of media archaeology.


Cf. Sven Spieker, »Einleitung. Die Ver-Ortung des Archivs,« in: Spieker (ed.), Bürokratische Leidenschaften, Berlin: Kadmos, 2004, 7–28.


See the foundational text: Geoffrey C. Bowker, Susan Leigh Star, Sorting Things Out: Classification and Its Consequences, Cambridge, MA: MIT Press, 1999.


Spieker, »Die Ver-Ortung des Archivs,« 18.


Alf Lüdtke uses this expression to refer to various forms of private, non-state collection practices, which since the 1970s have successively liberated the term »archive« from its »official, arcane sphere«: »It is not a matter of ›registered written materials‹ of this or that ›course of business.‹ Rather, they want to archive previously ignored text types or non-textual materials, and thus above all to facilitate their publication. […] These activities draw on a universe of real, existing archives held by individuals or in households—often they are ›humble‹ archives of correspondence or notes, tucked away in attics or in cellars, in suitcases, boxes, or crates« (Alf Lüdtke, »Archive—und Sinnlichkeit? Nachgedanken zu Arlette Farges ›Der Geschmack des Archivs,‹« in: Arlette Farge, Der Geschmack des Archivs, Göttingen: Wallstein, 2011, 99–116, here: 107).


Cf. Oliver Grau (ed.), Museum and Archive on the Move: Changing Cultural Institutions in the Digital Era, Berlin: de Gruyter, 2017.


Knut Ebeling, Stephan Günzel, introduction to Archivologie. Theorien des Archivs in Philosophie, Medien und Künsten, Berlin: Kadmos, 2009, 7–26, here: 7. In terms of the history of theory, from the perspective of media and cultural studies the inflating of the archive, which continues to this day, begins with the power-critical ›de-institutionalization‹ of the term in the context of Foucault’s research on the archaeology of knowledge: »I shall call an archive, not the totality of texts that have been preserved by a civilization or the set of traces that could be salvaged from its downfall, but the series of rules which determine in a culture the appearance and disappearance of statements, their retention and their destruction, their paradoxical existence as events and things« (Michel Foucault, »On the Archaeology of the Sciences: Response to the Epistemology Circle« [1968], in: James D. Faubion (ed.), Aesthetics, Method, and Epistemology: Essential Works of Foucault, 1954–1984, New York: New Press, 1998, 297–334, here: 309).


Eivind Røssaak, »The Archive in Motion: An Introduction,« in: The Archive in Motion: New Conceptions of the Archive in Contemporary Thought and New Media Practices, Oslo: Novus Press, 2010, 11–26, here: 15.


Cf. Valeska Bührer, Stephanie Sarah Lauke, »Archivarische Praktiken in Kunst und Wissenschaft. Eine Einführung,« in: Valeska Bührer, Stephanie Sarah Lauke, and Peter Bexte (eds.), An den Grenzen der Archive, Berlin: Kadmos, 2016, 9–21.


Mackenzie, »The Mortality of the Virtual,« 61.


Ibid., 59.


David M. Berry, »The Post-Archival Constellation: The Archive under the Technical Conditions of Computational Media,« in: Ina Blom, Trond Lundemo, Eivind Røssaak (eds.), Memory in Motion: Archives, Technology, and the Social, Amsterdam: Amsterdam University Press, 2017, 103–125, here: 106. Ina Blom writes in the introduction to that volume: »[…] once the archive is based on networked data circulation, its emphatic form dissolves into the coding and protocol layer, into electronic circuits or data flow. Archival data have, of course, always been in circulation: the whole point of an archive is to allow documents to be mobilized for the shifting needs and inquiries of the present. But with the networked digital archive, this circulation becomes a feedback circuit whose material structure is that of vectorial dynamics and electromagnetic fields« (Ina Blom, »Introduction: Rethinking Social Memory: Archives, Technology, and the Social,« in: ibid., 11–40, here: 12).


Mackenzie, »The Mortality of the Virtual,« 62. On the computer historiographic arrangement of von Neumann architecture, cf. Thomas Haigh, »Von-Neumann-Architektur, Speicherprogrammierung und modernes Code-Paradigma,« Zeitschrift für Medienwissenschaft 12 (2015), 127–139.


Robert Gehl, »The Archive and the Processor: The Internal Logic of Web 2.0,« New Media & Society 13 (2011), 1228–1244, here: 1229.


Gehl speaks here of »affective processing« and ultimately formulates a critique of the centralizing and economic absorption of the »archives of affect« that are stored in this process (cf. ibid., 1240).


Ibid., 1238.


Cf. Wendy Hui Kyong Chun, »The Enduring Ephemeral, or the Future Is a Memory,« Critical Inquiry 35 (Autumn 2008), 148–171.


Berry, »The Post-Archival Constellation.« Michael Moss sees the current transformation processes more as a return to the archival model of the 18th century, the Wunderkammer: »… the potential of the archive and what then becomes archivable on the most powerful distribution channel the world has ever seen is not the harbinger of a post-archival universe, but rather returns curation to its roots in the wunderkammer of Enlightenment Europe where everything can appear to be simultaneously disconnected and connected« (Michael Moss, »Memory Institutions, the Archive and Digital Disruption?,« in: Andrew Hoskins (ed.), Digital Memory Studies: Media Pasts in Transition, London: Routledge, 2017, 253–279, here: 264).


Cf. Jeffrey Rothenberg, Using Emulation to Preserve Digital Documents, The Hague: RAND-Europe/Koninklijke Bibliotheek, 2000, and Matthew Fuller, Andrew Goffey, Adrian Mackenzie, Richard Mills, Stuart Sharples, »Big Diff, Granularity, Incoherence, and Production in the Github Software Repository,« in: Ina Blom, Trond Lundemo, Eivind Røssaak (eds.), Memory in Motion: Archives, Technology, and the Social, Amsterdam: Amsterdam University Press, 2017, 87–102.


Wolfgang Ernst, Das Gesetz des Gedächtnisses. Medien und Archive am Ende des 20. Jahrhunderts, Berlin: Kadmos, 2007, 313. See also Ernst, »Jenseits der AV-Archive—Optionen der Streaming Media,« in: Verein für Medieninformation und Mediendokumentation (ed.), Fokus Medienarchiv. Reden/Realitäten/Visionen. 1999 bis 2009, Münster: LIT Verlag, 2010, 81–100. For a critique of Ernst’s conceptualization with regard to dynamic processes of temporalization in ›paper-based‹ archives, cf. Moss, »Memory Institutions,« 257.


Ernst, »Zwischen(-)Speichern,« 87.


This is especially the case for digital archival objects with no analog equivalent. Cf. Annet Dekker (ed.), Archive 2020—Sustainable Archiving of Born-Digital Cultural Content, Virtueel Platform 2010. In the context of the PEVIAR Project (Persistent Visual Archive), Rudolf Gschwind, the director of the Imaging & Media Lab in Basel, has attempted to work out the degree to which analog material can be utilized for long-term archiving of digital or digitized information: »A permanent medium that is very familiar is photographic film, microfilm, which has a high data density; this could be a commercial consideration. Films today exposed to rapid aging tests show that they can last a few hundred years. This is as good as permanent. For us it is interesting that it is possible to mix information in hybrid forms on a visual medium. On microfilm, digital information can be stored in the form of dot patterns, in a two-dimensional barcode, the highest density type of storage, as well as simple text and image. And even the description of how to read what the digital data means« (Rudolf Gschwind, Lukas Rosenthaler, Ute Holl, »Migration der Daten, Analyse der Bilder, persistente Archive,« Zeitschrift für Medienwissenschaft Nr. 2 (2010), 103–111, here: 104).


Ernst, »Zwischen(-)Speichern,« 89.


Cf. Andreas Bernard, »Das totale Archiv,« Merkur. Deutsche Zeitschrift für europäisches Denken 801 (2/2016), 5–16.


The fact that, from the viewpoint of network typology, the Internet is distributed (that is, not simply decentralized), has been examined from a critical »protocollogical« perspective: Alexander R. Galloway, Eugene Thacker, »Protokoll, Kontrolle, Netzwerke,« in: Ramón Reichert (ed.), Big Data. Analysen zum digitalen Wandel von Wissen, Macht und Ökonomie, Bielefeld: Transcript, 2014, 289–311.


According to Jacob Ørmen, the main problem with understanding search engine results themselves as archivable online documents is that they are the result of a cooperation between the search query and the search engine service—»They simply don’t exist prior to the particular act of searching« (Ørmen, »Historicizing Google Search,« 191). The logic of continuous indexing (search engines, like libraries, have access to continuously updated catalogues; both can give specific answers to specific queries because they have access to unrivaled information about the total inventory) also involves algorithmic parameters that produce non-simultaneities such as personalization and localization of the search query.


In a certain sense, the temporalization of streaming archives also already applied within the walls of classic archives. In this sense we can understand, for instance, histories of knowledge of the archive that argue from a praxeological-social theory perspective, going beyond concepts of passive memory to argue against a positivism of written texts, catalogues, inventories; cf. Markus Friedrich, Die Geburt des Archivs. Eine Wissensgeschichte, Berlin: De Gruyter, 2013.


Martin Warnke, »Digitale Archive,« in: Hedwig Pompe, Leander Scholz (eds.), Archivprozesse. Die Kommunikation von Aufbewahrung, Cologne: Dumont, 2002, 269–281, here: 272, 280.


Daniel Rosenberg rightly points out how eminently important archival agendas were that were implemented precisely on this level: »In electronic space, objects once traditionally thought of as documents mingle, disintegrate, and recombine according to protean systems and rules. For understanding recent history, these systems and rules are themselves objects of great archival importance, though their traces are not often intentionally conserved. Figuring out how to archive this archive is no small matter. It will be the foundation for the history of the epistemology of our contemporary era« (Daniel Rosenberg, »An Archive of Words,« in: Lorraine Daston (ed.), Science in the Archive: Pasts, Presents, Futures, Chicago: University of Chicago Press, 2017, 271–310, here: 272).


Chun, Updating to Remain the Same, 52, 90.


On the specific adaption and stability effects of non-hierarchical forms of organizing networks, cf. Hartmut Böhme, »Netzwerke. Zur Theorie und Geschichte einer Konstruktion. Einführung,« in: Böhme, Jürgen Barkhoff, Jeanne Riou (eds.), Netzwerke. Eine Kulturgeschichte der Moderne, Köln: Böhlau, 2004, 17–36, here: 23.


Quoted in Guido Kirsten, Florian Krautkrämer, Sebastian Lütgert, »Piraterie als filmpolitische Praxis. Sebastian Lütgert im Gespräch mit Florian Krautkrämer und Guido Kirsten,« montage AV. Zeitschrift für Theorie und Geschichte audiovisueller Kommunikation. Streams and Torrents 26/1 (2017), 81–90, here: 87.


Jan Gerber, Sebastian Lütgert, »10 Theses on the Archive,« Pad.ma, 2010. A distributed filesharing collection like the cinephile platform that operates as an invite-only tracker under the alias »Schwarzer Rabe« [»Black Raven«] could unquestionably be discussed as a digital film archive (not least because of the shared curatorial care and systematics practiced there). In their manifesto, however, they prefer the library as a memory agency model: »[Schwarzer Rabe] strives to be more than just a regular BitTorrent tracker for movies. We are an exclusive private filesharing community focused on creating a comprehensive library of Arthouse, Cult, Classic, Experimental and rare movies from all over the world.« The lending function of the library that is expressly emphasized here, however, is ›archivally‹ safeguarded by a reseed function, which entails quasi-institutional consolidating effects, as the range, variety, and lifespan of the audiovisual data holdings that can be consulted through Schwarzer Rabe have attested to for more than ten years: »Unfortunately one of the big disadvantages of the BitTorrent p2p system is that most torrent swarms die off relatively quickly, mostly because people do not have any incentive to keep torrents seeded. Other trackers would just delete those dead torrents. We on the other hand have set out to change that. In general, we do not delete any movie torrents and we do not consider old torrents to be ›dead‹. They are just unseeded at the moment. If a torrent has been unseeded for two days with no activity, a big red button on the top of the torrent details page allows you to request a reseed for the torrent. […] The combination of reseed requests and the various bonuses have created an extremely effective mechanism that allows even long-dead torrents to be resurrected swiftly. You can put in a reseed request and usually find the torrent seeded the next day« ([Schwarzer Rabe Homepage], last edited Feb. 20, 2007). Cf. Guido Kirsten, Fabian Schmidt, »Von Schwarzen Raben und anderen Netzwerken. Filmdistribution in der Schattenwelt des Internets—ein Bericht,« montage AV. Zeitschrift für Theorie und Geschichte audiovisueller Kommunikation. Streams und Torrents 26/1 (2017), 59–80; Ekkehard Knörer, »Movable Images on Portable Devices,« in: Gertrud Koch, Volker Pantenburg, Simon Rothöhler (eds.), Screen Dynamics: Mapping the Borders of Cinema, Vienna: Synema, 2012, 169–178, and more generally Tilmann Baumgärtel (ed.), An International Reader in Media Piracy: Pirate Essays, Chicago: Chicago University Press, 2016, as well as Theo Hug, Ronald Maier, Felix Stalder, Wolfgang Sützl (eds.), Medien — Wissen — Bildung. Kulturen und Ethiken des Teilens, Innsbruck: Innsbruck University Press, 2012.


Cf. Jussi Parikka, Digital Contagions: A Media Archaeology of Computer Viruses, Bern: Peter Lang Publishing, 2007.


Hoskins offers a perspective on this »turn« as the transformational narrative of memory studies, while at the same time pointing out the limits of the »post-scarcity culture« that has arisen: »The triumph of the networked archive to deliver an apparently anytime, everywhere view, paradoxically illuminates the infinity of media after the connective turn, and thus the limits of our capacity to hold or to store (a classical problem of memory), as well as to know« (Andrew Hoskins, »Introduction to Digital Memory and Media,« in: Hoskins (ed.), Digital Memory Studies: Media Past in Transition, London: Routledge, 2017, 1–24, here: 3).


»Without continuity of practices, the archive would not just slumber from time to time; it would sink into a coma. Stable practices of collecting, selecting, canonizing, scrubbing, and ordering data insure that the contents of archives are commensurable and retrievable« (Lorraine Daston, »Epilogue: The Time of the Archive,« in: Daston (ed.), Science in the Archive: Pasts, Presents, Futures, Chicago: University of Chicago Press, 2017, 329–332, here: 331).


Aleida Assmann, »Archive im Wandel der Mediengeschichte,« in: Knut Ebeling, Stephan Günzel (eds.), Archivologie. Theorien des Archivs in Philosophie, Medien und Künsten, Berlin: Kadmos, 2009, 165–175, here: 174. According to Assmann, storage and functional memory only diverge in the moment of developing efficient external storage media (for her, writing is the first such medium), which function as a »memory replacement,« and thus can also preserve currently ›functionless‹ knowledge that can then be transferred from the storage memory into the functional memory at a later time. In this sense, archives are paradigmatic institutions of storage memory, but they maintain a relatively direct data flow (in historical terms, facilitated primarily by the nation-state since the 19th century) to the agents of functional memory (ibid., 169ff.).


Ibid., 174.


»Everything that the work of culture has produced until now, especially the documents of culture as such (literary and artistic texts) is to be placed within the linked file and directory structures of the electronic archive. In addition, the archive drive conceives of ever new projects on the basis of their archivability. These are not projects for use, consumption or circulation elsewhere and then preserved in the electronic archive, rather they are generated by the referencing and storage structures of the network themselves« (Mackenzie, »The Mortality of the Virtual,« 61).


Which requires a new »forensics«: cf. Matthew G. Kirschenbaum, Richard Ovenden, Gabriela Redwine, Digital Forensics and Born-Digital Content in Cultural Heritage Collections, Washington, D.C.: Council on Library and Information Resources, 2010.


Jürgen Fohrmann, »›Archivprozesse‹ oder Über den Umgang mit der Erforschung von ›Archiv‹. Einleitung,« in: Hedwig Pompe, Leander Scholz (eds.), Archivprozesse: Die Kommunikation von Aufbewahrung, Cologne: Dumont, 2002, 19–23, here: 19.


Lisa Gitelman, Paper Knowledge: Toward a Media History of Documents, Durham, NC: Duke University Press, 2014.


Cornelia Vismann, Akten. Medientechnik und Recht, Frankfurt am Main: S. Fischer, 2011, 305.


Sven Spieker, »Manifesto for a Slow Archive,« ARTMargins Online, January 31, 2016, https://artmargins.com/manifesto-for-a-slow-archive/.


Jussi Parikka, What is Media Archaeology?, Cambridge, UK: Polity Press, 2012, 123.


Wolfgang Ernst connects this model with a concept of the internet as an »open encyclopedia« (Ernst, »Zwischen(-)Speichern,« 103). David Berry points to the »flatness« produced by the warehousing practices of leading online mail-order companies: »Amazon uses a principle of simplicity and an idea of ›flatness‹ to create a computational archive of physical objects. All objects are treated as records to be entered into a database, and they are processed through a grammatization framework which flattens the object not only into the data store but also within the warehouse space: the singularity of the object is, in other words, abstracted away by the technology. Objects are retrieved using computer-controlled robots from Kiva Systems, which glide swiftly and quietly around the warehouse. To do this, Amazon uses a so-called ›chaotic storage‹ algorithm that optimizes storage through mediating databases. […] Amazon knows the exact dimensions of every product in its warehouses and the exact dimensions of vacant shelf space. The robots glide the objects to be stored to the most efficient places. […] From the outside, the Amazon system looks horribly disorganized and illogical. In fact, the warehouse represents the objectification of the chaotic storage algorithm« (Berry, »The Post-Archival Constellation,« 110).


Cf. Harald Ehrmann, Logistik. Kompendium der praktischen Betriebswirtschaft, Herne: Kiehl-Verlag, 2008, 349f.


Cf. the foundational work: Christoph Rosol, RFID — Vom Ursprung einer (all)gegenwärtigen Kulturtechnologie, Berlin: Kadmos, 2008.


Cf. Rossiter, Software, Infrastructure, Labor.


Cf. Shannon Mattern, »Middlewhere: Landscapes of Library Logistics,« Urban Omnibus, June 24, 2015, https://urbanomnibus.net/2015/06/middlewhere-landscapes-of-library-logistics/.


On the significance of data structure for storage media, cf. Manovich, Software Takes Command, 201f.


Gugerli, Suchmaschinen, 71. This principle becomes generalized in web-based database operation, as Martin Warnke writes: »Databases facilitate access by many, they compensate for a disadvantage of the web as it was designed by Sir Tim Berners-Lee. With the appropriate technologies on the part of the large database operators, one can participate without having to know a lot about technology. A content management system ensures that users can make entries by means of web forms, that these find their way into databases and are then displayed on web pages, where they can be seen by others« (Warnke, »Datenbanken als Zitadellen,« 133).


In this vein David Berry speaks of a »computational opacity« (Berry, »The Post-Archival Constellation,« 105).


Arlette Farge, Der Geschmack des Archivs, Göttingen: Wallstein, 2011, 40ff.


Lara Putnam, »The Transnational and Text-Searchable: Digitized Sources and the Shadows They Cast,« American Historical Review 121/2 (2016), 377–402, here: 377.


Farge, Der Geschmack des Archivs, 40.


In his history of knowledge of the archive, Markus Friedrich states that his term »visibility depot« should not simply be understood as the passive storage of textual documents, but as the sum of archive-related practices that take place in various social spaces and must be negotiated with the historical tradition of governance and administration of an »administrative organ« (Max Weber) (Friedrich, Die Geburt des Archivs, 17ff.).


Cf. Urs Stäheli, »Die Wiederholbarkeit des Populären: Archivierung und das Populäre,« in: Hedwig Pompe, Leander Scholz (eds.), Archivprozesse. Die Kommunikation der Aufbewahrung, Cologne: Dumont, 2002, 73–82.


Cf. Jeffrey Rosen, »The Web Means the End of Forgetting,« New York Times, July 21, 2010.


Hoskins, »Introduction to Digital Memory and Media,« 3.


Debra Ramsay has produced an exemplary study of this connection in a work of media ethnography. Her concrete example is the development of a new interface for the archive website of the British National Archive (TNA). Representatives of the institution had to cooperate with web designers on the conception and implementation of this website. A fundamental area of conflict became apparent in the result: »User expectations generated by familiarity with commercial websites such as Google exert a definite and tangible pressure on the process of interface design in archives, and are increasingly inflecting perceptions of what is accessible from the past and how it can be accessed. But the design process demonstrates that commercial design principles are not simply or blindly implemented within heritage organisations like TNA, because the archive itself pushes back against them by asserting and upholding archival responsibilities and identity through a series of representational strategies« (Debra Ramsay, »Tensions in the Interface. The Archive and the Digital,« in: Andrew Hoskins (ed.), Digital Memory Studies: Media Past in Transition, London: Routledge, 2017, 280–302, here: 299).


Ernst, »Zwischen(-)Speichern,« 104.


Gitelman, Paper Knowledge, 133.


Ibid., 127.


Ibid., 114.


On the ›impregnation‹ of the concept of the archive in historical scholarship, Lorraine Daston has noted: »So complete and exclusive has the identification of archive with the discipline of history become that any other kind of archival research is assumed to be ipso facto historical in nature, and any archive to be of the sort prototypically investigated by historians: a fixed place with a curated, often official collection consisting mostly of old unpublished papers. […] Not only does archival research dominate the imagination of the historians; the historians’ archives dominate our collective imagination of all archival research« (Lorraine Daston, »Introduction: Third Nature,« in: Daston (ed.), Science in the Archive: Pasts, Presents, Futures, Chicago: University of Chicago Press, 2017, 1–14, here: 2f.).


Putnam, »The Transnational and Text-Searchable,« 386.


Gehl summarizes the granular calculation as follows: »The larger the archive, and the more granular the data about the desires, habits, and needs of users, the more valuable the archive. And if the archive is reliably linked to users who can sort data and process digital artifacts, the archive can be grown and made more precise« (Gehl, »The Archive and the Processor,« 1239).


Cf. Maja Kominko (ed.), From Dust to Digital: Ten Years of the Endangered Archives Programme, Cambridge, UK: Open Book Publishers, 2015.


Hoskins, »Introduction to Digital Memory and Media,« 4.


Cf. David M. Berry (ed.), Understanding Digital Humanities, Basingstoke, UK: Palgrave Macmillan, 2012. For a perspective on the Digital Humanities from the science of history, see: Jörg Wettlaufer, »Neue Erkenntnisse durch digitalisierte Geschichtswissenschaft(en)? Zur hermeneutischen Reichweite aktueller digitaler Methoden in informationszentrierten Fächern,« Zeitschrift für digitale Geisteswissenschaften 1 (2016), DOI: 10.17175/2016_011. More on this in chapter II.3 (Computing Video Data).


Putnam, »The Transnational and Text-Searchable,« 388.


Ibid., 400.


Ernst, »Jenseits der AV-Archive,« 87.


»Algorithms are introduced to write new algorithms, or to determine their variables. If in turn this reflexive process is built into an algorithm, it becomes ›self-learning‹« (Stalder, Kultur der Digitalität, 178).


Michel de Certeau, The Writing of History, New York: Columbia University Press, 1988, 78–79.


Lara Putnam has also indirectly pointed to this: »We took the enduring remains of state and church recordkeeping—censuses, parish records, tax rolls—and coded and calculated. What is new now is not computation per se but digitization and OCR, which make words above all available, whether for web-based discovery or for automated analysis. This mass data-fication of words is just one subsection of ›the digital‹ impacting academe, but it is a huge one. Not only is it the shift that has remade the information landscape for search, but it is also the driver for those tech-engaged historians experimenting with topic modeling, sentiment analysis, and other text-mining computational approaches« (Putnam, »The Transnational and Text-Searchable,« 400).


Ina Blom rightly points to diametrical dynamics that exist due to novel options in digital closure: »Digitization seems, at least in theory, to promote a radical democratization of memory: everything may, potentially, belong to everyone. A proliferation of digital paywalls and passwords is the reality; vestiges of a bounded, territorial concept of space, just like the duplicitous concept of storage« (Blom, »Rethinking Social Memory,« 13).


Stäheli, »Die Wiederholbarkeit des Populären,« 74.


Cf. Assmann, »Archive im Wandel der Mediengeschichte,« 168.


John Durham Peters, »Proliferation and Obsolescence of the Historical Record in the Digital Era,« in: Babette B. Tischleder, Sarah Wasserman (eds.), Cultures of Obsolescence: History, Materiality, and the Digital Age, Basingstoke, UK: Palgrave Macmillan, 2015, 79–97, here: 80.


Stäheli, »Die Wiederholbarkeit des Populären,« 75.


Mercedes Bunz, Die stille Revolution. Wie Algorithmen Wissen, Arbeit, Öffentlichkeit und Politik verändern, ohne dabei viel Lärm zu machen, Frankfurt am Main: Suhrkamp, 2012, 119.


Assmann, »Archive im Wandel der Mediengeschichte,« 175. Cf. also: Ernst, »Zwischen(-)Speichern,« 94.


»The archive provides the time to see and recognize. It resists the tempo of history with its decelerated temporality, and in doing so it makes history perceivable« (Stäheli, »Die Wiederholbarkeit des Populären,« 75).


Spieker, »Die Ver-Ortung des Archivs,« 8.


David Berry also does not assume that archival cultural techniques are per se infeasible in the media space of the internet: »[…] the Internet is an archive that represents an open-ended ›aggregate of unpredictable texts, sounds, images, data, and programs‹ but that is nonetheless navigable and open to traditional archival practices« (Berry, »The Post-Archival Constellation,« 108).


Anna Sobczak, Traditional vs. Virtual Archives—The Evolving Digital Identity of Archives in Germany [Online Publication: academia.edu], Hamburg, 2016, 6.


Daston, »Introduction. Third Nature,« 6. According to Daston, these chains of transmission ultimately also guarantee the sustainability of the archive: »Longevity is no accident: a chain of individuals and institutions links Babylonian cuneiform tablets withs NASA’s Five Millennium canons and the fossils piled up in a seventeenth century cabinet of curiosities with a twentieth-century digital database. At each stage of transmission, key information about the original context in which the archive was compiled might be lost; standards for precision, reliability, and relevance also have their history. Without scholars and scientists, copyists, printers, proofreaders, curators, librarians, archivists, programmers, and the institutions that at every step support them, from monastic scriptoria to the modern university library, the chain would break« (Daston, »Epilogue: The Time of the Archive,« 330).


Cf. Lorenz Engell, Joseph Vogl, »Editorial,« in: Engell, Vogl (eds.), Mediale Historiographien. Archiv für Mediengeschichte 1, Weimar: Universitätsverlag, 2001, 1–5.


Spieker, »Die Ver-Ortung des Archivs,« 19.

  • Collapse
  • Expand

The Distributed Image

Stream – Archive – Ambience