DHQ: Digital Humanities Quarterly
Editorial

From the Presupposition of Doom to the Manifestation of Code: Using Emulated Citation in the Study of Games and Cultural Software

Eric Kaltman <eric_dot_kaltman_at_csuci_dot_edu>, Department of Computer Science, California State University Channel Islands
Joseph Osborn <joseph_dot_osborn_at_pomona_dot_edu>, Department of Computer Science, Pomona College
Noah Wardrip-Fruin <nwardrip_at_ucsc_dot_edu>, Department of Computational Media, University of California, Santa Cruz

Abstract

For the field of game history to mature, and for game studies more broadly to function in a scholarly manner in the coming decades, one necessity will be improvement of game citation practices. Current practices have some obvious problems, such as a lack of standardization even within the same journal or book series. But a more pressing problem is disguised by the field’s youth: Common citation practices depend on the play experiences and cultural knowledge of a generation of game studies scholars and readers who are largely old enough to have lived through the eras they are discussing. More sustainable and precise alternatives cannot fall back on the tools available for fixed media — such as the direct quotations and page numbers used for books or the screenshots (of images that appear to all viewers) and timecode used for video. Instead, this essay imagines an alternative approach, working in the digital humanities traditions of speculative collections and tool-based argumentation. In the speculative future we present, there are scholarly collections of software, as well as tools available for citing software states and integrating these citations into scholarly arguments. A working prototype of such a tool is presented, together with examples of scholarly use and the results of an evaluation of the concept with game scholars.

1 The Changing Archive

As many institutions contemplate migrating their software collections to digital repositories and expanding their born-digital holdings, scholars find new opportunities and use cases that leverage these records’ born-digital nature. At the same time, as recent surveys of the digital humanist landscape make abundantly clear, the future of historical scholarship will be tied to reconciling past, print-based, qualitative practices with newer, networked, quantitative ones [Borgman 2007] [Owens and Padilla 2020] [Siemens and Moorman 2006] [Svensson and Goldberg 2015] [Warwick et al. 2012]. The meaning of the historical archive is changing, and the ways to enable and maximize its exploration must also adapt.
We provide in this article a way forward for the particular case of scholarship in video game history — by designing new means for historical citation, reference, and source retrieval. As we discuss below, these new means support a new approach to game scholarship (including, but not limited to, the history of games). Moreover, this new way of understanding game citation generalizes to other computational artifacts that will likely be of increasing concern to scholars, such as systems and application software, user interfaces, and more.
Specifically, we outline the current practice and requirements of citation in historical game studies works and then suggest a route beyond that practice through the description of a tool for the reference and resurrection of game software data. This resurrection is used as a shorthand for the re-execution of legacy software data inside a new computational context.[1]
Those investigating humanistic citation and authority in digital humanities, like Burdick et al. in Digital_Humanities, call out the coming dissolution of humanistic and preservationist foundations.
As concepts of authorship, document, argument, provenance, and reference become increasingly unstable, concepts that are fluid, iterative, and distributive, but less “authoritative,” are taking their place [Burdick et al. 2016, 109].
Game citation directly confronts — and calls for answers to — many of these issues.[2] In this work, we highlight such problems of “authorship, document, argument, provenance, and reference.”
As a motivating example, consider the following text from Pinchbeck's detailed account of the 1993 computer game Doom:[3]

The color palette of browns and grays shouts a level of realism that Wolfenstein 3D didn’t get anywhere near. A barrel positioned temptingly in the center of the screen just aches to be shot at and delivers a meaty crunching explosion that scatters debris across the screen. . . . There are a number of linked secrets, establishing that opening up new areas involves not just finding and triggering buttons and trip wires but triggering things in sequence. In this case, we have a different-colored wall panel, dropping down into a passage that takes us to the lake of waste with the superarmor, then a trip wire in the final room that lowers the Imp platform, announcing dynamic vertical-level adjustment and opening up a little area with a shotgun and shells. . . . In the space of a few short and small secrets, the game trains the observant player or completionist to watch for wall discoloration, lines of light/shadow and new sectors as trip wires, raising and dropping platforms, and linked sequences. [Pinchbeck 2013, 69]

This passage presupposes comparative knowledge of other early first-person shooters and assumes that the reader either accepts Pinchbeck's descriptions at face value or will do the legwork required to find a copy of the game, load it into emulation software (or find a computer capable of running it!), and reproduce Pinchbeck's referenced objects, sites, and narrative arc. This also puts a greater obstacle in the reader's path: Is the reader sufficiently dextrous and committed to complete the demo chapter and find the secrets? Whom does this type of citation include and exclude? Moreover, for game studies more broadly, it is time-consuming, tedious, and space-intensive to give detailed accounts of even a single linear playthrough of a game.
Consider, as an alternative, this interactive, augmented version of a longer excerpt from the same passage, in which the highlighted text may be clicked to launch the game in a specific saved state.
The color palette of browns and grays shouts a level of realism that Wolfenstein 3D didn't get anywhere near. [Editorial note: This first Wolfenstein 3D example unfortunately does not function on some browsers.] A barrel positioned temptingly in the center of the screen just aches to be shot at and delivers a meaty crunching explosion that scatters debris across the screen...The lights vary in this room - we can see the light from the large window to the right. When we pause by this window, the open areas seem huge, and we can see a lake of animated green goo. A glowing piece of armor sitting in the lake tells us straight away that we can leave the corridors and rooms and actually get outside...We skirt around to the left toward the pillars (which are throbbing and glowing with light) and head up the staircase to a plinth with animated scrolling textures, to collect some armor. On the way, a shaven-headed goon with a shotgun bellows at us. We fire, and he flies backward in a gush of blood, dropping the shotgun. We collect it and scoop up some blue health vials and some archaic metal helmets for an armor bonus, and we’re ready to go.
Game Emulator
Toggle audio
Example 1.
This window contains an active emulation context that runs the cited states (clickable links) in the surrounding paragraphs. When you click on the emulation, your mouse will be captured, use the "esc" key to release your mouse. Controls for Doom and Wolfenstein 3D are as follows: Ctrl key to shoot, arrow keys to move, numbers to change weapons, mouse to look (when captured), "shift" key to run and "esc" key to bring up game menu. Sometimes a state may load inconsistently, just click on the citation again to reload it. A Doom state may hit an internal "LUMP" error, this is due to internal consistency checking with the Doom engine. We recommend using Google Chrome to run this page, if possible, as it produced the fastest emulation speed in our tests.
Other innovations introduced in the first level include multiple vertical levels included in the same area. In the third room we enter, Imps stand on a raised platform in the far corner, while Zombie soldiers advance along a walkway that zigzags over green radioactive waste. There are a number of linked secrets, establishing that opening up new areas involves not just finding and triggering buttons and trip wires but triggering things in sequence. In this case, we have a different-colored wall panel, dropping down into a passage that takes us to the lake of waste with the superarmor, then a trip wire in the final room that lowers the Imp platform, announcing dynamic vertical-level adjustment and opening up a little area with a shotgun and shells. Finally, moving back out of this area and toward the second room opens a timed lift in the corner of the secret shotgun area, which we can run back to before it raises again (and it only does this once; some secrets are nonrepeatable). The lift leads to a short corridor with a couple of small armor bonuses before delivering the real reward, a one-way wall with a view over the walkway room. In the space of a few short and small secrets, the game trains the observant player or completionist to watch for wall discoloration, lines of light/shadow and new sectors as trip wires, raising and dropping platforms, and linked sequences.
It is worth noting that the “shaven-headed goon” referenced in the first quoted paragraph above only appears at this point at certain difficulty levels, so a player selecting easier difficulties will not encounter it. The previous citation, regarding the “plinth”, is the same position in the level on an easier difficulty setting. Emulated citations in this paragraph therefore reflect more than just the level position and player state, but also macro-level changes to game settings. In addition, the citation referencing the “timed lift in the corner of the secret shotgun area” loads with the game menu activated. This is due to the difficultly of catching the lift while simultaneously triggering a save state. If the menu is removed with the “esc” key, the lift will be shown rising up the wall. Therefore, there appears to be an art inherent to capturing evocative states that well-align with each descriptive sentence.
In preparing this augmentation, the authors used a new browser-based tool — the Game and Interactive Software Scholarship Tool (GISST) — to play through the game, record our performance and bookmark specific instants, store these bookmarks in a database, and hyperlink the textual content to specific moments in that performance. A reader has only to load such a bookmark to witness the moment of the game we need for our argument; they may then continue to play or simply click the next bookmark. The searchable database of performances and game citations itself becomes fuel for future arguments and counterarguments, as scholars can start playing from any of these bookmarks and take different actions (or, indeed, can apply the same series of actions to a different version of the game): “What if we had gone left instead of right?” “How does this scene appear differently in the first alpha version versus the final released version?”
Games and other interactive systems have dual lives: on the one hand, they are processes (often computer programs) which can generate sequences of emergent phenomena bounded only by combinatorics, with each additional choice or instant of time branching possible worlds by increasing exponents; on the other hand, every experience of a game is not a branching tree, but a linearized sequence of events and audiovisual outputs. Considering the game program as a text is not necessarily more or less correct than considering a specific play of a game as a text, and GISST illuminates the difference between game program and game performance which we explore later in this work.
As Stephen Ramsay and other digital humanists argue, computational tools can themselves be arguments for and about new humanistic expression [Ramsay 2011] sentiment is echoed in convincing terms in the aforementioned polemic on the state of digital humanities [Burdick et al. 2016]. The authors assert that “the next generation of digital experimenters could contribute to humanities theory by forging tools that quite literally embody humanities-centered views regarding the world” [Burdick et al. 2016, 104]. In this sense, the tool presented below is the embodiment of a potential future for the study of game history.
In creating such a tool, we aim to foreground issues of reference as regards the notoriously unstable nature of computer games and software.[4] These objects, in their requirements for future retrieval and use in historical arguments, call for the ideation of what Bethany Nowviskie refers to as a “speculative collections” [Nowviskie 2016]. These collections are those that do not yet exist, but will be required for future digital scholarship. The requirements of both game citation and the tool illustrated below invite the creation of a speculative future collection for managed and retrievable historical software data. Speculative collections call for the telegraphing of future scholarly use, and the creation of criteria for the descriptive, curative, and managerial requirements that would likely result.
The future of scholarship in game history is then not only based on the stabilization of records for future retrieval, but also in the ways that that retrieval is enacted in practice, and available for further exploration and exploitation by critical computational methods and tools.
We begin this article by identifying key problems in game and software citation. We explore these problems by applying theories of reduction and intertextuality. This leads us to new approaches for reference and from there to the design of GISST, the tool supporting the new kind of games scholarship illustrated above. Finally, we present a qualitative evaluation of GISST by other scholars.

2 Citation

Citation is a foundational act in modern scholarship. Regardless of the discipline, any scholarly argument is based on, or a reaction to, previous work and/or citable primary sources. Different fields use citation practice in different ways, but the major functions remain consistent [Sula and Miller 2014, 454–5].
Citation operates on two fundamental levels. Within a text, it legitimates the knowledge claims made by an author, and provides support for their arguments and findings. It also, through the connections a citation makes to related works, ties an author to the social organization of the discipline(s) to which they are contributing. In fact, any discipline is essentially constituted by these networks of citations, the collections of links that form being dense enough to support further claims and disclaimers, rebuttals and denials.
Studies of citation occupy the thoughts of numerous fields, from the quantitative analysis of bibliometrics, with its h-indexes and impact factors, to the applied socio-linguistic study of discourse analysis. The dual roles of citation practice, both in the form and content of knowledge links and as a base for social practices within disciplines, are certainly ripe enough for a pluck into the basket of game historical studies. Before diving into the morass of game citation, and even more specifically, game data citation, the rest of this section will set up some background definitions and support structures from citation-adjacent fields. These will then be leveraged into a fuller discussion of game citation and the citation tool as a speculation and intervention into future practice.

2.1 Citation in Use

As discussed, citation functions both within a text, as a marker to other sources, and without, as a tie between an author and a discipline. The citations found inside texts follow the prescriptions of the common practices within a field of study. Common guides for the humanities include the Modern Language Association (MLA), American Library Association (ALA), and the University of Chicago Manual of Style. Most engineering disciplines, including Computer Science, follow similarly organized research guides. In CS’s case, a majority of the works are organized around the conference guidelines provided by the Association for Computing Machinery (ACM) or the Institute of Electrical and Electronics Engineers (IEEE). These guides are the products of the study of bibliography, with its most prominent scholarly effect being the enumerative bibliographies — the “Works Cited” lists — found after the conclusions of monographs or conference publications. The constitution of bibliography entries is the result of practices in descriptive bibliography, a subset of analytical bibliography.
While bibliographic practice in the age of the Internet is in some corners lamented as both a lost art and potentially unnecessary [Parks 2014], we believe that coherent, consistent and standardized bibliographic description is still essential.[5] The practice of organized analysis of the works in a field can help to reveal new research directions and provide a solid base for future claims. Game historical work needs more time and effort devoted to these foundational, field-constitutive activities.
Returning to in-text bibliographic citation practices, we find that an explicit function of citation is the legitimation of claims.[6] Authors need to legitimate their claims, both by crediting original sources and supporting dialogue with other scholars with whom they may agree or disagree. Citation also works as a means to pay an intellectual debt, as any addition to knowledge, being based on previous efforts, should acknowledge others’ contributions [Eco 2015].
In history specifically, the formation of modern historical discourse is founded on the “scientific” practice of accurate historical sourcing. Commonly attributed to Leopold Von Ranke and his continental predecessors, the development of footnote and endnote showed that the author had “done an acceptable amount of work, enough to lie within the tolerances of the field” [Grafton 1997, 22]. Citation persuades others to pay attention to a scholar’s work and thoughts. It cannot “explain the precise course” taken by a historian, but can “give the reader . . . enough hints to make it possible to work this out — in part. No other apparatus can give more information — or more assurance — than this” [Grafton 1997, 23]. Anthony Grafton, in reference to the preceding quotes, notes that the function of footnotes — the historian’s preferred mode of citation — is to give legitimacy to a claim and promote the authority of the claimant. Footnotes also provide means for enforcing the social space of historians through the inclusion or exclusion of particular works. In many cases, a notable omission provides for a deeper criticism than an antithetical reference, because at least the latter claim is being recognized and confronted instead of ignored.

2.2 Citation as Discourse

Given that this article is devoted to the use and abuse of citation practices in game historical texts, we need to develop a suitable framing and terminology to discuss game citations. As alluded to above, the act of citation involves the coherent and consistent description of a source within a text. This is usually in the form of a bibliographic footnote or endnote, or as a list of entries (enumeration) following a text. Bibliography dictates that there is sufficient description to allow a future researcher to recover the described source. This creation of a knowledge link to another’s work, in light of bibliographic practice, is then a matter of practicality. One needs adequate description for future retrieval. What bibliographic notions do not cover is the use of a citation, a knowledge linkage, within the text itself. The correct form of a description for a source is not the same as how an author activates that source inside their text as a part of their argument. We then have to split the act of citation in two. The first part is “simply” the description and positioning of a citation inside a text, and the other is an analysis of how that citation is used as a way to further the objectives of an author, and by extension that author’s discipline. Fortunately, the discussion of citation’s effect on both textual composition and the social formation of disciplines is addressed in the fields of bibliometrics and discourse studies. In the following paragraphs we will borrow a few key disciplinary concepts, then position them in relation to practices in game historical scholarship.
The intersection of one text within another is an instance of intertextuality. Although the term “intertextuality” is used in a number of fields, our definition of it here is drawn from discourse studies. One of the founders of the field, Norman Fairclough, describes intertextuality as “the property texts have of being full of snatches of other texts, which may be explicitly demarcated or merged in, and which may assimilate, contradict, ironically echo, and so forth” [Fairclough 1992, 84]. Intertextuality, in Fairclough’s consideration, implicitly calls to account the production, distribution, and consumption of texts. For production, a key consideration is the “historicity” of texts, how they add to previous knowledge and expand a specific discursive chain of thought.[7] Distribution reflects on the network of texts and how they transform and flow into different types and fields.[8] Fairclough uses the example of political speeches transformed into news reports. With consumption, “the intertextual perspective is helpful in stressing that it is not just ‘the text,’ not indeed just the texts that intertextually constitute it, that shape interpretation, but also those other texts which interpreters variably bring to the interpretation process” [Fairclough 1992, 85]. All three implications of Fairclough’s “intertextual perspective” have a significant bearing on the interpretation of citation and reference in academic works. That texts are engaged with a disciplinary train of thought, flow and re-form based on context, and intimately involve presuppositions about the other texts they do (and could) use, are all points of reflection for both the citation practice of games, and the implications for a better use of the tool proposed below; it being a means of a new type of intertextual link for game history.
Fairclough’s intertextuality is drawn from a more elaborate framework based in French discourse analysis that traces back to Michel Foucault’s Archaeology of Knowledge. Rather than retrace this history ourselves, we can lean on Fairclough’s two notions of intertextuality, manifest and constitutive. Manifest intertextuality is “where specific other texts are overtly drawn upon within a text,” forming a “heterogeneous constitution of texts.” To clarify, the quote ending the previous sentence is an example of manifestation, as is “Fairclough’s intertextuality” at the beginning of the paragraph. Both are specific, overt calls out to other texts, with the direct quote being a more emphatic kind of manifestation. Constitutive intertextuality (for Fairclough “interdiscursivity”) is effectively the means of intertextuality for a specific text. How the configuration of its references, allusions, and implications for other texts, both explicitly mentioned and implicitly demanded, align to form a specific type of discourse for a specific audience.[9] Bibliographic actions — like enumerative bibliography, footnotes, and in-line citations — are then all types of manifestation while the act of citation itself, as a social and knowledge linking activity, is a constitutive act.
Before extending the application of constitutive and manifest intertextuality to game historical discourse, two more key insights from Fairclough are useful. The first is the idea of a “presupposition” of a text. Sometimes presuppositions are just “propositions that are given for, and taken for granted by, text producers.” When engaged in the act of writing and assembling an argument, authors bring into their writing numerous pointers to other works — through manifestation — or ideas from other works that are assumed to be part of the general knowledge of an assumed reader. Or, crucially, part of the knowledge that one is assumed to take from an “other text” that is insinuated in the current one. “In many cases of presupposition the ‘other text’ is not an individual specified or identifiable other text, but a more nebulous ‘text’ corresponding to general opinion (what people tend to say, accumulated textual experience)” [Fairclough 1992, 121]. A presupposition, through manifest citation, of a particular “other” author or “other” work, brings with it a host of assumptions based on an expected experience on the part of the reader. As a result, presuppositions are difficult to correct if they go awry because both the reader and the author are under a similar delusion about the content and shape of a reference.
This gap between presuppositions, as intended by an author and interpreted by a reader, leads us to a second point about ambiguity and the constitutive “surface” of a text. In drawing together the heterogeneous network of texts that constitute a new one, there are times when certain parts may not fit as well as others.

Texts vary a great deal in their degrees of heterogeneity . . . [they] also differ in the extent to which their heterogeneous elements are integrated, and so in the extent to which their heterogeneity is evident on the surface of the text. . . . Again, texts may or may not be “reaccentuated;” they may or may not be drawn into the prevailing key or tone (e.g., ironic or sentimental) of the surrounding text. Or again, the texts of others may or may not be merged into unattributed background assumptions of the text being presupposed. So a heterogeneous text may have an uneven and “bumpy” textual surface, or a relatively smooth one.  [Fairclough 1992, 104]

That we should pay attention to qualities of a textual surface — and the ways in which its imbricated texts do and do not comport with each other — motivates our framing for the discussion of the citation work below. If we are combining references in text to games, and other systems, based on non-discursive experiences, their constitutive intertextuality needs to be examined, along with its effects on the resulting historical discourse.[10] We may want the alignment to have a specific texture, but we also need to be aware that that texture, that surface, is something deserving of reflective consideration and thought. How do the citation of games and the juxtaposition of program and text affect the reader’s experience and comprehension of the argument? What does this do to the issues of presupposition and scholarly assumption? Below we discuss the current practice of game citation in light of both manifest intertextuality and presupposition.
One last implication of the heterogeneous constitution of texts is that it results in what Fairclough refers to as an “ambivalence” of argumentative meaning. “If the surface of a text may be multiply determined by the various other texts which go into its composition, then elements of that textual surface may not be clearly placed in relation to the text’s intertextual network, and their meaning may be ambivalent; different meanings may coexist, and it may not be possible to determine ‘the’ meaning” [Fairclough 1992, 105]. The ambiguity inherent to any textual argument results from the fact that we — as the authors — are removing other texts from their original context and constituting them into our own. As such, the onus is on us, the researchers, to both explain and refine the “other” texts in a way that supports our argument and that is interpretable to the reader. Poking below the surface of the text and retrieving a shared context requires significant effort, and in the case of games, might not currently be possible due to a lack of access [Kaltman et al. 2015] [McDonough 2010].

3 Bibliography and Citation in Game Studies

In the previous section we outlined the function of citation as both an act of descriptive bibliography and as an intertextual mechanism in the creation of textual discourse. This section brings both of those concepts to bear on the current practices of game citation in game studies and game historical texts. While both fields are large, with a significant amount of publication activity, the extent of game citation is currently rather limited. Additionally, in game studies works there is a good deal of presupposition about the accumulated played experiences of both the reader and author. These assumptions are a major reason for the current lack of specific bibliographic guidelines. As Nathan Altice writes, in one of the only other analyses of game citation practice,

Our familiarity with and access to videogames is taken for granted, since many of us are old enough to recall first-hand experience with the entire history of videogames — a claim that cannot be made by scholars of other media. There is an implicit assumption that we all know what a Super Mario Bros. cartridge looks like, so why bother with thorough descriptions?  [Altice 2015, 334]Altice 2015, 334]

This “implicit assumption” of Super Mario Bros. is the result of the presupposition at work in game studies texts. It is not uncommon for game references to be scant or potentially non-existent. This is a problem because the assumption of contemporary, tacit experience with historical games cannot hold up past the current generation of scholars. Game citation practices, like those of appraisal and description, need to be addressed with a mind toward future historical scholarship and needs. The rest of this section will describe how citation functions in game studies works, and briefly point to further recommendations for their improvement. All of this lays groundwork for the introduction of the citation tool in the next section — as a tool-assisted intervention into both the issues of citation standardization and presupposition of game play experiences.
As already noted, computer game bibliography and citation practice needs consistent and thorough standards. Again, from Altice,

To claim that videogame bibliography demands a closer allegiance to the practices [of enumerative, and analytical bibliography] assumes that a unified practice called “videogame bibliography” even exists. At their best, videogame citations adhere to the barest enumerative models. Even in those texts that most seriously grapple with electronic artifacts as objects that exhibit physical properties worthy of description, such as Kirschenbaum’s Mechanisms or Montfort and Bogost’s Racing the Beam, videogames are still afforded scant bibliographic information.  [Altice 2015, 333]

Altice’s claim of the lack of a “videogame bibliography” practice is not difficult to substantiate. As he states, many works that are intimately tied to the exploration of the material constraints and expressive potentials of technical artifacts do not share consistent bibliographic practices. Both works mentioned above, Matt Kirschenbaum’s Mechanisms — a treatise on the oft-overlooked ambiguities in the expression of digital data — and Ian Bogost and Nick Montfort’s Racing the Beam — a platform study into the inner workings of the Atari 2600 — come from the same publisher, are intimately involved with the technical distinctions of computer software, and do not share a consistent practice in their bibliographies [Kirschenbaum 2008] [Bogost and Montfort 2009].
Racing the Beam is a part of a larger series of works on specific platforms. Each book investigates the constraints that a particular platform imposes on the expressive potential of its software. Each book also takes a different position on the placement, organization and depth of its enumerative bibliography of games. Some volumes, like Jimmy Maher’s on the Commodore Amiga, eschew any explicit listing of the games referenced in the text [Maher 2012]. Contrarily, works like Altice’s own I AM ERROR, on the Nintendo Entertainment System, adopt meticulous, platform- and media format-specific reference schemes.
Now, given that there is a not a standard set of bibliographic and citation practices for software, and that most major sources of such practices — like the MLA, University of Chicago, and surprisingly even the ACM — lack any guidelines for software bibliography, it is perhaps not unexpected that academic book publishers and authors do not maintain consistent practices.[11] The work of the Game Metadata and Citation Project (GAMECIP), which looked at hundreds of game studies citations across a variety of online and print sources, also validates Altice’s (and our) assumption about the lack of consistent practice. In fact, even within the same journal, Game Studies, which does have an explicit bibliographic policy, there was still lax enforcement of descriptive citation practice.[12]
Altice also links bibliography and citation to descriptive concerns. He notes,

As a Famicom scholar, I may possess the terminology to describe that platform’s media but meanwhile lack the platform-specific knowledge to properly cite a PlayStation 2 game. . . . Granting each [reference] its due description poses a sizable research challenge. One solution is to build up a body of platform-specific descriptions that others may use as a model . . . but such shared knowledge will take time and work.  [Altice 2015, 337]

Computer game bibliography is distinct from that for other media forms mainly in the complex of technical requirements needed to retrieve the object. GAMECIP’s platform and format vocabularies, outlined in prior work [Kaltman et al. 2015], speak to Altice’s call in a limited way by attempting to codify and standardize some basic descriptive information for computer games. The larger issue, however, is that “rich bibliographic records require a baseline technical understanding of the objects they describe” [Altice 2015, 336].
For game scholars concerned with the technical underpinnings and object materiality of software, each new platform, format, and data configuration incurs a significant descriptive cost. For items in Altice’s enumerative bibliography, each specific game is listed according, in part, to the configuration of components inside a Nintendo Entertainment System Game Pak, and in the case of emulated sources, the header information of a particular game data file. Clearly, for his arguments to validate, this level of depth is necessary, and it would benefit future scholarship if others could take advantage of his classification schemes or even extend them into their own specific sub-domain of software.
Altice and others in the platform studies community are more concerned with the material and technical conditions of games than other historical scholars. What works for platform studies might be overkill for other subdisciplines. However, at the very least, the technical information provided in a bibliography should be correct, and involve a level of detail specific enough to allow an unacquainted reader a fair chance to recover the source.[13] The lack of consistency in the description of computer game sources can damage the legitimacy of game historical arguments. As mentioned in the last section, one key way that a citation can fail is through a mistaken presupposition about the source it is referencing. This misalignment between the author’s expectation — or recollection — of a game play experience and that of a future reader’s is only exacerbated by incomplete and inconsistent citation practice.

3.1 Presupposition of DOOM!

One significant difficulty in game citation is that games are not as recoverable as other media forms. Many institutions do not have software collections, and those that do struggle against the technical and material constraints of hardware maintenance and access. When these limited access scenarios collide with a lack of rigor in citation practice, the result is that many outputs of game scholarship rely on only the barest descriptions for games. They are used more as pointers to the concept of a particular game, as presupposition, than to an emphatic, playable instance of one.
To take a particularly salient example, let us return to Dan Pinchbeck’s book DOOM: SCARYDARKFAST. This book — a good example of design-focused phenomenological game study — relies, almost exclusively, on presupposition of game citations. The work contains manifest citations, mostly through in-line references, to 130 other computer games. Most are used in passing to articulate how a particular structural, thematic, or kinesthetic element from each game relates to those of Doom. The in-line references are of the form (Title, Year), leaving the reader to fill in the blanks based on their assumed knowledge of each title. Furthermore, given the breadth of games mentioned, it is likely that nearly all readers have not played, or at least not recently played, many of them. The references hang on a presupposition of past experiences with the titles, and hopefully they still resonate in ways commensurate with his arguments. The references are, as we mentioned above, presupposed shorthand for the shared played experience of both author and reader.
To illustrate how this form of manifest, presupposed citation functions, take this set of paragraphs describing the progress of the first-person perspective from Pinchbeck’s book:

We need to consider the context into which DOOM arrived. The very first FPS game was Maze War, created by Steve Colley, Howard Palmer, and Greg Thompson (and other contributors) at the NASA Ames Research Center. Colley estimates that the first version was built during 1973, as an extension of the earlier game Maze, which offered a first-person exploration of a basic wireframe environment. At some point during ’73 or ’74, networked capability was added, enabling multiplayer FPS play. The genre was born out of networked deathmatching. After Thompson moved to MIT, he continued to develop Maze War, adding a server offering personalized games, increasing the number of players to eight, and adding simple bots to the mix. Twenty years before DOOM, all of the prototypical features of the FPS were in place: a 3D real-time environment, simple ludic activity (look, move, shoot, take damage), and a basic set of goals and win/lose conditions — all this and multiplayer networked combat.

Around the same time, Jim Bowery developed Spasim (1974), which he has claimed to be the very first 3D networked multiplayer game. Spasim pitted up to thirty-two players (eight players in four planetary systems) against one another over a network, with each taking control of a spaceship, viewed to other players as a wireframe. A second version expanded the gameplay from simple combat to include resource management and more strategic elements. Whether or not Bowery’s argument that Spasim precurses Maze War and represents the first FPS holds water, its importance as a game is undiminished — even if for no other reason than because Spasim is a clear spiritual ancestor of Elite (Braben and Bell 1984) and its many derivatives. It perhaps even prototypes a game concept that would later spin out into combat-oriented real-time strategy (RTS) or even massively multiplayer online (MMO) gaming.

What certainly differentiates Spasim from Maze War is the perspective. Like other early first-person games, such as BattleZone (Atari 1980) and id’s Hovertank 3D (1991), the game is essentially vehicular, with no representation of the avatar onscreen other than a crosshair. It is interesting that, aside from occasional titles such as Descent (Parallax 1995) and Forsaken (Probe Entertainment 1998), the genre very swiftly settled down into the avatar-based perspective, abandoning vehicular combat more or less completely. It’s also interesting that contemporary shooters often opt for a shift to third-person when including vehicles, such as with Halo: Combat Evolved (Bungie 2002) or Rage (id Software 2011). Half-Life 2’s (Valve Software 2004) first-person car sequences are actually quite unusual. [Pinchbeck 2013, 6]

These three paragraphs reference twelve games spanning a period from 1974 to 2011. Doom does not receive a full in-line citation since it is the topic of the book, and is addressed with in-line references in a previous section. Ignoring the general argument and focusing only on the citations and their relationship to the assertions being made on their behalf, we already encounter some significant issues.
Firstly, the citations are not particularly specific. Descent, for example, was released in six different versions for three different platforms in three different localities in 1995 alone.[14] The description “(Descent, 1995)” then does not provide enough information for a reasonable assumption about the particular version Pinchbeck played (or presupposed). Second, the citations presuppose a significant amount of knowledge on the part of the reader. When Pinchbeck remarks, “Spasim is a clear spiritual ancestor of Elite . . . and its many derivatives,” we (the reader) are required to understand — through presupposition — that Spasim, a first-person, cockpit oriented space exploration game is echoed in Elite, a similarly-perspectived first-person space simulation game. Clearly, that assertion requires a familiarity with both games, and by extension knowledge of other Elite derivatives. Finally, even though a game might be recoverable through the sparse citation provided, much of the discussion still presupposes memories of experiences of play, both in Pinchbeck and in his readers. By referencing the vehicular segment of Half-Life 2, Pinchbeck is implicitly requiring a future researcher, should they want to experience that sequence, to spend many hours of game time reaching and evaluating it. There is nothing inherently wrong with this, but we must highlight the extent of the assumptions being made of the reader. Either you already have contemporary experience of Half- Life 2, and incidentally remember this game play sequence, or you are relying on Pinchbeck’s memory of his contemporary play. Both positions presuppose a temporally-situated accumulation of played experiences that aligns with the year of this work’s publication. Future researchers, removed from a contemporary, played understanding of the game, must assume that Pinchbeck is not committing any of the intertextual no-nos — like misinterpretation or incorrect presupposition — listed in the previous section. Otherwise, they will need to recover Half-Life 2 for themselves, and assume that their version contains the vehicular sequence in question and that it is reachable through play.
While it may seem that we are being a bit drastic in this example, we cannot take for granted that our own presuppositions about Half-Life 2 or any other game discussed in the quote above (or, for that matter, in this text) will align with the presuppositions of future scholars.
Pinchbeck’s references are intended to evoke a general idea of a specific title, relying primarily on the presupposition of reader knowledge. The referenced games in these cases stand in metonym for their specific constitutive function in the text. Halo, Rage, and Half-Life 2 for their comparative vehicular segments; Spasim, Elite, Descent, and Forsaken for their 360-degree cockpit viewpoint; and HoverTank 3D, BattleZone, Maze, and Maze War for their advances to first person representation. Concrete, retrievable instances of these games are secondary to the structural or thematic conceptualizations of them as presupposed into Pinchbeck’s argument.
In contrast, recalling Altice’s more extensive, object-based citations, we see that many of his claims are rooted in the minutiae of a single platform and its technical constraints. For Altice, his argument is dependent on the specifics, on the material differences between games rather than the higher level concepts they can evoke. He commonly uses emulated versions of games to illustrate points about Nintendo Entertainment System rendering techniques. Because rendering functions differ between the many versions of, say, Super Mario Bros., Altice’s citation of a specific version of the game’s data is important: his analysis would not be possible — or legitimate — without it.
The lesson is not that anything Pinchbeck is saying is particularly incorrect, but that the onus for clarification is heavily weighted toward the reader, and in particular, a presupposition about the reader’s accumulated knowledge of games. Pinchbeck’s work is intended for a game savvy audience, and is certainly not attempting to be a rigorous, formal history of Doom.[15] But the type of underspecified, game-as-shorthand reference structure is endemic to a significant swath of game studies. It will also make these types of work less relevant the farther they are displaced from the contemporary titles to which they refer. Again, in clarification with Altice, “Most contemporary game scholars are old enough to remember most of the entire historical arc of computer games, so further clarification for them, and audiences like them are not currently required.” Those in the future, unversed in the early history of computer games, will need to do a significant amount of work to recover all of the implicit game history embedded within Pinchbeck’s references.
Another note is that Pinchbeck’s citations are more the norm in current game citation practice. The GAMECIP study of citation practice analyzed citations in over 300 publications relating to computer games. Of those, 102 different styles of citation were found, and of those only 31 included any information about game platforms. A majority simply focused on title, developer and year of publication. The main problem with this lax citation practice is that without at least a foundational set of descriptive elements tied to some expression of technical constraints and requirements, locating and replaying games referenced in scholarly works might be very difficult in the far future.[16] Altice’s end of the spectrum, with its acknowledgment of computer game materiality grounding out into the literal byte order of a file header, is more historically secure in theory but requires a level of technical understanding that might turn off scholars with less techno-materialist concerns.
In the end, a probable solution is to provide a minimally viable set of descriptive bibliographic fields based, again, on assumptions of reasonable compatibility and retrieval. These minimum specifications and recommendations for citation practice can be found in forthcoming work from the GAMECIP project. The main thrust, however, is that for the legitimation of any argument made about or through a game, there is a requisite depth of citation that aligns with the claim. From the above, the depth of Pinchbeck’s arguments dealt with apparent surface characteristics of games — characteristics that would hopefully be apparent to anyone playing one of the games cited. In the case of platform studies arguments, the claims are dependent on citation at a different depth, one close to the actual material existence of the program.
Hopefully, this section illuminated some of the problems with current manifest intertextuality in games, most specifically that, due to the current limitations of textual description, the field of game studies is dependent on a presupposition of played knowledge that is not tied to any specific material instances of games. The next section looks at this problem from the perspective of constitutive intertextual relations and provides a basis for our partial solution in the form of a citation tool for executable reference. This constitutive work is the result of confronting the current limitations of citations as they have been described so far. Primarily, when even the citation of specific, material data is not enough or of a kind with the expression of new historical claims.

4 Reduction and Intertextual Expression

Intertextuality is a fickle phenomenon. As noted by Fairclough above, when making one text manifest within another, work needs to be done to mold the “other text” in a form commensurate with the discourse surrounding it. Otherwise the textual surface is disturbed, and the flow of thought for the reader becomes more difficult to reconcile and interpret. (Of course in some instances, this might be desired as a way to remark on the disjunction between different textual forms and different ways of reading.) Fairclough looks at newspapers, medical interviews, and other forms of discourse dissimilar from the academic text within which he is operating. He focuses on the ways in which each discourse’s intertextuality contributes to its existence as a distinct genre, a distinct type of expression. This notion of constitutive intertextuality, the ways in which different discourses make use of and interact with other texts, is a fundamental aspect of discourse analysis. The constitutive act of bringing together “other” texts through manifest actions like citation, as noted by Ken Hyland in his study of academic citation practice,

links text users to a network of prior texts depending on their group membership, and provides a system of coding options for making meanings. Because they help to instantiate or construe the meaning potential of a disciplinary culture, the conventions developed in this way foreclose certain options and make some predictions about meanings possible.  [Hyland 2000, 156]

The organization of disciplines and regions of thought and inquiry, both in the humanities and the sciences, are then dictated through the intertextual relations of publications. These publications organize into networks that then enforce and negotiate the boundaries of disciplines, and the specific intertextual discourse required for group allegiance. We argue that the intertextual surfaces of these genres of discipline make use of certain conventions that can preclude certain types of meaning and the expression of certain types of thoughts.
By relying on standard conventions of manifest intertextuality, and therefore prescribing limitations on the expression of academic claims, we are preventing certain discussions from taking place. In the interest of this article, we are most concerned with the explanation and historical positioning of computer games as systems and technical objects. We remarked in the previous section that references stand in metonym for more complex thematic components and system interactions. In a sense, the discussion was really about the limitations of current textual discourse about games — discourse that relies on the narrativization of game play or the accumulated knowledge of the reader as player. That game academics use text as the major form of expression is understandable. Michael Lynch, when discussing scientists’ use of text over visuals, notes:

The fact that writing is the dominant medium of academic discourse is not incidental; while pictorial subject matter is alien to written discourse, and requires a reduction to make it amenable to analysis, written subject matter can be iterated without any “gap” with the textual surface that analyzes it.  [Lynch 1988, 201]

Games are even more removed from the textual surface than the visualizations Lynch is investigating. Their insertion into textual discourse filters through many different levels of “reduction.” Lynch’s argument focuses on the reduction of the worked scientific reality of the life sciences to the written page. “Scientific research teams are described as agencies of mediation between an uncertain and chaotic research domain and the schematic and simplified products of research that appear in publications” [Lynch 1988, 204]. Researchers select and distill the appropriate data and reduce it to visualizations and textual description to align with the constraints of printed publications. This reduction of the chaos of a research project to the streamlined and validated prose of research publications is of a kind with the work of some game historical scholars in their attempts to better mind the “gap” between the played expression of games and their appearance and function in text-based academic arguments. Most of the time, game studies uses presupposition as a form of reduction, a way to fit the complex system interactions inherent to the experience of game play into readable discourse. This approach, however, largely limits the field of discussion for these games to their existence as “objects played by a researcher in the past,” preempting other means of using games in argument.
The notion of reduction is important to the overall discussion of intertextual surfaces and their effects on comprehension and expression. Reduction functions on a spectrum aligning with the goals of a particular discourse. The reduction from computer game to in-line textual citation is the most extreme form. Many others make use of, in progression: still images, sequences (or juxtaposition) of images, video, interactive visualization, and, in limited cases, emulation. In monographs, and examples like Pinchbeck’s and Altice’s above, only still and sequenced images are available. Pinchbeck narrates key areas of Doom with comparative juxtapositions of different game versions, and single images of key aesthetic and level design features. Altice makes extensive use of emulator tools for the visual display of internal memory states, or again, like Pinchbeck, comparative juxtapositions of key sequences or different passes of a rendering function.
Outlining the full extent of image usage in game studies monographs is well outside our scope, but the important consideration is the jump in textual mediation that occurs in the transition from collections of images to video, interactive visualization, or emulation. The textual surface described for the majority of this article is one of physical print and the constraints of its intertextuality. The newer forms of reduction are not mentioned by Lynch because they still remain unleveraged in the sciences — there are very few online publications in any field that leverage digital documents’ new textuality. Digital humanists cry out for more active, digital intertextual presentation, but their codified expression is only standardized in a handful of online publications.[17] Linking back, the expression of academic discussions of games is then dependent on the forms of manifest intertextuality that are available and commensurate with the dominant constitutive discourses that currently exist. When people want to engage with games in ways that are not commensurate with textual description they make use of less encumbering reductions. In our case, when trying to either explain embodied system interactions or complex dynamic processes, it is helpful to move beyond textual discourse as the only tool in the tool chest.

5 Types and Examples of Reduction

Taking a step back, there are two key considerations at work in our discussion. The first is the intertextual surface of discourse and how it makes use of manifest actions, like citation, quotation, and images, in the constitution of a text. The flow of an argument is aided by the integration of “other” texts such that the discursive flow is smooth.[18] Whenever one is reading through a discussion and needs to refer to a figure, table, or other interstitial manifestation, there is then, borrowing from Lynch, a “gap” in the textual flow and thus in the discursive progression. An aim for any apparatus that allows for new types of manifestation must be to consider how that manifestation affects the constitution of a text, and the ways that manifestations can augment or potentially further distort a discursive surface. This surface is also, with advances in on-line technology and distribution methods, not only just a physical sheet of flattened, dead wood, but the transmediatic — interactive — surface of the computer screen and networked document. The medium of expression, in the case of computer games and systems, is now of the same stuff of the medium being described and discussed. There is potential for a better and more forceful alignment of textual surface with digital system expression.
The second key consideration is how reductions assist intertextual integration to enable new forms of argument. We are not the first to venture down this intermedial path, and by illuminating some further examples we can highlight the new types of expression that we hope to enable with the to-be-described tool. This section is mainly devoted to an elaboration of the second point about methods of reduction in light of the first’s concern for intertextual alignment and comprehension. The following discussion includes a collection of related and motivating work.

5.1 Video

Recall that the methods of reduction not discussed by Lynch are embedded video, interactive visualization, and emulation. “Embedded” is key since this allows us to present them as manifest intertextual objects (and later use some of the discourse analytical apparatus to discuss their effects). Reduction is a reduction of the embodied act of play to a form amenable to the constraints of the particular intertextual surface being created. Video reduction is fast becoming one of the major means for the dissemination of knowledge about how games are played, and the surface characteristics of their presentation to the player. As a phenomenon, this is beneficial to the progress of game historical study since at the very least there will likely be some video remnants of game play available to future preservation efforts.[19] The availability of video has also prompted some academics to begin experimenting with embeddable video as a means for discussion. For example, Doug Wilson, in an extended discussion of the game Spelunky for the Polygon website, makes extensive use of embedded YouTube videos to support a discussion and walkthrough of one of the game’s most difficult achievements, a no-death solo eggplant run.[20] By interweaving textual description with multiple embedded videos keyed to specifically salient moments, Wilson telegraphs a new type of intertextual surface where narrativized gameplay description is aligned with video supports. The technical mastery at work is made more apparent and visceral through the accompanying videos.

5.2 Visualization

Interactive visualizations as embedded arguments, the next step up the reduction ladder, are not a significant practice in academia. Certainly, visualizations — in the form of static images embedded in text — are very common in physical and social sciences, and in humanistic analysis of textual corpora. Woolgar and Lynch preside over two volumes dedicated to representation in scientific practice that are mostly focused on the constitutive power of manifest visualizations in scientific texts. The fact that the volumes are separated by 30 years indicates the continuing influence of visualization on scientific work. Analysis of the effect of visualizations on humanistic practice is probably most expressed in the attention to algorithmic criticism in the works of Stephen Ramsay — for corpora analysis — and David Staley — in historical visualization — among others [Ramsay 2011] [Staley 2015]. However, the use of non-static, interactive visualization of dynamic systems is absent from the intertextual presentation of findings in most scholarly fields. The groundwork is actually being laid more by those interested in the pedagogical application of visualizations.
Bret Victor and his collaborators are at the forefront of so-called “explorable explanations,” juxtapositions of online text and embedded interactive visualizations designed to reveal the functionality of systems [Victor 2011b].[21] Some examples include a long explanation of the basic dynamics of simple animated pathing, and Vi Hart and Nicky Case’s interactive model of Thomas Schelling’s group segregation theories [Hart et al. 2016]. Each example works to mollify the “gap” between the textual presentation inherent to the browser window and the machinations — and interactive features — of the supporting visualizations. Victor’s work espouses a pedagogical philosophy of system comprehension through manipulation and tacit experience. Readers are invited to read the expository prose, and then play with the interactive models of the phenomena on the page. The hope is that through tacit manipulation and play a deeper understanding of the underlying system will develop.
Victor refers to his online visualizations as “reactive documents” that allow the reader to test out the various models presented and gain insight through those interactions. The goal is to develop an “active reader,” someone who uses “the author’s argument as a springboard for critical thought and deep understanding” [Victor 2011a]. Active reading owes much to the foundational pedagogical insights of Seymour Papert. Papert devoted his career to the development of computational tools to aid in mathematical and algorithmic thinking [Papert 1980]. He also believed that tacit experience with reactive systems would support better modeling capabilities in confronting new problems and challenges. He traced this potential to a youthful fascination with gears that implicitly enforced a tacit understanding of complex differential systems. His gears functioned as a model that enabled a better understanding of math “than anything I was taught in elementary school” [Papert 1980, vi]. The ability to present information in different and interactive ways led to a new model for the exploration of further knowledge. Papert linked this notion with the educational theories of Jean Piaget that espoused the “assimilation” of concepts into a learner’s world view. Papert’s gears functioned as an “affective model,” a mapping (assimilation) between their relational dynamics and other mathematical concepts. Victor’s work is then an attempt to embed these “affective models” as interactive visualizations into online documents.
The design and application of affective models that encourage comprehension of technical concepts is important for our larger argument about the potential for new forms of intertextuality to support new argumentation. Victor presents a prototypical means of doing more with online texts, and trying to engage the reader with the systemic processes under discussion. Victor’s reactive documents are a new discursive surface, one populated with interactive features aimed at creating a new type of active reader. They are also the result of a reduction from larger, complex system dynamics to concentrated, pedagogical visualizations designed to support textual arguments. The reduction, however, is much richer than an image or video, since it can support the creation of a tacit, embodied argument. Instead of referencing an image or video of a system processes, a smaller part of the system can be introduced into the discourse describing it — or, better yet, use the interactive surface as an argument in of itself for a particular point of view or affective process.

5.3 Emulation

Before discussing emulation as a form of reduction — in line with the progression outlined above — we need to clarify some basic technical distinctions and provide some related examples. This is necessary because the use of emulated systems as a form of argumentation about software history — or really any topic for that matter — has not before, to our knowledge, been explored or theorized.
Emulation, as a computational process, is the use of one system in reproducing the functionality and output of another.[22] Emulators, the programs responsible for emulation, are used in many corners of the software industry to allow for testing of applications on a variety of devices. For instance, most mobile phone applications are not developed on mobile phones. They are programmed in emulation on laptops and desktops more conducive to long bouts of typing and heartache. As noted by Nathan Altice, emulation has a long history tracing back to the historical origins of software development [Altice 2015, 293]. In the 1960s IBM developed the first sets of commercial emulators to allow software written for one mainframe model to be compatible with another.
More recently, in the 1990s, enthusiast game communities began to create emulators of popular game systems, like the Nintendo Entertainment System. Targeting then-current operating systems like Windows 95, these emulators allowed for the (re)play of older titles that might no longer be available for purchase, were released in foreign territories, or might otherwise be difficult to acquire. The “games” in this usage were data dumps extracted from the physical cartridges, and other forms of magnetic and optical media. For cartridge systems, these files are known as “ROMs” since they are copies of a cartridge’s read-only memory. A more general term for data extract from a media format is a “data image,” more commonly shortened to “image.”[23] Emulators operate on data formatted for a specific system, and over the last 20 years, emulation development and the extraction of legacy data from physical formats has flourished. Emulators now exist for thousands of different computational platforms, and are a ripe source for the exploration of software history. That is, assuming one ignores some significant legal issues.[24]
Important for our discussion of manifest intertextual presentation are recent developments in web browser technology (and speed improvements in general) that now make it feasible to run emulators inside online documents. The potential for embedded emulation has not yet been exploited or thoroughly explored. However, because most web browsers now run highly optimized JavaScript compilers, in-browser emulation is a growing phenomenon. The most prominent example is the JavaScript Multiple Arcade Machine Emulator (MAME) project. MAME began as a system for emulating arcade machines. Complementarily, the Multiple Emulator Super System (MESS), using a fork of MAME’s code base, provided support for most personal computers. MAME was open sourced in 2016 and MAME and MESS are now merged. The combined infrastructure supports thousands of different arcade machines and personal computers released over the last 40 years.[25] Initially a large C++ project, the Internet Archive, along with a collection of motivated developers, ported key components of MESS — before its integration with MAME — to JavaScript to create the Internet Arcade, a playable archive of the Internet Archive’s imaged software collection. After the open-source combination, JavaScript became one of a number of compilation targets for the entire MAME-MESS code base.
Similarly, many other emulators began organizing compilation to JavaScript. This included the emulators DOSBox, for legacy x86 MS-DOS machines; FCEUX, for the Nintendo Entertainment System; and Snes9x, for the Super Nintendo Entertainment System.[26]
Now that emulation is available in browsers, it is possible to place both a running program and the text describing or commenting on it onto the same intertextual surface. In the progression of reductions from images to interactive visualization, there was always a clear notion of how each step still represents a deficient copy of some object or system outside the text. For images and video, as mentioned by Lynch above, researchers put in a significant amount of effort to both make their samples more photogenic and thus more interpretable when presented on a textual surface. In dynamic visualizations, there is an implicit understanding that we are being presented with part of a system that has been distilled for comprehension and reader engagement. The very act of visualization is to provide a specific perspective (of many) on the data or system under discussion. With emulation, there does not appear to be a similar process at work. While the emulation is a program designed to conform to the constraints of a digital document, it is not a distillation of a system but a full version of the system itself. This challenges the basic premise of intertextuality presented above, that the texts made manifest in and reduced to a specific discursive surface are under the basic control of the author.[27] While some manipulation or presupposition — in cases where the author is misremembering or functioning with a divergent set of assumptions in regards to the reader — is always a potential issue, that the manifest references might have a mind or operation of their own was never imagined. In bringing emulation into the text, we therefore encounter a new type of intertextual interaction, and with it a different model of reduction — a model that requires significantly more effort in the legitimation of claims and the interpretive exercise. Presupposition of played experience cannot exert the same power because the system — the same system — is available to both author and reader.
The imbrication of emulation into argumentative texts has only been lightly attempted in the past. Nick Montfort wrote an article for the electronic book review that embedded a Java plugin-based emulation of the Infocom game Deadline [Montfort 2000].[28] However, his argument does not particularly integrate the emulation so much as position it in the article to simply show such a move is possible. Others have used online emulation for deeper systemic analysis. One notable example is Ben Fry’s early online visualization of the internal memory state of the Nintendo Entertainment System. The “deconstructulator” is a Java-based in-browser emulator based on a modified version of the NESCafe emulator [Fry 2003].[29] The visualization presents three different windows, a rendering of the full sprite memory of Super Mario Bros., a playable emulation of that game, and the active memory contents of the NES’s PPU (Picture Processing Unit), a component that manages sprite rendering on positioning on screen. As the player plays Super Mario Bros., the contents of the sprite map highlight the currently active sprites in use, and the PPU map shows the current state of each 8x8 tile in the PPU. By moving Mario around, the player can see how the different animations and changes to the game’s background, enemies and platforms modify the NES’s system memory. Fry designed the piece for his “Visually Deconstructing Code” series, a set of small projects aimed at unearthing some of the hidden processes at work in NES code.
Other examples of emulation as a revelatory mechanism exist within the communities devoted to forms of what James Newman refers to as “superplay” [Newman 2012a]. Activities like speed-running (both human and tool-assisted), glitch-hunting, sequence-breaking, and other forms of “performative mastery” of games benefit from research conducted with emulation. Many community emulators support tools for memory analysis and even scripting languages for the live manipulation of a running game. This allows player-performers interested in, for example, shaving that last second off of a run or getting past a boss without attacking, to dig beneath the representational surface of the game and mine its system for potential solutions. Some online streamers, like Clyde Mandelin, write custom emulator modifications that allow for live streaming of both their gameplay and aggregated statistics or interactive visualizations of the underlying system.[30] As we will discuss below, it is becoming clear that a community of practice is developing around the expressive potential of emulators. It’s also a sign that the products of community historical efforts are becoming more aligned with digital humanist insights about technical collaboration between academia and amateurs.[31]
However, it is also necessary to note that, in placing a running program into an online text, its reduction to that surface may suggest a raft of potentially misleading assertions about the historical play experience. As outlined extensively in the work of preservation-minded historical researchers, emulation, in its erasure of the original executable context, denies the experience of the original hardware [Fernández-Vara 2014] [Lowood 2011] [Newman 2012a] [Bogost and Montfort 2009] [Kaltman et al. 2015]. The modern web browser, as a displayed surface, is very different from an Atari-era CRT, and most modern machines do not have a way to interface with original peripherals. Additionally, many emulators try to make the played experience smoother by modifying speed for the sake of accuracy.[32]
They also remove old constraints on the swapping of disks to load parts of the program in piecemeal and internal memory limitations. However, not all aspects of emulation are a historical loss, since the position of the emulation as running inside a host process allows for the introspection and revelation mentioned in the above examples. The ability that many emulators provide to save and load memory states is also, as we will see, a boon to players and researchers hoping to encounter difficult, confusing or novel locations inside games.
A key note about the reductions above is their ability to bring something from “out there in the world” into the text. Usually those studying academic discourse, or the social construction of academic arguments, focus on how that external evidence is transformed into a manifest object in the text. For scientific work, we have mentioned both discourse analysis and science and technology studies as fields that theorize on the reduction and distortion of tacit laboratory knowledge into written discourse.[33] In the humanities, citation and reduction are less epistemologically fraught, since the aim of humanistic discourse is not generally to re-organize some empirical object or finding into a textual form suitable for publication.
Within history, the use of manifest intertextuality is critically important to the sustenance of the field, but the act of manifestation does not usually imply a reduction of a finding o“ut there;” the “out there” of historical sources being mainly other texts. Rarely is the historical object, if there is one, reduced to a form commensurate with the textual surface. In fact, much historical work into objects specifically addresses this issue, a good example being John Law’s work in aircraft design that explicitly constructs different historical strands of documentation to reveal the fractal nature of the object in question. In his case, he looks at the construction of the British TSR2 strike and reconnaissance aircraft, and how it exists as an object of engineering, marketing, and an embodiment of the projection of hegemonic force. The aircraft is viewed along different evidentiary axes to support a conclusion about how objects exist in myriad ways depending on how they are documented and narrativized. This again ties back to ideas from discourse analysis, mainly in how the constitutive intertextuality at work in the history of science and technology defines the objects of analysis; a summary from Steve Woolgar:

Surely, it is often said, it is absurd to say that we cannot distinguish between a thing and what is said about that thing. But the constitutive view does not prohibit such distinctions. It offers us a way of seeing these distinctions as actively created achievements rather than as pre-given features of our world. In particular, the distinction between talk and objects-of-talk is seen from the constitutive perspective as the upshot, rather than the condition, of discursive work.  [Woolgar 1986, 314]

The fundamental takeaway is that previously, describing any technical system as a historical object necessitated various forms of reduction and other intertextual strategies to remediate and insert it into a text. With embedded emulation, and to a lesser extent dynamic visualization, embedding the system itself must now be reconciled. When John Law describes his aircraft, he could not bring the aircraft into the text and let the reader hop into the cockpit. With computational systems increasingly the site of construction and reception for scholarly work, and with technical historical objects that are also computational systems, we can literally transcribe discussed objects into discourse and invite the reader to take the flight stick.

6 Back to Citation and Archives

In bringing a non-text-based object into textual discourse — like the reductions of image, video, visualization, and emulation above — there is a key link to archives and citation that has not been made explicit. In the case of online documents that incorporate various reductions, those texts are not singular objects but networked organizations contingent on access to the various sources of reduction. If one prints an image alongside text, the image is now part of the textual form, and is, from an archival standpoint, part of the same object. With online work, every page of information is an aggregate object. The basic markup for the page comes from one source, the styling of that page from another, and all the various images and other embedded entities from still others within that same domain or from some other (hopefully trusted) source. When things are embedded — as images, videos, visualizations, and emulations are — they necessitate and depend on the existence and maintenance of stable links to recover their data.
The maintenance of these links is a significant issue for the stability of knowledge online. Whenever a link leaves its local namespace (assuming that internal network links are maintained, which is not always a given) it relies on the existence, capabilities, and restrictions of a remote hosting repository. For videos, most content links resolve thanks to the embedded link architectures of mass scale video sites like YouTube or Vimeo. This allows the embedder to neither have to maintain their own video server nor provide the bandwidth necessary for playback. It also removes responsibility for intellectual property management and ties access to embedded content to the whims of the content provider. In the Spelunky example above, Wilson toyed with the constraints of YouTube’s embedded video player to reveal specific salient content inside the game. That action was only made possible by the affordances of YouTube, the repository hosting the content. In the case of historically stable online academic discourse, it should be apparent that any new ability to share or link to digital data incurs a commensurate necessity for a functional and stable repository. The current solutions for video leave a lot to be desired given that they are bound to the corporate imperatives of actors not emphatically concerned with preservation or link stability.[34]
The tool below is an attempt to organize a prototypical archive for embedded emulated content, and to try and reconcile some of the manifest intertextuality present in game historical work to a more stable set of practices regarding citation and linking. In the case of embedded emulation, the data has similar issues to that of video. Namely, that the IP rights for the distribution of streaming copyrighted gameplay data need to be correctly managed, and that the embedded content be presented in such a way to make it useful for inclusion into texts. The consideration for future scholarly use of emulated content is a way to dictate what a speculative collection of such works would look like at a larger, institutional scale. Additionally, the consistent citation of this content, as an initial condition of the system’s functionality, should be a concern for any future work in the creation of links to new forms of digital expression and reduction.

7 A Tool for Descriptive and Manifest Citation of Games

This section outlines the design and functionality of the citation system in the Game and Interactive Software Scholarship Toolkit (GISST). GISST is a suite of tools aimed at helping with common game studies and game historical tasks, and includes a system for the management of manifest citation of both game emulation and game bibliographic references. The citation component of GISST — described below as the “citation tool” — is designed to partially address numerous issues presented above:
  1. The need for more consistent bibliographic citation information for computer games.
  2. An example use case for the placement and manipulation of various reductions of computer games into online text (in this case, images, videos and live emulation).
  3. The need for a managed archive of the reductions used in (2).
The citation tool functions on three classes of objects — games, performances, and game system states — and resolves issues 1-3 for each. Game objects are collections of data about a game. This includes both basic descriptive metadata (required for correct bibliographic entries) and links to executable data needed to run the game in browser emulation. Performance objects are records of games as played or viewed by a player or group. These records also combine viewable performance data with descriptive metadata. By “viewable performance data” we mean two possible things. One is a collection of image frames — GIFs or video — representing some situated act of play. The other is replay data — input stream recordings for emulators or replay files for a specific game engine.[35] Game system states are snapshots of a game’s run-time memory, either saved by the emulator as a separate file or extracted directly from a system’s memory.
The tool manages game, performance, and game state records in a database, and allows for the embedding of any of these in a hypertext (assuming executable or viewable performance data is available). The rest of this section briefly accounts for the inclusion of performance in our citation apparatus and then lays out the functionality and potential future work for the tool.

7.1 Game v Performance

The discussion above has mostly dealt with the citation of game objects as a means of presupposing their content and the contours of their gameplay. However, game performances as events are also commonly referenced in scholarly works. Performances result from two activities: games-as-performance, in the case of games tied to explicit geo-temporal contexts (alternate reality games (ARGs), installations, etc.) and gameplay performance. Gameplay performance is the play of a game that is not explicitly tied to a geo-temporal context, but that gains relevance from being situated in one. An example is a particular match at a fighting game tournament, where the event itself highlights game play as performance. The game in this case is not the operative site of performative relevance: its play at the tournament is. If the same match occurred in practice in a dorm room, it would not have the same significance.
Game performances as significant historical sources are well discussed in the literature. Clara Fernández-Vara, in her Introduction to Game Analysis, notes that, “we may want to analyze a game that is an event, a be-there-or-be-square type of thing, a performance” [Fernández-Vara 2014, 44]. She describes the need for secondary sources — “paratexts” like video or firsthand description — in helping to reconstruct and corroborate information about a performance. This sentiment is echoed in Henry Lowood’s work on the reconstruction of events that take place in massively multiplayer online games (MMOGs) [Lowood 2011]. In this case, the study of virtual world game play is more akin to anthropological work. The game itself, while it could be recovered and run through emulation in the future, is devoid of the community that created meaning through the performative space and affordances the world provided. Lowood remarks on the fallacy of an ideal “perfect capture” of every event and input supplied to the virtual world.
Even if we save every bit of a virtual world — its software and the data associated with it and stored on its servers, along with a replay of every moment as seen by players — it may still be the case that we have completely lost its history. The essential problem with this approach is that it leaves out the identification and preservation of historical documentation, and these sources are rarely to be found in the data inside game and virtual worlds or on the servers that support them [Lowood 2011, 118].
Even with access to game replay files, or reproductions through emulation, evidence of a game performance must also be paired with secondary information to substantiate and analyze it. Our inclusion of performance citations in the tool is to enable a link between the game object’s data and description, and further contextualizing performances. Additionally, the ability to embed emulation in line with historical performance video and description adds further potential for somatic contextualization of game play. By bringing the emulated system to the reader, they can gain a sense of what Steve Swink refers to as “game feel,” an embodied understanding of the game system as felt through the act of play [Swink 2009].[36] Pairing this embodied understanding of a game play system with performances adds another level of intuitive understanding to historical game play acts.

7.2 Citation Tool

The citation tool has two primary components:
  1. A command line interface (CLI) responsible for the ingestion of game and performance data, the generation of citations, and the management of the citation database.
  2. A web application (the “app”) that enables the live emulation of ingested game data, the live recording of game play performances, and the live recording of computational game states.
For the rest of this section we will use CLI to refer to the first component, and “app” to refer to the second. Their functionality is significantly interrelated, for example the CLI launches the app and the app’s backend server calls the CLI for certain processing tasks. The next two sections provide a very brief technical overview of both components; we refer the reader to our prior work for a more detailed account [Kaltman et al. 2017]. Figure 1 illustrates the relationships between the various components.
Figure 1. 
GISST components and pipeline.
In the GISST pipeline, input resources (1) are fed to the CLI (2), which extracts their information (3) into an extraction table (for URLs) or the citation database (for performance and game data). The Web Application reads from the citation database (4) and the Indexer uses CiteState.js to create further citable resources (5). CiteState.js (the software powering the augmented Doom walkthrough from our introduction) can then use those resources’ permanent URLs (6) and its own citation function (7) to embed an executable program into a target HTML element (8).
Extraction begins with either local filenames or Universal Resource Identifiers (URIs). For games, the extraction files are either game data files or directories containing game data. Providing a URI to the game extraction command assumes that the linked resource hosts information about a particular game; currently, extraction supports game reference URIs for either MobyGames or Wikipedia. Performance extraction only accepts URIs from YouTube presently, but the design is modular and could easily support, for example, archives of game input sequences.[37]
Extraction obtains rough, noisy data from the given source or sources, with the assumption that a researcher will clean up the imported metadata. The extraction step is necessary to construct a stable citation because the information provided by a potential resource may exceed the constraints of the database's descriptive metadata scheme or require further disambiguation. As an example, the Wikipedia page for Super Mario Bros., originally released for the Nintendo Entertainment System, combines all information about the title, in all of its different versions and releases, onto a single page. Our Wikipedia extractor presents all of this information to the user and allows them to choose the particular version of Super Mario Bros. they wish to later cite.
GISST currently allows for extraction from a variety of sources and file types, as shown in Table 1 below:[38]
Citation Type Supported File Types Supported URI Sources
Game
.NES ROM Format
.SMC ROM Format
Any directory containing a DOS compiled executable
.z64 ROM Format (partial)
MobyGames
Wikipedia
Performance
FM2 Replay Format
Generic Video Files
YouTube
Table 1. 
Table 1: GISST Supported Resources
Any source data that is extracted (currently game files and videos) is linked to a dependent citation entry. This allows for the recovery of source data in the web application interface through either emulation or video playback.

7.3 Web Application

The GISST app is a standard browser-based web application, with a JavaScript/HTML/CSS front-end designed for use with the Chrome web-browser, and a backend interface that is linked to the same database and ingestion commands as the CLI. This is marked as steps 4 and 5 in Figure 1 above. The app allows for the play, recording and citation of game states through CiteState.js (a suite of automated JavaScript ports of emulators).
The app’s interface supports four basic views of citation data:
  1. A basic listing page for the game and performance citations in the database.
  2. A full text search page that includes all citation records and game save state descriptions.
  3. A citation listing page that provides active links to previous save states and, for performances, the ability to create quick GIF animations based on a performance video.
  4. An indexer page that allows for examination of an emulated game, and the creation of game save states and video recordings.
Entries 1-3 are mainly presented as data tables supplemented with item 3’s “active links.” The indexer, however, is the primary user interface for game scholars interested in creating new performance data. The indexer integrates live game play with game performance recording, retrieval, and reference tools. The game moments captured in our introductory Doom example were all acquired and stored via this indexer interface.
Figure 2. 
Indexer User Interface. The emulation window is on the left, with a recording of that window displayed on the right.
Proceeding to the app’s user interface (displayed in Figure 2), we will briefly describe each button and component. The buttons under the main emulation window provide for most of the data recording features. The user can load the emulation, save a state, load the most recently saved state, control video recording, and mute audio. When a state is saved, it is logged in the “Available States” tab along with a screen shot and generated descriptive information. Clicking on any available state will cause the emulator to immediately load that state into the main window. After a state is selected, the “State” tab in the left side bar lets the user change its descriptive metadata. All changes logged in the side bar are propagated to the server and will show up in search queries. Video recording functions in a similar way. Any time a start-stop sequence is completed, the beginning and ending state of the emulation will be saved along with the video.
Each recorded performance appears in the “Available Performances” tab. Clicking on a performance updates the sidebar’s “Performance” tab, allowing for the review of a recorded video and editing of its generated metadata.
Any performance or state saved in the analysis tool will appear in the main citation listing page, and on individual pages for each respective game and performance. The state links on each individual page operate as “active links,” in that they will load up the indexer page with the correct state preloaded into the emulator. This makes each active link a link into a running emulation as a specific point. We take up the pedagogical and analytical implications of this in our expert evaluation review (in the next major section of this article).
The indexer, in creating and storing the saved states and performance recordings, provides the source material for future links created by the CiteState.js module. The CiteState.js interface allows for a simple description of a target page element and an id from the citation database. CiteState.js then automatically handles the loading of a game, performance, or state, and places it into an HTML element that can be aligned by the user through CSS styling or other means of element positioning (this is steps 6-8 in Figure 1 above). This completes the chain from source ingestion through shared linking of a game emulation in a web browser.

7.4 Future Work

The citation tool opens up numerous opportunities for the dissemination and standardization of game historical sources. For bibliographic record purposes, all the information in a specific citation store could be exported into forms compatible with common citation database formats, like BibTeX, or linked with citation systems, like Zotero.[39] These citations could also automatically include information about a compatible emulator and the file-specific data required by more technical scholars, like the platform and software studies researchers mentioned above.
In the realm of reductions, since the emulation is a full computing system running in a web page, its memory and operations are totally available to introspection via other concurrent JavaScript processes. We are already working on including memory manipulation functionality in the CiteState.js interface, which would provide dynamic visualization of a complete program to occur coincidentally in the browser surface.
Lastly, since each of the citation types, games, performances, and game states also require a linkage between the citation and some form of born-digital data, new forms of storage and retrieval will be necessary. There is some work on storing and loading emulated systems or sharing the results of emulation produced on cloud-based servers, but still no general solutions for the storage and retrieval of executable software, nor support for citation as envisaged in the functionality of the tool above.
The next section — an expert evaluation of GISST’s citation component by practicing game studies scholars and library professionals — also presents some significant ideas for future work.

8 Evaluation

As described, the citation component of GISST is an argument for more rigorous citation of computer games, and for the augmentation of their expression in game studies discourse. We believe the tool can ease the citation burden for game scholars and allow them to create new types of arguments and expressions about games. To corroborate this belief, we conducted a speculative expert evaluation of the tool, inviting comment from a group of professionals engaged with game study and preservation. The goal of the study was to ascertain if the intentions of the tool were clear, if our thoughts above aligned with those of practicing scholars, and to invite constructive commentary. This section outlines the evaluation and its responses, and how those responses aligned with our goals and ignited ideas for future work and collaboration.
The evaluation consisted of a set of 11 questions to be answered based on a 5-minute introduction to the CLI and web app components of the tool. We sent the evaluation to a select group of practitioners consisting of game designers, game studies scholars, and librarians. These groups align with those we hope will benefit most from the citation tool. All those chosen were already aware of GISST, and the video served as a reminder of functionality that had at some point been demonstrated to them in person. Responses were collected from seven people through an online form. Respondents included: Henry Lowood, curator of the History of Science and Technology and Film and Media Collections at Stanford University; Chaim Gingold, a game designer and historical researcher; Nathan Altice, a professor and game historian at the University of California, Santa Cruz; James Newman, a professor and game historian at the University of BathSpa; Glynn Edwards, head of technical services in Special Collections at Stanford University Libraries; Shane Denson, a professor of Art History at Stanford; Douglas Wilson, a game designer and lecturer at RMIT University; and a professor who wished to remain anonymous.[40] The remainder of this section will describe the responses to the tool, followed by recommended improvements.

8.1 Discussion

Overall, the responses were overwhelmingly positive, with one game studies scholar stating that the availability of the tools “could be huge” for the field. The respondents hailed from an overlapping set of backgrounds, but the responses aligned along two basic paths. The first was how the tool could affect game studies and game historical practices in citation, and what the tool could contribute, through state citation and retrieval, to students and game studies scholars. The second turned toward more of the potential for preservation that the tool presents in its management of game citation and game states.
The potential influences noted for game studies practice included (1) the formalization of game citation practices, (2) the removal of obstacles to game access, (3) the automation of game history tasks currently taken on in an ad-hoc manner, and (4) the presentation of deeper, and more comprehensive, historical analysis. Multiple respondents noted the tool’s implicit call for a more “formal and robust” citation practice for games, with Henry Lowood stating that the tool provided a first take on a “citation framework where there was none.” The tool functioned as a way to call attention to the potential of better citation practices. James Newman explained, “a contribution of the tool will surely be to heighten discussion of citation and [its] limits and variations in current practices.” This therefore aligns with our arguments for more consistent citation practice in the game studies section above.
Altice, Newman and Shane Denson highlighted the tools’ ability to provide an easier route to specific game locations and gameplay sequences. Altice and Denson specifically work on comparative analysis of game versions and emulators, so the potential for the tool to make parts of games more reachable was appreciated [Altice 2015] [Denson 2017]. This concern for access to game history also extended to the other scholars, who all remarked on the ability of the tool to make classroom lectures more engaging, and according to Altice provide for “in-class play that isn’t contingent upon equipment or playing skill.” He felt this allowed for a “wider breadth of examples” since lengthy equipment set up or hours spent trying to get to a particular spot in a game could be removed from the equation. Altice also felt that this approach could make exploring games like “flipping to a relevant page in a book, which could make citation more prolific and illustrative.” He also believed that playing a game’s citation would have a more powerful rhetorical effect than other forms of reduction, nicely pairing with our claims about rhetoric and game feel above. Chaim Gingold also felt that the tools provided a significant new way to share content with students. He imagined providing state citations to students in the future as one might today assign videos on YouTube or Twitch.
The tools as an automated solution for citation also struck a chord with the researchers who already use an assortment of ad-hoc solutions for emulation, and gameplay recordings and analysis. Newman already organizes a rather complex chain of tools for gameplay capture and analysis, and the tools provided a way to alleviate some of the burden in getting multiple programs and systems to work together. Gingold also noted that the tools could allow students to engage in the types of historical analysis that are only available to interdisciplinary scholars with programming and humanities backgrounds. Students could incorporate “their interaction into their scholarship (like us!),” and provide them a starting point for more detailed analysis of design and game play interactions.
The last major response area (in this first thread of responses, on game studies and game historical practices) was about the ability for the tools to provide a new level of analysis for game studies and game history. Newman noted that having access to a GISST-like system would remove the need to engage in extended descriptions of game play or game scenes. If a citation is also a playable instance, he could rely on the reader to play what he was talking about, and then focus his time on deeper analysis instead of front-loading arduous amounts of descriptive text to set up his points.
Newman suggested that being able to refer to a persistent recording or savestate would give him increased confidence in writing more detailed analyses of sequences of gameplay and would, hopefully, alleviate some of the need for description in favour of close commentary and annotation.
Denson concurred, stating that “arguments can [now] be illustrated directly (through video, for example) and even mounted through hands-on engagement (gameplay) rather than merely discursive description.”
The second major thread in the responses highlighted the tools’ implicit effects on game preservation and the organization of game history. Some viewed the tools as an argument for more robust digital repositories able to handle and retrieve executable content. One noted that the tools displayed the power of centralizing documentation about games and how the coordination of tools could foster new expressions through the linkages of different technologies. In this case, the concordance of emulation, documented citation, and gameplay videos invited a discussion of the need for coherent underlying infrastructure to support preservation of those outputs. Lowood agreed, saying that the tools put “issues around documentation, archiving and gameplay preservation front and center.” Glynn Edwards also focused on the needs to create consistent metadata schemes for the emulated save states and companion documentation. There was a general agreement among the preservation professionals that the tools’ existence, in and of themselves, functioned as an argument for better preservation practice, and they were excited to begin working towards solutions.
Another small preservation note was that of the tools’ ability to allow for quick validation of game files, and game data integrity. By ingesting an executable into the system, one can easily check if it is compatible with a specific emulator, and if it is actually the file it claims to be. Lowood also noted that the tool could push repositories to negotiate better IP rights access to executable software, or, at the very least, further reveal the need for that work to be figured out (see footnote 25).

8.2 Improvements and Future Work

Given the positive response to the work, most of the critical discussion of the tools pointed solely to means of immediate improvement and new features. In the main, many respondents wanted the tools to continue development of better UI and user accessibility features. Right now, most of the ingestion apparatus occurs on the command line, and Altice correctly felt that “freeing the tool from dependence on the CLI” would be necessary since “this would be a non-starter for many (most?) scholars without a technical background.” Many also pointed out that while the technology had obvious potential, as noted in the last section, it needed a set of coherent examples and illustrative case studies. The tools represented more of a “starting point” for new discussions in game studies and game preservation but were not really yet a solution (though with work they could be). This appears to show that the tools are a ripe ground for future work, as just making them more accessible and easier to use excited many of the respondents. One even requested a basic tutorial for the current alpha prototypes since they felt they required hands-on access to fully appreciate the prototypes’ potential use cases.
Another general request was for the inclusion of more emulators than the four currently available to allow both for comparison of the emulators themselves, and to help with further preservation questions in the description of the configured environments needed to support game data. Many noted that the tools, in both their analytic and preservation potential, pointed to uses outside of games, and would be a boon for software studies and general software preservation. This was edifying, since another thread of our overall thesis is that most advances for game software history are also significantly applicable to broader classes of software.
Future work dovetailed with the requests for more system support and usability. Many wanted to see what a larger, shared repository of game state citation could afford, either in a classroom setting or for executable collections in libraries and archives. There was also a call for more explanation of the citation description formats, and perhaps even integration with current scholarly citation tool sets like Zotero. Multiple respondents also mentioned the potential for annotation tools to add voice overs to videos, and record and remark on game play input traces. Gingold specifically was excited about the ability to introspect on the running emulations, and visualize their system dynamics and memory states, similar to our own thoughts on future work above.
In closing, the responses to the tool essentially agree with the arguments presented above. Respondents believed that embedded emulations represent a new form of expression for game history, and that tools themselves function as an argument for further work on technical system visualization, documentation management, game citation, and preservation. We believe that this suitably validates the tools, the methodologies they support, and theories behind them in ways that not only legitimate them as contributions to multiple fields, but as a starting point for more significant future work, and perhaps even future research areas.

9 Conclusion

At the beginning of this article, we argued that game and game performance citation are underdeveloped, immature aspects of scholarly practice. This proceeded through a more in-depth discussion of the purpose and functionality of citation in both scholarly discourses in general, and towards the ways in which games are a new and special case. As a result, the system we described to manage and create playable manifest citations required not just novel engineering effort but a theoretical consideration of the types of objects within and around games that could be leveraged in arguments. The initial prototypes of the GISST system actually preceded and evolved in dialogue with our new conceptualization of game citation.[41] In attempting to find ways to cite games, we incidentally were forced to create the technical means for those citations, and to figure out how those citations could be made available and useful for argumentation. This then contributed new means of historical expression: we realized that the results of the citation system creation represented a new general class of activity in the design of systems to support game scholarship. The citation system is a central component of GISST because we believe that citation work is a necessary precondition for a whole range of possible tools for game history, game studies, and software studies works.
The citation work elaborated on a means for the manifest citation of new objects into scholarly discourse, but it also supported those objects’ creation and storage. As a result, this opened up those objects (performance videos, executable game play, and indexed game states) to further analysis. In addition to functioning as a form of reduction in supporting textual discourse, the objects are also now organized and manipulable by any potential future extensions of GISST’s toolset. As mentioned in Future Work above, this could mean input analysis and replay of game play — a means to further look at instances of “superplay” like speed running or glitch-hunting — and introspection on the game’s system state and run-time memory. In tracking the needs for a citable base of games, performance, and states, we have opened up a whole new set of resources and opportunities for exploration and expression through scholarship. This provides yet another example of how the stabilization of historical resources can lend itself to new uses and articulations.
GISST resolves many key issues in game citation, but more importantly it comprises a compelling argument in the ongoing development of norms and standards in game scholarship. It encodes in its database schema and user interface a view of what it means to talk about a game or game performance in a recoverable, useful way which is not prone to presupposition. It is an instance of the rhetorical approach promoted by Ramsay and by Burdick et al: we “contribute to humanities theory by forging tools that quite literally embody humanities-centered views regarding the world” [Burdick et al. 2016, 104].

Acknowledgements

The authors wish to thank Brandon Butler, Director of Information Policy at the University of Virginia Library, for help in the clarification of GISST’s potential legal context in endnote 23. This work was supported, in part, by Institute for Museum and Library Services grant LG-06-13-0205-13.

Appendix

Figures below illustrate the interactive saved states.
Figure 3. 
wolf3d_start. The initial player view upon loading the first level of Wolfenstein 3D the player's weapon, a door and a dead body.
Figure 4. 
first_barrel. One of the first interactive explosive barrels in Doom, waiting to be shot by the player in Episode 1 Map 1 (E1M1).
Figure 5. 
window_goo_lake. A view out of the first window in Doom, showing the tantalizing prospect of a hidden armor upgrade in a lake of toxic goo.
Figure 6. 
armor_plinth. A prominent armor upgrade resting on a plinth
Figure 7. 
nightmare_armor_plinth. The armor plinth is guarded by shotgun wielding baddies on the Nightmare difficulty level.
Figure 8. 
health_bonus_armor_room. Small health items located next to the initial armor plinth.
Figure 9. 
zig_zag_room. A zig zag platform room in E1M1 featuring a fireball shooting Imp demon.
Figure 10. 
wall_secret. The first hidden wall secret in Doom, located in the zig zag room and opened with a door opening command (generally space bar).
Figure 11. 
lowered_imp_pillar. The secret platform in the zig zag room that is lowered after exploring the next room. This grants access to the secret shotgun room.
Figure 12. 
shotgun_room. The hidden shotgun room adjacent to the zig zag room.
Figure 13. 
elevator_secret. The hidden elevator in the corner of the shotgun room lowering.
Figure 14. 
secret_corridor_elevator_top. The hidden path at the top of the hidden elevator in the shotgun room.
Figure 15. 
secret_window. A hidden window at the end of the hidden path in Figure 14.

Notes

[1] Resurrection could also apply to the physical reconstruction (or acquisition) of legacy hardware. However, this article is only able to comment on the data resurrection through emulation — for reasons that will become clear below.
[2]  That “less authoritative records are taking their place” is actually more a symptom of not creating systems and “authoritative” sources that can better deal with new types of records, as we attempt to show through the citation system below and have explored in our prior work on the appraisal of new forms of game development documentation and new descriptive apparatus for game software objects. See Kaltman et la. (2014), Kaltman et al. (2016).
[3]  We make extensive use of the computer game Doom, more specifically the Doom Version 1.9 Shareware, because it is freely distributable, a significant part of game history, and very well documented compared to other historical computer games.
[5] Additionally, the practices of analytical bibliography — probably most impressively displayed in Frank Manchel’s magnum opus, Film Study: An Analytical Bibliography — can help to reconcile the production and history of academic disciplines [Manchel 1990]. Manchel’s four-volume, 2500 page work enumerates and describes the entire breadth of English language film and film studies produced between 1965 and 1990. Interestingly, in line with the current project of scholarly support through tools, Manchel’s work is emphatically indebted to SCRIPT/VS word processing system and the IBM 6670 Laser Printer for help in maintaining and organizing the necessary subject and author indexes required for his work [Manchel 1990, 28].
[6] In the most recent MLA Handbook, the editors write that citation practice involves, “demonstrating the thoroughness of the writer’s research, giving credit to the original sources, and ensuring that readers can find the [sources] . . . to draw their own conclusions about the writer’s argument” [MLA Handbook 2016, 4]. Additionally, authors must provide a “comprehensible, verifiable means of referring to one another’s work . . . to give credit to the precursors whose ideas they borrow, build on or contradict and allow future researchers interested in the history of the conversation to trace it back to the beginning” [MLA Handbook 2016, 5].
[7] “Discursive chain of thought” is basically the organization of knowledge in a specific discipline. All practitioners are contributing to the historical accumulation that furthers the course of their discipline and the history of its claims.
[8] There are parallels here with Latour and the translation of concepts in networks, as well as the use of inscription in the creation of research works. We will briefly bring in some of the history of science and technology perspective on text formation, with its focus on practice, below.
[9] We are careful to note that the assumption of an audience here is not only referring to the actions taken by an author, with an audience in mind, to clarify and align their text with others’ expectations, but also to the implicit knowledge that a potential reader will bring to a text given that it is in a specific discursive form. Fairclough goes into more depth on discursive types in Fairclough (1992), specifically in chapters 2 and 3.
[10] The organization of citations in a text (and, in our case, the organization of text and running executable programs) does also call out to various traditions relating to visual design and the juxtaposition of text and image, specifically in art criticism see Berger (1973).
[11] The ACM is currently investigating ways to embed software references into publications through an “artifact review” badging process. See: https://www.acm.org/publications/policies/artifact-review-badging
[13] This is in line with the recommendation for “reasonable compatibility” in Kaltman et al. (2015). We argue that a game resource in a collection catalog should provide granular enough information to give a researcher a reasonable guess at the technical apparatus required for the resource’s recovery.
[14]  According to MobyGames, https://web.archive.org/web/20160421081803/http://www.mobygames.com/game/descent/release-info, Descent has 15 different releases, 6 of which occurred in 1995 in the United States, Japan, and Germany for DOS, Mac, and PC-98.
[15] Relevantly, Pinchbeck was involved with the KEEP project, an early attempt to provide an emulation framework for the recovery of older games [Pinchbeck et al. 2019]P. A specific purpose of KEEP was to insure that people could play old games to explicitly understand their affordances through play and reference their historical context. As such, we wish to reiterate that Pinchbeck’s work is used as an example of a specific type of game historical discourse and not a criticism of the work itself.
[16] For more information on recommendations for citation guidelines as a result of the GAMECIP work, please refer to our citation recommendations: Kaltman, Eric, Stacey Mason, and Noah Wardrip-Fruin. “The Game I Mean: Game Reference, Citation and Authoritative Access” [Kaltman et al. 2020]. Additionally, recent citation recommendations and a discussion of how citation of games is always a political and discipline-specific act can be found in “How to Reference a Digital Game” [Gualeni et al. 2019].
[17] Prominent examples are Kairos (http://kairos.technorhetoric.net/), Scalar (http://scalar.usc.edu), and Vectors (http://vectors.usc.edu).
[18] Unless, as indicated above, the interruption of the flow serves a discursive function — the disjunction of meanings being relevant to some point or elaboration. Sometimes contrasts in discursive presentation inform limitations of either one.
[19] Newman (2012a) actually calls for game preservation policies to prioritize videos of gameplay on the assumption that executable access is a less likely future scenario.
[20] As described in Wilson (2013). The “solo eggplant run” is a secret completion achievement for the computer game Spelunky; a rogue-like dungeon exploration game modeled on Indiana Jones and “explore the tomb”-type motifs. The eggplant run involves carrying a useless item from the beginning of the game through completion without losing it or losing one’s life. It was so difficult that it took years before its first completion by Bananasauras Rex in 2013.
[21] However, being at the forefront of something is not the same as “inventing” it. And Victor’s work is heavily inspired by Alan Kay’s in “active essays” and Ted Nelson’s educational musings in “No More Teacher’s Dirty Looks” from Nelson (1974). For more information on the genealogy of the term see Yamamiya (2009).
[22] More technically, it is “a technique for implementing a virtual machine on a host computer whose instruction set is different from the host computer’s” [Rosenthal 2015, 2]. Although the line between virtualization and emulation does get a bit murky by this definition — the computer I’m typing this on, a MacBook Pro running Apple Mac OS X 10.11, shares the same basic instruction set with a Sony PlayStation 4 — the encapsulation of one system within another is the basic function of emulation.
[23]  The use of “image” in referring to an extracted data set is drawn from operations in mathematics. When you use a function to operate on a set of numbers, you are mapping the input values with potential outputs. In the case of y = x + 1, y is a function of x (f(x)) with x standing for a range of input values, and y for the output values of the function. The input values in this case “map” through the function to a specific set of output values. This resultant map, which is just the set of inputs each incremented by 1, is an “image” of those inputs in a new domain. Similarly, the data extracted from a physical medium is not the same data (nothing changes from the physical medium’s point of view) but a mapping of the data stored on that medium to an equivalent set of data now migrated from the media to another machine. Thus, “imaging” is very directly the process of mapping the data stored on a specific physical media to an identical configuration on a separate machine with its own storage. This distinction between types of data, and the means by which they are migrated, mapped and translated between machines will become relevant below for issues of reduction.
[24] Emulation, specifically in its copying and use of potentially copyrighted data, is in unclear legal territory. The current GISST system would therefore need to be modified to adapt a BYOD (bring your own data) approach where users provide their own legal game data for citable manipulation. Another alternative would be for GISST to function as a research service, like a more conventional research database, within which executable citations would function for institutions paying for the service. Counterintuitively, providing GISST as a service that draws from data stored on third-party sites elsewhere on the Internet, like ROM sharing sites, may help limit legal risk for institutions. A GISST service tailored to enable only legitimate fair uses would be less exposed to liability than sites that support a broad array of more legally dubious uses. GISST could rely on third-party sites for content without necessarily taking on the risk associated with making the content available for open-ended reuse. For more extensive legal information on this topic, check out the aforementioned McDonough (2010) and Rosenthal (2015).
[25] The MAME project is known for its attention to detail and support for esoteric systems. For example, one addition in 2016 was a Sonic the Hedgehog popcorn vending machine with embedded display, SegaSonic Popcorn Shop, that was marketed only in 1993 in certain Japanese cities.
[26] Many other emulators now have JavaScript versions, the three listed above are the three we use in the citation tool and so are explicitly mentioned. For a full listing of emulators (including JavaScript) check https://en.wikipedia.org/w/index.php?title=List_of_video_game_emulators which keeps a running list of projects and compatibility.
[27] The inclusion of full emulated systems in a document might be closer to Ted Nelson’s notion of “transclusion” than to traditional, print-based models of intertextuality. In Nelson’s proposed (and, with Autodesk’s involvement, partially implemented) Xanadu system, a precursor of the World Wide Web, portions of documents could be manifestly represented inside other documents by “transcluding” a portion of a full version of the document. Those reading a Xanadu document could reach a full version of any transcluded document, or change the portion viewed through the transclusion “window,” or pull it up alongside the transcluding document, as easily as we now traverse links on the Web — making the transcluded document somewhat less under the control of the transcluding. For more, see Nelson (1993).
[28]  The Java-based plugins are now flagged as security threats, and even forcing the browser to ignore those warnings did not result in the emulations executing correctly.
[29]  On a preservationist note, while the “deconstructulator” (https://www.benfry.com/deconstructulator) is still available, it requires a Java plugin to function. Due to security concerns over the last decade, Java support has been dropped or disabled in many browsers. There is no longer any mention of the NESCafe emulator on its creator’s website, and its most recent update (July 22, 2008 according to https://www.zophar.net/java/nes/nescafe.html) is over 10 years old as of this writing in 2019.
[30] See https://www.youtube.com/user/ClydeMandelin, specifically “Poemato CX's Twitch Stream Magic” https://www.youtube.com/watch?v=rX87i71IC7g
[31]  See Rinehart (2014) for a more detailed discussion of the benefits and perils of incidental community archival practices.
[32] Some emulators running in constrained environments, like JavaScript emulators in web browsers, need to cut corners to get processing up to an acceptable speed. Other emulators running in native execution contexts, like Microsoft Windows applications, sometimes intentionally slow down processing in order to match the timings of older machines.
[33] See Woolgar (1986) for a discussion of the different philosophical roots behind discourse analysis (Continental Philosophy) and STS (Anglo-American Analytic Philosophy).
[34] This issue must be addressed as we explore the future of academic publishing. An intriguing example is Alexandra Juhasz's Learning from YouTube, an online book which includes and links to many YouTube videos, published by The MIT Press in partnership with the Alliance for Networking Visual Culture [Juhasz 2011]. Given YouTube’s (and the wider corporate web’s) lack of concern for preservation and stability, one of the first pages readers are likely to encounter (“HOW TO USE THIS VIDEO-BOOK”) suggests that they can “help the project by reporting broken links.” It’s difficult to imagine such a request in a book built on research in a traditional archive. (What form would it even take?)
[35] The term “game engine” is commonly used for software platforms that support particular types of game play experiences. Some are relatively specific, such as the “id Tech” engines primarily used for first-person shooter games. Some are more general, such as the Unity engine, which is used for many types of graphical games — but which would be ill-suited to text adventure or interactive fiction games, for example. However, more broadly, a game’s “engine” can describe the game’s processes as separated from its data. This is why Nathan Altice can describe community-produced data that targets the software and hardware processes of 1980s Nintendo Entertainment System games as “spawning fresh games from obsolete engines” [Altice 2015, 316] (Emulators are discussed in section 5.3). If this sounds like a slippery concept, that’s because it is. As Eike Falk Anderson, Steffen Engel, Peter Comninos, and Leigh McLoughlin put it, “[T]here is disagreement about exactly what a game engine is, with sometimes fundamental differences between definitions” [Anderson et al. 2008, 229]. Nevertheless, it remains a useful concept in practice.For any game that displays to a screen, players may attempt to capture game play performances using screen recordings, resulting in a series of image frames. In addition, some games support the creation of “replay files” — which cause the performance to be re-enacted in the game/engine software, reducing file size and making it possible to, for example, change the perspective from which the performance is viewed. Some emulators support interaction through “input streams” — which cause a scripted set of controller events to be executed at particular times. If the input stream is a recording of the controller events of a particular performance, and the game begins from the same computational state, the result will be a reproduction of the performance in the game/engine.
[36] Swink’s “game feel” is focused exclusively on continuous input games, like platformers or action titles. Doug Wilson has argued that “game feel” should extend to other types of interactions with computational feedback systems, from menu systems to mouse interaction in strategy games [Wilson 2017]. We take the latter, more liberal view of game feel in the context of providing an emulated system in argument for the significance of a performative act or as a means of elaborating on a deeper understanding of embodied play experiences.
[37] http://tasvideos.org/Movies.html contains a listing of different platforms along with downloads for their various “movie” formats, which are usually files of tabulated input sequences.
[38] The table does not include game states because the CLI does not ingest arbitrary state data. This is mainly due to the fact that emulated state data is specific to both a game and the emulator supporting it and effectively useless without those dependencies. Additionally, in the case of some of GISST’s supported emulators, like DOSBox, there are no independent save state formats, just data derived from the live emulation during run-time. That said, GISST-saved states should in principle be portable across browsers running the GISST app.
[40] Reviewers commented on the gender imbalance (6:2 male/female) among the evaluation set. Initial requests for commentary were sent to a slightly more diverse set (9:5) of evaluators, however limits due to publishing deadlines prevented further pursuit of more gender parity. Continued evaluative work on the GISST tool set will be sure to more suitably address this disparity.
[41] Note that this is conceptualization of GISST as a system and not “conceptualization” in the ontological sense.

Works Cited

Altice 2015 Altice, N., 2015. I am error: the Nintendo family computer/entertainment system platform, Platform studies. The MIT Press, Cambridge, Massachusetts.
Anderson et al. 2008 Anderson, E., Engel, S., Comninos, P., and McLoughlin, L. “The Case for Research in Game Engine Architecture”. In Proceedings of the 2008 Conference on Future Play: Research, Play, Share, 228–231. Future Play ’08. New York, NY, USA: Association for Computing Machinery, 2008. https://doi.org/10.1145/1496984.1497031.
Berger 1973 Berger, J., 1973. Ways of seeing. BBC and Penguin Books, London.
Bogost and Montfort 2009 Bogost, I. and Montfort, N., 2009. Racing the Beam: The Atari Video Computer System. MIT Press, Cambrige, Mass.
Borgman 2007 Borgman, C.L., 2007. Scholarship in the digital age: information, infrastructure, and the Internet. MIT Press, Cambridge, Mass.
Burdick et al. 2016 Burdick, A., Drucker, J., Lunenfeld, P., Presner, T., Schnapp, J., 2016. Digital_Humanities. The MIT Press.
Denson 2017 Denson, S., 2017. “Visualizing Digital Seriality or: All Your Mods Are Belong to Us!” 22.1.
Eco 2015 Eco, U., 2015. How to Write a Thesis. MIT Press, Cambridge, MA; London.
Fairclough 1992 Fairclough, N., 1992. Discourse and social change. Polity Press, Cambridge, Mass.
Fernández-Vara 2014 Fernández-Vara, C., 2014. Introduction to Game Analysis, 1 edition. ed. Routledge, New York.
Fry 2003 Fry, B. “deconstructulator | ben fry” [WWW Document], 2003. URL https://benfry.com/deconstructulator/ (accessed 3.23.19).
Grafton 1997 Grafton, A., 1997. The footnote: a curious history, Revised edition. ed. Harvard University Press, Cambridge, Mass.
Gualeni et al. 2019 Gualeni, S., Fassone, R., Linderoth, J., 2019. “How to Reference a Digital Game”, in: Proceedings of the 2019 DiGRA International Conference. Presented at DiGRA, Kyoto, Japan.
Hart et al. 2016 Hart, V., Case, N., 2016. “Parable of the Polygons” [WWW Document]. Parable of the Polygons. URL http://ncase.me/polygons (accessed 4.5.17).
Hyland 2000 Hyland, K., 2000. Disciplinary discourses: social interactions in academic writing, Applied linguistics and language study. Longman, Harlow ; New York.
Juhasz 2011 Juhasz, A., 2011. “Learning From YouTube: YOUTUBE IS ...” [WWW Document]. URL http://vectors.usc.edu/projects/learningfromyoutube/ (accessed 4.7.19).
Kaltman et al. 2014 Kaltman, E., Wardrip-Fruin, N., Lowood, H., Caldwell, C., 2014. “A Unified Approach to Preserving Cultural Software Objects and their Development Histories”.
Kaltman et al. 2015 Kaltman, E., Wardrip–Fruin, N., Lowood, H., Caldwell, C., 2015. “Methods and Recommendations for Archival Records of Game Development: The Case of Academic Games”. Proceedings of the 10th International Conference on the Foundations of Digital Games.
Kaltman et al. 2016 Kaltman, E., Wardrip-fruin, N., Mastroni, M., Lowood, H., De groat, G., Edwards, G., Barrett, M., Caldwell, C., 2016. “Implementing Controlled Vocabularies for Computer Game Platforms and Media Formats in SKOS”. Journal of Library Metadata 16, 1–22. https://doi.org/10.1080/19386389.2016.1167494
Kaltman et al. 2017 Kaltman, E., Osborn, J., Wardrip-Fruin, N., Mateas, M., 2017. “Getting the GISST: A Toolkit for the Creation, Analysis and Reference of Game Studies Resources”, in: Proceedings of the 12th International Conference on the Foundations of Digital Games. Presented at the Foundations of Digital Games, Hyannis, MA.
Kaltman et al. 2020 Kaltman, E., Mason, S., Wardrip-Fruin, N., n.d. “The Game I Mean: Game Reference, Citation, and Authoritative Access”. In submission.
Kirschenbaum 2008 Kirschenbaum, M.G., 2008. Mechanisms: New Media and the Forensic Imagination. The MIT Press, Cambridge, Mass.; London.
Lowood 2011 Lowood, H., 2011. “Perfect Capture: Three Takes on Replay, Machinima and the History of Virtual Worlds”. Journal of Visual Culture 10, 113–124. https://doi.org/10.1177/1470412910391578
Lynch 1988 Lynch, M., 1988. “The externalized retina: Selection and mathematization in the visual documentation of objects in the life sciences”. Human studies 11, 201–234.
MLA Handbook 2016 Modern Language Association of America (Ed.), 2016. MLA handbook, Eighth edition. ed. The Modern Language Association of America, New York.
Maher 2012 Maher, J., 2012. The future was here: the Commodore Amiga, Platform studies. MIT Press, Cambridge, Mass.
Manchel 1990 Manchel, F., 1990. Film study: an analytical bibliography. Fairleigh Dickinson University Press ; Associated University Presses, Rutherford : London.
Mandelin n.d. Mandelin, C., n.d. “Poemato CX’s Twitch Stream Magic.”
McDonough 2010 McDonough, J.P., 2010. “Preserving virtual worlds: Final Report”. Graduate School of Library and Information Science, University of Illinois at Urbana-Champaign.
Montfort 2000 Montfort, N., 2000. “Cybertext Killed the Hypertext Star | Electronic Book Review” [WWW Document]. URL http://www.electronicbookreview.com/thread/electropoetics/cyberdebates (accessed 2.26.17).
Nelson 1974 Nelson, T.H., 1974. Computer lib: Dream machines. Self published.
Nelson 1993 Nelson, T.H., 1993. Literary Machines, 93 edition. ed. Mindful Press.
Newman 2012a Newman, J., 2012. Best before: Videogames, supersession and obsolescence. Routledge.
Newman 2012b Newman, J., 2012. “Ports and patches: Digital games as unstable objects”. Convergence: The International Journal of Research into New Media Technologies 18, 135–142.
Nowviskie 2016 Nowviskie, B.P., 2016. “speculative collections”. URL http://nowviskie.org/2016/speculative-collections/ (accessed 11.18.16).
Owens and Padilla 2020 Owens, T. and Padilla T., 2020. “Digital sources and digital archives: Historical evidence in the digital age”. The International Journal of Digital Humanities.
Papert 1980 Papert, S., 1980. Mindstorms: Children, computers, and powerful ideas. Basic Books, Inc.
Parks 2014 Parks, T., 2014. “References, Please” [WWW Document]. The New York Review of Books. URL http://www.nybooks.com/daily/2014/09/13/references-please/ (accessed 4.5.17).
Pinchbeck 2013 Pinchbeck, D., 2013. Doom: scarydarkfast. University of Michigan Press, Ann Arbor.
Pinchbeck et al. 2019 Pinchbeck, D., Anderson, D., Delve, J., Alemu, G., Ciuffreda, A., Lange, A., 2009. “Emulation as a strategy for the preservation of games: the KEEP project”, in: DiGRA 2009-Breaking New Ground: Innovation in Games, Play, Practice and Theory.
Ramsay 2011 Ramsay, S., 2011. Reading machines: toward an algorithmic criticism, Topics in the digital humanities. University of Illinois Press, Urbana.
Rinehart 2014 Rinehart, R., Ippolito, J. (Eds.), 2014. Re-collection: art, new media, and social memory, Leonardo. The MIT Press, Cambridge, Massachusetts.
Rosenthal 2015 Rosenthal, D.S., 2015. Emulation & Virtualization as Preservation Strategies.
Siemens and Moorman 2006 Siemens, R.G., Moorman, D. (Eds.), 2006. Mind technologies: humanities computing and the Canadian academic community. University of Calgary Press, Calgary.
Staley 2015 Staley, D.J., 2015. Computers, visualization, and history: how new technology will transform our understanding of the past, Second Edition. ed, History, the humanities, and the new technology. Routledge, Abingdon.
Sula and Miller 2014 Sula, C.A., Miller, M., 2014. “Citations, contexts, and humanistic discourse: Toward automatic extraction and classification”. Literary and Linguistic Computing 29, 452–464. https://doi.org/10.1093/llc/fqu019
Svensson and Goldberg 2015 Svensson, P., Goldberg, D.T. (Eds.), 2015. Between humanities and the digital. The MIT Press, Cambridge, Massachusetts.
Swink 2009 Swink, S., 2009. Game Feel: A Game Designer’s Guide to Virtual Sensation. Morgan Kaufmann, Burlington, MA.
Victor 2011a Victor, B., 2011. “Explorable Explanations” [WWW Document]. URL http://worrydream.com/#!/ExplorableExplanations (accessed 4.5.17).
Victor 2011b Victor, B., 2011. “The Ladder of Abstraction” [WWW Document]. URL http://worrydream.com/#!2/LadderOfAbstraction (accessed 12.3.14).
Warwick et al. 2012 Warwick, C., Terras, M.M., Nyhan, J. (Eds.), 2012. Digital humanities in practice. Facet Publishing in association with UCL Centre for Digital Humanities, London.
Wikipedia 2019 “List of video game emulators”, 2019. Wikipedia.
Wilson 2013 Wilson, D., 2013. “A breakdown of 2013’s most fascinating video game moment” [WWW Document]. Polygon. URL http://www.polygon.com/2013/12/23/5227726/anatomy-of-a-spelunky-miracle-or-how-the-internet-finally-beat (accessed 2.20.17).
Wilson 2017 Wilson, D., 2017. “A Tale of Two Jousts: Multimedia, Game Feel, and Imagination”. URL https://www.youtube.com/watch?v=hpdcek4hLA8
Woolgar 1986 Woolgar, S., 1986. “On the alleged distinction between discourse and praxis”. Social Studies of Science 309–317.
Yamamiya et al. 2009 Yamamiya, T., Warth, A., Kaehler, T., 2009. “Active Essays on the Web”, in: Creating, Connecting and Collaborating through Computing, 2009. C5’09. Seventh International Conference On. IEEE, pp. 3–10.