Samantha Blickhan is the IMLS Postdoctoral Fellow at the Adler Planetarium, and Humanities Lead for Zooniverse.
This is the source
This piece examines
Review of
In an academic context, crowdsourcing
refers to the act of presenting data to
members of the public, and inviting them to perform a task, or series of tasks,
upon those data. It is a broad definition, but crowdsourcing is a broad
methodological concept that can be applied in a number of ways, from academia to
business practice. As their title suggests, Mark Hedges and Stuart Dunn narrow
the scope considerably with
Such breadth of methodological usage is not new territory for authors writing about digital projects in the humanities (and it should be said that the focus within this text is overwhelmingly on digital methods, though the authors acknowledge crowdsourcing’s pre-digital roots), which can be difficult to fit within a single framework, particularly due to frequent advances in the technologies that support digital projects. The potential audience for this methodological text can include everyone from participants in digital humanities projects to practitioners — including (but not limited to) researchers, archivists, academics, librarians, institutional staff, and students.
The authors address this breadth of scope in content as well as audience by using
existing literature from other participatory research fields (most notably the
adjacent field of citizen science
) as a starting point on which to build their
framework. They note that, under the umbrella of citizen science
, projects can
either focus on data processing (delegative
tasks) or on bringing in external
participants to the process of research (democratizing
tasks) academic knowledge
and the knowledge production typically associated with
academic crowdsourcing, as well as frequent outcomes for humanities crowdsourcing,
such as content transformation and information synthesis. However, this discussion
also allows the authors to identify where the framework of citizen science does not
meet the needs of the humanities community, thereby requiring a separate system
within which this work can take place
Creating a framework for any digital research method presents a unique challenge. The
speed with which tools are released, adapted, and discontinued — in conjunction with
the relatively slow development and practice of the academic research process,
particularly in regard to traditional methods of publication — means that any sort of
so-called standard practice
can be very difficult to sustain, and research making
use of digital technologies often becomes commonplace within academic fields before
any type of structure can be proposed, peer-reviewed, and refined.
A lack of standardization in the early stages of implementing new academic research practice does not necessarily guarantee a negative outcome for the work conducted within these nascent technological environments. Such instances of trial and error can allow new practices to emerge, free from traditional boundaries, but it often means that meta-textual studies must simultaneously function as both historiography and theory for these fields: critiquing, reviewing and suggesting best practices for their continued use. Hedges and Dunn have set out to provide just such a structure with
typology of arts and humanities crowdsourcing methods
Though researchers in the humanities have acknowledged crowdsourcing as a method
since the early 2010s, the majority of publications on the subject have been
overviews of how crowdsourcing has been used within specific humanistic disciplines
or sub-disciplines (examples include
The first three chapters of the book are devoted to an overview of crowdsourcing
as a method, including an examination of the historical and technological
developments that set the stage for a rise in modern academic crowdsourcing
(including the World Wide Web, Web 2.0, and business-oriented crowd models); the
scientific origins of crowdsourcing as a method, and how they have influenced
subsequent humanities-based work; and — perhaps most importantly — the presentation
of a typology of crowdsourcing, modeled after Short and McCarty’s
Once the authors have presented the history and methods, and offered a shared
vocabulary for the actions being taken within digital crowdsourcing projects, they
turn their attention to the social elements of crowdsourcing, both for individuals
and communities alike. Topics include roles within projects (Chapter 5), motivations
for and benefits of participation (Chapter 6), ethical issues like exploitation of
volunteer labor and questions of data ownership (Chapter 7), and the role
crowdsourcing can play in the creation of knowledge and memory in regard to cultural
heritage (Chapter 7). The chapter on ethics is particularly welcome. When digital
methodologies involve members of the public in academic work, as is the case with
crowdsourcing, it becomes essential to create a system within which projects can be
evaluated, as such systems are needed not only for the benefit of the academic
research output, but to ensure the fair and ethical treatment of the communities
participating. There has been a fair amount of work published on the ethics of paid
crowdsourcing, in academia and other platforms like Amazon’s Mechanical Turk (see
for example
The discussion of labor and exploitation within the chapter on ethical issues
presents commonly-asked questions about the ethical grey area that exists around
volunteer participation in academic research While participants do not in
general regard themselves as exploited, their willingness to volunteer and their
professed enjoyment in participating does not in itself imply that humanities
crowdsourcing is in ethical terms positive, or even neutral
For all cases involving ethics and crowdsourcing, the authors note that transparency
and regular communication are requisite elements for projects involving the public,
as well as acknowledgment of participants and open access to project outcomes in the
form of data. The suggestion of access to outcomes in the form of data is critical,
but the authors miss the opportunity to discuss a delicate but important issue: that
of open access to resulting publications which make use of project data. The authors
come close to this topic in an earlier discussion of motivations and benefits,
stating Most humanities crowdsourcing projects, however, do not reward their
contributors in material or professional ways
, having listed publication as a
material outcome for researchers in the previous sentence
In the final chapter,
In their conclusion, the authors note that