Mass digitization and the garbage dump: The conflicting needs of quantitative and qualitative methods

Gooding, Paul (2012) Mass digitization and the garbage dump: The conflicting needs of quantitative and qualitative methods. Literary and Linguistic Computing, 28 (3). pp. 425-431. ISSN 1477-4615

Full text not available from this repository.

Abstract

There has been widespread excitement in recent years about the emergence of large-scale digital initiatives (LSDIs) such as Google Book Search. Although many have become excited at the prospect of a digital recreation of the Library of Alexandria, there has also been great controversy surrounding these projects. This article looks at one of these controversies: the suggestion that mass digitization is creating a virtual rubbish dump of our cultural heritage. It discusses some of the quantitative methods being used to analyse the big data that have been created, and two major concerns that have arisen as a result. First, there is the concern that quantitative analysis has inadvertently fed a culture that favours information ahead of traditional research methods. Second, little information exists about how LSDIs are used for any research other than quantitative methods. These problems have helped to fuel the idea that digitization is destroying the print medium, when in many respects it still closely remediates the bibliographic codes of the Gutenberg era. The article concludes that more work must be done to understand what impact mass digitization has had on all researchers in the humanities, rather than just the early adopters, and briefly mentions the work that the author is undertaking in this area.

Item Type: Article
Faculty \ School: Faculty of Arts and Humanities > School of Literature, Drama and Creative Writing
Related URLs:
Depositing User: Pure Connector
Date Deposited: 15 Dec 2014 15:22
Last Modified: 10 Nov 2022 11:32
URI: https://ueaeprints.uea.ac.uk/id/eprint/51380
DOI: 10.1093/llc/fqs054

Actions (login required)

View Item View Item