paragraphs, I will first explore the transformative effects of these technologies, then
give some examples and I will finish with discussing the implications for
recordkeeping concepts and for appraisal and selection.
A need to rethink archival methods
Some leading archival scholars like Frank Upward and Barbara Reed argue that the
archives and record profession is facing a widespread crisis. One of the obvious signs
of being in crisis is that professionals cannot 'reliably say what a record as a thing is
as our conceptual understanding of it blurs into data, documents, information,
the archive, and the plurality of archives. The settings in which we manage these
converged "things" continues to multiply and increase in complexity. Our new
information spaces with their vibrant diversity are paradoxically producing a
collapse of collective memory' (Upward et all, 2013, p. 40). There are some parallels
to be made with the alarmist view David Bearman already expressed in the late
1980s, when he proclaimed that 'the best methods of the profession were
inadequate to the task at hand' (Bearman, 1989, preface). Since Bearman vented his
concern, the information-scape has been constantly in transformation. In his time,
the late 1980s, the administrative use of Internet was still in its infancy. Tim Berners
Lee had just started to work on what would become the world-wide web. Social
media were not born yet and the first sms would be sent in 1991. Big Data and the
Internet of Things were still a science fiction fantasy. Most of these new media are
commonly used nowadays. The computational turn not only affected information
and communication behaviour in the personal realm but it profoundly transformed
information and communication patterns in administration and business. The
computational turn enabled the rise of new economic models which are based on
sharing commodities and services, with Airbnb and Uber as the best-known
examples. Despite the major changes in the use of ICT, the debate on appraisal and
selection has largely remained within the existing document-oriented paradigm.
Recently, the Australian Recordkeeping Roundtable paid attention to the
implications of the computational turn on recordkeeping functions, including
appraisal and selection. Kate Cumming and Anne Picot presented a valuable
overview of the challenges appraisal and selection are confronted with. Some of
them were diagnosed as technical (new media and applications, networks, changing
forms of records, data volumes and storage) and others as organisational (multiple
professional responsibilities, decentralised business processes, commercialisation
and proprietary systems) (Cumming Picot, 2014, p. 133-145). They conclude that
appraisal in archival institutions is still too much defined as 'a process to preserve a
documentary cultural heritage rather than identifying appraisal as laying the basis
for practical and accountable recordkeeping'. Although the authors delineate some
valuable directions that need to be explored to rethink and reformulate appraisal
and call for developing a strategy to prioritise and to employ with business
operations, they pay relatively little attention to the fundamental changes that
digitisation and informatisation of society have on the attributed function(s) of
appraisal. This brings up the following question: what is needed for 'accountable
recordkeeping'?
Ubiquitous information technology
In 2011 the authoritative Dutch Scientific Council for Government Policy warned
against a precarious lack of awareness among policy-making officials about the far-
reaching implications of the networked information structures for the memory
functions of iGovernment. The Council emphasised that '[b]oth the importance of
'forgetting' - people should not be judged eternally on the information that
government has stored about them - and of saving and archiving require a radical
cultural transition and a firmly grounded strategy' (WRR 2011, p. 16 and p. 207).
The Council asserted that the government has changed from eGovernment - in
which ICT is mainly directed towards providing services - into iGovernment -
where ICT changes the relationship between government and citizens because
information-flows and data-networks are used for purposes of control and care.
The ubiquitous use of memory chips in innumerable applications and functions
leads to unprecedented volumes of recorded and processed data. Beyond the three
V's (the availability of high volumes, high velocity and high variety of data), it is
especially the ability to search, aggregate, and cross-reference large data sets that
generate these unprecedented opportunities (Boyd Crawford, 2012, p. 663). As a
result of these innovations, Chris Anderson, editor-in-chief of WIRED magazine,
announced the death of theory in 2008 in his much-discussed, contested but
nonetheless influential article in Science by stating: faced with massive data,
this approach to science - hypothesize, model, test - is becoming obsolete.
There is now a better way. Petabytes allow us to say: "Correlation is enough." We can
stop looking for models. We can analyze the data without hypotheses about what it
might show. We can throw the numbers into the biggest computing clusters the
world has ever seen and let statistical algorithms find patterns where science cannot'
(Anderson, 2008). Computer scientist Jim Gray introduced the fourth paradigm of
science in 2007. After empiricism (observation and experiment), theory (using
models, generalisations, hypotheses) and computation (simulating complex
phenomena), science is increasingly based on data intensive computing, which
unifies theory, experiment and simulation (Hey cs, 2009). This mixing up of
correlation and causality and this naïve belief in the power and possibilities of data
to solve present-day problems is typical for these big data adherents.
Data in itself might be seen as innocent, but the processing is definitely not
(Rouvroy Berns, 2013). It is the processing activity that makes data meaningful
and transforms data into information. Transforming data into meaningful
information cannot exist without a selective perspective. The terms data and
information are often improperly used as synonyms. Liebenau and Backhouse make
a clear distinction between data and information by defining data as 'symbolic
surrogates which are generally agreed upon to represent people, objects, events and
concepts' while information is 'the result of modelling, formatting, organising, or
converting data in a way that increases the level of knowledge for its recipient', or as
they summarise: 'information is data arranged in a meaningful way for some
perceived purpose' (Canhoto Backhouse, 2008, p. 48). The techniques used for
modelling and organising data are increasingly computational algorithms.
Algorithms are basically a set of rules or instructions to perform a certain
assignment in order to process input into output. In the words of the Norwegian
media scholar Eivind R0ssaak, computational algorithms have become the new
lingua franca of codes in the informational infrastructure and they increasingly
archives in liquid times
200
charles jeurgens threats of the data-flood. an accountability perspective in
the era of ubiquitous computing.
201