William Kilbride

William Kilbride

Last updated on 15 February 2019

Contirbuting Authors of at the Luanch of Preserveren, Amsterdam 28/11/18Author's Note In 2018 I was invited to write an essay for Dutch colleagues on the experience of the DPC and how the issues that have affected the development of digital preservation capability among our members might provide some comparison, correspondence and convergence with the experience of digital preservation in the Netherlands. Published in Amsterdam on World Digital Preservation Day, the volume presents a comprehensive and thoughtful guide to emerging requirements in this dynamic, diverse but collegiate community. Freely available online, the editors have graciously permitted this re-publication of my own contribution: in part to draw attention to the rest of the work for an English-speaking audience; in part because it is a potted history of the DPC and therefore sits comfortably on the DPC blog and with a DPC audience. Also, somewhat inevitably, this essay summarizes and re-states themes that regular readers of this blog will recognize. But it's a longer read than usual for this blog so make yourself comfortable.

The original volume is an open access publication available online:


It is a privilege to be invited to contribute to this volume on digital preservation in the Netherlands, not least because of the impressive impact that Dutch contributors and projects have had over the years. This short paper will review the experience of close neighbours at the Digital Preservation Coalition (DPC), identifying themes and gaps from the history of the DPC, and proposing directions for the next decade. In doing so, this paper contemplates common and distinctive challenges; and it renews the longstanding invitation to collaborate, which is a defining characteristic of the digital preservation community around the world.

A Secure Legacy

In January 2018, the DPC published a new strategic plan.[i] There is much to commend this plan, establishing six interconnected programmes for the Coalition and providing a mandate from the community on whose behalf the DPC operates. It is not the purpose of this article to report or review that plan. Yet an easily overlooked detail makes a significant statement about the shared challenges and trajectories of digital preservation. The new plan sets out the DPC’s mandate to 2022, the twentieth anniversary of our foundation.  Therefore, hiding on the very front page is a paradoxical complement and reproach that might well summarize the experience of the global digital preservation community: its enduring commitment to collaboration and the surprising obduracy of the problem it seeks to resolve. 

This puzzle might be phrased in more personal terms: the DPC was first proposed at a conference in 2000[ii] which also heard from the KB about the NEDLIB project[iii]. I recall meeting a fellow archaeologist called Marcel Ras who had also found his way into the emerging field of digital preservation. The younger version of myself, not to mention the DPC’s founders, would likely be astonished to know that almost twenty years later the digital preservation challenge still persisted, and perhaps would not perceive twenty years of the DPC to be something to celebrate. I certainly had a clear expectation that the digital preservation challenge would be solved in a few years and would have gone back to our day jobs. The question for my younger self and Marcel’s younger self then, perhaps as now, was when would we return to the more familiar environment of archaeology?

Minding the gaps?

Digital preservation was very different in 2000 and, from that perspective, there’s a lot to celebrate. The DPC’s earliest mission was to propose actions that would resolve the problems that were evident. The findings of a needs assessment, published under the title ‘Mind the Gap’, provided a roadmap of recommendations for the community under eight broad headings which included ‘growing awareness’, developing ‘repositories for all’, and the need for the emergence of a ‘new discipline’.[iv] Although there is a small but diverse body of literature about digital preservation from the first decade of the century, this report is a good summary of our expectations then and a useful benchmark for the progress we have made since.

It is surprising how fresh some of these recommendations seem more than a decade later. For example, recommendation 5 speaks of the need for better tools ‘to build a business cases for the long-term preservation of digital materials’, while recommendation 11 calls for ‘cross-disciplinary forums to allow both experienced individuals and organisations to exchange digital preservation best practice’.[v] Such needs may be expressed differently today, but they remain true.

In other areas there is clear evidence of progress such as recommendation 10 that calls for digital preservation to be embedded in the training of librarians and archivists, and recommendation 17 which calls for more tools to perform digital preservation activities.[vi] There has been significant expansion of professional development and a transformation in the availability of tools and products. Thus, there are reasons to be positive that we have indeed ‘minded the gaps’.

These early years of digital preservation were also marked by a parallel and perhaps less informed narrative that spoke to the consequences of data loss and the dysfunction that could follow. Coming hard on the heels of the Millennium Bug, gloomy predictions of a ‘digital dark age’ added urgency to our work.[vii] These compelling and eye-catching accounts were useful for generating headlines but were in many cases over-stated and, because they were overstated, may well have been ultimately self-defeating. Typically, the ‘digital dark age’ narrative did not anticipate the efforts of the digital preservation community and so it will never be evident whether such predictions were always fanciful, or whether they suffered from the ‘observer effect’, in which the simple act of anticipating a phenomenon changes its outcome.

Perhaps more surprising however are the challenges not identified in the report. These might be termed the ‘gaps between the gaps’ that were not recognized and therefore not addressed. Of all the things not stated in the report, the actions of humans seem the most important and impactful. Digital preservation may indeed be a systemic problem and need systemic solutions: but it is also anthropogenic. In the very broadest sense, digital preservation is a problem for people, created by people and addressed through people. If that had been recognized sooner, then perhaps it would be less surprising to find that continued efforts are required. Arguably the digital preservation challenges we face now have been shaped by an emphasis on systems and solutions without fully engaging with the ever-changing dynamics of the human beings that are its cause, its solution and its ultimate purpose.

Obsolescence as a choice

The challenges with digital preservation are typically framed in technical terms, and though the wider context in which we operate is better understood, technology remains the dominant discourse within the literature. Issues like file format obsolescence, media degradation, emulation and the loss of representation information are significant concerns. As early as 2003, Nancy McGovern and Anne Kenny had observed that policy and resources were also critical concerns for digital preservation, which alongside technology, were described as the three legs of the ‘three-legged stool’ of digital preservation.

In the last decade, though, there has been more attention to the role of agency in what might be termed the causes of obsolescence. Working on the assumption that data loss is almost always the outcome of some embedded socio-economic structure that predispose specific historical contexts, then the causes of data loss are also to some extent amenable to control.  If the early decades of digital preservation were an overt response to an emerging problem, then there has been a tendency for the digital preservation literature to exhibit ‘solutionism’[viii] in which problem-solving overtakes the subtle appreciation of how problems arise. The contexts of data loss are better understood now than they were in 2006 and with them the processes through which it would be possible to make obsolescence obsolete.

From an historical perspective it is perhaps significant that digital preservation came into focus shortly after the rise of the personal computer and during the rapid expansion of the World Wide Web as a platform for office and home computing in the late 1980’s and early 1990s (there are many accessible accounts, though see Kirschenbaum 2016[ix] for a case study of one popular technology). The 1980s and 90s were the decades that, without which, digital technologies may have taken radically different directions. By extension, the economic forces and business environments that shaped the 1980s and 1990s created the norms of the digital universe. Had these forces been less consumerist, less disposable, more resilient, and more sustainable, then the endemic challenges of technical obsolescence, resource discovery and short-termism may not have arisen in the way that they subsequently have. By some strange process that is yet to be properly delineated, digital preservation came into existence as a response to the creative destruction implied within neo-liberal economics.

The specific contexts of data loss have come into focus over the last decade and so the scope of the of the digital preservation challenge has grown. The space once occupied by risks of obsolescence or media rot is now crowded also with concerns about ill-managed rights, out of control political interests, failing markets and simple human frailties. As confidence with technical challenges has risen, so there is a greater need to highlight the human behaviours behind data loss. The open nomination process for the ‘BitList’ of Digitally Endangered Species has been a significant milestone for the DPC, considering the nature of the challenge and how to address it (see Kilbride 2018[x] for an overview and introduction). Corporate abandonment, malicious deletion and ill-managed encryption have emerged as long-term threats to our digital memory in ways that were not fully recognized in 2006. Barely a week goes by without some new evidence of a duplicitous erosion, deletion or obfuscation by some rich or powerful agent who seeks to sanitise or erase uncomfortable narratives to suppress unwelcome truths or conceal historical facts. Less sensational but perhaps more harmful in the long run, is the impact of corporate failure as a risk to the digital estate, especially in the context of cloud computing.  The subtle message in all of this is the need to situate digital preservation quite differently within our organizations and communities: as a counter balance to challenges more sinister than file-naming or bit rot.  Obsolescence is not some pre-ordained, invisible force. It is no longer inevitable: and being avoidable, it can only coincide with some form of negligence.

Solutions and plans for their development were very much at the forefront of digital preservation when DPC was established and are laid out very succinctly in the Mind the Gap Report.  As these solutions have become available so it has become obvious that the contexts of data loss are always to some extent a choice, and that no solution in the world could ever suffice on its own. The digital preservation community has exposed obsolescence as a convenient excuse and a lucrative business model. It’s not yet clear what can be done with this revelation but in the coming decades our role will be to ensure that responsibilities for preservation are more meaningful and more widely understood.

Digital Preservation as Community

The ‘Mind the Gap’ report was explicit about the need to encourage a new discipline of digital preservation and for training in new skills to become available.[xi] It stopped short of calling for a new profession. Understanding how the digital preservation has progressed over the years encourages some thinking about the changing shape of the professional cohort that now delivers digital preservation.

By any measure that cohort is larger and more diverse than ever: a fact represented in some simple statistics from the recent history of the DPC. The Digital Preservation Handbook was launched in 2002 as the work of two authors: the 2016 edition credits thirty-three.[xii] The Digital Preservation Awards in 2010 had one winner: in 2018 there were 6. In 2009 the DPC had 2 staff and in 2018 there are 7. In 2009 DPC had 33 members: in 2018 there are 83. This growth is welcome, but it generates a certain amount of disruption. The community is diverse and the use cases for preservation tools are more demanding, the requirements more expansive and expectations more exacting. It may seem like a good time to move into the digital preservation business but only for those able to deal with these disruptive, dynamic and eccentric requirements.

This growth has not been a straight line and there are some significant challenges associated with it. Again, it is worth remembering that practical and concerted digital preservation really began in the late 1990s, coinciding with an unprecedented economic boom.  It was such a long boom that economists and bankers congratulated themselves that this was the new norm.  They were wrong: it turned out that significant and unsustainable assumptions were hidden in a financial sector that wasn’t able or willing to prepare for the crash that would come. The question arises whether casually unsustainable assumptions embedded themselves into our plans for digital preservation in the early 2000’s, too?

OAIS set a high standard for digital preservation in 2002.[xiii] It has provided a shared language and some shared processes. It needs to be read through the prism of reasonable aspiration in 2000’s: there is no explicit encounter with values or vision and too little expectation of changing context. Mind the Gap anticipated many of the technical and policy challenges but did not foresee the machinations that eviscerated the public sector. It’s not a surprise that since 2010 it has become a lot more fashionable to talk about minimal effort ingest[xiv], parsimonious preservation[xv] and ‘Preserving Digital Objects With Restricted Resources[xvi]. This is not simply about lack of resource: it’s about competition for resources.

Seen in this context it is worth recognising that the digital preservation community has not been immune from the blunt trauma of economic conditions either. Consider, for example, the widespread redundancies and recruitment freeze that followed the banking crisis in 2008. Many agencies in the UK with an interest in digital preservation simply closed down in the turmoil that followed, including several significant DPC members and founders; and those that survived had little choice but to shed many of their previous complement of archivists, conservators and librarians. The normal employment cycle stalled and new generations of students with new skills were effectively locked out.

This had a profound effect on individuals and it postponed the emergence of a new ‘digital preservation’ profession. Existing professional channels were reinforced instead, and salaries were fixed at levels congruent with library and archive posts, which further inhibited the recruitment or retention of in-demand developer skills. 

The last decade has confirmed the earlier view that digital preservation is tricky. We’ve become used to the idea of working without all the resources needed to do all the things which seemed necessary in 2002. In many practical senses this generation are materially worse off now than when ‘Mind the Gap’ was published. David Rosenthal has observed that ‘Money turns out to be the major problem facing the future of our digital heritage’.[xvii] In 2006 it was all but impossible to anticipate the scale of the challenges that would arise.

Preservation, Access, Impact

The ‘Mind the Gap’ report makes almost no mention of users, except in the context of the exploitation of intellectual property. This seems the most significant gap of all.

For digital preservation, considerations about the users are mostly expressed via dissemination efforts, most immediately in the design and delivery of ‘Dissemination Information Packages’ and the ‘Designated Community’ (See Lavoie[xviii] 2014 for fuller explanation). However, access and dissemination alone offer little justification for preservation: access is not an end in itself any more than preservation is. For example, the ‘Blue Ribbon Task Force’ examined the value of digital preservation insofar as it secures ‘depreciable durable assets’ which are ‘long lasting and produce a flow of value through time’.[xix] It is hard to think of a single case in which access alone brings benefits because value is constituted after access. Like preservation or interoperability, it is a necessary but not a sufficient condition for impact. There is no doubt that digital preservation needs to be configured around access: but access will only make sense if it is configured to enable the benefits that accrue from timely, dependable and supported use of data.

This may sound like a word play but it has significant implications for the practical digital preservation. It is easy to describe an access function, much harder to translate ‘impact’ into concrete actions within preservation plans. It gets even harder when we need to demonstrate the links between the implementation of a digital preservation plan and the outcomes in terms of impact and value that depend upon it.

The foundations of this user-focussed and value-based approach are set firmly within the digital preservation standards. It is surprising that there was so little practical talk about how to test, predict or report changing user requirements and how to embed these within the ongoing delivery of digital preservation services. This oversight within ‘Mind the Gap’ seems to be repeated in the subsequent literature too. The energy with which organizational and technical challenges have been researched and resolved does not seem to be matched by an understanding of the user communities that repositories are established to support.  Stated more formally, it is hard to track the impact of changing user needs on meaningful re-evaluations of the representation information required to ensure the independent utility of a digital object, nor is it clear that they represent an ongoing relationship with a designated community.

Digital preservation facilities that ignore users end up with two difficulties: their repositories simply won’t work and the impact they seek to deliver is lost. If true, it is a significant risk to the relevance and vitality of the community.  Is it true that the digital preservation community doesn’t care about users enough to embed them into the day-to-day operation of its facilities? The apparent silence can be explained in perhaps two ways.

In some cases, digital preservation is embedded within institutions that already have robust and well-documented user bases, with feedback mechanisms to report changes and trends in user needs. That’s certainly true of the larger memory institutions that already have a public profile and therefore need to manage user expectations. Such services may not be reported in the digital preservation literature because they are so well established that they are not of research interest, and partly because when they are described it is at different conferences and with different peers.  In other cases, it may be that efforts are so concentrated on submission that digital preservation research just hasn’t made it to users yet.

Each of these answers is plausible but neither really accounts for the silence about ongoing tests and assessments. There is a pattern to research but the need to capture and track the requirements of designated communities should not be not an afterthought in digital preservation architectures.

There is a further and perhaps more demanding challenge in the relationship between repositories and their users, embedded deeply within the digital preservation literature and most clearly articulated in the OAIS: digital preservation technologies have yet to face up to the significant challenges of privilege and decolonialization which have arisen in the last decade. Again, considering the significant impacts that discourses of inclusion have had since 2006, it’s perhaps not surprising that these subtle challenges were not obvious then.

One of the strengths of digital preservation has been its willingness to adopt tools and approaches from many different disciplines. For example, OAIS is the product of the Consultative Committee for Space Data Systems. To some extent that origin haunts the language and assumptions of the model, and because OAIS is the lingua franca of digital preservation generally, the values and norms of space science lurk below the surface of just about every digital preservation conversation.

This contribution has been immensely welcome and potent, but it also should be set alongside important trends in archival and museological theory which tend to the view that meaning-making can be hard and contradictory. The textual turn of cultural hermeneutics and poststructuralism in particular has been controversial, and it has been argued that the whole genre of post-truth informatics has some origin in the legacy of postmodernism (D’Ancona 2017).[xx] It has certainly been a mixed blessing for archives, libraries and museums.

On one hand, the recognition that knowledge production is a fundamental tool in the reproduction of power has transformed memory institutions from the gatekeepers of authoritative resilience to the enablers of progressive narrative(s). Derrida equated archives with a sort of house arrest[xxi]: both as the source and containment of power, arranged to the practical convenience of the authorities, and only shared on asymmetrical terms with the public. It’s no small accomplishment to note that for three decades now any number of disenfranchised communities have taken back control of cultural storehouses to establish new and often conflicting histories that subvert established norms and empower those previously excluded.  Archives, libraries and museums have largely welcomed these new if at times unruly patrons on the assumption that if the epistemology of the institution is not fundamentally about justice then, by default, its purpose is to sustain injustice.[xxii]

On the other hand, if signifier and signified are in permanent renegotiation, and if context is the last and only arbiter of meaning, then anyone can interpret everything to mean anything. That seems significantly more challenging in the context of a memory institutions where the absence of authorial voice intensifies the impossibility of authoritative meaning-making.  In a crisis of relativism and self-congratulatory truth-making, where power is self-creating and context is fluid, what’s the use of archives at all?  The legacy of postmodernism could be summarized as follows: everyone empowered by their own narrative; and everyone empowered to deny everyone else’s.

If the challenge to meaning-making began with postmodernism then it has been turbo-charged by technology. Francophone theorists of the 1960’s and 1970’s legitimated the challenge to legitimacy while anglophone engineers of the 1970’s and 1980’s delivered the machinery of change. The result: anyone can assemble their own history from the many ubiquitous sources that they chose not to ignore; they can publish it; and in so doing can find an audience to share their pain; and then all can live secure in self-made echo-chambers of reflexive half-truths repeated so often than they might as well be whole truths. That’s becoming the all-too-common experience of social media and it’s no wonder it’s become fashionable to call it anti-social media.

The difficult history of meaning-making in the late 20th and early 21st century seems strangely at odds with the processes and norms adopted in digital preservation, especially with respect to representation information. Representation information is component of an information that holds the semantic and structural components that proved the transparency required to ensure that a ‘digital object’ can become an ‘information object’ in the hands of a designated community. Arguably, representation information, is the unique characteristic that distinguishes digital preservation from every other kind of content management.  Representation information is interpreted using representation information which, at face value, implies a sort of recursive absurdity. OAIS avoids this in two ways: by linking representation information into networks; and by accessing an underlying knowledge base which we can take for granted. For example, a person who has a Knowledge Base that includes an understanding of English will be able to read, and understand, an English text. So the extent of representation information is mapped against implicit knowledge between two agents within an information exchange. Information objects are generated therefore through a mix of data object, representation information and implied knowledge.

This has important consequences for anyone attempting to challenge privileged narratives.  If the link to an authoritative definition within a representation network is one of the keys to unlocking meaning, then whoever gets to assign that link or manage the end-point is a very important individual, a dependency that is open to abuse.  That’s even more problematic when one considers the implied knowledge that sits alongside representation information.  How might the extent of implied knowledge be established for a preservation process that evolves over decades?  There are two important themes here: the designated community and the mechanisms that measure changes within it. OAIS assumes a special class of consumers that it defines as the designated community, a class that should be able to understand the preserved information. The designated community has a role to define (and thus set boundaries to) the extent of representation information. So long as the OAIS charts and tracks that community through time, then representation is manageable.

This is self-evidently useful because the alternative is recursive absurdity.  But reading that through the lens of three decades of cultural theory, it implies that, if you’re not part of the designated community, you’re not expected to use or understand the collection and the archive has no explicit responsibility to help you, and no requirement to listen to you.  This might be true in the context of academic research where a relatively small but expert group of professionals would be expected to use complex datasets and would be motivated enough to cope with opaque documentation and annotations. It is altogether more concerning when identities, actions or meanings are in dispute: where honest misunderstanding may arise or faux conflicts be engineered and prolonged. 

At present, best practice in the digital preservation community means that the digital archivist is empowered, in fact is required, to exercise a kind of intellectual exclusion that is out of step with just about every other kind of memory institution. In summary, archives, libraries and museums have spent 30 years coming to terms with inclusion and polysemy, challenges that were barely considered by the digital preservation community in 2006.

Digital Preservation: Community not Process

This article started by wondering why the role and mission of the digital preservation community has remained relevant for so long, and in particular, if anything can be gleaned from the experience of the Digital Preservation Coalition that might inform the Dutch experience of digital preservation. Surely the problem has been fixed? Surely we can get back to our real jobs?

The answer to both questions, for better or worse, seems to be an emphatic no, at least in the context of the UK and Ireland where DPC was founded, and almost certainly in the context of the  Netherlands, too. Three themes emerge, which will certainly influence the direction of the DPC and which, because they are universal challenges, will invite ever closer collaboration with colleagues around the world. 

Firstly, perhaps most obviously, the nature of the technical preservation challenge continues to change and therefore the tools and approaches that have emerged need constant renewal and revision. This need for renewal and revision creates a tension for information managers who seek to ensure the integrity of their own processes. Understanding how to update and automate procedures without compromising them is a key challenge in the years ahead. 

The community has been conspicuously successful in the provision of tools and processes for data management, even if there is more to do. It has been less successful at embedding the outcomes of research into practical workflows and spent too little time considering how data loss arises in practical contexts on a daily basis.  These processes, not purely technical, will endure for as long as agencies, corporations and individuals are able to shirk or defer requirements for preservation that enable reasonable expectations of transparency and authenticity. We may never make obsolescence obsolete, but it is certainly possible to identify those for whom data loss is a convenient excuse of lucrative business model.

We have also been less effective in involving users in our discussions, especially on selection, preservation planning and representational information. A sustained and informed dialogue between repositories and their audiences is required. If it is possible to capture and embed the potential of the data then such changes could have a transformative effect. Digital preservation is not done for the good of the data, but for audiences and communities that have hitherto been all but absent from out literature. The coming challenge is to configure our processes around the needs of people and the opportunities they seek to exploit. 

In 2006 the DPC conceived of digital preservation as a series of gaps to be avoided through timely interventions and adaptation of process. What’s become apparent since then is that digital preservation is not a process, it’s a community. It’s time to stop minding the gaps: it’s time to start seeing the opportunities.

Acknowledgements

I am grateful to Margriet van Gorsel, Erika Hokke, Bart de Nil and Marcel Ras for the invitation to write this article, to Sara Day Thomson and Marcel Ras who commented on drafts of this text prior to publication, and to De Stichting Archiefpublicaties for permitting this re-publication.

References

[i] Digital Preservation Coalition 2018 A Secure Digital Legacy: The Digital Preservation Coalition 2018-2022 (2018) online at https://www.dpconline.org/docs/miscellaneous/about/1755-dpc-strategic-plan-2018-22/file last accessed 7/9/2018

[ii] Day, Michael ‘Preservation 2000’ in: Ariadne 26 (2001) online at: http://www.ariadne.ac.uk/issue26/metadata/ last accessed 7/9/2018

[iii] van der Werf-Davelaar, Titia, ‘Long-term Preservation of Electronic Publications, The NEDLIB project’ in DLib Magazine 5.9 (1999) online at http://www.dlib.org/dlib/september99/vanderwerf/09vanderwerf.html  last accessed 7/9/2018

[iv] Waller, Martin and Sharpe, Rob Mind the Gap: Assessing Digital Preservation Needs in the UK, (Digital Preservation Coalition, 2006) online at: https://www.dpconline.org/docs/miscellaneous/advocacy/340-mind-the-gap-assessing-digital-preservation-needs-in-the-uk/file  last accessed 7/9/2018 

[v] Waller, Sharpe, Mind the Gap p.36-37

[vi] Waller, Sharpe, Mind the Gap p.37-38

[vii] e.g. Bergeron, Bryan Dark Ages II: When the digital data die, (Prentice Hall, 2001)

[viii] Morozov, Evgeny To Save Everything Click Here: Solutionism and the Urge to Fix Problems that Don’t Exist (Penguin, 2014)

[ix]  Kirschenbaum, Matthew Track Changes; a literary history of word processing (Belknap, 2016) 

[x]  Kilbride, William Sic Transit Gloriae Digitalis: BitList Beta (DPC Blog, 2018) online at:  https://www.dpconline.org/blog/sic-transit-gloria-digitalis-the-bitlist-in-beta last accessed 3/10/18

[xi] Waller, Sharpe Mind the Gap p. 37

[xii] Digital Preservation Coalition Digital Preservation Handbook (2nd Edition, 2016) online at: https://dpconline.org/handbook last accessed 3/10/2018

[xiii] Lavoie, Brian The Open Archival Information System (OAIS): Introductory Guide 2nd Edition, Digital Preservation Coalition Technology Watch Reports 14-02 (2014) online at: http://dx.doi.org/10.7207/twr14-02  last accessed 3/10/18

[xiv] Bolette, AJ, Blekinge AA and Christiansen, KF ‘Minimal Effort Ingest’, in Proceedings of the 12th International Conference on Digital Preservation (School of Information and Library Science, University of North Carolina at Chapel Hill, 2016) online at: https://phaidra.univie.ac.at/o:429591 last accessed 3/10/18

[xv] Gollins, Tim Parsimonious Preservation: Preventing Pointless Processes!  A Simple Step that takes Digital Preservation a Long Way, (Online Information 2009, 2009) online at https://www.nationalarchives.gov.uk/documents/information-management/parsimonious-preservation.pdf last accessed 3/10/18

[xvi] POWRR Preserving digital Objects With Restricted Resources (Digital POWRR, 2012), online at: http://digitalpowrr.niu.edu/ last accessed 3/10/18

[xvii] Rosenthal, David Storage Will Be A Lot Less Free Than It Used To Be (DSHR Blog, 2012) online at: http://blog.dshr.org/2012/10/storage-will-be-lot-less-free-than-it.html last accessed 3/10/18

[xviii] Lavoie OAIS (2014)

[xix] Blue Ribbon Task Force Sustainable Economics for a Digital Planet: Ensuring Long-Term Access to Digital Information, Final Report of the Blue Ribbon Task Force on Sustainable Digital Preservation and Access (BRTF, 2009) p. 25

[xx] D’Ancona, Matthew Post Truth: The New War on Truth and How to Fight Back (Ebury, London, 2017)

[xxi] Derrida, Jacques and Prenowitz Eric Archive Fever: a Freudian Impression (Diacritics 25, 1995, pp 9-63) p. 10.

[xxii] O'Neill, Mark ‘Essentialism, adaptation and justice: Towards a new epistemology of museums’ in Museum Management and Curatorship 21, (2006) pp 95-116


Scroll to top