Sarah Middleton

Sarah Middleton

Last updated on 28 September 2016

In this issue:

  • What's On - Forthcoming events from November 2011 onwards
  • What's New - New reports and initiatives since the last issue
  • What's What - Too Big to Fail - William Kilbride, DPC
  • Who's Who - Sixty second interview with Janet Delve, KEEP Project, University of Portsmouth
  • One World - Italian legislation on the preservation of electronic records - Mariella Guercio, University of Urbino
  • Your View? - comments and views from readers

What's new is a joint publication of the DPC and DCC


What's On

The DCC have a number of events coming up that may be of interest to you. For further details on any of these, please see our DCC events listings at http://www.dcc.ac.uk/events/. You can also browse through our DCC events calendar to see a more extensive list of both DCC and external events.

Digital Preservation Training Programme (DPTP)
14-16 November 2011
http://www.dptp.org
The DPTP is a modular training programme, built around themed sessions that have been developed to assist you in designing and implementing an approach to preservation that will work for your institution. Through a wide range of modules, the DPTP examines the need for policies, planning, strategies, standards and procedures in digital preservation, and teaches some of the most up-to-date methods, tools and concepts in the area. It covers these topics via a mixture of lectures, discussions, practical tasks and exercises, and a class project.

Writing and using a Preservation Policy
17 November 2011
http://www.bl.uk/blpac/policy.html
Do you need to develop or revise a preservation policy as part of your long-term collections management strategy? Would you like guidance on what to consider and include? This training day is aimed at those who are new to the preservation of library and archive collections and would like to learn more. The day starts with an overview of preservation policies, what to include and how to format a policy. Subsequent sessions include case studies on implementing policies and a writing exercise.

How do we make the case for research data centres?
17 November 2011
http://www.rin.ac.uk/news/events/data-centres-their-use-value-and-impact
The recent report Data Centres: their use, value and impact was co-sponsored by RIN and by JISC through the Managing Research Data Programme. To mark the launch of this important report, RIN are hosting a discussion event at the Wellcome Collection.

Intellectual Property Rights and Digital Preservation
21 November 2011
http://www.dpconline.org/events/details/36-intellectual-property-rights-and-digital-preservation?xref=36
This briefing day, co-sponsored by the DPC and JISC Digital Media, is intended to examine and discuss key concepts of intellectual property rights as they impact on digital preservation. It will provide a forum to review and debate the latest developments in the law as it applies to preservation and it will initiate a discussion on how simple legal processes can be deployed. Based on commentary and case studies from leaders in the field, participants will be presented with emerging tools and technologies and will be encouraged to propose and debate the future for these developments.

The Future of File Format Identification: PRONOM and DROID User Consultation
28 November 2011

http://www.dpconline.org/events/details/37-droid?xref=37
The National Archives is proposing to launch a new phase of development of its DROID tool, and is seeking to engage with various user groups and stakeholders from the digital preservation community, government and the wider archives sector communities to help inform and discuss potential developments and user needs. As part of this process, The National Archives, in conjunction with the Digital Preservation Coalition, invites interested parties to attend a one day workshop, hosted at Kew, to discuss their experiences of using DROID and PRONOM in their respective disciplines, discuss how the tools fit their use case, and describe both positive and negative experiences of the tools and their interaction with The National Archives.

Beta Launch of New Geospatial Technology Tools
28 November 2011
http://code.google.com/p/jiscgeo/wiki/ProgrammeMeetingAgenda
JISC is pleased to announce the launch of several new BETA products and tools centred around geospatial technology and aimed at Universities, Colleges and Schools. Come along and experiment with these soon-to-be-released products at a special one day event on Monday November 28th at Ravensborough College in London. We encourage students, researchers and teaching staff from all disciplines to attend and sample these fantastic new tools so that we can further their usefulness to everyone in Academia.

3rd IEEE International Conference on Cloud Computing Technology and Science
29 November-1 December 2011
http://2011.cloudcom.org/
The “Cloud” is a natural evolution of distributed computing and of the widespread adaption of virtualization and SOA. In Cloud Computing, IT-related capabilities and resources are provided as services, via the Internet and on-demand, accessible without requiring detailed knowledge of the underlying technology. The IEEE International Conference and Workshops on Cloud Computing Technology and Science, steered by the Cloud Computing Association, aim to bring together researchers who work on cloud computing and related technologies.

Annual General Meeting of the Digital Preservation Coalition
1 December 2011, Belfast
http://www.dpconline.org/
The Annual General Meeting of the Digital Preservation Coalition will take place at the Public Records Office of Northern Ireland, Belfast in the afternoon of the 1st December. Details to follow.

7th International Digital Curation Conference
5-7 December 2011
http://www.dcc.ac.uk/events/idcc11
Digital curation manages, maintains, preserves, and adds value to digital information throughout its lifecycle, reducing threats to long-term value, mitigating the risk of digital obsolescence and enhancing usefulness for research and scholarship. IDCC brings together those who create information, those who curate and manage it, those who use it and those who research and teach about curation processes. This year the theme for the conference is "Public? Private? Personal? navigating the open data landscape".

7th IEEE International Conference on e-Science
5-8 December 2011
http://www.escience2011.org/
Scientific research is increasingly carried out by communities of researchers that span disciplines, laboratories, organizations, and national boundaries. The e–Science 2011 conference is designed to bring together leading international and interdisciplinary research communities, developers, and users of e–Science applications and enabling IT technologies. The conference serves as a forum to present the results of the latest research and product/tool developments and to highlight related activities from around the world.

Austin PASIG 2012
http://sites.tdl.org/austinpasig/
The PASIG is open to institutions and commercial organizations interested in learning about and sharing practical experiences in the following:

  • Comparison of high-level OAIS architectures, services-oriented architecture work, and use cases
  • Sharing of best practices and software code
  • Cooperation on standard, open, ‘in-a-box’ solutions around repository technologies
  • Review of storage architectures and trends and their relation to preservation and archiving architectures and eResearch data set management
  • Discussion of the uses of commercial third party and community-developed solutions

The organization is focused on sharing open computing solutions and best practices. But while sharing information about the state-of-the-art developments in standards and open source is important, this is not a standards-setting organization. It is a place to share practical experiences, successes, pain points, and potential topics for more collaboration

 


What's New

For more information on any of the items below, please visit the DCC website at http://www.dcc.ac.uk.

KAPTUR - managing visual arts research data in UK Higher Education and Research
http://vads.ac.uk/kaptur/outputs/20111010_KAPTUR_press_release.pdf
The Visual Arts Data Service (VADS), a research centre of the University for the Creative Arts, has been awarded funding from the JISC Managing Research Data programme 2011 for the KAPTUR project. Building upon the work undertaken by the Digital Curation Centre (DCC), and previous JISC Managing Research Data projects such as Project CAiRO and JISC Incremental, KAPTUR will bring together the Glasgow School of Art; Goldsmiths, University of London; University for the Creative Arts; and the University of the Arts London, to discover, create and pilot a sectoral model of best practice in the management of research data in the visual arts.

Call for stories of data loss and/or preservation
http://j.mp/datastories
Have you, or has your institution, lost important data through hardware failure, hardware or format obsolescence, migration to a new format, or other means? Were you once in danger of losing important data, but you managed to recover or preserve it? The National Digital Stewardship Alliance would love to hear about it - you can tell your story in 3-4 sentences using the online form.

Repository Fringe 2011
http://rfringe11.blogs.edina.ac.uk/2011/10/14/454/
Presentations, videos, blog posts and news items from this year's Repository Fringe event are now available on the RF11 website.

Research Data MANTRA online course
http://datalib.edina.ac.uk/mantra
The Research Data Management Training, or MANTRA project has produced an open, online training course to help disseminate good practice in research data management at the University of Edinburgh and beyond. The Data Library team at EDINA produced the materials over the course of the past year as part of the JISC Managing Research Data programme.

DaMSSI project delivers career profiles
http://www.rin.ac.uk/our-work/researcher-development-and-skills/data-management-and-information-literacy and http://www.dcc.ac.uk/training/data-management-courses-and-training/career-profiles
The JISC/RIN-funded Data Management Skills Support Initiative (DaMSSI), in collaboration with DCC, has produced a series of five career profiles that aim to demonstrate how data management skills contribute to and underpin high-quality performance in a number of professions.

Patients Participate! Bridging the gap between information access and understanding
http://bit.ly/oRor9b
Publishing a lay summary alongside every research article could be the answer to assisting in the wider understanding of health-related information, say the findings of a new citizen science project called Patients Participate! Commissioned by JISC and carried out by the Association of Medical Research Charities, the British Library and UKOLN, Patients Participate! asked patients, the public, medical research charities and the research community, ‘How can we work together in making sense of scientific literature, to truly open up research findings for everyone who is interested?’ The answer came from patients who explained that they want easy-to-understand, evidence-based information relating to biomedical and health research.

Enhanced publication: from experiment to practice
http://www.surffoundation.nl/enhancedpublications
Enhanced publication is a new type of scientific/scholarly publication whereby researchers make publications available on the Internet in combination with related research data. Researchers at a number of universities and research institutions gained experience in 2011 with enhancing publications during six projects financed by SURFfoundation. The emphasis in previous projects was mainly on developing the technical facilities for creating enhanced publications. This year, it was the turn of the researchers themselves to enhance their publications and to present them in context.

New DCC guide to citing datasets
http://www.dcc.ac.uk/resources/how-guides/cite-datasets
The DCC has released a new guide on citing datasets and explains how researchers can create links between their publications and the underlying data, so that each can be found from the other. It also provides advice for repository managers and data archivists wishing to make their data holdings easier to cite.

KE Report: Addressing legal barriers in sharing of research data
http://www.knowledge-exchange.info/Default.aspx?ID=461
It is difficult for researchers and those supporting them to understand how open access to research data can be legally obtained and re-used. This is due to the fact that European and national laws vary and researchers work across national boundaries. A possible approach to providing clarity would be that researchers assign a licence to their data. This practice could be incorporated in a code of conduct for researchers. This is one of the recommendations from the report ‘The legal status of research data in the Knowledge Exchange partner countries’ which was commissioned by Knowledge Exchange (KE) and written by the Centre for Intellectual Property Law (CIER). The aim of the report was to provide clarity by analysing the intellectual property regimes in the four KE countries and European database law. Moreover, the report provides three recommendations to achieve better access: making contractual arrangements with authors, harmonisation of European copyright law and setting up of policies on commercial interests.

Workshops at 7th International Digital Curation Conference (IDCC)
http://www.dcc.ac.uk/events/idcc11/workshops
The DCC is delighted to announce that a draft programme of workshops has just been published and the registration is now open. Please note that although the workshops take place before and after the conference you don't need to register for the conference to attend them. This year's International Digital Curation Conference will run from 5-7 December in Bristol and will be presented jointly by the Digital Curation Centre in partnership with the Coalition for Networked Information (CNI).

DigCurV Newsletter
http://www.digcur-education.org/eng/News/Ahead-of-the-CurV-Newsletter
The Digital Curator Vocational Education Europe (DigCurV) project is delighted to announce the first issue of the DigCurV newsletter. 'Ahead of the CurV'. The newsletter is one of the ways in which the project is reaching out to people working in the library, museum, archive and cultural activities sector with an interest or involvement in digital curation, digital preservation and training. The purpose of the newsletter is to share news and information about the project’s activities, about the sector, as well as events and developments in digital curation from the members of our network and related projects. In the first issue you will find an update on the project, features on Vilnius University Library and Biblioteca Nationale Marciana, and details about how you might join the DigCurV network. You can also follow DigCurV on Facebook, LinkedIn, Slideshare and Twitter.

 


wk-cropWhat's What - Editorial: Too Big to Fail

William Kilbride, DPC

Glasgow is pretty much the top left hand corner of the Roman Empire. More precisely, the Empire stretched from the shores of the Red Sea to a few metres north of the bus wash facility at the Gavinburn Vehicle Depot in Old Kilpatrick, where Antoninus Pius finished his wall. The Antonine Wall is much admired locally for the opportunities it brings for trite marketing and parochial myth-making, especially in my own bucolic suburban idyll north of the city. In reality the Romans weren't in Scotland for terribly long, nor were they here during the fashionable height of imperial power. Fourth century emperors like Theodosius or Constantine were much smarter and much wealthier than second century wimps like Antonine. But I'm sure it didn't seem like that at the time: I'm sure my Pictish ancestors – presented with a socking great military barricade across their countryside – thought the Imperium Romanum was too big to fail.

If global empires and dominations can fail then currencies and markets and corporations can too. The real wonder of our age is how quickly democratic governments can let so much else fail while nationalising enormous commercial debts. But for now let's simply note the linkage between market volatility and digital technology. What traces will survive of the extraordinary follies of our times? What is the relationship between digital preservation and the digital economy?

Before we get started, let’s be wary of hype. The digital economy is really just the same old economy. It is no cleverer and is no more benign than it used to be. Demand, volume and accessibility of information do not create business opportunities on their own: the ability to exploit (or feign) privileged access to information is not a new business plan. And it seems naïve to wonder whether the speed and volume of exchanges make capitalism any more or less moral. The story of the ‘digital’ economy is one where we can easily become obsessed with non-issues, or diverted by different hysteria.

But there are at least four substantive economic questions about digital preservation which really do need to be teased out: the economic sustainability of services; the economic opportunities in preservation solutions; the role of digital preservation in business; and the nature of obsolescence.

The first theme is well rehearsed in experience and theory: such as the collapse of the AHDS or the research agenda crystallised by the Blue Ribbon Task Force on Sustainable Economics for a Digital Planet. These show that the economics of digital preservation are distinctive and peculiar. We are now familiar with the ideas (if not the vocabulary) of the ‘depreciable durable assets’ which require a degree of dynamic and path dependent actions to retain their value. We can see clearly the misalignment of incentives between creators, preservers and consumers, which causes commentators to describe digital preservation as a ‘derived demand’. And there is a familiar truth to the awkward statement that digital assets are ‘non-rivalous in consumption’. The economics of preservation are further skewed by an inevitable confusion of capital and revenue. Although I can't provide any material evidence to prove it, it seems reasonable to assert that the costs of setting up a digital preservation facility are greater than the costs of operating one. Research and development require large sums of capital in the short term in order to produce services which will require smaller amounts of longer term funding and which may themselves facilitate new kinds of income generation. Extending this assertion as a comment on organisational maturity, the lowest costs accrue to the most highly developed. So the faster we can all get over this development hump, the lower our costs will be (and that's why you should join the DPC). But capital and revenue are not plentiful. For too many organisations, digital preservation is still an unfunded mandate. The case for prioritized local investment in practical preservation which many DPC members need to repeat can be supported by economic analysis, but will not be resolved by it alone.

The second theme is not so well developed and it deserves more attention. If digital preservation is a real problem, then someone is going to make a real fortune - or a real reputation - fixing it. There are too few trusted vendors active in this space and because preservation is quite a complicated supply chain it is likely to involve several different vendors working together. But the news that Tessella and the National Archives were presented with the Queen's Award for Innovation and Enterprise is good news for all of us. It shows innovation and research in preservation can lead to products and services which open new markets. It will certainly encourage Tessella and their investment in the Safety Deposit Box. But perhaps it will encourage others to design alternatives and perhaps that is more important. It's really pleasing that PASIG - so long Sun's interface with preservation research - has been reinvented recently and is planning a series of meetings next year. I have recently discovered that the Storage Network Industry Association is actively developing digital preservation training. Moreover in the last few months three different global IT corporations have asked me to comment on their own proposed architectures for preservation. And that's not counting the numerous open source services and tools coalescing around the Open Planets Foundation. This is all hopeful news. I'm not sure how big the digital preservation market is, or how big it will become but whatever it is, it's hard to imagine that it won't grow markedly in the next decade.

If it does grow, it will partly be because the need for long term access has been recognised in industry: my third concern about the relationship between digital preservation and the economy. We have made considerable headway in the public sector and especially among research agencies and memory institutions. But the private sector is less well represented in our various meetings, networks and symposia. Industries that are heavily regulated, industries with long product lifecycles, and industries where maintenance of property rights is essential to revenue are obvious candidates. That ought to encompass fields as diverse as pharmaceuticals, pensions, estate management, design, music, film, ship building, law, aerospace, architecture, nuclear power, mineral prospection and many more. Together they represent a massive proportion of the gross domestic product.

There are lots of possible reasons why we don’t see more of them, but it's entirely possible that they hear our message the wrong way round. My limited experience of corporate IT - I was a junior in a credit agency during one particularly inglorious period of my otherwise illustrious career - is of increasingly antiquated systems being compounded and hacked but never properly replaced. Occasionally a new CIO would come and a new, super-duper comprehensive solution was purchased. But the migration was never properly finished and the old system was maintained far longer than was decent, alongside the new one. So instead of simplifying things these new services had the effect of compounding them.

I have argued before that digital preservation is about confident deletion. If I had realised that back in the day I would have pointed out that our office didn't need a new system it needed a better way of ridding itself of the old ones. Having a properly established and well worked archival facility is about consolidating knowledge into manageable, necessary and durable units. Perhaps therefore our message to industry should be that digital preservation is about turning off legacy systems (confidently).

We could also do a better job of describing digital preservation in terms of capacity and opportunity instead of a risk and a threat. It was really encouraging to read in the recent UK Government consultation on the establishment of a 'Public Data Corporation' that they viewed public data as a 'powerhouse' that 'has the potential to create siginificant social and economic value for global economies'. They are clearly excitied about the data we look after and they are planning on getting some kind of value from it. So why does the digital preservation literature continue to make such glooming reading?

Of course we should not miss the last and perhaps most obscure connection between digital preservation and the real economy. There's something deep in the economics of capitalism that shapes obsolescence. Most organisations get involved in digital preservation because they suddenly have a crisis of information access. Sorting out a digital mess is expensive and unreliable though it's likely to remain a very active area. Most organisations quickly spot the issue and try to bring order to the chaos through some sort of management. This means taming elements of the digital universe and keeping them alive in some sort of planned environment. The solution is typically framed as a repository and the tasks undertaken are typically described as ingest, migration and emulation. All this aims to prevent obsolescence and maintain some kind of authenticity and access. Some organisations are even in the happy position of managing this process prior to ingest and they will tell you that the ideal point to intervene to ensure longevity is at the very point of creation. But if you listen really hard what you hear is that they want to intervene at some point even before creation.

I used to think that this was the most useful time to intervene in the digital lifecycle, but I'm beginning to think differently. It's because the question of obsolescence is itself a dynamic born out of capitalism. Obsolescence is not a given: it's the consequential loss associated with improvements and innovations in computing and information technologies which are themselves driven for some kind of corporate objective and mostly for profit. That means a more radical option is available: Wouldn't it be better if obsolescence didn't exist at all. What would it be like if files and systems were self-preserving or operating systems were smart enough to take care of obsolescence for us? Imagine if instead of being faced with a dialogue box that says 'I can’t open your file', the computer said 'I've not seen a file like this since 1994 - would you like to choose migration or emulation to access it?’ It’s not as easy as it sounds and it would still leave a bundle issues to address - but I can't help but think that this would be all the digital preservation that most people would ever need.

No empire or economy or idea is too big to fail - it's only a question of when and how. It was true of Antonine and it’s certainly true of modern states or currencies or technologies. The failure of modern systems will provide fascinating lessons to future generations but their ability to trace that decline depends in no small measure on our ability to fix the real economies and release a flow of value for digital preservation which is all too material and all too elusive. It's hard to imagine a world without data and services built on them: they are so invasive and ubiquitous. But they are also fragile.

Digital empires are not so big that they cannot fail.

 


janetdelveWho's Who: Sixty Second Interview with Janet Delve, KEEP Project, University of Portsmouth

Where do you work and what's your job title?
I am a principal lecturer in the School of Creative Technologies in the Faculty of Creative and Cultural industries at the University of Portsmouth. I co-manage the FP7 KEEP project (ICT-231954) with David Anderson, am co-investigator on the JISC POCOS project and I am co-leader of the Future Proof Computing Group. In addition I am Communications Officer for one of our faculty research centres: the Centre for Cultural and Industrial Technologies Research (CiTech).

Tell us a bit about your organisation
The Future Proof Computing Group has six staff including David Anderson, Leo Konstantelos, Antonio Ciuffreda, Vaughan Powell, Milena Dobreva and Jing Shi, many of whom will be known to the DPC. We are particularly interested in hybrid solutions that blend and integrate migration and emulation as preservation strategies. A more unusual focus of the group is that we are particularly keen on providing an interface with a diverse range of agencies such as the computing history community which has a great deal of untapped experience and resources which are relevant to digital preservation. This is a particular feature of the group because the University of Portsmouth is one of only a very few institutions worldwide that actively research and teach the history of computing.

What projects are you working on at the moment?
We are working on the POCOS project which ties in very well with my other main focus just now: the KEEP project – Keeping Emulation Environments Portable. This is a three year project funded by the European Commission to provide practical emulation services for the long term. The emphasis is very much on providing standalone services. At the heart of the project is the KEEP Virtual Machine with several layers, the bottom of which comprises two instructions that will always run future software. The emulation services for this virtual machine comprise an emulation framework from which you can call various emulators and software depending on the nature of the digital object. A technical environment metadata-database TOTEM provides details of the necessary software, operating systems, hardware etc needed to run the emulator for a particular object. TOTEM has been created with a fine level of granularity together with the ability to link into other RDF ontologies such as Manfred Thaller’s XCL at the University of Cologne and those from the SPAR team at the BnF, both having links to the OPF. Alternatively, I am looking at how we might be able to use data warehousing as another possible future development of TOTEM. In addition we are making a media transfer knowledgebase and workflow which will link up to the PLANETS interoperability framework. We have also carried out a legal study which underlines the importance of memory institutions running stand-alone services – as these things cannot readily be transferred across the Internet. There is a real mix of partners in the project: the French, Dutch and German national libraries; the German Computer Games Museum; the European Games Developers Foundation; and commercial partners Tessella plc and Joguin sas; which makes for an exciting and diverse atmosphere in KEEP.

How did you end up in digital preservation?
About six years ago, I met Vincent Joguin of Joguin sas, the French technology company who is developing the KEEP Virtual Machine. At the time I was doing Erasmus teaching in Grenoble together with David Anderson, who had looked up ACONIT, a computer history museum in Grenoble. We met Vincent there over pizzas and a rare pdp 9 computer mainframe, and then went on to plan projects together such as a virtual computer history museum. This did not take off, but later Vincent approached us about working on KEEP. After some thought, it was evident that the metadata aspects of the project fitted in well with my data modelling skills. Having worked with the Association for History and Computing for almost two decades, I found the transition from the broader humanities computing to digital preservation per se straightforward.

What are the challenges of digital preservation for an organisation such as yours?
At the faculty level, the digital preservation challenges are myriad, and feed very well into the work we are doing in POCOS: international brainstorming to find ways of preserving complex digital objects such as visualisations and simulations, software art and video games. Staff in the faculty actively research and teach all of these areas, as well as several others, so they are highly relevant. For example, we have a strong computer animation group in our School of Creative Technologies, with lecturers working on the Lord of the Rings (LOTR), Harry Potter etc. The issues involved in preserving computer animations such as the LOTR’s Gollum are tortuous, being influenced by commercial pressures, the imperative to have the latest software, the need for competitive secrecy and the highly technical nature of the work which involves particle physics! Likewise, the issue of digital artists working closely with memory institutions to maintain their artworks is salient; as is the necessity to develop emulation services to help preserve video games, which is a main application of KEEP.

What sort of partnerships would you like to develop?
We are keen to work with partners who would like to expand the use cases of TOTEM. We have designed this tool so that it can be used as a standalone or web-based database; eventually a data warehouse, or it can be mapped onto OWL ontologies, so we can accommodate partners with differing needs. The use cases we have catered for so far are those needing the following computing environments: games consoles, the X86 PC and the Commodore 64. We can develop TOTEM with use cases for digital art, for example, or other areas we have not thought of yet. Another area that I am keen to develop is data warehousing. I have been researching and teaching in this field for 14 years now, and am writing a JISC report (together with colleagues at NANETH, and Historical Geography at Portsmouth) on using data warehousing for digital preservation. Any partners with any thoughts in this area would be very welcome.

If we could invent one tool or service that would help you, what would it be?
An instant multi-language idiom translator! As well as a computer scientist specialising in databases and data warehouses, and a computing historian, I am also a linguist for my sins! I think it is very important to communicate clearly and plainly with European colleagues whose first language is not English, but this can lead to lack of subtlety. This is where in English you would turn to a good old idiom or paraphrase, but how can we expect foreign colleagues to cope with such oddities as ‘a kettle of fish’ or ‘a hostage to fortune’? This is where I would like a special Babel fish for idioms to give me just the right phrase in every language I needed.

And if you could give people one piece of advice about digital preservation ....?
Don’t throw away that old computer in your loft! One thing that we have pointed out in KEEP is that emulation does not obviate the need for old computer hardware, as some have asserted. We still need some extant computer equipment for comparison to check how authentic our emulation services are, and we still need 8” hard drives to read old 8” floppies, etc. Apparently some hard drives are fetching a fortune on eBay.

If you could save for perpetuity just one digital file, what would it be?
A digital copy of a recording of any of the BBC interviews with Alan Turing. His pioneering work in computing is so fascinating and far-reaching, but his actual voice is lost to us: all the recordings in the 50s were taped over, as far as we know. I would love to hear him speak of his vision for the digital future from his unique perspective 60 odd years ago.

Finally, where can we contact you or find out about your work?
To find out more about the Future Proof Computing Group, visit http://www.port.ac.uk/research/citech/principles/#fpc. To read more about our KEEP project please visit http://www.keep-project.eu/ezpub2/index.php or visit us at LinkedIn - we welcome any comments your have! The best way to contact me is at This email address is being protected from spambots. You need JavaScript enabled to view it.

 


One World: Italian Legislation on the Preservation of Electronic Records

Mariella Guercio, University of Urbino

Current Italian legislation on electronic records management is more than ten years old and was specifically dedicated to the public administration (Decree 445/2000). An important revision and integration was approved in 2005 with the ambitious aim of defining a complete series of principles and rules in the field of e-government, collected and harmonized as a Code for Digital Administration. For years attention was dedicated mainly (perhaps only) to electronic records creation and keeping as part of the current activity of the public agencies. As part of this goal, and specifically in pursuing the dream of a complete digital environment, the legislator specified in 2004 how to digitise analogue papers and create their digital surrogates or copies with evidential value. The same rule (a deliberation of the National Centre for Information Technology in the Public Administration – CNIPA) established (for the public and private sectors) how to maintain the bitstream and its evidential value in the medium term. The approved mechanisms were based on digital signatures and time stamps and not enough attention was dedicated to the metadata, to the use of standardised formats, to the interoperability of the sets of bitstream electronically signed and kept and even less consideration was paid to their accessibility over time. The focus was storage security and the capacity to ensure control of the records’ integrity. For years, the recommendations of the National Archives and of the international bodies related to the needs of preserving the original aggregations and the juridical and administrative contexts have been seriously under-estimated.

Luckily, public administrations and the private sector applied the new rules with caution: i.e. in the case of digitization projects the original paper records have not been eliminated and the information systems able to provide contextual elements for the electronic records have always been preserved by the creators along with the documentation of the procedures in place for managing the records and their related responsibilities (as indicated by the national legislation on ERMS). The awareness of the complexities involved moved regional agencies like Regione Emilia-Romagna and Regione Toscana to invest in this sector by creating digital archival centres able to offer qualified services for municipalities, hospitals and other public bodies.

The experience gained in the practical digital preservation processes has influenced new regulations approved in 2010 and 2011 with the direct involvement of Regions, the National Archives, the Italian ISO Committee on Records Management (see http://www.digitpa.gov.it/amministrazione_digitale and http://www.digitpa.gov.it/sites/default/files/normativa/Regole%20tecniche%20conservazione%2005%2008%202011.pdf). The new legislation is based on the recognition that the digital environment requires integrated processes and a chain of responsibilities which ensures continuity between the records creation and their preservation. The legislation (already approved in 2010 as new Code for Digital Administration and partially still under discussion as a detailed series of rules) is based on an integrated approach and on archival principles explicitly accepted:

    • the preservation function has to be based on a general and standardised model: transfer to the digital repository (both for the public and the private sector) requires the creation of information packages for submission, for archiving and for dissemination (according to the OAIS model) including all of the metadata and representation information necessary to document the records’ authenticity and to guarantee their accessibility as part of an archival aggregation,
    • each repository has to provide a preservation system, well documented with reference to the responsibilities involved, to its technical and organizational continuity,
    • the chain of the involved responsibilities has to be made explicit on the basis of a formal agreement in case of a transfer to a third party,
    • the internal responsibility for digital preservation has to be consigned to specialists, formally charged for their tasks as responsible for digital preservation,
    • a manual for digital preservation (whose main contents are listed by the regulation) has to be approved by the creator for internal auditing and by the external repository in case of outsourcing and,
    • a special procedure is defined for certifying the quality of the digital preservation and for accrediting public and private digital repositories.

This new legislation will bring significant changes for digital preservation in Italy and is expected to be approved by the end of this year.

 


This content has been locked. You can no longer post any comment.


Scroll to top