Recently at the iPRES 2014 conference in Melbourne I gave a presentation on the SCAPE Preservation Policies. Not only I explained the SCAPE Preservation Policy Model , but I also summarized my findings after analysing 40 real life preservation policies. You can read the detailed information in my article (to be published soon).
As part of the scape project, we did a large-scale experiment and evaluation of audio migration using the xcorrSound tool waveform-compare for content comparison in the quality assurance.
I did a presentation of the results at the demonstration day at the State and University Library, see the SCAPE Demo Day at Statsbiblioteket blog post by Jette G. Junge.
Ok. I know what you're thinking. Do we really need another PRONOM-based, file format identification tool?
There is a trend in digital preservation circles to question the need for migration.
Some time ago Will Palmer, Peter May and Peter Cliff of the British Library published a really interesting paper that investigated three different JPEG 2000 codecs, and their effects on image quality in response to lossy compression. Most remarkably, their analysis revealed differences not only in the way these codecs encode (compress) an image, but also in the decoding phase. In other words: reading the same lossy JP2 produced different results depending on which implementation was used to decode it.
A limitation of the paper's methodology is that it obscures the individual effects of the encoding and decoding components, since both are essentially lumped in the analysis. Thus, it's not clear how much of the observed degradation in image quality is caused by the compression, and how much by the decoding. This made me wonder how similar the decode results of different codecs really are.