Papers and Presentations
This presentation, given by Amy Rudersdorf at the 2016 American Library Association’s Preservation Administrator’s Interest Group meeting, provides a higher level discussion of the use of standards for digital preservation and repository management and assessment. Particular focus is given to ISO 16363: Audit and Certification of Trusted Digital Repositories and its usefulness beyond an audit tool to perform assessments to identify both gaps and strengths in digital repository practice.
In this presentation, Bertram Lyons demonstrates a methodology for employing the ISO 16363 standard for Audit and Certification of Trustworthy Repositories as a tool that can be used to help an organization plan for continued improvement of digital preservation services.
This report presents the findings of a study conducted by Bertram Lyons and Kara Van Malssen of AVPreserve, on behalf of the Library of Congress, to evaluate the existing state of technical, structural, and preservation metadata for audiovisual resources in the bibliographic environment in light of existing standards for audiovisual metadata, and to make recommendations about how BIBFRAME can support the expression of such information. This study follows on our May 2014 report titled, “BIBFRAME AV Modeling Study: Defining a Flexible Model for Description of Audiovisual Resources,” also commissioned by the Library of Congress, which explored and provided high-level recommendations on a flexible data model for audiovisual resources.
Lead by Kara Van Malssen, AVPreserve completed this report, commissioned by the BIBFRAME team within the Network Development and Standards Office at the Library of Congress, to evaluate the content description needs of the moving image and recorded sound communities and to specify how those requirements can be met within a semantic bibliographic data model designed generically to support all content types found in libraries.
AVPreserve has developed or contributed to the development of several tools for the inventory, assessment, and preservation prioritization of physical audiovisual materials, ranging in approach from collection level or format level analysis down to item level cataloging and selection. Primary among these are three tools: MediaScore and MediaRivers from Indiana University, our Catalyst inventory tool, and the AVCC inventory and planning tool. The approach one takes in such preservation efforts and the tools one might use depend on the scope of collections, the budgets and staffing available, and the end goals of the project. This spreadsheet presents a comparative analysis of the tools available on our site to help you determine which one might be right for your collection. Visit our tools page for access to the applications themselves.
The ARSC Guide to Audio Preservation is a practical introduction to caring for and preserving audio collections. It is aimed at individuals and institutions that have recorded sound collections but lack the expertise in one or more areas to preserve them. Among the many expert authors of the Guide, AVPreserve President Chris Lacinak contributed Chapter 7, “What to do after digitization”, and Senior Consultant Kara Van Malssen contributed Chapter 9, “Disaster prevention, preparedness, and response”.
The ARSC Guide to Audio Preservation was commissioned for and sponsored by the National Recording Preservation Board of the Library of Congress, and was copublished by the Association for Recorded Sound Collections (ARSC), the Council on Library and Information Resources (CLIR), and The Library of Congress. More information can be found on the CLIR website.
Whether outsourcing or digitizing in-house, collection managers need to be able to define the parameters and specifications for preservation reformatting in order to properly care for their assets and to control and understand the outcomes of the digitization process. In association with the ARSC Guide to Audio Preservation AVPreserve is releasing this Guide to RFPs for the Digitization of Audio, along with recommendations for technical and preservation metadata to collect during the process and a sample spreadsheet to obtain estimated pricing from digitization vendors. Every digitization project and organizational requirements are different; this guide is a starting point for creating an RFP specific to those needs.
- Guide to Developing a Request for Proposal for the Digitization of Audio (PDF)
- Appendix 1: Suggested Metadata (XLSX)
- Appendix 2: Pricing (XLSX)
In 2014, AVPreserve and the Northeast Document Conservation Center (NEDCC), with funding from The Andrew W. Mellon Foundation, undertook an in-depth, multi-faceted assessment to quantify the existing audio items held in institutional collections throughout the United States. This was performed in response to The Library of Congress National Recording Preservation Plan and its call for the appraisal of collections, as well as to establish a foundation for articulating the current preservation need of sound recordings in collections nationwide. Our goal was to acquire enough trustworthy data to be able to answer questions such as “How many sound recordings exist in broadcast organizations across the US?” or “How many sound recordings exist in archives throughout the US?” Moreover, we wanted to answer more complex questions such as “How many of such items are preservation-worthy?” or “How many have already been digitized?” Prioritization for digitization is as critical as both funding and timeliness. The foundation for action on all three of these fronts is trustworthy quantitative data. This paper aims to provide such data along with supporting information about the methodologies used in its generation.
Media preservation has reached a crisis point for content carried on physical audio and video formats as the world has transitioned to the digital age. Archival media collections could soon be considered highly endangered. It might help to invoke the power of narrative to aid in understanding the critical issues facing media archives and to spark us to imagine solutions to seemingly intractable problems. Outside of the fairytale, in our own world, the signs are ominous. This is a PDF version of an article that originally appeared in the International Association of Sound & Audiovisual Archives Journal No. 44, January 2015. Published here with permission from Mike Casey, the Director of Technical Operations for the Media Digitization and Preservation Initiative at Indiana University.
In this tutorial, we explore how to understand and apply features of the OpenRefine (formerly Google Refine) tool in an archival context. OpenRefine can enable organizations to clean up, merge, and manipulate their metadata so that the information can be better integrated into workflows and across systems. OpenRefine is “a free, open source power tool for working with messy data” that libraries, museums, archives, and other organizations can employ to analyze, normalize, and clean up datasets through its simple yet powerful features.
In this tutorial, spreadsheets are positioned as a mechanism to help practitioners manage metadata more accurately, efficiently, and effectively. Limited resources create a reality where spreadsheets are an interim alternative to preferred metadata management tools like databases and database applications, and become the default for capturing, editing, storing and reporting information. There are many opportunities to leverage sophisticated features in spreadsheet applications that allow you to work faster, smarter, and with greater accuracy towards a more robust metadata management system. The tutorial features instruction using Microsoft Excel. Self-directed exercises and practices worksheets are linked below the video.
On Thursday April 16, 2015, Kathryn Gronsbell spoke at the New York Foundation for the Arts (NYFA) in Brooklyn. The event, “PIXELS, LINES, AND BITS: An A/V Preservation Primer for Artists”, was a discussion around personal archiving and preservation approaches for artists interested in stabilizing their work so that it can be available in the short- and long-term. The presentation and Q+A session was an introduction to concepts like preserving and managing media, how you can leverage your time and money to make more sustainable decisions, and what the benefits might be now and in the future. Thank you to NYFA and Independent Media Arts Preservation for helping organize this event. See NYFA’s Highlight Reel from the discussion.
Kara Van Malssen‘s presentation from the Take Control of Your Records! conference at the National Audiovisual Institute in Warsaw, Poland offers 10 steps an organization can take to help ensure successful implementation of a media/digital asset management system.
This presentation from Seth Anderson covers recent efforts at AVPreserve to reframe the often information-heavy results of ISO 16363 audits into straightforward data points based on scoring criteria with actionable recommendations for achieving compliance. The presentation includes examples of different applications of the standard as a means of assessing developing digital preservation infrastructure and planning for completely new policies and systems. Additionally, extensive work with the standard has revealed inconsistencies and repetitive elements that cause confusion and difficulty in interpreting and applying the requirements of a trustworthy digital repository. Seth posits an altered hierarchy to address these issues in future versions of the standard, an approach that looks to such documents not as a static, inflexible set of guidelines, but pragmatically as a framework to apply and continually refine as results and technologies change, much like digital preservation itself!
A guest opinion piece by Chris Lacinak featured in Post Magazine on the importance of using preservation oriented workflows in a production environment. Establishing reliable preservation and archival practice makes sound business sense, promoting efficient and cost-effective workflows, providing find-ability and the wherewithal to support premium repurposing projects.
AVPreserve President Chris Lacinak was invited to give a keynote presentation at the 2014 Fédération Internationale des Archives de Télévision / The International Federation of Television Archives (FIAT/IFTA) World Conference on the topic of our Cost of Inaction Calculator. The COI Calculator is a planning tool that provides estimated budgets and schedules over the longterm so that one can begin to develop a preservation plan for beyond the immediate near term. By looking at the costs of physical storage and management, digitization, and digital storage, an institution can think about distributing costs over time while also considering the critical need for sustainability of preservation activities (and their associated costs) beyond short term fund raising or grants.
With the increasing ingest of born digital and digitized collections, we are at the point (perhaps well past the point) of admitting that almost all archives are digital archives, and as a profession we must identify and gain training on the tools that will help us describe, store, and manage file-based collections in the same ways we do with physical collections. The Command Line Interface (CLI) is a critical tool here, both for managing files from ingest to storage directly or being able to access certain applications that only have a CLI option for running. These introductions to CLI (both for Mac and Windows OS) provide a basic understanding of managing directories and files with the command line, skills which can be expanded from managing individual files to ingesting and caring for large sets of files in batches in order to save time and address the realities of file-based acquisition.
Una oferta inicial de cinco de nuestros documentos técnicos traducido al español. Esperamos a traducir más recursos pronto y compartirlos con nuestros colegas.
- *Nueve factores a considerar al evaluar el almacenamiento en la nube
- *Introducción a la preservación de medios ópticos
- *Introducción a los códecs de archivos sonoros y audiovisuales
- *Metodología Más producto, menos proceso para el procesamiento de colecciones audiovisuales
- *La recuperación de la colección multimedia después de la supertormenta Sandy
In 2011 The Netherlands Institute for Sound and Vision, in collaboration with the audiovisual heritage network AVA_net published a collection of essays on the topic of digital preservation entitled Making Invisible Assets: The Preservation of Digital AV Collections. The book is available for only the cost of shipping from Sound and Vision. AVPS Senior Consultant Kara Van Malssen was one of the international professionals commissioned to write an essay for the collection. Her article, “Planning Beyond Digitization”, is available here in PDF.
At the 2014 annual meeting of the Society of American Archivists in Washington, DC, AVPreserve hosted our First 1st Annual AV Archives Night party at the legendary Black Cat music venue. Soliciting submissions of audio and moving image content from archives in DC, Maryland, and Virginia, we put together a program celebrating the audiovisual heritage of the region as well as the important preservation work that archives are doing. Featuring content as varied as veterans oral histories, university radio station ID tags from John Lennon, government experiments with LSD, and more, the night was a great success. We look forward to continuing the tradition at future SAA conferences to highlight archival collections in regions across the country.
Archiving and preservation consists of technology, people and policies. For technology in particular, digital AV archives are largely indebted and beholden to a few sizable industries: cinema, broadcast, and information technology. Commercial interests catering to the aforementioned industries have produced a seemingly attractive toolset that has the potential to provide archives with the ability to apply their policies in service of preservation-oriented workflows. Yet, even in the hands of larger well-resourced organisations, employing these tools can be challenging and resource intensive. How can smaller, resource-constrained AV archives efficiently apply cost effective tools and technologies to their workflows? This article by Kara Van Malssen was originally published in AV Insider, Issue 2, September 2012 https://www.prestocentre.org/library/resources/av-insider-2-preservation-times-precarity.
There are a variety of algorithms that can be used for generating checksums, with two in particular – MD5 and SHA-256 – being the most common. The comparative benefits and drawbacks of both are well-understood: while MD5 is weaker against random and deliberate collisions, it is faster to generate than SHA-256. However, there are no published empirical estimates for the difference in time-to-generate between MD5 and SHA-256 in archival and repository environments, leading to difficulty in making an informed decision as to which algorithm to implement for preservation monitoring. This white paper documents a comparative checksum test of the same files under the same conditions, leading to some surprising findings about the actual processing speeds of the two algorithms.
The basis of Bertram Lyons’ panel presentation at Digital Preservation 2014. To date, the difficulty and high bar of doing an internal assessment as a Trusted Digital Repository has created a hurdle to the ability of organizations to track or rank their progress towards digital preservation standards. AVPreserve has been working on means of adapting TDR risk assessment by improving reporting options and analyses. Two assessment tools currently in use for digital preservation risk assessment are the NDSA’s Levels of Digital Preservation Matrix (Version 1) and ISO 16363:2012 Audit and Certification of Trustworthy Digital Repositories. The two tools offer overlapping yet distinct methods of analysis, very useful but resulting in differing reporting classifications and outcomes that are not easy to reconcile. In order to encourage the use of the two tools under one roof, and, especially, to increase the outputs of a standard ISO 16363 assessment, AVPreserve staff have mapped the Levels of Digital Preservation categories to the ISO 16363 requirements. A full paper on the topic will be available here soon. Linked below is our work that documents the mapping of NDSA Levels of Digital Preservation categories to ISO 16363 criteria and the DigPres14 slidedeck. We offer these as an opportunity for community discourse and involvement. Please evaluate our mappings and let us know what you think to help us work towards a shared mapping that others can employ in a standardized way.
- ISO Requirement Mapped to NDSA Categories (XLSX)
- Mapping Standards for Richer Assessment DigPres14 Slidedeck (PDF)
Physical audiovisual media collections are at risk for extreme levels of loss if action is not taken to preserve them in the next 10-15 years. Most archives are well aware of this critical issue, but are unable to move forward with preservation projects because it is difficult to quantify the intellectual impact and cost impact of action or inaction in order to advocate and secure budgets. Our new Cost of Inaction Calculator provides graphics and metrics that compare resource expenditures, digitization and storage costs, and the rate of loss of physical media to help provide an approach to planning and advocating for preservation. This paper presents a sample case study showing how the COI model and Calculator can be used to support preservation efforts. This is a PDF version of an article that originally appeared in the International Association of Sound & Audiovisual Archives Journal No. 43, July 2014.
The latest technical brief from Digital & Metadata Preservation Specialist Alex Duryee explores the use of inodes in the functionality of Fixity, our free digital preservation file monitoring tool. Fixity offers the unique capability of tracking file attendance as well as file integrity. In this free download, learn how we used filesystem structure to achieve that and how tracking files through their inode makes for a more powerful, more flexible monitoring approach.
In today’s world of digital information, previously disparate archival practices are converging around the need to manage collections at the item level. Media collections require a curatorial approach that demand archivists know certain information about every single object in their care for purposes of provenance, quality control, and appraisal. This is a daunting task for archives, as it asks that they retool or redesign migration and accession workflows. It is exactly in gaps such as these that practical technologies become ever useful. This article offers case studies regarding two freely-available, open-source digital asset metadata tools—BWF MetaEdit and MDQC. The case studies offer on-the-ground examples of how four institutions recognized a need for metadata creation and validation, and how they employed these new tools in their production and accessioning workflows. By Alex Duryee and Bertram Lyons. This article originally appeared in the Practical Technology for Archives Journal, Issue 2, June 2014, http://practicaltechnologyforarchives.org/. || Link to PDF.
As the archival horizon moves forward, optical media will become increasingly significant and prevalent in collections. This paper sets out to provide a broad overview of optical media in the context of archival migration. Author Alex Duryee begins by introducing the logical structure of compact discs, providing the context and language necessary to discuss the medium. The article then explores the most common data formats for optical media: Compact Disc Digital Audio, ISO 9660, the Joliet and HFS extensions, and the Universal Data Format (with an eye towards DVD-Video). Each format is viewed in the context of preservation needs and what archivists need to be aware of when handling said formats. Following is a discussion of preservation workflows and concerns for successfully migrating data away from optical media, as well as directions for future research. This is a PDF version of an article that originally appeared in the online Code4Lib Journal, Issue 24, 2014-04-16, ISSN 1940-5758.
Embedded metadata is a key component of managing digital files, providing information on correct presentation, source of the file, rights, and other information which supports findability, access, authentication, preservation, and more. This paper discusses the concept and uses of embedded metadata in general, and then looks more specifically at its use in WAVE audio files, focusing on the efforts of the Federal Agencies Digitization Guidelines Initiative (FADGI) to develop recommendations on embedding metadata in audio files created by government agencies. This project resulted in the development of BWF MetaEdit, a tool which allows users to view, edit, and create embedded metadata in WAVE files.
Part of our Feet on The Ground: A Practical Approach to The Cloud series, these profiles break down the offerings of third party cloud storage providers from a preservation point of view. Assessment points include Data Management, Reporting/Metadata, Redundancy, Accessibility, Security, End of Service, and adherence to the NDSA’s Levels of Preservation. Current offerings include Chronopolis, EVault, Amazon Glacier, Dternity (formerly Permivault), DuraCloud, Preservica, Rackspace Cloud Files, and Amazon S3. Profiles will continue to be added and will be updated as the market and services change.
- Key to Reading Profiles (PDF)
- Profile 1: Chronopolis (PDF)
- Profile 2: EVault (PDF)
- Profile 3: Glacier (PDF)
- Profile 4: Dternity (PDF)
- Profile 5: DuraCloud (PDF)
- Profile 6: Preservica (PDF)
- Profile 7: Rackspace Cloud Files (PDF)
- Profile 8: Amazon S3 (PDF)
- NDSA Comparison Grid (PDF)
What happens to a collection when its sole caretaker suddenly goes away? This case study examines such a situation and how the use of AVPreserve’s Catalyst inventory solution was used to document an audio collection in support of preservation planning. Download the first in a series of case studies about practical, outcomes based approaches to audiovisual collection appraisal and processing.
When evaluating cloud storage providers, it is dangerous to assume such services are only storage and therefore uncomplicated or that requirements for storage are obvious and therefor inherently met by the service provider. Experience with any technology selection will prove the opposite. No two services are the same and the variance between services often represents the difference between successful implementation and a failed initiative. Never purchase a service without proper vetting; uninformed decisions risk loss of time, money, and even assets. These nine assessment criteria will help you get started in asking the right questions and making a practical, informed decision on using cloud storage for archival or preservation needs. PDF Link
This four-part series of video tutorials, created by Kathryn Gronsbell is focused on Exiftool, a command-line application that can read, write, and edit embedded metadata in files. The tutorial series provides detailed support to users looking for an approachable and practical introduction to Exiftool. Featured exercises have wide-ranging applications, but trend towards improving digital preservation workflows through step-by-step exploration of Exiftool’s basic features and functions.
AVPreserve was involved in a number of panels and events at the 2013 Association of Moving Image Archivists Conference in Richmond, Virginia, including organizing the first ever AMIA HackDay and presentations related to the imminent decay of magnetic media, the importance of metadata development for digital preservation, and the intricacies of vendor selection for digital asset management systems.
- Chris Lacinak "The End of Analog Media: The Cost of Inaction and What You Can Do About It" (PDF)
- Mike Casey, Indiana University, "Why Media Preservation Can't Wait the Weathering Storm"
- AMIA & DLF HackDay 2013 Projects
- Seth Anderson, "Navigating the Digital Archive: First, Know Thyself"
- Seth Anderson, "Mastering Your Data: Tools for Metadata Management in AV Archives"
- Kara Van Malssen "From Zero to DAM" (PDF)
The Mid-Atlantic Region Archives Conference epitomizes the importance and reach of regional professional organizations, opening educational and networking opportunities to broader audiences that cannot regularly attend national conventions. AVPreserve was a proud first-time participant on two panels this year on the topics of preserving complex digital artworks and the refinement of archival data using tools such as Open Refine. We look forward to presenting at future meetings.
AVPreserve President Chris Lacinak is a featured speaker in this Chesapeake Systems Podcast discussing video metadata workflows and two powerful new search applications, Nexidia and NerVve. Nexidia is a phonetic search application that can scan a library of literally thousands of hours of footage in seconds. NerVve is an application that works similarly with video imagery search. Both applications present easy-to-use interfaces, and the results typically “blow-away” first-time users. Joined by representatives from each company, the panelists address the questions of what role these applications play in advanced video workflows and whether they threaten or complement metadata-driven media asset management systems?
Clear articulation and understanding of goals and specifications is essential to ensuring the success of any project. Whether performing digitization work in-house or using a vendor, a statement of work or request for proposal serves as the foundation of the project. This resource is intended to guide organizations in thinking critically about and discussing – internally and with vendors – the salient aspects of a request for proposal and the details within. Although this guide uses video as a focus point it is relevant and applicable for all media types.
When “Superstorm” Sandy swept through the New York City region it left unforeseen levels of flooding and damage in its wake in areas such as Red Hook, The Rockaways, and the Chelsea Gallery District. Though prepared for anticipated levels of flooding, Eyebeam Art+Technology Center ended up with three feet of water on the ground floor of its space. Amongst the damage was the majority of Eyebeam’s media archive: 15 years of videotape and computer disks containing artworks, documentation of events, and even server backups—essentially, Eyebeam’s entire legacy. This case study shares Eyebeam’s experience responding to the disaster in the hope that it will be of benefit as organizations consider preparing for future events. It is a reminder to archives, caretakers, curators, stewards, and others responsible preservation of content that our work on disaster preparedness is not, and never will be, done.
An online introduction to the concepts and application of Dolby Noise Reduction. Misapplication of noise reduction can have a highly deleterious effect on the quality and integrity of audio recordings, thus an understanding of the system and use of the correct Dolby settings during playback and reformatting is extremely important to preservation. Includes audio examples illustrating the differences.
Protecting the Personal Narrative: An Assessment of Archival Practice’s Place in Personal Digital Archiving
The archival community struggles to fit in the private process of personal digital archiving. A common recommendation is to begin preservation far upstream, introducing archival practices early into the act of personal collection. But what may the archives best intentions introduce into the act of personal collection? Entering too early into the process may place undue influence on the decisions of the collector, the what gets kept and why? Active preservation of digital personal archives is necessary for ensuring the longevity of materials, but the archives community must be aware that this may alter the personal narratives that personal archives represent. From the Personal Digital Archiving 2013 Conference, Seth Anderson’s presentation.
Creating item level records for archival media collections is seen as a high cost investment, but it may help save costs and efforts in the long run, especially in the event of a major loss due to disaster.
What’s Your Product? Assessing the suitability of a More Product, Less Process methodology for processing audiovisual collections
The widely referenced and adopted More Product, Less Process methodology (MPLP) represents a much needed evolution in the manner of processing archival collections in order to overcome backlogs and resource shortfalls that institutions face. In the case of audiovisual-based collections, however, the ability to plan budgets, timelines, equipment needs, and other preservation plans that unequivocally impact access is directly tied to the documentation of some degree of item-level knowledge about one’s collection. This paper proposes an extension of the MPLP model which is necessitated to properly address the particular needs of audiovisual and other complex media in a way that properly meets archival standards and that assists the archivist in generating their true product: the provision of the three basic services of Findability, Access, and Sustainability regardless of the format, the content, or the tools used.
The human desire to classify and name is a highly personal and a greatly prized act. Naming the files we create is no different, though the number of files and tools used for managing them place a great need on consistent structure and application of file naming guidelines. What to do is then very simple — consistency. More to the point is what not to do in order to avoid pitfalls.
This presentation addresses the typical questions that arise from embedded metadata implementers regarding the role, technicalities and value of the TimeReference field in the bext chunk of BWF files. This mostly visual presentation is a practical primer for everyone from engineers to archivists and librarians.
The Federal Agencies Digitization Guidelines Initiative is a governmental interagency activity that draws participants from the Library of Congress, the National Archives and Records Administration, the Smithsonian Institution, the National Libraries of Medicine and Agriculture, Voice of America, and several other interested agencies. The initiative is divided into two parts: the Still Image Working Group and the Audio-Visual Working Group. Chris Lacinak has drafted the initial report on the Audio-Visual Working Group’s efforts to evaluate audio digitization systems and develop performance metrics in order to set guidelines and evaluative measurements for conducting and monitoring digitization activities.
The ease of using cassette-based media — pop it in and press play — and the development of compact, no-frills consumer electronics helped make audiovisual materials more accessible to a wider population, but there has also been the side effect of distancing users from the processes involved in recording and playback that were more apparent with open reel media and higher end decks. This is less of an issue with commercially recorded tape where standards are more regulated, but when dealing with field recordings, oral histories, and other original material, the configurations and settings of the recording device and playback device can have a major impact on audio or visual quality if unaccounted for.
In the first in a series exploring all of those knobs, switches, and buttons you see on decks, Audrey Young and our own Peter Oleksik have written a brief primer on azimuth and why it matters for archivists, researchers, and other people who listen to or work with magnetic audio recordings.
Metadata is an integral component of digital preservation and an essential part of a digital object. Files without appropriate metadata lack the basic means required for computing systems and humans to understand, interpret, or manage them. Effectively, there is no preservation or meaningful access without metadata.
This presentation by Chris Lacinak covers the why, what and how of embedded metadata, focusing on WAVE audio files. It also reviews initial findings from an ARSC Technical Committee study, spearheaded by Chris, analyzing the interchange and persistence of embedded metadata across audio software applications that are regularly used in the creation of audio files in production and archival settings. Finally, Chris walks through BWF MetaEdit, a groundbreaking free and open-source tool commissioned by the Federal Agencies Digitization Guidelines Initiative and developed by AudioVisual Preservation Solutions in 2010.
AVPS moderated or presented on a number of panels at the 2010 International Association of Sound & Audiovisual Archives / Association of Moving Image Archivists conference in Philadelphia, PA. The topics covered a wide breadth, including embedded metadata, new tools and strategies for digital media preservation, and new approaches to funding and advocacy for collection management.
by Dave Rice and Stefan Elnabli – October 28, 2010
Due to the susceptibility and challenges of both digital and analog carriers, data must be periodically moved from one carrier to another within a preservation process. When analog data is migrated from its original carrier to a new digital carrier, the analog data is ultimately transformed through the process of sampling. Challenges are then posed to authenticating the accuracy of such a migration. Despite the perceptual exactness of an analog source to its digital copy, the analog data and the digital data are never exactly the same. However, in the realm of file-based digital-to-digital migration, exactness can be achieved and evaluated. Within the entirely file-based environment, checksums and data comparison tools can verify that two copies are exact matches or reveal their deviation in a way that is not feasible between analog and digital environments.
10 Recommendations for Codec Selection and Management
The increasing number of digital objects under our guardianship as archivists will require a greater convergence between IT and archival knowledge sets in order to develop effective preservation strategies. One area of great concern for the integrity and persistence of digital audio and video files is the selection of file formats and codecs…Though this is also an area where there is a great lack of certainty and clarity on the issue.
This paper by Chris Lacinak lays out a clear explanation of what codecs are, how they are used, and what their selection and application means to archives. Also provided are 10 recommendations that will help you in the selection and management of codecs in an archival setting.
It is widely understood that the special challenges of conserving film, video, computer-based, and interactive art demand collaborative efforts—shared responsibility among a wide array of disciplines. Over the past decade, best practices and shared principles about the care of this art have been developed: emulation, migration, variability. But how do these practices actually work in the real world? Co-sponsored by the Hirshhorn Museum and the Lunder Conservation Center, Smithsonian Institution, this colloquium brings together conservators, artists, curators, exhibition designers, and audiovisual specialists in a series of case studies about collaboration, designed to provoke debate about how we have cared for these works thus far.
At the colloquium Chris Lacinak addressed the topic Managing Born-Digital Time-Based Media, covering issues from file format and codec selection to preservation workflows and file management/storage. The link below is to the video of Chris’ talk taken from Part 1 of the livestream of the colloquium.
AVPS is involved in leading parallel projects within the Federal Agencies Digitization Guidelines Initiative and the Audio Engineering Society on the development of new standards and tools for performance testing of digital audio systems. As part of this work AVPS is proposing a Comparative Analysis tool which departs from existing error detection tools and is particularly well suited to identifying a particular type of error, labeled here as interstitial errors. This paper by Chris Lacinak uncovers one type of error that can occur and discusses the theory behind the comparative analysis methodology and approach to the development of new tools for test and measurement.
Becoming an effective advocate for your collections means becoming a proactive participant in the management and planning of their preservation and long term maintenance. The amount of work to do and the costs can feel overwhelming, but things will never change until you take charge, make a plan, and actively seek the resources you need. Here are 5 tips on how you can start to manage your collections rather than letting your collections manage you.
This paper examines preservation philosophies and strategies applied to large scale video collections that are both born-digital and tape-based. Technically and philosophically different approaches may be applied to migrating born-digital, tape-based content with decisions ranging from deck selection and choice of output to specifications of the resulting file. At the core of this is the distinction between migrating digital video as an audiovisual signal versus migrating it as data.
Born Digital File-Based Video recording is pervasive. Tape is not even an option on many new cameras being sold today. This shift has made accessioning and management of file based content and the associated challenges a new reality to archives. This presentation offers insights into the challenges that born digital file-based video brings to your archive and strategies for managing it.
Project Outsourcing: Navigating Through the Client/Vendor Relationship to Achieve Your Project Goals
A guide and checklist to help clients successfully work with vendors
A presentation focusing on obsolescence monitoring and normalization as strategies for managing born digital audio
David Rice and Mike Castleman represented Democracy Now! at the 2008 AMIA Digital Asset Symposium presenting on the integration of open source technology and Free Software in efforts to record, disseminate, and archive moving image media.
The presentation included references to
Tools for Recording: dvgrab, cron, vidi
Tools for Transcoding and Wrapping: ffmpeg, mplayer, MP4Box, ffmpegX, x246 for Quicktime
Tools for Online Media Accessibility: The Internet Archive, blip.tv, Miro
Tools for Migrating AudioVisual Data from Tape-Based Digital Media: DATXtract and Live Capture Plus
Tools for Backup and LTO Management: Bacula
Metadata Extraction Tools: MediaInfo, getid3, qt_tools
Metadata Standard: PBCore
A Survey of Current Audiovisual Assessment and Prioritization Projects (Chris Lacinak coordinated this six presentation session for JTS 2007. The speech below is an introduction to the session to offer perspective and context to the topic and presentations)
Managing the Intangible Quality Assessment of the Digital Surrogate
Written by Chris Lacinak in representation of the Audio Engineering Society and the Association of Moving Image Archivists.
Preservation: The Shift from Format to Strategy
Nestled at the base of a green rolling hill, thirty minutes north of Accra, Ghana in the small village of Medie, is the African Heritage Library, the home of Odomankoma Kyrema, the “Divine Drummer” Kofi Ghanaba. Formerly known as Guy Warren, Ghanaba is one of the more elusive musicians of the 20th century… Click below to read the full article.