Data Analyst / XML Developer

Washington, D.C.
1 other recent jobs
Created: March 21, 2014

Description

Title: Data Analyst & XML Developer (expert consultant)

Interested candidates should direct inquiries to Lauren Sorensen, Digital Conversion Specialist at lsor@loc.gov, with a brief statement of interest, salary requirements, and CV; subject line: Developer. Applications are due April 28th.

Project description:
The Library of Congress & WGBH recently entered into an agreement to archive and provide descriptive access to the first 40,000 hours of American Archive of Public Broadcasting content. Through a grant from Corporation for Public Broadcasting, the project will work with moving image and recorded sound content digitized through an outside vendor, alongside associated metadata and inventory records.

Job description:
The position will work closely with the American Archive of Public Broadcasting Digital Conversion Specialist and project coordinator to create scripts and protocols, and test data models from inventory and catalog records from existing project database to the Library of Congress’ National Audio-Visual Conservation Center database schema and workflow system, using schemas such as PREMIS and PBCore. The incumbent will propose a utility for performing this work, build and test tools to analyze data, extract patterns, and transform data among various formats as required by project demands.
Additional duties include providing feedback on further developments on the project, including advisement around using linked data and RDF. The developer will also work closely with project staff to advise on measures to actively share data between institutions and provide feedback on further website and data modeling occurring over the course of the year.

Location: Culpeper, VA or Washington, DC (If candidate is located in Washington, travel will be required on occasion to Culpeper campus)

The position is anticipated to start mid-2014. The schedule for this position will be on an intermittent basis as work demands  and cannot exceed one year from term start.

Minimum requirements:
• Understanding and experience using common technologies for database management and transformation of catalog records such as XSLT, XPath, MySQL, MINT (Metadata Interoperability Services), OpenRefine.
• Data programming experience, for example, scripting or performing batch metadata operations/ transformations using standard tools and methods.
• Familiarity with libraries, archives, and metadata schema standards and practices within that community.

Preferred qualifications:
• Familiarity with MAVIS (http://www.feenyx.com.au/)
• Familiarity with PBCore 2.0 schema, other schemas that describe moving image, sound, and digital materials for descriptive, preservation, and structural metadata, such as PREMIS, MODS, METS.
• Experience with data transformation tools and programming languages such as Python, Groovy, XML, XPath, and XSLT.
• Knowledge of data repository and preservation concepts and their application.
• Demonstrated interest in the applications of current and emerging technologies and their integration into the delivery of information services.
• Knowledge of semantic web tools and standards, including Linked Data, XHTML, RDF, RSS, Atom, SKOS, OpenURL/COinS, and OAI-PMH.

Metadata

Published: Friday, March 21, 2014 19:39 UTC


Last updated: Tuesday, February 28, 2017 23:43 UTC