Monetising legacy content

Wednesday, March 27, 2013

Article Image
For broadcasters and other content owners the first step would probably be to make their “new” content available to other similar organisations. This business-to-business approach is likely to be the simplest route avoiding the scale required to provide a service directly to consumers.
File-based platforms and good process management may offer the key

Broadcasters and media content holders have for many years wanted to gain greater value from their legacy archives, but in many cases have never been quite ready to attempt what can seem to be a daunting task. Peter Gallen, product manager for Tedial says that with the advent of more powerful content management tools and the move to file-based working, now is the time for any organisation with potentially valuable content to think seriously about such an initiative.  

According to Gallen the problem breaks down into two parts: firstly discovering what’s of significant value in the archive; and secondly how to process it and make it saleable. Material that’s current may already have associated metadata and these, if they are not already registered, can easily be catalogued in a MAM or content management system. The target here will be to prepare each item with a low resolution proxy version and some time-based content descriptors: who, what, where and when. Providing this level of cataloguing will make those items easily discoverable when they are placed in a suitable media portal.

One of the stumbling blocks of recovering legacy content has been the sheer size of the problem, especially as much of the archive material with significant value is likely to be in the physical archive on pre-file-based carriers such as video cassettes or even reel-to-reel tape.

Many who have contemplated, and even started to plan, such a project have been confronted by the prospect of building a sizable ingest capability using expensive components such as video servers and requiring a lot of manpower. The advent of cost-effective ingest systems, using video capture cards and standard PC workstations, have made the prospect of setting up an encoding unit much more reasonable. But this is not, or should not, necessarily be the green light for starting a massive digitising project. In order to gain the maximum return per-tape ingested, some groundwork is required.  

One starting point is to hunt down material in the library that’s well marked or has data held in a library manager. These assets can be passed through a basic QC process that will establish whether or not they are playable (given the availability of a suitable VTR) in which case they are given an entry in the MAM and can be added to a “to be digitised” section in the physical library.  Tapes that seem to have valuable content but are visibly deteriorated should be reserved for recovery by specialised systems, perhaps using a third-party offering.

The digitising production line

A digitising project to capture legacy content should not be attempted without a clear idea of how the workflow and overall efficiency are going to be managed. Although it could be handled with a human workflow it will be much more efficient if it’s controlled by a robust workflow manager. The definition of “robust” in this context is a system that will ensure that each task is completed to the specified requirements before moving on to the next stage. 

In a traditional ingest facility much time and effort is wasted looking for accurate information and due to a lack of connectivity between processes - for instance from content reception to ingest - data is often lost. This requires operators to re-enter it leading to further delays and potential inaccuracies. In order to streamline the process the workflow application will provide operators with all the data necessary to complete their tasks. That information will have been gathered and stored in the MAM and might have come from a number of sources: data entered when the tape arrived; a tape library catalogue; or an external source such as an automation or scheduling application.  

Ingest is a notoriously difficult activity and problems often occur causing operators to stop and look for information or call for technical assistance. With the benefit of a workflow underlying the processes, tasks may be suspended in case of difficulties and the operator can switch to another ingest job while a supervisor or technician picks up the suspended workflow and corrects it before releasing it for completion by others. 

The benefits of this way of working are that the ingest line is rarely idle as failing tasks are side-lined and relatively inexperienced operators can be used with little risk to the quality of the final results. In fact, this approach invariably leads to more accurate and consistent results, something that the workflow manager’s report system will confirm as metadata of every aspect of each workflow will be stored. This may then be used for fine tuning workflows or looking for bottlenecks in the system as a whole.

Capturing asset value

Although MAM is important in supporting the workflows described above, its ultimate purpose is to capture, structure and make available metadata describing the material and defining its potential value. It may be that a 30-minute tape contains one minute that’s of genuine value and the MAM and its associated tools should be able to identify the scene or scenes to be presented for purchase.  

One approach is to load such tapes into a robot tape library and to digitise them to a proxy format.  Following that operators are able to browse through the material using a workstation and to identify clips that are candidates for being ingested into the archive. Alternatively the whole tape might be digitised in both archive and browse formats. 

This preserves the whole of the material on the basis that the market requirements are likely to change over time, although the requirement for storage is greatly increased. The more accurate metadata that can be extracted, the more likely that the material will be discovered and potentially bought by visitors to the content portal.

The Shop Window

Having transformed the selected archive material into files, the next step is to make it available for viewing and ultimately purchase by people both inside and outside the organisation. In order to secure the content it will be necessary to limit access to it and there’s a variety of ways to do that.   It may be that the material to be made available is placed in a portal providing a window into a separate area of storage or repository or virtual storage area. 

Such a repository will enable access to content based on login privileges for the high resolution content. It may also allow wider access to proxy versions, which have embedded logos to deter theft or even a digital watermark enabling any subsequent usage to be tracked.  For broadcasters and other content owners the first step would probably be to make their “new” content available to other similar organisations. This business-to-business approach is likely to be the simplest route avoiding the scale required to provide a service directly to consumers.  

To date, one solution that a number of broadcasters in Europe have embarked on is to host the content that they want to share in a private cloud, making access to proxy content and metadata available on a portal. Such a portal provides the capability to search for content and should include tools to select clips from long form material before dropping the results into a basket for subsequent processing. This then triggers the transfer of the required clips that will have been partially restored from the original file, then transfers them to the customer or partner.

Today many content holders are unsure about committing their material to a cloud, even a private one, due to security and transfer cost issues and so the preferred methodology will be to use one of the available file transfer applications for delivery.   

Conclusion

The thorny issue of monetising legacy content may be tackled thanks to the potential cost savings of file-based working coupled with the acceptance that the broadcast production line, like any other manufacturing activity, requires the deployment of formal workflow systems. The efficiencies gained in such systems will be even greater when applied to the systematic assessment and recovery of tapes sitting in almost every broadcaster’s vaults. 
In addition, such systems should be capable of reporting metrics of the digitising process that enable organisations to accurately measure the costs of recovering content as well as helping to quantify the ultimate value of the material that will be made saleable. 

Article Search

Search
 

   cmip equinix XStream cmip  cmip 
BPL Broadcast Limited, 3rd Floor, Armstrong House, 38 Market Square, Uxbridge, Middlesex, UB8 1LH, United Kingdom | +44 (0) 1895 454 411 |  e: info@bpl-broadcast.com  | Copyright © 2014