Quality control in file-based workflows

Sunday, July 08, 2012

Article Image
When considered at an enterprise scale, the question of whether to invest in higher quality transcoders or in the resources required to identify and correct faults in media files becomes more difficult. It is true to say that in many cases the more expensive option may well offer better value for money when considered over the equipment’s total operational lifetime.
Today’s broadcast industry works within a very different business environment than that of ten years ago. Gone is the monopoly of a passive linear viewing process dictated by programme schedules to be replaced by many viewing devices enabling consumers to access content on demand in different resolutions, at a time and place to suit themselves.

In this brave new world, progressive media organisations see a host of new revenue generating opportunities where existing content can create significant new revenue streams across a range of channels to market. Before launching into relatively uncharted waters, these progressive players will look at the challenge from an enterprise perspective (rather than jumping into the technical detail immediately). They will identify the critical success factors in these new business models and they will identify the importance of quality control – making sure that across all their operations, every form of the media file that they offer consumers is high quality, resilient and fit for purpose.

Often times, the market for this content will be international, which means that there is an implicit need for standards conversion. Furthermore, it is not just a matter of applying standards conversion, but also of implementing it well – which will greatly increase the value of the content on international markets. Whether the content is destined for the North American, Asian or European market, to be watched at home on an HDTV or on the move via a smartphone, the media files need to be transformed into an appropriate form for that application and market.

With the evolution of file-based workflows this transformation, or transcoding process, has become more accessible. However, there remains the important issue of quality control – being certain that the quality of each media form is consistent and at a level that the content owner requires. Quality Control (QC) is an important part of bringing content into broadcasters’ workflows and archives. Broadcasters industry-wide spend a large proportion of their revenue on acquiring content, but this content cannot be monetized by a broadcaster until it has successfully made it into the business’ workflow. 

The central purpose of any media facility (or media factory as they have become today with industrialised workflow processes) is to capture and store content, re-purpose it according to market demand and then to distribute it across a range of platforms and channels to market. Underlying all these operations are two mission critical requirements: efficiency and quality.

The panacea that modern media factories strive for is to harness automation in ways that increase efficiency and profitability throughout the operation, and reducing pressure on staff to implement QC procedures whilst still enabling an appropriate human touch to ensure absolutely that the required quality is being achieved consistently across all forms of media output. 

Automating QC lowers the cost of bringing content into the business. The challenge is to strike the right balance between QC automation and QC reliability so that whether ingesting content from tape or transcoding a file, a good QC methodology gives the opportunity to identify problems with files and the processes that made them in timely manner, thereby saving money. 

Modern media facilities want their transcode farm to run reliably and hands-free, but they also need to make sure that upstream changes affecting the input to the farm do not impact its stability. The commercial efficiency of a media facility is a peculiar mix of the type of content that flows through it, the technical quality of the output, the capital cost of equipment, the throughput of the software, the cost of rejects, reworks and mistakes and the cost of the operators. QC can impact many of these costs. 

Implemented strategically, QC can reduce reject rates, improve technical quality, generate metadata for more intelligent workflows and improve the effectiveness of operational staff. All of these have a direct impact on the bottom line.

Real world economics – it’s not just about quality
Performing QC costs money - so just how much money should a business seek to spend on it? Quality control is similar to an insurance policy in that you do it to prevent bigger problems occurring in the future. 

In today’s fluid environment, consumers are still getting used to new delivery platforms and a bad experience can put them off using a new device. Often, that bad experience can be attributed to problems in the media file rather than in the hardware device. For this reason, QC assumes a new and critical importance – making sure that the media content offers reliable playback in all its forms.
 
As a general rule, the cost of implementing QC should be less than the cost of the problems that it will be fixing (otherwise what's the point). In a world where there is only a restricted number of file variants this is straightforward, but today we see a growing proliferation of new distribution platforms, requiring a wide range of file types, all of which need analysis and verification before they leave the media facility.

Whereas each distribution platform offers revenue generating possibilities, there is a situation where some applications will offer far more value than others and these media types require a special focus in the media facility. This market dynamic leads to the idea of "adaptable QC" where the more valuable content is subjected to more strenuous QC procedures.

The same market driver also leads to the concept of fault / degradation prevention in that it can cost more to fix a media file fault than to get it right in the first place. If this is the case and it costs more to fix, then averaged over the business, a more expensive transcoder that produces better quality output may well be cheaper to own than a less expensive transcoder that provokes more quality issues.
 

Total cost of ownership is a complicated equation:

  Cost (transcoder + throughput + QC + fix-up)

averaged over the content must be less than

Cost (transcoder + throughput) + Cost (errors downstream at unsuspected places)

When considered at an enterprise scale, the question of whether to invest in higher quality transcoders or in the resources required to identify and correct faults in media files becomes more difficult. It is true to say that in many cases the more expensive option may well offer better value for money when considered over the equipment’s total operational lifetime.

Focus on workflow processes, not problems
Quality control is not a point solution - it's a methodology and set of processes that ensure the quality of a media factory's processes. A typical file-based workflow may have many touch points between media arriving in a facility and media leaving. Each touch point – be that a file mover, transcoder, editor or playout server - uses up some of the ‘quality margin’.  For example, if we look at the phenomenon of ‘lip-synch uncertainty’, no single step in a workflow will insert 100ms of lip-synch error, but 10 touch points of 10ms could create such a large error without being able to track any one single source of lipsynch problems in a workflow.

At AmberFin, we have developed the Unified Quality Control (UQC) system, which tracks quality over the lifecycle of material so that XML reports can be generated that allow tracking of quality margins across multiple processes with multiple tools and multiple media files. One file showing a small lip synch shift may not be significant. 990 files from a batch of 1000 showing an identical small lip-synch shift shows that the process is faulty and should be fixed. The unique ability of UQC to deliver analysis of processes as well as individual files allows media companies to extract knowledge from their QC data so that processes can be refined, efficiencies improved leading to improvements in the overall business.

A desire for many in the media sector today is a ‘lights out’ facility where workflow processes can operate with minimal human intervention. When QC is talked about in this environment, the goal of fully automated QC with "no people" is often discussed. The reality is that we are still working in an entertainment industry and only a human operator is truly capable of judging that content is fit for purpose. Massive efficiency gains can be made with the correct combination of software and hardware analysis to assist the operator delivering an "assisted QC" solution that dramatically reduces the cost of QC whilst simultaneously improving its reliability.

Not every file needs to be 100% QC'd - in a big farm, it may be appropriate to do a lightweight syntax QC for every file created and to keep a single QC analysis node to sample generated files in a similar way that car manufacturing performs a lightweight check of every car and a detailed investigation of a small random sample to validate manufacturing processes. In media manufacturing, lightweight QC checks each file for gross errors whereas the full QC analysis ensures that the transcode, ingest  and edit processes are still working according as required. 

Conclusions
Quality Control is a business process that costs money. In today’s multi-format file-based workflows QC is an essential pre-requisite of efficient business operations. The $64 million question is what is the right amount of money to invest and where should it be invested.  

Working out the risks and consequences of QC failure helps judge the amount of QC analysis required in a system. At the end of the day the challenge focuses on a QC system’s total cost of ownership  - Cost (transcoder + throughput + QC + fix-up) averaged over the content must be less than Cost (transcoder + throughput) + Cost (errors getting downstream).  

Make the right QC choice and the future looks bright – implement flawed strategic thinking and your entire media facility is built on very fragile foundations.



 

Article Search

Search
 

   cmip equinix XStream cmip  cmip 
BPL Broadcast Limited, 3rd Floor, Armstrong House, 38 Market Square, Uxbridge, Middlesex, UB8 1LH, United Kingdom | +44 (0) 1895 454 411 |  e: info@bpl-broadcast.com  | Copyright © 2014