The Hybrid DAM

Monday, August 13, 2012

Article Image
In the current climate of cloud everything, the immediate answer might be: put it on the cloud. But, unfortunately the solution of cloud storage comes with it’s own plethora of downsides.
In the digital asset management (DAM) business, you wind up hearing this dilemma all the time: My company has a huge library of source and broadcast-quality assets in our local DAM — which took a huge investment of both time and money to buy, configure, and catalogue — but no one is using it! Erik Freid, VP of Product Management at MediaSilo offers a solution.

Often times, the problem is that for the marketing departments, production departments, contributors, freelancers, executives and clients who need to access content, the system that took so much time, effort, and money to build is, in fact, too complex, too stationary, and not accessible enough to actually use. The result of this unfortunate situation — wherein assets are your capital, but no one can actually get to them — is a vicious cycle of effort duplication. This might manifest in the form of sending yet another team out to get the same establishing shot that you know is already in the library somewhere. Or paying for celebrity stock footage that you already own. By placing barriers to content, companies industry-wide are decreasing the efficiency of access to their own content, their own property, their own lifeblood.

For many enterprise broadcast entities, the big question becomes:  How do you take these repositories and make them easily accessible to the people who need them so that your valuable resources can be efficiently discovered, used, and reused? 

In the current climate of cloud everything, the immediate answer might be: put it on the cloud. But, unfortunately the solution of cloud storage comes with it’s own plethora of downsides. Sure, it provides access from anywhere, but what enterprise company handling privacy-sensitive, pre-release materials wants to open a security DMZ (“demilitarized zone” a.k.a. unprotected web server port)?And, beyond the potential security issues, storing an entire library’s worth of source-sized files on the cloud is simply not yet a practical solution from a storage cost or file transport perspective.

So, if a local DAM is too stationary, but converting entirely to cloud storage is too costly and leaves content too vulnerable...then it seems the the answer must lie somewhere in between. This is one of the many reasons that MediaSilo, along with many other cloud-based SaaS (software as a service) providers, are turning to a hybrid approach that takes advantage of XML data exchange or API (Application Programming Interface, a.k.a. series of cues that can allow software systems to communicate with one another) for custom integration. This allows in-house or 3rd party integrators to create client-specific hybrid solutions to find a safe and sold middle ground. 

Results of this hybrid approach vary, but the goal is consistent: allow source files to be stored locally, while small proxy versions of them are uploaded to the cloud for review and discovery. These proxies can then be used for collaboration, approvals, and sharing — while still maintaining a connection to the secure local source through the proactive use of metadata.

Within MediaSilo, we have a number of clients who have implemented similar strategies, but the following two examples really show the potential of a little creative flexibility. Client A is a straightforward local DAM that has been linked with our cloud-based system. Client B has added a cloud element to an existing SAN (Storage Area Network, a.k.a. shared storage system that allows multiple users to access files simultaneously). 

Both of these setups were implemented by 3rd party integrators who used our powerful API to meet their client’s need for secure end user-access to large local repositories via the Internet.

DAM in the Cloud:

Client A provides IT asset management as a service (ITAM) for a large industrial manufacturer of a variety of video content: from training videos to advertising content that includes video, audio and images. This client maintains a local DAM that houses all their content in an LTO (Linear-Tape Open) library. At ingest into the local system, a watermarked proxy and a metadata payload are simultaneously uploaded into the cloud system, as well. 

Once ingested and bound together, the client’s worldwide sales staff can access all the content to browse for what they need through a custom, branded portal.  When someone finds something they want the source file for, the user interface allows them to make a request for either the entire asset or a partial version in the codec/format they desire.  This sends a request to a server via XML that the local system is set to poll at regular intervals. Using file acceleration software,  a new deliverable is created and a link is emailed to the address on file for the requesting user. Once the email link is clicked, delivery of the requested content is initiated — and the link automatically expires after download is successfully completed.  

With this workflow, Client A has been able to control the costs of offering a cloud service, while still providing a global distribution system and secure access to all media materials worldwide.

SAN in the Cloud:

Client B demonstrates a far more complex integration. This client is a major Internet media news property who wanted to mirror their SAN on the cloud in proxy form so that at any given time, assets on their SAN are searchable from anywhere in the world. This would effectively create a PAM (Production Asset Management) system designed to track work in progress, something very much needed in a news environment. On top of that, they wanted access to multiple versions that would be hosted on their own system, rather than the cloud service provider’s — in order to ensure control of access. 

To serve this need, the integrator for Client B used MediaSilo’s API to develop a server appliance and write an application platform that constantly monitors their SAN directories. As changes are made to the SAN, this application makes corresponding calls into the cloud platform that mirror these changes, which might include: a new folder or subfolder being created, new assets being added, or content being moved to a new location or deleted. While not in real time, this sync updates on a fixed schedule multiple times per day. Utilizing a concept called “external asset create,” a proxy version along with other deliverables can now exist on the cloud storage platform of the client’s choice(or any other HTTP-accessible storage location with its own RTMP streaming server to control access and cost) — while the source content remains their SAN. Through this set-up, multiple versions of each asset are available via download links which are are inserted into the metadata of the record so that they can be accessed by those with appropriate credential and downloaded with a single-click.

In both these cases, Clients A and B have leveraged existing local systems and have been able to maintain existing workflows by automating the process behind the scenes. Instead of taking the time and expense of trying to host and build their own solutions, their integrators were able to leverage the benefits of a platform as a service (PaaS) that offers the flexibility and user-friendly interface needed to securely and easily get their content out to end users anywhere in the world. This means that their content is no longer locked away, but instead democratized across the entire organization — ultimately saving individual time and organizational money. More than that, this process transforms previously unused and inaccessible content into valuable, dynamic assets.  

Article Search

Search
 

   cmip equinix XStream cmip  cmip 
BPL Broadcast Limited, 3rd Floor, Armstrong House, 38 Market Square, Uxbridge, Middlesex, UB8 1LH, United Kingdom | +44 (0) 1895 454 411 |  e: info@bpl-broadcast.com  | Copyright © 2014