Think you’re cut out for an efficient edit workflow?

Jeff Grassinger

Jeff Grassinger

Sales Alliance Manager

Why one media organization loves their video post-production workflow (And you should, too!)

Do you think your editing workflow is efficient? This question of efficient workflow around edit continues to come up in many of our conversations with media professionals. Edit workflows are a challenging landscape as in most environments there’s a lot to consider; codecs, network interfaces, switches, shared storage and many more – not to mention the never ending debate on which is the “best” edit solution – Avid , Adobe, or Apple.

Efficient edit made easy

While this edit debate continues and we wait for the emergence of the dominant edit platform, many media organizations are taking advantage of a key opportunity to drive efficient, flexible video editing workflows. Collaboration within your edit workflow is a great example of how a media organization can become much more efficient.

Recently we had the opportunity to talk with Martin de Bernardo the Manager of Technical Services at Sheridan College (the second largest art college in North America) about the challenges they faced in their media environment and the success they had in becoming more efficient. Specifically, Sheridan was looking to create unlimited access to media for their students to collaborate on projects between fellow classmates and instructors.

See why they love their workflow

When you watch to video you’ll understand how the team at Sheridan addressed their access and collaboration challenges in an environment with editing applications Apple Final Cut Pro, Adobe Premiere Pro and Avid Media Composer. You’ll see why Sheridan is breaking new ground on how an educational institution uses shared storage and collaboration tools from MXFserver.

 

Let us know if we can help in your environment to build efficient, collaborative workflows and if this video was helpful, please share it with your network.

 

To read more about Sheridan College’s Isilon solutions please click here.

Strata & Hadoop World 2014 Recap

Ryan Peterson

Ryan Peterson

Chief Solutions Strategist

Preventing terrorist attacks, feeding the hungry, capturing bad guys, and enabling cubic splined segmented continuous polynomial non-linear regression models. I promise to try and explain the last one later in this blog.

This week was Strata + Hadoop World, a fast-growing convention and exposition directly pointed at Statisticians, Engineers, and Data Scientists. The topics were diverse and ranged from machine learning to the Michael J Fox Foundation’s use of Hadoop to help discover Parkinson’s disease earlier on the cycle for patients.

What is clear from the messaging this year is that Hadoop has made it into the mainstream technology people are using in their organizations. Customers from all walks of life spoke about their projects, planned projects, or how they changed their business, the economy, the world, or even saved lives.

One of our customers discussed a scenario they were involved in where their software with Cloudera Haodop and Isilon was used to save a life: “A child contacted a helpline service online, indicating that he had self-harmed and was intending to commit suicide. This was passed on to CEOP who acquired the communications data to reconcile the IP address to an individual. They did so in a very short space of time and passed it on to the local police force. When they got into the address the child had already hanged himself, but was still breathing. If there had been any delay, or if the child had been unlucky enough to be using one of those service providers that do not keep subscriber data relating to IP addresses, that child would now be dead.” – Page 11 at http://www.publications.parliament.uk/pa/jt201213/jtselect/jtdraftcomuni/79/79.pdf

We also saw Mark Grabb of GE explain their use of EMC technology to create the Industrial Internet and what that means to the innovation engine (pardon the pun) at GE.

What we are most excited about this year is a fundamental transition that flips the thinking that data must be moved into a new repository in order for that data to be included in analysis operations. Don’t get me wrong, data lakes simplify the management and correlation of data by getting as much into one place as possible. That in mind, there are some fundamental issues we are starting to address. Take this math from a real customer: 130PB of object storage used to house video and images + 8PB of file data used for home directories, weblogs, click stream, and more. Add in a desire to run analysis on ALL of that data and you’ll need 3-4x the capacity in a central Hadoop system. Do you want to build a 400-500PB raw capacity hyper-converged Hadoop cluster? What if we can flip the process and offer the right storage solution for the data being stored at the location where that data needs to be stored, and for the primary workload that originally captures and uses that data? That changes the conversation to creating a highly capable platform full of all of the ecosystem applications and pointing to the data. I had the opportunity to discuss this flipping of the process with customers during a session at Strata and it was met with great enthusiasm.

Mike Olson announced the partnership with EMC and the enablement of Isilon as the first platform to be certified with CDH. See his blog at http://vision.cloudera.com/turn-your-data-lake-into-an-enterprise-data-hub/. It reflects on the idea of bringing an Enterprise Data Hub to layer above all of the data in data lakes to enable a central system for correlating data from many sources. Mike Olson and I discussed our newly found partnership with David Vellante on theCube.

We cannot be happier about these announcements and look forward to a long and mutually prosperous relationship. Let me say here that the Cloudera team encompasses some of the most humble and talented people in the world and they are a joy to work with. Tom Reilly, Cloudera CEO and Mike Olson both took multiple stage opportunities to talk about the new partnership from Mike’s keynote to Tom and Mike’s Cloudera Partner Summit discussions.

During the event, I had the great privilege to hold a joint “Jam Session” with Ailey Crow from Pivotal’s Data Science team. The goal of the session was to riff on projects we have worked on that range from Healthcare and Life Sciences to Government, Telco, and Banking. With a packed house, she and I had an incredible time answering questions, discussing use cases around Big Data and more. Ailey is one of the smartest people I have met and I am truly honored to have shared the stage. A couple of examples from the discussion include banks using social sentiment analysis to look at trends of stocks; enabling traders to use one more data point before investing in particular securities. Another Ailey spoke about correlated air quality information with patient’s experiencing asthma and who also haven’t refilled their prescriptions; the result of which enables notifications to those patients to refill their prescriptions when air quality drops below certain thresholds.

An offshoot conversation with Ailey and B. Scott Cassell (Director, Solutions Architecture for EMC) went into an idea B. Scott has for modeling performance of storage. As he explained what he was doing, Ailey explained that what he wanted to do was create a “cubic splined segmented continuous polynomial non-linear regression model”. Roughly what that means is to create a specific model of performance based on specific plot points, but in order to keep that model as accurate as possible, break it into multiple chunks (segmented), but in order to connect those segments, use a cubic spline (I have no idea what that is – but they did), and ultimately graph a continuous polynomial. Here is what one looks like:

Yep, that hurts my brain. And I bring it all up for good reason. This year at Hadoop World we began to see new products that do all of that for you and put together the neat graph, chart, or even turn it into an application (perhaps an easy to use performance predictor is in our future). Hadoop is becoming an underlying toolset that will be the base for the next generation of technology. Similar to the RDBMS, Hadoop will soon become a term and less of “the application”.

The EMC Federation was there in full force. The EMC booth displayed Isilon, ECS, and DCA. Experts on each platform manned the booth and hundreds of attendees came through to learn more about our HDFS Storage solutions. Sitting just across from VMware and down the hall from Pivotal, I was reminded how strong of a force the federation already is in the Big Data space. With the newly announced DCA+Isilon+Pivotal bundle v2, the federation is able to provide the “Data Lake in a box” that so many have been asking for. See the Press Release at http://wallstreetpr.com/emc-corporation-nyseemc-and-pivotal-unveils-data-lake-hadoop-bundle-2-0-34175

Aidan O’Brien (Head of Global Big Data Solutions for the EMC Federation) and Sam Grocott (SVP, Emerging Technologies Marketing & Product Management) discuss the newly formed Emerging Technologies Division and the plans for EVP solutions around Big Data.

People often ask me what excites me about working for a storage company. I like to answer them with a couple of key points. EMC is no longer a storage company in my mind. We are a data company. And we’re tackling challenges that had previously gone unsolved. EMC stores data, but with protocol access to that data such as HDFS (Hadoop), EMC is able to unlock the potential for that data and allow new harder questions to be asked. So whether you’re trying to prevent terror, increase food production, return kids to their parents, or answer a complex technology performance question in an easier way, EMC has the tools and rich partnerships to help you do that.

Wal-Mart probably knows more about you than your doctor…

James Han

James Han

Sr. Business Development Manager-Healthcare

When you walk into a Wal-Mart, they likely know more about you than your local hospital. They know when and what you’ve purchased, your income, your family members, your political affiliation, and probably even the habitual route you take while walking through the store.  Like many other companies, Wal-Mart mines tons of big data to improve their marketing campaigns, sell more, and generally improve their bottom line.

Healthcare is behind in employing big data analytics tools. For example, when you go to the hospital or clinic, it is often treated as a single visit—you may even have to update all of your demographic information each time.  Your data is generally only important during your visit and is often archived immediately after your visit—essentially making it inaccessible for subsequent visits. What if healthcare could employ big data analytics to the level of commercial enterprises like Wal-Mart?

Let’s look at some statistics related to healthcare spending.  A 2012 report (of 2009 data) from the National Institute for Health Care Management (NIHCM) reveals that spending for healthcare services is extremely uneven—a small proportion of the population is responsible for a very high portion of spending. The study finds that the top 5% of spenders account for almost half of spending ($623 billion), and the top 1% of spenders account for over 20% of spending ($275 billion)[1] (See Figure).

Healthcare_1

It wouldn’t take much improvement in efficiency when dealing with that 1% of the population to make a substantial payoff. If trends could be identified, or procedures developed that would lower costs for those few utilizers to keep them healthier and lower their consumption the impact can be dramatic.

Unfortunately, many healthcare providers are still trying to figure out what data they need to perform the equivalent of Wal-Mart’s analytics. Or they have the data, but can’t figure out how to get it all in one place.

EMC Isilon can help. Isilon is in the business of big data—making big data analytics more cost-effective and—perhaps most importantly with respect to healthcare—easier to implement. Isilon provides the foundation for a scale-out data lake—a key capability that provides simplicity, agility, and efficiency to store and manage unstructured data. Starting with a scale-out data lake, healthcare organizations can:

  • Invest in the infrastructure they need today to get started today,
  • Realize the value of their data, store, process, and analyze it—in the most cost effective manner, and
  • Grow capabilities as needs grow in the future.

In short, EMC Isilon can help healthcare organizations get on the road to leveraging their data to improve patient comfort, lower costs, and streamline healthcare procedures.

 

 

Source: [1]“The Concentration of Healthcare Spending: NIHCM Foundation Data Brief July 2012” http://www.nihcm.org/component/content/article/326-publications-health-care-spending/679-the-concentration-of-health-care-spending

Converged Infrastructure for Big Data Storage and Analytics

Michael Noble

Michael Noble

Sr. Product Marketing Manager

It’s no secret that unstructured data is growing rapidly and poses significant challenges to organizations across virtually every industry segment to store, manage, secure and protect their data. According to IDC, the total amount of data storage world-wide will reach 133 exabytes by the year 2017, of which 80 percent will be required for unstructured data.

Unstructured Data Growth

To meet this challenge, VCE has just introduced a compelling new converged infrastructure (CI) platform – VCE™ Technology Extension for EMC® Isilon® – for organizations looking to consolidate and modernize their big data environment. With this technology extension, existing VCE Vblock® Systems can leverage Isilon’s massive scalability and built-in multi-protocol data access capabilities to easily expand capacity to address large-scale data storage needs while supporting a wide range of applications and traditional and next-gen workloads including Hadoop data analytics.

Isilon Scale Out Data Lake

With VCE technology extension for EMC Isilon as a foundational element of a scale-out data lake infrastructure, organizations can eliminate costly storage silos, streamline management, increase data protection and gain more value from their big data assets. A great example of this is in the area of big data analytics and Hadoop. As the first and only scale-out NAS platform that natively integrates with the Hadoop Distributed File System (HDFS), Isilon is a game-changing big data storage and analytics platform.

Before Isilon came into the picture, Hadoop deployments have largely been implemented on a dedicated infrastructure, not integrated with any other applications, and based on direct attached storage (DAS) that is typically mirrored up to three times or more. In addition to requiring a separate capital investment and added management resources, this approach poses a number of other inefficiencies. By leveraging Isilon’s native HDFS support and in-place analytics capabilities, organizations can avoid capital expenditures and related risks and costs associated with a separate, dedicated Hadoop infrastructure by extending the Vblock System-based environments.

Isilon’s in-place analytics approach also eliminates the time and resources required to replicate big data into a separate infrastructure. For example, it can take over 24 hours to copy 100 TB of data over a 10GE line. Instead, with VCE technology extension for EMC Isilon, data analytics projects can be initiated immediately to get results in a matter of minutes. And when data changes, analytics jobs can quickly be re-run with no need to re-ingest updated data.

VCE technology extension for EMC Isilon also leverages Isilon’s ability to support multiple instances of Apache Hadoop distributions from different vendors simultaneously including Pivotal HD, Cloudera Enterprise and Hortonworks Data Platform. These same data sets on EMC Isilon can be extended to other analytics such as SAS and Splunk. This means that organizations gain the flexibility to use whichever tools they need for their analytics projects.

Along with these powerful big data storage and analytics capabilities, VCE technology extension for EMC Isilon brings the convenience and assurance of proven VCE engineering expertise. VCE’s technology extensions are tightly integrated and fully tested and validated. This allows organizations to quickly increase processing power or add storage capacity, without typical technology risks.

All-in-all, VCE technology extension for EMC Isilon does indeed look like a compelling approach to help tame the big data storage challenge and unlock the value of big data assets. Please let us know what you think.

 

Converged Infrastructure for Scale-out Data Lakes

Carl Washington

Carl Washington

Sr. Business Development Manager

For many organizations today, the rapid growth of unstructured data has put a spotlight on the challenges of a decentralized IT infrastructure burdened with data storage “silos” across the enterprise. These limitations include:

  • Complex management of multiple silo data sets
  • Inefficient storage utilization
  • Storage “hot spots” and related performance issues
  • Data protection and security risks
  • Inability to support emerging workloads

As many enterprises have experienced, converged infrastructure (CI) systems are a great way to easily eliminate silos, consolidate and simplify IT environments while addressing increasing demands on IT. Additionally, with the rapid growth of unstructured data, many organizations are attracted to a CI infrastructure strategy to implement a scale-out data lake. As used here, “data lake” is a large reservoir of unstructured and semi-structured data consolidated from different traditional and next generation application workload sources.  These next generation applications, including mobile computing, social media, cloud computing, and big data are fueling the enormous growth of this unstructured data.

Foundation for Scale-Out Data Lake

The storage infrastructure to support these next gen applications and scale-out data lakes must be highly scalable both from a capacity standpoint as well as performance. It must also have the operational flexibility to support a wide range of applications and workloads. One of the other great advantages of a data lake is the potential to leverage powerful analytics like Hadoop to gain new insight and uncover new opportunities by harnessing big data assets. To address these needs, VCE has just announced an exciting new product, VCE™ technology extension for EMC® Isilon that allows organizations to quickly implement an efficient converged infrastructure for scale-out data lakes.     

Designed to be deployed as a pooled resource with Vblock® Systems, VCE technology extension for EMC Isilon enables a scale-out data lake approach to the modern datacenter. This provides enterprises and service providers an infrastructure to consolidate unstructured data, support traditional workloads as well as next-gen applications, and enable in-place big data analytics using Isilon’s native HDFS support. These capabilities help organizations to reduce costs, simplify management, gain new efficiencies and accelerate time to insight while avoiding the need for separate infrastructure investments. Organizations that already rely on VCE to run their mission-critical applications and manage data can easily augment their existing environments with this new offering, VCE technology extension for EMC Isilon.

Isilon Scale Out Data Lake_1

The EMC Isilon Scale-out Data Lake can collect unstructured data from multiple workloads such as HPC, Mobile Home Directories, Video Surveillance, Large Scale Archives, File Shares, and more. A reservoir of unstructured data from different sources can become immediately available to have analytics performed against it.

What’s more, the source data can be written to Isilon using one protocol and then accessed using a different protocol. Isilon’s support for multiple protocols, such as NFS, SMB, HDFS, HMTL, etc. is a compelling feature that provides enormous flexibility and agility for next generation of applications. Given these points, Isilon’s ability to consolidate data from multiple sources and run in-place analytics strengthens the advantages provided by VCE Vblock Systems while extending its applicable use cases and accelerating adoption for next generation of applications.

VCE_1

 

In sum, VCE Technology Extension for Isilon allows enterprises to implement a scale-out data lake infrastructure that provides a number of advantages including:

  • Lower cost and increased efficiency
  • Simplified management
  • Increased operational flexibility
  • Faster time to service and market
  • Robust security and data protection

Another Important Advantage: The VCE Experience

By extending the full VCE Experience – engineered, built, validated, shipped and supported by VCE — to EMC Isilon, VCE is delivering virtualized cloud infrastructure systems optimized for traditional and next generation application workloads with scale-out data lakes. 

In addition to the technology and infrastructure capabilities, the VCE Experience helps IT executives drive agility into their operations and enable IT as-a-service, which makes IT more responsive to new business applications while shortening the time to provision new infrastructure.

VCE Technology Extension for Isilon offers a great value for organizations looking to implement an IT infrastructure for their own scale-out data lakes. Please let us know what you think.

An interview with Innovator, Hugh Williams

Ryan Peterson

Ryan Peterson

Chief Solutions Strategist

I recently got a chance to sit down with Hugh Williams, the SVP of R&D for Pivotal. Hugh was previously at eBay and Microsoft and comes with an impressive background with respect to Big Data technologies that includes industry patents and numerous publications.  Here is a transcript of our discussion:

Ryan: Hi Hugh, thanks for taking the time to sit down with me.  Getting straight to the questions I have for you:  How would you define Data Lake?

Hugh: Great question, the basic premise of a Data Lake is that you have one place where all of your data is stored, and it allows everyone in an organization to extract value from the data using the tools they want to use.  Without a data lake, data is usually silo’d in legacy systems where only a couple of applications or a subgroup of users have access to the data.

Ryan: What would you consider to be the most important attributes of a data lake?

Hugh: Having all of the data in one place.  Of course, you need the right tools to be able to accomplish that – ingestion and connection to existing sources is still more challenging than it should be.

Ryan: How do customers build data lakes?

Hugh: Most companies start out a data lake with a set of folks who build out a small Hadoop capability, they demonstrate that capability, the noise gets louder, and the company says that rather than having all of these solutions throughout the organization, let’s look at collecting all of that into one place.

Ryan: I call those Data Puddles!  What have you seen inhibit adoption?

Hugh: I think a few things come to mind: Ingestion and egestion is problematic.  How am I going to get all of that data from various places into the central place?  The second thing is that the Hadoop ecosystem is relatively immature.  Although an impressive toolbox, there is still a barrier on setting up the infrastructure, the standing up, the training, getting all the right pieces.  The last thing I’ll say is using Hadoop to extract business value is not easy.  You have to employ Data Science folks.  Pivotal is making SQL much more mature on Hadoop to help solve this issue.

Ryan: What interests you about the Isilon partnership with Pivotal?

Hugh: Hadoop will rule the world, but its maturity is a problem today.   Isilon is mature and companies bet their businesses on it.  If you want one thing to be reliable, it has to be the storage – and so the partnership between Pivotal and Isilon really matters

Ryan: Customers often lump HAWQ with Stinger, Impala, and even Hive.  How do you differentiate HAWQ from other SQL solutions?

Hugh: HIVE is a relatively elementary implementation of SQL access to Hadoop with basic features of SQL.  It was revolutionary when it happened, but it doesn’t have what a Data Scientist would need.  Impala is a nice step forward from HIVE.  The really interesting thing about HAWQ is that we took 10+ years of experience with SQL from the Data Warehouse space and ported that to work with Hadoop.  What you get with HAWQ is GreenPlum database heritage adapted to Hadoop.  Pivotal has the most advanced solution for SQL access to Hadoop.

Ryan: Can you provide an example of something you can do with HAWQ that cannot be done with the others?

Hugh: There are benchmarks such as TPC-DS that help validate whether various typical SQL queries can be evaluated and optimized on different systems.  In rough terms, when we used TPC-DS to test SQL compliance, 100% of queries are successful with HAWQ, only 30% with Impala, and around 20% for HIVE. We published an independently peer reviewed study that shows these results in this year’s SIGMOD, the leading database conference

Ryan: You recently announced GemXD, a new product in the GemFire family.  What is an example of a problem that GemXD solves?

Hugh: You can think of it as Cassandra or HBase done really, really well – with a SQL interface, full ACID capabilities, the ability to upgrade with no downtime, the ability to read and write to Hadoop’s HDFS storage layer when there’s too much data for memory, and much more.

Ryan: What’s your favorite “Big Data changed the world” story?

Hugh: Here’s a fun story. When I was at Microsoft, I decided to research what drugs caused stomach cramps by looking at what queries customers ran in sessions on Microsoft’s search engine. I reverse engineered a list of drugs that caused stomach cramps, and checked the FDA literature – and, sure enough, it was right.

Ryan: How does Cloud Foundry fit into the Big Data / Hadoop storyline?

Hugh: Today they’re somewhat separate stories, but they won’t be for long.  It’s of critical importance to the future of PaaS and the future of Big Data that they converge.  In the future, most applications will be data-centric, and great companies will be built on those applications. Developers are demanding the convergence. PaaS and Big Data exist within Pivotal to build the future platform for software.

Video Surveillance: A Tale of Two Markets

Christopher Chute

Christopher Chute

Vice President, Global Imaging Practice, IDC
Christopher Chute is a Vice President with IDC's Global Imaging Practice. His research portfolio focuses on transformative technology trends impacting the imaging market. Mr. Chute's specific coverage includes digital imaging and video technology adoption across professional, broadcast/creative, consumer and physical security/surveillance. He conducts forecasting, product and service analysis and user segmentation for these vertical markets through a variety of supply and demand-side studies and consulting engagements. Mr. Chute's research has often centered around charting the disruption caused by technology transformations on status-quo industries, such as the migration from film to digital in the photography market, commoditization/democratization of broadcast/cinema video capture and photofinishing, and the impact of cloud services and tablet usage on the imaging industry. Writers from a variety of publications rely on Mr. Chute for a deep understanding of these markets, including Time Magazine, The Wall Street Journal, Fortune Magazine, USA Today, Investor’s Business Daily, San Jose Mercury News, Bloomberg, and The Financial Times. His television and radio appearances include ABC World News Tonight, Fox & Friends, CNBC, National Public Radio, and Into Tomorrow With Dave Graveline. Mr. Chute also speaks regularly at a variety of international trade shows and technical industry group meetings, including ASIS, CES, Creative Storage, Computex and Photokina. Mr. Chute holds both undergraduate and MBA degrees from Boston College.

As a longtime video surveillance industry observer, I remember when it was fully reliant on analog technology to secure people, places and things. However, it seems the industry is still reliant on similar solutions, whether they are analog cameras capturing video that’s digitized and stored on reused tape, or security personnel who act as both preventative and post-event resources. It became clear to me that the market was bifurcating into two types of deployments: large, fully-IP installations that are fully IT-led, and analog-prone, fragmented, traditional eyeballs-to-screens installations.

IP surveillance is often thought of in terms of image quality, megapixels and other visually related terms, yet the IT-led side of the market has centered itself more on workloads, specifically a combination of video content and analytics that are woven into a broader physical security initiative. These deployments aim to be far more preventative than forensic. IT-led, multi-dimensional installations like these tend to be physically large and extend across several facilities across long distances. This requires both fixed and mobile surveillance cameras and other sensors connected to local edge storage sources that communicate with a core system: cities, transportation, government facilities, and large public complexes. The resultant workloads are collated into a large data set that undergoes extensive analytics processes at a centralized command facility. This workflow is modeled on an enterprise datacenter rather than a security room.

In many ways, workloads and analytics now define physical security more than cops and cameras.

Thus, IT storage leader EMC’s announcements at ASIS 2014 are timely in addressing market growing pains. For instance, to complement its enterprise-class core EMC Isilon storage platform, EMC is now offering EMC VNX-VSS100 (built on proven VNX technology), a purpose-built storage system that can act in an edge capacity with EMC Isilon, or serve as a cost-effective, scalable hub for smaller network-based security installations. The company is offering lab-level validation for Video Management Software (VMS) Providers  partners like Genetec, Verint, and Milestone, which will allow system integrators to deploy solutions more quickly. EMC is also creating greater partner enablement through training resource sponsorship and partner investment in high-growth countries.

IDC’s Digital Universe Project forecasts that video surveillance workloads will grow an average of 22% by 2020. And while system integrators have been successful partnering with a wide range of surveillance hardware, VMS, storage and analytics vendors, what’s been lacking is a strong, experienced third-party IT leader like EMC that can create a foundation for surveillance-specific vendors and integrators to work and partner with – while keeping pace with surveillance trends.

From “Edge-to-Core”: Redefining Video Surveillance

Suresh Sathyamurthy

Suresh Sathyamurthy

Sr. Director, Product Marketing & Communications

You’re probably familiar with the terms “Edge” and “Core” as they apply to a networking infrastructure—but video surveillance?  It turns out that the edge and core terms are a perfect description of today’s larger scale IP-based surveillance architectures. Out on the edge are the cameras themselves, as well as local, lighter-weight processing and storage.  At the core is the majority of the capacity (PBs), archiving capabilities, and, of increasing importance, analytics.

VSS Blog- Image 1

Organizations adopting edge-to-core architectures are asking for their surveillance technology to be scalable, flexible, and open and future-proof.

In my last blog (see “Bringing the Scale-out Data Lake to Life,” July 2014) I talked about the significance of the new capabilities we launched in July. These new capabilities bring the scale-out data lake model to surveillance, and directly hit the needs of edge-to-core video surveillance requirements.

VSS Blog- Image 2

That’s why the Isilon scale-out data lake storage model is so relevant to edge-to-core video surveillance.  For example, when surveillance video moves from edge cameras into the core data lake it can be actively used and leveraged simultaneously by multiple applications (e.g. Hadoop can be used to deliver valuable intelligence). At the same time, surveillance data can be securely accessed using Syncplicity on mobile devices, or used within cloud-based applications.

Today, over one million surveillance cameras capture their image data on Isilon and Isilon protects and manages this image data on over 160 PBs of capacity.  With the Isilon scale-out data lake at the core, we’re able to satisfy the current and future needs of video surveillance in industries such as Transportation, Federal and Local Government, enabling them to become more vigilant in their efforts to protect both people and property.

Check out this video where I provide an overview of the Video Surveillance market and deep dive in to EMC Isilon’s Core solutions that address customer needs.

Please let me know about your experiences with these solutions – feel free to post blog comments, tweet me (@sureshcs) or on other social media such as LinkedIn.

The Media Industry is changing, so stay tuned!

Jeff Grassinger

Jeff Grassinger

Sales Alliance Manager

The IBC trade show (12-16 Sep) is just around the corner!  At this year’s show we will see consumer technology trends driving rapid changes in the media industry.  One of the key trends is multiplatform content delivery in our non-stop “TV Everywhere” world. This one trend alone will have dramatic effects throughout the media industry, as it will necessitate significant transformation in the workflows in use today.

According to eMarketer, the average time spent with digital media per day will surpass TV viewing time for the first time this year. In addition, Ooyala’s Global Video Index reports that mobile and tablet viewing increased 133% year over year between Q1 2013 and the end of Q1 2014. So digital media consumption is not only growing fast, but specifically video on mobile devices is a significant area of dynamic growth.

As with any change that comes to the industry, media professionals are asking, “what workflow changes do I need to make to engage this new audience and how does that change the way we go to business?”

In the short term, it’s about leveraging the workflows you use today.  In the long term, success depends on adapting your workflows to new technology. Read on to find out how.

What are the goals and keys to success?

Workflows in TV Everywhere applications allow users to access their favorite content on their favorite devices, and discover new interests by searching for content or accepting recommendations. Your TV Everywhere goal is to create “stickiness” or engagement that encourages consumers to stay longer to drive your bottom line advertising and/or subscription revenue.

However unlike traditional broadcast workflows, TV Everywhere requires an array of codecs, streaming formats, digital platforms, and new and vastly different workflow architectures.

  • One of the most important goals when planning and building for the TV Everywhere future is workflow agility. Codecs, streaming formats and devices will change and evolve at a much more rapid pace than broadcast technology. Anticipating TV Everywhere technology, the foundation of your workflow must be agile enough to meet the requirements of today, while also being able to support new workflows with ease. Media organizations with legacy technology based on proprietary infrastructure, proprietary protocols, and inflexible technology will find themselves at a distinct disadvantage to their competitors.
  • Another important goal is to create simplified and consolidated file-based workflows based on a data center infrastructure. The technologies you choose need to work together simply, with a focus on an exponential reduction in operational costs. File-based workflows will significantly reduce the need to manually create and deliver media, maximizing topline growth and bottom line results. Data center technologies such as networking, virtualization, software-defined architectures, data analytics, and cloud solutions will increasingly be part of tomorrow’s delivery of TV Everywhere.

There is no arguing that the media industry is in transformation. While the living room TV screen continues to be the screen of choice for many viewers, it is clear that smaller screens will continue to see dramatic increases in video traffic. Seizing the opportunity to engage and entertain this new viewership, media organizations need to focus on technologies that enable business agility and data center file-based workflows.

Have you started your transition to agile, simplified and consolidated data center file-based media workflows? How has this industry transformation influenced your visit to this year’s IBC trade show?

Stop by the EMC Isilon booth #7.H10 to see our new storage software, platforms, and media solutions. I look forward to talking to you, and showing you how EMC Isilon can help you incorporate new technologies, design adaptable workflows, and improve your bottom line.

Free Hadoop….pun intended!

John Nishikawa

John Nishikawa

Director, Business Development & Alliances

free (adjective, verb, adverb) – to make free; release from imprisonment; to unlock your data

free (adjective, verb, adverb) – no charge; Isilon HDFS license key

Isilon is the #1 Enterprise Shared Storage Provider for Hadoop. We have more customers and more capacity in our storage infrastructure used for Hadoop than any other enterprise shared storage provider.  Are you looking to get more business insight out of your data to drive innovation, provide competitive advantage, improve customer satisfaction, accelerate time-to-market, or even in some cases – save actual lives?  If so, the power of your data is sitting right there in your Isilon cluster.  All you need to do is free Hadoop and bring it to your data.  Join the Free Hadoop revolution!  Here’s how.

Hadoop blog- Pic 2

In five easy and free steps, you can join the Free Hadoop revolution with Isilon at http://www.emc.com/campaign/isilon-hadoop/index.htm

  • Step #1: Request a free HDFS license key to Free Hadoop on Isilon
  • Step #2: Download a free community trial edition of Pivotal PHD or Cloudera CDH to sort of kick the tires on the power of Hadoop
  • Step #3: Download the free Hadoop Starter Kit (HSK) to get step by step instructions on how to deploy Hadoop to your existing Isilon and VMware infrastructure in about an hour
  • Step #4: Conduct a free TCO analysis of a Hadoop DAS architecture versus a Hadoop Isilon architecture and see why many customers are choosing Isilon for Enterprise ready Hadoop
  • Step #5: Enjoy the power of data and recruit others to join the Free Hadoop revolution!

So what are you waiting for?  Join the Free Hadoop revolution now and we’ll also send you this t-shirt to demonstrate unity as we together spread the Free Hadoop mantra across the globe.

Hadoop Blog - Pic 3

So, let the chanting begin in your data center, “Free Hadoop!  Free Hadoop!  Free Hadoop!  Free Hadoop!”