Interop 2012 New York — The Big News of “Big Data”

As you would expect, while Interop New York 2012 was heavily rooted in traditional IT networking, there was a lot of buzz surrounding the advent of cloud computing. Traditional network gear manufacturers were all displaying their cloud-ready boxes next to IaaS providers like ProfitBricks.

The Growing Ground-Swell of “Big Data”

While everyone was talking about the cloud, the growing ground-swell around “Big Data” was the sleeping giant. IT professionals are struggling to manage, backup, analyze and secure increasing numbers and size of data sets. It’s clear that incredible data growth is forcing us to think about infrastructure and the cloud in new ways.

According to the McKinsey Global Institute, “big data” refers to datasets whose size is beyond the ability of typical database software tools to capture, store, manage and analyze. While it will likely change in the future, big data, today, ranges from a few dozen terabytes to multiple petabytes (thousands of terabytes) in size.

So, How Big is “Big Data”?

Every day of every year, more and more data is generated…

For example:

Google and Big Data - Trivia

 In addition, McKinsey Global Institute, “Big Data: The Next Frontier for Innovation, Competition, and Productivity,” June 11, 2011 reveals:

•30 billion pieces of content are shared on Facebook every month

•US Library of Congress collected 235 terabytes of data just in April 2011 alone

•In 2011, it cost $600 to buy a disk drive that was capable of storing all of the world’s music

•5 Billion mobile phones were in use in 2010 (with even more in 2012, all capable of generating and digesting more data)

Today, 50 terabytes of data are created each and every second. It is estimated that by 2020, there will be 35 zettabytes of digitally stored data. (Source: Steps to Big Data Success, Margaret Dawson, Interop NY 2012)

Where will this Big Pile of Data Live and How Do We Deal with It?

The IDC Digital Universe Study 2011 states that data centers cannot be built fast enough.

Data Center Capacity and Demand

(Source: Steps to Big Data Success, Margaret Dawson, Interop NY 2012)

Only now, with the advent of cloud computing and IaaS, are we now able to get a handle of what all this data means, as powerful new computer processing architectures work to glean insights from the petabytes of data captured on our customers, markets, competitors, and employees. Technologists, rather than analysts, are becoming the go-to resource for data manipulation, using new processing architectures that store and analyze the information while remaining compliant with government regulations.

According to Jeremy Edberg, Netflix’s Reliability Architect, “data is going to be your biggest issue to deal with in the coming years. It’s what needs to be replicated to stay running. It’s what leaks when there’s a security breach. It’s what costs money to move around. And it’s the cause of latency and usability issues on poor performing corporate networking or 1st generation cloud providers. Compared to data, computing is easy. And as computing becomes a commodity in an era of clouds, we need to understand what it means to live in a Big Data world.”

One Possible Solution

Cloud infrastructure as a service providers, like ProfitBricks, are ideal partners for Big Data users due to their ability to customize the three essential computing resources of CPU performance (up to 48 real dedicated cores per server), RAM (up to 196MB) and storage (up to 16 sixteen terabit virtual drives) on an on-demand, runtime scalable basis to meet your precise data processing requirements.

Unlike 1st generation IaaS providers, ProfitBricks’ meshed 80Gbit/sec Infiniband Network is ideal for moving mass amounts of data to and from storage and CPUs. Also, when large analytic jobs are run, customers can take advantage of minute-based billing. With ProfitBricks, you can adapt each virtual server to the exact needs of your service and thus achieve optimum speed. And because your service requirements will constantly be changing, you can adjust the performance of your virtual servers at any time. In addition, Double Redundant High-Speed Storage systems keep your information safe and secure.

Were you at Interop New York 2012? What were your key takeaways?

 Scott Brazina, VP, Business Development, ProfitBricks