Resources

HPC Roots Feed Big Data Branches

10th Feb `14, 05:28 AM in Resources

According to the most recent IDC figures, 67% of HPC shops say that they perform what can be…

BDMS
Guest Contributor
 

According to the most recent IDC figures, 67% of HPC shops say that they perform what can be categorized as big data analysis. These workloads, which the analyst firm dubs, “high performance data analysis”(HPDA) are expected to grow extensively, increasing from $743.8 million in 2012 to almost $1.4 billion in 2017. Additonally, the storage revenye for high performance data analysis on HPC systems will near almost a billion by 2017, IDC says.

IDC defines HPDA as data-intensive simulation and analytics, involving tasks with “sufficient data volumes and algorithmic complexity to require HPC resources.” This can include existing simulation or new analytical methods, and a variety of data types (structured, unstructured, both) or potentially the use of graph analytics or Hadoop frameworks, for example.

These are striking figures in their own right, but let’s consider the reverse of these numbers for a moment. While HPC might be adopting tools and techniques driven from the big data-laden enterprise (nebulous dividing lines exist terminology-wise when HPC/big data are separated into distinct classifications), this series is focused on the lessons about scalability, reliability, efficiency and extensibility that HPC can teach to the big data masses.

Read More
MORE FROM BIG DATA MADE SIMPLE