Industry Expert, John Webster, took a look at storage requirements for Big Data environments in a post I read on SearchStorage.com. He raises a challenge that traditional storage systems have a very hard time addressing: being able to meet the throughput requirements needed to move large files and thousands of files in an acceptable time frame while delivering the transactional performance (response time) needed to drive a rational analytics process on.
Call me biased, but this sounds like an excellent application for Hybrid Storage. Our
MASS based metadata processing engine can very easily handle the movement and pointers to thousands or millions of large files while turning analytic IOPs that are up to seven times faster than traditional storage systems and keeping response times in the one millisecond neighborhood. The confluence of these two requirements falling on the same array make for an extremely expensive traditional array, or could really benefit from a system purpose built to balance the requirements of high capacity and high performance.
What Big Data storage challenges are you facing this year? Start your Big Data project out right and try it on Tegile’s Zebi array.