When it comes to data analytics, users are no longer satisfied with traditional business analytics that provide an historical view. These days, users want the predictive modeling capability offered via real-time analytics.
For a Datacenter Manager, real-time analytics may seem to have a 24/7 appetite for storage. With real-time analytics, queries and requests are processed immediately, instead of by batch. With real-time analytics, you can have far more queries running throughout the day and night.
The promise of real-time analytics looms large as people try new ways to create business value from their data. However, when it comes to identifying actionable business insights from real-time analytics, old storage does not have the speed or software to perform as needed. Nor does old storage efficiently store all the data required for so much data — and so many real-time queries.
Great expectations for IoT, MGD and 3rd Platforms
When it comes to real-time analytics, businesses have great expectations. The Internet of Things (IoT), Machine-Generated Data (MGD) and 3rd Platform interdependencies among cloud computing, analytics (big data) and social media offer a quantum leap forward when it comes to accessing valuable business data in real time.
However, in order to glean business insight from real-time analytics, the proper architecture is required. And traditional, disk-based storage was not designed for real-time analytics because the nature of the I/O of these new applications is different from the I/O of batch processing.
IoT, MGD and real-time analytics has changed the game on how organizations are structuring their datacenters to support what IDC calls “The 3rd Platform,” which essentially is a new solution framework on which data is managed. These new solutions have flipped the requirements of how data must be managed. And to the disappointment of many a Datacenter Manager, traditional storage platforms are not architected to support these new solutions.
The bottom line is that real-time analytics require an infrastructure that can process queries and requests immediately, as opposed to by batch processing.
Old systems frustrate users with great expectations
When analytics workloads take longer than expected, user frustration goes up. Users demand more than traditional business analytics that are run as a post-process activity to provide an historical view.
Instead of running analytics after the fact, users now desire the predictive modeling capability offered via real-time analytics. And in the age of instant data access, users expect analytics to run just as fast as their other web-enabled applications.
When analytics workloads take longer than expected, either staff time is wasted or SLAs are extended — both of which equate to lost money. In fact, organizations are now storing all data that is created across NOCs, etc.; and a new design is required to make sense of all that data for monetization and other purposes.
As users demand real-time analytics, Datacenter Managers are faced with the predicament that their old storage does not have the speed or software to perform (or efficiently store) all of the data available via IoT, MGD and 3rd Platforms.
Choice of storage is a game changer
People have a taste for the latest business tools, especially if they beckon increased revenue. If users demand real-time analytics, multi-tiered flash allows an organization to process data in real time on an all-flash platform. Multi-tiered flash provides both a performance and capacity layer, but on all flash.
Look at Splunk for example, which allows customers to make sense of the massive amount of data that is generated to make predictive business decisions to either monetize or streamline efficiencies in a datacenter.
Splunk turns machine data into rapid visibility, insight and intelligence. Splunk prefers low-latency and performance-oriented flash storage to allow a user to process queries in real time.
From batch processing to real-time analytics, what’s next?
Organizations will continue to hunger for new ways to process and create business value from all the data in their environment. Thus the technology required to satisfy users will need to be flexible, provide scale and do so in a cost-efficient manner.
If you listen to Gartner or IDC, they are recommending that organizations transform their datacenters to support applications such as mobile, social, cloud and analytics. With NAND flash finally approaching par cost with disk, there is every reason to investigate implementing an all-flash platform for analytics workloads that require data to be processed in real time.
Today, users seek the business insight gained by running real-time analytics. Tomorrow, who know what users will seek? If companies do not prepare their infrastructure to support current user needs, they will certainly miss the mark to support future user needs.
At Tegile, we recommend that companies prepare their infrastructure today for real-time analytics. In doing so, these same companies will be well positioned to serve user needs tomorrow.