Storage performance testing: workload profile added to Load DynamiX’s portfolio

Written By:
Published:
Content Copyright © 2015 Bloor. All Rights Reserved.
Also posted on: Accessibility

Load DynamiX has announced Enterprise 5.0 of its multi-faceted testing solution designed to help businesses optimise their storage infrastructure performance, something most find very hard to do.

I described the system in May 2015 [here], and these new elements make it more complete: Workload Analyzer for advanced analysis and modelling, the Workload Sensor appliance that captures storage data in real-time from production systems, and Storage Log Importer to collect data from legacy systems.

It had been a time-consuming manual exercise to initially set up the customer’s profile, but this is now mostly automated. Storage Log Importer takes a text or CSV file and parses the relevant information to pre-populate the Workload Analyzer.

“This turns what was one week for us, a month for them [the users], into hours – defining policies, the type of arrays and tools,” Len Rosenthal, Load DynamiX’s VP of marketing, told me. This feature can be used to import production array data offline; it also provides out-of-the-box policies for key legacy vendors and enables easy policy creation for newer vendors.

In May I wrote: “Once automation of its workload profile set-up is complete, it will surely be a very attractive add-on that storage equipment distributors and resellers will want to offer too.” This is now here, bundled into Load DynamiX Enterprise 5.0 (general availability from the end of December).

Users do live performance monitoring but typically have no common view linking application behaviour to storage behaviour. Rosenthal cited a 2015 US survey asserting that 64% (nearly two-thirds) of IT organisations still did not know their application storage I-O profiles or performance requirements* (and I doubt that the UK is any better).

He said that the application, network, storage and vendor teams did not have this visibility because they worked in silos, so performance planning was mostly guesswork; this resulted in wasteful under- or over-provisioning, trial and error to reach optimal configurations, and labour-intensive troubleshooting.

Workload Sensor is a unified workload sensor box, doing real-time statistical analysis of live storage workload behaviour and locality. It supports iSCSI and Fibre Channel (FC) networks (block storage now, file storage scheduled for Spring 2016 including NFS and SMB, and object protocols during second half of 2016). It only stores the statistics, not the actual network traffic.

Workload Analyzer then performs “lab testing” using this data; it is able to bring together application workload, storage and I-O patterns to identify their relationship to system performance. Vendor-agnostic, it covers SAN, NAS, converged and cloud data. Collection of both historical and current (real-time) data feeds future prediction for new developments.

Enterprise 5.0 (even without deploying its appliances) is no small financial consideration. So its results need to be good to make the investment worthwhile. Rosenthal told me that GE (as an example) had put its accuracy “in the high 90s percent”. Its impressive high-end customer base also indicates a level of user trust.

However, Load DynamiX is also beginning to target medium sized businesses. It has a software only version, and will also partner with distributors able to offer the testing as a service to its customers.

One leading UK distributor has already used Workload Analyzer to predict the likely Oracle performance, comparing six competing vendors’ systems all running the same production job. A dramatic (five-fold) difference between the best and worst vendor-independently pointed the customer planning to upgrade to the system most suited to his own needs.

Constant monitoring of an ever-changing picture (e.g. more users added, software and database upgrades, server upgrades, extra features and configuration changes), can bring insight into how they affect the I-O characteristics (by altering read/write, random/sequential and data/metadata mixes, block/file size distribution and compression) – impacting application latency, IOPS and throughput.

Even then, a sudden performance downturn could scupper an SLA, or worse. So, to quickly pinpoint the exact cause and take appropriate remedial action could avert a disaster. At this time Load DynamiX’s vendor-neutral software and supporting appliances seem to provide the only high-end storage testing capability able to do this.

For now, its users are predominantly US-based. By bringing its capabilities to the mid-market, and to Europe with the world in its sights, it has the potential to be a global player in its niche.

*Source: Gatepoint Research April 2015