Tackling the storage provisioning nightmare and business clamour to save costs

Written By:
Content Copyright © 2015 Bloor. All Rights Reserved.
Also posted on: Accessibility

Infrastructure Practice Leader, Peter Williams, recently cast his eye over Load DynamiX and its off the shelf appliance for important ‘what-if’ analyses of likely future performance of storage systems as they are updated  – especially at peaks when fully loaded with masses of throughput.

Despite ever-growing storage and backup needs, IT departments are always under pressure to reduce infrastructure costs. Pressure also exists to quickly deploy new and (maybe) faster technology as well as rolling out extra infrastructure to meet the business needs. Storage comes under strong scrutiny, as it typically accounts for over 40 per cent of total infrastructure costs, and rising.

The problem is: how do you deduce the likely result (except in theory and/or trusting the vendor), before you add new infrastructure and maybe remove older equipment? Storage applications, even with thin provisioning to guide the storage allocation in a virtual environment, can be hopeless at gauging the likely future performance of the system – especially at peaks when fully loaded with masses of throughput.

Nor does this allow you to try out “what-ifs” of, for example substituting some flash storage to replace spinning disk, before doing an expensive live trial. Every user’s storage infrastructure and need differs.

This is what Load DynamiX addresses. The company’s approach is centred on its 2U-high plug-in appliance available off the shelf (four models with up to eight ports supporting a wide variety of protocols*). Load DynamiX also creates a workload profile for the customer (shortly to be automated), and this builds a comprehensive view of the storage infrastructure.

Through the GUI, the storage administrator can then model workloads, generate throughput loads and manage tests on them – providing performance analytics including the latency. Utilities create, read and write throughput reports and allow “what-if” questions to see the effect of workload changes. This can be done rapidly, saving setup time, and is as accurate as it is possible to be.

Its largest 10G Series Advanced Solution appliance, which links to a 10GbE network, can handle up to 140 Gbps of workload throughput and 7m IOPS – allowing some very big tests. Load DynamiX also provides a short (one-week) administrator course, so that, once the system is installed, the user can run the tests for itself whenever he chooses – for file, block and object storage loads.

Len Rosenthal, Load DynamiX’s VP of marketing, told me that this differentiated it from freeware tools which were weak at scaling to production levels and took a lot of time to set up and administer. There are certainly not many providers with the ability to scale. (Load DynamiX has a slightly different focus from Virtual Instruments, whose software-only approach also covers the server-based infrastructure.)

How well does the appliance on software work? One apparently big endorsement is that it is now used by all the biggest IT storage vendors for their own internal testing (including EMC, NetApp, Dell, IBM, Cisco, Hitachi, HP, Oracle). Privately-held start-up Load DynamiX (whose market entry was in 2009), claims to be the leading performance validation solutions provider within a networked storage infrastructure. Once automation of its workload profile set-up is complete, it will surely be a very attractive add-on that storage equipment distributors and resellers will want to offer too.

The company is now expanding outside the US; so, for example, I also met with Gavin Tweedie, newly appointed director of EMEA operations, who will be UK-based at its soon-to-be announced EMEA office.

*Protocols and network connectivity support includes: NFS (v2/3/4/4.1), SMB 2/3, CIFS, SCSI, FC, iSCSI, HTTP, HTTPS, CDMI, S3, Swift/Cinder.