Storage product development is getting worse

Storage is becoming less conservative than in the past. It has its pros and cons, but if it means poorer quality of the final product and increasing risk of data loss, then it’s not the way to go.

Bad behavior

I stumbled on this article which talks about all the problems, bugs and mistakes made by Maxta with one of its (now former) customers. I won’t talk about Maxta and this particular case (also because there are different versions of this story), but this is an example on how some vendors, especially small startups, are setting the bar too high and then consequently struggle to deliver.

Data loss is the worst case scenario, but it’s quite common now to hear about storage startups in trouble when the game gets tough. Sometimes they miserably fail to scale when they promise “unlimited scalability”, or performance is far lower than expected or some of the features don’t actually work as documented.

This time round it’s happened to Maxta, but I’m sure that many others could fall in the same mistake.

DevOp-izing Data storage is dangerous

n the last couple of years, I’ve been hearing a lot about the drastic change in the development process of storage systems. Most of the vendors are adopting new agile development processes, and some of them have been openly talking about a DevOps-like approach.

I’ve always been keen on this type of development approach: modern, fast and produces results quickly. But… I can appreciate it on my smartphone apps, not on my storage system. I can imagine a continuous refinement of the UI or management features, but not for core performance or data protection aspects of the product.

What ever happened to the golden rule “if it works, leave it alone!”??? I’m not saying to apply it literally, but couldn’t more time be spent on testing and QAs instead of releasing a new version every other week? Do we really need a storage software update every fortnight? I don’t think so.

Fierce competition

It’s all about competition in the end. In the past a single good feature was enough to make a product, set a new market and have success (take DataDomain for example). It took time for others to follow and the development cycle was not as fast as today. Now everything is much more complicated, things have accelerated and an on going product evolution is needed to keep the pace with your competitors. Look at hyperconvergence or All-Flash for example, in many cases it is really difficult to find a differentiator now, and end users want all the features that are taken for granted (and the list is very long!). What is now considered table stake is already hard to achieve, on top of which you have to promise more to be taken seriously.

Closing the circle

I know times have changed and everything runs at a faster pace… but when it comes to data and data storage, data protection, availability and durability are still at the top of the list, aren’t they?

Standing out in a crowd is much harder now than in the past. Even established vendors are much quicker in reacting to market changes. Lately, when a new potential market segment is discovered, they’ve shown their ability to buy out a startup or come out with their own product pretty quickly and successfully (take VMWARE VSAN for example). First movers, like Nutanix for example, have an advantage (a vision) and can aim at successful exits, but for a large part of the me-too startups it’s tough because the lack of innovation and differentiation puts them in an awkward position: they are constantly trying to catch up with the leaders.

Software-defined or not, product quality is still fundamental, especially when dealing with Storage. I’d like to see more storage vendors talk about how thoroughly they test their products and how long they maintain them in Beta before going into production… instead of how many new releases they are able to provide per month!

And please, find some time to write better documentation too!

>Originally posted on