Here's what Nutanix told StorageReview about the plan:
While we work to finalise the performance test plan, we ask that you not conduct any testing that measures the performance of the Nutanix product or the performance of any application (including test software) running on the Nutanix platform.
Specifically, we ask that you not use any custom-developed or commercial test tools to measure performance, including but not limited to, Sysbench, VMmark, IOmeter and Open LDAP.
Until we have a mutually agreed upon plan, we ask that you not undertake any performance testing of the Nutanix product, or publish results of prior performance testing.
So Nutanix wants the marketing benefits of a product review in the independent StorageReview magazine, but won't let it run independent tests?
Beeler said no, and commented: "The test plan currently proposed by Nutanix is fine for learning the system and characterising lightweight behaviours, but does not show what customers can expect as their demands grow after initial deployment."
He added: "Nutanix now holds a position that its testing plan should be the hyperconverged standard, which is somewhat surprising given its testing plan leverages synthetic testing tools primarily, and doesn’t stress cluster performance" – i.e. real-world application performance."
StorageReview has a VMware Virtual SAN Review: SQL Server Performance test review, also a VMware Virtual SAN Review: Sysbench OLTP Performance testing result.
If that's not enough to demonstrate VMware VSAN performance transparency, there's also this VMware Virtual SAN Review: VMmark Performance review.
Despite six months of work, StorageReview was not able to produce an independent performance result, using benchmarks which were acceptable to VMware, and which also pleased Nutanix.
If followed through, this would not enable comparisons to be made with other hyperconverged systems – notably VMware's VSAN – which gets us back to the blog spat and Lundell's attempt to occupy the moral high ground.
Sorry Lukas, but Nutanix cannot claim that performance testing moral high ground. The company has seemingly demonstrated that it is not interested in transparency about test results. The clear implication is simply that Nutanix systems don't perform as well in the real world as Nutanix hoped.
A Nutanix spokesperson gave us this statement:
We’re committed to working with independent third-party evaluation labs like Storage Review to compare our solution against any hyperconverged product using comparable hardware and a comprehensive and representative testing methodology.
The current generation of methodologies does not adequately represent how hyperconverged solutions perform in real-world customer environments. We feel strongly that utilising outdated test tools and methodologies would not provide customers interested in hyperconverged solutions with relevant and indicative data.
As indicated by Lukas, we’re building an open, comprehensive test suite for this category that we feel will help customers better understand the performance of hyperconverged solutions. We’ll demonstrate it at the Nutanix booth at VMworld and will release it in September so anyone in the industry can use it.
In the meantime, we’ll continue talking to Storage Review and any other third parties about working together on a review that will benefit both the industry and customers evaluating hyperconverged solutions.