Coho Data has added concurrent Hadoop Distributed File System (HDFS) to its block and file access on the DataStream array system, along with multi-tenancy quality of service.
Its scale-out DataStream MicroArrays are servers with all-NVMe flash and hybrid flash and disk storage, which can run storage-related applications such as video transcoding in the array. But Coho stresses that they are not hyper-converged systems capable of running application virtual machines. Instead they can be used to provide vSphere storage.
DataStream v2.7 added the block storage capability to the existing file storage support. In August last year, Coho added container support so customers could run Splunk and other analysis software directly on the array.
DataStream v2.8 adds native Hadoop Distributed File System (HDFS) support. It also provides quality-of-service (QoS) features enabling multi-tenancy and tenant-specific service levels. The company says network, storage and performance characteristics can be securely isolated while meeting desired service levels. It says it can offer Big Data and Containers as a private cloud service.
Coho customer Michael Kuhlmann, President and CEO, Colony Networks, provided a nice canned quote: “Today, we are ingesting a large amount of device monitoring probe and Wi-Fi session information that needs to be crunched. With the Coho DataStream 2.8 release, we can now run elastic search and other micro-services in a container directly on storage, allowing us to build new value-added software features and enhance our customers’ experience using our software.”
He said: “We were able to move our existing workloads to Coho without changing any of the code while seeing about 40 per cent increase in performance.”
Coho’s co-founder and CEO Ramana Jonnala talks about the software-defined data centre (SDDC) and sees DataStream systems operating in private clouds. His canned quote says: “We feel our new Coho DataStream 2.8 release provides our customers with many of the core components of a software-defined data centre in a single platform that addresses their future, critical needs around consolidation, the rapid growth of data and operational simplicity.”