Oh no, you're thinking, yet another cookie pop-up. Well, sorry, it's the law. We measure how many people read us, and ensure you see relevant ads, by storing cookies on your device. If you're cool with that, hit “Accept all Cookies”. For more info and to customize your settings, hit “Customize Settings”.

Review and manage your consent

Here's an overview of our use of cookies, similar technologies and how to manage them. You can also change your choices at any time, by hitting the “Your Consent Options” link on the site's footer.

Manage Cookie Preferences
  • These cookies are strictly necessary so that you can navigate the site as normal and use all features. Without these cookies we cannot provide you with the service that you expect.

  • These cookies are used to make advertising messages more relevant to you. They perform functions like preventing the same ad from continuously reappearing, ensuring that ads are properly displayed for advertisers, and in some cases selecting advertisements that are based on your interests.

  • These cookies collect information in aggregate form to help us understand how our websites are being used. They allow us to count visits and traffic sources so that we can measure and improve the performance of our sites. If people say no to these cookies, we do not know how many people have visited and we cannot monitor performance.

See also our Cookie policy and Privacy policy.

This article is more than 1 year old

Big daddy HDS shows off fruits of Big Data slurpee Pentaho

Hyper-converged HSP analytics appliance

A refreshed HSP 400 series, HDS’ scale-out analytics appliance, has native integration with the Pentaho Enterprise Platform.

HDS slurped up Pentaho in February 2015 to acquire big data integration and predictive analysis technology. The software amalgamates multi-source data from, for example, Hadoop (Cloudera, Hortonworks and others), NoSQL (MongoDB, HBASE, Cassandra, etc) and data warehouses like Netezza and Vertica, and then runs analysis routines on it.

We’re told the HSP 400 is a hyper-converged analytics appliance using 2U nodes combining compute, storage and networking and “delivers seamless infrastructure to support big data blending, embedded business analytics and simplified data management.”

HDS HSP 400

HDS HSP 400

There is a “centralised, easy-to-use user interface to automate the deployment and management of virtualised environments for leading open source big data frameworks, including Apache Hadoop, Apache Spark, and commercial open source stacks like the Hortonworks Data Platform (HDP).”

The Pentaho integration provides “complete control of the analytic data pipeline and enterprise-grade features such as big data lineage, lifecycle management and enhanced information security.”

James Dixon, Pentaho’s CTO, said customers can unify disparate datasets and workloads, like legacy applications and data warehouses, using this HSP 400. It’s a “simplified, all-in-the-box solution that combines compute, analytics and data management functions in a plug-and-play, future-ready architecture.“

The HSP 400 is “a great first-step in simplifying the entire analytic process,” and has a pay-as-you-go business model.

HDS tells us that the HSP 400 will be used for more workloads than analytics in the future, with Pentaho being used in some or all of them.

Pentaho_chord_visualisation

Pentaho Chord Visualisation

A nearline SAS disk (12 X 4TB) configuration of the HSP 400 is available now, and an all-flash version is expected by the middle of the year. We expect a dozen SAS-interface SSDs will occupy the current SAS disk drive slots if HDS takes the simple engineering route.

On the other hand it has just launched its HFS A series all-flash array, which, like the 400 series nodes, is a 2U box. Hmm, slide those in an HSP 400 rack and you would have a hellishly fast and powerful system. Oh my Precious, I wants one! ®

 

Similar topics

TIP US OFF

Send us news


Other stories you might like