ODPi, the group formerly known as the Open Data Platform initiative and set up last year as an attempt to standardise Hadoop applications, has published its first runtime specification.
Backed by Hortonworks but kicked into the corner by heavyweights MapR and Cloudera, ODPi was set up last year to try and make sure applications would work across multiple Apache Hadoop distributions.
Instead of collapsing, however, the group quietly worked on a runtime specification accompanied by a test suite.
The ODPi technical working group says its objectives are:
- For consumers: ability to run any “ODPi-compatible” software on any “ODPi-compliant” platform and have it work.
- For ISVs: compatibility guidelines that allow them to “test once, run everywhere.”
- For Hadoop platform providers: compliance guidelines that enable ODPi-compatible software to run successfully on their solutions. But the guidelines must allow providers to patch their customers in an expeditious manner, to deal with emergencies.
The runtime spec, at Github, doesn't preclude applications from calling private interfaces, but the authors note that customisation breaks the “test once, run everywhere” model.
The current spec requires the Apache Hadoop 2.7 branch as a base spec. The “minimum native build” specifications set out requirements for Kerberos, Java and OS requirements, and the GZip and Snappy codec compression libraries.
Apache Big Top is used for packaging, testing and configuration, and there are “guidelines on how to incorporate additional, non-breaking features, which are allowed provided source code is made available through relevant Apache community processes.”
Sandbox images for testing are also at Github. ®