Snowflake targets Java and Scala devs, will soon slither after Pythonistas too

Cloud data outfit realises SQL-centric approach won’t attract the developers it needs to grow


Snowflake Summit Cloudy data-wrangling outfit Snowflake has opened itself up to Java and Scala developers.

At the company's annual event, Summit, the firm talked up Snowpark, which will allow developers to use the abovementioned languages to manage its platform. Until now, Snowflake has focused on SQL-centric developers. Java user-defined functions will also be permitted on the platform, allowing both code and business logic to be applied to Snowflake.

Peter O'Connor, Snowflake's veep for sales in Asia Pacific, told The Register it was recognition the company needs to be more accommodating to developers if it is to continue its growth.

Python support is also on the company's roadmap, but without a delivery date. Snowpark is currently in private preview, Snowflake's jargon for a closed beta. A public preview is "coming soon."

Snowflake has also vowed to improve its unstructured data-handling prowess, and create a new SQL API that lets applications call its services using REST.

Another private preview picks personally identifiable information out of data stored by Snowflake and then applies policies to ensure it is not used inappropriately. Or used at all. A new anonymized view of data allows the same outcomes.

Storage has also been tweaked, with improved compression said to reduce consumption and therefore costs, while scaling independently of compute usage.

Snowflake introduced the new functions to The Register at an event featuring customers, one of which – data-science-as-a-service outfit Quantium – explained that Snowflake has replaced Hadoop for many of its analytics chores.

Quantium's executive for engineering, product and technology, Florian Pasquier, said capital expenditure costs for Hadoop were worryingly high, and run times for analytics jobs unsustainably long. Pasquier said exceptional engineers could address such problems, but would require around two weeks to do so. Cue cost/benefit calculations that saw Hadoop heaved overboard. The organisation still uses Apache Spark for some analytics jobs.

O'Connor said Snowflake has just started to win deals in which it displaced Cloudera and Teradata in Asia-Pacific. He expressed hope the new features detailed above will accelerate that process. ®

Similar topics


Other stories you might like

Biting the hand that feeds IT © 1998–2021