Microsoft's Azure servers want to sip your mug of serverless Java

Azure Functions upgraded to woo developers speaking Oracle's language

JavaOne Microsoft has announced Java support for Azure Functions, the serverless cloud platform which competes with AWS Lambda. The announcement was made at the JavaOne event under way in San Francisco this week.

The service is now in public preview. Java is the latest language to be added following Microsoft’s redesign of the Azure Functions runtime to improve support for different programming languages.

Azure Functions can be triggered in a variety of ways, including a direct call over HTTP, scheduled execution, Azure Service Bus messages, changes to files in Azure storage, or in response to events managed by Event Grid, an Azure event routing service announced in August.

Azure Functions is billed per execution and scales automatically. Other supported languages are C#, F#, Node.js, Python, PHP and Bash.

Georges Saab at JavaOne '17

Oracle VP: 'We want the next decade to be Java first, Java always'


Microsoft also has tools for running an Azure Functions host locally, for development and test. The Azure Functions Core Tools use NodeJS and .NET Core, and run on Windows, Linux and Mac.

Java developers also have a plugin for Apache Maven which manages build and deployment.

Why Java in Azure Functions? Java remains the top language for enterprise development, even though it trails JavaScript in overall popularity. If Microsoft hopes to attract developers to Azure beyond the usual Redmond platform community, Java support is essential.

Note that Amazon's AWS has offered Java support in its similar Lambda service since June 2015. Lambda also has a local development option, called AWS SAM – that's Serverless Application Model – Local, so Microsoft’s offering is not all that distinctive.

That said, there is good reason to build on Microsoft’s cloud if you are integrating with other Microsoft services such as Office 365 or Dynamics 365, or an Azure database such as CosmosDB.

Microsoft already supports Java on Azure in various other services, including App Service, or running in containers. The advantage of serverless is that is a pure cloud model, in which developers only need to supply the code, which talks to the underlying platform via APIs. Obviously, it still ultimately all runs on actual servers. Serverless means you don't have to worry about deploying, maintaining, scheduling and tearing down host systems or virtual machines – you just supply the code and Azure (or AWS or whoever) takes care of the hosting automatically, fingers crossed.

More information is over here. ®

Other stories you might like

  • VMware claims ‘bare-metal’ performance from virtualized Nvidia GPUs
    Is... is that why Broadcom wants to buy it?

    The future of high-performance computing will be virtualized, VMware's Uday Kurkure has told The Register.

    Kurkure, the lead engineer for VMware's performance engineering team, has spent the past five years working on ways to virtualize machine-learning workloads running on accelerators. Earlier this month his team reported "near or better than bare-metal performance" for Bidirectional Encoder Representations from Transformers (BERT) and Mask R-CNN — two popular machine-learning workloads — running on virtualized GPUs (vGPU) connected using Nvidia's NVLink interconnect.

    NVLink enables compute and memory resources to be shared across up to four GPUs over a high-bandwidth mesh fabric operating at 6.25GB/s per lane compared to PCIe 4.0's 2.5GB/s. The interconnect enabled Kurkure's team to pool 160GB of GPU memory from the Dell PowerEdge system's four 40GB Nvidia A100 SXM GPUs.

    Continue reading
  • Nvidia promises annual datacenter product updates across CPU, GPU, and DPU
    Arm one year, x86 the next, and always faster than a certain chip shop that still can't ship even one standalone GPU

    Computex Nvidia's push deeper into enterprise computing will see its practice of introducing a new GPU architecture every two years brought to its CPUs and data processing units (DPUs, aka SmartNICs).

    Speaking on the company's pre-recorded keynote released to coincide with the Computex exhibition in Taiwan this week, senior vice president for hardware engineering Brian Kelleher spoke of the company's "reputation for unmatched execution on silicon." That's language that needs to be considered in the context of Intel, an Nvidia rival, again delaying a planned entry to the discrete GPU market.

    "We will extend our execution excellence and give each of our chip architectures a two-year rhythm," Kelleher added.

    Continue reading
  • Now Amazon puts 'creepy' AI cameras in UK delivery vans
    Big Bezos is watching you

    Amazon is reportedly installing AI-powered cameras in delivery vans to keep tabs on its drivers in the UK.

    The technology was first deployed, with numerous errors that reportedly denied drivers' bonuses after malfunctions, in the US. Last year, the internet giant produced a corporate video detailing how the cameras monitor drivers' driving behavior for safety reasons. The same system is now apparently being rolled out to vehicles in the UK. 

    Multiple camera lenses are placed under the front mirror. One is directed at the person behind the wheel, one is facing the road, and two are located on either side to provide a wider view. The cameras are monitored by software built by Netradyne, a computer-vision startup focused on driver safety. This code uses machine-learning algorithms to figure out what's going on in and around the vehicle.

    Continue reading

Biting the hand that feeds IT © 1998–2022