DataStax 'pauses' AIOps database project to figure out exactly what AIOps is

Some enterprise folk less keen on cloudy machine learning


Cassandra database slinger DataStax has "paused" its AIops project, Vector, while it figures out what exactly AIops is.

A sort of automated database help function, the Vector project was launched in June last year to offer operators and developers the DataStax "knowledge base and provide near real-time expert advice, knowledge, and skills," according to the firm.

Speaking to The Register this week, Ed Anuff, DataStax's chief product officer, said that after an extensive beta programme of Vector, it was time to pause the project and reflect.

"The purpose of running the beta was really to understand what it was that people wanted to see in the final product," he said.

"There were a lot of different expectations as to what the product should do: some people wanted a AIops to offer recommendations, other people wanted it to be a complete self-driving car of the databases," he said.

The cloud/diverse datasets issue

The second thing DataStax found was that users could not agree on whether the product should be powered from the cloud or not.

While some people were very comfortable with the idea of installing an agent that would connect up to the cloud, others, particularly enterprise customers, wanted to be able to run it fully on-premises.

happyhandholding

DataStax releases K8ssandra, the Kubernetes peace offering to the open-source community

READ MORE

The problem is: AIops was supposed to learn from the experiences of a range of customers. "Part of what makes it feasible from a machine learning standpoint, is that we're able to go and observe and use many different patterns as mechanisms of training," Anuff said.

DataStax, which supports the Apache Cassandra database and provides commercial products related to it, would restart the beta programme later this year "probably in the summer timeframe," he said.

Vector was headed up by Aaron Morton, who goes by the title "office of the field CTO" at DataStax, and was co-founder and CEO of The Last Pickle, a consultancy DataStax bought in March 2020.

Anuff said The Last Pickle team had been "blended" with the DataStax group and continue to work on different projects.

DataStax recently made efforts to court developers with its support for GraphQL and cloud container enthusiasts with its distribution designed specifically for Kubernetes. ®


Other stories you might like

  • AMD touts big datacenter, AI ambitions in CPU-GPU roadmap
    Epyc future ahead, along with Instinct, Ryzen, Radeon and custom chip push

    After taking serious CPU market share from Intel over the last few years, AMD has revealed larger ambitions in AI, datacenters and other areas with an expanded roadmap of CPUs, GPUs and other kinds of chips for the near future.

    These ambitions were laid out at AMD's Financial Analyst Day 2022 event on Thursday, where it signaled intentions to become a tougher competitor for Intel, Nvidia and other chip companies with a renewed focus on building better and faster chips for servers and other devices, becoming a bigger player in AI, enabling applications with improved software, and making more custom silicon.  

    "These are where we think we can win in terms of differentiation," AMD CEO Lisa Su said in opening remarks at the event. "It's about compute technology leadership. It's about expanding datacenter leadership. It's about expanding our AI footprint. It's expanding our software capability. And then it's really bringing together a broader custom solutions effort because we think this is a growth area going forward."

    Continue reading
  • Train once, run anywhere, almost: Qualcomm's drive to bring AI to its phone, PC chips
    Software toolkit offered to save developers time, effort, battery power

    Qualcomm knows that if it wants developers to build and optimize AI applications across its portfolio of silicon, the Snapdragon giant needs to make the experience simpler and, ideally, better than what its rivals have been cooking up in the software stack department.

    That's why on Wednesday the fabless chip designer introduced what it's calling the Qualcomm AI Stack, which aims to, among other things, let developers take AI models they've developed for one device type, let's say smartphones, and easily adapt them for another, like PCs. This stack is only for devices powered by Qualcomm's system-on-chips, be they in laptops, cellphones, car entertainment, or something else.

    While Qualcomm is best known for its mobile Arm-based Snapdragon chips that power many Android phones, the chip house is hoping to grow into other markets, such as personal computers, the Internet of Things, and automotive. This expansion means Qualcomm is competing with the likes of Apple, Intel, Nvidia, AMD, and others, on a much larger battlefield.

    Continue reading
  • Microsoft promises to tighten access to AI it now deems too risky for some devs
    Deep-fake voices, face recognition, emotion, age and gender prediction ... A toolbox of theoretical tech tyranny

    Microsoft has pledged to clamp down on access to AI tools designed to predict emotions, gender, and age from images, and will restrict the usage of its facial recognition and generative audio models in Azure.

    The Windows giant made the promise on Tuesday while also sharing its so-called Responsible AI Standard, a document [PDF] in which the US corporation vowed to minimize any harm inflicted by its machine-learning software. This pledge included assurances that the biz will assess the impact of its technologies, document models' data and capabilities, and enforce stricter use guidelines.

    This is needed because – and let's just check the notes here – there are apparently not enough laws yet regulating machine-learning technology use. Thus, in the absence of this legislation, Microsoft will just have to force itself to do the right thing.

    Continue reading

Biting the hand that feeds IT © 1998–2022