AI-powered dynamic pricing turns its gaze to the fuel pumps

Shopping as a constant poker game

Analysis "AI" could soon be making petrol more expensive at times of peak demand like the start of a bank holiday weekend or the school run.

Danish data analytics company a2i touts fuel pricing as an ideal implementation of its learning algorithms. The company claims that PriceCast Fuel, its dynamic pricing product, can improve fuel retailers' margins by around 5 per cent.

"With the use of Artificial Intelligence PriceCast Fuel detects behavioral patterns in Big Data (all available data relevant to the sale) and relates to customer and competitor reactions with a frequency and level of accuracy that users of traditional pricing systems only can dream about," the company explains in a brochure [PDF]. "Dynamically mapping customer and competitor behavior in order to identify the optimal route (price setting) throughout the day, makes it possible to relate to any given change in the local situation for a given station and re-route accordingly when necessary and within seconds."

PriceCast can do traditional rule-based pricing where, for example, the supplier wants to set the lowest price possible. But a2i can also incorporate "Restricted AI" or "Advanced AI" pricing.

The company says it isn't ripping off anyone.

"This is not a matter of stealing more money from your customer. It's about making margin on people who don't care, and giving away margin to people who do care," CEO Ulrik Blichfeldt told The Wall Street Journal earlier this month [paywalled].

a2i claims several fuel suppliers in Europe have signed on, but only one is prepared to go public. Why the shyness? Well, that isn't too hard to work out.

Uber's occasionally notorious "surge pricing" has established the principle that prices fluctuate in real time, but there is a logic to it. It's designed to increase the supply of taxis on the roads from Uber's casual labour pool. Uber can't compel its labour to work, but since higher fares mean the drivers keep more money, they're more inclined to go on the road. But dynamic pricing doesn't produce more or cheaper petrol – it's simply a case of retailers increasing margin where they can, as Blichfeldt has admitted.

Real-time dynamic pricing has long been a Silicon Valley fantasy. Here's Affirm co-founder and CEO Max Levchin four years ago:

On a Saturday morning, I load my two toddlers into their respective child seats, and my car's in-wheel strain gauges detect the weight difference and reports that the kids are with me in a moving vehicle to my insurance via a secure message through my iPhone. The insurance company duly increases today's premium by a few dollars.

What do you imagine personalised fuel pricing – based on your car broadcasting that your fuel gauge is low – would look like? Levchin then jokes that we'll see dynamically-priced priests and therapists. But it would be a bold and foolish therapist who hiked their prices after a major natural disaster or terrorist incident.

AI-powered dynamic pricing sticks in the craw because it makes it blatantly obvious that the consumer is being gamed.

The "smart" consumer will shop around, but dynamic pricing turns shopping into a 24/7 poker game – a full-time hobby, or neurosis. To the VC and AI nerds of Silicon Valley this is how it should be, squeezing every last ounce of "inefficiency" out of a marketplace.

"This is the nightmare world of Big Data, where the moment-by-moment behavior of human beings – analog resources – is tracked by sensors and engineered by central authorities to create optimal statistical outcomes," commented Nick Carr.

But that's a relationship many of us would rather not enter. Not for nothing have retailers, for decades, promised "lowest prices around – or your money back". It's nice and simple. And it offers a kind of contract: we can hold to them to account if they fail.

This is where AI offers another option to cynical retailers. How do we know if we've been offered the lowest price. And what happens when the pricing bots gang up on us?

"A cabal of AI-enhanced price-bots might plausibly hatch a method of colluding that even their handlers could not understand, let alone be held fully responsible for," notes The Economist.

Dynamic pricing is an ugly idea with a couple of fairly ugly parents. One is "behavioural science", or Nudging, which supplants an honest relationship with the customer with a tricksy and dishonest one. Nudging supposes we don't know we're being gamed, and is beloved of today's policy makers.

Another is Silicon Valley's insistence that because something can be done, we'll just love them for doing it. Six months ago I suggested there were two major obstacles to ML adoption, even if the techniques worked as well in new use cases as the AI evangelists hoped. One was the liability issue – we need someone, somewhere to carry the can when things go wrong – while the other was consumer resistance to "helpful" suggestions, which are often creepy and unnecessary.

AI-powered dynamic pricing shows us a Silicon Valley culture determined to ignore both. Full steam ahead! ®

Similar topics

Broader topics

Other stories you might like

  • If AI chatbots are sentient, they can be squirrels, too
    Plus: FTC warns against using ML for automatic content moderation, and more

    In Brief No, AI chatbots are not sentient.

    Just as soon as the story on a Google engineer, who blew the whistle on what he claimed was a sentient language model, went viral, multiple publications stepped in to say he's wrong.

    The debate on whether the company's LaMDA chatbot is conscious or has a soul or not isn't a very good one, just because it's too easy to shut down the side that believes it does. Like most large language models, LaMDA has billions of parameters and was trained on text scraped from the internet. The model learns the relationships between words, and which ones are more likely to appear next to each other.

    Continue reading
  • Cerebras sets record for 'largest AI model' on a single chip
    Plus: Yandex releases 100-billion-parameter language model for free, and more

    In brief US hardware startup Cerebras claims to have trained the largest AI model on a single device powered by the world's largest Wafer Scale Engine 2 chip the size of a plate.

    "Using the Cerebras Software Platform (CSoft), our customers can easily train state-of-the-art GPT language models (such as GPT-3 and GPT-J) with up to 20 billion parameters on a single CS-2 system," the company claimed this week. "Running on a single CS-2, these models take minutes to set up and users can quickly move between models with just a few keystrokes."

    The CS-2 packs a whopping 850,000 cores, and has 40GB of on-chip memory capable of reaching 20 PB/sec memory bandwidth. The specs on other types of AI accelerators and GPUs pale in comparison, meaning machine learning engineers have to train huge AI models with billions of parameters across more servers.

    Continue reading
  • AMD touts big datacenter, AI ambitions in CPU-GPU roadmap
    Epyc future ahead, along with Instinct, Ryzen, Radeon and custom chip push

    After taking serious CPU market share from Intel over the last few years, AMD has revealed larger ambitions in AI, datacenters and other areas with an expanded roadmap of CPUs, GPUs and other kinds of chips for the near future.

    These ambitions were laid out at AMD's Financial Analyst Day 2022 event on Thursday, where it signaled intentions to become a tougher competitor for Intel, Nvidia and other chip companies with a renewed focus on building better and faster chips for servers and other devices, becoming a bigger player in AI, enabling applications with improved software, and making more custom silicon.  

    "These are where we think we can win in terms of differentiation," AMD CEO Lisa Su said in opening remarks at the event. "It's about compute technology leadership. It's about expanding datacenter leadership. It's about expanding our AI footprint. It's expanding our software capability. And then it's really bringing together a broader custom solutions effort because we think this is a growth area going forward."

    Continue reading

Biting the hand that feeds IT © 1998–2022