Data warehousing: upgrade, extend or replace?

Compressed thinking


The B-eye Network recently conducted a survey on behalf of Dataupia. which asked, among other things, companies whether they would consider a "rip and replace" approach to solving whatever data warehousing problems they might have. Seventy-five per cent of respondents said no.

Now, apart from the fact that I find the term "rip and replace" pejorative this raises some interesting points. The first is that I find 75 per cent surprisingly low. It suggests that fully a quarter of all organisations are so fed up with the problems and costs associated with their current solutions (if "solution" is the right word for something that is obviously failing to deliver) that they are prepared to go through all the upheaval of ripping out and replacing their current systems.

However, it is not on this aspect that I want to focus but rather on the 75 per cent who are loyal, or potentially loyal, to their existing data warehouse supplier. Now, we must suppose that there are some such customers that are happy with their current lot: they haven't got increasing data volumes, are happy with the amount of data that is currently archived to tape that can't be queried, they don't have any demand to embed business intelligence capabilities into operational applications, the performance of their queries is so fast that they get returned within the blink of an eye, they have no concerns over the ease and cost of administration of the existing system, there are no additional users that want to be able to use the warehouse and there is no requirement for supporting unpredictable or complex queries that might overturn the performance applecart.

Yes, right.

Well, leaving them aside, what can any normal company, which has exactly these sorts of issues, do? If replacing the existing system with something bigger (or smaller in some instances) and better is not an option?

The first thing is that depends exactly what your issues are. If your issue is primarily one of performance then the obvious option is to add one or more data marts: presumably from a data warehouse appliance vendor, which is fine though it may mean that your whole environment becomes more complex.

However, if your issues are more complex: for example, you have concurrency and data volume issues as well as performance, then simply adding some new data warehouse appliances is unlikely to resolve the problem and, given that you don't want to replace your existing system, then you are going to have to improve the existing system in some way.

There are two ways of doing this: either you can upgrade your existing system through a new version of your supplier's software or by upgrading the hardware that it runs on, or you can extend your existing environment via a third party.

The problem with upgrading the existing environment is that what you get is more of the same. Certainly, if you upgrade to Oracle 11g, for example, then you can use its compression features to reduce your storage requirement, and you will get some performance benefits too. In other words, you will get incremental benefits. In fact, you may even get significant benefits but what you probably won't get is the sort of order of magnitude benefits that you want.

And it is these companies, who want order of magnitude benefits but don't want to rip and replace, that Dataupia is targeting. Hence its sponsorship of this survey. So, can it deliver on that promise? Well, it is early days for the company but one of its first customers, which is using Dataupia in conjunction with an Oracle database is now loading 30-90 days worth of call data records (it is a telecommunications company) into its warehouse, whereas previously it only stored them for between three and seven days. And it is getting the same sort of performance on the expanded system as it did before. That sounds like an order of magnitude improvement to me.

Philip Howard, is Director of Research - Technology, at Bloor Research.

Copyright © 2007, IT-Analysis.com


Other stories you might like

  • Robotics and 5G to spur growth of SoC industry – report
    Big OEMs hogging production and COVID causing supply issues

    The system-on-chip (SoC) side of the semiconductor industry is poised for growth between now and 2026, when it's predicted to be worth $6.85 billion, according to an analyst's report. 

    Chances are good that there's an SoC-powered device within arm's reach of you: the tiny integrated circuits contain everything needed for a basic computer, leading to their proliferation in mobile, IoT and smart devices. 

    The report predicting the growth comes from advisory biz Technavio, which looked at a long list of companies in the SoC market. Vendors it analyzed include Apple, Broadcom, Intel, Nvidia, TSMC, Toshiba, and more. The company predicts that much of the growth between now and 2026 will stem primarily from robotics and 5G. 

    Continue reading
  • Deepfake attacks can easily trick live facial recognition systems online
    Plus: Next PyTorch release will support Apple GPUs so devs can train neural networks on their own laptops

    In brief Miscreants can easily steal someone else's identity by tricking live facial recognition software using deepfakes, according to a new report.

    Sensity AI, a startup focused on tackling identity fraud, carried out a series of pretend attacks. Engineers scanned the image of someone from an ID card, and mapped their likeness onto another person's face. Sensity then tested whether they could breach live facial recognition systems by tricking them into believing the pretend attacker is a real user.

    So-called "liveness tests" try to authenticate identities in real-time, relying on images or video streams from cameras like face recognition used to unlock mobile phones, for example. Nine out of ten vendors failed Sensity's live deepfake attacks.

    Continue reading
  • Lonestar plans to put datacenters in the Moon's lava tubes
    How? Founder tells The Register 'Robots… lots of robots'

    Imagine a future where racks of computer servers hum quietly in darkness below the surface of the Moon.

    Here is where some of the most important data is stored, to be left untouched for as long as can be. The idea sounds like something from science-fiction, but one startup that recently emerged from stealth is trying to turn it into a reality. Lonestar Data Holdings has a unique mission unlike any other cloud provider: to build datacenters on the Moon backing up the world's data.

    "It's inconceivable to me that we are keeping our most precious assets, our knowledge and our data, on Earth, where we're setting off bombs and burning things," Christopher Stott, founder and CEO of Lonestar, told The Register. "We need to put our assets in place off our planet, where we can keep it safe."

    Continue reading

Biting the hand that feeds IT © 1998–2022