Object databases - alive and kicking?

New open mindedness in data storage


Comment At a recent meeting with InterSystems (vendor of Caché and Ensemble) the company said it was seeing increased interest in object oriented databases.

Now, I must qualify this by saying that, first, Caché is not merely (or only) an object oriented database and, secondly, that this interest was primarily in the United States and, to a lesser extent, continental Europe - but not in the UK.

Third, of course, the information is purely anecdotal: interest and downloads do not amount to sales. However, it is worth noting that Versant (perhaps the leading pure play object database vendor) just recorded its highest quarterly net income since its IPO a decade ago. So perhaps there is something to this story. In which case, why should that be?

Mike Fuller of InterSystems expressed the opinion that graduates leaving university a decade ago are now in positions of authority, but without having gone through the previous object oriented hype of a decade ago: they are thus more prepared to look at an object database as a natural storage method when developing in, say, Java. There may some truth to that, but I think it is actually broader.

I think we have now reached a position where it is no longer a given that we have to use a relational database for everything. There has always been a peripheral awareness that non-relational databases have their place; witness the continuing use of Adabas (for which revenues continue to grow at a healthy rate) and the various multi-valued databases such as Uniface, UniVerse and Revelation, not to mention Caché itself (incidentally, in the latest release of Caché, the product will run Unidata and UniVerse applications natively, which may be useful to know if you are thinking of moving away from these platforms - other multi-valued databases will require some porting, though InterSystems reckons that 70 per cent to 80 per cent of the work needed can be automated).

However, peripheral awareness of older or different systems is not what was causing change. I think what is causing this change is twofold: first, we are thinking again about databases in general because of open source vendors such as Ingres and MySQL and even because of data warehouse appliances and, secondly, we are thinking again because of Sleepycat (now part of Oracle) and IBM.

In the case of Sleepycat Software, its Berkeley DB is essentially a file system with database management systems built around it. Many companies have woken up to the fact that you don't need a relational database to store static, structured data such as call data records or the sort of details that underpin Amazon or eBay. In fact, you don't even need to have a database management system, you can just use a flat file system together with indexing and SQL access via CopperEye Greenwich plus structured search from the same company. But the point is that, whichever approach you take, the raison d'être is that you don't need a relational approach to this.

Now add in IBM's Viper release of DB2 to this mix, which will be a hybrid relational/XML database, and here we have IBM saying, in effect, that relational is not always good enough. Now, various analysts have come to the same conclusion at various times but there is no good going on flogging a dead horse and it's a subject we've quietly dropped. However, with IBM finally coming out with the same message it is clear that the horse is no longer dead - it is very much alive and kicking.

Whether the object database market is reviving I don't know, but what I do know is that the old ways of thinking about storing data - always in a relational database - are no longer valid (if they ever were). Open source databases and appliances ask lots of questions, as do file systems, Caché, Viper and the rest. What companies need are storage mechanisms that are fit for purpose and that may vary widely depending on the data and application: if we are seeing a new open-mindedness, then that can only be a good thing.

Copyright © 2006, IT-Analysis.com


Other stories you might like

  • EU-US Trade and Technology Council meets to coordinate on supply chains
    Agenda includes warning system for disruptions, and avoiding 'subsidy race' for chip investments

    The EU-US Trade and Technology Council (TTC) is meeting in Paris today to discuss coordinated approaches to global supply chain issues.

    This is only the second meeting of the TTC, the agenda for which was prepared in February. That highlighted a number of priorities, including securing supply chains, technological cooperation, the coordination of measures to combat distorting practices, and approaches to the decarbonization of trade.

    According to a White House pre-briefing for US reporters, the EU and US are set to announce joint approaches on technical discussions to international standard-setting bodies, an early warning system to better predict and address potential semiconductor supply chain disruptions, and a transatlantic approach to semiconductor investments aimed at ensuring security of supply.

    Continue reading
  • US cops kick back against facial recognition bans
    Plus: DeepMind launches new generalist AI system, and Apple boffin quits over return-to-work policy

    In brief Facial recognition bans passed by US cities are being overturned as law enforcement and lobbyist groups pressure local governments to tackle rising crime rates.

    In July, the state of Virginia will scrap its ban on the controversial technology after less than a year. California and New Orleans may follow suit, Reuters first reported. Vermont adjusted its bill to allow police to use facial recognition software in child sex abuse investigations.

    Elsewhere, efforts are under way in New York, Colorado, and Indiana to prevent bills banning facial recognition from passing. It's not clear if some existing vetoes set to expire, like the one in California, will be renewed. Around two dozen US state or local governments passed laws prohibiting facial recognition from 2019 to 2021. Police, however, believe the tool is useful in identifying suspects and can help solve cases especially in places where crime rates have risen.

    Continue reading
  • RISC-V needs more than an open architecture to compete
    Arm shows us that even total domination doesn't always make stupid levels of money

    Opinion Interviews with chip company CEOs are invariably enlightening. On top of the usual market-related subjects of success and failure, revenues and competition, plans and pitfalls, the highly paid victim knows that there's a large audience of unusually competent critics eager for technical details. That's you.

    Take The Register's latest interview with RISC-V International CEO Calista Redmond. It moved smartly through the gears on Intel's recent Platinum Membership of the open ISA consortium ("they're not too worried about their x86 business"), the interest from autocratic regimes (roughly "there are no rules, if some come up we'll stick by them"), and what RISC-V's 2022 will look like. Laptops. Thousand-core AI chips. Google hyperscalers. Edge. The plan seems to be to do in five years what took Arm 20.

    RISC-V may not be an existential risk to Intel, but Arm had better watch it.

    Continue reading

Biting the hand that feeds IT © 1998–2022