Data governance definitions

Mixed messages


I am getting mixed messages about data governance. IBM recently published the results of a survey it had conducted into the use of data governance, which it conducted in conjunction with the NCC. Out of 141 respondents, from companies of all sizes, only seven per cent reported that data governance was neither implemented nor on the planning horizon. Of the remaining 93 per cent more than half had some sort of (at least initial) process in place. Now, admittedly this was a self-selecting set of respondents but this does not fit with my experience.

As a counterexample, I asked SAS (and bear in mind that SAS owns Dataflux, the data quality vendor) last summer about demand for data governance across Europe. "None" was the answer that I was unequivocally given. Now it could be that I was speaking to the wrong person but bearing in mind that this was a special analyst conference put on by SAS EMEA then this is unlikely: they don't put people in front of analysts who don't know what they are talking about.

Aside: SAS EMEA is now defunct (almost immediately following the retirement of Art Monk-to which there is [ahem!] obviously no link) and judging from the recent surge in SAS contacts to my Linked-In contact list then there are probably some good technical and marketing people looking for new opportunities.

To return to the main point, either something has radically changed in the last six months or we are not talking about the same things. I suspect it is mainly the latter, which means we have a definitional issue. It may also be that the UK is ahead of the rest of Europe in this area though we are, I think, some way behind the States.

In so far as definitions are concerned: first there is compliance versus governance and secondly there is data governance versus data quality.

As far as I am concerned, data governance is more than either compliance or data quality. In essence it is about ensuring that data is available, secure, audited, traceable and fit for purpose. The need for data quality procedures is implicit in that definition, while compliance is a natural consequence of it. Note too, that I do not believe that compliance is simply about the fulfilment of whatever legislation or regulatory framework is in place. It is in fact about meeting best practices and while these are sometimes imposed externally, they may also be established by the business. Indeed, it is reasonable to assert that the only reason why we have Sarbanes-Oxley, Basel II, MiFID and the rest is because business has failed to keep its house in order: external legislation, in this regard, is simply the imposition of best practices that businesses have failed to implement for themselves.

Data governance then, or indeed any form of governance, is about establishing the policies and procedures that the company should adhere to, while compliance is the process of ensuring that those best practices are followed. A company "doing" data governance therefore has a body established to define those policies and procedures, with enough power to ensure that they get enacted, and mechanisms in place to ensure that these are fulfilled: data quality software, dashboards and so forth are merely tools (if important ones) that enable this. My suspicion is that far fewer than 93 per cent of organisations are actively engaged in data governance in this sense.

Copyright © 2007, IT-Analysis.com


Other stories you might like

  • Florida's content-moderation law kept on ice, likely unconstitutional, court says
    So cool you're into free speech because that includes taking down misinformation

    While the US Supreme Court considers an emergency petition to reinstate a preliminary injunction against Texas' social media law HB 20, the US Eleventh Circuit Court of Appeals on Monday partially upheld a similar injunction against Florida's social media law, SB 7072.

    Both Florida and Texas last year passed laws that impose content moderation restrictions, editorial disclosure obligations, and user-data access requirements on large online social networks. The Republican governors of both states justified the laws by claiming that social media sites have been trying to censor conservative voices, an allegation that has not been supported by evidence.

    Multiple studies addressing this issue say right-wing folk aren't being censored. They have found that social media sites try to take down or block misinformation, which researchers say is more common from right-leaning sources.

    Continue reading
  • US-APAC trade deal leaves out Taiwan, military defense not ruled out
    All fun and games until the chip factories are in the crosshairs

    US President Joe Biden has heralded an Indo-Pacific trade deal signed by several nations that do not include Taiwan. At the same time, Biden warned China that America would help defend Taiwan from attack; it is home to a critical slice of the global chip industry, after all. 

    The agreement, known as the Indo-Pacific Economic Framework (IPEF), is still in its infancy, with today's announcement enabling the United States and the other 12 participating countries to begin negotiating "rules of the road that ensure [US businesses] can compete in the Indo-Pacific," the White House said. 

    Along with America, other IPEF signatories are Australia, Brunei, India, Indonesia, Japan, South Korea, Malaysia, New Zealand, the Philippines, Singapore, Thailand and Vietnam. Combined, the White House said, the 13 countries participating in the IPEF make up 40 percent of the global economy. 

    Continue reading
  • 381,000-plus Kubernetes API servers 'exposed to internet'
    Firewall isn't a made-up word from the Hackers movie, people

    A large number of servers running the Kubernetes API have been left exposed to the internet, which is not great: they're potentially vulnerable to abuse.

    Nonprofit security organization The Shadowserver Foundation recently scanned 454,729 systems hosting the popular open-source platform for managing and orchestrating containers, finding that more than 381,645 – or about 84 percent – are accessible via the internet to varying degrees thus providing a cracked door into a corporate network.

    "While this does not mean that these instances are fully open or vulnerable to an attack, it is likely that this level of access was not intended and these instances are an unnecessarily exposed attack surface," Shadowserver's team stressed in a write-up. "They also allow for information leakage on version and build."

    Continue reading
  • A peek into Gigabyte's GPU Arm for AI, HPC shops
    High-performance platform choices are going beyond the ubiquitous x86 standard

    Arm-based servers continue to gain momentum with Gigabyte Technology introducing a system based on Ampere's Altra processors paired with Nvidia A100 GPUs, aimed at demanding workloads such as AI training and high-performance compute (HPC) applications.

    The G492-PD0 runs either an Ampere Altra or Altra Max processor, the latter delivering 128 64-bit cores that are compatible with the Armv8.2 architecture.

    It supports 16 DDR4 DIMM slots, which would be enough space for up to 4TB of memory if all slots were filled with 256GB memory modules. The chassis also has space for no fewer than eight Nvidia A100 GPUs, which would make for a costly but very powerful system for those workloads that benefit from GPU acceleration.

    Continue reading
  • GitLab version 15 goes big on visibility and observability
    GitOps fans can take a spin on the free tier for pull-based deployment

    One-stop DevOps shop GitLab has announced version 15 of its platform, hot on the heels of pull-based GitOps turning up on the platform's free tier.

    Version 15.0 marks the arrival of GitLab's next major iteration and attention this time around has turned to visibility and observability – hardly surprising considering the acquisition of OpsTrace as 2021 drew to a close, as well as workflow automation, security and compliance.

    GitLab puts out monthly releases –  hitting 15.1 on June 22 –  and we spoke to the company's senior director of Product, Kenny Johnston, at the recent Kubecon EU event, about what will be added to version 15 as time goes by. During a chat with the company's senior director of Product, Kenny Johnston, at the recent Kubecon EU event, The Register was told that this was more where dollars were being invested into the product.

    Continue reading
  • To multicloud, or not: Former PayPal head of engineering weighs in
    Not everyone needs it, but those who do need to consider 3 things, says Asim Razzaq

    The push is on to get every enterprise thinking they're missing out on the next big thing if they don't adopt a multicloud strategy.

    That shove in the multicloud direction appears to be working. More than 75 percent of businesses are now using multiple cloud providers, according to Gartner. That includes some big companies, like Boeing, which recently chose to spread its bets across AWS, Google Cloud and Azure as it continues to eliminate old legacy systems. 

    There are plenty of reasons to choose to go with multiple cloud providers, but Asim Razzaq, CEO and founder at cloud cost management company Yotascale, told The Register that choosing whether or not to invest in a multicloud architecture all comes down to three things: How many different compute needs a business has, budget, and the need for redundancy. 

    Continue reading

Biting the hand that feeds IT © 1998–2022