SIEM, UBA, UEBA... If you're suffering netsec acronym overload, then here's our handy guide

Is there a difference and does it matter?


Comment In a little more than 20 years, what quaintly used to be called "network security" has gone from simple firewalling and VLANs to talk of analytics driven by self-learning machine intelligence and AI. How should we make sense of such a dramatic jump?

The engine of change has been cybercrime and its remarkable ability to render the advances of successive generations of security technology almost as quickly as they have established themselves. Firewalls were the first to be challenged, overwhelmed by the sheer size and complexity of what they were being asked to monitor. Intrusion detection systems (IDS), intrusion prevention systems (IPS), and various layers of endpoint and device security, once seen as a clever way to make threat-hunting more focused, soon followed.

Ironically, what caused the problem was not that firewalls, IDS, IPS, and endpoint software couldn’t detect cyber attacks, but that the alerts they produced started to overwhelm the defenders with data. With volumes rising, and newer generations of kit adding the monitoring of accounts, applications, privileges, and users to the mix, vendors rushed to solve the problem with expert systems that that could present this mass of event data in a way that allowed humans to separate the good from the bad, and in a digestible manner.

SIEM’s ascent

Throughout the 2000s, security information management (SIM), security event management (SEM), and their later combination in security information and event management (SIEM) was the logical answer to data overload. Instead of managing log data using proprietary systems, SIEM made it possible to use a single overview into which many sources could be gathered and processed for correlations. Although SIEM has become a helpmate few would be without – security operations centers (SOCs) and centralized security monitoring would struggle without SIEM to perform log monitoring – it’s always been about totting up the pros and cons:

  • The data from diverse security events can be correlated in a way that would be somewhere between time consuming and impossible using logs from individual systems. If something suspicious is detected, security systems can be instructed to block it.
  • Anomalies – defined as violations of policy – stand out clearly in the present and historically.
  • Most SIEMS will come with reporting interfaces to help with compliance and auditing, for example PCI DSS and HIPAA.

SIEMs also have limitations, starting with the act they are only as good as the log events being fed and the way these are correlated using rules to generate alerts that point to meaningful events. In principle, good rules should spot unusual events. However, doing so without swamping teams in false positives turns out to be trickier than the marketing brochures let on.

Obviously building and maintaining these rules (or adjusting templates that come with the SIEM) is complex as is the process of normalizing log input to take account of the different data formats used by monitoring systems. Given that these formats and the inputs captured in logs changes and expands over time, this can become onerous.

UBA and the return of the user

The growth of data being fed into SIEMs, the complexity of its management, and doubts about their ability to detect attacks in realtime without overwhelming security teams in alerts eventually has led to a hunt for a new silver bullet. Interest in what Gartner calls user behavior analytics (UBA) followed.

This seemed to highlight the concern that crunching data from network detectors was a losing game when the most important dimension had always been the user. UBA’s innovation was to dispense with the guesswork about what log events meant, or how they could be related to one another, and just focus on a baseline state for users and their credentials. If a user event breached this known state – which could be internal as well as external – this would generate an alert.

Newer generations of firewall have been adding user management to their controls in the mid-2000s, but for organizations investing in SOCs and SIEM it always made more sense to unify this under a system above an individual monitoring device. In some cases, UBA was implemented as an extension to SIEM, or as a capability within the SIEM, as vendors sought to develop their products to suit the market.


Other stories you might like

  • Uncle Sam to clip wings of Pegasus-like spyware – sorry, 'intrusion software' – with proposed export controls

    Surveillance tech faces trade limits as America syncs policy with treaty obligations

    More than six years after proposing export restrictions on "intrusion software," the US Commerce Department's Bureau of Industry and Security (BIS) has formulated a rule that it believes balances the latitude required to investigate cyber threats with the need to limit dangerous code.

    The BIS on Wednesday announced an interim final rule that defines when an export license will be required to distribute what is basically commercial spyware, in order to align US policy with the 1996 Wassenaar Arrangement, an international arms control regime.

    The rule [PDF] – which spans 65 pages – aims to prevent the distribution of surveillance tools, like NSO Group's Pegasus, to countries subject to arms controls, like China and Russia, while allowing legitimate security research and transactions to continue. Made available for public comment over the next 45 days, the rule is scheduled to be finalized in 90 days.

    Continue reading
  • Global IT spending to hit $4.5 trillion in 2022, says Gartner

    The future's bright, and expensive

    Corporate technology soothsayer Gartner is forecasting worldwide IT spending will hit $4.5tr in 2022, up 5.5 per cent from 2021.

    The strongest growth is set to come from enterprise software, which the analyst firm expects to increase by 11.5 per cent in 2022 to reach a global spending level of £670bn. Growth has fallen slightly, though. In 2021 it was 13.6 per cent for this market segment. The increase was driven by infrastructure software spending, which outpaced application software spending.

    The largest chunk of IT spending is set to remain communication services, which will reach £1.48tr next year, after modest growth of 2.1 per cent. The next largest category is IT services, which is set to grow by 8.9 per cent to reach $1.29tr over the next year, according to the analysts.

    Continue reading
  • Memory maker Micron moots $150bn mega manufacturing moneybag

    AI and 5G to fuel demand for new plants and R&D

    Chip giant Micron has announced a $150bn global investment plan designed to support manufacturing and research over the next decade.

    The memory maker said it would include expansion of its fabrication facilities to help meet demand.

    As well as chip shortages due to COVID-19 disruption, the $21bn-revenue company said it wanted to take advantage of the fact memory and storage accounts for around 30 per cent of the global semiconductor industry today.

    Continue reading

Biting the hand that feeds IT © 1998–2021