And so to the Reg Library to unfurl some big IT blueprints for big organisations. Without further ado we kick off with Secure Computing, which knows how to get our attention - it cites a Reg article in its paper.
Let's kick off with our contribution:
In Australia a disgruntled individual used vulnerabilities in a wireless network to hack into a water/wastewater utility over and over again. The first 20 successful intrusions were viewed as mechanical or electrical problems with the utility network and/or associated field devices. Even after it was recognized as a cyber intrusion, the utility was unable to stop the attacks. Ultimately the hacker was able to stop pumps from operating, block alarms, and halt communications between the central computers and various pumping stations. On the 45th attack, he was able to cause a pumping station to overflow, causing raw sewage to pollute a residential neighborhood and tidal canal. Lack of good forensic technologies prevented the company from blocking his attacks for over 2 months. (Source: The Register)
This paper is for people concerned with security in utilities, banks, emergency services, process manufacturing and other things we have forgotten. The thesis from Secure Computing is that today's critical network are rarely isolated from corporated networks. There are good operational and cost saving reasons for this. Today, companies need to harvest regulatory audits. They also need to provide remote access for employees, contractors, industry integrators, and vendors. In addition, since many of these systems are interconnected, employees from third parties often require access to determine available capacity. But the downside is that critical systems now face the" same security concerns that have plagued IT administrators for years. Viruses, Trojans, worms, and malware are just the tip of the iceberg".
The paper lives up to its title - it's a good read, and the vendor pitch is very soft-shoe shuffle.
This is a monster of a book sponsored by HP - the table of contents alone runs to 10 pages. Co-author Don Jones, of Realtime.com says in his foreword ,that the title is by no means a "paid advertisement or white paper" And he's right, there.
There is an exhaustive checklist to plough through, but underpinning everything is the notion that the guide will "help" IT departments to perform more tasks with fewer resources. The importance and value of automating standard IT operations can be significant in data centers of any size. The goal is to significantly lower IT operational expenses while at the same time improving the end-user experience. Whether you’re a CIO or IT manager looking for ways to improve efficiency or a member of the down-in-the-trenches IT staff, you’ll find valuables concepts, methods'.
Not having run a data centre lately, we can't say if the guide lives up to lofty goals of its authors. But so far, it has earned good feedback from Reg readers.
In every cloud there is a silver lining. Especially where IT security companies are concerned. Cloud Computing, and its closely related siblings Web 2.0 and Software-as-a-Service, offer new back doors for hackers, and new things for security vendors to frighten us with.
So it may seem counter-intuitive that Trend Micro is inviting customers to offload some of their malware protection on to its cloud- based hosted security service. The aim is to to reduce endpoint resource consumption, network bandwidth consumption, malware, and overall risk, according to this short paper from IDC, which cautiously welcomes Trend's hybrid approach.
"Unwanted content (malware, spam, bad web sites, unproductive information, etc.) never hits the customer's gateway. Therefore, the customer is not responsible for content it never received. This reduces risk, cuts bandwidth requirements, lowers archiving requirement, and limits eDiscovery activities." ®