FireMon conducted a survey at Infosec Europe this year and apparently discovered that cybersecurity professionals manage to be both overworked and underutilized. That's a feeling which I've been quite familiar with myself from time to time over the years, but I was a little surprised that the article cited above was entitled 'Tired of fighting fires instead of doing real security work?'. As someone who has enjoyed (well, sometimes) several decades in firefighting and security, I'm not altogether convinced that firefighting and 'real' security are separate issues.
Even before I slipped from general IT support and started to prioritize systems and security, I remember all too well how a project could be interrupted by firefighting. Sometimes a critical issue like a virus outbreak or a major systems malfunction, sometimes something as urgent as a VSIP (Very Self-Important Person) having to walk to a printer just down the corridor because the one in his or her office was putting up an error message. Or even for a scheduled turn on the helpdesk, which was often where the first whiff of virtual smoke was reported.
That said, it may be that I was more than usually bothered by these issues because my increasing specialization meant that I spent more time on desk-bound projects whose importance was not always universally recognized, and that didn't lend themselves well to in-house collaboration.
That's a scenario that's more common in relatively small organizations with correspondingly small IT departments than in the sort of hierarchical, bureaucratic environment to which I moved subsequently. There, formal project management processes, in line with predetermined policies and global strategies and the outsourcing of major functions (all subject to political oversight), were the order of the day. Even there, though, major security-related firefighting was a frequent occurrence. You might be surprised at how much damage a worm or even a hoax email can do in an organization where staff numbers run well into seven figures.
But as technology has evolved, so has support. When I was working for that small organization where I learned my trade as a security geek, it was so locked into the firefighting model it considered forcing support engineers to hot-desk. Assuming, of course, that engineers would be too busy installing cable and network cards to sit around in the office doing things like documentation or coding. Nowadays, I gather that its support team rarely leaves the office to fight fires, since most problems can be fixed remotely, and many issues will be flagged by sensors in the NOC (Network Operations Centre).
Nevertheless, I suspect that a fair proportion of the fires the team does go out to fight are security-related – and in a far wider sense. When I was there, a very high proportion of incidents were virus-related, if you discount routine but high-volume trouble-tickets such as password resets.
While viruses have generally shrunk in impact, malware in a broader sense has increased dramatically in volume since the 1990s. Furthermore, it remains the weapon of choice for many criminals – as witnessed by the current boom in ransomware – even though malware and anti-malware technologies have diversified and increased in sophistication. And while an effective multi-layering strategy will reduce the impact of this firestorm of malware, far too many sites are still afflicted by malware – including ransomware – to dismiss it as not being 'meaningful security work'. After all, it's also become much more common for staff to use their own devices for work, both in the office and remotely, and personal devices can be much more difficult to police and protect.
While the FireMon article attributes part of the blame for this 'frustrating and dangerous state of affairs' to network complexity, that complexity does enable easier flagging and mitigation of system and networking issues. But the article also contends that the necessity of 'chasing' regulatory compliance is also taking time that could be spent on more 'meaningful' work – a hidden opportunity cost, in other words – by adding technology that survey respondents consider has no benefit apart from meeting compliance regulations. It even claims that respondents are 'compromising security posture in order to meet business demands’. And if security professionals are really cheating on audits, it's no wonder that SC Magazine sees it as an ethical dilemma.
I'd have to agree that regulatory compliance and bread-and-butter security are not the same thing. But by that, I don't just mean that some standards are more effective in promoting security than others.
Since FireMon drew its conclusions from responses to a survey that I haven't seen, I don't know which standards, regulations and audits are being referred to, but my own experiences may nevertheless be of some relevance.
A decade or so ago, I qualified as a lead auditor for BS7799, the precursor for ISO/IEC 27001. I found it quite a difficult exercise, because even though this series of standards is about implementing an Information Security Management System (ISMS), it's not about providing direct advice about an organization's security, but establishing a framework in which the organization can meet its own objectives. (I nearly said 'own goals', but perhaps soccer fans would find that an unfortunate turn of phrase.)
While the standard includes high-level information about security controls, it's more to do with implementing a process for choosing and defining controls than it is with nitty-gritty implementation. The essential goals of an audit are not to set security controls directly, but to help the organization:
- To fulfill contractual obligations
- To conform with regulations
- To gain an accreditation
It's for the organization to decide how to interpret the standard in order to set and meet the objectives most applicable to its own situation, and that's not just a security issue, it's a business issue. Auditing is about establishing conformance, not about being a security consultant. That doesn't mean that an auditor needn't have in-depth security knowledge: without that, you can't establish conformity. But the skills required to conduct a successful audit – even an internal audit – are not identical to those required to implement the best possible security.
I don't know what standards and regulations these security professionals were faced with auditing – probably a wide range – but it may be that one of the problems here is misunderstanding on several levels:
- Misunderstanding of the relationship between business and security. An insecure business is in bad shape, but an organization that's totally secure isn't necessarily in good shape to conduct its business. Security is a critical business process, but it isn't the only business process.
- Good security professionals and good security auditors don't necessarily have identical skills: if an organization has people who don't feel they can fulfill both roles comfortably, something is wrong. Not necessarily with them as skilled individuals, but in the way they're being used. But that doesn't mean that compliance doesn't matter.
There are no perfect standards. But in general, they're there for a reason. An organization that aims to conform because it has to, but doesn't understand the reasons for conformance, is not getting it right. And if that lack of comprehension is reflected in the work of the security team, something is going seriously wrong.