IT Security Analyst & CISO Forum 2015

Written By:
Published:
Content Copyright © 2015 Bloor. All Rights Reserved.
Also posted on: Security Blog

The IT Security Analyst & CISO Forum 2015 was a chance for CISOs, analysts, vendors and security professionals to get together and discuss what they are really worried about – under Chatham House Rules (that is, no attribution – which means that you’ll largely get my interpretations of what was said here).

Perhaps the key message was the importance of “layer 8 security” – the people issue. This was confirmed by a professional social hacker who, in general, claimed (very plausibly) to be able to get past any security, simply by asking to be let in, in the right way. In other words, she understands and exploits the psychology of company employees. Often, she is targeting and befriending disaffected employees (and how many large corporations generate such people as a matter of course – if company behaviour is routinely unethical, then disaffected or dysfunctional employees are almost inevitable). More often, she and her friends are exploiting what we think of as the better qualities of human nature – the desire of help desk personnel to be helpful (the clue is in the name), for example; or our desire to be kind to the disadvantaged and pleasantly incompetent. A very successful technique is to behave lke Michael Crawford of “ooops Betty” fame – sooner or later someone will tell you which server to look at, what the login process is, even the password you’ve forgotten, if you act like a helpless child…

And therein, I think, lies a huge risk, and possibly not the one you are thinking about. If you get too paranoid, you lose sales because you’ll make the user experience for the 95% who aren’t out to cheat you so horrid that they’ll go somewhere else. And, you’ll probably reduce staff morale to the point where you create opportunities for the 5% who are, who’ll put up with any user experience, however bad, in the “game” of getting past our defences. It seems to me that the defense against social engineering is largely good company ethics, good staff relations and good security awareness training – and a properly thought out (and well-trained) customer interface that is polite and helpful, but not credulous. Once you’ve achieved that, then technology may be able to assist you.

One area where technology can help is with the privileged user problem. The best way for a serious money-making attack to work is via a privileged user with the authority to override controls and intimidate minions. There’s an issue here, because suggesting that a top manager or the head of IT (or their, probably feral, children) can be a route for attack is severely career limiting in most organisations. Technology allows you to institutionalise appropriate policies and controls without getting personal about it – and once the privileged users have signed up to these, they can be enforced in a “stage 2”: “no sir, I’m not at all worried that you, of all people, are a risk – but you did sign up to these procedures and perhaps they should apply to you too, as otherwise it sends the wrong messages to the people we’re really worried about; it’s not me that’s doing this, it’s just the security tool you bought…”

Another area that got lots of discussion was BYOD (“bring your own device”) and “Shadow IT” generally. It was refreshing to find a CISO who thought that BYOD had been a fact of life for years and that shadow IT was a good thing – it is getting the business automated, and without being a burden (that is, a cost) for IT. What’s not to like? Well, there are risks, of course, but they can be addressed by providing easy access to IT mentors when appropriate and by security awareness training for the “citizen developer”. Once you explain the possible personal and company consequences of, say misusing personal data in Shadow IT, the citizen developer won’t want to do it any more than the compliance officer would want them to. It’s simply a matter of balancing Freedom and Trust (both strongly emerging trends today, although I can remember managing end-user computing in the way I’m suggesting  back in the days of Eastyrieve on the mainframe, some 25 years ago) – in the interests of better business and employee welfare.

So far, as I intimated, it’s been largely people issues, but there are real technology issues too. We all use software certificates to support Trust. It’s how we know that when Patch Tuesday kills our PC, at least the updates really did come from Microsoft. But do we? Are we sure that Microsoft always manages its certificates well? There is evidence that it hasn’t always done so in the past. How well do we manage our own certificates? Do we know where they all are, whether they are up to date, whether they are in the right place, whether they are even used? Now, those are real governance issues and ones that software tools can help with.

Tools have a huge place in security, to institutionalise and enforce good practice, and to manage the complexity that, un-managed, is the enemy of good security. But get the people and process more or less right, or well-understood, first, before you invest in tools (although you should probably expect to iterate around this a bit too, in a continuous improvement process).