Everybody's security-focused these days. So, what's wrong with Security? Well, possibly the word "security" itself and the silo'd thinking it encourages. And also, I'm afraid, sometimes the people in the security profession, who often appear to think that security is an end in itself.
I was once paid, on the basis of my having technical knowledge of both the mainframe and PC platforms, to stop the company security officer encrypting all our PC hard disks, because the IT group thought that this would soon stop anybody getting any work done. That says something about the "maturity" of that company, doesn't it? One department trying to stop another department doing what it conceives as its job....
Well, I lost that battle in part and the consequences were that some business departments (those that thought they were so important as to need encryption-protected hard disks) got very little extra security (I knew someone who could overwrite the encrypted system areas of a hard disk with the unencrypted data, in hex and from memory); were a constant maintenance overheard and frequently lost access to business systems for a significant time (there were people who could break their security easily, but they weren't—mostly, we hoped—employed as bankers).
That was many years ago and hard disk encryption wasn't such a bad idea—but the technology and processing power available, and the processes and maturity of the company, weren't adequate for implementing the idea effectively (strong encryption of everything on the hard disk would have been too much of an overhead). The balance between the needs of the business and implementation cost (including maintenance) was what made it a bad idea—then. These days? Well, you still need to balance the needs of business against the cost of security, but the answers may be different.
I actually don't believe in "security" as such. I believe in "business governance", which (of course) includes providing the appropriate levels of Confidentiality, Integrity and Availability needed for the business to operate. But metrics for Confidentiality and Availability are "stretch metrics"; improving the Confidentiality of a resource can impact its Availability to the business and its use for generating business benefit.
So, what does this mean for "security"? Well, first, stop thinking in silos. I was once at a testing conference, talking with people about penetration testing—and no, it wasn't a "security" conference. Some people told me that penetration testing was something Security worried about and that it was all about access to and compromise of the physical hardware and network wiring; and that social engineering attacks were something the application developers should worry about. Others said that penetration testing included social engineering at the application level and any and all compromise of the business systems by the unwashed outside and that it was all part of Security's remit. And what about unauthorised penetration of business systems by insiders? I have no idea which approach is "right" but I do know that all these issues are important and there is apparently a danger that some or all of them will fall into the cracks—that everybody thinks "someone else is looking after that".
I once met someone who implemented a security technology for a telecomms company and was prevailed upon (against his better judgement) to test his own work. So he put on a white coat and picked up a test set and went into the company and asked people for the passwords to his new technology. And, as he was obviously tech support, people gave him the passwords and he could then access information that he wasn't supposed to. So he wrote a report saying that your new technology is fine but you have serious process/people problems around it—and here's something I shouldn't be able to know, as evidence of this. His employers were very angry and accused him of cheating. Well, criminals cheat, I'm afraid....
You have to think of security in holistic terms and in terms of business outcomes. If personal information mustn't be compromised because the Data Protection Act (DPA) says that it mustn't, then it doesn't matter much whether someone sniffs it off a PC electronically (where it is in clear because someone is reading it), reads it off the PC screen with a telephoto lens, blackmails an employee into giving it to them—or finds it, in clear, on a laptop left on the train. And, just perhaps, if the cost of paying the fine for non-compliance is less than the cost of implementing DPA compliance, that makes business sense (although only if you've factored in costs associated with reputation risk if you're found out, and the possible impact of the obvious lack of corporate ethics this entails on employee behaviour generally).
In fact, perhaps you shouldn't think of security at all, but think about the regulatory, social, business confidentiality etc. necessary for a new automated service to deliver business benefit—as part of designing and implementing that service. You should design in good "governance" (Little Oxford Dictionary: action, manner, power etc. of governing; sway, control), which includes what we usually call "security", from the start.
Of course, this doesn't mean that it's a good idea to implement this aspect of design from scratch, for every business system. A lot of these design requirements for governance are common to all systems, and implementing good security policies with something like ISO 27000 might make a lot of sense—not because this will implement "security" but because it will facilitate the resilient delivery of desirable business outcomes.
Then, once you know what your business requirements are in this area, you may find that you can buy technology off the shelf which helps you to implement them—and perhaps you need security professionals to help you choose it. However, the acquisition of security technology should always be driven by holistic business needs.
Seeing it from this point of view brings other benefits. If a security technology is monitoring users for unacceptable business behaviours (such as stealing money or information in transit), it can also monitor end-user experience at the same time and identify problems with the use of technology. It can even provide feedback that will help service developers build more resilient or less fraud-prone, or easier-to-use automated systems. Security stops being just a cost of avoiding bad things and starts to provide proactive benefits. It makes risks visible for proactive management and thus encourages safe (risk-managed) business innovation.
So, is this all just "pie in the sky". Well, I have been talking with Richard Walters, CTO of Overtis; and its Vigilance Pro solution does appear to support a holistic approach, from physical security (e.g. integration with door locking and card entry systems—so you can be notified if a system which is supposed to be operated on premises is active when its user isn't in the building) all the way through to monitoring user behaviours at the application level. It only takes the availability of one technology implementation to make the sort of things I've been talking about feasible. Nevertheless, the availability of technology doesn't really drive the improvement of process and culture by itself. Most people are still thinking in terms of a security "silo" and addressing particular, very specific, risks (often just those which have hit the papers recently)—they are not thinking in terms of the governance and resilience of automated business systems and ensuring that business automation operates in accord with business strategy/policy and delivers business benefit—and nothing else.
People really don't automate business systems in order to expedite the theft of their information or money, so adequate governance of automated systems is generally an important, if often overlooked, non-functional requirement. So, "business-focussed security designed in from the first" sounds like a really good thing—but what does this mean in practice? It is primarily to do with good governance, with addressing people and process issues, not with buying technology; and thinking purely in technology silos just doesn't help.