Nothing gets my goat as badly as “risk-based security” talk that is suffocating discussions lately. It is so pervasive and so obnoxiously delivered that you end up wondering if the authors of the term even know how poorly they understand information security, risk management, and the organisation they support. Dunning-Kruger effect in action if there ever was one. Decisions, decisions To explain what I mean, let’s look at it from top down: 1. Everything starts with objectives. The things you want to achieve, stuff you want to do.
This is really just a short brain dump of the basics to get started thinking about information warfare in a non-US way. Yes, that means Russian, Chinese, South African, Australian, etc. approach. It may come as a surprise to many, but information warfare has always been more and better researched by those that do not commandeer the world’s biggest military. First of all we need to start with proper definitions of data, information, and knowledge. Typical definition is that data magically transforms into information and that assemblage of information turns into knowledge.
“Issues are risks that have already occurred” is the standard view of the difference between what is popularly called “risk” and “issue”. But, that’s a superficial difference that does not take into account a major attribute of “risk”: objectives. I had a recent discussion with a risk professional that went something like this: Me: “I need to add ‘X’ to the risk register. It’s a month until and the ‘X’ is nonexistent.” RM: “Wait, it’s not done? We don’t have it?
”… and of course we have set transitive trust between different sources of identity” said he with a glee. That’s when I knew for certain I was not speaking with someone that has had a long and successful track record of setting up identity management in an enterprise environment. Information security practitioners will often roll out the old and tired “transitive trust is bad” adage. Unfortunately the line is parroted too often without understanding why transitive trust is bad. ###Trust and related terms###
Often I see the terms “framework”, “strategy” and “process map” used for a variety of documents that typically aren’t neither or are a mix of all three. Here’s a quick and easy way to see how they fit together and how to tell them apart: Framework is the skeleton. Strategy is the nervous system. Processes are the musculature. On their own they are all useless. No or poor Framework: your Strategy cannot stand, your Processes cannot effect any change. No effective and well-defined Strategy: your Framework may break under the pressure your Processes effect.
Sony Pictures information security team, small as it is, is in the crosshairs of all and sundry after the recent breach of significant proportions. As is typical for information security, once a victim is found the ritual and merciless victim bashing can begin. What most of these pieces forget is that the issues highlighted for Sony Pictures are present if not prevalent in majority of large organisations. ###Kick them when they’re down This scenario plays out time and again: A large organisation is in the news for industry average information security practices.
“Companies have to get security right every time – an attacker only has to get it right once.” This is probably one of the biggest lies that information security tells on a frequent basis, partially to get more money for ineffective security technologies and partially to maintain the illusion that perfect, long-term security is possible. ####Multiples levels defence has to fail In truth, companies have to get it wrong at prevention, detection, and response levels a number of times before a breach does any considerable damage.
Risk related books are a dime a dozen nowadays. Many are rehashing the stuff that was new and hot a couple of decades ago, fewer are keeping up with the industry maturation and even fewer are applying the academic learnings to the industry. Here’s a short list of a few books that I’ve read in the past and re-read now, either for reference, for new appreciation of the depths that I missed before, or to see if they’re still current.
Dunning-Kruger effect is an illusion of competence bias, presenting itself in two ways: one, the severely incompetent do not recognise their own incompetence, nor do they recognise competence in others, and assume they’re far better than they really are; two, the highly competent assume that others are at a similar level of competency and/or assume that the test, knowledge, etc. is easy to come by and that they’re nothing special. Frequently, the Dunning-Kruger for competent comes with a partner, the Impostor Syndrome.
In Part 1 we looked at the deterrence quality of security controls. It’s one of the three attributes of security controls that are often ignored; sometimes consciously but more often due to ignorance. Now we will look at another attribute that is too often neglected: awareness. Typically when discussing security awareness the immediate mental image is of mandatory courses, presentations and drab, unimaginative posters around the workplace. What this post talks about is the information security situational awareness: what is happening, where, why, and who is involved.