And there wasn’t much of it in Operation 8.
A wise old man was sitting outside his village. A traveller asked him, “What kind of people live in this village. I want to move from my village?” The wise one asked, “What kind of people live in your village?” The man said, “They are mean, cruel and rude.” The wise man replied, “The same kind of people live in this village too.” After some time another traveller came by and asked the same question and the wise man asked him, “What kind of people live where you come from?” And the traveller replied, “The people are very kind, courteous, polite and good.” The wise man said, “You will find the same kind of people here too.”
It is also said in the Talmud, the central text of Rabbinic Judaism:
“We see things not as they are but as we are”
The ancients, without the benefit of the modern science of cognitive psychology, understood the human mind and its propensity to see the world as a reflection of itself and to build the narratives it wants to believe. Unfortunately there were no wise men, Jewish scholars or tohunga Maori on the Operation 8 team. Or any sort of scholar for that matter.
Throughout this series I have maintained that Operation 8 was a police cock-up. The reasons for that are partly the ignorance, racism and paranoia endemic still in the NZ Police, and partly just plain old incompetence. The incompetence in the intelligence function of the police prior to the appointment of intelligence professional Mark Evans was the result of a lack of a professional intelligence framework and training, and consequently a lack of intellectual ability. In other words Operation 8 was a dumb operation. The available audit trail clearly shows that to be the case.
“The steps in converting information to intelligence are largely intellectual. To aid the mind, various checks, procedures and processing tools exist, and these in turn help ensure the systematic exploitation and detailed scrutiny of information and provide an audit trail of the intellectual journey”.
– Lance Collins & Warren Reid, “Plunging Point, Intelligence Failures, Cover-ups and Consequences”, Fourth Estate, Australia, 2005.
Intelligence analysis is an intellectual activity. It requires the application of an educated mind to see things as they are, not as we think they are. It is not an activity suited to the mind of the policeman trained only in the investigative techniques of detection of crime after the event. That is the work of detectives.
” … intelligence analysis today continues to be a human practice dependant on the intellectual capacity of individual analysts, notwithstanding the increasing role of technology in the intelligence domain“.
“Intelligence analysis is arguably a critical part of national security as well as law enforcement function, but is dependent upon the intellectual capacity of individual analysts”.
– Corkill, Jeff, “Not Art, Not Science, but Artistry: Why professional artistry should matter to the intelligence community“, in The Journal of the Australian Institute of Professional Intelligence Officers, Volume 19, Number 1, 2011.
The work of the intelligence analyst:
- Is the work of prediction based on an assessment of the probability of future events;
- Is to gather and analyse information about events in the future;
- Is to draw tentative conclusions and build possible narratives or scenarios based on that analysis, and to test and evaluate all of those narratives and scenarios in order to propose the most likely scenario or scenarios. If there is insufficient information or evidence to confirm or to eliminate a scenario more evidence should be sought;
- Should always be tested and evaluated by senior analysts not involved in the analytical groundwork leading to the conclusions upon which the narratives and scenarios are based.
That did not happen in Operation 8. The lead analysts were detectives rather than intelligence professionals. As they progressed from December 2005 to the culmination of their work on 15th October 2007 they built a single narrative and a single scenario. They did not consider other possible narratives. They further reinforced that single scenario by only seeking out information to confirm their mindset. Once that mindset developed they would have been oblivious to other information and other narratives and scenarios. They built the narrative they wanted to find.
Even so they obviously knew they did not have sufficient information to support their single scenario analysis. The Operation 8 “termination” phase, including the armed paramilitary operation at Ruatoki and elsewhere and the nationwide computer seizure operation, was designed not just to arrest suspects and to seize weapons and equipment. It was designed to find the information they did not already have to complete their terrorist narrative and to prove the validity of their terrorist scenario. It failed miserably. The Solicitor General gave them a soft landing by blaming the legislation itself when he declined to prosecute the Urewera 17 under the Suppression of Terrorism Act.
They had built a single narrative and scenario for which they knew they did not have sufficient intelligence. Then they acted upon it. That was dumb intelligence analysis and management, and dumb policing at the highest level.
How did they get there?
In a previous post we explored the deliberate exclusion of Maori police officers who would have taken a broader and more knowledgeable view. Who should have been involved in the planning and direction, the collection and processing of information, the analysis and production of intelligence. Who should have been involved in challenging, testing and evaluating the analysis and conclusions, and in planning any resulting action. In all probability the armed paramilitary operation on 15th October 2007 would not have happened if they had been involved.
But they were not involved and so we need to look at what intellectual shortcomings led the “analysts” and their managers to the 15th October 2007 operation.
“Another important aspect of analysis as a cognitive process is where subconscious biases or mindsets by an individual or group prevent a full reflection of all available probabilities and conclusions, which can lead to faulty analysis and assessments. Cognitive biases are mental errors which are a normal part of human reasoning “.
– Patrick F. Walsh, 2011, “Intelligence and Intelligence Analysis“, Routledge, New York.
Apart from the fact that they were not trained as intelligence analysts their work was not challenged, tested and evaluated by experienced professional analysts to avoid the pitfalls of cognitive bias.
In a previous incarnation as an analyst I would challenge, test and evaluate the product of the analysts working for me. I would similarly have my own assessments and conclusions challenged, tested and evaluated. It was all part of the process of exposing cognitive bias and eliminating error as much as possible.
Based on their amateurish unchallenged, untested and unevaluated analysis the NZ Police rushed onwards towards a debacle of their own making. Commissioner Broad himself displayed an unbelievable lack of professionalism and incompetence by accepting the shoddy untested work of his intelligence analysts and operational advisors and presenting it to the Officials’ Committee for Domestic and External Security Coordination (ODESC) and to the Prime Minister and Cabinet.
There was thus a total lack of intellectual rigour in their work from the desks of the analysts to the police commissioner himself.
The competent analyst is a person who is not only educated to reason and think logically but who is also able to put aside personal or cultural bias in the interpretation of information. The mind of the analyst is able to comprehend subtleties and nuances, the multiple shades of grey between the absolute certainties of black and white. The mind of the analyst is comfortable with ambiguity and uncertainty while it seeks out information to reduce if not to eliminate that ambiguity and uncertainty.
The competent analyst relies solely on evidence rather than conjecture and must be trained not to jump to conclusions, but to consider all possible interpretations of the information available before offering an interpretation based on that information. If more than one valid interpretation is possible the analyst must present all possible interpretations. If one is chosen above the others the analyst must present evidence supporting that conclusion.
The analyst who is not educated to avoid them will unconsciously employ the shortcuts that the human mind habitually uses to make sense of the world. Our brains sideline or suppress the ambiguity and uncertainty of the real world and create coherent interpretations where they don’t exist.
As award winning cognitive psychologist Daniel Kahneman puts it, a few thousand years after the authors of the Talmud came to much the same conclusion, “We see the world as much more coherent than it is”.
“Many decisions are based on beliefs concerning the likelihood of uncertain events such as the outcome of an election, the guilt of a defendant, or the future value of the dollar. … Occasionally, beliefs concerning uncertain events are expressed in numerical form as odds or subjective probabilities. What determines such beliefs? How do people assess the probability of an uncertain event or the value of an uncertain quantity? … people rely on a limited number of heuristic principles which reduce the complex tasks of assessing probabilities and predicting values to simpler judgmental operations. In general, these heuristics are quite useful, but sometimes they lead to severe and systematic errors”.
– Amos Tversky and Daniel Kahneman, 1974, “Judgement under Uncertainty: Heuristics and Biases”, Science Vol 185, reprinted in Kahneman, 2011, “Thinking Fast and Slow”, Allen Lane.
The mind is inclined to jump to conclusions. What you see is what there is, or all there is. The mind/brain doesn’t allow for what you don’t know or see. It creates reality and certainty only from what it sees and hears. The mind uses unreliable information to reach those conclusions. One can construct very good stories, narratives or scenarios out of very little evidence. All unconsciously. And professional analysts have to be educated and trained to avoid those pitfalls.
“The human mind is an illusion generator … Patterns are everything to us. We hunger for them. We revel in them. They are the basis for art, literature, music, and much more in our lives. But a perceptual system that is so geared to wrestling patterns out of complex arrays of stimuli is bound to produce some false positives”.
“Broadly speaking, there are two ways you can make a perceptual mistake. You can fail to see something that is there, or you can see something that is not there“.
– Hank Davis, 2009, “Caveman Logic – the persistence of primitive thinking in a modern world”, Prometheus Books, New York.
Stereotypes are part of that process and provide mental shortcuts in our attempts to make sense of complicated situations. We ascribe stereotypes to groups of people and those stereotypes shape our perceptions and expectations of what people in those groups might think and how they might act. The analyst needs to acknowledge that and to guard against applying stereotypical thinking to his or her work. The police are especially vulnerable to stereotypical thinking for they live their working lives immersed in the world of criminality. They tend therefore to see and interpret their working world through the prism of criminality. There is no problem with that in the work of the criminal detective for the detective is after all tasked with solving crime that has already been committed. The intelligence analyst however has to maintain a much wider and more nuanced view of the world.
Non-Maori have formed and held stereotypes of Maori since the time of first contact. The American David Ausubel observed them and wrote about them in 1960 (“The Fern and the Tiki”, Angus and Robertson). Those same stereotypes persist into these modern times and shape opinions and interpretations about Maori. Non-Maori police are no exception and the stereotype of the Maori as criminal is alive and well. The police have also formed a collective stereotype of the activist, and in their intelligence gathering activities have equated activism with criminality. Whereas a few activists may be involved in some criminal activity most are not, yet the stereotype of the activist as criminal prevails. Based on this stereotype the police seem unable to differentiate political intelligence from criminal intelligence. Maori activists are doubly disadvantaged by this stereotypical thinking.
There is a specific stereotype of Taame Iti as a dangerous and sometimes violent radical Maori and Ngai Tuhoe activist, a former member of the Communist Party and of the “radical” protest movement Nga Tamatoa. The police officers who know Taame do not subscribe to this stereotype but higher up the chain they obviously do. In 2005 when Taame staged a massive theatrical presentation to the Waitangi Tribunal including shooting a flag on the marae the local police were not perturbed. Higher up the chain they invoked the stereotype and in their abject ignorance (and cognitive bias) had him charged and convicted. Later of course his conviction was overturned on appeal. Like the local police I know him as a likeable rogue, a family man with a strong sense of social justice, a strong commitment to the health and wellbeing of his people, total dedication to the Ngai Tuhoe cause, with an exceptional talent for theatrical and often humorous protest. I have watched as he dealt lovingly, gently and patiently with a difficult child, a far cry from the image of “terrorist”. The local fuzz could have arrested him over the phone and he would have arrived at the cop shop under his own steam after breakfast, with his mobile phone, rifle and can of petrol if that’s what they wanted. And the best way to find out what Taame is up to is to ask him. He can be disarmingly open, frank and honest.
Stereotypical thinking leads to confirmation bias, the tendency to search for, interpret and remember information that confirms one’s preconceptions. That leads to expectation bias, the tendency for analysts to believe and produce intelligence that agrees with their expectations for the outcome of their analysis, and to disbelieve, discard or downgrade information or data that conflict with those expectations. Or to the observer-expectancy effect when an analyst expects a given outcome and therefore unconsciously manipulates or misinterprets information in order to find it.
Another dangerous mental shortcut in intelligence analysis and management is consensual validity. It was one of the critical intellectual failures that led President George W. Bush to declare war on Iraq in 2003 based on insufficient hard intelligence, some false intelligence, and on the wrong conclusions believed by all of those around him.
“Consensual validity is a very powerful force. In many cases if those around you believe it, it must be true. It is also a prime example of what are called heuristics, or shortcuts that save each member of the group (or species) from having to re-evaluate the same evidence“.
– Hank Davis (2009)
Given that the critical analysis in Operation 8 leading to the belief that a terrorist plot was being hatched does not appear to have been challenged, tested and evaluated with any intellectual rigour, consensual validity seems to be the primitive thought process that led the analysts and their superiors, all the way up the chain to the police commissioner and thence to the Prime Minister and Cabinet, to their conclusions and decisions.
Groupthink is another name for it, or the bandwagon effect.
Heuristics or mental shortcuts are essential in our everyday lives and they work very well in most conditions as we navigate our lives mostly on auto-pilot. We don’t dwell on every problem and every decision we need to make; thousands of them every day of our lives. Heuristics are so ingrained in our unconscious minds that we are rarely aware that we are using them. But if we rely on them, knowingly or unknowingly, in situations when we should bring the intellect or the conscious mind to bear, then we will most likely draw incorrect conclusions. Reliance on heuristics in situations where they should be over-ridden indicates a lack of intellectual ability, or worse, intellectual laziness.
Thus constricted thinking too often leads to plausible yet incorrect conclusions as it did in Operation 8.
“Scholars and the study participants tend to be in agreement that good analysts possess certain qualities regardless of the domain in which they operate. These qualities include demonstrated intellectual capacity, curiosity, a degree of scepticism, and attention to detail. Additional qualities noted by the study participants include, creativity, tenacity, foresight and contextual understanding”.
– Corkill (2011).
I would add that the analyst and managers of analysts should have an advanced understanding of heuristics and a knowledge of how to avoid reliance on those mental shortcuts.
There are now many other identified heuristics or cognitive biases. Perhaps to end this discussion on bias I should mention the Dunning-Kruger effect in which incompetent people fail to realise they are incompetent because they lack the skill to distinguish between competence and incompetence. Much as I would like to I cannot attribute this insight to Tamati Kruger of Ngai Tuhoe but it would seem to be an appropriate observation for Ngai Tuhoe to make about the cognitive psychology underlying Operation 8.
I have defined another psychological effect. I am calling it the “Hi Ho Silver Effect” in which hyped up testosterone fuelled heavily armed cowboys in facemasks and fancy dress disregard the law, human rights and common dignity and decency, as they get their erotic kicks by terrorising unarmed women and children in the misguided belief that the objects of their collective pornographic fantasy really are dangerous outlaws. It involves a total suspension of reality and a high degree of theatricality of the tragicomedy variety. If ever confronted by the Hi Ho Silver Effect the counter to it is to imagine that the cowboys are wearing lipstick, bras, panties and panty hose under their macho getup, and imagine that you shove flowers down the barrels of their guns while silently chanting “Show us your knickers girls, come on, show us your frilly knickers”. For an encore you could ask them to dance the Can Can.
I know, I can’t resist it sometimes. But they really are cowboys and based on their vile and stupid behaviour at Ruatoki, Taneatua and Manurewa I wouldn’t have had any of them in the real combat infantry I commanded in my day.
And that unlawful and unforgiveable behaviour was the direct outcome of dumb analysis and dumb policing.
At a conference in July 2011 a representative of the Victorian Police in Australia spoke of the need for intellectual ability as a challenge facing the Victorian Police intelligence framework implementation. She noted that the Victorian Police were looking to recruit and train university graduates to be intelligence analysts rather than using less well educated policemen.
In response Mr Mark Evans who heads the NZ Police Intelligence system implementation explained that he was working, in consultation with other intelligence agencies, to establish intelligence education in conjunction with Massey University to upgrade the NZ Police (and other agencies) intelligence capability.
Subsequently on 21st December 2011 the NZ Police released a media statement announcing the signing of a Memorandum of Understanding with Massey University providing for collaboration in research, teaching and professional development. Massey would offer a new qualification at its Centre for Defence and Security Studies, a Master of International Security, which contains tailored papers including security strategy, crime intelligence, international law, and leadership and management. Several police officers have joined the degree course. There would be no better case study than Operation 8 to demonstrate how not to manage and conduct an intelligence operation. Except that the NZ Police culture does not allow the admission of failure unless forced to.
Mr Evans said the MOUs timing was appropriate as police launched the new Prevention First operating strategy aimed at making New Zealand an even safer place to live, visit and do business (Source: NZ Police website).
The NZ Police now require their analysts to have specific intelligence qualifications and their recruitment advertisements for analysts to work in the Special Investigation Group now reflect that requirement. It was not previously a requirement and was obviously not a requirement for the analysts involved in Operation 8. This extract from an advertisement in 2012 demonstrates the minimum qualification now required of an analyst:
“The successful applicant will hold the National Diploma in Intelligence Analysis (NDIA) or equivalent or be able to obtain the same within 12 months of employment. If a trainee is the successful applicant they will be put through training to achieve the NDIA”.
The diploma was developed by Mrs Janine Foster while she was at Customs and it is now the industry standard NZQA accredited qualification. Mrs Foster joined the NZ Police in 2002 when they were rapidly building their intelligence structure in the wake of 9/11. She is now on secondment to Massey University where she teaches the security and crime component of the degree course.
This development in the education of intelligence officers is a clear indication that at the executive level at least NZ Police has recognised the inadequacy of their previous capability (and staff) and the need to increase intellectual capacity in intelligence management and analysis.
That intellectual capacity, essential to professional and competent intelligence analysis, was completely missing from Operation 8. From top to bottom. It was a major factor in the debacle that ensued. It was a dumb operation.
Taku rākau ka hē ki te marahea
My weapon erred in the worst way.
Links: The Operation 8 Series