Defensible decisions through independent and impartial critical thinking and analysis.
Evidence-based decision making is a widely used phrase, but how often is rigorous analysis truly applied to inform decisions in the public sector? Too often, studies that claim to be impartial end up confirming the biases and expectations of their sponsors. Successful analysis projects go beyond this—they explore topics thoroughly, document assumptions, provide transparency, and clearly separate recommendations from options.
The Need for Rigorous Analysis in the Public Sector
Ministers and their supporting Departments and Agencies are fearful of the public scrutiny of decisions, and they should be. Taxpayers’ favour is a precious commodity, and they generally despise corruption, waste, and pork barrelling. Ministers and their supporting officials are wary of bad publicity and scrutiny in forums such as Senate Estimates and Royal Commissions. These fears perhaps overshadow the importance and timely delivery of initiatives. Good and credible analysis can alleviate these fears.
The Source of Good Ideas and the Importance of Scrutiny
No-one has exclusive ownership of good ideas for initiatives, and they can come from political parties, policy makers, allies, academics, industry and public officials. Good ideas in turn need to be given an appropriate level of scrutiny relevant to the level of investment and project impact. Too often we see projects or initiatives facing difficulties or delays because analysis was poor or insufficient. Due to the significant resource allocation, Defence receives justified criticism, but other Departments face similar challenges.
The Challenge of Scrutiny in Decision-Making
The question of scrutiny around decision-making is something that challenges all Western democracies. Non-democratic societies can (potentially) make decisions more quickly only to learn that they made poor decisions. To overcome the risk to timeliness of decision-making, leaders need to understand the sequence (and cadence) of decisions and consider information, and analysis requirements needed for defensible decisions.
While thorough analysis can lead to a perception of delay, this can be mitigated through stripping away unnecessary detail from the elements of the problem. Good, defensible analysis provides a foundation for stakeholder concurrence. Too often, unnecessary bureaucratic detail is added, in the name of reducing project risk.
Changing Strategic Context: The Defence Example
For example, during an extended period of peace, Defence decision makers had the luxury of time to minimise project risk. The National Defence Strategy from 2022 indicated that a 10-year warning time for a potential major conflict in the Indo-Pacific was no longer appropriate, meaning Australia’s National interests were likely to be at risk. By deduction, the Defence organisation now needs to make good decisions more quickly.
With well-planned analysis campaigns, the stripping away of bureaucracy, and an over sensitivity to even a small amount of risk, it can be done. Donald Horne, in his famous book “The Lucky Country” (Horne, D.L. The Lucky Country 1964) identified an Australian cultural aversion to scrutiny and analysis with a “passion for improvisation which leads to slapdash attitudes that may become increasingly dangerous in the technological age”. Notably he foresaw the potential for the luck to run out with the rise and industrialisation of Asia.
The Importance of Internal Contestability
A further challenge to achieving defensible, evidence-based decisions in the public sector is the shortage of analytic specialists. These experts are often vastly outnumbered by policy and project managers, making it difficult to ensure that rigorous, impartial analysis is consistently applied to major decisions. This imbalance can undermine the effectiveness of internal contestability—the process by which officials critically challenge each other’s ideas to improve outcomes. When analysis becomes focused on following procedures rather than genuinely examining outcomes and impacts, its value is diminished. For analysis to truly support robust decision-making, it must involve a deep, critical examination of the subject matter and a willingness to question assumptions and recommendations. Addressing this skills gap is essential for building the analytic rigour and transparency needed for credible, defensible decisions in the public sector.
Principles of Effective Analysis: What are the Critical Elements for Success?
- Applying critical thinking to frame the problem or decision.
- Socialising the variables and getting high‑level endorsement of the analysis Terms of Reference.
- Understanding the decision variables and key assumptions and being willing to challenge assumptions.
- Understanding the supporting data needed.
- Clearly outlining and categorising the problem scope.
- Engaging an analysis team early to understand and map decision-timeframes.
- Taking a campaign approach—use results as a body-of-knowledge for future refinement as context and assumptions change.
Common Pitfalls: What does Analysis Failure Look Like?
- As a leader, having fixed expectations of the answer (this is decision-based evidence making).
- Failing to negotiate scope and detail needed for the decision.
- Biases and lack of impartiality—not recognising real or perceived conflicts of interest.
- Analysis team lacks operational or technical knowledge or expertise
- Selective and prescriptive use of data and tools.
- Not engaging all relevant stakeholders—understand sensitivities and be prepared to explain.
- Not engaging regularly with study team.
- Not understanding releasability and classification of information.
With well-planned analysis campaigns, the stripping away of bureaucracy, and an over sensitivity to even a small amount of risk, it can be done.
What Does Good Analysis Look Like?
Operations research, management science and engineering literature each provide a rich and diverse base of academic and practical information that can arm practitioners.
Determining the ‘best’ approach to analyse a particular problem very much depends on the breadth and significance of the decision, the time available and the availability of trusted and valid data. In many instances a hybrid approach that combines analysis and engineering principles is needed to address the mix of qualitative and quantitative data. A generic process, based on the Operations Research and Systems Analysis Handbook study guidance (2011), can be seen in Figure 1. This graphic steps through a logical study process.
A question worth asking early in the analysis is its likeness with previous work and whether lessons can be drawn from that work.
To undertake the type of study process outlined in Figure 1, a range of skillsets and analysis methods including modelling and simulation are generally required. The analysis team supports the sponsor and stakeholders through each of the steps, building confidence in the rigour of the analysis and the sensitivities to variables and available data. This can be a confronting process as advocates for solutions may be disappointed with the findings. Good analysis will result in acceptance and endorsement of the findings on face value. If and when additional factors emerge that change the results, these should be documented.
Determining the ‘best’ approach to analyse a particular problem very much depends on the breadth and significance of the decision, the time available and the availability of trusted and valid data.
Types of Analysis Across Decision Levels
Examples of the types of analysis that can be undertaken are numerous and span all phases of portfolio and project decision cycles. Examples of analysis approaches are listed under four categories where different types of problems are often encountered:
Strategic / Portfolio-Level
- Concept development, strategic simulation and validation.
- Utility and net assessment.
- Dependency and supply chain analysis.
- Model-based mission engineering.
- Concept of Operations (CONOPS) development and validation.
- Technology Effectiveness.
- Cost estimation.
Operational / Program-Level
- System effectiveness and gap analysis.
- Requirements and Fundamental Input to Capability (FIC) analysis.
- Analysis of systems and capability alternatives (including preview Test and Evaluation).
- Logistics and sustainment analysis.
- Analysis of system interdependencies.
- Cost and risk estimation.
Delivery—Acquisition and Sustainment
- Market and industry analysis.
- Supply chain (flow-security) analysis.
- Facilities, processes and workforce.
- Schedule.
- Cost and risk assurance.
- Systems and technology integration and options analysis.
Fielded Systems & Tactical Level – Operations
- Operational plan analysis and validation.
- Mission systems gap analysis.
- Operation and exercise analysis (including many forms of Test and Evaluation).
- Network assessments.
- Demand and Supply – Force Generation and Sustainment analysis.
Good analysis empowers decision-makers with credible and tested data. As outlined in this article, the nature of the analysis needed for supporting important decisions varies depending on a range of factors that need to be assessed. Our team have deep expertise and experience delivering objective and impartial analysis. We have the capability to further support our existing Defence client base, as well as broader government agencies in this space. We’d welcome a conversation to understand your analysis needs.
Our team have deep expertise and experience delivering objective and impartial analysis.