/ cyber

ASPI ICPC's Cyber Maturity in Asia-Pacific region 2014 report: a review

The International Cyber Policy Centre of the Australian Strategic Policy Institute’s (ASPI-ICPC) released its inaugural “Cyber Maturity in the Asia-Pacific Region 2014” report. Like all such endeavours it has its warts, but it should be congratulated for tackling a significant challenge. The report is a mix of quantitative and qualitative approaches and tries to devise simple metrics for a complex issue.

It’s a great start that can only get better, and in light of that here are my few comments (mostly on methodology).

Research Questions

1. Governance

Metric 1A talks about effectiveness, metric 1B doesn’t. Including and attempting to measure effectiveness of activities is a better indicator of maturity than just recording that activities take place.

b) Is there existing legislation/regulation relating to cyber issues or internet service providers (ISPs)? Is it being used? What level of content control does the state conduct or support?

State views on ISP regulation are suggestive of the state’s perspective on the regulation of content, governance and the involvement of the private sector in cyberspace.

The view from the business and civil society is that the government should stay out of regulating ISP's as providers of connectivity.

an understanding of the state’s views on content control is important to all other stakeholders when engaging with it.

Absolutely. And there needs to be a measure for this. Is censorship good or bad? Plus or minus points? Can't have it both ways, or just indicating the content control is important but not specifying why, how, and how it affects the maturity.

c) How does the country engage in international discussions on cyberspace, including in bilateral, multilateral and other forums?

This is a subjective indicator; countries that have a different approach to the one assumed by Western reviewers will automatically rate lower than those with similar views and approaches to ours, no matter how much the rest of the world may disagree. Instead, the metric should be around the engagement regardless of how. Some will always have an opposing view and a different approach. Doesn't make it any less valid.

d) Is there a publicly accessible cybersecurity assistance service, such as a CERT?

And is this service widely known to the constituency? Australia's own JSOC, (Gov)CERT.Au, etc were all at one time assistance services that no one in the industry even knew about. I remember at least three events where various industry-facing services were mentioned by Australian Government representatives. Government cyber security services that majority of the people in the room (all representatives of critical national infrastructure organisations) never heard or before.

2. Military application

e) What is the military’s role in cyberspace, cyber policy and cybersecurity?

A specialised organisational cyber structure within the military indicates some awareness of cyber issues in the armed forces, and possibly the military’s perspectives on the use of cyber operations capabilities.

Insufficient to measure. Should also ask:

  • how many different areas in the military are looking after 'cyber' issues (indicates too many cooks, spoilt broth - and infighting, etc.)
  • how well is the military prepared to protect its own installations
  • is military encroaching on civilian sphere
  • ...

3. Digital economy and business

a) Is there dialogue between government and industry on cyber issues? What is the level/quality of interaction?

High-quality public–private dialogue on cyber issues demonstrates a mature understanding within government and a good awareness of cyber risks in the private sector. This presents an opportunity either to engage in capacity building or to learn and implement similar strategies.

More importantly, what are the tangible outcomes? There is always some level of quality interaction, yet the outcomes just aren't there.

b) Is the digital economy a significant part of economic activity?

How has the country engaged in the digital economy? The state’s level of engagement with the digital economy indicates its ability to harness the digital sector for economic growth.
State's level of engagement? Or state's encouragement, support and preparation of a free market that creates digital sector and related opportunities? State's level of engagement is very high in China, yet their metric is fairly low.

Components of the Methodology

There is absolutely nothing wrong with the approach to the methodology, but it needs a big “warning” sign there, because it applies a very WEIRD world-view to the area of the world that is extremely diverse and where Australia is the odd one out. With that out of the way, here’s specific comments that I had.

The final step was to rate each country against the nine factors, again on a scale of 1 to 10, with 10 being the highest level of maturity that could be awarded. These assessments were based on an extensive qualitative and quantitative open-source research package.

And yet the notes only list 12 references. Maybe all of the references should be included, even if they were just articles, journal papers, etc. because they most definitely influenced a number of ratings.

Appendix A: Scoring Breakdown

This, which should be the foundation of the otherwise good review of the maturity in the region, is sadly the weakest part of the report. The scale is from 0 to 10 (effectively giving 11 potential values), yet the criteria for each individual value is not set. Worse, scoring looks like this:

There are 3 values that can be assigned to “No organisational structure” but we don’t know what differentiates a ‘0’ from a ‘1’ or a ‘2’. Worse, there are no clear rules set that would help countries and other interested parties in seeing what they need to do in order to improve their maturity.

Appendix C: Engagement Opportunities Indicators

Further complicating things, and in my view diminishing, rather than adding value to, the report is the distinction between maturity and engagement opportunities. The two really should be combined. A high maturity infers high engagement because high maturity is measured by outcomes, not activities. Plenty of activity but no visible improvement is a sign of low maturity. Some would say that if we measured by improvements then the countries that are now rated as mature in the report would get a low to moderate maturity rating.

On the positive side, the Engagement Opportunities table shows in greater detail what should be done, or is expected to be in place. Still not sufficient to be actionable, but at least it gives a good foundation for what is expected.

My 2 bits

Overall, the courage to produce it, and the rule of thumb country results are a very good, informative read. There is a wealth of information hidden in there, which is let down by the poor maturity matrix for the quant wonks amongst us. That the authors decided to reveal their methodology shows that this fairly new centre (6 months?0 in a fairly new think tank is looking for feedback and looking to improve the service they provide, and there’s no doubt that the next year’s report is going to be even better.