This post is part of a larger series on Governance as a Service, and outlines what it means to set clear direction and expectations around the design and delivery of digital services, and why it’s critical to do so.
Given the topic and my background, I will begin with a disclaimer: all views in this post are my own, and do not represent those of my current and past employers. I am a digital standards nerd is all.
Direction and expectation are the “what” of governance, and digital government teams usually publish two official documents to express these: their digital strategy (sometimes also called a roadmap or action plan), and their digital standards. The former is an attempt to articulate direction, the latter expectation.
The strategy is more than just delivery
I’m saying this with much respect to Mike Bracken, because I really believe that delivery is key to winning confidence and credibility. However, no government would say that their COVID strategy is: “Just keep making more COVID digital solutions” without a larger outcome or purpose expressed.
Some digital teams shy away from publishing a standalone strategy, opting instead to point out that their efforts are in service of other published strategic government priorities. This is okay, as long as the team’s understanding of their vision for digital government and their part in making it a reality is consistent with how others in government perceive their role. In the absence of a standalone strategy, though, there is no articulated vision, nothing that guides which projects to take on (filtered based on whether it helps advance towards that vision), no way to ascertain whether any larger transformation is happening.
(FWIW, see also: How much do you really want to change?)
Standards = “the good path”
Standards are meant to be the change that digital services teams want to see, the consistent, minimum bar for policies, programs and services that governments design and deliver.
They’re meant for:
- Digital service teams, so that they can design and deliver services that consistently meet user needs and achieve policy outcomes
- Public servants, not necessarily in digital service teams, so that they have a clear understanding of what they need to expect when they work with digital service teams, hire for digital skills or procure services
- Advisors to decision-makers and assessors, so that they could objectively evaluate services using common criteria
- Senior decision-makers and politicians, so that they can hold others and themselves be held to account for their technology decisions
- Technology vendors, so they understand what it takes to do business for government
- Prospective public servants, so they know what it means to design and deliver services for government
Paved with clear expectations, not just good intentions
Standards articulate expectations of all of the above parties, and therefore need to be accompanied by clear explanations of what it means to meet them along with requisite practical guidance.
I can’t emphasize the word “practical” enough, as guidance and expectations are meant to be developed based on learned experience of actually delivering digital services in government. We are talking about a new minimum bar, after all, not aspirational and abstract ideals. Often, it’s this guidance that prevents or counters the change-wash that happens with digital service missions. (A quick test to this is to do a random sampling of your decision documents: how many times do you see the words “agile” and “user centred” to describe services in a way that is anything but those things?)
Clear expectations need to be set across a continuum of policy instruments
Different services may require varying degrees of proactive or reactive support based on a few risk factors (probability of harm if a service doesn’t work, number of people affected, cost of the service). These approaches range from appealing to moral suasion, providing clear guidance and self-assessment tools, all the way to providing institutional incentives (fast-tracked gating process or funding approval for user research, anyone?) or disincentives (withholding service funding), all the way to formal legislative enforcement options.
Making the good path easy
Blessed are the teams that confront the accretion of policy instruments that have developed before these standards have been put into place, and then make the time to understand their original intended outcomes to determine alignment, scope and relevance. Not all old policy instruments are irrelevant, and only very few have been developed with malicious intent.
Charting a path for others
Digital is a global movement, and grappling with a global pandemic means we can’t afford not to learn from others or chart a path for them. How might we have commonly understood principles to draw from so that we can objectively assess, hire and procure services?
My hypothesis is that guidance and expectations have already begun to be developed around common topics, such as -
- Agile, empowered multidisciplinary teams
- Understanding users and their needs
- Being inclusive and accessible at the outset
- Managing data responsibly
- Measuring performance for the purpose of continuous improvement
- Embedding privacy and security
- Using scalable, interoperable and reusable technology platforms
The Open Call team in Canada has begun to catalogue policy instruments and standards that have been developed as part of governments’ COVID response. What if there were a common assessment framework and a pattern library of institutional levers, too?