Results and Delivery: Is it Delivering? - Institute on Governance

Results and Delivery: Is it Delivering?

3 minute read

By: Gregory Richards, Vice-President of Business Development and Research and Elizabeth Seymour, Senior Associate

We have the opportunity to speak with many public servants on the front lines of the R and D agenda. While we don’t claim that the outcomes of these conversations represent a scientific study, the feedback we’ve received highlights specific opportunities and challenges and suggests three things managers can do to use analytics effectively to improve program results.

A core theme that emerges from our conversations is that, due to the Directive and Policy on Results, evidence-based management is now top of mind in most departments. In addition, the added transparency related to the mandate letters makes it relatively easy to get a sense of overall objectives for the department.

Rdjan7

On the other hand, the R and D agenda does call for different ways of working. Participants tell us that the need for more narrative style reporting related to the mandate tracker and being able to tell meaningful visual stories is now important. In addition, institutionalizing evidence-based managementacross entire departments is a challenge since many are still working on getting the right data. Also, targets that change and shifting standards of service can also present difficulties. And of course, the operational work must continue, which often takes priority over developing new ways of working.

One solution is to develop an analytics roadmap unique to the program. This roadmap starts with understanding the results chain. In the diagram above, a simplified results chain indicates that data from the nodes track whether certain targets are met and therefore are useful for reporting. Data from the arcs between the nodes, however, provide insights into how the nodes relate to each other. Since every result is driven by activities that use resources, data from the arcs help us understand whether resources are being used effectively, and whether the activities being done deliver the expected results. This information, which requires the use of analytics to understand the relationships between and among the data points, naturally leads to conversations about how to do things better to achieve better outcomes.

Our conclusion is that some progress is being made but there is still work to be done in using data to improve program outcomes. The greatest challenge is finding the time and space to analyze, discuss, and imagine new ways of working while continuing to deliver ongoing operational requirements.

There are three things managers can do: understand the results chain for their own program, validate the types of data needed, and design an analytics roadmap that is unique to their specific results chain.

First, some programs have logic models which are, in fact, results chains describing how resources are being used to deliver expected outcomes. Some of these models are high level and need to be translated to as specific program area. A simplified results chainas depicted in the diagram above is often useful as a start so that all members of the program team has a good idea of what outcomes they need to deliver and how they plan to make it happen.

Second, the results chain will help you identify the data needed both for reporting and for managing your program. Getting access to certain types of data, at the right level of granularity, in the right time frame, can be a daunting task. But it is made much easier if we can clearly identify what is needed.

Third, just having access to data does not mean you can start analyzing things. There are well over 200 different analytic models and algorithms that can be applied. From visualization through to advanced machine learning algorithms, the selection of the right model depends on the type of data available, the objective of the program, and the capability of the people involved to fully interpret the results. Based on these considerations, program managers should create an analytics roadmap that reflects the unique nature of their programs. More on these issues will be discussed in the next article.

You May Also Be Interested In

IOG Featured Image
Future of Work – Talent Retention & Trust

With guest contribution from David Scouler, Managing Director at CultureRx.

Learn More
IOG Featured Image
We Need to Change Our Conversations on Societal Infrastructure

With contribution from IOG Fellow Dr. Sara Filbee. We are

Learn More
IOG Featured Image
Future of Work – Hybrid Workplaces

With contribution from John Penhale. This blog post is the

Learn More
IOG Featured Image
Conduct during elections: The principle of ‘restraint’

During federal elections, the fair city of Ottawa experiences an

Learn More
IOG Featured Image
NEW PRESIDENT AND CEO APPOINTED FOR INSTITUTE ON GOVERNANCE

Aurele Theriault, Chair of the Board of Directors of the

Learn More
IOG Featured Image
Future of Work – A Dispersed Workforce

This is our third post in our series on Future

Learn More
IOG Featured Image
We Need to Change Our Conversations on Societal Infrastructure – PART II

With contribution from IOG Fellow Dr. Sara Filbee. This article

Learn More
IOG Featured Image
The Emergencies Act: The End of the Affair?

The declaration of a public order emergency under the Emergencies

Learn More
IOG Featured Image
REACTION – Team Canada Arrives Home

Canada’s best athletes returned to Canada from the Winter Olympics

Learn More
IOG Featured Image
REFLECTION – Co-Management Defined

With contribution from Sam Wells. The process of reconciliation between

Learn More
IOG Featured Image
REACTION: A Governance Lens on the Emergencies Act

When the Emergencies Act was invoked on February 14th, this

Learn More