Results and Delivery: Is it Delivering?

3 minute read

By: Gregory Richards, Vice-President of Business Development and Research and Elizabeth Seymour, Senior Associate

We have the opportunity to speak with many public servants on the front lines of the R and D agenda. While we don’t claim that the outcomes of these conversations represent a scientific study, the feedback we’ve received highlights specific opportunities and challenges and suggests three things managers can do to use analytics effectively to improve program results.

A core theme that emerges from our conversations is that, due to the Directive and Policy on Results, evidence-based management is now top of mind in most departments. In addition, the added transparency related to the mandate letters makes it relatively easy to get a sense of overall objectives for the department.

Rdjan7

On the other hand, the R and D agenda does call for different ways of working. Participants tell us that the need for more narrative style reporting related to the mandate tracker and being able to tell meaningful visual stories is now important. In addition, institutionalizing evidence-based managementacross entire departments is a challenge since many are still working on getting the right data. Also, targets that change and shifting standards of service can also present difficulties. And of course, the operational work must continue, which often takes priority over developing new ways of working.

One solution is to develop an analytics roadmap unique to the program. This roadmap starts with understanding the results chain. In the diagram above, a simplified results chain indicates that data from the nodes track whether certain targets are met and therefore are useful for reporting. Data from the arcs between the nodes, however, provide insights into how the nodes relate to each other. Since every result is driven by activities that use resources, data from the arcs help us understand whether resources are being used effectively, and whether the activities being done deliver the expected results. This information, which requires the use of analytics to understand the relationships between and among the data points, naturally leads to conversations about how to do things better to achieve better outcomes.

Our conclusion is that some progress is being made but there is still work to be done in using data to improve program outcomes. The greatest challenge is finding the time and space to analyze, discuss, and imagine new ways of working while continuing to deliver ongoing operational requirements.

There are three things managers can do: understand the results chain for their own program, validate the types of data needed, and design an analytics roadmap that is unique to their specific results chain.

First, some programs have logic models which are, in fact, results chains describing how resources are being used to deliver expected outcomes. Some of these models are high level and need to be translated to as specific program area. A simplified results chainas depicted in the diagram above is often useful as a start so that all members of the program team has a good idea of what outcomes they need to deliver and how they plan to make it happen.

Second, the results chain will help you identify the data needed both for reporting and for managing your program. Getting access to certain types of data, at the right level of granularity, in the right time frame, can be a daunting task. But it is made much easier if we can clearly identify what is needed.

Third, just having access to data does not mean you can start analyzing things. There are well over 200 different analytic models and algorithms that can be applied. From visualization through to advanced machine learning algorithms, the selection of the right model depends on the type of data available, the objective of the program, and the capability of the people involved to fully interpret the results. Based on these considerations, program managers should create an analytics roadmap that reflects the unique nature of their programs. More on these issues will be discussed in the next article.

About the author

Greg Richards

Greg Richards

Vice President - Research and Business Development

Gregory Richards, MBA, Ph.D, FCMC

Greg joins the IOG from the University of Ottawa were he was Director of the MBA program and of the IBM Centre for Business Analytics and Performance. He is a former public servant having worked at Transport Canada and Consulting and Audit Canada, has over 20 years of consulting experience, is a Certified Management Consultant, a Fellow of the CMC-Ontario Institute of Management Consultants, and an Academic Fellow of the International Council of Management Consulting Institutes. He is also an Adjunct Professor at the University of Ottawa and a Lean Six Sigma practitioner certified by Villanova University.

Greg’s research and advisory services focus on improving organizational effectiveness through the use of data analytics and digitization strategies. He has taught in MBA programs for over 15 years and has designed and delivered executive education courses for several government and private sector organizations.

Published papers can be found here: https://scholar.google.ca/citationsuser=jPlJJJkAAAAJ&hl=en

613-562-0090 ext. 236