google-site-verification: googlefccb558c724d02c7.html


Tuesday, January 31, 2012

Online RBM Training -2: The UN's Programme Performance Assessment in Results-Based Management

--Greg Armstrong --

The UN’s Programme Performance Assessment in Results-Based Management, is a jargon-laden, time-consuming and partially out of date production that, whatever its original merits, is too frustrating to be useful today. There are more productive ways to spend your time.

Level of Difficulty: Moderate to Complex
Primarily Useful As: Difficult to say - a history of UN RBM jargon?
Most useful: Section 4-5 on data collection and reporting
Limitations: Out-of-date RBM language,jargon-laden, boring format


Online RBM training is unlikely to provide the hands-on practice and the opportunities for learning of group processes which are the foundation for effective Results-Based Management training and implementation, but some courses are definitely better than others.

After my December 15, 2011 post describing the University of Wisconsin’s effective interactive online results-based management course, I was asked by one reader for my opinion of other online RBM courses. The University of Wisconsin’s Logic Model course, is, in my opinion, excellent, making the most of interactive opportunities available in the absence of live training.

On the other end of the spectrum, however, is an online course available from the UN Office of Internal Oversight Services–Programme Performance Programme Performance Assessment in Results-based Management.

Google search results - Online Results-Based Management Training
CLICK image to enlarge

If we do a plain word search for "online results based management course" (without the quotes) on Google's .com search we will get, depending on the day, more than 80 million hits, and on page one of this search – at least as of January 31, 2012, the first and third results shown are for this specific UN RBM course.

Those search results may vary slightly given country-specific versions of the search engines, but  Microsoft's Bing and Yahoo search return several million results on this topic too, again with this UN RBM course ranking very high in the search results.

The highest ranked result --with a 2011 date-- is UNESCO’s Open Training Portal, and if we click on the UNESCO link, it takes us to what appears to be the home of the course, the UN’s Office of Internal Oversight Services, which is number 3 on the search list.  The course also appears as a link in some other UN documents.

So - how does this highly-ranked course stand up to scrutiny?

A disappointing use of the web

For the general development community this online RBM course is not worth the time it will take to complete it.  With over 150 screens, it could easily take 3-4 hours to complete this, and most of us could make much better use of written manuals to get the same, and more detailed information, in a shorter period.

For UN staff, it is possibly of greater use than for the general public, other implementing agencies or national partners, but there are, even here, some limitations. It may appear, perhaps, unfair to criticise the lack of utility of this course for the general development community, when it was intended primarily as an in-house United Nations training tool when it was originally designed.  The problem is, that for any user, UN or not, there is little that is compelling in the way this is put together.

Not an interactive course

This RBM course makes no use of the engaging possibilities of interaction that the University of Wisconsin used to advantage in an online Logic Model course designed at least 3 years before this UN course was published.

At the very lowest level of potential interactivity, the menu on the UN course is extremely limited.  Readers  can skip to one of 4 “sessions” but if we want to move to subthemes inside them, after quitting and then rejoining the course, for example, laborious clicking is required to move slide by slide forwards or back to potentially interesting topics.

There is some interesting material, here, but  nothing you can’t find more easily in printed handbooks or electronic files. The content of this  course is basically just material taken from other available online RBM Manuals and documents.

The exercises, where they occur, simply ask the viewer to download Microsoft Word documents and complete them.

The few tests of skill, such as matching technical terms such as “Impact”, “Objective” or “Indicator” with definitions either require the viewer (and I hesitate to call anyone using this a learner) to print out lists and match them up with a pen, or require a series of simple “either-or” choices that are too simple to be useful.

"Interactive" pen and paper
CLICK image or link to enlarge

This course, then,  is basically just a 152-slide PowerPoint presentation. There is nothing wrong with putting PowerPoint presentations online, if they are creatively put together.  A good PowerPoint can be engaging –- applying the limited but still useful possibilities for animation that PowerPoint has, so that ideas are sequenced and highlighted.

Interesting substantive points about data collection and reporting

The sections on data collection (section 5)   which runs from slides 101-126 and reporting results using the Story Pyramid,  (which runs roughly from slide 138-146)  actually have some useful material, but both could have been much better if this material was presented in a more engaging and interactive manner.

The slides on qualitative and quantitative data, and causality have some interesting ideas, for example.

But even here, what we see in this screenshot is exactly what we get – the whole page is dropped in front of the viewer at one load, no drop-downs, no sequencing, no animation, no emphasis – essentially just what we would get, with much less trouble, if we read one of the many PDF manuals or Word documents on results-based management available from UN sources.

Discussing qualitative and quantitative data collection - static format
CLICK image or link to enlarge

A substantial amount of the text in these screens is made from photographs of text – jpg files inserted into the slide.  Pasting text like this is a bad idea even for PowerPoint presenations where viewers have a big screen, and a facilitator to wake them up with a joke or two – but for a static, online view, it makes reading difficult.

Jargon and out of date RBM terms

This course slips into academic jargon from time to time, and many of the 152 slides have long text.  References to the “post hoc ergo prompter hoc fallacy”  and the “dichotomy between quantitative and qualitative research” may show the erudition of the writers, but do nothing for intercultural adult learning.

Another potential problem is the outdated UN RBM jargon. Take the definition of "Outputs" for example:

Outdated results definitions
CLICK image or link to enlarge

For a number of years the UN has been moving very slowly towards definitions of Outputs which see them more as a result – a real change in capacity - rather than just the completed activities that so many field-level projects appear to settle for in their reporting. The current (as of 2011) UN definition of "Output", for example, from the UNDG RBM Handbook  looks like this:

“Outputs are changes in skills or abilities, or the availability of new products and services that must be achieved with the resources provided and within the time-period specified. Outputs are the level of result where the clear comparative advantages of individual agencies emerge and where accountability is clearest. As stated in section 1.7, UNDAF results should be formulated in change language.” [p.13]

Even the 2007 UNDG technical brief on Outputs , written within a year of the first apparent appearance of the UN office of Internal Oversight Services online RBM course, had a more complete view how Outputs should be viewed in reporting:

"You may be tempted to list things like workshops and seminars as outputs. After all, they are deliverable and some workshops can be strategic if they gather decision takers in one room to build consensus.  But, in most cases, workshops and seminars are activities rather than outputs. And remember that outputs are not completed activities – they are the tangible changes in products and services, new skills or abilities that result from the completion of several activities. "   [p.2 - italics added]

The same statement is included in the 2011 updated Technical Briefs on Outcomes, Outputs, Indicators, and Risks & Assumptionsn Outcomes [p. 5], so the idea of accountability for genuine results is not disappearing.

But while some in the UN are conscientiously trying to move staff to reporting on real results, this programme performance assessment course defines Outputs as essentially a completed activity - and  - if people see this definition as current, this will undermine UN attempts to move towards genuine results reporting.

The Black Hole in UN Results Reporting

The problem from the bilateral donors’ point of view with much of the UN results framework is that the ambiguity on whether an Output - the primary level of reporting for individual UN projects - is a completed activity or a genuine change in capacity, has provided unambitious UN project managers with an excuse for easy reporting on completed activities.

 Meanwhile, the bilateral donors who actually provide the money for most of these projects, have to report on real results.  This UN results-based management course uses  the  older definition of what an Output is, and because it holds the continued imprimatur of the Office of Internal Oversight Services, it reinforces the laziest side of UN reporting.

This leaves a great black hole in the results chain, with reports on completed activities at the project level, and then a long conceptual leap to country results at the other, with no assessment of the quality or utility of the intervening completed activities.   If UN agencies, like their bilateral counterparts, genuinely want to test assumptions about what activities and theories of change actually work - and what does not work,  then there is no excuse for not assessing the immediate results -- changes in understanding, attitudes and behaviour and professional practice -- that the activities are intended to produce.

Repeating the most limited interpretations of what is expected in terms of results will do nobody any service if this approach is reinforced by an online course that ranks, at least in January 2012, so high in internet search results.

Similarly, the course course puts substantial attention on “Accomplishments” instead of what the UN agencies  have all referred to for several years, as “Outcomes”.  

What do you do with an out of date course?

This  RBM course is clearly out of date.  There is a reference on the last substantive slide (#151), for example, to an exercise using 2002-2003 data, and for potential forthcoming work on a 2004-2005 report.

Using 2003 data
Click image or link to enlarge

The case could be made, I suppose, that the course was originally put online in 2005, so as times change we should not expect too much.  But there are multiple yearly search listings for the course, with somebody apparently updating them so they will appear current in search engine results.  If you do a date search for "Programme Performance Assessment" on Google for example, you will find listings for this course (as if the course were updated) for 2005,  2006, 2008,  and 2011.

Search results for the online course 2005-2011
CLICK on the image to enlarge

I compared the March 2005  version of the course with the 2011 version, and could find, before I dropped off into a semi-comatose state, no difference between them.

It is worth noting that the UN Office of Internal Oversight Services search reference itself only lists this as updated as far as 2008 – although it is unclear to me in comparing them, what - if anything - was changed between 2005 and 2008.

The more recent October 2011 listing is a reference from the UNESCO Opentraining site, which describes the course this way:

“The United Nations has adopted a results-based approach and in this, programme performance assessment is an essential component. It is an invaluable tool for programme managers to achieve the twin goals of accountability and effectiveness. The present training is designed to allow learners to work at their own pace in order to acquire the skills they will need to implement programme performance assessment.”

While the programme performance assessment approach in general may be "an invaluable tool", this course, unfortunately, is not.

Missing links – an indicator of website utility

On the whole, the entire course, while it may, in 2005, have had some potential, is too frustrating to be useful today.

Even where potentially interesting documents are referenced, the links are sometimes dead.  For example, section two references a manual on “How to Design and Carry Out Data Collection Strategies for Results-Based Budgeting”.  If you have to do results-based budgeting, as some of us do, this could be quite useful – but the link referenced there does not work.

And it is not just that the PDF file may have been moved – that can happen to any of us.  The link does not work, because it is actually not written in a valid  html format.  A critical “dot” has been left out of the html for the link.  This looks the same in the 2005 version of the course, as it does in the version online today.  But even if you format the link correctly, it is still out of date, and there is, in fact, no downloadable document that I could find, even with considerable time spent on the search.

It is a bit mind-boggling that nobody has noticed  this - or any of the other missing links - in  the 6 or 7 years this course has been online, and presumably intended for use by people who would want to download the documents.  That such broken links exist in the still available 2005, the 2008 and 2011 versions, suggest that in all likelihood nobody has clicked on the missing link in years, and pointed this out to the webmaster.  All of us who put work on the web will occasionally find we are left with expired links, and most of the time we try to review our sites every few months, to test the links, and update them. Any competent web analytics programme can tell webmasters when a link is broken, but evidently nobody responsible for this online RBM course much cares about this.

As indicators of website utility go, this one seems to suggest that very few people are actually viewing the course in any detail and, again, that those managing the site also have little interest in whether it works properly or not.

So, if  the course is out of date, there are basically three choices here:

a) make it clear, with a little simple editing, that this is a 2005 document, and may be out of date – fair warning to the user – or
b) just take it offline, and
c) replace it, one hopes, with something genuinely interactive or at least up to date.

The Office of Internal Oversight Services website

It should not, perhaps, be surprising to find such outdated material on the UN Office of Internal Oversight Services' website.  The UN Internal Evaluation Division section  of the website says that
“ IED is committed to providing timely, valid and reliable information from an independent perspective that can be used to strengthen the Organization."
But, while some of OIOS reports are up to date, others are not -- in their own terms --  at all "timely".  At least in January 2012, the latest available programme performance report is for the 2006-2007 year

The real indicator for me about how the OIOS looks at updating the site, however, is the section called  “What’s new?”  When I reviewed this in January 2012, I found that only one of  11 references was to any  date after 2009.  That one clear reasonably recent link was to a 2011 fraud alert .

Aside from that, here is what the OIOS website considers new:

  • An undated page on risk management, that has at least one non-functioning link;
  • A March 2009 version of the Internal Audit Division’s Audit Manual
  • An advance notice (!) for an upcoming workshop to mark the 15th anniversary of the Office Of Internal Oversight Services – in October 2009;

Indicators for completed activities - not results

Even where there is more recent material, such as in the Investigations Division monthly performance indicators, the documents are dispiriting if we are looking for any clear guidance on results, not just on completed activities.

There are 26 monthly reports there, ranging from October 2009 – November 2011.  Not one of these has indicators about results –

  • No indicators about the quality of  the quality of investigations,
  • No indicators about sanctions applied to violations,
  • No indicators about  the resolution to investigations
  • No indicators about what long term changes to which the investigations might have contributed.

All of the indicators refer to completed activities – number of cases received, number processed, age of cases, the kind of things that are interesting indicators of process – but not of results.

So, with this limited type of reporting, and with such a casual approach to keeping materials up to date, it should really be no surprise that this organization's online RBM course should be outdated.

A possible alternative RBM course for UN staff

Now, it may be that this course has been superseded for UN staff, with the online course Measurement for Effective RBM, but if that is the case, it is not at all clear from any of the UN websites.

In any case, for the general public -- those of us working with NGOs, governments or other donors --that RBM course is not accessible.  Users must have an "official UN system organization e-mail”, must be nominated (it is not clear who can do this), and there is the small matter of the fee – US$1,850, per person.  

That course may well be more interactive and more entertaining than Programme Performance Assessment online course – but for the price, it should be!

The bottom line:

If the creators  - or managers - of this online RBM course can’t spare the time to update it, why should you spare the time to read it?

If you want an introduction, that remains valid today, to the processes of thinking about results, I still recommend the University of Wisconsin’s online RBM course -- Enhancing Program Performance with Logic Models.   Although it was created 3 years before the UN course, ten years ago, as I write this, it makes much more creative use of what the internet can offer for interactive training, the links work, and it provides, additionally, a complete PDF version of the course as a supplement.

And if you want to match this understanding of the processes of RBM with an understanding of what RBM terms the UN agencies are using,  instead of wading through the Programme Performance Assessment in Results-Based Management course, a much better use of  your time could go to reviewing the many documents available online at the UNDG RBM site.  You will find conflicting information in the documents available there, because the UN itself has evidently not yet sorted itself out on results-based management, but at least you will be getting the latest thinking on the subject  from people who are trying to make RBM work in the UN context.

[Edited to update links February 13, 2012]


Greg Armstrong is a Results-Based Management specialist who focuses on the use of clear language in RBM training, and in the creation of usable planning, monitoring and reporting frameworks.  For links to more Results-Based Management Handbooks and Guides, go to the RBM Training website

RBM Training

RBM Training
Results-Based Management

Share on LinkedIn

Subscribe to this blog by email

Enter your email address:

Delivered by FeedBurner

Read the latest posts