Название: Learning in Development
Автор: Olivier Serrat
Издательство: Ingram
Жанр: Экономика
isbn: 9789290922087
isbn:
To help ADB improve its development effectiveness, Board members invited OED to
• develop a comprehensive annual development effectiveness report—building on the Annual Evaluation Review and Annual Report on Loan and Technical Assistance Portfolio Performance— that presents a truly serious discussion of results and holds ADB’s Management accountable for what it promised to do;
• work in ways that enhance the link between development effectiveness and resource allocation;
• generally emphasize simplicity in project/program designs;
• keep the focus of ADB on poverty reduction, both income and non-income;
• further strengthen the design and monitoring framework of projects, in particular by identifying killer assumptions and risks; and
• promote more interaction and sharing among ADB departments and offices.
Disseminating Findings and Recommendations. Although there have been improvements, ADB is not yet a learning organization in terms of actively using the lessons documented in OED reports to improve future operations. OED is developing a better system to categorize and disseminate its findings and recommendations using information technology. However, technology by itself will not solve the problem. OED is investing resources in knowledge management to distill lessons and do a better job of disseminating them within and outside ADB. New knowledge products and services are being designed, tailored to specific audiences,22 in forms that present results in accessible and digestible ways. Objective indicators are being developed to assess whether ADB is becoming a learning organization by using OED findings and recommendations.
Influential Evaluations
Evaluations that focus on key issues and provide usable findings and recommendations in a timely manner are a cost-effective means to improve the performance and impact of policies, strategies, programs, and projects. By challenging accepted thinking, such evaluations also contribute to improving overall development effectiveness.
Box 15: Building a Results-Based Management Framework a
Results-based management involves identifying the impact of an intervention, formulating its outcome, specifying outputs and inputs, identifying performance indicators, setting targets, monitoring and reporting results, evaluating results, and using the information to improve performance. A good quality design and monitoring framework is an integral quality-at-entry results-based management tool that (i) clearly identifies key project objectives with measurable performance indicators, (ii) establishes quantified and time-bound milestones and targets for the indicators at each level of the project, and (iii) specifies the sources of data for tracking implementation progress. Lacking one or more of these elements at entry weakens a project’s design quality.
In 2003, an evaluation study on project performance management found that the quality of ADB’s design and monitoring frameworks was poor—particularly in terms of clearly documenting the impacts and outcomes that ADB is trying to achieve. In response to the findings from this evaluation, ADB’s Management developed an action plan to rectify the situation. Multiple actions were initiated to create quality assurance, mentoring, and training capability within originating departments, and these departments were given clear responsibility and accountability for quality, and quality assurance. The vice-presidents of ADB’s operations departments gave instructions that frameworks needed to be improved for loans and TA operations, and directors general and directors were also required to sign off on frameworks. Recognizing that staff skills needed to be enhanced, the action plan directed that focal points be appointed in all regional departments to promote awareness, consistency, and knowledge sharing. Greater executing agency involvement in the preparation of design frameworks was also anticipated to help develop executing agency ownership further, sharpen design quality, and build understanding that the frameworks would be used as a monitoring tool.
The Central Operations Services Office and OED both played important roles. The former engaged a framework specialist, formulated the project performance monitoring system, and administered the initial inputs of the specialist to draft guidelines and conduct training programs. In 2004, more than 300 staff members attended briefing sessions that OED delivered on framework quality. A video version of this briefing was released for use by resident missions and interested parties. In 2004, OED also responded, daily, to requests for help in strengthening frameworks. Nevertheless, internal quality assurance alone is unlikely to be sufficient to ensure quality. Independent checking is also needed to validate that quality assurance systems are working effectively and whether quality improvements are actually achieved. To determine whether the efforts undertaken in 2004 bore fruit, OED subsequently conducted several independent assessments of the quality of frameworks. These assessments confirmed that, prior to implementation of the action plan, the majority of design frameworks were substandard. However, after implementation of the action plan in 2004, there was a sharp reversal resulting in a statistically significant improvement whereby approximately two-thirds of project frameworks were judged to be of acceptable quality.
The 2006 Annual Report on Loan and Technical Assistance Portfolio Performance contained a special chapter on design and monitoring frameworks to once again examine their quality and to track changes. The trends in the overall quality of the frameworks prepared at the project design stage and approved each year since 2000 are illustrated below.
Design and Monitoring Frameworks Rated Satisfactory or Better Overall
The significant improvements in design and monitoring framework quality can be plausibly attributed to action plan improvements instigated by evaluation studies. Nevertheless, despite these achievements, too many advisory and regional technical assistance frameworks remain substandard. Past evaluation studies have consistently documented the disappointing performance of ADB’s knowledge products and services. One of the contributing factors appears to be poor planning—particularly at the impact and outcome levels. It should not be surprising, therefore, that a lack of clarity in formulating higher-level project objectives is associated with poor results. OED will continue to monitor the quality of frameworks. The Central Operations Services Office has developed, published, and distributed guidelines for preparing frameworks and has continued to provide training in the understanding and use of this core results-management tool.
In a brief on managing for development results prepared for the DEC in November 2005, the Strategy and Policy Department noted that the Central Operations Services Office had set interim performance targets for framework quality. The goal was to have at least 80% of the frameworks prepared for loan projects and programs, and at least 50% of the frame-works prepared for advisory and regional TA activities rated satisfactory or better during 2005 and in subsequent years. The 2006 Annual Report on Loan and Technical Assistance Portfolio Performance shows that those targets were achieved in 2005. However, the ultimate target in the short to medium term must be to have all of the frameworks prepared during the project design phase, for all projects, rated satisfactory or better. ADB is also reaching out from headquarters. Since September 2005, 283 staff from executing agencies in 17 DMCs and 45 staff members from resident missions have attended workshops on project design and management. Ninety-five facilitators from 12 DMCs have participated in related training. Officials from 19 DMCs participated in the Third International Roundtable on Managing for Development Results held in Ha Noi, Viet Nam, in February 2007.
a Available: www.adb.org/documents/ses/reg/sst-oth-2003-29/ses-ppms.pdf
СКАЧАТЬ