G20 Information Centre provided by the G20 Research Group |
||
@g20rg
|
||
Comparison of G20 Compliance Scores, 2008–2012
Caroline Bracht, G20 Research Group
August 28, 2013
Since 2008, the G20 Research Group at the University of Toronto and the International Organisations Research Institute of National Research University Higher School of Economics (IORI HSE) in Moscow have produced reports on the G20's progress in implementing a set of priority commitments issued at each summit. These reports monitor each country's efforts on a carefully chosen selection of the many commitments announced at each summit. The reports are offered to the general public and to policy makers, academics, civil society, the media and interested citizens around the world in an effort to make the work of the G20 more transparent, accessible and effective, and to provide scientific data to enable the meaningful analysis of the impact of this informal international institution. The assessments are based on publicly available information, documentation and media reports. To ensure accuracy, comprehensiveness and integrity, the reports are sent out for stakeholder feedback and therefore can result in the scores being recalibrated if new material becomes available.
There are a number of reasons why accountability reporting is important. First, it helps to ensure that promises made are promises kept. If the commitments made at each summit are not realized, it raises the question of whether summits are even worth doing? Second, accountability reporting indicates whether the commitments made at each summit are actually making a difference. For instance has the AgResults Initiative presented at the Los Cabos Summit accelerated the flow of private capital to agriculture in low-income countries? Accountability reports identify the sort of outcomes that have been produced due to the commitments and furthers understanding of whether the commitments are having the intended impact. If not, course corrections can be made where necessary and the commitment can be altered accordingly at the following summit. Accountability reporting is a critical form of transparency; in addition to self-monitoring, it is necessary to have an independent accountability reporting mechanism. While the G20's effort to produce its own accountability report is useful, self-monitoring will always remain limited. The politics of not singling out a particular country or not issuing the group unacceptably low scores can hinder objective reporting.
Since 2008 the G20 Research Group with a team at IORI HSE has monitored 53 commitments from the first six G20 summits. The newest compliance report on the Los Cabos Summit includes an additional 17 commitments. Monitoring is based on the methodology developed by the G8 Research Group for assessing compliance by G8 members. The assessments are based on a three-point scale. A score of +1 is awarded to a country for full compliance; 0 is awarded for partial compliance or a work in progress; and –1 is awarded for no compliance or negative compliance.
For the first six summits from 2008 to 2011>, the average compliance score for all G20 members was +0.41 which translates to 70.5% on the more frequently used 0–100% scale. The average compliance score of +0.59 (79.5%) from the Washington Summit is slightly skewed as it only includes one commitment in the area of trade. Five commitments from the London Summit were monitored. The average compliance score decreased to +0.22 (61%). This is the only time the compliance average decreased. Eight commitments were monitored from the Pittsburgh Summit in September 2009 and the compliance average increased slightly to +0.25 (62.5%). There were 10 commitments monitored from the Toronto Summit and again the average compliance average increased to +0.32 (66%). The compliance average from the Seoul Summit substantially increased. The average of +0.50 (75%) was based on 13 commitments. There were 16 commitments monitored from the Cannes Summit and the compliance average was the highest at +0.54 (77%). From 2009 to 2011 the G20 has increasingly complied with a priority selection of commitments from each summit.
The latest Los Cabos compliance report measured 17 commitments from the day after the Los Cabos declaration was released on June 20, 2012, until July 30, 2013. The scores contained in the report have not been finalized as analysts are still reviewing stakeholder feedback. The preliminary scores indicate that the average compliance score for the 2012 Los Cabos commitments is equal to the Cannes Summit compliance average of +0.54 (77%). Australia had the highest implementation score of +0.94 (97%), followed by Canada, the United Kingdom and the United States, each with the same compliance average of +0.75 (88%). In third place was the European Union, France and Mexico, each with a score of +0.69 (84%). The lowest scoring members were Argentina at +0.32 (66%), Turkey at +0.25 (62%) and Italy in last place with a score of +0.13 (57%). The largest difference between the Cannes and Los Cabos compliance reports was Italy's score. In 2011 Italy had the third highest compliance average of +0.80 and dropped to last place in 2012.
The difference between G8 members of the G20 and non-G8 members has been decreasing. In the latest Los Cabos report the G8 members achieved an average compliance score of +0.59 (78%), and non-G8 members achieved a score of +0.52 (76%). This difference of 0.7 points is much lower than the difference in the Cannes report of 0.25 points, indicating a quickening pace of the shrinking gap of compliance between G8 and non-G8 countries. The BRICS members of Brazil, Russia, India, China and South Africa achieved an average of +0.50 (75%) for Los Cabos Summit, lower than the total score of G8 members and almost equal to the average of the non-G8 members. This also indicates the increasing strength of BRICS member compliance. Overall there has been an increasing compliance trend among G20 members. This increasing trend can be partly attributed to the increasing compliance scores of non-G8 members. However, there was no increase in the overall compliance score from Cannes to Los Cabos, making it even more important to look at the individual country trends to see where the shifts have been.
Previous reports are available here.
In addition to the reports produced jointly for each summit by the G20 Research Group at the University of Toronto and IORI HSE in Moscow, which have monitored a total of 70 commitments from 2008 to 2012, the University of Toronto has added to this database via special studies. These include a specialized study on the Seoul Summit development commitments, individual reports by University of Toronto students and special studies conducted by the G20 Research Group alone. While all of these reports follow the same methodology, only the joint G20 Research Group–IORI HSE reports benefit from stakeholder consultations. In this respect they are thus more directly comparable to the joint 2012–13 Los Cabos results presented above, which include stakeholder feedback only from the interim compliance report but not yet on the entire report.
The results of this larger database of N=107 differ from the smaller database in two ways. First, the larger database reflects a more even distribution of assessments from across the seven summits from 2008 to 2012, by filling in data for the initially slender early summits. Second, the larger database is much more heavily weighted toward development and employment reflected in the contribution of the Seoul Development Consensus (SDC). The SDC database assessed all 25 development commitments and employment commitments from the Seoul Summit in November 2010. Even with these biases, there is a considerable or high degree of consistency between the larger and smaller dataset.
A comparison of the current Los Cabos results with the larger dataset suggests the following. With the addition of three compliance reports, Washington remained the summit with the highest performance. Next, the larger dataset indicates that the Los Cabos Summit remained the second highest performing summit with a compliance average of +0.55 (78%). One significant difference between the smaller and larger dataset is the Seoul Summit average. In the larger dataset compliance was heavily weighted on development commitments, which at the time was a new issue area for the G20. This could account for the decrease in the overall compliance average for that summit.
Summit |
G20 Research Group |
G20 Research Group |
Difference |
||||
Score |
Number of Reports |
Score |
Number of Reports |
||||
Washington |
0.59 |
80% |
1 |
0.66 |
83% |
4 |
+0.07 |
London |
0.22 |
61% |
5 |
0.17 |
59% |
6 |
-0.05 |
Pittsburgh |
0.25 |
63% |
8 |
0.34 |
67% |
15 |
+0.09 |
Toronto |
0.32 |
66% |
10 |
0.40 |
70% |
14 |
+0.08 |
Seoul |
0.50 |
75% |
13 |
0.37 |
69% |
35 |
–0.13 |
Cannes |
0.54 |
77% |
16 |
0.54 |
77% |
16 |
— |
Los Cabos |
0.55 |
78% |
17 |
0.55 |
78% |
17 |
— |
Average |
0.45 |
73% |
|
0.42 |
71% |
|
–0.03 |
This Information System is provided by the University of Toronto Library and the G20 Research Group at the University of Toronto. |
Please send comments to:
g20@utoronto.ca This page was last updated August 15, 2024 . |
All contents copyright © 2024. University of Toronto unless otherwise stated. All rights reserved.