Latest update November 8th, 2024 1:00 AM
Jun 17, 2020 Letters
Dear Editor,
A careful examination of the “Report of the CARICOM observer team for the recount of the Guyana March 2, 2020 elections” has raised a number of important concerns about the structure, process, and the outcome of the analysis described in the 132-page document.
First, it is unclear whether the Team’s categorical observation that their report was based on “…an audit” and therefore “…was not in fact a recount” is consistent with what is described elsewhere in the document. For example, “Overall, while we acknowledge that they were some defects in the recount of the March 02, 2020 votes cast for the General and Regional Elections in Guyana, the Team did not witness anything that would render the recount … from reflecting the will of the voters. The actual count of the vote was indeed transparent” (p.2). There is a need to redefine what “transparent” means within the context of a report that was limited in its observation. Also, an assessment of the percentage of time that the word recount (approximately 92%) was used relative to an audit (about 8%) lend to the conceptual confusion of the approach taken in this report.
Additionally, the observation that “GECOM’s elaborate and unnecessary checklist (see Appendix IV). A checklist which was unnecessarily/excessively burdensome and which was suggestive of an audit rather than a recount” highlights the conceptual confusion about the approach the Team adopted in their work. This issue by itself is not especially problematic, however, given that a recount is clearly a type of an audit, one wonders about the potential impact that an unclear framework might have had on the process and the outcome of this report.
Second, the constraints that the Team described in the document (see page 5, para. 1) included the number of Team members (n=3) which resulted in only 18.09% or 423 workstations being observed in the process. This methodological approach or “The Recount Strategy” is fundamentally flawed but was nonetheless the framework that formed the basis for the conclusions reached by the CARICOM Team. The implications of this unscientific methodology are far reaching. The priority assumptions about the adequacy of the selected size of the CARICOM Team are brought into question and should be considered when assessing the value that is place on this limited and therefore incomplete report. A report is as good as the research design or in this instance, “The Recount Strategy” that it uses. Parenthetically, when was it known, “… that it was virtually impossible…” for the Team to adequately deploy across the workstations? More importantly, as is described later, are the decisions that were made in the face of the unfortunate reality of the limitations that confronted the Team not consequential? The Team’s solution of this problem is at the heart of the challenges that are inherent in the findings posited in this report.
Third, the discussion about the selection of the specific workstations (n=423 or 18.09%) does not point to an approach that can be trusted, at least in terms of the results of the analysis. Was the choice of the workstations based on established principles of random selection? This document does not even attempt to deal with this well-known problem in making inferences from a sample (i.e., n=423) to the general population from which the sample was drawn. Additionally, if one grants the Team a passing grade on the assumption “… that it was virtually impossible …” to actually observe 100% of the available work stations, it is virtually irresponsible to eliminate from the report a description of steps that were taken to ensure that their work is representative of the Guyanese electorate. So, the reader is left asking what if any of the basic principles of random sampling were considered or implemented in this design. If the criteria for any type of random sampling approach are not met, then the platform on which the results are built is fragile and potentially deadly to any attempt at a “free and fair election” process. Thus, in the face of the violation of well-established guidelines for analysis of data, the results cannot be trusted. The failure to describe the criteria used to select workstations within and across each region destroys any confidence that one could reasonably have in the process and the outcome of the work of the CARICOM Team.
Fourth, the percentage of the overall ballot boxes observed included the following: Region 4 (37%), Region 3 (18%), Region 6 (14%), Region 7 (26.82%) and Region 9 (24.65%). These percentages show no consistent pattern that considered the relative number of ballot boxes by Region. If a selection of stations was based on some statistical approach that accounted for the relative size of the Region then one might have been more confident that the CARICOM sample could be trusted. The core issue across all the observations about the methodology and the results of the ‘recount’ is the flawed design of the work of the CARICOM Team. Moreover, the irony of this report is that the solution (i.e., The CARICOM Team) has created more problems than clear answers to the issues it sought to resolve. This, in the absence of a systematic approach to a process that was intended to engender confidence in an election process, the electorate is left with a report that is conceptually flawed and statistically imprecise.
Finally, this is clearly a report that should be viewed more as an attempt to create a new methodology that blatantly disregards standard and best in class approaches to ‘process audit’ and or an electoral recount. As a result, we have a report that excludes data from over 80% of the possible work stations thereby rendering conclusions from an ‘audit’ that is based on an unscientific selection of 18.09% of the work stations useless at best. Overall, the multiple contradictions observed throughout the document is striking. In addition to those observed above, the essence of the inconsistent description on page 24, paragraph 1, is emblematic of similar contradictory statements in the document. That is, the Team suggested that the GECOM staff “… for the most part” were “well trained in the basic procedural matters” while admitting that “… it was also evident that there were varying degrees of efficiency and effectiveness of the staff.” These evaluation concepts capture the ideas of ‘doing things right’ (efficiency) and ‘doing the right things’ (effectiveness). If in fact, ‘varying degrees of efficiency and effectiveness…’ were found in the staff that were observed, then it should follow that there were ‘varying degrees’ of the doing things right and of doing the right thing. Clearly, there is a need to consider the impact of these and other identified “basket” of concerns on the outcome of the election.
Yours truly,
Dr. Richard Van West-Charles
Nov 08, 2024
Bridgetown, Barbados – Cricket West Indies (CWI) has imposed a two-match suspension on fast bowler Alzarri Joseph following an on-field incident during the 3rd CG United ODI at the Kensington...…Peeping Tom Kaieteur News- If the American elections of 2024 delivered any one lesson to the rest of the world, it... more
By Sir Ronald Sanders Kaieteur News – There is an alarming surge in gun-related violence, particularly among younger... more
Freedom of speech is our core value at Kaieteur News. If the letter/e-mail you sent was not published, and you believe that its contents were not libellous, let us know, please contact us by phone or email.
Feel free to send us your comments and/or criticisms.
Contact: 624-6456; 225-8452; 225-8458; 225-8463; 225-8465; 225-8473 or 225-8491.
Or by Email: [email protected] / [email protected]