As the announcements for 2004 candidacy become more frequent, it may be time to revisit the challenges of usability within the voting process. It is estimated that 4 to 6 million votes were "lost" in the controversial 2000 Presidential election. (What is; What could be ‚Äď Fast Facts, CalTech/MIT Voting Technology Report, July, 2001.) Of those an estimated 1.5 to 3 million votes were lost to registration mix-ups, 1.5 to 2 million votes were lost to faulty polling equipment and confusing ballot design and up to one million votes were lost due to polling station policy problems. An unknown number of absentee ballots were lost or mishandled.
All of these problems can be directly linked to usability problems:
While much ink has been spilled arguing for the need of usability testing for the specific voting apparatus, it appears there is a need to¬†reassess the whole voting system¬†to evaluate and improve the various task processes, the human-voting machine interaction and both human-machine and human-human error recovery strategies.
The most public problems in the 2000 voting process occurred with the interpretation of the punch card ballots in Florida. Many have recommended that the outdated balloting systems be replaced with more up-to-date Direct Recording Electronic Devices (DREs). DREs are electronic voting machines that look like tablet PCs. To the technologically savvy they appear more usable because they support more direct manipulation, such as touch screens. The flexibility of on-screen presentation allows for significant improvements in voting accessibility: Typographic elements such as font size can be modified on a by-need/by-voter basis. DREs may also simplify the cognitive load of the voting process by segmenting elections decisions into a series of discrete, sequential decisions (or screens) rather than the current practice of presenting all of the election propositions on a single, somewhat overwhelming ballot. Roth (1998) observed that DRE voting was more systematic, following the traditional Western reading order from left to right and top to bottom compared to a random pattern on the mechanical level ballot.
The adoption of new voting technology appears to be driven by cost and perceived advancement, so DREs are rapidly replacing mechanical and optical voting apparatus: 4 times more voters were faced with electronic voting machines in 2000 than in 1980. The American public is just now really embracing electronic commerce. Are we ready for electronic voting?
The promise of DREs is great. However, the usability pressures on newer technology are also great: the detailed design of the actual voting screens will have a significant effect on the accuracy with which these new devices accurately capture the voters' intention. Proactive usability analysis and testing becomes even more important as counties and precincts move to adopt the more advanced technologies. To date, the few published studies that address ease and accuracy of voting are not encouraging.
In the wake of the 2000 election, Information Design Journal reprinted Roth's (1998) now-seminal voting interaction studies. She reports two observational studies with participant voters of different ages, using a touch screen versus lever mechanism voting machine (Study 1) and various combinations of Punch cards/paper ballots and touch screens (Study 2). The studies did not collect voter intention. Instead, voting success was equated to submitting a complete ballot (that is, voting on all possible candidates/issues). Several factors influenced voting success:
Reads a lot like a basic usability factors checklist, doesn't it?
More recently, researchers at the University of Maryland evaluated the DRE that is to be adopted by a number of counties in Maryland (Bederson, 2003). In that report, Bederson and colleagues present findings of both an expert review and a (limited) field study evaluating the usability of the voting apparatus. Their converging evaluations uncovered several usability problems which significantly undermine the likelihood of voters casting a complete and intended ballot. They identify issues such as general system failure, voter-card interaction failure, layout, affordances, text size, task flow, error recovery, visual design hierarchy and monitor glare (a particular problem for aging voters-one of the nation's most active voting constituency).
Voter's confidence in the interaction is also an issue: exit polls reflect that voters are not universally confident that the machine records the vote that they intended or attempted to cast. In exit interviews in Bederson's study, ten percent of participants stated that they did not feel confident that the ballot they had cast reflected their intended vote. In the state of Maryland, that would translate to 171,706 actual voters who were not confident about the accuracy of their cast ballot (Maryland State Board of Elections). Fully 8% of participants reported that they somewhat or did not trust the voting system. Based on their expert review of the technology and the field studies, Bederson's research team suggested that the polling places should:
"...provide the user with a printed record of the votes electronically recorded. Before leaving the polling place, the voter would be required to certify the contents of the paper record and place it into a ballot box..."
The University of Georgia's Carl Vinson Institute of Government quarterly Peach State Poll asked Georgia voters how confident they feel about the electronic voting systems now in place in Georgia. Press reporting of the survey focuses on the increase in confidence that voters have in the machines in general: in 2001, 56% of voters were Very Confident or Somewhat Confident that their vote was accurately counted. In December 2002, 93% of voters reported that confidence level. While that change may seem optimistic, the reverse of that that statistic ‚Äď that 7% of voters were Not Very Confident or Not at all Confident that their vote was accurately counted should also give readers pause: roughly 88,400 active voters across the state of Georgia did not feel confident about the accuracy of their vote.
Looking closer, voter confidence varies widely on race lines. While 79% of Caucasian voters were Very Confident that their vote had been recorded properly by the new electronic voting system, only 40% of African-American voters felt that way. This disparity in voter confidence may reflect a variety of factors (e.g., confidence with the interface interaction, the impact of the digital divide, etc.). Critical to the human factors community, this disparity along racial lines highlights the importance of recruiting testing participants that truly reflect and represent the end user population in order to get accurate and meaningful test results.
Studies assessing voting efficacy, defined as casting a complete ballot that reflects the intended choices, are critical to establishing the usability of the various voting tools. Is it truly efficacious to embrace the emerging technology? Work by the CalTech/MIT Voting Project suggests that migrating to electronic balloting may be somewhat premature.
Their work reports longitudinal trends in residual voting errors. Residual voting errors occur when a ballot fails to register a vote for any reason. This may include failure to indicate a selection, indicating multiple selections or (in the case of paper ballots), stray marks that mean the ballot cannot be processed. They report residual voting error statistics for a range of voting technologies, including paper ballots, lever machines, punch cards, bubble (optically scanned) ballots and DREs. In their analysis, the residual voting rate of punch card methods and electronic devices is 50% higher than the rate for manually counted paper ballots, lever machines and optically scanned ballots.
Of particular interest is their additional analysis of the incidence of residual voting errors for counties that have recently switched voting technologies. Here the researchers are provided a unique naturalistic observation opportunity to compare residual error rates of the newly adopted technology with those of the replaced technology. This allowed them to determine if updating increased the likelihood of successful voting.
Comparing the new technology to the residual voting error baseline of lever voting machines (the most prevalent equipment), they predict the change scores derived below. In this table, positive numbers indicate that the newer technology results in a higher residual error incidence than lever machines.
|Projected Change in Residual Voting Error by Machine Type|
|Paper ballot (v. Levers)||‚Äď 0.55|
|Punch Card: DataVote||1.24|
Note that the prediction algorithm takes into account the fact that DREs protect voters from over voting. Unlike paper/pencil ballots, DREs do not register over-votes. Based on the findings of the Caltech/MIT Voting Project report, only a switch to traditional paper-and-pencil ballots reduced the residual voting error. In contrast, counties that switch from levers to DRES¬†can expect a significant increase in residual voting error: approximately 1% of all ballots cast will be unusable.
Their projection that DRE adoption increases residual voting error is validated by actual county return data: Precincts adopting DRE technology reported an increase in their residual voting error. Precincts that kept their lever technology or adopted bubble (optical) ballots reduced their residual voting error.
The authors suggest several possible causes for this increase in uncountable ballots. They include the technology learning curve, the voter's learning curve, inadequate administrative attention to care and maintenance of voting apparatus and the intimidation factor for less technologically sophisticated voters.
All in all, these findings suggest that if America actually IS ready for an electronic polling place, technology to support that migration is clearly not. Basic interface and task flow usability problems that directly undermine voting efficacy surface in both the hardware and interface design of the publicly evaluated DREs. These challenges clearly reflect screen design challenges that are unique to DREs.
As the task of voting is made simpler through technology, it is important that the county officials responsible for selecting and implementing voting tools become familiar with usability issues and evaluation techniques. Cognitive walkthroughs, heuristic reviews and ‚Äď most critically ‚Äď usability testing of representative users should be employed to evaluate and iterate on the designs before the next election.
On May 22, 2003, Mr. Holt submitted¬†the Voter Confidence and Increased Accessiblity Act of 2003¬†(H.R. 2239) to the 108th session of the House of Representatives. Among other things, this bill stipulates in Section B that:
and then later...
All software and hardware used in any electronic voting system shall be certified by laboratories accredited by the Commission as meeting the requirements of clauses (i) and (ii).
A post hoc feedback/error checking method is not quite the solution we, as usability professionals, would recommend. However, this legisation does indicate that the stakeholders recognize there is a problem...which is typically the first step toward the right track.
Bederson, J. Herrnson, P. and Neimi, R., (2003), Electronic Voting System Usability Issues, ACM Computer Human Interaction Conference, Ft. Lauderdale, Florida, April 5-10.
Roth, S. (1998),¬†Disenfranchised by Design, Information Design Journal, 9(1). Reprinted electronically:
CalTech/MIT Voting Technology Project Report: Residual Votes Attributable to Technology (Version 2), March 30, 2001.
Maryland State Board of Elections, 2002¬†Gubernatorial General Election - Voter Turnout - Statewide.
Georgians Express Confidence in New Electronic Voting System, Carl Vinson Institute of Government, University of Georgia.
Credit for Voting Report, Georgia Secretary of State Elections Office.
Sign up to get our Newsletter delivered straight to your inbox
HFI may use ‚Äúcookies‚ÄĚ or ‚Äúweb beacons‚ÄĚ to track how Users use the Website. A cookie is a piece of software that a web server can store on Users‚Äô PCs and use to identify Users should they visit the Website again. Users may adjust their web browser software if they do not wish to accept cookies. To withdraw your consent after accepting a cookie, delete the cookie from your computer.
HFI believes that every User should know how it utilizes the information collected from Users. The Website is not directed at children under 13 years of age, and HFI does not knowingly collect personally identifiable information from children under 13 years of age online. Please note that the Website may contain links to other websites. These linked sites may not be operated or controlled by HFI. HFI is not responsible for the privacy practices of these or any other websites, and you access these websites entirely at your own risk. HFI recommends that you review the privacy practices of any other websites that you choose to visit.
HFI is based, and this website is hosted, in the United States of America. If User is from the European Union or other regions of the world with laws governing data collection and use that may differ from U.S. law and User is registering an account on the Website, visiting the Website, purchasing products or services from HFI or the Website, or otherwise using the Website, please note that any personally identifiable information that User provides to HFI will be transferred to the United States. Any such personally identifiable information provided will be processed and stored in the United States by HFI or a service provider acting on its behalf. By providing your personally identifiable information, User hereby specifically and expressly consents to such transfer and processing and the uses and disclosures set forth herein.
In the course of its business, HFI may perform expert reviews, usability testing, and other consulting work where personal privacy is a concern. HFI believes in the importance of protecting personal information, and may use measures to provide this protection, including, but not limited to, using consent forms for participants or ‚Äúdummy‚ÄĚ test data.
HFI may use personally identifiable information collected through the Website for the specific purposes for which the information was collected, to process purchases and sales of products or services offered via the Website if any, to contact Users regarding products and services offered by HFI, its parent, subsidiary and other related companies in order to otherwise to enhance Users‚Äô experience with HFI. HFI may also use information collected through the Website for research regarding the effectiveness of the Website and the business planning, marketing, advertising and sales efforts of HFI. HFI does not sell any User information under any circumstances.
HFI may disclose personally identifiable information collected from Users to its parent, subsidiary and other related companies to use the information for the purposes outlined above, as necessary to provide the services offered by HFI and to provide the Website itself, and for the specific purposes for which the information was collected. HFI may disclose personally identifiable information at the request of law enforcement or governmental agencies or in response to subpoenas, court orders or other legal process, to establish, protect or exercise HFI‚Äôs legal or other rights or to defend against a legal claim or as otherwise required or allowed by law. HFI may disclose personally identifiable information in order to protect the rights, property or safety of a User or any other person. HFI may disclose personally identifiable information to investigate or prevent a violation by User of any contractual or other relationship with HFI or the perpetration of any illegal or harmful activity. HFI may also disclose aggregate, anonymous data based on information collected from Users to investors and potential partners. Finally, HFI may disclose or transfer personally identifiable information collected from Users in connection with or in contemplation of a sale of its assets or business or a merger, consolidation or other reorganization of its business.
If a User includes such User‚Äôs personally identifiable information as part of the User posting to the Website, such information may be made available to any parties using the Website. HFI does not edit or otherwise remove such information from User information before it is posted on the Website. If a User does not wish to have such User‚Äôs personally identifiable information made available in this manner, such User must remove any such information before posting. HFI is not liable for any damages caused or incurred due to personally identifiable information made available in the foregoing manners. For example, a User posts on an HFI-administered forum would be considered Personal Information as provided by User and subject to the terms of this section.
Information about Users that is maintained on HFI‚Äôs systems or those of its service providers is protected using industry standard security measures. However, no security measures are perfect or impenetrable, and HFI cannot guarantee that the information submitted to, maintained on or transmitted from its systems will be completely secure. HFI is not responsible for the circumvention of any privacy settings or security measures relating to the Website by any Users or third parties.
Human Factors International, Inc.
PO Box 2020
1680 highway 1, STE 3600
Fairfield IA 52556
HFI reserves the right to cancel any course up to 14 (fourteen) days prior to the first day of the course. Registrants will be promptly notified and will receive a full refund or be transferred to the equivalent class of their choice within a 12-month period. HFI is not responsible for travel expenses or any costs that may be incurred as a result of cancellations.
$100 processing fee if cancelling within two weeks of course start date.
4 Pack + Exam registration: Rs. 10,000 per participant processing fee (to be paid by the participant) if cancelling or transferring the course (4 Pack-CUA/CXA) registration before three weeks from the course start date. No refund or carry forward of the course fees if cancelling or transferring the course registration within three weeks before the course start date.
Individual Modules: Rs. 3,000 per participant ‚Äėper module‚Äô processing fee (to be paid by the participant) if cancelling or transferring the course (any Individual HFI course) registration before three weeks from the course start date. No refund or carry forward of the course fees if cancelling or transferring the course registration within three weeks before the course start date.
Exam: Rs. 3,000 per participant processing fee (to be paid by the participant) if cancelling or transferring the pre agreed CUA/CXA exam date before three weeks from the examination date. No refund or carry forward of the exam fees if requesting/cancelling or transferring the CUA/CXA exam within three weeks before the examination date.
There will be no audio or video recording allowed in class. Students who have any disability that might affect their performance in this class are encouraged to speak with the instructor at the beginning of the class.
The course and training materials and all other handouts provided by HFI during the course are published, copyrighted works proprietary and owned exclusively by HFI. The course participant does not acquire title nor ownership rights in any of these materials. Further the course participant agrees not to reproduce, modify, and/or convert to electronic format (i.e., softcopy) any of the materials received from or provided by HFI. The materials provided in the class are for the sole use of the class participant. HFI does not provide the materials in electronic format to the participants in public or onsite courses.