Never the cleanest or simplest process, the caucuses this year were saddled with new procedures, new rules and new mathematical calculations, all requiring specialized training of participants and leaders.
(TNS) — Well before the Iowa Democratic caucuses began on Monday, local observers and political gurus expected the process of choosing among candidates for the party's presidential nomination to be a mess.
Never the cleanest or simplest process even in the most serene of times, the caucuses this year were saddled with new procedures, new rules and new mathematical calculations, all requiring specialized training of caucus participants and leaders.
The organizers expected that any residual complexities could be managed with the help of new technology — specifically, a smartphone app designed to compile and transmit results.
The rest is history — history as farce, that is. Results first started dribbling in Tuesday, but by then they were tainted, perhaps indelibly, by questions about their reliability and the system's security.
The entire episode underscores how dependent we have become on technology — or more precisely, on the idea that technology is the solution to almost any problem.
"People often imagine that they can use technology to smooth out the rough edges of complicated real-world processes," says Meredith Broussard, a computer scientist now teaching at New York University's journalism school. Broussard calls this notion technochauvinism.
"We need to think about what is the right tool for the task," she told me. "Sometimes it's a computer, but sometimes it's paper."
Doubts about our blind faith in technology were raised most incisively by the late social critic Neil Postman, who was famous for asking about any novel application: "What is the problem for which this technology is the solution?"
Postman's formulation struck directly at people's tendency to overlook the consequences of rolling out new devices, systems and media. He would also ask whose problem was being supposedly solved, and what new problems would be created by the solution — all pertinent questions in the Iowa case.
As my colleague Jeff Bercovici reports, the Iowa smartphone app was developed by a firm called Shadow, which was started and partially funded by alumni of Hillary Clinton's 2016 presidential campaign. The state Democratic parties of Iowa and Nevada each paid around $60,000 to Shadow. But that may not have been enough to support development of a fully functioning app.
Iowa caucus officials, concerned that the app might be a target for hackers, kept much of it under wraps until the caucuses. They were vague about how much testing the app received, whether for vulnerabilities or for its ability to operate at full scale. Indications from Monday are that the problems the app experienced were the result not of sabotage, but incapacity and lack of adequate training.
Examples abound of technological solutions that inspired confidence at the conceptual stage but failed in practice.
Boeing, for example, used a software tweak to finesse an engineering shortcut it employed to minimize the cost of designing and building its new 737 Max airliner. To channel Postman, the problem it was trying to solve was how to preserve its profitability while bringing a new aircraft to market; it chose to make minimal hardware changes and make up for them with software. That solved a problem for the company's designers and accountants, but, according to the latest indications from investigators, perplexed and confounded pilots, causing two crashes that cost the lives of 346 passengers and crew members.
The initial failure of the government's healthcare.gov website more closely resembles the Iowa meltdown. The website failed during the initial rollout of the Affordable Care Act's individual insurance exchanges — on its first day of operation in October 2013, it was able to enroll only six customers.
The problem in search of a solution here was how to create a website that met the needs of a diverse population of insurance customers who had not been served by the existing insurance market. The solution was well-intentioned, but incompetently executed.
As a government analysis later established, the website development was hobbled almost from the start by limited resources and inexperienced and confused management. The volume of first-day customers had been immensely underestimated by government supervisors, and the site's private developers hadn't done the required testing to ensure that it could handle a scaled-up load.
The Department of Health and Human Services quickly reorganized the effort, but even though the website was soon functioning effectively, the snafu marred the public's perception of the ACA for several years.
Many have taken the Iowa problems as another manifestation of the idea that government can't manage complex systems as effectively as the private sector. But that's too simple.
In the first place, the Iowa caucuses, which are run by the state parties, are closer to private than government entities. There's nothing inherent in managing a technological rollout that can't be addressed by better funding and more consistent management — two features that were lacking in the rollout of Healthcare.gov, and that were overcome by more money and new management.
The private sector certainly has had its share of botched technology projects, even with billions of dollars, thousands of lives and priceless corporate reputations hanging in the balance. Boeing's future has been put in question by the 737 Max crashes. But there are many other cases in which a technological advance has led to unforeseen problems, as Postman posited — often to many more problems than existed to begin with.
Stock trading technologies have enabled markets worldwide to accommodate trading volume that would have been unimaginable even at the turn of the century. But this has also been an era of "flash crashes," such as the crash of May 6, 2010, which took stock markets down by a trillion dollars and then recovered, all in the span of 36 minutes — action that would have been impossible without advanced technologies.
Securities markets have become so instantaneously responsive to computerized orders that errors resulting from someone pressing a wrong button have earned their own moniker — "fat finger" trades.
These cases should make us wary of claims offered for technologies that are promoted as almost ready for the real world but are in reality still in their infancy. Take driverless cars, for example.
Autonomous driving systems are billed as the solution to human errors behind the wheel. But so far, they haven't come close to amassing enough real-world mileage to make any scientifically valid claim about their safety compared with human drivers.
Arguments that they will be inevitably safer therefore hinge on trust in technology — which experience tells us may be misplaced. And that's not even getting into the new problems this technology may create, which can't be predicted today.
Concerns about the security of electronic voting systems have been raising the hackles of election experts — and technologists — for years. The Iowa caucus experience should place a spotlight on the further risk of operational breakdowns at critical moments. For both reasons, more states are planning to mandate paper ballots or pondering such a rule for this year's election, if only as a backup to paperless systems. But they're not yet a majority, even though the rule should be universal.
As it happens, the Iowa caucuses did have paper ballots as backups. On Tuesday, those ballots were being used to verify Monday's voting totals. "They did the right thing," Broussard says. "They just didn't think they'd need them." They know better now.
©2020 the Los Angeles Times. Distributed by Tribune Content Agency, LLC.