This year’s best movie about a spirited band of resisters fighting an empire of evil isn’t the latest entry in the “Star Wars” franchise, but “Darkest Hour,” an extraordinarily deft and moving depiction of the outset of Winston Churchill’s prime ministership during World War II.
Cabinet meetings and political intrigue aren’t the most natural cinematic material, although the underlying event in “Darkest Hour” is one of the most dramatic in modern history: One man standing defiant before the onslaught of an enemy army, rallying his nation with his willpower and words.
Discounting for Hollywood embellishments, the movie is worthy of this story, which is high praise indeed. In particular, Gary Oldman’s portrayal of Churchill is so compelling that the Academy Award for best actor should be signed, sealed and delivered to him right now.
Upon taking power, Churchill faced disaster on every front in the war, yet bucked internal political pressure to explore a deal with Adolf Hitler. In his marvelous history of this crucial interlude, “Five Days in London: May 1940,” the great historian John Lukacs writes, “Then and there he saved Britain and Europe, and Western civilization.”
By his account years later, Churchill felt a sense of relief at being put in charge: “At last I had the authority to give directions over the whole scene.” But his bodyguard reported that when he congratulated Churchill on his ascension and noted the enormous task ahead, the new prime minister replied, tears in his eyes: “God alone knows how great it is. I hope it is not too late.”
In 1937, Churchill’s reputation had been at a low ebb, but he recovered on the strength of his acuteness about Hitler. When Neville Chamberlain returned from Munich, Churchill gave a speech in the House of Commons declaring “we have sustained a total and unmitigated defeat.” Britain’s position slid downward from there.
The same day that Churchill became prime minister, Hitler’s army invaded Western Europe in earnest, sweeping all before it and eventually trapping the British at Dunkirk.
Given the circumstances, the desire of Viscount Halifax, Churchill’s inherited foreign secretary, to explore peace terms wasn’t unreasonable, just profoundly wrong. Lukacs writes that Halifax knew “how to adjust his mind to circumstances rather than attempt to adjust the circumstances to his ideas.” Churchill thought differently. A contest ensued between the two of them in the War Cabinet, where the new prime minister’s position wasn’t unassailable.
Churchill opposed any deal. He was convinced, Lukacs notes, “that such a settlement, under any conditions, could not be counter-balanced by a maintenance, let alone a guarantee, of British liberty and independence.” Churchill bent a little toward Halifax when he initially felt it politically necessary, but ground him down and ultimately outmaneuvered him.
In a key episode, Churchill went to the larger Cabinet and won overwhelming approval for his stalwartness. Here, he made his famous statement, “We shall go and we shall fight it out, here or elsewhere, and if at last the long story is to end, it were better it should end, not through surrender, but only when we are rolling senseless on the ground.”
After the war, Churchill wrote of the reaction of his colleagues: “Quite a number seemed to jump up from the table and came running to my chair, shouting and patting me on the back. There is no doubt had I at this juncture faltered at all in leading the nation, I should have been hurled out of office.”
He didn’t falter. Churchill tapped into and built up the resolve of the British people. “There was a white glow,” he wrote later, “overpowering, sublime, which ran through our island from end to end.” Hitler wouldn’t neutralize the British, who escaped Dunkirk and kept up the fight.
The so-called Great Man theory of history might be overly simplistic, but history indisputably has its great men. “Darkest Hour” does justice to one of them.
‘Tis the season to confess error.
For years I have argued — to my students, to my readers and to lecture audiences — that it’s not rational to vote if your intention is to influence the outcome.
Following the lead of the late economist Gordon Tullock, I’ve challenged them to come up with an example of an election where a single vote made the difference. No one has offered an answer.
Small wonder. The odds against one vote affecting the outcome, even in a local election, are enormous. Thus, as I’ve always explained, if I happen to skip an election or two — or all of them until the day I die — the world of politics will not be altered by a millionth of a millimeter.
But maybe it’s time to rethink that, because we now have a tie. In the race to represent the 94th District in the Virginia House of Delegates, Democratic candidate Shelly Simonds and Republican incumbent David Yancey are tied, at 11,608 votes each. In the Election Day count, Yancey was ahead by 10 votes. A recount put Simonds ahead by one. A panel of judges then awarded a disputed ballet to Yancey. At stake is the Republican Party’s 17-year dominance of the state Legislature.
An election so close is rare but not unheard of. A 2002 paper by economists Casey B. Mulligan and Charles G. Hunter analyzed 16,577 elections for the U.S. House of Representatives between 1898 and 1992 and found only one contest, in nearly a century, that was decided by a single vote. (According to the authors’ calculations, the likelihood that a single vote will be pivotal in a congressional election is about 1 in 89,000.) When they studied state legislative elections, they found nine more.
Close elections are a problem. Landslides create hubris, but when the outcome rests on a handful of votes, one side is bound to think it’s been robbed. There’s no way to fix this. If Smedley beats Smithers by 100 votes, and a recount shows Smithers up by 50, Smedley’s people will be furious. They’ll demand to know why the new count is more to be trusted than the old. Should we do two out of three? Five out of nine? The loser will never be satisfied. Lots of Republicans still think Norm Coleman beat Al Franken in Minnesota’s 2008 Senate race, and lots of Democrats still think Al Gore rather than George W. Bush should have been awarded Florida’s electoral votes back in 2000.
There’s another reason to be skeptical of recounts. It turns out that we don’t count votes terribly well. A 2012 study found that although some methods of tabulating ballots are better than others, we can generally expect an error rate of 1 to 2 percent. Although we can’t predict which way the errors will fall, it’s unlikely that they will sum precisely to zero – in other words, there will always be mistakes. So each time we count, we can expect a different result. The defeated side’s partisans will never believe the count was fair, and although they’ll be right, their grumbling about the matter will affect their view of how well electoral democracy works. That can’t be a good thing.
One way to avoid this difficulty would be to hold not a recount but a new election whenever the victor prevails by, say, less than one half of 1 percent. This admittedly expensive solution might yield a clearer outcome, should lots of those who stayed home the first time now see the virtue of turning out (or vice versa, one supposes).
An easier way to avoid the grumbling is perhaps harder: The candidate who is defeated on election night could simply accept the result, even in a very close race. The graceful concession by Republican incumbent Kelly Ayotte, who on election night in 2016 was declared to have lost her U.S. Senate seat in New Hampshire to Democrat Maggie Hassan by a bit over a tenth of a percentage point, provided a fine example that others might follow. Ayotte would have been within her rights to demand a recount, and one was expected. Instead, she spared her state that ordeal, allowing Hassan to begin her term without any taint. By conceding, Ayotte strengthened rather than weakened democracy.
Meanwhile, back in the Virginia 94th, as election officials were set to conduct a drawing for the winner, Simonds asked a court last week to declare her the winner instead. The drawing is now scheduled for Thursday — unless a court decision preempts that.
Actually, we’ve been here before. In 1994, a state legislative race in Wisconsin ended in a tie. Republican Randall Luthi was declared the victor over independent Larry Call after a pingpong ball with his name on it was drawn out of a hat. The winner later called the outcome “democracy at its best.” He was wrong, of course. But anything’s better than counting the ballots again.
MADISON — There are many who believe society has “summoned the demon” through technologies that threaten to addict us to screens and games, more susceptible to cyber-snoops and replaceable by machines.
Those dangers and more are possible, of course, if the same society that creates technologies fails to remember the purpose for doing so — helping people to lead better lives.
In 2018, the pace of tech innovation will accelerate in areas where once-fanciful ideas are becoming integral to commerce, health, entertainment, security, learning, energy, manufacturing and more. Here are a few trends to watch:
BIoT: That acronym describes the intersection of blockchain, a shared ledger program that allows institutions to more selectively and securely share information and assets with others, and the Internet of Things, which connects devices through sensors. Experts believe the maturing use of blockchain will make IoT more useful while reducing the risk of hacking through a “chain” of digital records spread over thousands of computers. Possible applications include shipping goods, managing financial records and making energy systems more efficient.
CRISPR: Another acronym, it’s shorthand for a scientific technique that can be used to modify DNA in plants, animals and humans to better protect genes from bacteria and disease. It’s the next step in genetic editing that led to Food and Drug Administration approval in 2017 of therapies for certain leukemias and blindness. Such therapies don’t work for everyone and they may not last forever, but the head of the National Institutes of Health believes a revolution is at hand. Future applications may include hemophilia, sickle cells and muscular dystrophy.
Quantum computing: While still theoretical in some ways, there have been breakthroughs that suggest useful quantum computers are within reach. It harnesses the behavior of energy at subatomic levels to vastly speed up computing for a variety of uses. As Chemical & Engineering News reported, quantum computing could be used to model chemical systems that cannot be solved by conventional computers. Possible applications include: Advances in motors, magnets, power grids, fuels and solar-cell materials.
Computing job market: Even if quantum computing is still a bit “Star Trekish,” the demand for computer workers will continue to grow. In a recent presentation by leading faculty at the UW-Madison, it was noted that projected job openings for people who hold a bachelor’s degree or higher in computer science will far outstrip the number of degree holders through 2024. The demand is even greater than what the federal Bureau of Labor Statistics predicts for life scientists, physical scientists and engineers.
Virtual reality: As Smithsonian magazine recently reported, people have been fascinated by the notion of 3-D visual experiences since the mid-19th century. Technology and the availability of relatively inexpensive head-mounted devices has made virtual reality much more real since 2012, with applications ranging from architecture to games, from surgery to space simulation, and from education to psychotherapy. Heads-up-displays and augmented reality will likely affect how people drive, work, shop and play (think Pokemon Go) in the future.
Better broadband technologies: Remember when fiber optic wire was the only real way to deliver Internet connectivity? In time, reaching remote areas may be improved through wireless technology mounted on power lines, “Li-Fi” wireless tech based on LED lights, high-altitude balloons and simply making better use of unused television spectrum, called “white space.” As the Federal Communications Commission has noted: “(White space) is ripe for innovation and experimental use, holding rich potential for expanding broadband capacity and improving access for many users.” Microsoft has a 12-state white spaces pilot program in the works that would include Wisconsin.
Artificial intelligence: Voice and facial recognition are among many examples of how AI is already prevalent in society, not to mention some of the functions on the latest smartphones. Still, it’s the technology that often scares people the most — especially those who worry that machine-learning computers can become smarter than humans (probably true) and take over the human race (science fiction). As explained in a recent edition of Popular Mechanics, the promise of AI lies in giving humans better tools with which they can solve problems. Applications now or soon: Smart spam filters, faster Netflix and longer battery life.
Technology has its drawbacks and its dangers, especially if used to evil ends, but very few people would turn back the clock on advancements in regular use today. It’s a reason to be excited about the future, not fretful.
I realize La Crosse has a pigeon problem that must be addressed, but I also believe there is a more humane and safe way to address it.
The method may be a "quick fix," but doesn't necessarily solve the problem. La Crosse plans to poison them, net them and then, if I understand correctly, shoot them. Is this the humane way to attack this problem, and how can we guarantee that this poison will not find its way to other creatures, including our pets?
At Birdbgone.com, for one, they show how to eliminate nesting and perching areas for pigeons. This to me seems to be a more effective plan and probably longer range. Maybe La Crosse should rethink their method.
Kathryn Kremenski, La Crosse