There have been incidents in America’s wars about which we all feel remorse: the fire-bombing of Dresden at the close of World War II, which killed about 25,000 German civilians; the My Lai massacre, in which approximately 400 women, children, and elderly civilians were murdered in South Vietnam; or, closer to home, the Bad Axe Massacre, 180 years ago last week, at which more than 400 members of the Sauk and Fox tribes were killed trying to cross the Mississippi River near Victory, Wis.
No reasonable person can defend the intentional killing of civilians. But how do we feel about the unintended but foreseeable civilian deaths that occur routinely in military actions?
Collateral damage — the unintentional killing of civilians — has always been an inextricable part of warfare, and that has always been the chief reason why war is considered a last resort. But the situation is changing with the advent of robotic warfare.
Administration insiders say the use of drone strikes to assassinate terrorists is driven in large part by President Barack Obama’s commitment to the “just war theory,” an ethical tradition dating from the middle ages that considers the use of force to be justified only in self-defense and only when taking every measure to insure that innocent people are not killed. Obama personally reviews potential targets and authorizes every name on the “kill list.”
The number of civilian casualties resulting from drone strikes seems to be about 20 percent. That is a relatively low figure compared to collateral damage percentages in traditional military conflicts, which, according to a recent New York Times report, has ranged from 33 to 80 percent during the past 20 years.
Since Obama came into office nearly four years ago, the percentage of civilians killed in military strikes has decreased dramatically due to the effectiveness of drones in identifying and selecting intended targets.
But that doesn’t mean fewer civilians are being killed. Precisely because the drones are so effective at reducing collateral damage, they are being used more frequently.
There are 57 drone combat air patrols in operation overseas, mostly in Pakistan, but also in Yemen and Somalia. Drone strikes now take place in Pakistan about once every four days.
According to the London-based Bureau of Investigative Journalism, the U. S. has carried out about 350 drone strikes since 2004, killing an estimated 3,500 to 4,000 people, including about 800 civilians.
These numbers are educated guesses, because neither the White House nor the Pentagon will provide statistics, and terrorist groups grossly exaggerate the number of civilian casualties.
In June, a Pentagon spokesperson disputed estimates reported in Newsweek, but refused to provide an official number, saying only, “I can assure you that the number of civilian casualties is very, very low.”
Paradoxically, the very effectiveness of drone strikes is what makes them morally problematic. Because the only people directly involved in the attacks are the victims, there is no way for the rest of us to witness the cost of the suffering that is being inflicted on our behalf. Because we do not share the risk, we can easily ignore the harm.
Do drone strikes make the world safer, or do they make the world more dangerous while making Americans safer? Would even more civilians die at the hands of terrorists if we were to discontinue drone attacks?
I don’t know the answers to those questions, but what distresses me most is that even in an election year, I don’t hear anyone asking them.
The use of force without consequences to the user is a dangerous thing. In this new age of robotic warfare, we need to pause and reflect not only on what the technology allow us to do to others, but also on what it is doing to us. Is it leading us into complacent acceptance of what ought to be a last resort?
The Ethical Life is a series of reflections on the ways ethical thinking influences our actions, emotions and relationships. Richard Kyte is the director of the D.B. Reinhart Institute for Ethics in Leadership at Viterbo University.