|Computer Professionals for Social Responsibility|
War and Information Technology
by Chris Hables Gray
Technology has always been an important part of war. Today, with information technology situated as the defining technology of our age it should be no surprise that IT (information technology) is a central part of war making. In fact, the US military has officially determined that information is the main "force multiplier" in battle, more important than numerical superiority or force of fire, the old standards. This is because, in the US military's view, we now have the information technologies that can not just make a crucial difference in battle, in terms of precision weapons, intelligence, and command and control, but that these technologies might actually allow the "fog" (uncertainly) of war to be dispelled so that total, almost painless victories, might be won against many opponents.
Computer professionals, when evaluating these information technologies, must start with two basic rules that come both from within computer science (especially information theory, systems analysis and information management) and from the academic disciplines represented by groups such as the Society for the Social Studies of Science (4S), the Society for the History of Technology (SHOT) and Science, Technology, and Society studies in general (STS). They are:
1) Technologies cannot be evaluated out of their context of use. Very few technologies either "work" or "don't work". They all perform at certainly levels of efficiency, but whether or not they "work" depends on what they to do in terms of their institutional and societal contexts. In the case of military technologies, for example, they have to be evaluated not just in terms of failure rates but also how the technology performs, and influences, in the context of the relevant doctrines, tactics and strategies.
2) Technologies, especially information technologies, are parts of systems. They cannot be evaluated in isolation. The systems, which often include human operators and users, either work well or not in their context. For repairing and evaluating systems it is important to know how the various components or nodes perform, but a system that fails because of so-called human-error, has failed none-the-less and that includes it's computational elements.
So, any weapon system has to be analyzed in terms of all its components (including the humans and their training), the doctrine that it serves, its tactical and strategic uses, and the political context of the war it is used in. To say B-52s are bad and so-called smart missiles are better does not make sense. The computer and other technical parts might work with a high level of reliability (or not) but that doesn't make any one weapon system better than another. The purely technical efficiencies cannot be separated out.
As a computer science professor I teach my students that computing systems in particular must be evaluated as systems, not in isolation. A perfect case is the destruction of Iranian Air flight 655 by the USS Vincennes (deploying the most sophisticated computerized weapon system used in combat: the Aegis) that killed 290 civilians. There were technical problems (the jamming of an automatic 5-inch gun, the poor reliability of the automated Phalanx machine guns, the fact that Standard missiles need miles to accelerate, the poor interface design of the combat information center), training problems (all training was for fleet combat with the Soviets), command problems (other US ships correctly identified the flight but the Vincennes was in charge because it had the best computer system), human error (misreading the computer information because of scenario fulfillment), doctrine (the source of the command and training problems), the tactics (sending the Vincennes into Iranian territorial waters to engage small Iranian patrol boats), the strategy (the illegal stupid mission the Vincennes was on -- trying to provoke an Iranian counter-attack), and the politics (supporting Saddam's Iraq in an aggressive war against Iran). So what if the Aegis is a marvel of computer engineering? Tell that to the families of the dead Iranian civilians. (For full details and documentation see: Chris Hables Gray, "AI at War: The Aegis System in Combat" in Directions and Implications of Advanced Computing 1990, Vol. III, D. Shuler, ed., Ablex, 1996, pp. 62-79.)
By the way, there is substantial evidence that the testing of the circuit cards for the Phalanx was faked, one reason their reliability was so low and so the Captain of the Vincennes could not trust them for close-in defense and so made the incorrect decision to take out the innocent airliner.
An undergraduate in one of my computer classes who tried to evaluate a system that uses computers by only looking at certain technical parameters of the computers, and not at all the relevant factors, would not pass. A military that does the same, might win some battles, but it will lose all but the easiest wars, and even then, it will often lose the peace that follows.
In Gulf War I a strong case can be made that the massive B-52 bombings shattered the main Iraq Army, the so-called "smart weapons" were not nearly as accurate as claimed (and we found out much later the Patriots hit nothing), and that the destruction of the Iraqi infrastructure (in part by so-called precision bombing with all kinds of bombs and missiles) led to the deaths of 300,000 Iraqi women and children (according to both the UN, independent aid NGO's, and a Harvard Health School study). War is complicated, weapon system performance is complicated, and accuracy (or even reliability) is hardly everything.
When Force of Fire was the main doctrine for winning war in the US military, bigger explosions were almost always considered more important than smaller accurate ones. In some situations this was (and still is) true, in many others not. The US military in particular has had a tendency to want to ignore the strategic and political realities of war. This is why it lost Vietnam. US weapons were more accurate and more powerful than the Vietnamese weapons but that was not enough. It didn't help the Soviets win in Afghanistan either. This is why there is a whole debate about Asymmetric War in the military today, by the way.
Some of the main issues computer professionals should look at in Gulf War II--
* To what extent are such doctrines as "Shock and Awe" and of the general U.S. strategy based on misconceptions about what IT can and cannot do in a unpredictable and uncontrollable arena such as war.
* To what extent the existence of "so-called" smart weapons leads to the doctrinal, tactical, or strategic misuse of such weapons. Americans have a particular love of technology. That some people can unequivocally say such weapons "really work" when they have hit three of the wrong countries is an important issue to explore.
* The actual performance of such weapons as the Patriot.
* The possibility and possible impact of future weapons or military information systems such as effective identification systems.
* The absolute limits of computer technology to model complex systems. CPSR played a major role in articulating this crucial part of information theory during the first round of Star Wars debates and in the Spring 2001 newsletter (ed. by Carl Page and Chris Hables Gray) there is a full bibliography of the key articles in this area.
There are now a number of monographs and collections of articles that look critically at IT in the context of contemporary war. They include:
David Bellin and Gary Chapman, eds., Computers in Battle, Harcourt, Brace, Jovanovich, 1987.
Paul Edwards, The Closed World, MIT Press, 1996.
Chris Hables Gray, Postmodern War, Guilford, 1997.
Gerfield Stocker and Christine Schopf, eds., Infowar, Springer Wien/New York, 1998.
James Der Derian, Virtuous War, Westview, 2001.
Shock and Awe: http://www.commondreams.org/views03/0127-08.htm
Context of the War on Terror: http://world-information.org/wio/readme/992003309/1004366266
War on the Web (UK Guardian): http://www.guardian.co.uk/online/story/0,3605,898661,00.html
War Peace and Complex Systems: http://www.borderlandsejournal.adelaide.edu.au/vol1no1_2002/Gray_complexity.html
Interesting article from the UK on the use of tomahawk missiles: http://www.guardian.co.uk/Print/0,3858,4630027,00.html
Excellent article from a Business magazine, including discussion of interfaces: http://europe.businessweek.com/technology/content/jan2003/tc2003017_2464.htm
Here is a link to the book that describes the whole "shock and awe" concept: http://www.dodccrp.org/shockIndex.html.
Here are two sites that discuss the media-zation of the current war in different ways.
1) Brown University's Watson Center (yes, the IBM Watson) has a whole project on infowar and infopeace and have some recent articles on embedded journalists: http://www.infopeace.org/.
2) And here's a site on media literacy and this war: http://www.tandl.vt.edu/Foundations/mediaproject/
Respectfully submitted, Chris Hables Gray
I would like to address some points made by Chris Gray. Here he is specifically
addressing the catastrophe associated with the shooting down of Iran Air 655 by
the USS Vincennes. Specifically:
by James Nugent
"There were technical problems (the jamming of an automatic 5-inch gun, the poor reliability of the automated Phalanx machine guns, the fact that Standard missiles need miles to accelerate, the poor interface design of the combat information center) "
Mr. Gray lists these problems as contributing to the accidental shooting of Iran Air 655. As Mr. Gray has appropriately noted about weapons and their employment, it's important to understand doctrine and tactics bear on their use. In this case the five inch gun and Phalanx systems were not a consideration in the engagement of the missile system as this would have been the primary, i.e. first use, system. The order of priority for weapon/defensive engagement for the Vincennes would have been:
1. Standard missile system.
2. Five inch guns.
3. Electronic deception.
5. Phalanx (4 & 5 would be near simultaneous employments).
Systems 2 through 5 could have been completely out of service, or fully employable, and this would not have altered utilization of the standard missile system as the first system to be used against an air threat. This is in accordance with the U.S. and at the time Soviet Navy's doctrine of layered defense, a concept later adapted by the computer information assurance/security experts to detail how a computer network would be defended. In this case the doctrinal principle for a layered defense would be as follows:
"Weapons [Used within a layered defense]. AAW weapons begin with Phoenix, Sparrow, Sidewinder, and AMRAAM missiles carried by the combat air patrol (CAP) [Note: this was not available to Vincennes]. Next are long-range "Standard" missiles such as extended- and medium-range SM-1s and SM-2s. These missiles are capable of intercepting targets at a range of nearly 100 miles. Shorter-range variants of these weapons are good out to 25 miles. Inside 10 miles, Sea Sparrow missiles are used to engage targets, and at extremely close, "do-or-die" ranges, CIWS guns are utilized. Additionally, MK45 5"/54, and the OTO Melara 76mm gun mount can engage air targets with limited effectiveness."
Layered defense means essential the same thing here as it does in the computer
world, i.e. you stop an attacker as far from what is being attacked as you possibly
can. In this case that means you use the systems with the furthest reach to
take out an air threat, and on the Vincennes that is clearly the Standard missile
I would also point out that in Mr. Gray's paper on this subject he specifically states that the loss of the single five inch gun was not a contributing factor to the destruction of Iran Air 655.
The need for the missiles to accelerate is not entirely clear. No ship's captain, again based on the applicable doctrine and tactics, would ever allow an air threat to get so close that the acceleration dynamics of the defensive missile system would be a detrimental factor for its employment.
Mr. Gray's comments with regard to the human-machine interface are totally on target and this, vice any overriding technical failures or issues, along with the social dynamic surrounding events leading to the firing of missiles at the aircraft, was responsible for the tragedy.
" training problems (all training was for fleet combat with the Soviets) "
This is a legitimate point as well. The crew of the Vincennes was in a situation vis-à-vis the civilian airliner that it had never trained for. While the system itself was very capable of adequately tracking civilian airliners given its use for just that purpose in two different venues , this specific scenario had not been tested either via simulation or as a crew scenario. Had such training been conducted it's very likely problems with the human-machine interface would have been discovered and corrected, and either way the crew would have likely handled this specific situation differently and avoided the disaster which occurred.
" command problems (other US ships correctly identified the flight but the Vincennes was in charge because it had the best computer system), human error (misreading the computer information because of scenario fulfillment) "
These points are in keeping with ones I've made above concerning the environmental/sociological dynamic and the lack of training that led to the sorts of errors that cascaded to tragedy on board Vincennes.
" doctrine (the source of the command and training problems), the tactics (sending the Vincennes into Iranian territorial waters to engage small Iranian patrol boats), the strategy (the illegal stupid mission the Vincennes was on -- trying to provoke an Iranian counter-attack) "
There are many types of doctrine within the U.S. Navy and it's not clear which
one exactly is being referred to here. Tactics were without fault as the crew
of the Vincennes did exactly what it was expected to do given the situation
it believed it was facing.
It's questionable that a cruiser was the ideal system to send after shallow draft Iranian gunboats in what were depth restricted waters. That said, this incident started with a helicopter from the Vincennes coming under fire by the Iranian boats , so it was the call of the captain of the Vincennes how he was to defend his helicopter and he elected to engage the Iranians.
Mr. Gray raises an interesting question vis-à-vis the legality of the U.S. Navy's action during this time. Our main reason for being in the Gulf was because of a request from the government of Kuwait to escort its tankers (which were subsequently re-flagged as U.S. vessels) through the Gulf. Escorts were necessary because both the Iraqis and the Iranians were engaging ships with anti-ship missiles, and the Iranians were also aggressively laying floating mines in the international waterways of the Gulf. It bears remembering that the Iranian minelayer Iran Ajr was captured and its mines confiscated as it was in the act of laying such mines near Bahrain . Actions taken in the Gulf were in response to the illegal mining of the sea lanes in this region, which puts the engagement of U.S. ships with Iranian vessels in a very different context.
" and the politics (supporting Saddam's Iraq in an aggressive war against Iran)."
The extent of U.S. support for Iraq at this time is not as clear as this would lead one to believe, but this issue is also clearly beyond the scope of this response.
"So what if the Aegis is a marvel of computer engineering? Tell that to the families of the dead Iranian civilians."
Mr. Gray's inference here is that Aegis is what caused this tragedy and he seems very much in the minority on this. Technically speaking Aegis did exactly what it was supposed to, but there was a string of human errors, failures and assumptions that ultimately caused the captain of the Vincennes to authorize the unleashing of two missiles at Iran Air 655; Aegis was never in automatic and it did not make the decision to fire.
"By the way, there is substantial evidence that the testing of the circuit cards for the Phalanx was faked, one reason their reliability was so low and so the Captain of the Vincennes could not trust them for close-in defense and so made the incorrect decision to take out the innocent airliner."
I would categorize this as another red herring concerning the specifics of
this event. The order of battle for weapon employment is provided above, and
Phalanx is the last system a ship's captain thinks to use. It's a last ditch
effort, if the attacker is that close (Phalanx's range is 1 mile) engaging it
with Phalanx may well destroy the threat but if it's destroyed within the range
of Phalanx there's a good chance that there'd be debris resulting from the destruction
that could still present a danger to the ship. Bottom line, you engage air threats
as far away as possible.
A reasoned critique of a weapons system should come from a full appreciation for how it is used. Mr. Gray does not provide this perspective in the least, and it's very clear that he never endeavored to obtain one, likely I would suspect due to his distaste for anything to do with the military. He does make points here that are germane to what did cause this tragedy and that demanded action by the U.S. Navy. He otherwise clutters this with poor historical perspective and what would seem to be a desire to make a point from his own political/philosophical perspective, which poorly serves himself and his readers.
Submitted by James NugentReturn to Weapons & Peace Working Group Page
Created before October 2004