Battlefield Injuries and Medicine

Battlefield medicine evolved considerably between World War I and World War II. In the former, approximately 4 out of every 100 wounded men could expect to survive; in the latter, the rate improved to 50 out of 100.

Survival depended on a variety of factors: the type, severity, and location of the wound; the proximity of medical care (soldiers in the jungles of the Pacific typically fared worse than those in metropolitan Europe); and the availability of new “miracle drugs” such as penicillin. In general, however, a soldier who received treatment within an hour of being seriously wounded had a 90 percent chance of recovery. After eight hours, his chances slipped to 25 percent. Those most at risk were forces involved in amphibious landings, which were particularly hazardous. The beachhead casualty rate was often as high as 25 percent, and the rapid evacuation of men wounded during a beach landing was usually made doubly difficult by enemy fire and successive waves of incoming forces.

Figure 13-4 A medic treats a wounded American soldier in France, 1944.

Photo courtesy of the National Archives (208-YE-22)

During World War I, most soldiers were wounded by enemy bullets and chemical agents such as mustard gas. During World War II, artillery and bombs inflicted the most injuries — and the most severe. Head wounds and traumatic amputations were common, and medics strove to get the wounded into surgery within six hours, when the survival rate was highest (in the European theater, nearly 21 out of 100 who were wounded received treatment during this golden period).

The practice of triage increased the survival rate by sorting the wounded according to degree of injury, with the most severely wounded receiving treatment first. Men who were mortally wounded and likely to die were given drugs to ease their pain and were attended to only after those more likely to survive were seen.

A number of new drugs and medical techniques developed in the years between the world wars dramatically improved the survival rate among the sick and injured. For example, combat medics (and even men in the field) carried packets of sulfanilamide and sulfathiazole to coat wounds as a first line of defense against infection. Antibiotics such as streptomycin and penicillin also helped save the lives of countless soldiers.

Combat wasn't the only thing to fell soldiers during World War II; environment-related illnesses were also a common hazard. In the Pacific, for example, mosquito-borne malaria was a serious threat. To reduce the chances of infection, areas surrounding permanent bases were treated with insecticides such as DDT, and servicemen were routinely dosed with a quinine substitute known as atabrine. (Quinine itself was in short supply because many tropical countries that produced the raw materials for its manufacture had fallen to the Japanese.)

American servicemen were also inoculated for a wide variety of diseases before being shipped overseas. The most common vaccinations were for smallpox, typhoid, and tetanus, though soldiers assigned to tropical or extremely rural areas were also vaccinated for cholera, typhus, yellow fever, and, in some cases, bubonic plague.

Lastly, American military personnel also received numerous lectures — and viewed several graphic documentaries — about the risks of contracting a venereal disease such as gonorrhea or syphilis. “Fraternization” was strongly discouraged, but a great many men still became infected. Treatment usually consisted of a regimen of strong antibiotics.

  1. Home
  2. World War II
  3. The Horrors of War
  4. Battlefield Injuries and Medicine
Visit other About.com sites: