Abstract
Debilitating events could leave either more frail or more robust survivors, depending on the extent of scarring and mortality selection. The majority of empirical analyses find more frail survivors. I find heterogeneous effects. Among severely stressed former Union Army prisoners of war (POWs), the effect that dominates 35 years after the end of the Civil War depends on age at imprisonment. Among survivors to 1900, those younger than 30 at imprisonment faced higher old-age mortality and morbidity and worse socioeconomic outcomes than non-POW and other POW controls, whereas those older than 30 at imprisonment faced a lower older-age death risk than the controls.
Introduction
The first total war, the American Civil War, provides a unique opportunity to examine the long-run effects of acute malnutrition and the stresses associated with imprisonment on later life socioeconomic outcomes and older-age mortality and morbidity. At Andersonville, the most notorious of the POW camps, roughly one-third of POWs died within seven months (Rhodes 1904:404) and 40 % of the men who passed through Andersonville died there.1 Diarists’ accounts describe men weighing 90–95 pounds, down from almost 180 pounds, when they left Andersonville (Basile 1981; Ransom 1963). Walt Whitman wrote of the returning prisoners, “Can these be men— these little, livid brown, ash-streaked monkey-looking dwarfs? Are they not really mummied, dwindling corpses?” (Quoted in Linderman 1987:260).
This article examines the effect of imprisonment in Confederate POW camps on the mortality, morbidity, occupational attainment, and property ownership of Union Army veterans 35 years after the end of the war. Conditions in Confederate POW camps deteriorated sharply when prisoner exchanges stopped, leading to severe overcrowding and reduced rations. Union Army non-POWs and POWs imprisoned when conditions were better provide control groups for POWs imprisoned when conditions were at their worst.
Starvation and the diseases and stresses of imprisonment could leave either permanently scarred survivors or, through positive selection effects, very resilient survivors. Models of mortality selection posit that when mortality at younger ages is high, more frail individuals die, leaving a more robust population that survives longer. These models have been used to explain the deceleration of the age pattern of mortality at older ages (Horiuchi and Wilmoth 1998) and the black-white mortality crossover (Manton and Stallard 1981; cf. Preston et al. 1996). Empirical analyses, however, predominately show positive associations between debilitating events and the morbidity and mortality of adults more than 20 years after the event (e.g., Almond and Mazumder 2005; Barker 1992, 1994; cf. Caselli and Capocaccia 1989; Finch and Crimmins 2004; Horiuchi 1983; Kannisto et al. 1997). Studies of POWs are no exception, with the caveat that the biological processes through which insults at young adult ages affect older-age mortality are different from the processes through which insults at ages when organs are still developing affect older-age mortality. Dent et al. (1989) found that Australian World War II (WWII) POWs held by the Japanese faced excess mortality rates compared with non-POW combatants more than 38 years after the war, even though the two groups had similar degrees of medical morbidity (Goulston et al. 1985; Tennant et al. 1986). Page and Brass (2001) and Page and Ostfeld (1994) reported that among U.S. POWs, acute malnutrition was correlated with higher risks of death from ischemic heart disease, particularly after age 75. In contrast, studies of Holocaust survivors have found no difference in the mortality of camp survivors compared with other European-born Jews both 20–41 years and 40–50 years following WWII (Collins et al. 2004; Williams et al. 1993), but the samples in these studies are small.
I find both positive and negative older-age mortality effects of the Civil War POW experience. Among men younger than age 65 in 1900, POWs imprisoned when conditions were at their worst were more likely than non-POWs to die of any cause and to die of heart disease and stroke. They also exhibited more adverse cardiovascular signs and symptoms and valvular heart disease. They were also more likely to be laborers and were less likely to own property. But among men older than 64 in 1900, those imprisoned when conditions were at their worst were less likely to die than their counterparts who were not POWs or who were POWs during better times. These POWs were older than age 30 at captivity and faced a higher mortality rate in captivity than younger men, suggesting that those older men who survived POW camps may have been more robust.
This study has several advantages over previous work on the long-run effects of starvation and the stresses of imprisonment. Twenty-five percent of the Union Army was older than 30 at enlistment,2 enabling me to examine how catastrophic stress affects different age groups. After age 30, the ability to respond to stress declines, most physiological functions and the number of functioning cells per organ begin to decline (Moody 2010:17), and mortality rates begin to climb exponentially. Because records on the Union Army cohorts are in the public domain, I am able to examine a broader array of outcomes than many other studies. Finally, because the data are based on administrative records, including detailed medical examinations at older ages, I do not have to depend on self-reports.
Confederate POW Camps
An estimated 211,411 Union soldiers were captured during the Civil War. Of those, 16,668 were never imprisoned because they were paroled on the field. Of the remaining 194,743 men, 30,218 died while in captivity (Rhodes 1904:507). Thus, 7 % of all U.S. soldiers were ever imprisoned compared with 0.8 % for WWII and 0.1 % for Korea.3 Until mid-1863, many POWs were exchanged immediately. Prisoner exchanges stopped as the two sides argued over the terms, particularly the treatment of black soldiers (who could be re-enslaved) and white officers (who could be executed as leaders of a slave insurrection). Men were exchanged again in December of 1864 and early in 1865. The mean number of days spent in prison until death or release for men who were captured prior to mid-1863 was 20; for men who were captured after mid-1863, it was 92 days.
Men who were captured after mid-1863 faced ever worsening conditions as the crowds of prisoners increased. In a random sample of soldiers, 4 % of the men captured before July of 1863 died in captivity, and 27 % of those captured July 1863 or later died in captivity. In contrast, the total wartime mortality rate was 14 % (Costa and Kahn 2007b). Although men could buy food and goods within POW camps and trade with the guards, men who had been imprisoned for long had nothing left to trade. POWs suffered from poor and meager rations; contaminated water; grounds covered with human excrement and other filth; a dearth of shoes, clothing, and blankets (having often been stripped of these by needy Confederate soldiers); a lack of shelter in the open stockades that constituted camps such as Andersonville and Millen; and risk of being robbed and murdered by fellow prisoners and trigger-happy guards. With a maximum capacity of 10,000 men, Andersonville (also known as Camp Sumter, and located in Georgia), at one point held 32,899 men (Speer 1997:332). As crowding increased, rations were stretched thin. According to Warren Lee Goss, “The first morning after our arrival [at Andersonville on May 1, 1864] about twenty pounds of bacon and a bushel of Indian meal was given to me to distribute among ninety men. We had no wood to cook with. . . .” (Goss 1866:75). By June 1864, corn bread and cornmeal (a large fraction of which was ground cob) became the principal staples of the diet; and by August, most prisoners were reported to be suffering from scurvy (Marvel 1994:74, 179). After pellagra became recognized as a public health problem in the United States, Niles (among others) argued that veterans’ descriptions of “how the men had a supposed eczema; . . . how their skins were rough and hard, and how their hands were sore and cracked; how their bowels were chronically loose . . . [and how] the melancholy deepening into the different forms of dementia” were suggestive of pellagra but added that “[w]hether or not this was really pellagra will probably not be positively known” (Niles 1912:26–27). The chief recorded causes of death were scurvy, diarrhea, and dysentery. Scorbutic ulcers became gangrenous.4 Andersonville, which was emptied of men in September of 1864 when Sherman’s army threatened, was still used as a POW camp but never achieved the same levels of crowding.
Camp crowding was the single most-important predictor of death for POWs (Costa and Kahn 2007b, 2008: Fig. 2, p. 147). The individual characteristic that mattered the most was age. In a random sample of soldiers, 14 % of those who became POWs before age 30 died, whereas 21 % of those who became POWs after age 30 died. In this same sample, 31 % of those who became POWs before age 30 at Andersonville died, compared with 54 % of those who became POWs after age 30 at Andersonville.
POW Status and Older-Age Outcomes
Whether the average person becomes more frail or more robust depends both on the extent of scarring and mortality selection. Because I observe the effect of the POW experience on older-age outcomes, I can determine only which effect, on average, dominates.
The scarring effects of POW status on older-age outcomes could operate through several channels. By limiting men’s ability to work and to climb the occupational ladder, poor health upon discharge from the Army would increase their chances of being laborers rather than farmers or artisans and decrease wealth, likelihood of owning property, and likelihood of being married. Their lower socioeconomic status (SES) might then place them at greater risk of developing cardiovascular disease through greater exposure to disease, worse/lesser nutrition, or stress. POW status in itself could also have an effect on later life heart disease.
The evidence linking starvation, disease, and psychological stress to heart disease is largely epidemiological. Although there is no clear biological mechanism linking nutritional deprivation to subsequent chronic heart disease, cardiac atrophy has been found in healthy adult volunteers subjected to a semi-starvation diet, in starved rats, in individuals on energy-restricted diets, and in patients suffering from cardiac cachexia. However, during the rehabilitation of healthy adult volunteers on a semi-starvation diet, heart size increased rapidly after the initial decline (Keys et al. 1950:203–206). Dietary deficiencies also have cardiac sequelae. Thiamine deficiency leads to peripheral vasodilation, potentially resulting in cardiac failure, and selenium deficiency has been associated with cardiomyopathy (see Webb et al. (1986) for a review). Folates and vitamins B6 and B12 are required for the metabolism of homocysteine to methionine, and elevated homocysteine levels are a risk factor for coronary heart disease and ischemic stroke (see the review by Fairfield and Fletcher 2002). The reporting of such signs of vitamin deficiency as edema and night blindness while in WWII Japanese prison camps (where the diet consisted mainly of polished white rice) is correlated with cardiovascular morbidity and mortality (Page and Ostfeld 1994).
The weakened immune systems of the starved POWs placed them at greater risk of developing infectious disease and camp crowding increased their exposure to infectious disease. Some infectious agents most commonly linked to heart disease include rheumatic fever (valvular heart disease), the coxsackie B virus (pericarditis and myocarditis), and chlamydia pneumoniae, helicobacter pylori, and dental infections (atherosclerosis via inflammation). Those who have had a prior attack of rheumatic fever (common in the Civil War armies) are highly susceptible to recurrences after future streptococcal infections. A study of former British Far East WWII POWs (Gill 1983) found a higher proportion of deaths mentioning rheumatic heart disease compared with the general population, but the number of cases was only six.
Physiological changes associated with the dysregulation of the stress system are implicated in the development of a variety of illnesses, including hypertension and atherosclerosis. Former POWs are more likely to suffer from depression and post-traumatic stress disorder (PTSD) than veterans who were not POWs (Page 1992), and POWs with PTSD had statistically significantly increased risks of cardiovascular diseases, including hypertension and chronic ischemic heart disease, when compared with both non-POWs and POWs without PTSD (Kang et al. 2006).
Starvation is associated with a fall in heart rate, blood pressure, and blood volume; during re-feeding, there is a sudden reversal in these compensatory factors. Cardiac output needs to rise to handle increased salt, water, and energy loads (see Webb et al. 1986 for a review). Congestive cardiac failure is a danger during re-feeding of starved individuals. Re-feeding syndrome has been noted in Japanese-held WWII POWs (Schnitker et al. 1951), malnourished children (Edozien and Rahim-Khan 1968), and patients suffering from anorexia nervosa (Casiero and Frishman 2006). The long-run effects of re-feeding on chronic heart conditions are unknown.
Another potential explanation for scarring is the disruption of the growth process. Union Army soldiers were still maturing (growing) until age 21 or 24, and the youngest age of captivity in my sample is 14. During the WWI German famine, adolescent males were left permanently scarred, but older males were not (Horiuchi 1983). In addition, the young may be more psychologically vulnerable. Older World War II ex-POWs were less likely to have depressive symptoms at older ages than their younger counterparts (Page et al. 1991).
Data
The data come from two different sources: a sample of Union Army soldiers (the Fogel sample), some of whom were taken prisoners; and a random selection of men who were at Andersonville. The Fogel sample comes from a full sample of all men within 303 companies.5 Complete military records are available for these men, providing information on wartime service and on demographic and socioeconomic characteristics at enlistment.6 These data contain 3,175 cases of captivity with known dates of capture and of release or death for 3,040 men.7 I know which prison a man entered and on what date, and whether he survived or died. These records are not limited to men who were at Andersonville.
The National Park Service’s Andersonville database contains 35,323 men and was drawn from such disparate sources as the lists of the dead and published state muster rolls.8 Although the sample does not cover the entire population of Andersonville (and probably never can, given the lack of complete records), it comes close. An estimated 45,000 men passed through Andersonville (U.S. War Department (1880–1901, Series II, Vol. VIII:789)). The data provide capture and death dates (albeit not always complete) and also information on the soldier’s name, rank, regiment, and company. I collected a random sample of roughly 1,000 men who lived to 1900 (as ascertained from their pension records) from this database.
The men in both samples are linked to their military service records; their pension records (including the detailed reports of the examining surgeons); and the 1860, 1880, and 1900 censuses. Linkage procedures for the 1880 census differ across samples. The Andersonville sample used information on residence and on family members from the pension records, whereas the linkage for the Fogel sample was done purely on name and state (or country) and year of birth. The Andersonville sample is also linked to the 1870 census, as is a small subsample of the sample of Union Army soldiers (i.e., those soldiers enlisting in the District of Columbia, Delaware, Iowa, Kansas, Maine, Minnesota, Missouri, Ohio, and West Virginia).9 The military service records provide detailed information on men’s wartime illnesses, wounds, and captivity. The pension and surgeons’ records provide information on date of death after the war, cause of death, and chronic conditions after the war. These records also contain information on wartime illnesses (for POWs either before or after captivity). The censuses provide socioeconomic and demographic information, including wealth (in 1860 and 1870), occupation, and marital status.
I restrict both samples to those men who survived to 1900 and who were on the pension rolls by 1900.10 I also restrict the samples to men who have information on date of death and, among former POWs, men who have complete information on capture dates. This leaves 11,025 non-POWs, 889 POWs from the Fogel sample, and 893 POWs from the Andersonville sample. I lose 65 men in the Fogel sample (4 of them POWs) in the regressions because of missing information on age in 1900. Information on cause of death is available for 47 % of the Fogel sample and 50 % of the Andersonville sample. Cause of death is more commonly available when there was a surviving spouse. Causes of death are often vague: for example, “heart disease.” I therefore examine the combined category of heart disease as well as the more specific categories of stroke, valvular heart disease, and pericarditis; and the combined category of angina, atherosclerosis, arteriosclerosis, coronary occlusion, coronary thrombosis, myocarditis, and endocarditis. Fleming (1997) and Finlayson (1985) argued that mentions of angina, atherosclerosis, coronary occupation, and coronary thrombosis may represent ischemic heart disease, and I will refer to this combined category as “ischemic heart disease,” although it may not necessarily be equivalent to ischemic heart disease today.
The detailed records of the examining surgeons provide information on various chronic conditions, symptoms, and signs. Patient accounts, physical examination, and diagnostic information from surgeons’ exams were used to determine the existence of medical conditions based on diagnostic criteria. These criteria, determined by actively practicing physicians knowledgeable in medical history, were derived from modern medical knowledge while recognizing the limitations of nineteenth and early twentieth century medicine. The primary limitation is that examining surgeons were constrained by what they could detect through sight, touch, feel, and smell. For example, the records described heart murmurs and their location. I diagnose valvular heart disease as a murmur in the aortic or mitral valve. The examining surgeons diagnosed arteriosclerosis by feeling whether the arteries had hardened. Arteriosclerosis therefore refers to peripheral arteriosclerosis and could be atherosclerosis, an associated disease (such as diabetes), or local inflammation. The examining surgeons also noted whether the pulse was irregular or bounding as well as instances of arrythmia, tachycardia, or bradycardia. The examining surgeons were unable to detect any of the conditions that required modern diagnostic equipment, such as hypertension.11 However, the team of current physicians who reviewed the data reported that if in the field, with no diagnostic equipment, they could not do any better. (For a detailed discussion of potential biases in the surgeons’ exams, see Costa (2000, 2002).) Surgeons’ records are available for over 95 % of the men in my sample. These records are less likely to be found both for the severely wounded and for men who applied for a pension on the basis of age rather than a chronic condition.
I examined men’s time spent in captivity in six-month intervals and classified men as being (1) POWs for a lengthy time during the period of no prisoner exchanges—that is, captured July, 1863 to July, 1864 (non-exchange POWs); or (2) POWs earlier or later—that is, captured prior to July, 1863 or captured July, 1864 or later (exchange-period POWs). These measures reflect both time spent in captivity and conditions at the time of capture. Although men captured in July 1864 might not be exchanged until December, they faced only about a month of Andersonville at its worst (and some came in with money or goods) and were less likely to develop scurvy, a sign of severe malnutrition.
Table 1 shows that the prevalence of scurvy, as mentioned in wartime records or the later surgeons’ reports, was higher among men who were captured July 1863 to July 1864 than among men who were captured earlier, later, or were never captured. In the Fogel sample, wartime records mention scurvy for 11 % of all non-POWs; 14 % of POWs captured before July 1863; 16 % of POWs captured after June 1864; and 23 % for POWs captured between July 1863 and July 1864. The reports of the examining surgeons mention scurvy for 1 % of non-POWs; 2 %–5 % of POWs captured before July 1863 or after June, 1864; and 8 % of POWs captured between July 1863 and July 1864. The Andersonville sample also yields differential scurvy rates by time of capture, but with higher prevalence rates. (Within the Fogel sample, the roughly 100 men who were at Andersonville at its worst had somewhat higher prevalence rates for wartime scurvy mentions than the men in the Andersonville sample, but had slightly lower surgeons certificates mentions of scurvy. The results are not shown.)
This analysis combines the Fogel and the Andersonville samples to obtain a larger working sample. (Preliminary analyses revealed similar patterns in the Fogel data as in the Andersonville data but low statistical power.) Compared with the POWs in the Fogel sample, the men in the Andersonville sample were more likely to have been prisoners in the non-exchange period, to have enlisted in 1863 or later than in 1861 or 1862, to have been laborers at enlistment; and were younger. Compared with exchange-period POWs and non-POWs, non-exchange-period POWs were more likely to have been laborers in 1860 and less likely to be 1864 enlistees (because late enlistees were at less risk to become non-exchange-period POWs). Table 2 shows the characteristics of non-POWs, exchange-period POWs, and non-exchange-period POWs in the combined Fogel and Andersonville samples by age group in 1900.
Although among survivors to 1900, men who were POWs in any time period were more likely than non-POWs to be on the pension rolls before 1890, former POW status by itself did not entitle men to a pension. In 1900, former non-exchange-period POWs received an average pension of $11.80 per month, whereas non-POWs received a pension of $11.20 per month. Because controlling for pension amount did not affect the coefficients on POW status, pension amount is not included in the control variables.
Compared with non-POWs, POWs were more likely to be volunteers, have enlisted earlier (1862), and hail from companies that experienced higher death rates, as might be expected from men who were captured in the field. (However, having been recently wounded did not predict POW camp survival, perhaps because those who made it to a POW camp were those less severely wounded.) The results of a probit in which the dependent variable is POW status, and the independent variables are individual economic and demographic characteristics and the number of men in the company who were ever wounded or who ever died show that these two company characteristics were the main predictors of POW status (Costa and Kahn 2007b). POWs are not necessarily a random sample of soldiers on unobservables, either. If a subset of men “fought to the death,” they would never have been in a POW camp. Thus, I would be less likely to sample the most ideological men. However, diaries and postwar accounts suggest that the safest strategy to avoid capture was not obvious. Men were captured while sleeping in houses; guarding wagon trains; charging ahead of their company; being too slow during a retreat; or, most commonly, when their commanding officer decided to surrender on the battlefield or during a siege (Costa and Kahn 2008:141). Men had little idea what type of company they were enlisting in (Costa and Kahn 2008:59–62) and could not foresee whether the company would see action or would guard forts.
The veterans used in the analysis were all alive and on the pension rolls in 1900. Because the data are drawn from administrative records and because the only source of death information is from the pension rolls, the sample cannot be used to investigate morbidity and mortality immediately after the war. Those on the pension rolls immediately after the war were the most disabled, and men imprisoned at Andersonville were more likely to be on the rolls than other ex-POWs. Studies of mortality are possible only after the Union Army pension program becomes a universal old-age and disability program.12 Because the young and the healthy and non-POWs were less likely to be on the rolls in 1900, I will underestimate the negative effects of ex-POW status on later outcomes among younger men.
For the analyses, the sample is split between men younger than age 65 and older than 64 in 1900. This cutoff corresponds to imprisonment below and above age 30. Table 3 uses the Fogel sample to show that among Andersonville POWs, mortality rates increased sharply after age 30. Mortality rates below age 20 were slightly elevated. Table 3 also shows the number of men within each age group who were at Andersonville, who survived Andersonville, who survived to 1866, and who were alive in 1900. Those older than age 30 at captivity were 22 % of all men imprisoned at Andersonville, 16 % of Andersonville survivors, and 15 % of all war survivors. Mortality rates between camp survival and the end of the war were 13 % for those younger than 20 at captivity, 9 % for those aged 20–24, 15 % for those aged 25–29, and 15 % for those 30 and older. Because being a POW during a period of no prisoner exchanges did not predict linkage to the 1880 census, it is likely that there was not excess mortality for Andersonville survivors between 1866 and 1880. Most of the excess mortality probably occurred prior to 1866. Table 3 also compares age distributions in 1900 between Andersonville survivors in the Fogel sample and the former POWs in the Andersonville sample. In the Fogel sample, those older than age 30 at captivity were 11 % of all men alive and on the pension rolls in 1900. In the Andersonville sample, they were 14 % of all men alive and on the pension rolls in 1900.
Empirical Strategy

Using a competing risks hazard model for both age groups of men, I examine the effects of POW status on mortality from all heart disease; from valvular heart disease and pericarditis and stroke; and from the combined category of ischemic heart disease, myocarditis, and endocarditis. Other causes of death are assumed to be independent.
POW Status and Mortality
Results
POWs captured during the non-exchange period and younger than age 65 in 1900 were 1.1 times likelier to die than non-POWs, controlling only for age in 1900 and characteristics at enlistment and personal property wealth in 1860 (see Table 4). POWs captured during the exchange period faced the same odds of death as non-POWs. I reject the hypothesis that the hazard ratios on exchange and non-exchange-period POWs are equal at the 10 % level. Controlling for laborer status in 1880 and property ownership and marital status in 1900 does not change the results. When I substituted finer occupational categories (professional or proprietor, artisan, and laborer) at enlistment, and in 1880 (results not shown) for laborer in the second regression in Table 4, the hazard ratios on exchange-period and non-exchange-period POW fell slightly to 1.010 () and 1.087 (
), respectively, with the hazard ratios on non-exchange POW still statistically significant at the 5 % level. Controlling for wartime illnesses and wounds and mentions of scurvy in the surgeons’ exams (the third regression in Table 4) did not change the results, either.14 The hazard ratio on an interaction between POW status and wartime rheumatic fever was not statistically significant (results not shown). Controlling for arteriosclerosis, valvular heart disease, heart murmurs, and irregular heart rate in 1900 did not affect the hazard ratio on POW status (results not shown), suggesting that the effects of POW status on mortality do not operate through conditions already observed in 1900.
The last six regressions in Table 4 show that the excess mortality of POWs captured during the non-exchange period and younger than 65 in 1900 comes from heart disease. POWs captured during the non-exchange period were 1.2 times as likely to die of heart disease as non-POWs, even controlling for laborer status in 1880, property ownership and marital status in 1900, and wartime illnesses and wounds (columns 5, 6, and 7, respectively). POWs captured during the non-exchange period were nine times more likely to die of valvular heart disease (the caveat is that the number of deaths known to be from valvular heart disease was only 54); 1.3 times more likely to die of stroke; and 1.8 times more likely to die of the combined category of ischemic, myocarditis, and endocarditis than non-POWs (columns 8, 9, and 10, respectively). These cause-of-death effects do not arise from the sample being restricted to men with a cause of death: the fourth regression yields similar results as the first three regressions on the entire sample. Because sample size falls, I cannot reject the hypothesis that the hazard ratios on exchange-period and non-exchange-period POWs are equal, with the exception of ischemic heart disease.
Among men older than 64, POWs captured during the non-exchange period were 0.9 times as likely to die as non-POWs (see Table 5). However, as control variables are added, the effect becomes statistically insignificant. I also cannot reject the hypothesis that the hazard ratios on exchange and non-exchange-period POWs are equal. The excess of non-POW deaths comes partially from the combined category of ischemic, myocarditis, and endocarditis, and also from pneumonia and influenza (not shown); however, none of the effects were statistically significant. Non-exchange POWs were more likely than non-POWs to die of other heart conditions, but the effects are statistically significant only for valvular heart disease (where the number of cases is only 16). There is evidence of unobserved heterogeneity (frailty) among men older than 64, but not among men younger than 65. Using a Gompertz specification and assuming that the unobserved heterogeneity, θ, has a gamma distribution yielded a of 0.10 and 19.27 for the test of significance of θ for men younger than 65 and older than 64, respectively. However, explicitly controlling for this heterogeneity did not affect the hazard ratios on POW status.
The Cox hazard model is estimated under the assumption that the proportional hazards assumption holds—that is, that the covariates are multiplicatively related to the hazard. Tests of the proportional hazard assumption in the regressions in Tables 4 and 5 showed that it was not met for the first three regressions. (It was met for all other regressions, except for all heart conditions with controls for later SES. Testing the proportional hazard models on the basis of the Schoenfeld residuals for men younger than 65 yielded , respectively, for each of the last seven specifications.) However, the alternate strategies of stratifying and restricting the sample to a 15-year observation period yielded very similar results and satisfied the proportional hazards assumption. When the samples were restricted to a 15-year observation period, the hazard ratio on non-exchange-period POW became 1.182 (
) for men younger than 65 and 0.817 (
) for men older than 64. (Testing the proportional hazard models on the basis of the Schoenfeld residuals after regressions in which the controls were enlistment characteristics and 1860 personal property wealth yielded a
of 13.93 for men younger than 65 and one of 13.75 for men older than 64.) When the entire sample period was used, the proportional hazards assumption failed to hold for POWs who were exchanged, men who enlisted in a large city, and men found in the 1860 census. I therefore re-ran the specifications, omitting 1860 personal property wealth from the specification, and stratifying on city size and also whether the POW was an exchange-period POW. I found that the hazard rate on non-exchange-period POW was 1.088 (
) for men younger than age 65, and testing the Schoenfeld residuals yielded a
of 11.17.
Explanations
Potential explanations for the high survival rates of ex-POWs older than 64 in 1900 include mortality selection in the POW camp and mortality selection when men were in their 50s and early 60s. There is suggestive evidence that high POW camp mortality rates among older men killed the frail, perhaps those with latent chronic conditions while in the Army, thus leaving a more robust population. Among non-POWs who survived to 1900, 11 men had brothers (as determined from the 1850 and 1860 censuses) who had been POWs. Average year of death for the seven men with a brother who had survived the worst POW period was 1920, compared with 1910 for the four men without a survivor brother. Although the difference is statistically significant, the small number of men makes the evidence only suggestive.
There is no evidence that high mortality rates when men were in their 50s and early 60s left a more robust older population. When I restricted the sample of men younger than 65 in 1900 to those alive in 1910 and examined their mortality experience after 1910, I found that POWs during the non-exchange period were still 1.1 times more likely to die than non-POWs (results not shown).
Comparing non-exchange POWs with non-POWs of all ages in 1900 reveals a statistically insignificant nonlinear relationship between age at captivity and older-age mortality that provides suggestive evidence for the role of selection. I restricted the sample to non-exchange POWs and non-POWs and created a set of five dummy variables indicating whether the veteran had been captured prior to age 20; at ages 20–24, 25–29, 30–34; or age 35 or older. Controlling for age in 1900, size of enlistment city, laborer status at enlistment, native-born, and year of enlistment dummy variables, I found that the hazard ratios on the captivity dummy variable (relative to never captured) were 0.981 (), 1.109 (
), 1.083 (
), 0.894 (
), and 0.807 (
). Relative to never captured, only those POWs captured at ages 20–24 faced a statistically significantly higher mortality risk. Those younger than 20 when captured did not face a statistically significantly different mortality rate from non-POWs or from POWs captured at later ages. The magnitude of the hazard ratios for those younger than 20 at captivity and those older than 30 and the death rates by age group at Andersonville shown in Table 3 suggest selection.
An explanation for the poor survival rate of former non-POWs younger than age 65 at captivity is scarring. Scarring implies that ex-POWs who faced a lower mortality risk should do better. Costa and Kahn (2007b) found that those with kin at Andersonville, as proxied by the number of men in the company with the same last name, were more likely to survive. Among the men who survived to 1900 and who had been imprisoned when conditions were at their worst, those who had had kin in the camp (as proxied by at least one man with the same last name in the company) faced an odds of death of 0.809 () compared with men with no kin, controlling for age in 1900 (full Cox hazard results not shown). Having had kin in the camp was not a statistically significant predictor of older-age mortality among men who had been in POW camps when conditions were better.
Additional support for scarring among POWs younger than age 65 and selection among POWs older than 64 comes from examining the amount of time spent as an exchange and non-exchange POW. The hazard ratio on the logarithm of time spent as a non-exchange POW was 1.009 (), which is statistically significant at the 10 % level. For men older than 64, the hazard ratio was 0.975 (
), which is statistically significant at the 3 % level. The hazard ratios on exchange-period POW were statistically insignificant: 0.997 (
) and 0.991 (
) for men younger than 65 and men older than 64, respectively. The nonlinear relationship with time spent in captivity is consistent with the nonlinear relationship observed for camp survivorship. Men who had survived one or two months were less likely to die than those still in their first month of captivity, but the advantage of having survived a third month was small, and there was no advantage to having survived a fourth month (Costa and Kahn 2007b).
Another explanation, unrelated to scarring, for the worse survival outcomes of non-exchange-period POWs younger than 65 is that the characteristics that enabled men to survive captivity hurt their survival chances at older ages. One potential candidate for such a characteristic is height. The tall were less likely to survive POW camps because all men received the same size ration, and the tall need more food (Costa and Kahn 2007b). In very large samples, the tall are less likely to die of cardiovascular disease (Waaler 1984). However, the camp survival advantage of being short was small (Costa and Kahn 2007b), there was no statistically significant difference in the heights of non-POW and POW veterans, and controlling for height in the regressions did not affect the coefficients on POW status (results not shown).
POW Status and Cardiovascular Morbidity
Among men who were younger than 65, having been a POW during the non-exchange period increased the probability of having murmurs and valvular disease in 1900 by .04 compared with non-POWs for each condition (see Table 6). Men who were POWs during the exchange period were not statistically different from exchange-period POWs. There was no change in the probability of arteriosclerosis or irregular heartbeat by ex-POW status. However, controlling for later-life SES and wartime health conditions, both non-exchange-period and exchange-period POWs were statistically significantly more likely to have an irregular heartbeat (results not shown). The hazard ratios on exchange-period and non-exchange-period POW were 0.047 () and 0.039 (
). POW status did not predict any cardiovascular conditions among men older than 64.
POW Status and Socioeconomic Outcomes
Among men younger than 65 in 1900, POWs in 1870—regardless of when they were captured—were more likely to be laborers and less likely to have any personal property wealth, controlling for laborer status at enlistment and personal property wealth in 1860 (see Table 7). By 1880, POWs captured during the exchange period resemble non-POWs. Their laborer status is similar in 1880, as is property ownership in 1900. However, POWs captured during the non-exchange period were more likely to be laborers in 1880 and were less likely to own property in 1900 than non-POWs. POW status had no effect on marital status in 1900 (or in 1880; not shown). POW status did not have a statistically significant effect on the occupational, homeownership, and marital status of men older than 64 (see Table 8).
Former POWs, regardless of when they were captured, were less likely to have personal property wealth in 1870. POWs captured when the exchange system had stopped were more likely to be laborers than non-POWs in both 1870 and 1880, but the effect is not statistically significant. The effects of POW status on property ownership in 1900 are positive but statistically insignificant.
Conclusion
Using data on Union Army soldiers, I found that imprisonment as a POW had very heterogeneous effects on older-age outcomes, depending on age at captivity. Men older than 30 at captivity when conditions were at their worst were more likely to die in captivity, but 35 years after the end of the war, the survivors had a lower mortality rate than non-POWs or POWs imprisoned when conditions were better and had the same morbidity rate and socioeconomic outcomes. Thirty-five years after the end of the war, men younger than 30 at captivity had worse socioeconomic outcomes, were more likely to die of cardiovascular and cerebrovascular disease, and faced greater cardiovascular morbidity.
The mortality findings for men younger than 30 at captivity are consistent with studies of Japanese-held WWII POWs who faced different social, physical, disease, and medical environments and who also experienced severe malnutrition and even worse psychological stress (Dent et al. 1989; Page and Brass 2001; Page and Ostfeld 1994), suggesting certain common sequelae of starvation and the stress of imprisonment. The mortality findings for men older than 30 at captivity may differ from those of Japanese-held WWII POWs because when conditions in Civil War camps were at their worst, mortality rates from disease and malnutrition were higher than in Japanese POW camps, potentially leading to greater mortality selection in Civil War camps. Mortality effects may need to be very high for selection effects to operate: the mortality rate for men above age 30 at Andersonville was 54 % compared with 30 % for men in their 20s.
Acknowledgments
I thank Matthew Kahn, Louis Nguyen, Irwin Rosenberg, Nevin Scrimshaw, Avron Spiro, the participants of the UCLA Economic History Proseminar, and four anonymous referees for comments. I gratefully acknowledge the support of NIH grants R01 AG19637 and P01 AG10120 and data provided by a subgrant from P30 AG017265.
Notes
See U.S. War Department (1880–1901, Series II, Vol. VIII:615, 781). In a longitudinal random sample, roughly 38 % of the 554 men held at Andersonville died there. In the National Park Service’s cross-sectional database, 40 % of the listed men died at Andersonville (see Costa and Kahn 2007).
Estimated from a random sample of Union Army soldiers.
Estimated from the figures in U.S. Department of Veterans Affairs (2004) and from http://www.phil.muni.cz/~vndrzl/amstudies/civilwar_stats.htm.
Testimony from the trial of Captain Wirtz, reprinted in Ransom (1963).
The data are available at http://www.cpe.uchicago.edu and were collected by a team of researchers led by Robert Fogel. The sample of 35,570 represents roughly 1.3 % of all whites mustered into the Union Army and 8 % of all regiments that composed the Union Army. Ninety-one percent of the sample consists of volunteers, with the remainder evenly divided between draftees and substitutes. The data are based on a 100 % sample of all enlisted men in 331 randomly chosen companies. The sample is limited to 303 companies because complete data are not yet available on all 331companies.
Linkage to the 1860 census reveals that the sample is representative of the northern population of military age in terms of 1860 real estate and personal property wealth and in terms of literacy rates.
Because of the system of prisoner exchange (and the hope that it would be revived), the South had an incentive to record information on men who were captured.
A searchable version of the database is available online as part of the Soldiers and Sailors system (http://www.itd.nps.gov/cwss).
See Costa and Kahn (2007a) for details on the 1880 sample; see Lee (2005) for details on the 1870 sample.
Because information on cause of death comes from the pension records, men—from the researcher’s point of view—are not at risk to die until they are on the pension rolls. By 1900, an estimated 85 % of all Union Army veterans were on the pension rolls. Those not eligible for pensions included deserters (roughly 14 % of war survivors had ever deserted) and men who had served less than 90 days. Men who entered the rolls after 1900 were probably healthier. The percentage of veterans alive in 1900 who entered the rolls later was 8 % for non-POWs, 8 % for Fogel sample POWs, and 5 % for the Andersonville sample. My comparisons of POWs with non-POWs in 1900 may therefore understate health and subsequent mortality differences.
Shell shock, combat fatigue, and post-traumatic stress (all names for the same phenomenon in different wars) were not recognized as disorders either during or after the Civil War (see Hyams et al. 1996 for a history of PTSD). The records therefore provide little information on psychiatric disorders.
WWII POWs held by the Japanese experienced higher mortality rates for at least eight years after the war, whereas those held by the Germans did not. Excess mortality was due to tuberculosis and to trauma, including suicide, with the highest mortality rates among the youngest men (Cohn and Cooper 1954; Keehn 1980; Nefzger 1970).
Additional socioeconomic variables were investigated, including whether the veteran was a private (privates faced higher mortality rates in POW camps than nonprivates), but none of these were statistically significant, and none affected the other coefficients.
Even with no controls for type of POW, a mention of scurvy was not a statistically significant predictor of death.