The First Yellow Fever Pandemic: Slavery and Its Consequences

Today’s guest post is by Billy G. Smith, Distinguished Professor in the Department of History, Philosophy, and Religious Studies at Montana State University. He earned his PhD at University of California Los Angeles. His research interests include disease; race, class and slavery; early America, and mapping early America.

Bird flu, SARS, Marburg, Ebola, HIV, West Nile Fever.  One of these diseases, or another, that spread from animals and mosquitoes to humans may soon kill most people on the planet.  More likely, the great majority of us will survive such a world-wide pandemic, and even now we have a heightened awareness that another one may be on the horizon.  This blog focuses on these issues in the past, outlining a virtually unknown voyage of death and disease that transformed the communities and nations bordering the Atlantic Ocean (what historians now refer to as the Atlantic World).  It traces the journey of a sailing ship that inadvertently instigated an epidemiological tragedy, thereby transforming North America, Europe, Africa, and the Caribbean islands.  This ship helped to create the first yellow fever pandemic.

1-Hankey

The Hankey. From “Ship of Death: The Voyage that Changed the Atlantic World.”

In 1792, the Hankey and two other ships carried nearly three hundred idealistic antislavery British radicals to Bolama, an island off the coast of West Africa, where they hoped to establish a colony designed to undermine the Atlantic slave trade by hiring rather than enslaving Africans.  Poor planning and tropical diseases, especially a particularly virulent strain of yellow fever likely contracted from the island’s numerous monkeys (through a mosquito vector), decimated the colonists and turned the enterprise into a tragic farce.

1-Bulama

 From “Ship of Death: The Voyage that Changed the Atlantic World.”

In early 1793, after most colonists had died and survivors had met resistance from the indigenous Bijagos for invading their lands, the Hankey attempted to return to Britain.  Disease-ridden, lacking healthy sailors, and fearing interception by hostile French ships, the colonists caught the trade winds to Grenada.  They and the mosquitoes in the water barrels spread yellow fever in that port and, very soon, throughout the West Indies.  This was only a few months before the British arrived to quell the slave rebellion in St. Domingue (now Haiti).  The British and subsequently the French military had their troops decimated by the disease—one reason why the slave revolution succeeded.  The crushing defeat in the Caribbean helped convince Napoleon to sell the vast Louisiana territory to the United States.  He turned eastward to expand his empire, altering the future of Europe and the Americas.

A few months after the Hankey arrived in the West Indies, commercial and refugee ships carried passengers and mosquitoes infected with yellow fever to Philadelphia, the nation’s capital during the 1790s.  The resulting epidemic killed five thousand people and forced tens of thousands of residents, including George Washington, Thomas Jefferson, and other prominent federal government leaders, to flee for their lives.  The state, city, and federal government all collapsed, leaving it to individual citizens to save the nation’s capital.  Meanwhile, doctors fiercely debated whether “Bulama fever” (as many called it) was a “new” disease or a more virulent strain of yellow fever common in the West Indies.  Physicians like the noted Benjamin Rush fiercely debated the causes of and treatment for the disease.  They mostly bled and purged their patients, at times causing more harm than good because of the rudimentary state of medicine.

Among those who stepped forward to aid people and save the city were members of the newly emerging community of free African Americans. Led by Absalom Jones, Richard Allen, and Anne Saville, black Philadelphians volunteered to nurse the sick and bury the dead—both dangerous undertakings at the time.  Many African Americans and physicians, exposed to yellow-fever infected mosquitoes, made the ultimate sacrifice as both groups died in disproportionately high numbers.  When a newspaper editor subsequently maligned black people for their efforts, Jones and Allen wrote a vigorous response—among the first publications by African Americans in the new nation.

A Refutation_internetarchive

For one of the first times in American history, blacks responded in print; Revd.s Allen and Jones published a pamphlet answering the charges; Courtesy of the Internet Archive.

During the ensuing decade, yellow fever went global, afflicting every port city in the new nation on an annual basis.  Epidemics also occurred in metropolitan areas throughout the Atlantic World, including North and South America, the Caribbean, southern Europe, and Africa.  Among other consequences, this disaster encouraged Americans to fear cities as hubs of death.  The future of the United States, as Thomas Jefferson argued, would be rural areas populated by yeomen farmers rather than the people in teeming metropolises.  The epidemics also helped solidify the decision of leaders of the new nation to move its capital to Washington D.C. and away from the high mortality associated with Philadelphia.

After the Hankey finally limped home to Britain, its crew was taken into service in the Royal Navy; few of them survived long.  More importantly, the image of Africa as the “white man’s graveyard” became even more established in Britain and France, thereby providing a partially protective barrier for Africa from European invasion until the advent of tropical medicine.  The “Bulama fever” plagued the Atlantic World for the next half century, appearing in epidemic form from Spain to Africa to North and South America.  The origins and treatment of the disease drew intense debates as medical treatment became highly politicized, and the incorrect idea that Africans enjoyed immunity to yellow fever became an important part of the scientific justification of racism in the early nineteenth century.

Join Billy Smith along with epidemiologist Michael Levy on October 24 for Sickness and the City for a conversation that uses both science and history to understand the intersection of urban development and the spread of contagions.

References
Billy G. Smith. Ship of Death: The Voyage that Changed the Atlantic World. New Haven, CT: Yale University Press, 2013.

The British National Health Service and the Fight for Universal Health Insurance in the United States

Today’s guest post is by Andrew Seaton, the 2018 Paul Klemperer Fellow in the History of Medicine. Andrew is a History PhD candidate at New York University. His dissertation explains the survival of the British National Health Service since 1948, and its significance at home and abroad. Andrew will be presenting his Fellowship research on Wednesday, April 18, at 4 p.m. in the Hartwell Room. Please email history@nyam.org if you would like to attend. Space is limited.

Americans have often looked to other countries in their debates about extending health insurance. Health reformers in the Progressive Era held up Germany’s sickness insurance as a model to work toward, only to have this turned against them during the First World War.[1] In the postwar period, the British National Health Service (NHS) became a focal point of discussion. President Truman’s attempts to include “national health insurance” within existing Social Security legislation coincided with the establishment of the NHS in 1948. When Truman’s opponents – foremost among them the American Medical Association (AMA) – depicted the NHS as emblematic of the problems with “socialized medicine,” (see image below) progressives rushed to its defense.

Figure1_watermark

Typical representation of the British National Health Service by the American Medical Association. “The Rebellion of British Doctors,” Editor and Publisher, March 6 1948.

The left-wing health economist, Michael M. Davis – whose papers are housed in the New York Academy of Medicine historical collections – stood as a central advocate for the British model. Davis was one of the most important American health campaigners of the mid-twentieth century. He founded organizations such as the Committee for the Nation’s Health (CNH) in 1946 to promote national health insurance, and worked closely with Truman to achieve legislative reform.[2] Cognizant of attacks in the Progressive Era on the German model, the CNH realized that AMA “misinformation” about the British scheme would seriously harm their chances of securing their goal of comprehensive health coverage for all. Responding to this threat, the CNH rebutted AMA communications on the NHS in their own pamphlets (see image below), provided statistics and details about the British health service to newspaper editors, and reprinted favorable media coverage from the U.K.

Figure2_watermark

Committee for the Nation’s Health, “The Truth About Britain’s Medical Program” (March, 1949).[3]

Trans-Atlantic trips undergirded American battles over the NHS. Dozens of opponents and supporters of extending health insurance in the U.S. undertook field studies in Britain to aid in the battle back home. Davis – by this point nearly eighty years old – undertook such a trip in 1959 with his wife, Alice. They not only met with their extensive contacts in the medical profession and British civil service, but also spoke to ordinary people in public parks across the country to find out how they felt about the NHS. The Britons that Michael and Alice Davis met – from hotel maids to university professors – were “practically unanimous” in saying they “wanted the Health Service,” pointing to the end of anxieties about doctors’ bills as the main cause of satisfaction.[4] The following year, Davis presented these findings as a talk to various American community and labor organizations in an attempt to stimulate interest in national health insurance.

Despite these efforts, Davis and other progressives lost their battle with the AMA. Congress struck down Truman-era health bills, the CNH ended its activities in 1956, and trade unions turned towards securing the best deals for their members through private health insurance rather than advancing a federal health program. The reputation of the NHS played an important part in these events; the AMA’s negative vision of the NHS triumphed over that presented by figures like Davis. This underlines the importance of transnational perspectives when thinking about the history of health care in America – and indeed in Britain – alongside the significance of convincing a wider public when attempting to enact structural change. If Davis’s dream of universal medical coverage in the U.S. is ever to be realized, it will rest in part on shaping popular opinion about America’s place in the wider world of health systems.

References:
[1] Beatrix Hoffman, The Wages of Sickness: The Politics of Health Insurance in Progressive America (Chapel Hill: The University of North Carolina Press, 2001), 54-74.
[2] For a biography of Davis, see Alice Taylor Davis, Michael M. Davis: A Tribute (Chicago: Center for Health Administration Studies, 1972).
[3] New York Academy of Medicine, Library of Social and Economic Aspects of Medicine of Michael M. Davis, Box 64, CNH Releases on British N.H.S., “The Truth About Britain’s Medical Program” (March, 1949).
[4] New York Academy of Medicine, Library of Social and Economic Aspects of Medicine of Michael M. Davis, Box 62, Bibliography: England: 2, Michael M. Davis, “My Observations Last Summer of the British National Health Service” (1960).

Saving the Race from Extinction: African Americans and National Negro Health Week

Today’s guest post is written by Paul Braff, a PhD candidate in American History at Temple University whose research focuses on African American history and public health during the twentieth century. On Tuesday, March 6, Paul will give The Iago Galdston Lecture: “Who Needs a Doctor?: The Challenge of National Negro Health Week to the Medical Establishment.” Click HERE to register for this event.

In 1896, Frederick Hoffman, a statistician for the Prudential Insurance Company of America, released his assessment of African American health. His Race Traits and Tendencies of the American Negro recommended against insuring the race and gave an emphatic confirmation of what Charles Darwin and other scientists and doctors had asserted for years: African Americans were going extinct.[1] Within the context of the burgeoning professionalization of the medical field, such a conclusion had the potential to omit African Americans from medical care, especially when combined with the preconceived racial differences of the time.

Joke

A common joke in the early twentieth century.[2]

For Booker T. Washington, this negative view of the future of his race and the idea that blacks could not understand basic health or improve their situation had the potential to undermine all attempts at racial uplift. As he put it, “Without health and until we reduce the high death-rate [of African Americans] it will be impossible for us to have permanent success in business, in property getting, in acquiring education, to show other evidences of progress.”[3] For Washington, health was the building block upon which everything, political rights, economic self-sufficiency, even citizenship, rested.

To fight this white perception of African American health, in 1915 Washington launched a public health campaign, “National Negro Health Week” (NNHW). The Week focused on both public and private displays of health, emphasizing hygiene as well as painting and whitewashing, the latter overt actions to demonstrate that African Americans could achieve “proper,” or white, standards of cleanliness and connect being clean with health improvement. Thus, the Week incorporated Washington’s racial uplift philosophy as NNHW extolled health and cleanliness values to blacks that aligned with those of whites in the hope of decreasing racial differences. This non-clinical definition of health, in which practicing proper hygiene and painting, not physician overseen checkups and vaccinations, made one healthy, allowed African Americans to understand their own health and empowered them to become leaders in their communities. The straightforward and inexpensive activities the Week suggested were easy to duplicate and rally the community behind. The connections made in organizing a Week could then be used for more extensive African American social and political activities. Although he died later that year, the campaign lived on for another 35 years and became part of Washington’s legacy.

Washington

“National Negro Health Week: 17th Annual Observance, Sunday, April 5, to Sunday, April 12, 1931,” USPHS, Washington, D.C., 1931, cover, Folder 2, Box 5, “National Negro Health Week Collection,” Tuskegee University Archives, Tuskegee, AL.

NNHW’s popularity attracted the interest of the U.S. Public Health Service (USPHS), and when the Great Depression made the Week difficult to finance, the USPHS took it over in 1932. With the vast resources of the USPHS behind it, the Week grew into a massive campaign that had millions of participants in thousands of communities participate each year.

Chart

Susan L. Smith, Sick and Tired of Being Sick and Tired: Black Women’s Health Activism in America, 1890-1950 (Philadelphia, PA: University of Pennsylvania, 1995), 70.

However, such participation came with a price as the USPHS worked to redefine the Week’s definition of health. Under the USPHS, physicians were the ultimate arbiters of health and the focus changed from cleanups and whitewashing to vaccination and getting regular checkups from doctors and dentists. With the white medical establishment more centrally enthroned in the Week and the nascent Civil Rights Movement starting to take shape, African Americans called for an end to a Week based upon race.

National Negro Health Week illuminates the important role non-experts can play in defining personal health, and how those definitions can become internalized. Exploring the role of non-experts allows historians to examine the ways in which social constructions of health can be challenged, and the study of NNHW better positions scholars and public health officials to understand how race and health intersect today.

References:
[1] Charles Darwin, The Descent of Man, and Selection in Relation to Sex (London, UK: John Murray, 1871). Reprint. New York, NY: Penguin Books, 2004, 163; Frederick L. Hoffman, Race Traits and Tendencies of the American Negro (New York, NY: The Macmillan Company, 1896), 35; George Frederickson, The Black Image in the White Mind: The Debate on Afro-American Character and Destiny, 1817-1914 (New York, NY: Harper and Row, 1971), 236-237, 252-258.
[2] “An Important Work,” April 12, 1926, in “The Tuskegee Health Collection, 1926,” 853, Tuskegee University Archives, Tuskegee, AL (TA). See also “Negro Health Week Conference,” November 1, 1926, 1, Box 1 Folder 2, “National Negro Health Week Collection,” TA and Edwin R. Embree, “Negro Illness and the Nation’s Health,” Crisis, March 1929, 84, 97.
[3] Booker T. Washington, Gallery Proof, January 15, 1915, 827, “National Negro Health Week,” Reel 713, Booker T. Washington Collection, TA.

Asthma and the Civil Rights Movement

Today’s guest post is written by Ijeoma Kola, a PhD candidate in Sociomedical Sciences at Columbia University Mailman School of Public Health and a former National Science Foundation graduate fellow. Her dissertation examines the history of asthma in urban African Americans in the 20th century, with special attention to medical history, environmental racism, and community activism. On Tuesday, November 14 at 6pm, Ijeoma will give the talk “Unable to Breathe: Race, Asthma, and the Environment in Civil Rights Era New Orleans and New York.” Click HERE to register for this event.

In July 1965, several months after the assassination of Malcolm X and the freedom marches from Selma to Montgomery, the New York Times ran a story about “an emotional epidemic” of asthma sweeping across New York City.[1] Although the writer focused on psychosomatic explanations to link asthma symptoms to the hostility of the Civil Rights Movement, it prompted me to explore the significance of asthma’s emergence as a racial problem during the 1960s.

Asthma Linked to Rights Drive

Osmundsen, John A. “Asthma Linked to Rights Drive.” New York Times. 1965.

Before the 1960s, little was written about asthma in African Americans. For much of the early twentieth century, doctors debated whether black people could have asthma, as they understood the disease to afflict middle and upper-class whites, who were believed to have more civilized lifestyles and delicate constitutions than poor blacks.[2]

However, in the 1960s, several “outbreaks” of asthma made national news headlines. In the fall of 1960, nearly 150 patients from adjoining neighborhoods were treated for asthma at Charity Hospital in New Orleans. One patient, a 73-year-old man, died.[3] After several years of seasonal asthma admission spikes in the same hospital, researchers at Tulane University found that asthma related visits to the emergency room correlated with fire department calls from spontaneous fires at the base of garbage heaps, some five to twenty years old, around the city. Smoke containing silica particles would drift downwind to where the majority of people who visited Charity Hospital, triggering asthma attacks.[4]

Air Pollution and NO Asthma

Lewis, Robert, Murray M. Gilkeson, and Roy O. McCaldin. “Air Pollution and New Orleans Asthma.” Public Health Reports 77, no. 11 (November 1962): 953.

Air Pollution and NO Asthma 2

Lewis, Robert, Murray M. Gilkeson, and Roy O. McCaldin. “Air Pollution and New Orleans Asthma.” Public Health Reports 77, no. 11 (November 1962): 948. with modifications.

At the time, however, the New Orleans asthma epidemic of November 1960 was quickly forgotten, as events over the course of the next few days would quickly turn attention away from asthma to something more urgent. A week after Dennis Knight’s death, on November 14, 1960 – four black 6-year old girls – Leona Tate, Tessie Provost, Gaile Etienne, and Ruby Bridges – began the school integration process at two elementary schools in New Orleans. Violent protests broke out across the city, and only 13 of the usual 1,000 students at the two schools attended on integration day.[5]

In New Orleans in 1960, and in several other American cities with a large concentrated black community over the next decade, asthma appeared to present itself alongside moments of racial tension. Although the New York Times connects these two phenomena with a psychosomatic explanation of emotional distress, I view the relationship differently. Neighborhoods where African Americans lived – often restricted to due to segregation and redlining – were more exposed to both indoor and outdoor particles that triggered asthma symptoms. While struggling to breathe, black people simultaneously fought for the right to live as equals. Rather than think of Civil Rights as a cause of asthma, I see asthma outbreaks in black urban America and subsequent efforts to reduce the asthma disparity as both a symptom and a symbol of the Civil Rights movement.

References:
[1] John A. Osmundsen, “Asthma Linked to Rights Drive: Authorities Note Sharp Rise in Ailment Among Negroes and Puerto Ricans in City CAUSE STILL UNCERTAIN Tensions of Fight for Gains Play at Least Some Role, Many Experts Contend,” New York Times, 1965.
[2] Horace F. Ivins, “Pollen Catarrh-Hay Fever,” in Proceedings of the Fourth Quinquennial Session of the International Homoeopathic Congress, Held at Atlantic City, N.J., U.S.A., June 16 to 22, 1891 (Philadelphia: Sherman & Co., 1891), 732–43.
[3] “Medics Puzzled:: Asthma Epidemic Hits New Orleans; 149 Seized, 1 Dead,” Philadelphia Tribune (1912-2001); Philadelphia, Penn., November 12, 1960, sec. 2.
[4] Robert Lewis, Murray M. Gilkeson, and Roy O. McCaldin, “Air Pollution and New Orleans Asthma,” Public Health Reports 77, no. 11 (November 1962): 947–54.
[5] John G. Warner, “Mob of 5000 Is Hosed By New Orleans Police: Police Hose New Orleans Segregation Rioters,” The Washington Post, Times Herald  (1959-1973); Washington, D.C., November 17, 1960.

Wound Ballistics: The Science of Injury and the Mystery of Exploding Bullets

1018Johnkinder-FBToday’s guest post is written by John Kinder, Associate Professor of History and American Studies at Oklahoma State University. He is the author of Paying with Their Bodies: American War and the Problem of the Disabled Veteran (University of Chicago Press, 2015). On Tuesday, October 17, Kinder will give his talk, “A History of American War in Five Bodies.” To read more about this lecture and to register, go HERE.

On March 11, 1944, an American soldier in the 182d Infantry was digging a foxhole on the island of Bougainville when a Japanese bullet ricocheted and hit him in the ankle. The wound didn’t look that serious. There was almost no blood. Still, it was better to be safe than sorry. Medics bandaged the wound, loaded the soldier onto a litter, and started down the hill to the aid station. He was dead before they reached the bottom.

I recently discovered this story in a volume on wound ballistics published by the US Army Medical Department in the early 1960s. Wound ballistics is the study of the physiological trauma produced by modern projectile weapons. It achieved quasi-scientific status in the late nineteenth century, as military physicians and other self-proclaimed wound experts carried out experiments to measure and ultimately predict what happened when chemically-projected metal collided with living human tissue.

Early on, much of their research involved shooting ammunition into pine boards or the carcasses of  animals to estimate the casualty-causing potential of various armaments. Over time, however, wound ballisticians developed increasingly sophisticated techniques for mapping the body’s vulnerability to different weapons and fine-tuning the production of physiological trauma.

Dog

Microsecond X-ray of the femur of a dog after it has been shot by an 8/32-inch steel ball travelling at 4,000 feet per second. The bone has been shattered despite the fact that it was not actually hit by the steel ball. In order to understand the mechanisms of human injury, World War II-era scientists carried out ballistics experiments on a variety of “model” targets including living dogs, cats, pigs, and horses, as well as blocks of gelatin and tanks of water. 

In the process, they also managed to solve one of the most head-scratching mysteries in nineteenth-century military medicine. The mystery emerged in the mid-century, when growing numbers of observers began to notice a peculiar phenomenon: soldiers were dying from what initially appeared to be relatively minor “through-and-through” wounds. High-velocity bullets seemed to enter and exit the body with only minimal damage. Upon autopsy, however, surgeons discovered extensive internal trauma—pulped tissue, ruptured veins, shattered bones—far outside of the track of the bullet. How was this possible? As early as the 1840s, critics charged that the wounds must be the product of “exploding bullets,” which were subsequently banned by international treaty in 1868. In later years, physicians speculated that the internal explosions were caused by compressed air or heat, but nothing could be proven.

Cat

Microsecond X-ray of a thigh of a cat that has been shot by a 4/32-inch steel ball at an impact velocity of 3,000 feet per second. The dark area is the temporary cavity formed as the ball passes through the muscle tissues. X-rays like this one helped wound ballisticians explain the “explosive effect” that mystified nineteenth-century military physicians. 

By the 1940s, scientists were able to use X-rays and high-speed cameras to solve the mystery once and for all. They discovered that, around 200-400 microseconds after a high-speed bullet strikes a human body, a temporary cavity begins to form around the bullet path. This cavity, which expands and contracts in a fraction of a second, can be more than 20 times the volume of the permanent wound track, resulting in the explosive damage to nearby tissue and bone. And, thanks to the elasticity of human skin, the bullet’s entrance and exit wounds might be nearly closed over by the time the patient reaches medical attention. It was remarkable discovery—not least because it affirmed wound ballisticians’ belief that, when it came to understanding injury, the human eye was no match for a scientist and a machine.

To this day, practitioners of wound ballistics like to justify their work in humanitarian terms. The goal of their research, they often say, is to help military surgeons and body armor manufacturers cut down on unnecessary deaths. All of this is true—to a certain extent. From the very start, however, the field of wound ballistics has played a more ominous role in military history. If wound ballistics is the science of injury, it is also the science of injuring others. Understanding the body’s vulnerabilities has allowed warring nations to develop deadlier antipersonnel weapons: armaments designed to pulverize, poison, burn, shred, emulsify, and eviscerate the bodies of one’s enemies.

No doubt, some readers might be wondering about the soldier at Bougainville, the one who died after a light wound to the ankle. Was he too a victim of the “exploding bullet” phenomenon? As it turns out, his death can be chalked up to a more quotidian threat: human error. Today, we can only speculate about the medics’ actions: perhaps they were in a hurry, or perhaps they were exhausted after a brutal day of fighting, or perhaps—and this is my guess—they were so used to seeing war’s butchery that this soldier’s injury appeared inconsequential by comparison. Whatever the reason, they failed to apply a tourniquet to the wounded man’s leg.

Shortly after the litter party started down the hill, the soldier’s ankle began to hemorrhage. As blood drained from his body, he said that he felt cold. Within minutes, he was dead.

References:
1. International Committee of the Red Cross. Wound Ballistics: an Introduction for Health, Legal, Forensic, Military and Law Enforcement Professionals (film). 2008.
2. Kinder, John. Paying with Their Bodies: American War and the Problem of the Disabled Veteran. Chicago: University of Chicago Press, 2015.
3. Saint Petersburg Declaration of 1868 (full title: Declaration Renouncing the Use, in Time of War, of Explosive Projectiles Under 400 Grammes Weight”). November 29-December 11, 1868.
4. United States Army Medical Department. Wound Ballistics. Washington DC: Office of the Surgeon General, Department of the Army, 1962.

Images:
Dog X-ray: Newton Harvey, J. Howard McMillan, Elmer G. Butler, and William O. Puckett, “Mechanism of Wounding,” in United States Army Medical Department, Wound Ballistics (Washington DC: Office of the Surgeon General, Department of the Army, 1962), 204.
Cat X-ray: Ibid, 176.

More Than Medicine: Social Justice and Feminist Movements for Health

COS-Series-Twitter-R1 Event 1005Today’s guest post is written by Jennifer Nelson, Professor at University of Redlands, specializing in women’s history, the history of feminism in the United States, and medical histories associated with social justice movements. She is the author of More Than Medicine (NYUPress, 2015).

On Thursday, October 5, Nelson will give her talk, “More Than Medicine: Social Justice and Feminist Movements for Health.” To read more about this lecture and to register, go HERE.

I begin my story of social justice and feminist movements for health with the Mound Bayou demonstration clinic—located in the Mississippi Delta. The clinic was founded by medical doctors who had been part of the Medical Committee for Human Rights (MCHR). Most had come to Mississippi to volunteer during the 1964 Freedom Summer, although several others were locals active in the Delta Ministry, a Mississippi based Civil Rights organization.

Dr. H. Jack Geiger and the other founders of the Delta Health Center clinic worked with the Mound Bayou community to prioritize health needs. They quickly discovered that community members needed more than traditional medicine: they demanded food, jobs, and housing—linking these to the promotion of health. The clinic included a cooperative farm that grew vegetables for the community, since most of what was grown in the Delta was for commercial consumption. Click HERE to listen to Dr. Geiger talk about his experiences in Mississippi at the National Library of Medicine exhibit on the Delta Health Center.

Many historians of social movements have emphasized that the women’s liberation movement emerged from the Civil Rights movement. I also pay attention to this connection, but focus on the use of medicine to achieve social justice goals in both the Civil Rights and women’s liberation movements. In both of these contexts, activists expanded the meaning of medicine in the process.

In the 1980s women’s health movement feminists were also grappling with conversations about race and racism. Since the early 1970s, women of color had been demanding that feminists pay more attention to issues raised by women of color. In the 1970s much of the focus was on sterilization abuse. In the 1980s attention shifted to HIV transmission, safer sex, and AIDS. Dazon Dixon Diallo, one of the only women of color working at the Atlanta Feminist Women’s Health Center in the 1980s, focused on developing a program called the Women with AIDS Partnership Project.

Here is a clip from a talk given by Diallo about her organization SisterLove, which she formed when she left the Atlanta FWHC to more directly address HIV/AIDS:

Clip

Dazon Dixon Diallo. (Click image to watch video).

My book also focuses on Loretta Ross, one of the most important founders of the Reproductive Justice Movement, which sought to broaden the feminist discourse around “choice” to address the systemic problems associated with poverty and discrimination that prevented many women of color from simply choosing to have or not have children. Ross’s work connects back to the Civil Rights efforts in Mound Bayou among MCHR activists and local organizers with the Delta Health Center. Ross, by forging a reproductive justice framework, maintained that health promotion for poor women could not rest on medicine alone.

More Than Medicine: Social Justice and Feminist Movements for Health is the third event in the three part event series, Who Controls Women’s Health?: A Century of Struggle. The series examines key battles over women’s ability to control their bodies, health choices, and fertility. It is developed in collaboration with the Museum of the City of New York and supported by a grant from Humanities New York.

The Language of Textiles and Medicine

Today’s guest post is written by Kriota Willberg, New York Academy of Medicine’s Artist-in-Residence researching the history of sutures and ligatures.  Through graphic narratives, teaching, and needlework, Kriota explores the intersection between body sciences and creative practice. Starting this week, Kriota will be teaching a four-week workshop entitled “Embroidering Medicine,” which explores relationships between medicine, needlework, and gender. There is still time to register for this workshop, which begins September 14.

As an artist working with textiles and comics (two media often considered domestic or for children), I am interested in the interplay of culturally common materials, tools, and language with those of professional specialty. From the research I have done on the history of sutures and ligature, it appears that the staples of domestic needlework: thread/sinew, cloth/hide, scissors, pins, and needles have been appropriated from domestic use since the time of their invention, to assist in the repair of the body. Similarly, the language of domestic and professional needlework has been re-purposed to describe closing wounds.

Many of the texts I am reading describe the characteristics and purposes of various surgical needles, the type of textiles used for bandaging (linen, wool, cotton), and the type of thread used for various types of sutures (linen, silk, cotton, catgut). I have also found descriptions of wool and flax production by Pliny the Elder in the first century AD, an account of French silk production in 1766 from John Locke, and a couple 20th-century books detailing the history of catgut.

bauer_Black_Watermark

Ligatures and Sutures by Bauer and Black (c1924) chapter on “Preparation of Bauer & Black Catgut.”

Although I don’t know when a physician’s sewing kit diverged from those of a seamstress or leather worker’s sewing kit, John Stewart Milne writes in his book Surgical Instruments in Greek and Roman Times:

“Three-cornered surgical needles were in use from very early times. They are fully described in the Vedas of the Hindoos… A few three-cornered needles of Roman origin have been found, although they are rare.”[1]

In addition to describing the specific uses of surgical needles, Milne also discusses the uses of domestic needles in stitching bandages by Roman physicians.[2]

milne_watermark

A collection of needles and probes. Source: Surgical Instruments in Greek and Roman Times (1907) by John Stewart Milne.

Galen reinforces this play between textiles, medicine, and the body by describing damage to the body through the metaphor of fabric:

“It is not the job of one art to replace one thread that has come loose, and of another to replace three or four, or for that matter five hundred… In quite general terms, the manner by which each existent object came about in the first place is also the manner in which it is to be restored when damaged.

The woof is woven into the warp to make a shirt. Now, is it possible for that shirt to sustain damage, or for that damage to be repaired, in some way which does not involve those two elements? If there is damage of any kind at all, it cannot but be damage to the warp, or to the woof, or to both together; and, similarly, there is only one method of repair, an inter-weaving of woof and warp which mimics the original process of creation.”[3]

The tandem development of textile production and medicine becomes part of the domestic-to-medical interface of textiles and their tools manifested through the language used to describe materials, tools, and stitches.

In his Major Surgery (1363), in a chapter about “sewing” wounds, Guy de Chauliac describes wrapping thread around a needle in the same method that women use to keep threaded needles on their sleeves. He also describes using hooks to bind wounds. This closure technique is attributed to wool cutters or (wool) walkers.[4] Later Ambrose Paré, paraphrasing Guy’s description of another type of suture says, “The second Suture is made just after the same manner as the Skinners sow their…furs.”[5] Paré also uses the keeping a needle on one’s sleeve description when describing surgical repair of harelip (known today as cleft lip).

pare_watermark

Bottom illustration showing an example of thread winding described by Paré and Guy de Chauliac. Source: The Workes of that famous Chirurgion Ambrose Parey (1634).

The language of needlework and textiles is used to educate and inform the student surgeon about the body, health, and suturing techniques.  Woof and warp, wrapping needles, closing a wound as a wool walker would fasten wool, and suturing the body with the same stitch used by a Skinner, seem to be descriptions one is expected to understand and mimic. What is a wool walker? Thanks to Wikipedia I can tell you that “walking” is a step in cloth making, also called fulling, in which one pounds woolen cloth with one’s feet to thicken and clean it.[6] I still haven’t figured out how they fasten the wool with hooks.

References:
[1] Milne, John Stewart. Surgical Instruments in Greek and Roman Times. Oxford: Clarendon Press, 1907, p.75.
[2] Milne. p.75-76.
[3] Galen. Galen : selected works ; translated with an introduction and notes by P.N. Singer. Trans. Peter N. Singer. Oxford: Oxford University Press, 1997.
[4] Guy, de Chauliac. The cyrurgie of Guy de Chauliac. Ed. Margaret S. Ogden. London, New York: Early English Text Society by the Oxford Univ. Press, 1971, p.192.
[5] Paré, Ambrose. The Workes of that famous Chirurgion Ambrose Parey Translated out of Latine and compared with the French. Trans. Th: Johnson. London: Th:Cotes and R. Young, 1634, p.327.
[7] Wikipedia. Fulling. 10 July 2017.

Who Practices “Visualizing Anatomy”?

Today’s guest post is written by Kriota Willberg, New York Academy of Medicine’s Artist-in-Residence.  Through graphic narratives, teaching, and needlework, Kriota explores the intersection between body sciences and creative practice. This May, Kriota taught a four-week workshop entitled “Visualizing and Drawing Anatomy,” which utilized live models as well as anatomical illustrations from the New York Academy of Medicine’s library. You can read more about Kriota’s work HERE.

Class

The class gets oriented before drawing practice.

The Visualizing and Drawing Anatomy workshop was held at the Academy Tuesday evenings in June.  Once again I was impressed by the participants willingness to practice looking underneath our models’ skin to draw the deep anatomical structures that give our bodies form.

iPad

Participants draw using their preferred medium, in this case, paper or an iPad.

Who benefits from this kind of drawing practice? Practically everyone. Trained artists sharpen their skills, and those new to art and drawing learn fundamental principles of anatomy that lay the foundation for drawing the human figure.

Debbie Rabina, who is new to art, took the workshop last year.  Since then she has kept a regular drawing practice and she occasionally incorporates anatomy into her work.

Rabina

Debbie Rabina’s drawing since taking “Visualizing Anatomy” in 2016.

Ellen Zaraoff is a photographer who has just started drawing. Until taking the classes this year she had been focusing on drawing portraits in charcoal.  She took the workshop to get an introduction to anatomy, structure, and proportion.

Sarah Wukoson has a BA in art, and works in medical research. She took the workshop this year because she’s interested in the intersection of art and medicine as well as “the interplay of different modes of understanding the body.”

Wukoson

Sarah Wukoson’s 2017 in-class sketches and exercises.

Jim Doolley is a “life-long art lover who decided a couple years ago to take a stab at producing, not just consuming.” His focus is drawing and painting. He took this class to improve his draftsmanship.

Dooley

Jim Dooley’s 2017 homework.

Susan Shaw is an artist.  She says, “I took the class (last year) because I found I was thinking 2 dimensionally when I was drawing and the figures seemed to have no life… I now think about how the body functions when I draw and it makes gesture and weighting much easier.”

Shaw

Susan Shaw’s figure drawing since taking “Visualizing Anatomy” in 2016.

The variety of participants: artists, illustrators, cartoonists and enthusiastic beginners – all interested in anatomy and the Library’s historical collection make this workshop one of my favorites to teach.

This September 14-October 5, Kriota is offering an “Embroidering Medicine Workshop,” which will take place at the Academy.  This four-week workshop explores The New York Academy of Medicine Library’s historical collections, examining relationships between medicine, needlework, and gender. Learn more and register HERE.

 

The History of Garlic: From Medicine to Marinara

Today’s guest post is written by Sarah Lohman, author of Eight Flavors: The Untold Story of American Cuisine (Simon & Schuster, 2016). On Monday, June 5, Lohman will give her talk, “The History of Garlic: From Medicine to Marinara.” To read more about this lecture and to register, go HERE.

Ms. Amelia Simmons gave America its first cookbook in 1796; within her pamphlet filled with sweet and savory recipes, she makes this note about garlic: “Garlickes, tho’ used by the French, are better adapted to the uses of medicine than cookery.” In her curt dismissal, she reflected a belief that was thousands of years old: garlic was best for medicine, not for eating. To add it to your dinner was considered the equivalent of serving a cough syrup soup.

There are records of ancient Greek doctors who prescribed garlic as a strengthening food, and bulbs were recovered from Egyptian pyramids. Garlic was being cultivated in China at least 4,000 years ago, and upper class Romans would never serve garlic for dinner; to them, it tasted like medicine.

In medieval Europe, garlic was considered food only for the humble and low.  While those that could afford it imported spices like black pepper from the Far East, lower classes used herbs they could grow. Garlic’s intense flavor helped peasants jazz up otherwise bland diets. It was made into dishes like aioli, originally a mixture of chopped garlic, bread crumbs, nuts and sometimes stewed meat. It was intended to be sopped up with bread, although it was occasionally served as a sauce to accompany meats in wealthier households.

woodville_medicalbotany_garlic_1790_Watermark

Garlic (Scientific name Allium Sativum) from Medical Botany (1790) by William Woodville.

The English, contrary to the stereotype about bland British cooking, seemed particularly enchanted by garlic. In the first known cooking document in English, a vellum scroll called The Form of Cury, a simple side dish is boiled bulbs of garlic. Food and medicine were closely intertwined in Medieval Europe, and garlic was served as a way to temper your humors. Humors were thought to be qualities of the body that affected on your health and personality. Garlic, which was thought be “hot and dry,” shouldn’t be consumed by someone who was quick to anger, but might succeed in pepping up a person who was too emotionally restrained. According to food historian Cathy Kaufman, a medieval feast might have a staggering amount of different dishes, all laid on the table at one time, so that different personality types could construct a meal that fit their humors.

Up through the 19th century, people also believed you got sick by inhaling bad air, called “miasmas.” Miasmas hang out by swamps, but also by sewage, or feet–I always imagined them as the puddles of mist that lie in the nooks between hills on dark country roads. Garlic can help you with miasmas, too. Ever see an image of plague doctors from Medieval Europe wearing masks with a long, bird-like beak? The beak was filled with odorous herbs, garlic likely among them, designed to combat miasmas.

In 18th-century France, a group of thieves may have been inspired by these plague masks. During an outbreak of the bubonic plague in Marseilles in 1726 (or 1655, stories deviate), a group of thieves were accused of robbing dead bodies and the houses of the deceased and ailing, without seeming to contract the disease themselves. Their lucky charms against the miasmas? They steeped garlic in vinegar, and soaked a cloth or a sponge in the liquid, then tied it like a surgical mask over their mouth and nose. In their minds, the strong smells would repel miasmas. This story is probably a legend, but I think there is some grain of truth to it: in modern studies, garlic has been shown to obfuscate some of the human smells that attract biting bugs. Since we now know bubonic plague was carried by fleas, it’s possible the thieves were repelling the insects. The plague is also a bacterial infection, and both vinegar and garlic are effective antimicrobials.

Garlic remained in the realm of medicine for most of the 19th century. Louis Pasteur first discovered that garlic was a powerful antimicrobial in 1858. In 1861, John Gunn assembled a medical book for use in the home, The New Domestic Physician, “with directions for using medicinal plants and the simplest and best new remedies.” Gunn recommends a poultice of roast garlic for ear infections:

“An excellent remedy for earache is as follows: Take three or four roasted garlics, and while hot mash, and add a tablespoonful of sweet oil and as much honey and laudanum; press out the juice, and drop of this into the ear, warm, occasionally.”

Salmon_EnglishHerbal_1710_408_garlic_watermark

Garlick from Botanologia: The English Herbal (1710) by William Salmon.

He also recommends garlic for clearing mucus from the lungs and reducing cough, given by the spoonful with honey and laudanum.  Gardening for the South: Or, How to Grow Vegetables and Fruits, an 1868 botanical guide, says the medicinal values of garlic include making you sweat, which,  like bloodletting, was believed to leach out disease; it will also make you urinate, and is an effective “worm destroyer,” for any intestinal hitchhikers you might have. By the late 19th century, scientists also used garlic to treat TB and injected it into the rectum to treat hemorrhoids.

Today, garlic is one of the most heavily used home remedies, and it is increasingly being studied in the medical field. Some of its historic uses have been proved as bunk–while others, like its efficacy as a topical antiseptic, hold up. But since the late 19th century, garlic has found an even more worthwhile home, thanks to French chefs and Italian immigrants, who spread their garlic heavy cuisine around the world, and made even garlic-reticent Americans a lover of this pungent plant.

Join us on Monday, June 5 to learn more about this topic.  Click HERE to register.

The Enduring Impact of the X-Ray

Today we have part two of a guest post written by Dr. Daniel S. Goldberg, 2016 recipient of the Audrey and William H. Helfand Fellowship in the History of Medicine and Public Health. Part one can be read here.

X-ray exhibitions were hugely popular all over the country, and the greater NY area was no exception.  At a February 1896 demonstration run by Professor Arthur Wright, director of the Sloan Laboratory at Yale University, a newspaper reported that despite the auditorium being literally jam-packed, students were still crawling through windows 30 minutes into the lecture — and all this despite the fact that none of the audience, save those in the first few rows, could even hear Wright’s discussion.  The deans of multiple Yale schools (Divinity, Law, and Science), the head of the Yale Corporation, and the chief medical examiner were all in attendance.

bostonmedsurgjournal_feb201896_rontgenprocess_watermark

Perhaps the first published X-ray in the United States of a clinical condition. In “Rare Anomalies of the Phalanges Shown by the Röntgen Process,” Boston Medical and Surgical Journal 134(8), February 20, 1896: 198–99.

The pressing question is “why”? Why did X-rays exert such tremendous power across a wide spectrum of social domains? (X-rays were a constant topic of conversation in sermons and religious journals, in women’s journals, in influential satirical periodicals like Punch, and were the subject of a seemingly endless number of political and non-political cartoons, to name but a few).  Although historians of the X-ray have offered a number of plausible answers, I believe there is a key element left unexplored in the historiography: the intellectual frameworks, or ideas, relating to changing ideas of truth, doubt, and objectivity in U.S. society at the time.

Two of these frameworks are most useful in unpacking the stunning impact of the X-ray: the rise of mechanical objectivity, and what can be called “somaticism” within medicine and science.  Historians of science Lorraine Daston and Peter Galison explain that a new model of ‘objectivity’ begins to take hold during the middle decades of the 19th century.  Under this new model, the truth-value of scientific knowledge is a function of the investigator’s ability to remove or eliminate human, subjective influence from the knowledge-making process.  The fact that this is more or less impossible, and that X-rays can be manipulated in all sorts of ways was well-known to contemporaries and remained a source of anxiety for some time.  The important point is the ultimate goal: to let the mechanical processes of nature speak for themselves and reveal their truths.  Ideas of objectivity, as Daston and Galison point out, have for over four hundred years been connected to scientific images, which makes media like photography and X-rays especially significant.

By the end of the 19th century, ideas of mechanical objectivity begin to fundamentally reshape ideas of what is known and what is certain.  This is especially crucial in a century that features so much intense change, including but not limited to governments, family and labor structures, migration patterns, and, of course, industrialization and urbanization.  Late Victorians were beset with anxieties connected to their changing world, and they were especially concerned with artifice and deception — that the world was not what it seemed.  As such, intellectual frameworks that shaped the criteria for truth were hugely influential, and traveled well beyond narrow networks of scientists and medical men.

Somaticism integrates in important ways with constructs of mechanical objectivity.  Historians of medicine have documented the influence of somaticism (or, “empiricism,” as it is also sometimes termed) within medicine over the long 19th century.  The core of the framework is that truths about disease and the body are to be found in pathological anatomical objects.  The existence of these objects can then be clinically correlated with the illness complaints the patient has, or more likely had given that pathological objects are most likely to be located precisely during a postmortem — until the X-ray.  The truths of the sick body are to be found in the natural objects of disease, which makes seeing those objects so essential.  Laennec himself explained that the point of the stethoscope was not to listen; listening was merely a means to an end.  The point, as Jacalyn Duffin explains, was “to see with a better eye.”

Collectively, these frameworks go a significant length in explaining the enormous and enduring social impact of the X-ray.

X Rays for Consideration_watermarked

Article from the New York Record. May 1896.

For example, Morton’s clippings contain a May 1896 article from the New York Record entitled “X Rays for a Consideration: Light in a Human Kidney.”  The article details what may be the first private X-ray laboratory opened in New York City, founded by Mrs. M.F. Martin, and located at 110 East 26th Street.  The lab was intended solely for the use of physicians and hospitals.  One of its first patients was a doctor named George McLane, who traveled from North Dakota to have his kidney X-rayed for evidence of a possible stone.  A surgeon removed McLane’s kidney, and Morton placed it on a plate and subsequently irradiated it with X-rays.  The procedure “revealed the absence of any stone in the organ, demonstrating the entire reliability of doctors to prove the absence of stone in the kidney.”

The X-ray shines its light into the hitherto dark spaces inside the human body, revealing the truth of a disputed question: whether McLane suffered from a kidney stone or not.  The truth resides in the natural object itself, and the mechanism of the X-ray supposedly insulates the production of medical knowledge from the whims and artifices of the investigator (as compared to illustrations and drawings, for example).

Or, as Dr. McLane himself stated at the Post Graduate Hospital (the primary hospital at which Morton cared for inpatients):

“Dr. McLane spoke modestly at the Post Graduate Hospital about the risk he had taken in the name of science . . . ‘Hitherto a great many mistakes have been made owing to the inability of doctors to prove the absence of stone in the kidney . . .’  Now, by a very simple process, the truth can easily be determined.”

It is difficult to imagine how powerful it must have been, in 1896, to witness an X-ray operator remotely anatomize the living body.  Seeing inside the body had been a dream of physicians for centuries prior, and there is every reason to believe that its achievement has not eroded much of its social power.  Americans still perform significantly more medical imaging procedures than virtually any of our comparator societies, and what is most interesting is the evidence that this utilization is driven both by supply and demand.  That is, it is not merely that we have expensive X-ray and medical imaging machines — so we use them.  Across a wide variety of illness paradigms, illness sufferers and patients request medical imaging; they want it to be performed on their bodies.  The history of the X-ray helps us understand the enduring power of these tools, of what it means to delve into the penetralium.