The Language of Textiles and Medicine

Today’s guest post is written by Kriota Willberg, New York Academy of Medicine’s Artist-in-Residence researching the history of sutures and ligatures.  Through graphic narratives, teaching, and needlework, Kriota explores the intersection between body sciences and creative practice. Starting this week, Kriota will be teaching a four-week workshop entitled “Embroidering Medicine,” which explores relationships between medicine, needlework, and gender. There is still time to register for this workshop, which begins September 14.

As an artist working with textiles and comics (two media often considered domestic or for children), I am interested in the interplay of culturally common materials, tools, and language with those of professional specialty. From the research I have done on the history of sutures and ligature, it appears that the staples of domestic needlework: thread/sinew, cloth/hide, scissors, pins, and needles have been appropriated from domestic use since the time of their invention, to assist in the repair of the body. Similarly, the language of domestic and professional needlework has been re-purposed to describe closing wounds.

Many of the texts I am reading describe the characteristics and purposes of various surgical needles, the type of textiles used for bandaging (linen, wool, cotton), and the type of thread used for various types of sutures (linen, silk, cotton, catgut). I have also found descriptions of wool and flax production by Pliny the Elder in the first century AD, an account of French silk production in 1766 from John Locke, and a couple 20th-century books detailing the history of catgut.

bauer_Black_Watermark

Ligatures and Sutures by Bauer and Black (c1924) chapter on “Preparation of Bauer & Black Catgut.”

Although I don’t know when a physician’s sewing kit diverged from those of a seamstress or leather worker’s sewing kit, John Stewart Milne writes in his book Surgical Instruments in Greek and Roman Times:

“Three-cornered surgical needles were in use from very early times. They are fully described in the Vedas of the Hindoos… A few three-cornered needles of Roman origin have been found, although they are rare.”[1]

In addition to describing the specific uses of surgical needles, Milne also discusses the uses of domestic needles in stitching bandages by Roman physicians.[2]

milne_watermark

A collection of needles and probes. Source: Surgical Instruments in Greek and Roman Times (1907) by John Stewart Milne.

Galen reinforces this play between textiles, medicine, and the body by describing damage to the body through the metaphor of fabric:

“It is not the job of one art to replace one thread that has come loose, and of another to replace three or four, or for that matter five hundred… In quite general terms, the manner by which each existent object came about in the first place is also the manner in which it is to be restored when damaged.

The woof is woven into the warp to make a shirt. Now, is it possible for that shirt to sustain damage, or for that damage to be repaired, in some way which does not involve those two elements? If there is damage of any kind at all, it cannot but be damage to the warp, or to the woof, or to both together; and, similarly, there is only one method of repair, an inter-weaving of woof and warp which mimics the original process of creation.”[3]

The tandem development of textile production and medicine becomes part of the domestic-to-medical interface of textiles and their tools manifested through the language used to describe materials, tools, and stitches.

In his Major Surgery (1363), in a chapter about “sewing” wounds, Guy de Chauliac describes wrapping thread around a needle in the same method that women use to keep threaded needles on their sleeves. He also describes using hooks to bind wounds. This closure technique is attributed to wool cutters or (wool) walkers.[4] Later Ambrose Paré, paraphrasing Guy’s description of another type of suture says, “The second Suture is made just after the same manner as the Skinners sow their…furs.”[5] Paré also uses the keeping a needle on one’s sleeve description when describing surgical repair of harelip (known today as cleft lip).

pare_watermark

Bottom illustration showing an example of thread winding described by Paré and Guy de Chauliac. Source: The Workes of that famous Chirurgion Ambrose Parey (1634).

The language of needlework and textiles is used to educate and inform the student surgeon about the body, health, and suturing techniques.  Woof and warp, wrapping needles, closing a wound as a wool walker would fasten wool, and suturing the body with the same stitch used by a Skinner, seem to be descriptions one is expected to understand and mimic. What is a wool walker? Thanks to Wikipedia I can tell you that “walking” is a step in cloth making, also called fulling, in which one pounds woolen cloth with one’s feet to thicken and clean it.[6] I still haven’t figured out how they fasten the wool with hooks.

References:
[1] Milne, John Stewart. Surgical Instruments in Greek and Roman Times. Oxford: Clarendon Press, 1907, p.75.
[2] Milne. p.75-76.
[3] Galen. Galen : selected works ; translated with an introduction and notes by P.N. Singer. Trans. Peter N. Singer. Oxford: Oxford University Press, 1997.
[4] Guy, de Chauliac. The cyrurgie of Guy de Chauliac. Ed. Margaret S. Ogden. London, New York: Early English Text Society by the Oxford Univ. Press, 1971, p.192.
[5] Paré, Ambrose. The Workes of that famous Chirurgion Ambrose Parey Translated out of Latine and compared with the French. Trans. Th: Johnson. London: Th:Cotes and R. Young, 1634, p.327.
[7] Wikipedia. Fulling. 10 July 2017.

Who Practices “Visualizing Anatomy”?

Today’s guest post is written by Kriota Willberg, New York Academy of Medicine’s Artist-in-Residence.  Through graphic narratives, teaching, and needlework, Kriota explores the intersection between body sciences and creative practice. This May, Kriota taught a four-week workshop entitled “Visualizing and Drawing Anatomy,” which utilized live models as well as anatomical illustrations from the New York Academy of Medicine’s library. You can read more about Kriota’s work HERE.

Class

The class gets oriented before drawing practice.

The Visualizing and Drawing Anatomy workshop was held at the Academy Tuesday evenings in June.  Once again I was impressed by the participants willingness to practice looking underneath our models’ skin to draw the deep anatomical structures that give our bodies form.

iPad

Participants draw using their preferred medium, in this case, paper or an iPad.

Who benefits from this kind of drawing practice? Practically everyone. Trained artists sharpen their skills, and those new to art and drawing learn fundamental principles of anatomy that lay the foundation for drawing the human figure.

Debbie Rabina, who is new to art, took the workshop last year.  Since then she has kept a regular drawing practice and she occasionally incorporates anatomy into her work.

Rabina

Debbie Rabina’s drawing since taking “Visualizing Anatomy” in 2016.

Ellen Zaraoff is a photographer who has just started drawing. Until taking the classes this year she had been focusing on drawing portraits in charcoal.  She took the workshop to get an introduction to anatomy, structure, and proportion.

Sarah Wukoson has a BA in art, and works in medical research. She took the workshop this year because she’s interested in the intersection of art and medicine as well as “the interplay of different modes of understanding the body.”

Wukoson

Sarah Wukoson’s 2017 in-class sketches and exercises.

Jim Doolley is a “life-long art lover who decided a couple years ago to take a stab at producing, not just consuming.” His focus is drawing and painting. He took this class to improve his draftsmanship.

Dooley

Jim Dooley’s 2017 homework.

Susan Shaw is an artist.  She says, “I took the class (last year) because I found I was thinking 2 dimensionally when I was drawing and the figures seemed to have no life… I now think about how the body functions when I draw and it makes gesture and weighting much easier.”

Shaw

Susan Shaw’s figure drawing since taking “Visualizing Anatomy” in 2016.

The variety of participants: artists, illustrators, cartoonists and enthusiastic beginners – all interested in anatomy and the Library’s historical collection make this workshop one of my favorites to teach.

This September 14-October 5, Kriota is offering an “Embroidering Medicine Workshop,” which will take place at the Academy.  This four-week workshop explores The New York Academy of Medicine Library’s historical collections, examining relationships between medicine, needlework, and gender. Learn more and register HERE.

 

The History of Garlic: From Medicine to Marinara

Today’s guest post is written by Sarah Lohman, author of Eight Flavors: The Untold Story of American Cuisine (Simon & Schuster, 2016). On Monday, June 5, Lohman will give her talk, “The History of Garlic: From Medicine to Marinara.” To read more about this lecture and to register, go HERE.

Ms. Amelia Simmons gave America its first cookbook in 1796; within her pamphlet filled with sweet and savory recipes, she makes this note about garlic: “Garlickes, tho’ used by the French, are better adapted to the uses of medicine than cookery.” In her curt dismissal, she reflected a belief that was thousands of years old: garlic was best for medicine, not for eating. To add it to your dinner was considered the equivalent of serving a cough syrup soup.

There are records of ancient Greek doctors who prescribed garlic as a strengthening food, and bulbs were recovered from Egyptian pyramids. Garlic was being cultivated in China at least 4,000 years ago, and upper class Romans would never serve garlic for dinner; to them, it tasted like medicine.

In medieval Europe, garlic was considered food only for the humble and low.  While those that could afford it imported spices like black pepper from the Far East, lower classes used herbs they could grow. Garlic’s intense flavor helped peasants jazz up otherwise bland diets. It was made into dishes like aioli, originally a mixture of chopped garlic, bread crumbs, nuts and sometimes stewed meat. It was intended to be sopped up with bread, although it was occasionally served as a sauce to accompany meats in wealthier households.

woodville_medicalbotany_garlic_1790_Watermark

Garlic (Scientific name Allium Sativum) from Medical Botany (1790) by William Woodville.

The English, contrary to the stereotype about bland British cooking, seemed particularly enchanted by garlic. In the first known cooking document in English, a vellum scroll called The Form of Cury, a simple side dish is boiled bulbs of garlic. Food and medicine were closely intertwined in Medieval Europe, and garlic was served as a way to temper your humors. Humors were thought to be qualities of the body that affected on your health and personality. Garlic, which was thought be “hot and dry,” shouldn’t be consumed by someone who was quick to anger, but might succeed in pepping up a person who was too emotionally restrained. According to food historian Cathy Kaufman, a medieval feast might have a staggering amount of different dishes, all laid on the table at one time, so that different personality types could construct a meal that fit their humors.

Up through the 19th century, people also believed you got sick by inhaling bad air, called “miasmas.” Miasmas hang out by swamps, but also by sewage, or feet–I always imagined them as the puddles of mist that lie in the nooks between hills on dark country roads. Garlic can help you with miasmas, too. Ever see an image of plague doctors from Medieval Europe wearing masks with a long, bird-like beak? The beak was filled with odorous herbs, garlic likely among them, designed to combat miasmas.

In 18th-century France, a group of thieves may have been inspired by these plague masks. During an outbreak of the bubonic plague in Marseilles in 1726 (or 1655, stories deviate), a group of thieves were accused of robbing dead bodies and the houses of the deceased and ailing, without seeming to contract the disease themselves. Their lucky charms against the miasmas? They steeped garlic in vinegar, and soaked a cloth or a sponge in the liquid, then tied it like a surgical mask over their mouth and nose. In their minds, the strong smells would repel miasmas. This story is probably a legend, but I think there is some grain of truth to it: in modern studies, garlic has been shown to obfuscate some of the human smells that attract biting bugs. Since we now know bubonic plague was carried by fleas, it’s possible the thieves were repelling the insects. The plague is also a bacterial infection, and both vinegar and garlic are effective antimicrobials.

Garlic remained in the realm of medicine for most of the 19th century. Louis Pasteur first discovered that garlic was a powerful antimicrobial in 1858. In 1861, John Gunn assembled a medical book for use in the home, The New Domestic Physician, “with directions for using medicinal plants and the simplest and best new remedies.” Gunn recommends a poultice of roast garlic for ear infections:

“An excellent remedy for earache is as follows: Take three or four roasted garlics, and while hot mash, and add a tablespoonful of sweet oil and as much honey and laudanum; press out the juice, and drop of this into the ear, warm, occasionally.”

Salmon_EnglishHerbal_1710_408_garlic_watermark

Garlick from Botanologia: The English Herbal (1710) by William Salmon.

He also recommends garlic for clearing mucus from the lungs and reducing cough, given by the spoonful with honey and laudanum.  Gardening for the South: Or, How to Grow Vegetables and Fruits, an 1868 botanical guide, says the medicinal values of garlic include making you sweat, which,  like bloodletting, was believed to leach out disease; it will also make you urinate, and is an effective “worm destroyer,” for any intestinal hitchhikers you might have. By the late 19th century, scientists also used garlic to treat TB and injected it into the rectum to treat hemorrhoids.

Today, garlic is one of the most heavily used home remedies, and it is increasingly being studied in the medical field. Some of its historic uses have been proved as bunk–while others, like its efficacy as a topical antiseptic, hold up. But since the late 19th century, garlic has found an even more worthwhile home, thanks to French chefs and Italian immigrants, who spread their garlic heavy cuisine around the world, and made even garlic-reticent Americans a lover of this pungent plant.

Join us on Monday, June 5 to learn more about this topic.  Click HERE to register.

The Enduring Impact of the X-Ray

Today we have part two of a guest post written by Dr. Daniel S. Goldberg, 2016 recipient of the Audrey and William H. Helfand Fellowship in the History of Medicine and Public Health. Part one can be read here.

X-ray exhibitions were hugely popular all over the country, and the greater NY area was no exception.  At a February 1896 demonstration run by Professor Arthur Wright, director of the Sloan Laboratory at Yale University, a newspaper reported that despite the auditorium being literally jam-packed, students were still crawling through windows 30 minutes into the lecture — and all this despite the fact that none of the audience, save those in the first few rows, could even hear Wright’s discussion.  The deans of multiple Yale schools (Divinity, Law, and Science), the head of the Yale Corporation, and the chief medical examiner were all in attendance.

bostonmedsurgjournal_feb201896_rontgenprocess_watermark

Perhaps the first published X-ray in the United States of a clinical condition. In “Rare Anomalies of the Phalanges Shown by the Röntgen Process,” Boston Medical and Surgical Journal 134(8), February 20, 1896: 198–99.

The pressing question is “why”? Why did X-rays exert such tremendous power across a wide spectrum of social domains? (X-rays were a constant topic of conversation in sermons and religious journals, in women’s journals, in influential satirical periodicals like Punch, and were the subject of a seemingly endless number of political and non-political cartoons, to name but a few).  Although historians of the X-ray have offered a number of plausible answers, I believe there is a key element left unexplored in the historiography: the intellectual frameworks, or ideas, relating to changing ideas of truth, doubt, and objectivity in U.S. society at the time.

Two of these frameworks are most useful in unpacking the stunning impact of the X-ray: the rise of mechanical objectivity, and what can be called “somaticism” within medicine and science.  Historians of science Lorraine Daston and Peter Galison explain that a new model of ‘objectivity’ begins to take hold during the middle decades of the 19th century.  Under this new model, the truth-value of scientific knowledge is a function of the investigator’s ability to remove or eliminate human, subjective influence from the knowledge-making process.  The fact that this is more or less impossible, and that X-rays can be manipulated in all sorts of ways was well-known to contemporaries and remained a source of anxiety for some time.  The important point is the ultimate goal: to let the mechanical processes of nature speak for themselves and reveal their truths.  Ideas of objectivity, as Daston and Galison point out, have for over four hundred years been connected to scientific images, which makes media like photography and X-rays especially significant.

By the end of the 19th century, ideas of mechanical objectivity begin to fundamentally reshape ideas of what is known and what is certain.  This is especially crucial in a century that features so much intense change, including but not limited to governments, family and labor structures, migration patterns, and, of course, industrialization and urbanization.  Late Victorians were beset with anxieties connected to their changing world, and they were especially concerned with artifice and deception — that the world was not what it seemed.  As such, intellectual frameworks that shaped the criteria for truth were hugely influential, and traveled well beyond narrow networks of scientists and medical men.

Somaticism integrates in important ways with constructs of mechanical objectivity.  Historians of medicine have documented the influence of somaticism (or, “empiricism,” as it is also sometimes termed) within medicine over the long 19th century.  The core of the framework is that truths about disease and the body are to be found in pathological anatomical objects.  The existence of these objects can then be clinically correlated with the illness complaints the patient has, or more likely had given that pathological objects are most likely to be located precisely during a postmortem — until the X-ray.  The truths of the sick body are to be found in the natural objects of disease, which makes seeing those objects so essential.  Laennec himself explained that the point of the stethoscope was not to listen; listening was merely a means to an end.  The point, as Jacalyn Duffin explains, was “to see with a better eye.”

Collectively, these frameworks go a significant length in explaining the enormous and enduring social impact of the X-ray.

X Rays for Consideration_watermarked

Article from the New York Record. May 1896.

For example, Morton’s clippings contain a May 1896 article from the New York Record entitled “X Rays for a Consideration: Light in a Human Kidney.”  The article details what may be the first private X-ray laboratory opened in New York City, founded by Mrs. M.F. Martin, and located at 110 East 26th Street.  The lab was intended solely for the use of physicians and hospitals.  One of its first patients was a doctor named George McLane, who traveled from North Dakota to have his kidney X-rayed for evidence of a possible stone.  A surgeon removed McLane’s kidney, and Morton placed it on a plate and subsequently irradiated it with X-rays.  The procedure “revealed the absence of any stone in the organ, demonstrating the entire reliability of doctors to prove the absence of stone in the kidney.”

The X-ray shines its light into the hitherto dark spaces inside the human body, revealing the truth of a disputed question: whether McLane suffered from a kidney stone or not.  The truth resides in the natural object itself, and the mechanism of the X-ray supposedly insulates the production of medical knowledge from the whims and artifices of the investigator (as compared to illustrations and drawings, for example).

Or, as Dr. McLane himself stated at the Post Graduate Hospital (the primary hospital at which Morton cared for inpatients):

“Dr. McLane spoke modestly at the Post Graduate Hospital about the risk he had taken in the name of science . . . ‘Hitherto a great many mistakes have been made owing to the inability of doctors to prove the absence of stone in the kidney . . .’  Now, by a very simple process, the truth can easily be determined.”

It is difficult to imagine how powerful it must have been, in 1896, to witness an X-ray operator remotely anatomize the living body.  Seeing inside the body had been a dream of physicians for centuries prior, and there is every reason to believe that its achievement has not eroded much of its social power.  Americans still perform significantly more medical imaging procedures than virtually any of our comparator societies, and what is most interesting is the evidence that this utilization is driven both by supply and demand.  That is, it is not merely that we have expensive X-ray and medical imaging machines — so we use them.  Across a wide variety of illness paradigms, illness sufferers and patients request medical imaging; they want it to be performed on their bodies.  The history of the X-ray helps us understand the enduring power of these tools, of what it means to delve into the penetralium.

The Early Days of the X-Ray

Today we have part one of a guest post written by Dr. Daniel S. Goldberg, 2016 recipient of the Audrey and William H. Helfand Fellowship in the History of Medicine and Public Health. Dr. Goldberg is trained as an attorney, a historian, and a bioethicist.  He is currently on the faculty at the Center for Bioethics and Humanities at the University of Colorado Anschutz Medical Campus.

After news of Wilhelm Röntgen’s discovery of X-rays was cabled across the Atlantic late in 1895, evidence suggests X-ray experimentation was taken up eagerly all over the U.S. almost immediately.  While scientists and physicians scrambled to build their own X-ray machines, newspapers in major cities throughout the country eagerly reported on their progress, with stories small and large appearing in nearly every significant daily from New York and Philadelphia to Chicago and St. Louis to San Francisco and Los Angeles.  Historians of the X-ray estimate that within only a year of Röntgen’s discovery, literally thousands of articles had been published on the X-ray in both lay and expert periodicals.  Even in the fertile print culture of 1896, this is a significant accounting.

Therein lies the methodological difficulty for the historian of the X-ray.  So often, the craft of history is a tedious search for small scraps of information that may not even exist.  Yet, as to X-rays, the problem is one of feast, not famine.  With so much print material appearing in so many different sources in so many different places all at the same time, sifting through the morass to articulate coherent and important narratives is difficult.

What makes this task far easier is a remarkable collection held at the New York Academy of Medicine Library.  The William J. Morton Collection is a small holding, consisting of only two boxes.  The second box is the true treasure, containing a single folder, approximately six inches thick.  Inside is an unbound series of pages consisting solely of newspaper clippings related primarily to early X-ray use in the U.S.  These are Morton’s clippings, and as far as is known, the order and arrangement of the pages is original to Morton himself.  The collection is astounding, for it represents something of an index or a cipher for the ferment of X-ray use in NYC in the first half of 1896.

Clippings

Newspaper clippings from the William J. Morton Collection, New York Academy of Medicine Library.

There is no question that New York City played an important role in early X-ray use, if for no other reason than the enormous shadow cast by the inventor, Thomas Edison.  There were, however, many other important figures involved in early X-ray use in NYC, including Nikola Tesla[1], Michael Pupin[2], and Morton.  Morton, the son of William T.G. Morton of anesthesia fame, was a prominent physician, a fellow of the New York Academy of Medicine, and a respected neurologist and electro-therapeutic practitioner.

Telegrams_watermark

A telegram dated January 2, 1896 from Dyer & Driscoll, attorneys for none other than Thomas Edison, indicated that Morton visited Edison’s workshop for the purpose of conducting experiments (almost certainly with X-rays) several days earlier.

Because Morton was unquestionably at the forefront of early X-ray experimentation in NYC, his curation is a reasonable index as to important events and moments in the early use of X-rays in NYC.  There are limitations to this approach, of course.  Morton was obviously interested in his own role in early X-ray experimentation, so there is something of a selection bias at work (although it should be noted that there are no shortage of clippings pertaining to Pupin’s important work).

The collection is full of interesting and significant stories in the early history of X-ray use.  For example, in March 1896, strongman Eugene Sandow, considered the father of modern bodybuilding, turned to Morton in an effort to locate the source of a frustrating pain he was experiencing in his foot.  Apparently Sandow had stepped on some broken glass, but even his personal physician could not specify the location of the glass in his foot.  The potential for the X-ray must have seemed obvious, and Sandow reached out specifically to Morton to see if he could be of help.  Morton was eager to oblige.  He turned the X-rays on Sandow’s foot and located the shard of glass disturbing Sandow’s equanimity.  A surgeon subsequently operated and removed the glass, and the story made national news.

How the photograph was made_watermark

The X-Ray of Eugene Sandow’s foot in process.

Interestingly, Sandow was apparently impressed enough with the powerful rays to send an unsolicited telegram to Edison, offering his services as a human subject for any X-ray experiments Edison wished to undertake.

SandowLetter_watermark

Letter to Thomas Edison from Eugene Sandow.

It is difficult to imagine how powerful it must have been, in 1896, to witness an X-ray operator remotely anatomize the living body.  Seeing inside the body had been a dream of physicians for centuries prior, and there is every reason to believe that its achievement has not eroded much of its social power.  Americans still perform significantly more medical imaging procedures than virtually any of our comparator societies, and what is most interesting is the evidence that this utilization is driven both by supply and demand.  That is, it is not merely that we have expensive X-ray and medical imaging machines; so we use them.  Across a wide variety of illness paradigms, illness sufferers and patients request medical imaging; they want it to be performed on their bodies.  The history of the X-ray helps us understand the enduring power of these tools.

Footnotes:
[1] Tesla was heavily involved in early X-ray experiments in his laboratory at 46 East Houston Street; much to Edison’s likely chagrin, given the frostiness of their relationship by the time. The New York newspapers constantly asked Edison about Tesla’s progress.
[2] Pupin, a Columbia University physicist, would in short order — in 1896, in fact —  go on to discover a way of substantially reducing the exposure time needed to produce an X-ray image from hours to minutes.  The basics of Pupin’s method are still used today.

Scent Track

Today’s guest post is written by Ann-Sophie Barwich, Ph.D., scholar in the Presidential Scholars in Society and Neuroscience program at the Center for Science and Society, Columbia University. Her work is on current and past developments in olfactory research (1600 to today). On Wednesday, April 26, Barwich will give her talk, “Scent Track: What can the History of Olfaction tell us about Theorizing in the Life Sciences?” To read more about this lecture and to register, go HERE.

Scientific interest in the senses has always been preoccupied with vision and its underlying mechanisms. In comparison, smell is one our least understood senses. This may sound surprising given the importance of smell in flavor perception. Human cuisine represents one of the most central elements of human culture. While the cultural history of scent has gathered sufficient attraction in the humanities and social sciences, its scientific history has yet to be told.

Many of the central research questions about the characteristics of olfaction remain unresolved even to date. How do we classify smells? How many smells are there, and is there such a thing as olfactory primaries? Modern research on smell was revolutionized with the discovery of the olfactory receptors by Linda Buck and Richard Axel in 1991. Their discovery presented the key causal entity to model the molecular basis of smell and granted them the 2004 Nobel Prize in Physiology of Medicine. Since then, olfaction started to emerge as a modern model system in neuroscience.

Nonetheless, records of scientific theorizing about the material basis of odor reach much further back. These hidden experimental records of research on smell offer us an intriguing, yet untold, history of creativity in scientific reasoning. For large parts of the history of science, scientific approaches to smell were faced with its apparent lack of testability. An inherent difficulty for odor description and classification is that sense of smell is incredibly hard to study in a controlled setting. How do you visualize and materialize odor to turn it into an object of objective measurement and comparison? In reply to these questions, several answers were developed from various disciplinary perspectives throughout the past centuries. These ideas present a hidden heuristic source for widening our theoretical understanding of smell even today.

Figure1

Linnaeus’ classification of odors in medicinal plants in his Clavis Medicinae (1766).

My talk reconstructs a conceptual history of materiality that has informed scientific approaches to smell, and I analyze this material history of olfaction by three stages. First, smells are investigated as “objects in nature,” drawing on 18th-century expertise in botany and horticulture that arranged odors according to their diverse plant origins. Botanical classifications, such as in Linnaeus’ Odores Medicamentorum (1752) and Clavis Medicinae (1766), conceptualized odors as objects in nature. Here, the affective nature of smell was investigated with regard to the medicinal powers of plants. Meanwhile, perfumers have always experimented with odorous plant substances but their knowledge was a well-kept secret. Some records, such as George William Septimus Piesse’s The Art of Perfumery (1857), illustrate that these practices addressed the various possibilities for the material manipulation of odorous substances (e.g., through mechanical force, solvent extraction, distillation). They further conceptualized the psychological effects of odor by analogy with other sensory qualities such as taste, color, and sound. Can we blend odors like colors? Can we understand the harmony between odor notes in parallel with musical chords?

Figure2

Analogy of odors with sounds to define harmonic chords in perfumery. Source: Piesse 1857, The Art of Perfumery.

Second, smells are framed as “objects of production” in light of the industrialization of perfumery after the rise of synthetic chemistry at the end of the 19th-century. In earlier chemistry, smells were modeled as immaterial spirits that represented vital forces, such as in the Spiritus Rector theory by Herman Boerhaave. This theory was soon abandoned by a more mechanistic causal understanding of odorous particles, especially after Antoine-François de Fourcroy’s extraction of urea as the ‘smelling principle’ of horse urine. This discovery of the chemical basis of odors and its subsequent exploration with the rise of synthetic chemistry presented a fundamental conceptual liberation of smells from their plant origins. New scents, sometimes even unknown in nature, were now produced in the laboratory.

Figure3

Vanillin was first synthesized by Ferdinand Tiemann and Wilhelm Haarmann in 1874. It’s synthesis, illustrated above, was further refined by Karl Reimer in 1874. Source: Wikipedia (Yikrazuul).

Third, the introduction of molecular visualization and computational techniques in the 20th century abstracted smells further from their natural origins, and this advancement laid the foundation for smells to turn into what Hans-Jörg Rheinberger calls “epistemic objects.” This transformation signifies the integration of smell into the growing scientific domain of biochemical science. Confronted with the sheer diversity of chemical structures responsible for odor qualities, the classification of smells now required the integration of two seemingly separate data sets: a stimulus classification of chemical similarity on the one hand and an ordering of perceptual classes on the other. In this context, the food scientist John Amoore proposed a classification of five to seven primary odors in the 1960s and 1970s.

While this classificatory strategy was soon rendered too simplistic, it provides one of the earliest expressions of a central question in modern olfactory research: How does the chemical basis of odors relate to their perceptual quality? Can we predict smells from the molecular structure of their stimuli? Notably, this question remains open but of central scientific interest today.

Join us on Wednesday, April 26 to learn more about this topic. To RSVP to this free lecture, click HERE.

 

Robert L. Dickinson: Doctor and Artist

Today’s guest post is written by Rose Holz, Ph.D., historian of medicine and sexuality at the University of Nebraska – Lincoln where she serves as the Associate Director of the Women’s & Gender Studies Program and Director of Humanities in Medicine.  She is the author of The Birth Control Clinic in a Marketplace World (Rochester, 2012). Her current project investigates the intersection of medicine and art by way Dr. Robert L. Dickinson (1861-1950) — gynecologist, sexologist, and artist extraordinaire — and his prolific ten-year collaboration with fellow artist Abram Belskie (1907-1988). Not only did it yield in 1939 the hugely influential Birth Series sculptures but also hundreds of medical teaching models about women’s and men’s sexual anatomies. On Thursday, April 13, Rose will give her talk, “Art in the Service of Medical Education: The Robert L. Dickinson-Belskie Birth Series and the Use of Sculpture to Teach the Process of Human Development from Fertilization Through Delivery.” To read more about this lecture and to register, go HERE.

My interest in Dr. Robert L. Dickinson began many years ago when I was in graduate school, working on my Ph.D. in history and writing my dissertation on the history of birth control clinics in America. And, as has been the case with so many other scholars who have written about matters related to women, medicine, and sexuality in the twentieth century U.S., Dickinson made his brief cameo entrance into my story, though not without leaving behind a lasting impression.

For me it was the images — because, like me, Dickinson was compelled to color and draw. Early on, while pouring over Planned Parenthood records, I remember chuckling over a letter he had written to a contraceptive manufacturer complaining about the poor quality of one of their products, to which he then attached a drawing to illustrate his case.

Then there were the birth control manuals Dickinson wrote in the 1930s. Not only did he illustrate all the contraceptive methods then available, but he also offered birds-eye-view, architectural-style drawings to visualize how best to lay out gynecological clinics. More intriguingly still was what he included at the center of this architectural drawing, a tiny woman lying on the gynecological table with her legs spread wide open as the doctor conducted the physical exam.

IMG_4813

Pages from “Control of Contraception (2nd edition)” by Robert L. Dickinson.

As somebody who also loves small things—especially miniature worlds populated by miniature people—I could not help but find myself be smitten by this unusual man. However, at the time I had a different story to tell, a Ph.D. to defend, and a new job as a professor to pursue. And as the years passed, Dickinson slowly receded into the background.

IMG_1603

Drawings of the location of Embryo and size of Fetus. Source.

But Dickinson is not one to be denied, and that he has remained in obscurity for so long somehow explains to me why he has resurfaced—with a glorious vengeance—into my imagination. Indeed, he has made it clear to me that his story will be told; his skills as a doctor and artist properly recognized. And he has made it further clear that this story will begin with what he created in the twilight of his life: The 1939 Birth Series sculptures.

IMG_2449

Dickinson and Belskie’s “Sculptured Teaching Models Collection.” From the unprocessed Abram Belskie Papers, Belskie Museum, Closter, NJ.

Join us on Thursday, April 13 to learn more about Dr. Robert L. Dickinson and his Birth Series sculptures. To RSVP to this free lecture, click HERE.

Lady Mary Wortley Montagu and Immunization Advocacy

Today’s guest post is written by Lisa Rosner, Ph.D., Distinguished Professor of History at Stockton University. Recent publications include The Anatomy Murders (University of Pennsylvania Press, 2009) and Vaccination and Its Critics (ABC-Clio, 2017). She is the project director and game developer for The Pox Hunter, funded by an NEH Digital Projects for the Public grant.  On Thursday, April 6, Lisa will give her talk, “Lady Mary’s Legacy: Vaccine Advocacy from The Turkish Embassy Letters to Video Games.” To read more about this lecture and to register, go HERE.

In a letter dated April 1, 1717 – 300 years ago — Lady Mary Wortley Montagu (1689–1762), the wife of the British ambassador to Turkey, provided the first report from an elite European patient’s perspective of the middle-eastern practice of inoculation, or ingrafting, to prevent smallpox. She wrote to her dear friend, Sarah Chiswell:

“I am going to tell you a thing that will make you wish yourself here. The small-pox, so fatal, and so general amongst us, is here entirely harmless, by the invention of engrafting, which is the term they give it. There is a set of old women, who make it their business to perform the operation, every autumn, in the month of September, when the great heat is abated. People send to one another to know if any of their family has a mind to have the small-pox; they make parties for this purpose, and when they are met (commonly fifteen or sixteen together) the old woman comes with a nut-shell full of the matter of the best sort of small-pox, and asks what vein you please to have opened. She immediately rips open that you offer to her, with a large needle (which gives you no more pain than a common scratch) and puts into the vein as much matter as can lie upon the head of her needle, and after that, binds up the little wound with a hollow bit of shell, and in this manner opens four or five veins…

The children or young patients play together all the rest of the day, and are in perfect health to the eighth. Then the fever begins to seize them, and they keep their beds two days, very seldom three. They have very rarely above twenty or thirty in their faces, which never mark, and in eight days time they are as well as before their illness. Where they are wounded, there remains running sores during the distemper, which I don’t doubt is a great relief to it. Every year, thousands undergo this operation, and the French Ambassador says pleasantly, that they take the small-pox here by way of diversion, as they take the waters in other countries. There is no example of any one that has died in it, and you may believe I am well satisfied of the safety of this experiment, since I intend to try it on my dear little son.”

Lady_Mary_Wortley_Montagu_with_her_son_Edward_by_Jean_Baptiste_Vanmour

Mary Wortley Montagu with her son Edward, by Jean-Baptiste van Mour. Source: Wikimedia Commons.

This is probably the most famous passage in all Lady Mary’s voluminous correspondence. It deserves even more attention than it usually gets, because it is the first example, in the western history of medicine, of a mother’s perspective on the practice of immunization. We tend to hear a great deal from scientists like Jenner about their discoveries, but much less from mothers who adopted their techniques for children.

But Lady Mary was not just a mother, she was also an acute observer with an inventive and inquisitive mind, and a particular interest in what we would now call public health practices. She had lost a beloved brother to smallpox; she had also contracted the disease, and though she survived, she carried the scars for the rest of her life. As she traveled from London to Constantinople, she was particularly interested in innovations and cultural attitudes toward hygiene and domestic health, especially as they affected women’s lives.

Her enthusiasm for light, clean, airy environments comes through in her very first letter, written from the Netherlands. She wrote:

“All the streets are paved with broad stones and before many of the meanest artificers doors are placed seats of various coloured marbles, so neatly kept, that, I assure you, I walked almost all over the town yesterday, incognito, in my slippers without receiving one spot of dirt; and you may see the Dutch maids washing the pavement of the street, with more application than ours do our bed-chambers.”

For that reason, she noted:

“Nothing can be more agreeable than travelling in Holland. The whole country appears a large garden; the roads are well paved, shaded on each side with rows of trees.”

She was much less pleased with Vienna, for though there were certainly many magnificent sights, the city itself was dark and crowded. She complained:

“As the town is too little for the number of the people that desire to live in it, the builders seem to have projected to repair that misfortune, by clapping one town on the top of another, most of the houses being of five, and some of them six stories … The streets being so narrow, the rooms are extremely dark; and, what is an inconveniency much more intolerable … there is no house has so few as five or six families in it.”

As her travels continued throughout the fall and winter, another custom, neglected in England, caught her attention: the stove, valuable for warmth and for lengthening the growing season. At one of the formal dinners she attended, she was offered oranges and bananas and wondered how they could possibly be grown in Austria. She wrote:

“Upon inquiry I learnt that they have brought their stoves to such perfection, they lengthen their summer as long as they please, giving to every plant the degree of heat it would receive from the sun in its native soil. The effect is very near the same; I am surprised we do not practise [sic] in England so useful an invention. This reflection leads me to consider our obstinacy in shaking with cold, five months in the year rather than make use of stoves, which are certainly one of the greatest conveniencies [sic] of life.”

Mary_Wortley_Montague

Mary Wortley Montagu in Turkish dress. Souce: Wikimedia Commons.

When she arrived in Constantinople and spent time with ladies of the court, both Turkish and European, Lady Mary continued to pursue her interest in gardens, in baths, in the light airy spaces found in both European and Turkish households. She was not the first European to report on the practice of “ingrafting”: her family physician in Constantinople, Dr. Emmanuel Timoni, had previously sent a report to the Royal Society of London. But seeing a disease, so dangerous in Europe, treated as an excuse for a children’s party turned her into an advocate. As she wrote:

“I am patriot enough to take the pains to bring this useful invention into fashion in England, and I should not fail to write to some of our doctors very particularly about it, if I knew any one of them that I thought had virtue enough to destroy such a considerable branch of their revenue, for the good of mankind. But that distemper is too beneficial to them, not to expose to all their resentment, the hardy wight that should undertake to put an end to it. Perhaps if I live to return, I may, however, have courage to war with them. Upon this occasion, admire the heroism in the heart of your friend.”

After she returned to London, she kept her promise “to war” with the physicians in support of inoculation. When smallpox broke out in her social circle in 1722, she decided to inoculate her daughter, and the operation was performed with great success. Physicians who visited her found “Miss Wortley playing about the Room, cheerful and well,” with a few slight marks of smallpox. Those soon healed, and the child recovered completely. The visiting physicians were impressed, and they began to incorporate inoculation into their own practices.

As the epidemic raged, Lady Mary convinced her most prominent friend, Caroline, Princess of Wales, to inoculate the two royal princesses, Amelia and Caroline. Having received the royal seal of approval, smallpox inoculation became fashionable practice among British elites throughout the 18th century.

Memorial_to_Lady_Mary_Wortley_Montague_in_Lichfield_Cathedral

Memorial to the Rt. Hon. Lady Mary Wortley Montague erected in Lichfield Cathedral by Henrietta Inge. Source: Wikimedia Commons.

In 1789, Mrs. Henrietta Inge, Lady Mary’s niece, erected a memorial to her accomplishments in Litchfield Cathedral. The text reads:

“[She] happily introduc’d from Turkey, into this country the Salutary Art Of inoculating the Small-Pox. Convinc’d of its Efficacy She first tried it with Success on her own Children, And then recommended the practice of it To her fell-w-Citizens. Thus by her Example and Advice, We have soften’d the Virulence, And excap’d the danger of this malignant Disease.”

We can recognize in Lady Mary – and in Mrs. Inge — advocates of a kind met with very frequently in the history of vaccination: mothers whose personal experience led them to champion the discoveries that preserved their family’s health and well-being.

Bibliography:

  1. Grundy, Isobel. Lady Mary Wortley Montagu. Oxford: Oxford University Press, 1999.
  1. Montagu, Lady Mary Wortley. Letters of Lady Mary Wortley Montagu. Written during her travels in Europe, Asia, and Africa. Paris: Firman Didot, 1822. Available in many editions online.
  1. Rosner, Lisa. Vaccination and Its Critics. A Documentary and Reference Guide. Santa Barbara, CA: Greenwood, 2017.

Shop ad for Lady mary post

Infectious Madness, the Well Curve and the Microbial Roots of Mental Disturbance

3cfce0fe054a12627f41292ec26e6b22Today’s guest post is written by Harriet Washington, a science writer, editor and ethicist. She is  the author of several books, including Medical Apartheid: The Dark History of Experimentation from Colonial Times to the Present. On Wednesday, March 15 at 6pm, Washington will discuss: “Infectious Madness, the Well Curve and the Microbial Roots of Mental Disturbance.” In this talk, based on her book Infectious Madness: The Surprising Science of How We “Catch” Mental Illness, Washington traces the history, culture and some disturbing contemporary manifestations of this ‘infection connection.” To read more about this lecture and to register, go HERE.

“Mind, independent of experience, is inconceivable.” —Franz Boas

Psychological trauma, stress, genetic anomalies and other experiences that limit the healthy functioning of the mind and brain are widely recognized as key factors in the development of schizophrenia, major depression, and bipolar disorder.  However, despite a plethora of examples and evidence of microbial disorders from rabies to paresis, infection has been slow to join the pantheon.  This aversion persists largely because the perceived causes of mental disorders have evolved not only with our scientific knowledge of medicine but also with our tenacious cultural beliefs and biases.  Instead, we have long clung to what  Robert Sapolsky calls a “primordial muck” of attribution that includes broken taboos, sin—one’s own or one’s forbears’— and even bad mothering.

Brueghel_dancingMania

Representation of the dancing mania by Flemish painter Pieter Brueghel the Younger.Source.

Flemish painter Pieter Brueghel the Younger (1564–1636) painted the above representation of the dancing mania known as choreomania or St. Anthony’s Fire, which has seized a pilgrimage of epileptics en route to the church at Molenbeek. Such compulsive dancing was originally ascribed to satanic influence such as bewitchment, and later to a collective hysterical disorder, but is now ascribed to ergotism— the  infection of rye and other grains by the fungus Claviceps purpurea.  When people ate the tainted bread, their symptoms included compulsive dancing. Some have ascribed the mass hysteria of the Salem witch trials to ergotism.  Streptoccocal infections have also produced cases called Sydenham’s chorea.

Not all traditional “causes” of mental illness are confined to the past.  As late as the 1980s, the alternating rage, coldness and oppressive affection of domineering “schizophrenogenic mothers” was taught in psychology classes as the root of schizophrenia, just as Tourette’s syndrome initially was laid to poor parenting.

For Infectious Madness: The Surprising Science of How We “Catch” Mental Illness, I interviewed scientists working on the effects of infections on mental health such as Susan Swedo, chief of the pediatrics and developmental neuroscience branch at the National Institute of Mental Health, who studies the role of Group A strep (GAS) infections in children in rapid-onset cases of obsessive compulsive disorder, anorexia, and Tourette syndrome. Other visionary researchers, such as E. Fuller Torrey, executive director of Maryland’s Stanley Medical Research Institute, and Robert Yolken, director of developmental neurovirology at Johns Hopkins University, have for decades investigated the role of microbes in mental illness and have traced the path of viruses such as influenza, herpes simplex and Toxoplasma  gondii, among other microbes, in schizophrenia and bipolar disorder.

There are a myriad of ways in which infections cause or encourage mental disease. In order to suit its own need to reproduce within the stomach of a cat, the unicellular parasite Toxoplasma gondii changes the behavior of rodents — and incidentally, use it to gain entry. This seems strange, but changing the behavior of a host to suit its own needs is a common stratagem of parasites. The Cordyceps fungus, for example, manipulates an ant in the Amazon into climbing a tree where the fungal spores can be more widely disseminated. The spore- bearing branches extend from the corpse of the ant pictured below.

Ant1

The Cordyceps fungus manipulates an ant in the Amazon into climbing a tree where the fungal spores can be more widely disseminated. The spore-bearing branches extend from the corpse of the ant.Photograph © Gregory Dimijian, MD.

Infection, redux

“Everything has been thought of before, but the problem is to think of it again.” —Goethe

There is a long, all but forgotten history of infectious theories of mental illness. In his 1812 psychiatry text Medical Inquiries and Observations upon the Diseases of the Mind, for example, Benjamin Rush, MD, included a first detailed taxonomy of mental disorders, each with its own physical cause. He cited disruptions of blood circulation and  sensory overload as the basis of mental illness, and he treated his patients with devices meant to improve circulation to the brain, including such Rube Goldberg designs as a centrifugal spinning board, or to decrease sensory perceptions, such as a restraining chair with a head enclosure.

Restraining Chair

Pictured here is the “tranquilizing chair” in which patients were confined. The chair was supposed to control the flow of blood toward the brain and, by lessening muscular action or reducing motor activity, reduce the force and frequency of the pulse.Photograph © 2008 Hoag Levins.

Paresis, an infectious mental disorder

In 1857, Drs. Johannes Friedrich Esmark and W. Jessen suggested a biological cause for paresis: syphilis. Many researchers started to view paresis as the tertiary stage of syphilis, which often attacked the brain indiscriminately, and they began referring to it as neurosyphilis. This theory held out hope that if syphilis was ever cured, paresis could be too.

Nineteenth-century asylum keepers, however, persisted in viewing paresis as wholly mental in character. The long-standing insistence on divorcing physical illnesses from mental ones had to do with religious philosophy and culture but also with the politics of the asylum, which remained a battleground between physicians and religious and philosophical healers.

Matters were complicated by the fact that most physicians, despite the evidence that paresis was the mental manifestation of a physical disease, continued to treat paretics with the same ineffectual therapeutics given other mentally ill patients. Traditional treatments such as “douches, cold packs, mercury, blistering of the scalp, venesection, leeching, sexual abstinence, and holes drilled into the skull [trephination]” continued—without positive results. Even when toxic mercury-based treatments for syphilis were replaced by Paul Ehrlich’s safer, more effective arsenic-based Salvarsan (also called arsphenamine and compound 606), it was not used against paresis.

But in June 1917, Professor Julius Wagner-Jauregg of the University of Vienna Hospital for Nervous and Mental Diseases undertook a radical approach. He had noticed that some paretic patients improved markedly after contracting an infectious illness that gave them fevers. He decided to fight fire with fire by turning one disease against another: he sought to suppress the symptoms of paresis by infecting its sufferers with malaria.

Before Wagner-Jauregg won the Nobel and Freud forged the future of psychiatry, a paradigm shift had already taken place that transformed science’s approach to the nature of disease. It is the very framework that supports the role of infection in mental illness—germ theory. Developed by Louis Pasteur and Robert Koch, germ theory posits that specific microbes such as bacteria, viruses, and prions (infectious proteins) cause illness.

For more on this fascinating topic, join Harriet Washington on Wednesday, March 15 at 6pm.  More information can be found here

The Marrow of Tragedy: Disease and Diversity in Civil War Medicine

Today’s guest post is written by Dr. Margaret Humphreys, Josiah Charles Trent Professor in the History of Medicine at Duke University. She is the author of Yellow Fever and the South (Rutgers, 1992) and Malaria: Poverty, Race and Public Health in the United States (Johns Hopkins, 2001), Intensely Human: The Health of the Black Soldier in American Civil War (2008) and Marrow of Tragedy: The Health Crisis of the American Civil War (2013). On Tuesday, February 21 at 6pm, Humphreys will give The John K. Lattimer Lecture: “The Marrow of Tragedy: Disease and Diversity in Civil War Medicine.” To read more about this lecture and to register, go HERE.

In a memorable scene from the movie Gone with the Wind, Southern belle, Scarlett O’Hara, picks her way through the battle-wounded men lying on the ground near the train station in Atlanta, frantically seeking Dr. Meade to help her with her sister-in-law Melanie’s imminent delivery.  Meade brushes her off and turns to a screaming soldier, telling him that his leg would have to come off, and without anesthesia.  The man’s screams echo as Scarlett heads back to Melanie’s bedside.  This cinematic portrayal of Civil War medicine reflects a wide belief that there was no anesthesia at that time.  Indeed, it was said that the war occurred “at the end of the medical middle ages.”  (This quotation is widely attributed to Union Surgeon General William Hammond, but without citation).

atlanta

Scene from Gone with the Wind (1939).

In my book, Marrow of Tragedy: The Health Crisis of the American Civil War, I begin from a different perspective, recognizing that there was such a thing as “good medicine” and “bad medicine” during the War.  Medical care could be effective, and it could make a difference in disease and injury outcomes.  For example, chloroform and ether anesthesia meant most surgery occurred with the patient unconscious (although Confederate surgeons did run out of these supplies in desperate circumstances, such as the siege of Atlanta near the end of the war).

Alarming as the notion of amputation completely without anesthesia, are the revealing mortality rates from disease at this point in the war. Put simply, for every one white Union soldier who died of disease during the War, a little over two black Union soldiers died, and almost three Confederates succumbed.[i]

hospital-scene

Image source: Getty Images.

How can we account for these differences?  A major factor was the quality and quantity of food, a core ingredient of the modern concept of “social determinants of health.”  White Union troops also received better hospital care, calling on part of the strong social networks of the folks back home and their political impact.  The Union hospital system was much better funded, with full access to important medicines, such as quinine, opiates, and anesthetics; and the technology of cleanliness, which included clothing, soap, and disinfectants.  Nursing care was key, as well, with northern hospitals staffed by volunteer nurses, while those in the south were often civilians or slaves challenged by lack of formal training as well as lack of resources.

To learn more about Civil War medicine, join us on Tuesday, February 21 at 6pm. Register HERE.

sanitary-commission

Image source: Harper’s Weekly, April 9, 1864.

 

Note:

[i] Actual numbers, per 1000, were 63, 143, and 167, respectively.