Category Archives: paleomicrobiology

The Paleomicrobiology of Malaria Detection

Malaria is arguably one of the most influential infectious diseases in human history. Its been with us as long as we have been human, but as Teddi Setzer shows us in her recent review of detection methods, our abilities to find it in the past leaves a lot to be desired.

The standard method of looking for malaria involves searching for signs of anemia on the skeleton on the hypothesis that the anemia caused by malaria leaves these marks. This is not as clear as it might seem. There have been very few skeletal studies of modern people who have been diagnosed with malaria. There is no medical need; there are much more reliable methods of diagnosing malaria in a living person (or recent cadaver). So, it is unclear how often these lesions form in malaria patients. Other causes of anemia and even scurvy can cause the same or very similar lesions as well.  The number of malarial infections and/or relapses also effect bone changes. Plasmodium falciparium produces a short, virulent disease that may kill before bone changes develop. On the other extreme, a single P. malariae infection can relapse for life, although the anemia is not as severe.  Osteology must be correlated with other information to support the diagnosis. 

Cribra Orbitalia from Jess Beck’s blog Bone Broke

Cribra orbitalia and porotic hyperostosis are the two main indicators sought. Both are caused by bone marrow expansion in an attempt to compensate for the loss of red blood cells. Cribra orbitalia is pitting and extra bone growth in the orbits of the eyes, as seen in the photo.  Porotic hyperostosis causes pitting and thinning of the compact bone ‘shell’ that covers the cranial bones. A  correlation of nutritionally informed osteology with later epidemiology and mosquito incidence in England reviewed in a previous post shows that a convincing case can be made for malaria in ancient remains.

Detection of human genetic traits selected for by malaria such as the Duffy blood group, sickle cell trait, thalassemias, and glucose-6-phosphatase deficiency (G6PD) can with supporting information suggests that the population was once under selection by malaria. Balanced polymorphisms like sickle cell trait can remain in a population for centuries after the selection is gone (by either ecological change or by migration away from the malarious region). While there are some skeletal indicators of some hemoglobinopathies, human ancient DNA analysis would be a more secure method of diagnosis. Care has to be taken to distinguish skeletal changes made by malaria’s hemolytic anemia and the hemoglobinopathy anemias.

Ancient DNA detection of the malaria Plasmodium parasite has been disappointing. To date, only the tropical Plasmodium falciparium, that causes the most severe disease, has been detected by PCR. It is believed that attempts of detect the historically more common Plasmodium vivax have been stymied by the low parasite load in the blood.  The difficulty in finding vivax aDNA is a reminder that pathogens really do need to be in high concentrations within the sample to overcome degradation and be detected by PCR or sequencing technology. As far as I know, there have not been attempts to detect the other three human malarial parasites– Plasmodium ovale, Plasmodium malariae and Plasmodium knowlesi — by aDNA analysis.

Hemozoin crystals in the liver (Source: KMU Pathology Lab)

Modern medicine is devising an ever expanding array of tests for malaria diagnostics and prognostics. However, most of these tests all require fresh (soft) tissue or blood. Immunological methods have not been applied to malaria in archaeological material yet. The most promising detection method for malaria among the newer diagnostics is the detection of the iron containing waste product of the Plasmodium parasite hemozoin. When the parasite feeds on hemoglobin in the red blood cell, toxic iron waste products are processed into the biocrystal hemozoin and excreted into the tissues. In patients with reoccurring or multiple malaria infections, hemozoin will stain their bone marrow black and can be found in liver, spleen, brain and lungs. It can be detected microscopically (as seen above) or by mass spectrometer. Although some other blood parasites also excrete hemozoin, they can be distinguished from the malarial product. 

Despite the advances in diagnosis for existing malaria patients taking advantage of new methods and technologies, archaeological detection has not enjoyed the same success. Building a case for malaria in the past, must rely on an array of data with knowledge of ecology, vectors, and nutritional status of the population in addition to osteological markers of anemia. Hopefully, the detection of hemozoin will eventually be the key to opening up biological studies of malaria in the past. If hemozoin can identify malaria victims, then perhaps focusing the ancient DNA work on hemozoin positive remains will be more successful breaking through the firewall to malaria’s evolution and historical epidemiology. 

 Source:

Setzer, T. J. (2014). Malaria detection in the field of paleopathology: A meta-analysis of the state of the art. Acta Tropica, 140, 97–104. doi:10.1016/j.actatropica.2014.08.010 (open access early editionfinal edition)

See also Jess Brek “Porotic Hyperostosis and Cribra OrbitaliaBone Broke, March 2014. 

An Unnatural History of Emerging Infections

Unnatural HistoryRon Barrett and George Armelagos. An Unnatural History of Emerging Infections. Oxford University Press, 2013 (e-book)

This is not a traditional review. In keeping with this blog’s function as my shared file cabinet, this post will be something like a précis /notes with a few of  my comments in italics.

Medical anthropologists Ron Barrett and George Aremelagos argue that there have been common factors in the disease ecology that has governed all three main epidemiological transitions in human health. They argue that there is nothing fundamentally new about the driving factors of the current ecology of emerging and re-emerging infectious diseases. In all three transitions, human factors have created the ecology for acute infectious disease to thrive.

Concept: “syndemics: interactions between multiple diseases that exacerbate the negative effects of one or more diseases” (p. 10). Examples: co-infections of HIV, and combinations of infection and chronic respiratory disease (asthma etc).

Metaphor: “seed and soil” where the microbe is the seed and the ecology is the soil. Historically used by physicians who accepted Germ theory but practiced environmental medicine (sanitarians) especially in the gap between the beginning of germ theory and the availability of antibiotics. I really like this metaphor; it still works today. 

Prehistoric baseline

  • Important as our evolutionary context, first 100,000 years of human history. (that’s about 90% of total human history). At its peak only 8 million people globally; small, nomadic groups  rarely in contact.
  • Temporary shelter and carried little with them to carry vectors (or fomites?). Hunter gatherers maintained near zero population growth. More diverse nutrition but could not support large groups. Little hierarchy within the group so few inequalities (at least not consistently detectable in the osteological record.)
  • Nutrition is closely tied to immunological competence.  Protein deficiencies reduces competence to the level of AIDS patients. Nomads can move to find better nutrition, avoiding ‘famine foods’. Diets higher in lean meats and  fiber, but low in carbohydrates.
  • Too small to support acute epidemics (ran out of hosts too soon) but at an increased risk for parasites. Heirloom parasites like pin worms and lice; souvenir parasites picked up while foraging like ticks and tapeworms. Mostly chronic infections that could remain with the nomads until they could be transmitted to new groups.  New zoonoses that can be passed human to human contracted from hunting would ‘flash out’ in a small group. Groups too small for diseases like measles, smallpox or influenza.

First epidemiological transition – Agricultural revolution

  • The first transition comes with people settle down and form villages.  Settlement and agriculture allow populations to grow large enough to support acute epidemic disease and animal domestication brings humans in prolonged contact with animals sparking some important zoonotic diseases.
  • They note that agriculture and settlement begin in multiple parts of the world independently but not at the same time. It took about 9000 years for 99.99 % of the population to shift to farming and domestic animals as their primary nutrition source. Once the shift to agriculture comes, there is no going back.  They debate which comes first, settlement or agriculture, but they note that in the end for heath it doesn’t matter. (The length of time here has important implications for the incomplete nature of the second transition.)
  • Decrease in overall health seen in all societies that shifted to agriculture. Correlations between more/better grave goods and better health; ie. social inequity was bad for health as early as the neolithic. Very high childhood mortalities bring the overall lifespan down considerably. Settlement increased densities of humans and newly domestic animals making conditions ripe for the first acute epidemics and zoonotic transfers. Most zoonotic transfers in this period come from domestic animals.
  •  Nutrition suffers with settlement. Reliance on a monoculture makes them vulnerable to bad years and nutritional deficiencies of essential nutrients not found in the monoculture.  There is a general reduction in stature, increase in signs of anemia, and increase in osteological signs of infection. Examples: Nubia and Dickson Mounds, IL, USA. Correlation of age with skeletal pathologies shows that is health declines are not due to the ‘osteological paradox’ (more pathologies in stronger people because they survive what would have killed others).

Second epidemiological transition – Industrial revolution

  • Transition marked by decreasing deaths due to infectious disease and an increase in chronic diseases. Increasing life expectancy due in large part to decreasing childhood mortality. Total human population soars.
  • Germ theory vs. Sanitation reform: Germ theory is associated with quarantine tied to power of the church and state. (??) Sanitary reform has greater success in controlling diseases like cholera and food-bourne diseases. Sanitary reformers focused on building infrastructure, improving living conditions and personal hygiene. “Germ theorists had begun a revolution in medical thinking, but in the realm of medical practice, they could do little more than agree with existing recommendations of the miasmists.” “with the exception of a few vaccines and surgical asepsis, Germ Theory offered little…until well into the 20th century”. Not surprising that germ theory didn’t make much difference until antibiotics came along. 
  • McKeown Thesis: “identifies nutrition as the primary determinant in the decline of infection-related mortality” Improved nutrition best explains increasing population growth in different countries in a short time period; improved agricultural methods and transport of food. Urban growth with industrialization increased crowding and decreasing sanitation leaving nutrition as the cause for decreasing infectious disease. Correlation between increasing height and decreasing infant mortality, increasing maternal height (indicating good nutrition) increased indicators of infant health so that improving nutrition improved health from generation to generation.
  • McKeown’s critics: error rates in bills of mortality obscure particularly respiratory infections in the elderly. They also believe that he underestimates the significance of smallpox vaccination in decreasing death rates. Greatest criticism is that McKeown places too much emphasis on nutrition over non-medicinal factors.
  • “Comparing the Agricultural Revolution with the Industrial Revolution, we find the same human determinants of infectious disease: a) subsistence, via its affects on nutritional status and immunity; b) settlement, via its effects on population densisty, living conditions, and sanitation; and c) social organization, via distributions of these resources and their differences within and between groups…. As such, the First and Second Transition could be seen as two sides of the same epidemiological coin with human actions as the basic currency.” (p. 61)
  • Second transition is incomplete in many countries. Only seven nations began the transition before 1850 and 17 more by 1900 with most transitioning after World War II. “The ‘low mortality club’ consisted of richer nations whose life expectancies converged at around 75 years old at the turn of the millennium. The ‘high mortality club’ consisted of poorer nations whose life expectancies converged at the same time around 50 years of age.” (p. 66)  The poorer nations have relied more heavily on vaccines and drugs as a buffer against living conditions to achieve the transition. Drug resistant pathogens removes this buffer for poorer nations. High childhood moralities continued in the poorer countries for the same reasons as in the first transition.
  • Chronic diseases make people susceptible to different infections. example: diabetes + TB, infectious diseases causing cancer: HPV, H. pylori, EBV (lymphoma).
  • Developed world vulnerable to “reimportation epidemics” from poorer nations with agents like smallpox (prior to eradication). Increased speed of air travel allows people to travel between high and low disease areas during the incubation period through entry ports without detection.

Third epidemiological transition (current)

  • Convergence of chronic and infectious diseases in a global human disease ecology marks the Third epidemiological transition.
  • Human health determinants remain subsistence, settlement and social organization.
  • 335 novel pathogens discovered 1940-2004, mostly after 1980, 60% of which are zoonoses and 70% of those come from wild animals. With long exposure to zoonoses from domestic animals it makes sense for most new pathogens today to come from wild animals; also due to encroachment and habitat destruction.
  • Challenges of new zoonotic pathogens: establishing animal to human  transmission, then human to human transmission, and finally human population to population. Chatter is a pathogen trying to establishing the animal to human transmission but not yet getting the human to human. Chatter is often viral but can be other microbes as well. Viral chatter is a transitional moment in evolution; purely biological for the pathogen but primarily cultural for humans (human practices that help the pathogen make the transition by our behavior).
  • Attenuation hypothesis: evolutionary interests favor microbes not killing their hosts too soon. Works for the first transition when population groups were widely scattered.
  • Virulence hypothesis: Ewald’s concluded that evolution favors virulence for pathogens with multiple hosts. (ex. plague). We can’t take either hypothesis too far as both have contradicting examples.
  • We need to shift from just looking for drugs to combat pathogens and spend more time on factors of human ecology.
  • An interesting chapter on antibiotics and evolution.

Concluding focus: To dispel three myths

  1. Emerging infections are a new phenomenon. They are not. This is why the emerging infections page on this blog begins with emergences in Antiquity / Prehistoric. 
  2. Emerging and re-emerging infections are a natural or spontaneous phenomena. We have a part to play in microbial co-evolution. Epidemiological transitions are intended to balance microbiology in understanding these infections.
  3. Determinants of disease are different today than in the past. They are not.

“The purpose of this Unnatural History is to reveal the macroscopic determinants of human infection just as the germ theorists once revealed their microscopic determinants…. our approach has been one of both seed and soil, acknowledging the importance of pathogens while stressing their evolution in response to human activities: the ways we feed ourselves, the ways we populate and live together, and the ways we relate to each other for better or worse.” (p. 111)

 

I’m not an anthropologist so I’m not really going to look at this like an anthropologist.  Demographics shifts are what they are, facts. The underlying factors / variables  – subsistence (nutrition), settlement (living conditions/infrastructure), and social inequalities –are the same under all three transitions. As these conditions vary, so do the demographics. This is very useful; a reminder of the importance of human disease ecology. The Unnatural History of the title is reference to human manipulation of the environment creating the conditions for emerging infections. Epidemics are not ‘acts of god’, or simply a natural process that we are helpless to stop. We play our part. Often drugs are the easy way out of the problem, far easier and cheaper than building infrastructure or improving living conditions. 

The paradigm of epidemiological transitions is an anthropological tool. I don’t really have a practical use for labeling ‘transitions’.  As both the second and third transitions are incomplete, they are not of much use to me as concepts. The shortness of these transitions makes me wonder if we are not really looking at just one transition since ca. 1800 that is yet incomplete. It is more important to me to look at these underlying variables and their outcomes at specific times and places. From my point of view, taking generalizations about epidemiological transitions as more than a guide for research or a teaching paradigm can be problematic.

This is a short book and yet I probably highlighted more than any other e-book that I’ve read. The focus here is more theory than details. Some of their plague information is a little out of date but it doesn’t really detract from their main points. It’s a valuable resource for thinking about microbe-human co-evolution. 

Molecular Confirmation of Yersinia pestis in 6th century Bavaria

Erasing any lingering doubts about the agent of the Plague of Justinian, a group of German biological anthropologists have shown conclusively that Yersinia pestis caused an epidemic in a 6th century Bavarian cemetery at Aschheim. Harbeck et al (2013) provide a convincing refutation of previous theories about the etiologic agent of the Plague of Justinian.   Returning to the same cemetery where plague was previously reported, two independent labs using the most modern standards to prevent contamination confirmed Yersinia pestis from multiple burials within the cemetery making this the best characterized Early Medieval plague cemetery.

The cemetery, called Aschheim, is in Bavaria outside of Munich. It contains the remains of 438 people with an unusually high number of multiple graves but no disordered mass graves. The 19 multiple burials contained two to five individuals arranged in lines. The cemetery was dated archaeologically to 500-700 AD with remains being carbon dated ranging from 530 to 680, all consistent with the 541 pandemic and its aftermath. Harbeck et al (2013) tested 19 individuals from 12 multiple graves. From these, there were eight positive samples, but only one produced enough aDNA to do some SNP genotyping. Added to the previous paper, this makes 11 positive individuals from this cemetery. Given the tenuous survival of aDNA, 11 positive individuals out of 21 tested in the two combined papers is a very good success rate. This is a cemetery that the F1 antigen test would be interesting since it could be used on the entire cemetery without great cost or labor. More sensitive than aDNA, the antigen test could tell us the percentage of plague deaths in the cemetery.

Individual A120 was screened with several SNPs that mapped it to an early region of the phylogenetic tree in the 0.ANT section. This makes the Plague of Justinian isolate ancestral to the Black Death isolates (yellow boxes below) from East Smithfield. This section whose only point of diversity is 0.ANT1 at node 4. Date predictions for the nodes of diversity in the tree fits with the Plague of Justinian falling in this region.  Modern isolates that  form this region of the phylogenetic tree all come from central Asia (around Tibet), suggesting that like the Black Death, the Plague of Justinian also originated in Asia. Overall, everything fits in well with expectations for the first pandemic.

(Harbeck et al, 2013. Fig. 1)
(Harbeck et al, 2013. Fig. 1)

Reference:

Harbeck M, Seifert L, Hänsch S, Wagner DM, Birdsell D, et al. (2013) Yersinia pestis DNA from Skeletal Remains from the 6th Century AD Reveals Insights into Justinianic Plague. PLoS Pathog 9(5): e1003349. doi:10.1371/journal.ppat.1003349

Wiechmann I, & Grupe G (2005). Detection of Yersinia pestis DNA in two early medieval skeletal finds from Aschheim (Upper Bavaria, 6th century A.D.). American journal of physical anthropology, 126 (1), 48-55 PMID: 15386257