Blood Read online

Page 3


  WHAT WE NOW KNOW about blood seems all the more astounding when we think about where we have come from. For some two thousand years, philosophers and physicians imagined blood as one of the fundamental characteristics of our body and soul. We linked it to the spring, the air, and the liver.

  Thanks to the theories of figures such as Hippocrates, in 460 BCE, and Galen, in about 200 CE, we came to believe that sickness arose as a result of disequilibrium between four key parts (or humours) of the body: blood, yellow bile, black bile, and phlegm. Hippocrates inspired the Hippocratic oath and is often referred to as the father of modern medicine. Claudius Galen proved the presence of blood in the arteries, and argued that arteries and veins are distinct and that the liver has a key role in blood production. “The liver is the source of the veins and the principal instrument of sanguification,” Galen wrote in On the Usefulness of the Parts of the Body.

  Galen argued that the preponderance of one particular humour went so far as to determine a person’s basic personality type. One might be sanguine, choleric, bilious, or melancholic — words and concepts that continue to resonate with us today. Blood, for example, was said to quicken the spirit, and the adjective “sanguine” derives from the old French word sanguin and from the Latin sanguineus (meaning “of blood”). It refers to a person who is courageous, loving, and optimistic, especially in difficult situations. The New Oxford American Dictionary offers the following definition of “sanguine” in the context of medieval science and medicine: “of or having the constitution associated with the predominance of blood among the bodily humours, supposedly marked by a ruddy complexion and an optimistic disposition.”

  But too much of any humour would create a dangerous disequilibrium in both temperament and health — the elusive “mind-body” balance we still search and long for today. Traditional Islamic medicine and the Ayurveda medicine of ancient India suggest food and diet as one means to correct imbalances of the humours.

  Another was the technique of bloodletting, or phleb­otomy. In retrospect, it is sobering to imagine how many thousands of patients have died from bloodletting or its complications. I, for one, hate having my blood played with or withdrawn and feel grateful in the extreme for Louis Pasteur and Robert Koch, both nineteenth-century scientists who demonstrated that inflammation results from infection, thus obviating any need for bloodletting.

  Bloodletting is still practised in a few ways. We donate blood, have it withdrawn for laboratory tests, and use it to treat problems such as polycythaemia (an abnormally high concentration of hemoglobin in the blood) and hemochromatosis (a hereditary disorder in which excess iron is absorbed through the gut and deposited in tissues).

  For thousands of years, physicians have used leeches as a bloodletting device. They, like some other animals, such as mosquitoes, lampreys, and vampire bats, have figured out that sucking other animals’ blood is an effective shortcut to a rich, nutritious meal. I wonder who, in medical cultures in ancient Egypt, Greece, and India, came up with the bright idea of ushering leeches onto human skin for the purposes of bloodletting. Someone must have stepped back and muttered, “But there must be a use for this worm that annoys me so.”

  Leeches still have a role in modern medicine, particularly in reconstructive or plastic surgery. They dilate the blood vessels and prevent the blood from clotting, and are especially useful after surgery in promoting the flow of venous blood. They have proved useful in the re­attachment of body parts such as fingers, hands, toes, ears, noses, and nipples. Because veins have thin walls, they can be hard to stitch together in surgery. Until the body figures out how to do so again, leeches secrete an enzyme that helps move the blood into the thin and sometimes damaged veins of reattached body parts. They are energetic little devils. A leech can suck more than three times its body weight in blood.

  It may be troublesome to imagine a leech — which is basically a bloodsucking worm — attaching itself to your body. But other forms of traditional phlebotomy jump out as being far more invasive, and potentially lethal. I would take a leech over a human bloodletter, any day! Clearly, others feel the same way. Eric M. Meslin, associate dean for bioethics at the Indiana University School of Medicine, told me that while he was visiting the Spice Bazaar in Istanbul in April 2013, he came across a vendor who conducted a brisk business selling leeches. Identified on his storefront as “Prof Dr. Suluk,” the man sold leeches for purposes such as migraines, cellulite, low back pain, eczema, and hemorrhoids.

  In 400 BCE, the Greek historian Herodotus recommended cupping (the use of a partial vacuum to draw blood) as a means to promote appetite, digestion, and menstrual flow, and to resolve problems such as headaches and fainting. If blood is removed from behind the ears, he said, it brings about a natural repose. Other spots from which blood has traditionally been let include the knees and elbows. Bloodletting was certainly not limited to one cultural, religious, or geographic group. In addition to the Greeks and the Romans, bloodletting was carried out in Islamic cultures (for example, the Arab queen Zenobia killed King Jothima Al Abrash in this manner). Hindus practised it too.

  Bloodletting also entered into ancient Jewish traditions. As Fred Rosner wrote in 1986 in an article for the Bulletin of the New York Academy of Medicine, in the third to the fifth centuries CE, the Sages of the Babylonian Talmud held that a learned man should not live in a town that had no bloodletter. Bloodletting was recommended for headaches and plethora (an excess of blood).

  The medieval scholar and rabbi Maimonides wrote about the benefits and hazards of bloodletting, but not all Jewish writers believed in the practice. The Old Testament contains, in Leviticus, a prohibition against cutting into the skin. Maimonides said that before bloodletting, a patient should recite a supplication to God for healing, and that after treatment concluded that patient should say, “Blessed art Thou, Healer of the Living.”

  Over the years, many famous people have died of bloodletting. Charles II, king of England, Scotland, and Ireland from 1660 to 1685, should have been inspired by his own family history to pay close attention to the safeguarding of his own blood. After all, his own father, Charles I, was beheaded in 1649 on the charge of treason. Some of the king’s followers dipped their handkerchiefs in his blood. Oliver Cromwell, the revolutionary leader, permitted the king’s head to be sewn back onto his body so that his family could mourn properly after the execution. Nonetheless, some thirty-six years later, his son Charles II found himself ill at Whitehall Palace in London. Known as “the merry king” for his philandering, Charles II took to his bed one night with a sore foot. The next day, a barber shaved his head and the bloodletting began. In addition to enduring purging, mustard plasters, red-hot irons, and enemas of rock salt and syrup, Charles II had twenty-four ounces of blood withdrawn from his arms. He suffered a seizure and died.

  Napoleon survived a bloodletting and is known to have described medicine as “the science of murderers.” Mozart is thought to have died of shock from severe bloodletting, and George Washington lost his life a day after more than 2.3 litres of blood were taken from him, purportedly to help him cope with a cold and hoarseness.

  It would be easy to mock bloodletting as pseudo-­medicine that hurt or killed thousands of people over thousands of years, and whose widespread use has come to a halt only in the past century or so. But that would be an easy target. It is not hard to imagine the peals of laughter and squeals of disbelief that people might share in one hundred years when they analyze today’s medical practices. They will surely shake their heads and say, “What were they thinking?” We will always be in a state of evolution with regard to the perceptions about how our bodies work and how they can be cured of illness and disease.

  To me, the interesting thing about bloodletting is how thoroughly it was interwoven with our belief systems, and how long it endured. For two millennia, we coasted with the unassailable idea that a healthy body and a healthy mind should not be burdened by too much blood. In
our minds, we removed its impurities to improve our physicality, along with our moods and emotions, by letting our blood run. And ever since, we have been obsessed with the idea of balancing our minds and bodies and improving the composition of our blood. For thousands of years, to spill our own blood in the name of medicine, we used every manner of knife, quill, tooth, lancet, and scalpel. We could have filled rivers and lakes with all the blood we have voluntarily spilled. We did so because of our belief systems. We let our blood run because we had imaginations.

  We are always looking for ways to distinguish ourselves from other animals. Let me add one more point of comparison. Can you think of any other animal that cuts itself, or others, to satisfy the cravings of its soul? Other animals will attack if they need food, or run to avoid being eaten, but generally they have the good sense to leave their own blood alone.

  THE SEVENTEENTH-CENTURY BRITISH ANATOMIST William Harvey — physician to King Charles I — refuted thousands of years of medical thinking when he proved that blood circulates in the body and is pumped by the heart. He dissected live animals to establish his theory. It seems barbaric today, but Harvey had no other means at his disposal to advance his theories. As Thomas Wright notes in his book Circulation: William Harvey’s Revolutionary Idea, in the early seventeenth century, “Men could no more see blood coursing around their arteries and veins, going to and from the heart, than they could perceive that the earth was spinning round.”

  In 1628, Harvey confronted and shocked his doubters at the University of Altdorf in Nuremberg. Dressed in a white gown and his head covered with a white bonnet, the diminutive physician instructed porters to affix a live dog to a dissection table, immobilizing it and tying its jaws shut to prevent barking. He plunged a knife into the animal’s thorax, exposed its heart, and indicated the rising and falling of the organ. When the dog’s heart was in contraction, Harvey severed an artery. The blood spewed forth, showering the closest spectators, several feet away. Thus we finally learned the basics of blood circulation. Understanding this concept opened up the long and painful path toward blood transfusions.

  In the mid-1600s, doctors in France and England competed madly for the honour of carrying out the first blood transfusions. Dogs were transfused with the blood of other dogs, and eventually humans were transfused with the blood of calves and lambs. While some of these procedures did not lead to fatalities (possibly because little or no animal blood actually managed to enter the human bloodstream) and were deemed a success, others did result in death.

  One of the earliest documented transfusion attempts involved a French physician named Jean-Baptiste Denis, who grabbed a man named Antoine Mauroy off the streets of Paris in 1667 and attempted to calm his agitated mind by forcefully transfusing the blood of a calf into his veins. The physician believed that the mildness and freshness of the gentle animal would, by entering the patient’s bloodstream, affect his personality. We now know that a human being is likely to have a severe or fatal reaction to the blood of any animal, because the human body rejects the foreign substance. Mauroy died after a few attempts. His wife brought a complaint to the authorities. Denis and his colleagues countered that the patient’s wife had poisoned her husband. A French court eventually exonerated the doctor and charged the victim’s wife with murder. She disappeared from the records, and it is likely that she was executed.

  The French court finally decreed that no further transfusions were to be carried out without the consent of the French Faculty of Medicine. Shortly thereafter, both France and England banned human transfusions outright.

  James Blundell, the nineteenth-century English obstetrician, carried out the first successful human-to-human blood transfusions. In 1818, he used a syringe to extract four ounces of blood from a husband and transfuse it into his wife, to treat postpartum hemorrhage. She survived, and Blundell went on to carry out another ten transfusions between 1825 and 1830, five of which proved beneficial.

  Many patients died in early human-to-human blood transfusions. Looking back, we now know that many such deaths occurred because of mismatching blood types between donor and recipient.

  The path toward safe blood transfusions became much more promising in 1901, when the Austrian Karl Landsteiner discovered three blood groups: A, B, and C (later called O). The very next year, Landsteiner’s colleagues identified a fourth blood group: AB. The blood groups A, B, and AB are incompatible with each other. The blood type O, however, is compatible with the others. Within half a decade of Landsteiner’s discovery, Reuben Ottenberg at Mount Sinai Hospital in New York performed the first transfusion by matching blood types. He went on to perform more than a hundred other transfusions without the problems that had resulted previously from mixing incompatible types.

  Initially, blood transfusions required the volunteer donor to be placed next to the recipient so that their veins could be hooked up together. This was known as “blood on the hoof.” You can imagine how impractical it must have been to always require donors and recipients to be together, but people were amazingly creative in addressing the challenge. For example, in 1921, under the supervision of a man named Percy Lane Oliver, the Red Cross in London, England, set up a list of people who promised to be available, at any hour of the day or night, when donors were needed. These volunteers underwent medical exams ahead of time, including tests for blood type and syphilis. Their telephone numbers were recorded. At its height of activity in the 1930s, people on this list responded to nine thousand calls a year. Although the process of lining up donor with recipient must have been cumbersome, the emotional connections between the two people surely heightened the understanding of the value of the gift for both parties.

  But blood on the hoof diminished as a medical necessity as the science of blood storage moved forward. Sodium citrate was identified as a means to prevent stored blood from clotting; refrigeration was discovered as a safe means to prolong the shelf life of blood; and the looming tragedy of war drove us to discover the possibilities of blood banks so that massive amounts of blood could be moved to the front lines to save the lives of injured soldiers. The first blood depot was used in World War I, in 1917, when the U.S. army doctor Oswald Roberston used a citrate-glucose solution to store type O blood, to be used for British soldiers returning injured after fighting the Germans in the Battle of Cambrai in France. Use of the citrate-glucose solution made it possible to store the blood safely for a few weeks.

  The Canadian surgeon Norman Bethune entered the picture during the Spanish Civil War. On the side of the leftist Republicans, Bethune established a mobile blood service in Spain in 1936. Following the example of a Spanish hematologist working in Barcelona, Bethune travelled to Madrid and set up the service, which brought blood in bottles to Republican soldiers who had been injured at the front. Using a kerosene-­powered refriger­ator and sterilizing equipment, Bethune’s Canadian Blood Transfusion Service was soon making use of 4,000 donors, 100 staff members, and five trucks to deliver blood for 100 transfusions a day. Essentially, Bethune created an early model of the Mobile Army Surgical Hospital (or MASH, for short) units that were later employed by the American army during the Korean War in the early 1950s.

  In 1940, the African-American physician Charles Drew — who had studied medicine at McGill University — responded to a blood shortage in Britain during the Second World War. The United States had not yet entered the war, but working for the Presbyterian Hospital in New York, Drew devised a safe system to process, test, store, and ship plasma overseas to Britain. (I will discuss Drew in more detail in the next chapter.) After he directed the Blood for Britain program, the United States entered the war and had to see to its own blood supply needs. The American Red Cross organized a civilian blood donor service and collected some thirteen million units of blood over the course of the war.

  The science of transfusion continued to advance, thanks in part to new breakthroughs such as the discovery of the Rhesus blood system (more on this later),
the use of better anticoagulants to enhance blood storage, and the use of plastic bags for blood collection. Today, some 92 million blood donations are collected each year worldwide, many of them to be used for transfusions. As a noble and selfless gift, blood is in an entirely different class than money. Dollars from your bank account can be directed to the charity or person of your choice. And you expect, usually, to be thanked and recognized for a cash gift. If you give enough money, you may even get a bridge or building named after you. But for the most part, you have no idea who will receive your blood. You give it on principle. You and the recipient will never meet.

  It is not surprising that the evolutionary turning points in the science of transfusion culminated around the major wars over the course of the twentieth century. Blood donations became the ultimate symbol of the gift of life, from civilians to soldiers. Individuals helped save individuals, but also sought to protect an entire generation. But as we will see in the next chapter, blood donation has also come to reflect our deepest discriminations and prejudices, when philanthropy collides with politics and our very worst selves overcome our best selves.

  WHILE BLOOD IS UNIVERSAL in its nature and functions, it is also a marker of the gender difference between men and women. I’m sure that I was in no way unique when, at the age of thirteen or so, I walked into the bathroom in our Toronto home and gasped to see blood in the toilet. I knew that my sister, younger by one year, had just been in the same bathroom, and I ran to tell my mother that Karen was bleeding. I was astonished, and intrigued, by the firmness and quickness with which my mother flushed the toilet, told me not to worry, said everything was under control, and dismissed me. Earlier, she had told me about women’s menstrual cycles, and later, I believe, she repeated the message. But in the moment, her job was to get me out of the way and show me that what I thought was a major event deserved no attention whatsoever.