P1: GYQ Journal of Medical Humanities [jmh]
ph130-jomh-375318
June 19, 2002
7:47
Style file version June 4th, 2002
C 2002) Journal of Medical Humanities, Vol. 23, Nos. 3/4, Winter 2002 (°
Blood, Race, and National Identity: Scientific and Popular Discourses Allyson D. Polsky1
This essay examines the symbolic significance of blood in the twentieth century and its role in determining the composition of a national community along racial lines. By drawing parallels between Nazi notions of blood and racial purity and historically contemporaneous U.S. policies regarding blood and blood products, Polsky reveals a disturbing proximity in discourse and policy. While the Nazis attempted to locate Jewish racial essence and inferiority in blood and instituted eugenic measures and laws forbidding racial admixture, similar policies existed in the U.S. based on the so-called “one drop rule” that systematically discriminated against African Americans. KEY WORDS: race; blood; national identity.
Gene-bettered children will in no sense threaten human civilization. . . . If they grow up healthily gene-bettered, more such children will follow, and those whose lives are enriched by their existence will rejoice that science has again improved human life. —James Watson (1999)
DISCOURSES IN CIRCULATION The REPOhistory collective was formed by artists, educators, and social activists in 1989 with the mission of designing site-specific installations that interrogate how “official” histories are produced and contested in public culture and to create its own alternative histories. REPOhistory’s 2000 project “Circulation” used New York City’s vast network of public transportation, sewer and water 1 Address
correspondence to Allyson D. Polsky, Center for the Humanities, Wesleyan University, Middletown, CT 06459; e-mail:
[email protected]. 171 C 2002 Human Sciences Press, Inc. 1041-3545/02/1200-0171/0 °
P1: GYQ Journal of Medical Humanities [jmh]
ph130-jomh-375318
June 19, 2002
7:47
Style file version June 4th, 2002
172
Polsky
works, housing, waterfronts, blood banks, hospitals, and clinics as a backdrop for a history lesson in civic anatomy. “Circulation” examined how blood functions— or flows—in relation to culture and economics, as well as with respect to discourses about racial purity and national identity (Sholette, 1999, p. 4). Works by over thirty artists were disseminated through the mail, over the Internet, and in public venues including schools, street performances, and health clinics in order to educate New Yorkers about the symbolic significance of blood with REPOhistory lessons appearing on postcards, small objects, magnets, and stickers. For example, Greg Sholette’s work entitled “Do You Know Where Your DNA is?” called attention to how blood and genetics have been and are currently used as tools of discrimination. To get this message across, pedestrians on New York’s busy streets stumbled upon MetroCards whose cheery cityscape backgrounds were replaced by the double helix. Similarly, Tom Klem’s “Identification Cards” were e-greetings in which Klem depicted artists and family members as mug shots, identified only by their blood types. Klem’s cards referred to the ways Nazi scientists used blood and other physical markers to determine the identities of those with “impure” origins and to justify their extermination. (See Figure 1.) In addition to its broad pedagogical mission, “Circulation” also had an educational component specifically geared toward middle and high school students with its affiliation with New York’s Institute for Collaborative Education (ICE),
Fig. 1. (A) Source: http://www.repohistory.org/ circulation/cards/sholette.jpg; (B) Source: http://www. repohistory.org/circulation/cards/tnklem.jpg.
P1: GYQ Journal of Medical Humanities [jmh]
ph130-jomh-375318
June 19, 2002
7:47
Style file version June 4th, 2002
Blood, Race, and National Identity: Scientific and Popular Discourses
173
whose students produced an on-line magazine called The Bleeding Edge. Here students collaborated on the zine by designing an illustrated dictionary of the metaphors associated with blood, by creating bilingual family albums that include members who “may or may not be blood related or human,” as well as writing and illustrating stories for children on topics ranging from AIDS to the circulation of rumors. Students at all grade levels contributed related independent works to The Bleeding Edge, including fiction, nonfiction essays, and serial stories about blood. The “Circulation” project approached its object as both a physical entity and as a metaphor for identity by taking into account the complex and broad spectrum of meanings that enable the paradoxical status of blood. It investigated the contradictions that enable blood to be considered a potentially life-saving substance and a source of deadly pathogens, a symbol of human unity and a justification for segregation. But among its most ambitious goals was the deconstruction of the notion that our personal behaviors and our collective cultures are deterministically “in our blood.” This is an urgent task given the pervasive tendency of American popular culture toward the blind acceptance of biologism as well as the routine conflation of blood and genes. As Dorothy Nelkin (1999) explains: “Blood” frequently appears as the word for “genes,” implying that it holds meaning for the heritability of essential traits. In the American eugenics movement during the early part of [the twentieth] century, blood represented “stock,” or “lineage,” or “bloodlines.” The language of eugenics faded from public discourse after World War II. But today, encouraged by the highly promoted advances in genetics, the importance of heritability is again a prevalent theme in popular culture and the metaphor is still the “blood.” For example, an American prime time TV film called “Tainted Blood,” tells the fictional story of a pair of teenage twins who have been adopted as infants into “good” families in different parts of the country. Nevertheless they both ended up by murdering their parents as their biological mother had done. For they had inherited violent (tainted) predispositions. Criminal behavior was “in their blood.” (p. 277)
As most Americans know, thanks to the barrage of media attention it received, the first phase of the Human Genome Project is now complete. Harvard University biologist Walter Gilbert promises that it will not be much longer until we finally know “what it is to be human,” but what Gilbert and other scientists fail to acknowledge is that what counts as human has always been a reflection of power, signification, and valuation (Begley, 2000). Like the knowledge produced by the symbolics of blood in the twentieth century, the knowledge produced by the HGP in the twenty-first century will be influenced by past and present struggles over what qualities define the essence of human identity, and what criteria determine racial and national identities. Among the host of ethical concerns that have been raised by today’s rapidly advancing genetic research are the psychosocial impact of genetic screening and testing and the potential discrimination and exclusion that may face those who are deemed genetically “flawed.” Embedded within these technological advances is the ever-present threat of genetic determinism, which can hardly be considered
P1: GYQ Journal of Medical Humanities [jmh]
ph130-jomh-375318
June 19, 2002
7:47
Style file version June 4th, 2002
174
Polsky
a humanitarian advance. Scientists like Watson and Gilbert prefer to ignore the ethical implications of their work on the basis that the “genie is out of the bottle” and progress must march on. The purpose of my discussion is not to suggest that genetic research is intrinsically problematic or that it should be halted. However, an examination of twentieth century discourses linking race, blood, and national identity can and should be carefully read as a cautionary tale. Simply knowing the past is not enough to prevent its repetition in the future; however, it can provide us with a sense of what human rights abuses are likely and what steps are necessary to prevent them. TWENTIETH CENTURY NAZI RACIAL DISCOURSE I will begin my investigation of late twentieth century blood mythology by recalling what Michel Foucault (1978) described as “the biggest blood bath in recent memory” (p. 150). During the Nazi regime, blood was arguably the single most important vehicle for the promulgation of anti-Semitism. Nazism rested on the belief in German racial superiority, which was attributed to Germans’ shared descent from Nordic bloodlines. Hitler, who detested city life and cosmopolitanism, argued that city-dwelling Germans, who were no longer bound to the land by the early to mid part of the twentieth century, had lost sight of their heritage. The German nation, Hitler reasoned, could only salvage itself through the purification of its blood. Anthropologist Uli Linke (1999) explained that as the Nazis came into power, “Ideologically, society was conceived as a ‘new community,’ a unitary body based on the common substance of blood” (p. 207). The idealization of this common ancestral bloodline led to an obsession with cultures of the past. Mythological elements, symbols, customs, law, and rituals were all revived with great fervor (Linke, 1999, p. 198). By the mid-1930s, the Germans were intent on developing clear-cut guidelines to unambiguously distinguish between the “pure” Germans and the “impure” Jews, largely on the basis of blood. They turned to anthropologists and geneticists of the time to delineate the traits that they believed would reveal racial affiliation. The notion emerged that differences, including racial differences, were not only visibly written on the skin, but beneath the skin as blood emerged as a marker of pathological alterity. Otto Reche, professor of racial science at the University of Leipzig, was enlisted to conduct blood research aimed at identifying physiological racial distinctions. With another scientist, Paul Steffan, Reche founded the German Society for Blood Group Research in 1926. Based on his study of the rural population of Northwestern Germany, Reche noted that the “long-headed” European races had blood type A in greater numbers than any other type. According to Reche, another less well-defined race with assumed Asian origins had a higher prevalence of blood type B, and the pure-blooded people of pre-Columbian America were exclusively of type O. From these suppositions, Reche and Steffan concluded that
P1: GYQ Journal of Medical Humanities [jmh]
ph130-jomh-375318
June 19, 2002
7:47
Style file version June 4th, 2002
Blood, Race, and National Identity: Scientific and Popular Discourses
175
a strong correlation once existed between race and blood type. This correlation, they argued, had been corrupted through miscegenation, but blood purity could be restored through “properly” controlled human breeding. Reche did not invent his position out of thin air. Rather, he based his research and its conclusions on the distortion of previous studies conducted by the Polish serologist Ludwig Hirszfeld. Hirszfeld and his wife, also a doctor, served in the Serbian Army Medical Corps during World War I. In early 1916, they were evacuated to Salonica where they were joined by a large contingent of Allied troops. The Hirszfelds remained detained there with the others for two years, forming an extraordinarily ethnically diverse group. Noting an ideal opportunity for research to determine the heritability of blood groups, the Hirszfelds collected an impressive number of blood samples—over 8,000—from nearly twenty ethnic groups. Although they did not study type O at all, they found what they believed to be a correlation between ethnic groups and the prevalence of particular blood types. They also posited two distinct locations of human origin in India and North Central Europe. Despite confidence in their hypothesis, the Hirszfelds were very careful to note that their findings reflected information only about geography, where groups of people might have originated, and had nothing to do whatsoever with anthropological characteristics. Seven years after the Hirszfelds made their findings public, Reche and his colleagues attempted to prove precisely what the earlier scientists tried to debunk: the notion that blood types could be effectively correlated with certain racial characteristics. But there were problems with Reche’s attempt to appropriate the Hirszfeld’s research—for one thing, even in the Nazi capital of Berlin, so-called Aryans exhibited more type B blood than Jews. Furthermore, Reche had to admit no single blood type was more prevalent among the Jews than any other population. However, as Douglas Starr noted (1998), the Nazis nevertheless came to the erroneous conclusion that “because blood type B appeared with slightly greater frequency among Eastern Europeans and Jews (although still not in the majority), Nazi doctors identified it as a ‘Slavic’ or ‘Jewish’ marker” (p. 75). Starr explained that to [the Nazis], B became the blood type of the dark, Asiatic races and of the Minderwertig in Germany—the undesirable elements. Researchers correlated B blood with a host of negative traits, such as dark hair and a broad Slavic face. They linked it to “bearers of Polish names,” rather than “bearers of German names”; to urban as opposed to rural dwellers; to violent instead of nonviolent prison inmates; and to uncoordinated people versus graceful athletes. Whereas the A type, slightly more common among Germans, became linked with positive traits such as intelligence and industry, B was the mark of a retrograde population— imbeciles, alcoholics, those who were more prone to infection, and those suffering nervous disease, most of whom, incidentally, had their origins in the East. One researcher correlated B blood to the length of time spent on the toilet. According to his study, A types took just a moment to defecate, B types forty minutes or more. A French researcher sympathetic to Nazis wrote that B blood makes a person “better to retail trade than bearing arms.” (p. 75)
Despite the Hirszfeld’s strong public denunciation of the Nazis’ appropriation of their research, Reche, Steffan, and their colleagues pushed ahead with their
P1: GYQ Journal of Medical Humanities [jmh]
176
ph130-jomh-375318
June 19, 2002
7:47
Style file version June 4th, 2002
Polsky
agenda. Positing that the world had originally evolved from two distinct races, allegedly a superior Germanic one and an inferior Asiatic one, the Nazi scientists argued that miscegenation led to the emergence of so-called mongrel races such as Jews and Slavs. Detailed maps were drawn and widely distributed to depict these beliefs and to drum up support for the notion that racial repurification was necessary for the healthful future of the Aryan race. By the fall of 1935, these notions gained enough support that Hitler signed into law certain measures that were designed to racially “cleanse” Germany. The Nuremberg Laws excluded Jews from being considered German citizens and was a pivotal step in attempts at their dehumanization. The Law for the Protection of German Blood and German Honor made sexual relations and marriage between Jews and non-Jews criminal. This law defined Jews as those with at least three Jewish grandparents; those with fewer than three were considered “half-breeds” (Proctor, 1988, p. 132). The law was very clear on its restriction of the “half-breeds.” For example, someone who was only one quarter Jewish could be considered German and could engage in sexual relations with or marry another German, as long as the other German was not also one quarter Jewish. The Law for the Protection of the Genetic Health of the German People required all couples to undergo genetic testing prior to marriage so that those with “defective” traits could be forbidden from marriage or be subject to forced sterilization. All couples intending to marry had to be approved for certificates of racial eligibility. With the passage of the Nuremberg Laws, the Nazis made an unprecedented attempt to distinguish Jews and Germans on the basis of their blood; however, the stigma of being Jewish was also considered acquirable through contact with blood or through sexual contact. For example, in 1935, not far from a Jewish hospital in Southern Germany, a member of the German storm troopers was the victim of a car accident. He was taken to a nearby hospital immediately where he received a transfusion of “Jewish blood.” Following his recovery, he had to appear before a disciplinary court to determine whether or not he had been racially corrupted by his life-saving transfusion. The court noted that, strictly speaking, the transfusion had indeed made the man racially impure. However, they used the Civil Service Law to create an exception that would allow him to remain a storm trooper on the basis that the Jew whose blood he received had fought at the front during the first World War (Proctor, 1988, p. 149). In the same year, however, Dr. Hans Serelman, an Aryan, was not so lucky. The courts determined that Dr. Serelman, who donated his own blood to save a dying Jewish patient, had committed an act of defilement against the German race, and Dr. Serelman was sentenced to a concentration camp (Starr, 1998, p. 72). In fact, Nazi racial discourse was so popular that when the Nuremberg Laws were passed even leading German medical journals quickly applauded them. For ¨ example, the journal Deutsche Arzteblatt hailed The Law for the Protection of German Blood and German Honor, claiming it would protect the German “body”
P1: GYQ Journal of Medical Humanities [jmh]
ph130-jomh-375318
June 19, 2002
7:47
Blood, Race, and National Identity: Scientific and Popular Discourses
Style file version June 4th, 2002
177
from its further invasion by “foreign racial elements” and help to “cleanse the body of our Volk” (cited in Proctor, 1988, p. 133). Contrary to the popular notion that the Nazis corrupted the intellectual and liberal values of science in the service of their racial policies, historian Robert Proctor contends that Nazi political initiatives rose from within scientific communities and pervasive medical theories of racial hygiene. Proctor makes a good point insofar as Nazi racial discourse, including medical discourse, synthesized false cultural beliefs about Jews which were undoubtedly used to justify Nazi racial policies. While anti-Semitism corrupted science and medicine, I would also argue that this does not exclude the truth that science and medicine also corrupted humanist values and helped to justify anti-Semitism. In every historical era and among every culture, a continuum between bioscientific and popular beliefs always has a strong mutually reinforcing impact on scientific and popular discourses and practices alike. It is also important to keep in mind not only that the policies of racial hygiene adopted by the Nazis were supported by respected German scientists and institutions, but also that they were not completely anomalous. Indeed, Proctor and others have pointed out that racial hygienists looked to preexisting American immigration, sterilization, and miscegenation laws as they began to formulate their own policies. In 1907, Indiana, prompted by the eugenics movement, was the first state in the U.S. to enact laws for the sterilization of those deemed mentally ill or criminally insane. Similar laws quickly proliferated in other states, and in 1927 the Supreme Court weighed in on the matter. Ruling in the case of Bell v. Buck, the Court upheld a Virginia eugenics law that required the sterilization of the “insane, idiotic, imbecile, feebleminded, or epileptic” (Congressional Quarterly Researcher, 2000, p. 414). In his opinion in this case, Chief Justice Oliver Wendell Holmes declared, “It is better for all the world if, instead of waiting to execute degenerate offspring for crime, or to let them starve for the imbecility, society can prevent those who are manifestly unfit from continuing their kind” (p. 414). Holmes’ opinion reflects the belief that most traits, including behavioral ones, were utterly heritable, a belief which was then extraordinarily common in the U.S. Indeed, by the time of the ruling nearly thirty states and one Canadian province had enacted laws similar to Indiana’s, and by 1930, more than 30,000 people had been sterilized as a result of these laws (Proctor, 1988, p. 97). Compulsory sterilization laws were often loosely phrased. Those deemed “insane” or “feebleminded” were disproportionately recent immigrants whose limited literacy resulted in lower IQ assessment results (Hubbard & Wald, 1999, p. 21). Other populations affected by the sterilization laws included so-called sexual perverts, drug and alcohol users, epileptics and other “social degenerates” (p. 21). Although eugenicists found genetic “defectives” among all ethnic groups, certain groups were diagnosed far more often than others were. The bias of eugenicists was a major factor in the 1924 United States Immigration Restriction
P1: GYQ Journal of Medical Humanities [jmh]
ph130-jomh-375318
June 19, 2002
7:47
Style file version June 4th, 2002
178
Polsky
Act, which was designed to limit the influx of Jews, Poles, and Southern Europeans while enabling the immigration of people of British and Northern European descent. As Hubbard and Wald (1999) point out, the act restricted the number of people allowed to immigrate to the U.S. from other countries to two percent of the U.S. residents who had been born in that country based on an 1890 census. Therefore, it was the based on the population before the major immigration of people from southern and eastern Europe. This was a deliberate choice that effectively prevented the entry of eastern European Jews attempting to flee the Nazis (p. 21). Although the Holocaust ultimately led to denunciation of the notions of racial hygiene and the downfall of the eugenics movement, forced sterilizations continued well into the 1960s. Virginia, the last state to overturn its eugenics law, did not do so until 1979 when it finally shut down the Lynchburg Colony for the Epileptic and Feebleminded. By this time, in accordance with U.S. eugenics laws, more than 60,000 Americans had been sterilized. Another way in which the so-called purity of the American population was strictly controlled was through injunctions barring sexual relations between the races. The historical record shows that ever since the beginning of the slave trade in the United States in the sixteenth century, there has been recorded evidence of sexual relationships across racial lines. However, because slaves were legally classified as property rather than as citizens, they had no rights whatsoever and therefore no recourse for sexual abuse. Although censuses show thousands of multiracial people classified as “mulattos,” their paternity was rarely acknowledged. In fact, legally a child’s race was classified as the race of its mother; its status as a free person or a slave was wholly dependent on the mother’s status. This condition, known as the hypo-descent or “one drop rule” classified all persons with any black ancestry whatsoever as black. Marriage between anyone classified as white and anyone classified as nonwhite was explicitly prohibited in many states and upheld as constitutional by the Supreme Court in 1883 in Pace v. Alabama. The persistence of this racial attitude continued well into the twentieth century until 1967, when the Court finally overturned the prohibition of interracial marriages in Loving v. Virginia. At the time of this ruling, sixteen states still had laws on their books against interracial marriages. WORLD WAR II: BAD BLOOD AND THE LEGACY OF AMERICAN RACIAL DISCOURSE American outrage at Nazi atrocities appears hypocritical given our own longstanding racial discourse, and no other population in the U.S. felt this ideological contradiction more strongly than African-Americans. Although most American history texts fail to mention it, our military and blood policies during the World War II reveal the proximity of American racial discourse to Hitler’s own. To illustrate this point, I will examine the development of the blood procurement and
P1: GYQ Journal of Medical Humanities [jmh]
ph130-jomh-375318
June 19, 2002
7:47
Blood, Race, and National Identity: Scientific and Popular Discourses
Style file version June 4th, 2002
179
distribution policies of the American Red Cross, the leading organization of its kind, before, during, and after the war. The exchange and donation of blood has historically been viewed as an important symbol of community solidarity; the act of blood donation is considered an altruistic and patriotic contribution to the public good. The ideals of equality and fraternity appear to have motivated the creation of the first American Red Cross blood donation program in 1937. Doctors in Augusta, Georgia had observed the deaths of several black indigent women, usually obstetrical patients, who were unable to access blood for emergency transfusions. This spurred the development of the blood donation program, which was biracial and received enthusiastic support (Love, 1996, p. 144). By 1940, similar blood donor programs began to spring up across the country. As the war approached, the United States Armed Forces also became interested in blood procurement. A special meeting at the New York Academy of Medicine, attended by top scientists and military officials, led to the development of the Blood for Britain Project, an emergency operation to get plasma to British soldiers fighting on French battlefields. The esteemed physician and blood researcher Charles Drew, who had been the first black American to receive a doctor of science degree in medicine, was recruited to direct the effort.2 Although surgical residencies for blacks did not open up until the late 1940s and early 1950s, Drew managed to secure a position at Columbia’s Presbyterian Hospital (Love, 1996, p. 120). While at Columbia, Drew had helped to set up an experimental blood bank and wrote his doctoral dissertation on banked blood. These experiences prepared Drew to carry out pioneering research that would enable the separation of plasma from whole blood and its overseas shipment. In 1941, after leading the Blood for Britain Project, Drew established the first American Red Cross blood bank in New York City. The very same year the U.S. Armed Forces announced that blacks would be excluded as potential donors from American Red Cross blood banks across the country (Love, 1996, p. 49). The national blood program was significantly expanded with the entrance of the United States into the war following the bombing of Pearl Harbor in December 1941, yet 2 Drew
left his post at Blood for Britain for Howard University, where he was named head of the department of surgery and chief surgeon at Freedmen’s Hospital. Drew carried out his dream of leading the surgical residency program to build a team of excellent black surgeons who would carry out his legacy. Drew died in a car accident in 1950. Drew had been traveling all night with three other black doctors to a medical conference held ironically in Tuskegee, Alabama when he fell asleep at the wheel and crashed the vehicle. The rumor, which began to circulate almost immediately and persists to the present day, is that Drew bled to death because he was denied treatment at the local hospital. In fact, the hospital, like many other hospitals at the time, did operate on a segregated basis. However, by all accounts, Drew received the best possible standard of care. The Drew legend, as Spencie Love points out (1999), must be understood in the context of long-standing racial discrimination and justified mistrust of the white medical establishment. In the past and at the time of the crash, the hospital segregation system resulted not only in substandard care for blacks, but in some case, outright refusal of care on the basis that not enough black beds were available. Although this is not how Drew died, it is the way Bessie Smith and countless others did.
P1: GYQ Journal of Medical Humanities [jmh]
180
ph130-jomh-375318
June 19, 2002
7:47
Style file version June 4th, 2002
Polsky
blacks who attempted to enlist as soldiers or donate their blood faced discrimination and segregation. Following negative press coverage and strong protests against their blood policy, in January 1942 the Red Cross announced that the ban would be lifted so that blacks could donate blood, but their blood was to be labeled and segregated from “white blood.” The Red Cross claimed that their decision to segregate blood was based on the wishes of those who would be potential recipients. Repeatedly, Drew and others protested against the blood segregation policy. Despite Drew’s own unparalleled contributions to blood banking and plasma research, the Red Cross blood policy would have first made it impossible for him to donate his own blood at all, and later, possible only to donate it to other black recipients. The Red Cross policy reflected ambivalence and uncertainty in the minds of white Americans who believed in the dominant racial mythology of the 1940s. This mythology led them to believe that there was a fundamental difference between the blood of different races, that it was possible to transmit the traits and characteristics of one race to a member of another race by means of a blood transfusion, and that it was possible for blood transfusions to implant potentialities in an individual of one race that would show up in succeeding generations. These were paranoid fears that Charles Drew tried tirelessly to debunk. To get a sense of public sentiment at this time, consider this excerpt from an anonymous letter written in 1945 by a white person to their Congressman. The letter indicates the perceived threat to the loss of “pure” white racial identity that blood transfusion represented: The families of several young men and women, now serving with our armed forces overseas, have heard rumors which, at first, seemed too fantastic even to warrant serious thought . . . that blood plasma taken from Whites, Asiatics, and Negroes is being injected, without regard to its origin, into the bloodstreams of our wounded soldier. . . . [T]his unproved, potentially dangerous stuff to future generations is now being put into the bloodstreams of our wounded heroes. . . . [T]he color of a child is directly influenced by the blood of the mother and of the father. In fact, in the final analysis, the embryonic baby is the direct result of the mixing of the blood of the parents. No one knows the starting point of the pigmentation responsible for this coloration. A glance through the medical books will confirm this . . . . [A]re the babies, or grandchildren or great grandchildren of our wounded white soldiers to be white, brown, red, yellow, or black? How many white men, having a choice, would rather die there on the battlefield without plasma than run the chance of coming back to be the father, grandfather or great grandfather of a brown, red, black or yellow child? (cited in Love, 1996, p. 194)
In the years following World War II, the Red Cross opened branches all around the United States, yet because of irrational fears like these, their policy of racial discrimination in blood collection continued. The Red Cross used patriotic themes to recruit donors and officials who likened the Red Cross’s mission to that of the early American pioneers. However, as Douglas Starr (1998) explains, “In concession to the Southern States, the Red Cross authorized its chapters to label the blood it collected by race and let the hospitals they sent it to decide how to distribute it” (pp. 169–170). The Red Cross policy was considered reprehensible by other nations. Early in the Korean War, the Red Cross attempted to set up a blood donation center at
P1: GYQ Journal of Medical Humanities [jmh]
ph130-jomh-375318
June 19, 2002
7:47
Blood, Race, and National Identity: Scientific and Popular Discourses
Style file version June 4th, 2002
181
the United Nations. However, UN employees were so offended by the mandatory racial designation on Red Cross donor cards, a clear violation of the UN charter, that they threatened to boycott the collection. The boycott was only abandoned when the Red Cross allowed the UN to make its own donor cards without reference to race. Although the Red Cross later revoked their racial segregation policy, even as late as the late 1950s, Louisiana and Arkansas enacted blood segregation laws. The notion that black blood was bad blood was not isolated to national blood procurement and distribution policies. Between the years of 1932 and 1972, researchers at the Tuskegee Institute and the Macon County health department told some six hundred black men from Alabama that they would be given free treatment for their “bad blood.” Instead of receiving this treatment, however, almost four hundred men with late-stage syphilis and two hundred other disease-free controls, were given only placebos and the promise of decent burials. Although the periodic findings of this long-ranging study were routinely published in medical and public health journals, it was not until forty years had passed and the media began their own intensive investigation that the study was discontinued. It was not for another quarter of a century that a presidential apology was issued by President Clinton. As Spencie Love observes (1996), “The black men in the Tuskegee syphilis experiment were, in a metaphoric sense, allowed to ‘bleed to death’ as a result of medical neglect and hypocrisy” (p. 69). While some view Tuskegee as a tragic but isolated instance of unethical behavior that directly led to the development of future governmental guidelines for the protection of human research subjects, others situate the experiment in the context of long-standing governmental and medical human rights abuses perpetuated against African Americans. To view Tuskegee only in the first light is not merely an innocently na¨ıve position, it is to ignore the underlying social and economic context that both facilitated the project and was used to justify it as legitimate medical research. A more recent case where blacks have been excluded from the protections typically accorded citizens on the basis of “bad blood” is the racialization of sickle cell anemia. In his recent study, In the Blood: Sickle Cell Anemia and the Politics of Race (1999), anthropologist Melbourne Tapper has chronicled sickle cell anemia, which has been erroneously characterized as a “racial disease” since its identification in the early part of the twentieth century. The disease, which affects people who inherit a mutated gene from both parents, disproportionately affects people of African descent. Approximately one in five hundred African Americans have sickle cell anemia, based on their possession of alleles from both parents, and approximately one in ten African Americans are carriers for the sickle cell trait. Scientists conducting early research on sickle cell anemia used the prevalence of the illness in black populations in three insidious ways that made sickle cell appear as a symptom of race. First, they used it as evidence of an inherently diseased and inferior black body. Second, sickle cell anemia was to caution against miscegenation, which scientists falsely believed would facilitate the genetic spread
P1: GYQ Journal of Medical Humanities [jmh]
182
ph130-jomh-375318
June 19, 2002
7:47
Style file version June 4th, 2002
Polsky
of the disease to the white population. Third, just as the Nazis sought to determine traits that revealed racial affiliation, the sickling trait was viewed as evidence of the nonwhite status of its bearer. The possession of the trait was therefore used to call the racial “purity” of white sufferers into question. In order to explain how Mediterraneans who appeared white could be diagnosed with sickling, scientists theorized that they had come into contact with African blood through “historical or geographical circumstances” (Tapper, 1999, p. 25). By the 1970s, the overdetermined association between sickle cell anemia and race had not been resolved. Progressive civil rights activists began to call attention to disparities in health status and rates of mortality among whites and blacks, disparities that were strongly linked to socioeconomic disparities. Furthermore, these disparities also reflected inequities with respect to access to health care. The Nixon administration promised to address the health care of blacks; however, they chose not to respond to the specific issues raised by activists. Rather, Nixon pledged his commitment to combating a single disease—sickle cell anemia—that he described as a tragic burden to blacks. Following hearings in 1971, Congress passed the Sickle Cell Anemia Control Act. Massive screening programs to detect the presence of the sickle cell trait were launched as the proposed “cure” for sickle cell. This approach, which focused on detecting the presence of the trait rather than treating the disease, was highly problematic. The trait and the disease were conflated. Reports about the incidence of the disease were exaggerated and, without adequate counseling, those with the trait worried that they actually had the disease. This idea that sickle cell anemia was an epidemic and a public health threat even though genetic disorders are never contagious led several states to consider implementing mandatory screening of black citizens. Some even considered a provision making screening a prerequisite for children to enroll in public school (Tapper, 1999, p. 121). As screening programs were implemented across the country, carriers of the sickle cell trait who did not have the disease were denied jobs, entrance into the military, and were discriminated against by insurers (Tapper , 1999, p. 122). Tapper explains: Mass screening for sickling, as it was practiced in the 1970s, took place in supermarkets, neighborhood clinics, churches (after services), and fairs and in other public spaces. The results were conveyed to the screened individuals in an ad hoc manner—most often through telephone calls or postcards. The information generated by the screenings was, in other words, circulating freely—and unconfidentially—in the public realm. . . . [S]ince the 1940s, a person’s genetic makeup has been articulated as the watermark of his or her individuality, as the most private of properties, as the epitome of that which is assured government protection by the right to privacy. As such, it can be said to be sine qua non of full citizenship. The fact that the screening results did not remain strictly confidential indicates that a private sphere—and therefore full citizenship—was not immediately available to blacks (p. 122)
Sickle cell “control” programs, focused almost exclusively on African Americans, were aimed at limiting the spread of the disease by limiting reproductive choices, thus further stigmatizing blacks. They led to deep suspicions
P1: GYQ Journal of Medical Humanities [jmh]
ph130-jomh-375318
June 19, 2002
7:47
Blood, Race, and National Identity: Scientific and Popular Discourses
Style file version June 4th, 2002
183
among blacks about governmental conspiracy and even accusations of genocide. By the mid-1970s, many sickle cell screening programs were discontinued and the Act is considered by many to be a major policy failure. However, as Hubbard and Wald note (1999), the old screening test used in the 1970s was not capable of detecting the actual sickle cell allele, only the presence of sickle cell hemoglobin in the blood (p. 34). Since then, a new test has been developed that is now able to detect the sickle cell allele, which opens up the possibility of fetal screening for the trait or the anemia. The test has renewed interest in reestablishing sickle cell screening programs on the basis that it would give parents who are both carriers the informed option to decide whether or not to continue a pregnancy. The old question as to whether or not two parents with the sickle cell trait are “unfit” and should be discouraged from reproducing since their offspring stand a one in four chance of being born with sickle cell anemia has returned, along with new questions such as whether or not the new test will become mandatory. Will parents who wish to continue their pregnancy after a positive fetal test result be counseled to abort the fetus? Will they be accused of endangering their offspring if they don’t? Will the fetus, once born, face genetic discrimination for having a “preventable genetic flaw”?
CONCLUSION: DNA AND DIFFERENCE At the same time as DNA research progresses, old questions are resurfacing about human origins and the “biology of race.” Historically, biological notions of racial difference have been used to establish superiority and inferiority. Differences have been used to categorize and stratify humans; they are often represented as deviations from the norm, a racial ideal explicitly coded as white, male, Christian, and European. Where the search for differences has taken place, so too has the search for the cause. Searching for the cause of difference has not only been used as a justification for discrimination but also as a search for the cure for differences. Looking at today’s popular media, there is very little to suggest that developing genetic research is immune from problems similar to those that plagued past research. Consider the 29 January 2001 issue of U.S. News & World Report, which featured a cover story by Nancy Shute entitled, “Tracing Your Genetic Roots: DNA Mapping in Unraveling the Mystery of Human Origins.”3 Shute’s article begins by describing a Jewish man’s quest to find his genetic ancestors and relatives. Using the services of a small Houston firm, Family Tree DNA, the man submitted his 3 On
page 37, the Hirszfelds are incorrectly identified as immunologists, when in fact Hanna was a pediatrician. In her explanation of the Hirszfeld’s discovery of the prevalence of blood groups among certain ethnic populations, Shute offers a superficial explanation of their findings. She claims, “Blood types were used to prove that the Romany, or Gypsies, were correct when they claimed that they originally came from the Indian subcontinent, not Europe” (p. 37). Of course, this is not how the Hirszfeld’s research was used by the Nazis.
P1: GYQ Journal of Medical Humanities [jmh]
184
ph130-jomh-375318
June 19, 2002
7:47
Style file version June 4th, 2002
Polsky
DNA to be matched with the DNA of others in the firm’s database. This search resulted in a “match” to another man with similar genetic markers, and Family Tree DNA determined that both men shared a lineage that can be traced to the 3,000-year-old priestly cohanim group, who are thought to be the descendants of the biblical Aaron. Another woman featured in the article, an Alaskan Aleut, is participating in a study conducted by the University of Kansas by contributing a small sample of her DNA to the University’s effort to establish proof of the migration of her ancestors from Asia thousands of years ago. Shute cites the Jewish man and the Aleut woman as exemplary of a universal desire to understand human origins. They are both portrayed as beneficiaries of new advances in genetic research, which Shute claims will make the mapping of human history possible. But a comment made by the Aleut woman is particularly telling with respect to what I consider na¨ıve optimism about emerging physical anthropology. She tells Shute, “I think [contributing my DNA] is important. People always acted like because we were so far away we were a substandard species. It proves we were out here for a long, long time” (cited in Shute, 2001, p. 40). According to some evolutionists, being a substandard and ancient species are not mutually exclusive. While this woman believes that “being out here for a long, long time” is a factor that will invoke respectful attitudes, there is no guarantee that others will see it this way. It is just as likely that the data could be interpreted by those with another agenda as evidence of the less evolved status of Aleuts relative to other populations. This possibility became uncannily clear to me as I noticed that of the six people profiled in the article, only one is not explicitly identified as a member of a minority group. I then approached a map within the article that depicts how research on mitochondrial DNA suggests that humans shared a common ancestral origin in African about 150,000 years ago. This map, which extends for about a page and a half of the six-page article, shows hypothetical migration routes from Africa diverging in the Middle East through Europe and Asia, a map that eerily resembles a map created by the Nazis to explain the origin of the human races (cf. Linke). Although the article cautions against the conflation of DNA and identity, it does just that by privileging this map. Having uncritically represented a singular origin of the races and their subsequent admixture, it supports the notion of originary racial purity and invites speculation about the corrupting effects of miscegenation (indeed a sidebar directly asks, “Did Early Man Mix it Up?”). This map, like the Nazi one, is conjecture, but is represented as the legitimate result of cutting edge scientific research. As Nelkin and Lindee point out about the persuasive power of maps: The apparent precision of a map may make invisible the priorities and interests that shaped it. As forms of knowledge, all maps reflect social perspectives on the world at the time of their making; they are products of cultural choices. Maps select and link features of the world, in effect transforming those features by making them part of a coherent, single landscape. (Nelkin & Lindee, 1995, p. 8)
P1: GYQ Journal of Medical Humanities [jmh]
ph130-jomh-375318
June 19, 2002
7:47
Blood, Race, and National Identity: Scientific and Popular Discourses
Style file version June 4th, 2002
185
Fig. 2. Source: http://www.genetree.com.
The solution to this dilemma is not to deny human differences, but rather to think through the stakes involved in the way we conceptualize and represent differences. As this discussion has shown, we ought to be wary of any promise—genetic or otherwise—that claims to provide the keys to human origin or nature. Considering the history of American racial discourse in the twentieth century, like other forms of discrimination cloaked in scientific terms, twenty first century genetic discrimination is most likely to affect those who are already discriminated against on the basis of their minority status. This status is determined by others’ negative perception of their difference, which is both essentialized and pathologized. As scientific “progress” marches on, we must be vigilant not to falsely situate biology as the supreme locus of identity, lest we fall prey to the repetition of the physical anthropology of the past, which would amount to an intellectual and ethical devolution. In our quest to improve humans, we must always remember that there is no such thing as a humane eugenics. Do you know where your DNA is? (See Figure 2) REFERENCES Begley, S. (2000, April 10). Decoding the human body. Newsweek, 135(10), 50–57. The Bleeding Edge. (2000, January 5). Retrieved from http://www.thebleedingedge.org. Congressional Quarterly Researcher.(2000). Eugenics’ dark past–and bright future. The CQ researcher: Human genome research—does it open the door to genetic discrimination, 10(18), 414. Foucault, M. (1978). The history of sexuality. Volume 1: An introduction (R. Hurley, Trans.). New York: Vintage. Hirschfeld, L., & Hirschfeld, H. (1919). Serological differences between the blood of Different races: The results of researches on the Macedonian front. Lancet, 2, 675–679. Hubbard, R., & Wald, E. (1999). Exploding the gene myth: How genetic information is manipulated by scientists, physicians, employers, insurance companies, educators, and law enforcers. Boston: Beacon Press.
P1: GYQ Journal of Medical Humanities [jmh]
186
ph130-jomh-375318
June 19, 2002
7:47
Style file version June 4th, 2002
Polsky
Linke, U. (1999). Blood and nation: The European aesthetics of race. Philadelphia: University of Pennsylvania Press. Love, S. (1996). One blood: The death and resurrection of Charles R. Drew. Chapel Hill, NC: The University of North Carolina Press. Nelkin, D. (1999). Cultural perspectives on blood. In E. A. Feldman & R. Bayer (Eds.), Blood Feuds: AIDS, blood, and the politics of medical disaster (pp. 273–292). New York: Oxford University Press. Nelkin, D., & Lindee, M. S. (1995). The DNA mystique: The gene as a cultural icon. New York: W.H. Freeman and Company. Proctor, R. N. (1988). Racial hygiene: Medicine under the Nazis. Cambridge, MA: Harvard University Press. Repohistory. (September 6, 2000). Circulation. Retrieved from http://www.repohistory.org/circulation. html. Reverby, S. M. (Ed.). (2000). Tuskegee’s truths: Rethinking the Tuskegee Syphilis Study. Chapel Hill, NC: University of North Carolina Press. Sholette, G. (1999). Authenticity2. Repohistory: The anatomy of an activist urban art project. New art examiner. Retrieved September 6, 2000 from http://www.newartexaminer.org/ 1199 authencity.html. Shute, N. (2001, January 29). Where we come from: Recent advances in genetics are starting to illuminate the wanderings of early humans. U.S. News & World Report, 34–41. Starr, D. (1998). Blood: An epic history of medicine and commerce. New York: Alfred A. Knopf. Tapper, M. (1999). In the blood: Sickle cell anemia and the politics of race. Philadelphia: University of Pennsylvania Press. Watson, J. D. (1999, January 11). All for the good: Why genetic engineering must soldier on. Time, 91.