Home » Downloads » Discuss the ways in which the 20th century may or may not have been a century of progress

Discuss the ways in which the 20th century may or may not have been a century of progress

Discuss the ways in which the 20th century may or may not have been a century of progress

Was the world better off in 2000 than it had been in 1900?

Discuss the ways in which the 20th century may or may not have been a century of progress

The World Science Built

◆ Atoms, Elements, and Germs: Science Reveals the Universe

A new scientific revolution easily kept pace with the technological advances of the age. Ever since the ancient Greeks, Europeans had assumed that all matter was made up of four “elements”: earth, air, fire, and water. Not until the 1780s, when Antoine Lavoisier (1743–1794), the father of modern chemistry, began to study the nature of fire was that notion finally abandoned. A lawyer, and also France’s chief tax collector, Lavoisier was an ardent scientist as well. Testing the properties of oxygen, only just isolated, Lavoisier showed that fire was not an element but a compound of more basic ingredients. His research led to the conclusion that in nature no matter was ever lost (the law of the conservation of matter). Lavoisier ended his distinguished career by drawing up a list of 32 known elements, preparing the way for advances by his successors. In 1808, John Dalton (1766–1844), an English physician who experimented with gases decided that each of Lavoisier’s elements was composed of identical atoms and that each element could be distinguished from another by its atomic weight. His calculation of the weights of various atoms led to a theory about how elements form compounds. When, in 1869, Russian chemist Dimitri Mendeleev (1834–1907) put all known elements in order of atomic weight, he found they grouped themselves into several families sharing common properties. Since some families were missing some elements, he assumed (correctly) that these would subsequently be discovered, or even created, by humans. Mendeleev’s periodic table of the elements, the foundation of physical chemistry, remains one of the greatest achievements of nineteenth-century science. Shortly after mid-century the German botanist Ferdinand J. Cohn (1828–1898) discovered microscopic plants he called bacteria, which he suggested were the causes of many diseases. Scottish surgeon Joseph Lister (1827–1912) created new antiseptic practices that helped fight infection by killing these bacteria. Hungarian Ignaz Semmelweis (1818–1865) realized that childhood fever was caused by germs carried from patient to patient by doctors who saw no need to wash their hands or instruments. Semmelweis died of a wound infected during an operation long before the medical profession accepted his findings, but today’s antiseptic practices are based directly on his findings. This germ theory of disease was finally proven by Louis Pasteur (1822–1895) in France and Robert Koch (1843–1910) in Germany. A paper on “Germ Theory and its Application to Medicine and Surgery” read by Pasteur before the French Academy of Sciences on April 29, 1878, dealing with his experiments on the anthrax virus and septicemia bacterium is usually taken to mark the public debut of germ theory. A crude but occasional effective inoculation against smallpox, originally developed in China, had been known in Europe for centuries. But only during Pasteur’s fight against anthrax (a disease affecting sheep) in the 1870s did he begin to understand why vaccinations worked. Applying his insights to humans, Pasteur developed an effective smallpox vaccine using a mild form of the disease. Persons who received the vaccination escaped the more severe effects of the disease which killed a majority of its victims. Pasteur always emphasized the practical applications of his theoretical experiments. We see this every time we pick up a container of pasteurized milk, although Pasteur first developed the process of boiling to kill bacteria to help France’s beer industry. So eager was Pasteur to further knowledge of science that he used his prestige to institute evening university classes for working men. In Germany, Robert Koch, who had been a field surgeon during the Franco-Prussian War (1870–1871), pioneered research into other “germs,” eventually discovering the organisms that caused eleven different diseases, including cholera (1884) and tuberculosis (1882). He was awarded the Nobel Prize in Physiology and Medicine in 1905 for his development of the “scratch test” for exposure to tuberculosis, still in use today. He also developed many of the techniques still used to grow bacteria in a laboratory. These medical discoveries began to have an immediate impact on the lives of people in industrial societies as governments suddenly found themselves in the business of keeping things clean. Great Britain passed laws in 1875 requiring local authorities to maintain sewers, forbade the building of any new houses without a toilet, and outlawed selling foods colored or stained to look fresher than they were. Jacob Riis (1849–1916) emigrated from Denmark to the United States as a young man. Working as a police reporter for newspapers in New York, and then with his photograph-filled book, How the Other Half Lives (1890), he publicized the terrible living conditions in the slums that housed an ever-growing population of workers. Riis’s work inspired stronger public health and housing laws. The city even went so far as to buy up land in upstate New York to keep development from contaminating the source of the city’s water supply. Some historians of science believe the greatest advances of the century were made in physics. Many of these grew out of the process of industrialization. Working to improve techniques for boring metal cannon in 1798–1799, the American Benjamin Thompson (1753–1814) demonstrated that the activity generated a limitless amount of heat. Since no material body could be produced in unlimited quantities, his experiments proved that heat was a kind of energy, not a material thing. Using these findings, Hermann von Helmholtz (1821–1894) of Germany was able to formulate the law of the conservation of energy (1847). A counterpart of Lavoisier’s law of the conservation of matter, Helmholtz’s Law held that, although energy could be converted from one form into another, there could be no addition to, nor subtraction from, the total amount of energy in the universe. Because this law applies not only to heat, but also to electricity, magnetism, and light, it was one of the most important scientific generalizations of the nineteenth century. But advances in physics were not limited to theory; many of them had an immediate impact on everyday life. In Great Britain, Michael Faraday (1791–1867) helped develop the dynamo, a machine that allowed the transmission of electric current over long distances. Faraday’s ingenuity made possible public lighting systems, telephone networks, and the development of the electric motor.

By the century’s end, however, physicists were challenging the accepted nature of the universe itself. Not only were atoms not the smallest units of matter in the universe, many of them were also structurally unstable. Physicists themselves were startled in 1895 when Wilhelm Röntgen (1845–1923) of Germany reported a strange ray he detected while sending electric current through a glass tube from which most of the air had been removed. Röntgen named what he saw the “X-ray” because he was uncertain of the ray’s exact nature, although he believed it was a form of electromagnetic radiation like light but of a shorter wavelength. Future experiments proved his belief correct. For his discovery, Röntgen was later awarded the very first Nobel Prize in Physics (1901). In France, Henri Becquerel (1852–1908) discovered that uranium compounds also gave off a form of radiation; the papers he published in 1896 gave modern physics a new direction. Maria Skl⁄odowska-Curie (1867–1934) coined the term “radioactivity” in 1898 for the phenomenon first observed by Becquerel. Building on Becquerel’s work, she and her husband Pierre (1859–1906) demonstrated that radioactivity was an atomic property of uranium and isolated two more radioactive elements, radium and polonium. The Curies and Becquerel shared the Nobel Prize for Physics in 1903. In Britain, Joseph John Thomson (1856–1940) built on Röntgen’s work to give humanity a glimpse inside the atom with his discovery of the electron in 1897 for which he was later knighted. At the same time Ernest Rutherford (1871–1937) suggested that each atom had a central, positively charged nucleus, which was separate from its negatively charged electrons. Radioactivity was caused by electrons escaping from unstable atoms. X-rays, radioactivity, and the electron theory challenged one of the most dearly held beliefs of science, the idea that matter was indivisible and continuous. The work of Röntgen, Becquerel, the Curies, Thomson, and Rutherford cleared the way for a new understanding of the universe. The universe was neither solid nor stable, but composed of energy only precariously bound into atoms. The single, simple “theory of everything” the Scientific Revolution thought it had found in Sir Isaac Newton’s law of gravity receded further and further into the distance. The greatest challenge to that theory came from Albert Einstein (1879–1955). The son of a German Jewish electrical engineer in Switzerland, Einstein gave little evidence of genius during his school days. Unable to find a university post, this graduate of the Swiss Polytechnic Institute supported his family as a patent office clerk. Yet in 1905, at the age of 26, he published three articles in the same issue of the Annals of Physics that altered the history of the century (and won him the 1921 Nobel Prize in Physics). The third essay became the theory of special relativity. Science had already shown that light moved in a straight line and at a constant speed no matter the vantage point. But from this fact, Einstein drew seemingly outrageous conclusions. He demonstrated that, when observed, a moving clock ran more slowly than a stationary one and a moving object shrank in the direction of the motion of light. He used the example of two strokes

MARIA SKL⁄ ODOWSKA-CURIE (1867–1934)

Maria Skl⁄odowska was born on November 7, 1867, in Warsaw, while Poland was part of the Russian Empire. Her father taught mathematics and physics, and her mother ran a school for girls. But the death of her mother from tuberculosis when little Mania (as she was called) was only eleven marked a severe change in the family’s fortunes. Both Mania and her older sister Bronia (Bronisl⁄awa) had to go to work after graduating from the Russian lycée (secondary school). There was no university education for girls in Poland, but both Skl⁄odowska girls wanted to be scientists, so they worked out a plan to take turns paying for each other’s way to Paris. Mania tutored Polish working women as part of a nationalist “free university” and worked as a governess to pay Bronia’s way to medical school in Paris. Then, in 1891, with Bronia’s help, Mania followed her sister to Paris and took university degrees at the Sorbonne in physics (1893) and mathematics (1894). In 1894 she also met Pierre Curie (1859–1906) who had risen from laboratory assistant at the Sorbonne to supervisor at the School of Physics and Industrial Chemistry in Paris. In 1895, shortly after Pierre successfully defended his doctoral dissertation on magnetism (or Curie’s law: the magnetic coefficients of attraction of paramagnetic bodies vary in inverse proportion to the absolute temperature), and Pierre and Marie (as she was called in France) were wed on July 25. In 1896, while working with uranium ore, Henri Becquerel (1852–1908) discovered the phenomenon that Maria would later name radioactivity. Searching for a topic for her own doctoral dissertation, Maria looked to extend Becquerel’s discoveries to other substances. Maria and Pierre first worked with pitchblende (a major source of uranium) and together discovered two new elements in 1898: polonium (named by Maria for her homeland) and radium. Becquerel and the Curies shared the 1903 Nobel Prize for Physics for their work on radioactivity. The Prize won Maria her doctorate, but Pierre got the job, a professorship at the Sorbonne in 1904. The Curies had two daughters—Irène (in 1897) and Ève (in 1904)—before Pierre was killed on April 19, 1906, when he was run over by a horsedrawn cart. Maria had been working as a lecturer in physics at a normal school for girls (a teacher’s college) since 1900, but on May 13, 1906, she was appointed to fill Pierre’s professorship, the first woman to be so honored. In 1911, she won a second Nobel Prize, this time in Chemistry, for her continuing work on radium. The Radium Institute at the University of Paris was opened under her direction in 1914. Throughout the First World War, assisted by her daughter Irène, Maria Skl⁄odowskaCurie worked on the medical uses of X-radiography. She set up x-ray units for military hospitals and herself braved the dangers of the trenches. Working for the League of Nations after the War, Maria publicized the practical and theoretical uses of radioactive materials. Irène and her husband Frédéric Joliot continued to advance Maria’s work, and their discovery of artificial radioactivity also won a Nobel Prize. On July 4, 1934, just a few months after her daughter’s discovery, Maria died of leukemia probably caused by her years of exposure to radiation. of lightning hitting a railway embankment at two equidistant points, one in front of and one behind a train moving at a constant speed. An observer outside the train would see the two strokes strike the embankment at the same time, but for someone in the moving train, the front stroke would seem to hit the embankment first. Although such distinctions were important only for objects moving close to the speed of light, Einstein had shown time to be relative. Space, he concluded, was also relative, since the length of material bodies could not be objectively measured, being dependent on the speed at which they were moving in relation to the observer. Time and space were not separate entities but rather joined in a continuum, and both were relative to the position and speed of the measurer. Almost as an afterthought, Einstein added that energy and matter were not different things but different states of the same thing. Indeed, matter could be converted into energy as expressed in the famous formula E=MC2. Matter was stored (latent) energy. Since the energy contained in any object was enormous relative to its mass, a small object could potentially release a tremendous amount of energy. In this formulation the Atomic Age was born. Einstein now had no difficulty finding a professorial position. At the University of Prague in 1911, he began to assess the workings of gravity within such a world and soon proposed his general theory of relativity (1915). His theory can be approached by thinking of objects placed on a rubber sheet: the weight of the objects will cause the sheet to sag. This creates a “dimple,” or a curve within space/time. Other objects passing by this depression would then roll into it: hence, gravity. Einstein’s prediction that light waves were also subject to the force of gravity was proven correct in 1919 by measurements made during an eclipse of the sun. Suddenly Einstein was an international celebrity.

Living with the Modern World

◆ The Triumph of Technology Nineteenth-century society became ever more susceptible to materialism as industrialization continued to alter everyday life. At the beginning of the century sailing ships took weeks to cross the Atlantic, but steam-engine-powered ships measured the trip in days before the century’s end. Transportation overland also accelerated dramatically as networks of railroads spread across Europe, North America, India, and parts of South America. By 1905 the 5,542 miles of the TransSiberian Railroad, running from Moscow to Vladivostok, united Russia’s Asian lands with Europe. With each decade the products of far-flung areas were made ever more available to manufacturers and consumers. Communications across nations and empires were speeded up by the development of the commercial telegraph (1844) and the laying of the Atlantic Cable (1876) that allowed it to cross an ocean, by the telephone (1876), and the wireless (1895), the precursor of radio and television. By the end of the century a single inventor, Thomas Edison (1847–1931), a self-taught former telegraph operator, held over a thousand patents, and was responsible for such diverse changes as the lighting of cities, the phonograph (1877), and the motion picture (1896). London’s steam-powered railway opened in 1863. On October 27, 1904, New York’s first subway line, the IRT (Interboro Rapid Transit, better known today as the 1, 2, 3, 4, 5, 6, and 7 lines), opened after four years of tunneling; eventually there would be more than 700 miles of track in the world’s most extensive rapid transit system. The first internal combustion engines were powering cars, boats, and cycles by century’s end, and on a cold and windy December 17, 1903, Orville Wright (1871–1948) and Wilbur Wright (1867–1912), two bicycle mechanics from Dayton, Ohio, ushered in the age of the airplane with their flights over the North Carolina coast. It was hardly surprising that most of Western society was confident technology would secure its control over the world’s riches and ensure its material well-being. ◆ Realism and the Middle Classes Realism and naturalism were the literary counterparts of the materialism that dominated the late nineteenth century. Realists scoffed at literature that showed a world peopled by demons, individuals laboring under curses, and medieval lords and maidens. Instead, realist writers found the drama in the lives of ordinary people in everyday surroundings. They were as critical of modern society as were the romantics, but preferred to expose the harsh reality of modern life in the hope of improving it. Honoré de Balzac (1799–1850), whose ninety-one-volume Human Comedy appeared between 1827 and 1847, was the founder of French realism. A prodigious writer who put in fourteen- to sixteen-hour days fueled by endless pots of coffee, he spent his off hours in an equally extravagant social whirl that kept him in constant debt. The son of a peasant and a woman of middle-class background, he knew France from the bottom up. His novels threw a merciless light on the greed and jealousy within middle-class society. The heyday of realism was also the first great era of mass culture. As urban growth rates exceeded the growth rate of the population at large, the middle and working classes became prominent forces in the cultural life of their nations. Middle-class novels, often appearing in weekly installments in the popular press before being packaged as books, added a sentimental gloss to the harsh realist vision. In Britain, Charles Dickens (1812–1870), who was forced into factory work when his father was imprisoned for debt, combined realism with Victorian sentimentality. His Oliver Twist (1838) told the story of a young boy born into a workhouse who, after many misadventures among London’s criminal classes, ends up in the arms of his long-lost uppermiddle-class family. Samuel Clemens (1835–1910), the American river boat pilot and journalist who wrote under the pen name Mark Twain, was the most successful writer in the United States. He even set up his own publishing company to keep up with the demand for his works. Twain’s novels included warts-and-all depictions of the pre-Civil War South (Huckleberry Finn, 1884, and Tom Sawyer, 1876) that remain controversial classics to this day. The new middle classes were avid readers, but they preferred their literature discreet. Gustav Flaubert (1821–1880) turned to writing as a profession when a nervous condition (thought to be epilepsy) sidelined his legal career. His Madame Bovary (1857) was the story of a provincial middle-class wife who betrayed her husband out of boredom. The novel’s sympathetic description of her adultery so offended public sensibilities that Flaubert was tried—although not convicted— for scandal. At the same time, department stores began selling a seemingly endless variety of affordable machine-made goods. Distinctions in dress between classes began to disappear when almost everyone had access to similar ready-made clothing. Advertising and mail-order marketing further eroded such differences by making the latest urban fashions available and desirable to people all over the country. Mass media brought the same ideas in the same newspapers and magazines to everyone. The latest installment of a new novel by Dickens crossed the Atlantic within weeks. Different forms of mass entertainment appeared one after the other. The United States had its first professional baseball league in 1871. By 1900 there were two leagues and by 1903 a World Series between them. College basketball was an organized sport by 1891 with a professional association taking root by 1898. The first professional football teams were organized in Britain in 1893, and the first automobile race was held in France in 1894. Store-front theaters began to spring up to show off the brand new invention of moving pictures. The new middle classes even had their own high art. Impressionism, which developed, primarily in France, as early as the 1860s, rejected the high finish of classical realism. The founders of the movement were fascinated by the role of light in determining the appearance of the physical world. They hoped to capture the fleeting and ever changing impression that objects made on the eye. The development of the camera in the 1840s had seemed to threaten the future of painting as a recorder of reality. But by the changing effects of light on objects, impressionists cast doubt on the whole notion of an “objective” visual reality. Their subject matter was revolutionary as well. Leaving aside the large-scale history paintings and subjects drawn from Greek mythology, impressionists painted picture after picture of Sunday boaters, the middle class out for dinner at a café, a picnic, or a night at the theater. Ballet dancers, café singers, acrobats—all appeared on impressionist canvases. Originally shunned by the art establishment, the impressionists were eventually adopted by an increasingly confident middle class delighted to see itself depicted in the colorful canvases of such painters as Pierre Auguste Renoir (1841–1919). ◆ Alternate Visions But not everyone bought the picture of a sunny, modern world. Even in painting, the artists known as post-impressionists or expressionists rejected the middle class picnics and flower gardens that were common subjects of the Impressionists. Paul Gauguin (1848–1903) abandoned his family and a career in the stock exchange to explore the mysteries of primitive peoples by living and painting in the French colony of Tahiti. He asserted that “primitive” art still retained that sense of wonder at the world that Western Civilization had lost, and his paintings attempted to portray the mystical elements of Polynesian life. In France, Paul Cézanne (1839–1906) rebelled against the legal career his well-to-do parents had laid out for him and pursued the artistic life of bohemian Paris. Eventually rejecting the representational painting of the Impressionists, Cézanne developed a style based on geometric forms that led to cubism in the twentieth century. Another forceful rejection of the age’s worship of everything modern came from the pen of the German philosopher Friedrich Nietzsche (1844–1900). In works such as Thus Spake Zarathustra (1883–1884), Beyond Good and Evil (1886), Genealogy of Morals (1887), and the Twilight of the Idols (1888), Nietzsche passionately criticized Western culture, Christianity, and human conformity. He believed Christianity was based on deep resentment of this world, the resentment of the powerless, lower classes in the ancient Roman Empire among whom it had first spread. It fostered a “slave morality” (only a slave would “turn the other cheek” when struck, according to Nietzsche) that must and would be overcome as the world came to accept that “God is dead.” In the future, a finer type of man, the Übermensch (superman), modeled on the masters of the ancient world, would emerge free of foolish illusion and capable of moving humanity to a higher level of existence, with a new, heroic world-view taking joy in whatever the universe threw at them and in their own irrational instincts, including violence, the desire for power, and the thirst for beauty. Characterized by their courage, intellectual energy and beauty of character, these new men were destined to become the “lords of creation.” Although he believed no moral viewpoint could be imposed on all individuals, Nietzsche believed in the future greatness of mankind and success for nations such as Great Britain, Russia, and the United States. In Nietzsche’s view, all that sprung from power was healthy, while all that sprung from weakness was evil.

While Nietzsche’s criticism of modern society as a mediocre hypocrisy was made more easy to ignore as a result of his eventual descent into insanity (as a result of untreated syphilis), Freud’s attack on modern society’s claim to rationality proved harder to dismiss. The nineteenth century had not only inherited a physics of certainty from the Enlightenment, it had also inherited a psychology of rationality. The philosophes had seen the human mind as a machine reacting to physical stimuli in a calculating and mechanical fashion. Adam Smith had written of the enlightened (educated) selfinterest that underpinned the social division of labor (and was best left unregulated by governments). Jeremy Bentham had written of a “calculus of pleasure” (or, utility). The romantic reaction of the nineteenth century had stressed the emotional side of human nature, but had never been able to displace that basic faith in rationality. Auguste Comte and the positivists placed their faith in a human intellectual evolution as guaranteed as that of the Social Darwinists. The work of Sigmund Freud (1856–1939) upset that presumption of certainty as completely as Einstein’s destroyed the notion of a fixed universe. Freud, a lecturer in neuropathology at the University of Vienna, was deeply interested in aspects of the mind that seemed to operate outside the control of conscious thought. While using hypnosis to treat “hysteria” (a catchall term for symptoms without apparent physical causes), he found that hypnotic trances often brought out forgotten memories of youthful experiences in his subjects. These memories seemed connected to the hysterical symptoms. Freud speculated that there was an unconscious part of the mind that had a greater potential effect on waking behavior than did rational mentality. His first book, Studies in Hysteria (1895), suggested that doctors might be able to focus on the source of a patient’s ailment by a method of “free association” (the “talking cure”). Further research led Freud to conclude that dreams depicted, in symbolic form, the desires and conflicts of the unconscious elements of the mind. In The Interpretation of Dreams (1899), Freud argued that there were no accidents in mental processes and that the struggle between the conscious and unconscious mind had to be interpreted by an expert analyst. Continuing his work, Freud gradually developed a picture of a human mind divided into three parts: the ego (the mediating center of reason), the superego (the internalized restraints of society), and the id (the “primitive” sexual and aggressive drives). In Civilization and Its Discontents (1930), Freud argued that the conflict between humanity’s unconscious animal drives and the constraints of society not only caused a vast number of neuroses in troubled people but also threatened civilization itself. He believed the id’s lust for violence explained why war was so endemic in human history Almost from the start, the psychoanalytic movement was divided into factions. Freud’s own students remained divided over which “animal” drives played the greatest role in shaping the human personality, the extent to which childhood repression was the key to adult behavior, and the extent to which humans exhibited a “collective” (ethnic, national, or racial) as well as an individual personality. Many of the conditions studied by Freud and his students have also been found to have chemical causes. The modern psychiatrist uses magnetic resonance imaging, psychotropic drugs, electromagnetic shock, and laser surgery as well as Freud’s “talking cure.” Today, when psychotherapy is used, it is as likely be the short-term behavioral approach as the classic Freudian variety. But the impact of Freud’s theories on the wider cultural consciousness remains; people still talk about “Freudian slips,” “repression,” and “sublimation.” As for Freud himself, his pessimistic view of human nature did not necessarily translate into an ability to recognize true evil when it appeared. On March 13, 1938, Hitler’s forces moved into Austria, but Freud, a Jew, believed his fame protected him from harm and refused to leave. Even after the Nazis ransacked his house, he held to his belief that Nazism was a fleeting excess. Only after his daughter Anna was arrested and briefly held by the Gestapo, did Freud agree to leave his home for exile in Britain. He was not allowed to take his sisters out of the country, and they eventually died in German concentration camps. He died in England on September 23, 1939, after deliberately taking a lethal dose of morphine to end the pain of inoperable cancer. Perhaps the highest celebration of Europe’s self-confidence was the Paris Exposition Universelle of 1889. Millions of tourists descended on Paris to visit the two-hundred-acre site of the largest world’s fair ever held. It celebrated the triumphs of modern technology, “the living connection between men and things.” Tourists gazed in awe at the Eiffel Tower, the tallest structure in the world, and felt sure that progress would continue indefinitely and that Europeans would lead it. The fair ushered in La Belle Époque, a time of peace and shared confidence in the future. It was easy for bourgeois society to believe that technology would harness nature to create ever greater wealth for modern nations. Social Darwinists reassured western Europeans that imperial possessions were rightfully theirs because they were the “fittest,” the highest product of human evolution. Any problems that remained were only material, and these would be ended as the technological revolution continued. Under the surface calm of La Belle Époque, however, the problems of the nineteenth century remained. European intellectual life was a battlefield of opposing opinions whose conflicts had not been resolved. Belief in science, progress, and positivism continued strong, but not everyone shared in middle-class prosperity. The capitalist world economy was thriving, the bourgeoisie reveled in the sanctity of private property, and Britain was still the center of world finance. But Marxist socialists in every nation, convinced of the illegitimacy of capitalism, sought a future of collective ownership, the elimination of states, and an economy of shared wealth. Liberals preached the virtues of constitutional government, individual freedoms, and parliamentary representation, while in Eastern Europe autocracy remained predominant. Half the human race was denied basic rights because it was female. Social Darwinists documented immutable differences between the human “races” and predicted a pitiless biological struggle that would end in the enslavement or extermination of some peoples. Democracy and imperialism, civil rights and racism, and capitalist economies and Marxist political parties co-existed uneasily. Despite its wealth and power, Europe was deeply uncertain of its future direction. The good life of La Belle Époque would soon end as the nations of Europe found themselves involved in the very kind of war they thought their civilization had rendered impossible.

The Twentieth-Century Legacy

The modern world was born in revolutions scientific, national, and industrial. The values those revolutions embodied were spread by the steam engine, the railroad, and the machine gun until there was almost no place in the world where they were not known, if not necessarily adopted. But the meaning of modern and the societies so labelled continue to change. What seemed modern in the nineteenth century may be antique today. The only constant seems to be the increasing pace of change. In the twentieth century, humanity added more to its accumulated knowledge than in any previous period of history. We split the atom, landed on the moon, and began deciphering the genetic code. We created vaccines against smallpox, polio, mumps, measles, chicken pox, diphtheria, and tetanus. Our surgeons routinely transplant corneas, livers, kidneys, bone marrow, and hearts. Lasers facilitate the most delicate brain surgery. Cochlear implants, pacemakers, and artificial valves and joints are turning us into bionic men and women. Our knowledge of genetics verges on the ability to create “designer” human babies. This chapter looks at the complex legacy—in science, human rights, population, and economic integration—that the twentieth century has left to the twenty-first.

Science and the Universe of Uncertainty

◆ The Double Helix Some of our most heralded scientific triumphs created as many questions as they answered. In 1962, two molecular biologists, Francis Crick (1916–2004) of Great Britain and James Watson (b. 1928) of the United States won the Nobel Prize for their discovery of the double-helix structure of DNA, the genetic code inside every life form. In the decades since their first discoveries in 1953, their work has been used to bio-engineer high-yield, disease-resistant rice in the fight against world hunger, to create sophisticated anti-retro-viral agents in the war against AIDS, to determine guilt or innocence in criminal cases, and to clone animals. A major effort of the Human Genome Project is to identify the genetic variants that predispose people to diseases such as schizophrenia or cancer, but at a conference held in Washington, DC, in July 2001, a debate broke out between the scientists over what would be done with their data. While many genetic variants are uniformly spread throughout the human race, some are linked to different human populations. Fears have been raised that a new form of Social Darwinism might emerge if records were kept of the ethnicity of people donating their DNA for mapping. Other debates have opened up over the possibility of cloning humans and the use of embryonic stem cells in the treatment of genetically transmitted

◆ Physics The accomplishments of nineteenth-century medicine and technology gave rise to an attitude of confidence and certainty in Western culture. While the theories of Lyell and Darwin depicted an ever-changing universe, that change was seen to occur in a discoverable fashion and was popularly believed to be progressive in nature. Evolution was seen as a process of improvement. The work of Röntgen, Becquerel, and the Curies demonstrated that the world was not so certain a place. Atoms, presumed to be the basic building blocks of all matter, were not necessarily stable. Radioactivity was, after all, evidence of atomic degeneration. Radioactivity seemed to violate the accepted immutability of elements and the assumed distinction between matter and energy. But this aspect of late nineteenth-century physics did not travel beyond the confines of the scientific community to capture the imagination of the general population in the way that Darwin’s theories had done. To the world at large, the discoveries in physics were seen chiefly as new medical therapies—the x-ray, chemotherapy—available to the local physician. They were seen as proof of the increasing certainty of medicine rather than as evidence of an uncertain universe. Einstein’s theory of relativity revealed gravity as the result of a space that folded around itself and time as a relative measure of speed rather than the constant pulse the world had assumed it to be. In 1900, Max Planck (1858–1947) had stunned the German Physical Society by presenting a quantum theory that challenged our most fundamental assumptions about the natural world. As he studied the light effects of radiation, Planck noticed the absence of high-frequency light waves and suggested that the exchange of energy between mass and radiation occurred not in a steady stream, but in discrete impulses (quanta). In a sense, the entire universe was blinking on and off. Moreover, such emissions of energy occurred in unpredictable patterns. While Planck’s theories caused a great stir in the scientific community, they were not paid much attention in the mass media until Einstein’s theories had captured the popular imagination. But even Einstein was disturbed by Planck’s work. Einstein found it impossible to believe that the unpredictable whims of subatomic particles could create our intricate universe, and spent most of the rest of his life attempting to find a “unified field theory” that would resolve the seeming contradictions of the laws of gravity and electromagnetism. “The Lord God is subtle,” he said, “but malicious he is not.” “I shall never believe that God plays dice with the world.”1 Still more unsettling was the “uncertainty principle” proposed by the German physicist Werner Heisenberg (1901–1976). In 1927, Heisenberg noted it was impossible in principle to simultaneously locate and determine the speed of very small and fast moving sub-atomic particles; the very fact of their being observed would alter their velocity. The science that had once offered certainty was now reduced to substituting relative probability. Towards the end of the twentieth century, scientists proposed a world of eleven dimensions in which the most fundamental units were not particles but vibrating strings. Though this “string theory” cannot yet be demonstrated true or false by experiment, mathematically it is the only way physicists have been able to reduce the four known fundamental forces—gravity, electromagnetism, the strong nuclear force, and the weak nuclear force—into one. Since the human mind can only visualize an object in three physical dimensions (for each plane must cross the others at right angles), no human being can even paint a picture of this universe.

Human Rights While the horrors of the Nazi death camps, revealed in 1945, may have helped discredit the notions of racial superiority and purity so popular in the 1930s, they did not sweep away the barriers to ethnic, religious, and gender equality still found throughout the world. The struggle to break down those barriers has been a major element of the postwar world. It has not been entirely successful. ◆ Civil Rights in the United States In 1945 the official policy of the richest and most powerful nation on the earth was still that of the 1896 Plessy v. Ferguson Supreme Court decision. “Separate but equal” was the law of the land, even though separate was never equal. In the south, all public facilities (including school systems) were segregated. Poll taxes and “literacy” tests prevented African-Americans from registering to vote, and it was against the law for people of different races to marry. While such legal barriers did not exist in the north, more informal forms of discrimination assured white supremacy. The American armed forces that fought in World War II were officially segregated, although casualties suffered during the first two years of the war had led to piecemeal integration to keep each unit up to strength. After the war, a new militancy on the part of civil rights groups (fueled in part by an infusion of African-American veterans into their ranks) combined with demographic changes to give a new impetus to the campaign for desegregation. The development of synthetic fibers after World War II helped reduce southern cotton acreage from 43 million acres in 1929 to less than fifteen million in 1959. The southern farm population, white and black, fell from 16.2 to 5.9 million in the same period. Three million African-American moved to northern cities between 1940 and 1960 in search of jobs. This was a second “Great Migration” (the first having been during World War I). Able to vote, they helped make African-American demands for civil equality a national political issue. The first public barrier to fall was in the “national pastime,” baseball, when, in 1947, Branch Rickey (1881–1965) of the Brooklyn Dodgers broke the major league race barrier by hiring Jackie Robinson (1919–1972) out of the Negro Leagues. While this opened the floodgates of professional sports to Americans of color, it had the unfortunate side effect of destroying the black-owned enterprises of the Negro Leagues. The other major civil rights victory of the late-1940s came as a result of African-American agitation. A. Philip Randolph, whose threat of a march on Washington in 1941 had brought about the desegregation of the defense plants, had switched his sights to the military itself. His League for Nonviolent Civil Disobedience Against Military Segregation saw its first victory on July 26, 1948, when President Truman issued Executive Order 9981, barring segregation in the armed forces. The military proved to be a major path to African-American advancement over the ensuing decades. Probably the most prominent example of this progress was the appointment of Colin Powell (b. 1937) as the chairman of the Joint Chiefs of Staff (1989–1993). He would later go on to serve as secretary of state under President George W. Bush. Throughout the 1950s the federal government moved gradually to desegregate the military, the federal civil service, and interstate commerce, but the most prominent landmarks of the decade remain the 1954 Supreme Court decision ordering an end to public school segregation and the Montgomery bus boycott of 1955–1956. In Brown v. the Board of Education of Topeka, Kansas (1954), the Supreme Court declared that segregated schools violated the Fourteenth Amendment. Reversing the “separate but equal” doctrine of Plessy v. Ferguson, the Court declared segregation unconstitutional because “separate educational facilities are inherently unequal.” The National Association for the Advancement of Colored People (NAACP), which had brought the action on behalf of the parents of Linda Brown (b. 1946), rejoiced, although the battle for an integrated educational system had only just begun. Americans who turned on their television sets in 1957 saw paratroopers ordered by President Eisenhower escorting nine black students into Central High School in Little Rock, as Arkansas Governor Orval Faubus (1910–1994) mustered the national guard to keep them out. Rosa Parks (1913–2005), a member of the Montgomery, Alabama, chapter of the NAACP, was arrested on December 1, 1955, for refusing to give up her seat on a city bus to a white man. Her arrest sparked a black boycott of the city bus lines organized by the newly formed Montgomery Improvement Association led by a local pastor, Martin Luther King, Jr. (1929–1968). For 381 days the African-Americans who made up 70 percent of the bus riders starved the system of its revenues. Adopting the non-violent tactics that had worked so well for Gandhi against the

MARTIN LUTHER KING, JR. (1929–1968) The son and grandson of Baptist ministers, the Reverend Dr. Martin Luther King, Jr. was born on January 15, 1929, in Atlanta, Georgia. Receiving his B.A. from Morehouse College in 1948, he went on to Crozer Theological Seminary and Boston University where he was awarded a Ph.D. in 1955. In Boston, King met Coretta Scott (1927–2006), a student at the New England Conservatory of Music. They married in 1953 and had four children. While working on his doctoral dissertation, King became pastor of the Dexter Avenue Baptist Church in Montgomery, Alabama. Montgomery’s bus system was segregated (although 70 percent of bus riders were black), and on December 1, 1955, Rosa Parks (1913–2005) was arrested for refusing to give up her seat to a white passenger. Montgomery’s black community decided to boycott the transportation system, and King was chosen to lead a campaign that lasted 381 days before the U.S. Supreme Court declared Montgomery’s bus segregation unconstitutional on December 20, 1956. King survived death threats and the dynamiting of his home to become a national figure. In 1957 King set up the Southern Christian Leadership Conference (SCLC) to coordinate integration efforts throughout the south. He began a worldwide speaking tour. In India in 1959, he renewed his belief in Mahatma Gandhi’s philosophy of non-violence. Appointed co-pastor with his father of the Ebenezer Baptist Church in Atlanta, Georgia, King demonstrated those principles as part of a student sit-in at a segregated lunch-counter in October 1960. Arrested and sent to Reidsville State Prison Farm, King won release only after the intervention of Democratic presidential nominee John F. Kennedy. It would be only the first of many arrests. King spent the next three years fighting racial discrimination in the United States. On August 28, 1963, a quarter of a million people gathered near the Lincoln Memorial to hear many civil-rights speakers. But the most famous speech of the March on Washington remains King’s. The young pastor evoked a “dream” of a United States in which individuals would be judged “by the content of their character” rather than by “the color of their skin.” Awarded the Nobel Peace Prize in 1964, King’s true legacy is the Civil Rights Act of 1964 and the Voting Rights Act of 1965 that ended the Jim Crow system. King’s nonviolent methods and his goal of a racially integrated society were not acceptable to other civil rights activists; the Selma March and the Watts Riots in 1965 revealed the growing split within the movement. King himself broadened his concerns beyond integration to oppose American involvement in Vietnam. In March 1968, he was organizing a Poor People’s March on Washington when he stopped off in Memphis, Tennessee to support striking sanitation workers. A sniper’s bullet ended King’s life as he was standing on the balcony outside his motel room on April 4. James Earl Ray (1928–1998) pled guilty to the murder on March 10, 1969, and died in prison. A champion of human rights in the broadest sense, the Reverend Dr. Martin Luther King, Jr. was the first (and remains the only) private citizen of the United States honored with a national holiday.

British in India, King’s work in keeping the boycott going brought him to national prominence. On December 20, 1956, the Supreme Court declared Montgomery’s bus system unconstitutional. The non-violent arm of the civil rights movement reached its peak in the massive March on Washington for Jobs and Freedom in 1963, supported by a coalition of black organizations, the AFL-CIO, the Protestant National Council of Churches, and the American Jewish Congress. More than 250,000 demonstrators heard King deliver his famous “I Have a Dream” speech on August 28. Ten years of civil rights legislation and Supreme Court action followed. The Civil Rights Act of 1964 barred discrimination in public accommodations and employment, authorized the attorneygeneral to withhold federal funds from any state school district that did not desegregate, and created the Equal Employment Opportunity Commission to hear cases of alleged discrimination. The Voting Rights Act of 1965 put the entire registration and voting process under federal control with the introduction of federal examiners in counties where less than fifty percent of minority residents were on the voting lists. As a result, the nationwide number of black voters, which had already risen from 20 to 39 percent between 1960 and 1964, grew to 62 percent in 1971. The Fair Housing Act of 1968 barred discrimination in the selling and renting of real estate. In a series of decisions the Supreme Court extended school desegregation to extracurricular activities and ordered the use of mandatory busing to integrate schools. In a period of social upheaval ignited, as well, by protests against American involvement in Vietnam, more militant voices in the civil rights community rejected the non-violent approach to the problems of race in America pioneered by leaders like King. Others even questioned the desirability of assimilation. The Black Panthers repeated Mao Zedong’s slogan that “political power comes out of the barrel of a gun.” The Black Muslims rejected integration in favor of Black Nationalism, stressing black pride and self-help. They also incorporated an ugly current of antiwhite and anti-Semitic bigotry that attempted to give theological credence to the assertion that “the white man is a devil.” Malcolm X (1925–1965), one of the most prominent Black Muslim leaders, called Dr. King’s 1963 March “the farce on Washington.” The real gains of the civil rights movement began to run up against increased public resentment of the federal re-engineering of society. Malcolm X’s murder in 1965, the nationwide rioting and arson in 1967, the murders of New York Senator Robert Kennedy (1925–1968) and Martin Luther King, Jr. in 1968, and the burning of school buses in Boston in 1974 were only symptoms of the larger problem. Removing the legal barriers to integration had not brought an end to discrimination in American society. The Supreme Court began to set limits in the 1970s, refusing to allow busing from one tax jurisdiction to another to prevent the resegregation of inner city schools caused by “white flight” to the suburbs. In 2003 the Supreme Court severely limited the ability of universities to use race-based “affirmative action” measures to create diverse student bodies. While the African-American civil rights movement tended to dominate the news media, other minorities were also agitating for equal opportunity. Although few Hispanics had participated in political life before 1960, this changed with the creation of the Mexican American Political Association (MAPA) at the beginning of the decade. By 1969, however, a more radical political group La Raza Unida (The United Race) was formed to fight racism in American life. It was in this period that César Chávez (1927–1993), born to a family of migrant farm workers, founded the National Farm Workers Association (1962). This militant union began a successful five-year strike and boycott of California grapes in 1965. Joining with the AFL-CIO, Chavez’s organization became the United Farm Workers of America (UFW) in 1971. Its aim was to improve the living

MALCOLM X. (1925–1965) Malcolm Little was born on May 19, 1925, in Omaha, Nebraska. During his youth, his home in Lansing, Michigan, was burned by Ku Klux Klan members, his father murdered, and his distraught mother placed in a mental institution. The orphaned teenager went to live with his half-sister in Boston. The deeply troubled teen was soon imprisoned for burglary, but there found the anchor he needed to repair his life. He joined the Nation of Islam, a Black Muslim sect founded by Wallace Fard Muhammad (1877?–1934?) at a Detroit mosque in 1931. Fard, a Saudi immigrant, claimed to be a Messenger sent by Allah to liberate the “LostFound Nation of Islam in the West” from its white “slave masters.” Released from prison in 1952, Malcolm Little met Elijah Muhammad (1897–1975) who succeeded Fard as head of the Black Muslims after Fard’s mysterious disappearance in 1934. Malcolm took the new surname “X” as way of repudiating the heritage of slavery. A charismatic orator, Malcolm X was largely responsible for the movement’s growing membership. During the 1950s he founded numerous mosques across the United States and himself became minister of Mosque Number Seven in Harlem. His Muhammad Speaks (1961) became the official publication of the Nation of Islam. As a Black Nationalist, Malcolm X scornfully rejected the exclusively non-violent methods and integrationist aims of the mainstream civil rights movement. He called for black pride, economic separatism and the legitimate use of violence in self-defense against a racist society. To this end, he advocated the arming of all black men. To white America he was the man who referred to President Kennedy’s assassination a “case of chickens coming home to roost.” The resulting political furor led to Malcolm’s suspension from the Black Muslim movement. Malcolm left the Church in March 1964, and in April made a pilgrimage to Mecca. In that holy city he underwent a conversion to orthodox Islam and repudiated his earlier belief in the inherent evil of all whites. The obligation to make the hajj (pilgrimage to Mecca) is one of the Five Pillars of Islam; it entails a spiritual rebirth and subsequent renaming. Malcolm X became el-Hajj Malik el-Shabazz. When he returned to Harlem, there were numerous clashes between his supporters and those of Elijah Muhammad. During a rally of his followers, Malcolm was shot to death on February 21, 1965. Three members of the Nation of Islam were convicted of his murder. But the influence of Malik el-Shabazz continued to grow with the posthumous publication of The Autobiography of Malcolm X (1965) based on interviews conducted shortly before his death by Alex Haley (1921–1992), the author of Roots. Malcolm X’s present appeal relies more on his call for black pride and economic self-sufficiency than on the “world brotherhood” he espoused after his return from Mecca. How he himself would have combined the two can never truly be known

and working conditions of America’s migrant workers. In contrast, the Cuban American community, concentrated in south Florida, was mostly conservative, and loyally supported any candidate who was consistently opposed to the Castro regime. ◆ Women’s Rights in the United States The campaign for equal rights for American women was, by contrast, not the movement of an oppressed minority, but a movement of the oppressed majority of the American population (51.20 percent of the population was female according to the 2000 census). In 1900 the average American woman married at age 22, and had three to four children. Campaigners for birth control faced criminal prosecution. Margaret Sanger (1879–1966), the founder of Planned Parenthood (1942), was sentenced to 30 days in the workhouse for opening America’s first birth control clinic in 1917. Ratified on August 18, 1920, the Nineteenth Amendment stated that “The right of citizens of the United States to vote shall not be denied or abridged by the United States or by any State on account of sex.” But giving women the right to vote made little immediate change in their lives. During the First and Second World Wars women had moved into the workforce in record numbers, to fill the jobs left vacant by departing soldiers, but these gains were temporary. After each war ended, women were expected to surrender their jobs to returning veterans. The 1950s saw a return to a largely male-dominated work force, with most women being consigned to unpaid labor as housewives in the newly built and federally subsidized suburbs. But by the late 1960s, new job openings in an increasingly prosperous America attracted many women back into paid employment. More women also began to go to college. Only 35 percent of college undergraduates were women in 1960, but by the 1980s just over half of all graduates were women. Women were also increasingly moving into traditionally male graduate programs in law and medicine. In 1965, women made up only 6.9 percent of those earning medical degrees, but by 2005, women made up 47 percent of those earning medical degrees. In 1984 one in four married women earned more than their husbands; in 2006, onethird of all married women earned more than their husbands. All this would have important effects on the structure of American family life. By the year 2000 the average American woman had only 2.2 children. The divorce rate rose from only eight percent of all marriages in 1900 to just over 50 percent a century later. The women’s rights movement in the United States received a major impetus from the work of Betty Friedan (1921–2006). A journalist who had left the workplace for the traditional life of a suburban housewife, Friedan wrote an exposé of the pressure on women to conform to the domestic stereotype. Her book, The Feminine Mystique (1963), sparked a new militancy among women activists. She organized the National Organization for Women (NOW) in 1967 to channel that militancy into concrete change. In Griswold v. Connecticut (1965) the Supreme Court overturned state laws against the sale of contraceptives to married women, and, in Roe v. Wade (1973) it struck down state laws preventing abortions during the first three months of pregnancy. Later decisions extended the right to abortion to the second three months. The Equal Credit Opportunity Act of 1974 prohibited lenders from discriminating on the basis of gender. But the Equal Rights Amendment, which won the support of a number of state legislatures in 1972 and 1973, ultimately failed to be ratified. The desire to insure equality was, once more, running up against the reluctance to further re-engineer American society

MARGARET SANGER (1879–1966) The founder of the American birth-control movement was born Margaret Higgins on September 14, 1879, in Corning, New York, the sixth of eleven children in an Irish workingclass family. Her mother’s ill health (and eventual death) from the strain of eleven childbirths and seven miscarriages inspired Margaret to seek medical training. She attended Claverack College, where she took nurse’s training at the White Plains Hospital. After marrying William Sanger in 1900, she worked as a midwife in the immigrant communities on New York’s Lower East Side. High infant mortality rates and infectious disease were common features of life in the crowded tenements, but Sanger was particularly appalled by the frequent deaths from botched illegal abortions. Galvanized by what she saw, Sanger began a lifelong campaign to bring legal contraception to the American woman. Standing in her way was the Comstock Act (1873) that made contraceptive literature and devices illegal pornographic materials. In 1914, Sanger began publishing a monthly magazine, the Woman Rebel (later renamed The Birth Control Review) advocating birth control and women’s rights. Quickly indicted for violating postal obscenity laws, Sanger fled to Europe but returned when the case was dismissed in 1916. She opened up America’s first birth control clinic in Brooklyn and was sentenced to 30 days in the public workhouse for operating a “public nuisance.” Her first book, What Every Mother Should Know, was published that same year (1917). In 1921 she founded the American Birth Control League and served as its president until 1928. Sanger’s League merged with other birth control organizations in 1942, becoming the Planned Parenthood Federation of America with Sanger as honorary chairwoman. Sanger spent considerable time in Europe to escape the mounting political persecution in the States. Her open liaison with J. Noah H. Slee and her advocacy of “free love” did not help her position at home. After a divorce from William Sanger, Margaret married Slee in 1922, but both partners had extra-marital affairs during this “open marriage.” The three children that resulted from her two marriages remained in the background as Sanger began a worldwide campaign for contraception and population control. In 1927 she organized the first World Population Conference in Geneva, Switzerland. My Fight for Birth Control (1931) and Margaret Sanger: An Autobiography (1938) were written to popularize her crusade and keep the issue of birth control before the public. During World War II the practice of birth control became far more common, and Sanger’s crusade took on an even greater international dimension. Margaret Sanger became the first president of the International Planned Parenthood Federation in 1953, and took her campaign as far afield as India and Japan. Back in the United States, her efforts resulted in a relaxation of the Comstock Act in 1936 allowing physicians to import and prescribe contraceptives, but the U.S. Supreme Court did not strike down a Connecticut law prohibiting the use of contraception (even by married couples) until 1965, when Sanger was 86. The Supreme Court decision extending the right to privacy to cover abortion (Roe v. Wade) did not come until 1973, seven years after Sanger’s death on September 6, 1966, in Tucson, Arizona

Redefining “Equality” When the American Declaration of Independence (1776) declared all men to be “equal,” all that was intended was a narrowly defined equal protection under the law: an equal right to trial by jury and the objective application of criminal and civil law. The right to participate in the making of that law was not even extended to all white men. But it is one of the cardinal features of modern society that the definition of “equal” is always expanding. In 1969, the patrons of the Stonewall Inn, a gay bar in New York’s Greenwich Village decided they were tired of being harassed by the police and took to the streets. The Stonewall Riots set off an ongoing gay and lesbian rights movement. One of the accomplishments of this movement has been the adding of domestic partner provisions to the employee benefit packages of many local governments and private institutions, although there is an on-going debate over same-sex marriages. In 1990, the Americans with Disabilities Act (ADA) required government offices and private businesses to make all “reasonable accommodations” to insure every American equal access to education, employment, accommodation, and transportation. In 2008 the United States experienced a historic presidential campaign. The two main contenders for the Democratic Party nomination were New York Senator Hillary Rodham Clinton (b. 1947) and Barack Hussein Obama (b. 1961). For the first time in American history the presidential nominee of one of the country’s two main political parties would be either a woman or an African-American. Obama won the nomination and became the first African-American president of the United States on January 20, 2009. Hilary Clinton became his Secretary of State.

◆ Universal Human Rights In 1948 the United Nations issued a Universal Declaration of Human Rights. It declared the equal right of all human beings to “life, liberty and the security of person,” protection under the law, presumption of innocence, consensual marriage, education, and freedom of movement, opinion, and assembly. For many, however, it is a declaration honored in name only. Before World War I, women could vote only in New Zealand, Australia, Finland, Norway, Denmark, and Iceland. By the outbreak of World War II, women could also vote in the Netherlands, the Soviet Union, Canada, Austria, Czechoslovakia, Germany, Hungary, Poland, Sweden, Luxembourg, the United States, Great Britain, Ecuador, South Africa, Brazil, Thailand, Uruguay, Turkey, Cuba, and Salvador. The Dominican Republic gave women the right to vote during the war, while France, Guatemala, Italy, Japan, Mexico, China, Argentina, South Korea, and Israel extended the franchise to women at war’s end or shortly thereafter. Chile, India, and Indonesia granted women the vote in 1949, Pakistan in 1956, Switzerland not until 1971, and Syria not until 1973. Women still cannot vote in some of the more conservative regimes in the Middle East, such as Saudi Arabia and Kuwait. The right to vote is only the beginning of the road to equality. Even though women gained the right to vote in Brazil in 1932 it was not until 1978 that this largest Roman Catholic country in the world legalized divorce, not until 1988 that a new constitution mandated equality for women, and not until 2001 that the Brazilian congress passed the legislation that would ensure women real equality within their own families. For the first time husbands would have to share their legal right to make decisions for their children with their wives. Those same husbands would also no longer be able to obtain automatic annulments if they learn their wives were not virgins at the time of their marriage.

In Japan, women were not protected by an equal employment opportunity law until 1985. Not surprisingly, fewer Japanese women have managerial positions or are members of national legislatures than in any other industrialized country. In 2003, 41 percent of all workers in Japan were women, but only 8.9 percent of all managerial positions and 7.3 percent of the national legislature seats were held by women. In the United States, 46.6 percent of all workers, 46 percent of the managers, and 14.3 percent of the members of both houses of Congress were women in 2003. The inability of developing nations to catch up with the ever-expanding economies of the industrialized West has often been an aggravating factor in ethnic discrimination. In the 1950s Nasser promised to enrich Egyptian peasants by driving foreign merchants out of the country. Indian minorities were persecuted in and then expelled from Uganda and Tanzania in the 1970s. Chinese minorities in Indonesia, Vietnam, and Malaysia have suffered similar fates. ◆ Fundamentalism and “Ethnic Cleansing” The growth of religious fundamentalism in much of the underdeveloped world has often been a response to population pressure, growing poverty, and failed economies. Islamic fundamentalism has become the fastest growing social and political movement in the Muslim world. Rejecting the capitalism of their former colonial rulers, many in the Islamic world looked to the Soviet Union for models of social development after World War II, while others became clients of the United States. Plagued by corrupt regimes and inefficient central planning, these governments threatened the traditional values of their societies without improving the lives of their peoples. In contrast, the fundamentalist movement was able to win many supporters among the poor through grassroots organizations, charitable work, and constant preaching against Western cultural influences, which they believed encouraged promiscuity and drug use. Under the leadership of Muammar al-Qaddafi (b. 1942), Libya became an Islamic state in 1969, seeking inspiration not in socialism, but in the Quran. The Shah of Iran was overthrown a decade later and replaced by a fundamentalist Islamic regime. Sudan, an area of strong religious fervor since the nineteenth century, became the next site of Islamic revolution. Basing the nation’s laws on Shari’a (Muslim holy law), the new government set off a civil war when it attempted to impose Islam on the largely Christian and animist southern part of the country; the fighting continued into the twenty-first century. By the 1990s this resulted in a revival of the old slave trade, with recalcitrant non-Muslims transported as slaves into the Arab world. Instances of two ancient tribal practices—forced clitoridectomy (female circumcision) and “honor killings” (the murder of female relatives suspected of sexual relations outside of marriage, even if they were raped)—have also become increasingly common in some fundamentalist Muslim communities and traditionalist tribal groups in Africa. Some regions within Pakistan, long troubled by corrupt dictators, a rapidly growing impoverished population, and an illiteracy rate of at least 75 percent, have also begun to adopt Shari’a as the nation’s law. In fact, in February 2009, Pakistan announced that it was accepting Taliban control over one region. The military leaders of Algeria felt obliged to cancel the country’s first free elections in 1992 when it appeared Islamic leaders would win. Egypt also barred an Islamic opposition group from participating in its 1995 elections. The multi-ethnic states of Africa are particularly vulnerable to religious tensions. Twelve of Nigeria’s 36 states have adopted Shari’a as their basic law codes, creating tensions not just in Nigeria itself but also in neighboring, mostly Christian, Niger. Christians living in Maradi, a city in Niger just 30 miles north of its border with Nigeria, were attacked by Muslim fundamentalists at the beginning of 2001. On the other hand, Christians controlling the government in the Ivory Coast have turned a blind eye to the destruction of mosques and Muslim owned properties. In India, the growth of Hindu fundamentalism has been fueled by the failure of the secularist Congress Party to solve India’s economic problems. The Bharatiya Janata party, which has called for a purified Hindu India, has benefitted from the ongoing distress. Despite the massive shift of population, and terrible massacres that followed the partition of India and Pakistan in 1947, India still had a large Muslim minority (now over 120 million). Hindu militants began to accuse this minority of being responsible for many of the nation’s problems, including widespread poverty. Mosques, which during the Mughal ascendancy were sometimes built upon the ruins of demolished Hindu temples, have come under increasing Hindu assault. The 1992 destruction of the oldest Muslim mosque in India (the Babri Masjid) remains a symbol of a divided country. In Afghanistan, torn by civil war after the failed Soviet invasion, Taliban rebels instituted a fundamentalist Islamic government. As a result, Afghan women were removed from their jobs, denied an education and access to medical care, and threatened with violence if they ventured outside their homes without the supervision of a male relative. Afghan women were only able to return to public life after the ouster of the Taliban by American forces in 2002 and only in those areas enjoying relative freedom from inter-tribal fighting. The most dramatic example of the clash between fundamentalist and secularist culture to date was the attack on the World Trade Center

and the Pentagon by Al Qaeda terrorists on September 11, 2001. The result has been American invasions of Afghanistan (the main base for Al Qaeda) and Iraq. In 2003 a civil war broke out in Darfur, a region in western Sudan. Tribes within this area, made part of Sudan by Anglo-Egyptian forces, rebelled for the first time during the nineteenth century under the leadership of the Mahdi. Conflicts between Darfuri groups disputing who was Arab or who African were exacerbated by uneven economic development as first the colonial and then the independent government concentrated resources in central and northeastern Sudan. A further wave of violence was set off by lack of Sudanese government aid for a famine caused by drought in the 1980s. By 2003 a full-scale war was in progress with the Sudanese government aiding the Janjaweed ( a group of paid mercenaries) against the Fur, Zaghawa, and Masalit. Perhaps as many as two and half million people have been displaced and some half million or more killed. The Sudanese government restricts international access to the refugee camps and has been accused of killing witnesses and countenancing mass rape and genocide; it continues to refuse to let United Nations Peacekeeping Forces into the region. Nor has post-Cold-War Europe been immune to ethnic and religious hostilities. The transition to capitalism at the end of the Cold War brought several years of economic hardship to the nations of Central Asia and Central and Eastern Europe. Some of their leaders turned to ethnic nationalism to rally their beleaguered populations. Claims of old betrayals, lost empires, and demands for minority national self-determination combined to set off civil wars in Armenia, Azerbaijan, the former Yugoslavia, and the Kurdish lands of Turkey and Iraq. The Roma found themselves under increasing discrimination in the Czech Republic. Even France and Britain began to retreat from the hospitality they had offered to their former colonial populations. And one of the most persistent ethnic-religious divides remains that between the Scots-Protestant and Irish-Catholic factions in Northern Ireland despite the cessation of active hostilities in 1998. The most violent ethnic-religious divide within Europe was that within the former Yugoslavia. Before his death in 1980, Tito had turned Yugoslavia into a federal state with six republics (Serbia, Croatia, Bosnia-Herzegovina, Slovenia, Macedonia, and Montenegro) and two autonomous provinces (Kosovo and Vojvodina) within Serbia. After his death, the country was ruled by a presidency that rotated among the six republics, but Serbia’s refusal to give up its seat in 1987 and its rescinding of the privileges of autonomy to Vojvodina and Kosovo in 1989 precipitated the breakup of Yugoslavia. A war broke out between the mostly Roman Catholic Croats and mostly Eastern Orthodox Serbs within Croatia when that state declared its independence from Yugoslavia in 1991. That war grew wider when Bosnia-Herzegovina declared its independence the next year. Serbs, Croats and Muslims living in Bosnia slaughtered each other with the support of the Serbian and Croatian armies (and paramilitary groups) until NATO air strikes led to peace talks. The resulting Dayton Accords (1995) created a Bosnia with three presidents, one for each ethno-religious group, but some 30,000 NATO troops remain in Bosnia to keep the peace. Then the fighting moved to the province of Kosovo in Serbia. Ninety percent of Kosovo’s population was ethnic Albanian. As the Kosovar Albanians fought to gain their independence, the Serbs fought to keep control of the province. As in the earlier battles in Croatia and Bosnia, each side resorted to “ethnic cleansing” to eliminate its rivals. Once again, it took NATO to bring both sides to the bargaining table. NATO imposed peace accords in 1999 and its troops moved into Kosovo to keep the combatants at arms length. In February 2008 Kosovo declared its independence and was quickly recognized by the United States, Germany, France, and Britain. Nearly 17,000 NATO troops (including 1,600 Americans) still keep the peace in the new state.

In a post-Cold-War World, NATO and the United Nations have been trying to redefine their respective roles. NATO has been acting in part as Europe’s policeman. But as the 2003 stalemate over Iraq between France and the United States in the UN Security Council and Turkey’s refusal to allow US troops access in the campaign against Iraq made clear, even NATO allies are uncertain of the extent to which NATO troops should be used outside of Europe. Within Europe itself NATO is expanding. In 1999, the Czech Republic, Hungary, and Poland became the first three excommunist states to join NATO. In November 2002, NATO extended membership invitations to seven more: Bulgaria, Estonia, Latvia, Lithuania, Romania, Slovakia, and Slovenia. While Russia has been given a special auxiliary role, it remains concerned about NATO enlargement, especially into those states (Estonia, Latvia, and Lithuania) which had been part of the Soviet Union (and the Russian empire that preceded it) until 1991 except for a brief period of independence created by the Versailles settlement between the World Wars.

Plessy v. Ferguson

Justice Henry Brown, U.S. Supreme Court

United States, 1896

This case turns upon the constitutionality of an act of the General Assembly of the State of Louisiana, passed in 1890, providing for the separate railway carriages for the white and colored races…

The constitutionality of this act is attacked upon the ground that it conflicts both with the Thirteenth Amendment of the Constitution, abolishing slavery, and the Fourteenth Amendment, which prohibits certain restrictive legislation on the part of the States.

1. That it does not conflict with the Thirteenth Amendment, which abolished slavery and involuntary servitude, except as a punishment for a crime, is too clear for argument. Slavery implies involuntary servitude — a state of bondage; the ownership of mankind as a chattel, or at least the control of the labor and services of one man for the benefit of another, and the absence of a legal right to the disposal of his own person, property and services… It was intimated, however, in [previous Court cases that] this amendment was regarded by the statesmen of that day as insufficient to protect the colored race from certain laws which had been enacted in the Southern States, imposing upon the colored race onerous disabilities and burdens and curtailing their rights in the pursuit of life, liberty and property to such an extent that their freedom was of little value; and that the Fourteenth Amendment was devised to meet this exigency.

So, too, in the Civil Rights Cases it was said that the act of a mere individual, the owner of an inn, a public conveyance or place of amusement, refusing accommodations to colored people can not be justly regarded as imposing any badge of slavery or servitude upon the [person]…

A statute which implies merely a legal distinction between the white and colored races — a distinction which is founded in the color of the two races and which must always exist so long as white men are distinguished from the other race by color — has no tendency to destroy the legal equality of the two races, or reestablish a state of involuntary servitude…

2. [Regarding the Fourteenth Amendment] The object of the amendment was undoubtedly to enforce the absolute equality of the two races before the law, but, in the nature of things, it could not have been intended to abolish distinctions based upon color, or to enforce social… equality, or a commingling of the two races upon terms unsatisfactory to either. Laws permitting, and even requiring, their separation in places where they are liable to be brought into contact do not necessarily imply the inferiority of either race to the other, and have been generally, if not universally, recognized as within the competency of the state legislatures in the exercise of their police power. The most common instance of this is connected with the establishment of separate schools for white and colored children, which has been held to be a valid exercise of the legislative power even by courts of States where the political rights of the colored race have been longest and most earnestly enforced….

The distinction between laws interfering with the political equality of the negro and those requiring the separation of the two races in schools, theatres and railway carriages has been frequently drawn by this court. Thus, in Strauder v. West Virginia, it was held that a law of West Virginia limiting to white male persons, 21 years of age and citizens of the State, the right to sit upon juries was a discrimination which implied a legal inferiority in civil society, which lessened the security of the right of the colored race, and was a step toward reducing them to a condition of servility. Indeed, the right of a colored man that, in the selection of jurors to pass upon his life, liberty and property, there shall be no exclusion of his race and no discrimination against them because of color has been asserted in a number of cases….

[W]e cannot say that a law which authorizes or even requires the separation of the two races in public conveyances is unreasonable, or more obnoxious to the Fourteenth Amendment than the acts of Congress requiring separate schools for colored children in the District of Columbia, the constitutionality of which does not seem to have been questioned, or the corresponding acts of state legislatures.

We consider the underlying fallacy of the plaintiff’s argument to consist in the assumption that the enforced separation of the two races stamps the colored race with a badge of inferiority. If this be so, it is not by reason of anything found in the act, but solely because the colored race chooses to put that construction upon it. The argument necessarily assumes that if, as has been more than once the case and is not unlikely to be so again, the colored race should become the dominant power in the state legislature, and should enact a law in precisely similar terms, it would thereby relegate the white race to an inferior position. We imagine that the white race, at least, would not acquiesce in this assumption. The argument also assumes that social prejudices may be overcome by legislation, and that equal rights cannot be secured to the negro except by an enforced commingling of the two races. We cannot accept this proposition. If the two races are to meet upon terms of social equality, it must be the result of natural affinities, a mutual appreciation of each other’s merits, and a voluntary consent of individuals.

Brown v. Board of Education

United States, 1954

The opinion of the Supreme Court as delivered by Chief Justice Warren.

These cases come to us from the States of Kansas, South Carolina, Virginia, and Delaware. They are premised on different facts and different local conditions, but a common legal question justified their consideration together in this consolidated opinion.

In each of the cases, minors of the Negro race, through their legal representatives, seek the aid of the courts in obtaining admission to the public schools of their commu­nity on a nonsegregated basis. In each instance, they had been denied admission to schools attended by white chil­dren under laws requiring or permitting segregation ac­cording to race. This segregation was alleged to deprive the plaintiffs of the equal protection of the laws under the Fourteenth Amendment. In each of the cases other than the Delaware case, a three-judge federal district court denied relief to the plaintiffs on the so-called “separate but equal” doctrine announced by this Court in Plessy v. Ferguson, 163 U.S. 537. Under that doctrine, equality of treatment is accorded when the races are provided sub­stantially equal facilities, even though these facilities be separate. In the Delaware case, the Supreme Court of Delaware adhered to that doctrine, but ordered that the plaintiffs be admitted to the white schools because of their superiority to the Negro schools.

The plaintiffs contend that segregated public schools are not “equal” and cannot be made “equal,” and that hence they are deprived of the equal protection of the laws. . . .

Reargument was largely devoted to the circumstances surrounding the adoption of the Fourteenth Amendment in 1868. It covered exhaustively consideration of the Amendment in Congress, ratification by the states, then existing practices in racial segregation, and the views of proponents and opponents of the Amendment. This discussion and our own investigation convince us that, al­though these sources cast some light, it is not enough to resolve the problem with which we are faced. At best, they are inconclusive. The most avid proponents of the post-­War Amendments undoubtedly intended them to remove all legal distinctions among “all persons born or natural­ized in the United States.” Their opponents, just as cer­tainly, were antagonistic to both the letter and the spirit of the Amendments and wished them to have the most limited effect. What others in Congress and the state legislatures had in mind cannot be determined with any degree of certainty.

An additional reason for the inconclusive nature of the Amendment’s history, with respect to segregated schools, is the status of public education at that time. In the South, the movement toward free common schools, supported by general taxation, had not yet taken hold. Education of white children was largely in the hands of private groups. Education of Negroes was almost nonexistent, and prac­tically all of the race were illiterate. In fact, any education of Negroes was forbidden by law in some states. Today, in contrast, many Negroes have achieved outstanding success in the arts and sciences as well as in the business and professional world. It is true that public school edu­cation at the time of the Amendment had advanced further in the North, but the effect of the Amendment on Northern States was generally ignored in the congressional debates. Even in the North, the conditions of public education did not approximate those existing today. The curriculum was usually rudimentary; ungraded schools were common in rural areas; the school term was but three months a year in many states; and compulsory school attendance was virtually unknown. As a consequence, it is not surprising that there should be so little in the history of the Four­teenth Amendment relating to its intended effect on public education. . . .

In approaching this problem, we cannot turn the clock back to 1868 when the Amendment was adopted, or even to 1896 when Plessy v. Ferguson was written. We must consider public education in the light of its full develop­ment and its present place in American life throughout the Nation. Only in this way can it be determined if segrega­tion in public schools deprives these plaintiffs of the equal protection of the laws.

Today, education is perhaps the most important func­tion of state and local governments. Compulsory school attendance laws and the great expenditures for education both demonstrate our recognition of the importance of education to our democratic society. It is required in the performance of our most basic armed forces. It is the very foundation of good citizenship. Today it is a principal instrument in awakening the child to cultural values, in preparing him for later professional training, and in help­ing him to adjust normally to his environment. In these days, it is doubtful that any child may reasonably be expected to succeed in life if he is denied the opportunity of an education. Such an opportunity, where the state has undertaken to provide it, is a right which must be made available to all on equal terms.

We come then to the question presented: Does segre­gation of children in public schools solely on the basis of race, even though the physical facilities and other “tangible” factors may be equal, deprive the children of the minority group of equal educational opportunities? We believe that it does. . . .

The effect of this separation on their educational opportunities was well stated by a finding in the Kansas case by a court which nevertheless felt compelled to rule against the Negro plaintiffs:

“Segregation of white and colored children in public schools has a detrimental effect upon the colored children. The impact is greater when it has the sanction of the law; for the policy of separating the races is usually interpreted as denoting the infe­riority of the negro group. A sense of inferiority affects the motivation of a child to learn. Segrega­tion with the sanction of law, therefore, has a ten­dency to [retard] the educational and mental devel­opment of negro children and to deprive them of some of the benefits they would receive in a ra­cial[ly] integated school system.”

Whatever may have been the extent of psychological knowledge at the time of Plessy v. Ferguson, this finding is amply supported by modern authority. Any language in Plessy v. Ferguson contrary to this finding is rejected.

We conclude that in the field of public education the doctrine of “separate but equal” has no place. Separate educational facilities are inherently unequal. Therefore, we hold that the plaintiffs and others similarly situated for whom the actions have been brought are, by reason of the segregation complained of, deprived of the equal protec­tion of the laws guaranteed by the Fourteenth Amend­ment. This disposition makes unnecessary any discussion whether such segregation also violates the Due Process Clause of the Fourteenth Amendment.

I Have a Dream

Dr. Martin Luther King Jr.

Washington, D.C., 196

The “I Have a Dream” speech was delivered by Dr. Martin Luther King Jr. (1929-1968) in Washington, D.C. on August 28, 1963.

I am happy to join with you today in what will go down in history as the greatest demonstration for freedom in the history of the Nation.

Five score years ago, a great American, in whose symbolic shadow we stand today, signed the Emancipa­tion Proclamation. This momentous decree came as a great beacon of light and hope to millions of Negro slaves who had been seared in the flames of withering injustice. It came as the joyous daybreak to end the long night of captivity.

But one hundred years later, the Negro still is not free. One hundred years later, the life of the Negro is still sadly crippled by the manacle of segregation and the chain of discrimination. One hundred years later, the Negro lives on a lonely island of poverty in the midst of a vast ocean of material prosperity. One hundred years later, the Negro is still languishing in the comer of American society and finds himself an exile in his own land. So we have come here today to dramatize a shameful condition.

In a sense we have come to the capital to cash a check. When the architects of our republic wrote the magnificent words of the Constitution and the Declaration of Independence, they were signing a promissory note to which every American was to fall heir. This note was a promise that all men — black men as well as white men— would be guaranteed the unalienable rights of life, liberty, and the pursuit of happiness.

But it is obvious today that America has defaulted on this promissory note insofar as her citizens of color are concerned. Instead of honoring this sacred obligation, America has given the Negro people a bad check— a check that has come back marked “insufficient funds.” But we refuse to believe that the bank of justice is bankrupt. We refuse to believe that there are insufficient funds in the great vaults of opportunity in this Nation.

So we have come to cash this check. A check that will give us the riches of freedom and the security of justice.

We have also come to this hallowed spot to remind America that the fierce urgency is now. This is no time to engage in the luxury of cooling off or to take the tranquil­izing drug of gradualism. Now is the time to make

Answer preview to discuss the ways in which the 20th century may or may not have been a century of progress

Discuss the ways in which the 20th century may or may not have been a century of progress

APA

615 words

Get instant access to the full solution from yourhomeworksolutions by clicking the purchase button below

 

× Lets chat on whatsapp?