History is full of conflicts which persist over generations.
- Inter-Christian religious wars drive more than a century of conflict that spanned the globe.
- Inter-Empire wars over colonization rights drove another century of warfare.
- Carthago Delenda Est!
(We could argue the precise dates, or whether any given skirmish was part of a given historical trend; I'm not sure that is relevant to the question, although someone may be brilliant enough to spot the false assumption and provide a splendid answer).
Is there a general theory of these kinds of conflict?
Is there a way to classify a given skirmish within a series of global conflicts? (or is that only available ex post facto?) Is there a way to tell whether a given conflict is escalating or diminishing?
Is there a common thread in the resolution of these multi-generational conflicts?
Some academics have studied this issue. The very popular theory today concerning it is the Strauss-Howe Generational Theory or the Four Turnings, which I believe only applies to US history. They created this theory to try to help some of the problems which will occur, since we can expect them.
Strauss and Howe lay the groundwork for the theory in their 1991 book Generations, which retells the history of America as a series of generational biographies going back to 1584.1 In their 1997 book The Fourth Turning, the authors expand the theory to focus on a fourfold cycle of generational types and recurring mood eras in American history.
These are the theorist who believe that the Millennial Generation, defined as those that graduated high school starting in 2000, are the historical analogue to the WWII's Greatest Generation. They are the result of the "Fourth Turning."
today's teens and young adults are recasting the image of youth from downbeat and alienated to upbeat and engaged. They write that Millennials are held to higher standards than adults apply to themselves; they're a lot less violent, vulgar, and sexually charged than the teen culture older people are producing for them. Over the next decade, they will transform what it means to be young.
Here are the Turnings:
First Turning is a High (Silent Generation). This is a post-Crisis era when institutions are strong and individualism is weak. Society is confident about where it wants to go collectively, though those outside the majoritarian center often feel stifled by the conformity.
the Second Turning is an Awakening (Baby Boomers). This is an era when institutions are attacked in the name of personal and spiritual autonomy. Just when society is reaching its high tide of public progress, people suddenly tire of social discipline and want to recapture a sense of personal authenticity. Young activists look back at the previous High as an era of cultural and spiritual poverty
the Third Turning is an Unraveling (Gen X). The mood of this era is in many ways the opposite of a High: Institutions are weak and distrusted, while individualism is strong and flourishing. Highs come after Crises, when society wants to coalesce and build. Unravelings come after Awakenings, when society wants to atomize and enjoy.
the Fourth Turning is a Crisis. This is an era in which institutional life is destroyed and rebuilt in response to a perceived threat to the nation's survival. Civic authority revives, cultural expression redirects towards community purpose, and people begin to locate themselves as members of a larger group
(Apologies for so much cut and paste) There are many other theorists that have looked at the reasons for the cycles of peace and war in societies, as well, but I suggest starting here.
Conflicts Among the Tribes & Settlers
There were many Native American tribes living on the Great Plains, competing for scarce resources. Of course, the various tribes came into conflict with each other.
The Lakota (or Sioux) is actually a broad group of people that includes the seven bands of the Western (or Teton) Lakota, the Dakota (Yankton and Yanktoni) and the Nakota (Santee). This group of tribes lived in the Plains for only a part of their known history. The Lakotas originally lived in the northern woodlands. They struggled with the Anishnabe (who were called Chippewa by the Lakota). The Anishnabe were armed with guns they had from trading with trappers.
The Lakota slowly migrated south and westward and pushed aside the Omaha tribe in this early migration. At first, they didn’t have horses, but horses were spreading throughout the Plains from Spanish settlements in the Southwest. By 1742 the Tetons had gotten horses and they became more and more like horse-riding nomads. In the Central Plains the Lakota came into conflict with the Pawnee, a village tribe that held the rich hunting lands of the Republican River Valley until the Lakota entered the region. The Pawnee war parties usually made their trips on foot, unlike other tribes. Because the Lakota were mounted on horses, they had an advantage.
The Omaha war parties varied from eight to a hundred warriors. All members of the party were volunteers. The leader was usually a well-known warrior who had demonstrated his skill in battle. The warriors are said to have worn a white covering of soft, dressed skin for their heads. No shirt was worn, but a robe was belted around the waist and tied over the breast. No feathers or ornaments could be worn at this time. In actual battle, the warriors wore only moccasins and breechcloth.
Sometimes the wives of a few of the men accompanied a large war party to help care for their clothing and to do the cooking. A sacred War Pack, kept in the Tent of War, was important in any war activities. The contents of the pack were believed to protect the tribe from harm. A returning war party with the scalp of an enemy held a special scalp or victory dance. Men who won special honors on the warpath were permitted to wear an eagle feather in their scalp locks. Certain warriors might also wear a deer-tail headdress. Only important men wore the large feathered headdress seen in movies and only on social occasions. Only the men wore feathers in their hair, but the women might wear them on their clothing.
HOW TO RESOLVE CONFLICTS
It seems that people often have trouble getting along together. Families argue, neighbors come to blows, countries lob weapons at each other. Is this the way it has to be?
Anthropologists, sociologists, psychologists and others say it is. Having observed a long history of Man’s quarrelsome behavior, they claim that Man has animal instincts, or that he is anti-social and violent by his very nature.
In truth, Man is rather peaceful. But he can be driven, individually and collectively, to hatred and violence.
In researching the causes of violence, L. Ron Hubbard unearthed a fundamental and natural law of human relations which explains why conflicts between people are so often difficult to remedy. And he provided an immensely valuable tool that enables one to resolve any conflict, be it between neighbors, co-workers or even countries.
In this course, you will discover how to help others resolve their differences and restore peaceable relations. Peace and harmony between men can be more than just a dream. Widespread application of this law will make it a reality.
Ease of Study
This course is laid out in a step-by-step manner, with a sequence of reading assignments and questions to check your understanding.
Before you begin, you create your own personal logon on the Scientology website. Once logged in, the online program will guide you through each step of the course to full completion, with all course materials provided from within the Scientology website.
Length of Course
6 to 7 hours. You may, however, do the course at your own pace. In other words, it is not timed. The course is our service to you, free of charge.
Booklet: How to Resolve Conflicts
or The Scientology Handbook
Your course materials are also integrated within the online course. In other words, once logged on, you may read the materials from within the online course program as you do each step. We do, however, recommend that you download for free or purchase the booklet, to review and refer to when you are not logged into the course program.
If at any time you need assistance with your course assignments, do not hesitate to contact your online course supervisor, whom you can reach using the “Need help?” button in your online course program. The online course supervisor will help to ensure you understand and achieve the maximum benefit from the course materials. The end result is that you are fully able to apply the data contained therein.
Upon completion of The How to Resolve Conflicts Course, you will receive a certificate .
On April 6, 1994, the Hutu president of Rwanda, Juvénal Habyarimana, was assassinated when his plane was shot down near Kigali International Airport. The Hutu president of Burundi, Cyprien Ntaryamira, was also killed in the attack. This sparked the chillingly well-organized extermination of Tutsis by Hutu militias, even though blame for the plane attack has never been established. Sexual violence against Tutsi women was also widespread, and the United Nations only conceded that "acts of genocide" had occurred two months after the killing began.
After the genocide and the Tutsis' regaining control, about 1.3 million Hutus fled to Burundi, Tanzania (from where more than 10,000 were later expelled by the government), Uganda, and the eastern part of the Democratic Republic of the Congo, where the great focus of Tutsi-Hutu conflict is today. Tutsi rebels in the DRC accuse the government of providing cover for the Hutu militias.
17 Crazy Historical Facts That Are Worth Repeating Over and Over
As the saying goes, if you don't learn about history, then you're doomed to… Yeah, yeah, yeah. Look: Sometimes history repeats itself. Whether good (a cultural obsession with cats), bad (geopolitical strife), or just plain annoying (insufferable traffic), that's just the way the timeline works. This is all to say, history isn't a straight line. It's a circle. See for yourself!
Ancient Rome was plagued with horrendous traffic that lasted all day and all night. It was so bad that, according to 1st-century Roman scribe Decimus Iunius Iuvenalis, it caused people to literally die, thanks to recurring insomnia due to noise pollution. And you thought the 405 was bad…
To be fair, yes, that Roman writer was one the world's first satirists, so it's possible that his whole "people died" claim was a bit of an exaggeration. But the point remains: Traffic always has been—and always will be—a terrible, unavoidable part of urban living.
The internet is positively teeming with cats. Pictures and videos of the furry felines are constantly going viral online, where they are fawned over on Reddit, Facebook, Instagram, and more. This obsession is nothing new, though. Humans have adored cats for millennia, all the way back to, most notably, ancient Egypt .
Cats were present in cultural and religious iconography, and even regarded as an important member of the household. It's said that when a cat died, all the members of the household would shave their eyebrows in mourning. Now that's love.
For nearly all of human history, people have been using cannabis for its healing properties. As long ago as 2737 B.C.E., there are records of cannabis tea being prescribed in China to treat diseases like gout, rheumatism, and malaria. Since then, it has been used medically across the globe, from the Americas, to Asia, to Europe, and Africa, too. However, since it was incriminated in the United States in 1916, many have lost sight of these healing properties. Only recent progress in legislation is beginning to change negative public perception.
Yes, the United States recently went through its longest shutdown in its history, leaving hundreds of thousand of federal employees furloughed. But we're not the only country—nor the only era—to experience serious shutdowns. In Roman times, there was a similar strategy for resolving conflict: secessio plebis .
Basically, the plebeians would abandon the city en masse, leaving those in power (the patricians) to fend for themselves, essentially forcing both parties to come to the table for discussions. And it was generally a lengthy process. In fact, historians estimate that t he first secessio plebis lasted two whole years: from 495 to 493 B.C.E. At the end, the patricians created the Tribune of the Plebs, giving the hoi polloi government representation for the first time in the history of the Republic.
Secession plebis happened four more times over the course of Roman history. The final instance occurred in 287 B.C.E., and culminated in the passage of lex hortensia, a law that granted equal political rights to plebeians and patricians—theoretically, at least.Shutterstock
Fact: People today are living to be older than ever before. We can attribute the elongation of human lifespans to great leaps in science and healthier living practices. To better understand just how far we've come, consider the fact that, in 1900, the world average life expectancy was 31. Today, because of the strides humanity has made, the world average life expectancy has more than doubled: to 71.5.
In his 1889 book, Foods for the Fat: A Treatise on Corpulency and a Dietary for its Cure, author Nathaniel Edward Davies comes out swinging against obesity, explaining that "The power of enjoyment is limited in the corpulent person, as exertion is attended with breathlessness, which forbids active exercise…the temperament is proverbially easy-going, indolent, and lethargic, especially after meals, although very frequently interrupted by attacks of peevishness and irritability."
To avoid becoming such a peevish corpulent person, Davies sets out a pretty clear set of dietary restrictions that "an ordinary-sized person should take." This consists of:
4.5 ounces of nitrogenous food
3 ounces of fats
14.5 ounces of carbohydrates
1 ounce of salts
For breakfast, he recommends one large cup of tea or coffee, with two to three ounces of bread or dry toast, "very thinly buttered," and three to four ounces of "any light meat or fish." For lunch, "an ordinary dish of any soup," seven or eight ounces of roast or boiled meat, fish, or any meat dish, a "small plate of any non-farinaceous pudding" and five or six ounces of fruit. For dinner: six to eight ounces of light wine dry toast boiled eggs, fish, or any meat dish a glass of whisky and water "with a few gluten biscuits."
Give or take a few carbs and alcoholic beverages, is that so different from The South Beach Diet?
People have become prisoners to their screens, spending more and more time glued to games on their smartphones, tablets, and televisions. With this rise in screen-based entertainment, there's been an equal rise in an unhealthy sedentary lifestyle. But this is nothing new people have the same complaints about video games now as they people used to have about chess , which an 1859 edition of Scientific American called an "amusement of very inferior character," claiming it had no benefit on the body.
Young people these days feel an incredible pressure to succeed at a young age, promoting unhealthy habits and damaging self-esteem. It's worthwhile to remember that success can come at any age, just look at Laura Ingalls Wilder . She was a teacher, a farmhand, and a mother before she ever even considered writing. She was 65 when Little House in the Big Woods was published and 76 when her last book was published. Now, she is remembered for for having written some of America's most beloved literature.
Sure, they helped build a nation, but behind closed doors, they were catty as children. On a 1776 trip to Staten Island , Ben Franklin and John Adams stayed overnight at a New Brunswick Inn, where they shared the last room. They hardly got any sleep, though, because they couldn't stop arguing over whether to keep the window opened or closed. Even the greatest minds fall into feeble traps from time to time.
And long before President Trump came around hurling names like "Cryin Chuck" and "Crooked Hillary," Vice President Thomas Jefferson and President John Adams had perhaps one of the ugliest presidential battles ever. The war of words culminated in Adams being called a "hideous hermaphroditical character, which has neither the force and firmness of a man, nor the gentleness and sensibility of a woman," while Jefferson was labeled "a mean-spirited, low-lived fellow" (which was followed it by a series of racial slurs we won't dare repeat).
From the cultural preoccupation with craft beer to Instagram's obsession with rosé, it might seem like Americans are consuming more alcohol than ever before. While alcohol consumption per capita has been slowly growing since 1994, we look like teetotalers compared to the American's of yore. Back in 1830 , the average American consumed 7 gallons of alcohol per year. That's nearly two 750ml bottles per week! (Today, according to The Washington Post, roughly 80 percent of Americans have 6.25 drinks per week or less per week, and 30 percent don't drink alcohol at all.)
In 1798, John Adams wrote that there had never been "more new error propagated by the press" than in the decade since freedom of the press had been established in the United States. If only he knew just how much such published errors would drive apart our country centuries later. Though the term "fake news" is a recent invention, its practice is as old as print is itself. We should always remember the power of words.
The dawn of Hollywood was scandalous, especially to religious folk who claimed the movies were a conduit of immorality and sin, warping the minds of young people. A 1926 issue of the Pentecostal Evangelical complained that the beauty, clothing, and low moral standards of the film stars were being unconsciously appropriated by the youth. Glad that got resolved!
You think Philadelphia is a rough place to watch a live sports game? It's got nothing on Mesoamerica.
Before millions of people were tuning in, from all over the globe, to watch the World Cup, the game that brought the world the rubber ball was invented in Mesoamerica . The game was played all across the region, and held both religious and ritual significance, bringing forth masses of spectators. The aim was to pass the ball with only the hips, until it made it through a stone hole. However, the post-game antics make brawls between today's soccer fans look like child's play, as they often involved ritual human sacrifice. Yikes!
At least, according to one 5,000-year-old pay stub that was acquired by the British Museum. This ancient receipt shows a record of the amount of beer that one employer paid workers as a wage. This was a common practice during that time, apparently, as there is evidence of a similar system in ancient Egypt. Next time your friends help you move, assure them that six-pack of IPA is totally fair compensation—it's just history, after all!
From the ruins of the city, humans have learned that Pompeii was very modern, complete with government, commerce, and, like all modern cities, graffiti . It has been found in all the places you would expect to find graffiti: in brothels, on the walls of inns—in other words, the Pompeiian equivalents to bathroom stalls. As far as what was being scrawled, well, let's just say that the subject matter hasn't changed much. (We'll let you look up the content yourself…)
The way society defines beauty changes over time. While these days, some other things we consider typically beautiful include big lips, thick eyebrows, and an ample bottom, there's a good chance we'll look back at these trends in a few decades and wonder what we were thinking. After all, that's how most people feel about beauty trends of the past . Examples of bizarre trends include pale white powdered skin (!), blackened teeth (!!), unibrows (. ) made of goat's hair (. ), and even the removal of eyelashes (. ).
We can't wait to see what the future thinks of the era of social media!
More Young People Lived at Home in the 1940s Than They Do Today[/slidetitle Shutterstock
Millennials are often thought of as slow-starters, sometimes living with their parents well into their 30s. But the fact is, today's twenty- and thirty-somethings are setting out on their own far earlier on average than young adults of eras past.
According to Pew, the number of adults living with parents, "peaked around 1940, when about 35 percent of the nation's 18- to 34-year-olds lived with mom and/or dad (compared with 32 percent in 2014)." Feel free to show this to Mom and Dad next time they bug you about getting an apartment. And for more wacky history, don't miss The 40 Most Enduring Myths in American History.
To discover more amazing secrets about living your best life, click here to follow us on Instagram!
5. Historical Issues
Issue-centered analysis and decision-making activities place students squarely at the center of historical dilemmas and problems faced at critical moments in the past and the near-present. Entering into such moments, confronting the issues or problems of the time, analyzing the alternatives available to those on the scene, evaluating the consequences that might have followed those options for action that were not chosen, and comparing with the consequences of those that were adopted, are activities that foster students’ deep, personal involvement in these events.
If well chosen, these activities also promote capacities vital to a democratic citizenry: the capacity to identify and define public policy issues and ethical dilemmas analyze the range of interests and values held by the many persons caught up in the situation and affected by its outcome locate and organize the data required to assess the consequences of alternative approaches to resolving the dilemma assess the ethical implications as well as the comparative costs and benefits of each approach and evaluate a particular course of action in light of all of the above and, in the case of historical issues-analysis, in light also of its long-term consequences revealed in the historical record.
Because important historical issues are frequently value-laden, they also open opportunities to consider the moral convictions contributing to social actions taken. For example, what moral and political dilemmas did Lincoln face when, in his Emancipation Proclamation, he decided to free only those slaves behind the Confederate lines? Teachers should not use historical events to hammer home their own favorite moral lesson. The point to be made is that teachers should not use critical events to hammer home a particular “moral lesson” or ethical teaching. Not only will many students reject that approach it fails also to take into account the processes through which students acquire the complex skills of principled thinking and moral reasoning.
When students are invited to judge morally the conduct of historical actors, they should be encouraged to clarify the values that inform the judgment. In some instances, this will be an easy task. Students judging the Holocaust or slavery as evils will probably be able to articulate the foundation for their judgment. In other cases, a student’s effort to reach a moral judgment may produce a healthy student exercise in clarifying values, and may, in some instances, lead him or her to recognize the historically conditioned nature of a particular moral value he or she may be invoking.
Particularly challenging are the many social issues throughout United States history on which multiple interests and different values have come to bear. Issues of civil rights or equal education opportunity, of the right to choice vs. the right to life, and of criminal justice have all brought such conflicts to the fore. When these conflicts have not been resolved within the social and political institutions of the nation, they have regularly found their way into the judicial system, often going to the Supreme Court for resolution.
As the history course approaches the present era, such inquiries assume special relevance, confronting students with issues that resonate in today’s headlines and invite their participation in lively debates, simulations, and socratic seminars–settings in which they can confront alternative policy recommendations, judge their ethical implications, challenge one another’s assessments, and acquire further skills in the public presentation and defense of positions. In these analyses, teachers have the special responsibility of helping students differentiate between (1) relevant historical antecedents and (2) those that are clearly inappropriate and irrelevant. Students need to learn how to use their knowledge of history (or the past) to bring sound historical analysis to the service of informed decision making.
HISTORICAL THINKING STANDARD 5
The student engages in historical issues-analysis and decision-making:
Part I: The Origins of the Conflict Model.
The conflict model of science and faith can be traced to the late 19 th century and the work of two American authors, whose historical claims were discredited both then, and repeatedly since, by serious historians. One of them was a scientist and popular history writer named John William Draper, and the other a historian named Andrew Dickson White. It is no exaggeration to say that these two men together invented the model, which so many today still accept as unquestionable. In fact, it is often simply called the Draper and White Conflict Thesis by historians. To understand its origins, we have to go back several centuries and recognize three trends, two intellectual and one sociocultural, that set the stage for the success of Draper and White.
The first intellectual development, which goes back to the 17 th century, was a suspicion of any Christian doctrines other than moral teachings. Terms such as “dogma,” “divine mystery,” and “articles of faith” began to be used pejoratively to imply foolishness and fear of progress—and even religious deception. This is best captured in a letter that Thomas Jefferson wrote in 1816 to his friend, the Dutch minister Adrian van der Kemp, about the dogma of the Trinity: “Ridicule,” he wrote, “is the only weapon which can be used against unintelligible propositions. Ideas must be distinct before reason can act upon them and no man ever had a distinct idea of the trinity. It is the mere Abracadabra of the [tricksters] calling themselves the priests of Jesus.”
By the late 19 th century, dogmas had begun to be seen by many as anti-rational, the products of blind, dangerous faith. Many thought that science should be made to replace dogmas through a crusade to rescue religion from irrational ideas. Lost to view was the recognition that Christian dogmas can be rational, even though they relate to realities that are by their nature not fully comprehensible by the human mind, concerning as they do the self-revelation of God rather than facts about the physical universe.
The second intellectual trend took place in the 19 th century and was much more positive. The various areas of study which we now refer to with the umbrella term “science,” such as physics, chemistry, biology, etc., were becoming professionalized, taking on a whole new level of respectability and exciting popular enthusiasm through the new knowledge and industrial and medical benefits they were producing. For science, it was one of the best of times. This was the age of Lyell’s geology giving the first glimpse of the ancient age of the earth, of Pasteur’s germ theory, above all, of Darwin’s Origin of Species. As a result, science as we define it today began to stand out as a specific and separate pursuit. This change in perception even involved a change in vocabulary.
Before the 19 th century, the word “science” (from Latin scientia meaning “knowledge”) referred to any knowledge demonstrated logically, including theological knowledge. The words “philosophy” and “science” were treated as synonyms, as in the title of a book published in 1821: Elements of the Philosophy of Plants Containing the Scientific Principles of Botany. But by the late 19 th century the terms “science” and “scientific method” began to be associated exclusively with the study of the physical universe through observation and experiment. This change in perception added new words to the English vocabulary, terms such as “scientist” and “physicist,” which were coined in 1833 by the Anglican theologian and natural philosopher William Whewell (1794-1866). Sadly, the restriction of the science “word family” to one kind of human knowledge left open the possibility that other areas of knowledge such as philosophy, art, morality, poetry, and theology could be considered as unfruitful, subjective flights of fancy by comparison.
The third trend, Anglo-American in its roots, was sociocultural: the rise of anti-Catholic prejudice, even mania, in the United States as a response to the large influx of Irish and other Catholic immigrants that began in the mid-1840’s. From the perspective of the Catholic Church in America, the mid- to late-19 th century was one of the worst of times, and the decade of the 1870’s marked a high-point of anti-Catholic prejudice. The American bishops were seeking tax exempt status for tuition at Catholic schools, and the battle was fierce. In 1871, in Harper’s Weekly, the famous political cartoonist Thomas Nast published what many regard as one of his most powerful images, “The American River Ganges.”
The image shows a Protestant public school teacher, with a Bible tucked in his waistcoat, shielding a group of young children from menacing crocodiles, who are creeping up the shore in order to devour them. When the crocodiles are viewed closely, one realizes that their jaws are ornate, jewel-encrusted miters, and that the predators are actually Irish Catholic bishops. On the cliff, the New York politician William Tweed, a.k.a. “Boss Tweed,” and his cohorts are handing children down to be devoured. Behind him there is a gallows and Lady Liberty is being led away to be hanged. Across the water is what looks like St. Peter’s Basilica, but the name inscribed on it is Tammany Hall, the Democratic Party political machine run by Boss Tweed. Over the colonnade of the basilica can be seen the words “The Political Roman Catholic School.” The U.S. Public School in the foreground is crumbling.
The majority of Catholic immigrants were poor and illiterate, which gave their religion an air of ignorance and superstition to non-Catholics. A largely successful attempt to forbid public aid to Catholic schools drew upon these prejudices and upon fears that Catholics secretly wanted to bring the entire nation under the political control of the pope by corrupting education. A bias against the possibility of Catholics being open to the progress of knowledge ruled the day.
Science was identified with progress, and Catholicism with backwardness. Science brought knowledge, whereas Catholicism with its dogmas and mysteries was seen as fostering ignorance. This was the soil in which false claims about the history of the Church and science could take root and flourish, and such claims were not long in coming.
In 1874, John William Draper (1811-1882), a successful American chemist and early innovator of photography, published his book entitled History of the Conflict Between Religion and Science. He begins by making a generalized judgment: “The history of Science is not a mere record of isolated discoveries it is a narrative of the conflict of two contending powers, the expansive force of the human intellect on one side, and the compression arising from [traditional] faith.” Shortly after this declaration, he qualifies it by proclaiming the innocence of Protestant and Eastern Orthodox Christians, whom he claims have never opposed the advancement of knowledge and have always had “a reverential attitude to truth, from whatever quarter it might come.” He later refers to Protestantism as the “twin-sister” of science. The true religious enemy of science is the Roman Catholic Church, which he indicts for rejecting science and using violent means to maintain power over its adherents, with the long-term goal of gaining total political supremacy over all peoples:
In speaking of Christianity, reference is generally made [in this book] to the Roman Church . . . None of the Protestant Churches has ever occupied a position so imperious—none has ever had such widespread political influence . . . But in the Vatican—we have only to recall the Inquisition—the hands that are now raised in appeals to the Most Merciful are crimsoned. They have been steeped in blood!”
Throughout the rest of the book, Draper alleges conflict after conflict between the Catholicism and science while offering little or no evidence. He makes up details and presents them as facts. He rearranges sequences of events in order to support his position. He selects quotes that seem to support his case and fails to give the context, even leaving out parts of quotes that call into question his interpretation of them.
For instance, Draper condemns St. Augustine (354 - 430) for teaching that the sky is stretched out like a flat skin over a flat earth. Actually, St. Augustine quotes Psalm 104:2 in order to demonstrate his principle that the Bible must be read figuratively, not literally, in its depictions of natural phenomena. He actually affirms the very position Draper accuses him of rejecting: “rational arguments,” St. Augustine concludes, “inform us that the sky has the shape of a hollow globe all round us.” Draper concludes the book with a prophecy of doom for religion and victory for science:
As to the issue of the coming conflict, can anyone doubt? Whatever is resting on fiction and fraud will be overthrown. Institutions that organize impostures and spread delusions must show what right they have to exist. Faith must render an account of herself to Reason. Mysteries must give place to facts. Religion must relinquish that imperious, that domineering position which she has so long maintained against Science.”
Despite his fury and contempt for Catholicism, or, more likely, because of it, Draper’s book was an instant success. It outsold every other book in the series in which it was included. Since then it has been reprinted 50 times and translated into 10 languages. It remains readily available.
Numerous critics emerged to respond to Draper’s work, including Orestes Brownson, a celebrated intellectual and a Catholic convert. A common theme of their criticisms was that The Conflict seemed to be written with the primary aim of achieving bestseller status rather than historical accuracy. In the May 23, 1875 issue of a San Francisco newspaper called The Daily Alta California, a reviewer put it this way: “He may be a rhapsodist, but he is no historian. He is neither unprejudiced nor painstaking. If he investigate(d) authorities, he does not dare to cite them to sustain his ballooning [allegations]. His book is an immense pretension.” The anonymous author of this review knew that the facts of history were often the opposite of what Draper claimed and showed that Draper was not invincibly ignorant, just malicious.
The reviewer corrected Draper on three claims:
- he noted that the murder of the philosopher Hypatia by a mob in Alexandria, Egypt, in 413 AD was not animated by Christian fear and envy of her skill in mathematics and science but by politics.
- he noted that Giordano Bruno was executed by the Roman Inquisition not for his belief in a plurality of worlds and a heaven filled with “space and stars,” as Draper claimed, but for theological heresies. And,
- he pointed out that Galileo’s condemnation had more to do with his recklessness and lack of discretion than an entrenched ecclesiastical or theological antagonism toward cosmologies that “threatened” the assertions of the Bible.
Contemporary historians of science also dismiss Draper’s book as an exercise in propaganda rather than scholarship. Galileo Goes to Jail and Other Myths of Science and Religion, a collection of essays by noted experts, includes discussions of several of the historical myths invented by Draper.
Andrew Dickson White (1832-1918) was an American historian, who in 1865 cofounded Cornell University, the first purely secular institution of higher learning in the United States. This resulted in criticism for separating learning from religion—criticism that came mostly from competitors at Protestant institutions of higher education. In response, White decided to write a book showing that both religion and science would be better off once “dogmatic theology,” a subject not included in the curriculum at Cornell, was overcome. “I will give them a lesson which they will remember,” he wrote to his friend Ezra Cornell in 1869.
White delivered this “lesson” to his opponents over the next 27 years, during which he published 27 articles, which he finally brought together in 1896 in a two-volume work called History of the Warfare of Science with Theology in Christendom. He begins the book by praising Draper for “his work of great ability” and then goes on to repeat many of Draper’s errors, including one that is widely believed to this day: the infamous “flat-earth dogma.” White claims that until Christopher Columbus’s time the majority of Christian thinkers had insisted on biblical grounds that the earth was flat, and that a flat earth was practically a dogma of the Church. In reality, only two Christian authors of record, the early Christian writer Lactantius and the relatively obscure 6 th century Greek traveler and monk Cosmas Indicopleustes, had ever argued that the earth is flat.
Whereas St. Augustine, St. Jerome, St. Ambrose, St. Albert the Great, and many other ancient and medieval Christian theologians testified to the rotundity of the earth, as did such major popular writers as Dante and Chaucer. In fact, St. Thomas Aquinas, in the very first article of the first question of the first book of his enormous Summa Theologiae says, “Sciences are differentiated according to the various means through which knowledge is obtained. For the astronomer and the physicist both may prove the same conclusion, for instance that the earth is round, [but in different ways].”
Despite this mountain of evidence, White portrays the entire Christian tradition as committed to flat-eartherism, and presents Lactantius and Cosmas as typical. To add a touch of drama, he adopts Washington Irving’s fictional account of Christopher Columbus struggling unsuccessfully to convince Catholic priests and professors that the earth is spherical at the University of Salamanca in 1487:
The warfare of Columbus the world knows well . . . how sundry wise men of Spain confronted him with the usual quotations from the Psalms, from St. Paul, and from St. Augustine how, even after he was triumphant, and after his voyage had greatly strengthened the theory of the earth’s sphericity . . . the Church by its highest authority solemnly stumbled and persisted in going astray.
Had White done his homework, he would have discovered that all parties at Salamanca agreed with Columbus that the Earth is spherical. What they debated was the size of the Earth, not its shape. Columbus thought it was small enough that he could reach Asia with sufficient supplies, while his opponents knew that it was much larger (and their estimates of the Earth’s circumference were quite accurate). What neither side could have known was that between Europe and Asia lay the Americas (luckily for Columbus).
The “one-two punch” of Draper’s and White’s books has a remarkable, long-standing effect on popular opinion. Appealing to the prejudices of their day, especially anti-Catholicism, and riding the wave of enthusiasm for scientific progress, they created the very conflict they claimed to resolve. The errors and misrepresentations they foisted upon their readers are now routinely repeated as historical facts by non-historians and have been given new life in the work of popularizers such as Neil DeGrasse Tyson, who in his 2014 TV series Cosmos adopted Draper’s account of the execution of Giordano Bruno. The flat-earth “dogma” idea is now so widespread that many learn it in elementary school. In 2012, even U.S. President Barack Obama repeated it in a jibe against political opponents: “If some of these folks were around when Columbus set sail, they probably would have been founding members of the Flat Earth Society. They would not have believed that the world was round.”
If Draper and White created the completely false story that Catholic Church has been hostile to science, then what is the true story? How has the Church and her theologians understood the relation of science and faith?
The Einstein-Bohr legacy: can we ever figure out what quantum theory means?
Quantum theory has weird implications. Trying to explain them just makes things weirder.
- The weirdness of quantum theory flies in the face of what we experience in our everyday lives.
- Quantum weirdness quickly created a split in the physics community, each side championed by a giant: Albert Einstein and Niels Bohr.
- As two recent books espousing opposing views show, the debate still rages on nearly a century afterward. Each "resolution" comes with a high price tag.
Albert Einstein and Niels Bohr, two giants of 20 th century science, espoused very different worldviews.
To Einstein, the world was ultimately rational. Things had to make sense. They should be quantifiable and expressible through a logical chain of cause-and-effect interactions, from what we experience in our everyday lives all the way to the depths of reality. To Bohr, we had no right to expect any such order or rationality. Nature, at its deepest level, need not follow any of our expectations of well-behaved determinism. Things could be weird and non-deterministic, so long as they became more like what we expect when we traveled from the world of atoms to our world of trees, frogs, and cars. Bohr divided the world into two realms, the familiar classical world, and the unfamiliar quantum world. They should be complementary to one another but with very different properties.
The two scientists spent decades arguing about the impact of quantum physics on the nature of reality. Each had groups of physicists as followers, all of them giants of their own. Einstein's group of quantum weirdness deniers included quantum physics pioneers Max Planck, Louis de Broglie, and Erwin Schrödinger, while Bohr's group had Werner Heisenberg (of uncertainty principle fame), Max Born, Wolfgang Pauli, and Paul Dirac.
Almost a century afterward, the debate rages on.
The story of the Troubles is inextricably entwined with the history of Ireland as whole and, as such, can be seen as stemming from the first British incursion on the island, the Anglo-Norman invasion of the late 12th century, which left a wave of settlers whose descendants became known as the “Old English.” Thereafter, for nearly eight centuries, England and then Great Britain as a whole would dominate affairs in Ireland. Colonizing British landlords widely displaced Irish landholders. The most successful of these “plantations” began taking hold in the early 17th century in Ulster, the northernmost of Ireland’s four traditional provinces, previously a centre of rebellion, where the planters included English and Scottish tenants as well as British landlords. Because of the plantation of Ulster, as Irish history unfolded—with the struggle for the emancipation of the island’s Catholic majority under the supremacy of the Protestant ascendancy, along with the Irish nationalist pursuit of Home Rule and then independence after the island’s formal union with Great Britain in 1801—Ulster developed as a region where the Protestant settlers outnumbered the indigenous Irish. Unlike earlier English settlers, most of the 17th-century English and Scottish settlers and their descendants did not assimilate with the Irish. Instead, they held on tightly to British identity and remained steadfastly loyal to the British crown.
Camp David Accords and the Arab-Israeli Peace Process
The Camp David Accords, signed by President Jimmy Carter, Egyptian President Anwar Sadat, and Israeli Prime Minister Menachem Begin in September 1978, established a framework for a historic peace treaty concluded between Israel and Egypt in March 1979. President Carter and the U.S. Government played leading roles in creating the opportunity for this agreement to occur. From the start of his administration, Carter and his Secretary of State, Cyrus Vance, pursued intensive negotiations with Arab and Israeli leaders, hoping to reconvene the Geneva Conference, which had been established in December 1973 to seek an end to the Arab-Israeli dispute.
As Carter and Vance met with individual leaders from Arab countries and Israel during the spring of 1977, negotiations for a return to Geneva appeared to gain some momentum. On May 17, 1977, an Israeli election upset stunned the Carter administration as the moderate Israeli Labor Party lost for the first time in Israel’s history. Menachem Begin, the leader of the conservative Likud Party and the new Israeli Prime Minister, appeared intractable on the issue of exchanging land for peace. His party’s commitment to “greater Israel” left Carter with an even more challenging situation during the summer of 1977.
In addition to the new reality of a Likud government in Israel, long-standing rivalries among Arab leaders also played a role in blocking substantive progress in negotiations for a Geneva conference. By early November, Egyptian President Sadat found himself frustrated by the lack of movement and made a dramatic move, announcing on November 9 that he would be willing to go to Jerusalem. This move stunned the world. Sadat would attempt to break the deadlock and to engage the Israelis directly for a Middle East settlement, eschewing any talk of returning to the Geneva Conference. Sadat’s visit led to direct talks between Egypt and Israel that December, but these talks did not generate substantive progress. By January 1978, the United States returned to a more prominent negotiation role.
During the spring and early summer of 1978, the United States attempted to find common ground with regard to Israeli withdrawal from the Sinai, West Bank, and Gaza. Egypt insisted on an Israeli withdrawal to June 4, 1967 borders in exchange for security arrangements and minor border modifications. Israel rejected Egypt’s insistence on withdrawal, especially from the West Bank and Gaza. It argued instead for some form of Palestinian autonomy during a five-year interim period followed by the possibility of sovereignty after the interim period expired. The impasse over the West Bank and Gaza led Carter to intercede directly in an attempt to resolve the deadlock.
By July 30, as Sadat expressed disappointment over the progress of negotiations and a desire to cut direct contacts off with the Israelis, Carter decided to call for a summit meeting. This meeting would bring Sadat, Begin, and Carter together at the presidential retreat in Maryland at Camp David. On August 8, the White House spokesman formally announced the meeting, which both Begin and Sadat agreed to attend in September.
The Camp David Summit, held from September 5–17, 1978, was a pivotal moment both in the history of the Arab-Israeli dispute and U.S. diplomacy. Rarely had a U.S. President devoted as much sustained attention to a single foreign policy issue as Carter did over the summit’s two-week duration. Carter’s ambitious goals for the talks included breaking the negotiating deadlock and hammering out a detailed Egyptian-Israeli peace agreement. To this end, U.S. Middle East experts produced a draft treaty text, which served as the basis for the negotiations and would be revised numerous times during the Summit. The talks proved extremely challenging, especially when the trilateral format became impossible to sustain. Instead, Carter and Vance met with the Egyptian and Israeli delegations individually over the course of the next twelve days.
The talks ranged over a number of issues, including the future of Israeli settlements and airbases in the Sinai Peninsula, but it was Gaza and the West Bank that continued to pose the most difficulty. Specifically, the delegations were divided over the applicability of United Nations Security Council Resolution 242 to a long-term agreement in the territories, as well as the status of Israel’s settlements during projected negotiations on Palestinian autonomy that would follow a peace treaty. In the end, while the Summit did not produce a formal peace agreement, it successfully produced the basis for an Egyptian-Israeli peace, in the form of two “Framework” documents, which laid out the principles of a bilateral peace agreement as well as a formula for Palestinian self-government in Gaza and the West Bank.