The Origins of the United States Two-Party System

The Origins of the United States Two-Party System

George Washington believed that political parties would be damaging to American society and needed to be avoided. Yet the politics of the 1790s (like the United States today) was dominated by the arguments of two distinct political groups: the Federalists and the Anti-Federalists.

“If we mean to support the liberty and independence which has cost us so much blood and treasure to establish, we must drive far away the daemon of party spirit and local reproach” – George Washington

The political parties of the 1790s emerged because of disagreements over three main issues: the nature of government, the economy and foreign policy. By understanding these disagreements we can begin to understand the conditions that allowed for the origin of the two-party system in the United States.

Federalists & Democratic Republicans

Dan is joined by Caleb McDaniel, History professor and author of the Pulitzer prizewinning book, “Sweet Taste of Liberty: A True Story of Slavery and Restitution in America”.

Listen Now

Disagreements about how the United States should be governed emerged immediately after the revolution. However, these disagreements escalated considerably in the 1790s and can be best understood by examining the arguments between Alexander Hamilton (leader of the Federalists) and Thomas Jefferson (leader of the Anti-Federalists- also known as the Democratic Republicans).

Jefferson and Hamilton’s first major disagreement emerged over the nature of Government. Alexander Hamilton believed that for the United States to be successful it would have to be formed in a similar way to the British imperial model that had been so successful.

It would need a strong central Government, treasury and financial sector, a national army and a strong political executive representing the interests of all the states.

Jefferson’s preferences

Feeling revolutionary? Declare your independence with our new George Washington hoodie!

Shop Now

Jefferson, a Southern Plantation owner from Virginia, saw himself as a Virginian first and an American second. He believed that a central treasury and national army would endow the central government with too much power that an economy driven by finance would lead to reckless gambling.

He also thought a strong President would be no better than “a Polish King”, a reference to the Polish tradition of aristocrats electing their monarch from among their number. Furthermore, Jefferson was deeply mistrustful of the British and saw Hamilton’s preference for a British style system as being dangerous to the hard won freedoms of the American Revolution.

Jefferson’s preference was for political power to reside with individual states and their legislatures, not in a central government

Arguments on the economy

The building which housed the First Bank of the United States in Philiadelphia, completed 1795.

As well as the nature of government (a more abstract idea) Hamilton and Jefferson (and their allies) argued about more pressing economic matters. Hamilton was in charge of the Treasury under George Washington and had a very difficult job.

Under the previous Articles of Confederacy, the Government could request money from states but had no formal tax raising powers. This meant that it was very difficult for the newly formed United States to pay its international loans or raise an army.

Under Hamilton’s financial plans, the central Government would have tax raising powers, form a national bank and would print paper money to be used across all the states.

However Jefferson and his anti-federalist allies believed this was just another way of the federalists to centralise power, reduce states rights and work in the interests of the financial sector (primarily based in the north) at the expense of the agricultural sector (primarily in the South).

Disagreement on foreign policy

As well as the nature of Government and the economy, the federalist and anti-federalists divisions further emerged because of profound disagreements about foreign policy.

Dan takes a walk around Colonial New York with Karen Quinones (of Patriot Tours) to explore the great battle and in its original setting.

Listen Now

Jefferson, who had spent much time in France, and saw the French revolution as an extension of the American Revolution, was dismayed by the ambivalence shown by Hamilton and George Washington to France.

He believed, as did his Federalists allies, that this was further evidence of Hamilton’s desire to drive the United States back into the arms of Britain.

Hamilton however saw the French Revolution as unstable and was convinced that only improved relationships with Britain would lead to economic prosperity in the United States.

The defeat of the Federalists

2nd President John Adams a long time friend and rival of Jefferson and his Democratic Republicans.

By 1800 the Federalist Party effectively disappeared when Thomas Jefferson’s Anti-Federalist Party, the Democratic Republicans, beat his old friend John Adams and the Federalists to the Presidency. But this very difficult decade, marked by mistrust, the rise of factional newspapers and profound arguments about the future of the United States provide the origins of the two-party system in the United States today.


The Origins of the United States Two-Party System - History

What are political parties?

Political parties are groups of people that are organized based on their political beliefs and goals. In some cases, political parties are large powerful organizations that run much of the government.

In the United States there are two main political parties: Democrats and Republicans. These two parties run much of the government. Because these two parties are so powerful, the United States government is often called a "two-party system."

Elections in a Two-Party System

The elections in a two-party system are often held in two phases. The first phase is the primary election. In the primary election each party elects a candidate to represent their party. The next phase is called the general election. In the general election, the public votes between the winners of the primary election.

These elections are sort of like playoffs in sports. The primary elections are like the semifinals and the general elections are like the finals.

The Democratic Party was founded in 1828. It is generally associated with larger government programs and higher taxes. Members of the Democratic Party are often referred to as "liberals" or "progressives." The symbol of the Democratic Party is the donkey.

The Republican Party was founded in 1854 by anti-slavery activists. It is generally associated with smaller government and lower taxes. Members of the Republican Party are often referred to as "conservatives." The symbol of the Republican Party is the elephant.

There are other political parties in the United States, but they have not been able to make a significant impact in the government. Some of these parties include the Libertarian Party, the Green Party, and the Constitution Party. Political parties that have had power in the past include the Whigs, the Federalists, and the Democratic-Republicans.

Advantages and Disadvantages

There are good and bad things about a two party system. On the plus side, having only two parties helps the government to run smoother. Two-party systems can lead to a more stable government and less radical politics. On the negative side, two-party systems give the voters only two choices. Voters start to think that their vote doesn't count for much, causing them not to participate. It also makes it difficult for people with new ideas to have an influence in the government.

Sometimes political parties are described as being "left-wing" or "right-wing." The Democrats are considered "left" and the Republicans "right." The terms "left" and "right" originally came from the National Assembly during the French Revolution when the supporters of the king stood on the right and the supporters of the revolution on the left.


The History of Welfare

Welfare in the United States commonly refers to the federal government welfare programs that have been put in place to assist the unemployed or underemployed. Help is extended to the poor through a variety of government welfare programs that include Medicaid, the Women, Infants, and Children (WIC) Program, and Aid to Families with Dependent Children (AFDC).

Historical Poverty Rate in the US

Welfare is a fluid topic that cannot be discussed without first understanding the history of poverty in the United States. Many welfare programs are tied directly to the poverty line, which is defined federally on an annual basis.

The poverty line is dependent on the members in a household. For example, in 2017 the poverty line for a single adult was $12,488, but for a family of four it was $25,094. In 2000, those numbers were $8,791 and $17,604, respectively.

Here is a chart of the poverty line, defined annually for a family of four from 1959-2017 as a reference point.

Early History

The history of welfare in the U.S. started long before the government welfare programs we know were created. In the early days of the United States, the colonies imported the British Poor Laws. These laws made a distinction between those who were unable to work due to their age or physical health and those who were able-bodied but unemployed. The former group was assisted with cash or alternative forms of help from the government. The latter group was given public service employment in workhouses.

Throughout the 1800's welfare history continued when there were attempts to reform how the government dealt with the poor. Some changes tried to help the poor move to work rather than continuing to need assistance. Social casework, consisting of caseworkers visiting the poor and training them in morals and a work ethic was advocated by reformers in the 1880s and 1890s.

Prior to the Great Depression, the United States Congress supported various programs to assist the poor. One of these, a Civil War Pension Program was passed in 1862 and provided aid to Civil War Veterans and their families.

When the Great Depression hit, many families suffered. It is estimated that one-fourth of the labor force was unemployed during the worst part of the depression. With many families suffering financial difficulties, the government stepped in to solve the problem and that is where the history of welfare as we know it really began.

Under President Franklin D. Roosevelt, the Social Security Act was enacted in 1935. The act, which was amended in 1939, established a number of programs designed to provide aid to various segments of the population. Unemployment compensation and AFDC (originally Aid to Dependent Children) are two of the programs that still exist today.

A number of government agencies were created to oversee the welfare programs. Some of the agencies that deal with welfare in the United States are the Department of Health and Human Services (HHS), the Department of Housing and Urban Development (HUD), the Department of Labor, the Department of Agriculture, and the Department of Education.

1990: Head Start State Collaboration

The Head Start State Collaboration Offices were first funded in 1990 as a pilot project much like the Head Start program that started as an experiment in 1965. At first, 12 states were funded. The purpose was to create significant, statewide partnerships between Head Start and the states in order to meet the increasingly complex, intertwined, and difficult challenges of improving services for economically disadvantaged children and their families. Funding for ten more states followed two years later. By 1997, all 50 states, Washington, DC, and Puerto Rico had Collaboration Offices. In 2008, the American Indian/Alaska Native and the Migrant and Seasonal Head Start programs established Collaboration Offices.

Welfare history continued to be made in 1996 President Bill Clinton signed the Personal Responsibility and Work Opportunity Reconciliation Act. Under the act, the federal government gives annual lump sums to the states to use to assist the poor. In turn the states must adhere to certain criteria to ensure that those receiving aid are being encouraged to move from welfare to work. Though some have criticized the program, many acknowledge it has been successful.

Those who seek welfare information can find such information on the Internet or by looking under United States Government in their local phone book. Programs are available to those who qualify to provide welfare help in the areas of health, housing, tax relief, and cash assistance.


Democratic-Republican division

At the beginning of the 19th century the Democratic-Republicans were largely victorious and dominant. The Federalists, in turn, slowly faded, eventually dissolving. Because the Democratic-Republicans were so popular, the party had no less than four political candidates pitted against each other in the presidential election of 1824. John Quincy Adams won the presidency, in spite of Andrew Jackson winning the popular vote. This sparked a strong political division within the party, which eventually caused the party to split in two: The Democrats and the Whig Party. The Democrats were led by Andrew Jackson. He was against the existence of The Bank of the United States and he largely supported state’s rights and minimal government regulation. The Whig Party stood in distinct opposition to Jackson and the Democrats, and supported the national bank.

The donkey in the Democratic Party’s logo is said to derive from Andrew Jackson’s opponents calling him a “jackass”. “Jackass” is both another word for a male donkey and nickname that describes an unintelligent or foolish person. Instead of disputing this nickname, Jackson embraced it. It has since become an overall symbol of the Democratic Party in general.


Federalists and Democratic-Republicans

With the two-party system of government in its founding stages in the United States, a continent away events were taking place that would further the evolution of the Federalist and the Democratic-Republican parties. The people of France were taking their cues from the American Revolution and rebelling against the authoritarian leadership of King Louis XVI. As war ensued between France and Great Britain in 1793, conflict arose in America as the Federalists and Democratic-Republicans disagreed on where to place their loyalties.

According to the Franco-American Alliance of 1778, the United States was bound to aid France whenever called upon. But at the time of the Alliance, no one could foresee that France would become embroiled in conflict against Britain and that the United States might be called upon to repel British forces from French lands. The emerging American political parties took opposite sides on the issue. The Democratic-Republicans wanted to demonstrate loyalty to the French, who had helped them claim their own liberty, although Jefferson only wanted to lend moral support. He did not believe that the French would call upon the United States to uphold their end of the treaty. Conversely, the Federalists, under Hamilton’s leadership, implored President Washington to declare the 1778 treaty suspended. Hamilton’s primary goal was to maintain a peaceful relationship with Britain to ensure continued trade to support the American economy.

George Washington’s response was an action of inaction. He issued the Neutrality Proclamation in 1793, which declared the United States neutral between Britain and France and strongly urged people to avoid any alliance with either camp. The Democratic-Republicans were outraged, not only by the declaration itself, but by Washington’s failure to consult Congress before issuing the proclamation. The Federalists, for the most part, were pleased.

Citizen Edmond Genêt, a French representative to the United States, set out to take advantage of the conflict. Upon meeting with Democratic-Republicans, he came to believe that the Neutrality Proclamation was more a governmental display of excess authority than a reflection of the public’s desire. He began to recruit unauthorized American armies to overtake Spanish Florida and Louisiana, along with parts of British Canada, in support of the Franco-American Alliance. Genêt even threatened to overthrow Washington himself. However, Washington prevailed by demanding and receiving Genêt’s withdrawal from the United States and replacement with a more rational French representative.

The Democratic-Republicans perpetually found themselves at odds with the Federalists as the British continued to battle with France. Britain ignored Washington’s Proclamation of Neutrality, assumed America was allied with France, and seized ships in the West Indies and captured many American sailors. Although both the Federalists and the Democratic-Republicans were outraged, they had very different opinions about how America should respond. Under Hamilton’s leadership, the Federalists were most concerned with the economy and wanted to avoid war at all costs. In contrast, the Democratic-Republicans following Jefferson’s leadership felt America was obligated to again fight Britain for its liberty.

Washington stepped in to contain the situation. He sent Federalist Chief Justice John Jay to London in 1794 to negotiate a treaty with Britain to maintain trade relations and avoid war. Yet again the Democratic-Republicans were unhappy with Washington’s actions, fearing that Jay, who was notoriously pro-British, would betray his own country.

Meanwhile Hamilton, fearful of war and ensuing economic disaster, sabotaged Jay’s negotiations by sharing U.S. negotiation tactics with the British. Not surprisingly, Jay’s negotiations were ineffective, garnering only minor victories for the United States. Jay’s Treaty gave the British 18 months to withdraw from the western forts, although they were given the right to continue fur trade with the Indians. The treaty also called for America to repay debts incurred to England during the Revolutionary War. Although there was public outcry over this treaty, the Senate passed the treaty in 1795.

The Democratic-Republicans raged, while the effects of Jay’s Treaty rippled across the United States and beyond. Spain, fearing that the treaty indicated burgeoning loyalties between the U.S. and England, moved to gain a foothold by establishing its own alliance with America. In Pinckney’s Treaty of 1795, the Spanish granted almost all the United States’ requests, including ownership of the previously disputed territory north of Florida. This treaty also gave American western farmers and traders the right of deposit at New Orleans.


Lesson Summary

The evolution of our political party system has a long history and has been around for over 200 years. The one constant is that our political system has always involved the formation of only two major political parties. In the beginning, Americans were at odds over the power of the federal government, which resulted in a split between the Federalists and Democratic-Republicans. This was followed by the growing animosity between commercial interests and bankers and farmers and Western settlers, which led to the formation of the Whigs and Democrats, respectively.

But this would not be the last time that our parties would undergo changes. Further major historical events would continue to bring change to our two-party system, and you can learn more about that in the next lesson on the history of political parties in the United States.


New England Edit

The first American schools in the thirteen original colonies opened in the 17th century. Boston Latin School was founded in 1635 and is both the first public school and oldest existing school in the United States. [1] The first free taxpayer-supported public school in North America, the Mather School, was opened in Dorchester, Massachusetts, in 1639. [2] [3] Cremin (1970) stresses that colonists tried at first to educate by the traditional English methods of family, church, community, and apprenticeship, with schools later becoming the key agent in "socialization." At first, the rudiments of literacy and arithmetic were taught inside the family, assuming the parents had those skills. Literacy rates were much higher in New England because much of the population had been deeply involved in the Protestant Reformation and learned to read in order to read the Scriptures. Literacy was much lower in the South, where the Anglican Church was the established church. Single working-class people formed a large part of the population in the early years, arriving as indentured servants. The planter class did not support public education but arranged for private tutors for their children, and sent some to England at appropriate ages for further education.

By the mid-19th century, the role of the schools in New England had expanded to such an extent that they took over many of the educational tasks traditionally handled by parents. [4] [5]

All the New England colonies required towns to set up schools, and many did so. In 1642 the Massachusetts Bay Colony made "proper" education compulsory other New England colonies followed this example. Similar statutes were adopted in other colonies in the 1640s and 1650s. [6] In the 18th century, "common schools" were established students of all ages were under the control of one teacher in one room. Although they were publicly supplied at the local (town) level, they were not free. Students' families were charged tuition or "rate bills."

The larger towns in New England opened grammar schools, the forerunner of the modern high school. [7] The most famous was the Boston Latin School, which is still in operation as a public high school. Hopkins School in New Haven, Connecticut, was another. By the 1780s, most had been replaced by private academies. By the early 19th century New England operated a network of private high schools, now called "prep schools," typified by Phillips Andover Academy (1778), Phillips Exeter Academy (1781), and Deerfield Academy (1797). They became the major feeders for Ivy League colleges in the mid-19th century. [8] These prep schools became coeducational in the 1970s, and remain highly prestigious in the 21st century. [9] [10]

The South Edit

Residents of the Upper South, centered on the Chesapeake Bay, created some basic schools early in the colonial period. In late 17th century Maryland, the Catholic Jesuits operated some schools for Catholic students. [11] Generally the planter class hired tutors for the education of their children or sent them to private schools. During the colonial years, some sent their sons to England or Scotland for schooling.

In March 1620, George Thorpe sailed from Bristol for Virginia. He became a deputy in charge of 10,000 acres (4,000 ha) of land to be set aside for a university and Indian school. The plans for the school for Native Americans ended when George Thorpe was killed in the Indian Massacre of 1622. In Virginia, rudimentary schooling for the poor and paupers was provided by the local parish. [12] Most elite parents either home schooled their children using peripatetic tutors or sent them to small local private schools. [13]

In the deep south (Georgia and South Carolina), schooling was carried out primarily by private venture teachers and a hodgepodge of publicly funded projects. In the colony of Georgia, at least ten grammar schools were in operation by 1770, many taught by ministers. The Bethesda Orphan House educated children. Dozens of private tutors and teachers advertised their service in newspapers. A study of women's signatures indicates a high degree of literacy in areas with schools. [14] In South Carolina, scores of school projects were advertised in the South Carolina Gazette beginning in 1732. Although it is difficult to know how many ads yielded successful schools, many of the ventures advertised repeatedly over years, suggesting continuity. [15] [16]

After the American Revolution, Georgia and South Carolina tried to start small public universities. Wealthy families sent their sons North to college. In Georgia public county academies for white students became more common, and after 1811 South Carolina opened a few free "common schools" to teach reading, writing and arithmetic.

Republican governments during the Reconstruction era established the first public school systems to be supported by general taxes. Both whites and blacks would be admitted, but legislators agreed on racially segregated schools. (The few integrated schools were located in New Orleans).

Particularly after white Democrats regained control of the state legislatures in former Confederate states, they consistently underfunded public schools for blacks which continued until 1954 when the United States Supreme Court declared state laws establishing separate public schools for black and white students to be unconstitutional.

Generally public schooling in rural areas did not extend beyond the elementary grades for either whites or blacks. This was known as "eighth grade school" [17] After 1900, some cities began to establish high schools, primarily for middle class whites. In the 1930s roughly one fourth of the US population still lived and worked on farms and few rural Southerners of either race went beyond the 8th grade until after 1945. [18] [19] [20] [21]

Women and girls Edit

The, earliest continually operating school for girls in the United States is the Catholic Ursuline Academy in New Orleans. It was founded in 1727 by the Sisters of the Order of Saint Ursula. The Academy graduated the first female pharmacist. The first convent established in the United States supported the Academy. This was the first free school and first retreat center for young women. It was the first school to teach free women of color, Native Americans, and female African-American slaves. In the region, Ursuline provided the first center of social welfare in the Mississippi Valley and it was the first boarding school for girls in Louisiana, and the first school of music in New Orleans. [22]

Tax-supported schooling for girls began as early as 1767 in New England. It was optional and some towns proved reluctant to support this innovation. Northampton, Massachusetts, for example, was a late adopter because it had many rich families who dominated the political and social structures. They did not want to pay taxes to aid poor families. Northampton assessed taxes on all households, rather than only on those with children, and used the funds to support a grammar school to prepare boys for college. Not until after 1800 did Northampton educate girls with public money. In contrast, the town of Sutton, Massachusetts, was diverse in terms of social leadership and religion at an early point in its history. Sutton paid for its schools by means of taxes on households with children only, thereby creating an active constituency in favor of universal education for both boys and girls. [23]

Historians note that reading and writing were different skills in the colonial era. Schools taught both, but in places without schools, writing was taught mainly to boys and a few privileged girls. Men handled worldly affairs and needed to both read and write. It was believed that girls needed only to read (especially religious materials). This educational disparity between reading and writing explains why the colonial women often could read, but could not write and could not sign their names—they used an "X". [24]

The education of elite women in Philadelphia after 1740 followed the British model developed by the gentry classes during the early 18th century. Rather than emphasizing ornamental aspects of women's roles, this new model encouraged women to engage in more substantive education, reaching into the classical arts and sciences to improve their reasoning skills. Education had the capacity to help colonial women secure their elite status by giving them traits that their 'inferiors' could not easily mimic. Fatherly (2004) examines British and American writings that influenced Philadelphia during the 1740s–1770s and the ways in which Philadelphia women gained education and demonstrated their status. [25]

Non-English schools Edit

By 1664, when the territory was taken over by the English, most towns in the New Netherland colony had already set up elementary schools. The schools were closely related to the Dutch Reformed Church, and emphasized reading for religious instruction and prayer. The English closed the Dutch-language public schools in some cases these were converted into private academies. The new English government showed little interest in public schools. [26]

German settlements from New York through Pennsylvania, Maryland and down to the Carolinas sponsored elementary schools closely tied to their churches, with each denomination or sect sponsoring its own schools. In the early colonial years, German immigrants were Protestant and the drive for education was related to teaching students to read Scripture. [27] [28]

Following waves of German Catholic immigration after the 1848 revolutions, and after the end of the Civil War, both Catholics and Missouri Synod Lutherans began to set up their own German-language parochial schools, especially in cities of heavy German immigration: such as Cincinnati, St. Louis, Chicago and Milwaukee, as well as rural areas heavily settled by Germans. [29] The Amish, a small religious sect speaking German, are opposed to schooling past the elementary level. They see it as unnecessary, as dangerous to preservation of their faith, and as beyond the purview of government. [30] [31]

Spain had small settlements in Florida, the Southwest, and also controlled Louisiana. There is little evidence that they schooled any girls. Parish schools were administered by Jesuits or Franciscans and were limited to male students. [32]

Textbooks Edit

In the 17th century, colonists imported schoolbooks from England. By 1690, Boston publishers were reprinting the English Protestant Tutor under the title of The New England Primer. The Primer was built on rote memorization. By simplifying Calvinist theology, the Primer enabled the Puritan child to define the limits of the self by relating his life to the authority of God and his parents. [33] [34] The Primer included additional material that made it widely popular in colonial schools until it was supplanted by Webster's work. The "blue backed speller" of Noah Webster was by far the most common textbook from the 1790s until 1836, when the McGuffey Readers appeared. Both series emphasized civic duty and morality, and sold tens of millions of copies nationwide. [35]

Webster's Speller was the pedagogical blueprint for American textbooks it was so arranged that it could be easily taught to students, and it progressed by age. Webster believed students learned most readily when complex problems were broken into its component parts. Each pupil could master one part before moving to the next. Ellis argues that Webster anticipated some of the insights associated in the 20th century with Jean Piaget's theory of cognitive development. Webster said that children pass through distinctive learning phases in which they master increasingly complex or abstract tasks. He stressed that teachers should not try to teach a three-year-old how to read—wait until they are ready at age five. He planned the Speller accordingly, starting with the alphabet, then covering the different sounds of vowels and consonants, then syllables simple words came next, followed by more complex words, then sentences. Webster's Speller was entirely secular. It ended with two pages of important dates in American history, beginning with Columbus' "discovery" in 1492 and ending with the Battle of Yorktown in 1781, by which the United States achieved independence. There was no mention of God, the Bible, or sacred events. As Ellis explains, "Webster began to construct a secular catechism to the nation-state. Here was the first appearance of 'civics' in American schoolbooks. In this sense, Webster's speller was the secular successor to The New England Primer with its explicitly biblical injunctions." [36] Bynack (1984) examines Webster in relation to his commitment to the idea of a unified American national culture that would prevent the decline of republican virtues and national solidarity. Webster acquired his perspective on language from such German theorists as Johann David Michaelis and Johann Gottfried Herder. He believed with them that a nation's linguistic forms and the thoughts correlated with them shaped individuals' behavior. He intended the etymological clarification and reform of American English to improve citizens' manners and thereby preserve republican purity and social stability. Webster animated his Speller and Grammar by following these principles. [37]

Colonial colleges Edit

Higher education was largely oriented toward training men as ministers before 1800. Doctors and lawyers were trained in local apprentice systems.

Religious denominations established most early colleges in order to train ministers. New England had a long emphasis on literacy in order that individuals could read the Bible. Harvard College was founded by the colonial legislature in 1636, and named after an early benefactor. Most of the funding came from the colony, but the college began to build an endowment from its early years. [38] Harvard at first focused on training young men for the ministry, but many alumni went into law, medicine, government or business. The college was a leader in bringing Newtonian science to the colonies. [39]

The College of William & Mary was founded by Virginia government in 1693, with 20,000 acres (8,100 ha) of land for an endowment, and a penny tax on every pound of tobacco, together with an annual appropriation. It was closely associated with the established Anglican Church. James Blair, the leading Anglican minister in the colony, was president for 50 years. The college won the broad support of the Virginia planter class, most of whom were Anglicans. It hired the first law professor and trained many of the lawyers, politicians, and leading planters. [40] Students headed for the ministry were given free tuition.

Yale College was founded by Puritans in 1701, and in 1716 was relocated to New Haven, Connecticut. The conservative Puritan ministers of Connecticut had grown dissatisfied with the more liberal theology of Harvard, and wanted their own school to train orthodox ministers. However president Thomas Clap (1740–1766) strengthened the curriculum in the natural sciences and made Yale a stronghold of revivalist New Light theology. [41]

New Side Presbyterians in 1747 set up the College of New Jersey, in the town of Princeton much later it was renamed as Princeton University. Baptists established Rhode Island College in 1764, and in 1804 it was renamed Brown University in honor of a benefactor. Brown was especially liberal in welcoming young men from other denominations.

In New York City, the Anglicans set up Kings College in 1746, with its president Samuel Johnson the only teacher. It closed during the American Revolution, and reopened in 1784 as an independent institution under the name of Columbia College it is now Columbia University.

The Academy of Philadelphia was created in 1749 by Benjamin Franklin and other civic minded leaders in Philadelphia. Unlike colleges in other cities, it was not oriented toward the training of ministers. It founded the first medical school in America in 1765, therefore becoming America's first university. The Pennsylvania state legislature conferred a new corporate charter upon the College of Philadelphia and renamed it the University of Pennsylvania in 1791. [42]

The Dutch Reformed Church in 1766 set up Queens College in New Jersey, which later became known as Rutgers University and gained state support. Dartmouth College, chartered in 1769 as a school for Native Americans, relocated to its present site in Hanover, New Hampshire, in 1770. [43] [44]

All of the schools were small, with a limited undergraduate curriculum oriented on the classical liberal arts. Students were drilled in Greek, Latin, geometry, ancient history, logic, ethics and rhetoric, with few discussions, little homework and no lab sessions. The college president typically tried to enforce strict discipline. The upperclassmen enjoyed hazing the freshmen. Many students were younger than 17, and most of the colleges also operated a preparatory school. There were no organized sports, or Greek-letter fraternities, but many of the schools had active literary societies. Tuition was very low and scholarships were few. [45]

The colonies had no schools of law. A few young American students studied at the prestigious Inns of Court in London. The majority of aspiring lawyers served apprenticeships with established American lawyers, or "read the law" to qualify for bar exams. [46] Law became very well established in the colonies, compared to medicine, which was in rudimentary condition. In the 18th century, 117 Americans had graduated in medicine in Edinburgh, Scotland, but most physicians learned as apprentices in the colonies. [47]

The trustees of the Academy of Philadelphia, later the University of Pennsylvania, established the first medical school in the colonies in 1765, becoming the first university in the colonies. [42] In New York, the medical department of King's College was established in 1767, and in 1770 it awarded the first American M.D. degree. [48]

After the Revolution, northern states especially emphasized education and rapidly established public schools. By the year 1870, all states had tax-subsidized elementary schools. [50] The US population had one of the highest literacy rates in the world at the time. [51] Private academies also flourished in the towns across the country, but rural areas (where most people lived) had few schools before the 1880s.

In 1821, Boston started the first public high school in the United States. By the close of the 19th century, public secondary schools began to outnumber private ones. [52] [53]

Over the years, Americans have been influenced by a number of European reformers among them Pestalozzi, Herbart, and Montessori. [52]

Republican motherhood Edit

By the early 19th century with the rise of the new United States, a new mood was alive in urban areas. Especially influential were the writings of Lydia Maria Child, Catharine Maria Sedgwick, and Lydia Sigourney, who developed the role of republican motherhood as a principle that united state and family by equating a successful republic with virtuous families. Women, as intimate and concerned observers of young children, were best suited to the role of guiding and teaching children. By the 1840s, New England writers such as Child, Sedgwick, and Sigourney became respected models and advocates for improving and expanding education for females. Greater educational access meant formerly male-only subjects, such as mathematics and philosophy, were to be integral to curricula at public and private schools for girls. By the late 19th century, these institutions were extending and reinforcing the tradition of women as educators and supervisors of American moral and ethical values. [54]

The ideal of Republican motherhood pervaded the entire nation, greatly enhancing the status of women and supporting girls' need for education. The relative emphasis on decorative arts and refinement of female instruction which had characterized the colonial era was replaced after 1776 by a program to support women in education for their major role in nation building, in order that they become good republican mothers of good republican youth. Fostered by community spirit and financial donations, private female academies were established in towns across the South as well as the North. [55]

Rich planters were particularly insistent on having their daughters schooled, since education often served as a substitute for dowry in marriage arrangements. The academies usually provided a rigorous and broad curriculum that stressed writing, penmanship, arithmetic, and languages, especially French. By 1840, the female academies succeeded in producing a cultivated, well-read female elite ready for their roles as wives and mothers in southern aristocratic society. [55]

Attendance Edit

The 1840 census indicated that about 55% of the 3.68 million school age children between the ages of five and fifteen attended primary schools or academies. Many families could not afford to pay for their children to go to school or to spare them from farm work. [56] Beginning in the late 1830s, more private academies were established for girls for education past primary school, especially in northern states. Some offered classical education similar to that offered to boys.

Data from the indentured servant contracts of German immigrant children in Pennsylvania from 1771–1817 show that the number of children receiving education increased from 33.3% in 1771–1773 to 69% in 1787–1804. Additionally, the same data showed that the ratio of school education versus home education rose from .25 in 1771–1773 to 1.68 in 1787–1804. [57] While some African Americans managed to achieve literacy, southern states largely prohibited schooling to blacks.

Teachers, early 1800s Edit

Teaching young students was not an attractive career for educated people. [58] Adults became teachers without any particular skill. Hiring was handled by the local school board, who were mainly interested in the efficient use of limited taxes and favored young single women from local taxpaying families. This started to change with the introduction of two-year normal schools starting in 1823. Normal schools increasingly provided career paths for unmarried middle-class women. By 1900 most teachers of elementary schools in the northern states had been trained at normal schools. [53]

One-room schoolhouses Edit

Given the high proportion of population in rural areas, with limited numbers of students, most communities relied on one-room school houses. Teachers would deal with the range of students of various ages and abilities by using the Monitorial System, an education method that became popular on a global scale during the early 19th century. This method was also known as "mutual instruction" or the "Bell-Lancaster method" after the British educators Dr Andrew Bell and Joseph Lancaster, who each independently developed it about 1798. As older children in families would teach younger ones, the abler pupils in these schools became 'helpers' to the teacher, and taught other students what they had learned. [59]

Mann reforms Edit

Upon becoming the secretary of education of Massachusetts in 1837, Horace Mann (1796–1859) worked to create a statewide system of professional teachers, based on the Prussian model of "common schools." Prussia was attempting to develop a system of education by which all students were entitled to the same content in their public classes. Mann initially focused on elementary education and on training teachers. The common-school movement quickly gained strength across the North. Connecticut adopted a similar system in 1849, and Massachusetts passed a compulsory attendance law in 1852. [60] [61] Mann's crusading style attracted wide middle-class support. Historian Ellwood P. Cubberley asserts:

No one did more than he to establish in the minds of the American people the conception that education should be universal, non-sectarian, free, and that its aims should be social efficiency, civic virtue, and character, rather than mere learning or the advancement of sectarian ends. [62]

An important technique which Mann had learned in Prussia and introduced in Massachusetts in 1848 was to place students in grades by age. They were assigned by age to different grades and progressed through them, regardless of differences of aptitude. In addition, he used the lecture method common in European universities, which required students to receive instruction rather than take an active role in instructing one another. Previously, schools had often had groups of students who ranged in age from 6 to 14 years. With the introduction of age grading, multi-aged classrooms all but disappeared. [63] Some students progressed with their grade and completed all courses the secondary school had to offer. These were "graduated," and were awarded a certificate of completion. This was increasingly done at a ceremony imitating college graduation rituals.

Arguing that universal public education was the best way to turn the nation's unruly children into disciplined, judicious republican citizens, Mann won widespread approval for building public schools from modernizers, especially among fellow Whigs. Most states adopted one version or another of the system he established in Massachusetts, especially the program for "normal schools" to train professional teachers. [64] This quickly developed into a widespread form of school which later became known as the factory model school.

Free schooling was available through some of the elementary grades. Graduates of these schools could read and write, though not always with great precision. Mary Chesnut, a Southern diarist, mocks the North's system of free education in her journal entry of June 3, 1862, where she derides misspelled words from the captured letters of Union soldiers. [65]

Compulsory laws Edit

By 1900, 34 states had compulsory schooling laws four were in the South. Thirty states with compulsory schooling laws required attendance until age 14 (or higher). [66] As a result, by 1910, 72 percent of American children attended school. Half the nation's children attended one-room schools. By 1930, every state required students to complete elementary school. [67]

Religion and schools Edit

As the majority of the nation was Protestant in the 19th century, most states passed a constitutional amendment, called Blaine Amendments, forbidding tax money be used to fund parochial schools. This was largely directed against Catholics, as the heavy immigration from Catholic Ireland after the 1840s aroused nativist sentiment. There were longstanding tensions between Catholic and Protestant believers, long associated with nation states that had established religions. Many Protestants believed that Catholic children should be educated in public schools in order to become American. By 1890 the Irish, who as the first major Catholic immigrant group controlled the Church hierarchy in the U.S., had built an extensive network of parishes and parish schools ("parochial schools") across the urban Northeast and Midwest. The Irish and other Catholic ethnic groups intended parochial schools not only to protect their religion, but to enhance their culture and language. [68] [69]

Catholics and German Lutherans, as well as Dutch Protestants, organized and funded their own elementary schools. Catholic communities also raised money to build colleges and seminaries to train teachers and religious leaders to head their churches. [70] [71] In the 19th century, most Catholics were Irish or German immigrants and their children in the 1890s new waves of Catholic immigrants began arriving from Italy and Poland. The parochial schools met some opposition, as in the Bennett Law in Wisconsin in 1890, but they thrived and grew. Catholic nuns served as teachers in most schools and were paid low salaries in keeping with their vows of poverty. [72] In 1925 the U.S. Supreme Court ruled in Pierce v. Society of Sisters that students could attend private schools to comply with state compulsory education laws, thus giving parochial schools an official blessing. [73]

Schools for Black students Edit

In the early days of the Reconstruction era, the Freedmen's Bureau opened 1000 schools across the South for black children. This was essentially building on schools that had been established in numerous large contraband camps. Freedmen were eager for schooling for both adults and children, and the enrollments were high and enthusiastic. Overall, the Bureau spent $5 million to set up schools for blacks. By the end of 1865, more than 90,000 freedmen were enrolled as students in these schools. The school curriculum resembled that of schools in the North. [74]

Many Bureau teachers were well-educated Yankee women motivated by religion and abolitionism. Half the teachers were southern whites one-third were blacks, and one-sixth were northern whites. [75] Most were women but among African Americans, male teachers slightly outnumbered female teachers. In the South, people were attracted to teaching because of the good salaries, at a time when the societies were disrupted and the economy was poor. Northern teachers were typically funded by northern organizations and were motivated by humanitarian goals to help the freedmen. As a group, only the black cohort showed a commitment to racial equality they were also the ones most likely to continue as teachers. [76]

When the Republicans came to power in the Southern states after 1867, they created the first system of taxpayer-funded public schools. Southern Blacks wanted public schools for their children but they did not demand racially integrated schools. Almost all the new public schools were segregated, apart from a few in New Orleans. After the Republicans lost power in the mid-1870s, conservative whites retained the public school systems but sharply cut their funding. [77]

Almost all private academies and colleges in the South were strictly segregated by race. [78] The American Missionary Association supported the development and establishment of several historically black colleges, such as Fisk University and Shaw University. In this period, a handful of northern colleges accepted black students. Northern denominations and their missionary associations especially established private schools across the South to provide secondary education. They provided a small amount of collegiate work. Tuition was minimal, so churches supported the colleges financially, and also subsidized the pay of some teachers. In 1900, churches—mostly based in the North—operated 247 schools for blacks across the South, with a budget of about $1 million. They employed 1600 teachers and taught 46,000 students. [79] [80] Prominent schools included Howard University, a federal institution based in Washington Fisk University in Nashville, Atlanta University, Hampton Institute in Virginia, and many others. Most new colleges in the 19th century were founded in northern states.

In 1890, Congress expanded the land-grant program to include federal support for state-sponsored colleges across the South. It required states to identify colleges for black students as well as white ones in order to get land grant support.

Hampton Normal and Agricultural Institute was of national importance because it set the standards for what was called industrial education. [81] Of even greater influence was Tuskegee Normal School for Colored Teachers, led from 1881 by Hampton alumnus Booker T. Washington. In 1900 few black students were enrolled in college-level work their schools had very weak faculties and facilities. The alumni of Keithley became high school teachers. [82]

While the colleges and academies were generally coeducational, until the late 20th century, historians had taken little notice of the role of women as students and teachers. [83]

Native American Missionary Schools Edit

As religious revivalism swept through the United States in the early 1800s, a growing group of evangelical Christians took on the role of missionaries. These missionaries were, in many cases, concerned with converting non-Christians to Christianity. Native Americans were a nearby and easy target for these missionaries. According to the scholars Theda Perdue and Michael D. Green, these Christian missionaries believed that the Native Americans were uncivilized, and were in need of help from the missionaries to make them more civilized and more like Anglo-Americans. [84]

Missionaries found great difficulty converting adults, but, according to Perdue and Green's research, they found it much easier to convert Native American children. To do so, missionaries often separated Native American children from their families to live at boarding schools where the missionaries believed they could civilize and convert them. [84] Missionary schools in the American Southeast were first developed in 1817. [85] Perdue and Green's research has shown that these children did not only learn the basic subjects of education that most American children experienced, but also were taught to live and act like Anglo-Americans. Boys learned to farm, and girls were taught domestic labor, and according to Perdue and Green, they were taught that Anglo-American civilization was superior to the traditional Native American cultures that these children came from. [84] David Brown, a Cherokee man who converted to Christianity and promoted the conversion to Christianity of Native Americans, went on a fundraising speaking tour to raise money for missionary societies and their boarding schools. Brown, in his speech, described the progress that he believed had been made in civilizing Native American children in missionary schools. "The Indians," he claimed, "are making rapid advances toward the standard of morality, virtue and religions." [86]

The responsibility for missionary work fell on the missionaries themselves for the most part. While the U.S. government provided some funding for missionary work, such as Native American Missionary Schools, the missionaries themselves were primarily responsible for running these schools. [84] The scholar Kyle Massey Stephens argues that the federal government acted in a supporting role in assimilation programs like these mission schools. President James Monroe, though, wanted the United States to increase funding and assistance with private mission schools in their efforts to educate Native American children. According the Stephen's work, the first missionary schools from 1817 were funded completely by private donors. In 1819, this changed when Congress appropriated an annual fee of $10,000 to be given to missionary societies in addition to their private fundraising. The United States Secretary of War at the time, John C. Calhoun, advocated for these funds to be used towards educating Native American children in Anglo-American culture with courses on farming and mechanics for boys, and domestic labor for girls. [85] The Bureau of Indian Affairs, which was founded in 1824 to handle issues related to Native Americans, had thirty-two missionary schools that they had sanctioned in Native American communities in its first year of existence. In these schools, 916 Native American children were enrolled. [87]

Influence of colleges in 19th century Edit

Summarizing the research of Burke and Hall, Katz concludes that in the 19th century: [88]

  1. The nation's many small colleges helped young men make the transition from rural farms to complex urban occupations.
  2. These colleges especially promoted upward mobility by preparing ministers, and thereby provided towns across the country with a core of community leaders.
  3. The more elite colleges became increasingly exclusive and contributed relatively little to upward social mobility. By concentrating on the offspring of wealthy families, ministers and a few others, the elite Eastern colleges, especially Harvard, played an important role in the formation of a Northeastern elite with great power.

Progressive Era Edit

The progressive era in education was part of a larger Progressive Movement, extending from the 1890s to the 1930s. The era was notable for a dramatic expansion in the number of schools and students served, especially in the fast-growing metropolitan cities. After 1910, smaller cities also began building high schools. By 1940, 50% of young adults had earned a high school diploma. [53]

Radical historians in the 1960s, steeped in the anti-bureaucratic ethos of the New Left, deplored the emergence of bureaucratic school systems. They argued its purpose was to suppress the upward aspirations of the working class. [89] But other historians have emphasized the necessity of building non-politicized standardized systems. The reforms in St. Louis, according to historian Selwyn Troen, were, "born of necessity as educators first confronted the problems of managing a rapidly expanding and increasingly complex institutions." Troen found that the bureaucratic solution removed schools from the bitterness and spite of ward politics. Troen argues:

In the space of only a generation, public education had left behind a highly regimented and politicized system dedicated to training children in the basic skills of literacy and the special discipline required of urban citizens, and had replaced it with a largely apolitical, more highly organized and efficient structure specifically designed to teach students the many specialized skills demanded in a modern, industrial society. In terms of programs this entailed the introduction of vocational instruction, a doubling of the period of schooling, and a broader concern for the welfare of urban youth. [90]

The social elite in many cities in the 1890s led the reform movement. Their goal was to permanently end political party control of the local schools for the benefit of patronage jobs and construction contracts, which had arisen out of ward politics that absorbed and taught the millions of new immigrants. New York City elite led progressive reforms. Reformers installed a bureaucratic system run by experts, and demanded expertise from prospective teachers. The reforms opened the way for hiring more Irish Catholic and Jewish teachers, who proved adept at handling the civil service tests and gaining the necessary academic credentials. Before the reforms, schools had often been used as a means to provide patronage jobs for party foot soldiers. The new emphasis concentrated on broadening opportunities for the students. New programs were established for the physically handicapped evening recreation centers were set up vocational schools were opened medical inspections became routine programs began to teach English as a second language and school libraries were opened. [91] New teaching strategies were developed, such as the shifting the focus of secondary education towards speaking and writing, as outlined by the Hosic Report in 1917. [92]

Dewey and progressive education Edit

The leading educational theorist of the era was John Dewey (1859–1952), a philosophy professor at the University of Chicago (1894–1904) and at Teachers College (1904 to 1930), of Columbia University in New York City. [93] Dewey was a leading proponent of "Progressive Education" and wrote many books and articles to promote the central role of democracy in education. [94] He believed that schools were not only a place for students to gain content knowledge, but also as a place for them to learn how to live. The purpose of education was thus to realize the student's full potential and the ability to use those skills for the greater good.

Dewey noted that, "to prepare him for the future life means to give him command of himself it means so to train him that he will have the full and ready use of all his capacities." Dewey insisted that education and schooling are instrumental in creating social change and reform. He noted that "education is a regulation of the process of coming to share in the social consciousness and that the adjustment of individual activity on the basis of this social consciousness is the only sure method of social reconstruction.". [95] Although Dewey's ideas were very widely discussed, they were implemented chiefly in small experimental schools attached to colleges of education. In the public schools, Dewey and the other progressive theorists encountered a highly bureaucratic system of school administration that was typically not receptive to new methods. [96]

Dewey viewed public schools and their narrow-mindedness with disdain and as undemocratic and close minded. Meanwhile, laboratory schools, such as the University of Chicago Laboratory Schools, were much more open to original thought and experimentation. Not only was Dewey involved with laboratory schools, but he was also deeply involved with the emerging philosophy of pragmatism, which he incorporated within his laboratory schools. Dewey viewed pragmatism critical for the growth of democracy, which Dewey did not view as just a form of government, but something that occurred within the workings of the laboratory schools as well as everyday life. Dewey utilized the laboratory schools as an experimental platform for his theories on pragmatism, democracy, as well as how humans learned. [97]

Black education Edit

Booker T. Washington was the dominant black political and educational leader in the United States from the 1890s until his death in 1915. Washington not only led his own college, Tuskegee Institute in Alabama, but his advice, political support, and financial connections proved important to many other black colleges and high schools, which were primarily located in the South. This was the center of the black population until after the Great Migration of the first half of the 20th century. Washington was a respected advisor to major philanthropies, such as the Rockefeller, Rosenwald and Jeanes foundations, which provided funding for leading black schools and colleges. The Rosenwald Foundation provided matching funds for the construction of schools for rural black students in the South. Washington explained, "We need not only the industrial school, but the college and professional school as well, for a people so largely segregated, as we are. . Our teachers, ministers, lawyers and doctors will prosper just in proportion as they have about them an intelligent and skillful producing class." [98] Washington was a strong advocate of progressive reforms as advocated by Dewey, emphasizing scientific, industrial and agricultural education that produced a base for lifelong learning, and enabled careers for many black teachers, professionals, and upwardly mobile workers. He tried to adapt to the system and did not support political protests against the segregated Jim Crow system. [99] At the same time, Washington used his network to provide important funding to support numerous legal challenges by the NAACP against the systems of disenfranchisement which southern legislatures had passed at the turn of the century, effectively excluding blacks from politics for decades into the 1960s.

Atlanta Edit

In most American cities, Progressives in the Efficiency Movement looked for ways to eliminate waste and corruption. They emphasized using experts in schools. For example, in the 1897 reform of the Atlanta schools, the school board was reduced in size, eliminating the power of ward bosses. The members of the school board were elected at-large, reducing the influence of various interest groups. The power of the superintendent was increased. Centralized purchasing allowed for economies of scale, although it also added opportunities for censorship and suppression of dissent. Standards of hiring and tenure in teachers were made uniform. Architects designed school buildings in which the classrooms, offices, workshops and other facilities related together. Curricular innovations were introduced. The reforms were designed to produce a school system for white students according to the best practices of the day. Middle-class professionals instituted these reforms they were equally antagonistic to the traditional business elites and to working-class elements. [100]

Gary plan Edit

The "Gary plan" was implemented in the new industrial "steel" city of Gary, Indiana, by William Wirt, the superintendent who served from 1907–30. Although the U.S. Steel Corporation dominated the Gary economy and paid abundant taxes, it did not shape Wirt's educational reforms. The Gary Plan emphasized highly efficient use of buildings and other facilities. This model was adopted by more than 200 cities around the country, including New York City. Wirt divided students into two platoons—one platoon used the academic classrooms, while the second platoon was divided among the shops, nature studies, auditorium, gymnasium, and outdoor facilities. Then the platoons rotated position.

Wirt set up an elaborate night school program, especially to Americanize new immigrants. The introduction of vocational educational programs, such as wood shop, machine shop, typing, and secretarial skills proved especially popular with parents who wanted their children to become foremen and office workers. By the Great Depression, most cities found the Gary plan too expensive, and abandoned it. [101]

Great Depression and New Deal: 1929-39 Edit

Public schools across the country were badly hurt by the Great Depression, as tax revenues fell in local and state governments shifted funding to relief projects. Budgets were slashed, and teachers went unpaid. During the New Deal, 1933–39, President Franklin Roosevelt and his advisers were hostile to the elitism shown by the educational establishment. They refused all pleas for direct federal help to public or private schools or universities. They rejected proposals for federal funding for research at universities. But they did help poor students, and the major New Deal relief programs built many schools buildings as requested by local governments. The New Deal approach to education was a radical departure from educational best practices. It was specifically designed for the poor and staffed largely by women on relief. It was not based on professionalism, nor was it designed by experts. Instead it was premised on the anti-elitist notion that a good teacher does not need paper credentials, that learning does not need a formal classroom and that the highest priority should go to the bottom tier of society. Leaders in the public schools were shocked: They were shut out as consultants and as recipients of New Deal funding. They desperately needed cash to cover the local and state revenues that had disappeared during the depression, they were well organized, and made repeated concerted efforts in 1934, 1937, and 1939, all to no avail. The conservative Republican establishment headed collaborated with for so long was out of power and Roosevelt himself was the leader in anti-elitism. The federal government had a highly professional Office of Education Roosevelt cut its budget and staff, and refused to consult with its leader John Ward Studebaker. [102] The Civilian Conservation Corps (CCC) programs were deliberately designed to not teach skills that would put them in competition with unemployed union members. The CCC did have its own classes. They were voluntary, took place after work, and focused on teaching basic literacy to young men who had quit school before high school. [103]

The relief programs did offer indirect help. The Civil Works Administration (CWA) and Federal Emergency Relief Administration (FERA) focused on hiring unemployed people on relief, and putting them to work on public buildings, including public schools. It built or upgraded 40,000 schools, plus thousands of playgrounds and athletic fields. It gave jobs to 50,000 teachers to keep rural schools open and to teach adult education classes in the cities. It gave a temporary jobs to unemployed teachers in cities like Boston. [104] [105] Although the New Deal refused to give money to impoverished school districts, it did give money to impoverished high school and college students. The CWA used "work study" programs to fund students, both male and female. [106]

The National Youth Administration (NYA), a semi-autonomous branch of the Works Progress Administration (WPA) under Aubrey Williams developed apprenticeship programs and residential camps specializing in teaching vocational skills. It was one of the first agencies to set up a "Division of Negro Affairs" and make an explicit effort to enroll black students. Williams believed that the traditional high school curricula had failed to meet the needs of the poorest youth. In opposition, the well-established National Education Association (NEA) saw NYA as a dangerous challenge to local control of education NYA expanded Work-study money to reach up to 500,000 students per month in high schools, colleges, and graduate schools. The average pay was $15 a month. [107] [108] However, in line with the anti-elitist policy, the NYA set up its own high schools, entirely separate from the public school system or academic schools of education. [109] [110] Despite appeals from Ickes and Eleanor Roosevelt, Howard University–the federally operated school for blacks—saw its budget cut below Hoover administration levels. [111]

Secondary schools Edit

In 1880, American high schools were primarily considered to be preparatory academies for students who were going to attend college. But by 1910 they had been transformed into core elements of the common school system and had broader goals of preparing many students for work after high school. The explosive growth brought the number of students from 200,000 in 1890 to 1,000,000 in 1910, to almost 2,000,000 by 1920 7% of youths aged 14 to 17 were enrolled in 1890, rising to 32% in 1920. The graduates found jobs especially in the rapidly growing white-collar sector. Cities large and small across the country raced to build new high schools. Few were built in rural areas, so ambitious parents moved close to town to enable their teenagers to attend high school. After 1910, vocational education was added, as a mechanism to train the technicians and skilled workers needed by the booming industrial sector. [112] [113]

In the 1880s the high schools started developing as community centers. They added sports and by the 1920s were building gymnasiums that attracted large local crowds to basketball and other games, especially in small town schools that served nearby rural areas. [114]

College preparation Edit

In the 1865–1914 era, the number and character of schools changed to meet the demands of new and larger cities and of new immigrants. They had to adjust to the new spirit of reform permeating the country. High schools increased in number, adjusted their curriculum to prepare students for the growing state and private universities education at all levels began to offer more utilitarian studies in place of an emphasis on the classics. John Dewey and other Progressives advocated changes from their base in teachers' colleges. [115]

Before 1920 most secondary education, whether private or public, emphasized college entry for a select few headed for college. Proficiency in Greek and Latin was emphasized. Abraham Flexner, under commission from the philanthropic General Education Board (GEB), wrote A Modern School (1916), calling for a de-emphasis on the classics. The classics teachers fought back in a losing effort. [116]

Prior to World War I, German was preferred as a subject for a second spoken language. Prussian and German educational systems had served as a model for many communities in the United States and its intellectual standing was highly respected. Due to Germany being an enemy of the US during the war, an anti-German attitude arose in the United States. French, the international language of diplomacy, was promoted as the preferred second language instead. French survived as the second language of choice until the 1960s, when Spanish became popular. [117] This reflected a strong increase in the Spanish-speaking population in the United States, which has continued since the late 20th century.

The growth of human capital Edit

By 1900 educators argued that the post-literacy schooling of the masses at the secondary and higher levels, would improve citizenship, develop higher-order traits, and produce the managerial and professional leadership needed for rapid economic modernization. The commitment to expanded education past age 14 set the U.S. apart from Europe for much of the 20th century. [53]

From 1910 to 1940, high schools grew in number and size, reaching out to a broader clientele. In 1910, for example, 9% of Americans had a high school diploma in 1935, the rate was 40%. [118] By 1940, the number had increased to 50%. [119] This phenomenon was uniquely American no other nation attempted such widespread coverage. The fastest growth came in states with greater wealth, more homogeneity of wealth, and less manufacturing activity than others. The high schools provided necessary skill sets for youth planning to teach school, and essential skills for those planning careers in white collar work and some high-paying blue collar jobs. Claudia Goldin argues this rapid growth was facilitated by public funding, openness, gender neutrality, local (and also state) control, separation of church and state, and an academic curriculum. The wealthiest European nations, such as Germany and Britain, had far more exclusivity in their education system few youth attended past age 14. Apart from technical training schools, European secondary schooling was dominated by children of the wealthy and the social elites. [120]

American post-elementary schooling was designed to be consistent with national needs. It stressed general and widely applicable skills not tied to particular occupations or geographic areas, in order that students would have flexible employment options. As the economy was dynamic, the emphasis was on portable skills that could be used in a variety of occupations, industries, and regions. [121]

Public schools were funded and supervised by independent districts that depended on taxpayer support. In dramatic contrast to the centralized systems in Europe, where national agencies made the major decisions, the American districts designed their own rules and curricula. [122]

Teachers and administrators Edit

Early public school superintendents emphasized discipline and rote learning, and school principals made sure the mandate was imposed on teachers. Disruptive students were expelled. [123]

Support for the high school movement occurred at the grass-roots level of local cities and school systems. After 1916, the federal government began to provide for vocational education funding as part of support for raising readiness to work in industrial and artisan jobs. In these years, states and religious bodies generally funded teacher training colleges, often called "normal schools". Gradually they developed full four-year curriculums and developed as state colleges after 1945.

Teachers organized themselves during the 1920s and 1930s. In 1917, the National Education Association (NEA) was reorganized to better mobilize and represent teachers and educational staff. The rate of increase in membership was constant under the chairmanship of James Crabtree—from 8,466 members in 1917 to 220,149 in 1931. The rival American Federation of Teachers (AFT) was based in large cities and formed alliances with the local labor unions. The NEA identified as an upper-middle-class professional organization, while the AFT identified with the working class and the union movement. [124] [125]

Higher education Edit

At the beginning of the 20th century, fewer than 1,000 colleges with 160,000 students existed in the United States. Explosive growth in the number of colleges occurred at the end of the 19th and early 20th centuries, supported in part by Congress' land grant programs. Philanthropists endowed many of these institutions. For example, wealthy philanthropists established Johns Hopkins University, Stanford University, Carnegie Mellon University, Vanderbilt University and Duke University John D. Rockefeller funded the University of Chicago without imposing his name on it. [126]

Land Grant universities Edit

Each state used federal funding from the Morrill Land-Grant Colleges Acts of 1862 and 1890 to set up "land grant colleges" that specialized in agriculture and engineering. The 1890 act required states that had segregation also to provide all-black land grant colleges, which were dedicated primarily to teacher training. These colleges contributed to rural development, including the establishment of a traveling school program by Tuskegee Institute in 1906. Rural conferences sponsored by Tuskegee also attempted to improve the life of rural blacks. In the late 20th century, many of the schools established in 1890 have helped train students from less-developed countries to return home with the skills and knowledge to improve agricultural production. [127]

Iowa State University was the first existing school whose state legislature officially accepted the provisions of the Morrill Act on September 11, 1862. [128] Other universities soon followed, such as Purdue University, Michigan State University, Kansas State University, Cornell University (in New York), Texas A&M University, Pennsylvania State University, The Ohio State University, and the University of California. Few alumni became farmers, but they did play an increasingly important role in the larger food industry, especially after the federal extension system was set up in 1916 that put trained agronomists in every agricultural county.

Engineering graduates played a major role in rapid technological development. [129] The land-grant college system produced the agricultural scientists and industrial engineers who constituted the critical human resources of the managerial revolution in government and business, 1862–1917, laying the foundation of the world's pre-eminent educational infrastructure that supported the world's foremost technology-based economy. [130]

Representative was Pennsylvania State University. The Farmers' High School of Pennsylvania (later the Agricultural College of Pennsylvania and then Pennsylvania State University), chartered in 1855, was intended to uphold declining agrarian values and show farmers ways to prosper through more productive farming. Students were to build character and meet a part of their expenses by performing agricultural labor. By 1875 the compulsory labor requirement was dropped, but male students were required to have an hour a day of military training in order to meet the requirements of the Morrill Land Grant College Act. In the early years, the agricultural curriculum was not well developed, and politicians in the state capital of Harrisburg often considered the land-grant college a costly and useless experiment. The college was a center of middle-class values that served to help young people on their journey to white-collar occupations. [131]

GI Bill Edit

Rejecting liberal calls for large-scale aid to education, Congress in 1944 during World War II passed the conservative program of aid limited to veterans who had served in wartime. Daniel Brumberg and Farideh Farhi state, "The expansive and generous postwar education benefits of the GI Bill were due not to Roosevelt's progressive vision but to the conservative American Legion." [132] [133] The GI Bill made college education possible for millions by paying tuition and living expenses. The government provided between $800 and $1,400 each year to these veterans as a subsidy to attend college, which covered 50–80% of total costs. This included foregone earnings in addition to tuition, which allowed them to have enough funds for life outside of school. The GI Bill helped create a widespread belief in the necessity of college education. It opened up higher education to ambitious young men who would otherwise have been forced to immediately enter the job market after being discharged from the military. When comparing college attendance rates between veterans and non-veterans during this period, veterans were found to be 10% more likely to go to college than non-veterans.

In the early decades after the bill was passed, most campuses became largely male thanks to the GI Bill, since only 2% of wartime veterans were women. But by 2000, female veterans had grown in numbers and began passing men in rates of college and graduate school attendance. [134]

Great Society Edit

When liberals regained control of Congress in 1964, they passed numerous Great Society programs supported by President Lyndon B. Johnson to expand federal support for education. The Higher Education Act of 1965 set up federal scholarships and low-interest loans for college students, and subsidized better academic libraries, ten to twenty new graduate centers, several new technical institutes, classrooms for several hundred thousand students, and twenty-five to thirty new community colleges a year. A separate education bill enacted that same year provided similar assistance to dental and medical schools. On an even larger scale, the Elementary and Secondary Education Act of 1965 began pumping federal money into local school districts. [135]

Segregation and integration Edit

For much of its history, education in the United States was segregated (or even only available) based upon race. Early integrated schools such as the Noyes Academy, founded in 1835, in Canaan, New Hampshire, were generally met with fierce local opposition. For the most part, African Americans received very little to no formal education before the Civil War. Some free blacks in the North managed to become literate.

In the South where slavery was legal, many states had laws prohibiting teaching enslaved African Americans to read or write. A few taught themselves, others learned from white playmates or more generous masters, but most were not able to learn to read and write. Schools for free people of color were privately run and supported, as were most of the limited schools for white children. Poor white children did not attend school. The wealthier planters hired tutors for their children and sent them to private academies and colleges at the appropriate age.

During Reconstruction a coalition of freedmen and white Republicans in Southern state legislatures passed laws establishing public education. The Freedmen's Bureau was created as an agency of the military governments that managed Reconstruction. It set up schools in many areas and tried to help educate and protect freedmen during the transition after the war. With the notable exception of the desegregated public schools in New Orleans, the schools were segregated by race. By 1900 more than 30,000 black teachers had been trained and put to work in the South, and the literacy rate had climbed to more than 50%, a major achievement in little more than a generation. [136]

Many colleges were set up for blacks some were state schools like Booker T. Washington's Tuskegee Institute in Alabama, others were private ones subsidized by Northern missionary societies.

Although the African-American community quickly began litigation to challenge such provisions, in the 19th century Supreme Court challenges generally were not decided in their favor. The Supreme Court case of Plessy v. Ferguson (1896) upheld the segregation of races in schools as long as each race enjoyed parity in quality of education (the "separate but equal" principle). However, few black students received equal education. They suffered for decades from inadequate funding, outmoded or dilapidated facilities, and deficient textbooks (often ones previously used in white schools).

Starting in 1914 and going into the 1930s, Julius Rosenwald, a philanthropist from Chicago, established the Rosenwald Fund to provide seed money for matching local contributions and stimulating the construction of new schools for African American children, mostly in the rural South. He worked in association with Booker T. Washington and architects at Tuskegee University to have model plans created for schools and teacher housing. With the requirement that money had to be raised by both blacks and whites, and schools approved by local school boards (controlled by whites), Rosenwald stimulated construction of more than 5,000 schools built across the South. In addition to Northern philanthrops and state taxes, African Americans went to extraordinary efforts to raise money for such schools. [137]

The Civil Rights Movement during the 1950s and 1960s helped publicize the inequities of segregation. In 1954, the Supreme Court in Brown v. Board of Education unanimously declared that separate facilities were inherently unequal and unconstitutional. By the 1970s segregated districts had practically vanished in the South.

Integration of schools has been a protracted process, however, with results affected by vast population migrations in many areas, and affected by suburban sprawl, the disappearance of industrial jobs, and movement of jobs out of former industrial cities of the North and Midwest and into new areas of the South. Although required by court order, integrating the first black students in the South met with intense opposition. In 1957 the integration of Central High School in Little Rock, Arkansas, had to be enforced by federal troops. President Dwight D. Eisenhower took control of the National Guard, after the governor tried to use them to prevent integration. Throughout the 1960s and 1970s, integration continued with varying degrees of difficulty. Some states and cities tried to overcome de facto segregation, a result of housing patterns, by using forced busing. This method of integrating student populations provoked resistance in many places, including northern cities, where parents wanted children educated in neighborhood schools.

Although full equality and parity in education has still to be achieved (many school districts are technically still under the integration mandates of local courts), technical equality in education had been achieved by 1970. [138]

The federal government's integration efforts began to wane in the mid-1970s, and the Reagan and Bush Sr. administrations later launched several attacks against desegregation orders. As a result, school integration peaked in the 1980s and has been gradually declining ever since. [ citation needed ]

Education after 1945 Edit

In mid-20th century America, there was intense interest in using institutions to support the innate creativity of children. It helped reshape children's play, the design of suburban homes, schools, parks, and museums. [139] Producers of children's television programming worked to spark creativity. Educational toys proliferated that were designed to teach skills or develop abilities. For schools there was a new emphasis on arts as well as science in the curriculum. School buildings no longer were monumental testimonies to urban wealth they were redesigned with the students in mind. [140]

The emphasis on creativity was reversed in the 1980s, as public policy emphasized test scores, school principals were forced to downplay art, drama, music, history and anything that was not being scored on standardized tests, lest their school be labelled "failing" by the quantifiers behind the "No Child Left Behind Act. [141] [142]

Inequality Edit

The Coleman Report, by University of Chicago sociology professor James Coleman proved especially controversial in 1966. Based on massive statistical data, the 1966 report titled "Equality of Educational Opportunity" fueled debate about "school effects" that has continued since. [143] The report was widely seen as evidence that school funding has little effect on student final achievement. A more precise reading of the Coleman Report is that student background and socioeconomic status are much more important in determining educational outcomes than are measured differences in school resources (i.e. per pupil spending). Coleman found that, on average, black schools were funded on a nearly equal basis by the 1960s, and that black students benefited from racially mixed classrooms. [144] [145]

The comparative quality of education among rich and poor districts is still often the subject of dispute. While middle class African-American children have made good progress poor minorities have struggled. With school systems based on property taxes, there are wide disparities in funding between wealthy suburbs or districts, and often poor, inner-city areas or small towns. "De facto segregation" has been difficult to overcome as residential neighborhoods have remained more segregated than workplaces or public facilities. Racial segregation has not been the only factor in inequities. Residents in New Hampshire challenged property tax funding because of steep contrasts between education funds in wealthy and poorer areas. They filed lawsuits to seek a system to provide more equal funding of school systems across the state.

Some scholars believe that transformation of the Pell Grant program to a loan program in the early 1980s has caused an increase in the gap between the growth rates of white, Asian-American and African-American college graduates since the 1970s. [146] Others believe the issue is increasingly related more to class and family capacity than ethnicity. Some school systems have used economics to create a different way to identify populations in need of supplemental help.

Special education Edit

In 1975 Congress passed Public Law 94-142, Education for All Handicapped Children Act. One of the most comprehensive laws in the history of education in the United States, this Act brought together several pieces of state [ clarification needed ] and federal legislation, making free, appropriate education available to all eligible students with a disability. [147] The law was amended in 1986 to extend its coverage to include younger children. In 1990 the Individuals with Disabilities Education Act (IDEA) extended its definitions and changed the label "handicap" to "disabilities". Further procedural changes were amended to IDEA in 1997. [148]

Reform efforts in the 1980s Edit

In 1983, the National Commission on Excellence in Education released a report titled A Nation at Risk. Soon afterward, conservatives were calling for an increase in academic rigor including an increase in the number of school days per year, longer school days and higher testing standards. English scholar E.D. Hirsch made an influential attack on progressive education, advocating an emphasis on "cultural literacy"—the facts, phrases, and texts that Hirsch asserted are essential for decoding basic texts and maintaining communication. Hirsch's ideas remain influential in conservative circles into the 21st century. Hirsch's ideas have been controversial because as Edwards argues:

Opponents from the political left generally accuse Hirsch of elitism. Worse yet in their minds, Hirsch's assertion might lead to a rejection of toleration, pluralism, and relativism. On the political right, Hirsch has been assailed as totalitarian, for his idea lends itself to turning over curriculum selection to federal authorities and thereby eliminating the time-honored American tradition of locally controlled schools. [149]

By 1990, the United States spent 2 percent of its budget on education, compared with 30 percent on support for the elderly. [150]

Current Trends Edit

As of the 2017-18 academic year, there are approximately 4,014,800 K-12 teachers in the United States (3,300,000 traditional public school teachers 205,600 teachers in public charter schools and 509,200 private school teachers). [151]

Policy since 2000 Edit

"No Child Left Behind" was a major national law passed by a bipartisan coalition in Congress in 2002, marked a new direction. In exchange for more federal aid, the states were required to measure progress and punish schools that were not meeting the goals as measured by standardized state exams in math and language skills. [152] [153] [154] By 2012, half the states were given waivers because the original goal that 100% students by 2014 be deemed "proficient" had proven unrealistic. [155]

By 2012, 45 states had dropped the requirement to teach cursive writing from the curriculum. Few schools start the school day by singing the national anthem, as was once done. Few schools have mandatory recess for children. Educators are trying to reinstate recess. Few schools have mandatory arts class. Continuing reports of a student's progress can be found online, supplementing the former method of periodic report cards. [156]

By 2015, criticisms from a broad range of political ideologies had cumulated so far that a bipartisan Congress stripped away all the national features of No Child Left Behind, turning the remnants over to the states. [157]

Beginning in the 1980s, government, educators, and major employers issued a series of reports identifying key skills and implementation strategies to steer students and workers towards meeting the demands of the changing and increasingly digital workplace and society. 21st century skills are a series of higher-order skills, abilities, and learning dispositions that have been identified as being required for success in 21st century society and workplaces by educators, business leaders, academics, and governmental agencies. Many of these skills are also associated with deeper learning, including analytic reasoning, complex problem solving, and teamwork, compared to traditional knowledge-based academic skills. [158] [159] [160] Many schools and school districts are adjusting learning environments, curricula, and learning spaces to include and support more active learning (such as experiential learning) to foster deeper learning and the development of 21st century skills.

For much of the 20th century, the dominant historiography, as exemplified by Ellwood Patterson Cubberley (1868–1941) at Stanford, emphasized the rise of American education as a powerful force for literacy, democracy, and equal opportunity, and a firm basis for higher education and advanced research institutions. Cubberley argued that the foundations of the modern education system were influenced by processes of democratization in Europe and the United States. It was a story of enlightenment and modernization triumphing over ignorance, cost-cutting, and narrow traditionalism whereby parents tried to block their children's intellectual access to the wider world. Teachers dedicated to the public interest, reformers with a wide vision, and public support from the civic-minded community were the heroes. The textbooks help inspire students to become public schools teachers and thereby fulfill their own civic mission. [161] [162]

New evidence from historical education trends challenges Cubberley’s assertion that the spread of democracy led to the expansion of public primary education. While the U.S. was one of the world leaders in the provision of primary education during the late-19th century, so was Prussia, an absolutist regime. Democratization appears to have no effect on levels of access to primary education around the world, based on an analysis of historical student enrollment rates for 109 countries from 1820 to 2010. [163]

The crisis came in the 1960s, when a new generation of New Left scholars and students rejected the traditional celebratory accounts, and identified the educational system as the villain for many of America's weaknesses, failures, and crimes. Michael Katz (1939–2014) states they:

tried to explain the origins of the Vietnam War the persistence of racism and segregation the distribution Of power among gender and classes intractable poverty and the decay of cities and the failure of social institutions and policies designed to deal with mental illness, crime, delinquency, and education. [164]

The old guard fought back in bitter historiographical contests. [165] The younger scholars largely promoted the proposition that schools were not the solution To America's ills, they were in part the cause of Americans problems. The fierce battles of the 1960s died out by the 1990s, but enrollment declined sharply in education history courses and never recovered.

Most histories of education deal with institutions or focus on the ideas histories of major reformers, but a new social history has recently emerged, focused on who were the students in terms of social background and social mobility. [166] Attention has often focused on minority, [167] and ethnic students. [168] The social history of teachers has also been studied in depth. [169]

Historians have recently looked at the relationship between schooling and urban growth by studying educational institutions as agents in class formation, relating urban schooling to changes in the shape of cities, linking urbanization with social reform movements, and examining the material conditions affecting child life and the relationship between schools and other agencies that socialize the young. [170] [171]

The most economics-minded historians have sought to relate education to changes in the quality of labor, productivity and economic growth, and rates of return on investment in education. It is very important to keep in mind that during the gradual progression of history, the focus of the country's changes with each elected president. Historians now ask the questions of what economics was the center of the thought process in the first besides driving capitalistic gain. [172] A major recent exemplar is Claudia Goldin and Lawrence F. Katz, The Race between Education and Technology (2009), on the social and economic history of 20th-century American schooling.


The Electors: Ratifying the Voter’s Choice

Presidential electors in contemporary elections are expected, and in many cases pledged, to vote for the candidates of the party that nominated them. While there is evidence that the founders assumed the electors would be independent actors, weighing the merits of competing presidential candidates, they have been regarded as agents of the public will since the first decade under the Constitution. They are expected to vote for the presidential and vice presidential candidates of the party that nominated them. 

Notwithstanding this expectation, individual electors have sometimes not honored their commitment, voting for a different candidate or candidates than the ones to whom they were pledged. They are known as �ithless” or “unfaithful” electors. In fact, the balance of opinion by constitutional scholars is that, once electors have been chosen, they remain constitutionally free agents, able to vote for any candidate who meets the requirements for President and Vice President. Faithless electors have, however, been few in number (in the 20th century, there was one each in 1948, 1956, 1960, 1968, 1972, 1976, 1988, and 2000), and have never influenced the outcome of a presidential election.


The Origins of the United States Two-Party System - History

A Brief History of American Major Parties
and the "Two-Party" System in the United States

Most historical literature refers to the "Party" of the Washington Administration as the Federalists with those in opposition to the policies of that Administration as Antifederalists however, the use of these designations is, in fact, more than a little inaccurate. The term "Antifederalist" (originally applied to those who had opposed the ratification of the Constitution drafted by the Framers meeting in Convention in Philadelphia in 1787) ceased to have any real meaning as a designation of a political faction once the Constitution formally took effect on 4 March 1789, as anyone serving in the new Federal Government had to take an oath to the new Constitution before entering upon their duties: referring to members of Congress as "Antifederalist", thus, makes little- if any- sense. In addition, there were no real national Political Parties prior to the Presidential Election of 1796 (although loose coalitions between, where not pre-arranged alliances among, State-based "factions"- along the lines of those cosmopolitan vs. localist divisions in Revolutionary Era politics suggested by the historian Jackson Turner Main- would prove to be the basis of the two Parties which would emerge in 1796 and did also have some effect on the political make-up of the first four Congresses).

It is best, therefore, to treat those who served in the first four Congresses [1789-1797] as being either Administration (that is, generally allied with those around Secretary of the Treasury Alexander Hamilton and Vice President John Adams) or Opposition (those generally associated with Secretary of State Thomas Jefferson and Congressman James Madison)- with the caveat that, while there is an apparent lineal connection between these groupings and the later Federalists and Republicans, respectively, the Presidency of George Washington was an era of "factions" rather than one of "Parties" and that there were shifting sands in the political landscape of this early era in American political history. For his part, President Washington should be held to be a member of neither faction/future Party although his political leanings would almost certainly be classified as generally more "Federalist" than "Republican", one has to think he would have been quite surprised to see himself listed in modern American History books as a dyed-in-the-wool Federalist simply because his Vice President would be as President.

By the start of the 5th Congress (which coincided with the Inauguration of John Adams as President on 4 March 1797), two national Major Political Parties had emerged from among the strong supporters of the policies of outgoing President Washington and those who had pretty much been opposed to these policies, respectively. Those who had supported the policies of the Washington Administration became known as Federalists because they supported a strong national government as a counterweight to the States those who had been in Opposition became known as Republicans because they felt that defending the sovereignty of the States against encroachments by the Federal Government was a truer essence of the federal republic known as the United States of America however, the Federalists, feeling that their contrary vision of what a federal republic should be was the more "republican" in spirit, derisively referred to the Republicans as "democrats" (a term which, at the time, had connotations of the mob rule associated with the then-still very recent Reign of Terror following the French Revolution of 1789). It is true that some Republicans of this era came to see identification with Democracy as a badge of honor and one often sees the term Democratic-Republicans applied to this Party in historical literature (this usage also creating a lineal relationship between these early Republicans and the Democrats of today) however, many political observers, instead, refer to the Republicans of this era as the "old", or "Jeffersonian", Republicans as a better, and more accurate, method of distinguishing them from the Republicans of today.

John Quincy Adams was elected as a Republican (in fact, all the candidates for President in 1824 were ostensibly Republicans) but, during the course of the 19th Congress [1825-1827] and on into the 20th Congress [1827-1829], the Republicans in both houses of Congress began to separate themselves into "pro-Adams/anti-Jackson" and "pro-Jackson/anti-Adams" factions- this last feeling strongly that, because of the controversial result of the 1824 Presidential Election, President Adams was not a "legitimate" holder of his office and, thus, coming to favor Senator Andrew Jackson of Tennessee, who had been defeated by Adams for the Presidency in 1824, as the next President of the United States in the upcoming 1828 Presidential Election. It is the practice of TheGreenPapers.com to refer to the first Republican faction, simply, as Adams Republicans, while referring to the second as Jackson Republicans, though political observers have used the term Jackson Democrats for this second Republican faction of the era instead.

By the start of the 21st Congress (coinciding with the Inauguration of President Andrew Jackson on 4 March 1829), the two opposing factions within the "old" Republican Party which had become evident in the course of the two preceding Congresses had coalesced into two new Major Parties: the Democratic Republicans (the one-time Jackson Republicans) and the National Republicans (the one-time Adams Republicans). The Democratic Republicans took their name from their identification with the democracy they urged on behalf of the "common man" as well as a strong historical tie they now felt with the old "Jeffersonian" Republicans who- as noted above- had been referred to as "democrats" as a term of derision (the "Jackson" faction thus painting those who supported outgoing President John Quincy Adams as being the contemporary equivalent of the Federalists of Adams' father, President John Adams). The National Republicans, meanwhile, adapted their name from the nationalizing policies pushed by the outgoing Administration of their champion, President Adams. Note that neither faction becoming Party, however, was yet willing to completely give up their identification with the "old" Republicans of the era before the 1824 Presidential Election which had created each faction cum Party in the first place.

By the start of the 23rd Congress (which coincided with the Second Inauguration of President Andrew Jackson on 4 March 1833), the one-time Democratic Republicans were becoming more generally known as Democrats, the name itself derived from the aforementioned one-time term of derision hurled by the Federalists at the "old" (or "Jeffersonian") Republicans- with whom those who strongly supported the policies of President Jackson closely identified historically- back in 1796 and 1800. This Major Party has, of course, stayed with the name Democrats ever since. Meanwhile, by the start of the 24th Congress (4 March 1835), the one-time National Republicans were more generally known as Whigs, a name evocative of the political faction in opposition to the English Crown during the era of the Stuarts (17th Century) in addition, the Patriots of the American Revolution were often referred to- by friend and foe alike- as "Whigs" (in contradistinction to the loyalist "Tories"). These 19th Century American Whigs saw themselves as being a bulwark against the "excesses" of the Administration of "King Andrew" Jackson and his heir apparent, Vice President Martin Van Buren, hence the use of this name by this Major Party.

The Slavery issue, however, marked the death knell of the Whigs as a Major Party: the Compromise of 1850 (which first adapted the concept of "squatter sovereignty" to the problem of the extension of Slavery to the territories) was lost in the battle over the Kansas-Nebraska Act of 1854 (which first extended this principle north of the northernmost limit of Slavery under the Missouri Compromise of 1820). In the wake of the resultant political fallout, Free Soilers and so-called "Conscience" Whigs joined forces with so-called "Free" Democrats and even denizens of the nativist American (known colloquially as the "Know-Nothing") Party to sow the seeds of a new Major Party: one soon enough to become more generally known as the Republicans, the name of this Major Party to this day. Meanwhile, other Whigs (primarily in the South) joined the Democrats, while a core of so-called "old" Whigs (principally in the Border South) vainly attempted to hold what was, by now, an "anti-Free Soil yet pro-Union" faction together as the winds of Secession and Civil War began to intensify and the end of the 1850s drew nigh (this last remnant of the Whigs would become the core of a short-lived Constitutional Union Party by the 1860 Presidential Election). The 34th Congress [1855-1857], thus, can be seen as a more or less transitional period in which the final decay and decline of the Whigs was becoming offset by the shifting sands of the contemporary antebellum political landscape swiftly producing a new "Democrats versus Republicans" Major Party alignment: one that, at least insofar as the Parties' names are concerned, continues to this very day.


17 Advantages and Disadvantages of the Two Party System

In government structures, a two-party system means that only two political parties receive a majority of the votes that are cast for representatives. That means only one party or the other can win a majority in the government.

There are additional parties that are present and campaign within a two-party system, even on a national level. The United States is a two-party system, for example, but the Libertarian Party and the Green Party have nationwide influence. These “third parties” do not receive enough votes to become a majority party.

A two-party system can also be used to describe a system where two major parties dominate an election and work together to form a majority ruling coalition, even if neither party won an outright majority on their own.

Here are the advantages and disadvantages of the two-party system to think about and discuss.

List of Advantages of the Two-Party System

1. The two-party system simplifies the election process.

The average voter casts a ballot based on a handful of core issues that are important to them. In the United States, a conservative voter might cast a ballot for the Republican party because they support the party’s stances on abortion and taxation. A liberal voter might cast a ballot for the Democrat party because they support stances on freedom of choice and a right to healthcare access. Voters are more likely to participate when they have confidence that their actions can bring about social change.

2. It creates a system that eliminates confusion.

In a two-party system, the result of each election is that the winner takes everything. Voters know that the top candidate will represent their district in the state or national government. They know that their preferred party, if it achieves a majority, will push to have legislation passed that they support. There is a lot less confusion in this type of structure because you either get what you want, or you do not.

3. The two-party system allows for common ideas to gain traction.

In a two-party system, there will always be partisan ideas that are promoted by the majority and opposed by the minority. In 2017, the U.S. experienced this with the tax reform package that was passed by a Republican-controlled Congress and Executive Office. There is also plenty of room for common ground to be found because the system encourages cooperation above anything else. That allows the two-party system to avoid extremism naturally.

4. It allows more people to participate in the civic process.

Multiple parties create multiple platforms that must be evaluated. Without a two-party system in place, anyone can create their own political party and platform to run on their own issues. With just two major parties, each must declare a platform that addresses all societal issues instead of a handful of “important” ones. That simplifies the evaluation process of voters, encouraging more of them to participate in the process of elections. There will always be outliers who do not identify with either political party, but for the most part, people will choose one or the other and stick to it.

5. The two-party system can speed up the process of governing.

Although the U.S. government is famous for its gridlock, it can move at unprecedented speeds when emergency situations arrive. All branches of government are linked through the two-party system, eliminating the need to form ruling coalitions. This allows people to vote for specific candidates that fall outside their party spectrum for certain offices. An individual could vote for a Democrat for President in the U.S. and a Republican as their senator. This gives the people more control on the overall structure of their government.

6. It allows anyone to run for office while naturally promoting the most experienced candidates.

The two-party system uses a series of primary elections to weed out the candidates that people do not want to have representing them as a majority. The system of primaries makes it possible for anyone to run. An example of this is the 2016 primary for the State of Washington Governor’s race. In total, 11 people ran a formal campaign to become Governor, including an individual called “Goodspaceguy.” The top two vote-getters were then advanced to the general election. At the same time, however, there is little chance for a “hung” government with no majority because one or the other will be in power.

7. The two-party system encourages majority representation.

Only two third-party candidates have been somewhat successful in elections from a national vote perspective history since 1900 in the United States. In 1992, Ross Perot received over 19.7 million votes, which was almost 19% of the overall total. Then, in 1912, Theodore Roosevelt ran as a third-party candidate in an attempt to serve a second term as President. Roosevelt received 4.1 million votes, which was 27.4% of the total votes cast. The system, though inclusive, is also restrictive enough to ensure that the majority receives the exact representation they want in each district.

8. It limits the number of people with extreme views that can be elected.

A multi-party system makes it possible for anyone with an extreme view to be elected as a representative in the government. The two-party system restricts this concept, making it more likely that a centrist will be the representative of each party. In this way, the majority is protected from the minority. In a multi-party format, there is always the possibility that an extreme political party could become part of the ruling commissions. Extremism could introduce chaotic reforms that would be potentially damaging for multiple generations. With just two parties, more stability is achieved.

List of the Disadvantages of the Two-Party System

1. The two-party system creates societal polarization.

People are most comfortable when they are surrounded by others with similar beliefs. That means spending more time with like-minded family and friends. Households even move to neighborhoods with similar political preferences, so there is a “guarantee’ that the preferred political representation can be achieved. That means political polarization tends to occur over time in such a society, where there is little debate over certain issues. It closes more minds than opens them.

2. It creates thoughtless voting patterns.

In a two-party system, it is somewhat common for voters to vote a straight-ticket based on their political party preference. Some states even had a “master lever” which allowed a voter to vote for every candidate of their preferred party with a single voting action. Although that makes voter participation much easier, it also creates thoughtless voting. Instead of evaluating candidates based on their background, experience, and qualifications, they are evaluating people based off their political preferences.

3. The two-party system limits voter choice.

There were multiple candidates in the 2016 Republican Primaries for President. Donald Trump eventually emerged the victor and was nominated by his party to run for the Presidential election. Yet in the primaries, 65% of people in most states voted against Trump as Republicans. That meant a majority of people who affiliate with the Republican party were forced to support Trump in the election if they were going to support their party. Although anyone can run for office, the major parties limit voter choice through this type of nomination process. They’re told to vote for a specific person, whether they support the personal positions of that individual or not.

4. It creates a system of pluralism.

In the United States, 48 of the states give their electoral votes to the candidate which wins the most votes. Although the system of government in the U.S. allows electoral voters to cast ballots for a different candidate (sometimes with a personal fine levied if done), the end result maintains the system of two parties. Unless a third-party candidate can receive a majority of the votes, they’ll receive 0 electoral votes. That can make it difficult to vote out incumbents, especially when straight-ticket voters are out there.

5. The two-party system excludes individuality.

When someone talks about the fact that they voted for a third party in the United States, the average voter perceives that as a “wasted vote.” Or worse, they see it as a vote for the “other guy.” In the 2000 Presidential election, Ralph Nader received 2.74% of the popular vote. Democratic voters feel like if the third-party voters, most who self-identify as liberal, had voted for Al Gore instead of the Green Party, then it would have been Gore, not Bush, that would have won the election. In this political system, free-thinking and individuality are actually discouraged.

6. It creates debate restrictions that can limit new ideas.

Gary Johnson was excluded from the U.S. Presidential debates in 2016 because he fell short of poll thresholds that were implemented by the Commission on Presidential Debates. At the time, Johnson was consistently polling at 7%, but the Commission rules required a third-party candidate to be polling at 15% to be included. Yet, if a candidate receives 5% of the national vote, their party qualifies as a “major” party in the U.S. A two-party system creates restrictions on debates that can limit the new ideas that are available to a society.

7. The two-party system creates fixed political views.

The two parties in a political system create platforms that limit the number of ideas that are available on any given issue. These views are fixed, often set at the party’s convention every 4 years. That means each citizen is forced to vote for one party or the other, even if neither one fits their own personal preferences. These fixed views also make it difficult for the parties to be response to shifts in public opinion which may occur.

8. It eliminates the ability for the majority to rule in some instances.

In a two-party system, voter turnout is critical to the process. If the turnout is low in this political system, then the votes that the winning party receives are only a reflection of how a minority of the population wants to be represented. And, since the average voter only votes off of a handful of important issues on a personal level, they’ll vote for the one party that meets their core need, even if they disagree with the rest of that party’s stance on issues.

9. The two-party system creates inconsistent governing.

When one party loses power in a two-party system, their policies are often reversed since the other party has a reverse view of how things should be managed. The U.S. has seen this in the Trump Administration, with their efforts to undue the Affordable Care Act, change the DACA program, and reverse other regulations and executive orders. The Obama Administration did the same thing. It is a pattern that repeats itself, leading to high levels of policy change that make it difficult to create needed societal change.

The advantages and disadvantages of the two-party system of politics does make it easier to vote. It reduces the need to form coalitions and can encourage cooperation. At the same time, it can also encourage gridlock and inaction on the part of the government. No system of government is perfect. There will always be challenges to face with a two-party system. If they are carefully controlled, it can be a beneficial structure for everyone.


Watch the video: Η Χίλαρι Κλίντον φοράει μαντήλα