Generational Change

Generational change performs an important role in American society by implementing change on social and political issues in the continuously evolving United States. The best example of this is the passage of the Civil Rights Act of 1964, and the end of segregation in America. As new generations of young people entered college after World War II, many questioned the entrenched ideology of white racial superiority and the political solution provided by state sponsored segregation, and horrified at what they found, united to protest for change. An eclectic mixture of ethnicities joined to overturn centuries of morally corrupt white dominance, forcing the legislative, executive, and judicial branches of government to follow suit. The following year, the Voting Rights Act halted state sponsored efforts to deny African Americans a place at the polls, and the United States finally achieved its promise of a democratic nation.

The effects of generational change on the individual’s political and social life in 2014 abound as more states act to provide support for same-sex marriage and the legalization of cannabis sales and use. As young Americans enter the political fray at the state and national level, their attitudes reflect a greater tolerance for issues conservatives find intolerable. Both of these movements find overwhelming support at the local level, but national politics are catching up as generational change makes its way into the legislative and executive branches of government. For the most part, generational change is positive and continues to move the country in a progressive direction, but the single branch of government resistant to generational change is the judicial branch.

Because Supreme Court justices serve lifetime appointments, generational change has little or no effect on its members, as the average age of appointees is 53. The United States is governed by a set of laws overseen by a nine-member panel of individuals whose attitudes on social and political issues remain mired in the past. Reinforcing the tendency towards conservative values, justices do their work in private, without any accountability to the people forced to live under their laws. The concern that justices would lose touch with the will of the people was very real for the original framers of the Constitution, and that is why justices were mandated to ride circuit in the region they represented.

In The Great Decision, Cliff Sloan and David McKean described the legislators’ reasoning for insisting that Supreme Court Justices actively participate in circuit riding as mandated in the Judiciary Act of 1802. “Republicans claimed that it was more desirable to have Supreme Court justices riding circuit court instead of simply huddling together in Washington two times a year where they were isolated from the American people; the justices should be more in touch with local law and custom.”(Sloan, 102) William Howard Taft, the only person to serve as both President and Chief Justice, ended circuit riding when he joined the Supreme Court in 1921. The mandate by the nation’s founding fathers to circuit ride must have seemed overly taxing for a man who was so morbidly obese that he required a specially made bathtub that could fit two full-grown men.

In addition to circuit riding, rule changes governing the Supreme Court have further reduced the workload required of the justices, and placed additional barriers between its members and the public. In her book, Out of Order, Sandra Day O’Connor provides an insider’s perspective on the evolution of the Supreme Court’s standard operating procedures. “Today, with no obligation to ride circuit, the Justices enjoy their impressive and comfortable quarters at One First Street. The Court’s role in picking the cases it hears has also changed dramatically…the Court’s docket was inundated with…cases that the Justices were…obliged to decide on the merits, regardless of their importance or the urgency for review. Today, the court uses its discretion to select a small subset of cases from approximately eight thousand appeals…oral advocates are strictly limited to thirty minutes of argument time…and advocates are lucky if they get more than two unbroken sentences out of their mouths before the Justices interject with difficult questions.”(O’Connor, 9-10) While O’Connor takes pride in the Court’s evolution, the streamlining of justice allows it to choose the cases Justices will hear, and ignore cases focused on difficult political or social issues.

No longer beholden by a congressional mandate in the cases it must review, the Court dodges attempts to address the constitutionality of same-sex marriage, an issue the conservative Justices whose social views are colored by Abrahamic religion stonewall by deferring to state’s rights. As the U.S. population experiences generational change, its opinion on same-sex relationships and the people who are in them has become more accepting, but the conservative S.C. Justices remain mired in a pre-gay world where homosexuality was practiced in the closet. The LGBT movement rightly compares itself to the Civil Rights movement because in both issues the Court created two levels of citizenship, state and federal. In the end, a legislative amendment defining the rights of the LGBT individual will be the sole solution to the issue. For the portion of the population between ages 18-40, the normality of homosexuality in everyday life contradicts the strident and strict morality of disgust voiced by conservatives that results in the social disconnect labeled ‘out of touch.’

In the past, government’s resistance to social progress resulted in violent riots that shut down many of the nation’s major cities. Police methods of using fire hoses, dogs, clubs, and officers on horseback to quell opposition resulted in a backlash of greater defiance that the government could not ignore. Passage of the Civil Rights Act was a social victory for the American people, while the violence that accompanied it shook the judicial system to its core. In response, the contemporary justice structure is more comfortable protecting the amorality of corporations over Americans on matters including freedom of speech, campaign finance, and religious exemptions to federally mandated employment insurance coverage. More concerning, the militarization of police forces nationwide means future demonstrations will be met with more sophisticated weapons. Replacing the fire hoses and clubs with rubber and beanbag projectiles, electrically charged probes, pepper spray, and low-frequency sound cannons ensures future protestors will pay a higher price for dissent.

The architects of American democratic government provided the U.S. Constitution as a blueprint that incorporated tools for compromise and change in a country still in its infancy. Peter Irons contended in A People’s History of the Supreme Court that slavery was the central issue for delegates at the 1787 convention, with the term “wealth” becoming “a euphemism in convention debates for slaves.”(Irons, 29) It boggles the contemporary imagination that an ideology of white superiority, a social concept, took precedence over jurisprudence. As gatekeepers to the laws governing the United States, the S.C. Justices’ indifference to unjust edicts actively impeded the social and economic development of black Americans for over two centuries by allowing slavery and separate-but-equal laws to proliferate. The tumultuous decades in post-war America leveled the political and social playing field, but only after young Americans joined the fight. With the far-reaching power to shape American policies, is it a practical notion to continue providing Justices lifetime appointments, and what opportunities exist to guide the institution toward generation change?

The most promising solutions propose reinstating circuit riding duties, allowing congress to mandate a limited number of cases for Supreme Court review, and replacing lifetime appointments with term limits. First, a return to circuit riding can silence the critics who accuse the Justices of being out of touch with the average American’s problems. By participating in the circuit court’s most pressing cases, its judicial overseer gains a clearer insight into the challenges confronting the region’s population. Next, allowing congress to mandate a limited number of cases for Supreme Court review would require the Justices to engage with the laws that it contemporarily avoids. Instead of just choosing the cases it will hear, as O’Connor points out, it will become more responsive to current conflicts as represented by Congress, and the people’s representatives. Finally, replacing lifetime terms with limits of ten years will provide the Supreme Court Justices with a more democratic system that allows more judges to participate in the highest levels of court. Becoming a Justice at the average age of 53, the improvements in current health and medical practices allow lifetime appointments to stretch into three decades of shaping the United States justice system. Alternatively, term limits discourage a handful of Justices from dominating the writing of United States laws for over three decades, and grants a louder voice to a large eclectic judicial pool of possible candidates.

The gentrification of the Supreme Court has resulted in a branch of government that interprets the Constitution through the eyes of a corporate few, reinterpreting long-standing precedents in novel ways that elevate corporate personhood over the nation’s masses. It is an ossified view that only an old, white, wealthy, entitled man can entertain, but injecting new blood into the Supreme Court introduces generational change on this staid institution. Now that the Supreme Court has broken the glass ceiling between religion and government with its Hobby Lobby ruling, opportunities abound for the religious right to press for even more rights that oppress the majority, but fit perfectly with the worldview of a handful of Justices. The time for change is at hand, before the Supreme Court turns the Constitution into what Thomas Jefferson warned, “a mere thing of wax in the hands of the judiciary, which they may twist and shape into any form they please.”(Burns, 40)

History Repeating Itself

It was the dawn of a new century and across the United States, its citizens were angry. The U.S. military was fighting an insurgency in a country that many Americans would have difficulty locating on a map, sapping both the country’s youth and finances. Domestically, the national economy experienced a sharp downturn, causing unemployment and a deep distrust of Wall Street that extended to its influence on the nation’s politicians. The Supreme Court, too, came under attack, accused of promoting the welfare of corporations over that of Americans. In the media, tales of corporate corruption and monopolistic behavior appeared on an almost daily occurrence, leading to protests and calls for action from the public. This sounds depressingly familiar; however, it is a description of the problems facing the United States a century earlier; we, as a nation, have been here before. At the turn of the twentieth century, our nation was fortunate to have a strong, intelligent leader, Republican Theodore Roosevelt, who recognized the corrupting influence of corporate finance on the political system, and spent a lifetime attempting to level the playing field for all Americans. Who will play this protective role for contemporary Americans?

In The Bully Pulpit, historian Doris Kearns Goodwin described the conditions facing the new President:

“At the start of Roosevelt’s presidency in 1901, big business had been in the driver’s seat. While the country prospered as never before, squalid conditions were rampant in immigrant slums, workers in factories and mines labored without safety regulations, and farmers fought with railroads over freight rates. Voices had been raised to protest the concentration of corporate wealth and the gap between rich and poor, yet the doctrine of laissez-faire precluded collective action to ameliorate social conditions.”(Goodwin, 24)

Were Roosevelt alive today, his fellow Republicans would probably brand him a Republican in Name Only, or RINO (pronounced Rhino), because his political ideals ran counter to that of the party. He campaigned for women’s voting rights, eight-hour workdays, workers compensation for job injuries, federal regulation of industry, an end to political machines and cronyism, and the breakup of corporate monopolies. Americans were fortunate to have a leader such as Roosevelt at the turn of the century, when major corporations ran roughshod over the public’s political, economic, and social lives. He recognized the corporate domination of the Republican Party as a threat both to the party’s continued existence, and to the well-being of Americans. Winning the presidency in 1904 as a Republican, despite a ticket that was antithetical to Republican-machine politics, provides both a measure of Roosevelt’s leadership abilities, and his popularity with the American people.

The numerous biographies of Theodore Roosevelt, never Teddy, which he disliked, provide a consistent list of the qualities making him such a dynamic leader that include: extremely intelligent; boundless energy; near-photographic memory; a prodigious writer and reader of books; a willingness to seek out and listen to criticism; empathy for the lowest in society; and a disposition to personally investigate important political issues. During his long political career, Roosevelt applied these qualities to propel the Republican Party and American people into the twentieth century, while reigning in the dangerous excesses of corporations. An incident in his early career as a New York State legislator demonstrated these qualities when Roosevelt took on the city’s cigar manufacturers.

In 1882, a bill to ban the production of cigars in tenements began to circulate through the New York legislature. Introduced by the Cigar-Maker’s Union, it accused cigar manufacturers of running tenement factories that required entire families to eat, sleep, and work in filthy, cramped, and over-crowded apartments, resulting in a danger to public health. Ignoring his initial impulse to vote against the bill because it impinged on the rights of tenement factory owners, Roosevelt took the extra initiative to inspect several factories unannounced and the scenes he encountered decisively changed his opinion on the bill. Throwing his support behind the bill, it passed into law but was later challenged by cigar manufacturers and overturned by the New York Court of Appeals.

“It was this case,” Roosevelt later said, “which first waked me to…the fact that the courts were not necessarily the best judges of what should be done to better social and industrial conditions. They knew nothing whatever of tenement house conditions,” he charged, “they knew nothing whatever of the needs, or of the life and labor, of three-fourths of their fellow-citizens in great cities.”(Goodwin, 101)

In the collision, or what some critics label collusion, between government and industry, the corporate mandate won. Taking the lesson to heart, Roosevelt fought a lifelong uphill battle against the Republican Party machine to introduce progressive ideals that uplifted the American people, and returned the government to that of the people instead that of the corporation. It was a battle that temporarily divided the party and allowed the Democratic Party to recapture the White House in 1910, and one that Roosevelt would not live to see to its completion. His legacy lived on, however, when his cousin, Franklin D. Roosevelt, achieved the Oval Office and implemented a progressive agenda that was decades in the making. Today, when an American goes to work it is with assurance from government that the environment will be free of harassment based on age, gender, race, religion, or politics, and the workplace will be safe. If injured on the job, or as the result of unsafe working conditions, the employee is entitled to fair compensation. Additionally, the employer is limited in the number of hours it can require individual employees to labor. These were all tenets that Theodore Roosevelt fought to realize.

Roosevelt’s administration did succeed in rendering Americans safer from the proclivities of industry on a number of fronts, beginning with a pure food and drug act, and a meat inspection act that required the industry to implement sanitary conditions in all food-processing stages, and to stop selling rancid or adulterated meats. In arguably his greatest legacy to the American people, Roosevelt placed 230,000,000 acres of land under federal protection with a view to preserve the nation’s natural wonders from the creeping exploitation of industry in its search for resources. At a time when government policies skewed toward that of big business, his stance revealed a streak of political bravery that is rarely found in the contemporary Republican Party. After all, Ronald Reagan’s insistence on party unity that included an imperative not to criticize a fellow Republican precludes any opportunity for critical discourse or compromise on important issues facing the nation. Unwilling to reach a middle ground with the Democratic Party, the contemporary Republican Party stands unified, while doing nothing.

When Roosevelt died at home in his sleep, a contemporary noted that death had to sneak up on the former president, because if awake, a fight would have broken out. In addition to his political record, his noted accomplishments include winning the Nobel Peace Prize for brokering peace between Russia and Japan, the Congressional Medal of Honor for his actions in the Philippines, and authorship of 18 books. In contrast, who has the Republican Party offered by way of leadership in the twenty-first century?

George W. Bush, an alcoholic and failed business executive who achieved sobriety and found religion on the way to becoming the Governor of Texas, reached the presidency after losing the popular vote and intervention by the United States Supreme Court. Unlike Roosevelt, Bush was a mediocre student at both Yale, and at Harvard’s Business Administration program. Bush, too, served his country as a pilot in the National Guard; however, he did not seek out the challenge of combat in Vietnam as Roosevelt did in the Philippines. The Bush presidency was racked with controversy, such as White House officials signing off on a program of rendition and enhanced interrogation that a century earlier Roosevelt would recognize as kidnapping and torture, and the unjustified invasion of Iraq based on specious intelligence that it was a national threat because it harbored weapons of mass destruction. When Joe Wilson challenged the administration’s rationale in a New York Times op-ed, top White House officials leaked information that his wife, Valerie Plame, was a Central Intelligence Agency operative, a crime punishable under United States law, but federal officials held no one accountable. Moreover, Bush signed into law the deeply unpopular Patriot Act of 2001 that allowed unwarranted searches of Americans, and the targeted assassination of U.S. citizens without due process of the law.

Perhaps more disturbing was the role Dick Cheney played as the Chief Executive of Halliburton, and later as second in command to the president, in providing the company with no-bid military contracts during combat operations. In 2004, Cheney still owned stock options in the company worth eighteen million dollars, leading Kentucky Senator Rand Paul (R) to openly accuse the Vice President of war profiteering. In contrast, Theodore Roosevelt recognized the dangers of corporate influence on politics and frowned upon politicians using their office to give corporations an unfair advantage over small-business owners and individual Americans.

Roosevelt and Bush each confronted the threat of catastrophic economic collapse caused by Wall Street’s greed. In 1907, two men attempted to corner the market on copper leading to the collapse of New York’s second largest investment bank. To keep the system from ruin J.P. Morgan used his considerable fortune, in cooperation with other leading bankers, to prop up the market. Roosevelt promised twenty five million in funds, but in the true laissez-faire system, government expected financial institutions to accept the losses. Alternately, Bush responded with upwards of four trillion dollars to keep financial institutions afloat during the 2007 subprime mortgage scandal, and collapse of the housing bubble, and while fines were issued to the worst offenders by federal regulatory agencies, no one was held criminally accountable for actions that cost untold numbers around the world their life savings. In a comparison between both men as leaders of the Republican Party, Bush casts a dim shadow next to that of Roosevelt.

Following in the Bush administration’s tumultuous jet stream, the Republican Party chose a genuine former combat jet pilot who endured years of torture after his plane was shot down during the Vietnam War, Arizona Senator John McCain. Accentuating his reputation as a “maverick” politician, McCain’s 2008 presidential campaign against Democrat Illinois Senator Barack Obama stumbled on a number of occasions, notably when a reporter asked McCain how many houses he owned, and the Senator appeared unsure, but guessed that he and his wife probably owned about six. Furthermore, McCain chose Alaskan Governor Sarah Palin as a running mate without properly vetting her as a viable candidate. In response to the Democrat’s selection of a black candidate, the Republicans attempted to secure the female vote by choosing Palin, however the strategy failed when her numerous limitations and inexperience as a political leader became apparent. She did prove a boon for late-night comedians who lampooned Palin mercilessly over her perceived lack of intelligence; her brand of “maverick” was found wanting by the American people, signaled by Obama’s 2008 presidential election win.

Behaving as if it learned nothing about the nation’s priorities after losing the 2008 presidential election the Republican Party nominated Mitt Romney to lead it in the 2012 election. The former-Governor of Massachusetts earned his fortune in private equity and campaigned on a promise of bringing fiscal responsibility to government. After contradicting himself on key issues on a number of occasions, opponents labeled Romney a flip-flopper whose ideals changed in chameleon-like response to public opinion. Critics also questioned the manner in which Romney earned his fortune at Bain Capital, describing him as a corporate raider who benefitted by predating off weak companies. Just four years after the worst economic turndown since the Great Depression, the Republican Party offered Americans a prime example of the despised Wall Street everyman, and it lost another election. It is difficult to imagine Theodore Roosevelt telling an audience, as Mitt Romney did, “There are 47 percent of the people who will vote for the president no matter what. All right, there are 47 percent who are with him, who are dependent upon government, who believe that they are victims, who believe the government has a responsibility to care for them, who believe that they are entitled to health care, to food, to housing, to you-name-it. That’s an entitlement. The government should give it to them. And they will vote for this president no matter what. And I mean the president starts off with 48, 49…he starts off with a huge number. These are people who pay no income tax. Forty-seven percent of Americans pay no income tax.”

Romney voiced the beliefs of a Republican establishment that increasingly view the U.S. citizen in terms of taker or producer, and corporations as the only American entity truly deserving of being propped up by the government. In one hundred years of political maneuvering, the Republican Party remained true to its ideology of associating wealth with success, and the taxation of said wealth as anti-American. The Luddite-like attitude of the Republican establishment appears in its interpretation of scientific evidence to deny climate change, and its banal understanding of women’s health issues, such as the rights to seek a safe abortion, and to access affordable contraceptives. Proudly voicing its pro-life ideals to protect unborn children, the Republican Party offers little support for a child after leaving the mother’s womb, and later castigates the child for being a drag on society. It is a stagnant ideology that resists change in a continuously evolving nation, with leaders who see the rest of society in terms of units. And just as they did a century earlier, Americans recognize the tune, and tire of the song.

When the 2016 presidential campaigning begins in earnest, the Republican establishment will trot out the usual suspects: Kentucky Senator Rand Paul, New Jersey Governor Chris Christie, Louisiana Governor Bobby Jindal, Florida Senator Marco Rubio, Wisconsin Congressman Paul Ryan, and Texas Senator Ted Cruz. Do any of the individuals listed possess the political courage of a Theodore Roosevelt to address the pressing social and economic inequalities facing Americans? Do they understand the full spectrum of American society wrestling with barriers to economic advancement, and voice realistic and humane solutions to the problems? Perhaps the Republican establishment is too dependent on Wall Street to change, but were Roosevelt alive today, he would relish the fight to level the political playing field between individuals and corporation, and the nation would benefit in the process.

Mythmaking and Poverty

Media historians describe American film and the television series as a form of mythmaking driven by the contemporary issues facing the nation. As cultural tales, they provide examples of cautionary stories that address Americans’ prevailing concerns. For example, in the period 1940-1960 the science-fiction genre of film and television burst into prominence as a direct result of the atomic bomb. This new weapon’s immense destructive power, able to wipeout entire cities, brought science and its abilities to construct such advanced devices to the film industry’s attention. In Japan, the film industry birthed Godzilla, a creature formed in the wake of an atomic explosion. In the 1953 film War of the Worlds, the atomic bomb was used to demonstrate the invading Martians’ might as the nuclear blast proved ineffective.

The same remains true today but the fears pushing the mythmaking process are caused by economic worries and income inequality. Two popular television shows that relied on the theme of consistent income inequality are The Wire and Breaking Bad. While both series appear to focus on the American drug war, the engine driving both shows is income inequality.

The Wire began its five season run on cable television’s Home Box Office network in June 2002, and quickly gained notoriety among critics for its superbly written and acted characters. Taking an unflinching look at the drug industries money-trail and influence in the city of Baltimore, its stories touch on politics, labor, education, and print media. The series’ characters provided a believable level of ethnic diversity, while its positioning on HBO shielded the show’s writers from the censorship hoisted onto broadcast television. By placing the violence in context, the show allowed viewers to sympathize with both victim and killer. All these elements combined to produce a gritty inner-city police drama that makes a damning statement about income inequality, the poor’s forced participation in the underground drug economy that is the only form of work available to them, and the violence that accompanies both components.

Show creator David Simon populated The Wire’s dystopian world with characters whose personal stories resonated with the public. It brought viewers too genteel to have ever visited an inner-city ghetto into abandoned and condemned tenements where the children living within served as surrogate parents to even younger children by slinging drugs on street corners to buy food and clothes. In contrast to the very poor, the political elite search for the means to more power and money. Whether it is the labor union president trying to save his workers’ jobs, or the smartest, fiercest, and gayest stickup man, all the characters are driven by income inequality. It is a world, in short, where a six-year-old boy can go from playing in the streets one moment, and shooting the very same stickup man in the back of the head the next.

The show’s writers held no punches when portraying its police characters as flawed human beings who alternately create serial killers in order to acquire more work hours and resources for fellow officers, while also creating fake snitches to make a few extra bucks. Police officials eat unhealthily, drink heavily, and curse creatively, in response to pressure from above demanding lower crime statistics. While The Wire falls into the genre of a crime drama, only one police character fires a weapon throughout the entire series, on three separate occasions, and each time in error.

Gunplay is an episodic constant, but overwhelmingly and depressingly, a child usually holds the weapon. It is through these children that the show argued for social change, as the viewer watched a tug-of-war between the police, educators, parents, and drug bosses, for the hearts and minds of the majority black adolescents. Bombarded on all sides by conflicting information, the characters evolve survival strategies to navigate the dangers of an inner-city ghetto, passing the information on to the next generation in a perpetual loop of ingrained poverty.

The simple solution provided by show writers is the legalization or relaxed regulation of the drug trade, but this is a bandage approach for the much deeper wound of income inequality. A problem that flows across the barriers of age, race, and gender, with impunity, it remains a constant worry for large portions of the nation’s population. The Wire invited viewers to take a front row seat while it imagined the slow decline of an American city in the grip of two economies, one of which is illegal.

In contrast to the ethnic diversity found on The Wire, a majority of the characters on Breaking Bad are middle-class whites, perhaps foreshadowed by the protagonist’s name, Walter White, performed admirably by Bryon Cranston. Hispanic performers serve alternate roles such as assassins, kingpins, bodyguards, drug dealers, and Drug Enforcement Agency underlings. The show’s creator, Vince Gilligan, placed the crime drama in Albuquerque, New Mexico, which may provide a reliable excuse for the lack of diversity.

Broadcast on American Movie Classics beginning in 2008, Breaking Bad also had a five-season run that ended in 2013. It won a multitude of awards for writing and acting, and became a national phenomenon that spurred real-to-life copycat behaviors akin to that of the movie Scarface. Airing on the heels of the economy’s collapse, Gilligan situated Walter White in a majority of Americans worst nightmare. He is a late middle-aged high school chemistry teacher, who works part-time at a car wash in order to provide for his pregnant wife and cerebral palsy inflicted teenage son. To complete his Job-like misery, White is diagnosed with terminal lung cancer.

Gilligan presented his viewers with a story that required virtually little need to employ suspension of disbelief because the situation plays out daily in American life. White looks beyond his death to how it will affect his family’s future. His health insurance plan is woefully inadequate, and the cancer treatments needed to keep him alive will sap his family’s savings, leaving them deeply in debt long after his death. In a nation that measures personal success in fiduciary terms, White is an utter failure, and by extension, so is the family.

The need to provide for his family beyond death drives White into the same underground drug economy of The Wire, where he temporarily serves as a methamphetamine producer, before adding distribution to his resume. Viewers followed a bungling White naively stumbling into the local drug scene with a product far superior to the competitors, because he believed the fast cash promised his family long-term financial security. However, as White’s success brings in literally barrelfuls of cash, the power that accompanies financial prosperity seduces him into overreaching for more, placing him at odds with friends and family members.

Aaron Paul brilliantly portrays Jesse Pinkman, White’s frequently reluctant, morally conflicted, and physically abused partner and former pupil. Treated with disdain by White, who unapologetically manipulates Pinkman and those closest to him, Jesse empathizes with the victims left in the wake of his partner’s destruction. Searching for stability, Pinkman’s first major purchase is his aunt’s home from unwitting parents who have written off their son as a loser, but are tricked into selling at a grievously below market price. He provides the Yin to White’s Yang, literally tossing away bricks of cash in newspaper delivery fashion to the residents of Albuquerque, while his partner opts to hoard his stash in the ground.

By the series’ end, White is feared and reviled by the wife and son he cherishes, banished from the family as a bad memory to be outlived and forgotten. Gone is the once-adored chemistry teacher who endured bullying by his boss at the carwash, replaced by a calculating murderer who is willing to place children in harm’s way in order to reach his own goals. The delusion remains to the last, as he admits to enjoying the work, and being good at it, just prior to his shooting death, despite all the destruction surrounding him.

The trope of income equality and its accompanying fears are constants that American scriptwriters depend upon to spark the viewer’s imagination and create a successful cautionary myth. The Wire and Breaking Bad presented the drug trade as both the problem and solution to climbing the economic ladder. The former questioned the need for an underground drug economy by comparing it to prohibition, supporting drug legalization, and preferring treatment to incarceration. The latter unabashedly condemns the drug trade and condemns all it touches to shame, degradation, and death.

However, remove the plot device of an unjust economy and the myth falls apart for lack of context. The desperate fear of hunger and security become irrelevant in a world that focuses on helping the weakest, instead of glorifying the wealthy as models of success. What does it say about our culture that the most common American fear is also its favorite film and television plot device? Will future generations look back on both shows in confusion because the trope no longer makes sense? Just as science destroyed the creationism myth, perhaps the day will arrive when local and federal government efforts to eradicate poverty prove successful, and another plot device will fall by the wayside.

Lifetime Appointments Place Supreme Court Justices Above the Political Fray: The Lie That Provides Supreme Court Justices With Lifetime Appointments

From its inception, the United States government based its authority on the rule of law as decided by the will of the people. After years of enduring the whims of a distant king, the colonists decided on a government of popular representation with each branch of government answerable to the people, with the exception of the Judiciary. Instead, the president nominates a qualified individual to one of the highest positions in government, who the Legislative Branch then votes on, and if accepted, becomes a Supreme Court Justice. After swearing to protect the constitution, the new S.C. Justice begins a term unfettered by the need to seek the approval of the other government branches or the American people, and unburdened by election campaigns or term limits.

As the highest court in the United States, its members are the gatekeepers tasked with protecting the constitutional rights of every U.S. citizen from the unfair encroachment of the Executive and Legislative branches. The gravitas with which it carries out this task places the Supreme Court beyond reproach. However, has the court proven itself worthy of this power by observing a history of fair judgments untainted by political ideology? Have the justices earned the privilege of serving a lifetime commitment protecting the constitutional rights of American citizens? Is providing this small group of individuals with such far-reaching power good for a democracy?

The popular argument for providing Supreme Court Justices with lifetime appointments is to place them beyond the ideological reach of the party politics forced upon elected officials. However, any objective history about the United States Supreme Court’s impact on American civil liberties provides ample evidence that its members are politically committed to a particular party. It begins with the model case for understanding how the Supreme Court gained the power of absolute authority over the constitutionality of American laws, the 1803 Marbury v. Madison case.

The details of the case revolved around a political battle between the outgoing Federalist presidential administration of John Adams, and Thomas Jefferson, his Democratic-Republican predecessor. Jefferson accused the Adams administration of packing the federal judiciary with last minute appointees and refused to deliver three commissions, one of which belonged to William Marbury. Marbury v. Madison signaled the first of many political battles waged by presidents to select federal judges who fit the prevailing characteristics of the ideological climate. For example, Sandra Day O’Connor described President George Washington’s selections for the Supreme Court as, “reliable supporters of the Federalist cause, had service in the Revolution, were active in the political life of the nominee’s state, and were favorably regarded by the President or other well-known Federalists.”

Adams, too, selected Federalist-affiliated Justices for the court, leading to a showdown between the Jefferson administration and a Federalist-packed Supreme Court. From its very inception the highest court in the nation was shaped by political forces and it became the norm, not an isolated event. Speaking about the contemporary selection process, O’Connor said, “Every President making appointments has tried to appoint people who were politically acceptable to the President himself.” If the President’s search for Justice appointees relies on ideological equivalency, how, then, can its members claim exemption from the political vagaries facing the nation, while asserting a lifetime appointment as the solution?

Examples of the damage caused to Americans by the Supreme Court’s adherence to the prevailing political climate abound in its young history. In 1857 the Supreme Court had an opportunity to correct the injustice of slavery forced onto a large portion of the country’s black population. It was an issue of such importance that in just four years the nation would fight a civil war to overturn the ‘peculiar institution’ of slavery at the cost of an estimated 600k soldiers’ lives. With an opportunity to be on the right side of history, the Supreme Court, instead, ruled in favor of the political tenet in Dred Scott v. Sandford that blacks under the Constitution were a “subordinate and inferior class of beings who had been subjugated by a superior race” with “no rights which the white man was bound to respect.”

In the aftermath of the Civil War and the South’s reconstruction, the Supreme Court received a second opportunity to secure national rights for all Americans in the Slaughterhouse Cases of 1873. It decided on two levels of citizenship, with those laws addressed by state rights gaining precedence over national rights. Some southern states enacted laws requiring separate accommodations for whites from ‘inferior races’ in the public sphere. When the constitutionality of these laws was challenged in the 1896 Plessy v. Ferguson case, the Supreme Court denied that “the enforced separation of the two races stamps the colored race with a badge of inferiority.” Legislation was “powerless to eradicate racial instincts or to abolish distinctions based upon physical differences…If one race be inferior to the other socially, the Constitution of the United States cannot put them on the same plain.”

Supreme Court Justices were not immune to the climate of racism that plagued the nation long after the end of the Civil War. In 1914, President Woodrow Wilson appointed James McReynolds, a man with many prejudices, to the Supreme Court. When Louis Brandeis and Benjamin Cardoza, both Jews, joined the Supreme Court, McReynolds refused to sit next to, speak with, and dine with, either man because he was, according to Justice O’Connor, a “notorious racist.” If the Supreme Court lacked the ability to curtail racism within its own ranks, what level of impartiality could black Americans hope for in the early 20th century?

After supporting a national two-tiered social system for 151 years, the Supreme Court finally took a turn toward supporting civil liberties for all Americans. Led by Chief Justice Earl Warren, the court decided against the segregation of black and white schoolchildren in the 1954 Brown v. Board of Education case. Appointed by President Dwight Eisenhower, who later considered Warren “one of the two worst mistakes” of his presidency, Warren believed separating black children “from others of similar age and qualifications solely because of their race generates a feeling of inferiority as to their status in the community that may affect their hearts and minds in a way unlikely ever to be undone.”

The Supreme Court finished its pivot toward civil liberties for all Americans in 1967 when President Lyndon Johnson appointed the first black American to the Supreme Court, Thurgood Marshall. The following year, civil rights legislation passed that finally broke down the racial barriers separating white Americans and the rest of the country. When President Ronald Reagan selected Sandra Day O’Connor for the Supreme Court, she brought the promise of a truly diverse Supreme Court with her. However, with such a long, dismal record of adhering to outdated, antiquated, and conservative political ideologies, should Americans continue to trust Supreme Court Justices with the power to shape decades of government policy through lifetime appointments based on the false tenet of political neutrality?

President Thomas Jefferson once predicted that the Constitution would become a “mere thing of wax in the hands of the judiciary, which they may twist and shape into any form they please.” With advances in medicine, S.C. Justices enjoy longer terms in office, and the opportunity to mold government policy for the next three decades. In recent rulings, the Supreme Court continued its support of free speech principles in campaign finance, but its focus moved from that of protecting individual rights to allowing more money to flow into the political process. The media uproar in the wake of the McCutcheon v. FEC included a stern accusation from economist Robert Reich that the decision was “the most brazen invitation to oligarchy in Supreme Court history.” He noted the court overturned “40 years of national policy and 38 years of judicial precedents” to reach its conclusion. Has Jefferson’s prediction come true?

Theodore Roosevelt – A Better Choice for Republican Icon than Ronald Reagan

The Republican Party’s embrace of Ronald Reagan as its political ideal reveals a deep misunderstanding of the party’s rich political history. Contemporary Republicans continue to preach Reagan’s mantra of smaller government, tax cuts for job creators, a laissez faire approach to industry, and an adherence to traditional family values. Known as “The Great Communicator,” Reagan’s oratory skill and folksy approach continue to garner praise in Republican circles; his insistence on additional budgetary expenditures for the military and the influence of the Strategic Defense Initiative (SDI) are credited with ending the Cold War. He was the oldest president, at age 79, elected to office, and survived an assassin’s bullet. These are impressive accomplishments, but in order to measure Reagan’s success, we must examine his personal and political character in relation to both words and deeds. Did Reagan adhere to these qualities, and were they really in line with the best interests of the American people? If found lacking in these qualities, is there a better Republican presidential choice to replace Reagan as a political ideal?

As the president who ended slavery, Abraham Lincoln remains the Republican whose legislation benefited the single greatest number of American citizens. However, Teddy Roosevelt serves a distant second as the Republican president whose policies reflected a genuine interest in uplifting the nation’s people. Unlike Reagan, who was elected president, Roosevelt became president after William McKinley’s assassination in 1901; in stark contrast to Reagan, Roosevelt remains the youngest president, at age 42, to hold the office. Perhaps the most glaring contradiction between the two presidents is that, unlike Reagan, Teddy Roosevelt’s fellow Republicans disavowed him because of his progressive views. Despite their differences, Roosevelt and Reagan viewed the federal government as inefficient and in need of reform. Both presidents begged Congress for funds to improve the nation’s defenses, and they shared a belief that industry performed an important role in relation to government and society. However, the different methods that each president utilized to attain his goals offer an insight into both men’s political and personal character. A comparison between Roosevelt and Reagan reveals a much better candidate for the cloak of the Republican Party’s ideal – Teddy Roosevelt.

Two great conflicts serve as bookends for Theodore Roosevelt’s life, The Civil War and World War I. Born into the social chaos preceding the war’s outbreak, Roosevelt experienced the internecine character of the Civil War, as family members served on both sides of the conflict. However, the knowledge that his father, Theodore Roosevelt, Sr., paid three hundred dollars for an exemption and replacement in the Union Army troubled the young man. For Roosevelt, this small imperfection served as a lesson and in no way diminished his love for a father he described as charitable, gentle, and firm. Indeed, Roosevelt, Sr. practiced a style of “muscular Christian” charity that utilized his wealth and influence to assist the poor. Meanwhile, Roosevelt’s mother, Martha, had two brothers fighting for the Confederation, and the stories of their bravery in combat served as a reminder to the boy that his father had avoided fighting in the war. It was a blot on the family record that young Theodore would try to erase by leading his own unit on the battlefield in Cuba.

When Roosevelt landed on Cuba at the onset of the Spanish-American War, he was second in command of the First Volunteer Calvary, an eclectic mix of volunteer soldiers who christened the unit with the moniker “Rough Riders”. At one end of the social spectrum stood cowboy acquaintances Roosevelt befriended while working as a western cattleman. These men had fought in the Indian Wars, and grown accustomed to thriving in hostile environs. Hailing from the upper crust, Yale and Harvard chums eager to earn honor for their families, but reduced to cooking or cleaning latrines when called upon. Roosevelt believed everyone, regardless of social importance, was duty bound to serve his or her country during war. “He despised politicians who talked of war and sent others off to fight.” Offered the command position during the unit’s formation, Roosevelt declined, citing his inexperience as a military commander. Instead, he suggested Leonard Wood lead the Rough Riders, allowing Roosevelt to learn from a battle-hardened veteran.

Wood and Roosevelt proved to be masterful logisticians and leaders, as their volunteers received the newest weapons, smokeless cartridges that Roosevelt paid for out of pocket, and climate appropriate uniforms. Conversely, the regular army fought in heavy uniforms using bullets that when fired produced a wisp of smoke, revealing the soldier’s position to the enemy. In the hectic scramble to secure boats for troop movements, Roosevelt’s personal connections provided the needed transportation for his troops. Finally, as the battle for San Juan Hill raged, he remained upright on his horse in front of the Rough Riders, spurring them on. A witness to the charge stated, “No one who saw Roosevelt take that ride expected him to finish it alive.” After turning loose his horse, Roosevelt led his men on a charge up the hill, at times outdistancing them by one hundred yards; Cuba became the first domino to fall for the Spanish monarchy, which later also relinquished the Philippines to the United States. With his actions on the battlefield and Spain’s defeat, Roosevelt believed the family name restored. Among his soldiers, Roosevelt was considered a genuine leader. His commanders, however, held a very different view of Roosevelt, and for good reason; he criticized their leadership.

Roosevelt’s two major complaints involved food provisions and travel. During deployment, his troops received rancid rations that were sold to the military by war profiteers. Unwilling to let his men go hungry, Roosevelt again dipped into his personal wealth to purchase supplies. Perhaps his greatest ire was the government’s bureaucratic ineptness to secure passage stateside for his troops after the end of hostilities. With the approaching wet season and the accompanying mosquitoes, Roosevelt worried about the increased risk of malaria to the soldiers. In an effort to spur the War Department to action, Roosevelt offered for comment a letter on behalf of the expedition’s top leaders to a group of journalists; the letter, which later appeared in a number of newspapers, stated that the army must “get…out of Cuba before the force was obliterated by fever.” His comments embarrassed the Secretary of War, Russell Alger, and ultimately kept Roosevelt from receiving the Congressional Medal of Honor during his lifetime; Congress granted the metal posthumously in 2001.

Theodore Roosevelt’s quest to restore the family’s honor reveals a great deal about his character both as a leader and as a man. Driven by the memory of his father’s decision not to fight in the Civil War, Roosevelt proactively charted a course to redeem this slight. Prior to serving in Cuba, he held several important leadership positions, among them Civil Service Commissioner in Washington, D.C., New York City Police Commissioner, and Assistant Secretary of the Navy. However, when initially offered the Rough Rider’s command position, Roosevelt displayed an understanding of his limitations as a military leader, and the negative impact such hubris might have on his soldiers. Furthermore, by suggesting Woods as a more appropriate choice, he exhibited a penchant for recognizing talented individuals.

Roosevelt’s genuine concern for the wellbeing of his soldiers, and the actions he undertook on their behalf, contrasted sharply with contemporary military protocols that defined interactions between officers and subordinates. His willingness to “buy for his men beans and canned tomatoes,…ignoring the regulation that canned vegetables were for officers only,” displays Roosevelt’s altruistic attitude toward his common man. He learned this quality from observing his father’s philanthropic work. When the government failed in the simplest of its duties, feeding the hungry, it was the duty of those with the means to provide sustenance. Had Roosevelt decided to remain aloof from his men and provide only for himself, he faced no criticism from fellow officers. If soldiers went hungry, it was the War Department’s fault, and they would just have to wait for resupply, or scrounge for their own food. This attitude was beyond Roosevelt’s understanding because he recognized the fiscal limitations of the average soldier. A soldier, he believed, was simply the common laborer in uniform, and Roosevelt always believed in speaking for the common person.

It was a combination of concern for his troop’s health and Roosevelt’s natural impetuosity that led him to release the letter that publicly criticized the War Department. By virtue of his lowly status, a soldier’s voice was a mere whisper, so Roosevelt spoke for him. He had not safely delivered his men through battle only to watch them perish from a mosquito bite. He spoke out, as he had in the past and continued doing in the future, for the little person, and it cost him that award he craved the most, the Congressional Medal of Honor.

Roosevelt’s path to Cuba began and ended with his father’s decision to purchase a replacement during the Civil War; instead of bowing to resentment of his father’s perceived cowardice, he accepted this shortcoming and channeled it in a positive direction. Along the way, he employed the empathy learned from his father to care for his soldiers. Roosevelt referred to his father as “the best man I ever knew,” and often spoke reverently of him. This close familial relationship between Ronald Reagan and his father, Jack, is less evident, but no less important.

Much has been written about the life of Ronald “Dutch” Reagan, but most narratives downplay or only briefly discuss the relationship between father and son. Jack was an itinerant shoe salesman who moved the small family numerous times after Dutch’s birth before finally settling in Dixon, Illinois. Reagan’s mother, Nelle, labored as a seamstress, which allowed her time to serve as a source of support for members of the local Disciples of Christ church. Two working parents was a reality for a large majority of society in the 1920s. The often-unstable labor market left one spouse without work, in which case the family became dependent on the other spouse. These labor conditions for the working-class family existed in huge numbers throughout small towns across America; as the Great Depression tightened its fist, the nation’s poorest felt the effects first. Long periods of unemployment plagued families as the economic depression deepened; these were the realities of the Reagan home, but with the added burden of Jack’s alcoholism.

Jack Reagan’s condition imprinted itself deeply on an eleven-year-old Ronald; in reminiscing about his father’s drinking, Reagan said he found Jack drunk and asleep in the snow on the steps of the home. “I leaned over to see what was wrong and smelled whiskey. He had found his way home from a speakeasy and had just passed out right there. For a moment or two, I looked down at him and thought about continuing on into the house and going to bed, as if he weren’t there. But I couldn’t do it. When I tried to wake him he just snored – loud enough, I suspected, for the whole neighborhood to hear him.” In this encounter, Reagan acknowledged the shame visited on the family by his father’s alcoholism; indeed, it is the overriding motif of the incident as he remembered it.

Contemporary society defined alcoholism as an individual’s moral failure to curb a worldly desire. The event occurred in 1922, when the United States was two years into the Prohibition period. The main social drivers behind the movement were women and religious institutions, which together forced the passage of the Eighteenth Amendment banning alcohol. As a devout Christian and the family’s moral compass, Nellie fell neatly within both demographics. In keeping with her Christian ethics, she “urged them [Dutch and Moon] to be compassionate and understanding of their father’s struggle.” Nellie’s influence is on display when Dutch encountered his drunken father and vacillated between leaving Jack out in the cold and bringing him into the home’s warmth. The deciding factor for Dutch was concern that the neighbors might find his father in such a state.

Reagan’s retelling of the event imparts a sense of antipathy for his father. The Jack in this encounter is very different from the Jack that Reagan credited with teaching him “the value of hard work and ambition, and maybe a little something about telling a story.” His father was not only an alcoholic, but also a criminal by virtue of visiting a “speakeasy.” This behavior contradicted the values taught by the devoutly Christian Nellie, and when presented with the above conundrum the conflicted Ronald begrudgingly acted to assist his father. It was the humiliation induced by an alcoholic father, conflated with the Jack who taught hard work and ambition, which motivated Reagan to aspire for the nation’s highest office.

In response to his unstable living conditions, Reagan became introverted and sought emotional support from Nellie. She encouraged him to participate in local theater productions, and praised his performances. The impetus to Reagan’s acting abilities stretch back to his early youth when he “learned to pretend to be an insider despite being an outsider.” Although the family was poor, Dutch projected a middle-class aura to those around him, but in later recollections he said, “Our family didn’t exactly come from the wrong side of the tracks, but we were certainly always within sound of the train whistles.” His long honed acting talents and storytelling abilities served as the foundation that led Reagan into politics, and the Governorship of California. Ronald Reagan’s acting career placed him squarely in the public’s eyes, while also allowing him to garner social and political connections, and experience in mediation as president of the Screen Actors Guild. After several successful movie appearances, Reagan’s income allowed him to buy Jack and Nellie a home, a first for the couple.

Reagan’s Hollywood career was in decline when he successfully entered politics, becoming California’s governor in 1967. His stint as president of the Screen Actors Guild during the government’s forced restructuring of the film industry politicized a Reagan already seething at the amount he paid in taxes. By the time he entered office, the boy from Dixon no longer needed to pretense at wealth. Hailing from such a humble background in which both parents toiled in a fickle labor market, Reagan now possessed the necessary resources to uplift a substantial portion of California’s indigent population.

Instead, as California governor, Ronald Reagan instituted welfare reform as one of his mandates. Proponents of Reagan point out there were over 2 million people on California’s welfare rolls, which they insisted was far too many, and an indication of fraud and abuse. Reagan’s state welfare reforms began in 1970, after passage of the Civil Rights Amendment, and just as programs designed to assist African Americans climb out of an institutionally produced and entrenched poverty began to take hold. To accomplish welfare reduction, the governor redefined the understanding of what it meant to be poor in California by “tightening eligibility rules so that only the truly needy would receive public assistance, and wasteful spending would be eliminated. His stated intention to “bums back to work quote” successfully removed “three hundred thousand within three years” from welfare lists, “saving California taxpayers hundreds of millions of dollars,” which Reagan promptly handed back to state landholders in the form of tax cuts.

After becoming United States president, Reagan expanded his welfare reform federally, again focusing the narrative on corruption and abuse within the system. For example, he spoke of the “welfare queen” as if the system was rife with malfeasance, when in reality it was an anomaly. Reagan’s vision of the poor echoed back to his role model, Jack, whose battle with alcohol hindered him from steady employment. He believed welfare recipients had no incentive to work because of the social safety nets provided by previous presidential administrations; instead of working, public assistance recipients were characterized as wasting the day away drinking or doing drugs, “because that was the life they preferred.” This view ignored the financial barriers to education and training faced by both the rural and urban poor.

The jobs traditionally within reach for the undereducated were in manufacturing, which was already in decline when Reagan took office. In an effort to save labor costs, a large segment of American manufacturers relocated operations overseas, leaving U.S. laborers the task of adjusting to a service industry labor market. As his social policies took effect, the disadvantaged turned to an underground economy that included prostitution, drug sales, gambling, and weapons to replace the vacuum left in federal welfare’s wake. This resulted in an explosion of criminal convictions that taxed an overcrowded U.S. prison system, which contemporarily houses the largest inmate population on the planet. Instead of funding education programs, drug-addiction counseling, and cultural centers, Reagan’s administration offered federal funding to states in order to implement the militarization of existing police forces.

When presented with the opportunity to uplift a substantial portion of American society, Ronald Reagan responded in the same manner as to a drunken Jack – grudgingly and with misgivings about one’s inability to work. Nellie’s religiosity moved Dutch to assist minimally an inebriated Jack, a man whose moral failings Reagan believed kept him from steady employment, and this attitude prevailed in his decision regarding welfare reduction. He believed religious institutions rather than government better served the poor, claiming that if they combined resources and cared for the needs of just ten indigent families, “we could eliminate all government welfare in this country.” Reagan conflated financial success with moral integrity, while contending, as in childhood, that poverty was reserved for the lazy or vice stricken.

Both Ronald Reagan and Theodore Roosevelt considered their fathers’ failings as surmountable obstacles, but each man approached the resolution differently. Roosevelt organized and led a military unit in the Spanish-American war to counterbalance his father’s perceived cowardice. Reagan redefined poverty’s rubric, paring down welfare lists and defunding education programs influenced by Jack’s imagined moral failings. Roosevelt’s actions aided the U.S. in the war against Spain, which expanded America’s economic and military reach. Reagan, too, broadened the military strength by increasing defense spending, but at the expense of domestic social programs. In a contemporary American society that values the proactive solution over the reactive solution, Roosevelt presents the better choice.

A large majority of contemporary Republicans self identify as Reagan Republicans, citing his party loyalty, ideology, and policies as the exemplary political identity. Reagan’s often cited Eleventh Commandment, Thou Shalt Not Speak Ill of Any Fellow Republican, remains a party standard. His conservative ideology, rooted in small-town family values, still resonates with the party faithful. His conviction that government impinged on individual liberties and hindered economic growth persists in contemporary Republican policy development. However, Reagan’s political identity serves as a straw man for the realities of his political career. He championed this ideology, but in practice, found it very flexible.

Ronald Reagan’s political roots were planted in Franklin D. Roosevelt’s Democratic Party; when he voted for FDR in the 1932 election, Reagan agreed that under certain conditions, government’s duty to people required its taking action. FDR’s domestic policies benefited the Reagan household and millions like it across the nation. However, as Reagan’s success as an actor and president of SAG placed him in an increasingly higher tax bracket, his opinion on government’s broad influence gradually shifted toward the Republican Party. Reagan quit the Democratic Party in 1962 saying, “I was a Democrat when the Democratic Party stood for state rights, local autonomy, economy in government, and individual freedom. Today it is the party that has changed, openly declaring for centralized federal power and government-sponsored redistribution of the individual’s earnings.” Reagan proponents admit that he wanted to quit the party in 1960, but then-presidential candidate Richard Nixon “asked him not to do so: it would be better, Nixon said, if Reagan endorsed him and campaigned for him as a Democrat.” For two additional years, Reagan retained his Democratic Party credentials while operating as a Republican surrogate; nor was this the final instance when he ignored party loyalty.

Another of the tenets both parties adhere to is discouraging inner-party challenges to an incumbent president. In 1976, Reagan challenged Gerald Ford to the Republican nomination, claiming “Ford’s foreign policy failures” required that he act. The Republican Party’s nominee remained undecided until the convention, fragmenting a party that finally coalesced around Ford. By challenging Ford, Reagan undermined the incumbent’s credibility and cost his party the highest federal office in the nation; as a result, Democrat Jimmy Carter unseated Ford as president. Based on Reagan’s record as a Republican surrogate, and later challenging an incumbent Republican president, his party loyalty was meager at best.

Conservatives continue in their adherence to Ronald Reagan’s family values, to which he credited Dixon, Illinois, and its small-town character. Dixon provided the template through which he envisioned the nation’s future; contained within this template, the nuclear family consisted of a happily married, Christian man and wife. Abortion and contraceptive use were discouraged, and society expected parents to impart wisdom, love, and discipline on the resulting children. While frowned upon, conservatives recognized divorce as a necessary evil that could be overlooked. Within the family unit, the father acted as breadwinner, with the mother supplementing the family income when necessary; otherwise, she managed domestic affairs within the household.

However, Reagan’s childhood lacked many of these elements, which manifested later in the Reagan family dynamics. His defenders point to existing societal norms to explain his relationship with first wife, Jane Wyman. Married in 1940, Reagan told Wyman, “We’ll lead an ideal life if you’ll just avoid doing one thing: Don’t think.” Perhaps youth and a still-developing attitude toward women influenced Reagan’s words, but Wyman obviously disagreed, divorcing Reagan and retaining custody of their two children in 1948. His second marriage to Nancy Davis lasted until his death, but the couple’s scattering of children across two marriages resulted in a lack of “the warmth, stability, and values the Reagans publicly embodied,” because “the bond between Ronnie and Nancy became so intense it eclipsed all others, including their various offspring.”

Reagan’s connection to Dixon, Illinois is also tenuous, at best; he rarely visited his hometown, and most of these occasions were used to self-promote a film or election campaign. Between 1940 and 1970, he returned home on a mere eight occasions, and refused to lend his name or influence to various project requests by Dixon officials because it might appear cronyism. Instead, he spent much of his life in the California enclaves of Bel Air and Santa Barbara. Indeed, Reagan’s appreciation for Dixon and its family values appears illusory.

In regards to smaller government policy and promises made not to raise taxes, the record on Reagan suggests a pragmatic nature during both his gubernatorial and presidential tenures. He said, “…If I found when I was governor that I could not get 100 percent of what I asked for, I took 80 percent.” He believed the best means to limit government’s size and reach was to reduce departmental budgets, forcing departments to operate at greater efficiency with reduced costs. To accomplish this, Reagan appointed like-minded individuals as department heads, such as David Stockman and James Watts, who trimmed away large portions of their budgets.

As head of the Office of Management and Budget, “… [David] Stockman would turn out to be a disastrous appointment – he became the Robert McNamara of the Reagan administration – from which supply-side economics never fully recovered.” The OMB served as a presidential instrument to control budgetary spending, but when Stockman recommended deep cuts to social programs, the public backlash was immediate. Described by pundits as a “technocrat with the soul of a calculator,” he believed that the public “was not entitled to any services,” a view that negatively affected Reagan’s popularity in polls, forcing the president to announce that some basic social programs were “deemed untouchable.” Stockman further damage Reagan’s economic policies by candidly stating in a 1981 interview published in Atlantic Monthly magazine, “It’s kind of hard to sell ‘trickle down,’ so the supply-side formula was the only way to get a tax policy that was really ‘trickle down.’ Supply-side is ‘trickle down’ theory.” His implication that the administration’s economic plan was a “crackpot theory” undercut Reagan’s insistence that cutting taxes for the nation’s top earners encouraged greater investment opportunities, while creating more jobs for average Americans. Stockman remained with the administration until 1984, but neither his influence nor Reagan’s economic plan ever fully recovered.

James Watt’s appointment as Interior Secretary serves as another example of Reagan’s proxy attempt to reduce environmental regulations that he believed restricted economic growth. “Reagan and many conservatives loved the outdoors…But they feared the environmental movement as anticapitalist, big-government oriented, and too hostile to what they and their corporate supporters defined as progress.” While remembered for his attempt to ban the Beach Boys from performing on the Mall because their music attracted an “undesirable element,”
environmentalists attacked the conflict of interest between Watt’s oversight of federal natural resources and his position as a lawyer “funded to a large degree by mining, timber, and energy companies.” Watt shared Reagan’s belief that allowing industrial access to federal resources benefited both the private sector by providing raw materials, and the federal coffer by supplying a revenue stream. Watt ignored complaints made to the Environmental Protection Agency about industrial abuses, and “outraged environmentalist by selling off public lands, resisting efforts to declare species endangered, and encouraging more mining, drilling, and developing, while denouncing liberals as socialists.” Characterized as a plainspoken, “ingenuous evangelical Christian,” Watt’s public statements often proved distracting for the White House staff; he once described a list of appointees as, “I have a black, I have a woman, two Jews, and a cripple.” Reagan, Stockman, and Watt shared the same political ideals, however, the American public found this ideology unpalatable, a sentiment not lost on legislators, who voted down Reagan’s policies.

If the nature of a presidency is an indicator of political character, the seven scandals that plagued the Reagan administration hang heavily over his political legacy. Two transgressions that politically and personally damaged the president’s image were the Iran-Contra scandal and the Department of Health and Urban Development grant rigging investigation. His involvement in the Iran-Contra affair provides an example of the flexible nature of Ronald Reagan’s political character, while the HUD grant rigging scandal serves as a reminder of the negative impact on the lives of the poor by influence-peddling former administration insiders.

In its most condensed form, the Iran-Contra affair involved the sales of U.S. military equipment by the administration to Iran in order to free American hostages, and using the resulting funds to support the Contra rebels fighting against a communist-leaning Nicaraguan government. Reagan approved the arms-for-hostage swap even though long-standing federal policies forbid any arms sales to the hostile Iranian government. Further, the administration violated congressional legislation expressly forbidding any type of U.S. assistance to the Contra rebels. As news of the scandal spread, revealing the extent of the president’s involvement, a majority of Americans began questioning his character; his image suffered further damage when the public discovered the Iranians reneged on releasing the hostages. Reagan supporters insist that his empathy toward the hostages moved him to approve the operation, while also citing his win-at-any-cost attitude about fighting the Cold War. Reagan’s belief, that direct intervention in Nicaragua in order to offset communism in South America, necessitated ignoring Congress and breaking the law. Under questioning, Reagan often claimed he was unaware of the operation’s details, and blamed a poor memory for not recalling details. “Ultimately, Reagan avoided impeachment by claiming he was out of touch and possibly incompetent rather than responsible and thus guilty.” Reagan’s culpability in the scandal remains intensely polarizing among both parties, but competence aside, his actions demonstrate contempt for the rule of law by a president with a slippery political character.

On the domestic front, in the 1980s the Department of Housing and Urban Development endured a grant rigging scandal that harmed most the very people that its mandate purported to assist, America’s homeless and poor population. The root cause of the department’s misconduct resides in Ronald Reagan’s small-government ideology that encouraged department heads to cut costs by reducing personnel. As the Secretary of Housing and Urban Development, Samuel Pierce epitomized the loyal, budget-cutting Reaganite when he reduced HUD’s budget by more than fifty percent. Taking advantage of the disarray in the wake of personnel cuts, the administration used HUD as an engine to drive cronyism and wealth accumulation for supporters and former members of the administration. The administration handed out positions in Pierce’s department as a political reward and often its appointees were ill suited for their jobs. One such political appointee, Deborah Gore Dean, operated as an assistant to Pierce and would “sit down at a table with a clipboard listing the projects she wanted funded…[and] the employees would pore through…binders to see whether the states that were hosting had even submitted applications for funding.” Ignoring long-standing federal regulations designed to protect the government from malfeasance, Dean “circumvent[ed] the agency’s policies to benefit favored developers.” Instead of creating responsible government, Reagan-influenced policies fostered an environment of corruption that allowed his supporters to profit off funds meant for the nation’s poorest individuals.

Evaluating the merits of Ronald Reagan’s political identity can be seen as a Rorschach test for Republican Party membership. His decision to identify as a Democrat but act as a Republican offers proof to staunch party members of Reagan’s loyalty, while Democrats view his actions as disloyal. Republicans describe his failed challenge to Ford’s presidency as justified, even though it cost the party the presidency. The family values he preached remained just beyond Reagan’s own ability to reach, as he remains the only president to have been divorced; his connection to Dixon, Illinois, while often referred to, ended when he left town for Hollywood as a young man. Finally, Reagan’s small-government policies failed to produce responsible public assistance, instead breeding an atmosphere of corruption and scandal for which the American taxpayer suffered. As the Republican Party’s political ideal, Ronald Reagan falls short of the very identity to which the party adheres.

The political identity of Theodore Roosevelt provides the better choice for the Republican Party standard-bearer. He entered politics as a Republican and would remain in the party until 1912, when he joined the Progressive Party. Roosevelt described party loyalty as, “A man cannot act both without and within the party; he can do either, but he cannot possibly do both.” Understanding that reform must come from within, he studied the local party machine and found rampant cronyism and graft between government and business, and decided to be the “person who leads to clean up government through civil service reform and legislation.” This narrative resonated with the voters in his New York district who voted Roosevelt to the state legislature. As with Reagan, Roosevelt believed the government required much needed reform, but he blamed a system that unfairly benefited politicians and wealthy private interests rather than protecting the nation’s working class. Instead, the government operated on a spoils system that allowed politicians to dole out jobs based on political or financial support.

To understand why civil service reform resonated with the public requires a knowledge of just how deeply ingrained graft and cronyism were in politics. Whenever a shift in power between the two parties occurred, a massive number of government positions held by the losing party became fodder for the incoming party to dole out as it wished. Perhaps the most infamous example of political cronyism remains the Democrat-controlled Tammany machine of New York City; by controlling access to city government positions, Tammany politicians controlled the flow and direction of labor and city contracts, and decided who would benefit in the financial largesse. As the politician’s power grew, so too did the number of patronage positions available to him, virtually guaranteeing future electoral success. The public chaffed at a spoils system that benefited only a select few and remained out of the purview for average Americans.

For Roosevelt, civil service reform was a campaign promise made in earnest. As a state legislator, he authored a bill that reorganized 10 percent of the state’s jobs under civil service rules. When Democrats shelved the bill, progressive Democratic Governor Grover Cleveland asked Roosevelt to reintroduce it, promising to support the bill’s passage by a majority Democrat legislature. His designs were “to take out of politics the vast band of hired mercenaries whose very existence depends on their success, and who can almost always in the end overcome the efforts of them whose only care is to secure a pure and honest government.” The public supported his reforms because it conformed to three of the most basic American tenets; it opened up new avenues of federal employment to all Americans, regardless of political affiliation, who tested well enough to earn the job. Roosevelt’s actions gained t he attention of President Benjamin Harrison, who appointed him to the United States Civil Service Commission, a position he retained into the Grover Cleveland administration.

Roosevelt was seeking new challenges when he accepted a position offered by Mayor William Strong on the New York City police commission. The NY police department had suffered recent allegations of corruption, and Roosevelt’s reputation as a virtuous reformer established him as the ideal candidate to restore public faith in the institution. At the end of the 19th century, NYC served as the nation’s financial capital and an important port of entry for both people and goods. As the city’s population expanded, so too did the opportunity for vices such as drinking, gambling, and prostitution. Approximately 8000 saloons operated within the city, some open seven days a week even though laws prohibited alcohol sales on Sunday. On a steady basis, police officers accepted bribes from brokers to ignore the rampant vice occurring on their beats or within the precinct. Corruption manifested at all levels with one precinct captain charging brothels and saloons an initial $500 fee, and an additional $50 monthly for protection from harassment. Perhaps more disturbing, patrolmen often drank on the job or slept instead of walking a beat, and many New Yorkers complained about the rough handling and language used by officers.

Upon assuming his position, Roosevelt instituted a number of measures designed to improve the department and weed out the most corrupt members of the force. He announced that charges brought against officers would be thoroughly investigated, and if true, they were fired and prosecuted; this prompted a rash of retirements by officers worried about losing their pensions because of malfeasance. The commission placed officer hiring under civil service standards that required new recruits pass an examination that tested knowledge, and physical and mental health; this testing insured that new hires merited the position. The commission issued promotions only after investigating the officer’s record, as the former protocol involved simply buying advancements. Finally, he met with Police Chief Thomas Byrnes and informed him officers were no longer to turn a blind eye toward any law; Roosevelt warned Byrnes that he would occasionally walk the city, day or night, and any officer misconduct he discovered was subject to discipline and firing. Roosevelt reported to the public, “We are bound to make all honest, brave and efficient members of the force who are bold in their dealing with the criminals and courteous in their dealings with the ordinary citizens, understand that we are their friends.”

True to his word, Roosevelt went on nighttime jaunts with Jacob Riis as a guide and discovered policemen drinking, sleeping, and meeting with prostitutes; but these forays into the city served another purpose by introducing Roosevelt to the brutal effects of poverty. He blamed alcohol for the evils he witnessed, and decided the police department would strictly adhere to the Excise Law banning Sunday liquor and beer sales. Roosevelt viewed law enforcement in terms of black and white, fair or unfair, just or unjust, leaving him very little room to maneuver on unpopular positions. Enforcing the Excise Law required that saloons close on Sunday, and it was his job to assure compliance. This was a very unpopular stance for the city’s working poor who labored six days a week, leaving Sundays as their only day to imbibe. He further limited the number of saloons by instituting a hefty increase on the fees to register a saloon, and requiring that they meet city safety-code standards. Unable to endure a dry Sunday, New Yorkers ventured out in search of a saloon willing to defy the law, and many saloon owners, believing Roosevelt’s rhetoric was mere lip service and a ploy to appear the reformer, opened their establishments only to be shut down by the police and heavily fined by a judge. In an effort to circumvent the law, saloon owners devised clever methods to conceal alcohol sales, many of which were resurrected and made infamous during the Prohibition Era. Throughout New York City, the realization occurred that this police commission, unlike the others before it, could not be bought.

While Roosevelt’s popularity increased nationwide, he was a political pariah in New York at both the city and state level. The unpopularity of Roosevelt’s virtuous crusade to clean up the police force and uphold the law caused many Republican-leaning New Yorkers to vote Tammany Democrats back into office. Politicians in both parties feared being implicated in the police commission’s investigations, and responded by drafting legislation to dissolve it. The combative relationship among the commission members further impeded its progress. State Republican Party boss, Thomas Platt, refused to support Roosevelt’s cause, and expressed outrage when, during the state convention, legislators clamored for, and eventually added, enforcement of the Excise Law onto the Party’s plank. Seeking to harness his energy, party leaders sent Roosevelt on a speaking tour across the U.S. in support of William McKinley’s 1896 presidential bid against Grover Cleveland. After reaching the White House, McKinley appointed Roosevelt Assistant Secretary of the Navy, an assignment that also provided him with an honorable escape from the police commission.

Undertaking civil service reform required a tremendous amount of political courage because it undercut political power by removing patronage positions from legislators. Had Roosevelt not read correctly public sentiment, the issue might have been a career killer. The nation’s rural population, however, responded positively to his rhetoric, and while the Republicans lost New York to the Democrats, the party prevailed in sending William McKinley to the White House. The party that even then enjoyed a close relationship with industry and finance now signaled its intention to support issues important to the working class.

While Roosevelt’s campaigns for civil service reform are highlights of his political identity, his tenure as Assistant Secretary of the Navy displays Roosevelt’s ability to restructure the U.S. Navy into an internationally recognized and feared sea force. This was no easy task, as United States foreign policy strictly adhered to isolationism, and any attempt to expand naval power smacked of warmongering. His argument that naval sea power trumped a standing Army ran counter to the American war experience. Roosevelt, however, recognized the threat posed by a newly militarized and aggressive Japan and reasoned that naval bases in the Philippines, Hawaii, and Cuba would provide a needed buffer for the nation’s coastal defenses. Saddled with a navy scoffed at by the international community, he established a training regimen to shore up and professionalize it. His request for “six new battleships, six large cruisers, and seventy-five torpedo boats fell on deaf ears in Congress; it responded by approving one new battleship and a small number of torpedo boats.

In keeping with his ideology of issuing meritorious promotions, Roosevelt extended “equal rank to the newly skilled engineers and to line officers, as electricity was put into renovated ships.” Perhaps Roosevelt’s most fortuitous promotion belongs to George Dewey, who was promoted to admiral. Dewey proved his worth by destroying the Spanish fleet at the Battle of Manila Bay while only suffering a single American casualty. Admiral Dewey’s success reassured an American public worried about the country’s declaration of war against Spain in 1898; it also surprised British naval officers whose attitude before the U.S. Navy’s departure for action, Dewey said, “…was to the effect: ‘A fine set of fellows, but unhappily we shall never see them again.'”

As president, Roosevelt continued to develop a strong navy by authorizing the construction of 6 new dreadnaught ships, 16 battleships, 6 cruisers, 12 submarines, and 16 destroyers. By the time he left office, the U.S. Navy was ranked number two in the world, surpassed only by Britain. To display the U.S. Navy’s newly developed might, Roosevelt deployed it in 1907 on an international tour, ostensibly to test the White Fleet, so known because of the ships’ light grey paint, and its crew’s performance during extended missions far from American shores. However, the White Fleet also impressed upon Japan the extended reach of the United States and its ability to protect the nation’s assets in the distant Philippines.

Although Roosevelt exerted a great deal of energy on work, he also enjoyed a stable and happy family life. As with Ronald Reagan, Roosevelt married twice, however his first wife, Alice, died while giving birth. He named the baby girl Alice, and gave her to Bamie, his sister, to foster; when he remarried to Edith Carow, Alice rejoined the family at Edith’s insistence. Stories abound of the many antics on which Roosevelt led the “bunnies,” his affectionate term for the children, during play; he told them ghost stories, joined into pillow fights, and led them on long outdoor hikes. Close family friend Gifford Pinchot reported watching the children slide down a rope extending from the homes second floor window, with Roosevelt “whooping and hollering to highlight the drama.” His concern for children extended beyond Roosevelt’s immediate family, as he championed regulations that restricted industry’s abusive child labor practices. Roosevelt described the importance of family in a letter to his son, Ted, Jr.: “Home, wife, children – they are what really count in life. I have enjoyed many things; …but all of them put together are not for one moment to be weighed in the balance when compared with the joy I have known with your mother and all of you.”

Much like his family life, Roosevelt’s administration enjoyed a relatively peaceful reputation, enduring only a single scandal. The Brownsville Affair centered on accusations that in 1906 “Buffalo Soldiers” of the all black 25th Infantry Brigade shot and killed a white bartender in Brownsville, Texas. During the investigation by George H. Burton, the U.S. Army Inspector General, soldiers were questioned about the incident but refused to implicate their fellow service members; based on Burton’s recommendation, Roosevelt orders 167 men “dishonorably discharged from the ranks of the storied Buffalo Soldiers.” Ignoring the national outcry, he stood firm on a decision that denied the soldiers any opportunity to receive a pension or future consideration for federal employment. The stain of Roosevelt’s actions remains particularly perfidious, as later investigations proved the soldiers’ innocence; the Nixon administration later reversed Roosevelt’s decision.

In relation to the contemporary Republican doctrine of efficient government, family values, and party loyalty, Roosevelt’s political identity more closely conforms than that of Reagan. He never vacillated between parties, as Reagan had, and proactively labored to change government through Civil Service Reform. Reagan’s reform method of financially constricting federal departments and social programs unpopular within the Republican Party further crippled government and punished the nation’s working poor. Roosevelt’s revisions expanded federal employment to all Americans, while also ensuring applicants merited the position to which they applied. Finally, his home life exemplified the family values so often guilefully heaped upon Ronald Reagan.

A comparison between the two men demands a discussion about their relationships to labor, industry, and finance. Contemporarily, U.S. citizens reside under a governmental system that operates closely with those elements that serve as an engine for American capitalism. In its purest form, the capitalist system requires both winners and losers to operate efficiently; for every gain in the market, somewhere in the system a loss occurred. This close relationship between industry and government exists throughout the nation’s history however, the Founding Fathers’ vision did not include placing industry’s interests before that of the population. As the nation’s economic system evolved from one of mercantilism to that of raw capitalism, the relative importance of industry to government eclipsed that of the population. The president’s role as the first citizen is to define and enforce the boundary between government and industry. Both Ronald Reagan and Theodore Roosevelt applied very different definitions to the boundary between these competing elements, and in defining this boundary, they indicated whether government functioned, as envisioned by Abraham Lincoln, “of the people, by the people, for the people.”

Much of Roosevelt’s young adulthood spanned the Gilded Age, and his habit of dressing like a “dandy” is only one example of the period’s influence on him. A member of the nation’s financial elite, he benefited from the flow of resources that great wealth provided, including a Harvard education, comfortable and secure homes, a steady diet, and access to healthcare. The last was particularly important to a severely asthmatic Roosevelt, who, had he been born poor, probably would not have reached adulthood. He rubbed elbows with some of America’s most influential industry leaders, and understood the inside role capital performed as a lubricant in the political process.

Nurtured within a pro-business environment, Roosevelt’s initial views on labor matched those of industry, he supported laissez-faire capitalism that promoted the primacy of business over employee working conditions and pay; this included the owner’s right to deny employment based on age, race, sex, religion, and political affiliation. His evolution from pro-business to pro-labor was indicative of that of the Gilded Age population. Insulated by their social positions, middle-class and wealthy Americans only learned about poor labor conditions when trouble erupted. Conflict between industry and labor most often occurred when unions moved to organize workers. Industrialists employed a number of methods to combat unionization including the intimidation or firing of union workers, and replacing laborers with immigrants and African Americans, who worked for much lower wages. These measures succeeded in the short term, however as industrialism created great wealth for the nation, the deterioration in social and working conditions offset the gain for the poor, who increasingly found solidarity and support in unions.

While the nation remained largely rural, the effects of the booming industrial revolution caused urban populations to swell faster than a city’s institutional structure could keep up. This meant inner-city areas lacked garbage collection, running water, indoor toilets, bathing facilities, and fresh air. Because most factory workers lived within walking distance of the job, they journeyed from one extreme to another; most factories were dimly lit, intemperate, and dusty workspaces. Industrialists expected their employees to work a ten-hour day, and a six-day workweek. If a laborer suffered a debilitating or fatal workplace injury, no compensation or justice was forthcoming; instead, a replacement stepped in and production continued until the next accident occurred. Labor conditions in rural mining towns were often not much better. Company towns sprang up near remote mining sites, and forced workers to “buy their food and rent their miserable shacks from the companies.” Farmers, too, felt the heavy hand of industrialists in the form of inflated shipping rates charged to move produce. In many instances, farmers unable to pay railroad fees simply allowed crops to rot in the fields. Urbanites often suffered from the collusion between city government and railroad executives, as the scarcity of competition allowed rail operators to neglect repairs to equipment, leaving customers to pay a heavy price as unsafe railcars placed lives in danger of injury or death. Trains often operated sporadically, and the public considered the time schedules untrustworthy.

The shift in Roosevelt’s attitude toward industry began when, as a New York state legislator, he investigated the close and improper relationship between Jay Gould, State Attorney General Hamilton Ward, and State Supreme Court Justice T.R. Westbrook during the 1881 acquisition of the Manhattan Elevated Railroad by Gould. A year prior, Ward sued Manhattan Elevated as insolvent, a decision that Judge Westbrook approved. Ward then place two Gould employees as receivers, and after Manhattan Elevated stocks plunged by 95 percent, Westbrook declared it newly solvent, and gave it over Gould. In a letter to Gould, Westbrook stated, “I am willing to go the very verge of judicial discretion to protect your vast interests.” When his uncle and mentor, James A. Roosevelt, learned of Theodore’s intent to impeach Westbrook, and bring criminal charges against Ward, he pressed his nephew “to leave politics and indentify…with the right kind of people.” Stunned by his uncle’s words, Roosevelt later said, “It was the first glimpse I had of that combination between business and politics which I was in after years so often to oppose.”
It was a bitter medicine to swallow for Roosevelt when, after presenting exhaustive evidence to support his accusations, the Judiciary Committee reported finding no wrongdoing, and the state legislature voted 77-35 to accept its findings.

He angrily responded to Gould’s rough handling of the law by backing a bill that lowered fares on Manhattan Elevated from ten cents to five; Roosevelt reasoned that Gould profited unfairly at the public’s expense, while hiding much of the profits to avoid paying state taxes. The Five Cent Bill passed the legislature with massive public support behind it; however, Governor Cleveland vetoed the bill. Roosevelt later apologized for supporting the measure, but continued to express displeasure with industrialists of Gould’s ilk stating, “They are common thieves…they belong to that most dangerous of all classes, the wealthy criminal class.”

Roosevelt’s attitude toward industry continued to shift during his tenure as a police commissioner. As noted previously, he conducted frequent strolls around New York City to monitor police behavior, but when accompanied by social reformer Jacob Riis, the planned route usually included examples of industry’s detrimental impact at the community level. Prior to his police commission job, Roosevelt operated in that rarified air reserved for those of wealth and power, which limited his contact with abject poverty to isolated incidences. Now he confronted the conditions repeatedly, finding entire families in excess of five members occupying a single room tenement. These apartments contained no running water, and the stench of rotting refuse pervaded the buildings because tenants tossed garbage from their windows to the street or courtyard below, which went uncollected by the city. In response, he “closed down hundreds of tenements for incredible violations of the city’s health regulations.”
Because of his limited powers, health code violations were the only means available for him to assist the poor; if Roosevelt could not directly assist the poor, he would punish the property owners that benefited from these atrocious conditions. With the power of the governorship, Roosevelt’s hardened attitude toward industry became even more sharply defined, as he decided to hold industry accountable for paying its fair share.

During his political career, Roosevelt observed the exponential growth of industrial power and its consolidation to a small cadre of individuals who controlled vast trusts. Using vertical integration to control pricing, production, and availability empowered corporate leaders to hinder competition, which violated the Sherman Antitrust Act. These monopolies created massive fortunes for a small number of industrialists, providing them with unprecedented power to influence politics. Roosevelt was bothered that “neither the parties nor the public had any realization…or any adequate understanding of the dangers of the ‘invisible empire’ which throve by what was done in secret.”
To weaken the relationship between politics and government in New York, Roosevelt threw his political weight behind the Ford Bill, an 1899 state reform measure designed to tax franchises on state issued grants. He stated, “A corporation which derives its power from the State, should pay the State a just percentage of its earnings as a return for the privileges it enjoys.”

Fighting back, industrialists threatened to withhold campaign donations or general support from Roosevelt in future political campaigns if he continued with the legislation. State Republican Party leader Platt warned Roosevelt he would do everything in his power to delay or kill the Ford Bill, so Roosevelt publicly stated his intention to compromise. If the Platt-lead legislature produced a measure that met the governor’s standards, he would sign it; however, if he found the counter-legislation inadequate, Roosevelt intended to sign the original Ford Bill instead. The press and public sided with him, and Platt relented, allowing the bill’s passage. Roosevelt’s support of the Ford Bill speaks to his continued hardening attitude toward industry’s unfair advantage over government and the people. Faced with political ostracization, he pressed forward, recovering $11.5 million in taxes that the state spent on services for residents. It also serves as a benchmark in Roosevelt’s understanding of the boundary between industry and government, a boundary that broadened even further during his presidency.

Roosevelt considered himself an “accidental” president because, as vice president, he achieved the office after the assassination of William McKinley in 1901 by Leon Czolgosz. Unable to control Governor Roosevelt, Boss Platt sent him to what most believed was a political wasteland by cutting a deal with National Republican Party Leader Mark Hanna to place Roosevelt on the ticket as McKinley’s vice president. Platt’s machinations pushed an unwilling Roosevelt into a position that most politicians considered powerless. His addition boosted McKinley’s status, as Roosevelt was already a popular figure nationally because of his well-publicized reform battles against industry and corrupt politicians. Hanna understood the implications of a reformer as smart and energetic as Roosevelt occupying the nation’s highest office; when first approached with the plan by Platt, Hanna protested, “Don’t any of you realize that there is only one life between this mad man and the White House?” Now, as the nation’s first citizen and public steward, he possessed the power to correct the disparities between industrial political power and the government, but the nature of his presidential advancement meant he operated without a mandate. However, he considered a mandate unnecessary if federal laws were being broken, as was the case with the Northern Securities Company.

Owned by wealthy financier J.P. Morgan, the Northern Securities Company satisfied the criteria set forth by the Sherman Antitrust Act as a monopoly because, in 1901, its railroads controlled travel to the West. Unlike previous presidents, when Roosevelt decided to act, advance warning was not given to industrial leaders. The litigation enraged Morgan, who complained that if one of his holdings was violating federal law, the two men could collaborate on a solution. The hubris contained in Morgan’s response struck a chord with the President, the press, and the public, by implying that corporate political power allowed corporations to negotiate outside the law. This attitude Roosevelt roundly rejected and diligently worked to disabuse when he forced the break-up of Northern Securities.

The conflict involving Morgan and Roosevelt clearly defines his understanding of the proper relationship between government and industry. By pursuing the corporation under the Sherman Act, he signaled to industry leaders that government functioned not on a subservient or level plane with corporations, but as the dominant party in the relationship. Railroad tycoon E.H. Harriman warned Roosevelt that defeat and political ruin awaited him because “because whenever it was necessary he could buy a sufficient number of senators and congressmen or state legislators to protect his interests, and when necessary, he could buy the judiciary.” As president, he intended to protect the public’s interests, and if corporations obeyed regulations, its interests, too, were protected. The Roosevelt administration continued to push legislation that protected the public by passing the 1906 Food and Drug Act, which forbade selling tainted food or fraudulent drugs, and the Meat Inspection Act that required federal regulators monitor sanitary conditions in meat processing plants. The Hepburn Act of 1906 further regulated railroad fares and shipping rates by extending the power of the obscure Interstate Commerce Commission to set rates and ensure a competitive system.

Roosevelt’s grandest act as the people’s steward occurred in the arena of conservation. He enjoyed a deep and abiding love for the American wilderness, and believed its natural resources deserved protection from industrial overuse. An avid outdoorsman, Roosevelt owned a cattle ranch in the Dakota Territory, and built his log ranch house by hand. He observed the decline and extinction of displaced animal species as the logging, mining, and railroad industries carved up large sections of the wilderness in the rush for resources to supply American manufacturing. The unregulated harvesting of resources left a wake of environmental destruction that polluted rivers and lakes with the toxic chemicals used by mining to extract minerals, and denuded vast treks of forest to supply rail ties for westward expansion by railroads. If future generations were to have the opportunity to revel in America’s wild lands, judicious government oversight was required. To accomplish this, Roosevelt politically battled with industry-funded Senate leaders whose understanding of land use was very different from that of Roosevelt.

Federal land management in the early 1900s maintained a friendly relationship with railroads corporations by giving land allotments to incentivize westward expansion. Following the discovery of valuable ore resources, the government sold mining interests land allotments at greatly reduced rates. In many instances, both railroad and mining corporations resold these allotments for private development, reaping a healthy profit. If these industries did not own the land, as was often the case with lumber corporations, no authority existed in the areas to challenge its veracity to the resources. The overwhelming attitude among industrialists regarding federal resources was one of entitlement, because their efforts were responsible for bringing them within reach; in essence, the wilderness was a vast storehouse, theirs for the taking.

With these two competing views at war in 1902, Roosevelt began aggressively placing large tracks of land under federal protection, declaring many of them natural wonders deserving of protection for future generations of Americans to enjoy. To safeguard culturally historical sites such as Chaco Canyon and Devils Tower, he signed the Antiquities Act, a law that allowed the president to designate arbitrarily an area as culturally important to American heritage. By the time Roosevelt stood for re-election in 1904, he was responsible for creating “three national parks, twenty-nine national forests, and two federal bird reservations. After winning re-election, Roosevelt believed he possessed the necessary mandate from the people to push ahead with his conservation agenda. In 1906, Roosevelt introduced the United States Forest Service, and placed Yale-trained forester Gifford Pinchot in charge of the fledgling agency with domain over 60 million acres of land. Roosevelt argued for the agency’s necessity saying, “…the interests of the people as a whole are, I repeat, safe in the hands of the Forest Service. By keeping the public forests in the public hands our forest policy substitutes the good of the whole people for the profits of the privileged few. With that result none will quarrel except the men who are losing the chance of personal profit at the public expense.” Roosevelt further endeared himself to the public by allowing homesteaders 160 acres on federal lands, provided the parcel was occupied and improved upon. The government tasked the forest rangers with enforcing the law, which earned them the wrath of logging interests that paid individuals to act as surrogates for the industry. When discovered, the owner of a false claim faced aggressive prosecution, with penalties that included lengthy prison sentences.

Roosevelt’s protective, and some would say paternal, attitude toward the people included their safety and wellbeing in the workspace, and the right to demonstrate peacefully over unfair working conditions. In deciding whether to interfere in a labor dispute, the tipping point for Roosevelt appears to be when the threat of violence seemed imminent. Unfortunately, deplorable working conditions and a three-tiered wage system that benefitted whites first, followed by immigrants, with blacks receiving the lowest pay, kept tensions high between labor and management. Such was the case in the 1902 anthracite coal strike in which one hundred forty thousand miners, the majority immigrant, stopped production in a bid to gain a 10 percent wage increase, and the recognition of their union. In addition, the miners demanded a more fair system for weighing the coal, a determining factor in their wages, for which they labored twelve-hour workdays in dark, hazardous, and soot-filled conditions. In public opinion, the miners’ demands appeared just however, industry leaders refused even to consider the proposals. With winter approaching, Roosevelt threatened the coal operators with new federal regulations, forcing them to negotiate with labor in proceedings overseen by a federal commission. The miners earned the wage increase and a reduced nine-hour workday; in a first, government acted as an honest broker between labor and industry. In the final accounting, the miners failed to win recognition for the union, a condition that did not go unnoticed by Roosevelt.

In the contemporary pro-business environment, his decisions regarding antitrust litigation, conservation, and labor relations redefined government’s relationship with industry and reveal a president not only in step with public sentiment, but also at times ahead of it. The public took up his mandate in the coming decade by forming the Progressive Party. Roosevelt’s firmest opinion on labor appeared in his plank as the Progressive Party’s candidate in the 1912 presidential election; he demanded compensation for workplace injuries, regulations limiting child labor, a safe work environment, an eight-hour workday, and the protection of working women. Although he lost the election, Roosevelt’s fight for labor influenced his cousin, Franklin, who later ensconced the legitimacy of labor by federally recognizing unions’ right to exist alongside industry, by signing the 1935 Wagner Act. The loss also serves as an indicator of government’s limitation in the area of social welfare over industry in the early 1900s. This changed in the coming years as Fordism became the new standard for efficient industrial production by introducing the assembly line. More importantly, Ford paid workers in excess of industry standards because he believed employees deserved the means to purchase the sweat of their labors; Fordism promoted the ideology of consumerism in which the individual served both as employee and customer. Labor unions pounced on the political possibilities presented by the ideology that decreased working wages kept the individual from enjoying a consumer economy. This narrative of abundance just beyond reach, and the system’s undemocratic nature, allowed FDR to promote the Wagner Act as a fairness issue that allowed labor unions to compete with industry.

In 1929, Ronald Reagan benefited from labor’s expanding influence, of which Theodore Roosevelt was an advocate and later validated by Franklin D. Roosevelt, when he acted as spokesperson for Eureka College’s student union. Faced with a large school budget deficit, Eureka President Bert Wilson announced plans to cut programs, while also laying off a portion of the academic staff. Reagan blamed poor leadership in the mismanagement of the budget, and that students unjustly bore the brunt of Wilson’s ineptness; he called for Eureka College’s president to resign. The student union won the day as Wilson vacated the position, but the irony of Reagan’s involvement came at the school president’s expense. Months earlier, a desperate Reagan pleaded with Wilson for a needy student scholarship so that he could afford attendance; Wilson granted Reagan a scholarship and it cost him the job. That Reagan’s student union boasted the agency to force a college president from office speaks to unions’ growing importance, but also marks his entry into the labor movement.

After moving to California in the 1930s to work in film, Reagan joined the Screen Actors Guild (SAG), a union tasked with the workplace protection of a diverse cross-section of professional actors. As the federal government’s forced restructuring of the movie industry forced aging “B” actors such as Reagan into increasingly marginal roles, he focused more energy on his SAG duties. From 1947-1952, Reagan served as president of the union, and credited studio head Jack Warner with teaching him how to negotiate. He also complained bitterly about the federal regulations he believed impeded the film industry. During this period, his sympathies began shifting toward management’s understanding of film industry operations.

Reagan’s compliant attitude toward management while acting as SAG president indicates his initial shift from a liberal Democrat to a conservative Republican; his vacillation also serves to demonstrate Reagan’s understanding of the boundary between industry and labor unions. In a twenty-year span, he morphed from a student union spokesperson agitating against unfair school program cuts, to a president of SAG with sympathy toward management’s position. In his last year as SAG president, Reagan signed a waiver with Music Corporation of America (MCA) that provided the company with unlimited access to television resources, giving MCA an unfair advantage within the film industry. This conflict of interest attracted the attention of authorities in 1962, as a federal grand jury questioned his role in the affair, but chose not to indict him.

When Ronald Reagan became governor of California in 1967, the state was reeling from the social unrest that accompanied the Civil Rights Movement. The nation’s college campuses felt these effects intensely, as much of the movement’s momentum came from student protests, and the University of California, Berkley became one such hot spot. A student-lead demonstration over the school’s land usage policies erupted into violence after Reagan ordered 300 California Highway Patrol and Berkley police officers to disperse the crowd. In the clash, a police officer using a shotgun fired on a student, killing him; city residents and students retaliated by flooding into the area. Reagan countered by sending in 2700 National Guard soldiers to police the city and enforce a curfew. The individual who at one point in his life demonstrated against Eureka College’s improprieties defended his actions by stating, “If it takes a bloodbath, let’s get it over with. No more appeasement.” Reagan’s shifting attitude on protest and labor unions continued its movement toward a pro-business conservatism.

Evidence of Reagan’s complete transformation to an anti-union Republican manifests itself in his handling of the 1981 PATCO union strike of air traffic controllers. Unwilling to compromise with the union, Reagan fired 13,000 air traffic controllers and replaced them with military personnel until new controllers were trained. Historian Eric Foner stated that Reagan’s actions “inaugurated an era of hostility between the federal government and organized labor.” His positions on labor encouraged private employers to fire striking union workers and replace them with non-union employees, which Foner noted, “was a rare occurrence before 1980.” Speaking from a pro-business point of view, Chairman of the Federal Reserve Paul Volker called Reagan’s accomplishment “a turning point in America’s economic, psychic, and patriotic revival.” Reagan’s reversal on labor issues demonstrated only one element in his pro-business attitude.

The 1950s mark the last decade of manufacturing as a key factor in the American economy, and Reagan witnessed it while hosting the General Electric Theater television program. The show mixed in tours of General Electric factories with skits performed by contemporarily popular Hollywood figures. As General Electric’s spokesperson, Reagan often complained of the unfair burden placed on the corporation by federal regulations; however, the federal oversight that he so deplored was in place to protect worker safety and labor’s right to seek fair wages. Labor’s successes, however, increased cost for large manufacturers and spurred industries to move plants overseas, and seek economically distressed countries with more relaxed regulations and a cheap source of labor. Industries that remained within the U.S. updated factories with equipment to mechanize repetitive tasks, leading to a reduced manufacturing workforce.

While federal regulation and labor’s influence resulted in manufacturing’s wholesale abandonment of
America, both elements also combined to produce the world’s largest and most affluent middle class. The rising standard of living prompted a change in how Americans defined freedom from that of “economic independence and democratic participation,” to the “ability to gratify market desires.” Two factors that weakened labor’s position were the shift to a consumer society and the decline in manufacturing; however, America’s financial sector grew by developing mechanisms to ease the flow of credit to hungry consumers who, in turn, purchased new homes and all the items deemed necessary to fill it. By embracing a consumer culture, Foner stated, “Americans became comfortable in living in never-ending debt, once seen as a loss of economic freedom.”

Reagan’s five years as a G.E. spokesperson immersed him in a corporate culture that defined economic freedom as the worker’s right to quit an employer if unhappy, and the primacy of management to both control wages, and continue hiring practices that placed barriers to employment based on age, race, sex, religion, and political affiliation. During the show’s run, he visited 135 plants nationwide, and observed a growing social welfare state that enabled the poor to remain unemployed; however, this view of the indigent as freeloader ignored the contemporary reality of the high unemployment caused by manufacturing’s shift to overseas labor markets. Reagan believed government interference drove American corporations into a more business-friendly foreign market, which kept industries from successfully competing on a global level. Although he lacked the necessary power to implement change, the show provided a platform to influence the hearts and minds of American viewers on the benefits of corporatism to the nation.

Reagan’s pro-business message attracted the attentions of wealthy California businessmen who recognized the political and business opportunities implied in his ideology; they supported him in the 1967 California race after he promised to lower corporate tax rates, and relax regulations for business in the state, a program Reagan continued to espouse as president. Federal regulations limited the scope and range of industrial growth by forcing adherence to unrealistic or outdated rules, and he believed they warranted re-examination by the administration. Taxes, he believed, stifled spending for all Americans, from individuals to corporations. Influenced by Jack Kemp (R-N.Y.), Reagan embraced a “supply-side” ideology that proposed reducing taxes for corporations and high-income individuals, believing the entitlements would encourage the growth of personal and corporate wealth. Flush with profits, they would reinvest their capital in new business ventures, or improvements to existing structures, and in the process create new job opportunities for working Americans. Supply-side economics, or “Reaganomics,” promoted a laissez-faire approach to government interference across a broad spectrum of industries, and Reagan assured big business that his administration intended to relax regulations. His ability to convince the American public of the veracity of supply-side economics and reduced regulations to improve the nation’s economy, speaks volumes about Reagan’s oratory skills. It also demonstrates his presidential understanding of the boundary between government and industry. Reagan believed that by assisting corporations, he also helped uplift poor Americans who needed employment.

We now know that Reagan’s economic and regulatory policies failed to meet the intended goals; wealth accumulated to a small portion of the American population while wages and employment stagnated for the working poor. The corporation’s ascendancy in government influence at the expense of the unions weakened labor’s ability to demand fair wages. As a result, the contemporary divide between rich and poor Americans is at its greatest level since the Gilded Age. Meanwhile, Reagan’s tax cutting strategies proved ineffective in producing the “trickle down” of wealth from the rich to the poor, because the wealthy horded the largesse, without rebuilding America’s manufacturing sector.

Five Reasons the Average American Cannot Become President of the United States

Beginning in primary school American students learn about the inclusive nature of democracy, which promises that regardless of sex, creed, or religion, any United States citizen can become President. But how true is this teaching? Let us concede for the moment that an individual meets all the legal requirements to seek the nation’s top office. What are the other factors that allow citizens to attain the presidency? Utilizing the 18 presidents elected during the 20th and 21st centuries as guidelines, each one shared the same distinctions in the areas of gender, education, college affiliation, political party, and government service. So, could the average American realistically become President of the United States? Here are five reasons that you will never sit in the Oval Office’s big chair:

1. Political Party

Though touted as inclusive, American democracy only offers the voter two party choices – Democratic or Republican. While many political parties vie for presidential power, only these two parties successfully ran candidates that reached the White House, with 11 for the Republican Party and 7 for the Democratic Party. This means that until a different party usurps power from these two groups, the candidate must be either a Republican or Democrat, or there is zero chance of winning.

2. Gender

During the 2008 presidential race, much was made of Sarah Palin’s insertion onto the Republican ticket as Vice-presidential candidate. She joined Geraldine Ferraro, the 1984 Democratic Vice-presidential candidate, as the only two females ever to reach that level for the two major parties. Sadly, the data on our list of candidates reveals that the presidency remains in the male dominion. Hillary Clinton presents a strong viable 2016 candidate for the Democratic Party, and she may well break the cycle for women. However, until a woman gains the presidency, the data reveals that being female means exclusion.

3. Education

Primary school attendance is required in the United States unless parents opt to provide their own form of schooling, and all of the last 18 presidents graduated from high school. College, however, is an expensive investment, and tuition continues to climb at an alarming rate. On our list of presidents, only Harry Truman did not attend college and graduate. According to the data, a successful presidential candidate possesses a high school diploma, at minimum, and without a college diploma, the chances of success drop to only 5%.

4. College Affiliation

The number of higher education institutions in the United States is vast, but a fraction hold the distinctions of providing its alumnus with both a degree and a pedigree. U.S. News and World Report lists the top National Universities and Liberal Arts Colleges in America each year, and five universities consistently jostle for the top spots: Princeton, Harvard, Yale, Stanford, and Columbia Universities. On our list of presidents, fourteen graduated from a school ranked in the top 20 of U.S. News’s list, with Harvard and Yale Universities dominating the other three schools. This means that 78% of the presidents elected in the 20th and 21st centuries graduated from a small sliver of the gargantuan number of colleges accessible to Americans. There is still hope for a candidate who attended a school outside the top twenty, as Warren Harding (Ohio Central College), Lyndon Johnson (Southwest Texas State Teachers College), and Ronald Reagan (Eureka College) attended colleges considered small by contemporary standards. However, a presidential candidate taking this route only achieves success 18% of the time.

5. A Powerful Position at the State or Federal Level

Along with gender and education, holding a powerful political office at the state or federal level is a commonality shared by all eighteen presidents elected in the 20th and 21st centuries. The number one power position held prior to attaining the presidency is divided between two offices, with each providing six presidents. The only position at the state level from which presidents ascend is that of Governor, while at the federal level Vice-presidential candidates are the clear favorites. U.S. Senators occupy the next most popular spot with three presidents. The final three positions are one-offs from the federal level that include Secretary of War, Secretary of Commerce, and Supreme Military Commander. In short, the data reveals that the successful presidential candidate must be a veteran of some form of government.

Based on the shared characteristics of our previous eighteen presidents, the average American stands a 5% to 22% chance of reaching the nation’s highest office provided the candidate campaigns from a power position at the state or federal level. American democracy’s most egregious omission is the absence of female representation at the presidential level. By the gender standards of the 20th and 21st century’s presidents, half of the American population is denied fair representation. Perhaps even more alarming is the hold on presidential power by a small sliver of higher-education institutions, and the social ideologies, some shared, that influences future leaders. Furthermore, the cost of an education at a top twenty school is $46,000 annually and economically unfeasible for the majority of Americans. So, the next time you hear someone say that any U.S. citizen can become President of the United States, speak truth to the lie and set them straight.

Education and Political Power

The definition of American democracy in its simplest form is a government of the people, by the people, and for the people, and it worked for the nation’s earliest political leaders. Consisting of men who worked the land, they were familiar with the wants and needs of their constituents, and took the matter of government service as a serious duty. The legislative calendar reflected the times as politicians scheduled session breaks to coincide with the planting or harvesting of crops. Times, however, have changed and a popular complaint today is that our leaders are out of touch with the populace, and place corporate welfare above the well-being of the people. The overwhelming cause for this disconnect is education, or more precisely, which college a politician attends.

For most Americans, the cost of college plays a significant role when choosing a school. Higher education is an investment that serves to expand the student’s mind, while also teaching the social skills necessary to make friendships that may last a lifetime. With each new addition to the social network, new opportunities become available for the student, and if seeking a graduate degree, the social network matures to include a smaller network of exceptional friends. However, for some students cost is not a factor, and the choice becomes influenced by a university’s prestige and reputation for providing the student with both a degree and a pedigree. The friendships made while attending one of the nation’s top schools offer exponential benefits to the student via access to social elites, while the school’s reputation bestows an often-undeserved gravitas on its alumni.

Education matters, but to reach the truly preeminent position of presidential, judicial, and senatorial power requires attendance at one of the nation’s top twenty universities. To explore the relationship between education and political power, this analysis divided American higher educational schools into three tier levels using data provided by U.S. News and World Report rankings of National Universities and National Liberal Arts Colleges. The top twenty schools on both lists formed the Tier 1 schools, while schools ranked from 21 to 50 were included in Tier 2 schools. Tier 3 schools included any institution ranked below fifty.

To attend a Tier 1 school in 2014 costs an individual $43k annually with an average student enrollment of 15k in national universities that drops to an average of 2k for liberal arts colleges. A steep price to pay for a smaller class size, the cost of an undergraduate degree from an elite school is $172k and it continues to rise. The students encounter elite private clubs and rub elbows with the next generation of socioeconomic giants, and possibly future powerful contacts.

Further, these contacts make the exorbitant school costs at this level such a worthwhile investment. Narrow the focus to the nation’s top five universities – Princeton, Harvard, Yale, Stanford, and Columbia Universities – and the fiscal benefits of attending one of these institutions becomes apparent with an alumnus that boasts 116 billionaires, according to The Atlantic. Harvard alone counts almost 3000 graduates as worth more than $30 million each. Academically, these schools are staffed with scholars whose studies drive the narrative for the nation’s future. However, the elite atmosphere that insulates these institutions also limits their students and faculties exposure to America’s cultural soup and the economic realities confronting the nation’s denizens. As a result, the scholarship from Tier 1 schools is often focused through a very narrow socioeconomic prism of understanding.

The difference in cost between a Tier 1 school and Tier 2 school is small, with the student on the hook for an average of $39k annually at a national university, and a $42k average annual price tag for attendance at a liberal arts college. Annual average enrollments climbed to 24k at national universities, but maintained the same 2k average found at Tier 1 schools. The increased enrollment signals that Tier 2 national university schools appear more accessible to a larger portion of the American populace, providing greater diversity for its student body, and broader access to opportunities for cultivating economic contacts. The mirror-like averages for Tier 1 and Tier 2 liberal arts schools infers that the colleges in this group also provide a measure of exclusiveness to degree-holding students, thus boosting the perceived value of alumni membership. Ideas, too, flow copiously from the scholars at these universities, bringing with them the possibility of providing the fame necessary to make the leap to teach at a Tier 1 school.

Tier 3 schools are populated by the remaining higher education institutions, with an average annual cost of $22.8k for the student. This is the tier group available to the majority of American families, admitting students based on a sliding scale of costs in relation to their economic conditions. The quality of academic output from scholars at this level also varies too much to generalize because its members are in the process of career expansion or decline. The substance of a degree earned from a member school depends on the student’s willingness to study diligently in the search for knowledge. While Tier 3 contains the vast majority of American colleges, it is the least represented group within the halls of presidential, judicial, and senatorial power.

In the 20th and 21st century, all but four of eighteen American presidents attended Tier 1 schools. The remaining four presidents hailed from Tier 3 schools or possessed no college education, while two members made sweeping changes to the national dialogue about poverty and identity. Lyndon Johnson signed into law the Civil Rights Act of 1964, and advanced the ideology of a Great Society and the War on Poverty that resulted in government supported food assistance and education programs. Without Johnson’s support for black Americans, there is no President Obama. Ronald Reagan, the penultimate presidential hero for modern Republicans, redefined the definition of poverty leading to a reduction in welfare recipients, while arguing for lower taxes on the wealthy that he promised would “trickle down” to the rest of society. Neither, however, attended a Tier 1 school.


The percentage of Supreme Court Justices hailing from a Tier 1 school is 66%, and shows only a moderate fall from that of the presidential numbers. With thirty-seven of the fifty-six justices appointed in the 20th and 21st century coming from Tier 1 schools, this percentage pales in relation to the current number of sitting Supreme Court Justices, with 100% representation. The impact the judicial branch of the government has on its citizens is mighty. Collectively they are the custodians of interpretation of the U.S. Constitution, and the institution’s power was on full display when it expanded the definition of free speech in politics, resulting in a flood of money for campaigning purposes.

Because of the vast numbers of Senators voted into office during the 20th and 21st century, information for this analysis was limited to a fifty year period beginning with the current
Senate members of the 113th Congress and continuing in decade length intervals to include the 108th, 103rd, 98th, and 93rd Classes. In the Senate, there was a greater level of diversity among Senators and their school tier group representation, with almost 47% of its members identifying as Tier 1 school alumni. However, the tilt in numbers remains firmly in the direction of Tier 1 alumni, with Tier 3 school representation hovering at 45% throughout the period. While these numbers point to a greater degree of influence sharing between the tier groups, much of a Senator’s power resides in committee memberships.

Using information from the 113th Congress, power sharing at the committee level appears relatively even between Tier 1 and Tier 3 schools with the exception of three committees – Committee on Rules and Administration, Committee on the Budget, and Committee on the Judiciary. The Committee on Rules and Administration operates to provide credentials and qualifications for Senate members, but it also oversees federal elections for President and Congress. The Tier 1 representation for this committee stands at 57%, or eleven members, while Tier 3 representation was 37%, or seven members. With the power to decide elections, the current trend means a small cadre of schools possesses the ability to influence election outcomes through entrenched ideologies that emanate from these institutions.

The Senate Budget Committee, in a sense, holds the purse strings of the federal government, deciding the general economic plan for the nation’s spending, while also overseeing the Congressional Budget Office, which monitors the federal debt. This committee contained the highest Tier 1 membership of any committee, as it contained 14 senators, or 64%. Senators from Tier 3 schools numbered seven for 32%, a dismally small percentage of members to represent the viewpoint of the average American, particularly in matters of spending. A committee populated by members who earn more in a year that many Americans earn in a lifetime, yet they still claim to understand the harsh economic conditions that embrace the nation’s poorest people.

The final committee with an overwhelming number of Tier 1 schooled senators is the Committee on the Judiciary. Tasked with oversight of amendments to the U.S. Constitution, it also controls the selection process for new federal and Supreme Court justice nominees. With such far-reaching influence over the laws of the land, the decisions made by committee members directly impact the rights of all Americans. Tier 1 representation on this committee is 61%, or eleven senators, while Tier 3 members only number six, or 33%, a paltry accounting from a government organization that purports to be defenders of justice. It is a vision of justice imagined through the narrow filter of the most expensive, socially exclusive higher education institutions, and only slightly ameliorated by the diversity of six Tier 3 senators.

Each senate committee possesses its own form of power. The Rules and Administration Committee influences the election process, the Budget Committee holds sway over the spending of federal monies, and the Judiciary Committee controls the lawmakers and lawmaking processes. However, is it wise to trust these vast powers and the ideology driving them when only a narrow band of schools, the nation’s top twenty institutions, informs their world view? Further, what does the data mean for the average American?

First, unless the average American can afford to invest $368k on education, joining the Supreme Court is virtually impossible, and gaining the presidency becomes a remote possibility. If the percentages were expressed in terms of a successful cancer recovery diagnoses, the average American’s chance of living would be 22% (President) or 23% (Supreme Court), both low enough to begin planning a funeral, unless, of course, cost was no issue for the patient. The only president who served in the last two centuries without a college education was Harry Truman, an anomaly the current statistical trend indicates will not recur anytime soon.

Second, data from the 113th Senate class points to greater opportunities for power sharing, however, to achieve a seat on an influential committee requires the same heavy education investment as that of a President or Supreme Court Justice. The large presence of Tier 1 schooled senators on the Judiciary Committee may influence its preference for choosing like-minded Supreme Court Justices. Anyone familiar with the high court’s nomination process recognizes the penchant toward selecting candidates that conform, while differences in social or economic ideology provide ammunition to attack and dismiss those whose views fall outside mainstream beliefs.

Finally, the accusation leveled by the Occupy Movement, that American political leadership favors the 1% of society, appears valid. The overwhelming representation of Tier 1 schools among the nation’s political elites reinforces an ideology poorly influenced by the economic and social diversity encountered by the average American. These conditions create a self-sustaining ideological loop that equates elitism with intelligence, resulting in political support by Tier 1 educated leadership for policies that appear foreign to its citizens.

A nation that once enjoyed fair representation by its peers, who provided a government of the people, by the people, and for the people, has devolved into a system that instead is of the elites, by the elites, and for the elites. With the rising cost of college tuition, it is a system built to accommodate success, unless you are an average American.

 

In what American social class do you or your family reside: the wealthy, middle class, working class, or poor?

In what American social class do you or your family reside: the wealthy, middle class, working class, or poor? I posed this question to a classroom of San Diego State University students taking an introductory course on American history from the Civil War to the present. This was my third semester working as a teaching assistant, and experience taught me that students understood history concepts best when explained by connecting them to real life experience. At this point in the course, we were discussing the social upheavals that buffeted American society from 1900-1930, particularly in the labor movement, and paved the way for President Franklin Roosevelt’s New Deal policies.

To make the polling as shameless as possible, I instructed students to close their eyes and raise their right hand above their heads if they were members of the wealthy class. If they identified as members of the middle class, students were told to raise just their left hands, and if they considered themselves part of the working class to raise both hands. Finally, I instructed those who identified as poor to not raise their hands.

As I counted the students, noting the number in each category, the majority identified themselves as being in the middle class. A smattering of students polled from the wealthy and working class, while one brave soul claimed a place among the poor. The results, however, did not conform to the statistics provided by SDSU administration that showed the majority of its students came from San Diego’s working class. To explore this statistical anomaly, I placed a student at a white board that listed the four social class groupings, while the class provided suggestions for the identifying characteristics of each social class.

Students began with a debate about the fiscal boundaries of each social class before moving on to define the economic factors important to all Americans. While on the surface determining the components that characterize an individual’s social class appeared simple, students struggled with this concept because Americans rarely think about class differences, with most simply believing they fit within the middle class. To really flesh out this problem, I asked the class to think about the major economic factors they will probably face while growing older. What services, I asked, will you need in day-to-day life, and which resources do you consider indulgent?

That simple question brought an avalanche of ideas forward and the class settled on eight factors that delineated each social class from the other: housing, transportation, healthcare, salary, savings/stock ownership, credit, profession, and the access and affordability of a college education for both adults and their offspring. Using these boundaries, the students’ produced a table (see below) that they believed reasonably defined each American social class.


Poor

Working Class

Middle Class

Wealthy

$11,000-single

$15,000-couple

$18,000-one child

$22,000-two child

$23,000-single

$28,000-couple

$32,000-one child

$38,000-two child

$46,323-single

$67,348-couple

$1,000,000 +

Rent/Own home/apt

Rent/Own home/apt

Own home/multiple homes

Own multiple homes

Car/Public transportation

Car/Public transportation

Multiple vehicles

Multiple vehicles, ships, aircraft

High school education/some college

High school education/some college

BA/MA/PhD

BA/MA/PhD

Lives month-to-month on paycheck

Lives month-to-month on paycheck

Adequate paycheck

Ample economic growth

Dependent on credit cards or doesn’t own cc

Dependent on credit cards

Less dependent on CC

Use CC as convenience

No savings or stock ownership

Scant savings and some stock ownership

Monthly savings and stock ownership

Stock and corporate ownership

Limited access to healthcare

Limited access to healthcare

Access to healthcare

No limits on healthcare

Work more than one job

Work more than one job

One job or both working

Single job that utilizes work force

Cannot pay for child’s college education

Limited ability to fund child’s college education

Can pay for a child’s portion or complete college education

No barriers in paying child’s college education

Assigning class designation based on salary was the most sensible place to start because the data was readily available on government websites. I challenged students to find the data in less than five minutes, and pointed them to census.gov where they discovered the necessary figures. The paycheck remains a vital economic indicator for most Americans, and a deciding factor for where they will live, their means of transportation, and whether they will go to college. It determines the individual’s access to healthcare and credit, and the amount of savings, if any, for future retirement. More importantly, pay ascertains how much time and resources parents can spend on their children.

The prime domain for minimum wage laborers, the lowest paid workers often needed more than one job just to break even on a month-to-month basis, and depended on credit cards to afford vital services such as food, clothing, rent, and bills. With nothing left over at month’s end, savings are diminutive and limit educational opportunities. Unless employers provide healthcare plans, the poor’s access to medical aid is restricted to emergency hospital visits, or low cost community health centers that only deal with symptoms and not long-term solutions. Affordable housing eats up the largest portion of pay, and families often find themselves forced to live in gang controlled, crime infested neighborhoods. Faced with such a steep economic climb, families endured an overwhelming number of obstacles to escape generational poverty, the students decided.

To move into the working class required an almost doubling in pay based on economic figures, but it also opened up greater educational opportunities. Students posited that, although access to education improved, the other economic factors remained relatively stagnant, leaving the individual susceptible to employment downturns solved by increasing dependence on credit card or taking multiple jobs. Accruing savings was possible, but affordable college education for both the individual and offspring was only possible by accepting student loans. Housing opportunities also improved for members of this social class, lessening the incidents of crime and environmental pollution. The working class, students complained, was the most difficult to define and the least discussed by the nation’s politicians.

Advancement into the middle class required almost a doubling in salary relative to the working class, but students also decided that this group drew fiscal benefits from investing excess wages in stock markets. Access to higher education, and the professional connections derived from the experience, provided greater access to high-wage job markets, a benefit passed on to offspring through private schools and tutors. Credit card use among this well educated group was seen as more convenience driven than as a necessity, and members were able to pay the cost of college for offspring, both tuition and housing, without reliance on student loans. Without the negative effects of burdensome student loans weighing down on them, this group’s offspring were empowered with the advantages necessary to succeed on a generational basis.

The wealthy class, students decided, was the easiest to define because literally no obstacles, other than greed, constrained its members opportunities. Advanced education, fiscal abundance, and political adroitness provided this group with the power to manipulate legislative rules to influence beneficial tax and economic policies. Although President Barack Obama identified $200k as the low-end of the wealthy class, students decided that true financial freedom only occurred for those individuals earning a minimum of one million dollars annually. Wealth accumulation for this class ensues not only from the individual’s earnings, but also from the labor provided from the social classes below it.

Armed with this new understanding of contemporary socioeconomic class, I conducted another secret poll asking where my students believed they fit in the nation’s fabric. This time no one identified as wealthy, while the number identifying as middle class dropped dramatically, replaced by a rise in students claiming working-class and poor status. Class identification among my students brought home the reality of obstacles facing contemporary individuals to economic advancement, something readily apparent to American citizens living during the period from 1900-1930. Now, ask yourself, where do you fit in today’s American social class?

The 10 most annoying things about dining out lists

When people ask me what I do for a living I usually reply that I’m a professional student. I’ve spent a good portion of my life in college collecting various bachelors and masters degrees while waiting tables to make ends meet. I don’t consider myself a professional food server although I’ve been in the restaurant industry a good portion of my life; the job just seems to fit well into a busy life learning new, interesting, and mind-expanding information. My morning routine consists of checking new email, reading the NY Times (online), browsing both Reddit and Digg, and finishing the reading schedule with new blog postings populated by Feedly. I must confess that I’m a sucker for stories that include lists and there are no shortage of these postings on Reddit and Digg. Today while perusing Digg I stumbled across this list of the 10 most annoying things about dining out by Dave Faries. Because I have an insider’s knowledge of the restaurant industry, Faries’ complaints reveal personal information about himself that might go unnoticed by his readers. I would normally just leave a quick comment on the post but this particular list hit a nerve that requires more than a note to vent my frustration. Not all of Faries’ listed complaints are invalid, however some are so inconsequential that they don’t deserve more than a brief acknowledgment.

10. Wait staff asking “How is everything” at all the wrong moments
Faries is complaining about servers who ask this question just after the customer’s taken a large bite of food and can’t really reply without choking on their meal. Now, earlier I alluded to the writer inadvertently revealing personal information in his post, and this is one of those moments. If Faries frequently encounters this situation, then he is probably considered a difficult customer by his servers. The “how is everything” technique is used by servers to avoid customers that constantly complain or attempt to engage the server in boring or inane conversation; these customers are best approached with their mouth full.

9. Superlatives
This is when restaurants use “best” or “greatest” to describe food/drink menu items. I agree with Faries on this one, however, would a customer invest in a piece of “better than average” apple pie? Superlatives are a fact of life in the capitalist system, and frequently used by every industry from automotive to cell phone service providers; they’re there to make the customer feel good about the purchase item. You can bet the server is in the wait station laughing with fellow employs about that fourth “diet” coke refill needed by the customer to wash down that last bite of “America’s best” truffle cake.

8. Seating people in clusters
In this instance Faries takes exception with seating groups of customers together when the restaurant isn’t busy. He admits the practice allows food servers to better attend guests, but it “robs the guest of some privacy. In a quiet space, voices carry. The practice, therefore, can make for an awkward dining experience.” Unless Faries is planning a bombing or some future embarrassing sexual adventure (I’m thinking Marv Albert here), he’s giving his conversation skills far too much credit for being interesting to any customers seated nearby. If customers are seated haphazardly throughout the restaurant, the level of customer service drops off. This is called a catch-22.

7. Opening hours not posted on the web site

Again, I have to agree with Faries on this complaint. However, if I’m interested enough to search out a restaurant online I’ll probably just call if the times aren’t posted.

6. Restaurants not keeping stated hours
I heartily agree with Faries on this point, but keep in mind that individually owned restaurants will close early if business is slow in an effort to save on operating costs. They don’t have the deep pockets of national restaurant chains that can remain open regardless of business conditions.

5. By the glass wines at cocktail prices
Apparently, Faries is upset at the high price of wines by the glass. For those familiar with Adam Smith’s concept of the invisible hand, the free market sets the price for goods and customers decide whether to participate by purchasing said goods. If the wine is too expensive, a re-evaluation of drink preference is in order; I’ll be right back with your “diet” coke.

4. Staff not bussing silverware between courses
This is another point on which Faries and I agree, however the problem of lies in the customer “guarding” the dirty silverware or placing it out of reach to the server. International dining guests are much easier to serve because they place any used silverware together on the empty dish; this signals the waiter that they’ve finished with the dish and are ready to have it removed. Frequent diners are aware of this common courtesy, but some customers lose their ability to civilly dine in public, forcing the server to become a surrogate mother.

3. Charging high corkage fees at BYOB-only restaurants
I don’t know of any restaurant that doesn’t charge a corkage fee when the customer arrives with outside liquor. This doesn’t mean that all restaurants charge this fee, but it is usually a small nominal charge. If Faries is bothered by this practice he should avoid offending restaurants; if liquor is a necessity to enjoy a meal I would also recommend having a local AA chapter on speed dial. Speaking as a server, when the customer objects to a corkage fee, we believe the diner is just being cheap; the demographic that usually complains the loudest are the wealthy diners. I can’t explain why this is true, but I’m speaking from personal experience. Complaints by customers about food/beverage costs signals to the server that the individual is probably cheap and won’t leave a decent tip regardless of the service provided; the server will usually decide to focus attention on the diner that understands the server wasn’t invited to the meeting that decided costs and shouldn’t be held responsible by the customer. If the food/drinks are out of a comfortable price range, you’re dining in the wrong place.

2. No reservation policies at popular restaurants
Restaurants that practice this policy do themselves real harm. Turning away any customer is bad business and if it happens frequently, diners will stop seeking out the restaurant. In my experience this practice is a rarity; in fact, I’ve never worked, nor dined, at a restaurant that didn’t take reservations.

1. Restaurants encouraging valet parking
Faries takes exception to restaurants roping off a large portion of the parking lot for valet parking, forcing patrons to walk farther to reach the establishment. Again, this is a common service that forces customers into less desirable portions of the parking lot. If this practice restricts the spaces available for patron parking, then the restaurant is again discouraging customer access. As long as I can find a parking space I don’t mind a short walk; a short walk isn’t something to get upset about.