Wednesday, April 19, 2017

The Party of Know Nothings

     In the 1840s and 1850s, several secret societies arose in reaction to the heavy immigration of poor Irish, southern Germans, and Central Europeans into the U.S. due to the potato famines in Ireland and the suppressed revolutions of 1848 across Europe.  These societies became openly political in the forms of the American Republican Party in New York, then the regional Native American Party, and finally the national American Party by 1855.  When asked about their organizations, one member answered “I know nothing.”  From that point, these people became the Know Nothings.    
     The Know Nothings were extremely anti-immigrant and anti-Catholic.  Many believed that Pope Pius IX, who was widely accused of being anti-democratic, had hatched a plot for millions of Catholics to immigrate into and then take over the U.S. by force of arms if not by ballots.  Conspiracy theories ran rampant.  The Know Nothing story was particularly popular among the American lower middle and skilled working classes, especially Protestants of English, Welsh, Scottish, Scotch-Irish, Dutch, and northern German ancestry.  They saw the new immigrants as slovenly, stupid, criminal, and unwitting tools of unpatriotic Catholic bishops and corrupt city machine politicians.  The Know Nothings argued that immigrants posed serious threats to traditional (their) American ways and therefore should be denied the rights to vote, run for public offices, and American citizenship when residency was less than 21 years.      
     The Know Nothings varied on points of emphasis in different parts of the country.  In addition to anti-immigration and anti-Catholic sentiments, in some areas they were also anti-elitist and anti-intellectual.  In other areas, they advocated social and political reforms that anticipated future popular causes, especially temperance and Progressive political reforms.
     The Native American Party (with “Native” meaning white and older generational Americans, not Indians) swept elections in Massachusetts in 1854.  The Know Nothings also exercised major political strength in Pennsylvania, Ohio, Indiana, Illinois, Maryland, and California (where the anti-immigration hatred was directed at the Chinese).  They successfully elected mayors in Boston, Philadelphia, Chicago, and San Francisco.  The movement hit a high point of popularity in 1855, and then began to decline after a serious Know Nothing riot killed 22 people and wounded many others in Louisville.  The American Party ran candidates for President and Vice President in 1856, but then virtually disappeared by 1860.
     It was another issue that eclipsed the anti-immigration movement:  slavery.  Northern Know Nothings migrated to the Republican Party, even though Abraham Lincoln of Illinois disapproved of them.  (The Irish and German immigrants, by the way, became furiously loyal to the Union and supplied numerous troops for Lincoln's army.)  In the South, the Know Nothings bitterly opposed Lincoln and supported secession when racial proved stronger than ethnic prejudice.
     Periodically, nativist groups have gained popularity in American politics because they reflected deep fears and biases concerning generation-after-generation of newcomers to the U.S.  Once each immigrant group took hold in America, they tended to object to new immigrant groups, which were seen as threats on several levels.  Having worked so hard to win social respect and middle-class lifestyles, Americans have jealously guarded their advantages against foreign-born intruders.  There have always been Know Nothings.  They may even exist today.


© 2017 Stephen M. Millett (all rights reserved)                   

Thursday, April 13, 2017

Personal Health as Public Health

     If an individual becomes sick, does he or she pose a threat to others?  When the sickness is due to transmittable viruses and bacteria, you bet!  Personal health can become public health.
     As discussed in my previous blog posting, the concept of public health in the U.S. dates back to at least the 1640s.  Local and state governments routinely regulate individual behavior that impacts the health of other individuals.  They also provide public health services, such as sanitation, public water, garbage collection, and contagious disease controls.  Public health has always included disease prevention as well as epidemic management. 
     In Jacobson v. Massachusetts, the U.S. Supreme Court ruled in 1905 that states have the power to require individual vaccinations to prevent epidemics (in this instance, smallpox).  The ruling also went far to justify state powers to impose individual isolation and mass quarantines.
     At the national level, the Federal government regulates public health and provides services under the war powers, the taxing powers, and the commerce clause of the Constitution.  The Public Health Service Act of 1944, subsequently amended and enlarged by numerous acts of Congress, created the Public Health Service and the Center for Disease Control (CDC) to prevent the spread of illnesses into the U.S. from abroad and among the states.  Congress further provided health care services to individuals through Social Security and Medicare.
     In 1985, during the Presidency of Ronald Reagan, Congress passed the Consolidated Omnibus Budget Reconciliation Act (COBRA).  Among many provisions, the act prohibited hospitals (but not necessarily physicians) from the practice of “patient dumping” because people could not pay.  Hospitals are required to provide full services in cases of individual health emergencies caused by sicknesses, disabilities, injuries, and assaults regardless of the individual’s ability to pay for such services.  Hospitals often recover their losses by charging other patients and their insurance companies higher prices, so that those who can pay end up indirectly paying for those who cannot.
     If the government requires people to do certain things for their own individual healthiness in the interest of public health and if the government requires hospitals to serve people in medical emergencies, then cannot the government also require patients to pay for such services?  In 2010 Congress passed the Affordable Care Act (ACA), also popularly known as “Obamacare,” mandating that people had to have healthcare insurance to pay hospitals (and physicians) for medical services.  It was a way to balance the obligation to provide medical assistance with the obligation to pay for it.  The constitutionality of Obamacare was twice upheld by the U.S. Supreme Court. 
     The requirement that everybody must have healthcare insurance, including checkups and disease prevention as well as cures and recoveries, is in the public interest.  Requiring individuals to take responsibility for pursuing and paying for their own well-being and healthcare in addition to preventing and managing epidemics is a legitimate government power in the pursuit of public health.  In matters of health, the government is protecting me from you, and maybe you from me.



© 2017 Stephen M. Millett (All rights reserved)






Thursday, April 6, 2017

Public Health Protects Individuals

     In an attempt to save the colony from its own self-indulgences, the stern governor of the Dutch settlement of New Amsterdam in 1648 ended the common practices of allowing hogs and other animals to free-range across both public and private property, throwing household garbage into the streets, and allowing private outhouses to overflow.  The alleged tyrant was the colorful Petrus Stuyvesant and the nascent colony survived and prospered to become New York City.
     Dutch libertarians in 1648 might have protested that nothing was more private than a privy.  They might have decried regulatory intrusions into personal matters.  The governor, however, would not tolerate libertarian excrement.  He took the stand that no individual enjoyed the freedom to do things that infringed upon the freedoms of other individuals or compromised the well-being of the entire community.  Stuyvesant did not understand public sanitation and the biology of human feces carrying viruses and bacteria dangerous to other people, but he did understand that an overflowing privy “not only creates a great stench and therefore great inconvenience to the passers-by, but also makes the streets foul and unfit for use.”
     For nearly 370 years the city fathers of New York have regulated human sanitation, water quality, and garbage collection; they have even mandated that you have to clean up poop left on the sidewalks by your dog.  The regulations of New York have been widely adopted by cities and states across the country through public health regulations and services.  Communities have gone beyond just sanitation and garbage to enforce ordinances and laws concerning restaurants, restrooms, land zoning, and building codes to protect individual health, lives, and property along with maintaining public order. 
     As I explained in my book, the Federal government is the institution of the national community.  Few people would dispute today the authority of cities and states to regulate public health, but does the Federal government also have the power to regulate it?  As early as 1798, Congress, based on its war powers, created a network of public hospitals for seamen that evolved into the office of the Surgeon General of the U.S., the U.S. Public Health Service, and the U.S. Department of Health and Human Services.  In 1906 Congress, based on its interstate commerce powers, passed the Pure Food and Drug Act and the Meat Inspection Act to protect the health and safety of consumers.  As transportation, communication, and business networks expanded from coast to coast, the Federal government has exerted more national regulatory powers over increasingly national health problems for all Americans that cannot be adequately addressed by just local and state governments.  After all, the water quality of many major lakes and rivers and the air quality that we breathe transcend municipal and state boundaries.
     Does the evolution of the United States as a fully blended national community justify increasing Federal powers to regulate even global climate change and individual healthcare insurance?  Let’s explore this question more fully in future blog posts.
       

© 2017 Stephen M. Millett (All rights reserved)







 

Thursday, March 30, 2017

Is Patriotism Only Military?

     Think 4th of July – parades, cookouts, flags, and fireworks.  It is the national celebration of American patriotism, which is the love for our country and what it stands for.  We emphasize the defense of our personal liberties and independence from foreign enemies.  We also express our gratitude to generations of men and women who have served our country in uniform.  But is patriotism only military?
     Oddly, many Americans praise military service as personal sacrifice in defense of our country and then rail against the same national government that operates the military as though Washington, D.C, were the seat of a foreign and oppressive regime.
     If patriotism is the love for our country, does it also extend to the love of our fellow Americans in a context other than just national defense?
     I argued in Chapter 4 of American Ways that Americans historically have loved private communities, but have always distrusted public communities.  So often Americans have viewed private communities as “us” and public communities as “them.”  This attitude is apparently a legacy of the American Revolution and the Anti-Federalists who opposed the ratification of the U.S. Constitution.  It may also be a legacy of slavery and the Civil War, depending upon whose side one supported.
       Private communities consist of churches, neighborhoods, teams, clubs, fraternal orders, and societies.  We express our individual freedoms in selecting our participation in them and agreeing to cooperate with others in the same social organizations.  On the other hand, public communities are schools, governments, and the military.  What a paradox!  We express our loyalty to our country through military service, yet we distrust all forms of public communities, which includes the military.
     The national government in Washington, D.C., is the long-lived institution of our national community.  As founded by the Constitution of 1787 and as periodically amended, the Federal government serves all Americans regardless of state residency, social and economic position, race and color, gender, and personal preferences and lifestyle.  When we abide by Federal laws and court rulings, vote, express our views to our elected representatives, provide emergency relief, and respect the rights of other individual Americans, are we not also expressing our patriotism?
 

© 2017 Stephen M. Millett.  All rights reserved.

Tuesday, March 14, 2017

House Divided: The Absolutists

     In a series of posts, I have been exploring the major divisions in American society, the “house divided,” and whether they could cause another civil war.  The most serious division of many may be more emotional than material.  It occurs among those Americans who hold absolute beliefs with unyielding values and lifestyles.  I call them “the Absolutists.”  And they can emerge from both the right and left wings of the ideological spectrum.
     The American people have become highly polarized in their increasingly partisan political views.  In the 1960s and 1970s, many Americans came to distrust government because of the war in Vietnam and Watergate.  Some wished to restore trust through social programs and liberal reforms, while others pursued a new trust by limiting government and encouraging private initiative.  The latter gravitated to the Republican Party and aligned themselves with social as well as fiscal and constitutional conservatives.  Some of them became Absolutists:  what they believed was “true” and what they said and did could be “trusted.”  And some even became mean.
     Republicans deeply resented the upset elections of Bill Clinton to the White House in 1992 and 1996, and they tried to overturn those elections with impeachment.  Then the Democrats bitterly objected to the alleged stolen election of 2000 in Florida that returned the White House to a Bush.  There followed eight years of unrest with the terrorist attacks of 9/11 in 2001, the subsequent wars in Afghanistan and Iraq, and the economic meltdown in 2008. 
     Partisan politics got really ugly with the election of Barack Obama in 2008 and again in 2012.  Millions of Americans who lost their well-paying jobs in the Great Recession blamed Washington more than Wall Street.  Conspiracy theories abounded with people venting their anger on Obama’s economic recovery policies, Obamacare, and the apparent government favoritism for certain social and economic minorities over others.  Had the Absolutists of the left taken over the country?  With the surprising election of Donald Trump to the White House in 2016, both Republicans and Democrats came to deeply distrust the Federal government.   
     Another type of Absolutist emerged among some religious groups.  The most divisive issue became abortion.  The issue polarized opinions from those who asserted that all abortions were immoral, and therefore should also be illegal, to those who argued that most if not all abortions were justified at the personal discretion of the mother.  Another controversial social and legal issue concerned the marriage of gay partners.  People became exceptionally bitter and reluctant to compromise on political let alone religious grounds.  Added to this bitterness were those who held their faith so deeply that they objected to the ideas and behavior of other people who did not hold their same beliefs.  If a point of view becomes absolute, then should not everyone abide by the same principles?  Since the 1970s, some have pursued their political agenda aligned explicitly with their religious views.  No longer was the separation of church and state necessarily desirable.
     What has been lost among the Absolutists of both the right and left wings has been the social and ideological willingness to be hard-headed rather than hard-hearted:  to make accommodations and seek practical compromises so that people can get past their own states of mind and move on to productive jobs, careers, and lives without trying to tell other Americans how they should best live their own lives.  As I explained in American Ways, such accommodations in the past have bolstered individual freedoms, political stability, and great economic growth.  Otherwise the Absolutists may drag all of us into another civil war of interpersonal fights, localized violence and riots, political upheavals, and economic reversals.



© 2017 Stephen M. Millett.  All rights reserved.

Wednesday, March 8, 2017

House Divided: The IWE Gap

     A major division in America today is the growing gap in IWE: income, wealth, and education.  In some respects the gap is Marxist, except the working and middle classes have been gaining while the rich have gained greatly.  It’s the gap between the Haves and Have Lots, the top 1% and maybe 10%.   
     Several studies have shown that long-standing income and wealth gaps in the U.S. started to narrow during the Great Depression and continued into the 1970s.  Among several factors that changed this, the Reagan Administration encouraged private wealth-building, and a few did.  The 1990s saw great economic growth in general, but the rich pulled way ahead of others. This was due to the explosion in digital technologies and the success of new products, services, and startup companies.  It was also caused by the growing disparity between the wages of employees and the income, bonuses, and investment returns of business executives. 
     Then came the economic bust of 2008 and the Great Recession.  Nearly 8 million Americans lost their jobs; that number of new jobs was not again reached until 2014.  The Great Recession was not just one of those periodic business downturns, but rather a structural shift in the American economy.  Many workers and lower-level managers never got their old jobs back or found new jobs that paid as well.  They got left behind in the recovery enjoyed by the upper middle class and the rich.  By 2011, the top 10% of the population held 72% of all private wealth while the bottom 50% owned just 2%.  Said another way, the top 1% may own as much as 38% of private wealth, or more than all the assets of the bottom 90%.
     More than ever in the past, the growing income and wealth gap in the U.S. is caused by and in turn affects the education achievement gap.  Those parents and communities that have benefited from superior schools strongly back education, because an education is the surest way for the better-off to pass class and financial advantages on to their children.  But less advantaged parents and communities, especially in inner city and remote rural districts, can undervalue and undermine the schools in many ways.    
     It has been estimated that only 30% of Americans now achieve a level of education higher than their parents.  In the past, Americans expected to do better than their parents in all sorts of ways.  Now more and more young people doubt that they will earn as much as their parents did.  They question the future validity of the “American Dream.” 
     Education translates directly to better incomes for students in the future.  For example, in 1995 an American who did not finish high school typically earned a bit less than $25,000 a year; by 2014, that median annual income in constant dollars barely rose.  With a high school diploma, one could earn over $32,000 in 1995, and actually earned less by 2014.  Meanwhile, a person with a bachelor’s or higher degree made over $51,000 in 1995 and made slightly more by 2014. 
     Many Americans of all ages who lacked competitive skills and suffered serious financial reverses in spite of the Great Recovery have become bitter and resentful.  They particularly resent the government and its subsidies to those people who feel “entitled” to benefits that they have not otherwise earned.  Occasionally they may resort to violence, but they will not likely engage in civil war as class warfare – not unless they believe that they have to rise in revolt to protect their self-respect and interests.   
     Addressing the gaps in income, wealth, and education to avert social tensions and violence, let alone another civil war, is the biggest domestic challenge in the U.S.  Yes, we want general economic growth, but who will benefit the most?  With their great wealth, will the top 1% act like 18th century French aristocrats and expect extraordinary privileges?  Will they dominate the political process?  Could they be the “Deep State”?  And how will income and wealth inequality impact all American consumers, who account for some 70% of our annual GDP?


© 2017 Stephen M. Millett.  All rights reserved.

Friday, March 3, 2017

A House Divided: Racial

     Among all the divisions of Americans that could lead to violence, even to the next Civil War, perhaps the most evident is race – the division between African Americans of slave descent and whites of European extraction.  The division is not as much biological as it is psychological.  The underlying foundation of American racial relations remains fear.

     Black Americans from the slave tradition fear white brutality, discrimination, and demeaning gestures.  For centuries, they were expected to obey white masters and defer to their white superiors and they paid dearly in pain and lives if they crossed whites.  As I asserted in my book American Ways, the Civil War ended slavery but did not end the racism that had previously rationalized the “peculiar institution” among whites.

     And whites fear blacks as well.  The great fear of white masters in early American history was generated by plots and incidents of slave insurrection.  As early as 1739, the Stono Rebellion of slaves in South Carolina resulted in the deaths of as many as 47 whites and 44 slaves.  In 1811 a slave revolt broke out in the territory that became Louisiana.  As many as 500 rebellious slaves marched on New Orleans, burning down five plantations along the way.  And the Nat Turner revolt in Virginia in 1831 killed about 60 whites.  None of these slave revolts were successful and all were brutally suppressed, but they caused enormous fear that lasted beyond 1865 and continues to exist somewhat even today.

     The greatest and most successful of all slave revolts was the revolution that resulted in a new black independent country, Haiti, from French masters in 1804.  The former slave forces of the new republic systematically massacred 3,000-5,000 white French men, women, and children.  The French who escaped to the U.S. brought with them horrific stories and near hysteria of what might also happen elsewhere. 

     Since the end of slavery in 1865, whites have continued to fear black violence.  The form became private fights, occasional murders, and periodic race riots across the country.  The fear of a race war, a fear reminiscent of slave rebellions, reached a peak in the late 1960s during a period of turmoil caused by the civil rights movement, militant black groups, assassinations, and opposition to the Vietnam War.

     Might racial divisions cause another American civil war in the future?  Possibly, but not probably.  Occasional racial shootings and riots will likely continue, but they are not likely to escalate into full-blown civil war for several reasons.  The percentage of African Americans among the total American population has never been large:  since 1860, the national percentage has ranged from about 10 to 13, although some areas are predominantly black.  With time and increasing numbers of interracial marriages and children, the races are slowly converging.  Fear generally has been declining, except for particular incidents where blacks fear police actions and whites fear black crimes.  During the last six decades, whites have increasingly accepted blacks in social and economic settings.  Equal rights before the law, legal nondiscrimination in public, and voting rights for blacks are more generally practiced.  Each generation since World War II has become less and less responsive to old racial suspicions.  In addition, Hispanics have emerged as the principal minority group in the U.S., and they are causing a new set of fears, especially concerning some 11 million illegal immigrants who have been characterized by some white Americans as drug dealers, rapists, and murderers.  The divisions concerning Hispanics and Muslims will be covered later in a separate blog posting on immigration.


© 2017 Stephen M. Millett.  All rights reserved.