Friday, April 29, 2016

God Is Not Dead, We Just Look for God in the Wrong Places

There have been many pronouncements that God is dead.  The most famous perhaps is that of Friedrich Nietzsche, although it is widely misunderstood.  If you look beyond the quoted phrase, Nietzsche was saying that we have killed God.  That we have taken away everything that was magical in God’s creation and are left with nothing to moor us.  

“But how have we done this? How were we able to drink up the sea? Who gave us the sponge to wipe away the entire horizon? What did we do when we unchained the earth from its sun? Whither is it moving now? Whither are we moving now? Away from all suns? Are we not perpetually falling? Backward, sideward, forward, in all directions? Is there any up or down left? Are we not straying as through an infinite nothing?”

This is not the statement of a Godless man, but one who realizes that our modern knowledge makes it impossible to believe in the God of the Old Testament and that we must find something else to believe in, to moor us.  

Darwin’s theory of evolution as well as the many discoveries of modern science regarding the history of the world just are not compatible with the Bible.  In a word, one cannot believe that the Earth is 4.5 billion years old and that man is several million years old .. even modern man is about 50,000 years old … and believe in the God of Genesis.

But others have argued a more fundamental point, as do I.  The history of life on Earth has proven that the concept of a God to whom one prays and who is said to answer prayers and control life on earth is an illusion, purely a creature of belief.  So even if one looks at the Bible with a grain of salt and says that God guided the creation of the Earth and all that is upon it over this expanse of time, the God that we’ve been taught to believe in just doesn’t exist.

What kind of God would have allowed slavery?  What kind of God would have allowed the holocaust and all the other gross and minor inhumanities of man.  What kind of God would for some reason make a child suffer and die?  The questions go on and on.

In the old days, and even today, many people answer these questions, not willing to see the facts as evidence that such a God doesn’t exist, with the classic, “The Lord works in mysterious ways.”  Because if they did not believe in God, what would they believe in?  As Nietzsche said, God is their mooring. 

The answer is not so much a “new” conception of God, but one that has existed almost as long as the world’s major religions … that God, that the Devine, is to be found in each of us.  It’s just not a concept that has received much exposure. 

The mystical traditions of all three Abrahamic faiths ... Christianity (Gnosticism), Judaism (Kabbalah), and Islam (Sufism) ... as well as Buddhism and Hinduism contain the teaching that what we think of as being ourselves, our ego, is not our true self.  That instead our true self is variously defined as our heart, our true Buddha nature, our Divine essence.  Our suffering results from our true self having been buried under years of learned experience at the hands of our family, peers, and culture, of our thus identifying with and being under the control of our ego.  Unfortunately, these truths are not stated in the Old Testament or Koran nor are the flocks of these religions taught this truth.  How sad.

Although Christ did not speak to this issue, some in the early church, such as Paul, and later Augustine, and then the Reformation, put forth the concept of original sin … that we are all born sinners because of Adam’s not heeding God’s word in the Garden of Eden and being cast out.  And that only God, or Christ, can bring salvation.  This concept became central to the teaching of the Catholic Church and many Protestant denominations.

But as I noted in my post, “Our Culture Is the Serpent in the Garden of Eden,” I believe that this take on the story is wrong.  What then is the real lesson of the Garden of Eden?  

As told in Genesis, in the paradise that God created, man and woman were naked, but they were not embarrassed by their nakedness and they were one with all things.  The only thing forbidden to them was to eat the fruit of the Tree of Knowledge of Good and Evil.   They lived in a world where there was no “knowledge” of right or wrong, good or bad, no cravings, fear, or strife.  Interestingly, the paradise of Genesis is virtually identical with the Buddhist Nirvana.

But they ate from the Tree of Knowledge of Good and Evil. The point is not so much that God’s commandment was broken and that they thus sinned and were cast out, but that because it was broken in this specific way, they lost their innocence and the world would never be the same.  

The story does relate dramatically, metaphorically, that man would be separated from the Tree of Life, from the knowledge of his true self, his God-essence, having gained knowledge of good and evil.  But not that man for all eternity will be burdened with original sin and be born a sinner.  That is the spin that Christianity put on the story.  And as a result, millions of people in each generation have believed, because they were so taught, that they were born sinners.  Not a healthy self-concept.

The teaching of the opposing universal truth … that our ego is not our true self but that the God/Buddha essence is … is found in the teachings of the Buddha, in Sufi literature such as, The Art of Being and Becoming, in the Kabbalah, in the teachings of Gnosticism, and in the Bhagavad Gita.  Contrary to the fear of Nietzsche and many others that man will be left rudderless without their belief in the old God, contrary to the proof they see in our modern culture of the death of the old God and the resulting waywardness of people, God has always been alive and well inside each and every one of us.  

But it is for us to rediscover it, to uncover it, and allow it to embrace us and transform us.  For example, according to Kabbalah, “every soul is pure in essence and the only salvation is to become enlightened (i.e. to remember the truth of who and what we really are). … Salvation is the process of clearing out whatever obstructs our manifestation of the concealed divine image. … Kabbalah leads to the conclusion that ultimately we must rely on ourselves - for we alone have the power to save ourselves.”  It is to our heart we must look for guidance, not our ego-mind.

If one were to ask why most of organized Christianity adopted the doctrine of original sin, and why in Judaism and Islam the teaching that the God-essence is in each of us is mostly confined to their mystical branches, the answer might be found in that statement of Kabbalah just quoted … “we must rely on ourselves, for we alone have the power to save ourselves.”  Organized religion could well have felt that that teaching would reduce its power and influence.   Or it could be that organized religion didn’t have faith that we, ordinary people, can save ourselves and thus felt we needed something external to believe in.

Having found Buddhism in my middle age and walked the path for more than 20 years now, I can attest that freeing ourselves from our ego-mind is not an easy matter.  It involves changing the habit-energies of a lifetime; changing everything we have come to believe about who we are.  But it is possible, with discipline and good teaching, to find the Buddha nature, the God essence, inside each of us.  First comes belief in the teaching, then meditation and practice, and ultimately self-realization.

God is alive and well.  The God-spirit is in each of us, no matter how high or low, no matter how pure or consumed with evil thoughts and acts.  We have all been led astray by the serpent of learned insecurity and the culture of “want.”  We have been programmed by our life experiences to act and think as we do.  But that is our ego, not our true self.  There is no such thing as a bad person; just persons who do bad things.

If we all sought to find the Divine in each of us, the world would be a very different place.

Monday, April 11, 2016

The Cause of Urban Ghetto Violence Cannot Be Placed on a Failure of the Black Community

There are many, especially Republicans, who criticize Blacks for the violence in the urban ghetto community, which mostly falls on themselves.  The point is either made or implied that it has something to do with Black culture, that it is a failing of Black mothers to raise their children properly, or that there are too few two-parent households.

While there can be no arguing against the facts of ghetto violence, the causal connections often made have only superficial merit.  If one looks at urban slums/ghettos around the world, one finds gangs, drugs, and violence.  It makes no difference if one is in Asia, Africa, Europe, Los Angeles or New York City.  

Regardless what the race, color, or ethnicity of the urban ghetto dweller is … the incidence of violence in the urban ghetto is a universal fact.  It is instead the crushing, de-humanizing impact of urban ghetto poverty that creates a seedbed for violence.

In most global urban ghettos, the poor are also predominantly immigrants or migrants.  One could even argue that Black Americans are still to a large extent immigrants (forced) because they have not been successfully assimilated into important aspects of the larger culture.  This aggravates the crushing impact of the urban ghetto because people also feel, with good reason, that they are not welcomed, that they have no place in the larger society.

That the combination of poverty and urbanization should produce such an outcome should not be surprising.  And the impact of globalization has actually made it worse.  

Maya Angelou, in her book Wouldn’t Take Nothing for My Journey Now, says that the children of the ghetto are the way they are because they do not experience caring, self-respect, and courtesy in the home.  That has much validity, but that experience itself is in turn the product of poverty and the soul-crushing life of the urban ghetto.

I’m not going to go into the sociological reasons why the combination of poverty and urban ghetto produce violence.  Untold books and articles have been written on the subject.  The reasons are well known and the facts inescapable.  Yet we as a society, and all societies around the world, choose to point the finger at the people themselves and/or their cultures rather than the situations the poor find themselves in because that is what is convenient for us.  

If it wasn’t the fault of the poor, if the problem wasn’t self-inflicted, then the larger society would have both a social and a moral obligation to correct the situation, to remove or at least ameliorate the causal factors.  But we do not want to drastically change the way our societies are structured, the way resources are distributed by government, the deeply embedded racism against the ethnic poor, and the pervasive discrimination directed towards all poor.  And so life for the poor continues more or less as it always has, even while receiving meager assistance in the U.S. and other countries from the government.

This is just one more example of the impact of the lack of humanity in our society  (see my post, “Healing Our Nation, Healing Ourselves”).  And our nation, as well as the rest of the world, will not move forward unless the essence of humanity is rediscovered by us humans, individually and collectively.

Saturday, February 20, 2016

When Is a Socialist Not a Socialist?

When Barack Obama was running for President, the Republican Right branded him a “Socialist.”  They have also branded Obamacare as “socialized medicine.”  These claims were so ridiculous that neither Obama nor anyone else ever took the time to set the American people straight on the meaning of these words and the lie they spoke as applied.  Thus for many, the terms stuck.

Now, because of Bernie Sander’s run for the nomination, and his self-identification as a Socialist or what he sometimes refers to as a Democratic Socialist, it is critically important for the American people (and Sanders!) to understand what these words mean before even starting to think about which candidate they prefer.

First, the meaning of Socialism:  “A system of society in which the major means of production are owned and controlled by the government rather than by individual people and companies.”  This definition is from Webster’s and is basically identical with other sources.  

Why government ownership?  The theory is that government is the desired owner because it represents all of the people rather than just a few and so decisions about production and distribution will be made in a way which better meets the needs of the broader society.  Capitalism, on the other hand, where the means of production are owned and controlled by private companies or individuals, makes its decisions on what is produced and how it is distributed based solely on what is in the best interests of the company and its owners/shareholders.

Neither Barack Obama nor Bernie Sanders has ever called for industries, for the means of production, to be owned by or controlled by the government.  Therefore, neither of them are Socialists nor do they advocate Socialism.  

Yes, I know that Sanders identifies himself as a Socialist at times, but he’s not.  I have the feeling he just likes the sound of the word, that it confirms he’s for the people and against big money, and it sets him apart.

The term “Democratic Socialism” is still Socialism as defined above, but the system of government is democratic, that is, representative.  So again, neither Obama or Sanders are or advocate Democratic Socialism.

Well what is Sanders then?  Sanders, like the European countries he often refers to, is a Social Democrat.  I know the semantics may seem confusing, but the differences are important.  

“Social Democracy” refers to a political democracy in which a capitalist system of ownership and production is regulated by the state to make it more reflect the public good and the state helps those who need help with various forms of aid, such as public aid, Medicare, Social Security, etc.  Webster’s also defines it as a state that combines both capitalist and socialist practices. 

So guess what?  The United States is a social democracy, certainly since the Depression.  Only the most radical right-wing Republicans want a purely capitalist state where there is no government regulation (and also no government aid to industry) and no government help for those in need.

The difference between today’s mainstream Republicans (radical has become mainstream for them), Hillary Clinton, and Bernie Sanders are really differences of degree, albeit great, along a continuum from little government social involvement … that is action to promote the public rather than private good … to significant government action to promote the public good.

Hillary wants more government action to help those in need, but does not want to disturb the capitalist model.  Sanders is willing to disturb the capitalist model where necessary to provide for the public good, for example, universal health insurance.  Likewise, Hillary is less willing to closely regulate the financial industry while Sanders wants rather strict regulation of that industry.  

The example of health insurance is perhaps the easiest way of clarifying the distinctions.  In the strictly capitalist model, health insurance is provided by private for-profit insurance companies and is bought by individuals or companies on behalf of employees.  The government is not involved at all.  There would be no such thing as Medicare or Medicaid.  

Even Radical Republicans don’t dare go that far.  They would prefer to remove the government from any programmatic involvement and rely on private insurers, but still provide funding through some type of voucher or income tax credit program.  Which would provide more profits for private insurers.

Bernie Sanders wants universal health care with the government being the single payer, easiest to understand as expanded Medicare for everyone.  This is the system that is in place in most European countries and Canada.  This could fairly be called socialized medical insurance, but the medical delivery system otherwise remains as is.  People can in most cases opt out of this system and choose private care if they so choose.

What Hillary wants is Obamacare.  This is a system that still uses private insurers and so it cannot be called socialized medical insurance because the insurance is not provided by the government.  But the government both regulates and provides subsidies so that those who cannot afford the insurance can still obtain it.  It’s better than what we had before, but it’s a clunky system and there are lots of shortcomings just from my own personal experience.

Bottom line.  The whole “Socialist” or “Democratic Socialist” harangue is a red herring.  
It would be helpful if Sanders started getting his terminology correct and made the point expressed in this post that most everyone regardless of political party is on the same continuum, just at different points of the spectrum.  We are a social democracy, even if not a very progressive one.

This does not lessen the differences between the parties or candidates.  But it does remove scare terminology from the debate and instead places the question clearly where it should be … how much help should the government provide its citizens, directly or indirectly?  Is health care a basic right that everyone should have?

Saturday, February 13, 2016

Understanding Why America Is No Longer, and Perhaps Can Never Be, As Great As It Was

People either go on about how great America is, or they lament that America is not as great as it used to be.  In the first case, people typically ignore reality.  In the second case, they often ignore fundamental factors. 

When people say that America is great, they are either referring to the strength of our military (which is a fact), the size of our economy (which is a fact), or the things America stands for (which is also a fact, at least in theory).  

However, while we unquestionably have a strong military it does not serve its purpose of protecting American interests because our enemies are not cowed by our might nor do we have the political willingness or financial ability to send our military everywhere it is needed to protect our interests.  Thus we are not really as strong as our size and might would make it appear.  American strength is somewhat of a facade.

Our economy is also the largest in the world, even though the Chinese have been rapidly catching up with us.   We also have the most stable and strongest domestic economy in the world.  But our corporations, and as a result our financial well-being, have become so interconnected with the stability of the rest of the world economy that our economy is not as strong/stable as it was.  

Further, because of stagnant wages and loss of middle-class jobs, financial inequality in America has soared and become damaging and our middle class, which was the bedrock of our consumer economy, has been eviscerated.  The American people are hurting even as its corporations are prospering.  Then there’s the fact that the rest of the world, in particular China, holds most of the debt that we have incurred spending more than we take in, especially as a result of the disastrous Bush II tax cuts and the Iraq war.

As to American exceptionalism being a function of our ideals, as I’ve noted in prior posts, this exceptionalism is mostly a myth (see “American Exceptionalism - A Myth Exposed”).  America has never lived up to the ideals expressed in the Declaration of Independence or the 14th Amendment.

On the other hand, when people speak of America not being as great as it was, they often speak of America not being respected because our military needs to be stronger.  But the lack of respect has little to do with the strength of our military.  It is more because America has not for many years had the moral authority that it once had, even if it was based on an illusion.  Also, as noted above, our guerrilla enemies are not scared of our military prowess.

When they speak of our economy being weaker, they do focus on the issues I raised above, but the underlying context is not addressed.  The economy is not as great as it was because the world has changed and America has changed.  

The world has changed because 3rd world countries are no longer just producers of raw material (with the glaring exception of most of Africa).  And so they produce products that would pose competitive problems for U.S. production even without the free market trade agreements that have proliferated at the behest of both economists and corporations.  Unless the U.S. would enact high protective trade barriers to keep many foreign products out of the U.S.  But that would create a different problem … the combination of not having inexpensive foreign-produced items to purchase and a reasonable growth in U.S. workers’ wages would lead to high inflation rates that would damage our economy.  (Also, the inevitable trade war fight that would ensue would harm our exports.)

But America has changed in significant ways as well.  During the first stage of explosive growth in our economy, much of the country was still unsettled frontier leaving room for  a huge expansion of economic activity accompanied by huge increases in population through immigration from all parts of the world.  During the second stage, from the turn of the 20th century into the initial post-WWII period, America was unequaled because the rest of the world’s developed economies were minuscule by comparison and China and most of the non-European world were undeveloped, not even developing.

None of that is true today.  And so, because of all of these factors, the way often cited for the American economy to regain its strength is through American creativity or innovation.  And many think we’ve done just that.  

But while we have seen lots of American technological innovation in the last few decades, it has only fueled American corporate profits, not worker wealth, since the products are all produced overseas, and so the economy has not really been strengthened.  Only if those jobs were brought back would it make a real difference.  

As for creativity, since the creation of the computer chip, there really hasn’t been much creativity, just innovation.  Even nanotechnology is innovation, not creativity.  But regardless, unless creativity resulted in good, middle-class jobs for U.S. workers, it would not help strengthen our economy.

But this discussion begs the question, “Does America have to be great?”  Economically, given the size of our population and the standard of living that we were used to 40 years ago and would like to reacquire, that answer is unfortunately, yes.

Thus, bottom line, figuring out how to bring American jobs back or create new ones without creating other major economic disruptions such as high inflation is a task that corporations and workers/unions need to sit down at the table to discuss, probably at the behest of government.  One point seems clear.  To significantly increase the number of American middle-class jobs, wages will have to be lower than they once were, but that would still result in a benefit for both workers and the economy.

The only other way that America will either be or viewed as the great nation it once was economically is if much of the rest of the world implodes and the U.S. finds a way of disconnecting itself from that calamity.  I think recent history shows that it would be prudent to prepare for that eventuality.

Militarily, America certainly needs to be strong.  But what that means in the context of current or projected international conflicts has been a subject of some debate.  Many argue that we need a leaner and more flexible military rather than an updated version of the current dinosaur.

As for being great from a moral authority perspective, while there is no need to be great, it certainly would be very beneficial from many perspectives for the U.S. to regain its moral authority.  President Obama certainly tried to move in that direction in the beginning of his first term.    But to regain that authority, much would have to change both within this country as well as how it engages the rest of the world. 

If the advice I have given in many of my posts on this site were followed, it would go a long way to regaining our moral authority.  But that, unfortunately, is highly unlikely because to bring about that change means changing who holds power in Washington … ending the control that corporate America and the wealthy have over our policies.  Although Bernie Sanders talks of such a revolution, achieving it is another matter … and he is the only candidate talking about it.

As has been the case in many of my posts, the final analysis is that we survive in an outdated, broken system and cannot be what we need to be in the future without major changes in our political, social, and economic structure.

Saturday, January 30, 2016

The Warehousing of the Elderly - Cruel But Usual Punishment

My mother is 106 and in a nursing home.  Around the time she turned 90, she went to live in a life-care community, first in independent living, then assisted living, then the dementia unit, and now the nursing home.

As I visit her frequently during her lunch, I have had ample opportunity to observe the other patients in the facility.  It is a sad sight.  In general, the people are visibly very unhappy, regardless whether they have extreme dementia, in almost coma-like states, or are still to some extent coherent.  

When she was younger, in her 70s perhaps, she used to say that if she ever got “like that” … meaning not able to care for herself … to give her “the black pill.”  That was her way of saying that she wanted to die in that event.

While I have often thought of the problems with how we care for our elderly (see my post, “Aging - A Buddhist’s Take on the Stages of Life”), it came to me the other day that what many elderly experience in their last years is cruel but unfortunately usual punishment.  

“How can I say, ‘punishment’?” the reader may ask.  I say punishment because the elderly have not chosen to end their lives in this way; the choices have been made for them.  The options are largely dictated by our society, including the medical profession, even if loved ones make specific choices within that structure.

The problem comes down to this.  Except for a very narrow range of directives that people can make in a Living Will, one has no control over the trajectory of one’s life once you are not of sound mind.  And even if one is of sound mind, the legal options are very limited.

What the law should provide is the opportunity for people, when they are still of sound mind, to state their wishes regarding how they want to live, be treated, or be cared for  – including assistance in dying – when they reach certain events or stages in their life if they are no longer able to direct their own care.  I do not speak only of the elderly here because illness and accidents and death can come at any time.

So for example  (the actual document would be far more specific):

1.  If you have a health event (heart attack, stroke, accident) or as a consequence of aging become physically unable to care for yourself and such condition is irreversible, 
do you:
- wish all efforts to be made to prolong your life whether in a nursing home facility or elsewhere, 
- do you wish to be assisted in dying if that is legally permissible, or 
- do you wish to be cared for at home and allowed to die?

2.  If you have a health event or as a consequence of aging become unable to think coherently and engage in conversation, are disoriented, and don’t know who you are, which condition is irreversible, do you:  (as above)?

3.  If you have an irreversible physical condition that produces constant, or near-constant, pain, which pain can only be reversed by putting you in a heavily sedated state, do you: (as above)?

4.  If you become terminally ill, do you: (as above)?

5.  If you indicate that you wish to be cared for at home and allowed to die, do you wish to refuse any and all treatment for any illness or condition which, if untreated, would eventually lead to your death or not?

6.  If you indicate that you wish to be cared for at home and allowed to die, would a hospice facility be an acceptable alternative or not?  

7.  If as a result of such refusal noted above you are in pain or experience other discomfort such as intractable nausea and shortness of breath, do you request that you be given all available palliative care, including narcotic medication, to mask any pain and ease any discomfort or not?

There are few things more personal or private than one’s physical and mental health.  Only by providing individuals while they are still in a sound state of mind with the ability to make such directives will society provide them with the control of their medical care and their life/death to which each person is entitled.

Sunday, December 27, 2015

Back to the Future, But Not Too Far!

We are a country that is obsessed with the future, with facilitating the next phase of our “progress.”  In the process, we have lost our collective, our societal mooring to what has made the United States a great social and political experiment.  

As I’ve noted in previous posts, our society is dysfunctional in many respects.  But there are two central problems.  One is that virtually all political power is now in the hands of major corporations and the rich; they call the shots in Washington, not the people.  The other is that these same actors, as well as many average citizens, seem to have no concern for the welfare of their fellow citizens, and in the case of corporations, their workers.

One can place a band-aid here, and another there.  But that will not change any of the basic problems that we are facing and which are pulling the United States down from its great potential.

I have therefore argued for a revolutionary change in attitude and perspective on the part of our political parties and citizens.  This revolutionary change is not to something “new,” some utopia, but rather back to ideals and standards that served this country well and made it strong during the 20th century.  

In the first 125 years of our country’s history, things were pretty much a frontier-style free-for-all.  Each person for himself.  People who needed help generally weren’t helped, and those who were on the make pretty much got away with anything they tried.

But at the turn of the 20th century, the country took a progressive turn in its politics under Republican President Theodore Roosevelt.  The government and people saw that things had gotten out of hand and that there was massive inequality in power and wealth in the country.  Because such inequality did not square with our founding ideals, there was a realization that government needed to become a more active player to insure that the average citizen wasn’t exploited, and that power was more evenly distributed.

Thus, during the first 20 years of the new century, the progressive income tax was introduced, the robber barons were regulated, massive holding companies like Standard Oil were broken up, and workers were given the right to unionize.  And women were finally given the right to vote.  

As I state in my book, We Still Hold These Truths, a social contract developed that gave practical shape to Lincoln’s famous, “government of the people, by the people, and for the people.”  There was an increasing emphasis on a balance between rights and obligations, between business interests and the public good, with each person contributing to support the government’s efforts to level the playing field, each according to his ability.

Following the 1929 stock market crash and the resulting Depression, government saw the need to increase its role both in providing a hand to those in need (for example, the enactment of Social Security) as well as regulating the excesses of big business (for example, the Glass-Steagal Act).  In the mid 1960s, Medicare was enacted together with a host of measures to further improve the balance and fairness of our society. 

Congress also passed major civil rights legislation in the 1960s, although it must be said that while these laws resulted in certain improvements in their lives, the basic standing of most black Americans in our society and the conditions in which they lived and were educated were left virtually unchanged.  And they were still frequently subject to various forms of both institutional and private discrimination.  (See my posts, “The Mirage of Civil Rights,” and “Our Failed Economic/Social/Political System.”)

But I don’t want to overstate my case.  Needless to say, throughout these progressive periods, there were plenty of people, both in Congress and in the populace, primarily Republicans, who were against both measures to regulate business and efforts to increase government spending or other efforts to help those in need.  Even during the Depression and its immediate aftermath, there were people, and not just the rich, who literally hated FDR!  In 1932, the height of the Depression, Roosevelt only got 58% of the popular vote when he ran against Hoover, although he swept the electoral vote.

In this regard, it should be noted that regardless of the huge changes shown in the electoral vote map, indicating landslide years, the popular vote has never been a landslide.  For example, in 1972 when Nixon got 96% of the electoral vote, he received only 61% of the popular vote.  Likewise, when FDR got 98% of the electoral vote in 1936, he got only 62% of the popular vote.  The country has historically been quite divided.  

Then along came Ronald Reagan, the same man who had campaigned vigorously against the enactment of Medicare, who as President famously said that, “Government is not the solution to the problem.  Government is the problem.”  Reagan didn’t invent a new movement.  He just gave voice and a popular face to deep feelings that have always been held by a large percentage of the voting population, legitimizing those perspective.

The fervency and bitterness of these feelings grew and deepened over the following years, culminating in the Tea Party movement and the current crop of Republican radicals (they should not be referred to as “conservatives”) in Congress.  What they, led by the billionaire Koch brothers and others who back them, want is nothing less than a return of this country to its 19th century ethos, when it was each man for himself, without any interference from or help by the government, of course with the exception of Social Security and Medicare from which most of them directly benefit.  Unfortunately, they don’t see the irony in this.

What I am calling for is a return to the 20th century ethos (Reagan excepted) of balance and social responsibility plus a changed attitude towards black Americans.  

This is not a soak the rich movement or class struggle.  It is a movement that seeks a return to the ethos where we are all part of a society, that recognizes that many people are born into situations that place huge obstacles in their attempts at pursuing the American dream of happiness and equality, and that those who have made it, who have benefited from the system, have a responsibility as citizens to help the government in its efforts to assure that all have true equal opportunity.  

In this regard it should be noted that for most of the income tax’ existence, the highest tax bracket ranged from 60 - 94%, dropping down to 50% during the Reagan years.  So the current top rate of 39.6%, and even the various suggested increases, are historically low.  It should also be noted that regardless of the tax rate, the rich have always remained rich.

Nor is this an anti-business movement.  The health of our economy and of the businesses that make it prosper are of critical importance to the well-being of all Americans.  Business interests must always have a significant place at the table.  But we have learned all too often that it is nevertheless not true that what is good for corporate America is good for all Americans.  Thus there must be a balance between the needs of business and the greater public good.  Maximizing profit cannot be the sole goal of a responsible corporation in a democracy.  

For example, the New York Times just reported that corporate lobbyists working with their friends in Congress (on both sides of the aisle) inserted a provision in the omnibus spending bill that just passed that continues a tax loophole that benefits casino and hotel owners as well as major Wall Street investors to the tune of $1 billion.  That is to say that our tax revenues will continue to be reduced by that amount from what they otherwise would be.  That is unconscionable.

Nor is this a big government movement.  I for one feel strongly that government should be as small as it can be while executing the functions that are its responsibility.  There should be no holy cows.  Every aspect of government must be justified by the purpose it serves and its effectiveness.

What I seek is simply government of the people, by the people, and for the people … all the people.  Not government of the people  (they do still elect), but by corporations, and for corporations.  Which sadly, is what our government has to a large extent become.

The citizens of this country deserve better.

Saturday, December 5, 2015

When Is Justice Not Justice?

For the last few decades, certainly since President Reagan’s nomination of Robert Bork, the nomination and confirmation process of Supreme Court and lower Federal court justices has been political theater.  It was not always so.  

Chief Justice Roberts stated in a recent talk reported in The New York Times that earlier in the 20th century “The Court was not regarded as such a partisan football.  A lot of nominations of this time were Republican presidents appointing Democrats and vice versa.  The Court wasn’t regarded as a place where partisan matters would be worked out.”  

And so it should be.  Not that partisan matters shouldn’t come before or be considered by the Supreme Court.  But they shouldn’t be decided on a partisan basis.  The only proper question is whether or not they violate the Constitution, and that should be a strictly legal question, not a partisan one.

In front of many courthouses there is a statue of Justice, a blind-folded woman holding the scales of justice.  Since the 15th century, she has been blind-folded because of the belief that to be “justice,” a decision must be without regard to who is being judged or who the victim or other party is.  Justice must be impartial.

But what does that mean?  Is it just a matter of it making no difference, for example, if someone is rich and powerful as opposed to poor and weak?  Or whether someone is black or white?   

Not quite.  It must go deeper because for that to truly occur it means that justice must be blind in the sense that those meting it out must be blind to their biases, their attitudes … everything but the facts.  That is the essence.  Political and personal attitudes must be left at the door.  Only then can a judicial decision truly be made without regard to who the parties are; only then is justice impartial.

For example, in a case which pits corporate rights against the individual, being pro-business should not impact the decision.  In a case involving abortion-rights opponents against women’s choice proponents, siding with one side or the other emotionally or intellectually should not impact the decision.  The same is true for any case that pits parties on opposite sides of an ideological divide against each other.

The reader may respond that what I’m suggesting, nay demanding, is not possible.  It is indeed difficult for someone to put aside their biases and attitudes when they have the opportunity to further them.  But when one is a judge, I believe one must.  Otherwise, the justice handed down is not impartial.  And if it’s not impartial, it’s not just flawed, it isn’t justice.

It is in the legislative branch of government where biases and attitudes have a legitimate role.  We live in a representative democracy and representatives should as a general rule speak on behalf of their constituents, which means representing their biases and attitudes.  Yes, representatives are supposed to act in the best interest of the country (or state or city), but what that “best interest” is interpreted to be is inseparable from biases and attitudes.  It’s the nature of the beast.  And majority rules.

But the judicial branch is another matter.  Its role is to objectively interpret the meaning of laws, apply laws to individual cases, and decide if laws violate the Constitution.   Objectivity requires impartiality … both as to who the parties are and what they stand for, as well as in the law’s interpretation.  If the law is unclear as written, judges can look to the history of legislation or the history of the Constitution to decide how to interpret it, but not to their own political and social (as opposed to legal) biases and attitudes.  They are always interpreting, not rewriting, the law.

The Supreme Court is often criticized by Conservatives for what they term “judicial activism,”  which they claim is rewriting the Constitution.  Funny, though, that phrase is used by the Right exclusively when criticizing a liberal decision by the Court.  When the Court does the same but leans to a conservative interpretation, that term is not applied.  

But the term “judicial activism” as criticism is bogus.  Legitimate activism is inherent when interpreting the Constitution.  It was written in the 18th century.  The founders had their philosophies and attitudes, revealed in the basic principles stated in the document, especially the Bill of Rights, but they knew they could not foresee how society and enterprise would be transformed over the centuries.  They wrote a document for the ages, and that necessarily requires that broad concepts be applied to situations never envisioned.

As an aside, I want to make clear my view that there are both liberal and conservative elements in the Constitution’s language.  The creation of this country and its founding documents was a working out of the tension between these two views of government.  The result was a grand compromise.  The Constitution may be a profoundly liberal document overall, but it has its conservative aspects.  There is no denying that.

Often when dealing with the Constitution, looking at the language and contemporary documents while helpful still leaves the question open of how it should be applied to modern circumstance.  To answer this question, the Court has taken cognizance of society’s current attitudes - sometimes explicitly, sometimes not - in determining its proper application.  This is quite different from justices interjecting  their own attitudes and biases.  Referring to contemporary societal attitudes is more like asking what the founders would say in the current context.  This process does not disturb impartiality.

Let’s take two famous cases of activism as examples.  In Brown v Board of Education, the Court overturned its earlier decisions that separate education was equal and declared that separate education was inherently unequal.  What brought about this changed interpretation?  

When Plessy v Ferguson was decided in 1896 and supported state-sponsored segregation (in this case of railroad cars), society was not ready for integration.  The Court, applying contemporary standards, stated that the 14th Amendment “could not have been intended to enforce social, as distinguished from political, equality or a commingling of the two races unsatisfactory to either.”  

And so they interpreted “equal protection of the law” quite narrowly and upheld the stated intent of Louisiana’s segregation statute as providing equal but separate accommodations. The Supreme Court, always wary of being too far in front of public opinion, conscious that they are not a legislative body, prefers to step lightly.

But the world and our society was at a different place in 1954 when Brown was decided.  Blacks were generations removed from being former slaves.  They were a part of society in a way that they weren’t in 1896.  While the South was still not ready for integration, the rest of the country had moved forward.  

And so the Court struck down segregation as being inherently unequal.  It wasn’t just a question of how much money was spent or the quality of education.  The very concept of the government separating the races in providing education flew in the face of the 14th Amendment’s guarantee of equal protection of the laws.

The point I’m making is that because society had changed, the interpretation of the meaning of the Constitution required a change.  The Court may in fact have been more liberal in 1954 than in 1896, but looking at the case objectively, they came to the correct decision.

Many people, especially in the South, were outraged at the decision and felt that their State’s rights had been trampled.  This decision was perhaps the first decision where the Court was viewed by many as stepping into the partisan arena, the issue of race clearly being a highly charged social issue.

But it is the task of the Supreme Court to decide whether a Federal or state law violates the Constitution.  The fact that it happens to involve a highly charged area of social, as opposed to political, life does not remove it from the jurisdiction of the Court.  Society had changed in the intervening six decades since Plessy and so the Court in Brown properly came to its decision.

The other case I would cite is Citizens United v FEC, the case that declared that corporations are “people” to whom the 1st Amendment of the Constitution applies.  Thus their, and other organizations such as labor unions, right of free speech meant that they couldn’t be prohibited from spending money to influence elections through “independent” advertising in the 90-day period preceding an election.  

There was no legal precedent nor contemporary documentation to support the decision that the right of free speech applies not just to individuals but to organizations.  Nor was this a question of society having changed in a way which required a change in interpretation.  Corporations had not become weak entities that needed free speech to protect themselves.

And mind you, the law that was struck down did not say that corporations couldn’t spend any money on issue ads; it just said that in the 90-day period prior to an election they couldn’t run ads that mentioned a candidate.  There was a rational fear that a deluge of corporate money into advertising during the period could easily tilt an election.  Corporations and organizations, after all, do not have the right to vote, and so they should not have the right to unduly influence elections.  

But the Court now had a distinctly Conservative majority on issues pertaining to business.   They said that free speech was so important to our democracy that corporations should have that Constitutional right, regardless the lack of precedent, and so they struck down the law.  

This was a clear instance of the justices substituting their political judgment for that of Congress and also rewriting the Constitution.  This was the opposite of impartial justice. This was judicial activism that deserved to be criticized.

The American Bar Association Model Code of Judicial Conduct states in Canon 2.4 (b) that, “A judge shall not permit family, social, political, financial, or other interests or relationships to influence the judge’s judicial conduct or judgment.”  That’s close to what I’m saying in this post although I think that “interests” is more narrow, more circumscribed, than “attitudes and biases.”  

The actual Federal Code of Judicial Conduct, however, is unfortunately less helpful on this point; it also doesn’t apply to the Supreme Court.  It states that, “A judge … should not be swayed by partisan interests, public clamor, or fear of criticism.”  It further states that a judge should disqualify himself when he “has a personal bias or prejudice concerning a party.”  Advisory opinions regarding the Code all seem to deal with external evidence of perceived partiality … connections to groups or individuals … rather than actual partiality of a judge.

But if a judge’s membership in an association that takes public positions on controversial topics would raise questions regarding his impartiality, then it follows that his private biases and attitudes on such matters should not be brought to bear on a case because it would disturb his impartiality.  I would urge that the point be stated unambiguously in all Codes of Judicial Conduct that judges must not apply their personal or political biases and attitudes to the cases before them.  Only then will impartial justice truly prevail.

Finally, I come back to the initial point I made in this post regarding the selection of judges.  If we want our judges to judge impartially, then how they are selected is of utmost importance.  Judges should be appointed for their neutrality, for their objectivity, not for their record of either being liberal or conservative.   

There are existing models for this.  At the state level, many judges are now appointed by non-partisan commissions using a merit selection plan.  Observers have long argued that this should be true for all state judges.

I would argue this should be true for all judges, regardless whether state or Federal.  A list of several candidates should be selected by non-partisan commissions, with the actual appointment then being made by the President/Governor.

The idea behind lifetime appointments for Federal judges was to remove them from the pressure of politics, which in one sense it certainly has.  But it hasn’t removed politics from the judicial process.  If a judge takes his political leanings with him on to the bench, he or she will apply political as well as personal biases and attitudes in rendering decisions, making them examples of partiality, not impartiality.  Both the way judges are selected and the Codes of Judicial Conduct must be changed.