Are We Humans Better Liars than Thinkers or Sages?

I am all but certain that, somewhere lying around in the minds of certain scientists today, is an hypothesis that accurately describes the origins of language.  That is, I’m nearly sure the origins have already been largely figured out by now.

I am also all but certain that, unless we invent time travel, or the gods both exist and decide to reveal their knowledge of its origins, or a genius quite improbably comes up with a mathematical proof of its origins,  or — most likely these days —  a FOX News personality stumbles across its origins while searching for ancient dirt on Barrack Obama’s alleged War on Adam and Eve,  it will never be much more than an astute guess whether the correct hypothesis of language’s origins is truly correct.

Yet, despite the improbability of actually discovering the origins of language,  various things about the fundamental nature of language and its uses suggest to insightful and very learned guess-a-tators such as myself that language might — or might not — have evolved from mating calls, that it might — or might not — have been preceded by singing, that it might — or might not — have evolved faster in women than in men, that it might — or might not — have had multiple causes for its development from mating calls (such as its use in promoting group cohesion and cooperation), and that it surely, certainly, and absolutely was used almost from “the very moment it was invented” to tell lies.

There are a variety of reasons to tentatively think that particular use for language developed early on.   Of all those various reasons, the only ones that interest me here are these two:  Humans lie with ease and great frequency, and they begin playing around with telling lies at tender ages. If lying didn’t develop early on, then why is it so behaviorally advanced in us?  Why are we so good at it?

It seems obvious to me that our brains are more advanced at lying than they are at many other things — such as doing math or science, for nearly everyone of us lies with ease when he or she wants to, but so many of us struggle with critical, mathematical, or scientific thinking.

It also seems obvious to me that our brains are even less developed for wisdom than they are for critical, mathematical, or scientific thinking.  There are whole, vast areas of life in which, at most, only about one in ten or one in twenty of us frequently behave in ways that consistently show great wisdom.  That is, I’ve observed that even the village idiot now and then acts wisely, but I’ve also observed that the large majority of us have blind spots — whole areas of our lives — in which we are inconsistently wise, or even frequently fools.

Human relationships are usually a person’s most easily noticed blind spot.  Indeed, relationships are an area of life in which even those folks who most consistently behave towards others with great wisdom often stumble or fall, and if someone has learned to dance among us like a sage, you can be sure it took her an age of clumsy mistakes to learn her grace.

It seems likely that many people believe on some level that popularity is a sure sign of wisdom in dealing with others, and — if that were indeed the case — there would be a lot more people in this world who are wise about relationships than there really are, for there are certainly a lot of popular people.  Indeed, I myself can believe there is some small link between wisdom in relationships and popularity, but I cannot believe that link is more than a small one, if only because I’ve known too many fools who were popular, and too many comparably wise people who were not.

So I think the human brain is least of all evolved for wisdom, somewhat more evolved for critical, mathematical, or scientific thinking, and most of all of these evolved for lying.  And, likewise, it seems to me that language is best suited to lying, less suited to the sort of precision and exactness that one so often needs to communicate critical, mathematical, or scientific ideas, and least of all suited to communicate wisdom.  In fact, I’m pretty certain wisdom is not merely difficult, but extraordinarily difficult, to communicate, if it can be communicated at all.

For instance, this morning I came across a meme post to a website that stated, “It’s better to be alone than to be in a bad relationship”.  The first thing I thought was, “That’s true for a number of reasons”, and the second thing I thought was, “Among those reasons, it is better to be alone than to be in a bad relationship because, ironically, we are more likely to suffer from intense loneliness when we are in a bad or abusive relationship than when we are by ourselves and alone.”  But the third thing I thought was, “If one does not already know the truth of these things, then one is unlikely to learn the truth from either the meme or from any other words spoken about it.   How often have I seen people plunge themselves into bad or abusive relationships, or refuse to leave one, primarily out of fear of being lonely?  At least a third or half of the people I’ve known well in life have had at least one story of getting into a bad or abusive relationship and then delaying or even failing to leave it largely out of fear of being lonely.  Yet, nearly everyone who actually left such a relationship has looked back and said to me, ‘I only wish I left sooner, or not gotten into that relationship at all.’ Not a single person has yet told me that being alone has turned out to be lonelier than was being in the relationship.”

Now, I have heard people say that wisdom is “subjective” because there are no objective means for determining what is “right or wrong”.  But I think that might be a half-truth, and perhaps only a quarter-truth.  In many cases, all we need for wisdom to become objective is pick a goal.  Once we have picked a goal, it so often becomes possible to know with a fair amount of assurance which actions will bring us to our goal, which actions will not, and even which actions will be more efficient or effective than others in doing so.

For instance, if our goal is to avoid for ourselves the worst of loneliness, then it is obvious that choosing to get into a bad or abusive relationship is not the wisest decision we can make, while remaining alone or getting into a healthy relationship is a wiser choice.  Of course, this assumes that it is true for us, even if for no one else, that we will feel lonelier in a bad or abusive relationship than we’d otherwise feel.  But that question can be answered objectively.

The choice of goal is ultimately subjective (but that should not distract us from the fact that we can many times objectively determine the wisest means to that goal).  And yet, it is only ultimately subjective, for goals themselves can be arranged in hierarchies so that a higher goal might determine whether or not one expresses or attempts to actualize a lower goal.

In this blog post, I have been using the word “wisdom” as nearly synonymous with the phrase “most effective”.  Which, if I am being logically consistent, means that I harbor the somewhat dismal notion that our species of super-sized chimpanzees relatively excel at lying; perform mediocre at critical, mathematical, or scientific thinking; and suck the big potato at assessing the comparative effectiveness of various relevant behaviors, and then acting in accordance with those assessments, in order to bring about the most desired outcome.  If all of that is substantially true, then it naturally raises the question:  Why is it that we’re better liars than “thinkers” or sages?

Is it Human Nature to Murder People for Their Opinions?

Yesterday, May 12th, another blogger was murdered in Bangladesh:

The Bangladeshi humanist blogger and author Ananta Bijoy Das has been hacked to death in Sylhet by four masked men wielding machetes and cleavers. His murder is the fourth such attack in Bangladesh in the last three years and the third in as many months. Das had been working with the International Humanist and Ethical Union (IHEU) to gain asylum in Europe; just last week a visa application was denied by Swedish authorities.

In 2013, Ahmed Rajib Haider was hacked to death, while Asif Mohiuddin was stabbed several times and several bloggers were arrested. More recently, in February this year, Avijit Roy was also hacked to death while his wife Bonya Ahmed was severely injured. Then in March, Washiqur Rahman was also hacked to death. And today Ananta Bijoy Das has been killed in the same way.

Attacks previous to yesterday’s have been motivated by a desire to suppress the opinions of the bloggers on the grounds that their opinions “defame Islam”. Although no one has come forward yet to explain the motive behind the most recent attack, it’s a pretty good guess that its motive is the same as that of the previous attacks.

There is a tendency to see these and other similar violent assaults as more or less peculiar to Muslims, or at least, as peculiar to religious fanatics of one faith or another. But the tendency to blame religion strikes me as a misleading one.

Religion might all too often add fuel to the fire, but the fire is already burning even before religion inflames it higher.

We humans have a very long history of irrationally suppressing opinions we find offensive both through overtly violent and through other means. Apart from murder, we also employ such means as shouting down the speaker, hounding them, ostracizing them, insulting them, or threatening them with various other repercussions if they persist in expressing their views. Such behavior is ubiquitous, and when a behavior is ubiquitous, when it is found in all places and at all times through-out history, it must be suspected of being a human trait, rather than merely a cultural, social, or individual one.

The fact – if it is indeed a fact – that the suppression of offensive opinions is grounded in human nature does not mean that the suppression is morally or ethically justified.

Human biology is not the sum of human destiny. We seem to be either unique or almost unique among animals in that we have brains capable of making decisions that run contrary to our instincts. Consequently, it cannot be truthfully said that, because suppressing opinions that offend us is human nature, doing so is either necessary or even inevitable. There is no escaping by that route the obligation to decide what is morally or ethically just.

The question thus comes down to what kinds of opinion, if any, can be morally or ethically suppressed?

Over a hundred years ago, John Stuart Mill provided what I regard as a sound answer to that question. The example he used to make his point involved the English corn merchants. They were the bankers of his day. The merchants were often reviled, especially by poor people. Poor people perceived that the merchants frequently manipulated the market to drive prices up, making corn unaffordable to many, and had much to say about the fact. In turn, the merchants took offense at the things said about them, and sought to have such speech criminalized. Mill came to the defense of free speech by arguing that no one had a right to suppress opinions on the mere basis that such opinions were offensive to them, for to be offended was not to suffer actual harm. Only if someone’s speech was an incitement to do actual harm to someone could it be morally suppressed.

I follow Mill in believing that offense is not a basis for suppressing someone’s opinions. However, the obvious counter to that position is to argue that offense is actually harmful to the offended party. And that is what the American philosopher Joel Feinberg did in the 1980’s.

Feinberg argued that a person’s opinions can cause embarrassment, shame, fear, revulsion, shock, and so forth, in other people, and that those feelings can amount to actual harm done. He therefore urged that Mill’s “harm principle” be replaced with his “offense principle”.

Feinberg’s illiberal views seem to have been picked up on mostly by the radical Left. So far as I’ve heard, on many college campuses today, the notion that opinions which cause someone offense are actually injurious to them has largely prevailed over Mill’s harm principle. And this appears to have led to all sorts of notably stupid situations. For instance, Jonathan Chait writes in New York Magazine:

Last March at University of ¬California–Santa Barbara, in, ironically, a “free-speech zone,” a 16-year-old anti-abortion protester named Thrin Short and her 21-year-old sister Joan displayed a sign arrayed with graphic images of aborted fetuses. They caught the attention of Mireille Miller-Young, a professor of feminist studies. Miller-Young, angered by the sign, demanded that they take it down. When they refused, Miller-Young snatched the sign, took it back to her office to destroy it, and shoved one of the Short sisters on the way.

Speaking to police after the altercation, Miller-Young told them that the images of the fetuses had “triggered” her and violated her “personal right to go to work and not be in harm.” A Facebook group called “UCSB Microaggressions” declared themselves “in solidarity” with Miller-Young and urged the campus “to provide as much support as possible.”

By the prevailing standards of the American criminal-justice system, Miller-Young had engaged in vandalism, battery, and robbery. By the logic of the p.c. [political correctness] movement, she was the victim of a trigger and had acted in the righteous cause of social justice. Her colleagues across the country wrote letters to the sentencing judge pleading for leniency. Jennifer Morgan, an NYU professor, blamed the anti-¬abortion protesters for instigating the confrontation through their exercise of free speech. “Miller-Young’s actions should be mitigated both by her history as an educator as well as by her conviction that the [anti-abortion] images were an assault on her students,” Morgan wrote. Again, the mere expression of opposing ideas, in the form of a poster, is presented as a threatening act.

The notion that mere images of aborted fetuses can rise to the level of “an assault” that might be justifiably defended against to even by means of vandalism, battery, and robbery is, of course, a dangerous idea. But the notion is also a logical deduction from Feinberg’s offense principle.

Once you grant that anything which offends a person does actual harm to that person, that person is logically justified to take action to prevent themselves from coming to harm. And the greater the potential harm, the more extreme the legitimate range of actions they can take. If your opinion on some matter, however trivial it might be to you, can cause me severe, lasting and permanent damage, then what prevents me from being morally justified when I resort even to violence in order to prevent that damage? What matter vandalism, battery, and robbery when done in “necessary” self-defense? Or if I feel sufficiently harmed, why should I not recruit three of my friends with whom to hack at you with machetes?

Although my example here has been an example of an assault on free speech from the American Left, such assaults are by no means confined to any one ideology, movement, or politics.

The dangerous idea that we have a right to suppress opinions or ideas that offend us is a notion that is very likely to always be with us in one form or another, for it seems to be rooted in human nature itself, rather than more simply rooted in a particular religion, ideology, or society.

And that can be a scary thought, for the implication here is that all the world’s social or ideological progress might be little more than a veneer, and that a future age of illiberal barbarism is perhaps just as much a possibility as a future age of enlightened civilization. We will always have within us the genes for that barbaric age.

What Are the Politics of Human Instincts?

I recall in the 1960s and 70s, it was popular in many circles to insist that human nature was uniquely malleable.  It was frequently said that, while other animals had many instincts,  human instincts were few and far between.

Instead of instincts,  human behavior was governed solely by learning.  We lacked any instinct to have sex and had to learn to have it.    Again, we lacked any instinct for defending a territory and had to learn both the concept of a territory and to defend ours.  And so forth…

Learning and instinct were seen as oil and water:  They didn’t mix.  An animal’s behavior was either instinctual or it was learned.  If it was instinctual, then it was unvarying and reflexive.  If it was learned, then it was almost infinitely variable and far from reflexive.   The most widely used definitions of instinct at the time precluded just about any other interpretation.  Konrad Lorenz was around, but his pioneering work on instinctual behavior was not nearly so well understood and accepted as the work on learning of, say, B. F. Skinner.

My impression is that people believed humans had so few instincts because they wanted to think of our species as improvable.  The 60s and 70s were in many ways an optimistic time when folks thought humanity could fundamentally change for the better.  And, of course, if that was true, then it made sense to think that human behavior was limited only by what humans could learn.

There might also have been a bit of Christian theology underlying the expectations of scientists.  In Christianity, man occupies an unique place in nature.  He is the only animal who has a soul, and perhaps the only animal with free will.  I suspect the scientists of the 60s and 70s were unconsciously influenced by those beliefs.  Hence, they expected to find a human quite unlike the other animals.  A human whose behavior was uniquely malleable if not through free will, then through learning.

I only know a small handful of people today — mostly sociologists — who still deny that humans have any significant instincts.  Instincts are not always called “instincts” today.  Sometimes, they are called “predispositions”, “behavorial tendencies”, “predilections”, or other terms.  But regardless of what name they use, you everywhere run across people talking about instinctual behavior.   Or, at least I do.

Some of the behaviors that one or another person has conceived of as instinctual to our species include tribalism, territorialism, war, rape,  reciprocity,  language, certain morals, and a belief in spirits and other supernatural entities.  Those and many other things have been thought of as  either instinctual or having a strong instinctual component.

There is much more to the history of human instincts than I have the space for here, but I think you can now get an approximate idea of the change in thinking about instincts that’s taken place over the past few decades.

In my own view, instincts and learning are not oil and water.  Instead, they mix.  Moreover, the instinct is not an unvarying reflex, but rather more like a predisposition towards a certain behavior.   If humans have an instinct for sex, that does not mean that humans will necessarily have sex every chance they get.   It does not mean that humans are like automatons who cannot vary their behavior in order to adapt to circumstances.   Instead, an instinct for sex means, among other things, that humans have a pronounced tendency towards having sex.

Politically, the notion of instinctual behavior in humans is potentially dangerous to liberty.  My guess it is only a matter of time before some inbred fool comes along to claim that his or her inherent instincts are superior to everyone else’s inherent instincts, giving his or her family a right to rule the rest of us for the next ten generations.  Minimum.   And of course, if that wannabe aristocrat has enough money, he or she will have many supporters.   In other words, the recognition that human behavior is not determined by learning — and learning alone — can seem to be an implicit recognition that some of us might be born better people to govern than others of us.

On the other hand, it seems to me that liberty for everyone is justified on many grounds.  Thus, one does not need to prove that all people are born equal — or born with equal potential, as it were — to justify everyone possessing the same political liberties.

But what do you think?

America’s Future?

The economic crisis in advanced economies is accelerating the timeline in which big emerging nations like China rule the global economy. Instead of the market focusing on American shopping habits, they’ll be focused on consumers in Shanghai and Mumbai. Unless the US can recover the 8.5 million jobs it lost in the recession, and unless incomes begin rising, the US will be knocked off its pedestal within a generation.

In the US, the biggest problem is Washington. It is becoming clear that they work for maybe a hundred billionaires and five industry groups and that’s about it.

China still has a long way to go before it catches up with the US, and China is a command and control economy. China says that its style of economics is not for export, and other emerging nations, like Brazil, have not tried to emulate it. They don’t have to. Nor does India, or Thailand or Indonesia, for that matter. Their populations are getting richer, ours are getting poorer, with average incomes declining in 2009 and 2010, according to the US Census Bureau. Their corporations are investing at home and creating jobs; ours are either hamstrung from doing so, demanding more tax breaks from a revenue strapped government, or investing where the growth really is.

And where is it? Far and away from the US, new cities are being built, new industries, new entertainment centers rivaling Hollywood; new brands and a new middle class. In some of these countries, like Brazil, disparity between rich and poor is shrinking, not widening. It’s not Nirvana. It’s better. It’s worse. But it’s growing, and it’s hiring, and it is peaceful.

From “The Post-Western World“, posted in Forbes, by Kenneth Rapoza.

In making his case that the American reign is nearing its end, Rapoza quotes in his Forbes post from Noam Chompsky.

“It is a common theme” that the United States, which “only a few years ago was hailed to stride the world as a colossus with unparalleled power and unmatched appeal is in decline, ominously facing the prospect of its final decay,” Giacomo Chiozza writes in the current Political Science Quarterly.

The theme is indeed widely believed. And with some reason, though a number of qualifications are in order. To start with, the decline has proceeded since the high point of U.S. power after World War II, and the remarkable triumphalism of the post-Gulf War `90s was mostly self-delusion.

Another common theme, at least among those who are not willfully blind, is that American decline is in no small measure self-inflicted. The comic opera in Washington this summer, which disgusts the country and bewilders the world, may have no analogue in the annals of parliamentary democracy.

The spectacle is even coming to frighten the sponsors of the charade. Corporate power is now concerned that the extremists they helped put in office may in fact bring down the edifice on which their own wealth and privilege relies, the powerful nanny state that caters to their interests.

From “America in Decline“, posted in Nation of Change, by Noam Chomsky.

It’s a strange day when thinkers such as Rapoza and Chomsky, who are on either ends of the ideological spectrum, agree about America’s prospects over the next 10 or 20 years.

Both articles are worth reading in their entirety.

Why Did Humans Invent the Gods?

I think I’m headed in the direction of becoming a very disagreeable old man.  I think that might happen to me because I have a number of pet peeves.  Peeves that are meaningful only to me — but which I increasingly lack the wisdom to keep to myself.  And one of those pet peeves became inflamed tonight.

I have for years held the opinion — rabidly held the opinion — that E. B. Tylor was mistaken. Tylor, who was born in 1832, was the anthropologist who coined the notion the gods were invented to explain things.

I don’t think Tylor had any real evidence for his notion the gods were invented to explain things.  I agree with those folks who say he was speculating.  Yet, his notion can seem plausible.  And I suppose that’s why his notion has caught on.  So far as I can see, Tylor’s notion is the single most popular explanation for the invention of deities.

Basically, his notion goes like this:  Primitive humans did not have the science to know what caused thunder, so they invented a god that caused thunder.  In that way, their natural curiosity was satisfied.  Again, primitive humans did not know what caused love, so they invented a god that caused love.  And so forth.

Tylor’s views spawned the notion the gods would sooner or later go away because science would sooner or later replace them as an explanation for things.  Of course that hasn’t happened.

A number of scientists have come up with much more interesting theories about the origins of deity than Tylor came up with.  But those theories haven’t had the time to catch on as widely as Tylor’s. Nevertheless, the gist of the current thinking is that our brains are somewhat predisposed to belief in supernatural things — from ghosts to gods.  I have posted about those new notions here and here, but for a more comprehensive look at the new notions, see the recommended readings at the end of this post.

__________________________________

Recommended Readings:

Andrew Newberg and Eugene D’Aquili, Why God Won’t Go Away: Brain Science and the Biology of Belief.

Scott Atran, In Gods We Trust: The Evolutionary Landscape of Religion.

Some Reactions to the Debt Ceiling Deal

We currently have a deeply depressed economy. We will almost certainly continue to have a depressed economy all through next year. And we will probably have a depressed economy through 2013 as well, if not beyond.

The worst thing you can do in these circumstances is slash government spending, since that will depress the economy even further. Pay no attention to those who invoke the confidence fairy, claiming that tough action on the budget will reassure businesses and consumers, leading them to spend more. It doesn’t work that way, a fact confirmed by many studies of the historical record.

From “The President Surrenders”, posted in The New York Times, by Paul Krugman.

◄►

How can the leader of the Democratic Party wage an all-out war on the ostensible core beliefs of the Party’s voters in this manner and expect not just to survive, but thrive politically?  Democratic Party functionaries are not shy about saying exactly what they’re thinking in this regard:

Mark Mellman, a Democratic pollster, said polling data showed that at this point in his term, Mr. Obama, compared with past Democratic presidents, was doing as well or better with Democratic voters. “Whatever qualms or questions they may have about this policy or that policy, at the end of the day the one thing they’re absolutely certain of — they’re going to hate these Republican candidates,” Mr. Mellman said. “So I’m not honestly all that worried about a solid or enthusiastic base.”

In other words: it makes no difference to us how much we stomp on liberals’ beliefs or how much they squawk, because we’ll just wave around enough pictures of Michele Bachmann and scare them into unconditional submission. That’s the Democratic Party’s core calculation: from “hope” in 2008 to a rank fear-mongering campaign in 2012.

From “Democratic Politics in a Nutshell”, posted on Salon.com, by Glenn Greenwald.

◄►

Overheard: Thanks for protecting the job creators, you know for creating jobs for chauffeurs, valets, domestic help, and most importantly seamstresses who specialize in crafting $100 bills into luxurious overcoats. ~ Jim Cutler

From “US Debt Highlights ~ The Day After…”, posted on The Bis Key Chronicles, by Gandalfe

◄►

The protracted negotiations over the debt ceiling, as well as the final package agreed to by President Obama and the congressional leadership, show what happens when a small minority is allowed to gain control over national debate. While polls consistently show that the vast majority of the public sees jobs as the main problem facing the economy, there has been a well-funded crusade to ignore public opinion and make cuts to social insurance programs and other spending the top priority for Congress and the President.

From “Statement on the Debt Ceiling Deal“, posted on The Center for Economic and Policy Research Website, by Dean Baker.

◄►

“Shame on you! you who make unjust laws and publish burdensome decrees, depriving the poor of justice, robbing the weakest of my people of their rights, despoiling the widow and plundering the orphan. What will you do when called to account, when ruin from afar confronts you? To whom will you flee for help?”

– Isaiah 10:1-3

From “A Warning to Congress”, posted on Dover Beach, by Θεόφιλος.

 

Can a Person Who is Alienated from Themselves find Happiness?

Most people are other people.  Their thoughts are someone else’s opinions, their lives a mimicry, their passions a quotation.

–Oscar Wilde

In Artic Dreams, Barry Lopez somewhere talks about an Inuit word for a wise person.  The word, if I recall, means “someone who through their behavior creates an atmosphere in which wisdom is made tangible.”  When I read Lopez a few years ago, I thought of Paul Mundschenk.  As I recall, I never once heard him claim to possess, say, compassion, good faith in others, or kindness.  Yet, he embodied those virtues, as well as others: He made them visible.

Mundschenk was a professor of Comparative Religious Studies, and, as you might imagine, I discovered he was inspiring.  But not inspiring in the sense that I wanted to be like him.  Rather, inspiring in the sense he showed me that certain virtues could be honest and authentic. I was a bit too cynical as a young man to see much value in compassion, good faith, kindness, and so forth.  I thought intelligence mattered an order of magnitude more than those things.  Yet, because of Mundschenk, and a small handful of other adults, I could only deny the value of those virtues; not their authenticity.

I can see in hindsight how I naively assumed at the time that we all grow up to be true to ourselves.  Isn’t that normal for a young man or woman to make that assumption, though?  Aren’t most youth slightly shocked each time they discover that yet another adult is, in some way important to them as a youth, a fake?

Perhaps it’s only when we ourselves become an adult that we eventually accept most of us are less than true to ourselves, for by that time, we so often have discovered what we consider are good reasons not to be true to ourselves.

If that’s the case, then I think there might be a sense in which Mundschenk never grew up.  That is, he just gave you the impression of a man who has never accepted the common wisdom that he must put on a front to get on in the world. He had an air of innocence about him, as if it had somehow simply escaped his notice that he ought to conform to the expectations of others, and that any of us who refuses to do so is asking for all sorts of trouble.

Now, to be as precise as a dentist when untangling the inexplicably tangled braces of a couple of kids the morning after prom night, Mundschenk did not seem a defiant man.  He was anything but confrontational.  Rather, his notably open and honest individualism seemed deeply rooted in a remarkable indifference to putting on any fronts or airs.  He simply couldn’t be bothered to conform.

Often, when I remember Mundschenk, I remember the way he shrugged.  I remember some folks for their smiles, others for their voices, but Mundschenk for his shrug.  It seemed to hint of Nature’s indifference, but without the coldness.  Which, I guess, makes me wonder: Is there anything unusual about someone who is both notably indifferent to himself and notably true to himself?

I was put in mind of Paul Mundschenk this morning because of a  post I wrote for this blog three years ago.  The post was intended to be humorous, but I titled it, “An Advantage of Being Cold and Heartless?“.   Consequently, the post gets two or three hits each day from people looking for advice on how to make themselves cold and heartless.

I can imagine all sorts of reasons someone might want to make themselves cold and heartless.  Perhaps someone they are on intimate terms with — a parent, a sibling, a spouse, a partner — is wounding them.  Or perhaps they are among the social outcasts of their school.  But whatever their reasons, they google search strings like, “How do I make myself cold and heartless?”

Nowadays, I think it is a mistake to try to make yourself tough, cold, heartless, or otherwise insensitive.  But I certainly didn’t think it was a mistake 30 years ago, when I was a young man.

Yet, I see now how my values and priorities in those days were not largely derived from myself, but from others. The weight I placed on intelligence, for instance, was from fear that others might take advantage of me if I was in anyway less intelligent than them.  I valued cleverness more than compassion and kindness because I thought cleverness less vulnerable than compassion and kindness.  And I carried such things to absurd extremes: I can even recall thinking — or rather, vaguely feeling — that rocks were in some sense more valuable than flowers because rocks were less vulnerable than flowers.  The truth never once occurred to me: What we fear owns us.

It seems likely that when someone seeks to make themselves insensitive, they are seeking to protect themselves, rather than seeking to be true to themselves.   If that’s the case, then anyone who tries to make themselves less sensitive than they naturally are runs the risk of alienating themselves from themselves.

Can a person who is significantly alienated from themselves be genuinely happy?  I have no doubt they can experience moments of pleasure or joy, but can they be deeply happy?  It’s an interesting question, isn’t it?  Perhaps a little bit like asking whether someone who wants a melon will feel just as happy with a pepper instead.