Late Night Thoughts: Love, Realism, Talents, Happiness, and More.

Terri, who occasionally comments on this blog, pointed out the other day in a discussion about compassion that some feelings or emotions are as strikingly beautiful as anything physical.  Of course, that is true not only of compassion, but also of love.  And to me, one of the most beautiful things about love is how it so often creates in us both a desire to improve the lives of our beloved, and a sensitivity to ways that might genuinely improve their lives.

When I composed the following poem, I had in mind more the desire to improve, than the sensitivity to know what would improve.  Still, I think the poem works in its own way.

Love is an ancient thing
That travels back before gravity was born
And forward beyond the last gods.
I have wanted to sip your breast
In between the lights of night and day
And tell you how I’ve taken sides
Against a mammoth
To bring you his tusks
So that you, my woman, my love,
Will be happy now
For all the worlds
You have given to me.

Should love — any kind of love — really be thought of as a single emotion?  Is romantic love just one emotion?  Erotic love?  Mature or deeply attached love?

Perhaps erotic love is but a single emotion, lust, but how can you make the same case for the others?  Romantic, mature, and other kinds of love do seem to have many characteristics, rather than just one.  For instance, in addition to making us desire to improve someone’s life, don’t both romantic and mature love also make us feel greater tolerance for the differences that might exist between us and our beloved?

It’s a tricky question, I think, because perhaps they only make us overlook the differences, rather than actually make us willing to tolerate the differences.  Or are those the same thing?

Most people, I believe, stubbornly accept reality just as conscientiously as they accept their religion.  That is, only when it is convenient to do so, but then conscientiously.  Realism is not our main strength as a species.

Have you noticed that humans so seldom are what they want to be?  Yet so much of our happiness, I think, comes from accepting ourselves as we are.

All that striving to be what we are not seems to produce more unhappiness than anything else, because — while we can change ourselves around the edges — we have much greater difficulty changing our core nature.

But then, what is our core nature?

I don’t think I have the complete answer to that question, but surely part of the answer is that our core nature includes our talents.  By “talents” I do not mean our skills, but rather our raw predispositions to such things as athletics, mathematics, music, drawing, writing, dance, mechanics, etc.

A good way to tell if you have a talent for something is to ask yourself two questions.  First, “Do I like doing this?”  We usually like doing what we have a talent for doing.   Second, “Does it come comparatively easy to me?”  I think the key word here is “comparatively”.   If you don’t have a talent for, say, mathematics, but do have a talent for music, you will usually find that music comes a whole lot easier to you than math.   Answer those questions honestly, without wishful thinking, and you will most likely gain a pretty good idea of where your talents lie.  At least that’s been my experience.

In my view, pursuing one’s talents in life by working to turn them into actual skills is — all else being equal — not only conducive to happiness, but perhaps more important, conducive to a sense of meaning.

Now, all of this might seem commonsense, and so obvious it’s hardly worth mentioning, but I have met far too many people who were more or less clueless about their talents for myself think “it’s just commonsense to know your talents”.

Why have so many people been ignorant of their own talents, though?

I think the single most important reason is that, in this matter, most of us listen way too much to the advice of others.  They usually mean well, but they don’t know you nearly as well as you yourself could — if you took a dispassionate look at yourself — know you.  Most often, other people of good will want what’s best for you, but their idea of what’s best for you is very heavily colored by what they know about what’s best for them.

The worst evil that you can do, psychologically, is to laugh at yourself. That means spitting in your own face.  — Ayn Rand

The main reason I think of Rand in something less than an entirely negative light is because several of my female friends have told me over the years that Rand helped them psychologically liberate themselves from the oppressive expectations and indoctrinations of the religious cults they grew up in.

While I think there are better — much better — authors than Rand for helping with that, I’m glad she did indeed help my friends realize just how greatly they had been lied to about their worth and potential as women.

Having said that, my overall impression of her is that she is squarely in the buffoon class of philosophers and social critics.  Indeed, I even think it was pretentious of her to have called herself a “philosopher” at all.  She did very little to push the envelope of rational thought, such as the great philosophers have done.  But that’s a minor peeve of mine.  A greater reason for calling her a buffoon is that she could not laugh at herself.  Have you ever known a buffoon who genuinely could?

I am of the view that humor, in general, evolved as an adaptive mechanism.  To put it somewhat superficially here, it seems to me that humor greatly facilitates logical reasoning and attention to empirical evidence.   More specifically, it can play a key role in helping us to overcome our innate cognitive biases, egotistical attachments to our beliefs, and general intellectual inertia, in order to change our minds when we are wrong about something.  And changing our minds when we are wrong about something can have obvious benefits to our survival, albeit it is quite often extraordinarily difficult for us to do — and nearly impossible for those who lack any appreciable sense of humor at all.

In that regard, self-deprecatory humor is no different than humor in general.  So far as I can recall, I’ve not yet in my sixty years met a man or woman who “took themselves too seriously” and who greatly understood themselves.

There used to be a saying among fire fighters that, for all I know, might still be current.  “Never fight fire from ego”.  Both myself and the men I worked with in the few years that I fought fires profoundly distrusted anyone who “fought fire from ego”.  We knew they could too easily get themselves killed — or far worse, someone else killed.

Today, forty or so years later, I still haven’t found anyone — whose ego has such a firm grip on them that they can’t laugh at themselves — that I would trust at my side in even a moderately demanding situation, let alone where my life might be on the line.  Yes, I know, I’m only thinking of myself here, but so be it.

Of course, you might want to make up your own mind about all that, rather than simply swallow what I say.  I have, after all, been certified as crazy by a group of scientists.  Personally, I don’t think the space alien scientists who have contacted me through my microwave know what they’re talking about, but it might still be reasonable of you to take my words — or anyone’s words, for that matter — with a bit of reflective thought, rather than reflexively.

Chat Room Love

She was beyond caring why she loved him.
Beyond any caring at all these days
Except when she was caring for him.
She had met him in chat that one morning
When the Winds were light with Spring’s fertile hopes,
And even I was somewhat less than purely skeptical
About the Great Slut-Goddess Love.

He seemed a man of passions to her.
Huge thirsty, lusty, hungry passions:
For his typing was fast and prolific,
His “LOLs” were always the first
To pop up after anyone’s jokes;
His “BRBs” were inevitably shouted out in all caps;
And his exclamation points were as numerous as the shoals of fish
That once had been so abundant on the Grand Banks.

She did not know — how could she have known —
But she was hankering after a false man:
For he was not really the rich, sophisticated
Albanian fashion designer he represented himself to be,
But was instead, an 74 year old quality control engineer
For a failing American manufacturing company
That produced only a decrepit line of men’s genital waxes.

The Social Brain

“The trouble with practical jokes is that very often they get elected.”  ― Will Rogers

Politicians are not the only practical jokes that get elected.  A lot of bad ideas also “get elected”.  Get elected in the sense that they become as popular as cheap hamburgers, and more popular than much better ideas.

Social Darwinism is surely one of the worse ideas that humans have ever invented.   Humans are quite talented at inventing bad ideas, but talent alone lacks the necessary brilliance to have invented Social Darwinism.  No, Social Darwinism took genius.

There were actually several geniuses involved in the invention of Social Darwinism, a whole intellectual clusterfuck of them.  But perhaps William Graham Sumner was the most brilliant clusterfucker of that whole group.

In 1883, Sumner published a highly influential pamphlet entitled “What Social Classes Owe to Each Other”, in which he insisted that the social classes owe each other nothing, synthesizing Darwin’s findings with free enterprise Capitalism for his justification.  According to Sumner, those who feel an obligation to provide assistance to those unequipped or under-equipped to compete for resources, will lead to a country in which the weak and inferior are encouraged to breed more like them, eventually dragging the country down. Sumner also believed that the best equipped to win the struggle for existence was the American businessman, and concluded that taxes and regulations serve as dangers to his survival.  [Source]

To be able to take an idea as brilliant as Darwin’s Theory of Evolution and turn it into an idea as hard-packed with stupidity as Social Darwin is absolute genius.  Sumner might have been one of the people George Orwell had in mind when he said, “There are some ideas so absurd that only an intellectual could believe them”.

Anti-intellectualism is just as American as apple pie or selling diabetic horse urine as beer.  That does not mean, however, that Americans skeptically refuse to  embrace the ideas of intellectuals.  No, in practice, it has meant only that Americans are so unfamiliar with intellectuals and their ideas that they can’t tell the good from the bad.  They are like those poor, sad folks who are so anti-sex they never develop whatever raw talent they might have for sex into becoming moderately decent lovers, let alone dynamos between the bed sheets.  There is no other way to explain the continuing popularity in America of Sumner’s ideas.

Social Darwinism is many things but so often at the core of it is the notion that human evolution has been predominantly driven by intraspecies competition.  As it turns out, however, to say that intraspecies competition predominantly drove human evolution is just as absurd as saying that a dozen minutes of start-to-finish jackhammering is mainly all there is to sex.  There is so much more!

For a long time, scientists have known that the human brain is exceptionally large relative to body size.

Early attempts to explain the fact tended to focus on environmental factors and  activities.  Thus, humans were thought to have evolved large brains to facilitate banging rocks together in order to make tools, hunt animals, avoid predators,  think abstractly, and outsmart competitors for vital resources like food, territory, mates, and rocks.  This was known as the “ecological brain theory”.

Then, in 1992, the British anthropologist Robin Dunbar published an article showing that, in primates, the ratio of the size of the neo-cortex to that of the rest of the brain consistently increases with increasing social group size.

This strongly suggested that primate brains — very much including human brains — grew big in order to allow them to cope with living in social groups.  As a consequence of that and other research, the new “social brain theory” started replacing the old “ecological brain theory” in the hearts and minds of scientists.

We don’t have the biggest teeth, the sharpest claws, the fleetest feet, the strongest muscles in nature.  But, as it happens, we are in most ways the single most cooperative species of all mammals, and in unity there is strength.  One human is usually no match for a lion even if he’s the most competitive human within a hundred miles. But through cooperation we are able to achieve more together than we can achieve through competition.

I once saw a film in which a band of two dozen or so men and women chased a huge male lion into a thicket and killed it in just a few seconds with nothing more than pointed sticks.   That is the bare minimal kind of cooperation that no doubt helped us to become the extraordinarily successful species we are today.

Even the fact we are able to (to some extent) reason abstractly might have much to do with our evolving as a social species.

Hugo Mercier and Dan Sperber have come up with the fascinating theory that reasoning evolved — not to nobly discern truths — but to persuade our fellow apes to cooperate with us, and to help us figure out when someone is telling us the truth.

Thus Mercier and Sperber begin with an argument against the notion that reasoning evolved to deliver rational beliefs and rational decisions:

The evidence reviewed here shows not only that reasoning falls quite short of reliably delivering rational beliefs and rational decisions. It may even be, in a variety of cases, detrimental to rationality. Reasoning can lead to poor outcomes, not because humans are bad at it, but because they systematically strive for arguments that justify their beliefs or their actions. This explains the confirmation bias, motivated reasoning, and reason-based choice, among other things.

In other words, those of us who wish in at least some cases to arrive at rational beliefs and rational decisions are somewhat in the position of a person who must drive screws with a hammer — the tool we have available to us (reason) did not evolve for the purpose to which we wish to employ it, and only by taking the greatest care can we arrive safely at our goal.  But I digress.

Mercier and Sperber go on to ask, “Why does reasoning exist at all, given that it is a relatively high-cost mental activity with a relatively high failure rate?”

They answer that reasoning evolved to assess the reliability and quality of what someone is telling you (“Is Joe telling me the truth, the whole truth, and nothing but the truth about his beer cellar?”), and also to enable you to persuade someone to do (or not do) something (“How do I talk Joe into giving me all his beer?”).   That is, reasoning involved in a group context.  The implication is that we reason best and most reliably when we argue or debate with each other.

I have long thought that one of the reasons the sciences have demonstrated themselves to be all but the most reliable means of inquiry that we have ever invented — second only to getting baked on Colorado’s finest weed in order to ponder the “Big Questions” of life — is because the sciences rest on the principle of intersubjective verifiability.  Basically, you check my work, I’ll check yours, and together we might be able to get closer to the truth than either of us could get working alone.

When Thomas Hobbes was writing out his political philosophy in the 1600s, he embraced the sensible notion that any political system should be based on human nature, as opposed, say, to based on what we might think some god or king wants us to have.   Hobbes, who often cooked up brilliant ideas, now proceeded to burn his meal, for he envisioned that human nature is essentially solitary.  He thought if you go back far enough in human history you will come to a time when people did not live in social groups, but alone.  There was no cooperation between people and it was instead “a war of all against all”.

Hobbes was not only wrong about that, he was very wrong about that.  What evidence we have suggests our species always lived in groups, our ancestors always lived in groups, and their ancestors always lived in groups.  In fact you must go back at least 20 million years in evolutionary history before you find a likely ancestor of ours that might have been a loner.  Our brains have been evolving as specialized organs for dealing with we each other for at least 20 million years, which is almost long enough to listen to every last complaint my two ex wives have about me.  And hell, we’re only talking about their legitimate complaints!

Of course, the fact we are social animals does not mean we are hive animals.  We are very much individuals, so far as I can see.  But that means, among much else, that there is and always will be a tension or conflict between our social and our individual natures.

Before we started living in the first city-states about 6,500 years ago, we lived in relatively small hunting/gathering bands of 200 or so people at the most.  So far as we know today, the bands were mostly egalitarian.  Just about anyway you can measure it, there wasn’t much social, political, or economic difference between people.  And the individual and society were probably in a fairly well balanced relationship with each other. Then some killjoy invented the complex, hierarchical society of the city-states.   And the people of the time, instead of doing the rational thing, and hanging him on the spot, let him get away with it.

From that infamous day forward, there’s been very few times in history when the balance between the individual and society has favored the individual.  Most societies have been oppressive.  That needs to end.   Yet end in a way that restores a sane balance, not in a way that destroys societies through extreme individualism.

Women’s Sexuality: “Base, Animalistic, and Ravenous”

What is the future of our sexuality?

How, in twenty maybe forty years, will we be expressing ourselves sexually?

Do we have any clues today about what kind of sexuality tomorrow might bring?

And why did my second wife doze off on our wedding night just as I was getting to the climax of my inspiring lecture to her on Socrates’ concept of love?  After all, she positively begged me for some “oral sex”!  Doesn’t make a lick of sense she fell asleep in the midst of it.

I’ve been wondering about those and other questions this morning but not, as you might suspect, because I’ve been binge viewing Balinese donkey on donkey porn again.  What inspires me instead is the emerging consensus in the science of human sexuality.  That consensus strikes me as a game-changer.

It’s sometimes said that the early human sexuality studies of Kinsey, Masters and Johnson, paved the road to the Sexual Revolution of the 1960’s and 70’s.  It seems to me today’s new, still emerging consensus could be like that — or it could be even more seismic than what we’ve seen before.

What’s at the core of this is women’s sexuality, along with a growing body of research that strongly suggests women’s sexuality isn’t what most of us nearly the world over have been taught it is.

To be sure, nothing is going to happen overnight.  For one thing, any really profound cultural changes that result from this new understanding of women’s sexuality are almost certain to take generations to be fully realized.  Deep cultural change is seldom quick.  Yet, sometimes great storms are proceeded by light rains blown ahead of the main storm, and something like that could happen here too.

For another thing, it’s always possible that the emerging consensus will fall apart.  The research seems to me solid so far, but as yet, not massive.

Some Old Ideas About Women’s Sexuality

To understand how the new science could transform our cultures, let’s first look at what’s at stake.  It seems that across many — but certainly not all — cultures there is a more or less shared set of beliefs about the differences between men and woman’s sexuality.  Among these beliefs:

  • Women are naturally much less promiscuous than men.
  • Women naturally seek and need emotional intimacy and safety before they can become significantly horny.
  • Women naturally prefer to be pursued by men, rather than to do the pursuing.
  • Women are naturally pickier than men when choosing a sex partner.
  • Women are naturally less horny than men.
  • Women are naturally less likely than men to cheat on their partners.
  • Women are naturally more suited to monogamy than men.
  • Women are naturally more traumatized by divorce than men.
  • Even more traumatic for women than divorce is a night spent with Sunstone.

What seems to be happening is that, idea by idea, the old notions of how men and women differ in natural sexuality from each other are being challenged by the new science.  Sometimes the challenges merely qualify the old idea, usually by showing that, although the difference exists, it is largely due to culture and learning rather than to innate human nature.  At other times, the challenges threaten to overturn the old ideas completely.

Some New Ideas About Women’s Sexuality

Bergner, and the leading sex researchers he interviews, argue that women’s sexuality is not the rational, civilized and balancing force it’s so often made out to be — that it is base, animalistic and ravenous, everything we’ve told ourselves about male sexuality.  –Tracy Clark-Flory

I believe that when thinking about the emerging new consensus, the emphasis should be put on “emerging”.  There are so many questions yet to be answered that I do not believe it can as yet be definitively stated.  But at this stage, the following four points seem to me, at least, to best characterize the most important findings:

  • Women want sex far more than almost all of us are taught to believe.
  • Their sex drive is as strong as, or possibly even stronger, than men’s sex drive.
  • Their desire for sex does not always depend on their feeling emotionally intimate with — nor even safe with — their partner.
  • Women might be less evolved for monogamous relationships than men.

But do women know this about themselves?  There’s evidence that many women might not.  One such bit of evidence:

Dr. Meredith Chivers attempts to peek into the cage by sitting women in La-Z-Boy recliners, presenting them with a variety of pornographic videos, images, and audio recordings, and fitting their bodies with vaginal plethysmographs to measure the blood flow of desire. When Chivers showed a group of women a procession of videos of naked women, naked men, heterosexual sex, gay sex, lesbian sex, and bonobo sex, her subjects “were turned on right away by all of it, including the copulating apes.” But when it came time to self-report their arousal, the survey and the plethysmograph “hardly matched at all,” Bergner reports. Straight women claimed to respond to straight sex more than they really did; lesbian women claimed to respond to straight sex far less than they really did; nobody admitted a response to the bonobo sex. Physically, female desire seemed “omnivorous,” but mentally, it revealed “an objective and subjective divide.”

Women, it seems, might not be in tune with their physical desires when it comes to sex.  But if this is so, it should come as little or no surprise.

The Repression of Women’s Sexuality

While significant efforts to repress women’s (and often enough men’s) expression of their own sexuality are not found in every culture (e.g. the Mosuo), they seem to be found in all major cultures, and they range from shaming all the way up to female genital mutilation,  honor killing, and stoning.  Indeed, rape — which is a nearly ubiquitous behavior — can be seen as largely a form of repressing women’s sexuality, especially given how often it is justified in terms of “she asked for it”, meaning that she in some way or another expressed her sexuality in a manner the criminal(s) thought invited attack.

But those are merely the enforcement mechanisms for more subtle ways of repressing women’s sexuality.  Sexual ideologies seem to be the primary means of repression.  By “sexual ideologies” I mean in this context anything from full blown systems of thought about what is proper or improper, right or wrong, natural or unnatural about women’s sexuality to unorganized and unsystematic ideas and beliefs about their behavior.   For instance, advising young women not to wear short skirts doesn’t count by itself as a true ideology, but for the sake of convenience I’m lumping such advice into the same bucket as true ideologies here.

Sexual ideologies are perhaps even more effective than the gross enforcement mechanisms at repressing women.  If you can convince someone that it’s natural, right, and moral to suppress her sexual feelings, then you do not need to rely on the off chance you can catch and punish her for them if she fails to do so.  Ideally, you can even get her to suppress her feelings to the extent she no longer knows she even has them, because if you can do that, then she herself is apt to become something of a volunteer oppressor of other women, especially, say, in raising her daughters.

Nature, Mr. Allnut, is what we are put in this world to rise above.  — Rose Sayer, The African Queen (1951).

Disturbing Studies

Here are a few quick examples of the things being found out about women’s sexuality these days:

In surveys men routinely report having two to four times the number of sex partners that women report, which lends support to the notion that men are naturally more promiscuous than women.  But one study, published in 2003 in The Journal of Sex Research, found that when men were tricked into believing they were hooked up to a lie detector, the men reported the same number of sex partners as the women reported.  This is significant because it calls into question a fair body of research that is often cited in support of the notion women are less promiscuous on the whole than men.

A 2009 study published in Psychological Science found that pickiness seems to depend on whether a person is approached by a potential partner, or is themselves doing the approaching.  The experiment, conducted in a real-life speed-dating environment, showed that when men rotated through women who stayed seated in the same spot, the women were more selective about whom they chose to date. When the women did the rotating, it was the guys who were pickier.  This implies that women’s choosiness might largely depend circumstances, and not on innate nature.

In 2011, a study published in Current Directions in Psychological Science found that women liked casual, uncommitted sex just as much as men provided only that two conditions were first met: (1) the stigma of having casual sex needed to be removed, and (2) the women had to anticipate that the man would be a “great lover”.   Contrary to conventional wisdom, the women did not seem to need to feel emotionally intimate with the man in order to enjoy casual sex with him.

In 2015, evidence was published in the journal Biology Letters that both men and women fall into two more or less distinct groups: Those who prefer monogamy and those who prefer promiscuity.  Curiously, the sexes were about the same in terms of the proportions of men and women  who favored one or the other.  A slight majority of the men favored promiscuity, while a slight minority of the women did.  This would seem to undermine the notion that men as a group are markedly more promiscuous than women.

The journal Psychological Science published a 2006 study that found women in general are more flexible than men in their sexual orientations, and that the higher a woman’s sex drive, the more likely she was to be attracted to both sexes (the same was not true of men).

In 2006, the journal Human Nature reported that both men and women in new relationships experience about equal sexual desire for each other, but sometime between one to four years into the relationship, women’s sexual desire for their partners began to plummet (The same was not true of the men: Their sexual desire held constant.)  Two decades into committed relationships, only 20% of women remained sexually desirous of their partners. Long term monogamy appears to sap a woman’s sex drive.   Ladies! Tired of the Same Old Same Old? Willing to dress up in a hen costume and squawk like a chicken?  Sunstone loves his rooster suit, and is currently available most evenings.  Simply call 1-800-BuckBuck! Motto: “He’s even more desperate than you are!”®

Disturbed Men

The new science has huge implications if it is indeed sound.  For instance, as hinted above, the sexual repression of women often enough depends on women buying into certain myths about their own sexuality, such as the myth that a woman’s sexuality, when compared to a man’s, is weaker, less urgent, less demanding.  If the myth is true, then an implication is women should sexually defer to their partners, place their own sexual needs on the back burner while tending to the needs of their man.

Yet, if the new science is sound, then men and women’s sex drives are more or less equal, and there becomes no ideological reason for women to not demand their rightful share of the fun.   That seems to disturb some men.

I can think of any number of reasons why some men are disturbed or put off by sexually assertive women, but none of them are relevant enough to go into here.  Yet, it should be kept in mind that some men  — but not all — are disturbed by the notion that women, being by nature sexually equal to men, ought to have equal rights in bed.

There are other implications of the new science men might find even more disturbing.  Perhaps the biggest implication might have at its core how women’s unleashed sexuality could affect men’s reproductive success.   The new sexuality might fearfully suggest to many men that their liberated partners are now more likely to cuckold them.  That’s not a prospect most men are entirely blissful about.

Grand Sweeping Summary and Plea for Money

Acceptance of reality is not, actually,  one of our major strengths as a species.  Even if the new science proves over time to be sound, it’s unlikely to be accepted without a fight.

If you are like me, you believe more research is needed into women’s sexuality.  Much more research.  Moreover, you are keen on funding some of that research yourself!  Yes, this is your opportunity to send me on a mission of scientific discovery to my town’s finest strip joint, where I will be surveying and assessing how women express their sexuality through dance, while flirting with suffering a heart attack from the intrinsic excitement of doing science.  Simply email me to arrange a transfer of funds!

A Life that Passed Like a Wind

Thirty four years ago last November, my former roommate, Dan Cohen died at the age of 25. He was an extraordinary individual, and if you have a moment, I’d like to tell you a little bit about him.

Dan had the misfortune of being born a Thalidomide baby. He was significantly less than five feet tall, slightly hard of hearing, nearly blind but for his exceptionally thick glasses, and he had purple tinted teeth — which were always on display since his lips did not easily close over them. But the worst of it was that he had an exceptionally weak heart.

At the time I knew him, Dan could walk only a few hundred yards without stopping to rest because his heart would within that short distance pound like he’d run a marathon.

At an early age — maybe nine or ten — Dan’s doctors told his parents that, because of the weakness of his heart, he would most likely not live beyond 25 years old, which proved to be an accurate prediction. His parents made the decision to tell Dan what the doctors had told them, so Dan knew early on that he wasn’t going to live a long life.

I met Dan in college. He and I lived on the same dorm floor for awhile. We became roommates because no one else on the floor wanted him as a roommate. Frankly, Dan was one of the messiest people I’ve ever known. But when he asked to become my roommate, I figured I could handle it on the one condition that he didn’t let any of his mess stray to my side of the room.

It wasn’t long before I learned that Dan’s one ambition in life was to learn everything he could possibly learn as fast as he could learn it. Because of his circumstances, the university allowed him to study anything he wanted to study without pressuring him to graduate. His official major was biology, but he took courses in every major field of science along with many courses in the humanities. He was an engaging thinker, and introduced me to many ideas that were new to me.

The only thing Dan seemed to like more than learning something new was a good joke. Most of our conversations were laced with his wit, and even to this day, I can hear in my mind his laughter.

He also had an well-informed empathy for the underdog, the oppressed, that I myself at the time did not fully share with him. For instance, he was deeply concerned with injustices suffered by the Palestinians.

We only roomed together for one year before I left the dorms. Then one freezing winter night, Dan got a phone call from the hospital. My brother was seriously ill and had been taken to the emergency room. Could Dan give them my new number?

As it happened, Dan only had my address, but not my phone number. Without apparent thought for himself, he set out past midnight, in the middle of a blizzard, to walk to my new home because he didn’t have cab fare and couldn’t find anyone who would lend him the money. It took him, he said, almost two hours to reach me. He had to stop every block or so and rest his heart in the freezing wind.

What impresses me most about the man was not the selfless, heroic effort he made to inform me of my brother’s hospitalization, but rather his extraordinary love for life, his courage, and his sensitivity to others.

Dan knew he didn’t have much time in this world, but I never once heard him complain about it. You can say life was unfair to him, but that’s not a judgement he himself ever gave an indication of harboring.

Instead, I only recall his passionate enthusiasm when he would toss out to me some new idea he’d had, or some bit of knowledge he’d discovered that day. I think he made the most of the tragic hand he was dealt in life, and over the years, he has become something a personal inspiration to me.

Thank you for listening. I believe Dan deserves to be remembered.

When Logic Breaks Bad: Three Shocking Errors that Turn an Appeal to Authority into a Depraved Fallacy!

 

Before we properly begin today, dear reader, I feel it is my duty to inform you that some of the material we will be covering is scandalous in its nature.  I must therefore recommend that, as a simple but necessary precaution, you have handy your smelling salts.

With that said, let us now properly begin: I think it is reasonable of me to hazard that you, dear reader,  are a dedicated student of both logic and epistemology, just as am I.  Therefore, I can deduce that you, too, might be aware of the well established fact that poets are suspect.  Highly suspect.

Obviously, that is because poets far more often than not harbor theories of knowledge, truth, justification, and belief that are so poorly defined a sensible man or woman might be fully justified to believe the typical poet has never more than two or three times in his or her life studied an academic journal’s worth of articles in the blissful fields of logic and epistemology!

Now, it is quite understandable if you think I’m exaggerating, if you think such a thing is improbable.  But please trust me: I know what I’m talking about.  For, as it happens, I’m a worldly man, an experienced man, a man who has seen shocking things in his life, even things as appalling as the heartbreaking story of a poor, wretched, old homeless woman I once befriended only to discover in an especially poignant moment during one of our casual conversations that she was entirely ignorant of the simple distinction between a priori and a posteriori knowledge!

Instantly, I was so choked up that tears welled in my eyes and my voice failed me.  I could not even lecture her, let alone speak to her, and all I could think to do was mutely give her all the money I had on me — a hundred or so dollars — though I felt that was not possibly enough to console her.  Perhaps you can imagine how touched I was when the poor dear, bless her, pretended to be thrilled merely in order to comfort me.

You may be forgiven, dear reader, if you are now in something approaching a state of shock.  Yet, I fear what I have to say next will —  will, if you fail to stoutly brace yourself at once — topple you into madness or, worse, into committing a tu quoque, arguably the most easily avoided of all known fallacies of logic.

You see, poets now and then get it right!

A case in point: Byron by name; Lord by title; English by birth; poet doubtlessly by horrific accident.  His exact words were,  “If I am fool, it is, at least, a doubting one; and I envy no one the certainty of his self-approved wisdom.”

Now, a clear implication of “Byron’s Theorem”, as I like to call it,  is that we cannot absolutely rely on the authority of anyone, not even that of ourselves, for it is always possible for a human — or even dare I say, an epistemologist — to make a mistake.  Clearly, that is implied by the Theorem.

I will not go into the precise and exacting reasons why that is implied. I would only be repeating myself, for I engagingly lay out those reasons in a chapter in the second volume of my insightful sex manual for newlyweds, Towards an Epistemology of Carnal Knowledge: A Popular Guide to the Hot Topics Every Couple Lusts to Discuss on Their Wedding Night. Colorado Springs: Charging Bore Books, 2009. Print.  I confess though, the chapter might not fully satisfy your thirst for an in-depth discussion of Byron’s Theorem because the whole series itself is light reading targeted to a wide audience.  Unfortunately, the only country the volumes have sold well in is England.  Odd, that.  I suppose it must mean English epistemologists are tops.

As it happens, Byron’s Theorem matches a principle of deductive logic.  Since even an authority could be wrong, any and all deductive arguments that appeal to an authority in support of their conclusion are necessarily fallacious.  There are no exceptions.

Yet, the same is not true in inductive logic.  Inductive logic is far less strict in this matter than deductive logic and it allows for some appeals to authority.

But why does deductive logic forbid all appeals to authority while inductive logic permits some appeals?  At the risk of being slightly superficial, this is as short of an explanation as I can personally make of it:

♦ In a deductive argument, the conclusion necessarily follows from the premises if the premises are true.

♦ In an inductive argument, the conclusion probably follows from the premises if the premises are true.

♦ Even though an authority on some subject might usually be right, it is always possible that they could be wrong.

♦ Hence, appealing to an authority as a premise for a deductive argument is invalid because doing so would mean that the conclusion would not necessarily follow from that premise (since the authority could be wrong).

♦ But, in an inductive argument, appealing to an authority as a premise is valid because doing so would mean that the conclusion was made more probable (since the authority is probably right, and despite that the authority could be wrong).

In brief, the reason appeals to authority are allowed in some inductive arguments, but not in any deductive arguments hinges on the difference between “likely to be true” and “must be true”.

Now, I think I might safely say that just about any person who knows me, if asked to pick but one word with which to best describe my emotional side, would pick the word, “passionate”.  I am, after all, a man of passions, strong, towering passions — especially, say, when savoring an argument exquisitely formulated in doxastic logic that perhaps suggests to me a floral hint of orange blossom when I am perusing its axioms.

And like most people of a passionate nature, I sometimes wear my emotional side on my sleeve.  Thus, I should warn you, beloved reader, that we are about to embark upon a discussion of fallacies — a topic almost certain to provoke me, perhaps even provoke me to raw, untamed outbursts in which I might express my opinions with unusual force and vigor — even for the internet.  So I apologize in advance if my stormy language at such moments causes you the vapours.

Having given fair warning, I will now turn to three common ways in which an appeal to authority becomes a fallacy of logic, beginning with:

Citing Someone Who is Not an Actual Authority

Suppose you have a Valentine’s Day date with your cute research colleague in paleobotany.  You’ve already turned off the lab lights, romantically lit a couple Bunsen burners, slipped out of your lab coats, and ordered the pizza.  Now you and your colleague are gazing into each other’s eyes over the soft blue glow of the burners, exchanging witty small talk about the lab director’s fossilized pollen samples, when suddenly, out of nowhere, your beloved colleague cites Albert Einstein as an authority on post-glacial plant recolonization.

Alas!  The mood is broken.  But is there a way to recover it?  Yes!  The trick is to gently correct your colleague’s faux pas when inevitably pointing out to him or her that they have indulged themselves in the fallacy of appealing to an authority because Einstein, while an authority on physics, was not an authority on post-glacial plant recolonization.  That is, an appeal to an authority is only good if the authority’s expertise is in the relevant field.

Be sure to avoid harsh words, shocked expressions, and spontaneous squawks of disbelief when gently pointing out your colleague’s indulgence.  If you manage that, the rest of the evening can be saved.

Asserting that Authority is Proof

Suppose you and your friends have spent a few hours in the coffee shop pounding down the green teas while good naturedly bragging about the impressive lengths of your curricula vitae.  After legitimately citing a string of human resource personnel who’ve all said your c.v. was the longest they’ve ever seen, the dangerous levels of caffeine you’ve consumed finally get the better of you, and you blurt out, “That absolutely proves mine is the longest!”

Proves?  On the contrary, no number of authorities, no matter how many you cite, can actually prove your conclusion.  Instead, they merely support your conclusion.  The reason is because it is at least possible that all your authorities are wrong.  Thus, you can only say they make your conclusion probable, but you cannot say that they make your conclusion necessary.

Tut! Tut!  Tut!

I must apologize now in case my nearly spontaneous outburst of passionate “tutting” has disturbed your composure.

Giving More Weight to a Minority of Experts than to an Opposing Majority of Experts

Suppose you are spending a pleasant sunny afternoon with your best friend in the park, lying on the green grass idly chitchatting about ancient Sumerian technologies.  Casually, you toss out the fascinating fact that 97% of the experts in the field agree that it was the Sumerians who first invented the sail.

Your friend nods agreeably and you are about to happily go on when a passing member of the merchant marine overhears your conversation and, quite unexpectedly, interrupts you to arrogantly list a mere half dozen or so authorities who disagree with the 97% consensus view you mentioned.

How can you correct the old salt without unduly embarrassing him?

Perhaps the best way to begin is to politely point out to him that his claim is extraordinary since it would seem highly unlikely for 97% of the authorities in a field to be wrong, while a mere 3% were right.   You should then remind him of the principle that “extraordinary claims require extraordinary evidence”, and then gently ask him to provide you with the top fourteen or so reasons he believes his half dozen or so experts have the edge on most of the rest of the researchers in the field.

You see, it is possible that his minority of experts are right and that your majority are wrong.  Yet, it is unlikely that’s the case.  And since an inductive argument rests on the likelihood of the evidence supporting the conclusion, the sailor is more or less obliged to add weight to his claim by going into detail about why his minority of experts are right after all.

To summarize, there are at least three common ways of turning an appeal to authority into a fallacy of logic.  Those ways are (1) citing someone who is not an actual authority during a romantic evening in the lab, (2) mistaking authority for actual proof of one’s conclusion while pounding down the green teas at the coffee shop, and (3) giving more weight to a minority of experts than to an opposing majority of experts without any justification for doing so while lying on a green lawn in the park.

Now to be sure, the mere fact that an argument contains a fallacy does not mean that the conclusion must be false.  It is quite possible for a fallacious argument to have a true conclusion.  However, one should get into the habit of considering fallacious arguments suspect, much as one is already in the habit of considering poets suspect.

This is because fallacious arguments tend to arrive at false conclusions, just as poets tend to arrive with scandalous frequency at radically speculative epistemologies.  I confess I have my days when I suspect poets seldom properly study The Philosophical Review at all!  How on earth they so often arrive at sharp insights and deep observations is simply beyond the grasp of any sensible man or woman.

How to Make Positive Thinking Work for You

When I gave positive thinking a try some decades ago and it didn’t work for me, I concluded it was for the other guy.  That is, I didn’t write it off for everyone, because too many people were telling me that it worked for them, but I did write it off for me.  I didn’t know then that I myself routinely indulged in a kind or species of positive thinking.

I had a mental habit — and I still do — of first daydreaming about something I wanted, such as honesty in politics,  improving my painting skills, or — most often — to see Terri’s breasts in the moonlight once again.  I’d let my mind wander imagining her magnificently pleasing honesty in politics, etc, and all that those things meant or implied.  That was the positive thinking part of it.

Sooner or later, however, my mind would turn to assessing the problems and challenges involved in making those things happen.  How could I overcome those problems and challenges?  Sometimes I’d realize at that point that there were few if any practical ways of overcoming them (e.g. in the case of pure honesty in politics).  Yet, often enough, I’d come up with a workable plan to obtain my wishes.

That was and is my version of positive thinking.  It seems to be something that I long ago just lucked into, because I have no memory of it having been taught to me.   It turns out, though, that I’m not alone in doing it.

Gabriele Oettingen is a scientist who studies how people think about the future, and who writes about positive thinking, among other things.  Based on over twenty years of research, Oettingen has concluded:

While optimism can help us alleviate immediate suffering and persevere in challenging times, merely dreaming about the future actually makes people more frustrated and unhappy over the long term and less likely to achieve their goals. In fact, the pleasure we gain from positive fantasies allows us to fulfill our wishes virtually, sapping our energy to perform the hard work of meeting challenges and achieving goals in real life.

In a New York Times article that is well worth reading in its entirety, she writes:

My colleagues and I have since performed many follow-up studies, observing a range of people, including children and adults; residents of different countries (the United States and Germany); and people with various kinds of wishes — college students wanting a date, hip-replacement patients hoping to get back on their feet, graduate students looking for a job, schoolchildren wishing to get good grades. In each of these studies, the results have been clear: Fantasizing about happy outcomes — about smoothly attaining your wishes — didn’t help. Indeed, it hindered people from realizing their dreams.

But Oettingen does not recommend giving up on positive thinking entirely.

In a turn of events certain to astound and confuse my two ex-wives, I have actually gotten something right in my life.  I am sooo going to email this blog post them! Of course, I am far above gloating about it, but it happens that Oettingen and her colleagues have discovered that combining positive thinking about one’s wishes with realistically thinking about the problems and challenges to obtaining one’s wishes is an effective way to realize those wishes.  At least, those wishes that are basically realizable in the first place.  The psychologists call it “mental contrasting“:

What does work better is a hybrid approach that combines positive thinking with “realism.” Here’s how it works. Think of a wish. For a few minutes, imagine the wish coming true, letting your mind wander and drift where it will. Then shift gears. Spend a few more minutes imagining the obstacles that stand in the way of realizing your wish.

This simple process, which my colleagues and I call “mental contrasting,” has produced powerful results in laboratory experiments. When participants have performed mental contrasting with reasonable, potentially attainable wishes, they have come away more energized and achieved better results compared with participants who either positively fantasized or dwelt on the obstacles.

When participants have performed mental contrasting with wishes that are not reasonable or attainable, they have disengaged more from these wishes. Mental contrasting spurs us on when it makes sense to pursue a wish, and lets us abandon wishes more readily when it doesn’t, so that we can go after other, more reasonable ambitions.

I think mental contrasting can help with far more than meeting near universally felt personal goals such as weight loss, job promotion, skill improvement, or smooching with Terri.  I think it can also help with such things as developing a realistic politics.  In fact, I’d argue that several of the American Founders were more or less masterful at reconciling their idealism with both eternal political realities and the circumstances of their time.  It’s my guess they did so by intuitively employing some form of mental contrasting.

Now, as long as we’re on the subject of getting what you want, I’d like to add here a second technique that I have personally found helpful.  I don’t know of any science, however, that either supports or discourages this second technique.  But, for whatever it might be worth, I’ve found it to be efficacious in obtaining your goals.  This is not a technique that I came up with on my own, though.

Thirty-five or so years ago, I was struggling at a job in corporate sales.  I wasn’t even coming close to making my monthly quotas, and perhaps the only two reasons that I wasn’t fired for lack of performance were that most of my fellow sales people were in the same boat (it was one tough industry to be a salesperson in!), and that the management of the company were ridiculously old fashioned enough to care about their employees.  One way they showed that care for us was to, instead of firing us all, hire a sales coach.  An excellent coach, as it turned out.

Within about year, I’d turned myself around.  But I didn’t fully realize by how much I had improved until the Chief Financial Officer took me aside at an employee meeting to inform me that in the first quarter of the year I had added more revenue to the company’s coffers than all the other salespeople combined.  The quarter after that, I beat my own new record by such a margin that I, who have always been the most hard-working, dedicated, and conscientious of employees, was able to negotiate an immediate month long paid vacation.  “You sure don’t want me burnt out for the rest of the year, do you?  I needs me fishing time!”

I put my turnaround down to that coach, and to the fact I was one of the few salespeople who took his lessons to heart.  Maybe that was due to the fact he’d told me something revealing about the effectiveness of his methods: “Most people are either going to dismiss out of hand what I’m trying to show them, or they’re going to give it a single try, get their noses bloodied, and give up on it all.  The fact is, there’s a learning curve to these things.  You can’t expect to get it right the first time, nor even the second or third times.  It’s just like learning tennis: It takes a lot of practice to become good at it.”  I was determined not to give up on his lessons until I’d given them a fair shot.

I won’t go much into the first thing he taught me.  It revolutionized how I sold to people, and it’s probably the more important of his lessons, but it’s largely irrelevant in this context.  The second thing, though, is pertinent.

Simply put, my coach changed my thinking by defining a goal as “a lens through which one sees opportunities”.  I can no longer recall what I thought a goal was before then, but I do recall goals had always intimidated me.  Yet, after I began to practice his lesson in earnest, I no longer felt intimidated by them.

By “a lens through which one sees opportunities” he meant, in part, that you should become at least mildly obsessed with your goal.  You should start looking for ways to reach it everywhere and in everything.  Suppose, for instance, you sold furniture, and you were at a party during which someone mentioned to you that their girlfriend had just given them the ultimatum, “Get your books off the floor or I’m leaving!”.  If you were properly obsessing, you’d at once see that as an opportunity to sell them some of your shelving.

Besides making me alert to such straight-forward opportunities as that one, I found obsessing on my goal brought out my creativity.  I began seeing more and more obscure opportunities.  In the end, it was as if I couldn’t drive to work in the mornings without seeing at least a half dozen things that would pop ideas into my head about how to sell my service to some business or another.

Now to be sure, there was a downside to turning my goal into an obsession.  That was driven home to me in a WTF? moment one day.   I was waiting in my car at a stoplight, and I had just thought of a way my service could boost one of my client’s sales from repeat customers.  I wanted nothing more than to get the office and call him for an appointment.  An old woman with a walker was slowly crossing the street when the light turned green.  Without thinking of her, but only of my goal, I started honking the horn.  Abruptly, I realized what a jerk I’d become!

So I think that, when turning a goal into an obsession,  you should bear in mind the dangers of becoming ruthless in your pursuit of it.   But, apart from that, the practice has served me well over the years.

Of course, when adopting or creating a goal for myself, I perform mental contrasting to understand it and the problems and challenges to realizing it.   I regret that I have no science for you that suggests seeing your goal as a lens through which to spot opportunities actually works for anyone other than me, but it might still be something you should give a try.  Just don’t start honking at people when they’re trying to cross the street!