Chat Room Love

She was beyond caring why she loved him.
Beyond any caring at all these days
Except when she was caring for him.
She had met him in chat that one morning
When the Winds were light with Spring’s fertile hopes,
And even I was somewhat less than purely skeptical
About the Great Slut-Goddess Love.

He seemed a man of passions to her.
Huge thirsty, lusty, hungry passions:
For his typing was fast and prolific,
His “LOLs” were always the first
To pop up after anyone’s jokes;
His “BRBs” were inevitably shouted out in all caps;
And his exclamation points were as numerous as the shoals of fish
That once had been so abundant on the Grand Banks.

She did not know — how could she have known —
But she was hankering after a false man:
For he was not really the rich, sophisticated
Albanian fashion designer he represented himself to be,
But was instead, an 74 year old quality control engineer
For a failing American manufacturing company
That produced only a decrepit line of men’s genital waxes.

The Social Brain

“The trouble with practical jokes is that very often they get elected.”  ― Will Rogers

Politicians are not the only practical jokes that get elected.  A lot of bad ideas also “get elected”.  Get elected in the sense that they become as popular as cheap hamburgers, and more popular than much better ideas.

Social Darwinism is surely one of the worse ideas that humans have ever invented.   Humans are quite talented at inventing bad ideas, but talent alone lacks the necessary brilliance to have invented Social Darwinism.  No, Social Darwinism took genius.

There were actually several geniuses involved in the invention of Social Darwinism, a whole intellectual clusterfuck of them.  But perhaps William Graham Sumner was the most brilliant clusterfucker of that whole group.

In 1883, Sumner published a highly influential pamphlet entitled “What Social Classes Owe to Each Other”, in which he insisted that the social classes owe each other nothing, synthesizing Darwin’s findings with free enterprise Capitalism for his justification.  According to Sumner, those who feel an obligation to provide assistance to those unequipped or under-equipped to compete for resources, will lead to a country in which the weak and inferior are encouraged to breed more like them, eventually dragging the country down. Sumner also believed that the best equipped to win the struggle for existence was the American businessman, and concluded that taxes and regulations serve as dangers to his survival.  [Source]

To be able to take an idea as brilliant as Darwin’s Theory of Evolution and turn it into an idea as hard-packed with stupidity as Social Darwin is absolute genius.  Sumner might have been one of the people George Orwell had in mind when he said, “There are some ideas so absurd that only an intellectual could believe them”.

Anti-intellectualism is just as American as apple pie or selling diabetic horse urine as beer.  That does not mean, however, that Americans skeptically refuse to  embrace the ideas of intellectuals.  No, in practice, it has meant only that Americans are so unfamiliar with intellectuals and their ideas that they can’t tell the good from the bad.  They are like those poor, sad folks who are so anti-sex they never develop whatever raw talent they might have for sex into becoming moderately decent lovers, let alone dynamos between the bed sheets.  There is no other way to explain the continuing popularity in America of Sumner’s ideas.

Social Darwinism is many things but so often at the core of it is the notion that human evolution has been predominantly driven by intraspecies competition.  As it turns out, however, to say that intraspecies competition predominantly drove human evolution is just as absurd as saying that a dozen minutes of start-to-finish jackhammering is mainly all there is to sex.  There is so much more!

For a long time, scientists have known that the human brain is exceptionally large relative to body size.

Early attempts to explain the fact tended to focus on environmental factors and  activities.  Thus, humans were thought to have evolved large brains to facilitate banging rocks together in order to make tools, hunt animals, avoid predators,  think abstractly, and outsmart competitors for vital resources like food, territory, mates, and rocks.  This was known as the “ecological brain theory”.

Then, in 1992, the British anthropologist Robin Dunbar published an article showing that, in primates, the ratio of the size of the neo-cortex to that of the rest of the brain consistently increases with increasing social group size.

This strongly suggested that primate brains — very much including human brains — grew big in order to allow them to cope with living in social groups.  As a consequence of that and other research, the new “social brain theory” started replacing the old “ecological brain theory” in the hearts and minds of scientists.

We don’t have the biggest teeth, the sharpest claws, the fleetest feet, the strongest muscles in nature.  But, as it happens, we are in most ways the single most cooperative species of all mammals, and in unity there is strength.  One human is usually no match for a lion even if he’s the most competitive human within a hundred miles. But through cooperation we are able to achieve more together than we can achieve through competition.

I once saw a film in which a band of two dozen or so men and women chased a huge male lion into a thicket and killed it in just a few seconds with nothing more than pointed sticks.   That is the bare minimal kind of cooperation that no doubt helped us to become the extraordinarily successful species we are today.

Even the fact we are able to (to some extent) reason abstractly might have much to do with our evolving as a social species.

Hugo Mercier and Dan Sperber have come up with the fascinating theory that reasoning evolved — not to nobly discern truths — but to persuade our fellow apes to cooperate with us, and to help us figure out when someone is telling us the truth.

Thus Mercier and Sperber begin with an argument against the notion that reasoning evolved to deliver rational beliefs and rational decisions:

The evidence reviewed here shows not only that reasoning falls quite short of reliably delivering rational beliefs and rational decisions. It may even be, in a variety of cases, detrimental to rationality. Reasoning can lead to poor outcomes, not because humans are bad at it, but because they systematically strive for arguments that justify their beliefs or their actions. This explains the confirmation bias, motivated reasoning, and reason-based choice, among other things.

In other words, those of us who wish in at least some cases to arrive at rational beliefs and rational decisions are somewhat in the position of a person who must drive screws with a hammer — the tool we have available to us (reason) did not evolve for the purpose to which we wish to employ it, and only by taking the greatest care can we arrive safely at our goal.  But I digress.

Mercier and Sperber go on to ask, “Why does reasoning exist at all, given that it is a relatively high-cost mental activity with a relatively high failure rate?”

They answer that reasoning evolved to assess the reliability and quality of what someone is telling you (“Is Joe telling me the truth, the whole truth, and nothing but the truth about his beer cellar?”), and also to enable you to persuade someone to do (or not do) something (“How do I talk Joe into giving me all his beer?”).   That is, reasoning involved in a group context.  The implication is that we reason best and most reliably when we argue or debate with each other.

I have long thought that one of the reasons the sciences have demonstrated themselves to be all but the most reliable means of inquiry that we have ever invented — second only to getting baked on Colorado’s finest weed in order to ponder the “Big Questions” of life — is because the sciences rest on the principle of intersubjective verifiability.  Basically, you check my work, I’ll check yours, and together we might be able to get closer to the truth than either of us could get working alone.

When Thomas Hobbes was writing out his political philosophy in the 1600s, he embraced the sensible notion that any political system should be based on human nature, as opposed, say, to based on what we might think some god or king wants us to have.   Hobbes, who often cooked up brilliant ideas, now proceeded to burn his meal, for he envisioned that human nature is essentially solitary.  He thought if you go back far enough in human history you will come to a time when people did not live in social groups, but alone.  There was no cooperation between people and it was instead “a war of all against all”.

Hobbes was not only wrong about that, he was very wrong about that.  What evidence we have suggests our species always lived in groups, our ancestors always lived in groups, and their ancestors always lived in groups.  In fact you must go back at least 20 million years in evolutionary history before you find a likely ancestor of ours that might have been a loner.  Our brains have been evolving as specialized organs for dealing with we each other for at least 20 million years, which is almost long enough to listen to every last complaint my two ex wives have about me.  And hell, we’re only talking about their legitimate complaints!

Of course, the fact we are social animals does not mean we are hive animals.  We are very much individuals, so far as I can see.  But that means, among much else, that there is and always will be a tension or conflict between our social and our individual natures.

Before we started living in the first city-states about 6,500 years ago, we lived in relatively small hunting/gathering bands of 200 or so people at the most.  So far as we know today, the bands were mostly egalitarian.  Just about anyway you can measure it, there wasn’t much social, political, or economic difference between people.  And the individual and society were probably in a fairly well balanced relationship with each other. Then some killjoy invented the complex, hierarchical society of the city-states.   And the people of the time, instead of doing the rational thing, and hanging him on the spot, let him get away with it.

From that infamous day forward, there’s been very few times in history when the balance between the individual and society has favored the individual.  Most societies have been oppressive.  That needs to end.   Yet end in a way that restores a sane balance, not in a way that destroys societies through extreme individualism.

Women’s Sexuality: “Base, Animalistic, and Ravenous”

What is the future of our sexuality?

How, in twenty maybe forty years, will we be expressing ourselves sexually?

Do we have any clues today about what kind of sexuality tomorrow might bring?

And why did my second wife doze off on our wedding night just as I was getting to the climax of my inspiring lecture to her on Socrates’ concept of love?  After all, she positively begged me for some “oral sex”!  Doesn’t make a lick of sense she fell asleep in the midst of it.

I’ve been wondering about those and other questions this morning but not, as you might suspect, because I’ve been binge viewing Balinese donkey on donkey porn again.  What inspires me instead is the emerging consensus in the science of human sexuality.  That consensus strikes me as a game-changer.

It’s sometimes said that the early human sexuality studies of Kinsey, Masters and Johnson, paved the road to the Sexual Revolution of the 1960’s and 70’s.  It seems to me today’s new, still emerging consensus could be like that — or it could be even more seismic than what we’ve seen before.

What’s at the core of this is women’s sexuality, along with a growing body of research that strongly suggests women’s sexuality isn’t what most of us nearly the world over have been taught it is.

To be sure, nothing is going to happen overnight.  For one thing, any really profound cultural changes that result from this new understanding of women’s sexuality are almost certain to take generations to be fully realized.  Deep cultural change is seldom quick.  Yet, sometimes great storms are proceeded by light rains blown ahead of the main storm, and something like that could happen here too.

For another thing, it’s always possible that the emerging consensus will fall apart.  The research seems to me solid so far, but as yet, not massive.

Some Old Ideas About Women’s Sexuality

To understand how the new science could transform our cultures, let’s first look at what’s at stake.  It seems that across many — but certainly not all — cultures there is a more or less shared set of beliefs about the differences between men and woman’s sexuality.  Among these beliefs:

  • Women are naturally much less promiscuous than men.
  • Women naturally seek and need emotional intimacy and safety before they can become significantly horny.
  • Women naturally prefer to be pursued by men, rather than to do the pursuing.
  • Women are naturally pickier than men when choosing a sex partner.
  • Women are naturally less horny than men.
  • Women are naturally less likely than men to cheat on their partners.
  • Women are naturally more suited to monogamy than men.
  • Women are naturally more traumatized by divorce than men.
  • Even more traumatic for women than divorce is a night spent with Sunstone.

What seems to be happening is that, idea by idea, the old notions of how men and women differ in natural sexuality from each other are being challenged by the new science.  Sometimes the challenges merely qualify the old idea, usually by showing that, although the difference exists, it is largely due to culture and learning rather than to innate human nature.  At other times, the challenges threaten to overturn the old ideas completely.

Some New Ideas About Women’s Sexuality

Bergner, and the leading sex researchers he interviews, argue that women’s sexuality is not the rational, civilized and balancing force it’s so often made out to be — that it is base, animalistic and ravenous, everything we’ve told ourselves about male sexuality.  –Tracy Clark-Flory

I believe that when thinking about the emerging new consensus, the emphasis should be put on “emerging”.  There are so many questions yet to be answered that I do not believe it can as yet be definitively stated.  But at this stage, the following four points seem to me, at least, to best characterize the most important findings:

  • Women want sex far more than almost all of us are taught to believe.
  • Their sex drive is as strong as, or possibly even stronger, than men’s sex drive.
  • Their desire for sex does not always depend on their feeling emotionally intimate with — nor even safe with — their partner.
  • Women might be less evolved for monogamous relationships than men.

But do women know this about themselves?  There’s evidence that many women might not.  One such bit of evidence:

Dr. Meredith Chivers attempts to peek into the cage by sitting women in La-Z-Boy recliners, presenting them with a variety of pornographic videos, images, and audio recordings, and fitting their bodies with vaginal plethysmographs to measure the blood flow of desire. When Chivers showed a group of women a procession of videos of naked women, naked men, heterosexual sex, gay sex, lesbian sex, and bonobo sex, her subjects “were turned on right away by all of it, including the copulating apes.” But when it came time to self-report their arousal, the survey and the plethysmograph “hardly matched at all,” Bergner reports. Straight women claimed to respond to straight sex more than they really did; lesbian women claimed to respond to straight sex far less than they really did; nobody admitted a response to the bonobo sex. Physically, female desire seemed “omnivorous,” but mentally, it revealed “an objective and subjective divide.”

Women, it seems, might not be in tune with their physical desires when it comes to sex.  But if this is so, it should come as little or no surprise.

The Repression of Women’s Sexuality

While significant efforts to repress women’s (and often enough men’s) expression of their own sexuality are not found in every culture (e.g. the Mosuo), they seem to be found in all major cultures, and they range from shaming all the way up to female genital mutilation,  honor killing, and stoning.  Indeed, rape — which is a nearly ubiquitous behavior — can be seen as largely a form of repressing women’s sexuality, especially given how often it is justified in terms of “she asked for it”, meaning that she in some way or another expressed her sexuality in a manner the criminal(s) thought invited attack.

But those are merely the enforcement mechanisms for more subtle ways of repressing women’s sexuality.  Sexual ideologies seem to be the primary means of repression.  By “sexual ideologies” I mean in this context anything from full blown systems of thought about what is proper or improper, right or wrong, natural or unnatural about women’s sexuality to unorganized and unsystematic ideas and beliefs about their behavior.   For instance, advising young women not to wear short skirts doesn’t count by itself as a true ideology, but for the sake of convenience I’m lumping such advice into the same bucket as true ideologies here.

Sexual ideologies are perhaps even more effective than the gross enforcement mechanisms at repressing women.  If you can convince someone that it’s natural, right, and moral to suppress her sexual feelings, then you do not need to rely on the off chance you can catch and punish her for them if she fails to do so.  Ideally, you can even get her to suppress her feelings to the extent she no longer knows she even has them, because if you can do that, then she herself is apt to become something of a volunteer oppressor of other women, especially, say, in raising her daughters.

Nature, Mr. Allnut, is what we are put in this world to rise above.  — Rose Sayer, The African Queen (1951).

Disturbing Studies

Here are a few quick examples of the things being found out about women’s sexuality these days:

In surveys men routinely report having two to four times the number of sex partners that women report, which lends support to the notion that men are naturally more promiscuous than women.  But one study, published in 2003 in The Journal of Sex Research, found that when men were tricked into believing they were hooked up to a lie detector, the men reported the same number of sex partners as the women reported.  This is significant because it calls into question a fair body of research that is often cited in support of the notion women are less promiscuous on the whole than men.

A 2009 study published in Psychological Science found that pickiness seems to depend on whether a person is approached by a potential partner, or is themselves doing the approaching.  The experiment, conducted in a real-life speed-dating environment, showed that when men rotated through women who stayed seated in the same spot, the women were more selective about whom they chose to date. When the women did the rotating, it was the guys who were pickier.  This implies that women’s choosiness might largely depend circumstances, and not on innate nature.

In 2011, a study published in Current Directions in Psychological Science found that women liked casual, uncommitted sex just as much as men provided only that two conditions were first met: (1) the stigma of having casual sex needed to be removed, and (2) the women had to anticipate that the man would be a “great lover”.   Contrary to conventional wisdom, the women did not seem to need to feel emotionally intimate with the man in order to enjoy casual sex with him.

In 2015, evidence was published in the journal Biology Letters that both men and women fall into two more or less distinct groups: Those who prefer monogamy and those who prefer promiscuity.  Curiously, the sexes were about the same in terms of the proportions of men and women  who favored one or the other.  A slight majority of the men favored promiscuity, while a slight minority of the women did.  This would seem to undermine the notion that men as a group are markedly more promiscuous than women.

The journal Psychological Science published a 2006 study that found women in general are more flexible than men in their sexual orientations, and that the higher a woman’s sex drive, the more likely she was to be attracted to both sexes (the same was not true of men).

In 2006, the journal Human Nature reported that both men and women in new relationships experience about equal sexual desire for each other, but sometime between one to four years into the relationship, women’s sexual desire for their partners began to plummet (The same was not true of the men: Their sexual desire held constant.)  Two decades into committed relationships, only 20% of women remained sexually desirous of their partners. Long term monogamy appears to sap a woman’s sex drive.   Ladies! Tired of the Same Old Same Old? Willing to dress up in a hen costume and squawk like a chicken?  Sunstone loves his rooster suit, and is currently available most evenings.  Simply call 1-800-BuckBuck! Motto: “He’s even more desperate than you are!”®

Disturbed Men

The new science has huge implications if it is indeed sound.  For instance, as hinted above, the sexual repression of women often enough depends on women buying into certain myths about their own sexuality, such as the myth that a woman’s sexuality, when compared to a man’s, is weaker, less urgent, less demanding.  If the myth is true, then an implication is women should sexually defer to their partners, place their own sexual needs on the back burner while tending to the needs of their man.

Yet, if the new science is sound, then men and women’s sex drives are more or less equal, and there becomes no ideological reason for women to not demand their rightful share of the fun.   That seems to disturb some men.

I can think of any number of reasons why some men are disturbed or put off by sexually assertive women, but none of them are relevant enough to go into here.  Yet, it should be kept in mind that some men  — but not all — are disturbed by the notion that women, being by nature sexually equal to men, ought to have equal rights in bed.

There are other implications of the new science men might find even more disturbing.  Perhaps the biggest implication might have at its core how women’s unleashed sexuality could affect men’s reproductive success.   The new sexuality might fearfully suggest to many men that their liberated partners are now more likely to cuckold them.  That’s not a prospect most men are entirely blissful about.

Grand Sweeping Summary and Plea for Money

Acceptance of reality is not, actually,  one of our major strengths as a species.  Even if the new science proves over time to be sound, it’s unlikely to be accepted without a fight.

If you are like me, you believe more research is needed into women’s sexuality.  Much more research.  Moreover, you are keen on funding some of that research yourself!  Yes, this is your opportunity to send me on a mission of scientific discovery to my town’s finest strip joint, where I will be surveying and assessing how women express their sexuality through dance, while flirting with suffering a heart attack from the intrinsic excitement of doing science.  Simply email me to arrange a transfer of funds!

A Life that Passed Like a Wind

Thirty four years ago last November, my former roommate, Dan Cohen died at the age of 25. He was an extraordinary individual, and if you have a moment, I’d like to tell you a little bit about him.

Dan had the misfortune of being born a Thalidomide baby. He was significantly less than five feet tall, slightly hard of hearing, nearly blind but for his exceptionally thick glasses, and he had purple tinted teeth — which were always on display since his lips did not easily close over them. But the worst of it was that he had an exceptionally weak heart.

At the time I knew him, Dan could walk only a few hundred yards without stopping to rest because his heart would within that short distance pound like he’d run a marathon.

At an early age — maybe nine or ten — Dan’s doctors told his parents that, because of the weakness of his heart, he would most likely not live beyond 25 years old, which proved to be an accurate prediction. His parents made the decision to tell Dan what the doctors had told them, so Dan knew early on that he wasn’t going to live a long life.

I met Dan in college. He and I lived on the same dorm floor for awhile. We became roommates because no one else on the floor wanted him as a roommate. Frankly, Dan was one of the messiest people I’ve ever known. But when he asked to become my roommate, I figured I could handle it on the one condition that he didn’t let any of his mess stray to my side of the room.

It wasn’t long before I learned that Dan’s one ambition in life was to learn everything he could possibly learn as fast as he could learn it. Because of his circumstances, the university allowed him to study anything he wanted to study without pressuring him to graduate. His official major was biology, but he took courses in every major field of science along with many courses in the humanities. He was an engaging thinker, and introduced me to many ideas that were new to me.

The only thing Dan seemed to like more than learning something new was a good joke. Most of our conversations were laced with his wit, and even to this day, I can hear in my mind his laughter.

He also had an well-informed empathy for the underdog, the oppressed, that I myself at the time did not fully share with him. For instance, he was deeply concerned with injustices suffered by the Palestinians.

We only roomed together for one year before I left the dorms. Then one freezing winter night, Dan got a phone call from the hospital. My brother was seriously ill and had been taken to the emergency room. Could Dan give them my new number?

As it happened, Dan only had my address, but not my phone number. Without apparent thought for himself, he set out past midnight, in the middle of a blizzard, to walk to my new home because he didn’t have cab fare and couldn’t find anyone who would lend him the money. It took him, he said, almost two hours to reach me. He had to stop every block or so and rest his heart in the freezing wind.

What impresses me most about the man was not the selfless, heroic effort he made to inform me of my brother’s hospitalization, but rather his extraordinary love for life, his courage, and his sensitivity to others.

Dan knew he didn’t have much time in this world, but I never once heard him complain about it. You can say life was unfair to him, but that’s not a judgement he himself ever gave an indication of harboring.

Instead, I only recall his passionate enthusiasm when he would toss out to me some new idea he’d had, or some bit of knowledge he’d discovered that day. I think he made the most of the tragic hand he was dealt in life, and over the years, he has become something a personal inspiration to me.

Thank you for listening. I believe Dan deserves to be remembered.

When Logic Breaks Bad: Three Shocking Errors that Turn an Appeal to Authority into a Depraved Fallacy!

 

Before we properly begin today, dear reader, I feel it is my duty to inform you that some of the material we will be covering is scandalous in its nature.  I must therefore recommend that, as a simple but necessary precaution, you have handy your smelling salts.

With that said, let us now properly begin: I think it is reasonable of me to hazard that you, dear reader,  are a dedicated student of both logic and epistemology, just as am I.  Therefore, I can deduce that you, too, might be aware of the well established fact that poets are suspect.  Highly suspect.

Obviously, that is because poets far more often than not harbor theories of knowledge, truth, justification, and belief that are so poorly defined a sensible man or woman might be fully justified to believe the typical poet has never more than two or three times in his or her life studied an academic journal’s worth of articles in the blissful fields of logic and epistemology!

Now, it is quite understandable if you think I’m exaggerating, if you think such a thing is improbable.  But please trust me: I know what I’m talking about.  For, as it happens, I’m a worldly man, an experienced man, a man who has seen shocking things in his life, even things as appalling as the heartbreaking story of a poor, wretched, old homeless woman I once befriended only to discover in an especially poignant moment during one of our casual conversations that she was entirely ignorant of the simple distinction between a priori and a posteriori knowledge!

Instantly, I was so choked up that tears welled in my eyes and my voice failed me.  I could not even lecture her, let alone speak to her, and all I could think to do was mutely give her all the money I had on me — a hundred or so dollars — though I felt that was not possibly enough to console her.  Perhaps you can imagine how touched I was when the poor dear, bless her, pretended to be thrilled merely in order to comfort me.

You may be forgiven, dear reader, if you are now in something approaching a state of shock.  Yet, I fear what I have to say next will —  will, if you fail to stoutly brace yourself at once — topple you into madness or, worse, into committing a tu quoque, arguably the most easily avoided of all known fallacies of logic.

You see, poets now and then get it right!

A case in point: Byron by name; Lord by title; English by birth; poet doubtlessly by horrific accident.  His exact words were,  “If I am fool, it is, at least, a doubting one; and I envy no one the certainty of his self-approved wisdom.”

Now, a clear implication of “Byron’s Theorem”, as I like to call it,  is that we cannot absolutely rely on the authority of anyone, not even that of ourselves, for it is always possible for a human — or even dare I say, an epistemologist — to make a mistake.  Clearly, that is implied by the Theorem.

I will not go into the precise and exacting reasons why that is implied. I would only be repeating myself, for I engagingly lay out those reasons in a chapter in the second volume of my insightful sex manual for newlyweds, Towards an Epistemology of Carnal Knowledge: A Popular Guide to the Hot Topics Every Couple Lusts to Discuss on Their Wedding Night. Colorado Springs: Charging Bore Books, 2009. Print.  I confess though, the chapter might not fully satisfy your thirst for an in-depth discussion of Byron’s Theorem because the whole series itself is light reading targeted to a wide audience.  Unfortunately, the only country the volumes have sold well in is England.  Odd, that.  I suppose it must mean English epistemologists are tops.

As it happens, Byron’s Theorem matches a principle of deductive logic.  Since even an authority could be wrong, any and all deductive arguments that appeal to an authority in support of their conclusion are necessarily fallacious.  There are no exceptions.

Yet, the same is not true in inductive logic.  Inductive logic is far less strict in this matter than deductive logic and it allows for some appeals to authority.

But why does deductive logic forbid all appeals to authority while inductive logic permits some appeals?  At the risk of being slightly superficial, this is as short of an explanation as I can personally make of it:

♦ In a deductive argument, the conclusion necessarily follows from the premises if the premises are true.

♦ In an inductive argument, the conclusion probably follows from the premises if the premises are true.

♦ Even though an authority on some subject might usually be right, it is always possible that they could be wrong.

♦ Hence, appealing to an authority as a premise for a deductive argument is invalid because doing so would mean that the conclusion would not necessarily follow from that premise (since the authority could be wrong).

♦ But, in an inductive argument, appealing to an authority as a premise is valid because doing so would mean that the conclusion was made more probable (since the authority is probably right, and despite that the authority could be wrong).

In brief, the reason appeals to authority are allowed in some inductive arguments, but not in any deductive arguments hinges on the difference between “likely to be true” and “must be true”.

Now, I think I might safely say that just about any person who knows me, if asked to pick but one word with which to best describe my emotional side, would pick the word, “passionate”.  I am, after all, a man of passions, strong, towering passions — especially, say, when savoring an argument exquisitely formulated in doxastic logic that perhaps suggests to me a floral hint of orange blossom when I am perusing its axioms.

And like most people of a passionate nature, I sometimes wear my emotional side on my sleeve.  Thus, I should warn you, beloved reader, that we are about to embark upon a discussion of fallacies — a topic almost certain to provoke me, perhaps even provoke me to raw, untamed outbursts in which I might express my opinions with unusual force and vigor — even for the internet.  So I apologize in advance if my stormy language at such moments causes you the vapours.

Having given fair warning, I will now turn to three common ways in which an appeal to authority becomes a fallacy of logic, beginning with:

Citing Someone Who is Not an Actual Authority

Suppose you have a Valentine’s Day date with your cute research colleague in paleobotany.  You’ve already turned off the lab lights, romantically lit a couple Bunsen burners, slipped out of your lab coats, and ordered the pizza.  Now you and your colleague are gazing into each other’s eyes over the soft blue glow of the burners, exchanging witty small talk about the lab director’s fossilized pollen samples, when suddenly, out of nowhere, your beloved colleague cites Albert Einstein as an authority on post-glacial plant recolonization.

Alas!  The mood is broken.  But is there a way to recover it?  Yes!  The trick is to gently correct your colleague’s faux pas when inevitably pointing out to him or her that they have indulged themselves in the fallacy of appealing to an authority because Einstein, while an authority on physics, was not an authority on post-glacial plant recolonization.  That is, an appeal to an authority is only good if the authority’s expertise is in the relevant field.

Be sure to avoid harsh words, shocked expressions, and spontaneous squawks of disbelief when gently pointing out your colleague’s indulgence.  If you manage that, the rest of the evening can be saved.

Asserting that Authority is Proof

Suppose you and your friends have spent a few hours in the coffee shop pounding down the green teas while good naturedly bragging about the impressive lengths of your curricula vitae.  After legitimately citing a string of human resource personnel who’ve all said your c.v. was the longest they’ve ever seen, the dangerous levels of caffeine you’ve consumed finally get the better of you, and you blurt out, “That absolutely proves mine is the longest!”

Proves?  On the contrary, no number of authorities, no matter how many you cite, can actually prove your conclusion.  Instead, they merely support your conclusion.  The reason is because it is at least possible that all your authorities are wrong.  Thus, you can only say they make your conclusion probable, but you cannot say that they make your conclusion necessary.

Tut! Tut!  Tut!

I must apologize now in case my nearly spontaneous outburst of passionate “tutting” has disturbed your composure.

Giving More Weight to a Minority of Experts than to an Opposing Majority of Experts

Suppose you are spending a pleasant sunny afternoon with your best friend in the park, lying on the green grass idly chitchatting about ancient Sumerian technologies.  Casually, you toss out the fascinating fact that 97% of the experts in the field agree that it was the Sumerians who first invented the sail.

Your friend nods agreeably and you are about to happily go on when a passing member of the merchant marine overhears your conversation and, quite unexpectedly, interrupts you to arrogantly list a mere half dozen or so authorities who disagree with the 97% consensus view you mentioned.

How can you correct the old salt without unduly embarrassing him?

Perhaps the best way to begin is to politely point out to him that his claim is extraordinary since it would seem highly unlikely for 97% of the authorities in a field to be wrong, while a mere 3% were right.   You should then remind him of the principle that “extraordinary claims require extraordinary evidence”, and then gently ask him to provide you with the top fourteen or so reasons he believes his half dozen or so experts have the edge on most of the rest of the researchers in the field.

You see, it is possible that his minority of experts are right and that your majority are wrong.  Yet, it is unlikely that’s the case.  And since an inductive argument rests on the likelihood of the evidence supporting the conclusion, the sailor is more or less obliged to add weight to his claim by going into detail about why his minority of experts are right after all.

To summarize, there are at least three common ways of turning an appeal to authority into a fallacy of logic.  Those ways are (1) citing someone who is not an actual authority during a romantic evening in the lab, (2) mistaking authority for actual proof of one’s conclusion while pounding down the green teas at the coffee shop, and (3) giving more weight to a minority of experts than to an opposing majority of experts without any justification for doing so while lying on a green lawn in the park.

Now to be sure, the mere fact that an argument contains a fallacy does not mean that the conclusion must be false.  It is quite possible for a fallacious argument to have a true conclusion.  However, one should get into the habit of considering fallacious arguments suspect, much as one is already in the habit of considering poets suspect.

This is because fallacious arguments tend to arrive at false conclusions, just as poets tend to arrive with scandalous frequency at radically speculative epistemologies.  I confess I have my days when I suspect poets seldom properly study The Philosophical Review at all!  How on earth they so often arrive at sharp insights and deep observations is simply beyond the grasp of any sensible man or woman.

A Most Titillating Fallacy of Logic!

 

Like most people, I am keenly aware the reason you do not often see “sex” and “logical fallacies of relevance” in the same sentence together is because logical fallacies of relevance are intrinsically so exciting they do not need sex to sell them.

Merely mention one of the numerous fallacies of relevance — say, the Ad Hominem Fallacy, the Red Herring Fallacy, or the Naturalistic Fallacy — and you create an atmosphere of tingling anticipation.  To toss “sex” into the mix would only be overkill.

So it may astonish my readers that I am about to bring up both the Naturalistic Fallacy and the subject of sex — together.

Make no mistake about it, though:  I am not mixing the bliss of logic with the occasionally interesting topic of sex merely to super-excite you, my beloved readers.  Nor am I mixing sex with logic merely because I am a  man of passions — strong, huge, even alarming passions — especially when reviewing a decent first order propositional calculus!   No, there is nothing gratuitous about this.

Instead, I reassure you that it is actually necessary here to mention sex, just as it was — I eventually discovered — necessary to mention sex now and then during the course of my two marriages.

And why is that?  Because someone — someone! — has made a mistake on the internet!  That is, they have committed the Naturalistic Fallacy in the all but certain presence of impressionable children. Children who might now grow up to promiscuously introduce fallacies into the very core of their reasoning.  Children who might one day run large multinational corporations, huge NGOs, entire governments, or even — more importantly — departments of philosophy.   DOESN’T ANYONE THINK OF THE FUTURE OF OUR SPECIES BEFORE THEY COMMIT FALLACIES OF RELEVANCE ANYMORE?

The person in question — let us call him the “Perpetrator” — committed the fallacy in the course of arguing that we should derive our morals from “evolutionary biology”.  Allow me to quote:

My position is that evolutionary biology lays on us certain [moral] absolutes. These are adaptations brought on by natural selection to make us functioning social beings. It is in this sense that I claim that morality is not subjective. [bracketed material mine]

As it happens, there is more than one way to lay out his argument. In the spirit of good sportsmanship, I shall now lay out the Perpetrator’s argument in the strongest possible manner I can come up with, despite the risk of giving us all the vapours:

  • We evolved various behaviors (“adaptations”) that make us functioning social beings.
  • Because the evolved behaviors (“adaptations”) make us functioning social beings, they are moral absolutes.
  • We ought to behave according to moral absolutes.
  • Therefore, we ought to behave according to the various behaviors (“adaptations”) that make us functioning social beings.

As you see, the second premise is the offending one.  It constitutes a mini-argument within the larger argument, for it has the form of a premise (“our evolved behaviors make us functioning social beings”) and a conclusion (“our evolved behaviors are therefore moral absolutes”).

But that is precisely the form of the Naturalistic Fallacy, which can be described as, “An argument whose premises merely describe the way that the world is, but whose conclusion describes the way that the world ought to be….”  The Naturalistic Fallacy is a fallacy because you cannot reason from an “is” to an “ought”.

If you could reason from an “is” to an “ought”, you could reason all sorts of ridiculous things. “There is theft, therefore there ought to be theft.” “There are wars, therefore there ought to be wars.”  Even, “There are murderous fallacies of logic, therefore there ought to be murderous fallacies of logic.”

Yet, for the moment, let us accept the Perpetrator’s reasoning, despite it’s power to shock us.  What, then, might happen if we were to buy into his notion that “evolutionary biology lays on us certain [moral] absolutes?

Would not any behavior with a genetic basis that increased someone’s reproductive success then become moral? I cannot see why it would not.

For instance, it appears that war has a genetic basis in territorial instincts and other such things.  But if that is so, then wars would be moral if they increased someone’s reproductive success.  Again, there is a hypothesis that rape has a genetic basis.  But if that is so, then rape would be moral if it increased someone’s reproductive success.

Such implications must disturb even the calmest of men and women.  To permit the notion that evolutionary biology lays on us moral absolutes seems to invite a deluge of undesirable consequences.  Fortunately we need not permit it, for sound logic does not compel us to permit it. For that, and for other reasons, men and women of conscience may justifiably and emphatically wag their fingers while saying to the Perpetrator in the most passionate terms, “Buffoonery! Mr. Perpetrator, your notion is buffoonery!”

Life’s “What Was That All About?” Moments

I’m about three-quarters and a dime convinced that a certain blogger I’ve been reading on and off for years writes so well that she could, if she wanted to, transform the journey of a common black ant tediously meandering across a boring concrete sidewalk into a New York Times best seller.

Her eye for detail, sharp wit, and fresh, nearly poetic prose enrich commonplace life events with emotion and (often enough) laughter.  She not only makes me feel, though: She makes me think, too.  And thinking about something she wrote earlier tonight is what I’ve been doing for the past hour or so.

Should you like to read her post, it’s here.  The soul of it is a “What was that all about?” moment that she had on her way to the gym.  She wrote it up in a way that left me feeling like it had happened to me.  So I commented on her post.

We had a brief exchange during which she proposed that her whole life was one WTF? moment after another.  That got me thinking, “Yeah, there’s probably at least some truth to that for nearly everyone of us”.

Psychologists, among others, will tell you that we humans tend to naturally turn strings of events into stories, or “narratives”, as they call them.   Where most other animals might see just a string of events, we see a narrative.  For our species, such a string of events is often enough perceived as (1) causally linked, (2) progressive or unfolding, (3) thematic, and (4) tending towards a climatic moment followed by (5) a resolution.   That list might have left out some things, too.

Seeing stories in events is not really something we learn, it’s something we’re born with.  An instinctive way of perceiving or ordering reality.  To feel the force of that instinct simply recall how you felt the last time someone told you an interesting or engaging story that…left you hanging.

FIRST PERSON: “It was the bottom of the ninth, the score was tied with two outs when Fisher stepped up to bat.  The first pitch was super-fast, too fast for him to swing in time. Strike! But on the second pitch he connected.”

(long pause)

SECOND PERSON: “That’s it?  But what happened next?”

On a subtle note, when you read the name “Fisher” did you for perhaps a brief instant wonder, “Why Fisher?  Who is he?”, or something along those lines?  If so, that’s your mind trying to change a simple fact (i.e. the name “Fisher”) into more story, more narrative.

I’m not going to spend time here speculating on why we see stories in causally related events, because I’d like to focus on something else instead:  I think it’s highly arguable that life mostly is not what we so often think it is.

Mostly, life does not fit quite so neatly into the frame of a story.  But do we easily remember how often that’s the case?  I don’t think so.  When life fails to fit into a story, I think we tend to dismiss it, downplay it, forget it, unless there is some distinctive reason not to forget it (e.g. an event was funny, poignant, moving, disturbing, scary, etc.).  What’s mostly left are memories of when life did make passable sense as a story, and thus we have an impression that life is more often a story than it actually is.

Put differently, I think it might be arguable that life is more often composed of “What was that all about?” moments than it is composed of more tidy and satisfying conclusions.

For instance, shyness was quite a problem for me from an early age through to my late 30s.  But the shyness ran beneath the surface, beneath the mask I wore of a fairly outgoing person.  I myself was keenly aware of it, though.

Then, sometime in my late 30s or early 40s it all but entirely disappeared.  I’m 60 now, and I can probably count on my ten fingers the number of times since age 45 that I’ve felt shy.  Why it went away, I have no idea.  I can speculate endlessly on that question, but I cannot find a convincing answer to it.

My shyness thus makes a mostly unsatisfying story.  Sure, there’s a sort of resolution (i.e. it did go away), but I am left hanging on the why.  Consequently, when I look back on it now, I have feelings of “What was that all about?”  And those feelings are magnified for me by the fact that I spent so much time and effort in my younger years trying one thing after another to eradicate my shyness.  Not one of those things worked for me.  Then. for no apparent reason, it was gone.

When you read about my shyness, do you feel an urge to explain why it went away?  If you’re like me, you do.  My mind wants to just jump in there with the most plausible explanation it can conjure, regardless of the fact there’s no practical way I know of actually testing any explanation to determine if it is really true.

“What was that all about? moments might just be far more common than we think.  It’s even arguable that they are more characteristic of life than moments when things do make a heap of sense.  But whatever the case, it’s a fact our minds see strings of causally connected events as stories.  In light of that, a “What was that all about?” moment can be thought of as the “conclusion” to an aborted story.

Please feel free to share your favorite “What was that all about?” moment! And, by the way, some of my earlier views on the topic of our narrative minds can be found here.