What Could Possibly Go Wrong?

Alfred Lord Tennyson famously observed the unfortunate fact that nature is “red of tooth and claw.” But soon, according to British philosopher David Pearce, there will be an app for that. In an interview with futurist blog io9, he has put together a plan worthy of a comic book super-villain: reboot every ecosystem on Earth to end all suffering by stopping animals (humans included) from eating each other.

The hubris of this endeavor is, of course, remarkable, as is the optimism that technologies both emerging and imaginary will, without question, grant everyone their three wishes. Engineers who actually make things work need to see real-world constraints, and so we tend to have to take off our rose-colored glasses. Academics, however, get to keep theirs on, and Pearce’s are firmly in place, casting everything in a soft pink hue. At the very end of the interview, he even quotes Karl Popper’s famous observation that “Those who promise us paradise on earth never produced anything but a hell.” Nevertheless, rather than applying it to himself, he dismisses this lesson of history with a short rhapsody about how totally awesome his utopian dream would be if somehow it actually worked this time.

And why shouldn’t it? Based on the interview, Pearce seems to believe he has pinpointed the problems with earlier utopian schemes: They were not far-reaching enough because they ignore predation in nature, and they lacked the power to sufficiently micromanage the lives of all living things. But now, he believes that “within the next few decades, every cubic metre of the planet will be computationally accessible to surveillance, micro-management and control.” There will be technologies to perform mass updates of the genetics of entire ecosystems. (Don’t you just love when Windows forces you to reboot to install updates? Well, soon, every bird and every blade of grass will have the same feature.) Mass extinctions are certainly on the table for Pearce. For him, however, it’s not a big deal. He notes in his manifesto that nobody cares about the intentional extinction of smallpox, so why not contracept snakes and crocodiles into oblivion? Veganism will be imposed on all creatures. But don’t you steak-lovers worry; as a matter of practicality, this will probably need to wait until “delicious, cruelty-free cultured-meat products become commercially available.” Mmmmm. Which such technology on the horizon, it sure is a good thing that Uncle Ben’s assertion that great responsibility accompanies great power combines with an “in for a penny, in for a pound” approach to manipulating the world around us to produce the wise, utopian, techno-fascists who will guide the application of these forthcoming inventions.

The root of Pearce’s myopic arrogance is Benthamite Utilitarianism, an amoral system which reduces all ethics to a matter of pain and pleasure. Is an action wrong? Only if it brings about more pain than pleasure. Is it right? Only if it brings about more pleasure than pain. The higher concepts of the good, the true, the beautiful, love, justice, and so forth that are written on the hearts of ordinary people and affirmed by the humanities are eschewed as mere illusions which imprecisely describe certain kinds of pleasures. Likewise, vices such as cruelty are simply reduced to “involuntary pain.” Consider, for example, his discussion of the problem of cats. There could be some protest to simply getting rid of them. After all:

Most contemporary humans have a strong aesthetic preference in favour of continued feline survival. Their existence in current guise is perhaps the biggest ethical/ideological challenge to the radical abolitionist. For our culture glorifies lions, with their iconic status as the King of the Beasts; we admire the grace and agility of a cheetah; the tiger is a symbol of strength, beauty and controlled aggression; the panther is dark, swift and elegant; and so forth. Innumerable companies and sports teams have enlisted one or other of the big cats for their logos as symbols of manliness and vigour.

These “aesthetic preferences” are so entrenched that they create the “disturbing” impression among humans that “phasing out” or “reprogramming” is “evocative of genocide, not universal compassion.”

Thankfully, Pearce has a way to help get our heads on straight: We need only “compare our attitude to the fate of a pig or a zebra with the fate of an organism with whom those non-human animals are functionally equivalent, both intellectually and in their capacity to suffer, namely a human toddler.” After all, if pleasure and pain are the only relevant metrics by which to judge actions, and the capacity to feel these things is the only metric to judge the value of a creature, then this all becomes clear. He concludes:

Well, if our theory of value aspires to a God’s-eye perspective, stripped of unwarranted anthropocentric bias in the manner of the physical sciences, then the well-being of a pig or a zebra inherently matters no less than the fate of a human baby – or any other organism endowed with an equivalent degree of sentience. If we are morally consistent, then as we acquire God-like powers over Nature’s creatures, we should take analogous steps to secure their well-being too.

And that’s why cats have to be eliminated; because a lion eating a pig is no different from a lion eating a little kid. And if a lion eating a little kid is an evil which must be stopped, then so is a lion eating a pig. Given the premises, the logic is inescapable. Or is it?

As philosophers have often quipped, one man’s modus ponens is another man’s modus tolens. One could draw Pearce’s conclusions from the premises. One could just as logically conclude this: Because a lion eating a pig is no different from a lion eating a toddler, and a lion eating a pig is really no big deal, then a lion eating a toddler is also no big deal. The darkest side of the animal rights movement has always been this: if animals are just as important as humans, then humans are no more important than animals. If Pearce and his ilk are willing to rewrite the biology of animals to fit his ideals and sterilize or otherwise extinguish (humanely, of course) the undesirable species, why not do the same to humans who are of no more concern than animals? If the elimination of cats is really more like universal compassion than it is like genocide, than why can’t genocide be a manifestation of universal compassion? If the end of suffering is an adequate justification for forcibly rewriting the genetics of animals to be more to the techno-fascists’ liking, then it is also an adequate justification for forcibly rewriting the genetics of humans to make them more to the techno-fascists’ liking.

Many people are afraid of technology because of its ever-growing power. But as any regular user of powerful tools knows, respect is a more appropriate response than fear. A chainsaw is powerful, but it is safe in the hands of a responsible person who knows how to use it. It is terrifying only in the hands of children, fools, and maniacs. It is unfortunate that utlilitarians are so in vogue, for they are the children and fools of ethical philosophy—and the more consistent they try to be, the more they leave their humanity behind and become maniacs.

Rather than black, white, and a linear progression of grays along the pain/pleasure axis, good moral theorists can see in many colors, and men like Pearce should be thankful for it. After all, what verdict would a consistent utilitarianism pass on him if, instead of starry-eyed speculation about future benefit, it used the hard and bloody facts of the history of 20th century utopians as grist for its mill? The judgment might provide aggregate pleasure for society, but it would be quite unpleasant for men like Pearce.

Posted in Ethics, Science | 2 Comments

Irony & Religion in the Media

It’s time for another entry in the “those silly Christians think the media is unfair to them” file. I thought the following Q&A from an interview with columnist Mark Oppenheimer was noteworthy in that it actually manages to demonstrate two of the ways in which the media is unfair.

Religious believers often feel that they’re treated unfairly by the media. Do they have a point? What aspects of religion do journalists regularly get wrong?

Most reporters have a superficial knowledge of whatever beat they’re on; that’s true of me every time I wander from the religion beat, where I actually have pretty deep knowledge. So reporters get religion wrong, but they get a lot of things wrong: labor relations, war, etc. I don’t think there is a special animus against religion. One could argue there is special gentle treatment for religion. Religious believers say things all the time for which there is no real evidence — that’s what “faith” is, by definition — and reporters don’t call them on it, unless the religion is new and thus seems weird, like Scientology. But if a religion is old and traditional, like Judaism and Christianity, its adherents get to go on about the Rapture, or the Resurrection, or whatever, and reporters never insert paragraphs like, “Asked for evidence that the Rapture would someday come, the minister could only point to the Book of Revelation.”

Superficial knowledge of most subjects may be a typical problem amongst reporters, but the more insidious version of this problem is thinking that one’s own knowledge is deep even when it is superficial. Oppenheimer, for example, considers his own knowledge of religion to be “pretty deep,” but one begins to find this assessment deeply suspect as he goes on to define faith as belief without evidence. That may be how atheists like to define faith; it may be how Hollywood usually defines faith; but it is not how faith is defined by orthodox Christianity—one of the two specific religions he mentions. In truth, Christians talk about faith in several different senses. This is why we often use disambiguations like saving faith when referring to trust in God’s promises of salvation through the death & resurrection of Christ or the faith when referring to the body of doctrine given by God and delivered to the Church through the apostles. It is also why we use the term blind faith when referring to Oppenheimer’s kind of faith—to mark it out as something which has nothing to do with Christianity. This would not be news to anyone with a pretty deep knowledge of religion and should not be news to anyone who has taught others how to report on religion.

The second problem is revealed when Oppenheimer goes on to confuse patronization with “gentle treatment.” Reporters don’t ask about evidence for specific doctrines because by-and-large they see Christians like they see Trekkies—as members of some kind of weird fandom. To reporters, being a Christian just means being caught up in a story—until, of course, Christians begin taking it “too far” by treating it as though it were actually true and allowing it to unduly influence their lives. That’s when Christians go from harmless fans to dangerous fanatics. Because of this mindset, reporters will often approach the subject with all the credulity of parents inquiring about their child’s imaginary friend. They will chuckle or reflect wistfully on this strange behavior when they find it charming, and they will chide and censure when they do not, but they will never bother asking about evidence because they are certain it would all be techno-babble anyway.

I, for one, would love to see reporters stop being “gentle” so that they can start inserting paragraphs like, “Asked for evidence that the Resurrection happened, the minister provided historical documentation that would be more than adequate for any other event in antiquity.”

Posted in Apologetics | 2 Comments

The Latest from the Federalist

Just realized I never linked to the last two pieces I wrote for The Federalist:

Four Myths About the ‘Helpless’ Single Woman, in which I critique a recent challenge to the conservative vision of singleness.  (Note for the reading-challenged:  this piece does not say that single women are double-plus-ungood.)

It’s Past Time To Reconsider The Place Of College, which contends that the Baby Boomer fantasy that every American should go to college immediately after high school is harmful to both prospective students and to colleges’ ability to provide a quality education.

Posted in Chastity, Culture, Feminism | Leave a comment

The New Honesty about Abortion in Film

I have not seen “Obvious Child,” a recent romantic comedy about abortion, but I did take note when film critics kept going on and on about how honest it is. It’s “anchored” by an edgy “kind of truth-telling” according to the L.A. Times; the N.Y. Times says that it’s “trying above all to be honest;” and Buzzfeed describes it as even “painfully honest” in the headline. This would be surprising, since honesty about abortion isn’t terribly well received. After all, when some pro-life activists start bringing exceptionally gruesome placards onto college campuses, the honesty of such images is seldom lionized. So how could an honest narrative about abortion gather such glowing reviews from liberal Hollywood? The answer, based on reading the reviews at least, is that true honesty about abortion is too alien to be recognized by that particular culture.

Apart from frank comic discussion of bodily functions (which is now described as “honest” rather than “potty humor” I guess,) the main claim to honesty seems to be that the heroine of the film actually goes through with the abortion—a fairly rare event in popular films and television. When filmmakers want to create a likeable character, the tendency is to make her somehow avoid the procedure. The N.Y. Times inadvertently reveals why while trying to make the opposite point: “There have been a handful of comedies — “Juno” and “Knocked Up,” most notably — about women who choose not to end their unexpected pregnancies, so why assume that the other side of the coin is off limits? Like it or not, abortion is a fact in many women’s lives and therefore as available for humorous treatment as any other aspect of human experience.” You know… just like child molestation. It’s a fact of life & human experience and therefore just as available for humorous treatment. The honest truth is that it’s very hard to make a character actually funny and genuinely likeable when she inflicts terrible violence on her own child.

Daring to include the procedure in the narrative is, of course, not the same thing as being honest about it—something lost on the critics. The worst offender on this point is the aforementioned Buzzfeed review. It highlights the film’s “responsibility… toward the procedure” by noting how “the camera follows Donna right into the examining room and holds on her anesthetic-addled face.” It doesn’t mention the camera lingering on any of the more gritty and real parts of the procedure that immediately follow the anesthetic. The review also talks about the movie’s “acknowledgement of the way abortion tends to be discussed in whispers and euphemisms and part of some furious skating around the issue itself” and praises how “You won’t find euphemisms here.” The same review euphemizes abortion as “an unfun possible outcome of sex” and Donna, the mother, as “not yet ready to be a mom.” Of course abortion, though presumably unfun, isn’t a mere possible outcome of sex anymore than wife-beating is a mere “possible outcome of burning a pot roast.” Likewise, the character of Donna was already a mom—one who makes a grisly decision how to treat her child.

Having not seen it, I cannot judge the film itself, but as far as the critics are concerned, I’m underwhelmed at their ability to discern anything at all as “honest” on the subject of abortion.

Posted in Abortion | Leave a comment

Yes, Virginia, There is Such a Thing as a Slut

I came across this piece in The Atlantic a few days back, boldly proclaiming that “there’s no such thing as a slut.” It was occasioned by a recent book written by a couple of academics who studied the daily lives of 53 women as they progressed from freshman to new college graduates.

Two items in this book led to the author’s startling conclusion. First, it noted the differences between how rich and poor students view sluttiness and found that there was no precise definition agreed upon by both groups. In logical terms, leaping from that to the conclusion in the headline is like saying that because Mormons call themselves Christians whereas Christians call Mormons heretics, it means there’s no such thing as a Christian. On the contrary, the fact that Christians exist is one of the things that both parties agree on.

The second item that provoked her conclusion is that the book noted how the slut label was often applied as a general-purpose insult connected to class rather than sexual behavior. Of course, neither does this have anything to say about whether there is such a thing as a slut. After all, we do not hear 13-year-old boys calling their disliked peers ‘f*gs’ entirely apart from any known sexual behavior and conclude that there’s no such thing as homosexuals. The existence of a thing does not hinge on the precision with which humans are able to describe or identify it. All in all, thinking otherwise is about as silly as when, at the end of an article about women calling each other sluts, it describes the word as a mere “misogynistic catch-all, a verbal utility knife that young people use to control women.”

Although it may not logically support the conclusion found in The Atlantic, the study does seem to provide clear examples of how the moral law written on our hearts bubbles to the surface even when society represses it. The first thing we can recognize is that nearly everybody in the study knew to be true what the article denied—that there is such a thing as a slut. That’s why they routinely placed people into that category while excluding others. According to the study, “All but five or six of the women practiced ‘slut-shaming,’ or denigrating the other women for their loose sexual mores.” It seems that class differences leave quite a bit of common ground after all.

The second recognition is that everybody knew that having “loose sexual mores” is a negative indictment—that it is shameful. That’s why they either sought to define the category in a way that excluded themselves or sought to hide their own sexual history so that no one else would categorize them that way. That’s why they were disproportionately eager to label those they disliked as sluts. As the article pointed out, “They conflated their accusations of ‘sluttiness’ with other, unrelated personality traits, like meanness or unattractiveness. It seems there was no better way to smear a dorm-mate than to suggest she was sexually impure.”

The third recognition is that everyone involved knew that being a slut meant frequent fornication—and that the further this fornication was from a marriage-like relationship, the sluttier it was. This too was consistent across the class divisions set up by the study.

The rich women tended to view casual sex as problematic only when it was done outside of steady relationships, and even then, only when it included vaginal intercourse. Meanwhile, frequent “hooking up,” which to them included kissing and oral sex, did not a slut make. “I think when people have sex with a lot of guys that aren’t their boyfriends, that’s really a slut,” as one put it.

The poorer women, by contrast, were unaware that “hooking up,” in the parlance of the rich women, excluded vaginal intercourse. They also tended to think all sex and hook-ups should occur primarily within a relationship.

The “contrast” has to do with technicalities and lingo, but not at all with the bigger picture that sex ought only to be practiced within a relationship that involves some measure of exclusivity and permanence. It’s not exactly a high moral standard, but it’s undoubtedly there and undoubtedly apes marriage to some extent.

How frequent must fornication be to be slutty? How far from a marriage-like relationship can it be? What counts as “sex?” Is Susie really a slut, or is it just a rumor? Here there were disagreements, and it is on these that the article focused. Nevertheless, this focus is itself merely a smokescreen to hide the far wider areas of agreement. Ironically, the author was doing exactly what the subjects of the study did—attempting to find a way to hide favored people (women as an identity group in the author’s case) from a negative moral indictment because this indictment is accompanied by negative social consequences. It’s ultimately just one more way of running away from guilt.

This is precisely the state of humanity that Paul described in his letter to the Romans: “[The gentiles] show that the work of the law is written on their hearts, while their conscience also bears witness, and their conflicting thoughts accuse or even excuse them.” We all have knowledge of a moral law that manifests itself in the universal way that we accuse others and excuse ourselves from breaking it. Nevertheless, the knowledge does not do us any good. We have all sinned and fallen short just as the participants in the study were apparently all guilty of fornication regardless of how they applied the “s” word.

Some may try and solve this problem by fudging the line until they are on the right side of it. Others may try to solve the problem by pretending that the line is too blurry to tell for sure which side anyone is on. But blurring our view of it does not change reality. Even people embroiled in hookup culture know that there’s something seriously wrong with it—even if that knowledge never reaches their self-consciousness. Though some would like people to forget that there is such a thing as a slut, sluts along with the rest of us need forgiveness rather than forgetfulness.

Posted in Chastity, Natural Law | 4 Comments

Cultural Doggie Bag: Cloud Atlas and Natural Law

Cloud Atlas is one of those movies I have a love/hate relationship with. I love it because of the complex and well-crafted storyline made up of interwoven time periods and because this craftsmanship itself is there for the sake of underscoring the film’s message. On the other hand, I disagree with most of that message because of my own views on natural law, and on top of that, the movie even inadvertently disagrees with itself before it’s finished.

Towards the beginning of the film, a question is asked: If God made the world, how much of it are we allowed to change? In other words, it raises the question of ordinance—is the world supposed to be a certain way by design, and if so, to what extent? The film explores the issue through a series of narratives that take place in different times and involve different characters, but are nevertheless connected by reincarnation, historical continuity, and a familiarity of circumstance in which a person or people is oppressed by “the way things are.” The narrative in the 1800’s deals with slavery, the one in 1930’s Scotland with homosexuality, 1970’s America with misogyny and corporatism, present day England with the marginalization of the elderly (though the purpose of this thread seems mainly to be comic relief), 22md century Korea with the subjugation of androids, and a post-apocalyptic 23rd century dealing with religious superstition. In each narrative, the harm and injustice the protagonists experience are brought about and rationalized by people’s belief that such things are simply moral ordinance—the nature of things that cannot be changed.

Each narrative also involves the protagonists breaking free of this oppression by changing the perceived ordinance in favor of a new one, “From womb to tomb, we are all connected,” a phrase that’s repeated more often in Cloud Atlas than “With great power comes great responsibility” in a Spider-Man film. This interconnectedness of humanity is further emphasized by the interconnectedness of the different time periods as the story progresses. For this reason humans (and androids) band together to help one another and free themselves and their neighbors from whatever oppression they happen to be facing (usually personified by Hugo Weaving). They overthrow each ordinance in favor of a singular ordinance which the movie portrays as more fundamental. In effect, the film’s answer to the initial question seems to be that the only real ordinance is love.

But can love really be the only ordinance? If so, then what is love? It is at this point that the film cheats on its own message, for all of its examples of love are incarnate within ordinance. The emotional impact of the film is not generated by evoking relationship as such but by evoking specifically loving relationships that fit a natural order. The good guys behave in ways consistent with being husbands, fathers, friends, wives, daughters, and so forth while the bad guys interfere with those relationships. The film would have you believe that it’s moral gravitas hinges entirely on “From womb to tomb, we are all connected,” but its narratives would not be compelling at all unless some connections were more loving than others. But if some connections are more loving than others, then that implies the existence of other unstated ordinances.

Nowhere is this more clear than in 1930’s Scotland, which ends up being the most disjointed narrative of the group precisely because it alone eschews reliance on typical ordinance. Robert Frobisher is a homosexual composer who leaves his lover to pursue his artistic vision. This piece of the story is tied together by Robert’s narration of letters to his lover as he describes the trials and tribulations of his attempts to compose and publish the Cloud Atlas sextet. Their love is portrayed in typical romantic fashion as something that is too pure and noble for this cruel world—just like Robert’s artistic vision, for which he runs himself ragged and then kills himself when finished. But what is it that makes Robert’s relationship loving besides the glowing self-description in his letters? He sleeps with his boss’s wife out of convenience to his ambition while he assures his original lover that it doesn’t have anywhere close to the deep spiritual meaning their own liaisons had. He attempts to sleep with his boss as well, but his boss is uninterested and demands credit for the Cloud Atlas in exchange for keeping his homosexuality a secret. Robert flees to a run-down hotel room to finish his composition on his own. The narrative reaches its climax as Robert’s lover, upon learning that he is in trouble, desperately searches for him. Robert knows this and even watches his lover’s desperation from the shadows with a smile on his face, but intentionally avoids him, only to go back to his hotel room and kill himself, thus ensuring for himself his fame and the credit for his music. In the end, what does the audience have to make them think that Robert actually loved his pen pal? He isn’t faithful to him, or even terribly considerate of him in general. He abandons him twice purely for ambition, and never expresses any concern from him outside of the text of the letters themselves. In other words, it commits the literary sin of telling instead of showing because it has nothing real to show. Without ordinance, love is an abstraction that never touches the real world.

Having abandoned natural law and the God who authorizes it, Western society has been seeking a new basis for morality for some time. But however hard they try, they eventually need to absentmindedly rely on the very natural law they eschewed. The person who says “you can do anything you want as long as you don’t hurt anybody” has said nothing at all unless he and his audience know what “hurt” means. In the same way, “from womb to tomb, we are all connected” is a catchy phrase, but is meaningless unless “connected” is defined. Unfortunately, the more seriously the audience takes Cloud Atlas, the less it has to say.

Posted in Culture, Natural Law | Leave a comment

The Cultivation of Shame

“You need to hold your temper.”

It was a phrase I heard from my dad on numerous occasions when I was growing up. At the time, of course, I did not get the blacksmithing reference—the ability of metal to retain its shape and edge when under duress. But despite that, I did get the message: anger is natural, but you can’t just lose your cool over every irritation that you encounter; you need to learn self-control.

So was my dad trampling all over my feelings? Was he scarring me for life? Was he emotionally manipulating me just to get me to behave? Was he tearing down my self-worth or devaluing me by foisting temperance on an intemperate little boy? Of course not. He was merely doing me the fatherly service of civilizing me—of training virtues where none had yet formed.

Anger is indeed a natural emotion, but teaching self-control is not emotional manipulation—it is emotional cultivation, for no young child knows well how to discern what kinds of things are worth being angry over. Neither was it detrimental to my self-worth, for no young child has an adequate sense of proportion. I can remember one episode when I freaked out over a popped balloon to the point where I wanted revenge on the one who accidentally popped it. Granted, that was one incredible balloon; but the perspective granted by maturity shows how childish and destructive my reaction was—not because of a loss of childlike wonder at simple things, but because of a gain of respect for family and neighbors. Indeed, what kind of sense of self-worth could I have had if I had never acquired that perspective? This kind of maturity does not simply happen—it is a gift from our parents that we receive through the way they raise us. Thankfully, very few people would ever advise a parent that they should never tell their child to hold their temper.

Unfortunately, the same cannot be said of other phrases I heard from my parents. I believe, for example, that I heard “shame on you” from my mom from time to time. Now that I’m rapidly approaching the life stage that triggers unsolicited parenting advice, I’ve begun to hear that this is something one should never say to a child. Why not? Because it tears down their sense of self-worth. Because it is emotional manipulation to get them to behave. Because it scars children for life by trampling all over their feelings and by imprisoning them in social conformity. However, I don’t believe this is true for addressing feelings of shame any more than it is true for anger. On the contrary, removing the phrase from a parent’s lexicon can prevent maturity and the formation of virtues.

Like anger, shame is a natural emotion that everyone experiences, and we all experience both emotions for the same reason: Sometimes the moral law written on our hearts bumps up against reality. Though a child’s sense of right and wrong is unrefined, it exists from the earliest ages. Anger is triggered by perceived injustice, while perceived justice triggers feelings of magnanimity. Likewise, feelings of shame are triggered by perceived transgression while doing what we perceive as the right thing triggers feelings of pride. By itself, feeling shame is not right or wrong—it simply is. Shame is certainly a less pleasant emotion than anger is—righteous anger is something people are generally very comfortable with—but that does not mean the feeling must be avoided. It does, however, mean that shameful actions must be avoided. Like pain, shame is actually a good thing because it exists as a hedge against self-destruction.

But just as young children lack discernment when it comes to anger, so also, few young children have a good grasp on what kinds of things are shameful. This is something that parents must train and nourish. As a child, I did plenty of shameful things. Most of these I figured out for myself or caught on with some simple parental instruction. Other times, I needed to be straightforwardly told when shame was appropriate, and so I heard “shame on you.” Accordingly, far from resenting my mom for “shaming” me, I’m grateful to her. Where would I be if my sense of shame was left uncultivated? Would there even be any real significance to the feelings of self-worth possessed by an emotional barbarian who has no grasp on how to delineate sources of pride from sources of shame?

The consequences of negating shame can be easily seen in the area in which shame has been most intentionally negated: sexuality. Men and women are increasingly becoming sexually barbaric. Monogamous marriage having decades ago given way to successive polygamy (or, as it is less accurately known, serial monogamy), successive polygamy is quickly giving way to simple hook-ups—spontaneous sexual encounters with no spoken expectations of continuity. In other words, like a typical squirrel, smelling good and looking good during mating season is pretty much all there is to it for many young men and women. Though this is erroneously considered by many to be liberating, it has a remarkable tendency to inadvertently sound very unpleasant even as it is being extolled. This atrophy of chastity, though bad in and of itself, is accompanied by other types of harm: disease, depression, deliberate barrenness, children deprived of a stable home, and the murder of the inconveniently conceived. These changes in cultural attitudes toward children are particularly barbaric, for children represent the continuity of civilization.

Unsurprisingly, most of civilized humanity has therefore historically recognized such behavior as shameful. Nevertheless, many sex-positive feminists and others have spent a great deal of effort trying to erase the feelings of shame that still tenaciously cling to contemporary sexual license. And yet, shame is a part of human nature. It cannot be entirely expunged no matter how much effort is devoted to the task. Unsurprisingly, studies show that such sexually barbaric behavior still tends to produce shame. Given that the sexual revolution was already old by the time today’s youth were even born, the go-to explanation that this persistent shame is a result of culturally entrenched sexual taboos is increasingly implausible.

Even where people seem to shamelessly embrace shameful behavior, it is usually the case that shame has been diverted rather than removed. Younger generations—young women in particular because of the way they are targeted by feminists—have been trained in some very peculiar kinds of shame.

One of the most common is shame at being ashamed. As the thrill of hooking-up wears thin and the emotional wounds deepen, many women end up forcing themselves to continue participating in that culture. After all, they have often been told that being a strong, independent, and sexually liberated woman depends on such participation. Anything else is prudish or puritanical—some of the worst kinds of insults that can be leveled today. And so it becomes a kind of responsibility. One of the reasons these hook-ups are so often drunken is not that drunkenness leads to irresponsibility, but that alcohol is needed as an anodyne against naturally occurring feelings of shame—not part of the fun, but a tool to be exploited. And so shame becomes inverted. Like a deadline that forces an industrious worker to drink coffee as she pulls an all-nighter, hook-ups become a responsibility that people are ashamed of not living up to, even if they need liquid encouragement.

Another common diversion of shame is shame over the provocation of shame. “Slut-shaming,” for example, has drawn a great deal of fire. Some of this is in response to instances of bullying and manipulation, and such instances are indeed wrong simply because they are bullying and manipulation. At the same time, however, many are accused of shaming simply because they have expressed the value of chastity or reminded someone of the existence of sexual morality. Such reminders may make the unchaste feel ashamed just as reminders about courage may make the cowardly feel ashamed or reminders about temperance may make the intemperate feel ashamed. Nevertheless, these reminders are not “shaming” in any negative sense. They do not bully; they civilize. They do not manipulate; they cultivate. They do not denigrate anyone’s humanity, but help transform immature humans into mature humans. And so, when these shame-provoking reminders are themselves shamed away, civilization and cultivation do not happen.

Expectations of chastity having been denigrated because they were perceived to be burdensome, the corresponding shame which reinforced them was therefore attacked and displaced. The result was simply trading a sensible burden for an incoherent one. Young millennial Kristina describes the experience in a recent Rolling Stone feature. Because she became disillusioned with relationship prospects as a freshman in college after her experiences with “frat bros,” she trained herself to act like one of the same frat-bros that disillusioned her. Now she describes herself as a “sexual vulture” and before circling carcases, she “pre-games with a water bottle full of vodka tonic before moving on to the rugby house, where the sporty all-American type of guy that Kristina favors should be in abundance.” It is a story in which manufactured moxie blends with an undercurrent of despair to form a very convoluted confession. She rebels against the “douches” who are just “looking for someone to bang” by conforming to their expectations instead of the allegedly burdensome expectation of chastity. Though she now finds even the idea of dating and boyfriends distasteful and has given up on a sweetheart she’ll be with forever, she nevertheless wistfully hopes that servicing 29 guys and counting will turn out to be a great way to get the big wedding she’s always dreamed of. After all, she says of herself and her peers that “We’ll be so experienced in all the people that we don’t want, when we find the person who we do want, it’s just going to happen.” Behold sexual liberation: drugging yourself up to service unwanted guys, hoping against hope that your empowered no-strings sexual encounters will eventually lead to the strings you actually wanted in the first place.

Though my examples have been from the area of sexuality, a cultivated sense of shame is of broad human value, and its breakdown bears other consequences. The practice of shame-shaming has recently grown to include laughable attempts to ban words like “bossy” and even “sorry” because apparently all users of that latter word owe the world an apology, and they had better offer it or else. And so two extremely unpleasant personality traits are being extolled as virtues. Likewise, the diversion of shame creates bizarre expectations. There was a time when the typical salt-of-the-earth blue-collar Americans—men in particular—were ashamed of government handouts because of the expectation that a man should be able to take care of his family When various New Deal programs were first introduced, many who were in need refused for this very reason. This kind of shame, consistent as it is with male nature, had a great deal of utility, for it accepted charity only as an absolute last resort and promoted diligence, hard work, and perseverance. The various attempts to expunge this shame over the years have resulted in the expectation of charity as a legitimated way-of-life and a loss of the expectation of self-reliance from which most actual self-reliance ultimately sprang. Nowadays, shame is more often encouraged over attempts to curtail harmful social welfare programs. A cultivated sense of shame helps us make sense of expectations so that we can sort the wheat from the chaff. Without it, people are easily manipulated, whether by a teenager’s expectations of no-strings sex, an office-worker’s expectation of obedience from peers, or a by welfare queen’s expectation of a handout.

Shame is a part of human nature, and it will be a necessary one for as long as we are prone to shameful behavior. As a result, whether or not we feel shame will never be optional for us no matter how much effort we devote to the task. What is optional is whether we cultivate that sense of shame so that it can accurately discern the shameful from the benign or let it grow wild and unkempt, making emotional barbarians of future generations. The ill-advised social experiments of the last century have borne their results, and ignorance is no longer a valid excuse for continuing them. It is time we get over the outdated notion of being ashamed of shame.

Posted in Chastity, Culture, Ethics, Feminism | 4 Comments

That’s Just Your Reading Comprehension

Any Christian who has ever argued theology with people has probably, at some point, heard the phrase, “That’s just your interpretation” in response to a citation of what God actually teaches in His Word. It’s the “whatever” of the literary world—an end to discussion without an acknowledgment that any salient points have been made.

Many postmodernists have long conceived of “interpretation” as though it were a barrier between a person and the true meaning of the text. They take examples of static in the transmission of a text’s meaning from one person to another, cannot see a way to resolve it, and finally conclude that the real text must therefore be forever closed off from us. Instead, all we have are our own interpretations of the text rather than the text itself. Debates over what the text really mean are therefore irrelevant—all these arguments really consist of are two different but equally valid interpretations attempting to occupy the same social space. They have nothing to do with what is written, only with personal thoughts.

Of course, wise people know that this is an incoherent kind of dodge. If it works intellectually at all, then it can be applied to anything, including the views of the postmodernist. Any disagreement could be immediately resolved in one’s favor by “interpreting” everything said by an interlocutor as agreement with one’s own position.

“I completely disagree with everything you just said.”
“Well, I’m glad we’re in agreement.”
“But I’m disagreeing with you!”
“That’s just your interpretation. My interpretation is that you think I’m brilliant… and good-looking.”

Ridiculous, of course, but no more so than its more typical applications. The fact that even postmodernists keep on talking about texts as if their words make sense to other people indicates that they don’t really believe interpretation is an airtight barrier between us and the text, no matter what they pretend when it comes to subjects like theology and philosophy.

In reality, interpretation is simply our reception of the text—a part of who we are as readers. Far from being a barrier, it is not a third thing between us and the text at all, but our very act of reading the text. When talking about Scripture, a better term than interpretation would be a skill called “reading comprehension.” Accordingly, the popular rejoinder of “that’s just your interpretation” should be more accurately expressed as “that’s just your reading comprehension.”

Of course, if the matter comes down to reading comprehension, there is no immediate sense of “my view is inherently just as valid as yours.” On the contrary, it’s quite clear that some people comprehend writing better than others. This is why children are taught to read—so that they can comprehend texts well rather than poorly. This is why most forms of standardized testing have a component for reading comprehension (under various monikers) and why some students do better than others. When reading comprehension is challenged, it is not a dismissal of the subject. It is instead a question of who is comprehending the text better, the resolution of which steers us back to the text in question (often more of the text than was originally cited)—the very last thing that’s-just-your-interpretation guy wants to actually read.

One of the clearest signs that a person’s reading comprehension skills are poor is that they have determined what a text is allowed to say prior to reading it. Take, for example, the Bible’s teachings about homosexuality. Most pro-homosexuality activists are honest in that they openly do not care what the Bible says on the subject. For non-Christians, the Bible is relevant only when it intrudes on their lives in some fashion. The same true of theological liberals who, though less honest because they falsely claim to be Christian, are nevertheless open about seeing the Bible as a product of its time whose value is always subsidiary to modernistic “enlightened” opinions. Sometimes, however, there are people who want to belong to orthodox Christianity—who want to believe that their religion is actually true—but do not want to believe some of the inconvenient details. They cannot dismiss the text as non-Christians do, nor can they completely subject it to the Spirit of the Age as theological liberal heretics do, but they believe they can categorize some Biblical teachings as coming from an “interpretation” rather than from the text itself.

According to these individuals, the Biblical passages about homosexuality are true and applicable to Christians, but are not referring to homosexuality as we understand it today according to their own interpretation. Moses was just talking about acts of dominance that are more like rape. Paul was just talking about Greek pederasty and homosexual promiscuity. None of them were talking about the (supposedly) monogamous and faithful homosexuality we encounter today. If this is my interpretation, than the opposing orthodox view must be the other person’s interpretation—not a teaching a Scripture. Add a dash of misunderstood Sola Scriptura, and the rationalization is complete: our modern kind of homosexuality must be permitted because everything not specifically condemned by Scripture is permitted.

But what happens when we remove the “interpretation” dodge and treat the contention as a matter of reading comprehension instead? Well, then we have to ask, “how do we know that these Bible passages aren’t about homosexuality as we encounter it today?” Here the answers diverge, but I usually only see variations on two general themes. The first is that we know this because our loving and monogamous form of homosexuality is only a modern discovery; so ancient prophets couldn’t possibly have been talking about it. The second is that because our kind of homosexuality is good and loving, and a good and loving God would never condemn something good and loving, so He must have been condemning something else—something sort-of-homosexual that isn’t good and loving.

There are plenty of historical and logical reasons why both of these “how do we know” answers are dubious, but here I’d like to focus on what they mean for reading comprehension specifically. In neither case does one actually need to look at the text itself to see what Scripture actually says. In both cases, the outcome is determined apart from the text. If ancient prophets couldn’t possibly have been talking modern homosexuality, then they couldn’t possibly have been talking about it no matter what the prophets actually say. If God would never condemn something one sees as good and loving, then His words will always mean something other than that no matter what they are. At this point, one does not even need to read the Bible to find out that homosexuality is just peachy as far as Scripture is concerned. In the end, the fact that one’s reading comprehension doesn’t actually involve reading is always indicative that it is very poor indeed.

The example I used is about homosexuality, but it applies to all sorts of subjects anywhere from wifely submission to whether the Sacraments actually do anything (i.e. whether “Baptism now saves you” is just my interpretation of “Baptism now saves you.”) People try to avoid all sorts of teachings for all sorts of reasons by shifting focus from the text to interpretations. The more industrious seek out various creeds, confessions, and writings of the Church Fathers to show that their interpretation is part of the historic church. But as valuable as such documents are to help us refine our Biblical reading comprehension skills, they can never be a stand-in for reading the text itself. When two Christians disagree over what God is teaching them, the problem is not interpretation—the problem is at least one of the two Christians.

Posted in Apologetics, Theological Liberalism, Theology | 1 Comment

Domesticated Camels in Genesis not yet Confirmed by Science

At least, that’s how a more objective headline regarding the recent findings of two Israeli archeologists might read (though I suppose one could quibble about the “yet.”) Instead we get headlines like “Is Camel Discovery the Straw the Broke the Bible’s Back?” and “Camels had no Business in Genesis.” I suppose Christians should be used to the spin applied by liberal reporters slavering for license to encourage the disregard of parts of God’s Word that they don’t like. But what did these archeologists actually discover? Have they really disproved the Bible?

The American Friends of Tel Aviv University summarize the findings this way:

Camels are mentioned as pack animals in the biblical stories of Abraham, Joseph, and Jacob. But archaeologists have shown that camels were not domesticated in the Land of Israel until centuries after the Age of the Patriarchs (2000-1500 BCE). In addition to challenging the Bible’s historicity, this anachronism is direct proof that the text was compiled well after the events it describes.

That is certainly a bold claim. What evidence do they present? Well, the archeologists examined an unspecified number of ancient copper smelting sites in a specific area of Palestine: the Aravah Valley. Here they found an unspecified amount of camel bones buried in various layers of soil. Most of these bones showed signs of wear-and-tear consistent with having frequently carried heavy loads. The earliest of these bones (11th to 9th century B.C. according to carbon-dating) is well after the time of Abraham, Issac, and Jacob who, according to Genesis, used domesticated camels. There were older camel bones that were also discovered, but these did not have the same signs of wear-and-tear, leading the researchers to conclude that they were wild. From this, they speculate that domesticated camels did not exist in Palestine when the Bible talks about them, proving that these were, at best, later additions to the text by an ignorant people who thought camels had always been there.  At worst, of course, these later ignoramuses simply made up all the stories for the sake of political expediency.

Well, that is one explanation of the findings. But is it the only one, or even the best? In a way “findings” isn’t really an accurate description because the conclusion hinges entirely on what was not found.  It is unfortunate that highly educated individuals need to be reminded that “absence of evidence is not evidence of absence,” but this seems to be a clear case of such forgetfulness. One of the reasons that this is a good proverb is that there are often many reasonable explanations for why you did not find what you were looking for.  After all, there are almost always more places that you did not search than places that you did.  Indeed, it is not difficult to find alternate explanations for the lack of bones in this case either.  Off the top of my head, it is possible that:

  • The operations at these sites did use domesticated camels earlier, but these particular bones haven’t been uncovered yet.
  • Camels were used elsewhere in Palestine, but not in these particular copper operations.
  • Camels were only used sparsely in Palestine at the time of the Patriarchs, but more heavily later on leading to a disproportionate likelihood of finding later bones.
  • Abraham’s camels were brought from elsewhere (he was, after all, not native to the area.)
  • The older camels that were found were indeed domesticated, but used more lightly (After all, the sites either were or were not regularly active that early. If they were not, one would not expect to find bones related to the operation. If they were, it seems odd to me that wild camel bones would be found there.)

And these don’t even consider any possible difficulties with radiometric dating or other potential technical errors in these findings.

Devotees of Scientism, of course, would not have cause to consider these alternatives because science has not yet supplied evidence for any of them. They are only allowed to acknowledge a belief in whatever has been offered up through official use of the scientific method. However, those with a longer view of history will recognize how often academic findings are overturned—particularly when it comes to claims of Biblical inaccuracy.

Take the Gospel of John, for example. Due to a lack of evidence, a presumption of legendary material, and speculation about the amount of time needed for legends to develop, the academic consensus for many decades had been that the Gospel of John was written no earlier than 170 A.D. This, of course, all changed in 1934 when somebody unexpectedly found a fragment of it dating to around 100 A.D. tucked away in a library in Manchester. As it turns out, it wasn’t that no early copies existed—it’s simply that they had not yet been found and identified. Interestingly enough, no revision was made to the academic presumption of legend—they simply adjusted downward their speculation about how long it takes legends to develop.

Episodes like this serve to illustrate that an ounce of evidence is worth a pound of speculation & presumption. In the case of the domesticated camels of the Patriarchs, however, lack of evidence and speculation is all the archeologists are really left with. The only thing that was found was nothing, and the search was hardly exhaustive.

To be sure, it is not unreasonable for these archaeologists to conclude that domesticated camels were a later introduction to the area than what the Bible indicates. For those who already think the Bible is bunk, there is not yet any good reason to believe camels were present any earlier. The alternative explanations I offered are simply maybe’s which themselves have no evidence. At the same time however, claiming that their findings amount to “direct proof” is a gross overstatement at best and entirely fallacious at worst. A newly discovered lack of evidence does not rise to the level of proof one way or the other. Consequently, it is not at all unreasonable for those who already believe that the Bible is actually true to continue believing so. After all, we do have a good reason to think there’s another explanation. After all, we have the testimony of a Book accredited by the Son of God Himself. Whatever the headlines suggest from day-to-day, wisdom and history suggest that the Biblical view will eventually come out on top.

Posted in Uncategorized | Leave a comment

An Extraordinary Failure

I have, for the past few months, been teaching a comparative religion class at my church, and just this past Sunday, we began our final topic—the twin-headed religion of Atheism and Secular Humanism. Whether this is truly a religion or not has been debated ad nauseum, but what is undeniable is that many modern atheists behave religiously in many respects. They proselytize and encourage others to do the same. Many have even begun organizing quasi-churches immediately followed by practicing the time-honored tradition of schism.

One subject I intend to bring up in our next session is Carl Sagan’s famous contention that extraordinary claims require extraordinary evidence. In other words, anyone who wants to suggest something that doesn’t fit into the naturalistic box needs to come up with evidence far beyond what would be required for any other type of claim. This is a favorite saying of some of the lazier atheist apologists, as it allows one to avoid defending his own beliefs and instead sit back and claim that no amount of evidence offered in favor of God’s existence is sufficiently extraordinary. This frees them up to instead focus on the task of provoking volitional doubt in the believer.

This “Sagan Standard” can trip up Christians because it is one of those statements which seems like common sense at first glance. After all, few people would need much time to come up with an extraordinary claim that they would not believe unless extraordinary evidence were offered. Nevertheless, the closer one looks, the less sense it makes. It has at least three critical failings when applied to the existence of God.

First, the statement as a whole simply isn’t true in a broad sense. Not all extraordinary claims require extraordinary evidences—only some do. The reason for this is that many extraordinary claims are compositions of entirely ordinary claims. Take, for example, the Resurrection of Christ. A man coming back from the dead is certainly extraordinary. Nevertheless, a claim to a resurrection is really made up of two simpler claims: first, that a man was dead at a certain point in time, and second, that the same man was bodily alive again at a certain (later) point in time. These are about as mundane of claims as one could possibly come across; they only become extraordinary when paired together. And yet, establishing a resurrection requires nothing more. It is therefore entirely possible for extraordinary claims to rest on ordinary evidence.

The second critical failure is that claiming the existence of God is not in any way extraordinary for most people. Despite the pretense of modern intellectuals, atheism most certainly is not the default for humanity. The vast majority of people have believed, do believe, and will continue to believe in some kind of divinity. While extraordinary in the mind of the atheist, such belief is quite ordinary for everyone else. As it turns out, there is a strong subjective element to the concept of “extraordinary” that Sagan and most atheists pass over. What metric shall we use to measure it? Atheists seldom volunteer one. This ambiguity is of great utility to the atheist who needs to rhetorically pass judgment that God’s existence is extraordinary and that the evidence thereof is not. As long as nobody asks and the metric is imposed by unspoken assumption, an atheist’s job is much easier.

Finally, there have been points in time at which God has demonstrated his existence in extraordinary ways. As an historical event, the Resurrection of Jesus Christ (as we already noted) is by most accounts quite extraordinary. Likewise, Jesus’ own explanation for this event is quite well-known to involve God. Extraordinary or not, however, it is something that actually happened—a well-attested fact of history. Neither does the Resurrection stand alone. Though it is the best attested of God’s miracles, it is hardly the only one. Even if extraordinary claims required extraordinary evidence and the existence of God were an extraordinary claim, extraordinary evidence is still readily available unless one dismisses it a priori.

Like much of modern atheism, Sagan’s oft-quoted contention derives its force from a presumption of atheism that has characterized our the intellectual culture of the West for the past century or two. When approached by an atheist, there is no good reason for Christians to play along with that presumption.

Posted in Apologetics, Atheism | Leave a comment