04/18/14

“The tolerance of intolerance is cowardice,” but the intolerance of the intolerance of intolerance is expected

Ayaan Hirsi Ali was in the news last week when Brandeis reversed their decision to give her an honorary doctorate for her work. Many people have discussed the ridiculousness of Brandeis’ response, which is either deception or woeful ignorance. I’m not interested in those issues as much as I am in the justifications of those who argue it was the right thing to do. A blogger on altmuslim claimed that Ali promotes the same intolerance that she claims to be fighting against. He also noted that although Ali’s arguments as treated as scholarship, “her words and arguments are not academic or scholarly.” These points deserve further examination.

Intolerance is an accusation that hurts the feelings of many a liberal, for they also use it liberally. It is most often backed up with the unspoken presumption that one should never want to be labeled as intolerant. Yet it is a poor definition of tolerance that says it is a quality to be valued for its own sake. In other words, if one is to make an argument for tolerance, it must be justified not on the basis of tolerance itself, but on some other fundamental value, such as that of life, freedom, etc. Few of us would suggest being tolerant of those who commit egregious acts of violence (unless, of course, these are committed against animals). There are plenty of things we can and should be intolerant of (corporate business practices, disregard for environmental destruction, etc.), so long as our intolerance is not accompanied with physical violence or the impending threat of violence against individuals.

It is, as this blogger implies, the hallmark of a scholarly or academic argument to carefully separate “bad” acts from “good” religion. In fact, scholars of religion could often be the unintended subjects of Ali’s comment that “Tolerance of intolerance is cowardice.” They join much of the world in pleading with folks not to print cartoons or make films that might cause offense. There are some who systematically dissociate acts of violence from their religious context, even when overt. And this is seemingly well-intentioned. The blogger contends that “her approach is not driven by an academic or scholarly need to help the oppressed,” but it is because Ali does not only walk the careful line of disinterested scholarship that she has a passion for change.

If a freethinker criticizes religion, if he or she suggests that the world would be better off without “x” religious tradition, he or she is not insulting God. To the freethinker there is no divinity, and there cannot be one in the public sphere. In the public sphere, there is only humanity. To be sure, the religious may believe that the divine rules public life as well, but this cannot be a community motivation if we desire a free society.

Nor is the freethinker insulting a tradition. There are no traditions we can assess beyond their embodiment in assemblages of people and buildings that make them up. In the public sphere, there are only people, and these people must live with each other. I’m sure there are thousands, perhaps millions, that are offended by the words of Ali. There are also thousands that are offended by the words written in suppposedly holy texts. Are there as many of the latter group? Perhaps not, but does it really matter? It is the hallmark of a free society to be able to offend. Offense and intolerance, insofar as they describe feelings and words, are a signal that an open society is at work.

I’m not talking about allowing people to scream “Fire” in a crowded theater. I’m arguing that suggesting the world would be better off without a particular tradition, no matter how improbable that may seem, is a proposition that should not (and will not) be shut down by claims of intolerance. In other words, it is intolerant, and that is good. It is not intolerant for its own sake, but because of the connections between religion and violence that are evidenced by Ali’s own life. The common refrain that such-and-such particular practice is not actually encouraged in a particular text is no argument against the historical and cultural connection between religion and suffering, particarly considered in an impoverished political and economic context. The point is not that there is a tidy equation, that violence and oppression would magically disappear if religion lost its hold, which is the point that defenders seize upon. I would even argue that such a direct attack is an inefficient approach to the problem, but it does not automatically invalidate the correlation she suggests by adhering the label intolerance.

In 19th century America, there were once mean slave owners and nice slave owners as well, and there were even perhaps willing and unwilling slaves. Many of these men and women, I’m sure, were “good” people. Few of us now would argue that the institution of slavery should have been kept around because there were quite a few folks for whom the system worked quite well, who never hurt anyone and generally got along just fine, or even benefited from its perpetuation. In retrospect that seems silly to consider, but it certainly wasn’t for many at the time. It is the hope of Ali and others, I believe, that we will someday look back at religious traditions the same way, wondering how we justified its abuses for so long.

Of course, the case of religion is different in many ways. It would be as deplorable to prohibit the individual practice of religion as it is to mandate it. But the individual practice of religion is a maximum, not a minimum threshold, and until it is certain that all individuals are aware of their options for understanding the world outside of religious tradition, we are still far above the maximum threshold of individual practice as a basis for tolerance.

It may be best in the end that Ayaan Hirsi Ali did not receive an honorary doctorate from Brandeis, because it is not academic to be so bold, at least in the field of religion. But it would be a welcome addition if more were.

04/7/14

“Getting Things Done”

In the last few months, I’ve read more “life organizing” literature than I ever have before. I read and reread Getting Things Done, a book my wife read years ago and had on-hand. At the time, I probably poked fun at her, but I’ve been surprised to see how typical (and ineffective) my task management is. I’ve always been resistant to having a book or a method tell me how I should organize things. What I typically tell myself is that I really know best how to do everything, from planning my daily activities to knowing what my long-term goals are and what progress I’m making toward them. What I’m consistently finding, however, are that the things I think are important to me are not the things I spend the majority of my time on.

When I was in my late-twenties, I hosted a college-age small group at our house. As a Christian group, we would usually be reading through some text such as The Purpose-Driven Life or Wild at Heart. Of course, we also read frequently from the Bible, trying to discern what life lessons we could learn from the reorganization of the temple under Hezekiah for our contemporary existence.

We held the group for a couple years, and the most recurring theme in our discussions was the question of what we were all going to do—what we should do—with our lives. I was in my late-twenties, working at a good job that I was nonetheless unsatisfied with. We owned our house, we had just had a child, we had a dog, etc. We had followed the American dream formula, and it had seemed to work out well. Yet I, like many others, found myself constantly asking, “Is this it?”

The other group members in their early twenties were at the beginning of that same spectrum. The future was open; they could do anything they wanted. But what should they do? Depending on which paradigm one followed, there were ready-made answers. If the middle-class response was go to college, the evangelical response was “Go on a mission.” Most of us there were to trying to reassure ourselves that it was okay that we didn’t want to abandon everything and move to Africa for six months.

The great conceit of the small group was that if we came together and talked about and to God, we would get that clear vision of our lives’ goals and purposes. Or at least we would get the next step. Yet we kept returning to the same questions. On reflection, the group wasn’t large enough or fervent enough for any of us to convince ourselves that we could get a revelation from God about our lives. Instead, we fumbled around with the questions but supported each other along the way with the more practical aspects of life. When I needed to build a fence around the yard, for example, several of them (who knew more than I ever will about construction) came over and helped out. When someone moved, we all showed up to help. But we never got any bigger answers. We just lived life and moved on.

When faced with the innumerable choices and directions our lives can go, we are overwhelmed. Religious traditions fill a definite need in that respect, providing a simulation of knowing what you do not know. That is not to say faith cannot provide psychological/existential relief for people; it can. It does so, however, only to the extent that you ignore the very tenuous connection it has to the way we actually live our lives. If a divine being is ultimately in control, then I am relieved of the burden of ultimate concern about the environment or the consequences of my consumption.

For my part, I exited one system, thinking myself much more authentic for having gotten rid of it. However, at the same time I was being inculcated into the system of higher education, which provides a rival structure for goals and purpose. For six years, I had the goal of earning a degree. It was only near the finish that I began to experience the openness that accompanies life with no tradition, no trajectory, to tell you what to do and where to go. I can’t yet speak to what comes next.

It is, in these cases, easy to allow yourself to go on auto-pilot, so to speak, and let the roles you are in dictate my day-to-day existence. That seems to be what many of us do. While on the outside it looks like an organized life, it is only a coordinated backdrop that overlays an uncertainty that never really goes away. Why? Because there really is no certainty other than that which we construct.

The key, then, seems to be to construct purpose for life or for the day’s affairs that has as little collateral damage as possible, either for your own life or the lives of others. There will be collateral damage, and it must actively be minimized. Anxiety will remain, and it is managed with the systems you set up arbitrarily for yourself. There is more Nietzsche than Sartre here. We establish roles for ourselves, all the while knowing that it is just a play. And yet we must play.

There were several years where, when I realized that I was merely playing a role, I resisted playing it because it was not “real.” But not all roles are the same, not all require the same depth of self-deception about oneself and the world. I have always relied on the top level to dictate the actions for everything underneath, but this doesn’t create a life. If followed unthinkingly, it extinguishes life. We often know this, but we prefer the familiarity of traditions, with all their contradictions, to uncertainty. Uncertainty, however, is a level playing field. We will make mistakes, but they are conscientious ones, and not the unthinking destruction of traditional institutions. In the end, we must actually get things done.

02/10/14
Screen Shot 2014-02-09 at 4.31.11 PM

Thoughts on a live debate over the existence of God…

Screen Shot 2014-02-09 at 4.32.16 PM I attended a debate on Friday put on by the Secular Student Alliance at Boise State entitled “Does God Exist?” To my surprise, the room was packed, with about three hundred people in attendance. The debaters were Dan Barker, a former evangelical pastor and founder of Freedom from Religion, and Bill Pubols, a director of Athletes in Action, a “community striving to see Christ-followers on every team, every sport, every nation.” I’ve never attended a debate like this before, but I’ve heard about Dan Barker for some time and wanted to see the type of arguments each side trotted out.

I will say up front that Pubols (who valiantly came in as a last minute replacement for Matt Slick) was inexperienced and outmatched by the veteran Barker. However, the arguments he brought forth were similar to those of more experienced debaters, albeit not deployed as skillfully or confidently. For his part, Barker was not as charitable as I would have liked in his characterization of Christians, though I agreed with nearly all of his points.

While the constructing and dismantling of arguments was interesting, I noticed a distinct change in tactics on Pubols’ part over the course of the debate. He began with the Kalam cosmological argument, made arguments from universal moral principles, and contended for the validity of the New Testament based on its historical accuracy. Barker in turn dismissed the cosmological argument for making a category error (assuming that the universe itself obeys the same laws of things within the universe), denied that morality had to be universal to be valuable, and suggested a number of irreconcilable contradictions in the Biblical text.

As the debate continued though, Barker retained the same approach while Pubols shifted from making arguments to using anecdotal evidence and making emotional appeals. I recognized both the rhetoric and the tone of his altered argument from time spent listening to innumerable sermons on Sunday mornings.

I sensed that Pubols was more comfortable with anecdotes and emotional appeals than philosophical arguments, and rightly so. Christianity situates the individual within a narrative that spans both time and eternity. Seen from within, this narrative creates purpose and meaning, but as Jean-Francois Lyotard notes in The Postmodern Condition, this grand narrative is incompatible with scientific knowledge. Lyotard concludes that “it is…impossible to judge the existence or validity of narrative knowledge on the basis of scientific knowledge or vice versa: the relevant criteria are different” (26). The two epistemologies speak a different language, and this became apparent during the debate.

(One might argue then, as many have, that religion and science just occupy mutually exclusive registers of reality. But Lyotard’s point is that narratival justification is no longer possible in the postmodern world, and the best we can do is little narratives that make no claim at universality. In a sense we know too much for the grand narratives to continue to function. And if it were true that religious or scientific beliefs were held in a vacuum, their potential conflict would be inconsequential. In our world, though, they vie for position in politics and culture. This is one reason I can’t buy the argument that freethinkers should just leave believers alone if their belief gives them comfort. It’s not that simple.)

Both men made appeals to scientific knowledge, and I’m curious to know whether a scientific argument is appealing to other folks when arguing over religion. Pubols told of the unimaginable improbability of the universe being constructed so as to support life–which for him points to a knowing creator–but Barker was well-versed in scientific jargon to support other examples in the universe of order coming from chaos. Those arguments did little to convince me on either side. It may be because my deconversion was initiated from a more practical and social standpoint. I was more convinced by the arguments from morality and the problem of evil.

The case of morality is particularly interesting because the believer is sincerely convinced that life is not meaningful without ultimate purpose (think Rick Warren and the Purpose Driven Life here), and the freethinker is just as sincerely convinced that life can (and must) be meaningful without ultimate purpose because there is none. This suggests that understanding how individuals pass from one paradigm to another is critically important to understand.

The problem of evil is much more straightforward, and it remains difficult to understand how one can employ notions of the goodness of God, or divine love, in the face of the human condition. As Barker noted, if God is whimsical or bad, he would be more convinced of his existence, but the insistence that God is good in the face of good and bad acts in the world requires a redefinition of linguistic terms that is only possible when one starts with the answer. To use a crude but applicable example, if a friend or partner beats you and then tells you he loves you, others would recognize it as manipulation or abuse. On the global scale and when talking about the divine, many religious folk are comfortable with calling it love.

In the end, although the arguments Pubols first employed were attempts to justify his belief on the basis of philosophy or science, they weren’t the foundation for his belief, nor are they (I think) for most Christians. They certainly weren’t for me as a believer. Christianity was true because I was part of a narrative, one that plotted me in the course of human history and guaranteed my righteousness for eternity. Thus, when his attempts at reasonable justification were thwarted, Pubols resorted to the familiar tactic of narrative, the means by which he and others have been sincerely convinced. He referred to, among other things, the “knowledge” of the heart, the “Truth” of Jesus’ statements such as “I am the Way, the Truth, and the Life,” and the felt “need” we all have for ultimate meaning.

According to the anonymous entrance poll, the majority of audience members were Christian, and there was about a four percent shift toward the nonexistence of God by the exit poll. I came away entertained but wondering if the debate format was worth the effort if the aim is to sway the opposition. Changing the question from the existence of God to the validity of faith would likely have improved the discussion, but lessened the draw to the debate. Overall, it seemed akin to the recent debate between Bill Nye and Ken Hamm (which I didn’t see). One commenter summed it up by saying that the only thing that would change Nye’s mind is evidence, and the only thing that would change Hamm’s mind is…nothing. But people do change, somehow. If I could only figure out how…

01/19/14
Screen Shot 2014-01-18 at 9.18.25 PM

One of these things is not like the others…

Screen Shot 2014-01-18 at 9.18.25 PMI’ve had multiple conversations in the last year about whether atheism is a religion. I don’t self-identify as atheist for both political and ideological reasons, but most of the critiques I see of atheism—which are usually critiques of atheists, and usually about how mean they are—only shallowly engage the ideas they critique and beg the very questions atheists are asking.

A way to get behind the question is to ask what function atheism-as-religion has for the parties who make that claim. It’s easier to deal first with those who self-identify as atheist. The closest thing I know to religious atheism is the Sunday Assembly, whose recent split seems to have been over just how much to explicitly cater to atheists as opposed to a more general humanism. (As a side note, it doesn’t seem the best tactic to argue that a “split” is evidence that atheism is a religion). They meet together, sing songs, tell stories, and enjoy each other’s company. If you want to call atheism a religion in a colloquial sense based on groups like these, so be it.

I’ve found, however, that those who claim atheism is a religion are usually members of a “rival” religious tradition. The argument seems to go something like this:

  1. Christianity is defined by a belief in God (Jesus).
  2. Atheism is defined by a belief that there is no God.
  3. These are both beliefs.
  4. Therefore, an atheist critique of Christianity is invalid because the two are both belief systems.

There are many problems with this argument. Beginning with the end, if it were the case that all belief systems are structurally the same and they therefore have no ground to critique each other, this would undercut any criticism of another institution. This might be helpful if we judged systems solely on the basis of structure or organization without any evaluation of content, but we don’t, and that leads to the next point.

There is a gap between points three and four implying that all beliefs are qualitatively the same. This is a disingenuous argument because it separates belief as a thing out in the world separate from believers, those who create belief through acting in the world. Sure, a belief is a belief, just as a law is a law, but we wouldn’t likely argue that all laws are qualitatively the same. They pertain to different aspects of existence and we judge some of them effective and others not-as-effective.

What the argument is saying is that the act of believing is equivalent in both cases. Again, this is technically true, but it is disingenuous because it negates the content of the belief. It partakes in the sociological idea of rational choice, which suggests that we pick our way of being in the world as if picking a value meal at McDonalds. In truth, we are already enveloped in a world that disposes us to prefer some ways of being over others. Sincere adherents to a tradition prefer their traditions. They think their tradition is better for them than others for a variety of reasons. It may be because of potential theological consequences; it may be because of social preference. One is deceiving one’s self, however, if one both claims to be a member of a tradition and claims that his or her tradition is no better that any others. (Of course, one other option is to being to realize you don’t prefer a tradition as much as you thought you did, that realization becoming a catalyst for change. Such was my experience.)

If we look at the specific beliefs (assuming that the defining belief here is the presence or absence of God), no better case can be made. A monotheist affirms that there exists a supernatural being of higher order that interacts with humanity in some way. Those who are not monotheists do not necessarily believe that there is no God (although they may); they simply lack a belief that monotheists have. These are not the same thing. The first implies the existence of a divine being and suggests that one’s decision is whether to affirm its existence. The second denotes the presence of belief in one case, and an absence in the next.

The reason the argument is not usually made this way is, in part, because the presumption of divine beings has been prevalent in Western society for all of written history. (We may be even genetically predisposed to affirm a higher power, anthropomorphizing what we cannot explain.) The existence of God has been normalized to such an extent that is the starting point for all discussions about religion. Thus the absence of belief is characterized as a belief in itself, which from a normative stance is also seen as an attack on existing belief. This is not to say that atheists do not attack “believers.” It is to say that “believing differently” is a poor way of conceptualizing an absence of belief.

So what is a better way of conceptualizing those who, from the perspective of religious traditions, do not believe? A better way might be to look at what they affirm. Far be it from me to speak for atheism; rather I want to suggest that all ways of viewing the world are not belief systems. Or, more precisely, all are not faith systems. In discussions such as these, there is slippage between the two ideas. It is possible to justify belief, but it is not possible to justify faith. Faith is belief in the absence of—or because of the absence of—justification. Its primary criteria is not being subject to falsification. Other epistemologies are defined by their being subject to refinement, criticism, and inquiry. The substance of faith cannot be changed, and this is why it cannot be considered as an equivalent form of knowledge to any other that is subject to such falsification. One might even try to argue that faith is better than other forms of belief, but it cannot be the same.

We come full circle here. The faithful can argue, “Well I can’t prove that God exists, but you can’t prove that he doesn’t!” That is indeed the case. I can neither prove that unicorns exist. Luckily I don’t need to because very few if any think they do. The point is that when other epistemologies come to the fringes of their systemic ability, they may speculate, but they do not assume or create other forms of knowledge to compensate. This certainly does not mean a lack of desire to know the unknown. It entails a humility about our systems and abilities of perception that is in keeping with the history of humanity.

I’d be interested to hear if I am missing possibilities. Is it possible to both identify with a particular tradition and yet not think that it is qualitatively better for them to be in that tradition than others? Is it possible to view faith as an epistemology like any other?

01/6/14
Caravaggio - Sacrifice of Isaac (Wikipedia)

The Last Line of Defense?

Caravaggio - Sacrifice of Isaac (Wikipedia)

Caravaggio – Sacrifice of Isaac (Wikipedia)

Up until the last couple years, I have prided myself on not allowing my worldviews to sway discussion in the classroom. To oversimplify a bit—and speaking primarily of the humanistic disciplines—I thought the university was divided between “activist” professors, those who can’t help but betray their investment in the issues they discuss, energizing some students and alienating others, and “neutral” professors, those who keep their views hidden so as not to abuse their power and give a balanced presentation on all issues discussed. I was ambivalent about the first model, but I aspired to be the second model, perhaps because of humility, perhaps because of timidity.

I was reminded of how conflicted I now am about the latter model in when reading an article on religion and violence by Hector Avalos. Avalos is a professor of Religious Studies at the University of Iowa. He’s also a former Pentecostal preacher and an outspoken critic of religion. He is something of an enigma in being a professor of religion who is openly critical of not only his former tradition, but religion in general. He’s written a book calling for the end of his own discipline of Biblical Studies because it attempts to perpetuate as a living text a book that he argues is fundamentally incompatible with the modern world. There are certainly others within Religious Studies that are critical of some religion, but not many like Avalos. I’d like to hear about his deconversion some day.

But back to the article, which is entitled “Religion and Scarcity: A New Theory for the Role of Religion in Violence,” a chapter in the Oxford Handbook of Religion and Violence. The article is a riff on his 2005 book Fighting Words: The Origins of Religious Violence. The point of the article—and the earlier book I presume, though I haven’t read it yet—is that violence is caused by a scarcity of resources, real or imagined, and religion is particularly dangerous because the source of justification for the scarce resources it centers around are intangible, and thus unverifiable in any way. For example, because of the belief that a supernatural being, God, condemns homosexuality, many Christians believe that traditional values (i.e., their values) are under attack with the increase of same sex relationships, and it is consequently their duty to correct the situation, with violence if need be. Sacred space is another example Avalos gives. All the monotheistic traditions want a piece of the action in Jerusalem because each is under the impression that its God has imbued the land with sacred significance for them. One does not need to know much history to know how much violence this belief has caused.

After citing examples of scarcity, Avalos gives a critique of the ethics of religious violence with the following syllogism:

  1. What exists is worth more than what does not exist.
  2. Life exists.
  3. Therefore, life is worth more than what does not exist.

Although I wouldn’t say I disagree, there are certainly more convincing ways to delegitimize religious violence, including the historical examples above. His point, however, is that this violence is taking place on the basis of empirically unverifiable claims. Not land or oil or wealth—although these all can be implicated as well—but faith.

What I appreciated was the candidness with which he made his conclusion. Given the immorality of religious violence, there are two conclusions, Avalos contends. One would be to modify religion so that it does not manufacture scarcities, and the other would be to remove religion completely. The latter would not remove all violence, but would remove one source of purely immoral violence. He doesn’t make a strong case for the first option, partly because I don’t think there’s a strong one to be made. Postmodern Christianity is certainly fighting for this approach, and from an individual perspective, I understand it. It’s one I tried to pursue for some time and have some lingering sympathies for. However, I think that this approach only hides the symbolic violence that religion can still contribute to in other spheres. I’d call this the Pontius Pilate approach. I’ll wash my hands of the whole thing, and if people happen to get hurt, it’s not my fault.

The second approach, the one Avalos spends more space discussing, is to rid the world of religion. How so? With education. By exposing religious thinking to the same process of rational thought and empirical evidence that governs other spheres of inquiry. He ends with the following: “Even if it can never be achieved, the most ethical mission of academic religious studies may be to help humanity move beyond religious thinking.”

I cannot vouch for the ethos of other religion scholars, but this is definitely not how I learned to teach religion. The religious studies scholar’s role in the twenty-first century seems to be to defend religion. This role seems to have been accelerated after 9/11, when many Americans had little difficulty believing that Islam existed only for violent ends. (Indeed, many still do.) Religion scholars have perpetually mounted a concerted defense of religion, usually by denouncing acts of violence as not religious in their very nature or making some sort of separation between good and bad religion. I wrote about that in the case of the Boston Marathon bombing last year. While the intention of many was likely good, attempting to halt the proliferation of violence upon violence, it seems to have furthered the role of the religion scholar as the defender of religion. One doesn’t need to defend religion on its own merit in order to denounce violence against those who are religious, and it has produced some unthinking scholarship.

It is abundantly clear that religious traditions have had and do have intimate associations with violence, in physical, symbolic, and systemic forms. I suppose the charitable question would be the following: If, as any sincere religious believer would have to think, eliminating religion is not the best option, and assuming one wants to minimize violence, how can one remove the violence and keep the religion? (Hint: the answer is not to dissociate violent acts done in the name of religion from religion. That just offloads the problem.) I think this second approach may actually be the more difficult one.