Why is it that refusing to judge is often seen as a form of virtue? I’ve been leading discussion groups in a class that explores the nature of genius this semester. This week we explored the idea of “evil genius,” which is a significant cultural trope in our society, often in other guises such as the mad scientist or the super villain. We asked the students if it was possible for genius to be evil. While most lined up on one side or the other of the debate, a few vocal students protested, and one defended his protest with a classic “Who am I to judge?” line: “What some might call evil, others might call good.”
Now as an abstract statement this might be defensible. There are certainly cases, particularly involving violence, where one person’s (or one country’s) evil is another’s greater good. However, what this student (and many others like him) was doing was to use the fact of a multiplicity of perspectives to conclude that we cannot and should not make distinctions between perspectives. “That’s just their culture” is disingenuous, but it is seen as a modern, savvy, and politically correct response. After all, its better than “They’re just inferior” or “They’re just savages,” right?
Don’t get me wrong. I’m glad that cultural imperialism is not—at least overtly—the norm in higher education. The problem is that many students conclude that the best response to lessons about pluralism and diversity is to adopt a position of cultural relativism, and many teachers either don’t know how to correct the trend or think the same way their students do.
Of course, none of us are really cultural relativists. We are cultural relativists in so far as we are personally unaffected by, distanced from, the cultures we are reluctant to judge. We are cultural relativists in so far as we reside in a culture that allows us the privilege to treat other cultures as thought experiments. Yet insofar as we are privileged, we should instead use that privilege to question thoroughly both other cultures and our own in order to make judgements for positive change.
So a historical shift has taken place from explicit cultural imperialism to an implicit cultural imperialism under the guise of appreciating and valuing cultural diversity. Religion plays a significant role in maintaining this separation. Echoing my own past religious experience, students who profess a strong Christianity usually fail to see a connection between their ideological ethics and their practical ethics, their way of operating in the world. Given, this is true to a certain extent with all students due in part to the infiltration of Christian values into American life, but it is more easily visible in the religious. These students are quick to defend Christianity from perceived attacks and extremist misrepresentation, but fail to see the ethical implications in their practical lives for the Jesus Christianity they profess.
I don’t think this is all their fault. The training to connect an ideological Christian ethic with reality is remarkably sparse from within the religious community. Christianity is personal salvation, after all, and it is rarely in the institutional interest to advance anti-institutional claims such as equal treatment for the LBGTQ community or universal health care. If anything, religious students are implicitly told to not make their religion a big deal in public for fear of being one of those extremists on the quad who screams at scantily-clad women that they’re going to hell. Higher education maintains a tacit agreement with religion to allow students to keep their faith unquestioningly, and even use it to make their decisions, as long as they don’t make it overly obvious.
This strongly contributes to the “Who am I to judge?” scenario above. It stems from an inability to engage complex issues because of a faulty and undeveloped means of reasoning. It may be that refusing to make any sort of judgement is better than trumpeting an overtly culturally biased one, but I’m not sure that it makes a lasting difference if the underlying mechanism of unjustified belief remains in place. If it was a success in the 20th century, it is no longer enough.
There has been much talk of the dim future of the humanities lately, and if pluralism and cultural diversity are the best things they have to offer, the analysis may be correct. Cultural diversity should absolutely be taught, but not in a way that allows students to keep their ideologies as sacred. We should at least not pretend that this makes for a productive and successful citizenry. What it makes is a body of people that profess love, care, and community support while they maintain bias and bigotry against others. Education is not about what not to say or what to think, but as David Foster Wallace claimed, “how to think.”