Tag Archives: society

The Introversion Cop-out

Social life, and the social culture that surrounds it, is by necessity an idealization of extroverted personalities.  Being outgoing, adventurous, flirtatious–i.e., sociable–is the go-to characteristic that storytellers revert to when they want to make a character likable.  In contrast, if they want to convey the point that a characters is not fully well-adjusted, the usual trope is to make her/him socially aloof (or downright inept), awkward, withdrawn, or not good at the basics of human interaction (somehow Sherlock Holmes can deduct all the intricacies of human behaviors to get an accurate read on people’s personalities, right down to their favorite toilet paper brands, but can’t figure out that he himself is a total asshole, huh?).  Given this subversively negative portrayal of introversion by media and entertainment sources, it’s no surprise that many introverts will eagerly seek out any medium that affirms some level of humanity to the introverted individual.

Self-help books on Amazon that deal with introversion not as a maladaptive flaw, but as a perfectly valid state of personality, garner a lot of support, both in their reviews and the number of sales.  Online communities (which tend to skew heavily towards the introverted side of the personality scale anyway) will often share supportive words and studies showing that being an introvert doesn’t simply end at “not being social,” but encompasses a wide array of positive traits, too, such as thoughtfulness, self-sufficiency, and creative aptitude.  One could even argue how the ease by which social media has taken over the personal interactions of much of modern human communications, that this digital age we’re enjoying caters much better to our introverted tendencies, given the control users of these platforms have in terms of getting to tailor interactions to their personal comfort levels.

Personally, I definitely lean more towards being an introvert than an extrovert, so I’m inclined to welcome any positive press bestowed towards my fellow shut-ins (relax; we’re allowed to ironically use these demeaning terms among ourselves).  But going right along with the introvert’s supposed knack of thoughtful introspection, I would be doing my tribe a disservice if I didn’t point out that for many people the introvert label has become somewhat of a cop-out to avoid uncomfortable situations, or avoid taking steps towards any semblance of self-improvement on the social front.

Everybody has bouts of introversion; even the most socially lively among us.  Usually these show up while we’re in the midst of new social surroundings and experiences.  What seems to separate the self-identified extroverts from the self-identified introverts is the way they respond to said experiences.  Extroverts will use the initial discomfort to energize themselves and try to turn the unfamiliar setting into something familiar (thereby increasing their comfort level with it), while introverts tend to see these social settings as a drain to their energy and will approach them like a tedious chore (thereby not concerning themselves with increasing their comfort level in the situation, but focusing on the comfort they’ll get to enjoy once they’re finally able to be alone again).  I’m admittedly generalizing here for the sake of brevity, so calm down with the caveats and nuances I know you’re preparing to angrily type my way (we introverts do have a penchant for pedantry, after all).

With all this bit of pop psychology aside, I want to get to  matter that I have observed pretty prominently for a while now.  For a lot of us who identify as introverts, we often use the label as an excuse to cover for our shyness.  As I said, everyone is introverted some of the time, but I’ve noticed that for many of us who  define ourselves as introverts–not just as one of our personality traits, but the defining trait of our identity–what we seem to be doing is using the now more socially acceptable fact of being an introverts to hide the still less acceptable fact of just being too shy.

What reason would any of us have to self-delude our own egos this way?  Well, for starters, to say that you are an introvert is to say that avoiding social settings is a part of your nature, while admitting that you are just too shy for social settings might make you sound like you are fearful, and therefore make you feel like a coward.  It goes without saying that being shy doesn’t make anyone a lesser person, but it’s also unavoidable that most of us would rather not advertise our fears and insecurities to the rest of the world.  With the rise of respectability given to genuine introversion, many of us see it as an opportunity to mask our social fears and anxieties behind it.  Meanwhile, we continue to feel withdrawn and isolated, and continue to fall deeper into the despair of loneliness; making it much worse for ourselves because we’ve now fooled all those around us into believing that being alone is our preferred state of being.  And because we have convinced others (and, on a surface level, ourselves) that we are innate introverts, whose default nature is to be away from others as much as possible, we eventually find it impossible to seek out what we truly do crave at our core:  companionship and camaraderie.   

It took me some time to accept that deep down I wasn’t just an introvert comfortable in solitude, as much as I was also a shy kid who was afraid to engage in social settings, despite actually having a basic desire to do so.  This shy kid eventually became a shy adult who embraced his more introverted qualities, because it was easier than having to confront my honest fears on the matter, and leave myself vulnerable to the very sort of judgment that caused my shyness (and nurtured my introversion) to begin with.

Much like stage fright, I can’t promise that shyness ever really goes away.  Whether it’s origins are ultimately caused by nature or nurture (or a combination of both), once you mature through life with it, you’ll always feel some of its affects on you.  But there are ways to lessen the sting of it, especially when it comes to your outward interactions with others.  It takes effort (a lot of effort), as no book, seminar, or inspirational quote can do the job of remolding the way you see yourself, and the way the world interacts around you.  But it can be done.  And if you are a self-identified introvert reading this, I would ask you to consider whether, for you too, it is perhaps simple shyness that is at the root of what you believe to be an inherently introverted character.

And if you are considering finding ways to overcome the negative aspects of shyness that are keeping you from being as happy in life as you could potentially be, a giant step forward will be to admit the fact of your shyness to yourself.  The next steps forward are more incremental, and involve making a combination of small and moderate changes to your way of thinking about socializing and interacting with others.  One giant step backward to any possible progress, however, is to cling to things that allow you to hide from the reality of your fears and insecurities about achieving the social life that would satisfy you (whatever extend or comfort level that may be), and pretending that your lack of social interactions are the result of being an innate introvert, when it probably has more to do with simply being a person whose shyness has caused them to avoid the initial discomfort of socializing.  There is no shame in not wanting to be alone, but hiding from this want and continuing to deny it to ourselves out of a misguided sense of loyalty to an identity we have adopted to cope with our shyness,  is the best way to guarantee a lifelong refuge in a misery that need to be.

Dispatches from Gulfton

The first grocery store I saw when I moved to the United States was a meager looking spectacle called Sellers Bros. in a rundown strip-mall area of southwest Houston, TX.  The store’s shelves were as overcrowded with bargain, generic-name products, as it’s aisles were with patrons shuffling from one end of the building to the next, holding tightly to their Lone Star Cards needed to feed their families for the month.  The building’s somber looking outer-structure held a passing resemblance to the apartment complexes that surrounded it only a few paces away—one of which my family was living in at the time, serving as our first exposure to the realities of inner-city American life we had immigrated to, and were gradually assimilate with.

The majority of the neighborhood was composed of immigrant families.  Though unlike my family, which originated east of the Atlantic Ocean, it was impossible not to notice that most of my neighbors hailed south of the Rio Grande.  As a result, while I had come to this country with the advantage of being able to speak English reasonably well—well enough to understand, and be understood by the general Anglophone population anyway—this advantage proved of little value on the very street I called home for these years of my adolescence.  It was an early education to the fact many living in urban America are readily familiar with.  Namely, that within the reality of American life, reside smaller sects of conflicting realities, many of which can neither communicate nor understand one another, and are set up so that they will rarely meet.  Gulfton Street in Houston, Texas, occupies one such reality.

Tucked away between two major highways in southwest Houston, spanning a stretch of 3 to 4 miles of cracked concrete landscape, sits the street of Gulfton.  The epicenter of the Gulfton Ghetto, as it’s occasionally called by the local media and by other Houstonians (though never by the neighborhood’s own inhabitants).  To those who take a wrong turn off Bellaire and find themselves driving down Gulfton Street by accident, the insulting nickname will seem most warranted.

The immediate sights one is met with are panel after panel of gang graffiti, row upon row of low-rent apartment complexes, and concrete sidewalks that have been in desperate need of repair for a good few decades now.  Surprisingly, there is a park/recreational center meant to give some relief to the area’s ongoing problem with juvenile delinquency, though anyone who has ever stepped onto the park itself will be quickly robbed of any hopefulness at the prospect of this endeavor.  In short, like many neighborhoods in urban America, Gulfton is a place that has been largely abandoned to the ravages of metropolitan entropy.

Under-funded and halfway flushed out improvement projects that have failed to live up to expectations are pointed to by the rest of the city as reasons not to bother with any future attempts at repairing the crumbling infrastructure.  Leaving the residents who have given up on the idea of moving away to either wall themselves off from the unsavory conditions that surround them within their private residences (however meager they may be), or embrace it by becoming a part of its destructive nature.

The first instinct any well-meaning person will have when confronted with a reality like Gulfton is, “Can anything be done to fix this?”  It’s an honest question, but it betrays a lot about the person asking it.  The idea that there is any one thing that can resolve problems that are decades in the making is a part of the problem to begin with.  These sort of problem are such that they have no one facet of origin, but are a delicate, interwoven mess of social, economic, and political barriers erected and maintained through complex systems with interests that themselves compete against and prop up each other in a multitude of ways.  The problems of Gulfton, like the problems of similar neighborhoods and populations throughout this country, have no single cause; hence they can have no single solution to curb the path they are currently on.

“Why don’t the people living there work to fix things?  It’s their neighborhood, after all.  Don’t they care?”

Unfortunately, the reality of all urban areas is that they are landlocked and dependent on the larger metropolitan that surrounds them.  They don’t get to make decisions in a vacuum, and resources are finite and sparse in terms of what will be readily allocated to benefit them.  The further issue is that once a neighborhood has fallen far enough to be regarded as “hopeless” by officials and administrators who could possibly make a difference, the very hopelessness of said neighborhood is used as the reason against committing long-term funds to improve its conditions, on the basis that it would be unfair to use tax dollars from well-behaved citizens in more savory parts of the city to fund the activities of no-good thugs and gangsters in these low-income, high crime areas.  Local agencies will say they are not equipped to handle the expenses needed to undertake the sort of social projects necessary to overhaul the issues plaguing these sorts of areas, while Federal agencies see these issues as strictly a local concern.

In the absence of a robust social safety net provided by the city or state authorities to ensure the most basic of securities and public amenities, opportunistic forces will band together to construct their own safety nets, which for many young people will take on the form of turning to gangs that prey on social instabilities as a means to offer their quasi-organized crime structure as an alternative to festering in a decrepit social system.  The reason youths are most susceptible to this, is that they are the most in need of some kind of functioning social order to orientate their lives (and relieve their boredom), and even the violent and dangerous structure of a gang life is to many preferable to the instability of no visible structure at all.

Some people have a natural aversion to hearing that any issues constitute a systemic problem, requiring a systemic approach to resolve.  They conjure up images of how the very notion of entertaining such a thought is little more than an attempt to skirt away responsibility from the individuals and let them avoid the consequences of their actions and/or apathy, leaving them no incentive to make things better on their own accord.  I can understand the sentiment behind this aversion, though I find it largely misinformed.

In a place like Gulfton, how exactly do you expect the individuals living there to step up to fix the various problems that plague their environment?  Should they pool their meager earnings together to pay for the ongoing structural damage to their concrete sidewalks and street signs, despite the fact that we’re talking about city property and as a results is an issues needing to be addressed by the local government?  How about the need to improve the resources available to the local schools so that there can be robust after-school programs and activities available for young people to occupy their time with to discourage the need for delinquency and gang activity?  Should the low-income earning parents of these youths fund these programs directly, thereby taking money away from them that’s needed to pay rent, utilities, food, clothing, etc.?  Would that be an example of individuals stepping up to take personal responsibility to improve the conditions around them, or a neglect of one’s obligations to provided basic necessities for one’s own family first?  If donating money is not the answer, surely we can get everyone to at least volunteer their time to improve their community, no?  It’s not as if the sort of people who have to live in these sorts of neighborhoods, are undoubtedly also stuck working jobs with little to no flexible hours or time off, after all.

Perhaps the answer is that all these folks ought to work harder to increase their earnings, so they aren’t hostage to their economic conditions.  Yet, if they actually managed to do just that, what incentive would they have to spend their extra earnings on repairing a place like Gulfton, as opposed to–oh, I don’t know–simply moving away to a better part of town that already offers all the basics of having dignified living conditions?

Unless you are Bruce Wayne, sitting on an endless supply of inherited wealth, resources, and leisure time, individuals donating money and/or donating time, will never be a solution to the problems that affect neighborhoods like Gulfton.  These are problems that took a long time to manifest, and they require long-term investment and planning to be resolved. It requires layers upon layers of overarching organizational resources, to properly oversee and track improvements, that no single individual or clustered group is capable of providing.  Private businesses, local or otherwise, also offer little help in the matter, since their is no business incentive in investing in a place simply to improve the lives and environment of its residents, since these residents will not be able to return the gesture on account that, at the end of the day, they’ll still be too poor to ever be able to turn a profit for these businesses.

And it takes an astounding level of naivete to not be able to realize this.  The same sort of naivete that leads certain people to make inane points like, “If you like public programs, and think taxes should be higher to pay for them, why don’t you just volunteer more of your money on an individual basis, instead of demanding everyone else do it through the tax code?”  Because individual actions and donations will not solve systemic problems like the ones affecting neighborhoods like Gulfton, that’s why.  Because many of the problems plaguing inner-city life are far too complex and interconnected to a multitude of surrounding factors to be seriously brushed off with red herrings concerning individual responsibilities.

Areas like Gulfton are the way they are because they have become culturally and economically alienated from the rest of their metropolitan centers, and the rest of the country at large, and little is being done to incorporate them into the greater society that surrounds them.  The full reasons for this alienation are legion, and the solutions that will be necessary will by definition be just as extensive, which is a reality that must be acknowledged by those who purport to take the issues of working, urban, and immigrant communities seriously.

If, on the other hand, you simply don’t care about places like Gulfton, then just say you don’t care, and stand by the convictions of your apathy.  And stop pretending that there is a greater moral or ideological basis to what is essentially pure disinterest for the plight of people you can’t be bothered to give a shit about.  It will make for a much more honest conversation.

The Art of Rhetoric: Its Virtues & Flaws

In a not-too-distant previous life, when I thought that standing in front of dozens of apathetic teenagers in hope of teaching them why learning proper grammar, writing, and argumentation skills was a worthwhile vocation to pursue, I came up with a nifty little speech to start off every semester.

I would say:

I know exactly what you are thinking right now.  It’s the same question every student, in every course, in every land thinks every time they enter a classroom.

Why do I need to learn this?

The simple answer is that it’s because the law requires you to; at least until you turn 18.  For most of you that’s a good enough answer to put up with my incessant talking for a few months, scrape together enough effort to satisfy the course requirement, and move on to your next classroom, until the law finally says that you’ve gone through the motions long enough to be let loose into the real world, full of non-classroom-type duties and responsibilities.  For most of you this answer is good enough.  But there’s a few of you for whom this sort of reasoning is not anywhere near good enough to make you put up with what the education system expects of you for an hour and fifteen minutes of your day.

If you fall within that group, I want you to listen very closely.  In life you will meet many people.  A great number of these people will make prejudgments about you from the first moment they see you–both good and bad.  The good prejudgments will work to your benefit, and the bad will be obstacles that can make your life very, very hard.

People will make prejudgments about you based on your height, your weight, your race, your gender, the way you dress, the way you stand, even the way you choose to cut your hair.  The negative opinions formed by these prejudgments, no matter how unfair or shallow, will for the most part be things you have little control over.  Except for one important component:  The way you communicate.  Yes, people will judge you by how you speak, too.  And while you can’t do much about someone who simply hates you for the way you look, you can sure as hell do everything to deny them the pleasure to dismiss you for the way you communicate.  Even if they still hate you at the end of the day for all the bigoted ways available to them, you should at the very least do everything in your power to make it impossible for them to dismiss you for the way you write, the way you argue–the way you speak!  That is entirely within your power, and it is a power that’s learned, not inherited.  This is your opportunity to learn it, if this is a power you wish to possess.  If you don’t, any prejudgments others make about your person as a results of your decision right now, will be entirely on you.

I’m biased, but I like to think it got the point across as well as anything else could.  And while the point was of course to get the students to feel somewhat enthused about the lesson plan, there was also a deeper purpose to my little pep-talk.  Namely, I was demonstrating the use of rhetoric to argue the case for learning about rhetoric (none of the students ever really picked up on this, though).

Rhetoric has a few technical (read boring) definitions floating around, but the basic gist of it is that rhetoric is a form of discourse meant at persuasion (typically of a person or audience).  This is the part about rhetoric that most philosophical commentators agree on anyway.  Opinions regarding the use or ethical standing of rhetoric have been more polarizing, however.  Plato looked down on rhetoric as mere flattery that could be used to manipulate the masses, as it’s primary purpose was to convince you to side with the argument, and not to impart knowledge or truth.  His student Aristotle took a more favorable view, and considered rhetoric to be an important discipline (and art form), and a necessary part of any well-rounded civics education.  Much of the writings and social revolutions that emerged from the Enlightenment relied heavily on rhetoric to persuade the public to a new way of thinking about life (and liberty, and even the pursuit of happiness).  The same goes for anti-Enlightenment reactionaries, who argued in favor of preserving the status quo in society.

In the modern world, rhetoric (in its purest form) is most readily seen in courtrooms and legislative bodies, and the political spheres that surround them.  It’s no surprise that so many politicians start out as lawyers, and go on to use the same rhetorical tricks they learned in law school on the campaign trail.  It’s for this reason that rhetoric takes on a negative connotation in many people’s minds.

Memorable (yet content-empty) slogans, propagated by conscience-devoid politicians, whose only concern is scoring a victory in their (and their donors’) favor.  Arguments put worth by their mouthpieces in the form of public commentators and pundits, serving the sole purpose of winning over the electorate’s hearts, often at the expense of their critical thought and personal long-term interests.  Honorable mentions also go to the rhetorical tactics of self-professed experts who peddle pseudoscience and conspiracy theories to the affect of fostering a perpetually misinformed populace for the sake of monetary gains.  These can all be counted as examples in support of Plato’s skepticism towards rhetoric as a virtuous mode of discourse.

Even my speech above is arguably laced with unwarranted rhetorical hyperbole.  (Honestly, most people you meet will probably not form good or bad opinions of you; they’ll probably look right past you with complete indifference, if you offer no value to them as a person).  However, one should refrain from getting distracted with unwarranted equivocations.  I sincerely believe there’s a big difference between educators using rhetoric to motivate their students to succeed in their coursework, and the sort of rhetoric that contributes to public policy meant to misinform the public (if you don’t, I hope you never get picked to serve on any jury).

I already mentioned the culpability of politicians making use of rhetoric to spread propaganda for ideological gains.  And while this is universally snubbed as somewhere on the edge of morally questionable behavior, the only reason its done is because it works so well.  In other words, people get manipulated by the bells and whistles of skilled rhetoricians because they don’t care to educate themselves about the hogwash they are being fed (usually because they agree and want to believe what’s being said to them, even if it’s factually baseless).

The public (at least its voting component) is the primary check on politicians in a democratic republic.  However, given the ease by which we will readily be swayed by faint words of praise and reckless fearmongering, its not absurd to thing that Plato may have been on to something when expressing doubts with the public’s ability to combat against rhetoricians whose only purpose is to persuade with complete disregard for the truth of their words.

A secondary check on the rhetoric of public officials is the part of the voting public that makes up the free press.  The reason why the founders of the United States explicitly mentioned protection for the free press from the government in the first amendment of the U.S. Constitution, relates back directly to the role the press (ideally) ought to have as the fact-checkers holding those in power accountable.  Unlike the public, a respectable free press has several internal mechanisms in play that work to sift through credible and credulous information.  It’s also why the first thing clever rhetoricians do is undermine the very credibility of the free press.  “Fake News” is a beautiful example of manipulative rhetoric at its finest, as it plays on the public’s distrust of media sources (i.e. its only reasonable to believe that some news outlets fail to overcome the biases of their presenters) and gives it a credulous dose of self-serving generalization (i.e. all news outlets that disagree with me are the biased ones, regardless of any evidence they present to support their position).

Any reasonable amount of critical thought on the subject clearly shows that the fact that news sources can be mistaken (or even outright deceptive), does not therefore warrant the conclusion that all media must be wrong and lying when they report something you don’t want to be true.  Once again, it’s up to the public to follow-up on the sources any reputable press will readily provide for them to check the merits of what’s being reported.  Shouting “Fake News,” however, makes it easier to muddy this relationship between the public and the press, by equating all sectors of the press as untrustworthy in general, and allows people to lazily self-select only the media they are already disposed to agree with, without having to be burdened with doing any intellectual legwork.

Journalists are also rhetoricians by trade.  Unlike politicians and lawyers, however, members of the free press ought to strive to belong to Aristotle’s more virtuous sect of the rhetoric spectrum, which aims to persuade the masses towards truth and knowledge.  As journalism moves more towards competing for public viewership to continue to operate–thereby having to appease to the whims and tastes of the public, rather than seeking to simply inform them–the concept of fact-based reporting threatens to descend completely into the realm of vacuous rhetoric meant to do little more than keep up viewer support (which, as mentioned, is prone to succumb to some flimsy and fickle interests).

The elevation of online personalities, whose sole journalistic experience is being able to cultivate an audience around themselves on video-sharing sites like YouTube, under the neologism of “alternative media,” is an example of a free press where rhetoric takes precedence over fact-based reporting.  Not to smear those personalities who make every effort to be a respectable source of information, the reality is that the environment of being an online news commentating source is inherently prone to undermine the fact-checking mechanism of traditional journalism, mostly by side-stepping it completely in favor of peddling rhetoric.

These online outlets have little in the way of field-based journalists doing the legwork to uncover newsworthy stories, let alone teams of fact-checkers tirelessly looking through sources and notes to determine the veracity of a story prior to its reporting.  In truth, they rely almost entirely on the work of traditional journalists, whose work they present and provide opinionated commentary over, while ever-so-often throwing in jabs at how ineffective traditional journalism is, despite most (if not all) their actual “news” content coming through the efforts of said traditional journalism.  The reason why this matters is that it is a clear example in which what could be a respectable profession, and a reliable venue for information for the public, is sacrificing its responsibility to dispel factual knowledge for the convenience of mindless rhetoric because it offers them popularity and financial gains in terms of viewer support and sponsorship.

Understanding the role of rhetoric–its values, its uses, and its prevalence–is vital in being able to identify the difference between an impassioned speaker fighting on behalf of a just cause, and a demagogue looking to manipulate the mob to his advantage.  Its vital in being able to distinguish between journalists who go through many painstaking, sleepless nights to report a truth to the people as a public service, and pundit blowhards using the cover of journalism to propagate misinformation for their own gains and egos.  In general, to understand the use of rhetoric, is to be able to identify it and (if need be) ward yourself against its more dire influences.

Rhetoric is not, and should not be, a dirty word.  Like most things, in the hands of benign and well-meaning hands, it is a powerful tool of communication that can inspire immense good in the world.  In the wrong hands, however, it can be the barrier that keeps us permanently free-falling in the abyss of credulity and self-destruction.

 

Treatise on Blasphemy

Recently the Republic of Ireland held a referendum to repeal longstanding blasphemy offenses in its country.  While blasphemy still stands as a finable offense in the Republic under the 2009 Defamation Act, the referendum is still a demonstration that, as far as the Irish people are concerned, charges of blasphemy ought not to be a part of punishable civil law in their nation.

Friends of my adopted homeland here in the United States usually have a conception of Western Europe as being made up of a set of predominantly secular and progressive cultures.  And speaking as someone who spent many years growing up in Western Europe, this conception isn’t wholly unfounded.  As a result, it might astound many Americans to hear that some of these secular, progressive, ultra-liberal, borderline lefty countries still have enforceable blasphemy laws in place.  Granted, the actual enforceability of such laws is largely theoretical in nature, given that they are usually undermined by far more salient laws allowing for the freedom of religious expression and the freedom to believe in accordance to one’s personal conscience.  Thus, blasphemy laws currently exist as a vestigial organ in European law books; without practical purpose or application, but still present nonetheless.

“If these laws are unworkable, than why even bother to fret about them with referendums at all?  Why not just continue to ignore them, and get on with your blaspheming ways?”

This could be a reasonable response, but it misses an important point concerning blasphemy laws.  Putting aside the fact that it makes perfect sense to oppose the criminality of blasphemy on principle alone as unbecoming of any modern democratic nation, there is also the issue of the frailty on which the laxity of these laws currently exist.  To put it more plainly, the reason blasphemy charges are unworkable in most of the European nations that have them is precisely because the current sociopolitical climate is too secular and progressive to enforce them.  However, as any student of history knows, sociopolitical climates are anything but static.  So what happens if the political pendulum swings too far to the right, towards a political faction that views the protection of religious sensibilities as far more important to a nation’s cultural well-being, than the free expression of its citizenry?  Suddenly, these outdated blasphemy laws that have had no real thrust in civil law for almost two centuries, become a very powerful weapon in the hands of reactionaries all too eager to use the existing rule of law to conform society to their line of quasi-pious thinking.  And this is a potential threat both believers and unbelievers alike ought to be concerned about.

Blasphemy isn’t simply the act of professing one’s disbelieve in religious claims, whole cloth.  Blasphemy is the very nature in which all religions profess the very doctrines that make up their faiths.

Whenever polytheistic faiths, like certain sects of Hinduism, profess the existence of multiple gods, they are blaspheming against monotheistic religions which insist that there is only one god, and none other (and vice versa).  Within the monotheistic Abrahamic faiths, when Christians profess that Jesus Christ is the foretold messiah, they are blasphemy against the Jewish faiths that claim that the messiah is yet to come (and vice versa).  When Muslims claim that Jesus, though a prophet and a messiah, is not the son of God, they are blaspheming against a central claim of Christianity.  The Catholic Church’s stance on the supremacy of the Roman papacy is blasphemous to the Eastern Orthodox Churches, and the Protestant rejection of Catholic ecclesiastical authority is blasphemous to Catholics.  The Methodists are blasphemers to the Calvinists, and just about every Christian sect considers Mormonism a heresy.

The obvious point here is that to take away the right to blaspheme is to make it impossible for religious pluralism to exist within a society.  Perhaps this is fine as long as your religious opinion is the dominant one in the society you inhabit, but what happens if you find yourself just short of the majority opinion?  What if a population shift occurs, and the very laws that enforced the thin-skinned sensibilities of your religious persuasion becomes the means why which the new dominant line of thought undermines your right to religious expression?

I could stop writing now, and end on this appeal for mutual cordiality between people of all faiths, and how it is in everyone’s self-interest to oppose blasphemy laws, but I fear it would leave things very much against the spirit of healthy discomfort that blasphemy really should elicit in a person when coming across it.  On that note, allow me address the elephant in the room that needs to be brought up when concerns regarding religious offense of any sort, in law or public discourse, rears its head.

Undeniably, religions make bold claims for themselves.  Claims that offer definitive answers on matters concerning life, death, morality, with a wager on possessing a monopoly on Truth with a capital T.  And they are always keen to wrap this all-knowing, all-encompassing bit of absolutist wisdom in a garb of self-proclaimed humility, as if to say, “No, no, don’t mind me…I’m simply professing to know the answers to all of life’s mysteries, ready made with the consequences (read: threat) that will befall you if you don’t follow along with my modest creed.”

In short, religions by their inherit design simply claim to know things they couldn’t possibly know.  But I, in turn, admit that I don’t know.  I don’t know what the answers to life’s mysteries are; nor do I know which of today’s mysteries will remain mysterious forever, and which might become common knowledge for subsequent generations to come.  I don’t know which moral answers yield the most objective good for humanity; nor can I say for sure that such answers are even completely knowable.  The truths I do know come with a lowercase t, held provisionally in accordance to forthcoming evidence and reasoned arguments, and I don’t know if I can do anything other than to reject the grammar of bolder Truth claims when confronted with them.

It is precisely that I don’t know that I am left with little recourse than to examine, question, dismiss, disbelieve, and (when I see fit) deride those who do claim to know, but offer hardly a dearth of evidence for their claim.  It took centuries of debate and bloodshed of previous generations of thinkers for any of us to be able to enjoy this simple — yet powerful — privilege to skepticism.  A privilege I do hold up as my right, and which I will speak up for without hesitation or apology.  What you call blasphemy, I call critical thought.  And if anyone can appeal to traditions as a means to protect religious sensibilities by legal means, I am fully within my right to appeal to the tradition of cultural and intellectual pushback towards religious doctrines and religious authorities that has made it possible for any sort of interfaith (and non-faith) social cohesion to exist in the modern world.  A tradition that includes both the right to the profane and the blasphemous, which cannot be allowed to be abridged in a democratic republic, for as long as one wishes to be part of any nation worthy of the claim.

Egalitarianism; A Practice in Self-Scrutiny

Genuine self-scrutiny is a personal virtue that is much easier preached than practiced.  Usually the furthest most of us are willing to go is a relativistic acknowledgment that differing opinions exist and that, all things considering, we would be willing to change our minds if these alternative viewpoints were to persuade us sufficiently.  But, in my opinion, this sort of tacit relativism isn’t much in the way of self-scrutiny.  To self-scrutinize is to actively challenge the values and ideals we hold dear to our person–to dare to shake the foundation holding up our most cherished beliefs, and test if the structure on which we house our beliefs is sturdy enough to withstand a direct attack.  In contrast, the aforementioned acknowledgment that differing (and potentially equally valid) views exist to our own is a very passive stance, as it strictly relies on an external source to come along and challenge our own position(s), with no actual self-scrutiny being involved in the process.

Up to this point, this very post can be rightfully characterized among the passive variant; i.e. it’s me (an external source) attempting to challenge you to question the manner by which you view the world around you.  Although there are occasionally posts on this blog in which I sincerely try to adopt opposing stances to my own, the truth is that I do this primarily to better strengthen my own position by being able to effectively understand what I’m arguing against.  This, too, is not self-scrutiny.  And it would be dishonest to pretend otherwise.

To truly self-scrutinize I would have to pick a position–a value, an ideal–by which I orientate my worldview around, and mercilessly strip it to its bone.  The frustrating part of such a mental exercise is the inevitability of having to rely on generalizations of my own opinions in order to be able to paraphrase them thoroughly enough, without getting trapped in a game over petty semantics.  The important thing to remember is that the points I will be arguing over with myself in this post are admittedly stripped of their nuances regarding some obvious exceptions and caveats, so as to not lose focus of addressing the underlying principles that are being discussed.  Consider that a disclaimer for the more pedantic-minded among my readers (you know who you are).

First, it would be helpful if I stated a value by which I orientate my worldview around, prior to trying to poke holes in it.  Above most else, as long as I can remember, I have always valued the egalitarian approach to most facets of human interaction.  I truly do believe that the most effective, and just, and fair means for society to function is for its sociopolitical and judiciary elements to strive for as equitable an approach to administering its societal role as possible.  In this view, I also recognized that this can more realistically be considered an ideal for society to endeavor towards rather than an all-encompassing absolute–nonetheless, I still see it as a valuable ideal for modern society to be striving towards, even if we must acknowledge that its perfect implementation may forever be out of our grasps.

Additionally, I should clarify that I do not necessarily claim this personal value of mine to be derived from anything higher than my own personal preferences to how I think society ought to be.  Yes, it is subjective, because it is subject to my desires and interests, however I would argue that this is true of just about any alternative/opposing viewpoint that may be brought up.  Furthermore, the merits and benefits I believe to be implicit in my personal preference of an egalitarian society (though admittedly subjective) are, in my opinion, independently verifiable outside of just my own internal desires.  In short, I value egalitarianism on account that, because I have no just and tangible means by which to sift through who merits to occupy which position in the social hierarchy, I consider it important that (if nothing else, at least on the basic application of our political and judicial proceedings), we hold all members of society to an equal standard.  Moreover, not that it matters to determining the validity of the egalitarian viewpoint, but I’m convinced that the majority of the people reading this will have little trouble agreeing with the benefits of such a worldview (though probably more in principle, while leaving room on disagreement on the most practical means by which to apply said principle in a social framework).

Now, the immediate issue I see arising with this stance of mine is the objection that genuine egalitarianism can easily lead to outright conformity–especially enforced conformity–as a society built on the model of complete equality might find it difficult to function unless it actively sets out to maintain the equality it’s seeking to establish.

It is a harsh fact that large-scale human interaction is not naturally egalitarian; meaning that left to their own devices there is little in historical evidence to suggest that a society of people will not diversify themselves into a multi-layered hierarchy; thereby instinctively creating the social disparity that the egalitarian mindset is aiming to combat.  The most obvious response would be to insist that egalitarianism simply means that the basic functions of society (i.e. the laws) have to be applied equally, and that as long as measures are upheld in society, the system can self-correct to its default setting.  Yet, this outlook is only convincing as long as one is inclined to have faith in the sincerity of the application of the law, in terms of holding all in society to an equal standard.  This also brings us to the issue of who is to be the arbiter warranted with upholding the principles of an egalitarian system.  The judicial system?  The policymakers?  The public at large?  And does this then bestow on these individuals a set of authority (i.e. power and privilege) that thereby creates a disparity which in itself violates the very premise of a truly egalitarian model?

“In a democratic society, the authority rests with the people in the society to ultimately decide on who is to be the arbiter(s) to ensure that equality is being upheld in said society on the people’s behalf.”

But maintaining social equality by means of representative democracy brings us to the issue of having those in the minority opinion be subject to the whims of the majority.  And is this not also in itself a violation of what an egalitarian society ought to be striving for?

When we play out the potential pitfalls of every one of these concerns what we end up with is the realization that, in practice, egalitarianism seems to only function when applied on a selective basis.  Complete equality, across the board, on all matters, has the serious consequence of either ending up in a social gridlock (rendering all manners of progress on any issue impossible), or coercion (negating the benignity that is ideally associated with egalitarianism).

I’ve heard it said how in this sort of a discussion it is important to differentiate between equality of outcome and equality of opportunity; that the latter is the truly worthwhile goal an egalitarian ought to be striving for in order to ensure a just and fair society.  I’m not sure this does much to address the primary issue at hand.

If there exists no disparity in opportunity, but we reserve room for an inequity in outcome, than will it not be the case that you will still end up with a select number of individuals occupying a higher role in the social hierarchy than others?  And once the foundation is laid for such a development, is it not just as likely that those who end up occupying a higher role could put in place measures that will be of interest to themselves alone; or even at the expense of those who fall into lower social roles?  Meaning that even though in this model all opportunity was equally available at first, the caveat that different people can have different outcomes–fall into more favorable and less favorable social conditions–fails to safeguard against the potential dilemma of having those who manage to rise high enough manipulating matters in society to their advantage; thereby stifling the outcome and opportunity potentials of future generations.  If the rebuttal is that in a truly egalitarian society measures would be in place to prevent this, we fall back to the question of who exactly is to be the arbiter warranted with upholding the principles of an egalitarian system?  Thus bringing us full-circle to the line of inquiry mentioned in the preceding paragraphs; hence, making an equality of outcome vs an equality of opportunity distinction does little to nothing to resolve the issues being discussed here.

All these objections are ones that, even as someone who considers himself an egalitarian, I can sympathize with.  Mainly because I don’t have any way to refute them without appealing to a personal intuition that these concerns are not endemic to an egalitarian model and that it’s ultimately feasible to avoid such potential pitfalls when we leave room within the social system to be amendable to debate and revision.  However, I have to also admit that I’m not always entirely sure of this myself.

This problem brings me directly to the confrontation of what should be valued more in society:  the complete equality of all people, or the value of the autonomous individual?  And whether creating such a dichotomy is necessary, or a balance can be struck in satisfying the interests of both entities?

The threat that removing all disparity that exists between all individuals might lead to a stifling of the distinct individuality of people is something I believe is worth worrying over.  What good is a world where equality is triumphant but reigns on the merits of absolute sameness?  Not to mention, what will happen to the human ingenuity all of us in modern life depend on for our survival as a society?  The prospect of attaining personal achievement is necessitated by one’s ability to stand out above the fold, and create something unique and distinct from that which is common.  The possibility that this drive will be held in suspect in a completely egalitarian world, in the name of preemptively combating all forms of perceived inequality, no matter how unpleasant it might be to my core values to acknowledge, is not something I can dismiss simply because it’s inconvenient to my worldview.  Essentially, I believe that it would be unwise to simply brush off the point that a world safeguarded to the point where no one falls, is also potentially a world where no one rises.

When I started writing this post I had a standard set of points I knew I would raise to fulfill my interest of demonstrating a genuine attempt at unrestrained self-scrutiny.  I know that some readers might wonder why I’m not doing more to combat the objections I’ve raised here against my own egalitarian perspective, and the simple truth is that it’s because I understand my desire for egalitarianism to be practical and feasible rests almost entirely on the fact that I want both of those things to be true, as it would validate my presupposed worldview, by fiat.  Nonetheless, I do understand that reality does not depend on my personal whims and wishes.  In all honesty, having actually reasoned out the premises here, I’m left wondering why, if for the sake of practicality we will undoubtedly always be forced to be to some extent selective with our approach to egalitarianism, we (myself included) even bother calling it egalitarianism at all?  Perhaps there is a term out there that more honestly fits what most of us mean when we strive to uphold what we refer to as egalitarian principles.  That, however, is a wholly separate discussion to my intentions here.  My goal was to hold my own views and values to the fire and see where it ends up.  In that goal, I think I’ve succeeded…what results from it will take a bit more thinking on my part to figure out.

The Power of Names

Shakespeare invited us to consider, “What’s in a name?  That which we call a rose, by any other word would smell as sweet.”  The Bard’s musings on the subject notwithstanding, the truth is that names do hold a fair bit of power in forging our perception of other people, as well as ourselves.

If you are a foreign-born individual who goes about in your adopted land of residence with a first name that points clearly to your nation of origin, you immediately know how vital a role a name can play when trying to integrate yourself with the local population (so much so that many foreigners will give in, and change their foreign-sounding names to something more palatable to the culture they aim to assimilate in).  Although few of us will readily admit to it, we are all susceptible to making generalizations about people we come across in our daily life based on superficial features.  Names are definitely one such feature.  That is not to say that every assumption made about someone based on such features is either wrong, or malicious.  It’s not wrong (factually or morally) to deduce that a person with an obviously Asian sounding name is in some way culturally connected to Asia.  Same with a man named Hans Gunterkind most likely being of some kind of Germanic heritage,  Jean-Pierre Neauvoix being French.  So on and so forth.

(It goes without saying that the contemptible part in forging a preconception about someone isn’t the initial preconception itself, it’s what you do with it from there on forward.  If on recognizing you’re about to speak with Chen Huiyin leads you to assume she is probably Asian before seeing her, no sensible person will raise an eyebrow for that assumption.  If, however, you further take your preconception to assume she is in some way personally inferior to someone who isn’t Asian, that’s where we run into issues of bigotry that will rightly be condemned by much of the public at large.)

Issues of what might be called ethnic names aside (are not all names relatively ethnic to different cultures, one might be inclined to ask here?), there are naming norms within American culture that occasionally shape our interactions with each other.  When you’re in the middle of everyday America and come across the name Kevin, it is unavoidable that you will imagine a man.  Unless you just happen to know a woman named Kevin, but even then you are likely to ascribe it to a rare anomaly.  What if over the course of the next three decades a swarm of new parents decide that Kevin makes for a great name for their baby girls, and the social paradigm shifts so that suddenly you run into more female Kevins than male ones?  Would you easily adjust to the new cultural trend, or still stick to the norm you had been accustomed to of Kevin being a predominantly male name?  If this sounds like an unlikely scenario to happen, think about how the name Ashley in America at the start of the 20th Century changed from mostly male to predominantly female by the start of the 21st Century.

Not to belabor a point past my humble reader’s generous patience, but it would feel disingenuous not to touch on my personal experience here.  Growing up in continental Europe as a boy named Sascha/Sasha the social assumption about it was that my parents must be bland, unimaginative, and possibly even a tad bit conservative in their leanings, precisely because boys named Sascha/Sasha are so common to come across there.  At the time, it formed a personal impression of myself being just another average lad going about my business, similarly to how I imagine an American youth named Michael or David would feel on the matter in contemporary American culture.  When I moved to the U.S. in my early teens I came to find out that my name was somewhat of a peculiarity to my peers; one that definitely demanded further explanation on my part.  Suddenly, I was no longer merely a random guy with an average-to-boring name, I was a random guy whose androgynous-to-feminine name invited further conversation (occasionally schoolyard taunts, too, but I’m pretty good at deflecting unkind commentary and rolling with the punches, so I bear no negative grudges from it).

I would argue that your name is the most basic qualifier of your identity, and people’s reactions to it forms a great deal of your learned behavior when interacting with others.  I can honestly say that the change in perception in how people reacted to my name on moving to the U.S.–as opposed to the reaction I received for it back in Europe–did affect how I carry myself and interact with others to some non-trivial extent.  At least in that I know when I introduce myself to others, I can be sure of two things:  1. I will be pegged as foreign regardless of my citizenship status, 2. I may be asked an awkward follow-up question regarding my name (to which, when I’m feeling lazy, my typical response will be either “My parents were really hoping for a girl, and were surprised when I popped out, dick-swinging and all,” or “I wanted to be able to better relate to women, but Nancy Sunflowerseed sounded too butch, so Sascha had to do”).

Believe it or not, the purpose of this post was not to regale anyone with anecdotes about naming cultures, as a clever ruse to sneak in a dick-swinging joke.  It’s to touch on a greater point about forging better writing habits and being mindful of one’s intended audience’s social palate.  Sooner or later, just about all writers find themselves fretting over picking out the perfect name to convey their characters’ personalities and backgrounds effortlessly to the reader.  And there are definitely right and wrong names one can decide on, for the roundabout reasons stated above.

If you’re writing a story about a street-wise, inner-city black kid, born and bred in the Bronx, but is named Hans Jorgenson Gunterkind, well you better be ready to explain how the hell that came to be.  Same if you’re writing a story about a 15th Century Samurai named Steven.  While clever names can add exotic intrigue to characters, and piece together unspoken–unwritten?–context about their personal interactions with their environments, it can also needlessly distract the reader if it’s not really meant to be a focal point of the narrative.

It’s perfectly fine to be bold and go for something unconventional when you’re crafting your written world, but don’t bend over backwards to convey uniqueness unnecessarily, to the point that it hinders the readers ability to become immersed within the narrative.  A story that has five characters named Mike to show the absurd commonality of the name can be witty and fun, or it can end up confusing and frustrating to the reader.  Take a moment to consider how the greater world you have created interacts with this dynamic, and whether it helps or hurts the story you’re setting out to tell.  Reading practicality should not be dispensed for the sake of creativity; they should operate together to form a coherent story that can be enjoyably read.

You can’t please everyone, and someone will hate your work no matter what or how you write.  Which is why the starting point for all my writing advice is to always start with being honest with every story’s first reader: its author.  And if, as you put pen to paper (or, more realistically, fingers to keyboard), what seemed like a great name in the first outline is becoming harder to work with as the story progresses, rather than forcing the narrative to conform, there is no shame in revising the basics–character names included.

Suck on that, Shakespeare, is what I’m really trying to say here.

The Cynic’s Political Dictionary

  • Centrist: adj. the act of claiming to not care about identity politics in order to feed one’s own already narcissistic self-value.
  • Communism: adj. crippled by Progress (see Progress).
  • Conservative: adj. a desire to recapture an imaginary Golden Age, and cease caring.
  • Corporation: adj. the benchmark of personhood for Conservatives; n. the Great Satan of Liberals.
  • Economics: v. the act of attempting to predict the future, through a broken crystal ball.
  • Elections: n. the greatest theater production money can buy.
  • Family Values: absolute control of the person (see Person), and her/his genitalia.
  • Fascism: v. the act of feigning fear.
  • Free-market: n. the omniscient, omnibenevolent, omnipotent God of Libertarianism (see Libertarianism).
  • Independent Voter: n. a disgruntled Conservative/Liberal; n. a committed Moderate (see Moderate).
  • Labo(u)r: n. an archaic animal of antiquity that invokes nostalgia in Liberals (see Liberal), and disdain in Conservatives (see Conservative).
  • Liberal: v. a state of perpetual inability to cease seeing faults everywhere in society.
  • Libertarianism: n. the completely rational belief that faceless, easily corruptible conglomerates are more honest and trustworthy than faceless, easily corruptible governments.
  • Middle-class: n. a mythical being with no clear definition; adj. a rhetorical token point.
  • Moderate: n. white bread.
  • Person: adj. act of being valued by your monetary and/or societal contribution; n. a corporation (see Corporation).
  • Politics: adj. the art of self-interest.
  • Progress: v. the infantilization of humanity; adj. hope for change with no plan to act.
  • Religion: adj. a source of false humility for the socially powerful, and a source of false power for the socially humiliated.
  • Socialism: n. the elder brother of Communism (see Communism); adj. being beyond redemption.
  • The People: n. a device that creates the impression of human compassion.
  • Voting: v. a dramatic tragedy.