A Word on Class Warfare

You can tell we just got past an election year when buzzwords like “class warfare” are getting thrown around without reservation. I’m told class warfare is the hostile means by which we in the lower bracket of the economic hierarchy unfairly try to undermine the integrity and work ethic of the wealthier individuals in the country by redistributing their wealth down to ourselves. How we’re doing this I’m a little fuzzy on, since as the years have gone by the only redistribution of income I have seen is the consistent loss of my own to utilities, a rising rent, and various other life necessities. Whatever devious scheme we poor folks are supposed to be up to, I’m obviously doing it wrong. (Apparently the memo informing me of when, where, and how we are to topple the Bastille got lost with my shipment of freshly polished battle-axes. And, honestly, what fun is any kind of “war” without battle-axes, anyway?) 

I’m told that my distrust of both faceless conglomerates and faceless bureaucrats is contributing to all the vile, unjustified antagonism from my economic ranks against those who can afford to buy my home, car, and soul, trice over. For that I’m sincerely sorry, and in the future I will take into consideration that just because these entities have the ability to influence significant factors in my personal life, is no excuse to fail to consider how possessing such power must be a burden on their fragile humanitarian hearts. To put their minds at ease, I hereby declare to these caring, faceless conglomerates and bureaucrats that anytime the stress of controlling my finances and civil freedoms becomes too much to bear, I will be more than willing to take some of the load off their hands. It’s a small gesture on my part really, but I think ultimately it’s the thought that counts.    

All joking aside, I’m getting the impression that a small segment of the population is getting somewhat paranoid that any day now their neighbors from outside the Country Club roster will come to storm their gated-community’s ivory entrances, demanding some sort of economic overhaul, or whatever. Often the sentiment of concern lingers on the fear that we uncultured brutes might turn to violent rebellion to sooth our misguided aspirations. To ease these fears let me inform any affluent citizens who might be reading this that they have nothing to worry about, because the majority of crimes we poor people commit are now, and will always be, against other poor people. Why? Well, firstly, because low-income individuals are located at a nearer proximity to petty criminals (for instance, I’m almost certain that the young man who attempted, and failed, to rob me a few years ago lived within a few blocks from me). Secondly, robbing low-income individuals of the meager possession they have carries a little-to-no-risk factor of getting caught, because essentially nobody gives a damn about Angelo’s stolen George Foreman Grill (honestly, he should take it as a compliment that anyone would even bother stealing that piece of crap), as much as when some shady hedge fund manages swindles Joe Millionaire out of a few zeros from his bottom line.

If there is class warfare in this country, rest assured it is an intra-class warfare. We poor people will turn on each other before we will ever think of undermining our more socially powerful counterparts. As for the rich, your immediate fears ought to lie with your equally affluent competitors who actually have the means to put a real dent in your earnings. Trust me, waitress Susan asking for affordable healthcare coverage for her children will not be the catalyst that erodes your trust fund.

And if I’m wrong, I’ll see you all at the Bastille!

Discovering James Hogg

It should be a crime how little appreciated James Hogg’s The Private Memoirs and Confessions of a Justified Sinnersis (1824) is in Gothic literature.

James Hogg’s novel is a unique take on the subject of the material and spiritual world, in that it offers the reader both perspectives through two rivaling narratives of a single event. The first, “The Editor’s Narrative,” gives a strictly materialistic view of the seemingly supernatural events and characters. Unlike the second narrative (titled “The Private Memoirs and Confessions of a Justified Sinner”, and told from the point of view of the main character Robert Wringhim Cowan), the Editor’s suspiciously does not give much focus to Hogg’s devil-character, Gil-Martin. Instead, the few scenes in which Robert is shown conversing with the yet-unnamed character, any unnatural occurrence observed are immediately brushed off and rationalized by the secondary characters, “It is a fantasy of our disturbed imaginations, therefore let us compose ourselves till we investigate this matter farther.”[1] This serves to set a mood in this part of Hogg’s novel, where the prose recognizes the presence of something perplexing in the atmosphere, but is unable to acknowledge the extraordinary source behind it. This has the effect of suggesting to the reader that it makes no difference whether or not one chooses to believe that demonic forces are among us (and the Editor giving the first account appears not to), as our inability to perceive the supernatural has no binding effect on its ability to manipulate this world.

Although, the devil-character, Gil-Martin, is admittedly incomprehensible in his demeanor and appearance to the characters that observe him,[2] there is no indication in the narrative that he has any restrictions on his ability to freely interact with those around him; moreover it can be deduced that because he apparently transcends any physical form (this will become clearer in the second narrative), his existence is in no way shaped or bound by the material world. Thus, rather than being merely a religious concept, residing solely within the minds of convinced believers, Hogg’s devil is an agent operating entirely independent of our limited sensory and mental faculties.

The second narrative, structured as the personal memoir of Robert Wringhim Cowan as he unknowingly becomes an agent of Gil-Martin, gives a much more satisfying account of the devil, simply because Robert has no apprehensions about identifying his experiences with the spiritual realm. This is shown by his first encounter with Gil-Martin, whom he initially perceives to be his personal angel because of their uncanny resemblance[3]; this tendency of Robert to identify everything he encounters with his Christian faith serves as a major tool by which the devil comes to manipulate the young man’s actions. 

At one point, Gil-Martin himself explains the peculiarity of his changing facial features later to Robert, “If I contemplate a man’s features seriously, mine own gradually assume the very same appearance and character.”[4] The basic message Hogg is telling the reader here is that the devil has no true face of his own, meaning that at any given moment he could take the form of anyone, and essentially be everywhere. Attaining someone’s likeness also gives Gil-Martin the ability to know that person’s mind, possess his thoughts and secrets, implying that the metamorphosis is entirely emotion driven, and inaccessible to rationality.

Gil-Martin is completely aware that he is an anomaly to observers, and in devilish fashion toys and winks on occasion to suspicious laypeople as an affirmation to how, indeed, you are seeing me as a wholly unnatural part of this natural world, and try as you may, you are incapable of explaining me by any empirically logical standard. An example of this is shown in his coy salute to Mrs. Calvert and Mrs. Logan,[5] both of whom are completely ignorant of his true identity, but nevertheless sense something irregular about the faux-man. All of this points to the notion that Gil-Martin as an entity, is in no way dependent on anyone’s belief in his person for survival, because he knows that he exists independent of any mind’s perception of him; he is his own mind. 

He even occasionally gives hints to Robert as to his demonic identity, such as his explanation, “I have no parents save one, whom I do not acknowledge,”[6] an obvious reference to Lucifer’s fall from God’s grace. Here, Gil-Martin could simply be relying on the fact that even a person as spiritually inclined as Robert will not possess the ability to cope with the logical conclusion of his statement, and will instead rationalize it and then conform it with his already presupposed religious convictions. But it also reflects on his nonchalant attitude towards keeping his demonic character hidden. Certainly when it comes to Robert, Gil-Martin uses the young man’s strong Calvinist faith in predestination to corrupt his mind, and get him to surrender his free will, but at no point is it insinuated that the devil needs Robert to believe him to be a man in order to carry out his sinister plot (and at times Robert seems to question this very notion). If anything, it is Robert who thoroughly goes mad by surrendering his identity to this devious doppelganger that is gaining more and more control of his mental and physical recesses: “But the most singular instance of this wonderful man’s power over my mind was, that he had as complete influence over me by day as by night.”[7]

To James Hogg, the devil is a real agent operating in the material world. Although, Gil-Martin’s face is defined by the individual observer, his identity is clearly not. One can even argue that besides giving him a means to enter the thoughts of those whose features he adopts, Gil-Martin’s metamorphosis also acts as a way to disarm those he seeks to manipulate by letting them believe that they themselves are the dominant personality between the two (since, after all, he is adopting their face), blinding them to the reality that the devil is subduing their very person.

Hogg’s devil-character is implied to be everywhere, manipulating people at any given times, his presence has no bearing on whether or not his influence is recognized as demonic or not, the end result will ultimately still be the same (as can be seen by the unexplainable political fight stirred up in the Editor’s narrative, and the ease by which Robert can be rhetorically swayed to commit one sin after another; both examples credited directly or indirectly back to Gil-Martin as the causal source).

Standing as a precursor to later Gothic novels to follow in the same century, James Hogg’s The Private Memoirs and Confessions of a Justified Sinners is certainly a dark genre read well worth looking into to get a feel for the earlier incarnation of the transition from Romantic to Gothic literature, and the various literary elements explored therein.


[1] Hogg, James.  The Private Memoirs and Confessions of a Justified Sinner, p. 110.

[2] Hogg, p. 107, and p. 109.

[3] Hogg, p. 133.

[4] Hogg, p. 137.

[5] Hogg, p. 108.

[6] Hogg, P. 141.

[7] Hogg, p. 144.

The Politics of Friedrich Nietzsche

Nietzsche writes in the first section of his autobiographical Ecce Homo, “Hear me! For I am such and such a person. Above all, do not mistake me for someone else.” Possibly foreshadowing the innumerable misinterpretations and false generalizations that politically-minded individuals will be determined to make out of the philosopher’s writings in the generations to come.

The most useful interpretation of Nietzsche’s politics is to simply reject the notion that the man had any clear political inclination to begin with, or at least not any that fit clearly within the political models commonly made reference to in his day, or ours. Indeed, over the past few decades, academia has done its best to instil just such a post-political framework into Nietzschean philosophy. Unfortunately, the effort has yet to trickle down to the self-styled public intellectuals, who have cleverly deduced that context-void quotations, from context-heavy philosophers, make for a more digestible expression of their own personal ideologies than actual self-reflection (why bother thinking about defenses for your own position on sociopolitical matters, when someone long dead has already done all the work for you, right?).

Now, since there is little point disputing the fact that Nietzsche directly called himself anti-political (Ecce Homo, “Why I Am So Wise,” Section 3), the only reasonable question left to consider is what sort of political implications a person might be justified in deriving from the philosopher.

Above all else, if there is one consistent fact that must be understood about Nietzsche’s relations to the politics of his day, it’s that (in stark contrast to many of his claimed admirers today) the man loathed and ridiculed everything associated with his native Germany; from its culture right down to its cuisines:

Against the Germans I here advance on all fronts: you’ll have no occasion for complaints about “ambiguity.” This utterly irresponsible race which has on its conscience all the great disasters of civilizations and at all decisive moments of history had something “else” on its mind / now has “the Reich” on its mind—this recrudescence of petty state politics and cultural atomism (from NIETZSCHE’S LETTER TO OVERBECK, October 18, 1888).

Only the complete worthlessness of our German education—its “idealism”—explains to me to some extent why at precisely this point I was backward to the point of holiness (Ecce Homo, “Why I Am So Clever, Section 1).

The German climate alone is enough to discourage strong, even inherently heroic intestines (Ecce Homo, “Why I Am So Clever,” Section 2).

The few cases of high culture I have encountered in Germany have all been of French origin (Ecce Homo, “Why I Am So Clever,” Section 3).

The Germans are incapable of any notion of greatness (Ecce Homo, “Why I Am So Clever,” Section 4).

The way I am, so alien in my deepest instincts to everything German that the mere proximity of a German retards my digestion (Ecce Homo, “Why I Am So Clever,” Section 5).

As far as Germany extends, she corrupts culture (Ecce Homo, “Why I Am So Clever,” Section 5).

This is just a small sample of the disdain Nietzsche repeatedly expresses for his place of origin in his writings.

It is a clear reflection of the philosopher’s rejection of ideological identification, illustrated by his extensive attacks on what he considered to be the most evident of its mindless incarnations: the growing sentiment of German nationalism in the late 19th century. To Nietzsche this sentiment represented the antithetical of critical thought, and he was not shy about using the grand image of its idolatry (i.e. the German “Reich”) as the irredeemable symbol of all things decadent in modern civilization. Thus, it becomes highly ironic to consider how in popular thought today the man has been cast into the same ranks with nationalists and fascists, and their wannabe modern descendants; not to mention the bemusing fact that many of these nationalists and fascists will ignorantly promote Nietzsche as their intellectual muscle—bearing to all just how sickly and illiterate their cognitive fitness truly is.

Very well, Nietzsche has no place in nationalist politics, or any traditional Left/Right political spectrum. But what about something less categorically restrictive? After all, Nietzsche talks a lot about individualism, and the need for self-creation, doesn’t this give credence perhaps to anarchist thinkers, or (on a more moderate tone) at least libertarians? In short, no. Just as people make the mistake of radicalizing Nietzsche in with fascist-crackpots, the folly of romanticizing the man as some sort of idol of individual strength and responsibility would be equally mistaken.

At its core, Nietzsche’s philosophy is not about individualism, nor does he promote the notion of self-governance; what he really aimed at was to promote the message that one must be strong enough to conceive reality as it is, for “only in that way man can attain greatness” (Ecce Homo, “Why I am A Destiny,” Section 5). Following a political narrative would have been pure poison to Nietzsche’s program, as the parameters of any such narratives are by definition restricted solely to the acceptable party platforms.

As far as individualism goes, the man clearly states in Thus Spoke Zarathustra, “For, my brothers, the best should rule, the best also want to rule” (“On Old and New Tablets”). It is true that Nietzsche believed that society placed too many restrictions on the individual, but it is also true he considered human society to be a long trial, with the herd-mentality being an innate manifestation for most people. Nietzsche’s rejection of free will (Ecce Homo, “Why I Am So Wise,” Section 6; also see Nietzsche on Free Will) leaves no room for personal self-improvement. You are either one who rules or you are with the herd, hence to act in any other way than your innate nature dictates for you to act would be nonsensical to Nietzsche. Most the majority of us, we cannot and we will not, rise above our herd-minded instincts, according to Nietzsche, hence a political model celebrating individualism (or emphasis on individual responsibility) would have have seemed self-defeating to the philosopher.

The point of the matter is that you simply cannot defend your political ideology through anything Nietzsche wrote, without negating one or more important aspects of his broader philosophy. And, on that note, you shouldn’t want to. And shouldn’t need to waste time defending your convictions by desperately attaching them to the musings of any one philosopher or another. As is the repeated theme throughout this article, Nietzsche is not someone to be admired or canonized to an infallible guru status. Like all thinkers, past and present, he is to be examined and scrutinized, allowing little to no romantic idolatry to cloud one’s judgment.

Whatever politics you personally support you are to defend it by the merit of its own tenets, not by the virtues you think some third party would approve of. Especially, not by the virtues of Friedrich Nietzsche, who would no doubt instinctively scoff at and ridicule any such attempt.       

CTE: Entertainment at Any Cost

There is a docu-series on Netflix on former NFL player Aaron Hernandez, who was arrested and charged for murder in 2013. In 2015, Hernandez was found guilty of first-degree murder, and sentenced to life in prison without the possibility of parole. In 2017, he was found dead in his cell after having committed suicide. He was 27 years old at the time of his death.

The life of Aaron Hernandez is certainly interesting enough to look into, in and of itself, but the real story doesn’t end at the young man’s death, as he would be posthumously diagnosed with chronic traumatic encephalopathy (CTE), which is speculated to have contributed to the violent and irrational behavior that led to his homicidal crimes.

CTE, being a neurodegenerative disease, is commonly found among individuals who sustain repeated (concussion-level) blows to the head. Hence, it is no surprise that the disease has been in the news for years to explain away the destructive behavioral problems exhibited by athletes who played contact sports known for their high frequency of head trauma.

There are often cited studies that confirm a higher than average CTE diagnoses among such athletes compared to the general population, however, because CTE is only able to be diagnosed through an autopsy (meaning the person in question by default has to already be dead in order to confirm if they had the disease) skeptics argue that the statistics used in these studies are bound to be overinflated since the dead athletes being tested for CTE most likely exhibited the behavioral issues indicative of the disease to begin with. An unbiased diagnosis rate would require a large and diverse sample pool of athletes who play concussion-prone contact sports, who would need to be tested posthumously for CTE, and the results would then need to be compared to the rates of CTE diagnoses to the rest of the population that didn’t partake in such sports (which would also require a large and diverse sample pool of test subjects to avoid skewing the data through selection bias).

Obviously, this is an issue that will not reach a satisfying conclusion any time soon on the science alone, if ever, for the very cumbersome reasons of testing for the disease outlined above. But how much data would even be sufficient to convince us that some percentage of these athletes are at risk of suffering unalterable brain damage before we are willing to draw any ethical considerations on the subject? Moreover, what percentage is considered an acceptable sacrifice in this situation? 50%? 25%? What if it’s definitely proven that only 5-10% of athletes who engage in these sports are going to sustain brain damage that will lead them to possibly hurt others and/or hurt themselves? Is that an acceptable number for us to accept as just part of an athlete’s life and experience?

I wasn’t personally raised in a household that cared a whole lot about sports, but I do still understand how all of us can get very attached to our preferred pastime, and get quite protective of it. And it’s not just about enjoying a game; it’s about the thrill of the competition, and the camaraderie between likeminded fans coming together to cheer for their team (at times with nothing in common except for maybe their mutual dislike of the opposing team). Sports to a lot of people aren’t just games, but a form of community, and arguably even a shared worldview. And to be told that something that brings you joy in life is inherently harmful to the very group of people you’re idolizing (i.e. the athletes) can be enough to put anyone on the defensive as it’s all to easy to interpret such arguments as a personal indictment against ones very character.

Although I didn’t watch much conventional sports growing up, my home TV was often set to the bi-weekly professional wrestling shows from the 90s to the mid 2000s. I watched pro wrestling from a young age (possibly too young), and was enamored by the characters, storylines, theatrics, and yes, the violence of it all. If I’m being honest, I also did eventually grow bored of it year to year as the storylines got repetitive, and I became desensitized to the spectacle of watching people genuinely put their bodies through hell in scripted fights for my entertainment. But I continued to tune in despite my waning interest, because it was a point of shared interest with my family and friends that I did not want to let go of. And I didn’t, until mid-2007.

If you’re a wrestling fan, you probably already guessed what I’m about to reference. In June 2007, WWE wrestler Chris Benoit murdered his son and wife, before committing suicide in his Atlanta home. It was an event that shook the pro wrestling community, and left many people bewildered as to what could have compelled a man who so many fans admired as a decent guy to do something so heinous.

We may never know what exactly motivated Benoit to do the horrible things he did that day, but a leading theory of the underlying cause is CTE, as confirmed by an autopsy which revealed the wrestler’s brain to be severely damaged and resembling an Alzheimer’s patient, caused by years of repeated head trauma and concussions. The findings sparked a new debate among wrestling fans, where they asked if it was right to hold the man fully responsible for his actions, or if his state of mind was such that he had no control over his actions. Meanwhile, a different sort of debate crept up in my own mind: Am I partly responsible for this?    

After all, I cheered every head blow, steel chair collision, punch, kick, and fall for years and years right along with everybody else. It was done for my enjoyment, and I never once questioned the ethics of it. These are adults, after all. They know the risk they’re getting into. I neither created this sport, nor controlled how it’s managed and presented. What they chose to do is beyond me, and if I stopped watching, it would still exist, completely indifferent and independent of me. All of this was and is true, yet it still didn’t feel right anymore. I simply couldn’t watch another match without feeling uncomfortable about the possible damage I was passively encouraging through my viewership.

My family and friends still watched, and I never tried to argue them out of it (nor anybody else). I didn’t go into detail about why I stopped watching, choosing to simply say I was bored with it (which was true enough) and not participating in the conversation if the topic came up. Everyone accepted it wasn’t my thing anymore readily, and things moved on without issue.

The feeling of discomfort never left though. There are even residual traces of defensiveness still lurking, ready to stand up for my past viewing habits, so I’m not being flippant when I say I understand the reflexive agitation football fans, soccer fans, boxing fans, etc. etc. etc., are feeling nowadays from the scrutiny aimed at their favorite sports, and the implied judgment accompanying screeds about the physical, measurable harm done for their entertainment value.

Just as I had no intention of talking anybody out of watching pro wrestling 14 years ago, I have no intention of arguing for sports fans of any sort to give up their preferred pastime. I don’t believe attempting such a thing to even me possible, honestly. And I also don’t believe that a legal ban on specific sports is the productive way to go about mitigating the perceived harm being committed here, either. The only question I ask of anyone is to consider what the value of your entertainment experience is, and if this cost happens to be laced with bodily trauma, and pain, and agony, and tragedy for the athletes that make said entertainment possible, is it a cost that’s worth paying?

My answer is no, but your mileage may vary.  

Happy Singles Awareness Day!

Me:  (being polite)“Happy Valentine’s Day.”

Her:  (being grumpy) “You mean Singles Awareness Day Eve.”

Me:  “Funny, did you come up with that on the spot?”

Her:  “No, I read about it online, but the point still stands. Valentine’s Day is a sham.”

Me:  “Because you’re single?”

Her:  (getting defensive) “No, it’s not that. I hated it just as much last year when I was dating.”

Me:  “Why?”

Her:  “It’s a stupid marketing scam that tricks women into forcing men to ‘proof’ their love for them by buying cards with cliché sayings, chocolate that will be gone by the end of the day, and flowers that will be dead by the end of the week. I didn’t like it then, and I don’t like it now.”

Me:  “Now, when you’re single, you mean?”

Her:  “I told you it has nothing to do with that.”

Me:  “Did you still accept the gifts back when you weren’t single?”

Her:  “I didn’t have a choice.  And it’s not the gesture that bothers me, it’s the fact that its so forced; so contrived. You know?”

Me:  “Well, I don’t know my feelings about all of that, but you have a point about the flowers being dead in week thing. Which is why some time back I opted to give someone a large bouquet of roses, all made out of plastic. Symbolizing how our love will transcend the mortal limitations of life itself, and be everlasting. Explained all of that on the card and everything, too.”

Her:  “Nice. Did she like it?”

Me:  “I’m standing here celebrating Singles Awareness Day Eve with you, aren’t I? That should give you your answer.”

Final Verdict:  Don’t bothering arguing the meaning of it all and just buy your partner the stupid cards, chocolates, and flowers, and then get yourself laid.  Happy Valentine’s/Singles Awareness Day.

How to Talk to People Without Hurting Yourself

Last week I found myself trapped gleefully engaged in conversation on a topic I cared nothing about, and could contribute nothing to. This apparently caused no grief to the woman that was torturing my eardrums providing me with a pleasant new outlook on life…by any means necessary. However, as my short attention span (the tolerance of which I had clearly underestimated up to that point) began to waver, I decided to mentally pen a short list of steps that can help others survive such an ordeal, and in the long run possibly even save society…possibly.

How to Talk to People Without Hurting Yourself

Step 1:  Find Person

Step 2:  Ask a question that implies interest in person’s life/activities/relationships.

Step 3:  Remember to look alert and express concerned/amused facial expressions as the situation demands. Note: there is no real need to actually listen to what the person says since most people use the same tiresome set of topics/inquiries, which require minimal thought process to respond to. Besides, the immense level of boredom ensured by actually listening has high risks of suicidal outcomes. Proceed with caution.

Step 4:  Keep asking vague questions that can be applied to anything or anyone. For example, “How was your day?”, “How was the movie?”, “How cloudy will it be tomorrow in your opinion?”

Step 5:  Ignore all answers.

Step 6:  Hum song to yourself to avoid possible suicidal/homicidal thoughts.

Step 7:  In the circumstance that Steps 1 to 6 cannot be completed, properly dispose of person and start over. [Methods of disposal vary and are limited only to one’s imagination and duct tape availability. No purchase necessary].

Humanity, you’re welcome.

Nietzsche on Free Will

There is some confusion and misunderstanding floating around concerning Friedrich Nietzsche’s thoughts on the concept of free will. By which I’m referring to the willful inability of many admirers of the philosopher to accept the fact that he wholeheartedly rejected the existence of anything akin to free will.

To Nietzsche, free will is a concept that cannot be separated from its religious underpinnings, thus: “God has been thoroughly refuted; ditto, ‘the judge,’ ‘the rewarder.’ Also his ‘free will'” (Beyond Good and Evil, “What is Religious,” section 53).

Since Nietzsche gives no credence to the religious worldview, he sees no reason why religious concepts ought not to be rejected right along with the rest of the divine packaging, “The desire for ‘freedom of the will’ in the superlative metaphysical sense, which still holds sway, unfortunately, in the minds of the half-educated (Beyond Good and Evil, “On the Prejudice of Philosophers,” section 21).

He does acknowledge, however, that many of his irreligious peers still try to preserve some notion of a non-supernatural version of free will, a sentiment that Nietzsche describes as the need for individuals to hold onto a sense of personal responsibility, “some will not give up their ‘responsibility,’ their belief in themselves, the personal right to their merits at any price” (Beyond Good and Evil, “On the Prejudices of Philosophers,” section 21). The mindset of the naturalistic thinkers who hold to the existence of free will, is their attempt to salvage the idea of accountability (their own, and that of others), and by extension, the institution of justice and due punishment for one’s actions.

But Nietzsche rejects this desire as a misdirected conflation of two separate issues; namely, a conflation of justice with punishment, and a further conflation of both of these with free will:

The idea, now so obvious, apparently so natural, even unavoidable, that had to serve as the explanation of how the sense of justice ever appeared on earth–“the criminal deserves punishment because he could have acted differently”–is in fact an extremely late and subtle form of human judgment and inference: whoever transposes it to the beginning is guilty of a crude misunderstanding of the psychology of more primitive mankind (On the Genealogy of Morals, “Second Essay,” section 4).

Nietzsche proposes that the origin of justice can be more accurately characterized as a form of trade, serving as a method to equalize two competing parties, and not necessarily as a punishment for one’s freely chosen actions (i.e. free will). In fact, in such a framework the emphasis on punishing offenders is superseded by the notion that, “every injury has its equivalent and can actually be paid back, even if only through the pain of the culprit” (On the Genealogy of Morals, “Second Essay,” section 4).

As already mentioned, Nietzsche’s rejection of free will is tied in with his general rejection of theism. And he feels that the efforts of atheistic philosophers to retain the faulty concept, while still proposing a godless reality, is misguided; not to mention counterproductive:

Surely, that philosophers’ invention, so bold and so fateful, which was then first devised for Europe, the invention of “free will,” of the absolute spontaneity of man in good and in evil, was devised above all to furnish a right to the idea that the interest of the gods in man, in human virtue, could never be exhausted (On the Genealogy of Morals, “Second Essay,” section 7).

Nietzsche argues that the reason free will was originally invented as a concept was to give religiously-minded philosophers a means by which to allow for unconstrained supernatural intervention on the part of the various gods man had hitherto created. In short, free will is a trump card conveniently utilized to give deities a meaning to exist:

The course of a completely deterministic world would have been predictable for the gods and they would have quickly grown weary of it—reason enough for those friends of the gods, the philosophers, not to inflict such a deterministic world on their gods! (On the Genealogy of Morals, “Second Essay,” section 7).

Now, a fair question for a reader to ask is how Nietzsche’s rejection of free will does not also lead to a dismantling of much of Nietzsche’s own philosophy, in particular his conception of “the will to power,” and his continuous call for individuals to create their own values in life? Although a good point, it nonetheless rests on a superficial reading of Nietzsche’s thoughts on the subject.

It is true that Nietzsche heralded the idea of individuality, but not in any sense that would imply self-improvement. He fervently maintained that, “independence was for the very few” (Beyond Good and Evil, “The Free Spirit,” section 29), and even these individuals had no choice in the matter, because their instinct for individualistic expression is also deterministically confined, just as the herd-instinct of the masses can’t help itself but to subvert the independence of the few (On the Genealogy of Morals, “First Essay,” section 2). In this regard, there is nothing “free” about Nietzsche’s “will to power,” which is itself entirely instinctive, driven not by any conscious intent or choice-value, but on purely mechanical responses to environmental and genetic factors. Thus, in Nietzsche’s own language, the will to power is nothing more but the instinct for freedom (On the Genealogy of Morals, “Second Essay,” section 18), which of course is an instinct no one can freely choose to have.

Nietzsche understood how his views on this matter would make some uncomfortable (in particular his call for persons to abandon a concept like free will, upon which so much of the popular conception of personhood is based on), to which he bluntly responded: “One should guard against thinking lightly of this phenomenon merely on account of its initial painfulness and ugliness” (On the Genealogy of Morals, “Second Essay,” section 18).

According to Nietzsche, free will–being fundamentally an illusion–necessitates that we have no choice but to act as if our decisions are free agents. Therefore, the disdain individuals feel about the fact that their actions are entirely deterministic is itself a causal result of the way by which human perception has evolved to relate to its environment. We have no free will, but we are determined to behave as if we do. Whatever, “painfulness” or “ugliness,” people imagine will result from acknowledging this point is moot on principle.

The Intellectual Value of Comic Books

Although the previous two decades saw a great surge in the respectability afforded to comic book characters adapted brilliantly to cinema screens, I don’t think the same level of appreciation carried over to the colorful, panel-style pages that all these characters originate from. What I mean is, while moviegoers might have cheered on at the sight of the Avengers, I predict very few people cared enough to go out and read up on the multitude of Avengers comics in publication since the mid-20th Century. I would argue the same probably holds true for many of the other top comics-to-cinema franchises.

Some movie historians point to the success of the 1978 Superman movie, or Tim Burton’s 1989 Batman, as the beginning of the mainstream acceptance of comic book adaptations, but I’m not too sure it’s reasonable to cast such a far-reaching net. Movie genres, I believe, come in arcs and trends, and I don’t think the recent rise the comic book movie is anymore linked to the success of the two aforementioned movies than the rise of popularity of action movies throughout the 1980s and 1990s in general.

I’d argue that the precursor to the current comic book movies craze started just at the close of the 20th Century, with a movie called Blade.

For readers too young to remember 1998 too well, the first Blade movie was a humongous hit at the time of its release. Despite most moviegoers probably not being aware that they were in fact watching a comic book movie, Blade set the stage for Marvel’s superhero film adaptations that continue to this day. Moreover, it shifted the zeitgeist away from comic book movies needing to have an air of lightheartedness and child-friendly whimsy, and showed that you can have superheroes be dark, serious, and directed in a way where it looks as if they’re grounded in a reality that could plausibly overlap with our own (Christopher Nolan’s Batman trilogy would utilize a similar formula when adapting the caped crusader to the big screen in 2005’s Batman Begins).

Nevertheless, the theatrical success of Blade the movie, didn’t elevate the Blade comic book in the wider audience. Nor did the mainstream embrace of the subsequent comic book movies that enjoyed massive commercial success uplift most of it’s printed character counterparts to an equal footing with their cinematic namesakes’ successes.

Now, don’t misunderstand me. I am not making some nerd-elitist “we true fans liked it before it was cool,” and in fact I’d argue that some comic book characters like Thor and Iron Man are not just adaptations, but superior works of storytelling in their big screen form than they ever were on the printed page.

What I am saying is that, despite the mainstream acceptance and success of movies based on comic book characters, and the widespread enjoyment the public gets out of the stories being told therein, comic books themselves are still not afforded the intellectual respect of being viewed as something beyond children’s entertainment, regardless of the maturity or complexity of the actual story being told within the drawn panels. Furthermore, if a comic book does reach a point where it is mature enough, raw enough, complex enough that it does crossover into the domain of being legitimate adult-approved entertainment, it immediately gets rebranded from mere “comic book” status up to the more reputable sounding category of a “Graphic Novel.”

So, there were some conversations about graphic novels… – Idaho Commission  for Libraries

Arguable the differentiation between what counts as a comic book, and what counts a graphic novel, could very well have its place. However the truth remains that, while a lot of people are willing to defend the intellectual worth of graphic novels like Watchmen, Maus, or Sin City, not too many bother to standup for the literary value of the common comic book; often this includes those of us who grew up enjoying comic books. And I would argue this seemingly minor oversight causes us to ignore a major contributor to a child’s introductory development to the world of literature, which can and does give rise to a lifelong appreciation of storytelling as a whole. Stories that can, and ought to, still be enjoyed well into adulthood.

Personally, comic books were a gateway into appreciating the written word at a young age, and laid the groundwork for understanding the importance of syntax structure when communicating one’s ideas through prose.  Now, I certainly didn’t realize as a kid, as all I did was enjoy the stories I followed in the printed panels, but the seed was planted for me to have a foundation to grasp the classics of literature once I was mature enough to engage them firsthand. Nowadays, I am surrounded by the greats (and some not-so-greats) of the literary world on my bookshelves, but I still feel no shame in openly indulging in the cheap, department store comic I bought along with my morning snickers bar. 

To me, comic books are a form of literature. Like all literature, some of it is good and some of it is bad; some of it is fascinating, and some of it is corny; some of it is engaging, and some of it is dull. But to dismiss the entire genre, so critical in shaping a one’s early sense of imagination and reading comprehension, just seems like a betrayal to the very foundations that introduced us to the world of literature to begin with.

Pronouncing Nietzsche

A reader sent a pretty good question to my inbox:

This will sound really really stupid but do you know how ‘Nietzsche’ is supposed to be pronounced? I mean the way he would have pronounced it himself. I always feel like I’m saying it wrong.

There is nothing stupid about asking something you genuinely don’t know the answer to, and I personally have little regard for individuals who make it a habit to put down anyone eager to correct their confusion on a particular issue. Now that I got that out of the way, dear reader, let me address the question.

The most often mistake I hear is “NEE-chee” (with an ending that rhymes with “see” or “fee”), and it’s probably the way most native English speakers have been thought to say it; this includes both academic professors and the average layperson. I suppose the reason why this mispronunciation is so widespread amongst Anglophones is because the pronunciation of the man’s name is of no real consequence when it comes to analyzing his philosophy–except to those who happen to have a particular fixation on these sort of issues. That last bit was not meant to be judgmental, just an observation on my part. And I can actually see how such fixations can be a healthy sign of a person’s intellectual curiosity, as long as you don’t start thinking of other people as your intellectual inferiors over something as trivial as the fact that they mispronounce a name whose linguistic origins they don’t happen to speak. 

The other mistake is to simply pronounce the name as “Nitch” (with the false assumption that the “e” is silent); this one’s rarer, but I’ve heard it said once or twice in college so it’s worth mentioning.

The confusion people seem to have is how the heck you’re suppose to say the ending of the philosopher’s name. This site gives a decent rendition of the standard German pronunciation (with audio included), and I encourage readers to follow the link to hear it for themselves. In the linked site, the pronunciation is transcribed as something close to “NEE-cheh”, but this can be confusing to some English speakers because the closing “h” syllable is relatively soft; coming across as a quick exhaling sound, so it sounds kind of like you’re saying it under your breath (as you’ll hear on the audio recording on the link provided). This can be made even more confusing by the fact that depending on which German speaker you ask, the pronunciation you hear can either come across sounding like “NEE-ché” (ending “é” used as it is in French, but with a guttural stress; which brings it very close to the “NEE-cheh” pronunciation shown in the link).

For all the years I’ve been fluent in German (i.e. since early childhood), and all the time I spent talking to native Germans (also since early childhood), I have always used the former pronunciation (the guttural “é” sound at the end), but one needs to keep in mind that I learned to speak German in Hannover, Lower Saxony, which is often cited to be as close to an accent-neutral region as German can get (sort of the German equivalent of what Americans would call a “Midwestern accent” in their country). However, in college (here, in the U.S.) I ran into several professor who also spoke fluent German, and they vehemently insisted that it’s supposes to be “NEE-cha”. Rather than pointlessly argue over it, I’ll just let people know about the supposed discrepancy, even though I almost never encountered it myself while communicating with German speakers.

In closing, this is a common question English speaking have when looking the writings of Friedrich Nietzsche, and it’s always difficult to transcribe linguistic sounds from one language to another. I think the linked site’s phonetic transcription of “NEE-cheh” is a good compromise between the two (allegedly) disputing accounts of the German pronunciation of “Nietzsche”. Just keep in mind that the closing “h” is more of an ending breath, than it is a proper syllable. 

Or you can simply keep on pronouncing it as you always have, because how you say the name of any writer or philosopher shouldn’t have any bearing on how well you understand and analyze his/her ideas.

The Bum and the Professor: A Hypothetical Conversation

Bum:  “Spare some change?”

Professor:  “No.”

Bum:  “Not even a quarter, or a nickel?  No change at all?”

Professor:  “Sorry. If I had some, you can rest assured that I’d give it to you, but I just don’t have any.”

Bum:  “Why can I ‘rest assured’ of that? I don’t know you.”

Professor:  “True, but I know you, more or less. I have spent decades lecturing and writing on the plight of the underprivileged. So I understand your hardship enough to know that if I honestly had any money to spare, I wouldn’t hesitate to give it to you at once.”

Bum:  “All these decades you’ve spent lecturing and writing about someone like me, did no one ever pay you?”

Professor:  “Of course they did.”

Bum:  “And yet, you haven’t got a quarter or nickel to spare with the guy that earned you a paycheck?”

Professor:  “I resent that remark. I’ll have you know that I have given a large sum of money over the years to various charities to help people in need.”

Bum:  “Good for you. That still doesn’t put either a quarter or a nickel in my hand, right now.”

Professor:  “You’re judging me for not being able to give you money, right now? A bit self-righteous for a man who spends his days begging for a portion of other peoples money, don’t you think?”

Bum:  “No judgment here, honestly. I’m just following your train of thought, which I admit can seem pretty ‘self-righteous’. Probably about as self-righteous as being told that someone knows me, just because they’ve written something about poor folks here and there.”

Professor:  “I see. Well, allow me to clarify: While I don’t know you personally, I do understand, because of my extensive research and studies on the subject, the hardship that comes along with residing within the parameters of today’s socioeconomic hegemony.”

Bum:  “Parameters of what?”

Professor:  “Socioeconomic hegemony.  It’s a phrase I coined in one of my papers. Roughly it means that the conditions of a person’s environment are so dominating that they are naturally setup to be disadvantageous to the underprivileged in said environment. You understand?”

Bum:  “I understand what you said. I don’t understand what good it does to have it said.”

Professor:  “Identifying and defining a problem is the first step to having it resolved.”

Bum:  “When did you first write this?”

Professor:  “About 30 years ago.”

Bum:  “How long until it starts to ‘resolve’ the problem?”

Professor:  “It doesn’t work that way.”

Bum:  “Why not?”

Professor:  “Because social theories aren’t meant to fix people’s problems just by the power of the pen.  People have different perspectives, and one social theory can yield an innumerable sub-theories on how to implement reforms. Not to mention, there is always nuance to consider.”

Bum:  “So some other guy can come up with a different ‘social theory’ about the exact same problem your social theory talks about, and his would be just as good as yours.”

Professor:  “I think you’re getting confused, remember we’re talking about hypothetical thought experiments here.”

Bum:  “So they’re imaginary.”

Professor:  “No, they are normative descriptors of reality.”

Bum:  “How do you know they’re describing reality, if they haven’t been tried out yet?  That is what hypothetical means, right?”

Professor:  “It’s more abstract than that.”

Bum:  “I bet. But I still don’t see the point of coming up with all of these social theories, if they can’t actually resolve the problems they’re addressing. Seems to me like a man might as well be doing nothing and still get the same results.”

Professor:  “I told you, social theories recognize a problem and allow for the future assembly of working models to be implemented by society.”

Bum:  “Hypothetically.”

Professor:  “Yes, hypothetically.”

Bum:  “See that building over there? 30 years ago I was part of the crew assembling the foundation of dozens of buildings just like it, all over town. Most of them are still around. People can use them, live in them. They can like them or hate them. But they can’t ignore them. If they decide to get rid of them, they have to put some physical effort into removing them from the spot we put them on. You understand what I’m driving at?”

Professor:  “Not really, no.”

Bum:  “Before we put down the foundation, when we were barely carving out the dimensions on the ground, the buildings were what you would call hypothetical. Now, 30 years later, I guess someone a little better with words than me, would say that these buildings are ‘descriptors of reality’, at least in the little, tiny spot of reality where they stand. You couldn’t describe the area where these buildings are without mentioning the buildings themselves.”

Professor:  “Okay, I get what you’re driving at, but you’re wrong. This is completely different from my academic discipline; you’re simply not comparing like with like.”

Bum:  “Yeah, probably. All I know is that 30 years ago, we identifies a problem: no building in this spot. Now, 30 years later, problem is resolved: building is there, whether someone likes it or not. 30 years ago, you identified a problem; now, 30 years later, you’re identifying of the problem all those years ago hasn’t done squat to resolve whatever problem it is you felt needed to be identified in the first place–because if it had I wouldn’t be sitting here like this, would I? So, let me ask you, are you sure your social theories are actually describing reality, or are you just defining reality to your liking, and cramming your social theories into it so you can have something to lecture people on?”

Professor:  “My theory is sound, but to understand it properly would take many years of study.  Hence, this conversation is inconsequential. Here’s your quarter, and have a nice day.”

Bum:  “Much appreciated, good sir. You have yourself a good one, too.”