There is a docu-series on Netflix on former NFL player Aaron Hernandez, who was arrested and charged for murder in 2013. In 2015, Hernandez was found guilty of first-degree murder, and sentenced to life in prison without the possibility of parole. In 2017, he was found dead in his cell after having committed suicide. He was 27 years old at the time of his death.
The life of Aaron Hernandez is certainly interesting enough to look into, in and of itself, but the real story doesn’t end at the young man’s death, as he would be posthumously diagnosed with chronic traumatic encephalopathy (CTE), which is speculated to have contributed to the violent and irrational behavior that led to his homicidal crimes.
CTE, being a neurodegenerative disease, is commonly found among individuals who sustain repeated (concussion-level) blows to the head. Hence, it is no surprise that the disease has been in the news for years to explain away the destructive behavioral problems exhibited by athletes who played contact sports known for their high frequency of head trauma.
There are often cited studies that confirm a higher than average CTE diagnoses among such athletes compared to the general population, however, because CTE is only able to be diagnosed through an autopsy (meaning the person in question by default has to already be dead in order to confirm if they had the disease) skeptics argue that the statistics used in these studies are bound to be overinflated since the dead athletes being tested for CTE most likely exhibited the behavioral issues indicative of the disease to begin with. An unbiased diagnosis rate would require a large and diverse sample pool of athletes who play concussion-prone contact sports, who would need to be tested posthumously for CTE, and the results would then need to be compared to the rates of CTE diagnoses to the rest of the population that didn’t partake in such sports (which would also require a large and diverse sample pool of test subjects to avoid skewing the data through selection bias).
Obviously, this is an issue that will not reach a satisfying conclusion any time soon on the science alone, if ever, for the very cumbersome reasons of testing for the disease outlined above. But how much data would even be sufficient to convince us that some percentage of these athletes are at risk of suffering unalterable brain damage before we are willing to draw any ethical considerations on the subject? Moreover, what percentage is considered an acceptable sacrifice in this situation? 50%? 25%? What if it’s definitely proven that only 5-10% of athletes who engage in these sports are going to sustain brain damage that will lead them to possibly hurt others and/or hurt themselves? Is that an acceptable number for us to accept as just part of an athlete’s life and experience?
I wasn’t personally raised in a household that cared a whole lot about sports, but I do still understand how all of us can get very attached to our preferred pastime, and get quite protective of it. And it’s not just about enjoying a game; it’s about the thrill of the competition, and the camaraderie between likeminded fans coming together to cheer for their team (at times with nothing in common except for maybe their mutual dislike of the opposing team). Sports to a lot of people aren’t just games, but a form of community, and arguably even a shared worldview. And to be told that something that brings you joy in life is inherently harmful to the very group of people you’re idolizing (i.e. the athletes) can be enough to put anyone on the defensive as it’s all to easy to interpret such arguments as a personal indictment against ones very character.
Although I didn’t watch much conventional sports growing up, my home TV was often set to the bi-weekly professional wrestling shows from the 90s to the mid 2000s. I watched pro wrestling from a young age (possibly too young), and was enamored by the characters, storylines, theatrics, and yes, the violence of it all. If I’m being honest, I also did eventually grow bored of it year to year as the storylines got repetitive, and I became desensitized to the spectacle of watching people genuinely put their bodies through hell in scripted fights for my entertainment. But I continued to tune in despite my waning interest, because it was a point of shared interest with my family and friends that I did not want to let go of. And I didn’t, until mid-2007.
If you’re a wrestling fan, you probably already guessed what I’m about to reference. In June 2007, WWE wrestler Chris Benoit murdered his son and wife, before committing suicide in his Atlanta home. It was an event that shook the pro wrestling community, and left many people bewildered as to what could have compelled a man who so many fans admired as a decent guy to do something so heinous.
We may never know what exactly motivated Benoit to do the horrible things he did that day, but a leading theory of the underlying cause is CTE, as confirmed by an autopsy which revealed the wrestler’s brain to be severely damaged and resembling an Alzheimer’s patient, caused by years of repeated head trauma and concussions. The findings sparked a new debate among wrestling fans, where they asked if it was right to hold the man fully responsible for his actions, or if his state of mind was such that he had no control over his actions. Meanwhile, a different sort of debate crept up in my own mind: Am I partly responsible for this?
After all, I cheered every head blow, steel chair collision, punch, kick, and fall for years and years right along with everybody else. It was done for my enjoyment, and I never once questioned the ethics of it. These are adults, after all. They know the risk they’re getting into. I neither created this sport, nor controlled how it’s managed and presented. What they chose to do is beyond me, and if I stopped watching, it would still exist, completely indifferent and independent of me. All of this was and is true, yet it still didn’t feel right anymore. I simply couldn’t watch another match without feeling uncomfortable about the possible damage I was passively encouraging through my viewership.
My family and friends still watched, and I never tried to argue them out of it (nor anybody else). I didn’t go into detail about why I stopped watching, choosing to simply say I was bored with it (which was true enough) and not participating in the conversation if the topic came up. Everyone accepted it wasn’t my thing anymore readily, and things moved on without issue.
The feeling of discomfort never left though. There are even residual traces of defensiveness still lurking, ready to stand up for my past viewing habits, so I’m not being flippant when I say I understand the reflexive agitation football fans, soccer fans, boxing fans, etc. etc. etc., are feeling nowadays from the scrutiny aimed at their favorite sports, and the implied judgment accompanying screeds about the physical, measurable harm done for their entertainment value.
Just as I had no intention of talking anybody out of watching pro wrestling 14 years ago, I have no intention of arguing for sports fans of any sort to give up their preferred pastime. I don’t believe attempting such a thing to even me possible, honestly. And I also don’t believe that a legal ban on specific sports is the productive way to go about mitigating the perceived harm being committed here, either. The only question I ask of anyone is to consider what the value of your entertainment experience is, and if this cost happens to be laced with bodily trauma, and pain, and agony, and tragedy for the athletes that make said entertainment possible, is it a cost that’s worth paying?
I have an unhealthy obsession with conspiracy theories. Now, when I say this please don’t misunderstand me. I don’t actually buy into the stated details of conspiracy theories, I’m just fascinated by how much devotion and faith people put into them. How a person will take several halfway demonstrable, halfway ludicrous details, and then loosely connect them into something which at first glance sounds like a plausible narrative, but on any close inspection falls apart under the most basic level of scrutiny.
Despite what some might think, I am wholly unconvinced that either intelligence or education plays a significant role in deterring people away from believing in conspiracy theories, because such theories are not really about filling the gaps of our mind’s ignorance and shortcomings. It’s about satisfying a base desire for witnessing something greater, higher, that is closed to the majority of the “deluded” masses. This is what makes conspiracy theories appealing to its proponents.
I was still young when Lady Diana died in 1997, but I was old enough to take note of the reactions people around me had to the news. It took about four minutes after hearing the news for several members in my family to staunchly announce how they didn’t accept the “mainstream” story. Why didn’t they accept it? What tangible evidence did they have to make them doubt the news report? Essentially none, but it didn’t matter. There suspicion was that the simple answer must be a distraction to cover up the real story. Or, as one person put it, “I cannot believe that there isn’t more to this whole thing.” This sentence, I believe, captures the mindset most of us have, most of the time, when we are confronted with some awestruck piece of data.
Of course, the official report of the incident was that Diana and her boyfriend died after crashing in a road tunnel in Paris, due to the driver losing control of the vehicle. But this just wasn’t grand enough for some people, who to this day maintain that there has to be more to it. And no investigation will be enough to convince any of them otherwise, because any investigator who comes up with a different conclusion will simply be evidence of the greater conspiracy. Most conspiracy theories follow a similar line of reasoning, regardless of the facts or details presented to them to negate their favored narrative.
We have an innate aversion to simplicity. Just repeating a story we hear isn’t enough, we need to add more complex details onto it to make it more digestible for wider consumption; refine it and move the narrative forward with facts we think ought to be included with the official details.
It can’t be that politicians are simply corrupt and self-serving, they must also be secretly operating under the direction of an unknown shadow government, which is menacingly pulling the strings behind the curtain. And (occasionally) this shadow government has to be made up of shape-shifting, inter-dimensional lizards, whose bloodline traces back to ancient Babylon; or a cabal of cannibalistic pedophiles using the blood of their child victims to maintain their youth and power.
It’s not enough to say that life on earth is simply adaptive to its environment, there has to be more to it; some kind of grand purpose and intent operating on a level too complex, too powerful for out meager minds to fathom. This line of thinking is even stronger when we don’t have enough facts to draw any kind of clear conclusion, in such a case we’ll reason that even a conspiracy theory is better than no theory.
Simple reasons and answers are often not enough to do the job for us, because simplicity can never meet the expectations of our innately suspicious imaginations. What does satisfy our suspicion is a narrative that goes counter to the mainstream. That only those of us who are of the most elite intellect can grasp: “The Illuminati may be fooling you but it’ll never fool me.”
Part of the appeal of conspiracy theories is the layer of excitement they bring to everyday facts. It is stimulating beyond belief to lose oneself in all the various plots and details of a hidden world, even if its veracity is only verified by a very questionable set of complex circumstances; this just makes it more exciting. The other part of the appeal is the strange level of remote plausibility it brings to the table. For instance, there is no denying that people have conspired in the past (and still do today), often for ominous reasons (an example being the documented long history of unethical humane experimentation in the United States). And this air of remote plausibility is more than enough to keep peoples suspicions on high alert, except when it comes to scrutinizing the various details being used to support the particular conspiracy theory they have chosen to embrace.
We know that the human mind is in many ways constrained in its ability to rationalize the world, thus we are constantly seeking the higher, the greater, the unimaginable as our answer of choice. The strange thing is that as the answer we are seeking becomes more nuanced and complex the simpler it will begin to seem to us, and we will insist that our highly elaborate, immensely complicated and circumstantial answer, is really the most simple and obvious of them all. Because by that point we have already accepted the narrative of the conspiracy, where the grand conclusion is being used to fill in the details, instead of the observable details being used to arrive at the most possible conclusion (be it simple or complex).
Throughout the history of American cinema in the 20th Century film narratives served as a decent reflection of where the general public consensus stood in regard to America’s domestic or foreign affairs. Westerns in particular played a vital role in being able to encapsulate the nation’s mood, and broaden it by promoting a nostalgic wanting for the country’s simpler, if largely mythical, frontier past.
Although the initial tone of this cultural molding was done in favor of the American ideal by the likes of John Ford and Michael Curtiz, the impact of Vietnam, the collapse of President Johnson’s Great Society, and the near universal betrayal felt by the nation through the Watergate scandal, all worked together to gradually shift the tone in the public consciousness, and as a result, the movie narrative right along with it. John Carpenter’s 1981 Escape from New York is the culminating product of this trend, set in a dystopian American in the not too distant future (1997), the once heralded ideals of lawfulness, respect and responsibility in governance has vanished, leaving less than a handful of individuals who still embody the true rugged sense of American virtue.
The film begins by introducing the audience to the events that have led up to the dire world America has found itself in. In 1988, the crime rate has risen by 400% (no doubt an allusion to the growing crime rate seen in American urban centers in the 1970s), and Manhattan island, of the once great city of New York, has been turned into a maximum security prison to keep the dangerous forces of society at bay. Left to roam on their own in the streets of Manhattan, the thugs, murderers, and crazies, forge a Hobbesean social order in their own image, which while confined, is ultimately without constraints.
The central plot of the movie is a symbolic parallel of the disillusionment Americans have been experiencing towards their government for the better part of the preceding decades, and what happens when the authorities responsible for creating such an environment find themselves at the receding side of the contempt they have created.
In the film, the President of the United States is forced to crash land in the Manhattan prison-state after his plane is hijacked by the anti-government terrorist group National Liberation Front, from which point on he is left at the mercy of the criminals running the area (primarily the self-appointed Duke of New York). Both of these casual events are brought about from the policies the President himself either enacted or was associated with through the system that helped foster it. Therefore, it is difficult to feel too much sympathy for the man, a message Carpenter may have intended on the grounds that he opted to keep the character nameless throughout the plot, leaving him to be the ideal bureaucratic representation of any and every administrative and legislative figure of the 1960s and 1970s. Instead, the protagonist is the rogue fugitive Snake Plissken, whose role in trying to save the President is one of staunch reluctance brought on through outright entrapment by the state authorities; a strong nod to the fuming Vietnam draft generation.
Whereas in the past the heroes of cinema, in particular Westerns, fully displayed a sense of idealist fervor towards protecting and living up to the quasi-mythical notion of what America is and ought to be, Plissken shows no such romantic illusions. The sub-plot of having to rescue the President in time for him to attend a summit with the USSR and China to divulge information on nuclear fusion, vaguely explained as vital for “the survival of the human race,” is treated with utter disinterest by Plissken who sees his own personal survival as being of far greater importance than the political quarreling between despots.
This general mood is a clear indication of the cynicism the American public had been feeling about its government, and the breakdown of the American myth in cinema signaled an end to “the sanctioning of ‘cowboy’ or vigilante-style actions by public officials and covert operatives who defy public law and constitutional principles in order to ‘to do what a man’s gotta do.’” However, rather than disappear completely, the envoy of the American spirit was simply transferred from the national scale to the disgruntled individual, which is what Plissken’s character is meant to signify. He was a war hero, turned criminal in a country that is probably unrecognizable to him from the one he once fought for, and possibly once believed in. Hence, the old nostalgia characteristic of the Western is still present, but the prospect for hope in the future has been extinguished.
Snake Plissken is easily recognized by every character he happens to run into on his rescue mission in New York, often being met with the bemused statement, “I heard you were dead.” To which he once tellingly responds, “I am.” If Plissken is meant to be the stand-in for the American public at large in the midst of a corrupt, disengaged social order, than as the remaining glow of what was once the shining light of American values, the aforementioned greeting takes on a highly pessimistic overtone. “In a healthy society the political and cultural leaders are able to repair and renew that myth by articulating new ideas, initiating strong action in response to crisis, or merely projecting an image of heroic leadership.”
But in the dystopian society Escape from New York depicts, the political leadership is not so much portrayed as too tyrannical to project a heroic image, but too impotent to even attempt it. The President is easily kidnapped, and his life is held at the will of the lowest sectors of society, and even with all the vast resources of the nation unable to do anything about it; this is not an image of a power that has over-asserted its might, but the measly shadow of a tamed and defanged creature. The fate of the country and the world is at stake and the people (or person, in Plissken’s case) are too disillusioned to give a damn.
The final conversation Plissken has with the President after rescuing him is the most revealing, as Plissken asks him, “We did get you out. A lot of people died in the process. How do you feel about that?” Coming from Plissken this sort of curiosity is interesting, because it shows that behind the cynicism and lost hope there is still at least a memory of a former ideal, when such things may have seemed to matter. Of course, the President’s response of mindless political rhetoric only works to further cement the disgust Plissken has for the public figures running the country. A sentiment many Americans in 1981 would have easily identified with.
In contrast to similar movies like Deathwish, which explore the widespread cynicism prevalent in America in the aftermath of Vietnam and Watergate, John Carpenter’s Escape from New York leaves the viewer with no foreseeable remedy for the decadent situation. In fact, judging by the act of sabotage by Plissken against the President’s urgent message to the other superpowers of the world, the message Carpenter appears to be trying to convey is that although things are bad now, things will get worse, with no prospect of recapturing the optimism of a bygone era. No doubt resonating fears in the audience of an imminent last man scenario, where the cherished ideals of yesterday are not just fading away, but ultimately not worth fighting for.
A lot of what passes for Nietzsche’s image in popular thought is a caricature of what was constructed by the Nazi propaganda machine in the 1930s (largely with the help of the philosopher’s own nationalistic, anti-Semitic sister, Elisabeth). Of course, if blame is to be assigned, then it is only fair to point out that much of the misinterpretations surrounding Nietzsche stems from the man’s own insistence on expressing his views in rather quick, often intentionally obscure musings and aphorisms, leaving his ideas wide open to be bastardized by opportunistic ideologues.
The reality is that even though it takes little effort to sanction an elitist system through Nietzsche’s philosophy, the actually details that accompany the man’s anti-egalitarian values—namely, anti-politics, anti-nationalism [especially anti-German], anti-group/herd mentality—are by definition incompatible with the belligerent, conformist, nationalistic, fascism inherent to the Third Reich’s state ideology. Nietzsche views on the notion of nationalities and personal identities (and the often times conflicted dynamics between the two), reveal a much more complex and nuanced perspective than the picture that has been (still is) often presented of him as the patron saint of Nazism.
In Part Eight of Beyond Good and Evil (1886), titled “Peoples and Fatherlands”, Nietzsche outlines his analysis of European and Western development, and critiques the modern move towards democratic institutions as a step towards the cultivation of a true tyranny. Nietzsche comments that the tribal affiliations that once dominated Europe are eroding away in favor of a more borderless sentiment amongst the hitherto disconnected people:
The Europeans are becoming more similar to each other / an essentially supra-national and nomadic type of man is gradually coming up, a type that possesses, physiologically speaking, a maximum art and power of adaptation as its typical distinction.
For Nietzsche, this development is a direct result of the advent of modernity, and modern ideas, which has made a person’s allegiance to a trifling tribe or nation unsatisfactory in light of modern man’s greater awareness of the world. Thus, a grander identity is needed, and a newer, more encompassing, international personal ideology is required to escape the limitations of the narrow worldview of one’s regional clan. Moreover, as identities and ideologies extend beyond the old local boundaries, a person’s interests will also evolve from the tribal group to the global. Politically, one possible result from all of this will be the development of a pluralistic society, out of which democracy will ascend as a means of appeasing the diverging—and converging—interests arising amongst the new, modern populace. It is within this context, Nietzsche argues, that democracy is born.
Nietzsche understands how this rise of democracy is looked upon as a great progress by contemporary society, but the philosopher himself is wary of the implications that such a system holds for humanity, stating that “this process will probably lead to results which would seem to be least expected by those who naively promote and praise it, the apostle’s of ‘modern ideas.’” Nietzsche is distrustful of populist inclinations, because it unduly gives credence to the degenerate, weaker persons of society to regress the progress of the more innovative value-creators, who will be forced to reside amongst the lowly plebeian masses. This sentiment is directly tied in with Nietzsche’s thesis on the dichotomy of master-slave moralities, the relevant part of which can be summarized as follows:
Our egalitarian sentiment, according to Nietzsche, is a result of the poison we have all blindly swallowed. Our demand for universal moderation, for the value of humility, our aversion to boastfulness as being too impolite in the presence of weaker, stupider individuals, and our desire to reduce the feeling of inadequacy from an opponent’s failures, are all manifestations from the original slave revolt of morality that is promulgated by those who seek to vindicate the virtue of their inferiority by means of social cohesion—to rationalize away personal failure in favor of mass victimization.
The democratization of society is to Nietzsche a move towards the promotion of mediocrity. It will condition us to be content with the will of others as reasonably equivalent to our own, instead of asserting our own interest in opposition to the whims of the masses. In short, our strive to achieve a more egalitarian mindset, will leave us too eager to be content with compromises with positions we fundamentally disagree with, rendering us potentially incapable of identifying and combating the ascension of any tyrannical entity that might see fit to stealthily encroach its power over our person:
The very same new conditions that will on the average lead to the leveling and mediocritization of man—to a useful, industrious, handy, multi-purpose herd animal—are likely in the highest degree to give birth to the exceptional human beings of the most dangerous and attractive quality.
Nietzsche proposes that in a society where the primary aim is to create unanimous equality, the ultimate result will be to create an environment of obstinate complacency (the greatest form of oppression that can be leveled against a thinking person). All this will in turn lead to the sweeping infantilizing of the individual, making her/him dependent on the body of the system as a whole for her/his survival, rather than one’s own strength and merit. A trend that will lead to a population “who will be poor in will, extremely employable, and as much in need of a master and commander as of their daily bread.”
However, the degeneration will not be universal amongst all individuals. Nietzsche explains that “while the democratization of Europe leads to the production of a type that is prepared for slavery in the subtlest sense, in single, exceptional cases the strong human being will have to turn out stronger and richer than perhaps ever before.” According to Nietzsche, in nature there exist those who can only dominate by virtue of their own values, and those who can only be dominated as a result of their inability to create values (hence, they must leach off of the values of others). These two groups do this by the presence of their will to power, that is to say, the very nature of their existence. As long as they exist, they cannot choose to act differently than the manner in which their nature—i.e. their will to power—dictates.
The problem Nietzsche sees with modernity is that our egalitarian-minded moral system has turned all of this upside-down, allowing for the weaker plebeian caste (who cannot create any values of their own) to dominate the environment on which the stronger noble caste (the natural value-creators) are cultured to stoop to the level of the very masses they should be dominating. This causes a dilemma for those few contemporary men born possessing the noble character trait, where their instinct (their will to power) tells them to reject the moral values of their surroundings and create their own moral values, but their conscience (indoctrinated by the slave mentality of the lowly masses controlling the moral discourse) tells them that subverting their own will in benefit of the herd is the highest virtue of the good modern man. Thus, when any individuals do inevitably rise above the masses (because, in Nietzsche’s view, the masses cannot help but unwittingly condition themselves to be dominated by some sort of master), the resulting value-creators who ascend to power will be as much a perversity of the noble character, as the degenerate culture that has produced them; what will ensue is absolute tyranny:
I meant to say: the democratization of Europe is at the same time an involuntary arrangement for the cultivation of tyrants—taking that word in every sense, including the most spiritual.
Reading these dire statements by Nietzsche through the privileged viewpoint of the 21st century, an observer would be justified to marvel at the prophetic nature of the philosopher’s words in predicting the rise of the totalitarian systems that would follow a few decades after his death.
The rise of fascism in both Italy and Germany appeared to emerge out of relatively democratic phases in both nations’ histories. Likewise, the 1917 October Revolution in Russia that brought to power the Bolshevik faction in the unstable country was enabled by the indecisiveness of the democratically-minded Provisional Government that arose from the 1917 February Revolution. In all of these examples the presence of a democratic political institution did not hinder the advent of repressive totalitarian regimes. Moreover (Nietzsche might argue), the presence of said democracies were instrumental in opening the door to these malignant forces, by having no mechanism by which to eject them from the political process besides the whims of a broken, infantilized population (whom Nietzsche describes as being “prepared for slavery in the subtlest sense”).
However, if one wants to be critical about the possibly prophetic nature of Nietzsche’s philosophy, it would also be apropos to point out that this sort of historical analysis is more the result of selective reasoning then objective inquiry. After all, it is equally true that every single one of the European democracies that yielded the totalitarian regimes of the 20th Century, were themselves preceded by non-democratic political entities, whose infrastructure crumbled despite their lack of concern for creating an egalitarian society. Furthermore, if the oppression of the totalitarian models of the last century are to be blamed on the insufficiency of the democratic institutions that preceded them, than consistency demands for us to also blame the insufficiencies of these democratic institutions on the failures of the aristocratic power structure that preceded them; and so on, and so forth, ad infinitum.
A better way to approach Nietzsche’s position here, is to consider that the philosopher may not be referring to political power at all, but a psychological development: “I hear with pleasure that our sun is swiftly moving toward the constellation of Hercules—and I hope that man on this earth will in this respect follow the sun’s example?” Hercules, of course, is the Roman demigod who is described as having returned from the underworld, and eventually ascended to the realm of the gods by virtue of his strength and valor—a character whose legend for Nietzsche must have served as a fitting representation of the philosopher’s will to power. The fact that Nietzsche states the reference as a question indicates that he was doubtful of the development of man to follow the example set forth by the Roman demigod.
I mentioned before that Nietzsche popular image is heavily, and unjustifiably, linked with Nazism. The falsity of this supposition is verified by Nietzsche’s own rejection of the purity of the German people, a sentiment that is antithetical to Nazi ideology: “The German soul is above all manifold, of diverse origins, more put together superimposed than actually built.” To Nietzsche the idea that Germany is to be cleansed of foreign elements is an absurdity in and of itself, since all things German (for him) are a mixture of originally non-German elements [a truth that I personally believe aptly pertains to all nations and ethnicities]. Nietzsche views the German nationalism emerging in his time as a result of an undefined people attempting to become a coherent identity; it is a compensation for a fault, which in its path “is at work trying to Germanize the whole of Europe” [a statement that perhaps once again hints at Nietzsche’s “prophetic” qualities in predicting the coming decades].
The most surprising fact to anyone whose opinions of Nietzsche have been largely shaped by the man’s false impression as a Nazi-precursor is the philosopher’s staunch abhorrence of European anti-Semitism. Nietzsche seems to understand the potential for his writings to be utilized by opportunistic anti-Semites, causing him to purposefully herald the Jewish people as a superior specimen, in contrast to the anti-Semites who seek to expel them from the continent:
The Jews, however, are beyond any doubt the strongest, toughest, and purest race now living in Europe; they know how to prevail even under the worst conditions (even better than under favorable conditions), by means of virtue that today one would like to mark as vices.
The irony here is that Nietzsche is attributing to the Jewish peoples every positive quality the anti-Semitic nationalists of Europe wish to attribute onto themselves. Just how much of this is motivated by Nietzsche’s preemptive desire to separate himself from the bigoted views of some of his potential admirers is an open question, but what is certain is the philosopher’s complete denunciation of the conspiratorial propaganda the anti-Semites are eager to spread into public consciousness:
That the Jews, if they wanted it—or if they were forced into it, which seems to be what the anti-Semites want—could even now have preponderance, indeed quite literally mastery over Europe, that is certain; that they are not working and planning for this is equally certain.
In other words, Nietzsche is of the opinion that if the Jewish people were as eager for world domination as the anti-Semites claim, they would already be dominating the world by now. The fact that they are neither planning nor interested in this is evident by the continued harassment they have to endure by people who claim (and have been claiming for a good few centuries now) to constantly be a knife-edge away from “Jewish-dominance.” Instead, Nietzsche suggests that the history of the Jewish people in Europe indicates a desire to want to at long last be accepted within the public realm:
Meanwhile they want and wish rather, even with some importunity to be absorbed and assimilated by Europe; they long to be fixed, permitted, respected somewhere at long last.
Even going so far as to insist that to achieve the long overdue inclusion of the Jewish people “it might be useful and fair to expel the anti-Semite screamers from the country.” I mentioned before the possibility that Nietzsche’s motivation for writing this screed against the anti-Semites of Europe is directly tied in with his desire to counterattack any possible conflation between his views and the views of some of his more questionable admirers (it was a move that, while well-intentioned, proved futile in the long run).
A more intellectual challenge that can be issued on Nietzsche’s passionate defense of the Jewish people, is the seeming contradiction it creates with the man’s staunch attacks against religion, in particular against Abrahamic monotheism, of which Judaism is the founding faith. A reasonable counter Nietzsche could make is that nowhere in his defense of the Jewish people does he defend any of the religious tenets of Judaism; rather he is aiming to point out the prejudice unduly leveled against the Jews as an ethnic group (which is what their most vitriolic defamers classify them as). Another point of consideration is that Nietzsche’s defense of the Jewish people, as an ethnic group, is completely compatible with his broader worldview regarding master-slave moralities. As a quick summary, Nietzsche divides human society into two distinct castes: the aristocratic nobility (the value-creating masters) and the plebeian masses (the herd-minded slaves). Amongst the aristocratic nobility, who–according to Nietzsche–are the rightful arbitrators of what is morally good, a further distinction is made between the knightly-aristocracy and the priestly-aristocracy; the latter of which are the ones who have provided the intellectual means for the lowly plebeians to charge a slave-revolt against the purer morality of the more noble caste—a slave-revolt which has permeated and shaped the moral conscience of modern man. In this scenario described by Nietzsche, the ancient Hebrews would occupy the role of the priestly-aristocracy, which has created the opportunity for the revolting slave-morality of Christianity to perverse the nobleman’s superior morality.
But Germans and anti-Semites aren’t the only groups Nietzsche holds in low regard; his opinion on the English are equally negative, dismissively referring to the nation’s philosophical contributors as the archetypes of modern mediocrity:
There are truths that are recognized best by mediocre minds because they are most congenial to them; there are truths that have charm and seductive powers only for mediocre spirits: we come up against this perhaps disagreeable proposition just now, since the spirit of respectable but mediocre Englishmen.
Nietzsche’s sentiment here could be due to his perception of the historical influence English thinkers have had in fostering the atmosphere for what he considers to be harmful modern ideals. Nietzsche’s reasoning may partly be justified by the fact that English parliamentary-style government has served as a model for many forms of European democracies; a system which, as discussed earlier, Nietzsche views as contributing to the “mediocritization of man.” This reading is supported by the philosopher’s persistent equating of the lowly plebeian values with the English nation, in contrasts to the superior (in Nietzsche’s eyes) French culture, “European noblesse—of feeling, of taste, of manners, taking the word, in short, in every higher sense—is the work and invention of France; the European vulgarity, the plebeianism of modern ideas, that of England.” Here, Nietzsche’s personal biases are leaking through the prose, showing his preference towards the Latin countries he spent a great deal of his creative career residing in, in hopes that the temperate climate would alleviate his poor health. France, in particular, is a place he developed a great deal of fondness for, an affection that was further encouraged by the fact that the German nationalists of his time (à la Richard Wagner) held French culture in very low regard. In contrasts to the barbarianism of the northern cultures of Europe, Nietzsche described the French as possessing a more timid and sophisticated taste and mannerism:
Even now one still encounters in France an advance understanding and accommodation of those rarer and rarely contented human beings who are too comprehensive to find satisfaction in any fatherlandishness and know how to love the south in the north and the north in the south.
Of course, it can be easily argued that Nietzsche is engaging in a very selective form of cultural analysis in his heralding of France as a society that has transcended politics and nationalities. Furthermore, one is even justified in pointing out the apparent contradiction in Nietzsche’s reasoning, since the ideals of the French Revolution played a large part in nurturing the call for democratic reforms throughout the European continent—at least in spirit, if not in practice—a historical development Nietzsche claims to despise wholeheartedly. The inconsistency in Nietzsche’s condemnation of the English for their historic role in nurturing democratic principles, but failure to acknowledge France’s equal part in this modernization effort, is a shortcoming that cannot (should not) be easily overlooked by even the casual reader.
On the face of things, Nietzsche’s opinions of nationalities and patriotism appear direct and concise, as he spends page after page polemically dissecting and chastising all who fall for such “infantile” ideals. However, the man’s mindset on the modern development of Western society seems to be somewhat murky at times. He writes as if he loathes the coming uniformity of society (a sentiment instilled through the growing influence of democratic institutions), but at the same time he condemns the narrow-minded tribalism on offer from the nationalists. This leaves open the question on what sort of political development Nietzsche would like to see come about to reverse the wrongs we are currently on. Moreover, is it even possible to develop any political ideals from a man whose philosophy is so staunchly anti-political to begin with; will not any such attempt result in complete failure, on account that one cannot successfully create an ideological foundation on inherently polemical premises? I think Nietzsche’s primary goal on the issue of modern politics ought to be viewed more as a social criticism, rather than a social framework. For instance, when it comes to European affairs, the philosopher distances himself from both the nationalist and democratic factions, but is astute enough to realize that the former is a final gasp of a dying sentiment, and that the latter will be the ultimate trend amongst modern man, because (above all else) “Europe wants to become one.” Yet, despite the potential that lie with the aim in greater social unity, the underlying principles upon which this globalizing trend is based on, is something Nietzsche simply cannot support in good spirit.
 Nietzsche, Friedrich. Beyond Good and Evil, Part Eight “Peoples and Fatherlands,” section 242.
 Ibid, section 243.
 Virgil, Aeneid, 6.395.
 Ibid, section 244.
 Ibid, section 251.
 Nietzsche, Friedrich. On the Genealogy of Morals, “First Essay: ‘Good and Evil,’ ‘Good and Bad,’” 1887, section 7.
 Nietzsche, Beyond Good and Evil, “Peoples and Fatherlands”, section 253.
Having taken as long a break as I did it from posting updates to this site, and having done so just as a pandemic got started to boot, it would be weird if I didn’t address the 2020 lbs elephant in the room. Namely, I’m fine and healthy, and have remained so for this whole year (so far, at least).
Part of the reason I managed to avoid COVID like the proverbial plague is the fact that, unlike many of my fellow Americans who serve as essential workers in our struggling economy, my place of employment was able to transition its staff into a work-from-home setup early on in this ordeal. Hence, I was fortunate that the burden of figuring out how to properly socially distance from my coworkers was never a serious threat to me, even when the confirmed number of infections continued to climb.
When we consider the fact that the people who had to remain in continuous contact with the public, day in and day out, and put their health at risk so that the rest of us could reserve the services we needed to uphold some level of comfort during this trialing time, are also by far among the lowest paid workers in American society, it doesn’t take a whole lot of big-brain thinking to figure out that something in our society is seriously messed up. But I digress.
The point of this post isn’t to complain about the imbalance in the modern day economic model. What I really want to discuss is a realization I made during my seven month (and counting!) tenure as a remote worker. It’s a realization that many have already made long before me, and long before corona was on everyone’s lips (and mucus): the traditional office is obsolete and serves no purpose in a 21st century workforce.
Now, hold on. Whether you agree or disagree with me, I do come carrying caveats, just in case.
Let’s just start out by saying how this statement is not absolute, of course. Putting that out there right off the bat, before readers start emailing me a list of office jobs that can’t be done remotely. I know these roles exist, and I know that they’re vital, and I know that there will always be a place carved out for them in the white collar workforce. However, allegorical counterexamples don’t change the fact that, as a whole, a lot of what the average office worker does by going to a cubicle every day, could be just as well adapt into a home office setting. And workers could do so at a reduction of costs for themselves (save on gas, save on meals…hell, save on clothes if you don’t feel like wearing pants anymore as you work–it’s your living room, Bob, go nuts!). But it’s also a cost reduction for the employer, as they would no longer need as extensive of a physical office, if most of their staff is working remotely. Note, I said they’d have no need for an extensive physical office. I understand that there will always be a need for a skeleton crew of individuals to run the daily administrative responsibilities at a company’s corporate location, but the argument is that such a space needed to contain a handful of individuals ought to be a lot more affordable than the space that’s needed to house a staff of two dozen or more for forty hours a week.
The second thing I’d like to address is the appeal to the need to foster workplace camaraderie between coworkers, and how working remotely will cause us to lose this experience of bonding with the people we share an office with. While I don’t doubt that there are many out there who bond, socialize, and form lifelong companionship with their coworkers, I would guess that for every employee who falls into that category, there are are at least six or seven employees who have little to no interest in viewing the persons sitting in the desks around them as people to get chummy with. That’s not to say that most people necessarily view their coworkers negatively, but there is a big difference between being friendly with others, and being friends with them. I’d wager that for most of us, coworkers fall more in the former, than the latter camp. I can’t help but think that this idealized notion of camaraderie between employees exists mostly in the minds of a management class who doesn’t really grasp just how little time the average American worker has to fraternize with their colleagues while they’re rushing to meet deadlines, and process a full day’s workload.
I’ve also been told that productivity is a concern when it comes to work-from-home, and that it’s demonstrably higher when they need to go to an office away from their homes, as it enforces the separation between one’s professional and private life. Granted, I’m single, and live alone, and have no children. So I don’t want to lecture those whose living situation is different than mine, nor do I want to resort to deferring to testimonials from married parents living in a multi-family home, who also happen to agree with me. I will simply say that, I’m amazed employers are having trouble figuring out what they should do with unproductive employees, just because they happen to be working remotely. After all, if you’ve spent any amount of time grunting it out with us plebes on the floor of the office, you’d know that there is always one or two unproductive members of the staff sitting in a cubicle only a few feet away from management’s vigilant eye. And I have yet to hear anybody try to make the connection that these individuals’ lax attitudes must be tied to having to put on a tie and sit in a box for eight hours a day. I’m not saying it is; I’m saying that short of raw metadata into the subject, we’re both just speculating to fit our narratives.
I could go on for much longer, but I want to finish by admitting that this time last year, I was fairly open to the idea that traveling to a work office every morning was more ideal for most company jobs than having the majority of such employees work from home. Face-to-face training, interpersonal meetings, and even just the casual “Hello!” in the break room seemed like integral parts of the working experience to me, and I could have been swayed into believing that they were necessary parts we shouldn’t abandon. But now that I’ve worked remotely for the better part of the year, I just don’t see the point of having people shuffle to and fro to desks and cubicles, where they’ll be immersed in work for hours on end, only to occasionally look up from their daily reports to nod at the equally overworked person sitting next to them. The amount of genuine human engagement most of us experience in the office isn’t enough to satisfy the basic socializing needs of the most introverted members of society, let alone the majority of people who fall closer to the median of that spectrum. And if human engagement is why we’re holding on to an increasingly outdated concept, we’re probably better off figuring out how to find it elsewhere.
Social life, and the social culture that surrounds it, is by necessity an idealization of extroverted personalities. Being outgoing, adventurous, flirtatious–i.e., sociable–is the go-to characteristic that storytellers revert to when they want to make a character likable. In contrast, if they want to convey the point that a characters is not fully well-adjusted, the usual trope is to make her/him socially aloof (or downright inept), awkward, withdrawn, or not good at the basics of human interaction (somehow Sherlock Holmes can deduct all the intricacies of human behaviors to get an accurate read on people’s personalities, right down to their favorite toilet paper brands, but can’t figure out that he himself is a total asshole, huh?). Given this subversively negative portrayal of introversion by media and entertainment sources, it’s no surprise that many introverts will eagerly seek out any medium that affirms some level of humanity to the introverted individual.
Self-help books on Amazon that deal with introversion not as a maladaptive flaw, but as a perfectly valid state of personality, garner a lot of support, both in their reviews and the number of sales. Online communities (which tend to skew heavily towards the introverted side of the personality scale anyway) will often share supportive words and studies showing that being an introvert doesn’t simply end at “not being social,” but encompasses a wide array of positive traits, too, such as thoughtfulness, self-sufficiency, and creative aptitude. One could even argue how the ease by which social media has taken over the personal interactions of much of modern human communications, that this digital age we’re enjoying caters much better to our introverted tendencies, given the control users of these platforms have in terms of getting to tailor interactions to their personal comfort levels.
Personally, I definitely lean more towards being an introvert than an extrovert, so I’m inclined to welcome any positive press bestowed towards my fellow shut-ins (relax; we’re allowed to ironically use these demeaning terms among ourselves). But going right along with the introvert’s supposed knack of thoughtful introspection, I would be doing my tribe a disservice if I didn’t point out that for many people the introvert label has become somewhat of a cop-out to avoid uncomfortable situations, or avoid taking steps towards any semblance of self-improvement on the social front.
Everybody has bouts of introversion; even the most socially lively among us. Usually these show up while we’re in the midst of new social surroundings and experiences. What seems to separate the self-identified extroverts from the self-identified introverts is the way they respond to said experiences. Extroverts will use the initial discomfort to energize themselves and try to turn the unfamiliar setting into something familiar (thereby increasing their comfort level with it), while introverts tend to see these social settings as a drain to their energy and will approach them like a tedious chore (thereby not concerning themselves with increasing their comfort level in the situation, but focusing on the comfort they’ll get to enjoy once they’re finally able to be alone again). I’m admittedly generalizing here for the sake of brevity, so calm down with the caveats and nuances I know you’re preparing to angrily type my way (we introverts do have a penchant for pedantry, after all).
With all this bit of pop psychology aside, I want to get to matter that I have observed pretty prominently for a while now. For a lot of us who identify as introverts, we often use the label as an excuse to cover for our shyness. As I said, everyone is introverted some of the time, but I’ve noticed that for many of us who define ourselves as introverts–not just as one of our personality traits, but the defining trait of our identity–what we seem to be doing is using the now more socially acceptable fact of being an introverts to hide the still less acceptable fact of just being too shy.
What reason would any of us have to self-delude our own egos this way? Well, for starters, to say that you are an introvert is to say that avoiding social settings is a part of your nature, while admitting that you are just too shy for social settings might make you sound like you are fearful, and therefore make you feel like a coward. It goes without saying that being shy doesn’t make anyone a lesser person, but it’s also unavoidable that most of us would rather not advertise our fears and insecurities to the rest of the world. With the rise of respectability given to genuine introversion, many of us see it as an opportunity to mask our social fears and anxieties behind it. Meanwhile, we continue to feel withdrawn and isolated, and continue to fall deeper into the despair of loneliness; making it much worse for ourselves because we’ve now fooled all those around us into believing that being alone is our preferred state of being. And because we have convinced others (and, on a surface level, ourselves) that we are innate introverts, whose default nature is to be away from others as much as possible, we eventually find it impossible to seek out what we truly do crave at our core: companionship and camaraderie.
It took me some time to accept that deep down I wasn’t just an introvert comfortable in solitude, as much as I was also a shy kid who was afraid to engage in social settings, despite actually having a basic desire to do so. This shy kid eventually became a shy adult who embraced his more introverted qualities, because it was easier than having to confront my honest fears on the matter, and leave myself vulnerable to the very sort of judgment that caused my shyness (and nurtured my introversion) to begin with.
Much like stage fright, I can’t promise that shyness ever really goes away. Whether it’s origins are ultimately caused by nature or nurture (or a combination of both), once you mature through life with it, you’ll always feel some of its affects on you. But there are ways to lessen the sting of it, especially when it comes to your outward interactions with others. It takes effort (a lot of effort), as no book, seminar, or inspirational quote can do the job of remolding the way you see yourself, and the way the world interacts around you. But it can be done. And if you are a self-identified introvert reading this, I would ask you to consider whether, for you too, it is perhaps simple shyness that is at the root of what you believe to be an inherently introverted character.
And if you are considering finding ways to overcome the negative aspects of shyness that are keeping you from being as happy in life as you could potentially be, a giant step forward will be to admit the fact of your shyness to yourself. The next steps forward are more incremental, and involve making a combination of small and moderate changes to your way of thinking about socializing and interacting with others. One giant step backward to any possible progress, however, is to cling to things that allow you to hide from the reality of your fears and insecurities about achieving the social life that would satisfy you (whatever extend or comfort level that may be), and pretending that your lack of social interactions are the result of being an innate introvert, when it probably has more to do with simply being a person whose shyness has caused them to avoid the initial discomfort of socializing. There is no shame in not wanting to be alone, but hiding from this want and continuing to deny it to ourselves out of a misguided sense of loyalty to an identity we have adopted to cope with our shyness, is the best way to guarantee a lifelong refuge in a misery that need to be.
I make no secret about the fact that I consider the self-help industry to be largely bullshit. That’s not to say that striving for personal improvement isn’t a worthwhile goal, and there is certainly no shame in seeking out sources that will help one achieve said improvement. In fact, I’m a firm proponent that everyone should go out and find personal fulfillment and work towards better clarity, understanding, and all that great stuff that make a person a well-adjusted and psychologically healthy individual. Be it yoga, video games, sports teams, fitness, mountain climbing, elaborate cooking escapades–if it floats your boat and leads to a better version of you, than, by all means, ride that wave home to shore.
The problem is that the self-help industry is something very different from just a resource for genuine self-improvement. It’s a profit-driven marketing scheme, propagated by charlatans with a cult-like sense of self-importance, whose bottom line is to prey on people’s insecurities as a means to secure their own monetary success and celebrity status–where helping people overcome their actual problems is an afterthought, if it is given any real thought at all.
Noting the handful of (arguably) legit self-help trends that I’m sure some readers will be eager to point to as the exception to my condemnations above, I would hope that most of us can at least agree that what is commonly referred to as the pickup artist (PUA) community, largely operates as a racket.
PUA is the umbrella term for various seduction and attraction methods put forward by a loose-knit collection of self-styled experts in the field that claim to be able to help men get sex from women. Now, those within these communities will undoubtedly disagree with my description here, and will want to claim how their “techniques” actually span a variety of confidence building and self-improvement exercises applicable to a wide range of a person’s life beyond just sex and seduction. But in all honest, I dare anybody to point to a single legit PUA source whose underlying material isn’t about showing men how to get laid with a higher quantity of attractive women. Go ahead, I’ll wait…
Books, seminars, workshops, blogs, podcasts–there is hardly a profitable venue the PUA market hasn’t reached. The gurus delivering the message will almost always be decently attractive men themselves, who will always claim to have at some point been just as clueless about approaching women as the love-shy men currently seeking their advice. The methods they are teaching are therefore tested, and street verified, with the transformation and testimonial of the now suave pickup artist himself as the ultimate proof that you, too, can reach this level of Casanova sexual prowess with the ladies. These PUA gurus will offer samples of their services for free online, but to really get the full affect of their wisdom you will eventually need to commit to attending their infield training camps, the cost of which can range up in the thousands (yes, thousands!) of dollars.
I’ve never been much of a business man so I will foolishly distill down the basic message of all PUA methods, techniques, and skills, into two all-encompassing points:
Don’t be needy.
Play the odds.
The first point covers all the basics of not coming across as desperate, or fixating, or being too accommodating towards any one women. And the second point emphasizes how in a world of varying sexual appetites, simply approaching enough women will statistically increase the likelihood in your favor that at least a few of them will be willing to interact with you, and possibly even have sex with you.
That’s it. Those are all the tactics PUAs have to offer in a nutshell. All the jargon, all the insider terminology, essentially falls under points 1 and 2 above.
Now I’m going to go one step further and actually tell you the key universal truth about attraction. Are you ready? There is nothing you can do to make someone attracted to you, if they weren’t already inclined to feel attracted to you. In case you need it put more bluntly: There is no trick, method, or approach you could ever learn or master that will make someone who is otherwise not attracted to you, suddenly want to have sex with you.
Oh sure, you could wave millions of dollars in a woman’s face that will entice her to pretend to be attracted to you. Hell, the incentive of gaining riches could very easily make a number of straight men agree to fondle your genitalia, too. But they still won’t be attracted to you; not really. Not anymore then they were inclined to be when they first met you, and knew nothing about you.
If you need further convincing of the validity of this key universal truth of attraction, indulge me with this thought experiment. Think of a person you are just not sexually attracted to, at all. There doesn’t need to be anything physically wrong with them, and they could be a perfect lovely and decent human being in their own right; they’re just not your cup of tea as far as sexual attraction goes. Now try to think of anything this person could ever do or say that would suddenly make you feel sexually attracted to them. Can you think of anything? No? Exactly.
Pickup artists know that this is the truth, and it’s part of their long con. They understand that it really doesn’t fucking matter what you say to a women, just that you approach her in the first place. Because what do these PUA gurus say to the men who have spent 2-3 paychecks worth for their advice when they still end up striking out with a women under their tutelage?–“Don’t worry about it, man. Just go on to the next one.” Which is correct and good advice, but hardly worth the shitload of cash they had these men put in to receive it. But men who lack experience with talking to women in the first place–let alone dating them–don’t know that. They think there must me something more to it, like a secret code that can be deciphered. But there isn’t. No code, hence no cheat code; ergo, no shortcuts or tricks.
You will only be attractive to the women who find you attractive, and you will only find these women by talking to and approaching enough women in the first place. And as long as you act like a decent enough human being, you will manage to keep the attraction of these women long enough that they may agree to have sex with you. That’s it. No book, or method, or lecture, or dishonestly edited “infield footage” will give you anymore insight than that.
There will be some number of readers who will nod along in agreement with everything I have written in this post about PUAs and their tactics, but will part with my unwillingness to outright attack the men they prey on for personal and financial gains. They might say that if you are the sort of person who is so easily taken in by obvious grifters, you deserve little no sympathy for it. If you are of this mindset, I can’t say anything to dissuade of it, but I sincerely cannot find it in me to go along with this line of thinking.
If you are the sort of person who takes advantage of another’s self-conscious personal flaws, and seeks to make a livelihood out of other people’s pains and loneliness, it is you who is the bad person, not those who were unfortunate enough to fall into your predatory sights. And PUAs, like all these self-help guru charlatans, are essentially just predators who have found the venue by which to turn their predatory natures into a profitable market. And they deserve the ire of any decent person who crosses their path for it, and they don’t deserve to have any of that ire deflected onto their victims; regardless of how gullible the latter group may seem in the grand scheme of things.
The first grocery store I saw when I moved to the United States was a meager looking spectacle called Sellers Bros. in a rundown strip-mall area of southwest Houston, TX. The store’s shelves were as overcrowded with bargain, generic-name products, as it’s aisles were with patrons shuffling from one end of the building to the next, holding tightly to their Lone Star Cards needed to feed their families for the month. The building’s somber looking outer-structure held a passing resemblance to the apartment complexes that surrounded it only a few paces away—one of which my family was living in at the time, serving as our first exposure to the realities of inner-city American life we had immigrated to, and were gradually assimilate with.
The majority of the neighborhood was composed of immigrant families. Though unlike my family, which originated east of the Atlantic Ocean, it was impossible not to notice that most of my neighbors hailed south of the Rio Grande. As a result, while I had come to this country with the advantage of being able to speak English reasonably well—well enough to understand, and be understood by the general Anglophone population anyway—this advantage proved of little value on the very street I called home for these years of my adolescence. It was an early education to the fact many living in urban America are readily familiar with. Namely, that within the reality of American life, reside smaller sects of conflicting realities, many of which can neither communicate nor understand one another, and are set up so that they will rarely meet. Gulfton Street in Houston, Texas, occupies one such reality.
Tucked away between two major highways in southwest Houston, spanning a stretch of 3 to 4 miles of cracked concrete landscape, sits the street of Gulfton. The epicenter of the Gulfton Ghetto, as it’s occasionally called by the local media and by other Houstonians (though never by the neighborhood’s own inhabitants). To those who take a wrong turn off Bellaire and find themselves driving down Gulfton Street by accident, the insulting nickname will seem most warranted.
The immediate sights one is met with are panel after panel of gang graffiti, row upon row of low-rent apartment complexes, and concrete sidewalks that have been in desperate need of repair for a good few decades now. Surprisingly, there is a park/recreational center meant to give some relief to the area’s ongoing problem with juvenile delinquency, though anyone who has ever stepped onto the park itself will be quickly robbed of any hopefulness at the prospect of this endeavor. In short, like many neighborhoods in urban America, Gulfton is a place that has been largely abandoned to the ravages of metropolitan entropy.
Under-funded and halfway flushed out improvement projects that have failed to live up to expectations are pointed to by the rest of the city as reasons not to bother with any future attempts at repairing the crumbling infrastructure. Leaving the residents who have given up on the idea of moving away to either wall themselves off from the unsavory conditions that surround them within their private residences (however meager they may be), or embrace it by becoming a part of its destructive nature.
The first instinct any well-meaning person will have when confronted with a reality like Gulfton is, “Can anything be done to fix this?” It’s an honest question, but it betrays a lot about the person asking it. The idea that there is any one thing that can resolve problems that are decades in the making is a part of the problem to begin with. These sort of problem are such that they have no one facet of origin, but are a delicate, interwoven mess of social, economic, and political barriers erected and maintained through complex systems with interests that themselves compete against and prop up each other in a multitude of ways. The problems of Gulfton, like the problems of similar neighborhoods and populations throughout this country, have no single cause; hence they can have no single solution to curb the path they are currently on.
“Why don’t the people living there work to fix things? It’s their neighborhood, after all. Don’t they care?”
Unfortunately, the reality of all urban areas is that they are landlocked and dependent on the larger metropolitan that surrounds them. They don’t get to make decisions in a vacuum, and resources are finite and sparse in terms of what will be readily allocated to benefit them. The further issue is that once a neighborhood has fallen far enough to be regarded as “hopeless” by officials and administrators who could possibly make a difference, the very hopelessness of said neighborhood is used as the reason against committing long-term funds to improve its conditions, on the basis that it would be unfair to use tax dollars from well-behaved citizens in more savory parts of the city to fund the activities of no-good thugs and gangsters in these low-income, high crime areas. Local agencies will say they are not equipped to handle the expenses needed to undertake the sort of social projects necessary to overhaul the issues plaguing these sorts of areas, while Federal agencies see these issues as strictly a local concern.
In the absence of a robust social safety net provided by the city or state authorities to ensure the most basic of securities and public amenities, opportunistic forces will band together to construct their own safety nets, which for many young people will take on the form of turning to gangs that prey on social instabilities as a means to offer their quasi-organized crime structure as an alternative to festering in a decrepit social system. The reason youths are most susceptible to this, is that they are the most in need of some kind of functioning social order to orientate their lives (and relieve their boredom), and even the violent and dangerous structure of a gang life is to many preferable to the instability of no visible structure at all.
Some people have a natural aversion to hearing that any issues constitute a systemic problem, requiring a systemic approach to resolve. They conjure up images of how the very notion of entertaining such a thought is little more than an attempt to skirt away responsibility from the individuals and let them avoid the consequences of their actions and/or apathy, leaving them no incentive to make things better on their own accord. I can understand the sentiment behind this aversion, though I find it largely misinformed.
In a place like Gulfton, how exactly do you expect the individuals living there to step up to fix the various problems that plague their environment? Should they pool their meager earnings together to pay for the ongoing structural damage to their concrete sidewalks and street signs, despite the fact that we’re talking about city property and as a results is an issues needing to be addressed by the local government? How about the need to improve the resources available to the local schools so that there can be robust after-school programs and activities available for young people to occupy their time with to discourage the need for delinquency and gang activity? Should the low-income earning parents of these youths fund these programs directly, thereby taking money away from them that’s needed to pay rent, utilities, food, clothing, etc.? Would that be an example of individuals stepping up to take personal responsibility to improve the conditions around them, or a neglect of one’s obligations to provided basic necessities for one’s own family first? If donating money is not the answer, surely we can get everyone to at least volunteer their time to improve their community, no? It’s not as if the sort of people who have to live in these sorts of neighborhoods, are undoubtedly also stuck working jobs with little to no flexible hours or time off, after all.
Perhaps the answer is that all these folks ought to work harder to increase their earnings, so they aren’t hostage to their economic conditions. Yet, if they actually managed to do just that, what incentive would they have to spend their extra earnings on repairing a place like Gulfton, as opposed to–oh, I don’t know–simply moving away to a better part of town that already offers all the basics of having dignified living conditions?
Unless you are Bruce Wayne, sitting on an endless supply of inherited wealth, resources, and leisure time, individuals donating money and/or donating time, will never be a solution to the problems that affect neighborhoods like Gulfton. These are problems that took a long time to manifest, and they require long-term investment and planning to be resolved. It requires layers upon layers of overarching organizational resources, to properly oversee and track improvements, that no single individual or clustered group is capable of providing. Private businesses, local or otherwise, also offer little help in the matter, since their is no business incentive in investing in a place simply to improve the lives and environment of its residents, since these residents will not be able to return the gesture on account that, at the end of the day, they’ll still be too poor to ever be able to turn a profit for these businesses.
And it takes an astounding level of naivete to not be able to realize this. The same sort of naivete that leads certain people to make inane points like, “If you like public programs, and think taxes should be higher to pay for them, why don’t you just volunteer more of your money on an individual basis, instead of demanding everyone else do it through the tax code?” Because individual actions and donations will not solve systemic problems like the ones affecting neighborhoods like Gulfton, that’s why. Because many of the problems plaguing inner-city life are far too complex and interconnected to a multitude of surrounding factors to be seriously brushed off with red herrings concerning individual responsibilities.
Areas like Gulfton are the way they are because they have become culturally and economically alienated from the rest of their metropolitan centers, and the rest of the country at large, and little is being done to incorporate them into the greater society that surrounds them. The full reasons for this alienation are legion, and the solutions that will be necessary will by definition be just as extensive, which is a reality that must be acknowledged by those who purport to take the issues of working, urban, and immigrant communities seriously.
If, on the other hand, you simply don’t care about places like Gulfton, then just say you don’t care, and stand by the convictions of your apathy. And stop pretending that there is a greater moral or ideological basis to what is essentially pure disinterest for the plight of people you can’t be bothered to give a shit about. It will make for a much more honest conversation.
In a not-too-distant previous life, when I thought that standing in front of dozens of apathetic teenagers in hope of teaching them why learning proper grammar, writing, and argumentation skills was a worthwhile vocation to pursue, I came up with a nifty little speech to start off every semester.
I would say:
I know exactly what you are thinking right now. It’s the same question every student, in every course, in every land thinks every time they enter a classroom.
Why do I need to learn this?
The simple answer is that it’s because the law requires you to; at least until you turn 18. For most of you that’s a good enough answer to put up with my incessant talking for a few months, scrape together enough effort to satisfy the course requirement, and move on to your next classroom, until the law finally says that you’ve gone through the motions long enough to be let loose into the real world, full of non-classroom-type duties and responsibilities. For most of you this answer is good enough. But there’s a few of you for whom this sort of reasoning is not anywhere near good enough to make you put up with what the education system expects of you for an hour and fifteen minutes of your day.
If you fall within that group, I want you to listen very closely. In life you will meet many people. A great number of these people will make prejudgments about you from the first moment they see you–both good and bad. The good prejudgments will work to your benefit, and the bad will be obstacles that can make your life very, very hard.
People will make prejudgments about you based on your height, your weight, your race, your gender, the way you dress, the way you stand, even the way you choose to cut your hair. The negative opinions formed by these prejudgments, no matter how unfair or shallow, will for the most part be things you have little control over. Except for one important component: The way you communicate. Yes, people will judge you by how you speak, too. And while you can’t do much about someone who simply hates you for the way you look, you can sure as hell do everything to deny them the pleasure to dismiss you for the way you communicate. Even if they still hate you at the end of the day for all the bigoted ways available to them, you should at the very least do everything in your power to make it impossible for them to dismiss you for the way you write, the way you argue–the way you speak! That is entirely within your power, and it is a power that’s learned, not inherited. This is your opportunity to learn it, if this is a power you wish to possess. If you don’t, any prejudgments others make about your person as a results of your decision right now, will be entirely on you.
I’m biased, but I like to think it got the point across as well as anything else could. And while the point was of course to get the students to feel somewhat enthused about the lesson plan, there was also a deeper purpose to my little pep-talk. Namely, I was demonstrating the use of rhetoric to argue the case for learning about rhetoric (none of the students ever really picked up on this, though).
Rhetoric has a few technical (read boring) definitions floating around, but the basic gist of it is that rhetoric is a form of discourse meant at persuasion (typically of a person or audience). This is the part about rhetoric that most philosophical commentators agree on anyway. Opinions regarding the use or ethical standing of rhetoric have been more polarizing, however. Plato looked down on rhetoric as mere flattery that could be used to manipulate the masses, as it’s primary purpose was to convince you to side with the argument, and not to impart knowledge or truth. His student Aristotle took a more favorable view, and considered rhetoric to be an important discipline (and art form), and a necessary part of any well-rounded civics education. Much of the writings and social revolutions that emerged from the Enlightenment relied heavily on rhetoric to persuade the public to a new way of thinking about life (and liberty, and even the pursuit of happiness). The same goes for anti-Enlightenment reactionaries, who argued in favor of preserving the status quo in society.
In the modern world, rhetoric (in its purest form) is most readily seen in courtrooms and legislative bodies, and the political spheres that surround them. It’s no surprise that so many politicians start out as lawyers, and go on to use the same rhetorical tricks they learned in law school on the campaign trail. It’s for this reason that rhetoric takes on a negative connotation in many people’s minds.
Memorable (yet content-empty) slogans, propagated by conscience-devoid politicians, whose only concern is scoring a victory in their (and their donors’) favor. Arguments put worth by their mouthpieces in the form of public commentators and pundits, serving the sole purpose of winning over the electorate’s hearts, often at the expense of their critical thought and personal long-term interests. Honorable mentions also go to the rhetorical tactics of self-professed experts who peddle pseudoscience and conspiracy theories to the affect of fostering a perpetually misinformed populace for the sake of monetary gains. These can all be counted as examples in support of Plato’s skepticism towards rhetoric as a virtuous mode of discourse.
Even my speech above is arguably laced with unwarranted rhetorical hyperbole. (Honestly, most people you meet will probably not form good or bad opinions of you; they’ll probably look right past you with complete indifference, if you offer no value to them as a person). However, one should refrain from getting distracted with unwarranted equivocations. I sincerely believe there’s a big difference between educators using rhetoric to motivate their students to succeed in their coursework, and the sort of rhetoric that contributes to public policy meant to misinform the public (if you don’t, I hope you never get picked to serve on any jury).
I already mentioned the culpability of politicians making use of rhetoric to spread propaganda for ideological gains. And while this is universally snubbed as somewhere on the edge of morally questionable behavior, the only reason its done is because it works so well. In other words, people get manipulated by the bells and whistles of skilled rhetoricians because they don’t care to educate themselves about the hogwash they are being fed (usually because they agree and want to believe what’s being said to them, even if it’s factually baseless).
The public (at least its voting component) is the primary check on politicians in a democratic republic. However, given the ease by which we will readily be swayed by faint words of praise and reckless fearmongering, its not absurd to thing that Plato may have been on to something when expressing doubts with the public’s ability to combat against rhetoricians whose only purpose is to persuade with complete disregard for the truth of their words.
A secondary check on the rhetoric of public officials is the part of the voting public that makes up the free press. The reason why the founders of the United States explicitly mentioned protection for the free press from the government in the first amendment of the U.S. Constitution, relates back directly to the role the press (ideally) ought to have as the fact-checkers holding those in power accountable. Unlike the public, a respectable free press has several internal mechanisms in play that work to sift through credible and credulous information. It’s also why the first thing clever rhetoricians do is undermine the very credibility of the free press. “Fake News” is a beautiful example of manipulative rhetoric at its finest, as it plays on the public’s distrust of media sources (i.e. its only reasonable to believe that some news outlets fail to overcome the biases of their presenters) and gives it a credulous dose of self-serving generalization (i.e. all news outlets that disagree with me are the biased ones, regardless of any evidence they present to support their position).
Any reasonable amount of critical thought on the subject clearly shows that the fact that news sources can be mistaken (or even outright deceptive), does not therefore warrant the conclusion that all media must be wrong and lying when they report something you don’t want to be true. Once again, it’s up to the public to follow-up on the sources any reputable press will readily provide for them to check the merits of what’s being reported. Shouting “Fake News,” however, makes it easier to muddy this relationship between the public and the press, by equating all sectors of the press as untrustworthy in general, and allows people to lazily self-select only the media they are already disposed to agree with, without having to be burdened with doing any intellectual legwork.
Journalists are also rhetoricians by trade. Unlike politicians and lawyers, however, members of the free press ought to strive to belong to Aristotle’s more virtuous sect of the rhetoric spectrum, which aims to persuade the masses towards truth and knowledge. As journalism moves more towards competing for public viewership to continue to operate–thereby having to appease to the whims and tastes of the public, rather than seeking to simply inform them–the concept of fact-based reporting threatens to descend completely into the realm of vacuous rhetoric meant to do little more than keep up viewer support (which, as mentioned, is prone to succumb to some flimsy and fickle interests).
The elevation of online personalities, whose sole journalistic experience is being able to cultivate an audience around themselves on video-sharing sites like YouTube, under the neologism of “alternative media,” is an example of a free press where rhetoric takes precedence over fact-based reporting. Not to smear those personalities who make every effort to be a respectable source of information, the reality is that the environment of being an online news commentating source is inherently prone to undermine the fact-checking mechanism of traditional journalism, mostly by side-stepping it completely in favor of peddling rhetoric.
These online outlets have little in the way of field-based journalists doing the legwork to uncover newsworthy stories, let alone teams of fact-checkers tirelessly looking through sources and notes to determine the veracity of a story prior to its reporting. In truth, they rely almost entirely on the work of traditional journalists, whose work they present and provide opinionated commentary over, while ever-so-often throwing in jabs at how ineffective traditional journalism is, despite most (if not all) their actual “news” content coming through the efforts of said traditional journalism. The reason why this matters is that it is a clear example in which what could be a respectable profession, and a reliable venue for information for the public, is sacrificing its responsibility to dispel factual knowledge for the convenience of mindless rhetoric because it offers them popularity and financial gains in terms of viewer support and sponsorship.
Understanding the role of rhetoric–its values, its uses, and its prevalence–is vital in being able to identify the difference between an impassioned speaker fighting on behalf of a just cause, and a demagogue looking to manipulate the mob to his advantage. Its vital in being able to distinguish between journalists who go through many painstaking, sleepless nights to report a truth to the people as a public service, and pundit blowhards using the cover of journalism to propagate misinformation for their own gains and egos. In general, to understand the use of rhetoric, is to be able to identify it and (if need be) ward yourself against its more dire influences.
Rhetoric is not, and should not be, a dirty word. Like most things, in the hands of benign and well-meaning hands, it is a powerful tool of communication that can inspire immense good in the world. In the wrong hands, however, it can be the barrier that keeps us permanently free-falling in the abyss of credulity and self-destruction.
Recently the Republic of Ireland held a referendum to repeal longstanding blasphemy offenses in its country. While blasphemy still stands as a finable offense in the Republic under the 2009 Defamation Act, the referendum is still a demonstration that, as far as the Irish people are concerned, charges of blasphemy ought not to be a part of punishable civil law in their nation.
Friends of my adopted homeland here in the United States usually have a conception of Western Europe as being made up of a set of predominantly secular and progressive cultures. And speaking as someone who spent many years growing up in Western Europe, this conception isn’t wholly unfounded. As a result, it might astound many Americans to hear that some of these secular, progressive, ultra-liberal, borderline lefty countries still have enforceable blasphemy laws in place. Granted, the actual enforceability of such laws is largely theoretical in nature, given that they are usually undermined by far more salient laws allowing for the freedom of religious expression and the freedom to believe in accordance to one’s personal conscience. Thus, blasphemy laws currently exist as a vestigial organ in European law books; without practical purpose or application, but still present nonetheless.
“If these laws are unworkable, than why even bother to fret about them with referendums at all? Why not just continue to ignore them, and get on with your blaspheming ways?”
This could be a reasonable response, but it misses an important point concerning blasphemy laws. Putting aside the fact that it makes perfect sense to oppose the criminality of blasphemy on principle alone as unbecoming of any modern democratic nation, there is also the issue of the frailty on which the laxity of these laws currently exist. To put it more plainly, the reason blasphemy charges are unworkable in most of the European nations that have them is precisely because the current sociopolitical climate is too secular and progressive to enforce them. However, as any student of history knows, sociopolitical climates are anything but static. So what happens if the political pendulum swings too far to the right, towards a political faction that views the protection of religious sensibilities as far more important to a nation’s cultural well-being, than the free expression of its citizenry? Suddenly, these outdated blasphemy laws that have had no real thrust in civil law for almost two centuries, become a very powerful weapon in the hands of reactionaries all too eager to use the existing rule of law to conform society to their line of quasi-pious thinking. And this is a potential threat both believers and unbelievers alike ought to be concerned about.
Blasphemy isn’t simply the act of professing one’s disbelieve in religious claims, whole cloth. Blasphemy is the very nature in which all religions profess the very doctrines that make up their faiths.
Whenever polytheistic faiths, like certain sects of Hinduism, profess the existence of multiple gods, they are blaspheming against monotheistic religions which insist that there is only one god, and none other (and vice versa). Within the monotheistic Abrahamic faiths, when Christians profess that Jesus Christ is the foretold messiah, they are blasphemy against the Jewish faiths that claim that the messiah is yet to come (and vice versa). When Muslims claim that Jesus, though a prophet and a messiah, is not the son of God, they are blaspheming against a central claim of Christianity. The Catholic Church’s stance on the supremacy of the Roman papacy is blasphemous to the Eastern Orthodox Churches, and the Protestant rejection of Catholic ecclesiastical authority is blasphemous to Catholics. The Methodists are blasphemers to the Calvinists, and just about every Christian sect considers Mormonism a heresy.
The obvious point here is that to take away the right to blaspheme is to make it impossible for religious pluralism to exist within a society. Perhaps this is fine as long as your religious opinion is the dominant one in the society you inhabit, but what happens if you find yourself just short of the majority opinion? What if a population shift occurs, and the very laws that enforced the thin-skinned sensibilities of your religious persuasion becomes the means why which the new dominant line of thought undermines your right to religious expression?
I could stop writing now, and end on this appeal for mutual cordiality between people of all faiths, and how it is in everyone’s self-interest to oppose blasphemy laws, but I fear it would leave things very much against the spirit of healthy discomfort that blasphemy really should elicit in a person when coming across it. On that note, allow me address the elephant in the room that needs to be brought up when concerns regarding religious offense of any sort, in law or public discourse, rears its head.
Undeniably, religions make bold claims for themselves. Claims that offer definitive answers on matters concerning life, death, morality, with a wager on possessing a monopoly on Truth with a capital T. And they are always keen to wrap this all-knowing, all-encompassing bit of absolutist wisdom in a garb of self-proclaimed humility, as if to say, “No, no, don’t mind me…I’m simply professing to know the answers to all of life’s mysteries, ready made with the consequences (read: threat) that will befall you if you don’t follow along with my modest creed.”
In short, religions by their inherit design simply claim to know things they couldn’t possibly know. But I, in turn, admit that I don’t know. I don’t know what the answers to life’s mysteries are; nor do I know which of today’s mysteries will remain mysterious forever, and which might become common knowledge for subsequent generations to come. I don’t know which moral answers yield the most objective good for humanity; nor can I say for sure that such answers are even completely knowable. The truths I do know come with a lowercase t, held provisionally in accordance to forthcoming evidence and reasoned arguments, and I don’t know if I can do anything other than to reject the grammar of bolder Truth claims when confronted with them.
It is precisely that I don’t know that I am left with little recourse than to examine, question, dismiss, disbelieve, and (when I see fit) deride those who do claim to know, but offer hardly a dearth of evidence for their claim. It took centuries of debate and bloodshed of previous generations of thinkers for any of us to be able to enjoy this simple — yet powerful — privilege to skepticism. A privilege I do hold up as my right, and which I will speak up for without hesitation or apology. What you call blasphemy, I call critical thought. And if anyone can appeal to traditions as a means to protect religious sensibilities by legal means, I am fully within my right to appeal to the tradition of cultural and intellectual pushback towards religious doctrines and religious authorities that has made it possible for any sort of interfaith (and non-faith) social cohesion to exist in the modern world. A tradition that includes both the right to the profane and the blasphemous, which cannot be allowed to be abridged in a democratic republic, for as long as one wishes to be part of any nation worthy of the claim.