All posts by kronstadtrevolt

The Christian Right’s Faustian Bargain With Donald Trump

“Gog and Magog are at work in the Middle East. The biblical prophecies are being fulfilled. This confrontation is willed by God who wants this conflict to erase his people’s enemies before a new age begins.”

According to former French President Jacques Chirac, these are the words former U.S. President George W. Bush said to him sometime prior to what is now known as the colossal blunder that was/is the invasion of Iraq in 2003.  I should note that Bush himself has never confirmed, nor denied saying these words.  But regardless of whether Bush’s words are actually being quoted verbatim, or are a paraphrasing on Chirac’s part, my reaction to quotes like this is the same as it is to all babble coming from the political mouthpieces of the Christian Right in this country:  “What the fuck is he even talking about?”

It’s the same reaction I always have when this same sect of self-appointed moral crusaders will in one breath espouse their belief regarding the sanctity of life, and in another breath oppose legislation that would give people access to life-saving healthcare.  Or when they pontificate about the importance of upholding family values (read: their values), while working tirelessly to deprive families of any assistance that would actually help them feed and cloth their loved ones.  As far as I’m concerned the only proper reaction this this sort of schizoid babbling is, ” What the fuck are they even talking about?” as trying to humor these disjointed thought processes would be a disservice to the process of thought itself.

Given all this, one might believe that the way in which the Christian Right pledged their unwavering support for a man like Donald Trump is yet another example warranting a snide, rhetorical remark disguised as a question.  I disagree.  The reason I disagree is that, when it comes to Trump, I know exactly what the Christian Right is talking about.

Undeniably, President Donald J. Trump is a narcissistic, petty, mean-spirited, disgusting shell passing for a human being.  He is greedy, selfish, self-serving, self-aggrandizing, and incapable of holding the simplest of conversations without spouting out an inarticulate string of lies that both mocks and puts to shame the very language he has such a painstakingly low grasp of.  He shows no sense of loyalty towards anyone or anything, let alone the basics of human decency when it comes to how he treats those he views as his adversaries (and, at times, even his supposed allies).  He has no qualms about breaking campaign promises, and then berating anyone who points out his inconsistencies to him as the dishonest party in the discussion.  Among all these things, Donald Trump is also the darling of the Christian Right; who praise his name, and talk of him as if he truly is the second coming Christ had promised (and, some would say, failed to deliver on) to the followers of his generation nearly two millennia ago.  And when I hear them talk like this about Trump, I know exactly what they are talking about.

It’s not about the flaws of Trump’s character, either as a person or as a head of state.  Any and every fault can be dismissed under the nauseating cop-out, “Is all mankind not fallen and flawed?  Are we not all sinners?”  When faced with such a boldfaced heap of meaningless platitudes, one is apt to point out the fact that few of us–and no decent person in general–would ever walk up to unsuspecting women and “grab them by the pussy,” like President Trump has bragged about doing.  That is definitely one sin I can attest to having never committed, and, yes, I feel quite justified in saying that it morally places me on better footing than those who have.  But even mentioning that to this crowd is pointless, because ultimately it doesn’t matter to them if Donald Trump is a chauvinistic, perverted scoundrel.  The only thing they care about–the only thing they have ever cared about–is shaping the legal arm of the nation in accordance with their will, and impose their sets of hypocritical edicts on everyone else, whether they like it or not.

It layman’s:  The Christian Right is backing Trump because he will appoint the judges who will align with their views of how laws ought to be interpreted in this country.  He will give executive backing to legislation that will reshape this nation into what they always wanted it to have been from the start–a fundamentalist, conservative hallmark for Christendom; rife with the great tradition of hypocrisy and intolerance that is entailed by it.

In this context, the fact that they are undercutting their own sanctimonious virtues by throwing their lot with a person as un-Christlike as Donald Trump is irrelevant.  The fact that their current actions are causing younger generations to walk away from their congregations is a moot point.  It ultimately does not matters what convictions anyone individually holds; as long all are still forced to abide by the laws and legal precedents implemented by the Christian Right, victory has been ensured for generations to come because once a matter becomes the judicial status quo (regardless of how draconian or unpopular) it becomes that much harder to overturn, socially and politically.  Rather than flailing in the wind towards irrelevance, this sect is playing what they believe to be the long game in the culture war to reshape American society.

And for once, I know exactly what the fuck they are taking about when they spout their babble, and there is nothing meek or humble about it, in either a Christian or secular sense of the words.  If the other side of the political aisle wishes to have a fighting chance against such blatant subversion of the democratic process, the pushback has to be equally biting with a succinct and unrelenting, “Like hell you will!”

Advertisements

The Reason Stories are Written in Past Tense

One of the first things any decent creative writing class will teach an aspiring author is the importance of maintaining consistency throughout the text, and it’s something I’ve definitely mentioned before on this blog.  Although this often refers to the importance of maintaining plot consistencies, grammatical consistencies (and functional consistencies), are equally crucial parts in creating a legible narrative.

Anyone who reads fiction novels regularly will have noticed that the overwhelming majority of these stories are written in past tense; e.g. “It was the best of times…”, “She figured it was all over…”, “He loved her like no other, but also saw no way to show it…” etc.  But why is this?  What makes a past tense narrative more grammatically correct, then a past or future tense syntax structure?  To answer that question, one needs to first dispel the phrasing of it.  There is nothing inherently more grammatically correct about using past tense, as opposed to any other tense, as long as the narrative voice remains consistent in its use throughout the story (or there is a damn good reason why it doesn’t need to do so).  Hence, the reason past tense is seen as the default has less do to with grammar, and more to do with functionality.  It shouldn’t be forgotten that writers by definition are also readers, meaning that they carry with them decades’ worth of literary conditioning, just like the audience they are trying to reach.  Most of the books a writer has read will have been written in past tense narrative, and like every other reader, it is understandable if this structure naturally seeps into one’s own writings.  Thus, one also shouldn’t underestimate the sheer amount of concentration it will take to catch the potential for inconsistent writings when attempting to do experimental works that run counter to the norm, and how the potential of creating an inconsistent prose goes up substantially when trying do to something out of the ordinary.  Therefore, defaulting to the more common past tense narrative is an easy way to ensure consistency throughout one’s plot, since it will feel the most natural; for writers and readers, alike.

Alternatively, rarely do you see whole plot narratives written as future tense; e.g.  “I will go see her tomorrow, after which we’ll talk…”, “They are going to take care of it later…” etc.  This sort of writing is reserved more for character dialogues, as they are more in line with casual conversations (not to mention people’s internal dialogues) wherein the discourse centers on planned actions (i.e. things yet to be done, spoken about by character’s whose overall knowledge of events is limited).  In contrast, narrator voices—whether they are written in first person, or third person; whether they are limited, or omniscient—are instinctively read by the audience in a bird’s eye view perspective, detailing the happenings to them as an observer of events.  It wouldn’t be impossible to write a whole narrative in the future tense, but the risk you run is to possibly frustrate your readers because, in many ways, such choice of phrasing stands so deeply in contrast with how most of us are attuned to differentiating between plot narrative and character dialogue that it may have the unfortunate affect of making the story too confusing and tiresome for most to bother following along with to the end.  And while challenging readers through provocative prose can be laudable, given them a headache through cumbersome verb usage is anything but useful.

Lastly, there is present tense; e.g.  “She creates the world as she sees it…”, “He says what he thinks, and he thinks what he knows…”  It’s a very impactful form of narrative, which immediately frames the plot into an action mode—things are happening, and they are happening right freaking now!  It’s unique, and in the hands of a skilled writer, has the potential to serve as a creative alternative to its more common past tense counterpart.  On the other hand, in unseasoned hands, it has the potentially to also wear out the reader; think sensory overload brought about by too much intensity.  There is a reason most stories follow the general set up of: introduction -> rising action -> climax -> falling action -> conclusion/resolution. If the whole story is written in a narrative that denotes action all throughout these distinct steps in the narrative, then the writer will have to work doubly hard to make the impact of the climax (and the rising action that leads up to it) standout to the reader’s attention.  I’m not saying that it’s an impossible task to accomplish, but it is harder, and takes considering talent to get it right.

I outlined why looking at the prevalence of past tense narratives in fiction isn’t really an issue of grammar, but an issue of ease of writing and what reader’s are simply accustomed to.  In an obvious way, the situation is very much a Catch-22:  Readers are used to reading narratives because most authors write in past tense narratives; authors write in past tense narratives because most readers are used to reading in past tense narratives. And a prevailing orthodoxy is therefore sustained.  Now, I will never say not to attempt a heterodox approach that deviates from the norm, on the grounds that one never knows for certain what works until it’s tried (every new situation carries with it the prospect for new discovery, and all that).  I simply want to make the point that no reader expects you to re-invent the written word to be seen as a great storyteller, and it’s perfectly fine to stick with what has been tried-and-tested to work, and what will make it easier for you to write your story, rather than fret over the structural details when you really don’t have to.

Dispatches from Gulfton

The first grocery store I saw when I moved to the United States was a meager looking spectacle called Sellers Bros. in a rundown strip-mall area of southwest Houston, TX.  The store’s shelves were as overcrowded with bargain, generic-name products, as it’s aisles were with patrons shuffling from one end of the building to the next, holding tightly to their Lone Star Cards needed to feed their families for the month.  The building’s somber looking outer-structure held a passing resemblance to the apartment complexes that surrounded it only a few paces away—one of which my family was living in at the time, serving as our first exposure to the realities of inner-city American life we had immigrated to, and were gradually assimilate with.

The majority of the neighborhood was composed of immigrant families.  Though unlike my family, which originated east of the Atlantic Ocean, it was impossible not to notice that most of my neighbors hailed south of the Rio Grande.  As a result, while I had come to this country with the advantage of being able to speak English reasonably well—well enough to understand, and be understood by the general Anglophone population anyway—this advantage proved of little value on the very street I called home for these years of my adolescence.  It was an early education to the fact many living in urban America are readily familiar with.  Namely, that within the reality of American life, reside smaller sects of conflicting realities, many of which can neither communicate nor understand one another, and are set up so that they will rarely meet.  Gulfton Street in Houston, Texas, occupies one such reality.

Tucked away between two major highways in southwest Houston, spanning a stretch of 3 to 4 miles of cracked concrete landscape, sits the street of Gulfton.  The epicenter of the Gulfton Ghetto, as it’s occasionally called by the local media and by other Houstonians (though never by the neighborhood’s own inhabitants).  To those who take a wrong turn off Bellaire and find themselves driving down Gulfton Street by accident, the insulting nickname will seem most warranted.

The immediate sights one is met with are panel after panel of gang graffiti, row upon row of low-rent apartment complexes, and concrete sidewalks that have been in desperate need of repair for a good few decades now.  Surprisingly, there is a park/recreational center meant to give some relief to the area’s ongoing problem with juvenile delinquency, though anyone who has ever stepped onto the park itself will be quickly robbed of any hopefulness at the prospect of this endeavor.  In short, like many neighborhoods in urban America, Gulfton is a place that has been largely abandoned to the ravages of metropolitan entropy.

Under-funded and halfway flushed out improvement projects that have failed to live up to expectations are pointed to by the rest of the city as reasons not to bother with any future attempts at repairing the crumbling infrastructure.  Leaving the residents who have given up on the idea of moving away to either wall themselves off from the unsavory conditions that surround them within their private residences (however meager they may be), or embrace it by becoming a part of its destructive nature.

The first instinct any well-meaning person will have when confronted with a reality like Gulfton is, “Can anything be done to fix this?”  It’s an honest question, but it betrays a lot about the person asking it.  The idea that there is any one thing that can resolve problems that are decades in the making is a part of the problem to begin with.  These sort of problem are such that they have no one facet of origin, but are a delicate, interwoven mess of social, economic, and political barriers erected and maintained through complex systems with interests that themselves compete against and prop up each other in a multitude of ways.  The problems of Gulfton, like the problems of similar neighborhoods and populations throughout this country, have no single cause; hence they can have no single solution to curb the path they are currently on.

“Why don’t the people living there work to fix things?  It’s their neighborhood, after all.  Don’t they care?”

Unfortunately, the reality of all urban areas is that they are landlocked and dependent on the larger metropolitan that surrounds them.  They don’t get to make decisions in a vacuum, and resources are finite and sparse in terms of what will be readily allocated to benefit them.  The further issue is that once a neighborhood has fallen far enough to be regarded as “hopeless” by officials and administrators who could possibly make a difference, the very hopelessness of said neighborhood is used as the reason against committing long-term funds to improve its conditions, on the basis that it would be unfair to use tax dollars from well-behaved citizens in more savory parts of the city to fund the activities of no-good thugs and gangsters in these low-income, high crime areas.  Local agencies will say they are not equipped to handle the expenses needed to undertake the sort of social projects necessary to overhaul the issues plaguing these sorts of areas, while Federal agencies see these issues as strictly a local concern.

In the absence of a robust social safety net provided by the city or state authorities to ensure the most basic of securities and public amenities, opportunistic forces will band together to construct their own safety nets, which for many young people will take on the form of turning to gangs that prey on social instabilities as a means to offer their quasi-organized crime structure as an alternative to festering in a decrepit social system.  The reason youths are most susceptible to this, is that they are the most in need of some kind of functioning social order to orientate their lives (and relieve their boredom), and even the violent and dangerous structure of a gang life is to many preferable to the instability of no visible structure at all.

Some people have a natural aversion to hearing that any issues constitute a systemic problem, requiring a systemic approach to resolve.  They conjure up images of how the very notion of entertaining such a thought is little more than an attempt to skirt away responsibility from the individuals and let them avoid the consequences of their actions and/or apathy, leaving them no incentive to make things better on their own accord.  I can understand the sentiment behind this aversion, though I find it largely misinformed.

In a place like Gulfton, how exactly do you expect the individuals living there to step up to fix the various problems that plague their environment?  Should they pool their meager earnings together to pay for the ongoing structural damage to their concrete sidewalks and street signs, despite the fact that we’re talking about city property and as a results is an issues needing to be addressed by the local government?  How about the need to improve the resources available to the local schools so that there can be robust after-school programs and activities available for young people to occupy their time with to discourage the need for delinquency and gang activity?  Should the low-income earning parents of these youths fund these programs directly, thereby taking money away from them that’s needed to pay rent, utilities, food, clothing, etc.?  Would that be an example of individuals stepping up to take personal responsibility to improve the conditions around them, or a neglect of one’s obligations to provided basic necessities for one’s own family first?  If donating money is not the answer, surely we can get everyone to at least volunteer their time to improve their community, no?  It’s not as if the sort of people who have to live in these sorts of neighborhoods, are undoubtedly also stuck working jobs with little to no flexible hours or time off, after all.

Perhaps the answer is that all these folks ought to work harder to increase their earnings, so they aren’t hostage to their economic conditions.  Yet, if they actually managed to do just that, what incentive would they have to spend their extra earnings on repairing a place like Gulfton, as opposed to–oh, I don’t know–simply moving away to a better part of town that already offers all the basics of having dignified living conditions?

Unless you are Bruce Wayne, sitting on an endless supply of inherited wealth, resources, and leisure time, individuals donating money and/or donating time, will never be a solution to the problems that affect neighborhoods like Gulfton.  These are problems that took a long time to manifest, and they require long-term investment and planning to be resolved. It requires layers upon layers of overarching organizational resources, to properly oversee and track improvements, that no single individual or clustered group is capable of providing.  Private businesses, local or otherwise, also offer little help in the matter, since their is no business incentive in investing in a place simply to improve the lives and environment of its residents, since these residents will not be able to return the gesture on account that, at the end of the day, they’ll still be too poor to ever be able to turn a profit for these businesses.

And it takes an astounding level of naivete to not be able to realize this.  The same sort of naivete that leads certain people to make inane points like, “If you like public programs, and think taxes should be higher to pay for them, why don’t you just volunteer more of your money on an individual basis, instead of demanding everyone else do it through the tax code?”  Because individual actions and donations will not solve systemic problems like the ones affecting neighborhoods like Gulfton, that’s why.  Because many of the problems plaguing inner-city life are far too complex and interconnected to a multitude of surrounding factors to be seriously brushed off with red herrings concerning individual responsibilities.

Areas like Gulfton are the way they are because they have become culturally and economically alienated from the rest of their metropolitan centers, and the rest of the country at large, and little is being done to incorporate them into the greater society that surrounds them.  The full reasons for this alienation are legion, and the solutions that will be necessary will by definition be just as extensive, which is a reality that must be acknowledged by those who purport to take the issues of working, urban, and immigrant communities seriously.

If, on the other hand, you simply don’t care about places like Gulfton, then just say you don’t care, and stand by the convictions of your apathy.  And stop pretending that there is a greater moral or ideological basis to what is essentially pure disinterest for the plight of people you can’t be bothered to give a shit about.  It will make for a much more honest conversation.

Wit Contra Sarcasm (Contra Douchebaggery)

Wit is hard to get.  Just as you think you get wit, they’ll come around and change what wit is.  Suddenly, what you thought was wit, is no longer it, and what is wit, will sound to you like a pile of shit!

Fortunately, wit has an easier to attain co-traveler in the world of rhetoric named sarcasm, which is much, much easier to pull off.  Much like pineapple on one’s pizza, people either love sarcasm or they don’t.  And for those who love it, they really freaking love it.  I find it to be especially true of women, as you are setting out in the initial courting process, because the women who appreciate a good sarcastic banter will respond very favorable to any guy able to keep up with their own sarcastic quips, while the women who are turned off by sarcastic jokes will very quickly show you how they are not amused by your highbrow wit-lite ramblings.

Let me say from the onset that I’m not bashing sarcasm here—sarcasm is great people in my book (I can attest that some of my best friends are practically verbally drenched in nothing but sarcasm…also desperation and self-loathing, but sarcasm is a large ingredient in their person-stew, too).  My main problem with it is that a lot of people seem to think that simply saying something in a sarcastic tone ought to be treated on par with making a witty comment, seemingly unaware that it is not the sarcasm that makes a comment witty; it’s how clever and salient said comment is to the situation it is speaking on.

I’m sure we all know at least one person who has unwittingly fallen into this trapping, but for a notoriously bad offender think no farther than Dennis Miller’s stand-up routines in the 90s, where in addition to pointlessly disjointed similes, a la “Man this whole impeachment issue is becoming a sticker mess for Bill Clinton than Rutherford B. Hayes’ sauna sessions, daddio!  Amirite folks? Har har har” [note: not real Dennis Miller quote, but can you honestly tell the difference?], he often relied on simply saying something in a sarcastic tone to give the implication that a witty comment had been made, hoping it could carry the point home for him.  It hadn’t, and it couldn’t.  As is the case for all things sarcasm-sans-wit related (and all things Dennis Miller related, for that matter), it’s essentially where the desperate nugget of any relevant point goes to die.

On a related note, think about all the times you have been in a situation where you made a suggestion regarding a course of action, only to get a response of, “Oh yeah, that’ll work reeeeeeal great, I’m sure of it.”  Accompanied with an eye-roll, and a few air-quotes thrown in to truly carry the point home.  While we all can recognize this as being far from anything resembling wit, I would even hesitate to deem it worthy enough of being called mere sarcasm.  It much closer to what I would refer to as “Douchebag Cynicism”.  Which is academically defined as, any and every action or comment made to identify and amplify one’s irredeemable douchebaggery poorly masquerading for cleverness.  It’s a noun.

Really, my only point in this whole rant of a post is that if you feel the urge to be sarcastic, put a bit more thought into it besides just adding a mocking inflection to your voice—try to actually have something noteworthy and clever to contribute to the conversation.  Also, always strive not to be a douchebag cynic.  Though that last bit is wisdom that can probably apply to most areas of one’s life.

Character Backgrounds: The Dilemma of Sharing Too Little, or Too Much

When writing a story, there exists a natural disconnect between how the author interprets the plot, and how the audience reads it.  The obvious reason for this being that the author has the (mis)fortune of knowing the intended background details of the events and characters before they ever makes their way onto the page, in ways that are not readily available to the reader.  The task for any decent writer is to convey these details in a way that makes for a compelling narrative that will be neither overbearing for the reader, nor leave them stranded in the dark regarding important plot/character developments.

Spotting moments when an author is being too reserved with details is fairly easy.  Anytime you’ve come across a part of a story or book that left you wondering, “Wait, who is this, and why are they suddenly in the middle of everything?  Where they hell did they come from?” you were essentially exposed to underdeveloped writing.  Be sure not to misunderstand what I’m saying, though.  Introducing new characters, and strategically withholding information about them, can be an effective writing technique to invigorate interest back into the plot, as a little mystery can go a long way in building much needed suspense in an otherwise stale plot.

As an example, imagine a love story between two characters named Tom and Jill.  For over a hundred pages, you followed along as Tom sees Jill, falls in love with her, and tries desperately to impress her.  Jill was originally aloof regarding Tom’s advances, but slowly she starts to feel flattered by his affection for her, and agrees to give him a chance.  Things are going great for the two love birds for several more pages, then—just as the plot can’t bear the weight of anymore Hallmark moment clichés—a sudden wrench is thrown into the mix:

Nothing could tear Tom’s gaze away from Jill’s eyes.  The shape of them, their softness as she smiled, even the wrinkles that formed at the corners of her eyelids as she laughed, all worked to keep him in a hypnotic trance from which he could not—would not—escape.  Or so he thought.  Because the moment Susan Gallaghan walked by them, he felt his eyes wander from his beloved Jill’s enchanting eyes, to the rhythmic steps that paced along in front of him.

Let’s assume this is the first time this Susan character is ever mentioned in the plot.  The first thoughts any reader is going to have will be along the lines of:  “Who the hell is this Susan person?”, “Is she someone new to Tom?”, “Is she an old flame?”, “Is she a girl from his youth that he secretly pined after?”, “Is Tom actually a serial killer, and Susan his next victim?”  At this point, we, the audience, have no clue.  The fact that we have no clue is what makes it a brilliant writer’s trick, because now you are invested in the dilemma and subsequent resolution that is sure to follow.

But what if the drama never really follows the way you expect it to?  While the sudden introduction of this new character works to spark the reader’s interest in the development of the story, it can only carry the audience’s engagement so far.  If Susan keeps popping up in the same way, with the same vague acknowledgment from the established characters, the reader’s interest will quickly turn to frustration, and ultimately to disinterest.  You have to give the audience a reason as to why the things that are happening on the page are worth being mentioned to begin with, and in the case of character development, this means divulging at the very least some connection between secondary plot-devise characters (like Susan above) and the main protagonists.

Divulging a character’s background effectively in a narrative is not as easy as it may sound.  A lot of times it can come across bloated, and a poor attempt to force feed too much information into the plot, just for the sake of having the reader know why this person exists in the story.

Imagine if the mysterious introduction of Susan above followed up with:

Tom immediately recognized Susan as his high school sweetheart, to whom he had lost his virginity to on prom night.  The two of them went their separate ways soon after graduation, but Tom never quite got over his love for Susan.  Susan, for her part, had little trouble moving on from Tom.  So much so, that she moved away to study and travel abroad.  As she traveled the world, she gained an appreciation for herself, and how she didn’t need to define her identity by any one person that happened to be in her life.  Unlike Tom, Susan wasn’t validated by whether someone loved her; she felt complete knowing that she loved herself.  Even now as she walked past him with all the confidence of a young woman who intended to live  her life to the fullest, Tom’s heart throbbed once again for the one that got away.  Though Susan didn’t recognize Tom, the two of them would be seeing a lot more of each other from her on out, since she was set to begin a new position in the very firm Tom worked at.

The problem here isn’t that this information is being revealed within the plot; it’s that there is no reason to have it laid out all at once, let alone right after the mysteriousness regarding Susan’s presence was so brilliantly executed.  All of this can be revealed through the course of several pages, if not several chapters.  Again, by all means give the necessary background to establish a character, but there is no need to lump it all together in one spot, because then your narrative will inevitably end up repeating itself again and again, every single time the information needs to be revisited.  Eventually, Tom and Susan will have a confrontation, where hints can be dropped regarding their past intimacy.  Rather than state that Susan is a confident and independent person, why not show it by the way she behaves and interacts with her surroundings and the other characters?  Pretty much everything stated in that one paragraph can be dispersed throughout the story by piecemeal, without having to kill the suspense of revealing it all in one big swoop (especially right after the mystery character is introduced).

For a real literary example of where an author does a superb job of balancing the enigma of his characters with their subtle background revelations throughout the plot, I would point to the characters of Mr. Croup and Mr. Vandemar in Neil Gaiman’s Neverwhere.  Even before the book’s otherworldly narrative is revealed, these two characters’ peculiar manner of dress and manner of speaking foreshadows a fantastical nature to their persons (and, by extension, the plot itself).  All of which is subtly explored in what essentially amounts to breadcrumbs worth of information through the course of a 300+ page story.  And in the end of it all, the mystery behind who/what Mr. Croup and Mr. Vandemar really are is never fully revealed, precisely because there is no reason for the story to do so.

Ultimately, it’s up to every writer to decide how much is too much background exhibition for her/his characters, and how much is just enough to not stifle character and plot development.  That happy balance will largely depend on the sort of story you are trying to tell, and it may take several revisions to get it within the range you are aiming for.  But, while it’s not always straightforward in either case, being able to spot the problem in other written works means you are more than capable of applying that critical eye to your own.  Like a lot of writing advice, it simply starts with reading your writings not as an author, but as a reader, first and foremost.

Mindlessly Mindful: How Meditation Stifled my Creativity

Over the course of the last few years, the practice of mindfulness meditation has sparked a great deal of interest in private and public discourse.  For many this discourse takes on the form of a full-scale spiritual reawakening in their lives–the rationale of looking back to what some would call time-tested wisdom, as a guide to navigate through modern life.  Still to others, who might belong to a more pragmatic mindset, the adoption of meditation into their daily routine is less about reaching an esoteric sense of enlightenment, and more about wanting to find a means of focus for the cluttered thoughts they feel are clogging up their minds.

My own interest into mindfulness meditation began sometime in late-2016, and stemmed from a general curiosity regarding the positive results being attested to by its practitioners–ranging from all sort of different personalities; including (but not limited to) self-appointed gurus, public intellectuals, corporate bosses, average laborers, and everyone in between.  What peaked my curiosity most was how the underlying message from this diverse group of people was a resounding agreement that: “Yes, indeed, meditation works!”  The full definition of how it “works!” and what it means for it to “work!” often vary as much as the individual backgrounds of meditation practitioners, however there are some very clear commonalities among all the positive testimonials.

A greater sense of focus is one reoccurring benefit attested to by mindfulness meditators.  Specifically, a greater awareness and appreciation of the details encompassing the moment one happens to be currently occupying, as well as the multitude of thoughts that accompany it.  Another common theme among meditation circles is how it leads one to confront the (supposedly false) preconceptions surrounding the fundamental concept of the Self, and the illusory nature by which we think of our Self in relations to both our internal dialogue, as well as the external world our Self interacts with (whether it is even coherent to think of the Self as an independent agent relating to the world, rather than another component in an endless string of interacting affects that make up existence).

I spent weeks researching the practice and philosophy of mindfulness meditation to get a better understanding of it, until finally, on January 1st, 2017, I decided to put theory to practice and devote a significant portions of my free time trying to gain some firsthand experience of what it truly means to incorporate meditation in my daily life.  Recently, on January 1st, 2019, this personal experiment of mine came to a full stop.

When I first set out on this personal journey I expected the possible results to go one of two ways:  1.  A net positive, wherein I would enjoy the benefits of reaching some semblance of self-awareness, self-discovery, and hopefully even personal growth (like so many others testified to having experienced through meditation).  2.  A net neutral, the results of which would be no more dire than having wasted some portion of my time on a fruitless exercise that offered no real benefits, but ultimately no harm.

Having now gone through it, I can’t say what I experienced to have been neutral, since the practice definitely affected me on more than one level.  Unfortunately, from my perspective, the affects I felt leaned more towards a net negative as a whole; so much so, I decided to give up meditating completely as something that may simply not be a suitable practice for someone like me.

Once I ceased meditating, a subsequent curiosity came over me in which I wanted to find out if there were others that have had a similar (negative) experience to my own while practicing mindfulness meditation, but surprisingly enough the answer to that questions seems to be a resounding, “No.”

I came across a few blog posts here and there of people saying they weren’t completely satisfied with what mindfulness meditation offered, or that it wasn’t what they expected, but they were still overall happy to have had the experience (even if they decided it wasn’t the right fit for them).  I also finally took the time to research the medical and psychological data regarding the long-term benefits of meditation (or, more aptly, the lack thereof) I had intentionally avoided while engaging in the practice, so as not to be prematurely biased against it.  Yet, other than a general confirmation that little to no empirical evidence exists to validate its self-proclaimed benefits–possibly making meditation more comparable to a placebo effect than genuine self-awareness–I still didn’t come across reports that confirmed anything close to my personal (negative) experience.

I’m not going to go into deep details regarding the exact nature of the sort of mindfulness regiment I did during this two year period; partly because I’d rather be guilty of leaving details ambiguous, then have every meditating Tom, Dick, and Mary who fancies her/himself a guru lecture me about how “real” meditation ought to be done.  If that is the sort of objection coming to mind as you read this, I am unfortunately failing to get the crux of my point across.

It’s not that I meditated and got no results from it, or that my results were drastically different from what I’ve read, heard, and observed others state about their own experiences while meditating.  In fact, my experiences were more or less in line with what the typical person claims to go through while practicing mindfulness exercises.  My problem with meditation–and mindfulness meditation, specifically–are what I view to be the negative impact it had on my creative wherewithal.

What exactly do I mean with this? Allow me to explain.

A heightened awareness of the current moment is one of the major benefits promoted in favor of meditation.  While I see how it might help those who have a habit of wearing their emotions on their sleeves to meditate–or maybe those who suffer from impulsive decision-making in general–I’m someone who came into meditation already relatively calm and collected, possessing a decent set of stress management skills to begin with.  Furthermore, I’m someone who relies on having to construct imaginary plots, involving imaginary people, and projecting them into contrived scenarios that could resolve themselves any number of ways I see fit to write.  Now, seeing that creative writing is generally penned in the past tense, about things that have yet to be imagined, involving situations which do not exist, I never expected mindfulness meditation to offer much in the way of benefits in this part of my life.  But I also wasn’t prepared for how downright harmful it could be to it, either.

Prior to incorporating meditation into my daily routine, sitting at my desk and passionately typing away at my laptop’s keyboard for long enough to lose my sense of self because I am too immersed in the world I’m creating, was the feeling that gave me satisfaction at the end of a day when I went to bed.  And, slowly but surely, I felt this passion begin to erode the more progress I made with my meditative practice.  (Then subsequently return when I stopped meditating altogether.)

Sure, I got better at focusing on my breathing, as well as the various physical sensations that made up my moment-to-moment experiences, which in turn made me more aware of not just my thoughts, but the process by which these thoughts seemed to spontaneously manifest into my conscious monologue, but all of this came at a cost.  Being more aware of my thoughts–moreover being conscious of the act of thinking–made it harder to lose myself within those thoughts when I needed to weave together thoughtful writing.

And it wasn’t just writing.  Other creative outlets like painting became harder, too, because a large part of my painting process revolves around being able to foresee and focus on what shapes and images can be created (rather than what are present in the moment), and what method/color scheme will illustrate them best.  Being aware of the moment, and the act of what I’m doing (in this case sitting in a chair while painting) offered no benefit to the act itself, and ironically often served to distract from letting my thoughts roam towards conjuring up the inspiration needed to complete the project.

Yes, inspiration.  That is the key ingredient that I felt slipping the deeper I delved into meditation.  Ironically, as a result I found myself feeling more frustrated and stressed as a person when I sat down to do my work; traits I largely did not possess (at least not to the level I developed) going into meditation.

Like a lot of bad side effects, it took time for the signs to come to the surface, at which point meditation had already become part of my daily routine (and, really, routines can be so hard to break once they’ve cemented into our daily lives).  So I carried forward through all of 2017, and the first half of 2018, somewhat oblivious to what was the source to my depleting creative spark.  Then, last summer I wrote a post on this blog titled, The Pitfalls of Self-Help, after which I started to consider the possibility that all the positive testimonials I had heard in praise of mindfulness (which got me interested in it) were just as vacuous as the testimonials of people following any other self-help/self-awareness fad.

I started to seek out other mindfulness practitioners to see what insights they had to share, and was largely met with not-fully-thought-through regurgitations from self-proclaimed meditation gurus, whose wisdom sounded more like buzzwordy slogans from the reject bin of yesterday’s fortune cookie stash.

One particular conversation proved most enlightening.  The gist of it went something like:

Meditator:  “How you perceive of the Self is an illusion.”

Me:  “I perceive of my Self as a collection of atoms that make up the matter that is me; occupying a specific space in time that only I occupy.  In what sense in this an illusion?”

Meditator: “That’s not how people define the Self.  When people talk about a Self, they speak of it in terms of a separate entity that’s observing their doings, instead of being a part of it.  That’s an illusion.”

Me:  “But I just told you that doesn’t apply to how I, personally, conceive of the Self; as it pertains to me, or anyone else.”

Meditator:  “It does.  You’re trying to intellectually rationalize you perception.  In reality, you’re just not being honest with how you really perceive your Self, in everyday practice.”

I’m fine with accepting that I have blind spots regarding my own conscious and subconscious awareness.  What I take issue with is being told I have to accept the idea that someone else–with absolutely no firsthand access to my thoughts or perceptions–has figured out where all these blind spots are, how they pertain to my experiences, and how it all conveniently fits into her/his own preconceived generalizations and worldview.  In other words, feel free to tell me that I’m wrong in my opinion, but don’t condescendingly tell me you know what I’m really thinking, in order to make me and my thoughts conform to your philosophy.  That’s not awareness; that’s just bullshit.  And I hate to say it, but a lot of meditation seems to run very close to this level of discourse.

In the last half of 2018, as I drifted more and more away from seeing any value for keeping meditation in my life, I was given two further explanations by meditation practitioners for my lack of positive results:  1.  I’m not spiritual enough, and 2. I’m too straight-edge.

I’ll freely grant the truth of the first explanation as a strong possibility.  Even with the most elastic definition of the word “spiritual,” I can honestly say that it does not, and cannot, apply to me.  While I know there are efforts made to promote a secular form of spirituality, I still feel the need to point out that I have never believed in the supernatural, nor the mystical, and the values and passions I have in life I do not equate or think of in any deeper “spiritual” terms.  The things that give my life meaning and joy, are simply the things that give my life meaning and joy, and I see no reason why I need to lump on belabored spiritual terminologies that do little to further elucidate what is innately a tautological experience for everybody.  Apparently, this type of thinking doesn’t sit well with the sort of people who claim to get concrete benefits out of meditation.  In such circles, simply saying you appreciate any aspect of life, and your roles and perceptions in it, is an affirmation of your spirituality.  Which is fine, but to me that just redefines spiritual so broadly that it becomes meaningless as a term.  I’m not invested enough in the semantics behind it all to debate the issue, but it’s safe to say that I don’t personally consider myself to be a spiritual person (regardless of whether others want to see me as such).

As to the second point, concerning my lifestyle choices; on more than one occasion, it was suggested to me that meditation can only be truly of benefit when performed under the influence of psychedelics.  I have no way of knowing if this is true or not, as I do not partake in recreational drug use (though I support anyone else’s right to do so).  But I have to ask, how do you know that what you perceive to be a greater self-awareness while high on psychedelics isn’t just a drug-induced delusion that has no bearing on reality as it actually is?  If being on drugs, and then meditating, is the key to opening the door to a greater truth about life, how come no one has ever emerged from these drug-fueled meditative states with any tangible, verifiable realizations about the world?

How come in all the centuries of taking mushrooms and meditating in caves, none of these yogis and gurus came out of the experience with something like “E=mc^2”, or the formula for penicillin, or even something as basic as “hey guys, guess what, the world is actually round” (in fact, there is a growing following of people online, at least some of whom I imagine are very prone to getting baked, that argue in favor of a flat-earth).  It’s always some esoteric and vague platitude, like “the Self is an illusion” (as long as both “Self” and “illusion” are defined in very particular terms) or “states of happiness and suffering both depend on consciousness to be realized” (no shit, you’re telling me people who are brain dead can’t feel happy or sad?–Brilliant!).  So, I must ask, what exactly is the point of a greater awareness, if said awareness has nothing tangible to say about the most fundamental, verifiable facts regarding the reality we inhabit?

And, look, perhaps there are those for whom such musings and conversations are of great value, and their personal experiences have been greatly enriched by their existence.  If meditation has brought these people happiness, and impacted their personal growth as individuals positively, I would never argue to take it away from them on the basis that it wasn’t my cup of tea.  We’re all different, and what works for you may not work for me, is one underlying message here.

The other reason for writing this post is to speak to anyone who may have had a similar experience with meditation to my own, and also struggled to find others voicing said experience.  Although I didn’t find much in the way of negative testimony regarding mindfulness meditation, I have a hard time believing that there isn’t someone–at least one person–in the world who, like myself, has tried this out and found it to have been more of a hindrance in her/his life, rather than a benefit.  To this person(s) I’d like to say, there’s in no point in struggling to move forward in a futile quest, and there’s in no shame in walking away from something that is doing you no good.  There are many different ways to experience life and achieve personal fulfillment, and just because something is presented as a cure-all to what ails you, doesn’t mean that there aren’t better alternatives out there more suitable for you.

And if you think everything I’ve written is unwarranted drivel, let me know, and I’ll be sure to meditate on your concerns post haste.

Understanding Perspective in Writing

Writers easily get bogged down in what one could call the nuts and bolts of narrating a story–plot, setting, character development, etc.–that it often gets easy to overlook that narrating itself is the very underpinning that defines the perspective by which a story is revealed to the reader.

Generally, most narratives are written as either from a first-person or third-person perspective.  Second-person exists, too, but is not often used as an exclusive character perspective on account that it’s hard to construct a long-form narrative with it (not impossible, but definitely hard).  As an example, on many blog posts [including this one] I’ll often utilize the rhetorical second-person “you” in reference to the hypothetical reader scrolling through the text, but when doing so will usually not take long to resort to the first-person “I” in order to make the prose coherent.  By and large, if you are writing some kind of narrative, especially in the realm of fiction, you’ll probably be doing it in first-person, or third-person.

Regular readers of KR know that I hate all forms of jargon.  Philosophical, political, literary–all of them; if you’re someone who always feels the need to express yourself using pretentious ten dollar words and terms in lieu of more straightforward ones available, I will always assume that you are probably someone who doesn’t know what s/he is talking about.  With that in mind, if you are not 100% sure about all these terms, let’s simplify it by saying that if your story’s narrator speaks using “I,” “me,” and “we,” your story is written in first-person.  The strength  of writing in first-person comes from the ease by which the reader gets to empathize with the narrator, and in turn, the narrative of the story being told.

Tom went to the store, bought gum, and then shot himself with his revolver,” can be emotionally gripping, but not as emotionally gripping as, “I went to the store, bought gum, and then shot myself with my revolver,” because now you are not just being asked to read as a casual observer, but as the main character him/herself.  This is why first-person narratives are easier to immerse oneself into, as the prose has less of a descriptive barrier between narrator and reader, making it easier to become invested in the plot’s dilemmas and character arcs.

However, writing in first-person also has its drawbacks.  The perspective is by definition restricted to only one point-of-view.  Unless your character is some sort of clairvoyant deity, the narrative will be limited to whatever s/he sees and describes (even if your character is an all-knowing god, written in first-person the story is still only told through the perspective of one viewpoint, hence it’s still restricted).  Most stories have more than one character present; hence it’s not hard to realize the issues that arise when you can only ever truly understand how one character is feeling, and have to rely on this one perspective to give a complete deduction on the thoughts and intentions of all the other characters.

As an example, let’s say that the narrator character is in a conflict with side characters A and B.  What are character A and B’s thoughts on this conflict?  You don’t know.  You know what the narrator character thinks their thoughts might be, and that’s all.  This isn’t a problem, in and of itself.  It can be used to create a wonderful sense of tension and suspense.  But it also means that a writer has to keep in mind perspective consistency within the plot, so that it doesn’t violate the logic of the first-person perspective that’s been setup so far.  This means that if side characters A and B had a conversation somewhere far away from the narrator character, the narrative has to be worked around in which the narrator character somehow gets wind of it if it’s going to be mentioned in the plot.  The narrator character can’t just mention it in mid-conversation, because we–as the readers who have had direct purview to the narrator’s perspective–know that that’s not knowledge that could have been available to her/him.  It breaks internal logic, and it’s rhetorically lazy.

Another glaring handicap with first-person narratives is that everything in the setting is dependent on the description given by the narrator character.  Which means that if this narrator is presented as someone not keen on being too observant and articulate, it will seem weird to have her/him suddenly break into elaborately detailed descriptions of everything happening around her/him just so the reader can see what’s being looked at.  It can also be distracting, and work to undercut the immersion benefits mentioned earlier regarding the first-person narrative to begin with.

The ready alternative is to write in the third-person, and many writer’s workshops will tell you to do just that.  Third-person allows you to separate the narrator’s voice from the characters in your story.  This means that things like character actions and appearance, and setting descriptions, are not dependent on any one character’s observations.  They are instead voiced by an impartial, non-participatory “third-person” giving all the details of the narrative’s happenings.  The obvious benefit of writing in the third-person is that it allows the writer to craft a multi-perspective plot that includes the inner thoughts of any character in the story, not just one narrator character.  Although a third-person narrative can have the affect of creating a buffer between the reader and a story’s protagonist in contrast to how a first-person perspective can work to merge reader and character into one unified voice, it also gives the writer a greater sense of control of setting the details of the narrative, as well as a greater sense of freedom when it comes to how these details are to be dispensed to the reader.

The major setbacks to writing in a third-person perspective is the misstep of not understanding that the narrative comes in two very distinct forms, which for the sake of consistency should not be confused throughout the plot.

The first form is what is called third-person-limited. The non-participatory narrator uses pronouns like “he,” “she,” and “they” (as opposed to the first-person, “I,” “me,” and “we”), and will give descriptions from the aforementioned impartial point-of-view.  But, as the name implies, a third-person-limited perspective has its literary  constraints.  Limited implies that while the narrative will give descriptive details to the reader independent of any one character’s subject thoughts, it’s narrative scope is limited to the details of usually one main character, and the details shared will not step outside the purview of the details available to this main character.

If you’re thinking that this sounds a lot like a first-person perspective just with different pronoun usage, you are both right and wrong.  Similarities between the two are clearly present, but unlike a first-person perspective , third-person-limited does allow for the narrative to explore the inner thoughts and motivations of the secondary characters because they are not being described through the main character’s subjective perspective.  The limitation is that the secondary characters have to be in some sort of interaction or connection with the main character.  Of course, it is also possible to avoid being tied down to one and only one character, by re-centering to a different main character throughout the different scenes that make up the plot.  One just has to be careful not to get confused about what character is currently occupying this role (i.e. if character A is the main character in Scene 1, and Scene 2 switches to character B as the focal point, the third-person-limited narrative in Scene 2 can’t suddenly start reference details revealed in Scene 1 because its point of focus, character B, will as of this point be ignorant of said details–even simply stating in the narrative, “Character B is ignorant of this fact revealed to Character A” is a violation of the internal logic of a third-person-limited perspective).

On the opposite side of all of this, stands the third-person-omniscient perspective.  For a writer, this perspective allows for the greatest amount of narrative freedom, in that you are not chained to the thoughts or whereabouts of any one character.  Think of the third-person-omniscient perspective as the god’s eye view to the third-person-limited bird’s eye view of storytelling.   Want to explore multiple character thoughts and feelings, without needing to relate it back to any given main character’s role within the scene?  No problem.  Want to jump around between character perspectives, and reference back to the reader things that only they (as the audience) are aware of within the greater plot?  Your only limitation is your creativity here.  However (oh, come now, you knew it was coming) it is important to keep in mind that too much freedom within a prose can also very easily tire out a reader.  When you present multiple viewpoints, it might make it harder for readers to bond with any characters (let alone the intended main protagonists of the story), or get invested in the dilemmas and outcomes that befall any of them.  In other words, too much information can create perspective fatigue, which is why even a narrative written from a third-person-omniscient perspective will often self-limit when to utilize its omniscience.

I spend some time here going over some of the strengths and drawbacks of the different narrative perspectives available to writers, not in order to argue for using one form over the other, but to simply give an overview why someone might wish to choose one over the other (depending on what sort of story is being written).  By far, the only real thing I am arguing for in this post is the importance of consistency in writing.  Meaning that whatever perspective you choose for your story’s narrative, you have to stick with it, otherwise you are setting yourself up for a grueling writing experience, and increasing the likelihood of the final draft being a frustrating mess to read (as much as it will be one to write).

It is perfectly fine to start out with one perspective, and then decide that the story is better served if written from a different perspective, but when faced with such a case the correct action is to take the time to go back to the beginning and rewrite everything to match the now better fitting narrative for the story.

Consistency. Consistency. Consistency.  That is the only true lesson here.