All posts by kronstadtrevolt

The Original Sasha Fierce!

Resume Writing 101-From Start to Finish

Writing a concise resume used to be a person’s first introduction into the competitive world of job hunting.  It didn’t matter whether the job being sought was entry-level or the management track, knowing how to sell oneself via a 1-2 page formal summary of professional qualifications and achievements was the first (and, oftentimes, only) chance at impressing a potential employer that you would get.  Despite the popularity of business oriented social media sites like LinkedIn, the importance of having a decent resume holds as much true today as it did twenty years ago.

I can feel the collective eye-roll of most readers at this point, sighing in unimpressed union, “Well, duh!”  If you are among this crowd I assure you that I’m not trying to waste your time (or mine) by typing up a how-to on a matter that is common knowledge.  Given the volume of resumes and abstracts (if one can call them that) I go through on a regular basis, the glaring fact that stands out is the sheer negligence of the most basics of resume writing standards getting ignored by job-seekers entering the workforce nowadays.  For their sake, and my own, I think it’s worthwhile to go over some of these basics, one point at a time.

  1. Use a simple Word Processor document.  There are many resume writing programs and apps on the market now, but I have yet to come across one that’s worth its bandwidth when it comes to typing up a plain, to-the-point resume.  A simple Word document that most laptops and desktops already come equipped with is really all you need.
  2. Font and Style:  Times New Roman is the classic; Arial is acceptable though slightly less business classy.  Anything else, ought to be avoided.  Seeing as how your resume was typed on a computer, and is understood to be read as such, there is no need to use fonts that mimic handwritten or artsy text.  What makes resumes visually appealing to an employer is their legibility, not the amount of fancy swirls or loops you managed to imitate in your text fonts.  If anything, this could be seen as distracting and unprofessional.  Since you never know what quirky pet peeves a person might have, deciding to play on the safer side by sticking to plain script (i.e. Times New Roman, 12-point font) is just a smarter way to go.
  3. Write your name in bold at the top of the page, and center it.  I would advise that your name should be the only thing written in bold on your whole resume to make it pop from the rest of the text, and thereby more memorable to the person reviewing it.  It is also advisable that you write your name in a slightly larger font to further add to the effect (so if you’re using 12-pt. font for the body of your resume, go up to 14 or 16-pt. for your name, but nothing else).  While still centered on the page,  write your phone number and email address beneath your name (no need to bold these; your name is the only part we’re trying to make pop on the page above anything else, remember?).  A lot of resume tips online will also say to include your address among your contact information, but I would have to disagree with this.  When employers are narrowing candidates down for callbacks, they start looking at the pettiest things to choose from among otherwise equally qualified candidates.  Hence, when they see that Candidate 1 lives 5 miles away, and Candidate 2 lives 15 miles away, they might consciously or subconsciously take this into consideration when making a final decision.  Best not to give them the option to even let it be factored into the equation by not mentioning your exact location so front and center.  [This is a general bit of advise.  If you know that your address will not or cannot be a detrimental factor, and might even be an asset, by all means go for it, and list it.  I’m simply telling you what’s been helpful in my experience, having sat on both sides of the hiring table.]

Now, with the basics out of the way, let’s get into the actual meat of the matter.  All resumes need to detail the following four sections regarding your professional history:  1. Objective, 2. Qualifications/Skills, 3. Education, 4. Employment History (I’ll mention a few words regarding how best to handle References towards the end of this post).

  1. Objective:  Your objective is your one-sentence pitch as to what your career goal is in terms of why you’re seeking this position.  I say one sentence, because you should really be able to explain your reason for wanting this job (and the reason why it’s a perfect fit for you) as succinctly as possible, and usually when people start typing up two-to-three sentences worth about themselves, they are prone to letting irrelevant/rambling details seep into the text.  A lot of jobs are fast-paced environments that value employees who don’t waste time, and demonstrating that you are someone who can communicate your intentions in one sentence, while others take four, goes a long way to speak in your favor.  (And no, just writing one long run-on sentence, filled with commas and semicolons is not a convincing hack that will fool anybody; if anything it will just make you sound long-winded.)  As to what to actually say in your objective, it largely depends on what type of job-hunt you are conducting.  If you are tailoring your resume to a very niche position, in a specific line of work, it’s better to speak directly to that.  If however you are job-hunting with more of a general idea of the sort of job you’d like to do, but know that you will be sending this same resume to a variety of different employers, a more versatile wording in your Objective might be applicable.
    •  Acceptable example:  “Objective:  To obtain a competitive position in a field that will offer continues growth in proportions to my abilities and skills.”  Vague enough to apply to a variety of career fields, displays a sense of ambition, but also pays lip-service to the notion that this ambition will be of mutual benefit to the employer and the employee.
    • Unacceptable example:  “Objective:  To get a job that I will enjoy and with which I can forward my career in the long-term.”  Essentially says the same thing, but it’s far too casual for employer’s to read any further depth into beyond what’s stated, and more importantly it is entirely egocentric in its delivery giving the implication that this candidate is someone who will bail the moment the feel that things aren’t going their way at work (this may be true of most employees, and any competent employer will be aware of this, but showing that you possess the gift of subtlety and plausible deniability are also highly valued skills on the job market, even if your bosses know when you’re try to use these skills on them.)
  2. Qualifications/Skills:  Right after your career objective, you should have a section of your skill sets.  The best format is to list them in tidy bullet points, one after the other, with the most relevant at the top of the list (i.e. relevant as they pertain to the position you are applying for, so feel free to shuffle these bullet points around and personalize them to each position, as you apply from one job to the next).  If you have any certifications or specialized training, this is the time to mention it.  If you know that the position you are applying for requires knowledge of a specific skill that you possess, write it out as plainly and obviously as possible (i.e. if the job will require you to work with spreadsheets all day, say “Proficient in all matters of Excel use, both on PC and Mac OS” instead of the more opaque “Proficient in Microsoft Office systems”–yes, the latter obviously includes Excel, but don’t overestimate the attention span of employers and their need to have things explicitly spelled out to them at all times).  Towards the end of your list of skills it’s perfectly all right to mention something that, though not completely relevant, shows you to be an interesting, and well-rounded person, but use a bit of common sense regarding what details to share.  Saying, “Extensive experience volunteering with youth groups to help foster a more positive community for at-risk students,” is a great humble brag, but saying “Leading figure in the online furry community, actively advocating inter-species acceptance and relations,” though potentially intriguing to discuss, probably not appropriate to lay on a potential employer so early on.
  3. Education: State your education as plainly as possible, by which I mean:  Name of school, type of degree, area of study, and noteworthy honors or commendations.  Unless the position you’re applying makes a point of mentioning an educational requirement, or your education reflects some unique or prestigious point, there is no reason to overwork this section beyond the basic points mentioned.

The above information should fit within 1 page of a 12-pt typed font, or somewhere very close to it.  Leaving you open to type up the final section on a separate page.

4. Employment History:   As the section’s name implies, give a list of  places where you worked.  Self-explanatory, really, but I’ll be painfully long-winded about it anyway [because I’m a pedantic son of a bitch, that’s why!].

    • The rule of thumb to follow is that if you have very little job experience, list whatever you can reasonably get away with passing off as “work experience”.  Have you ever done volunteer work?  List it, and give a detail of your responsibilities.  If you have done internships, student work-study, lead meet-up groups, whatever…these are all experiences you could use to demonstrate your ability to be productive and efficient in an occupational environment, even if you weren’t technically being paid to do them.  And in terms of wanting to fluff your resume to supplement a lack of employment history, or fill in extensive gaps in your employment history, mentioning these specific activities looks much better than trying to camouflage it with  vague concealers like, “Worked freelance projects” or “Self-employed entrepreneur” (unless you’ve got a legit business card naming you the CEO of a registered company, please don’t ever use this designation for yourself–no one is impressed by it).  The point isn’t to lie, or make things up in lieu of a robust work history to tout; it’s about showing that despite your lack of a standard 9-5 employment history, you are still a viable candidate that should be considered a serious contender for the job.
    • Now, some people will have the opposite problem, where they have way, way too many past jobs, volunteer work, extracurricular activities, etc., listed under their employment history to the point that they need several pages to fit it all.  If this sounds like your resume, you should definitely consider a rewrite.  If you’re applying for a administrative position, and you have several years of of administrative experience, you don’t need to list that summer job in McDonalds, or that year you spent as a delivery driver, or the side-gig you’ve got going on entertaining children dressed as a clown (honestly, adult coulrophobia is so widespread these days that mentioning the last one might work against you full stop; no one wants to take the chance of getting murdered by a clown during business hours).  That’s not to say that you can’t or shouldn’t mention any unique work experiences that only tangentially relate to the job you are applying for, but if your employment history is already running well past 1 page of otherwise relevant work experience, just let said work experience do the heavy lifting at impressing your future employer, and wait to let all that quirky personal charm you’ve got shine through during the in-person interview.
    • Now, how do you best summarize a past or present job position on your resume?  Easy:  list the company, list your position, list the time worked there (month and year).  Beneath that type up a bullet point summarizing all your duties and responsibilities.  Be as thorough as you can without resorting to word salads, or simply repeating what’s already listed under your Qualification/Skills sections–i.e. type in full sentences, as if you were relaying the information in person.  Depending on how much you did in the position, this may take one sentence or three, but try to keep it within 5 (you’re less likely to ramble irrelevancies if you place this limitation on yourself).

Okay–I’ve teased it, so now let’s say a word about References.  Please, please, please, do not write these three words anywhere on your resume in regard to your references:  “Available upon request.”  If they are available, and you already have them on hand, moreover if you know this is a job that will request them from you, just have a separate page typed up and ready to show to the employer.  In fact, even if you aren’t directly asked for references, you can never go wrong by always attaching a list of references to the back of your resume.  Regardless of whether the employer will make use of the list or not, having it ready and available for them shows forethought and thoroughness, and leaves a very good impression in your favor as a professional and serious job-seeker.  I want to mention that your list of references need not be more than 3 up-to-date professional references (ideally past employers who are at least likely to still remember you by name); give their names, their positions, their relation to you, and their contact information (just a phone number will suffice in most cases).  That’s it as far as references go; real straightforward, no need to overthink this anymore than anything else on your resume.

Keep in mind that nothing written here is the definitive word on resume writing, and I’m sure there are several caveats and exceptions I failed to mention simply for the sake of not wanting to take up more of your valuable time (or mine).  And although the above information is tailored to an old-school typed and printed resume format, it can just as easily apply to any other style of resume submission, and even as a rundown of how to organize the sections on one’s LinkedIn profile.  Though you should always, without question, have an actual typed resume on hand; if employers just wanted links to your social media they’d contact the third parties Facebook is selling all your personal data to for profit–Heyoo!  What?  Too soon?  Or have we all just moved on from that unethical bit of privacy invasion?

All right then, carry on, and good luck job hunting.

Advertisements

The Christian Right’s Faustian Bargain With Donald Trump

“Gog and Magog are at work in the Middle East. The biblical prophecies are being fulfilled. This confrontation is willed by God who wants this conflict to erase his people’s enemies before a new age begins.”

According to former French President Jacques Chirac, these are the words former U.S. President George W. Bush said to him sometime prior to what is now known as the colossal blunder that was/is the invasion of Iraq in 2003.  I should note that Bush himself has never confirmed, nor denied saying these words.  But regardless of whether Bush’s words are actually being quoted verbatim, or are a paraphrasing on Chirac’s part, my reaction to quotes like this is the same as it is to all babble coming from the political mouthpieces of the Christian Right in this country:  “What the fuck is he even talking about?”

It’s the same reaction I always have when this same sect of self-appointed moral crusaders will in one breath espouse their belief regarding the sanctity of life, and in another breath oppose legislation that would give people access to life-saving healthcare.  Or when they pontificate about the importance of upholding family values (read: their values), while working tirelessly to deprive families of any assistance that would actually help them feed and cloth their loved ones.  As far as I’m concerned the only proper reaction this this sort of schizoid babbling is, ” What the fuck are they even talking about?” as trying to humor these disjointed thought processes would be a disservice to the process of thought itself.

Given all this, one might believe that the way in which the Christian Right pledged their unwavering support for a man like Donald Trump is yet another example warranting a snide, rhetorical remark disguised as a question.  I disagree.  The reason I disagree is that, when it comes to Trump, I know exactly what the Christian Right is talking about.

Undeniably, President Donald J. Trump is a narcissistic, petty, mean-spirited, disgusting shell passing for a human being.  He is greedy, selfish, self-serving, self-aggrandizing, and incapable of holding the simplest of conversations without spouting out an inarticulate string of lies that both mocks and puts to shame the very language he has such a painstakingly low grasp of.  He shows no sense of loyalty towards anyone or anything, let alone the basics of human decency when it comes to how he treats those he views as his adversaries (and, at times, even his supposed allies).  He has no qualms about breaking campaign promises, and then berating anyone who points out his inconsistencies to him as the dishonest party in the discussion.  Among all these things, Donald Trump is also the darling of the Christian Right; who praise his name, and talk of him as if he truly is the second coming Christ had promised (and, some would say, failed to deliver on) to the followers of his generation nearly two millennia ago.  And when I hear them talk like this about Trump, I know exactly what they are talking about.

It’s not about the flaws of Trump’s character, either as a person or as a head of state.  Any and every fault can be dismissed under the nauseating cop-out, “Is all mankind not fallen and flawed?  Are we not all sinners?”  When faced with such a boldfaced heap of meaningless platitudes, one is apt to point out the fact that few of us–and no decent person in general–would ever walk up to unsuspecting women and “grab them by the pussy,” like President Trump has bragged about doing.  That is definitely one sin I can attest to having never committed, and, yes, I feel quite justified in saying that it morally places me on better footing than those who have.  But even mentioning that to this crowd is pointless, because ultimately it doesn’t matter to them if Donald Trump is a chauvinistic, perverted scoundrel.  The only thing they care about–the only thing they have ever cared about–is shaping the legal arm of the nation in accordance with their will, and impose their sets of hypocritical edicts on everyone else, whether they like it or not.

It layman’s:  The Christian Right is backing Trump because he will appoint the judges who will align with their views of how laws ought to be interpreted in this country.  He will give executive backing to legislation that will reshape this nation into what they always wanted it to have been from the start–a fundamentalist, conservative hallmark for Christendom; rife with the great tradition of hypocrisy and intolerance that is entailed by it.

In this context, the fact that they are undercutting their own sanctimonious virtues by throwing their lot with a person as un-Christlike as Donald Trump is irrelevant.  The fact that their current actions are causing younger generations to walk away from their congregations is a moot point.  It ultimately does not matters what convictions anyone individually holds; as long all are still forced to abide by the laws and legal precedents implemented by the Christian Right, victory has been ensured for generations to come because once a matter becomes the judicial status quo (regardless of how draconian or unpopular) it becomes that much harder to overturn, socially and politically.  Rather than flailing in the wind towards irrelevance, this sect is playing what they believe to be the long game in the culture war to reshape American society.

And for once, I know exactly what the fuck they are taking about when they spout their babble, and there is nothing meek or humble about it, in either a Christian or secular sense of the words.  If the other side of the political aisle wishes to have a fighting chance against such blatant subversion of the democratic process, the pushback has to be equally biting with a succinct and unrelenting, “Like hell you will!”

The Reason Stories are Written in Past Tense

One of the first things any decent creative writing class will teach an aspiring author is the importance of maintaining consistency throughout the text, and it’s something I’ve definitely mentioned before on this blog.  Although this often refers to the importance of maintaining plot consistencies, grammatical consistencies (and functional consistencies), are equally crucial parts in creating a legible narrative.

Anyone who reads fiction novels regularly will have noticed that the overwhelming majority of these stories are written in past tense; e.g. “It was the best of times…”, “She figured it was all over…”, “He loved her like no other, but also saw no way to show it…” etc.  But why is this?  What makes a past tense narrative more grammatically correct, then a past or future tense syntax structure?  To answer that question, one needs to first dispel the phrasing of it.  There is nothing inherently more grammatically correct about using past tense, as opposed to any other tense, as long as the narrative voice remains consistent in its use throughout the story (or there is a damn good reason why it doesn’t need to do so).  Hence, the reason past tense is seen as the default has less do to with grammar, and more to do with functionality.  It shouldn’t be forgotten that writers by definition are also readers, meaning that they carry with them decades’ worth of literary conditioning, just like the audience they are trying to reach.  Most of the books a writer has read will have been written in past tense narrative, and like every other reader, it is understandable if this structure naturally seeps into one’s own writings.  Thus, one also shouldn’t underestimate the sheer amount of concentration it will take to catch the potential for inconsistent writings when attempting to do experimental works that run counter to the norm, and how the potential of creating an inconsistent prose goes up substantially when trying do to something out of the ordinary.  Therefore, defaulting to the more common past tense narrative is an easy way to ensure consistency throughout one’s plot, since it will feel the most natural; for writers and readers, alike.

Alternatively, rarely do you see whole plot narratives written as future tense; e.g.  “I will go see her tomorrow, after which we’ll talk…”, “They are going to take care of it later…” etc.  This sort of writing is reserved more for character dialogues, as they are more in line with casual conversations (not to mention people’s internal dialogues) wherein the discourse centers on planned actions (i.e. things yet to be done, spoken about by character’s whose overall knowledge of events is limited).  In contrast, narrator voices—whether they are written in first person, or third person; whether they are limited, or omniscient—are instinctively read by the audience in a bird’s eye view perspective, detailing the happenings to them as an observer of events.  It wouldn’t be impossible to write a whole narrative in the future tense, but the risk you run is to possibly frustrate your readers because, in many ways, such choice of phrasing stands so deeply in contrast with how most of us are attuned to differentiating between plot narrative and character dialogue that it may have the unfortunate affect of making the story too confusing and tiresome for most to bother following along with to the end.  And while challenging readers through provocative prose can be laudable, given them a headache through cumbersome verb usage is anything but useful.

Lastly, there is present tense; e.g.  “She creates the world as she sees it…”, “He says what he thinks, and he thinks what he knows…”  It’s a very impactful form of narrative, which immediately frames the plot into an action mode—things are happening, and they are happening right freaking now!  It’s unique, and in the hands of a skilled writer, has the potential to serve as a creative alternative to its more common past tense counterpart.  On the other hand, in unseasoned hands, it has the potentially to also wear out the reader; think sensory overload brought about by too much intensity.  There is a reason most stories follow the general set up of: introduction -> rising action -> climax -> falling action -> conclusion/resolution. If the whole story is written in a narrative that denotes action all throughout these distinct steps in the narrative, then the writer will have to work doubly hard to make the impact of the climax (and the rising action that leads up to it) standout to the reader’s attention.  I’m not saying that it’s an impossible task to accomplish, but it is harder, and takes considering talent to get it right.

I outlined why looking at the prevalence of past tense narratives in fiction isn’t really an issue of grammar, but an issue of ease of writing and what reader’s are simply accustomed to.  In an obvious way, the situation is very much a Catch-22:  Readers are used to reading narratives because most authors write in past tense narratives; authors write in past tense narratives because most readers are used to reading in past tense narratives. And a prevailing orthodoxy is therefore sustained.  Now, I will never say not to attempt a heterodox approach that deviates from the norm, on the grounds that one never knows for certain what works until it’s tried (every new situation carries with it the prospect for new discovery, and all that).  I simply want to make the point that no reader expects you to re-invent the written word to be seen as a great storyteller, and it’s perfectly fine to stick with what has been tried-and-tested to work, and what will make it easier for you to write your story, rather than fret over the structural details when you really don’t have to.

Dispatches from Gulfton

The first grocery store I saw when I moved to the United States was a meager looking spectacle called Sellers Bros. in a rundown strip-mall area of southwest Houston, TX.  The store’s shelves were as overcrowded with bargain, generic-name products, as it’s aisles were with patrons shuffling from one end of the building to the next, holding tightly to their Lone Star Cards needed to feed their families for the month.  The building’s somber looking outer-structure held a passing resemblance to the apartment complexes that surrounded it only a few paces away—one of which my family was living in at the time, serving as our first exposure to the realities of inner-city American life we had immigrated to, and were gradually assimilate with.

The majority of the neighborhood was composed of immigrant families.  Though unlike my family, which originated east of the Atlantic Ocean, it was impossible not to notice that most of my neighbors hailed south of the Rio Grande.  As a result, while I had come to this country with the advantage of being able to speak English reasonably well—well enough to understand, and be understood by the general Anglophone population anyway—this advantage proved of little value on the very street I called home for these years of my adolescence.  It was an early education to the fact many living in urban America are readily familiar with.  Namely, that within the reality of American life, reside smaller sects of conflicting realities, many of which can neither communicate nor understand one another, and are set up so that they will rarely meet.  Gulfton Street in Houston, Texas, occupies one such reality.

Tucked away between two major highways in southwest Houston, spanning a stretch of 3 to 4 miles of cracked concrete landscape, sits the street of Gulfton.  The epicenter of the Gulfton Ghetto, as it’s occasionally called by the local media and by other Houstonians (though never by the neighborhood’s own inhabitants).  To those who take a wrong turn off Bellaire and find themselves driving down Gulfton Street by accident, the insulting nickname will seem most warranted.

The immediate sights one is met with are panel after panel of gang graffiti, row upon row of low-rent apartment complexes, and concrete sidewalks that have been in desperate need of repair for a good few decades now.  Surprisingly, there is a park/recreational center meant to give some relief to the area’s ongoing problem with juvenile delinquency, though anyone who has ever stepped onto the park itself will be quickly robbed of any hopefulness at the prospect of this endeavor.  In short, like many neighborhoods in urban America, Gulfton is a place that has been largely abandoned to the ravages of metropolitan entropy.

Under-funded and halfway flushed out improvement projects that have failed to live up to expectations are pointed to by the rest of the city as reasons not to bother with any future attempts at repairing the crumbling infrastructure.  Leaving the residents who have given up on the idea of moving away to either wall themselves off from the unsavory conditions that surround them within their private residences (however meager they may be), or embrace it by becoming a part of its destructive nature.

The first instinct any well-meaning person will have when confronted with a reality like Gulfton is, “Can anything be done to fix this?”  It’s an honest question, but it betrays a lot about the person asking it.  The idea that there is any one thing that can resolve problems that are decades in the making is a part of the problem to begin with.  These sort of problem are such that they have no one facet of origin, but are a delicate, interwoven mess of social, economic, and political barriers erected and maintained through complex systems with interests that themselves compete against and prop up each other in a multitude of ways.  The problems of Gulfton, like the problems of similar neighborhoods and populations throughout this country, have no single cause; hence they can have no single solution to curb the path they are currently on.

“Why don’t the people living there work to fix things?  It’s their neighborhood, after all.  Don’t they care?”

Unfortunately, the reality of all urban areas is that they are landlocked and dependent on the larger metropolitan that surrounds them.  They don’t get to make decisions in a vacuum, and resources are finite and sparse in terms of what will be readily allocated to benefit them.  The further issue is that once a neighborhood has fallen far enough to be regarded as “hopeless” by officials and administrators who could possibly make a difference, the very hopelessness of said neighborhood is used as the reason against committing long-term funds to improve its conditions, on the basis that it would be unfair to use tax dollars from well-behaved citizens in more savory parts of the city to fund the activities of no-good thugs and gangsters in these low-income, high crime areas.  Local agencies will say they are not equipped to handle the expenses needed to undertake the sort of social projects necessary to overhaul the issues plaguing these sorts of areas, while Federal agencies see these issues as strictly a local concern.

In the absence of a robust social safety net provided by the city or state authorities to ensure the most basic of securities and public amenities, opportunistic forces will band together to construct their own safety nets, which for many young people will take on the form of turning to gangs that prey on social instabilities as a means to offer their quasi-organized crime structure as an alternative to festering in a decrepit social system.  The reason youths are most susceptible to this, is that they are the most in need of some kind of functioning social order to orientate their lives (and relieve their boredom), and even the violent and dangerous structure of a gang life is to many preferable to the instability of no visible structure at all.

Some people have a natural aversion to hearing that any issues constitute a systemic problem, requiring a systemic approach to resolve.  They conjure up images of how the very notion of entertaining such a thought is little more than an attempt to skirt away responsibility from the individuals and let them avoid the consequences of their actions and/or apathy, leaving them no incentive to make things better on their own accord.  I can understand the sentiment behind this aversion, though I find it largely misinformed.

In a place like Gulfton, how exactly do you expect the individuals living there to step up to fix the various problems that plague their environment?  Should they pool their meager earnings together to pay for the ongoing structural damage to their concrete sidewalks and street signs, despite the fact that we’re talking about city property and as a results is an issues needing to be addressed by the local government?  How about the need to improve the resources available to the local schools so that there can be robust after-school programs and activities available for young people to occupy their time with to discourage the need for delinquency and gang activity?  Should the low-income earning parents of these youths fund these programs directly, thereby taking money away from them that’s needed to pay rent, utilities, food, clothing, etc.?  Would that be an example of individuals stepping up to take personal responsibility to improve the conditions around them, or a neglect of one’s obligations to provided basic necessities for one’s own family first?  If donating money is not the answer, surely we can get everyone to at least volunteer their time to improve their community, no?  It’s not as if the sort of people who have to live in these sorts of neighborhoods, are undoubtedly also stuck working jobs with little to no flexible hours or time off, after all.

Perhaps the answer is that all these folks ought to work harder to increase their earnings, so they aren’t hostage to their economic conditions.  Yet, if they actually managed to do just that, what incentive would they have to spend their extra earnings on repairing a place like Gulfton, as opposed to–oh, I don’t know–simply moving away to a better part of town that already offers all the basics of having dignified living conditions?

Unless you are Bruce Wayne, sitting on an endless supply of inherited wealth, resources, and leisure time, individuals donating money and/or donating time, will never be a solution to the problems that affect neighborhoods like Gulfton.  These are problems that took a long time to manifest, and they require long-term investment and planning to be resolved. It requires layers upon layers of overarching organizational resources, to properly oversee and track improvements, that no single individual or clustered group is capable of providing.  Private businesses, local or otherwise, also offer little help in the matter, since their is no business incentive in investing in a place simply to improve the lives and environment of its residents, since these residents will not be able to return the gesture on account that, at the end of the day, they’ll still be too poor to ever be able to turn a profit for these businesses.

And it takes an astounding level of naivete to not be able to realize this.  The same sort of naivete that leads certain people to make inane points like, “If you like public programs, and think taxes should be higher to pay for them, why don’t you just volunteer more of your money on an individual basis, instead of demanding everyone else do it through the tax code?”  Because individual actions and donations will not solve systemic problems like the ones affecting neighborhoods like Gulfton, that’s why.  Because many of the problems plaguing inner-city life are far too complex and interconnected to a multitude of surrounding factors to be seriously brushed off with red herrings concerning individual responsibilities.

Areas like Gulfton are the way they are because they have become culturally and economically alienated from the rest of their metropolitan centers, and the rest of the country at large, and little is being done to incorporate them into the greater society that surrounds them.  The full reasons for this alienation are legion, and the solutions that will be necessary will by definition be just as extensive, which is a reality that must be acknowledged by those who purport to take the issues of working, urban, and immigrant communities seriously.

If, on the other hand, you simply don’t care about places like Gulfton, then just say you don’t care, and stand by the convictions of your apathy.  And stop pretending that there is a greater moral or ideological basis to what is essentially pure disinterest for the plight of people you can’t be bothered to give a shit about.  It will make for a much more honest conversation.

Wit Contra Sarcasm (Contra Douchebaggery)

Wit is hard to get.  Just as you think you get wit, they’ll come around and change what wit is.  Suddenly, what you thought was wit, is no longer it, and what is wit, will sound to you like a pile of shit!

Fortunately, wit has an easier to attain co-traveler in the world of rhetoric named sarcasm, which is much, much easier to pull off.  Much like pineapple on one’s pizza, people either love sarcasm or they don’t.  And for those who love it, they really freaking love it.  I find it to be especially true of women, as you are setting out in the initial courting process, because the women who appreciate a good sarcastic banter will respond very favorable to any guy able to keep up with their own sarcastic quips, while the women who are turned off by sarcastic jokes will very quickly show you how they are not amused by your highbrow wit-lite ramblings.

Let me say from the onset that I’m not bashing sarcasm here—sarcasm is great people in my book (I can attest that some of my best friends are practically verbally drenched in nothing but sarcasm…also desperation and self-loathing, but sarcasm is a large ingredient in their person-stew, too).  My main problem with it is that a lot of people seem to think that simply saying something in a sarcastic tone ought to be treated on par with making a witty comment, seemingly unaware that it is not the sarcasm that makes a comment witty; it’s how clever and salient said comment is to the situation it is speaking on.

I’m sure we all know at least one person who has unwittingly fallen into this trapping, but for a notoriously bad offender think no farther than Dennis Miller’s stand-up routines in the 90s, where in addition to pointlessly disjointed similes, a la “Man this whole impeachment issue is becoming a sticker mess for Bill Clinton than Rutherford B. Hayes’ sauna sessions, daddio!  Amirite folks? Har har har” [note: not real Dennis Miller quote, but can you honestly tell the difference?], he often relied on simply saying something in a sarcastic tone to give the implication that a witty comment had been made, hoping it could carry the point home for him.  It hadn’t, and it couldn’t.  As is the case for all things sarcasm-sans-wit related (and all things Dennis Miller related, for that matter), it’s essentially where the desperate nugget of any relevant point goes to die.

On a related note, think about all the times you have been in a situation where you made a suggestion regarding a course of action, only to get a response of, “Oh yeah, that’ll work reeeeeeal great, I’m sure of it.”  Accompanied with an eye-roll, and a few air-quotes thrown in to truly carry the point home.  While we all can recognize this as being far from anything resembling wit, I would even hesitate to deem it worthy enough of being called mere sarcasm.  It much closer to what I would refer to as “Douchebag Cynicism”.  Which is academically defined as, any and every action or comment made to identify and amplify one’s irredeemable douchebaggery poorly masquerading for cleverness.  It’s a noun.

Really, my only point in this whole rant of a post is that if you feel the urge to be sarcastic, put a bit more thought into it besides just adding a mocking inflection to your voice—try to actually have something noteworthy and clever to contribute to the conversation.  Also, always strive not to be a douchebag cynic.  Though that last bit is wisdom that can probably apply to most areas of one’s life.

Character Backgrounds: The Dilemma of Sharing Too Little, or Too Much

When writing a story, there exists a natural disconnect between how the author interprets the plot, and how the audience reads it.  The obvious reason for this being that the author has the (mis)fortune of knowing the intended background details of the events and characters before they ever makes their way onto the page, in ways that are not readily available to the reader.  The task for any decent writer is to convey these details in a way that makes for a compelling narrative that will be neither overbearing for the reader, nor leave them stranded in the dark regarding important plot/character developments.

Spotting moments when an author is being too reserved with details is fairly easy.  Anytime you’ve come across a part of a story or book that left you wondering, “Wait, who is this, and why are they suddenly in the middle of everything?  Where they hell did they come from?” you were essentially exposed to underdeveloped writing.  Be sure not to misunderstand what I’m saying, though.  Introducing new characters, and strategically withholding information about them, can be an effective writing technique to invigorate interest back into the plot, as a little mystery can go a long way in building much needed suspense in an otherwise stale plot.

As an example, imagine a love story between two characters named Tom and Jill.  For over a hundred pages, you followed along as Tom sees Jill, falls in love with her, and tries desperately to impress her.  Jill was originally aloof regarding Tom’s advances, but slowly she starts to feel flattered by his affection for her, and agrees to give him a chance.  Things are going great for the two love birds for several more pages, then—just as the plot can’t bear the weight of anymore Hallmark moment clichés—a sudden wrench is thrown into the mix:

Nothing could tear Tom’s gaze away from Jill’s eyes.  The shape of them, their softness as she smiled, even the wrinkles that formed at the corners of her eyelids as she laughed, all worked to keep him in a hypnotic trance from which he could not—would not—escape.  Or so he thought.  Because the moment Susan Gallaghan walked by them, he felt his eyes wander from his beloved Jill’s enchanting eyes, to the rhythmic steps that paced along in front of him.

Let’s assume this is the first time this Susan character is ever mentioned in the plot.  The first thoughts any reader is going to have will be along the lines of:  “Who the hell is this Susan person?”, “Is she someone new to Tom?”, “Is she an old flame?”, “Is she a girl from his youth that he secretly pined after?”, “Is Tom actually a serial killer, and Susan his next victim?”  At this point, we, the audience, have no clue.  The fact that we have no clue is what makes it a brilliant writer’s trick, because now you are invested in the dilemma and subsequent resolution that is sure to follow.

But what if the drama never really follows the way you expect it to?  While the sudden introduction of this new character works to spark the reader’s interest in the development of the story, it can only carry the audience’s engagement so far.  If Susan keeps popping up in the same way, with the same vague acknowledgment from the established characters, the reader’s interest will quickly turn to frustration, and ultimately to disinterest.  You have to give the audience a reason as to why the things that are happening on the page are worth being mentioned to begin with, and in the case of character development, this means divulging at the very least some connection between secondary plot-devise characters (like Susan above) and the main protagonists.

Divulging a character’s background effectively in a narrative is not as easy as it may sound.  A lot of times it can come across bloated, and a poor attempt to force feed too much information into the plot, just for the sake of having the reader know why this person exists in the story.

Imagine if the mysterious introduction of Susan above followed up with:

Tom immediately recognized Susan as his high school sweetheart, to whom he had lost his virginity to on prom night.  The two of them went their separate ways soon after graduation, but Tom never quite got over his love for Susan.  Susan, for her part, had little trouble moving on from Tom.  So much so, that she moved away to study and travel abroad.  As she traveled the world, she gained an appreciation for herself, and how she didn’t need to define her identity by any one person that happened to be in her life.  Unlike Tom, Susan wasn’t validated by whether someone loved her; she felt complete knowing that she loved herself.  Even now as she walked past him with all the confidence of a young woman who intended to live  her life to the fullest, Tom’s heart throbbed once again for the one that got away.  Though Susan didn’t recognize Tom, the two of them would be seeing a lot more of each other from her on out, since she was set to begin a new position in the very firm Tom worked at.

The problem here isn’t that this information is being revealed within the plot; it’s that there is no reason to have it laid out all at once, let alone right after the mysteriousness regarding Susan’s presence was so brilliantly executed.  All of this can be revealed through the course of several pages, if not several chapters.  Again, by all means give the necessary background to establish a character, but there is no need to lump it all together in one spot, because then your narrative will inevitably end up repeating itself again and again, every single time the information needs to be revisited.  Eventually, Tom and Susan will have a confrontation, where hints can be dropped regarding their past intimacy.  Rather than state that Susan is a confident and independent person, why not show it by the way she behaves and interacts with her surroundings and the other characters?  Pretty much everything stated in that one paragraph can be dispersed throughout the story by piecemeal, without having to kill the suspense of revealing it all in one big swoop (especially right after the mystery character is introduced).

For a real literary example of where an author does a superb job of balancing the enigma of his characters with their subtle background revelations throughout the plot, I would point to the characters of Mr. Croup and Mr. Vandemar in Neil Gaiman’s Neverwhere.  Even before the book’s otherworldly narrative is revealed, these two characters’ peculiar manner of dress and manner of speaking foreshadows a fantastical nature to their persons (and, by extension, the plot itself).  All of which is subtly explored in what essentially amounts to breadcrumbs worth of information through the course of a 300+ page story.  And in the end of it all, the mystery behind who/what Mr. Croup and Mr. Vandemar really are is never fully revealed, precisely because there is no reason for the story to do so.

Ultimately, it’s up to every writer to decide how much is too much background exhibition for her/his characters, and how much is just enough to not stifle character and plot development.  That happy balance will largely depend on the sort of story you are trying to tell, and it may take several revisions to get it within the range you are aiming for.  But, while it’s not always straightforward in either case, being able to spot the problem in other written works means you are more than capable of applying that critical eye to your own.  Like a lot of writing advice, it simply starts with reading your writings not as an author, but as a reader, first and foremost.

Mindlessly Mindful: How Meditation Stifled my Creativity

Over the course of the last few years, the practice of mindfulness meditation has sparked a great deal of interest in private and public discourse.  For many this discourse takes on the form of a full-scale spiritual reawakening in their lives–the rationale of looking back to what some would call time-tested wisdom, as a guide to navigate through modern life.  Still to others, who might belong to a more pragmatic mindset, the adoption of meditation into their daily routine is less about reaching an esoteric sense of enlightenment, and more about wanting to find a means of focus for the cluttered thoughts they feel are clogging up their minds.

My own interest into mindfulness meditation began sometime in late-2016, and stemmed from a general curiosity regarding the positive results being attested to by its practitioners–ranging from all sort of different personalities; including (but not limited to) self-appointed gurus, public intellectuals, corporate bosses, average laborers, and everyone in between.  What peaked my curiosity most was how the underlying message from this diverse group of people was a resounding agreement that: “Yes, indeed, meditation works!”  The full definition of how it “works!” and what it means for it to “work!” often vary as much as the individual backgrounds of meditation practitioners, however there are some very clear commonalities among all the positive testimonials.

A greater sense of focus is one reoccurring benefit attested to by mindfulness meditators.  Specifically, a greater awareness and appreciation of the details encompassing the moment one happens to be currently occupying, as well as the multitude of thoughts that accompany it.  Another common theme among meditation circles is how it leads one to confront the (supposedly false) preconceptions surrounding the fundamental concept of the Self, and the illusory nature by which we think of our Self in relations to both our internal dialogue, as well as the external world our Self interacts with (whether it is even coherent to think of the Self as an independent agent relating to the world, rather than another component in an endless string of interacting affects that make up existence).

I spent weeks researching the practice and philosophy of mindfulness meditation to get a better understanding of it, until finally, on January 1st, 2017, I decided to put theory to practice and devote a significant portions of my free time trying to gain some firsthand experience of what it truly means to incorporate meditation in my daily life.  Recently, on January 1st, 2019, this personal experiment of mine came to a full stop.

When I first set out on this personal journey I expected the possible results to go one of two ways:  1.  A net positive, wherein I would enjoy the benefits of reaching some semblance of self-awareness, self-discovery, and hopefully even personal growth (like so many others testified to having experienced through meditation).  2.  A net neutral, the results of which would be no more dire than having wasted some portion of my time on a fruitless exercise that offered no real benefits, but ultimately no harm.

Having now gone through it, I can’t say what I experienced to have been neutral, since the practice definitely affected me on more than one level.  Unfortunately, from my perspective, the affects I felt leaned more towards a net negative as a whole; so much so, I decided to give up meditating completely as something that may simply not be a suitable practice for someone like me.

Once I ceased meditating, a subsequent curiosity came over me in which I wanted to find out if there were others that have had a similar (negative) experience to my own while practicing mindfulness meditation, but surprisingly enough the answer to that questions seems to be a resounding, “No.”

I came across a few blog posts here and there of people saying they weren’t completely satisfied with what mindfulness meditation offered, or that it wasn’t what they expected, but they were still overall happy to have had the experience (even if they decided it wasn’t the right fit for them).  I also finally took the time to research the medical and psychological data regarding the long-term benefits of meditation (or, more aptly, the lack thereof) I had intentionally avoided while engaging in the practice, so as not to be prematurely biased against it.  Yet, other than a general confirmation that little to no empirical evidence exists to validate its self-proclaimed benefits–possibly making meditation more comparable to a placebo effect than genuine self-awareness–I still didn’t come across reports that confirmed anything close to my personal (negative) experience.

I’m not going to go into deep details regarding the exact nature of the sort of mindfulness regiment I did during this two year period; partly because I’d rather be guilty of leaving details ambiguous, then have every meditating Tom, Dick, and Mary who fancies her/himself a guru lecture me about how “real” meditation ought to be done.  If that is the sort of objection coming to mind as you read this, I am unfortunately failing to get the crux of my point across.

It’s not that I meditated and got no results from it, or that my results were drastically different from what I’ve read, heard, and observed others state about their own experiences while meditating.  In fact, my experiences were more or less in line with what the typical person claims to go through while practicing mindfulness exercises.  My problem with meditation–and mindfulness meditation, specifically–are what I view to be the negative impact it had on my creative wherewithal.

What exactly do I mean with this? Allow me to explain.

A heightened awareness of the current moment is one of the major benefits promoted in favor of meditation.  While I see how it might help those who have a habit of wearing their emotions on their sleeves to meditate–or maybe those who suffer from impulsive decision-making in general–I’m someone who came into meditation already relatively calm and collected, possessing a decent set of stress management skills to begin with.  Furthermore, I’m someone who relies on having to construct imaginary plots, involving imaginary people, and projecting them into contrived scenarios that could resolve themselves any number of ways I see fit to write.  Now, seeing that creative writing is generally penned in the past tense, about things that have yet to be imagined, involving situations which do not exist, I never expected mindfulness meditation to offer much in the way of benefits in this part of my life.  But I also wasn’t prepared for how downright harmful it could be to it, either.

Prior to incorporating meditation into my daily routine, sitting at my desk and passionately typing away at my laptop’s keyboard for long enough to lose my sense of self because I am too immersed in the world I’m creating, was the feeling that gave me satisfaction at the end of a day when I went to bed.  And, slowly but surely, I felt this passion begin to erode the more progress I made with my meditative practice.  (Then subsequently return when I stopped meditating altogether.)

Sure, I got better at focusing on my breathing, as well as the various physical sensations that made up my moment-to-moment experiences, which in turn made me more aware of not just my thoughts, but the process by which these thoughts seemed to spontaneously manifest into my conscious monologue, but all of this came at a cost.  Being more aware of my thoughts–moreover being conscious of the act of thinking–made it harder to lose myself within those thoughts when I needed to weave together thoughtful writing.

And it wasn’t just writing.  Other creative outlets like painting became harder, too, because a large part of my painting process revolves around being able to foresee and focus on what shapes and images can be created (rather than what are present in the moment), and what method/color scheme will illustrate them best.  Being aware of the moment, and the act of what I’m doing (in this case sitting in a chair while painting) offered no benefit to the act itself, and ironically often served to distract from letting my thoughts roam towards conjuring up the inspiration needed to complete the project.

Yes, inspiration.  That is the key ingredient that I felt slipping the deeper I delved into meditation.  Ironically, as a result I found myself feeling more frustrated and stressed as a person when I sat down to do my work; traits I largely did not possess (at least not to the level I developed) going into meditation.

Like a lot of bad side effects, it took time for the signs to come to the surface, at which point meditation had already become part of my daily routine (and, really, routines can be so hard to break once they’ve cemented into our daily lives).  So I carried forward through all of 2017, and the first half of 2018, somewhat oblivious to what was the source to my depleting creative spark.  Then, last summer I wrote a post on this blog titled, The Pitfalls of Self-Help, after which I started to consider the possibility that all the positive testimonials I had heard in praise of mindfulness (which got me interested in it) were just as vacuous as the testimonials of people following any other self-help/self-awareness fad.

I started to seek out other mindfulness practitioners to see what insights they had to share, and was largely met with not-fully-thought-through regurgitations from self-proclaimed meditation gurus, whose wisdom sounded more like buzzwordy slogans from the reject bin of yesterday’s fortune cookie stash.

One particular conversation proved most enlightening.  The gist of it went something like:

Meditator:  “How you perceive of the Self is an illusion.”

Me:  “I perceive of my Self as a collection of atoms that make up the matter that is me; occupying a specific space in time that only I occupy.  In what sense in this an illusion?”

Meditator: “That’s not how people define the Self.  When people talk about a Self, they speak of it in terms of a separate entity that’s observing their doings, instead of being a part of it.  That’s an illusion.”

Me:  “But I just told you that doesn’t apply to how I, personally, conceive of the Self; as it pertains to me, or anyone else.”

Meditator:  “It does.  You’re trying to intellectually rationalize you perception.  In reality, you’re just not being honest with how you really perceive your Self, in everyday practice.”

I’m fine with accepting that I have blind spots regarding my own conscious and subconscious awareness.  What I take issue with is being told I have to accept the idea that someone else–with absolutely no firsthand access to my thoughts or perceptions–has figured out where all these blind spots are, how they pertain to my experiences, and how it all conveniently fits into her/his own preconceived generalizations and worldview.  In other words, feel free to tell me that I’m wrong in my opinion, but don’t condescendingly tell me you know what I’m really thinking, in order to make me and my thoughts conform to your philosophy.  That’s not awareness; that’s just bullshit.  And I hate to say it, but a lot of meditation seems to run very close to this level of discourse.

In the last half of 2018, as I drifted more and more away from seeing any value for keeping meditation in my life, I was given two further explanations by meditation practitioners for my lack of positive results:  1.  I’m not spiritual enough, and 2. I’m too straight-edge.

I’ll freely grant the truth of the first explanation as a strong possibility.  Even with the most elastic definition of the word “spiritual,” I can honestly say that it does not, and cannot, apply to me.  While I know there are efforts made to promote a secular form of spirituality, I still feel the need to point out that I have never believed in the supernatural, nor the mystical, and the values and passions I have in life I do not equate or think of in any deeper “spiritual” terms.  The things that give my life meaning and joy, are simply the things that give my life meaning and joy, and I see no reason why I need to lump on belabored spiritual terminologies that do little to further elucidate what is innately a tautological experience for everybody.  Apparently, this type of thinking doesn’t sit well with the sort of people who claim to get concrete benefits out of meditation.  In such circles, simply saying you appreciate any aspect of life, and your roles and perceptions in it, is an affirmation of your spirituality.  Which is fine, but to me that just redefines spiritual so broadly that it becomes meaningless as a term.  I’m not invested enough in the semantics behind it all to debate the issue, but it’s safe to say that I don’t personally consider myself to be a spiritual person (regardless of whether others want to see me as such).

As to the second point, concerning my lifestyle choices; on more than one occasion, it was suggested to me that meditation can only be truly of benefit when performed under the influence of psychedelics.  I have no way of knowing if this is true or not, as I do not partake in recreational drug use (though I support anyone else’s right to do so).  But I have to ask, how do you know that what you perceive to be a greater self-awareness while high on psychedelics isn’t just a drug-induced delusion that has no bearing on reality as it actually is?  If being on drugs, and then meditating, is the key to opening the door to a greater truth about life, how come no one has ever emerged from these drug-fueled meditative states with any tangible, verifiable realizations about the world?

How come in all the centuries of taking mushrooms and meditating in caves, none of these yogis and gurus came out of the experience with something like “E=mc^2”, or the formula for penicillin, or even something as basic as “hey guys, guess what, the world is actually round” (in fact, there is a growing following of people online, at least some of whom I imagine are very prone to getting baked, that argue in favor of a flat-earth).  It’s always some esoteric and vague platitude, like “the Self is an illusion” (as long as both “Self” and “illusion” are defined in very particular terms) or “states of happiness and suffering both depend on consciousness to be realized” (no shit, you’re telling me people who are brain dead can’t feel happy or sad?–Brilliant!).  So, I must ask, what exactly is the point of a greater awareness, if said awareness has nothing tangible to say about the most fundamental, verifiable facts regarding the reality we inhabit?

And, look, perhaps there are those for whom such musings and conversations are of great value, and their personal experiences have been greatly enriched by their existence.  If meditation has brought these people happiness, and impacted their personal growth as individuals positively, I would never argue to take it away from them on the basis that it wasn’t my cup of tea.  We’re all different, and what works for you may not work for me, is one underlying message here.

The other reason for writing this post is to speak to anyone who may have had a similar experience with meditation to my own, and also struggled to find others voicing said experience.  Although I didn’t find much in the way of negative testimony regarding mindfulness meditation, I have a hard time believing that there isn’t someone–at least one person–in the world who, like myself, has tried this out and found it to have been more of a hindrance in her/his life, rather than a benefit.  To this person(s) I’d like to say, there’s in no point in struggling to move forward in a futile quest, and there’s in no shame in walking away from something that is doing you no good.  There are many different ways to experience life and achieve personal fulfillment, and just because something is presented as a cure-all to what ails you, doesn’t mean that there aren’t better alternatives out there more suitable for you.

And if you think everything I’ve written is unwarranted drivel, let me know, and I’ll be sure to meditate on your concerns post haste.