Resume Writing 101-From Start to Finish

Professional Resume Writers | Resume Writing Group

Writing a concise resume used to be a person’s first introduction into the competitive world of job hunting.  It didn’t matter whether the job being sought was entry-level or the management track, knowing how to sell oneself via a 1-2 page formal summary of professional qualifications and achievements was the first (and, oftentimes, only) chance at impressing a potential employer that you would get.  Despite the popularity of business oriented social media sites like LinkedIn, the importance of having a decent resume holds as much true today as it did twenty years ago.

I can feel the collective eye-roll of most readers at this point, sighing in unimpressed union, “Well, duh!”  If you are among this crowd I assure you that I’m not trying to waste your time (or mine) by typing up a how-to on a matter that is common knowledge.  Given the volume of resumes and abstracts (if one can call them that) I go through on a regular basis, the glaring fact that stands out is the sheer negligence of the most basics of resume writing standards getting ignored by job-seekers entering the workforce nowadays.  For their sake, and my own, I think it’s worthwhile to go over some of these basics, one point at a time.

  1. Use a simple Word Processor document.  There are many resume writing programs and apps on the market now, but I have yet to come across one that’s worth its bandwidth when it comes to typing up a plain, to-the-point resume.  A simple Word document that most laptops and desktops already come equipped with is really all you need.
  2. Font and Style:  Times New Roman is the classic; Arial is acceptable though slightly less business classy.  Anything else, ought to be avoided.  Seeing as how your resume was typed on a computer, and is understood to be read as such, there is no need to use fonts that mimic handwritten or artsy text.  What makes resumes visually appealing to an employer is their legibility, not the amount of fancy swirls or loops you managed to imitate in your text fonts.  If anything, this could be seen as distracting and unprofessional.  Since you never know what quirky pet peeves a person might have, deciding to play on the safer side by sticking to plain script (i.e. Times New Roman, 12-point font) is just a smarter way to go.
  3. Write your name in bold at the top of the page, and center it.  I would advise that your name should be the only thing written in bold on your whole resume to make it pop from the rest of the text, and thereby more memorable to the person reviewing it.  It is also advisable that you write your name in a slightly larger font to further add to the effect (so if you’re using 12-pt. font for the body of your resume, go up to 14 or 16-pt. for your name, but nothing else).  While still centered on the page,  write your phone number and email address beneath your name (no need to bold these; your name is the only part we’re trying to make pop on the page above anything else, remember?).  A lot of resume tips online will also say to include your address among your contact information, but I would have to disagree with this.  When employers are narrowing candidates down for callbacks, they start looking at the pettiest things to choose from among otherwise equally qualified candidates.  Hence, when they see that Candidate 1 lives 5 miles away, and Candidate 2 lives 15 miles away, they might consciously or subconsciously take this into consideration when making a final decision.  Best not to give them the option to even let it be factored into the equation by not mentioning your exact location so front and center.  [This is a general bit of advise.  If you know that your address will not or cannot be a detrimental factor, and might even be an asset, by all means go for it, and list it.  I’m simply telling you what’s been helpful in my experience, having sat on both sides of the hiring table.]

Now, with the basics out of the way, let’s get into the actual meat of the matter.  All resumes need to detail the following four sections regarding your professional history:  1. Objective, 2. Qualifications/Skills, 3. Education, 4. Employment History (I’ll mention a few words regarding how best to handle References towards the end of this post).

  1. Objective:  Your objective is your one-sentence pitch as to what your career goal is in terms of why you’re seeking this position.  I say one sentence, because you should really be able to explain your reason for wanting this job (and the reason why it’s a perfect fit for you) as succinctly as possible, and usually when people start typing up two-to-three sentences worth about themselves, they are prone to letting irrelevant/rambling details seep into the text.  A lot of jobs are fast-paced environments that value employees who don’t waste time, and demonstrating that you are someone who can communicate your intentions in one sentence, while others take four, goes a long way to speak in your favor.  (And no, just writing one long run-on sentence, filled with commas and semicolons is not a convincing hack that will fool anybody; if anything it will just make you sound long-winded.)  As to what to actually say in your objective, it largely depends on what type of job-hunt you are conducting.  If you are tailoring your resume to a very niche position, in a specific line of work, it’s better to speak directly to that.  If however you are job-hunting with more of a general idea of the sort of job you’d like to do, but know that you will be sending this same resume to a variety of different employers, a more versatile wording in your Objective might be applicable.
    •  Acceptable example:  “Objective:  To obtain a competitive position in a field that will offer continues growth in proportions to my abilities and skills.”  Vague enough to apply to a variety of career fields, displays a sense of ambition, but also pays lip-service to the notion that this ambition will be of mutual benefit to the employer and the employee.
    • Unacceptable example:  “Objective:  To get a job that I will enjoy and with which I can forward my career in the long-term.”  Essentially says the same thing, but it’s far too casual for employer’s to read any further depth into beyond what’s stated, and more importantly it is entirely egocentric in its delivery giving the implication that this candidate is someone who will bail the moment the feel that things aren’t going their way at work (this may be true of most employees, and any competent employer will be aware of this, but showing that you possess the gift of subtlety and plausible deniability are also highly valued skills on the job market, even if your bosses know when you’re try to use these skills on them.)
  2. Qualifications/Skills:  Right after your career objective, you should have a section of your skill sets.  The best format is to list them in tidy bullet points, one after the other, with the most relevant at the top of the list (i.e. relevant as they pertain to the position you are applying for, so feel free to shuffle these bullet points around and personalize them to each position, as you apply from one job to the next).  If you have any certifications or specialized training, this is the time to mention it.  If you know that the position you are applying for requires knowledge of a specific skill that you possess, write it out as plainly and obviously as possible (i.e. if the job will require you to work with spreadsheets all day, say “Proficient in all matters of Excel use, both on PC and Mac OS” instead of the more opaque “Proficient in Microsoft Office systems”–yes, the latter obviously includes Excel, but don’t overestimate the attention span of employers and their need to have things explicitly spelled out to them at all times).  Towards the end of your list of skills it’s perfectly all right to mention something that, though not completely relevant, shows you to be an interesting, and well-rounded person, but use a bit of common sense regarding what details to share.  Saying, “Extensive experience volunteering with youth groups to help foster a more positive community for at-risk students,” is a great humble brag, but saying “Leading figure in the online furry community, actively advocating inter-species acceptance and relations,” though potentially intriguing to discuss, probably not appropriate to lay on a potential employer so early on.
  3. Education: State your education as plainly as possible, by which I mean:  Name of school, type of degree, area of study, and noteworthy honors or commendations.  Unless the position you’re applying makes a point of mentioning an educational requirement, or your education reflects some unique or prestigious point, there is no reason to overwork this section beyond the basic points mentioned.

The above information should fit within 1 page of a 12-pt typed font, or somewhere very close to it.  Leaving you open to type up the final section on a separate page.

4. Employment History:   As the section’s name implies, give a list of  places where you worked.  Self-explanatory, really, but I’ll be painfully long-winded about it anyway [because I’m a pedantic son of a bitch, that’s why!].

    • The rule of thumb to follow is that if you have very little job experience, list whatever you can reasonably get away with passing off as “work experience”.  Have you ever done volunteer work?  List it, and give a detail of your responsibilities.  If you have done internships, student work-study, lead meet-up groups, whatever…these are all experiences you could use to demonstrate your ability to be productive and efficient in an occupational environment, even if you weren’t technically being paid to do them.  And in terms of wanting to fluff your resume to supplement a lack of employment history, or fill in extensive gaps in your employment history, mentioning these specific activities looks much better than trying to camouflage it with  vague concealers like, “Worked freelance projects” or “Self-employed entrepreneur” (unless you’ve got a legit business card naming you the CEO of a registered company, please don’t ever use this designation for yourself–no one is impressed by it).  The point isn’t to lie, or make things up in lieu of a robust work history to tout; it’s about showing that despite your lack of a standard 9-5 employment history, you are still a viable candidate that should be considered a serious contender for the job.
    • Now, some people will have the opposite problem, where they have way, way too many past jobs, volunteer work, extracurricular activities, etc., listed under their employment history to the point that they need several pages to fit it all.  If this sounds like your resume, you should definitely consider a rewrite.  If you’re applying for a administrative position, and you have several years of of administrative experience, you don’t need to list that summer job in McDonalds, or that year you spent as a delivery driver, or the side-gig you’ve got going on entertaining children dressed as a clown (honestly, adult coulrophobia is so widespread these days that mentioning the last one might work against you full stop; no one wants to take the chance of getting murdered by a clown during business hours).  That’s not to say that you can’t or shouldn’t mention any unique work experiences that only tangentially relate to the job you are applying for, but if your employment history is already running well past 1 page of otherwise relevant work experience, just let said work experience do the heavy lifting at impressing your future employer, and wait to let all that quirky personal charm you’ve got shine through during the in-person interview.
    • Now, how do you best summarize a past or present job position on your resume?  Easy:  list the company, list your position, list the time worked there (month and year).  Beneath that type up a bullet point summarizing all your duties and responsibilities.  Be as thorough as you can without resorting to word salads, or simply repeating what’s already listed under your Qualification/Skills sections–i.e. type in full sentences, as if you were relaying the information in person.  Depending on how much you did in the position, this may take one sentence or three, but try to keep it within 5 (you’re less likely to ramble irrelevancies if you place this limitation on yourself).

Okay–I’ve teased it, so now let’s say a word about References.  Please, please, please, do not write these three words anywhere on your resume in regard to your references:  “Available upon request.”  If they are available, and you already have them on hand, moreover if you know this is a job that will request them from you, just have a separate page typed up and ready to show to the employer.  In fact, even if you aren’t directly asked for references, you can never go wrong by always attaching a list of references to the back of your resume.  Regardless of whether the employer will make use of the list or not, having it ready and available for them shows forethought and thoroughness, and leaves a very good impression in your favor as a professional and serious job-seeker.  I want to mention that your list of references need not be more than 3 up-to-date professional references (ideally past employers who are at least likely to still remember you by name); give their names, their positions, their relation to you, and their contact information (just a phone number will suffice in most cases).  That’s it as far as references go; real straightforward, no need to overthink this anymore than anything else on your resume.

Keep in mind that nothing written here is the definitive word on resume writing, and I’m sure there are several caveats and exceptions I failed to mention simply for the sake of not wanting to take up more of your valuable time (or mine).  And although the above information is tailored to an old-school typed and printed resume format, it can just as easily apply to any other style of resume submission, and even as a rundown of how to organize the sections on one’s LinkedIn profile.  Though you should always, without question, have an actual typed resume on hand; if employers just wanted links to your social media they’d contact the third parties Facebook is selling all your personal data to for profit–Heyoo!  What?  Too soon?  Or have we all just moved on from that unethical bit of privacy invasion?

All right then, carry on, and good luck job hunting.

The Reason Stories are Written in Past Tense

Pagina niet gevonden | Clock tattoo, Clock drawings, Old clock tattoo

One of the first things any decent creative writing class will teach an aspiring author is the importance of maintaining consistency throughout the text, and it’s something I’ve definitely mentioned before on this blog.  Although this often refers to the importance of maintaining plot consistencies, grammatical consistencies (and functional consistencies), are equally crucial parts in creating a legible narrative.

Anyone who reads fiction novels regularly will have noticed that the overwhelming majority of these stories are written in past tense; e.g. “It was the best of times…”, “She figured it was all over…”, “He loved her like no other, but also saw no way to show it…” etc.  But why is this?  What makes a past tense narrative more grammatically correct, then a past or future tense syntax structure?  To answer that question, one needs to first dispel the phrasing of it.  There is nothing inherently more grammatically correct about using past tense, as opposed to any other tense, as long as the narrative voice remains consistent in its use throughout the story (or there is a damn good reason why it doesn’t need to do so).  Hence, the reason past tense is seen as the default has less do to with grammar, and more to do with functionality.  It shouldn’t be forgotten that writers by definition are also readers, meaning that they carry with them decades’ worth of literary conditioning, just like the audience they are trying to reach.  Most of the books a writer has read will have been written in past tense narrative, and like every other reader, it is understandable if this structure naturally seeps into one’s own writings.  Thus, one also shouldn’t underestimate the sheer amount of concentration it will take to catch the potential for inconsistent writings when attempting to do experimental works that run counter to the norm, and how the potential of creating an inconsistent prose goes up substantially when trying do to something out of the ordinary.  Therefore, defaulting to the more common past tense narrative is an easy way to ensure consistency throughout one’s plot, since it will feel the most natural; for writers and readers, alike.

Alternatively, rarely do you see whole plot narratives written as future tense; e.g.  “I will go see her tomorrow, after which we’ll talk…”, “They are going to take care of it later…” etc.  This sort of writing is reserved more for character dialogues, as they are more in line with casual conversations (not to mention people’s internal dialogues) wherein the discourse centers on planned actions (i.e. things yet to be done, spoken about by character’s whose overall knowledge of events is limited).  In contrast, narrator voices—whether they are written in first person, or third person; whether they are limited, or omniscient—are instinctively read by the audience in a bird’s eye view perspective, detailing the happenings to them as an observer of events.  It wouldn’t be impossible to write a whole narrative in the future tense, but the risk you run is to possibly frustrate your readers because, in many ways, such choice of phrasing stands so deeply in contrast with how most of us are attuned to differentiating between plot narrative and character dialogue that it may have the unfortunate affect of making the story too confusing and tiresome for most to bother following along with to the end.  And while challenging readers through provocative prose can be laudable, given them a headache through cumbersome verb usage is anything but useful.

Lastly, there is present tense; e.g.  “She creates the world as she sees it…”, “He says what he thinks, and he thinks what he knows…”  It’s a very impactful form of narrative, which immediately frames the plot into an action mode—things are happening, and they are happening right freaking now!  It’s unique, and in the hands of a skilled writer, has the potential to serve as a creative alternative to its more common past tense counterpart.  On the other hand, in unseasoned hands, it has the potentially to also wear out the reader; think sensory overload brought about by too much intensity.  There is a reason most stories follow the general set up of: introduction -> rising action -> climax -> falling action -> conclusion/resolution. If the whole story is written in a narrative that denotes action all throughout these distinct steps in the narrative, then the writer will have to work doubly hard to make the impact of the climax (and the rising action that leads up to it) standout to the reader’s attention.  I’m not saying that it’s an impossible task to accomplish, but it is harder, and takes considering talent to get it right.

I outlined why looking at the prevalence of past tense narratives in fiction isn’t really an issue of grammar, but an issue of ease of writing and what reader’s are simply accustomed to.  In an obvious way, the situation is very much a Catch-22:  Readers are used to reading narratives because most authors write in past tense narratives; authors write in past tense narratives because most readers are used to reading in past tense narratives. And a prevailing orthodoxy is therefore sustained.  Now, I will never say not to attempt a heterodox approach that deviates from the norm, on the grounds that one never knows for certain what works until it’s tried (every new situation carries with it the prospect for new discovery, and all that).  I simply want to make the point that no reader expects you to re-invent the written word to be seen as a great storyteller, and it’s perfectly fine to stick with what has been tried-and-tested to work, and what will make it easier for you to write your story, rather than fret over the structural details when you really don’t have to.

Character Backgrounds: The Dilemma of Sharing Too Little, or Too Much

Noir Film Detective Standing in Stockvideoklipp på (helt royaltyfria)  1009708991 | Shutterstock

When writing a story, there exists a natural disconnect between how the author interprets the plot, and how the audience reads it.  The obvious reason for this being that the author has the (mis)fortune of knowing the intended background details of the events and characters before they ever makes their way onto the page, in ways that are not readily available to the reader.  The task for any decent writer is to convey these details in a way that makes for a compelling narrative that will be neither overbearing for the reader, nor leave them stranded in the dark regarding important plot/character developments.

Spotting moments when an author is being too reserved with details is fairly easy.  Anytime you’ve come across a part of a story or book that left you wondering, “Wait, who is this, and why are they suddenly in the middle of everything?  Where they hell did they come from?” you were essentially exposed to underdeveloped writing.  Be sure not to misunderstand what I’m saying, though.  Introducing new characters, and strategically withholding information about them, can be an effective writing technique to invigorate interest back into the plot, as a little mystery can go a long way in building much needed suspense in an otherwise stale plot.

As an example, imagine a love story between two characters named Tom and Jill.  For over a hundred pages, you followed along as Tom sees Jill, falls in love with her, and tries desperately to impress her.  Jill was originally aloof regarding Tom’s advances, but slowly she starts to feel flattered by his affection for her, and agrees to give him a chance.  Things are going great for the two love birds for several more pages, then—just as the plot can’t bear the weight of anymore Hallmark moment clichés—a sudden wrench is thrown into the mix:

Nothing could tear Tom’s gaze away from Jill’s eyes.  The shape of them, their softness as she smiled, even the wrinkles that formed at the corners of her eyelids as she laughed, all worked to keep him in a hypnotic trance from which he could not—would not—escape.  Or so he thought.  Because the moment Susan Gallaghan walked by them, he felt his eyes wander from his beloved Jill’s enchanting eyes, to the rhythmic steps that paced along in front of him.

Let’s assume this is the first time this Susan character is ever mentioned in the plot.  The first thoughts any reader is going to have will be along the lines of:  “Who the hell is this Susan person?”, “Is she someone new to Tom?”, “Is she an old flame?”, “Is she a girl from his youth that he secretly pined after?”, “Is Tom actually a serial killer, and Susan his next victim?”  At this point, we, the audience, have no clue.  The fact that we have no clue is what makes it a brilliant writer’s trick, because now you are invested in the dilemma and subsequent resolution that is sure to follow.

But what if the drama never really follows the way you expect it to?  While the sudden introduction of this new character works to spark the reader’s interest in the development of the story, it can only carry the audience’s engagement so far.  If Susan keeps popping up in the same way, with the same vague acknowledgment from the established characters, the reader’s interest will quickly turn to frustration, and ultimately to disinterest.  You have to give the audience a reason as to why the things that are happening on the page are worth being mentioned to begin with, and in the case of character development, this means divulging at the very least some connection between secondary plot-devise characters (like Susan above) and the main protagonists.

Divulging a character’s background effectively in a narrative is not as easy as it may sound.  A lot of times it can come across bloated, and a poor attempt to force feed too much information into the plot, just for the sake of having the reader know why this person exists in the story.

Imagine if the mysterious introduction of Susan above followed up with:

Tom immediately recognized Susan as his high school sweetheart, to whom he had lost his virginity to on prom night.  The two of them went their separate ways soon after graduation, but Tom never quite got over his love for Susan.  Susan, for her part, had little trouble moving on from Tom.  So much so, that she moved away to study and travel abroad.  As she traveled the world, she gained an appreciation for herself, and how she didn’t need to define her identity by any one person that happened to be in her life.  Unlike Tom, Susan wasn’t validated by whether someone loved her; she felt complete knowing that she loved herself.  Even now as she walked past him with all the confidence of a young woman who intended to live  her life to the fullest, Tom’s heart throbbed once again for the one that got away.  Though Susan didn’t recognize Tom, the two of them would be seeing a lot more of each other from her on out, since she was set to begin a new position in the very firm Tom worked at.

The problem here isn’t that this information is being revealed within the plot; it’s that there is no reason to have it laid out all at once, let alone right after the mysteriousness regarding Susan’s presence was so brilliantly executed.  All of this can be revealed through the course of several pages, if not several chapters.  Again, by all means give the necessary background to establish a character, but there is no need to lump it all together in one spot, because then your narrative will inevitably end up repeating itself again and again, every single time the information needs to be revisited.  Eventually, Tom and Susan will have a confrontation, where hints can be dropped regarding their past intimacy.  Rather than state that Susan is a confident and independent person, why not show it by the way she behaves and interacts with her surroundings and the other characters?  Pretty much everything stated in that one paragraph can be dispersed throughout the story by piecemeal, without having to kill the suspense of revealing it all in one big swoop (especially right after the mystery character is introduced).

For a real literary example of where an author does a superb job of balancing the enigma of his characters with their subtle background revelations throughout the plot, I would point to the characters of Mr. Croup and Mr. Vandemar in Neil Gaiman’s Neverwhere.  Even before the book’s otherworldly narrative is revealed, these two characters’ peculiar manner of dress and manner of speaking foreshadows a fantastical nature to their persons (and, by extension, the plot itself).  All of which is subtly explored in what essentially amounts to breadcrumbs worth of information through the course of a 300+ page story.  And in the end of it all, the mystery behind who/what Mr. Croup and Mr. Vandemar really are is never fully revealed, precisely because there is no reason for the story to do so.

Ultimately, it’s up to every writer to decide how much is too much background exhibition for her/his characters, and how much is just enough to not stifle character and plot development.  That happy balance will largely depend on the sort of story you are trying to tell, and it may take several revisions to get it within the range you are aiming for.  But, while it’s not always straightforward in either case, being able to spot the problem in other written works means you are more than capable of applying that critical eye to your own.  Like a lot of writing advice, it simply starts with reading your writings not as an author, but as a reader, first and foremost.

Mindlessly Mindful: How Meditation Stifled my Creativity

Mindfulness Meditation Can Help Relieve Anxiety And Depression : Shots -  Health News : NPR

Over the course of the last few years, the practice of mindfulness meditation has sparked a great deal of interest in private and public discourse.  For many this discourse takes on the form of a full-scale spiritual reawakening in their lives–the rationale of looking back to what some would call time-tested wisdom, as a guide to navigate through modern life.  Still to others, who might belong to a more pragmatic mindset, the adoption of meditation into their daily routine is less about reaching an esoteric sense of enlightenment, and more about wanting to find a means of focus for the cluttered thoughts they feel are clogging up their minds.

My own interest into mindfulness meditation began sometime in late-2016, and stemmed from a general curiosity regarding the positive results being attested to by its practitioners–ranging from all sort of different personalities; including (but not limited to) self-appointed gurus, public intellectuals, corporate bosses, average laborers, and everyone in between.  What peaked my curiosity most was how the underlying message from this diverse group of people was a resounding agreement that: “Yes, indeed, meditation works!”  The full definition of how it “works!” and what it means for it to “work!” often vary as much as the individual backgrounds of meditation practitioners, however there are some very clear commonalities among all the positive testimonials.

A greater sense of focus is one reoccurring benefit attested to by mindfulness meditators.  Specifically, a greater awareness and appreciation of the details encompassing the moment one happens to be currently occupying, as well as the multitude of thoughts that accompany it.  Another common theme among meditation circles is how it leads one to confront the (supposedly false) preconceptions surrounding the fundamental concept of the Self, and the illusory nature by which we think of our Self in relations to both our internal dialogue, as well as the external world our Self interacts with (whether it is even coherent to think of the Self as an independent agent relating to the world, rather than another component in an endless string of interacting affects that make up existence).

I spent weeks researching the practice and philosophy of mindfulness meditation to get a better understanding of it, until finally, on January 1st, 2017, I decided to put theory to practice and devote a significant portions of my free time trying to gain some firsthand experience of what it truly means to incorporate meditation in my daily life.  Recently, on January 1st, 2019, this personal experiment of mine came to a full stop.

When I first set out on this personal journey I expected the possible results to go one of two ways:  1.  A net positive, wherein I would enjoy the benefits of reaching some semblance of self-awareness, self-discovery, and hopefully even personal growth (like so many others testified to having experienced through meditation).  2.  A net neutral, the results of which would be no more dire than having wasted some portion of my time on a fruitless exercise that offered no real benefits, but ultimately no harm.

Having now gone through it, I can’t say what I experienced to have been neutral, since the practice definitely affected me on more than one level.  Unfortunately, from my perspective, the affects I felt leaned more towards a net negative as a whole; so much so, I decided to give up meditating completely as something that may simply not be a suitable practice for someone like me.

Once I ceased meditating, a subsequent curiosity came over me in which I wanted to find out if there were others that have had a similar (negative) experience to my own while practicing mindfulness meditation, but surprisingly enough the answer to that questions seems to be a resounding, “No.”

I came across a few blog posts here and there of people saying they weren’t completely satisfied with what mindfulness meditation offered, or that it wasn’t what they expected, but they were still overall happy to have had the experience (even if they decided it wasn’t the right fit for them).  I also finally took the time to research the medical and psychological data regarding the long-term benefits of meditation (or, more aptly, the lack thereof) I had intentionally avoided while engaging in the practice, so as not to be prematurely biased against it.  Yet, other than a general confirmation that little to no empirical evidence exists to validate its self-proclaimed benefits–possibly making meditation more comparable to a placebo effect than genuine self-awareness–I still didn’t come across reports that confirmed anything close to my personal (negative) experience.

I’m not going to go into deep details regarding the exact nature of the sort of mindfulness regiment I did during this two year period; partly because I’d rather be guilty of leaving details ambiguous, then have every meditating Tom, Dick, and Mary who fancies her/himself a guru lecture me about how “real” meditation ought to be done.  If that is the sort of objection coming to mind as you read this, I am unfortunately failing to get the crux of my point across.

It’s not that I meditated and got no results from it, or that my results were drastically different from what I’ve read, heard, and observed others state about their own experiences while meditating.  In fact, my experiences were more or less in line with what the typical person claims to go through while practicing mindfulness exercises.  My problem with meditation–and mindfulness meditation, specifically–are what I view to be the negative impact it had on my creative wherewithal.

What exactly do I mean with this? Allow me to explain.

A heightened awareness of the current moment is one of the major benefits promoted in favor of meditation.  While I see how it might help those who have a habit of wearing their emotions on their sleeves to meditate–or maybe those who suffer from impulsive decision-making in general–I’m someone who came into meditation already relatively calm and collected, possessing a decent set of stress management skills to begin with.  Furthermore, I’m someone who relies on having to construct imaginary plots, involving imaginary people, and projecting them into contrived scenarios that could resolve themselves any number of ways I see fit to write.  Now, seeing that creative writing is generally penned in the past tense, about things that have yet to be imagined, involving situations which do not exist, I never expected mindfulness meditation to offer much in the way of benefits in this part of my life.  But I also wasn’t prepared for how downright harmful it could be to it, either.

Prior to incorporating meditation into my daily routine, sitting at my desk and passionately typing away at my laptop’s keyboard for long enough to lose my sense of self because I am too immersed in the world I’m creating, was the feeling that gave me satisfaction at the end of a day when I went to bed.  And, slowly but surely, I felt this passion begin to erode the more progress I made with my meditative practice.  (Then subsequently return when I stopped meditating altogether.)

Sure, I got better at focusing on my breathing, as well as the various physical sensations that made up my moment-to-moment experiences, which in turn made me more aware of not just my thoughts, but the process by which these thoughts seemed to spontaneously manifest into my conscious monologue, but all of this came at a cost.  Being more aware of my thoughts–moreover being conscious of the act of thinking–made it harder to lose myself within those thoughts when I needed to weave together thoughtful writing.

And it wasn’t just writing.  Other creative outlets like painting became harder, too, because a large part of my painting process revolves around being able to foresee and focus on what shapes and images can be created (rather than what are present in the moment), and what method/color scheme will illustrate them best.  Being aware of the moment, and the act of what I’m doing (in this case sitting in a chair while painting) offered no benefit to the act itself, and ironically often served to distract from letting my thoughts roam towards conjuring up the inspiration needed to complete the project.

Yes, inspiration.  That is the key ingredient that I felt slipping the deeper I delved into meditation.  Ironically, as a result I found myself feeling more frustrated and stressed as a person when I sat down to do my work; traits I largely did not possess (at least not to the level I developed) going into meditation.

Like a lot of bad side effects, it took time for the signs to come to the surface, at which point meditation had already become part of my daily routine (and, really, routines can be so hard to break once they’ve cemented into our daily lives).  So I carried forward through all of 2017, and the first half of 2018, somewhat oblivious to what was the source to my depleting creative spark.  Then, last summer I wrote a post on this blog titled, The Pitfalls of Self-Help, after which I started to consider the possibility that all the positive testimonials I had heard in praise of mindfulness (which got me interested in it) were just as vacuous as the testimonials of people following any other self-help/self-awareness fad.

I started to seek out other mindfulness practitioners to see what insights they had to share, and was largely met with not-fully-thought-through regurgitations from self-proclaimed meditation gurus, whose wisdom sounded more like buzzwordy slogans from the reject bin of yesterday’s fortune cookie stash.

One particular conversation proved most enlightening.  The gist of it went something like:

Meditator:  “How you perceive of the Self is an illusion.”

Me:  “I perceive of my Self as a collection of atoms that make up the matter that is me; occupying a specific space in time that only I occupy.  In what sense in this an illusion?”

Meditator: “That’s not how people define the Self.  When people talk about a Self, they speak of it in terms of a separate entity that’s observing their doings, instead of being a part of it.  That’s an illusion.”

Me:  “But I just told you that doesn’t apply to how I, personally, conceive of the Self; as it pertains to me, or anyone else.”

Meditator:  “It does.  You’re trying to intellectually rationalize you perception.  In reality, you’re just not being honest with how you really perceive your Self, in everyday practice.”

I’m fine with accepting that I have blind spots regarding my own conscious and subconscious awareness.  What I take issue with is being told I have to accept the idea that someone else–with absolutely no firsthand access to my thoughts or perceptions–has figured out where all these blind spots are, how they pertain to my experiences, and how it all conveniently fits into her/his own preconceived generalizations and worldview.  In other words, feel free to tell me that I’m wrong in my opinion, but don’t condescendingly tell me you know what I’m really thinking, in order to make me and my thoughts conform to your philosophy.  That’s not awareness; that’s just bullshit.  And I hate to say it, but a lot of meditation seems to run very close to this level of discourse.

In the last half of 2018, as I drifted more and more away from seeing any value for keeping meditation in my life, I was given two further explanations by meditation practitioners for my lack of positive results:  1.  I’m not spiritual enough, and 2. I’m too straight-edge.

I’ll freely grant the truth of the first explanation as a strong possibility.  Even with the most elastic definition of the word “spiritual,” I can honestly say that it does not, and cannot, apply to me.  While I know there are efforts made to promote a secular form of spirituality, I still feel the need to point out that I have never believed in the supernatural, nor the mystical, and the values and passions I have in life I do not equate or think of in any deeper “spiritual” terms.  The things that give my life meaning and joy, are simply the things that give my life meaning and joy, and I see no reason why I need to lump on belabored spiritual terminologies that do little to further elucidate what is innately a tautological experience for everybody.  Apparently, this type of thinking doesn’t sit well with the sort of people who claim to get concrete benefits out of meditation.  In such circles, simply saying you appreciate any aspect of life, and your roles and perceptions in it, is an affirmation of your spirituality.  Which is fine, but to me that just redefines spiritual so broadly that it becomes meaningless as a term.  I’m not invested enough in the semantics behind it all to debate the issue, but it’s safe to say that I don’t personally consider myself to be a spiritual person (regardless of whether others want to see me as such).

As to the second point, concerning my lifestyle choices; on more than one occasion, it was suggested to me that meditation can only be truly of benefit when performed under the influence of psychedelics.  I have no way of knowing if this is true or not, as I do not partake in recreational drug use (though I support anyone else’s right to do so).  But I have to ask, how do you know that what you perceive to be a greater self-awareness while high on psychedelics isn’t just a drug-induced delusion that has no bearing on reality as it actually is?  If being on drugs, and then meditating, is the key to opening the door to a greater truth about life, how come no one has ever emerged from these drug-fueled meditative states with any tangible, verifiable realizations about the world?

How come in all the centuries of taking mushrooms and meditating in caves, none of these yogis and gurus came out of the experience with something like “E=mc^2”, or the formula for penicillin, or even something as basic as “hey guys, guess what, the world is actually round” (in fact, there is a growing following of people online, at least some of whom I imagine are very prone to getting baked, that argue in favor of a flat-earth).  It’s always some esoteric and vague platitude, like “the Self is an illusion” (as long as both “Self” and “illusion” are defined in very particular terms) or “states of happiness and suffering both depend on consciousness to be realized” (no shit, you’re telling me people who are brain dead can’t feel happy or sad?–Brilliant!).  So, I must ask, what exactly is the point of a greater awareness, if said awareness has nothing tangible to say about the most fundamental, verifiable facts regarding the reality we inhabit?

And, look, perhaps there are those for whom such musings and conversations are of great value, and their personal experiences have been greatly enriched by their existence.  If meditation has brought these people happiness, and impacted their personal growth as individuals positively, I would never argue to take it away from them on the basis that it wasn’t my cup of tea.  We’re all different, and what works for you may not work for me, is one underlying message here.

The other reason for writing this post is to speak to anyone who may have had a similar experience with meditation to my own, and also struggled to find others voicing said experience.  Although I didn’t find much in the way of negative testimony regarding mindfulness meditation, I have a hard time believing that there isn’t someone–at least one person–in the world who, like myself, has tried this out and found it to have been more of a hindrance in her/his life, rather than a benefit.  To this person(s) I’d like to say, there’s in no point in struggling to move forward in a futile quest, and there’s in no shame in walking away from something that is doing you no good.  There are many different ways to experience life and achieve personal fulfillment, and just because something is presented as a cure-all to what ails you, doesn’t mean that there aren’t better alternatives out there more suitable for you.

And if you think everything I’ve written is unwarranted drivel, let me know, and I’ll be sure to meditate on your concerns post haste.

Understanding Perspective in Writing

Writer's Tips – Writing from a Unique Perspective | Amazing Kids! Magazine

Writers easily get bogged down in what one could call the nuts and bolts of narrating a story–plot, setting, character development, etc.–that it often gets easy to overlook that narrating itself is the very underpinning that defines the perspective by which a story is revealed to the reader.

Generally, most narratives are written as either from a first-person or third-person perspective.  Second-person exists, too, but is not often used as an exclusive character perspective on account that it’s hard to construct a long-form narrative with it (not impossible, but definitely hard).  As an example, on many blog posts [including this one] I’ll often utilize the rhetorical second-person “you” in reference to the hypothetical reader scrolling through the text, but when doing so will usually not take long to resort to the first-person “I” in order to make the prose coherent.  By and large, if you are writing some kind of narrative, especially in the realm of fiction, you’ll probably be doing it in first-person, or third-person.

Regular readers of KR know that I hate all forms of jargon.  Philosophical, political, literary–all of them; if you’re someone who always feels the need to express yourself using pretentious ten dollar words and terms in lieu of more straightforward ones available, I will always assume that you are probably someone who doesn’t know what s/he is talking about.  With that in mind, if you are not 100% sure about all these terms, let’s simplify it by saying that if your story’s narrator speaks using “I,” “me,” and “we,” your story is written in first-person.  The strength  of writing in first-person comes from the ease by which the reader gets to empathize with the narrator, and in turn, the narrative of the story being told.

Tom went to the store, bought gum, and then shot himself with his revolver,” can be emotionally gripping, but not as emotionally gripping as, “I went to the store, bought gum, and then shot myself with my revolver,” because now you are not just being asked to read as a casual observer, but as the main character him/herself.  This is why first-person narratives are easier to immerse oneself into, as the prose has less of a descriptive barrier between narrator and reader, making it easier to become invested in the plot’s dilemmas and character arcs.

However, writing in first-person also has its drawbacks.  The perspective is by definition restricted to only one point-of-view.  Unless your character is some sort of clairvoyant deity, the narrative will be limited to whatever s/he sees and describes (even if your character is an all-knowing god, written in first-person the story is still only told through the perspective of one viewpoint, hence it’s still restricted).  Most stories have more than one character present; hence it’s not hard to realize the issues that arise when you can only ever truly understand how one character is feeling, and have to rely on this one perspective to give a complete deduction on the thoughts and intentions of all the other characters.

As an example, let’s say that the narrator character is in a conflict with side characters A and B.  What are character A and B’s thoughts on this conflict?  You don’t know.  You know what the narrator character thinks their thoughts might be, and that’s all.  This isn’t a problem, in and of itself.  It can be used to create a wonderful sense of tension and suspense.  But it also means that a writer has to keep in mind perspective consistency within the plot, so that it doesn’t violate the logic of the first-person perspective that’s been setup so far.  This means that if side characters A and B had a conversation somewhere far away from the narrator character, the narrative has to be worked around in which the narrator character somehow gets wind of it if it’s going to be mentioned in the plot.  The narrator character can’t just mention it in mid-conversation, because we–as the readers who have had direct purview to the narrator’s perspective–know that that’s not knowledge that could have been available to her/him.  It breaks internal logic, and it’s rhetorically lazy.

Another glaring handicap with first-person narratives is that everything in the setting is dependent on the description given by the narrator character.  Which means that if this narrator is presented as someone not keen on being too observant and articulate, it will seem weird to have her/him suddenly break into elaborately detailed descriptions of everything happening around her/him just so the reader can see what’s being looked at.  It can also be distracting, and work to undercut the immersion benefits mentioned earlier regarding the first-person narrative to begin with.

The ready alternative is to write in the third-person, and many writer’s workshops will tell you to do just that.  Third-person allows you to separate the narrator’s voice from the characters in your story.  This means that things like character actions and appearance, and setting descriptions, are not dependent on any one character’s observations.  They are instead voiced by an impartial, non-participatory “third-person” giving all the details of the narrative’s happenings.  The obvious benefit of writing in the third-person is that it allows the writer to craft a multi-perspective plot that includes the inner thoughts of any character in the story, not just one narrator character.  Although a third-person narrative can have the affect of creating a buffer between the reader and a story’s protagonist in contrast to how a first-person perspective can work to merge reader and character into one unified voice, it also gives the writer a greater sense of control of setting the details of the narrative, as well as a greater sense of freedom when it comes to how these details are to be dispensed to the reader.

The major setbacks to writing in a third-person perspective is the misstep of not understanding that the narrative comes in two very distinct forms, which for the sake of consistency should not be confused throughout the plot.

The first form is what is called third-person-limited. The non-participatory narrator uses pronouns like “he,” “she,” and “they” (as opposed to the first-person, “I,” “me,” and “we”), and will give descriptions from the aforementioned impartial point-of-view.  But, as the name implies, a third-person-limited perspective has its literary  constraints.  Limited implies that while the narrative will give descriptive details to the reader independent of any one character’s subject thoughts, it’s narrative scope is limited to the details of usually one main character, and the details shared will not step outside the purview of the details available to this main character.

If you’re thinking that this sounds a lot like a first-person perspective just with different pronoun usage, you are both right and wrong.  Similarities between the two are clearly present, but unlike a first-person perspective , third-person-limited does allow for the narrative to explore the inner thoughts and motivations of the secondary characters because they are not being described through the main character’s subjective perspective.  The limitation is that the secondary characters have to be in some sort of interaction or connection with the main character.  Of course, it is also possible to avoid being tied down to one and only one character, by re-centering to a different main character throughout the different scenes that make up the plot.  One just has to be careful not to get confused about what character is currently occupying this role (i.e. if character A is the main character in Scene 1, and Scene 2 switches to character B as the focal point, the third-person-limited narrative in Scene 2 can’t suddenly start reference details revealed in Scene 1 because its point of focus, character B, will as of this point be ignorant of said details–even simply stating in the narrative, “Character B is ignorant of this fact revealed to Character A” is a violation of the internal logic of a third-person-limited perspective).

On the opposite side of all of this, stands the third-person-omniscient perspective.  For a writer, this perspective allows for the greatest amount of narrative freedom, in that you are not chained to the thoughts or whereabouts of any one character.  Think of the third-person-omniscient perspective as the god’s eye view to the third-person-limited bird’s eye view of storytelling.   Want to explore multiple character thoughts and feelings, without needing to relate it back to any given main character’s role within the scene?  No problem.  Want to jump around between character perspectives, and reference back to the reader things that only they (as the audience) are aware of within the greater plot?  Your only limitation is your creativity here.  However (oh, come now, you knew it was coming) it is important to keep in mind that too much freedom within a prose can also very easily tire out a reader.  When you present multiple viewpoints, it might make it harder for readers to bond with any characters (let alone the intended main protagonists of the story), or get invested in the dilemmas and outcomes that befall any of them.  In other words, too much information can create perspective fatigue, which is why even a narrative written from a third-person-omniscient perspective will often self-limit when to utilize its omniscience.

I spend some time here going over some of the strengths and drawbacks of the different narrative perspectives available to writers, not in order to argue for using one form over the other, but to simply give an overview why someone might wish to choose one over the other (depending on what sort of story is being written).  By far, the only real thing I am arguing for in this post is the importance of consistency in writing.  Meaning that whatever perspective you choose for your story’s narrative, you have to stick with it, otherwise you are setting yourself up for a grueling writing experience, and increasing the likelihood of the final draft being a frustrating mess to read (as much as it will be one to write).

It is perfectly fine to start out with one perspective, and then decide that the story is better served if written from a different perspective, but when faced with such a case the correct action is to take the time to go back to the beginning and rewrite everything to match the now better fitting narrative for the story.

Consistency. Consistency. Consistency.  That is the only true lesson here.

The Muse, She Calls at Night

How Big Bend Ranch State Park Earned Its Dark Sky Designation – Texas  Monthly

Depending on whom you ask, the severity of what it means to have writer’s block ranges from a minor annoyance to an anxiety inducing migraine.  Everyone experiences a bit of writer’s block now and again.  Often it takes the form of not knowing how to verbalize going from Point A to Point P in a prose; or, at least, not knowing how to write it seamlessly enough that it would count as decent writing.  In these cases, it can be something as simple as the ideal word or phrase that turns it all around to unclog the ol’ writer’s pipelines.  Other times, just the act of persistent writing (followed by heavy editing)  is enough to help get the creative juices flowing back onto the page.

As is to be expected, the people who feel the most emotionally committed to what they are trying to write tend to feel the most distraught when their creativity is experiencing a slowdown, or has reached a complete halt altogether.  If you find yourself in a situation like this, then you are fortunate enough that there is plenty of advice out there for you.  Reading more (both related and unrelated works) to inspire your own writing, is one of them.  As is the aforementioned idea to persevere through the block through sheer willpower and keep typing away until something halfway decent starts manifesting itself.  Exercise, eating a well-balanced meal, and getting enough sleep are probably somewhere on the list, too.  Someone once told me that it’s also worthwhile to try stepping away from one’s writing entirely to cure writer’s block.  Although I’m sure this might work for some, I’ve also seen it have the opposite effect of causing writer’s to lose the motivation to go back to an unfinished work the more time they spend away from it.

In this sea of helpful remedies to cure writer’s block, I would like to take a moment to share what helps me personally ward off this dreaded ailment.  It’s more of a writing guideline–or routine–I have found to be the most conducive to getting me where I need to be when confronted with the heavy hurdle of staring at a blank page.   And it can be plainly stated as:

Write at night, and edit by day. 

For me, there’s just something about writing in those last few hours before bedtime that gets my creativity firing at its full capacity.  Maybe it’s the fatigue of the day, where my mind had already spent several hours going through a few rough drafts long before I ever started to write a word down.  Or perhaps it’s a combination of the still of the night, and the dreamlike state of slumber already taking a hold of my senses to steer my imagination where it needs to go.  I don’t really know what it is, but for me the writing muse comes at night.

Now, I also added the bit about editing by day, which shouldn’t be ignored.  While I might feel most inspired to write at night, I’m also more prone to make avoidable grammatical errors when I’m already drifting off to sleep.  This is why, after I hit save and turn in for the night, I’ll spend the next day (or two) going through what I had written to fix any spelling mistakes, cluttered diction, or to revise anything that might have looked decent when first written, but in the light of day reads like it’s been overworked, or is off in some other way.

Is this an obsessive compulsive routine that needs to be followed to the letter for me to be able to write anything?  Of course not!  Plenty of things get written and edited by day, too, within a few short hours, with no creative hindrance whatsoever.  Just like there are nights when the muse decides to turn in early and doesn’t bother to come at all for one project, and barely manages to phone it in for another.  However, outliers shouldn’t be used to negate a general trend.

I will freely admit, though, that I have always been somewhat of a night owl, laced with infrequent bouts of insomnia.  Hence, it’s possible that I just happen to be the personality type for whom a habit of nightly writing comes the easiest, and you might not be.  But, if you are struggling with writer’s block, and none of the other remedies have offered you much relief in the matter, do feel welcome to try my personal guideline out for yourself.  Take the last two hours or so before your normal bedtime (no need to force wakefulness past your usual comfort level), and see if it helps unclog that cerebral blockage.  Just be prepared to possibly have to edit and revise a few things the next day, like a motherfucker!

Forcing the Narrative

Forces - Lessons - Tes Teach

So you’ve decided to write a story.  Before you begin, you put together a pretty coherent outline.  You have your protagonists and antagonists all clearly panned out.  You might not know exactly how long it will be, or all the minor details that will pull the whole plot together, but if there’s one thing you do know it’s exactly how the major parts of this story will progress from beginning, to climax, to finish.  There’s just one itsy-bitsy problem–your characters aren’t behaving like they should.

It’s hard to pinpoint what it is, really.  The dialogue is crisp and clean; without too many overly excessive and cumbersome adjectives repetitively cluttering up the prose.  All the different personalities are well laid out, and totally not cliche or one-dimensional.  There’s a deep subtext noticeable throughout the work, though not of the rambling variant [yeah, suck on that, David Foster Wallace].  But there’s something that just is not working, and it’s driving you crazy trying to figure out why your narrative is not behaving as it should–as you have so clearly planned it out from beginning, to middle, to end.

What the hell is going on?!

Well, you’re in luck, because I may just have the solution to your problem.  The problem probably isn’t that the story you’ve set out to write is unmanageable, or that the characters you’re eager to create aren’t as capable of being the greatest heroes and villains in fiction as you’ve imagined them to be.  More than likely, the problem is you.  By which I mean, the problem is that rather than letting your story unfold, and your characters respond and adapt to their surroundings, you have allowed yourself to get stuck in one of the easiest pitfalls for an author to find her or himself in: you have forced the narrative.

Forcing the narrative can happen in many different ways, but the most common occurs when authors stubbornly refuse to follow the natural progression of the story they have set out to create, for no other reason than that doing so may deviate from the original blueprint they have arbitrarily committed themselves to in their minds.  And, of course, stagnant character progression is often the first victim to suffer as a result of this stubbornness.

Say, for example, that you’re writing a story that has two characters that you know you intent to have fall in love halfway through the plot.  You introduce them separately to the reader, so each can have distinct personalities that your potential audience will relate to.  They spent all these pages developing identities that are unique and self-sustaining (as they very well should be), but the moment you finally have them interact with each other–the moment the entirety of your whole plot hitherto was supposed to be leading up to–and…there’s nothing.

Where you thought the dialogue would flow smoothly between these two people you crafted to be perfect for one another, all their exchanges instead sound too contrived to be authentic.  You can force them to say all the things you think are necessary to convey the message that they are love-bound soulmates, but every time you do just that everything that comes out of their mouths starts to read like a rejected script for a corny made-for-TV movie of the week.

So what gives?  Are you such a lousy writer you can’t even get a genuine romance plot right?  Maybe…or maybe you’re working to hard to wedge a square peg trope into a heart-shaped prose.  By which I mean, maybe the characters that seemed perfect for each other before you put pen to paper…err…I mean, fingertip to keyboard?… whatever, the point I’m making is that a plot idea can seem perfect before you set out to write it, but once you get going it can become downright impossible to stay true to said idea without sacrificing the integrity of the narrative you have created up to that point.

As already mentioned, this dilemma can show itself in the most basic of details.  Including the very issue of whose story it’s going to be.  You might have a main character in mind from the start, but the more time you spend with him the more you start feeling like writing intriguing dialogue for him is a strenuous task taking up way too much of your creative concentration than it should.  Perhaps you even find yourself preferring to spend time with secondary characters that have taken on more interesting lives compared to your once-great-now-bland protagonist.

“Well, what’s to be done in this case?”  Good question, hypothetical reader!  When you are in the thick of this frustrating bit of a writer’s conundrum, it’s easy to miss the simplest of solutions staring back right in your sleep-deprived, bloodshot eyes.  That is to say:  Screw your preliminary outline.  Tear up your rough draft notes (it’s called a rough draft for a reason, after all).  Go with what your instincts tell you as a reader first, and ignore the self-righteous indignation of your inner-writer unwilling to deviate from an unworkable premise.

Are two characters not hitting it off as well as you thought they would?  Fine, try having them hate each other instead.  Or even try pairing them up with side characters that showed more prospect in the plot, and see where that goes.  Is your main character too wooden to lead the story the way you hoped?  Then why not sideline him, and shift the perspective onto a different character whose personality and dialogue carries your narrative forward with so much more ease than you ever thought possible?

All these options are readily available to you, because, no matter what, it is your world–it can only exist as you wish it to.  But you need to trust your instincts, not just as a writer, but also as a reader on what makes for compelling storytelling.  And you are allowed to change your mind about the details of the happenings in your fictional world, if these changes help bring about the greater narrative you set out to breathe life into.  Treat the initial bits of ideas that inspired you to start on your journey as just that–a cursory launching point to something better.  Nothing you write down–be it at the beginning, the middle, or the end–is sacred scripture.  It is not absolute, or inadaptable to subsequent burst of creativity that may strike you once you have already begun to feverishly churn out the bulk of your prose.

It’s important to be aware that if something doesn’t feel right about your story to you as you are barely writing it, it will definitely not seem right to your readers as they are reading it.  Even if they might not know how to articulate what’s so off-putting about it when they notice a belabored prose, the audience can definitely sense when something isn’t working as well as it could be.  And forcing a narrative, in an otherwise great story, is a perfect way to ensure that it won’t be working for anyone; neither you, nor your characters–but, above all else, not the reading public.

The Power of Names

Shakespeare invited us to consider, “What’s in a name?  That which we call a rose, by any other word would smell as sweet.”  The Bard’s musings on the subject notwithstanding, the truth is that names do hold a fair bit of power in forging our perception of other people, as well as ourselves.

If you are a foreign-born individual who goes about in your adopted land of residence with a first name that points clearly to your nation of origin, you immediately know how vital a role a name can play when trying to integrate yourself with the local population (so much so that many foreigners will give in, and change their foreign-sounding names to something more palatable to the culture they aim to assimilate in).  Although few of us will readily admit to it, we are all susceptible to making generalizations about people we come across in our daily life based on superficial features.  Names are definitely one such feature.  That is not to say that every assumption made about someone based on such features is either wrong, or malicious.  It’s not wrong (factually or morally) to deduce that a person with an obviously Asian sounding name is in some way culturally connected to Asia.  Same with a man named Hans Gunterkind most likely being of some kind of Germanic heritage,  Jean-Pierre Neauvoix being French.  So on and so forth.

(It goes without saying that the contemptible part in forging a preconception about someone isn’t the initial preconception itself, it’s what you do with it from there on forward.  If on recognizing you’re about to speak with Chen Huiyin leads you to assume she is probably Asian before seeing her, no sensible person will raise an eyebrow for that assumption.  If, however, you further take your preconception to assume she is in some way personally inferior to someone who isn’t Asian, that’s where we run into issues of bigotry that will rightly be condemned by much of the public at large.)

Issues of what might be called ethnic names aside (are not all names relatively ethnic to different cultures, one might be inclined to ask here?), there are naming norms within American culture that occasionally shape our interactions with each other.  When you’re in the middle of everyday America and come across the name Kevin, it is unavoidable that you will imagine a man.  Unless you just happen to know a woman named Kevin, but even then you are likely to ascribe it to a rare anomaly.  What if over the course of the next three decades a swarm of new parents decide that Kevin makes for a great name for their baby girls, and the social paradigm shifts so that suddenly you run into more female Kevins than male ones?  Would you easily adjust to the new cultural trend, or still stick to the norm you had been accustomed to of Kevin being a predominantly male name?  If this sounds like an unlikely scenario to happen, think about how the name Ashley in America at the start of the 20th Century changed from mostly male to predominantly female by the start of the 21st Century.

Not to belabor a point past my humble reader’s generous patience, but it would feel disingenuous not to touch on my personal experience here.  Growing up in continental Europe as a boy named Sascha/Sasha the social assumption about it was that my parents must be bland, unimaginative, and possibly even a tad bit conservative in their leanings, precisely because boys named Sascha/Sasha are so common to come across there.  At the time, it formed a personal impression of myself being just another average lad going about my business, similarly to how I imagine an American youth named Michael or David would feel on the matter in contemporary American culture.  When I moved to the U.S. in my early teens I came to find out that my name was somewhat of a peculiarity to my peers; one that definitely demanded further explanation on my part.  Suddenly, I was no longer merely a random guy with an average-to-boring name, I was a random guy whose androgynous-to-feminine name invited further conversation (occasionally schoolyard taunts, too, but I’m pretty good at deflecting unkind commentary and rolling with the punches, so I bear no negative grudges from it).

I would argue that your name is the most basic qualifier of your identity, and people’s reactions to it forms a great deal of your learned behavior when interacting with others.  I can honestly say that the change in perception in how people reacted to my name on moving to the U.S.–as opposed to the reaction I received for it back in Europe–did affect how I carry myself and interact with others to some non-trivial extent.  At least in that I know when I introduce myself to others, I can be sure of two things:  1. I will be pegged as foreign regardless of my citizenship status, 2. I may be asked an awkward follow-up question regarding my name (to which, when I’m feeling lazy, my typical response will be either “My parents were really hoping for a girl, and were surprised when I popped out, dick-swinging and all,” or “I wanted to be able to better relate to women, but Nancy Sunflowerseed sounded too butch, so Sascha had to do”).

Believe it or not, the purpose of this post was not to regale anyone with anecdotes about naming cultures, as a clever ruse to sneak in a dick-swinging joke.  It’s to touch on a greater point about forging better writing habits and being mindful of one’s intended audience’s social palate.  Sooner or later, just about all writers find themselves fretting over picking out the perfect name to convey their characters’ personalities and backgrounds effortlessly to the reader.  And there are definitely right and wrong names one can decide on, for the roundabout reasons stated above.

If you’re writing a story about a street-wise, inner-city black kid, born and bred in the Bronx, but is named Hans Jorgenson Gunterkind, well you better be ready to explain how the hell that came to be.  Same if you’re writing a story about a 15th Century Samurai named Steven.  While clever names can add exotic intrigue to characters, and piece together unspoken–unwritten?–context about their personal interactions with their environments, it can also needlessly distract the reader if it’s not really meant to be a focal point of the narrative.

It’s perfectly fine to be bold and go for something unconventional when you’re crafting your written world, but don’t bend over backwards to convey uniqueness unnecessarily, to the point that it hinders the readers ability to become immersed within the narrative.  A story that has five characters named Mike to show the absurd commonality of the name can be witty and fun, or it can end up confusing and frustrating to the reader.  Take a moment to consider how the greater world you have created interacts with this dynamic, and whether it helps or hurts the story you’re setting out to tell.  Reading practicality should not be dispensed for the sake of creativity; they should operate together to form a coherent story that can be enjoyably read.

You can’t please everyone, and someone will hate your work no matter what or how you write.  Which is why the starting point for all my writing advice is to always start with being honest with every story’s first reader: its author.  And if, as you put pen to paper (or, more realistically, fingers to keyboard), what seemed like a great name in the first outline is becoming harder to work with as the story progresses, rather than forcing the narrative to conform, there is no shame in revising the basics–character names included.

Suck on that, Shakespeare, is what I’m really trying to say here.

In Defense of Mary Sue

bonddrew

There are two distinct ways in which the term Mary Sue gets used in literary works (as well as any other fictional medium, really).  The most common usage today is in the context of the perfect protagonist.  This could mean a character that has a seemingly limitless aptitude for displaying/learning skills that go well beyond the realm of reason even within the reality of the fanciful narrative in which s/he exists.

Think of characters that are described as flawless physically, and around whom all the other characters gravitate towards, whether the plot necessitates it or not.  Obvious examples are characters brought to life within the pages of fan-fiction, but I would say that such writings are somewhat of a given on account that they are meant to be tributes to existing characters, thus overemphasizing said characters attributes might be unavoidable in this genre.  More worthwhile examples of Mary Sues are characters that are actually successful, and one could say well-respected, within literature.

Characters like James Bond and Nancy Drew in their original literary inceptions could very easily be argued to fit this description.  James Bond speaks every language of every country he steps foot in, can fight (and always win) in every fighting style confronted with, and can (and will) seduce any woman he desires because every woman he meets just naturally lusts after him without hesitation.  Likewise, Nancy Drew effortlessly picks up any activity she tries, is seemingly liked by everyone and often complimented on just how great she is by the other characters, and of course understands investigative deduction and forensic science well beyond what ought to be plausible for a person her age.

A word needs to be said about not going overboard and pinning the Mary Sue label on any character that just happens to be either capable, or powerful.  For example, although Superman is essentially a god-like character in many regards, he’s not really a Mary Sue as the term is commonly used.  Notwithstanding the fact that he has a fatal weakness in kryptonite, a lot of the narrative around Superman centers on the way his immense power keeps him on some scale separate–even isolated–from the very people he is dedicated to protect.  No matter how humane he is, he is never going to be human, and will always be an outsider in that regard in the only world he knows as home (especially since his birth planet no longer exists).  In this sense, there is a genuinely ongoing tragedy underlying the Superman saga, whether it is explicitly stated or not, in a way Mary Sues don’t really have to deal with.

There is a secondary definition to a Mary Sue, and it involves authors who essentially write themselves into the plot of their stories as a means of wish-fulfillment.  To put it simply, when the main character in a story is written as a idealized version of the author her/himself, and is written in a way to fulfill the perfect protagonist archetypes described above, then we have a Mary Sue on our hands.

I can see why people dislike either incarnation of the Mary Sue trope sneaking into the pages of a story.  Perfect character can get stale very quickly, because they are largely unrelatable to the vast majority of readers.  Moreover, the overreaching plot of a story will become very boring if we can tell from the start that the main character will always save the day, get the love, or that every obstacle encountered is just a superficial plot piece that offers no real danger in the long run.  However, despite all this reasonable criticism on why not to write characters in this way, the fact is that Mary Sues can actually resonate with readers if they find the story engaging enough–compelling writing just have a way of trumping all tropes.  The two examples of James Bond and Nancy Drew can attest to this just by how prolific both characters have been through the decades.  (It should be noted that I am aware how Bond has been greatly “de-Sued” in his cinematic portrayal over the years, in particular in the most recent Daniel Craig films, which show him as a far more vulnerable and broken person than he ever was in print.)

What this tells me is that people don’t mind Mary Sues so much as they like to use Mary Sues as a convenient way to write off a work of fiction they probably disliked to begin with.  And I get that, too.  Sometimes, characters in a book can just rub you the wrong way.  I for one absolutely loathed Holden Caulfield when I first read The Catcher in the Rye, and am still not too found of the little shit to this day.  (I’ve mellowed out about him because I’ve come to terms with the possibility that he’s a character I’m not meant to like.)  If I discovered that Holden was written to serve as an idealized stand-in for J.D. Salinger my opinion would not be swayed one way or the other.  This brings me to the final point I want to make on this topic, and it deals with the issue people have of authors writing themselves into the characters.  As anyone who has ever written fiction can confirm, it is unavoidable that some part of you will come through, in some way, in every character you will ever create.  I’ll even go as far as to say that I have never written a character that didn’t reflect some aspect of my personality, morbid curiosities, lived experiences, faced dilemmas, overcome setbacks, learned failures, and hard fought successes.  And I know that people will object that I’m shamefully stirring away from the genuine opposition leveled against Mary Sues (i.e. an author’s perfect protagonist wish-fulfillment), but I would argue that the fear of not wanting to create a Mary Sue-type character may be holding some writers back from exploring the full depth they can push themselves to because they are too paranoid about falling into this trope.  What I would urge instead is for a different approach.

You shouldn’t just see yourself as the author of the story, but remember that you are also its first reader.  You are the first one who will look through the characters’ eyes and see the world as it is written for them to see.  Regardless of whether you are a novice or been doing this for years, it is no easy feat to create an entire world from whole cloth, and then give to it a pair of eyes (several pairs, if we are being honest) for others to share in the experience.  It can be a rather frustrating task to even know where to start.  My take on the matter is simply to realize that, as you’re struggling to give sight to your story’s narrators, it is perfectly fine to first start with the pair of eyes ready made in your head, and expand from there without fear of breaking some unwritten rules of storytelling.

James Frey: A Lesson of Honesty in Writing

In 2003, James Frey published a widely acclaimed memoir, titled A Million Little Pieces, about his experiences as a young addict struggling to rehabilitate his life back to sobriety.  It is a dark and engaging account of the depth to which a person can fall as his inner demons–in this case, manifesting externally in the form of crack and booze–brings him to a crossroads in which the next handful of decisions could literary be the determining factor between life or death.

Needless to say, the reading public responded very well to the book, and Frey was heralded not just as a great writer and storyteller, but as somewhat of a hero for those who have been affected by the horrors of addiction (either personally, or vicariously), for whom he served as an eloquent communicator to the general public on how to emphasize with their less-than-sober counterparts in society.

As time went by, praise and attention continued to grow, and Frey’s star shined bright enough to even warrant attention from Oprah Winfrey’s coveted Book Club (surely, the hallmark of any author serious about actually selling her or his book in the industry).  Around this same time, however, a different sort of attention was also creeping up and casting a more accusatory shadow over Frey’s spotlight.  Eventually, after much push and pull, and pontificating about integrity and trust, it was revealed that a large chunk of the details concerning Frey’s lived experiences in the book, were not lived experiences at all.  Although we don’t know exactly to what extend things have been fabricated in the faux-memoir, we do know that just about every event detailed that ought to be verifiable (i.e. police records, specific people and interactions, etc.) simply aren’t.  So much so, that the book nowadays sits in the fiction section of your local bookstore, and serves as a case in point of a literary forgery.

Despite all the controversy, it needs to be said that James Frey is actually a decent writer, and A Million Little Pieces is not a badly written book, and given his knack for storytelling he has gone on to write several subsequent works that are equally engaging and enjoyable (though, since the incident, he has wisely kept both feet squarely within the realm of fiction; showing that a person truly can learn something from a degree of public shame).  Thus, the question I’m more interested in concerning this entire mess isn’t really about Frey’s stand alone role in this matter, but rests more on the issue regarding the extend to which the writing world (writ large) has a professional obligation to maintain honesty with its readers?

The question should be an easy one on first sight, for who would say out loud that writers need to be free to unabashedly lie to their audience?  This is especially true for writers whose prose rests in the realm of non-fiction.  Yet, although certainly true, I think just repeating a platitude on this matter does little to really convey the seriousness of an incident like this.

Sticking with the example at hand, I read Frey’s book after the drama had already unfolded, and was never in doubt about the faulty veracity of its claims as one might have been if they came to the work under the ruse of it being an honest memoir of a person’s private struggles.  I can see how someone who had become emotionally invested in the story of the flawed-yet-persistent person fighting to gain back some semblance of meaning and sanity in his chaotic life, would have felt more than a little betrayed on hearing that this “real” person was a mere sensationalized character in one authors hopeful attempt at circumnavigating through the competitive hoops of the publishing world.  They felt duped, and rightfully so, because in a very clear way they were.  And this one experience could very well sour the public and harden a cynical attitude towards the apparently appalling lack of a rigorous vetting process on the hands of publishers more concerned with making a buck of off people’s empathy, then researching on whether the “real-life” story they’re selling is in fact bunk.

This is where the responsibility lies, in my opinion.  The literary world as a whole has an obligation to at the very least accurately promote the product they are selling.  And to do so prior to publication, not post-public outcry, which (let’s be honest) will still push sales by virtue of secondhand curiosity alone.  I accept that I’m naive in my thinking to expect a business to prioritize integrity and honesty over financial imperatives, but seeing as how–if I’m inclined to share an opinion on the matter–I feel obligated it be an honest one; be it idealist, if it must.