Like many others, I've always been suspicious of that old writing-workshop saw, "write what you know." Unexamined, it tends to lead to lots of stories about undergraduates at keg parties (if you are an undergraduate writing major), or about writers with writers' block (the rest of us). It discourages us from ever leaving our comfort zones and, god forbid, learning about other kinds of people or experiences. And if fiction writers aren't willing to explore, then what is the world coming to? Well, it becomes a vast blank space in which tiny, disconnected voices cry out for attention--without doing very much to earn it.
On the other hand, it has lately dawned on me that the stories I've written that seem most successful--both in terms of others' responses and my own feelings about them--*are* about what I know most intimately. True, and somewhat contrary to the rant above, they tend to have some external autobiographical elements. But what those elements have actually achieved, without my conscious intention, is to draw emotional authenticity into the story. I set up these trappings of self, I thought, out of laziness: why do research on, say, men who work on oil rigs, when I already know what it's like to be an underemployed, nearly ex-academic? Knowing the external context, though, is what allowed me to depict the emotional experience that became the heart of the story. I know not only how ex-academics looking for meaningful work spend their days, I know what it feels like to be in that position, and what kinds of awkward, bizarre emotional adventures can--or might--ensue when one is in such a state. Those experiences made readers (and me) care.
This doesn't at all mean that one can't or shouldn't branch out from one's own literal experiences. Quite the opposite. But I do think it helps to have some kind of authentic, *emotional* anchor--even if it's only visible to you--in your story. Maybe include a character based on your passive-aggressive father, only with a completely different job, or an interlude that reminds you of the time you thought your dog had run away for good. Even if the rest of the story is invented out of whole cloth, that anchor can be a wellspring of emotional authenticity for the whole enterprise. It helps ensure that you are writing what you know emotionally, and that, I think, is what we really want from stories.
Mostly about fiction and writing.
"They also live / Who swerve and vanish in the river."--Archibald MacLeish
Friday, December 21, 2012
Wednesday, December 12, 2012
Ask a question and delay the answer
I've been thinking about Lee Child's column in last Sunday's New York Times. For writers who want to learn how to create suspense, he provides a simple answer: ask a question, then make the reader wait to find out the answer. Because human beings, he says, are hard-wired to seek out answers, this set-up is all but foolproof. You can't put the book down until you know.
Obviously for the thriller genre--to the extent that such a thing exists in a pure form--that's a no-brainer. Who killed X? Will the spy get out of the country or be caught? Same with romance: here, perhaps the question is not whether the lovers will get together, but how.
But does this formula work for literary fiction? Child seems to imply that it does: writers are "told they should create attractive, sympathetic characters, so that readers will care about them deeply, and then to plunge those characters into situations of continuing peril, the descent into which is the mixing and stirring, and the duration and horrors of which are the timing and temperature." The focus on character is the standard definition for what constitutes literary fiction, although genre fiction surely benefits when its characters are believable and compelling. Yet Child says the matter is much simpler: ask, delay. That's all.
I've said before that I have trouble with the idea of character as the central feature of literary fiction. More to the point, I have a lot of trouble with the advice to "start your story with character." I know that many people do this--come up with one or more vivid character first, and then see what they start doing. But I can't do this without knowing what circumstances the character exists within. What situation, what milieu? In other words, what questions are they going to respond to? How have such questions already shaped who they are? That's one reason I feel comfortable with Child's formula. I write to try to understand something--a belief system, a moral dilemma, a relationship between people or between people and place. Yet I know that I will never really understand it. Conversely, if I believe I can fully grasp the matter, I know I shouldn't write about it. Otherwise the piece will be didactic, self-righteous, and boring.
There's no doubt that literary fiction does ask questions; in fact, you might argue that this is all it does. Unlike a whodunnit, literary fiction opens out at the end, rather than closing down. You set down a work of literature feeling satisfied, but also wondering, ruminating, generating more questions of your own. This is not necessarily a more enjoyable experience, though it may be a more edifying one. (Not that we always want to be edified.)
In other words: yes, the formula does work for literary fiction, I believe. But somehow the answer you provide also has to also be a non-answer--a believable, satisfying, and fair non-answer. Not a trick. Not a deflection, and not a refusal to grapple with the issues you've laid out. The answer should not just be unexpected in content (as it should be in any kind of suspense fiction), but also, perhaps, in form. It might call the original question into question. It might raise the question, what is an answer, anyway?
Obviously for the thriller genre--to the extent that such a thing exists in a pure form--that's a no-brainer. Who killed X? Will the spy get out of the country or be caught? Same with romance: here, perhaps the question is not whether the lovers will get together, but how.
But does this formula work for literary fiction? Child seems to imply that it does: writers are "told they should create attractive, sympathetic characters, so that readers will care about them deeply, and then to plunge those characters into situations of continuing peril, the descent into which is the mixing and stirring, and the duration and horrors of which are the timing and temperature." The focus on character is the standard definition for what constitutes literary fiction, although genre fiction surely benefits when its characters are believable and compelling. Yet Child says the matter is much simpler: ask, delay. That's all.
I've said before that I have trouble with the idea of character as the central feature of literary fiction. More to the point, I have a lot of trouble with the advice to "start your story with character." I know that many people do this--come up with one or more vivid character first, and then see what they start doing. But I can't do this without knowing what circumstances the character exists within. What situation, what milieu? In other words, what questions are they going to respond to? How have such questions already shaped who they are? That's one reason I feel comfortable with Child's formula. I write to try to understand something--a belief system, a moral dilemma, a relationship between people or between people and place. Yet I know that I will never really understand it. Conversely, if I believe I can fully grasp the matter, I know I shouldn't write about it. Otherwise the piece will be didactic, self-righteous, and boring.
There's no doubt that literary fiction does ask questions; in fact, you might argue that this is all it does. Unlike a whodunnit, literary fiction opens out at the end, rather than closing down. You set down a work of literature feeling satisfied, but also wondering, ruminating, generating more questions of your own. This is not necessarily a more enjoyable experience, though it may be a more edifying one. (Not that we always want to be edified.)
In other words: yes, the formula does work for literary fiction, I believe. But somehow the answer you provide also has to also be a non-answer--a believable, satisfying, and fair non-answer. Not a trick. Not a deflection, and not a refusal to grapple with the issues you've laid out. The answer should not just be unexpected in content (as it should be in any kind of suspense fiction), but also, perhaps, in form. It might call the original question into question. It might raise the question, what is an answer, anyway?
Labels:
character,
fiction,
good ideas,
literature,
plot,
writing
Tuesday, December 04, 2012
Writing and being still
I was very glad to see this piece in last Sunday's NYT, on writing when you're not actually writing. I've said something similar, but I didn't realize it was, you know, a thing. The author, Silas House, calls these various imaginative practices "being still." He also emphasizes that this isn't just an artistic method but a way of being in the world.
The piece lined up, perhaps coincidentally, with that Sunday's Modern Love column. The writer, Teresa Link, describes her former husband's inability to comprehend that she was actually doing something (writing) when she was sitting in a riverbank, outwardly doing nothing.
I do wonder how many creative people are shamed out of the very work they hope or need to do by these kinds of misunderstandings. Often--at least in my case--we ourselves don't realize that we're working. We ourselves think we are being lazy, just daydreaming or brooding, when we're actually doing preliminary work to writing or painting or dancing or whatever. Both authors convincingly remind us of the extent to which first-world cultures, the US in particular, celebrate visible, physical, even frenetic activity. If you're a writer, that means you at least should be throwing down actual words on some actual document (digital or otherwise).
Which is probably why novice writers are usually advised to write every day, write x number of words every day, and don't get up until you've done it. You are lazy! You need discipline! Famous Author got up every morning at 3 to write, before preparing her children's breakfasts and then heading off to her job as CEO of Everything! And you can't even cough out a thousand words? Why, you forty-seven-percenter / slug!
Yes, I've used the thousand-word axiom myself, and found it quite helpful, especially at overcoming the dreaded Internal Editor. But I don't think we need the guilt that comes with "not producing," especially when there are other ways to be a productive writer. You just need to get the thing into words at some point. If the daily routine is not possible or functional for you--or not always--that could very well be just fine.
The piece lined up, perhaps coincidentally, with that Sunday's Modern Love column. The writer, Teresa Link, describes her former husband's inability to comprehend that she was actually doing something (writing) when she was sitting in a riverbank, outwardly doing nothing.
I do wonder how many creative people are shamed out of the very work they hope or need to do by these kinds of misunderstandings. Often--at least in my case--we ourselves don't realize that we're working. We ourselves think we are being lazy, just daydreaming or brooding, when we're actually doing preliminary work to writing or painting or dancing or whatever. Both authors convincingly remind us of the extent to which first-world cultures, the US in particular, celebrate visible, physical, even frenetic activity. If you're a writer, that means you at least should be throwing down actual words on some actual document (digital or otherwise).
Which is probably why novice writers are usually advised to write every day, write x number of words every day, and don't get up until you've done it. You are lazy! You need discipline! Famous Author got up every morning at 3 to write, before preparing her children's breakfasts and then heading off to her job as CEO of Everything! And you can't even cough out a thousand words? Why, you forty-seven-percenter / slug!
Yes, I've used the thousand-word axiom myself, and found it quite helpful, especially at overcoming the dreaded Internal Editor. But I don't think we need the guilt that comes with "not producing," especially when there are other ways to be a productive writer. You just need to get the thing into words at some point. If the daily routine is not possible or functional for you--or not always--that could very well be just fine.
Tuesday, November 27, 2012
Is it necessary to humiliate your characters?
So I was happily reading this new and highly acclaimed novel. And then I stopped. Because, well, I had reached my limit on watching characters get physically humiliated. I won't go into detail, largely because I am squeamish. But the fact that I decided not to read any further got me wondering. Am I just squeamish? Isn't humiliation a powerful and real experience, which deserves a place in fiction just as much as any other emotion?
I would answer yes on both counts. But I sensed something else going on here, similar to what I've suspected any number of times while watching various movies and TV shows. The writer (or director) seems somehow to be gloating--if not outright enjoying the scene, at least calling attention to his or her daring in creating the scene in the first place. In other words, the characters suffer on behalf of the writer's quest for authenticity. This seems especially egregious when live actors must actually undergo the experience, or some convincing simulacrum thereof. I can't help thinking that a power game is going on, not only in the story, but in the making of the story. And I don't like it.
Maybe that sense of sadism is a natural side effect of depicting humiliation; the author is in the odd position of both creating the humiliation and sort of standing over it, watching, unable to help (because the story demands that the humiliation occur). The reader's in that position, too, because she imagines and therefore recreates the scene. I suppose this complicity can be an informative experience.
It also seems to me that the artistic depiction of humiliation (not to mention its comic counterpart in mainstream movies) is a recent trend that is only picking up speed. A humiliation scene is the equivalent of having your characters use smartphones--it shows you're contemporary. You "get" how the world works today. It's a calling card of realism.
I'm not speaking of stories about war and torture, in which humiliation has a different valence--it's acknowledged as a tool of an oppressive regime. I'm talking about "first-world" stories, garden-variety suburban tales of alienation, where this ugliness seems like an attempt to fulfill some function that the story otherwise might not manage. It's almost an insistence on the story's significance, and on the power of the author's words to stir strong feelings. Maybe suburban life *is* humiliating in a way that can only be expressed through the exaggerating airing of intensely private experiences. Anyway, it's only happening to fictional characters, right? And authors don't owe their characters any protection.
I still don't like it.
I would answer yes on both counts. But I sensed something else going on here, similar to what I've suspected any number of times while watching various movies and TV shows. The writer (or director) seems somehow to be gloating--if not outright enjoying the scene, at least calling attention to his or her daring in creating the scene in the first place. In other words, the characters suffer on behalf of the writer's quest for authenticity. This seems especially egregious when live actors must actually undergo the experience, or some convincing simulacrum thereof. I can't help thinking that a power game is going on, not only in the story, but in the making of the story. And I don't like it.
Maybe that sense of sadism is a natural side effect of depicting humiliation; the author is in the odd position of both creating the humiliation and sort of standing over it, watching, unable to help (because the story demands that the humiliation occur). The reader's in that position, too, because she imagines and therefore recreates the scene. I suppose this complicity can be an informative experience.
It also seems to me that the artistic depiction of humiliation (not to mention its comic counterpart in mainstream movies) is a recent trend that is only picking up speed. A humiliation scene is the equivalent of having your characters use smartphones--it shows you're contemporary. You "get" how the world works today. It's a calling card of realism.
I'm not speaking of stories about war and torture, in which humiliation has a different valence--it's acknowledged as a tool of an oppressive regime. I'm talking about "first-world" stories, garden-variety suburban tales of alienation, where this ugliness seems like an attempt to fulfill some function that the story otherwise might not manage. It's almost an insistence on the story's significance, and on the power of the author's words to stir strong feelings. Maybe suburban life *is* humiliating in a way that can only be expressed through the exaggerating airing of intensely private experiences. Anyway, it's only happening to fictional characters, right? And authors don't owe their characters any protection.
I still don't like it.
Labels:
character,
discomfort,
known unknowns,
movies and tv,
writing
Thursday, November 15, 2012
In fiction, voice is everything
I know I've said this before. But reading Junot Diaz's The Brief Wondrous Life of Oscar Wao just brought that lesson back home.
Your narrative voice not only creates the world in which your fiction operates, it defines the boundaries of that world. This doesn't just apply to what the narrator can realistically know or not know (this is especially an issue in the case of first-person narration, when the narrator is also an actor in the story). The even larger issue is what the narrator can and cannot do. And it's the nature of the voice itself that establishes those parameters.
In Wao, Diaz's narrator, Yunior, possesses a wide-ranging intellect, a powerful and outraged sense of history, a dark and hilarious sense of humor, and a command of many languages--not just English and Spanish, but academic-ese, science-fiction geek-speak, 1980s New Jersey teenager, and a bunch of others. Bakhtin would have really loved this guy.
The voice so energetic and captivating, we as readers will let Yunior take us anywhere. We'll read lengthy footnotes on the history of the Dominican Republic and long passages of what attendees at a lesser sort of workshop would criticize as summary (no, don't tell us what happened to Oscar's mother, show us!). Well, he does get around to showing eventually, but there's lots of telling first--and the telling is fascinating because of the voice. The story isn't separate from the voice; the voice is the story, and the story's requirements made the voice.
How does Diaz do it? I mean, aside from being brilliant. That helps. But as authors, we can all be more aware of what our story needs its voice to do. There is no law against summaries and histories in fiction; the law is to make them interesting--which you must do via the voice.
Now, not every narrative voice is as prominent as Yunior's; many authors take a more minimalist or transparent approach to storytelling. But those voices may not be as capacious--the straightforward, "I'm-not-really-here" narrative voice would, I think, have a harder time inserting long passages of history into a fictional narrative, because they'd come off as dry and boring. That's not to disparage the minimalist style; it doesn't seek the kind of capacity I'm talking about in the first place. But if you need greater range in your story, your voice needs greater range, too: so give it more bravado, more curiosity, more offbeat humor, more fury. The more distinctive the voice, the more places it can go.
Shop Indie Bookstores
Your narrative voice not only creates the world in which your fiction operates, it defines the boundaries of that world. This doesn't just apply to what the narrator can realistically know or not know (this is especially an issue in the case of first-person narration, when the narrator is also an actor in the story). The even larger issue is what the narrator can and cannot do. And it's the nature of the voice itself that establishes those parameters.
In Wao, Diaz's narrator, Yunior, possesses a wide-ranging intellect, a powerful and outraged sense of history, a dark and hilarious sense of humor, and a command of many languages--not just English and Spanish, but academic-ese, science-fiction geek-speak, 1980s New Jersey teenager, and a bunch of others. Bakhtin would have really loved this guy.
The voice so energetic and captivating, we as readers will let Yunior take us anywhere. We'll read lengthy footnotes on the history of the Dominican Republic and long passages of what attendees at a lesser sort of workshop would criticize as summary (no, don't tell us what happened to Oscar's mother, show us!). Well, he does get around to showing eventually, but there's lots of telling first--and the telling is fascinating because of the voice. The story isn't separate from the voice; the voice is the story, and the story's requirements made the voice.
How does Diaz do it? I mean, aside from being brilliant. That helps. But as authors, we can all be more aware of what our story needs its voice to do. There is no law against summaries and histories in fiction; the law is to make them interesting--which you must do via the voice.
Now, not every narrative voice is as prominent as Yunior's; many authors take a more minimalist or transparent approach to storytelling. But those voices may not be as capacious--the straightforward, "I'm-not-really-here" narrative voice would, I think, have a harder time inserting long passages of history into a fictional narrative, because they'd come off as dry and boring. That's not to disparage the minimalist style; it doesn't seek the kind of capacity I'm talking about in the first place. But if you need greater range in your story, your voice needs greater range, too: so give it more bravado, more curiosity, more offbeat humor, more fury. The more distinctive the voice, the more places it can go.
Shop Indie Bookstores
Friday, November 02, 2012
Dr. Who vs. Star Trek
On a 1-10 scale of science-fiction geekdom, I give myself a 5. I know not to call the genre "sci-fi," for instance. I know Margaret Atwood once said something about squids in space that didn't go over well. I once read half a Vernor Vinge novel and quite liked it, until I became unspeakably tired of it.
So I now offer these thoughts, knowing that the series in question are both so monstrously popular and intellectually picked-over that I may largely be revealing my own relative newbie-ness. Nevertheless. In the past month or so I have become completely obsessed with the new version of Dr. Who. (I never watched the old version; growing up, I knew it as just some odd thing that was apparently on from midnight till morning on the PBS station.) I've gone out of sequence, starting with the first season of Matt Smith, then going back and doing the three David Tennant seasons, and then backing up all the way to the single first season of Christopher Eccleston. (This first incarnation has been rightly judged as not great, especially in comparison to the others. But it wasn't really Eccleston's fault. They were starting an old series over, and learning as they went what worked and what didn't--for example, that the Doctor should not resemble, in the show's own words, a U-boat captain.)
So here's what I've observed. If Dr. Who is as huge, or huger, in the UK as Star Trek is here, the two series seem to reveal fundamental differences in our respective national self-perceptions. While I like most of the various iterations of Star Trek just fine, I've always found them to be fundamentally dull. That's because the characters are fundamentally dull. From Next Generation on, the human characters are not allowed to be flawed in any serious way. They may have some superficial "quirks" (like blindness or, I dunno, a nagging sense that one hasn't measured up to one's father)--but any real flaws (greed, violence) are offloaded onto other races, who are then mocked and/or fought. This whole structure seems to track with the doctrine of American exceptionalism, in which we (Americans/humans) are always the good guys, whose good intentions always, in the end, produce good results. Because we're good, nothing can really go wrong.
In contrast, might Dr. Who represent the emotional realism of a former empire? Because in this series, the "good guys," while making us love and root for them, do not always do the right thing, for various reasons. Often the right thing to do isn't clear, or it's a choice between two or more equally troubling options. Also, because Dr. Who is focused on time travel, we can discover that what appears right at one moment turns out to cause problems far down the line--as when your apparent rescue of a civilization turns its members into TV-addicted bozos hundreds of years in the future. Your attempt not to cause someone pain causes them more pain, because it turned out you weren't thinking from their perspective at all, even though you told yourself you were. Sometimes, you are just kind of a jerk, because you are tired and stressed and lonely and jealous, like everyone else is.
Perhaps it's un-American of me to prefer this viewpoint. Perhaps it's even defeatist to think that every solution you create might also create problems. But--this is crucial--this does not imply that one shouldn't bother. Knowing better than anyone how things can go wrong, the Doctor never gives up trying to save the universe, and never gives up on humanity, either. He loves humanity. In this way the show might even be more optimistic that Star Trek, because it allows for real human flaws to play a part in the story--even to be the story--while still advocating for the good fight. It's about doing your best, even while you know that time and entropy and your own failings will undo your deeds.
So I now offer these thoughts, knowing that the series in question are both so monstrously popular and intellectually picked-over that I may largely be revealing my own relative newbie-ness. Nevertheless. In the past month or so I have become completely obsessed with the new version of Dr. Who. (I never watched the old version; growing up, I knew it as just some odd thing that was apparently on from midnight till morning on the PBS station.) I've gone out of sequence, starting with the first season of Matt Smith, then going back and doing the three David Tennant seasons, and then backing up all the way to the single first season of Christopher Eccleston. (This first incarnation has been rightly judged as not great, especially in comparison to the others. But it wasn't really Eccleston's fault. They were starting an old series over, and learning as they went what worked and what didn't--for example, that the Doctor should not resemble, in the show's own words, a U-boat captain.)
So here's what I've observed. If Dr. Who is as huge, or huger, in the UK as Star Trek is here, the two series seem to reveal fundamental differences in our respective national self-perceptions. While I like most of the various iterations of Star Trek just fine, I've always found them to be fundamentally dull. That's because the characters are fundamentally dull. From Next Generation on, the human characters are not allowed to be flawed in any serious way. They may have some superficial "quirks" (like blindness or, I dunno, a nagging sense that one hasn't measured up to one's father)--but any real flaws (greed, violence) are offloaded onto other races, who are then mocked and/or fought. This whole structure seems to track with the doctrine of American exceptionalism, in which we (Americans/humans) are always the good guys, whose good intentions always, in the end, produce good results. Because we're good, nothing can really go wrong.
In contrast, might Dr. Who represent the emotional realism of a former empire? Because in this series, the "good guys," while making us love and root for them, do not always do the right thing, for various reasons. Often the right thing to do isn't clear, or it's a choice between two or more equally troubling options. Also, because Dr. Who is focused on time travel, we can discover that what appears right at one moment turns out to cause problems far down the line--as when your apparent rescue of a civilization turns its members into TV-addicted bozos hundreds of years in the future. Your attempt not to cause someone pain causes them more pain, because it turned out you weren't thinking from their perspective at all, even though you told yourself you were. Sometimes, you are just kind of a jerk, because you are tired and stressed and lonely and jealous, like everyone else is.
Perhaps it's un-American of me to prefer this viewpoint. Perhaps it's even defeatist to think that every solution you create might also create problems. But--this is crucial--this does not imply that one shouldn't bother. Knowing better than anyone how things can go wrong, the Doctor never gives up trying to save the universe, and never gives up on humanity, either. He loves humanity. In this way the show might even be more optimistic that Star Trek, because it allows for real human flaws to play a part in the story--even to be the story--while still advocating for the good fight. It's about doing your best, even while you know that time and entropy and your own failings will undo your deeds.
Labels:
character,
known unknowns,
movies and tv,
point of view
Friday, October 12, 2012
Do writing prompts work?
Yes.
Why do I ask? Well, during the past few years, I've published some stories (and written a few more), but mostly I've worked on novels. And once I was well into a novel, a "prompt" for whatever I was going to do next didn't seem necessary. If I didn't feel inspired simply to go forward, I'd go back into earlier chapters and try to pull out something interesting or unresolved. That often worked, especially as I've learned to see the novel form as less rigid, more open. In novels, the writer can wander--and, I am reminded, she can do the same in short stories, too. As long as the story itself is "about" wandering in some way. More about that at another time.
But another reason I had for not using writing prompts is that it had started to feel like cheating to me. Or, to put it another way, I felt like a student ... and shouldn't I have "graduated" by now? I mean, does Jonathan Franzen use writing prompts? Does Alice Munro? Don't "real" writers have enough material in their own heads, all queued up, because they're fully aware of their own concerns and purposes as artists?
Well, I suspect these guys don't use "prompts" as such--meaning those you'd find in a "How to Write Fiction" book, or receive in a class. But I've just rediscovered them. And in the past month or so, two of those prompts have already helped me
Here are the prompts I've been using, from the Poets and Writers site.
Why do I ask? Well, during the past few years, I've published some stories (and written a few more), but mostly I've worked on novels. And once I was well into a novel, a "prompt" for whatever I was going to do next didn't seem necessary. If I didn't feel inspired simply to go forward, I'd go back into earlier chapters and try to pull out something interesting or unresolved. That often worked, especially as I've learned to see the novel form as less rigid, more open. In novels, the writer can wander--and, I am reminded, she can do the same in short stories, too. As long as the story itself is "about" wandering in some way. More about that at another time.
But another reason I had for not using writing prompts is that it had started to feel like cheating to me. Or, to put it another way, I felt like a student ... and shouldn't I have "graduated" by now? I mean, does Jonathan Franzen use writing prompts? Does Alice Munro? Don't "real" writers have enough material in their own heads, all queued up, because they're fully aware of their own concerns and purposes as artists?
Well, I suspect these guys don't use "prompts" as such--meaning those you'd find in a "How to Write Fiction" book, or receive in a class. But I've just rediscovered them. And in the past month or so, two of those prompts have already helped me
- prepare the ground for a new novel
- complete one short story
- start a new story
Here are the prompts I've been using, from the Poets and Writers site.
Tuesday, October 02, 2012
Selected Shorts as writing teacher
So you're a writer. That means you listen to NPR's Selected Shorts, right? And even if your station doesn't carry it, you know you can listen to the last five broadcasts here, right?
As I was reminded last Saturday evening, listening to Jane Levy read Aimee Bender's "Americca," there's perhaps no better way to improve your own writing than to listen to great actors reading great stories. That's because so much of literary fiction is getting the voice right. In many ways, the story is the voice that tells it. The voice creates and embodies the world of your story. And hearing the words, read with a trained actor's emotion, pacing, and enunciation, lets you take that voice into your own head.
Once it's in your head, you can then compare the voice to the one you "hear" when you're writing your own story. Maybe you can take on some of its pacing, borrow some of its interjections, emulate its balance of high and low diction. This is not to say that you copy the voice, but that you develop a more acute ear for the voice of your writing. You become more able to hear places in it where you can slow down or spice up--or notice when the voice doesn't yet sound authentic to the story. In other words, this kind of listening expands your range.
That was my experience with hearing Bender's story, anyway. Among other improvements, it inspired me to add a very satisfying "Duh!" to my story of a teenager facing the end of the world.
So listen to Selected Shorts, by all means. And don't forget to attend (and apply for) LA's New Short Fiction Series if you're on the West Coast.
As I was reminded last Saturday evening, listening to Jane Levy read Aimee Bender's "Americca," there's perhaps no better way to improve your own writing than to listen to great actors reading great stories. That's because so much of literary fiction is getting the voice right. In many ways, the story is the voice that tells it. The voice creates and embodies the world of your story. And hearing the words, read with a trained actor's emotion, pacing, and enunciation, lets you take that voice into your own head.
Once it's in your head, you can then compare the voice to the one you "hear" when you're writing your own story. Maybe you can take on some of its pacing, borrow some of its interjections, emulate its balance of high and low diction. This is not to say that you copy the voice, but that you develop a more acute ear for the voice of your writing. You become more able to hear places in it where you can slow down or spice up--or notice when the voice doesn't yet sound authentic to the story. In other words, this kind of listening expands your range.
That was my experience with hearing Bender's story, anyway. Among other improvements, it inspired me to add a very satisfying "Duh!" to my story of a teenager facing the end of the world.
So listen to Selected Shorts, by all means. And don't forget to attend (and apply for) LA's New Short Fiction Series if you're on the West Coast.
Monday, September 17, 2012
Revision: cutting out the boring parts
Elmore Leonard famously said the key to writing well is to "skip the boring parts." This is sound advice. Boring the reader is about the worst thing a writer can do. I also think that writers, with experience, can actually tell when their own writing is boring. In drafting such sections, they feel mildly persecuted, as if their boss has just told them to redo the budget. Similarly, in revising, they find themselves skimming over the same sections their readers will skim. Simply responding to this unease--whether or not you know its source--by cutting out these sections and finding something to replace them with will go a long way toward improving your writing.
But what, in the case of literary fiction specifically, counts as "boring"? I would say it's writing that does not stir emotion. More specifically, readers who read literary fiction seek emotional experiences that are both complex and powerful. This does not mean you need to be punching your unseen interlocutor in the gut at a rate of one hit per second. You want dynamics, as in music, fortes and pianos and everything in between. But it does mean that something emotionally interesting should be going on, virtually at all times.
This idea doesn't line up exactly with "show, don't tell." I believe "telling" (i.e., the well-done summary) can be emotionally satisfying. It also doesn't line up with "action," in the usual sense. Scenes of contemplation, descriptions of settings, and so forth can certainly move your reader. But you might ask yourself these sorts of questions: Is this section doing something, or just reporting something? Is this creating an experience, or is it merely duct tape holding the plot together? Above all: Do I feel anything here?
But what, in the case of literary fiction specifically, counts as "boring"? I would say it's writing that does not stir emotion. More specifically, readers who read literary fiction seek emotional experiences that are both complex and powerful. This does not mean you need to be punching your unseen interlocutor in the gut at a rate of one hit per second. You want dynamics, as in music, fortes and pianos and everything in between. But it does mean that something emotionally interesting should be going on, virtually at all times.
This idea doesn't line up exactly with "show, don't tell." I believe "telling" (i.e., the well-done summary) can be emotionally satisfying. It also doesn't line up with "action," in the usual sense. Scenes of contemplation, descriptions of settings, and so forth can certainly move your reader. But you might ask yourself these sorts of questions: Is this section doing something, or just reporting something? Is this creating an experience, or is it merely duct tape holding the plot together? Above all: Do I feel anything here?
Tuesday, September 11, 2012
Tolerating--no, welcoming, dammit!--badness in writing
OK, so, not doing so well with the regular blogging thing. I think about it! I do! Just as I think about my current novel, and the beginnings of my new one, and maybe a short story I might get started on.
I won't say I'm blocked; I've mostly just been busy with freelance work that involves creative writing as well as very close editing. I love and welcome this work, except that it drains a lot of my energy for my own writing. My work also demands that I write well *quickly,* which feeds into a longstanding problem I have with first (and second and third) drafts. Basically, I can't allow them to be bad.
After all this time, after all this practice, after all this reading and listening to writers who are more successful and disciplined than I am, I still can hardly stand to write something that feels less than perfect. I spend a lot of time mulling over new scenes for my novel (for example), and rejecting them before I even attempt to commit them to the screen. As if anyone but me--and maybe a highly trusted friend or two--will ever see them! What the hell am I afraid of? I know from experience that writing the scene out--not just thinking about it--is the only way to discover whether it will work.
Even more important, during the writing process, new and better ideas *always* emerge. No matter how many notes I take or how much muttering I do to myself, I will *only* discover the ideas by physically writing out the sentences, paragraphs, pages, chapters. And then tearing these sentences, paragraphs, pages, and chapters apart, and rebuilding them, over and over. Or throwing them out, which is not the same thing as having wasted my time writing them. Because now I've seen what does *not* work, and instead of having an unworkable idea hovering tantalizingly in the background, it can be dispensed with once and for all, clearing a space for what does work. It even happens that what I originally think is bad turns out to be good.
The fear has to be that the bad writing represents my true potential. It's the rawest and therefore clearest indicator of an insurmountable lack of talent. Right? It's as if the revision process is somehow inauthentic, not "real" writing, and even a kind of trickery that covers up (rather than alters) the fundamental failure of what lies beneath. What an odd and yet persistent delusion.
But I've had enough experience to know that these recursive doubts never really go away. I know that most other writers have them, too. Sometimes the doubts lie low; other times they surge and nearly engulf me. I have to work in the presence of these doubts. They can loom behind my chair like a nosy coworker, whom I must acknowledge and then gently tell to go away.
I won't say I'm blocked; I've mostly just been busy with freelance work that involves creative writing as well as very close editing. I love and welcome this work, except that it drains a lot of my energy for my own writing. My work also demands that I write well *quickly,* which feeds into a longstanding problem I have with first (and second and third) drafts. Basically, I can't allow them to be bad.
After all this time, after all this practice, after all this reading and listening to writers who are more successful and disciplined than I am, I still can hardly stand to write something that feels less than perfect. I spend a lot of time mulling over new scenes for my novel (for example), and rejecting them before I even attempt to commit them to the screen. As if anyone but me--and maybe a highly trusted friend or two--will ever see them! What the hell am I afraid of? I know from experience that writing the scene out--not just thinking about it--is the only way to discover whether it will work.
Even more important, during the writing process, new and better ideas *always* emerge. No matter how many notes I take or how much muttering I do to myself, I will *only* discover the ideas by physically writing out the sentences, paragraphs, pages, chapters. And then tearing these sentences, paragraphs, pages, and chapters apart, and rebuilding them, over and over. Or throwing them out, which is not the same thing as having wasted my time writing them. Because now I've seen what does *not* work, and instead of having an unworkable idea hovering tantalizingly in the background, it can be dispensed with once and for all, clearing a space for what does work. It even happens that what I originally think is bad turns out to be good.
The fear has to be that the bad writing represents my true potential. It's the rawest and therefore clearest indicator of an insurmountable lack of talent. Right? It's as if the revision process is somehow inauthentic, not "real" writing, and even a kind of trickery that covers up (rather than alters) the fundamental failure of what lies beneath. What an odd and yet persistent delusion.
But I've had enough experience to know that these recursive doubts never really go away. I know that most other writers have them, too. Sometimes the doubts lie low; other times they surge and nearly engulf me. I have to work in the presence of these doubts. They can loom behind my chair like a nosy coworker, whom I must acknowledge and then gently tell to go away.
Tuesday, August 21, 2012
Writing about states of mind: the telling that shows
This Iris Murdoch lady can write. Did you guys know? I had never read Murdoch before, but picked up The Good Apprentice recently, and am, like, captivated. Yes, it comes with a blurb from Harold Bloom, but don't let that dissuade you. It is that somewhat rare bird, the truly entertaining philosophical novel. It's about a college student who accidentally kills his friend by slipping him LSD. It's also about the student's family--all deranged in varying degrees, but, like Dostoevsky's characters, they've earned their nuttiness by sincerely trying to understand how to live. It's engrossing, creepy, and quite often hilarious.
I've been revising my own novel frantically as I read Murdoch. I know writers differ on whether it's a good idea to read similar works while writing one's own. I've gone back and forth on this myself, but at the moment I'm firmly in the yes! read them! camp. There's the worry about possibly copying, or--more likely--trying to fit your work into what seems like the other writer's more successful formula. I actually don't think there's much of a risk of either, as long as you understand that your work is different. Whatever the other writer is doing has to be translated into your circumstances and idiom--it must, and will, become yours. I am looking to Murdoch, whose subject and themes are quite similar to those in my new novel, for ways to solve certain problems with pacing. I am not borrowing her prose or even her plot lines, but I am learning patterns and structures.
Specifically, there's the matter of portraying a character's state of mind. As literary theorists, notably Jonathan Culler, remind us, fiction is the one mode that gives us seemingly transparent insight into other minds--though those other minds are, necessarily, fictional (because it's a fiction that we can have access to other minds). Anyway. My question, as a writer is, how do you do that? What's the best way to represent another mind in action?
The dreaded "Show don't tell" mantra doesn't help here. You can show distress by having a character pace and run his fingers through his hair and mutter. You can have him tell another character he's distressed--but not in so many words, of course. But if you want to represent interior life, the life the character lives when he's alone--and I think you do, because otherwise your character is an automaton, simply a reactor to stimuli--you need another mode. You need a form of telling that shows.
Murdoch is a master of showing by telling. Here is just a sampling of Edward's thoughts, shortly after his friend's death:
The reason this passage (and pages and pages of similar passages) works is that it doesn't bore us with mere, abstract telling. The narrator shows us what's in Edward's mind by using the language and rhythm Edward himself uses when he's thinking. He's obsessed, so his language and thoughts are repetitive. He's overwhelmed, self-punishing and self-pitying, so his language is grandiose. At the same time, Edward is intelligent. He is the kind of character who strives to think through his problems, and he has access to concepts that allow him to do so in a sophisticated--if ineffectual--manner. (And his thinking should be ineffectual; there is no easy way to live with, or comprehend, what he's done.)
Reading Murdoch has opened my eyes to the possibilities of this kind of narration--the self-aware interior monologue, in which telling and showing merge. I think such possibilities offer themselves more easily once we learn to really respect our characters. We can give them the ability, which we ourselves possess, to wonder and marvel and obsess and worry inside their own heads. Let them seek truth, as we do, through them.
Shop Indie Bookstores
I've been revising my own novel frantically as I read Murdoch. I know writers differ on whether it's a good idea to read similar works while writing one's own. I've gone back and forth on this myself, but at the moment I'm firmly in the yes! read them! camp. There's the worry about possibly copying, or--more likely--trying to fit your work into what seems like the other writer's more successful formula. I actually don't think there's much of a risk of either, as long as you understand that your work is different. Whatever the other writer is doing has to be translated into your circumstances and idiom--it must, and will, become yours. I am looking to Murdoch, whose subject and themes are quite similar to those in my new novel, for ways to solve certain problems with pacing. I am not borrowing her prose or even her plot lines, but I am learning patterns and structures.
Specifically, there's the matter of portraying a character's state of mind. As literary theorists, notably Jonathan Culler, remind us, fiction is the one mode that gives us seemingly transparent insight into other minds--though those other minds are, necessarily, fictional (because it's a fiction that we can have access to other minds). Anyway. My question, as a writer is, how do you do that? What's the best way to represent another mind in action?
The dreaded "Show don't tell" mantra doesn't help here. You can show distress by having a character pace and run his fingers through his hair and mutter. You can have him tell another character he's distressed--but not in so many words, of course. But if you want to represent interior life, the life the character lives when he's alone--and I think you do, because otherwise your character is an automaton, simply a reactor to stimuli--you need another mode. You need a form of telling that shows.
Murdoch is a master of showing by telling. Here is just a sampling of Edward's thoughts, shortly after his friend's death:
If only he could have, somehow, somewhere, a clean pain, a vital pain, not a death pain, a pain of purgatory by which in time he could work it all away, as a stain which could be patiently worked upon and cleansed and made to vanish. But there was no time, he had destroyed time. This was hell, where there was no time.
The reason this passage (and pages and pages of similar passages) works is that it doesn't bore us with mere, abstract telling. The narrator shows us what's in Edward's mind by using the language and rhythm Edward himself uses when he's thinking. He's obsessed, so his language and thoughts are repetitive. He's overwhelmed, self-punishing and self-pitying, so his language is grandiose. At the same time, Edward is intelligent. He is the kind of character who strives to think through his problems, and he has access to concepts that allow him to do so in a sophisticated--if ineffectual--manner. (And his thinking should be ineffectual; there is no easy way to live with, or comprehend, what he's done.)
Reading Murdoch has opened my eyes to the possibilities of this kind of narration--the self-aware interior monologue, in which telling and showing merge. I think such possibilities offer themselves more easily once we learn to really respect our characters. We can give them the ability, which we ourselves possess, to wonder and marvel and obsess and worry inside their own heads. Let them seek truth, as we do, through them.
Shop Indie Bookstores
Thursday, August 16, 2012
Counting minutes, not words?
Via the PEN Center, Aimee Bender writes a nice piece on the importance of routine and structure for writers. Probably more people will be wowed by the information that she used to write in a closet. But I'm more interested in the fact that she sets a time limit, as opposed to a word count, for her writing sessions.
I think this strategy puts Bender in something of a minority. I'm used to hearing about word counts, like Stephen King's 3,000 per day (though he suggests 1,000 for mortals like us). I have used this myself. The idea is that you can't get up till you do your 1,000, so you might as well bang through it. Who knows, you could be done in half an hour, and off to more rewarding things like staring into the fridge and muttering "fuck."
But I'm interested in this time business. First, it accommodates those of us--most of us--with other jobs and/or obligations. If you "write" from 7-9 every morning, or 7-9 every evening, you can state with confidence when you will and will not be participating in the flesh-and-blood world. There is also, let's face it, something particularly draconian about the word count. On days when the words aren't coming, and you just can't bear to kick out your internal editor and pour out crap like you're supposed to, "597 to go" is a deeply lousy feeling. The time thing seems a little more gentle. Better to write from boredom than from obligation, perhaps.
Setting time frame seems especially applicable during revisions, when a word count doesn't really make sense.
In my rule book, I don't have to do anything except sit at the computer, but I'm not allowed to do anything else, and I usually get so bored I start to work.
I think this strategy puts Bender in something of a minority. I'm used to hearing about word counts, like Stephen King's 3,000 per day (though he suggests 1,000 for mortals like us). I have used this myself. The idea is that you can't get up till you do your 1,000, so you might as well bang through it. Who knows, you could be done in half an hour, and off to more rewarding things like staring into the fridge and muttering "fuck."
But I'm interested in this time business. First, it accommodates those of us--most of us--with other jobs and/or obligations. If you "write" from 7-9 every morning, or 7-9 every evening, you can state with confidence when you will and will not be participating in the flesh-and-blood world. There is also, let's face it, something particularly draconian about the word count. On days when the words aren't coming, and you just can't bear to kick out your internal editor and pour out crap like you're supposed to, "597 to go" is a deeply lousy feeling. The time thing seems a little more gentle. Better to write from boredom than from obligation, perhaps.
Setting time frame seems especially applicable during revisions, when a word count doesn't really make sense.
Tuesday, August 14, 2012
Is silence golden?
Insert usual excuses for not blogging here.
Plus:
I have been wondering lately about this whole imperative to say stuff on the Internet as often as possible. Where does this very recent, overwhelmingly powerful requirement to write in public come from? I think it's safe to say it comes from corporations, whose need to draw "eyeballs" to advertisements intersects powerfully with 1) the human need for connection and attention generally, and 2) writers' need to write and be read specifically.
Caveats:
1) This is not necessarily bad. There's a real discipline to rapid, public writing--which is still evolving, and whose shaky tenets many, many people don't practice. But still, discipline is good. Learning to write both reasonably well and reasonably fast is good. And who will finance this sort-of-real/sort-of-fake form of publication for aspiring writers, if not the makers of dangerous diet pills and fly-by-night, for-profit colleges?
2) My ambivalence about the compulsion to write is neither a satisfactory excuse for not writing, nor an indication I intend to give up blogging, or tweeting, or anything else.
3) I'm just saying.
What am I saying? I'm saying that last week I had in mind a post about the Ronco Rhinestone and Stud Setter, which was inspired by a pair of jeans I'd just bought at a thrift shop. They are Michael Kors, very nice, $14, but until I brought them home I was not fully aware of the large amount of studs adorning the front and back pockets. Normally I'm not a fan of adornments on jeans; I'm sure these will make my life a living hell if I forget and wear them through an airport scanner or if I happen to amble past a giant magnet. Also, they reminded me of Barbara Stanwyck in The Thorn Birds, stomping around the ranch with her dentures and bowed legs. By which I mean, too much stuff on jeans is either for 1) very young people or 2) old people trying to look young. Neither of which I 1) am or 2) wish to be.
But then the Sikh temple shooting happened, on the heels of the Aurora shooting, and those are just the awful happenings I happened to be paying most attention to, thanks to the Internet. A breezy post about the Ronco Rhinestone and Stud Setter had to be out of the question for at least a period of time (how long? Is now OK?). But was I supposed to say something instead about the shootings? Does saying something mean you are automatically more concerned than if you say nothing? I started to feel like the better move was to say nothing. Or, rather, that speaking just for the sake of speaking was not helpful. I don't know what *is* helpful. But maybe the traditional "moment of silence" is more than just a mask for "let us pray," when you're not supposed to say that publicly. I'd like to think it also means: let us shut our traps for just a second. Let us pause. Let us not jump and react and flail our arms and demand attention. Perhaps only silence is as enormous as certain events, and speech is too small a thing in such circumstances.
So there. I said something about saying nothing. You're welcome, Internet.
Plus:
I have been wondering lately about this whole imperative to say stuff on the Internet as often as possible. Where does this very recent, overwhelmingly powerful requirement to write in public come from? I think it's safe to say it comes from corporations, whose need to draw "eyeballs" to advertisements intersects powerfully with 1) the human need for connection and attention generally, and 2) writers' need to write and be read specifically.
Caveats:
1) This is not necessarily bad. There's a real discipline to rapid, public writing--which is still evolving, and whose shaky tenets many, many people don't practice. But still, discipline is good. Learning to write both reasonably well and reasonably fast is good. And who will finance this sort-of-real/sort-of-fake form of publication for aspiring writers, if not the makers of dangerous diet pills and fly-by-night, for-profit colleges?
2) My ambivalence about the compulsion to write is neither a satisfactory excuse for not writing, nor an indication I intend to give up blogging, or tweeting, or anything else.
3) I'm just saying.
What am I saying? I'm saying that last week I had in mind a post about the Ronco Rhinestone and Stud Setter, which was inspired by a pair of jeans I'd just bought at a thrift shop. They are Michael Kors, very nice, $14, but until I brought them home I was not fully aware of the large amount of studs adorning the front and back pockets. Normally I'm not a fan of adornments on jeans; I'm sure these will make my life a living hell if I forget and wear them through an airport scanner or if I happen to amble past a giant magnet. Also, they reminded me of Barbara Stanwyck in The Thorn Birds, stomping around the ranch with her dentures and bowed legs. By which I mean, too much stuff on jeans is either for 1) very young people or 2) old people trying to look young. Neither of which I 1) am or 2) wish to be.
But then the Sikh temple shooting happened, on the heels of the Aurora shooting, and those are just the awful happenings I happened to be paying most attention to, thanks to the Internet. A breezy post about the Ronco Rhinestone and Stud Setter had to be out of the question for at least a period of time (how long? Is now OK?). But was I supposed to say something instead about the shootings? Does saying something mean you are automatically more concerned than if you say nothing? I started to feel like the better move was to say nothing. Or, rather, that speaking just for the sake of speaking was not helpful. I don't know what *is* helpful. But maybe the traditional "moment of silence" is more than just a mask for "let us pray," when you're not supposed to say that publicly. I'd like to think it also means: let us shut our traps for just a second. Let us pause. Let us not jump and react and flail our arms and demand attention. Perhaps only silence is as enormous as certain events, and speech is too small a thing in such circumstances.
So there. I said something about saying nothing. You're welcome, Internet.
Tuesday, July 31, 2012
Riffing and dwelling
I have *so* not been blogging. Obviously. As I recently learned from a highly scientific test, I am a creature of routine (but no less fascinating for being one!). When my routine is thrown off, say by lots of paying work, which, I hasten once again to add, is a very, very good thing, I start dropping regular tasks faster than famous young movie stars shed their spouses.
But this situation has allowed me to reflect, again, on the topic of how to write when you don't have time to write. It now seems to me that there is a benefit to being away from my writing. I don't mean taking a break from it; I mean being physically away from the text, which, in my case, resides on a computer. Because my other, paying work is on the computer, by the end of the day, I am really interested in being in a different room, if not a different county, from the screen. And that's when I've started to dream up ways to enhance my second novel.
My particular problem as a writer is that I actually write too little in early drafts. As Ann Patchett once said, my writing is like "concentrated orange juice," which needs water. I cram what should be extended scenes into summaries, asides, or complicated metaphors. Usually I am so enamored of these summaries and metaphors that I need someone else to point out the problem to me.
But going back and staring at the text doesn't always inspire expansion. The lines on the screen start to look like the wires of a chain-link fence. Whereas when I'm (say) lying on the sofa, or cooking dinner, or playing with the cats, I can start imagining how I might untangle a tight knot in the narrative. I can "feel" places where the story seems jammed up, and start playing around with dialog, images, etc. The stakes feel a lot lower, and the space for exploration a lot more open. Before I know it, I have written a scene. Then it's just a matter of sitting down and recording it, and which point even more ideas start to occur to me.
I made a note to myself on top of my notebook. It consisted of two words: Riff and Dwell. This is what I need to do in my writing--open up spaces to riff and dwell. And being physically distant from the writing itself, I find, is often a tremendous help.
But this situation has allowed me to reflect, again, on the topic of how to write when you don't have time to write. It now seems to me that there is a benefit to being away from my writing. I don't mean taking a break from it; I mean being physically away from the text, which, in my case, resides on a computer. Because my other, paying work is on the computer, by the end of the day, I am really interested in being in a different room, if not a different county, from the screen. And that's when I've started to dream up ways to enhance my second novel.
My particular problem as a writer is that I actually write too little in early drafts. As Ann Patchett once said, my writing is like "concentrated orange juice," which needs water. I cram what should be extended scenes into summaries, asides, or complicated metaphors. Usually I am so enamored of these summaries and metaphors that I need someone else to point out the problem to me.
But going back and staring at the text doesn't always inspire expansion. The lines on the screen start to look like the wires of a chain-link fence. Whereas when I'm (say) lying on the sofa, or cooking dinner, or playing with the cats, I can start imagining how I might untangle a tight knot in the narrative. I can "feel" places where the story seems jammed up, and start playing around with dialog, images, etc. The stakes feel a lot lower, and the space for exploration a lot more open. Before I know it, I have written a scene. Then it's just a matter of sitting down and recording it, and which point even more ideas start to occur to me.
I made a note to myself on top of my notebook. It consisted of two words: Riff and Dwell. This is what I need to do in my writing--open up spaces to riff and dwell. And being physically distant from the writing itself, I find, is often a tremendous help.
Tuesday, July 17, 2012
Advertising, fiction, and Mad Men
So I recently figured out that people want the same thing from advertising as they do from fiction: an emotional experience. In advertising, that experience is designed to spur you to buy, or at least think favorably, about a product or service--which may or may not be an admirable goal. In fiction, that experience leads to ... what, exactly? If there's a purpose beyond giving the reader pleasure (and I don't think there needs to be), perhaps it's strengthening the reader's capacity for empathy. Some studies show fiction actually does do this, although it could still be a chicken-and-egg problem (maybe more empathic people tend to read more in the first place).
At any rate, at the heart of both successful fiction and successful advertising is people's hunger for emotional experience. Not just pure experience, though--it somehow needs to be at a remove, so that it's safe and understandable, as well as powerful. You want the experience plus the meaning, or the solution. Fulfill that hunger, and you've got yourself a happy reader, or a willing buyer.
All of this reminds me of a scene from the first season of Mad Men, which has always stuck with me. Here, Don Draper presents his campaign for the Kodak slide "wheel," which he has renamed the Carousel. I find this a tremendously moving, and telling, scene. It's layered by Don's own nostalgia for a childhood he never really had, and for a family that, even now, is not exactly his (because he cheats on his wife, and, as we begin discovering in this season, he's assumed another man's identity). His longing exists, as perhaps it does for all of us, because he can never have what he seeks: a true home, an ideal past. He can only have a substitute, or talisman--the slides, and the projector that lets him see them. But the scene is also powerful because it illustrates the genuine power of advertising. There may be something sinister or shallow in its motives--the bottom line is always shareholder value, and eventually the projector will end up in landfill, or, as in my family's case, in a box in the garage marked "SLIDE PROGECTOR." Yet the emotion the campaign generates, in Don and in us, is anything but false. And that's why we still, at least somewhat willingly, respond to ads and to fiction: we want this genuine experience, in any form we can get it.
At any rate, at the heart of both successful fiction and successful advertising is people's hunger for emotional experience. Not just pure experience, though--it somehow needs to be at a remove, so that it's safe and understandable, as well as powerful. You want the experience plus the meaning, or the solution. Fulfill that hunger, and you've got yourself a happy reader, or a willing buyer.
All of this reminds me of a scene from the first season of Mad Men, which has always stuck with me. Here, Don Draper presents his campaign for the Kodak slide "wheel," which he has renamed the Carousel. I find this a tremendously moving, and telling, scene. It's layered by Don's own nostalgia for a childhood he never really had, and for a family that, even now, is not exactly his (because he cheats on his wife, and, as we begin discovering in this season, he's assumed another man's identity). His longing exists, as perhaps it does for all of us, because he can never have what he seeks: a true home, an ideal past. He can only have a substitute, or talisman--the slides, and the projector that lets him see them. But the scene is also powerful because it illustrates the genuine power of advertising. There may be something sinister or shallow in its motives--the bottom line is always shareholder value, and eventually the projector will end up in landfill, or, as in my family's case, in a box in the garage marked "SLIDE PROGECTOR." Yet the emotion the campaign generates, in Don and in us, is anything but false. And that's why we still, at least somewhat willingly, respond to ads and to fiction: we want this genuine experience, in any form we can get it.
Wednesday, July 11, 2012
Introverts unite! (No, wait, not so close ...)
Susan Cain's book about introversion made quite a splash in January, and the ripples are still going strong. Her work came back to my attention recently, as I've been doing some consulting for schools that help children "come out of their shells." There's much to be said for teaching introverts to advocate for themselves, and to present themselves in ways that don't accidentally put others off. We need basic social skills to function in any society, and enabling people to master them is honorable work.
On the other hand, Cain's right--our culture does have an extroversion bias. When we talk about bringing kids (or adults) out of their shells, we often imply that introversion needs to be treated and overcome, rather than worked with or even celebrated. I notice, too, that Cain points out a difference between introversion and shyness, where shyness is a fear of social judgment, and introversion is simply a preference for less stimulating environments (a glass of wine with a good friend, rather than a big party). This definition of shyness does suggest a problem, a form of self-tormenting that isn't necessary and doesn't do anybody any good. Whereas introversion, properly recognized, can be a great thing for all involved. We like solitude, and we get a lot done that way. We also, as Cain points out, are not anti-social, but differently social (see wine, above). We think before we speak, which others tend to appreciate.
It really has been only in the last few weeks that I've started to consider introversion as a positive trait, rather than something to be "dealt with," i.e., concealed. Perhaps we introverts could have calmer and more productive lives (though many of us have these already, because we seek them) by minimizing the struggle to appear extroverted. That is, we still need to go out in the world sometimes and be friendly. But I suspect there's some wasted effort in dissing ourselves for not being more extroverted, and in trying too hard to conceal our inclinations.
On the other hand, Cain's right--our culture does have an extroversion bias. When we talk about bringing kids (or adults) out of their shells, we often imply that introversion needs to be treated and overcome, rather than worked with or even celebrated. I notice, too, that Cain points out a difference between introversion and shyness, where shyness is a fear of social judgment, and introversion is simply a preference for less stimulating environments (a glass of wine with a good friend, rather than a big party). This definition of shyness does suggest a problem, a form of self-tormenting that isn't necessary and doesn't do anybody any good. Whereas introversion, properly recognized, can be a great thing for all involved. We like solitude, and we get a lot done that way. We also, as Cain points out, are not anti-social, but differently social (see wine, above). We think before we speak, which others tend to appreciate.
It really has been only in the last few weeks that I've started to consider introversion as a positive trait, rather than something to be "dealt with," i.e., concealed. Perhaps we introverts could have calmer and more productive lives (though many of us have these already, because we seek them) by minimizing the struggle to appear extroverted. That is, we still need to go out in the world sometimes and be friendly. But I suspect there's some wasted effort in dissing ourselves for not being more extroverted, and in trying too hard to conceal our inclinations.
Thursday, July 05, 2012
I am the grumpiest optimist I know
Having weighed in on busyness the other day, I thought I'd take a crack at "optimism." This article, too, is from the NYT. It's by Jane Brody, who I generally think is a good person and a purveyor of useful advice, although the chirpy puritanism of her columns always puts me on edge. I also rather dislike the Good Examples she often serves up, as in this one--as if all we must do is be like these wonderful people (who often include Brody herself)--and all will be well.
OK, I know: would I take health advice from an ironic, crotchety health reporter? Yes, I would, but probably others wouldn't. And certainly the column format is part of the problem. In the piece in question, for instance, along with the Good Examples, we get a series of bullet points on how to be an optimist. To wit:
Oh, palm to forehead! If only I had known about these bullet points! How could I have been so foolish, and been a pessimist? Optimism is so easy, and therefore I, a sometimes pessimist, am a jerk. If only I had known to laugh, to be engaged!
Here's the thing. I agree with these bullet points; and in the course of a lengthy treatment for depression, followed by finding a really cool guy to marry, and then very gradually figuring out what kind of life I wanted to live, and how to live it, I practiced all of them, and I continue to try to do so. I think being happy is better than being sad. But not because it's my duty to be happy. Being sad is not a failure--it's often an accurate and sensitive reading of your situation, and of the world more generally. If anything, being sad feels like the proper duty, except that it prevents you, often, from trying to make things better. And it sucks. So it's all kind of a feedback loop, and part of the problem is this terminology.
I propose we dispense with the terms "happy" and "optimistic" as the stated goals for ourselves and the surly loved ones we wish to help. As this article sort of gets around to explaining, what we are really talking about is "confidence" and "persistence." A book called The Growth Mindset also talks about the centrality of persistence, and the new, confidence-building feedback loop that's created when we train ourselves--over time--to persevere.
In other words, optimism is not an attitude, but a practice. Sayings like "It's all in your attitude" give us the wrong idea--that we are thinking or perceiving wrongly, and all we have to do is make ourselves see the glass half full. This leads, in my experience, to a whole lot of mental self-kicking (see above), which makes the pessimism and paralysis worse. But persistence is often not delightful or easy. It's hard. That is the point. It is not a matter of flipping a switch in your brain. It means sticking with it for the long haul. It means learning to tolerate the unpleasantness of slogging through obstacles, even or especially when a reward is not certain. Yes, there are ways to do that, including meditation and laughing and so on. But the difficulty of persisting, especially for those who aren't used to it, somehow gets left out of these "just do it" articles.
I speak, of course, for the less chipper beings of the universe. I gather some people are either born with or endowed early on with persistence and confidence and even sunniness, and they can't understand why people like me are talking about having to learn these traits. Why wouldn't you just go out and get what you want? I can't really answer. But I also can't see how this column could help the type of person whom it's aimed at. Sunny just-do-it-ism is either going to bounce right off them, or make them feel worse.
OK, I know: would I take health advice from an ironic, crotchety health reporter? Yes, I would, but probably others wouldn't. And certainly the column format is part of the problem. In the piece in question, for instance, along with the Good Examples, we get a series of bullet points on how to be an optimist. To wit:
- Face your fears. [...]
- Re-evaluate events in your everyday life. [...]
- Practice mindful meditation. [...]
- Take control over how you feel instead of letting feelings control you. [...]
- Laugh. [...]
- Be fully engaged. [...]
Oh, palm to forehead! If only I had known about these bullet points! How could I have been so foolish, and been a pessimist? Optimism is so easy, and therefore I, a sometimes pessimist, am a jerk. If only I had known to laugh, to be engaged!
Here's the thing. I agree with these bullet points; and in the course of a lengthy treatment for depression, followed by finding a really cool guy to marry, and then very gradually figuring out what kind of life I wanted to live, and how to live it, I practiced all of them, and I continue to try to do so. I think being happy is better than being sad. But not because it's my duty to be happy. Being sad is not a failure--it's often an accurate and sensitive reading of your situation, and of the world more generally. If anything, being sad feels like the proper duty, except that it prevents you, often, from trying to make things better. And it sucks. So it's all kind of a feedback loop, and part of the problem is this terminology.
I propose we dispense with the terms "happy" and "optimistic" as the stated goals for ourselves and the surly loved ones we wish to help. As this article sort of gets around to explaining, what we are really talking about is "confidence" and "persistence." A book called The Growth Mindset also talks about the centrality of persistence, and the new, confidence-building feedback loop that's created when we train ourselves--over time--to persevere.
In other words, optimism is not an attitude, but a practice. Sayings like "It's all in your attitude" give us the wrong idea--that we are thinking or perceiving wrongly, and all we have to do is make ourselves see the glass half full. This leads, in my experience, to a whole lot of mental self-kicking (see above), which makes the pessimism and paralysis worse. But persistence is often not delightful or easy. It's hard. That is the point. It is not a matter of flipping a switch in your brain. It means sticking with it for the long haul. It means learning to tolerate the unpleasantness of slogging through obstacles, even or especially when a reward is not certain. Yes, there are ways to do that, including meditation and laughing and so on. But the difficulty of persisting, especially for those who aren't used to it, somehow gets left out of these "just do it" articles.
I speak, of course, for the less chipper beings of the universe. I gather some people are either born with or endowed early on with persistence and confidence and even sunniness, and they can't understand why people like me are talking about having to learn these traits. Why wouldn't you just go out and get what you want? I can't really answer. But I also can't see how this column could help the type of person whom it's aimed at. Sunny just-do-it-ism is either going to bounce right off them, or make them feel worse.
Monday, July 02, 2012
"I am the laziest ambitious person I know."
I very much enjoyed Tim Kreider's piece on busyness in yesterday's NYT. Mostly because it validates me, and the many hours I spend draped on the couch, with or without a cat on my sternum. I am valuable! I have insights, not despite but because of my staggering capacity for sloth! Lazy ambitious people, unite! Oh, never mind. It's too much trouble.
Only one little question nags at me. I used to spend equal if not more amounts of time in a very similar mode (couch, semi-dozing state, though no cat back in those days)--and it was a sign of depression. I do sense a difference in my current way of doing nothing, but the difference isn't quite clear enough to make me feel 100% confident in my new laziness. I suppose, in depression mode, my mind was actually racing and obsessing, rather than drifting and dreaming, as now. I also had a feeling pointlessness, of just wanting the day to be over--of lying low, till the storm that was the day passed--which I don't have now. I like my days.
Still, I continue to suspect my idleness. And I continue to admire the busy, even though Kreider suggests a great deal of busyness is an expression of fear. Idleness can be fear-driven, too.
But hey, at least I wrote a blog post about it!
Only one little question nags at me. I used to spend equal if not more amounts of time in a very similar mode (couch, semi-dozing state, though no cat back in those days)--and it was a sign of depression. I do sense a difference in my current way of doing nothing, but the difference isn't quite clear enough to make me feel 100% confident in my new laziness. I suppose, in depression mode, my mind was actually racing and obsessing, rather than drifting and dreaming, as now. I also had a feeling pointlessness, of just wanting the day to be over--of lying low, till the storm that was the day passed--which I don't have now. I like my days.
Still, I continue to suspect my idleness. And I continue to admire the busy, even though Kreider suggests a great deal of busyness is an expression of fear. Idleness can be fear-driven, too.
But hey, at least I wrote a blog post about it!
Thursday, June 21, 2012
Not dead
... just working a lot and traveling. Occasionally tweeting. I took a test and discovered that I am a creature of routine and an introvert, and also very much like Johnny Depp. And Mary Poppins. Other than that, not much news.
I hope to resume some form of regular blogging next week.
I hope to resume some form of regular blogging next week.
Tuesday, June 05, 2012
Mystery novels and the mystery of death
I expect to return many times to this nearly twenty-year-old interview with Don DeLillo in The Paris Review. It basically answers all my questions about writing and validates what I thought were some of my worst tendencies, especially in relation to character. So it's awesome!
But today I want to dwell on this almost throw-away insight DeLillo offers about mystery novels:
For a long time I've wondered how and why mystery fiction transforms murder into a form of amusement. How does that dynamic operate? How does it make murder appealing rather than merely appalling? What makes us accept that the satisfaction of solving the crime seems to justify its occurrence, at least in fiction?
Well, of course. The tightly woven plot is a containment strategy. Part of the pleasure we derive from such stories is in experiencing that containment, that transformation. Murder and death become artifacts through the evident artifice of these plots.
In other words, the murder mystery is a proxy for the mystery of death itself. What happens, really? And why must it happen? By the strict terms of the human condition, we can't have these answers. But we can have more practical answers about death in a murder story: we can use our minds to discover who did it, and why. The criminal's motive replaces God's and/or the universe's. If we can't understand death in general, at least we can understand a particular death.
All of which suggests the mystery novel as a kind of ritual or amulet or charm against death, which goes a long way to explaining its appeal. And a character like Sherlock Holmes, a master at comprehending murder mysteries, becomes a kind of priest.
But today I want to dwell on this almost throw-away insight DeLillo offers about mystery novels:
When I think of highly plotted novels I think of detective fiction or mystery fiction, the kind of work that always produces a few dead bodies. But these bodies are basically plot points, not worked-out characters. The book’s plot either moves inexorably toward a dead body or flows directly from it, and the more artificial the situation the better. Readers can play off their fears by encountering the death experience in a superficial way. A mystery novel localizes the awesome force of the real death outside the book, winds it tightly in a plot, makes it less fearful by containing it in a kind of game format.
For a long time I've wondered how and why mystery fiction transforms murder into a form of amusement. How does that dynamic operate? How does it make murder appealing rather than merely appalling? What makes us accept that the satisfaction of solving the crime seems to justify its occurrence, at least in fiction?
Well, of course. The tightly woven plot is a containment strategy. Part of the pleasure we derive from such stories is in experiencing that containment, that transformation. Murder and death become artifacts through the evident artifice of these plots.
In other words, the murder mystery is a proxy for the mystery of death itself. What happens, really? And why must it happen? By the strict terms of the human condition, we can't have these answers. But we can have more practical answers about death in a murder story: we can use our minds to discover who did it, and why. The criminal's motive replaces God's and/or the universe's. If we can't understand death in general, at least we can understand a particular death.
All of which suggests the mystery novel as a kind of ritual or amulet or charm against death, which goes a long way to explaining its appeal. And a character like Sherlock Holmes, a master at comprehending murder mysteries, becomes a kind of priest.
Tuesday, May 29, 2012
"Show, don't tell" and hoarding
I spent the past two days purging old clothes from my closets. I hauled five bags of stuff off to Goodwill, and upon returning home, I felt ... awful. My sense is that one is supposed to feel liberated on such occasions, and also not a little holy for contributing to charity (though one is also aware that clothing donations from the first world can interfere with nascent clothes-making businesses in the third). While I kept reminding myself of how lame it is to hang onto stuff I don't wear, when others might be able to use it, when I drove away from the Goodwill container I felt almost like I was fleeing the scene of a crime I had just committed.
Yes, things are just things. And yet they are not. In the same way that in fiction, the specific, telling detail is a window into a character's psyche, things--in our consumer society, anyway--are portals to the past. To me, even the most trivial piece of clothing I heaved into the Hefty bag had some bit of memory stuck to it, along with lint and cat hair. I almost always remembered where I'd bought the thing, and what life was like at that time (I really have hung onto things far too long). If the piece of clothing was a gift, well, so much the worse. How ungrateful I felt for never wearing that jacket (even if it didn't fit, or made me look like Carmela Soprano, or both). How I felt like I was stabbing the giver in the heart, like I was tossing a kitten out of a car into the rain.
It's my guess that although I may be nuttier than many in this area, I'm not alone. (I am fortunate, too, that our condo is small enough that full-on hoarding is simply not possible.) Consumerism, I think, depends on this fear of loss, especially of the loss of memory. We buy to ward off Alzheimer's, and we give to keep others from forgetting us. Those aren't the only reasons, of course, but they come into play--maybe more so when we feel ourselves isolated, and people close to us start dying, and we start to think of how many people we've already lost track of over the course of our lives.
What will keep our memories for us, if not objects? Stories? Facebook? The Cloud? Fine, but we can't touch and see these things in the same way I could have--but didn't--wear that TV-test-pattern sweater I bought in England two-plus decades ago. You can tell stories to kids and grandkids, but if they're not connected to tangible things, those stories mutate and dissipate over generations, even if they're written down. Objects don't change, although they do decay, and their original appeal or function can become questionable.
This is where Buddhism is supposed to help, right? Impermanence, impermanence. Further: giving up clothing is nothing compared to giving up one's life, which is what Memorial Day is about. Yet Memorial Day is also about sales. It really seems like this endless circulation of stuff through our lives is really the presence (presents?) of death.
Anyway, I've moved more junk into the empty spaces in the closets, so there's lots more space in my office now. That's kind of nice.
Yes, things are just things. And yet they are not. In the same way that in fiction, the specific, telling detail is a window into a character's psyche, things--in our consumer society, anyway--are portals to the past. To me, even the most trivial piece of clothing I heaved into the Hefty bag had some bit of memory stuck to it, along with lint and cat hair. I almost always remembered where I'd bought the thing, and what life was like at that time (I really have hung onto things far too long). If the piece of clothing was a gift, well, so much the worse. How ungrateful I felt for never wearing that jacket (even if it didn't fit, or made me look like Carmela Soprano, or both). How I felt like I was stabbing the giver in the heart, like I was tossing a kitten out of a car into the rain.
It's my guess that although I may be nuttier than many in this area, I'm not alone. (I am fortunate, too, that our condo is small enough that full-on hoarding is simply not possible.) Consumerism, I think, depends on this fear of loss, especially of the loss of memory. We buy to ward off Alzheimer's, and we give to keep others from forgetting us. Those aren't the only reasons, of course, but they come into play--maybe more so when we feel ourselves isolated, and people close to us start dying, and we start to think of how many people we've already lost track of over the course of our lives.
What will keep our memories for us, if not objects? Stories? Facebook? The Cloud? Fine, but we can't touch and see these things in the same way I could have--but didn't--wear that TV-test-pattern sweater I bought in England two-plus decades ago. You can tell stories to kids and grandkids, but if they're not connected to tangible things, those stories mutate and dissipate over generations, even if they're written down. Objects don't change, although they do decay, and their original appeal or function can become questionable.
This is where Buddhism is supposed to help, right? Impermanence, impermanence. Further: giving up clothing is nothing compared to giving up one's life, which is what Memorial Day is about. Yet Memorial Day is also about sales. It really seems like this endless circulation of stuff through our lives is really the presence (presents?) of death.
Anyway, I've moved more junk into the empty spaces in the closets, so there's lots more space in my office now. That's kind of nice.
Tuesday, May 22, 2012
Personification enlivens abstraction
Here's another thing writing teachers always tell us: Be concrete. Use words that create images in the reader's mind; make them feel or hear or see or smell something specific. (Smell is an especial favorite.) This dictum is a variation of the dreaded "Show, don't tell," and, like its counterpart, it has its merits. For example, concreteness leads to specificity, which is always a good thing, because specific details form the Lilliputian army with which you beat back the Giant Art-Killing Cliche Worm.
But, let's face it: abstractions exist. Language itself is an abstraction. Also, if a writer has any ambition beyond accurately rendering the physical experiences of daily life, she'll need to use some words denoting concepts, ideas, theories. Yet she doesn't want to end up with a dry philosophical treatise, drained of all life's blood. What to do?
Well, cleverly blending the abstract and the concrete is one way to go. Let's take a cue from Benjamin Black, the pen-name of John Banville in his thriller-writing mode. In Christine Falls, his main character, Quirke, is working his way through a bad meal in an overrated restaurant, while trying to extract some information from his brother-in-law. As Quirke mulls his interlocutor's evasive answers, Black tosses off this gem:
First off, the personification here ratchets up the interest right away. Having Quirke's palate "recall" the salmon gives an otherwise dull and largely abstract concept, one's "palate," a life of its own, literally. We've all had this experience of unwillingly retrieving a bad food experience (I'm not talking about the more dramatic, literal possibilities here)--and having the palate do the honors, rather than Quirke himself, creates that involuntary dynamic.
The second word that benefits from this treatment, in this same sentence, is "qualm." This is another word that can't really attract any interest on its own. It's part of a cliched phrase, having qualms about x, and although it's an amusing-sounding word, it can't overcome this history by itself. But, again, it's the palate that has the qualm, not Quirke, and this makes all the difference. When something that we assume can't have a qualm (or whatever) is shown to do so, the all-but-dead word comes back to quivering life. The "qualm" now sounds like a little spasm of the throat or tongue, and is funny and vivid and very close to our own recalled experience.
So, personification of abstractions is one way to infuse them with specificity. You can't overdo personification, of course, or it will quickly grow preposterous--but it is a way of mixing the abstract and the concrete, so that you can use abstractions without floating off into the high desert of pure theory.
Shop Indie Bookstores
But, let's face it: abstractions exist. Language itself is an abstraction. Also, if a writer has any ambition beyond accurately rendering the physical experiences of daily life, she'll need to use some words denoting concepts, ideas, theories. Yet she doesn't want to end up with a dry philosophical treatise, drained of all life's blood. What to do?
Well, cleverly blending the abstract and the concrete is one way to go. Let's take a cue from Benjamin Black, the pen-name of John Banville in his thriller-writing mode. In Christine Falls, his main character, Quirke, is working his way through a bad meal in an overrated restaurant, while trying to extract some information from his brother-in-law. As Quirke mulls his interlocutor's evasive answers, Black tosses off this gem:
Quirke's palate recalled the salmon with a qualm.
First off, the personification here ratchets up the interest right away. Having Quirke's palate "recall" the salmon gives an otherwise dull and largely abstract concept, one's "palate," a life of its own, literally. We've all had this experience of unwillingly retrieving a bad food experience (I'm not talking about the more dramatic, literal possibilities here)--and having the palate do the honors, rather than Quirke himself, creates that involuntary dynamic.
The second word that benefits from this treatment, in this same sentence, is "qualm." This is another word that can't really attract any interest on its own. It's part of a cliched phrase, having qualms about x, and although it's an amusing-sounding word, it can't overcome this history by itself. But, again, it's the palate that has the qualm, not Quirke, and this makes all the difference. When something that we assume can't have a qualm (or whatever) is shown to do so, the all-but-dead word comes back to quivering life. The "qualm" now sounds like a little spasm of the throat or tongue, and is funny and vivid and very close to our own recalled experience.
So, personification of abstractions is one way to infuse them with specificity. You can't overdo personification, of course, or it will quickly grow preposterous--but it is a way of mixing the abstract and the concrete, so that you can use abstractions without floating off into the high desert of pure theory.
Shop Indie Bookstores
Monday, May 14, 2012
Flashbacks in fiction: Do they suck?
A writing teacher once told me that you should resist including flashbacks in your fiction at all costs. If you absolutely must add a flashback, each one can be no more than three lines (or was it sentences? Lines, probably, because with sentences you could cheat, spinning out subordinate clauses for pages, sprinkling liberally with commas and semicolons).
The reason for this perhaps extreme prohibition is that flashbacks can lead to a static narrative in the story's present. The reader can almost picture the character sitting on a couch in a dim waiting room, tapping her foot as you methodically plod through all the steps that got her to this point in the story. Perhaps worse, because you know you need to get back to the present, and rescue your character from that waiting room where she's growing more sullen and uncooperative by the minute, the flashback can fall into a weird neither-nor land: not quite summary, not quite scene. Especially if the flashback seems to you to have a primarily explanatory function--how did Suzy get to be so sensitive about her appearance?--you're especially likely to fall into this mode.
I find that in my own writing process, the early stages of a novel tend to be full of flashbacks. Probably even more flashback than present narrative. What that tells me is that I've either set the story at the wrong period in the character's life, or that I need to give some of these experiences to other characters. I have to find some way to get these flashbacks into the main story. Either that or I'll have to come up with some kind of lovely, stylistic move to weave memory itself into the story, without making the story about a person being struck by random flashbacks over the course of an otherwise ordinary day, or week, or ... OK, in the right hands, I can see that being a very good story.
In this case, though, I think I'm going to hand off some of these experiences to the other characters. Because that's the other thing I find that I do in the early stages--create a bunch of characters, and then not give most of them enough to do. So instead of having the mother's experience at her school be a flashback, I'll have it be her daughter's experience in the story.
More thoughts on flashbacks: the lack of them in the Odyssey (per Auerbach), and as a function of point of view.
The reason for this perhaps extreme prohibition is that flashbacks can lead to a static narrative in the story's present. The reader can almost picture the character sitting on a couch in a dim waiting room, tapping her foot as you methodically plod through all the steps that got her to this point in the story. Perhaps worse, because you know you need to get back to the present, and rescue your character from that waiting room where she's growing more sullen and uncooperative by the minute, the flashback can fall into a weird neither-nor land: not quite summary, not quite scene. Especially if the flashback seems to you to have a primarily explanatory function--how did Suzy get to be so sensitive about her appearance?--you're especially likely to fall into this mode.
I find that in my own writing process, the early stages of a novel tend to be full of flashbacks. Probably even more flashback than present narrative. What that tells me is that I've either set the story at the wrong period in the character's life, or that I need to give some of these experiences to other characters. I have to find some way to get these flashbacks into the main story. Either that or I'll have to come up with some kind of lovely, stylistic move to weave memory itself into the story, without making the story about a person being struck by random flashbacks over the course of an otherwise ordinary day, or week, or ... OK, in the right hands, I can see that being a very good story.
In this case, though, I think I'm going to hand off some of these experiences to the other characters. Because that's the other thing I find that I do in the early stages--create a bunch of characters, and then not give most of them enough to do. So instead of having the mother's experience at her school be a flashback, I'll have it be her daughter's experience in the story.
More thoughts on flashbacks: the lack of them in the Odyssey (per Auerbach), and as a function of point of view.
Tuesday, May 08, 2012
In praise of the writerly surprise
I've been dipping into George Saunders's essays in The Braindead Megaphone. I've always admired Saunders as a writer of the kind of surreal, hilarious, and deeply sad fiction I wish I could come up with myself. But man, can he rock an essay.
I suppose that what makes his fiction great is something he also does--less often, and therefore perhaps more strikingly--in his nonfiction: He surprises us. I don't mean that his work is startlingly good, or shockingly original, though that's true. I'm talking about these little explosive surprises that he drops in, mid-sentence or mid-paragraph, that make you stop and go, Wow.
Case in point, from "Thank you, Esther Forbes." This is a celebration of Saunders's childhood discovery of Johnny Tremain, and with it, the fact that fiction did not have to be awful:
Who on earth would think of comparing the mediocre prose in a 70s children's fiction anthology to a moccasin? But the thing is, I think we all make leaps like this. Only for must of us, these leaps remain in our subconscious. We'd never articulate them to ourselves or others, because it feels ridiculous that we get the same feeling--the creeps--from reading a stupid story as from looking at someone's slippers. Neither of these things should give us the creeps in the first place, right? The fact that both do is beyond wrong, even a little shameful. Yet it's true. I instantly recognized this feeling of getting the creeps from an apparently, indeed strenuously innocuous story. It's the insistence on innocuousness that causes the creeps to descend.
Later in the same essay Saunders drops a much more shocking bomb, quoting the bureaucratic prose of an SS officer on how it's better to leave the lights on in the gas chamber before turning on the jets, to keep "the load" from screaming and pushing against the door. Good God. Is it really a straight line from the story of the tow-headed boy, through the slippers, to this grotesque evasion of responsibility for mass murder? Well, Saunders suggests, kind of. An inauthentic relationship to language is no small matter. It means you can distance yourself from what you say, and what you think you mean. Using language well--giving it its full due--means taking responsibility.
Much better, then, to be honest about how a bad story reminds us of a house slipper, in that both made us feel deeply weird. That kind of honesty is a small salvo in the fight against the bureaucratization of the soul. Plus, when you say it out loud, it turns out to be so right, it's hilarious.
Shop Indie Bookstores
I suppose that what makes his fiction great is something he also does--less often, and therefore perhaps more strikingly--in his nonfiction: He surprises us. I don't mean that his work is startlingly good, or shockingly original, though that's true. I'm talking about these little explosive surprises that he drops in, mid-sentence or mid-paragraph, that make you stop and go, Wow.
Case in point, from "Thank you, Esther Forbes." This is a celebration of Saunders's childhood discovery of Johnny Tremain, and with it, the fact that fiction did not have to be awful:
Before Johnny Tremain, writers and writing gave me the creeps. In our English book, which had one of those 1970s titles that connoted nothing (Issues and Perspectives, maybe, or Amalgam 109), the sentences ("Larry, aged ten, a tow-headed heavyset boy with a happy smile for all, meandered down to the ballfield, hoping against hope he would at last be invited to join some good-spirited game instigated by the other lads of summer") repulsed me the way a certain kind of moccasin-style house slipper then in vogue among my father's friends repulsed me.
Who on earth would think of comparing the mediocre prose in a 70s children's fiction anthology to a moccasin? But the thing is, I think we all make leaps like this. Only for must of us, these leaps remain in our subconscious. We'd never articulate them to ourselves or others, because it feels ridiculous that we get the same feeling--the creeps--from reading a stupid story as from looking at someone's slippers. Neither of these things should give us the creeps in the first place, right? The fact that both do is beyond wrong, even a little shameful. Yet it's true. I instantly recognized this feeling of getting the creeps from an apparently, indeed strenuously innocuous story. It's the insistence on innocuousness that causes the creeps to descend.
Later in the same essay Saunders drops a much more shocking bomb, quoting the bureaucratic prose of an SS officer on how it's better to leave the lights on in the gas chamber before turning on the jets, to keep "the load" from screaming and pushing against the door. Good God. Is it really a straight line from the story of the tow-headed boy, through the slippers, to this grotesque evasion of responsibility for mass murder? Well, Saunders suggests, kind of. An inauthentic relationship to language is no small matter. It means you can distance yourself from what you say, and what you think you mean. Using language well--giving it its full due--means taking responsibility.
Much better, then, to be honest about how a bad story reminds us of a house slipper, in that both made us feel deeply weird. That kind of honesty is a small salvo in the fight against the bureaucratization of the soul. Plus, when you say it out loud, it turns out to be so right, it's hilarious.
Shop Indie Bookstores
Labels:
discomfort,
earthly dieties,
fiction,
imagery,
language,
writing
Thursday, May 03, 2012
Just a little more on 2666
...because I'm obsessed, still hung over, grasping at the fading glimmers this novel's explosion left in my psyche.
I came across this piece, In the Labyrinth: A User's Guide to Bolaño, on the New Yorker web site. Now, I actually receive the New Yorker at my home on a mostly regular basis, but I hadn't read this piece. I believe that's because, back in February, I was still resenting Bolaño for being dead and yet *still* getting published in the New Yorker more often than almost anyone else. This writing gig is hard enough, New Yorker editors! Must we compete with the deceased as well?
Anyway, the "User's Guide" is mostly interesting and helpful, although I plan to read all of Bolaño's work anyway. And then there was this:
No, no, yes, yes, yes, and this is why you should read it. It's not a "traditional" novel, but what 2666 proves is that the novel is not synonymous with the bildungsroman or the romance. True, it's the rare writer who can pull off the *appearance* of formlessness and characterlessness. Melville, I'd say, was one. As in Moby-Dick, the only real character here is the universe (differently conceived, but still incomprehensible). And as for the piece's other complaint, that the huge section on the killings in Ciudad Juárez leads to nothing but "exhaustion," again--yes. I believe that is the intended and the appropriate response.
Shop Indie Bookstores
I came across this piece, In the Labyrinth: A User's Guide to Bolaño, on the New Yorker web site. Now, I actually receive the New Yorker at my home on a mostly regular basis, but I hadn't read this piece. I believe that's because, back in February, I was still resenting Bolaño for being dead and yet *still* getting published in the New Yorker more often than almost anyone else. This writing gig is hard enough, New Yorker editors! Must we compete with the deceased as well?
Anyway, the "User's Guide" is mostly interesting and helpful, although I plan to read all of Bolaño's work anyway. And then there was this:
Avoid “2666” for as long as possible, and for heaven’s sake, don’t start with it. The book is a desert of negative space across which the panting reader will search in vain for the traditional pleasures of the novel: form, character, coherence, meaning.
No, no, yes, yes, yes, and this is why you should read it. It's not a "traditional" novel, but what 2666 proves is that the novel is not synonymous with the bildungsroman or the romance. True, it's the rare writer who can pull off the *appearance* of formlessness and characterlessness. Melville, I'd say, was one. As in Moby-Dick, the only real character here is the universe (differently conceived, but still incomprehensible). And as for the piece's other complaint, that the huge section on the killings in Ciudad Juárez leads to nothing but "exhaustion," again--yes. I believe that is the intended and the appropriate response.
Anyway, read 2666!
Shop Indie Bookstores
Friday, April 27, 2012
Speaking of large, sprawling novels...
While we're on the subject of large, sprawling novels, why can't I think of any by women? Is there a female DFW, Dostoevsky, Melville, Tolstoy, or Bolaño? There is, right? Am I just drawing a blank, or is this really some kind of guy thing?
Middlemarch, maybe? Anne Rice doesn't count. I'm not talking about unedited novels.
Middlemarch, maybe? Anne Rice doesn't count. I'm not talking about unedited novels.
Tuesday, April 24, 2012
Must the Great Novel be large, sprawling, frantic, and a bit of a mess?
No, I didn't disappear under a pile of fennel, but of work...which is like fennel, in that it is tough and large and sometimes hard to cut through, but very good roasted.
Shop Indie Bookstores
In addition, I have been racing to finish Roberto Bolaño's 2666, because I took it out of the library, and although I can renew it for another three weeks, there's a certain shame in that, a certain failure, and so I read and read most of last weekend, instead of working on my own stuff. OK, so I'm still not quite done. And I don't want to be done. This is a book that I'm going to miss when it's over, even though a great deal of it is concerned with war and murder, especially the mass murders of women in Ciudad Juárez, but also of Jews in World War II. In fact there is, at one point, an almost relentless cataloging of murders, a kind of theme with variations, that goes on for hundreds of pages, while various police officers, politicians, and reporters weave in and out, trying and failing to understand, and sometimes disappearing themselves. Yet, these depictions of atrocities never feel exploitative or sentimentalized; as a reader I felt a tremendous weight of confusion and sorrow, which, oddly, made me unable to put the book down.
The preface and afterword explain that 2666 was Bolaño's last work; he wrote it while becoming increasingly ill, and he knew he was dying as he was finishing it. This no doubt added a certain urgency to the prose, perhaps a certain wildness and willingness to plunge into depths that the rest of us either fear too much, or don't know. In addition there is a raggedness and repetitiveness to some sections, which Bolaño might have edited out, had he been given more time. Then again, his other work suggests polish and brevity were not his concerns. And in 2666, hearing of a friend's preference for "short, neatly shaped novels,"* one of the characters muses:
I do think this novel is that other kind, imperfect and torrential. In my opinion, it's right up there with The Brothers Karamazov and Moby-Dick. It has the same vast scope, the same existential cri de coeur at its center, the same rage to comprehend--to encompass and understand--the entire range of human experience. Interestingly, the novel is not much concerned with God or religion, at least not overtly. Literature seems to replace God as a transcendent, unifying principle. Now, normally I don't approve of novels whose theme is writing (and whose main characters are writers), because they suggest that the author simply can't imagine people with professions unlike his own--which is a blatant failure of the writer's task. However, in this case, writing is the ultimate human quest for answers. The writer, as Bolaño has made clear in any number of works, is a detective. He is not, if he's doing it right, some kind of cloistered, disengaged commentator, but a seeker of truth, a religious pilgrim, outlaw, and professional wrestler rolled into one.
But is it true that an authentically great novel must be of this type--big, messy, overtly trying and ultimately failing to embrace everything? My own preference is, in fact, for such novels, though I have never attempted one myself (as yet). Still, this might just be a personal preference. Or is there a reason that the beautifully crafted, smaller "gem" of a book really does fall short of greatness? Does the perfection of craft represent a failure of intellectual or emotional ambition? Should the writer always bite off more than she can chew, or do battle with monsters she can never hope to catch or tame?
*I am quoting from the Note to the First Edition by Ignacio EchevarrÃa.
The preface and afterword explain that 2666 was Bolaño's last work; he wrote it while becoming increasingly ill, and he knew he was dying as he was finishing it. This no doubt added a certain urgency to the prose, perhaps a certain wildness and willingness to plunge into depths that the rest of us either fear too much, or don't know. In addition there is a raggedness and repetitiveness to some sections, which Bolaño might have edited out, had he been given more time. Then again, his other work suggests polish and brevity were not his concerns. And in 2666, hearing of a friend's preference for "short, neatly shaped novels,"* one of the characters muses:
Now even bookish pharmacists are afraid to take on the great, imperfect, torrential works, books that blaze paths into the unknown. They choose the perfect exercises of the great masters. Or what amounts to the same thing: they want to watch the great masters spar, but they have no interest in real combat, when the great masters struggle against that something, that something that terrifies us all, that something that cows us and spurs us on, amid blood and mortal wounds and stench.
I do think this novel is that other kind, imperfect and torrential. In my opinion, it's right up there with The Brothers Karamazov and Moby-Dick. It has the same vast scope, the same existential cri de coeur at its center, the same rage to comprehend--to encompass and understand--the entire range of human experience. Interestingly, the novel is not much concerned with God or religion, at least not overtly. Literature seems to replace God as a transcendent, unifying principle. Now, normally I don't approve of novels whose theme is writing (and whose main characters are writers), because they suggest that the author simply can't imagine people with professions unlike his own--which is a blatant failure of the writer's task. However, in this case, writing is the ultimate human quest for answers. The writer, as Bolaño has made clear in any number of works, is a detective. He is not, if he's doing it right, some kind of cloistered, disengaged commentator, but a seeker of truth, a religious pilgrim, outlaw, and professional wrestler rolled into one.
But is it true that an authentically great novel must be of this type--big, messy, overtly trying and ultimately failing to embrace everything? My own preference is, in fact, for such novels, though I have never attempted one myself (as yet). Still, this might just be a personal preference. Or is there a reason that the beautifully crafted, smaller "gem" of a book really does fall short of greatness? Does the perfection of craft represent a failure of intellectual or emotional ambition? Should the writer always bite off more than she can chew, or do battle with monsters she can never hope to catch or tame?
*I am quoting from the Note to the First Edition by Ignacio EchevarrÃa.
Shop Indie Bookstores
Labels:
Brothers Karamazov,
earthly dieties,
fiction,
language,
literature,
Moby Dick,
writing
Thursday, April 12, 2012
Instead of fiction, fennel
Not really feeling the literary life today; not sure why, but allow me to compensate by talking about fennel! It's awesome! Yeah, that stuff that grows in huge clumps along the freeway here in Northern California is just the best thing ever, roasted or sauteed. Nor does one need to park precariously on the shoulder and start hacking away at those bushes. Turns out they sell it at the grocery store, minus the coating of car exhaust (one hopes).
My fennel feeling started with this recipe from Bryant Terry's The Inspired Vegan* for Savory Grits with Sauteed Broad Beans, Roasted Fennel, and Thyme. It's vegan, and it's easy, and you should definitely make it either before or after you buy the book.**
So I used the bulbs for that, but hung onto the stalks and fronds, not sure what I was going to do with them, until just now. I just chopped up the stalks and fronds, and sauteed them with some garlic, walnuts, salt, and pepper, and served over penne pasta for a quick and delicious lunch.
No photos, because I'm a lousy food stylist.
*a much-loved holiday gift from Amy and Doug!
**the only part that might slow you down is that to make the creamy grits properly, you have to soak raw cashews overnight before pureeing them. But they're actually delicious in their own right, for smoothies and other uses, and well worth making. And if you don't remember to make these ahead of time, heck, just make polenta with some vegan margarine, even though Terry seems to frown on this...
Shop Indie Bookstores
My fennel feeling started with this recipe from Bryant Terry's The Inspired Vegan* for Savory Grits with Sauteed Broad Beans, Roasted Fennel, and Thyme. It's vegan, and it's easy, and you should definitely make it either before or after you buy the book.**
So I used the bulbs for that, but hung onto the stalks and fronds, not sure what I was going to do with them, until just now. I just chopped up the stalks and fronds, and sauteed them with some garlic, walnuts, salt, and pepper, and served over penne pasta for a quick and delicious lunch.
No photos, because I'm a lousy food stylist.
*a much-loved holiday gift from Amy and Doug!
**the only part that might slow you down is that to make the creamy grits properly, you have to soak raw cashews overnight before pureeing them. But they're actually delicious in their own right, for smoothies and other uses, and well worth making. And if you don't remember to make these ahead of time, heck, just make polenta with some vegan margarine, even though Terry seems to frown on this...
Shop Indie Bookstores
Tuesday, April 10, 2012
An adverb of note
From time to time I like to say something nice about adverbs. Stephen King has said the road to hell is paved with them, and no one knows the way to perdition better than King. And it's true that writers often employ adverbs for the sole purpose of shoring up weak adjectives or verbs--at which task they are destined to fail every time. However, the rare, weird, unexpected and yet perfectly fitting adverb is cause for celebration.
Today's example comes from Roberto Bolaño's 2666:
I absolutely love the "doggedly" there. Of course, I don't know what word was used in the original Spanish, or even if the sentence was constructed in the same way: credit for the adverb must be shared between Bolaño and his translator, Natasha Wimmer. The word just gives the sentence an additional little twist, like fine-tuning a guitar string, that nudges the whole scene into sublimity.
The test of an adverb's necessity is whether the sentence would be the same or stronger without it, and/or if you can find a verb (or, sometimes, adjective) that incorporates the adverb, and becomes more powerful for having consumed it. The image of Germanists "fighting over butter and jam" would still be kind of funny and recognizable. Perhaps "squabbling" or "skirmishing" could be substituted for "fighting," which is not an especially vivid word in itself. But I can't think of a better way to convey the ritualized, determined, petty, hilarious, and hopeless nature of the fight depicted here than with "doggedly." The word gets a boost from the previous sentence, which reflects the relentless march of conferences at which the fight is played out over and over.
But adverbs are the Bigfoot (Bigfeet?) of grammar. They should appear rarely, and be rare and weird themselves. Spotting one should be memorable, and something to tell your friends or blog about.
Today's example comes from Roberto Bolaño's 2666:
Then came an assembly of Germanists in Berlin, a twentieth-century German literature congress in Stuttgart, a symposium on German literature in Hamburg, and a conference on the future of German literature in Mainz. Norton, Morini, Pelletier, and Espinoza attended the Berlin assembly, but for one reason or another all four of them were able to meet only once, at breakfast, where they were surrounded by other Germanists fighting doggedly over the butter and jam.
I absolutely love the "doggedly" there. Of course, I don't know what word was used in the original Spanish, or even if the sentence was constructed in the same way: credit for the adverb must be shared between Bolaño and his translator, Natasha Wimmer. The word just gives the sentence an additional little twist, like fine-tuning a guitar string, that nudges the whole scene into sublimity.
The test of an adverb's necessity is whether the sentence would be the same or stronger without it, and/or if you can find a verb (or, sometimes, adjective) that incorporates the adverb, and becomes more powerful for having consumed it. The image of Germanists "fighting over butter and jam" would still be kind of funny and recognizable. Perhaps "squabbling" or "skirmishing" could be substituted for "fighting," which is not an especially vivid word in itself. But I can't think of a better way to convey the ritualized, determined, petty, hilarious, and hopeless nature of the fight depicted here than with "doggedly." The word gets a boost from the previous sentence, which reflects the relentless march of conferences at which the fight is played out over and over.
But adverbs are the Bigfoot (Bigfeet?) of grammar. They should appear rarely, and be rare and weird themselves. Spotting one should be memorable, and something to tell your friends or blog about.
Tuesday, April 03, 2012
A toned-down rant on education, with bonus crackpot theory
So this is very cool.
It's TED Curator Chris Anderson's animated talk, "Questions No One Knows the Answers To." Are kids everywhere watching this? And adults as well? I hope so.
As I have mentioned, I went to an excellent public school and had a presumably excellent science education therein. Yet never, not once, did it dawn on me that the purpose of this education was to be able to answer questions that were as yet unanswered. In doing countless experiments that turned out either right (yay, you got the blue foam!) or wrong (you idiot, you made black sludge!), and taking lots of multiple-choice tests, I was given to understand that science was about confirming what was already known. The black sludge was simply an error, not a result of processes that were just as real, and just as interesting in their own way, as those that made the foam. Science was proving you could follow the directions.
I can think of a number of reasons I got this message. One is that I was an extraordinarily rule-oriented child (I am only a little less so as an adult). After all, I also didn't get that art was about "expressing yourself," so much as properly rendering what you saw in front of you. So I may have simply missed the part about how we were learning scientific techniques through these repetitive experiments, so that we could later use them to make new discoveries. It may also be that my own teachers either didn't know or didn't think it important that science was about discovery. Their emphasis on mastery of the known might have reflected their own experiences and philosophies.
Or there may have been an even larger purpose behind this kind of teaching. The system cannot function if everyone in it is constantly innovating. Nothing would get done. In truth, we need people--lots of people--to implement known processes and principles. Maybe that's what this education was about: because most people will (by choice? by necessity? whose choice? whose necessity?) be implementers rather than visionaries, they must be trained as such.
Was there such a governing philosophy behind education? Is there one now? All the standardized testing going on now suggests that the answer to the second question is yes. Even if that is not the stated purpose of the testing, it will assuredly be a pronounced effect. Is that intentional on some deep, unacknowledged level? Or am I just being paranoid, or unrealistic? Should I just be grateful for the conscientious training of drones? I'm one, also, most of the time.
It's TED Curator Chris Anderson's animated talk, "Questions No One Knows the Answers To." Are kids everywhere watching this? And adults as well? I hope so.
As I have mentioned, I went to an excellent public school and had a presumably excellent science education therein. Yet never, not once, did it dawn on me that the purpose of this education was to be able to answer questions that were as yet unanswered. In doing countless experiments that turned out either right (yay, you got the blue foam!) or wrong (you idiot, you made black sludge!), and taking lots of multiple-choice tests, I was given to understand that science was about confirming what was already known. The black sludge was simply an error, not a result of processes that were just as real, and just as interesting in their own way, as those that made the foam. Science was proving you could follow the directions.
I can think of a number of reasons I got this message. One is that I was an extraordinarily rule-oriented child (I am only a little less so as an adult). After all, I also didn't get that art was about "expressing yourself," so much as properly rendering what you saw in front of you. So I may have simply missed the part about how we were learning scientific techniques through these repetitive experiments, so that we could later use them to make new discoveries. It may also be that my own teachers either didn't know or didn't think it important that science was about discovery. Their emphasis on mastery of the known might have reflected their own experiences and philosophies.
Or there may have been an even larger purpose behind this kind of teaching. The system cannot function if everyone in it is constantly innovating. Nothing would get done. In truth, we need people--lots of people--to implement known processes and principles. Maybe that's what this education was about: because most people will (by choice? by necessity? whose choice? whose necessity?) be implementers rather than visionaries, they must be trained as such.
Was there such a governing philosophy behind education? Is there one now? All the standardized testing going on now suggests that the answer to the second question is yes. Even if that is not the stated purpose of the testing, it will assuredly be a pronounced effect. Is that intentional on some deep, unacknowledged level? Or am I just being paranoid, or unrealistic? Should I just be grateful for the conscientious training of drones? I'm one, also, most of the time.
Tuesday, March 27, 2012
Freaks and Geeks and mining your childhood
We just watched Episode Fifteen of the eighteen total episodes of Freaks and Geeks. As we near the end, it's all starting to seem darker and sadder, because we know there will never be any more episodes, ever, and also because the later episodes explore darker themes (addiction, accidental pet death, accidental/deliberate human almost-death).
Still, it's a beautiful show, possibly the best TV show ever made. And one reason for that is mentioned in one of the commentaries. For months before they actually wrote the scripts, the writers sat in a room and dredged up all their most painful/embarrassing/astonishing high school experiences and shared them with each other. They then used these in the scripts, as they actually happened, or seemed to have happened in memory.
Now, sitting in a room and trading high school horror stories sounds like something I would gladly forgo in favor of a two-hour visit to the dentist. I also have a problem with fiction that sounds too much like someone reliving his or her personal victimization. (The Missouri Review has a blog post on the related matter of defining characters by their traumas.) However, F and G avoids such maudlin pitfalls, and instead just feels perfectly real and honest.
How did they do that? My guess is that it came about precisely because the writers refused to protect themselves when telling their stories to each other, and thus refused to protect the characters. Of course we are not talking hideous tragedies here, only the ordinary but highly significant failures of adolescence. Yet, speaking for myself, I can well imagine wanting to somehow "spin" these excruciating memories either to make myself look better, or to punish my youthful self more severely (which is another form of self-protection). But what happens if you just tell the story without trying to steer the reader's or viewer's reaction one way or the other? The F and G writers, as they mention in other commentaries, often didn't set out ahead of time to write comedy or tragedy; they just let the stories play out. Some turned out to be funny, some were sad, and most of the time they were both.
I'm not sure what it would take for me to be able to tell such stories honestly in fiction or--god forbid--memoir. Perhaps the group process the writers used helped: they all must have realized that everyone had experiences that still gouged their insides out when they thought of them. Their personal disaster wasn't so bad in the great scheme of things. In the same way, when we see the stories on TV, as viewers, we find them funny, forgivable, and ordinary in the most redeeming sense. Maybe that's the best kind of group therapy (mass group therapy) of all.
Still, it's a beautiful show, possibly the best TV show ever made. And one reason for that is mentioned in one of the commentaries. For months before they actually wrote the scripts, the writers sat in a room and dredged up all their most painful/embarrassing/astonishing high school experiences and shared them with each other. They then used these in the scripts, as they actually happened, or seemed to have happened in memory.
Now, sitting in a room and trading high school horror stories sounds like something I would gladly forgo in favor of a two-hour visit to the dentist. I also have a problem with fiction that sounds too much like someone reliving his or her personal victimization. (The Missouri Review has a blog post on the related matter of defining characters by their traumas.) However, F and G avoids such maudlin pitfalls, and instead just feels perfectly real and honest.
How did they do that? My guess is that it came about precisely because the writers refused to protect themselves when telling their stories to each other, and thus refused to protect the characters. Of course we are not talking hideous tragedies here, only the ordinary but highly significant failures of adolescence. Yet, speaking for myself, I can well imagine wanting to somehow "spin" these excruciating memories either to make myself look better, or to punish my youthful self more severely (which is another form of self-protection). But what happens if you just tell the story without trying to steer the reader's or viewer's reaction one way or the other? The F and G writers, as they mention in other commentaries, often didn't set out ahead of time to write comedy or tragedy; they just let the stories play out. Some turned out to be funny, some were sad, and most of the time they were both.
I'm not sure what it would take for me to be able to tell such stories honestly in fiction or--god forbid--memoir. Perhaps the group process the writers used helped: they all must have realized that everyone had experiences that still gouged their insides out when they thought of them. Their personal disaster wasn't so bad in the great scheme of things. In the same way, when we see the stories on TV, as viewers, we find them funny, forgivable, and ordinary in the most redeeming sense. Maybe that's the best kind of group therapy (mass group therapy) of all.
Thursday, March 22, 2012
The Hound of the Baskervilles: Pwned by Holmes
So, three weeks ago precisely, I wrote a terribly excited post about how Holmes and Watson had accidentally allowed their client, Henry Baskerville, to be killed! And it was really cool, because, see, Conan Doyle had introduced this really troubling moral dimension to the whole sleuthing business, and...
Well, so I was so excited about this development that I stopped reading and fired off the post, and then I got busy with other stuff, and only got back to the novel yesterday, at which point I discovered that the dead guy wasn't Henry at all, but the escaped convict, Selden:
Ah, well. Of course, Conan Doyle and his characters also did a good job in selling the false identification, with lots of hand-wringing and self-berating. All of this goes to show us--and them--that nothing, nothing, is as it seems in a Holmes story, not until the last period has landed at the end of the last sentence. That is, looking at a piece of evidence once is not enough. Looking twice may not be, either. And one's emotions, even for Holmes, can get in the way of remembering this basic fact. In this instance, Conan Doyle used his detectives' overwrought emotions, which were especially surprising in Holmes's case, as a sleight of hand to distract the reader. We all forgot to look at the dead man's face.
I think that's a good and impressive technique that mystery (and literary mystery) writers can use. But probably only once in a novel.
Well, so I was so excited about this development that I stopped reading and fired off the post, and then I got busy with other stuff, and only got back to the novel yesterday, at which point I discovered that the dead guy wasn't Henry at all, but the escaped convict, Selden:
"We must send for help, Holmes! We cannot carry him all the way to the Hall. Good heavens, are you mad?"
He had uttered a cry and bent over the body. Now he was dancing and laughing and wringing my hand. Could this be my stern, self-contained friend? These were hidden fires, indeed!
"A beard! A beard! The man has a beard!"
"A beard?"
"It is not the baronet—it is—why, it is my neighbour, the convict!"
With feverish haste we had turned the body over, and that dripping beard was pointing up to the cold, clear moon. There could be no doubt about the beetling forehead, the sunken animal eyes. It was indeed the same face which had glared upon me in the light of the candle from over the rock—the face of Selden, the criminal.
Then in an instant it was all clear to me. I remembered how the baronet had told me that he had handed his old wardrobe to Barrymore. Barrymore had passed it on in order to help Selden in his escape. Boots, shirt, cap—it was all Sir Henry's. The tragedy was still black enough, but this man had at least deserved death by the laws of his country. I told Holmes how the matter stood, my heart bubbling over with thankfulness and joy.The troubling dimension to Holmes's rational zeal is still kinda there--the scene is "black enough," as it were. In fact the issue comes up again a few pages later, when Holmes uses Henry as bait to catch the hound, and the experience is traumatic enough that Henry has to take a trip around the world with his doctor to recover. Clearly, Holmes's primary concern is not with his clients' well-being, except as it represents his success in solving the case. Nevertheless, the ironic edge is considerably dulled. And I have to admit to having been pwned by Conan Doyle, whose main interest is not in Holmes's moral complexity, but in keeping the reader off balance till the end. I was an easy mark because of my literary training; I'm always going to leap into multilayered considerations of Theme whenever the slightest opening is given to me.
Ah, well. Of course, Conan Doyle and his characters also did a good job in selling the false identification, with lots of hand-wringing and self-berating. All of this goes to show us--and them--that nothing, nothing, is as it seems in a Holmes story, not until the last period has landed at the end of the last sentence. That is, looking at a piece of evidence once is not enough. Looking twice may not be, either. And one's emotions, even for Holmes, can get in the way of remembering this basic fact. In this instance, Conan Doyle used his detectives' overwrought emotions, which were especially surprising in Holmes's case, as a sleight of hand to distract the reader. We all forgot to look at the dead man's face.
I think that's a good and impressive technique that mystery (and literary mystery) writers can use. But probably only once in a novel.
Labels:
Borrowed Fire,
Hound of the Baskervilles,
plot,
writing
Tuesday, March 20, 2012
For the love of the flawed novel
From Emily St. John Mandel's review of Nick Harkaway's Angelmaker on The Millions:
It seems so to me, as well. Part of what I love about The Brothers Karamazov, for instance, is the sense that its author, genius as he is, is just barely in control of his material. His writing often has the feel of a man trying to wrap his arms around a large bag of demonically possessed squirrels. The source of his novels' grandeur is struggle, reflected in his characters' existential anguish. His greatest success, like theirs, is not in overcoming anguish but in giving it voice--sometimes seemingly by accident, as in convoluted prose, rushed scenes, and characters who appear and disappear for no evident reason. I suppose if the whole novel was nothing but these kinds of accidents, we'd call it a promising first draft, or simply a mess. But these flaws coexist with clearly defined conflicts with stakes even larger than life and death, masterful set pieces, and characters whose flesh and blood we come very close to actually touching. The flaws, in fact, make these successful parts even better: they roughen the edges of the masterpiece, making it a visceral experience.
In contrast, I love Marilynne Robinson's Gilead in a very different way. As my previous commentary on it shows, I experienced it as a Work of Art that more or less drove me to my knees. It inspired awe, and had the heft of cathedral tunes. As an artistic achievement, I think it deserves mention in the same breath as Brothers Karamazov, but on an emotional level I'm less drawn to it, because it is not flawed. There is no badly worded sentence, no misstep in the (admittedly much simpler) plot. The artist is in complete control, and there is something vaguely off-putting about that. Maybe because I suspect I can never achieve that level of mastery myself. Or maybe it's because consciousness itself is messy and surprising. As Annie Murphy Paul explains in Sunday's NYT editorial, a recent analysis of MRI data
With its irruptions, dead ends, and coffee and blood stains, the map of Dostoevsky's mind looks far more like mine than Robinson's does. Hard to believe, since her work is set in twentieth-century Iowa and Dostoevsky's in tsarist Russia. But there you go.
[I]t seems to me that there’s something magnificent about sprawling and ever-so-slightly flawed novels.
It seems so to me, as well. Part of what I love about The Brothers Karamazov, for instance, is the sense that its author, genius as he is, is just barely in control of his material. His writing often has the feel of a man trying to wrap his arms around a large bag of demonically possessed squirrels. The source of his novels' grandeur is struggle, reflected in his characters' existential anguish. His greatest success, like theirs, is not in overcoming anguish but in giving it voice--sometimes seemingly by accident, as in convoluted prose, rushed scenes, and characters who appear and disappear for no evident reason. I suppose if the whole novel was nothing but these kinds of accidents, we'd call it a promising first draft, or simply a mess. But these flaws coexist with clearly defined conflicts with stakes even larger than life and death, masterful set pieces, and characters whose flesh and blood we come very close to actually touching. The flaws, in fact, make these successful parts even better: they roughen the edges of the masterpiece, making it a visceral experience.
In contrast, I love Marilynne Robinson's Gilead in a very different way. As my previous commentary on it shows, I experienced it as a Work of Art that more or less drove me to my knees. It inspired awe, and had the heft of cathedral tunes. As an artistic achievement, I think it deserves mention in the same breath as Brothers Karamazov, but on an emotional level I'm less drawn to it, because it is not flawed. There is no badly worded sentence, no misstep in the (admittedly much simpler) plot. The artist is in complete control, and there is something vaguely off-putting about that. Maybe because I suspect I can never achieve that level of mastery myself. Or maybe it's because consciousness itself is messy and surprising. As Annie Murphy Paul explains in Sunday's NYT editorial, a recent analysis of MRI data
concluded that there was substantial overlap in the brain networks used to understand stories and the networks used to navigate interactions with other individuals — in particular, interactions in which we’re trying to figure out the thoughts and feelings of others. Scientists call this capacity of the brain to construct a map of other people’s intentions “theory of mind.” Narratives offer a unique opportunity to engage this capacity, as we identify with characters’ longings and frustrations, guess at their hidden motives and track their encounters with friends and enemies, neighbors and lovers.
With its irruptions, dead ends, and coffee and blood stains, the map of Dostoevsky's mind looks far more like mine than Robinson's does. Hard to believe, since her work is set in twentieth-century Iowa and Dostoevsky's in tsarist Russia. But there you go.
Labels:
books,
Brothers Karamazov,
earthly dieties,
fiction,
language,
setting,
writing
Monday, March 19, 2012
A brief but passionate rant about education
I break my usual Monday silence to share some thoughts that came to me after reading this article. Its overall point is that homeschooling must be more closely regulated. While the author, Kristin Rawls, acknowledges that few formal studies have been done on the matter, anecdotal evidence suggests many homeschooled kids, particularly in fundamentalist movements like Quiverfull, are not achieving even basic levels of literacy. The reasons this may be happening look compelling: overwhelmed mothers with large numbers of children to educate, along with full responsibility for household chores and an ideology that holds "moral" (i.e. religious) education far above the factual and intellectual kind.
Fair enough. But it's also true that public (and probably even private) schools across the country are also turning out illiterate kids at an embarrassingly high rate. The fact is, this country has always had a ferociously conflicted attitude about education: parents' aspirations for their children to better their circumstances crash head-on into the old frontier suspicion of "book learning" (you don't want some weakling with his nose in a book next to you when a bear, or an Indian, is charging). Although I have no evidence for this myself, I suspect our ongoing debates about education policy, and the current fad for threatening and punishing educators, speak directly to this ambivalence. American individualists (and I'm one, in my own way) hate that we need education and educators. We hate that someone else knows more than we do about something, and that we have to subordinate ourselves, or our kids, to that person's expertise. Hence the appeal of homeschooling: it says that no one knows what's better for kids than their own parents do--no matter what the subject. The rugged individual merges with the expert.
But it's not just homeschoolers who feel this way. The recent decision by the New York City Public Schools to publish their highly problematic rankings of teachers gives non-experts another weapon to attack those who think they're so damn smart. Look, I get that there are bad teachers who are hard to root out of the system. I get the need for some objective (or close to objective) measure of students' success, and for accountability to both the school system and parents. I was lucky enough to attend excellent suburban public schools, and I still had a handful of teachers who were out-and-out lunatics, who traumatized me. But in this country there's a deep suspicion and even loathing of anyone who dares to educate others. And while I lack expertise in education policy, let me suggest these two culturally based efforts, which all of us can apply this very moment and into the future:
1. Encourage the aspirations of children and adults who seek education. Do not mock them.
2. Encourage the aspirations and work of educators. Do not humiliate them.
Fair enough. But it's also true that public (and probably even private) schools across the country are also turning out illiterate kids at an embarrassingly high rate. The fact is, this country has always had a ferociously conflicted attitude about education: parents' aspirations for their children to better their circumstances crash head-on into the old frontier suspicion of "book learning" (you don't want some weakling with his nose in a book next to you when a bear, or an Indian, is charging). Although I have no evidence for this myself, I suspect our ongoing debates about education policy, and the current fad for threatening and punishing educators, speak directly to this ambivalence. American individualists (and I'm one, in my own way) hate that we need education and educators. We hate that someone else knows more than we do about something, and that we have to subordinate ourselves, or our kids, to that person's expertise. Hence the appeal of homeschooling: it says that no one knows what's better for kids than their own parents do--no matter what the subject. The rugged individual merges with the expert.
But it's not just homeschoolers who feel this way. The recent decision by the New York City Public Schools to publish their highly problematic rankings of teachers gives non-experts another weapon to attack those who think they're so damn smart. Look, I get that there are bad teachers who are hard to root out of the system. I get the need for some objective (or close to objective) measure of students' success, and for accountability to both the school system and parents. I was lucky enough to attend excellent suburban public schools, and I still had a handful of teachers who were out-and-out lunatics, who traumatized me. But in this country there's a deep suspicion and even loathing of anyone who dares to educate others. And while I lack expertise in education policy, let me suggest these two culturally based efforts, which all of us can apply this very moment and into the future:
1. Encourage the aspirations of children and adults who seek education. Do not mock them.
2. Encourage the aspirations and work of educators. Do not humiliate them.
Thursday, March 15, 2012
Justice and the satisfying ending
Today's writing lesson is nominally about Hound of the Baskervilles. But since I haven't read any further from last week, I will have to speak about Larger Issues as opposed to specific literary techniques. For example, the matter of justice: What does fiction have to do with it? Can it bring about justice literally, as Uncle Tom's Cabin supposedly did? Or does its real power lie in creating a sense of justice, which we experience rarely in real life? As I suspected, I have written about this before. You may wish to review that post, particularly if the term "altruistic punishment" strikes your fancy.
Today I'll go a little further and suggest that justice in fiction can also be an aesthetic experience. Writing workshops tell us that the ending of a story must be both surprising and inevitable to be satisfying. That's a paradox, certainly, and it raises the possibility that paradox itself is aesthetically pleasing. Now, it might not be ethically pleasing, not in real life anyway, where we want our good folks rewarded and our evildoers punished unambiguously. Yet, that hardly ever happens. In life, the ironies--e.g. the embezzler who spends a month in jail, and then gets a book deal and his own reality show--are often maddening. However, in literature, possibly because we know we can't do anything about it, and aren't expected to, a certain kind of success-in-failure (or vice versa) feels just right. Moral ambiguity can be safely viewed as an aesthetic problem.
That in itself isn't bad; it might allow a more nuanced and less heated consideration of the issues at hand. For example, in Hound, Holmes and Watson, in their zeal to solve the case, allow their client to get killed.* In real life, that would be shocking and galling; in the novel, it creates a sort of frisson, an uncanny halo around the supposedly good work of solving crimes. Part of the reader's satisfaction with the surprising/inevitable ending is that it feels real, while being explicitly unreal. Awareness of that aesthetic/ontological balance, I think, is one of the great pleasures of fiction. (There, I worked in the Hound reference.)
Which is also to say: one must resist the temptation to impose the aesthetic tenets of storytelling on real life. At that, our culture is failing miserably.
*UPDATE: Ah, the dangers of commenting on a Sherlock Holmes novel before finishing it. Wasn't Henry Baskerville after all. More on this later.
Today I'll go a little further and suggest that justice in fiction can also be an aesthetic experience. Writing workshops tell us that the ending of a story must be both surprising and inevitable to be satisfying. That's a paradox, certainly, and it raises the possibility that paradox itself is aesthetically pleasing. Now, it might not be ethically pleasing, not in real life anyway, where we want our good folks rewarded and our evildoers punished unambiguously. Yet, that hardly ever happens. In life, the ironies--e.g. the embezzler who spends a month in jail, and then gets a book deal and his own reality show--are often maddening. However, in literature, possibly because we know we can't do anything about it, and aren't expected to, a certain kind of success-in-failure (or vice versa) feels just right. Moral ambiguity can be safely viewed as an aesthetic problem.
That in itself isn't bad; it might allow a more nuanced and less heated consideration of the issues at hand. For example, in Hound, Holmes and Watson, in their zeal to solve the case, allow their client to get killed.* In real life, that would be shocking and galling; in the novel, it creates a sort of frisson, an uncanny halo around the supposedly good work of solving crimes. Part of the reader's satisfaction with the surprising/inevitable ending is that it feels real, while being explicitly unreal. Awareness of that aesthetic/ontological balance, I think, is one of the great pleasures of fiction. (There, I worked in the Hound reference.)
Which is also to say: one must resist the temptation to impose the aesthetic tenets of storytelling on real life. At that, our culture is failing miserably.
*UPDATE: Ah, the dangers of commenting on a Sherlock Holmes novel before finishing it. Wasn't Henry Baskerville after all. More on this later.
Labels:
fiction,
Hound of the Baskervilles,
plot,
writing
Tuesday, March 13, 2012
Staying in touch with your writing
Over the last few weeks I've been fairly consumed with editing work.
Which is good! Very, very good! But it has left me a tad depleted on the
verbal front, not to mention reluctant to spend any more time in front
of a computer screen than necessary. So I didn't work on my novel that
much.
However, this wasn't the same kind of hiatus that I've allowed to happen in the past, because I recalled some advice I read years ago. I really wish I could remember who wrote this, but the gist was, even if you can't actually write, find a way to "stay in touch with your writing." So I decided to do that. At least once a day, for maybe fifteen minutes, I thought about my novel--specifically, the point where I'd left off on my revisions. I did this while making dinner, or just before going to sleep. I tried to fully inhabit that place in my novel, without rushing off to take notes or open the file on the computer. So as not to overload the experience with layers of worry, I told myself I'd recall what I'd imagined when the time came, and that the important thing was to be in that place, if only for a few minutes every day.
That seems to have worked. Later in the week, and yesterday, I found an hour here or there to actually open the file and work. And unlike in the past, when I just shoved my own work completely aside in favor of the paid stuff, I was able to dive right back in. I did remember what I'd thought of (and expanded on it). Also, I suspect that setting aside the anxiety about not writing actually helped me find time to work. I accepted that I might only have an hour, or less, but that I could still do something with that time. I didn't spend time lamenting the time I didn't have.
In short, I think worrying about not writing is worse than simply not writing. This doesn't have to be an all-or-nothing enterprise, i.e. either you have four hours a day to work, or you have nothing. Writing can happen in the interstices. So find a way to stay in touch. That's my advice for today.
However, this wasn't the same kind of hiatus that I've allowed to happen in the past, because I recalled some advice I read years ago. I really wish I could remember who wrote this, but the gist was, even if you can't actually write, find a way to "stay in touch with your writing." So I decided to do that. At least once a day, for maybe fifteen minutes, I thought about my novel--specifically, the point where I'd left off on my revisions. I did this while making dinner, or just before going to sleep. I tried to fully inhabit that place in my novel, without rushing off to take notes or open the file on the computer. So as not to overload the experience with layers of worry, I told myself I'd recall what I'd imagined when the time came, and that the important thing was to be in that place, if only for a few minutes every day.
That seems to have worked. Later in the week, and yesterday, I found an hour here or there to actually open the file and work. And unlike in the past, when I just shoved my own work completely aside in favor of the paid stuff, I was able to dive right back in. I did remember what I'd thought of (and expanded on it). Also, I suspect that setting aside the anxiety about not writing actually helped me find time to work. I accepted that I might only have an hour, or less, but that I could still do something with that time. I didn't spend time lamenting the time I didn't have.
In short, I think worrying about not writing is worse than simply not writing. This doesn't have to be an all-or-nothing enterprise, i.e. either you have four hours a day to work, or you have nothing. Writing can happen in the interstices. So find a way to stay in touch. That's my advice for today.
Wednesday, March 07, 2012
A small accomplishment, overblown
I'm short on time and brainpower this week, so I'll just show you this picture--
--which is of me at Pt. Reyes, just before I free-climbed that rock wall in the background.
Yes, well. I didn't climb the part that's over the water. Also, the same feat was achieved by many others, including women notably older than me, children, and at least one small, fluffy dog.
Still.
--which is of me at Pt. Reyes, just before I free-climbed that rock wall in the background.
Yes, well. I didn't climb the part that's over the water. Also, the same feat was achieved by many others, including women notably older than me, children, and at least one small, fluffy dog.
Still.
Thursday, March 01, 2012
The Hound of the Baskervilles: Holmes and the uncanny
Well, Hound gets very exciting in the section I read this week. A murderous convict loose on the moor! An encounter with a fallen woman! Horrid, blood-chilling screams and/or baying in the night! Another mysterious man loose on the moor, who appears silhouetted atop a tor in the moonlight, and when Watson traces him to his hiding place, he turns out to be...
...OK, you probably knew this before I did, but I admit, I was surprised to find out it was...
Sherlock Holmes! He has been out on the moor the whole time Watson has been conducting his investigations and proudly sending reports back to Holmes.
While glad to see his partner, Watson is understandably pissed that Holmes has been doing his own detective work while making him believe he was actually contributing something. Holmes reassures him that his reports, which he's had forwarded to him at the hut, were indeed valuable, because they allow the two of them to compare their observations. Watson is persuaded, and ends up, as usual, admiring Holmes's cunning.
What interests me in this section is how closely Holmes is tied to the spookiness--the uncanniness--of the moors. Before he knows the identity of the man on the tor, here's how Watson describes him:
This, combined with Holmes's odd, "cat-like" cleanliness while living in a hut on the moor, suggest that he's not exactly super-human, but maybe extra-human. Anyway, not human in the way Watson is. I wrote a few weeks ago about Holmes as an enchanter whose magic is reason. In these passages, Holmes appears as a spirit or specter, and perhaps some kind of sprite with dancing gray eyes. He may represent reason, but he's otherworldly all the same.
All this suggests that for all Holmes's brilliance, there is something not quite right about his enterprise. That notion is reinforced pronto, when, as he and Watson are comparing their discoveries of the past few weeks, they hear a terrible cry. Rushing to investigate, they find their client, Henry Baskerville,* dead.
That last self-accusation by Holmes sums it up: he has sacrificed his client for the case itself. (Notice how he says it's the "greatest blow which has befallen me...": it's still all about him, isn't it?) One suspects, despite his expression of regret here, that he'd do the same thing again in a minute.
Conan Doyle seems keen on showing us that there are consequences to Holmes's single-minded brilliance. He is not a simple hero, a seeker and defender of justice. He is driven by ratiocination for its own sake, and anyone who hires him for protection risks meeting the same fate as Baskerville.
The character of the flawed detective is itself a cliche by now. But a detective whose very brilliance is a danger to his clients is compelling. Holmes himself is the "hound" of the Baskervilles--a spectral and seemingly supernatural hunter.**
*UPDATE: Except it isn't Henry Baskerville. This is the second time I've been foiled by a plot twist in this novel. Note to self: in a Holmes story, there is no resolution until the end.
**I still think the overall point stands: Holmes's concern is with the facts, rather than with people. And he's uncanny, all right. More about these twists and turns in a subsequent post.
...OK, you probably knew this before I did, but I admit, I was surprised to find out it was...
Sherlock Holmes! He has been out on the moor the whole time Watson has been conducting his investigations and proudly sending reports back to Holmes.
I stooped under the rude lintel, and there he sat upon a stone outside, his gray eyes dancing with amusement as they fell upon my astonished features. He was thin and worn, but clear and alert, his keen face bronzed by the sun and roughened by the wind. In his tweed suit and cloth cap he looked like any other tourist upon the moor, and he had contrived, with that cat-like love of personal cleanliness which was one of his characteristics, that his chin should be as smooth and his linen as perfect as if he were in Baker Street.
While glad to see his partner, Watson is understandably pissed that Holmes has been doing his own detective work while making him believe he was actually contributing something. Holmes reassures him that his reports, which he's had forwarded to him at the hut, were indeed valuable, because they allow the two of them to compare their observations. Watson is persuaded, and ends up, as usual, admiring Holmes's cunning.
What interests me in this section is how closely Holmes is tied to the spookiness--the uncanniness--of the moors. Before he knows the identity of the man on the tor, here's how Watson describes him:
And it was at this moment that there occurred a most strange and unexpected thing. We had risen from our rocks and were turning to go home, having abandoned the hopeless chase. The moon was low upon the right, and the jagged pinnacle of a granite tor stood up against the lower curve of its silver disc. There, outlined as black as an ebony statue on that shining back-ground, I saw the figure of a man upon the tor. Do not think that it was a delusion, Holmes. I assure you that I have never in my life seen anything more clearly. As far as I could judge, the figure was that of a tall, thin man. He stood with his legs a little separated, his arms folded, his head bowed, as if he were brooding over that enormous wilderness of peat and granite which lay before him. He might have been the very spirit of that terrible place. It was not the convict. This man was far from the place where the latter had disappeared. Besides, he was a much taller man. With a cry of surprise I pointed him out to the baronet, but in the instant during which I had turned to grasp his arm the man was gone. There was the sharp pinnacle of granite still cutting the lower edge of the moon, but its peak bore no trace of that silent and motionless figure.
This, combined with Holmes's odd, "cat-like" cleanliness while living in a hut on the moor, suggest that he's not exactly super-human, but maybe extra-human. Anyway, not human in the way Watson is. I wrote a few weeks ago about Holmes as an enchanter whose magic is reason. In these passages, Holmes appears as a spirit or specter, and perhaps some kind of sprite with dancing gray eyes. He may represent reason, but he's otherworldly all the same.
All this suggests that for all Holmes's brilliance, there is something not quite right about his enterprise. That notion is reinforced pronto, when, as he and Watson are comparing their discoveries of the past few weeks, they hear a terrible cry. Rushing to investigate, they find their client, Henry Baskerville,* dead.
A low moan had fallen upon our ears. There it was again upon our left! On that side a ridge of rocks ended in a sheer cliff which overlooked a stone-strewn slope. On its jagged face was spread-eagled some dark, irregular object. As we ran towards it the vague outline hardened into a definite shape. It was a prostrate man face downward upon the ground, the head doubled under him at a horrible angle, the shoulders rounded and the body hunched together as if in the act of throwing a somersault. So grotesque was the attitude that I could not for the instant realize that that moan had been the passing of his soul. Not a whisper, not a rustle, rose now from the dark figure over which we stooped. Holmes laid his hand upon him, and held it up again, with an exclamation of horror. The gleam of the match which he struck shone upon his clotted fingers and upon the ghastly pool which widened slowly from the crushed skull of the victim. And it shone upon something else which turned our hearts sick and faint within us—the body of Sir Henry Baskerville!
There was no chance of either of us forgetting that peculiar ruddy tweed suit—the very one which he had worn on the first morning that we had seen him in Baker Street. We caught the one clear glimpse of it, and then the match flickered and went out, even as the hope had gone out of our souls. Holmes groaned, and his face glimmered white through the darkness.
"The brute! the brute!" I cried with clenched hands. "Oh Holmes, I shall never forgive myself for having left him to his fate."
"I am more to blame than you, Watson. In order to have my case well rounded and complete, I have thrown away the life of my client. It is the greatest blow which has befallen me in my career. But how could I know—how could l know—that he would risk his life alone upon the moor in the face of all my warnings?"
That last self-accusation by Holmes sums it up: he has sacrificed his client for the case itself. (Notice how he says it's the "greatest blow which has befallen me...": it's still all about him, isn't it?) One suspects, despite his expression of regret here, that he'd do the same thing again in a minute.
Conan Doyle seems keen on showing us that there are consequences to Holmes's single-minded brilliance. He is not a simple hero, a seeker and defender of justice. He is driven by ratiocination for its own sake, and anyone who hires him for protection risks meeting the same fate as Baskerville.
The character of the flawed detective is itself a cliche by now. But a detective whose very brilliance is a danger to his clients is compelling. Holmes himself is the "hound" of the Baskervilles--a spectral and seemingly supernatural hunter.**
*UPDATE: Except it isn't Henry Baskerville. This is the second time I've been foiled by a plot twist in this novel. Note to self: in a Holmes story, there is no resolution until the end.
**I still think the overall point stands: Holmes's concern is with the facts, rather than with people. And he's uncanny, all right. More about these twists and turns in a subsequent post.
Subscribe to:
Posts (Atom)