Tuesday, December 17, 2013

My book has a cover!

Coming from Bona Fide Books in June 2014 ...
Design by Vicky Shea at Ponderosa Pine Design.

Monday, December 16, 2013

Disturbing Christmas Carols, 2013 Edition

The holidays are upon us, and it's time to take a closer look at some of the strange messages being piped into our innocent ears and souls. This year, I'm finding "Frosty the Snowman" particularly unsettling, though perhaps no more so than "Rudolph" or "Little Drummer Boy."

To recap:

Frosty the Snowman, was a jolly happy soul,
With a corn cob pipe and a button nose, and two eyes made of coal.

Frosty the Snowman, is a fairytale, they say.
He was made of snow, but the children know he came to life one day.

There must have been some magic in that old silk hat they found,

For when they placed it on his head, he began to dance around!

Oh, Frosty, the Snowman, was alive as he could be;
and the children say he could laugh and play,
just the same as you and me.

Thumpety thump, thump, thumpety thump, thump,
look at Frosty go.

Thumpety thump, thump, thumpety thump, thump,
over the hills of snow.

Frosty the Snowman, knew the sun was hot that day,
so he said, "Let's run, and we'll have some fun now, before I melt away."

Down to the village, with a broomstick in his hand,
Running here and there, all around the square,
sayin', "Catch me if you can."

He led them down the streets of town, right to the traffic cop;

and only paused a moment, when he heard him holler, "Stop!"

For Frosty, the Snowman, had to hurry on his way,
But he waved goodbye, sayin' "Don't cry, I'll be back again some day."

I will not dwell on the post hoc, ergo propter hoc fallacy regarding the silk hat, though I'll note that the lyrics cleverly gloss over the dilemma with that odd conditional, "must have." Is the song firmly committing to the existence of magic, or only speculating, toying with our desire to know for certain? We cannot say. Like Frosty himself, the song dances on the knife edge of being/not being.

No, what strikes me in this season of enforced cheer is the idea of Frosty as memento mori, specifically aimed at children. As he begins to melt, his running and cavorting grow ever more frantic, until the weeping children watch their dwindling friend disappear over the hills. Here we sense the tragic nature of Frosty, who knows he's dying but is determined to entertain till the end; he is the archetypal sad clown. The carpe diem message comes through clearly, and that's a lesson it's never too early to learn.

Yet perhaps Frosty would have been more heroic had he not announced to the children that this was his last day (for now, anyway) on earth. Couldn't he have simply made an excuse that he had an appointment or something? Yes, the children would perhaps have felt confused and hurt. Where is he going? Doesn't he like us anymore? Are we too boring for Frosty, who appears afflicted with ADHD? But now, whenever they see snow, they will think of loss and death intertwined with, and perhaps overshadowing, the fun and beauty of the season.

On the other hand, as Wallace Stephens said, "Death is the mother of beauty." Perhaps winter and snow become more beautiful as we learn to connect them with aging, impermanence, and the witless brutality of nature. Perhaps Frosty sensed these children were old enough to understand this; and perhaps he's a Buddhist and expects to be reincarnated. As, indeed, he may be, every time someone builds a snowman.

All I'm saying is, this is a lot for a young person to take in. I don't believe in shielding kids from reality, but this deceptively jolly song is an odd way to deliver it.

Tuesday, July 23, 2013

I do not know how to write short stories

... in case you were wondering. I have published stories, but that doesn't mean I know what I'm doing.* In fact, each time I start one, I have less of an idea of what is supposed to happen. There's supposed to be an arc, I gather. You're supposed to create interesting characters that the reader--quickly!--comes to care about. The story should create an overall experience of surprised satisfaction--the reader did not see that ending coming, but at the same time realizes no other ending could be possible. Some kind of turning point should arise; some permanent (even if seemingly minor) change should occur.

Or not. You can throw together a bunch of seemingly unrelated fragments (although I'd appreciate it if you wouldn't number them). You can take on a persona and rant in character. You can obsessively attend to the details of the story's setting, and then never actually tell the story. You can write the story all in dialog, or write no dialog at all.

I find this lack of parameters disturbing. And I do not understand how some people can just produce one story after another, one collection after another, all of them quite good. Do they have some kind of basic formula in their heads, which they alter and bend and break, but still at least start with? Because I feel like I'm starting from scratch every time. I don't trust conventional arcs, but I still want something to hang my hat on. Lots of people will say that hat-hook is character, but I don't really trust that, either. To me, character--outside of a specific setting, situation, tone, voice, structure, and purpose--doesn't mean much. I need all of it to come together, and that never happens in the same way twice. I'm not sure how to make it happen, other than to be patient and make lots of mistakes, and accept that some stories, as originally conceived, will never succeed.

Which brings me to the only other technique that sometimes helps me: smushing together two failed stories or story fragments. On their own, these stories don't suffice, but together they generate enough friction to start a fire.

*I also don't know how to write novels, although I've written two (well, 1.75). But having more room to maneuver within the novel somehow makes this not-knowing less dire. All the fumbling around eventually becomes the novel itself. Which is why the novel imitates life, right? Fumbling = living. But a story can't fumble. Or can it?

Tuesday, July 16, 2013

Where does self-reflection come from?

This post on the Zimmerman verdict by William Saletan makes the insane suggestion that both parties in the murder of Trayvon Martin were equally at fault because each made a snap judgment about the other. Yes, Saletan says, Zimmerman "profiled" Martin, but Martin also "profiled" Zimmerman, assuming he was a pervert out to get him. Well, Zimmerman was out to get him--expressly so. Martin's assumption that Zimmerman was dangerous was entirely correct.

Nevertheless, Saletan did make one point that got me thinking:

In Zimmerman’s initial interrogation, the police expressed surprise that he hadn’t identified himself to Martin as a neighborhood watch volunteer. They suggested that Martin might have been alarmed when Zimmerman reached for an object that Zimmerman, but not Martin, knew was a phone. Zimmerman seemed baffled. He was so convinced of Martin’s criminal intent that he hadn’t considered how Martin, if he were innocent, would perceive his stalker.

Saletan wants us all to be more reflective, to pause to see ourselves as others might see us before we go flying off the handle and shoot somebody, or maybe call them a name. That's all well and fine. But what could have happened differently in Zimmerman's life that would have enabled him to consider how Martin might have seen him?

Here's where people like me often go into a high-minded spiel about the value of a humanities education. Sure, it won't get you a job, especially in this economy, but it's these intangible things that make the study of literature and philosophy so worthwhile. Studying fine literature (as opposed to, say, genre fiction populated by vigilante cop-heroes), presumably under the wise tutelage of an expert in the field, would have helped Zimmerman take a step outside himself. He would have thought, "Aha--if some stranger were chasing me and reaching for something, I might feel threatened! Therefore, I ought to back off!" Or, even better, he would never have blended the rather modest position of neighborhood-watch volunteer with his Dirty Harry fantasies. He would know the difference between life and (bad) art.

But that all sounds rather feeble to me right now. Because what would motivate someone like Zimmerman to take such classes in the first place? Why would he even care how he appears to others? (And by "how he appears" I do not mean Do I look cool? Do I look manly?--which he plainly does care about in abundance. It means genuinely seeing another person's point of view as legitimate and important.) Self-reflection holds little appeal in our culture; it seems opposed to action and suggests dithering and weakness. Why would someone who feels weak to begin with want to become more so?

Does it all come down to parenting and one's earliest experiences? And if so, what would motivate parents to teach their children empathy, as opposed to always being tough and, you know, standing your ground (which somehow seems to imply encroaching on others')? I don't think God and religion help here, either, since Zimmerman said his killing Martin was "God's plan." A different interpretation of God might help, but then we're back to asking where that interpretation comes from. So, what to do? How to make reflection--that least visible of human activities--cool? We can't make TV shows about people thinking. Even I would be bored by those. More shows that suggest negative consequences of violence and vigilantism might be nice. But, again, how to interpret those, assuming one already loves that craggy old vigilante sheriff in the battered old hat?

So I guess education--free, excellent, public education, from pre-k through, oh hell, college--remains our best bet. I still hope that education can help people use words rather than guns to solve conflicts.


Tuesday, June 25, 2013

Giving your characters an inner life

This lovely piece about James Gandolfini got me thinking about literary characters and what makes them seem real to us. My former students know I've been grinding this ax a long time. But this is another way that writing fiction very much resembles acting: in both cases, you can't reveal everything there is to know about a character. Good characters (like good stories, according to Hemingway) are icebergs; much more of them exists below the surface than above. And, let's face it, that's true of actual human beings as well. That's why, in fact, fictional characters who seem to have a lot going on inside them feel more real than those whose thoughts, beliefs, and actions are always transparent and aligned.

So, keeping Gandolfini as Tony Soprano in mind as an example, here are some ways we fiction writers can make our characters more real--as opposed to, God help us, more likeable.

  • Embrace contradiction. Tony was a wildly violent man who fretted over his violent tendencies, took his daughter to visit colleges, and became quite fond of a family of ducks. We all fret about certain characteristics we possess; we don't think they're quite honorable or consistent or helpful, yet we can't quite smooth ourselves out. Your characters can be the same way: an elementary school teacher who fears children, a doctor who's addicted to cocaine ... Let them be aware of these contradictions without knowing how to resolve them.
  • Use interior monologue. The psychiatrist in The Sopranos was a device than enabled us to hear what went on inside Tony's head. In fiction, we can just write that stuff out. Where else but in fiction do we have direct access to another person's thoughts? Take advantage of that opportunity. After all, I think, therefore I am: that's how I know that I, at least, am real. Of course, you can overdo interior monologue or use it as an excuse to tell rather than show. The test is whether it reveals character, rather than explaining it.
  • Let characters wonder what each other thinks. It's my sense (isn't it yours, too?) that we humans spend a lot of time wondering what other people think. It might be one reason why some of us read fiction--to at least have the sense that we're inside someone else's head. So a realistic fictional character would probably have this desire also--to know what her husband is really thinking about when he shrugs and says "nothing." Our common, everyday tragedy is that none of us can ever really know this for certain. In their world, your characters live this tragedy, too.
  • Explore emotion. As Gandolfini said, let your characters really feel emotions. Those emotions can be confusing, "wrong," in conflict with one another, etc. They can explode volcanically, or the character can work hard at suppressing them, but the emotions themselves never go away.

Monday, June 17, 2013

The "broken windows" theory of feminism?

I haven't read Lean In yet, because, until recently, I felt it didn't apply to me. I actually do work in corporate America, though as a consultant, and offsite, and for a small, woman-led company where all but one employee is a woman. Also, though I used to be "in management," I have since leaned out on that score altogether. I have determined that I do not like managing others. In my case, I would argue that this is not an entirely gender-based decision; my father always refused leadership positions throughout his career, preferring to focus on "the work itself." My mother stopped working outside the home after I was born. That is a whole other roiling kettle of fish and worms and what have you, which I will perhaps address elsewhere.

Anyway, I'm now thinking Lean In might have relevance for me, and for a lot of us, after all. As Frank Bruni put it in a recent column, we've received a deluge of reminders of "how often women are still victimized, how potently they’re still resented and how tenaciously a musty male chauvinism endures." But I still find myself shrugging my shoulders, thinking "what do you expect?," when I encounter such resentment and chauvinism personally. For me, such experiences feel like "no big deal" in the scheme of things. Millions of women deal with much harsher forms of belittlement and victimization than I do. Yet why shouldn't those of us who are relatively privileged stand up against those little, supposedly meaningless slights? Why wouldn't this lead, ever so quietly and gradually, to improvement, in the same way that repairing broken windows in crime-ridden neighborhoods seems to lead to a reduction in crime? Why not chip away, in our own small way, at the gigantic edifice of subtly demeaning rhetoric that we all participate in?

I'm thinking, for instance, of a moment a few months back when I went to observe a class of second-graders. The teacher introduced me to the class, and quickly asked me if I went by "Mrs." or "Ms." She actually might have offered "Miss" as the second option; I didn't quite hear. I have a Ph.D., so my correct honorific is "Doctor." But I felt I could not say that without sounding like a horrible snot. I don't use my husband's last name, so I can never be a Mrs. That left me with "Ms.," which I chose, hitting the "zzz" sound as hard as I comfortably could. The teacher introduced me, and the whole stupid moment was over in a second.

Except. Those second graders didn't get to see a woman called "Doctor." And I dismissed my own accomplishment in earning this degree, because I was afraid of looking obnoxious and possibly making the teacher herself feel diminished.

I gather that Lean In, though more focused on corporate settings, addresses these kinds of seemingly small snafus that professional women get entangled in all the time. Trying to avoid discomfort, our own and others' (and others' discomfort further contributes to our own, doesn't it?), we "lean out" of situations where we could be setting forth our qualifications and furthering our own--and other women's--interests.

Even now, I feel somewhat foolish for writing this post. Who cares what that class of second-graders thought my title was? I was only there for fifteen minutes; none of them will even remember me. And I did a nice thing, didn't I, by not contradicting their teacher in front of them?

But all these little stupid moments add up.

Friday, May 24, 2013

To Nook or not to Nook?

About six months ago I got myself a Samsung tablet. I intended to use it mostly for editing work, because at the time I was editing a lot of PDFs. And the Samsung had this very cool app for marking up PDFs and taking notes by hand.

But I've ended up using it mostly for the Nook app (and surfing the web, but that goes without saying). I realized that with any e-reader, the time lag between reading an interesting book review and owning the book in question goes from approximately a week (if one orders online, and even remembers to buy the book at all) to twenty seconds. After reading Broken Harbor, for instance, I immediately yanked from the ether all the remaining Tana French Dublin Murder Squad novels, the last of which I'm saving for my next long plane trip. I'm now champing at the bit for Suzanne Rindell's The Other Typist.

And yet. On the Samsung, at least, e-books can inject some serious aesthetic problems into the reading experience that you'd never experience with a physical book. As Joe Hill mentioned on Twitter, subtle but necessary formatting cues can go haywire in an e-book.

I'm about to finish Jo Walton's Among Others, which I happen to have in paperback form, and have found myself calmly immersed in a way I haven't achieved with the e-books. Much as I like Among Others, I don't think this is due to the writing alone. Hill goes on to make the point that while most of us don't really notice the formatting of novels (unless they have illustrations or other overt formatting effects), we do notice it when it's off. I now rather dread going back to the e-reader, even though I want The Other Typist asap.

There is also the awkwardness of holding the tablet. The Samsung is too heavy to hold with one hand, so I usually end up folding back the cover and balancing the thing on a pillow on my stomach while lying down. (Not that I wouldn't read lying down anyway, but the balance here is more delicate.) Finally there's the whole business of staring at a screen, which I've already been doing more or less all day, and the kind of cognitive buzzing that the screen creates in the background of the reading experience.

In short, add e-books to the list of Things I Am Ambivalent About. And add me to the list of People Who Are Ambivalent about E-books.

Tuesday, May 14, 2013

Bigfoot and the Baby ... coming from Bona Fide Books

I am thrilled to report that my first novel, Bigfoot and the Baby, will be published by Bona Fide Books in spring 2014. Bona Fide is an innovative, exuberant small press, committed to literary fiction, poetry, and nonfiction, with a particular interest in the environment and the American West.

Wednesday, May 08, 2013

Ray Harryhausen and writing

In the NYT obituary for legendary stop-motion animator Ray Harryhausen, I found myself dwelling on this statement:
“There’s a strange quality in stop-motion photography, like in ‘King Kong,’ that adds to the fantasy,” he said in 2006. “If you make things too real, sometimes you bring it down to the mundane.” 
As a fiction writer--and reader--I've always agreed with this, without ever quite knowing why. Now I actually suspect it's because I grew up watching Harryhausen's movies, often on the big screen, in gorgeous Technicolor. The way Harryhausen's monsters moved impressed the hell out of me--at once faster and slower than humans, with starker contrasts between light and shadow. I don't remember precisely how they looked. The movement was what made them otherworldly, and therefore more real, in the sense of more plausible as monsters. Why would something from another world (or from the imagination) move as we do, and exist in precisely the same plane? The movements of Harryhausen's creatures made them seem both here and somewhere else at the same time--and that lent them real power and real magic. And as he suggests, the more realistic CG effects become, the more disappointing they become, even as we marvel at the technical achievement. The monsters really have been brought down to Earth.

That's probably not the *only* reason that I prefer a veneer of unreality in fiction. But if you're going to go ahead and create a world, why not let it shimmer and quake a little around the edges? Because you have made something. You have brought something into our world from another one--your imagination. That wonder and that strangeness, it seems to me, deserve attention.

Wednesday, May 01, 2013

Zap the "to be" verbs

Not until I started working for a marketing firm (yes!) did I realize the full insidiousness of "to be" verbs. Not only do they make your prose static, they can obscure the true meaning of your sentence by preventing you from finding the right verb--and with it, the right noun(s), adjective(s), adverb(s), and phrasing. I've noticed that, particularly when I write in a rush, "to be" verbs proliferate--because I don't take the time to ponder exactly what I want to say. And maybe that kind of rushing works for a first draft, when you don't want to pause and ponder, but simply "get it down." But during revisions, you can make a lot of amazing improvements by scrutinizing and replacing "to be" whenever possible.

Tuesday, April 09, 2013

OK, once and for all: Is getting a Ph.D. in literature a waste of time?

Yes. No. Oh, who can say?

Well, in theory, I should be able to say. I got my Ph.D. in comparative literature [redacted] years ago, and I'm one of those twisted yet oddly grateful souls who didn't become a professor. I ought to have warnings and/or encouragement to offer those on the front end of the process--not to mention solace and/or encouragement to those, possibly drifting in adjunct limbo, who are now thinking of stepping all the way out. 

But I must start by saying: are we really still having these arguments? Really? Literally these same laments, and dismissals thereof, have been flying about for longer than the [redacted] years since I first ignored the warnings aimed at me. Nothing has changed. Tenure-track jobs still prove more elusive than starring roles in feature films. Universities continue to admit more grad students than they can ever hope to place in such jobs--because they supply cheap academic labor, because they represent the next generation of a culture and philosophy that at least some people hope to preserve, and because they still want to come. So it shall ever be, evidently. And while we non-professors--eventually--generally find satisfying alternative careers, we still seem to have no good answer to the question, What is a literature Ph.D. for?

My experience is just one experience. And even [redacted] years later, I find myself unable to wrap it in a comprehensive, persuasive ball of wisdom. Here, then, are some random questions and the answers I would give you, today, if you asked.

Would you do it over again, knowing what you know now?
No.

Are you still glad you did it?
Most days, yes. It has undoubtedly opened doors, though the person on the other side has often been surprised to see me there. I've learned to use that surprise to my advantage.

Did you learn to think better?
Probably. Yes. What kind of a question is that? I will say I learned to pay very close attention to language, although the attention was focused through a very narrow lens. Emotional reactions, including simply taking pleasure in an author's beautiful words, were right out. I hope that's changing. I now believe that this is a much better way to think about--and with--literature.

Did you learn to write better?
God, no.

Could you have tried harder to get a tenure-track job?
Yes. But the answer is always yes, isn't it? The best advice I ever heard was: If you can do something else, you should. I felt that I could, indeed, do something else, though it took me awhile to figure out what that was. Not everyone has the luxury of that kind of time, though.

Did others fail to warn you sufficiently?
I doubt it. We all think we're the exception. Sometimes we are. Warnings can scare off the ambivalent, but not the truly determined.


Monday, March 18, 2013

On writing and feeling "unhoused"

Via Andrew Sullivan, The Paris Review's lovely interview with Andrea Barrett offers these lines:

I’ve never known a writer who didn’t feel ill at ease in the world. Have you? We all feel unhoused in some sense. That’s part of why we write. We feel we don’t fit in, that this world is not our world, that though we may move in it, we’re not of it.

The question is, does('nt) everyone feel this way? Or just writers/artists? I suspect the circle of the emotionally "unhoused" is larger. But I'm really wondering who actually feels "housed." Wall Street types? Oncologists? Priests? Cheerleaders? And do they know they feel this way?

Tuesday, February 05, 2013

Affirming life with The Pale King, of all books

I have now officially finished reading The Pale King. I make a special point of mentioning that I've finished, because it has been my habit to opine at some length about a book I am only halfway through ... only to discover that you kinda do need to read the whole book to understand what it's about. Also, I am very proud of myself. This is a long book--ultimately unfinished by the author himself, as we know--and it is full of tax-law arcana, which is fascinating in that it was fascinating to DFW.* But not at all fascinating otherwise.

I mean, what sort of mind can engage itself with that stuff at that level of detail? Of course, that's the central question The Pale King asks. And the answers within the book differ from those we've learned from DFW's biography. We all know that the author committed suicide before finishing the novel. So that suggests the mind in question was not a conventionally healthy one. But I just went back and added "conventionally" before "healthy," because without the adverb, the adjective seemed to cast judgment on that mind. Despite the terrible end to DFW's life, it seems, from the evidence of this book alone, that there was something right about his mind, something good and powerful, which may or may not spring from the same well as his depression.

I also think it's a mistake to look for clues to "what pushed him over the edge" within this book--even though I had avoided reading it precisely because I thought it would be littered with such clues. Reading it seemed somehow ghoulish, and I feared being sucked into depression myself. Yet nothing of the sort happened. I was able to forget, or put aside, my knowledge of Wallace's death, because the book is so full of ... yes, life. It's energetic, hilarious, compassionate, fascinated and amused by the most ordinary aspects of human life, sometimes frightening, and also pretty stark raving crazy. If anything, I found myself wondering, Why would a person who saw the world this way want to die?

I don't know the answer. But if I had to guess, I'd say that he couldn't sustain this level of embracing joy. Perhaps he could not even reach it at all, and this book is an act of straining toward what he could never actually grasp. Or maybe (like most of us, really) he grasped it only for brief instants, and the slipping away of that joy was unbearable to him. I was thinking about an episode of To the Best of Our Knowledge on NPR last weekend, the topic of which was "Why Do We Love Sad Songs?" One segment concerned the supposedly saddest piece of music ever written, Samuel Barber's Adagio for Strings. The author of a book about that piece, Thomas Larson, explained how Barber wrote it during one of the happiest times of his life. An artist, he said, does not need to be in the same mood as the work he is creating. Rather, he needs to have access to that emotion--and that access might, in fact, depend on distance. In other words, if Barber had actually been that sad, he might not have been able to write that piece at that time. The reverse might be true of Wallace.

*Yes, I skimmed a lot of that. And some of the footnotes.


Shop Indie Bookstores

Monday, January 07, 2013

Yes, pictures and music are sometimes better than words

I have a vague, troubling memory from my graduate school days. OK, more than one memory. Actually dozens. Hundreds. Oh, God, the flood of vague, troubling memories ... ! But the one that's troubling me today is the sound of my voice, intoning to a classroom full of undergraduate composition students something to the effect of: "Anything that cannot be articulated in words is not worth articulating." In my defense, I think I was trying to counteract certain students' desire not to write anything about what they had just read--or seen or listened to--because putting their thoughts into words would "spoil it." Which, all you students out there must admit, could also be interpreted as a cover story for laziness.

Nevertheless, I was young and (relatively) brash myself in those days, and given to pronouncements that now make me cringe. So let me say for the record: Words are not always the best medium of communication. They are the most practical, and practiced, medium, but there are many, many occasions when another form works better.

All of this is by way of saying that I bought my husband a copy of Philip K. Dick's Flow My Tears, The Policeman Said for Christmas, and this particular edition came with this most extraordinary cover art by Chris Moore.

Of course, you also need to read the book beneath the art. As I understand it, this is one of the few novels by Dick that don't start out brilliantly (they all do, as far as I can tell), but then collapse spectacularly after about p. 20. This one makes it all the way through.

But if you want to experience the true spirit of 1970s science fiction, this cover art delivers, in a way mere words cannot. Take a look, and then tell me I'm wrong.