The good news of writing, I often tell my disbelieving students, is the good news of terrible first drafts. All first drafts are terrible. Every one. Kurt Vonnegut, who wrote 27 book-length works of fiction, non-fiction, and drama, said: “When I write, I feel like an armless, legless man with a crayon in his mouth.” First drafts are born as disappointments—and it really can’t be otherwise. It’s not their fault. Everyone who has engaged in any creative practice, from cooking to clogging, knows this is true. Your first attempt is usually not just bad, but awful. And that’s okay.

Terrible first drafts are the good news of writing because a) they are the great equalizer: all first drafts are terrible, and b) it means drafts usually get better. Even though I earn my paycheck by teaching people how to write more gooder, I don’t actually enjoy the process of writing all that much. Writing makes me feel like an armless, legless man with a broken crayon in his mouth. I love, however, to have written—because then, once I’ve got something, anything, on paper, then I have something to tinker with. Then I have something to disassemble and put back together. Then I have something to revise.

To revise means, literally, to re-envision. To reimagine the draft from the ground up, to chop and hack and cut and rearrange the pieces into a more effective or more pleasing order: some of my fellow composition teachers have brought scissors to class and actually had their students cut their papers up into pieces so they can play around with different arrangements. (Though I have not been brave enough to try this in my classroom yet, I admire their dedication to the pedagogy of creative destruction.)

I find in revision a pretty good metaphor for thinking about anything that involves some sort of craft or iterative practice—that is, anything we do again and again and again with the hope of improving, not by leaps and bounds, but by slow, methodical steps. If there is something in your life that you pay attention to and attempt to improve, then you are revising. When you join or leave a community, you revise that community. Today revises yesterday just as tomorrow will revise today.

If revision had a spirit guide, it would have to be, I think, Walt Whitman, the poet who revised a single book of poetry, titled Leaves of Grass, through four editions over the course of about 37 years. The first edition, in 1855, contained 12 poems. The last edition (the so-called “deathbed” edition), published in 1892, contained over 400, and the two editions in-between those contain revisions of the original poems. (These multiple editions are a great boon for literary critics like me, for they give us endless hairs to split and thus ensure job security.)

I look to Whitman to assuage one of the fears of revision, namely the fear of revising to the point of losing the essence of the original. Which version of Leaves of Grass is the correct version? Is it the original vision he had as a young man, or the culmination of nearly 40 years of practice? Or is it one of the two middle editions, published when Whitman was at the height of his creative powers? Whitman lets us off the hook for that question toward the end of “Song of Myself,” when he pauses to declare: “Do I contradict myself? Very well then, I contradict myself. I am large. I contain multitudes.”   That is, we don’t need to go full “ship of Theseus” here

The truth is, I think, there is no vision without revision—or, to say it more accurately, all new visions are already revisions, for as I read in a good book somewhere, there is no new thing under the moon. Nothing is totally unprecedented, and even the boldest, most “original” visions are themselves revisions of what has come before. Karl Marx’s dream of a communist utopia is a revision of capitalist society. Whitman’s speaker, the singular “I” that somehow contains a plurality of “multitudes,” is a revision of the Cartesian “cogito.” Revision is good news because it frees us from the expectation of originality, opening the gates to the kingdom of play and experimentation. So let us be large, let us contain multitudes, let us envision and re-envision.

Leave a comment

Filed under Uncategorized

Words that Heal

“Words that Heal”

Here is a slightly embarrassing story about me that my mother likes to tell: when I was two years old, she took me in to see our family doctor. While doing his routine checkup, he asked her if I was putting together two word phrases, like “Milk please.” Before she could answer, apparently, I asked the doctor, “Can I see your stethoscope?”

Words have always mattered a great deal to me, which is why, in some respects, it seems natural, possibly inevitable, that I majored in English and later became an English professor. Words get stuck in my head, sometimes just because I like their sound. It’s like having a popcorn kernel stuck in your teeth. My mind plays that word over and over and over again, tracing all of its little bumps and ridges. Stethoscope. STETH o scope.

Here is another slightly embarrassing fact—related to the first—about me: some of my earliest memories are of worrying incessantly.

I come from a great family of worriers. My mother worries, my grandmother worries. We’re good worrying stock. As a young child, I thought it was normal to lay awake for hours at night, gripped with fear about things that were far beyond one’s control. I recall, for instance, learning about an invasive species of snake somewhere in the Pacific—let’s say Guam. What were they going to do about all those snakes? They were killing all the birds! This fear wasn’t the idle sort, the kind that is really more akin to curiosity than true fear. What I felt about those snakes in Guam is the gut-twisting, bowel-punching fear that, evolutionarily speaking, is supposed to make you jump back from the rattlesnake hissing at your feet.

My natural aptitude for worrying went through something like boot camp in my early adolescence, when a string of bullies made me begin to hate—to hate my body, to hate my ethnicity, to hate most of all my weakness, my inability to stand up for myself. By then, I knew it wasn’t normal to curl up into a ball in the back of the school bus and spend the entire trip praying no one would speak to me; even worse, I had allowed myself to believe that somehow I deserved it, that it was my fault. The words that got stuck in my head then were not good words. Not chocolate words to roll around on your tongue, but acrid, astringent words that choked my throat as I swallowed them down.

If you’ve never had the particular pleasure of anxiety—which is the term a string of therapists I eventually turned to insisted on using, no matter how hard I argued against them—it’s like having Donald Trump’s Twitter feed running on repeat in your brain, spewing poisonous, terrifying words—way worse than “crooked Hilary.” Like a Trumpian tirade, anxiety is also based on some set of “alternative facts” that have only the most tenuous grasp on reality. The worst thing is that you believe it.

Words were a symptom of my illness, and they were equally a sign of my cure. First through poetry, where I rekindled my love of language, spending hours obsessively writing and rewriting each poem—no joke, 30, 40 drafts easy—before it was right. I eventually worked up the courage to read my poetry in front of people, and late night coffee houses became my first church, my first glimmer of psychic salvation. People said they liked my poems, and I secretly started to believe it might be possible for me to like myself. Later, in college, a beautiful girl said she loved me—she used those words!—and I loved her, and I told her, too, and years later we said two other short, powerful words to each other—“I do”—and those words healed me, too, more than any poem.

Leave a comment

Filed under Uncategorized

On (Not Being) Moved by the Spirit

WA Reflection for 2/25/18

In 2003, when I was a junior in college, I got to spend a semester studying at Oxford University in England. The walk from the house that I shared with three other American undergrads to New College, where I met with my professors, took me down St. Giles’s Street past two significant landmarks: the Oxford Quaker Meeting House and The Eagle and Child, which is the pub where C.S. Lewis and J.R.R. Tolkien would meet with other writers to eat, drink, and read unfinished drafts of their work.

I’ll admit that I visited the pub before stepping foot into the Quaker Meeting House, but from the start I was intrigued by the Quakers. I had read about their radical history, their pacifist reputation, and the egalitarian structure of their worship—with no official clergy or minister, on Sundays the Quaker society of “Friends” sits in simple benches arranged in a circle around the room. Quakerism holds that each individual has a direct relationship with the divine, with God, and so their “service” consists of a meditative silence punctuated by congregation members who speak “when the spirit moves them to speak.”

Up to this point in my life, I had attended church only when visiting my grandparents for extended periods, and even then, my family tried to weasel out of it. When asked about my religion or faith, I would identify as an “optimistic agnostic”—meaning that I had no idea if there was or wasn’t a God, but that I hoped for the best.

Before the first Quaker service I attended, the friends welcomed me, explained the basics of their theology and worship practices, and even invited me to speak during their meeting—when the spirit moved me to do so.

“When the spirit moved me to speak?” What did that mean? How would I know? Would God whisper in my ear, and if he did, would it be the full text, or crib notes? Or would it be subtle? Would the ether begin to hum?

I was mostly drawn to Quaker meeting through the allure of sitting in silence with other people. (This same impulse lead me to Buddhism later in life.) But I went to each Sunday’s meeting alive to the possibility that, sometime, the spirit might move me to speak. Sitting in silence with the congregation, I contemplated what it would mean for the spirit to move me to speak. I was 20 years old, so I knew just about everything, yet I somehow intuited that the intent of this spiritual practice was not to for me explain things to other people.

Truthfully, though several people would speak at each meeting, I don’t remember a single thing any of them said. But I do remember how they spoke. One woman, in particular, remains in my memory. Rising from the pew, her eyes half open, curly brown hair falling at her shoulders, she swayed gently back and forth and. Spoke. One. Word. At. A. Time. As though tugging at a thread. no one. could see. It reminded me of improvised jazz, but slower, halting, and without the safety net of an established rhythm or melody.

For three months, I sat in the meeting house and waited for the spirit to move me to speak. I never spoke. Whether I was not touched by the divine, or just shy, I can’t say. I wanted to speak, but the time never felt right. The call didn’t come.

In fact, not feeling called may have been a more important spiritual lesson, for the most profound moments of service occurred when I could hear nothing but the soft breath of the congregation. And this is fitting, because the Latin root of the word “spirit” is spirare, meaning “breathe.” And it was while sitting in the meeting house, a 19th century building of wood and stone, that I first learned that just breathing can be a spiritual practice—one that takes a lifetime to master at that.

Leave a comment

Filed under Uncategorized

Excess and Contradiction

Excess and Contradiction

Sermon for UUMAN, New Year’s Eve, 2017

Good morning, and happy new year. I want to begin this new year’s message in an unlikely place with a poem by Edna St. Vincent Millay. If you don’t know Millay, do yourself a favor and look her up one day when you’re killing time online. She’s more interesting than Facebook, I promise.

Nowadays, if you hear someone is a poet, the image that probably pops to mind is a shabby, starving artist type whose “real job” probably has more to do with pulling espresso shots than weaving rhyme schemes. Or, maybe, more generously, we might imagine an earnest MFA student scraping along by teaching freshman comp and creative writing at community college.

But Millay is different. In her time, the 1910s and 20s, Millay’s poetry made her a bona fide celebrity, an “it girl.” She was an early bohemian and modernist who lived in Manhattan and contributed to a burgeoning subculture that embraced feminism, free love, non-normative gender roles, and art for art’s sake—all a full 40 years before the beatniks and hippies would take up similar causes.

Here is the first poem from her most famous book, titled A Few Figs from Thistles, which you can also find printed in your order of worship.

“First Fig”

MY candle burns at both ends;
It will not last the night;
But ah, my foes, and oh, my friends–
It gives a lovely light!

Here, the speaker uses the image of a candle lit at both ends, a stock image or cliché, to describe living with bright, hot intensity. In this context, she is referring specifically to pursuing passionate love and bodily pleasures. And yet, as Millay is aware, this is not a sustainable way to live. Her candle “will not last the night.” She’s going to burn out, and she knows it.

But—and this is a crucial “but”—she then pivots. First, she addresses her “foes,” reminding us that acts of beauty, creativity, and pleasure can also be acts of resistance. Then she addresses her friends and insists that they also acknowledge it’s “lovely light.”

I love this little poem. It’s hard not to. If you think about it, her message is pure rock n’ roll, right? Burn bight, flame out early, but create a moment of intense beauty as you do. This is Jimi Hendrix before Jimi Hendrix was Jimi Hendrix. Or, to adapt Willard Motely, this is “living fast, dying young, and leaving a good-looking corpse” behind when you go.

Millay continues this defiant tone in the book’s second poem, which is also in your order of worship.

“Second Fig”

SAFE upon the solid rock the ugly houses stand:
Come and see my shining palace built upon the sand!

Millay continues her defiance when she points to the “ugly” houses built on solid rock. I imagine Millay might consider my own suburban house one of these “ugly houses. In the second line, Millay again pivots, but here she invites us to come join her in appreciating the luminous castle built upon sand and waiting to be washed away. These ugly houses and shining palaces are, of course, metaphors for all creative works, but the message is the same.

And the rest of the book continues in the same vein of celebrating temporary, fleeting beauty and brief, intense pleasures: Wednesday’s passionate love that fades away, without any drama or regret, by Thursday.

For Millay, celebrating intense and unsustainable beauty was her way of thumbing her nose at prudish Victorian traditions. The Victorian era, generally speaking, highly valued propriety and austerity. Women’s dresses covered everything below the chin and above the ankle—but don’t worry, their shoes covered everything below the ankle. The rising middle class did everything in their power to bring “respectability” to the working class. Social and gender roles were clearly prescribed. Millay and her ilk burst on the scene as the Victorian era was slowly dying, causing massive scandals and creating incredible art. (The two are rarely separable.)

These are clearly not poems about Christmas, or Hanukkah, or Kwanza, or even New Year’s Eve. Nevertheless, I think these poems have something to teach us about this time of year.

The holiday season—which, if charted by my own average consumption of sugar, fat, and adult beverages, stretches from the end of October to January 1st—is also a season of intense and unsustainable indulgence in the pleasurable things in life. The winter holidays are when, among other traditions, we eat the best foods and drink the best drinks and spend money freely—often to excess—and generally are much more inclined to cast away out concerns for tomorrow.

When I read Millay, as when I listen to Iggy Pop or Jimi Hendrix, I fall in love with the idea of living for today, and according to my waist line, that attitude extends to my holiday plate as well. Yes, I think, that is the good life—burn the candle at both ends and grasp that brief moment of perfect, transcendent satisfaction.

Yet I quickly find that chasing after these fleeting pleasures is, frankly, exhausting. As much as I want to live in the shining palace built upon the sand, I also want to retreat to my safe, boring, “ugly” house built on solid rock. And I equally find myself drawn to the austere, stoic ideals. Like a Victorian, I think yes, the good life is mastering your desires and not giving in to empty temptations.

Now, a psychologist might look at my predicament and call it “cognitive dissonance.” Cognitive dissonance is psychology’s term for the discomfort that comes from holding two contradictory beliefs in mind at the same time—like a smoker who knows that smoking causes lung cancer but continues to smoke, or a poor person who votes for the politician vowing to cut welfare.

Our culture, it seems to me, has an advanced case of cognitive dissonance when it comes to the holidays and over consumption.

On the one hand, we’re encouraged by friends, family, media, multinational corporations, and perhaps our own natural inclinations to go back for that second (or third) helping of cookies, or yet another glass of egg nog. Whether from nature or nurture, it seems right to pack on a few pounds come winter time.

On the other hand, we’re simultaneously afraid of overdoing it—or at least I am. Some of these anxieties are born out of real health concerns. We know the dangers of consuming sugar and alcohol excessively. Additionally, we in the United States have inherited many of the Puritan attitudes that saw indulging in pleasure as sinful. The original sin—eating the forbidden apple of knowledge—was a sin of overreaching appetite.

These are not new problems, of course. In fact, whole schools of philosophy have grown up to deal with these issues.

One name commonly evoked to justify over consumption is Epicurus, the ancient Greek philosopher whose name is the origin of Epicureanism. You may have seen issues of Epicurean, a magazine devoted to fine cooking and dining, in the grocery store checkout line, or Epicurious, a website for aspiring gourmets. Most people understand the philosophy of Epicureanism as about maximizing pleasure, especially the pleasure of eating and drinking, and as an adjective the word “Epicurean” denotes someone who is especially fond of luxurious consumption—so when we splurge on that prime rib roast, or throw an extra nob of butter into the skillet, or fill our glasses with another splash of punch, we’re all being little Epicureans. Just like when I read Millay’s injunction to live with fiery passion, I too feel a strong draw toward Epicurean delights. For instance, I’ve spent far more time than I’d ever admit studying—literally, studying—the techniques of great pizza making. (Aside: your dough must rise for at least 18 hours, don’t rush the proofing, slice toppings thin, super-hot oven.)

For Millay, as we saw a moment ago, embracing temporary beauty or pleasure was an act of defiance. She advocated bright, fiery passion in defiance of Victorian austerity. But is there anything defiant about eating more than you should during the holidays?

Maybe. The line that sprang to me when I was lying awake one night pondering this problem was the injunction to “Eat, drink, and be merry, for tomorrow you may be dead.” The origin of this phrase is Biblical. In the Biblical context, this phrase was used to call out the hypocrisy of people who pursued wordily pleasure at the expense of their souls—that’s how St. Paul uses it.  and its sentiment has been evoked in literature for more than a millennium. The Roman poet Horace, for instance, coined the phrase carpe diem in a poem advising people to make the most of today, since we can’t know what the future has in store. You’ve probably heard “carpe diem” translated as “seize the day,” but according to one source I consulted, “carpe” means “to pluck”—as in to pluck a fruit when it is ripe, and I like how “plucking” involves a much more visceral sense than the militaristic “seizing.” The moment, this moment, now, is a ripe fruit for you to pluck.

In the English Renaissance, carpe diem poems became a whole subgenre of poetry. The poet Robert Herrick, for example, famously advised the young:

Gather ye rosebuds while ye may,

Old Time is still a-flying;

And this same flower that smiles today

To-morrow will be dying.

An even starker and wonderfully morbid illustration of this idea comes from ancient Egypt. A Roman historian, Herodotus, describes the Egyptian feasting rituals, which were elaborate affairs in wealthy households. Remember, Egypt at that time was a major world power, so a feast at a wealthy Egyptian house would be like New Year’s Eve at the Kardashian’s home—or so I imagine. Unlike the Kardashians—again, I assume—one aspect of Egyptian feasts is what has been called the “skeleton at the feast.” At the end of Egyptian feasts would end, according to Herodotus, a man would enter the room carrying a replica corpse or skeleton that was designed to look as realistic as possible. He would carry this corpse from guest to guest, saying to each “look on this, and drink, and be merry, for when thou art dead, such shalt thou be.” (19th cen. translation of Herodotus).

So in a way, maybe there is something defiant about how we celebrate the holidays by with over indulgence. Maybe, as these ancient sources suggest, excessive eating and drinking helps us feel extra alive and thus defies the only true inevitable—death.

But even if that is true, it’s no way to live, at least not long term. A candle that burns at both ends will not last the night. Despite his reputation, Epicurus actually cautioned against overconsumption. Yes, he did think that the way to a good life was to maximize pleasure and minimize pain, but he also recognized that too much overindulgence leads to pain, as I’m sure the majority the over-21 crowd has experienced on January 1st at least once. Ironically, in fact, what Epicurus actually advocated wouldn’t fit well within Epicurean magazine. What we today call Epicureanism might be better labeled hedonism—the pursuit of pleasure above all else. Epicurus, though, does not argue that we should only pursue pleasure and avoid pain. Rather, he advises us to work to attain a state of being he called ataraxia.

Ataraxia is usually translated as being tranquil or unperturbed. In a state of ataraxia, there would be no need to wheel out a skeleton at the end of a feast, because there would be no feast. What Epicurus actually taught is that, when we free ourselves from the unnecessary pain caused by our minds, we need very little to be satisfied—simple food shared with friends in a peaceful setting is pleasure enough. He is very Buddhist, in fact: pain causes us to seek pleasure, but overindulgence in pleasure causes more pain.

I’ve wrestled with the cognitive dissonance of indulgence and austerity for a long time, and the closest I’ve come to finding an answer is by thinking of it as seasonal. To every thing, there is a season. Millay’s season of indulgence and fun followed the Victorian season of propriety and restraint. So if you feel the holidays have been excessive, as mine certainly have, consider following the advice of the stoic philosopher Seneca, who encouraged a close friend to “vaccinate” himself against hardship by practicing austerity. Set aside a certain number of days, Seneca wrote, during which you eat meager, coarse food, all the while asking what it was about this condition that you feared. “Endure all this for three or four days at a time, sometimes for more, so that it may be a test of yourself instead of a mere hobby. Then, I assure you… you will leap for joy when filled with a pennyworth of food, and you will understand that a man’s peace of mind does not depend upon Fortune; for, even when angry she grants enough for our needs.”

And if you’ve planned a New Year’s Eve party for tonight, consider inviting a skeleton.

Thank you for listening, and happy new year.




Several people asked me about Millay after the service. If you’d like to read A Few Figs from Thistles, you can find a first edition online here for free:

Leave a comment

Filed under Uncategorized

Just Don’t Call Me Late for Dinner

A couple of weeks ago, one of my composition students asked me why I prefer to be called “Adam” rather than “Dr. Fajardo” in class. The question took me off guard, and I didn’t give her a great answer, but, in some respects, being taken off guard is exactly why I ask my students to call me by my first name. This student was asking because she had accidentally called another professor by their first name. That professor had scolded her, saying something along the lines of, “It took me years to earn this degree, so you need to show me the respect that it entails.” I told her that I go by “Adam” because the norm at my undergraduate school was to be on a first name basis with all of your professors. I valued that aspect of my college’s educational culture because it made the faculty feel much more open, available, and approachable, and I want my students to have a similar experience. That’s what I told her, but the question stuck with me.

Like my student’s other professor, I also want respect, but I’m somewhat suspicious of the respect that is automatically endowed through certifications, degrees, and prestige. This isn’t just an antiauthoritarian remnant from my short-lived punk phase. There is a real danger, I think, that comes from being at the top of your field. (People with terminal degrees, in my view, are basically at the top of their fields because, while there is always someone who can publish vast quantities, write a more field changing book, or land a huge grant, we are the 1%-ers relative to the general population.) The danger is complacency. Recently, I listened to an interview with a chess prodigy and competitive martial artist who described how many martial arts champions  go on to start their own schools. In their schools, however, they often train very little with their students because they are scared to lose. When they do train, they’ll rely on the familiar tricks and techniques that helped them win before, which means that the masters aren’t regularly challenging themselves to keep growing. I think a similar phenomenon can happen in higher education. For academic, being right is similar to winning a sparing match. It’s good to be knowledgeable, of course, but I think we should be wary of complacency if the goal is to remain sharp and present to students.

None of this is to say that I’m not proud of my degree or that I question other professors who insist on having students address them by their titles. I learned first hand in my undergraduate college that being on a first-name basis with a teacher does not guarantee that person will be approachable or will allow you to ask challenging questions. I could certainly go by “Dr. Fajardo” and still cultivate an open, engaged relationship with my students. Also, going by my first name does not by any means remove the hierarchy that exists between my students and me. (And I don’t want that hierarchy to go away, either—if a student were doing something dangerous or harmful, I would not hesitate to use the full power of that hierarchy to correct the situation.) I just believe that empowering students to ask hard, insightful questions is healthy for the student and the faculty member alike.

The opening of Chris Palmer’s “Reflections on Teaching: Mistakes I’ve Made” reminded me of a Zen saying popularized by Shuryu Suzuki in Zen Mind, Beginner’s Mind. Palmer admits that, when he began a second career as a teacher, he did not have a “teaching philosophy beyond some vague, unarticulated feeling that [he] wanted [his] students to do well.” While Palmer uses this unprepared feeling to frame a discussion of how he began to make a lot of mistakes as a teacher, it also perfectly illustrates the power of what Suzuki might call shoshin, or “beginner’s mind.” The power of “beginner’s mind,” Suzuki explains, is that “In the beginner’s mind there are many possibilities,” while in the “expert’s mind there are few.” “Beginner’s mind” names a state of awareness and attention that is tuned in to the present situation (and thus not encumbered by habit, preconceptions, or the desire to be perceived as knowledgeable). Palmer is able to turn around his inital self-doubts by asking lots of questions and continually assessing (in a much more systematic way than I ever have), what was and was not effective in his teaching practice. This seems to me a natural and healthy outcome of a pedagogical beginner’s mind: he didn’t come into the classroom armed with theories and “best practices” that should work. Instead, he tried lots of things, failed often, and course corrected along the way. I have been teaching composition and literature for about eight years—not so long in the big scheme, but more than long enough to become mired in habits that might limit the amount of learning that can happen in my classroom or could keep me from fully engaging with my students. Though I wasn’t thinking in these terms at the time, I can see that I made certain pedagogical choices this year that encouraged me to adopt a “beginner’s mind” in my classes. Going by my first name at GGC, when the norm seems to be to go by title, is one of those choices.

Leave a comment

Filed under Uncategorized

Getting out of the way.


Last week there was a day on my composition syllabus dedicated to teaching summary and paraphrase. These are two skills that are fundamental to a lot of the learning that we’ll do this semester, so it’s important to get them right. But let’s face it–as important as it is to be able to summarize and paraphrase well, talking about these skills can be pretty boring.

In the past, my lesson plan for teaching summary would go something like this: spend some time going over the textbook then work on an activity that asked student to apply their new knowledge. Sounds fine, right? Only, it wasn’t. Inevitably, we’d get bogged down at the textbook phase. The problem was mismatched incentives: I felt responsible for conveying content, so even though I said we’d move through the book quickly, I’d get caught up in the particulars. This self-imposed sense of due diligence clashed with my students’ incentives. There was no real challenge in what I’d asked them to do, and it certainly wasn’t fun, so why would they put in extra effort just to be rewarded with more work?

This time around, I wanted to try something different, but how to make sure that my students both understood the content and had time for real, hands-on practice? My classrooms this semester all have laptop carts in them, so I designed an experiment. Before class, I created a new forum called the “1101 Writer’s Toolbox” in the Discussions page of our Brightspace site. Inside the forum, I made a couple of new topics. The first, simply titled “Summary Vs. Paraphrase,” prompted students to write an 10 sentence explaiaton of the difference between the two skills. I didn’t bother asking for a definition of each since they’d end up doing that while explaining the differences. The prompt also asked students to consider things like how much of their opinion should go into each, how long each should be relative to the original text, and so on. Working in small groups, my class had 15 minutes to write their explanation and post it to Brightspace. To raise the stakes somewhat, I said we’d be reading each response aloud and voting on the best one.

Screen Shot 2015-08-31 at 12.55.26 PM

Then I took a step back and let them get to work. At first the class was quiet. They puzzled over the textbook for a minute and asked each other and me a few clarifying questions. But in less than five minutes a pleasant buzz filled the room as they started talking about the concepts and debating which wording to use. I moved around the room, monitoring their progress and reading over their shoulders, but for the most part I just kept out of their way.

When the time came to read their responses, I was surprised and pleased by how good they were. In fact, there was only one concept that none of the groups had mentioned, which was the idea that summaries also need to show the logical connections between the main points. Taking a moment to mention this point to them helped me transition into our final short activity.

For the last 20 minutes of class, I showed my students a short animated film called “Death Sails” and asked them to write individual summaries of it that would also be posed to the Brightspace discussion thread. Before they began writing, I nudged them to include not just a “laundry list” of events, but also why events turn out as they do–that is, to point out the logical connections between events. This final activity also went well, and I was pleased to see them connect the dots with causal phrases.

An added benefit of this activity is that it substantially increased the number of iterations of the main ideas. Iteration increases retention, and the structure of this lesson plan more than doubled the iterations while also increasing student engagement. It was also a lot more interesting for me–a win-win-win!

Leave a comment

Filed under Uncategorized

My side gig.

I tend to scribble a lot

I have a guest post today over at the GradHacker blog. It talks about the benefits I’ve experienced from building up a small copy editing gig and why I think grad students would benefit from spinning their academic work into a paying freelance job.

Check out the post here.

Leave a comment

Filed under Uncategorized