The twentysomething brain (and beyond.)

(from my manuscript-in-progress)

Significantly, Buddhists call looking at an object or emotion steadily for some time and processing the emotions that arise “sustaining the gaze.”  The ability to “sustain the gaze” without distraction from within or without is the ability to rest in the relative stability of a mature understanding of reality, to pay attention to the other person or thing as it is, to enter into a reality beyond the self, and to recognize – as I’ve said before – that the world is not a story with yourself at the center.  Walking this path, you find that your own neuroses, anxieties, and self-criticisms diminish and your ability to engage the world with wonder and empathy and fairness increases.

Being able to “sustain the gaze” – including at yourself, to evaluate and renovate – is a sign of maturity, no matter what your actual age is.  Yet technology and media are only one part of what makes that type of maturity difficult – in many cases, achieving mindfulness means working against the structure of the brain itself, and the younger and/or the more technologically immersed you are, the more difficult this may be.  In her book The Defining Decade: Why Your Twenties Matter – And How To Make the Most of Them Now, clinical psychologist Meg Jay writes that since our brains develop from the bottom to the top and from the back to the front, among the most active portions of our brains when we’re in our twenties are the amygdalae – the deeply rooted seat of “the emotional brain,” where emotions and memories (particularly challenging or fearful ones) are assimilated into long-term memory storage.  “Evolutionary theorists believe,” Jay writes, “that the brain is designed to pay special attention to what catches us off-guard, so we can be better prepared to meet the world next time,” and since our twenties are full of experiences different in degree and in kind from what came before, the amygdalae can make us feel more than a bit as if we’re on an emotional rollercoaster, jolted up, down, or sideways by strong emotions connected to new experiences. “MRI studies show,” Jay writes, “that twentysomething brains simply react more strongly to negative information than do the brains of older adults.”  Our active amygdalae make us feel as if every emotion or event (particularly negative ones) is groundbreaking and crucial, while the frontal lobe – the last part of the brain to develop, in our thirties – hasn’t yet assumed its full role as mediator, the site of reflecting, evaluating, putting life events into perspective, and calming oneself down.  As college students, we can feel whipsawed by disappointments or uncertainties in work, school, or relationships that may seem minor even five years later, simply because, in neurological terms, that event is standing center stage in our brains, alone, demanding what later comes to seem a disproportionate amount of bandwidth but at the time feels simply like “the way things are.” Scary or uncomfortable things just get the brain’s attention more.

Therefore, just as we can’t afford to assume that our current techno-saturated lifestyles are natural or inevitable and cannot be changed or resisted, we can’t lose sight of the fact that what we experience as “reality” is being constructed by our malleable brains, which are always in flux and response to stimuli from within and beyond ourselves.  We are always looking through a lens, and our lives become more centered, quiet, and powerful when we learn to see that the brain itself is where the lens is built, with and without our conscious consent.  Neuroscientist Douwe Draaisma, in his book Why Life Speeds Up As You Get Older, writes that the feeling he describes in the title comes from the brain having more experiences to put next to one another in memory, creating an interior sense of speeded-up or compressed time which also can lend perspective: the event we’re experiencing is and is not unique.  In our own lives as thirty- and forty-somethings and beyond, we become able to calm ourselves even in difficulty with memories of previous survival, experiences we can put next to new ones in our memories as we draw upon the reflective, judging, forecasting powers of the mature frontal lobe:  I went through this before, we think, and it didn’t kill me; this feeling is painful, but pain does pass. 

By contrast, the relatively active and unmediated amygdala of a twentysomething and younger brain is seeking and seizing on new stimuli for the brain’s own development, then trumpeting to itself the importance of what it has found.  It wants to be plugged in.  It wants to connect.  It wants to be fed.  It can’t really help seeing itself as the most important thing in existence.  This, I’d argue, makes young brains especially susceptible to the rollercoaster of anxiety and stimulus-seeking that the Internet is designed to provide, and to the subsequent consumer anxiety that follows.  When every part of your interior life feels suffused by the imperative Grow! See more! Become a self! Figure out who you are! Build memories and experiences! – and when, as we’ve seen, these imperatives are arising from the literal structure of the brain itself – it can feel very logical to respond to those urges by checking Facebook and email obsessively, by clicking on the Zappos ad that pops up next to the Facebook post in which you’ve been chatting with your friends about a new pair of shoes for your job, or going out to shop for a new suit, or standing a round of drinks for friends and new social contacts at the bar, and swiping your credit card to make all of this happen and checking Facebook on your phone again to see what your friends are saying about you.  Jay’s The Defining Decade cites studies that reaffirm a familiar bit of financial-planning wisdom: in our twenties, it’s so easy to rack up debt because the future self that will be on the hook for it does not seem real, and because the actual and perceived pressures to form social and professional identity seem so urgent.  (Remember, the frontal lobe, where forecasting happens, isn’t quite grown up yet.)  In a study Jay cites, two groups of twentysomethings were shown two different kinds of pictures – one group saw current self-portraits, while the other was shown age-progressed pictures of themselves – and then each group was asked to set aside retirement savings for the person in the picture.  Unsurprisingly, the subjects who literally “saw their future selves” set aside more than twice as much as those looking at their present selves.

This “present bias,” to which we’re susceptible at any age, is heightened by a consumer culture constantly shrieking buy, watch, consume this to become this type of person, reducing – in quintessential capitalist logic – people to things and identities to trappings of things.  Creating consumers is about creating or heightening anxieties, then offering purchasable commodities as “solutions” to those anxieties.  (Partisan media communities can work the same way, offering stories and imagined communities of the like-minded to stoke views of the world that the media outlet itself has helped to create, reinforcing all of this as “reality.”)  Thus before they know it, twentysomethings can get stuck in a perfect storm of debt and anxiety created by what after all could be called socially and even neurologically “natural” goals: creating social connections in community, building an identity in others’ eyes and your own, generating stimuli and experiences that construct the brain’s bank of memory – and thus, perhaps, the self – from the inside.  Perhaps that new handbag or pair of shoes or book or item of outdoor gear or dream vacation adventure is, for the younger brain, literally harder to resist, especially when no other experience of the world exists yet to set against it.  Just as it’s easier to fume against those on one side of an issue if you don’t actually know anyone affected by it in ways you aren’t or anyone who disagrees with you, it’s easy to grasp at the available straw of consumer goods to cement that always-in-progress thing, your own identity, and dispel that most uncomfortable emotion – uncertainty and ambiguity and fear – when you have developed no other way to turn down the volume of that [perhaps imaginary] “need” shrieking in your ear.

As I write, as a tenured associate professor in a job I love, nine months away from age 40, I’m smiling a fond and bittersweet kind of smile: in my twenties and early thirties, I went through exactly what I’ve just described.  I got good grades in college, where I thrived in a double English and Journalism major and a newly formed Honors Program, edited the literary magazine, took as many creative writing classes as I could, and had the great fortune to learn through those experiences what I loved: writing.  An internship at a small but stellar advertising firm became a job there, from which I made the transition to graduate school in literature.  Yet although my performance in graduate school was outwardly as “successful” as my undergraduate career had been – perhaps more so, since I was publishing and teaching by that time – I couldn’t fully admit to myself how deeply I felt panicked and out of my depth.  I knew I was smart, but I’d always read and written more or less just what I wanted and been able to get away with it, seldom working systematically or with the kind of concentration I knew I was capable of.  Despite my high grades, great student rapport, happiness in the program, and conviction this was the right place for me in the world, I still felt like an imposter, bitten deep with the conviction most academics feel but few confess: they’re going to find out I don’t belong here and send me home.   I worked hard.  I read constantly.  I published scholarly articles and short stories while teaching undergraduates and taking courses and qualifying exams and going to conferences.  I made friends and threw parties.  But it never felt like enough.

So I went after the external trappings of professional academic identity with everything I had – specifically, with every credit card.  By the time I graduated with my Ph.D and secured my first tenure-track job, I was carrying $50,000 in consumer debt.  Some was perhaps justifiable: I’d had to finance a new car and job-search suits and plane tickets to MLA.  But easily half of it was clothes and meals out and concert tickets, the trappings of a social and professional identity I wanted badly to believe in.  And most of all it was books, new and used, from the three different used bookstores and excellent on-campus new bookstore I haunted so much that even their staffs started giving me bemused, incredulous smiles: you’re here again?  If I read about it or met the author at a reading or conference and could justify its purchase to myself for  “professional reasons” – I could write about this! I could teach this! It will help me with my dissertation! – I usually bought it.  In addition to the neat little piles of new acquisitions on my apartment coffee table, I kept the library books I was also checking out “just to look at.”  Unsurprisingly, overextension in every direction meant unproductivity in any direction, and it meant too often that those optimistically purchased or borrowed books got read hastily or not at all.  I worked multiple jobs and earned a stipend and kept myself afloat, but largely because the bubble of credit card expenditures lifted me artificially high.   I alternated between exuberance and comfort and choking panic when I looked at all my books.  So much hope and possibility.  So much obligation I feared I’d never live up to.  So much enthusiasm.  So much fear.  Fueled by promises and hard work and following through and discovering but also by a series of white lies, especially to myself, that got darker and darker: I’ll get a job and pay all this back.  It’ll be okay.  I’m only twenty-____.  I have a lot of time.  

As hard as it is to admit this now, I tell this story (especially to students and twentysomething friends) to emphasize how common it is and how quickly what seem like defensible choices can turn into something else you never meant to happen.  I’ve spent the years from job to tenure fighting to pay down that debt and get these old habits under control and feel now that I have, as much as any habits ever are under control.  Every year I get better at evaluating purchases and possessions by the truth of my own instincts and emotions, not others’ expectations: do I really like and want this in my life as it is right now, or is what I’m feeling some borrowed emotion or obligation? Will I still want this in a week? Do I feel a sense of real pleasure and possibility when I look at this, or do I feel primarily ‘I should do something with this?’ [‘Should,’ I’ve learned, tells me that something’s probably not going to happen.]  The book-buying has continued, but so now has the book-giving-away.  This past month I gave away six big boxes of books, from my office and from home.  When  I saw how many of those books were practically new, good scholarly editions bought in fits of conference or dissertation optimism and barely (or never) opened, I thought about the anxious girl who bought them, and I forgave her.  It’s okay.  You have enough now, and will continue to have enough.  And that’s what it’s really all about: let your frontal lobe talk to the rest of your brain about reality, the person you actually are and the life and career you actually have and want.  Let yourself settle down and look at what is, and leave space for the good things still to come.

Posted in attention, home, money, resilience, stuff, technology | 4 Comments

This is your brain on skis.

dec2013 036I learned to cross-country ski from a student of mine about three years ago, on an excursion with a bunch of other novices and a few ultra-experienced daredevils who could launch themselves off the side of a hill, spin in the air — skis flashing like juggled knives — and land upright.  I fell a lot and was proud just to shuffle forward on flat ground.  But by the end of the afternoon, I was sold.  As often as possible, that winter and the next, I rented a pair of boots and skis and poles from a shop in town and headed out on the winter-groomed trails and river bottoms around town. I do best on flat ground (once I stopped a downhill descent by just hurling myself into a snowbank), but it’s amazing how easy cross-country skiing really is.  Now snow is something to look forward to.

Last month, after the first snowfall, I finally decided that since a winter’s worth of rental is the cost of a good used pair of skis, and since I am determined to combat the effects of too many hours at the desk, it was time to get my own skis — cheapskatishly, of course.  I went to the shop I always rent from, where the owner remembers my boot size from year to year (that’s the kind of town this is). “You got anything you want to retire from your rental fleet?” I asked.  “Yep,” he said, “I could let these ones go… and these poles, how about $10 for the pair?”  And I took from the rack the same sturdy white and red and black skis I had rented every winter, thrilled with ownership: now they are mine.  Even with my brand-new boots — which broke in like a dream — I paid half the cost of a whole new setup.  Even when I’d only been on them a few minutes, I knew this was the best money I had spent all year.

Even aside from the exercise benefits, there’s so much to love about skiing, from the soft swish, swish sound to the mindless pleasure of the repeated motion and the changing scenery all around you to the haze of historical associations that come to mind: Norwegians fought the Nazis on skis.  Your tracks are just a few of many in the snow – goose, turkey, deer, dog, coyote, human – but often you are the only moving person in the landscape. A hawk or eagle wheels dreamily overhead, slowly.    You can watch a flock of geese bank and drop, wings high and still, down onto the water and think of Dante’s analogy in Inferno when Paolo and Francesca heed his call and turn back and drop out of the air with exactly that motion – he compares them to doves returning to the nest.

On a clear day the light is dazzling, working along with the cold air to blast the indoor, sedentary fog from your brain.  Like yoga, skiing loosens up your knotted thoughts and sets everything running forward again.  Forward and outward.  Again I am reminded: getting outside and moving flips the switch that converts us back into our Best Selves.  Things become clear, stress loosens its grip, problems edge closer to solutions, outlines of the life you want shimmer into view.  And the possibilities become calm, specific, clear: I will do things differently in 2014: fewer late nights in the office and more time with friends.  I will not worry so much.  I will save more money and go ahead and switch the 403 b to the “aggressive” allocation.  I will eat less sugar and try not to freak out in head-balance pose in yoga class because I know it won’t kill me and learning that momentary fear won’t kill you is the point of yoga – one of many.  I will try a new haircut.  I will not be afraid of the future because I am doing my best to shape it now.  And I will definitely spend more time on skis.

Posted in body, Driftless region, resilience, seasons, self-reliance, yoga | 4 Comments

“Learn, little cousin:” Seeing with Signorelli.

Something about Christmas mixes emotion and memory like no other time.  This is the hinge of the year, the liminal space where we could step one way or another way, where we can and can’t see what’s coming.  Where we cannot avoid thinking about our relationship to time and how we perceive the world as it flows and unfolds around us, never stopping.  Where we ask, with a mixture of trepidation and hope, what’s next?

A couple of weeks ago, I read this incredible essay.  I shared it with students and got some of the most moving and thoughtful responses I’ve ever gotten from students on anything.  Mortality, art, the heightening of emotion we get from coupling the two, the questions it raises on “how shall we then live” — not, apparently, the most optimistic thing to share with twenty-ish-year-olds, but questions they are eager to consider, and rightly so.

Luca Signorelli, "Man Carrying Corpse," ca. 1500.

Luca Signorelli, “Man Carrying Corpse,” ca. 1500.

The centerpiece of the essay is this drawing by Italian Renaissance artist Luca Signorelli (c. 1445-1523), which prompts author Zadie Smith to sharp, compassionate, thoughtful meditations on art, life, and how we can live [that ancient human problem] fully and even joyfully in the presence of the knowledge of our own mortality.  She writes:

“How to be more present, more mindful? Of ourselves, of others? For others?

You need to build an ability to just be yourself and not be doing something. That’s what the phones are taking away, is the ability to just sit there…. That’s being a person…. Because underneath everything in your life there is that thing, that empty—forever empty.

That’s the comedian Louis C.K., practicing his comedy-cum-art-cum-philosophy, reminding us that we’ll all one day become corpses. His aim, in that skit, was to rid us of our smart phones, or at least get us to use the damn things a little less (“You never feel completely sad or completely happy, you just feel kinda satisfied with your products, and then you die”), and it went viral, and many people smiled sadly at it and thought how correct it was and how everybody (except them) should really maybe switch off their smart phones, and spend more time with live people offline because everybody (except them) was really going to die one day, and be dead forever, and shouldn’t a person live—truly live, a real life—while they’re alive?”

To those thinking, “Wow, this is depressing!” I’ll say, well, yeah.  But admitting it is the start of spiritual practice, and adulthood, and real life while we are here, and it keeps us from using and exploiting others to prop up illusions about ourselves; we are responsible for living the most just and responsible lives we can, while we’re here.  Pema Chodron: “Since death is certain and the time of death is uncertain, what’s the most important thing?” Marcus Aurelius: “Every moment think steadily…to do what you have in hand with perfect and simple dignity and feeling of affection and freedom and justice; and to give yourself relief from all other thoughts. And you will give yourself relief, if you do every act of your life as if it were the last, laying aside all carelessness, passionate aversion from the commands of reason, hypocrisy, self-love, and discontent with the portion that has been given to you.” Jesus, paraphrased: “This world is not all there is.  But, even so, remember, while you’re here, that you are not the center of it.  Act accordingly.”

But how we can keep being most alive?  How we can keep seeing our lives and our surroundings and our opportunities for contact with others as eternally fresh and new, not dulled by habit?  A few years ago at a writers’ conference, I learned from the writer Anthony Doerr (whose recent piece quoting him is here) about the Russian formalist Victor Shklovsky.  “If we start to examine the general laws of perception,” Shklovsky writes, “we see that as perception becomes habitual, it becomes automatic….If one remembers the sensations of holding a pen or of speaking in a foreign language for the first time and compares that with his feeling at performing the action for the ten thousandth time, he will agree with us.”  Doerr goes on:

“But art, Shklovsky says, ought to help us recover the sensations of life, ought to revivify our understanding of things—clothes, war, marriage—that habit has made familiar. Art exists, he argues, “to make one feel things, to make the stone stony. The purpose of art is to impart the sensation of things as they are perceived and not as they are known.”  Sometimes you have to make yourself a stranger to your own life in order to recognize the things you take for granted. Like sunsets, or hot showers, or alphabets. My health, my family, the streams of photons sent from our star—how had I stopped actually seeing these things?”

Looking at Signorelli’s muscular man – so determined, so strong, so apparently eternally alive – carrying the corpse away to some destination we can’t see, I thought of Shklovsky again.  Art had made me re-see my living body’s presence in this world.  Who was Luca Signorelli, anyway?  Having been teaching Dante and Michelangelo, my head was already full of the Renaissance. But Signorelli was new to me.

Luca Signorelli, Self-portrait, detail from "The Preaching and Acts of the Antichrist," 1500

Luca Signorelli, Self-portrait, detail from “The Preaching and Acts of the Antichrist,” 1500

What I’ve been learning about Signorelli is not much, yet, but seems to suggest one thing: a good craftsman, kind of quietly overshadowed, then and now (perhaps inevitably) by Michelangelo and Raphael.  In his Lives of the Artists (first published 1549-50),  Michelangelo’s disciple Giorgio Vasari gives 75 pages to his master (whom he calls, basically, a spirit sent from God to make Florence, “the most worthy among all the other cities,” even more perfect than it already is) while giving Signorelli only five and a half.  The beginning of the chapter is a little mixed: “Luca Signorelli, an excellent painter about whom, following chronological order, we must now speak, was more famous in his day, and his works were held in higher esteem, than any other previous artist no matter the period, because in the pictures he painted, he demonstrated how to execute nude figures and how, although only with skill and great difficulty, they could be made to seem alive.” Most of his account of Signorelli is a list of what Luca painted and where.  “[D]uring his last years [in Cortona],” Vasari writes, “he worked more for pleasure than for any other reason like a man who, accustomed to toil, is unable or unwilling to remain idle.” Yet there’s also a charming anecdote about when Signorelli was a guest in Vasari’s own home and told the then-eight-year old Giorgio, who was obsessed with drawing figures instead of doing his other homework, “Learn, little cousin, learn.” (It’s kind of the Renaissance equivalent of the moment in John Lee Hooker when “Papa told Mama, let that boy boogie-woogie… it’s in him and it’s got to come out.”)

Yet Vasari also tells us, “It is said that when his much beloved son, who was extremely handsome in face and figure, was killed in Cortona, even as he grieved, Luca had the body stripped, and with the greatest constancy of heart, without crying or shedding a tear, he drew his portrait so that he could always see whenever he desired, through the work of his own hands, what Nature had given him and inimical Fortune had taken away.”  This is a sterner thing, a challenge.  In Peter Matthiessen’s The Snow Leopard (1973) – a classic nonfiction account of a part-spiritual-trek, part-naturalist-discovery-quest through Nepal, the real subject is not the elusive snow leopard of the title (glimpsed only once by Matthiessen’s scientist partner, never by the author himself) but what it means to confront reality without flinching, to immerse yourself in the present moment, to be in emotions without letting them bind or define you.  As he writes, Matthiessen’s wife — a fellow author and student of Buddhism — has died of cancer.  Yet he presses on, intent on the nature of this or any pilgrimage:  a confrontation with reality as it exists beyond any illusions or emotions, difficult and radiant.  “The sun is roaring,” he writes, “it fills to bursting each crystal of snow. I flush with feeling, moved beyond my comprehension, and once again, the warm tears freeze upon my face.  These rocks and mountains, all this matter, the snow itself, the air — the earth is ringing.  All is moving, full of power, full of light.”  There is a dignity and a nobility in using one’s humanity to see both that humanity itself and the larger reality of which it is only one part.  There is a dignity in that.

A closer look at Signorelli’s work reveals a kind of quiet, close-grained dignity and focus that refreshes our sight just by looking.  Consider the self-portrait above – one of those little self-portraits Italian Renaissance artists were always sneaking into larger works. The man looks right at us and leaps off the screen.  He’s as testy, reserved, yet forthright as any similar man we might know.  He’s a human, looking at us across more than 500 years.  The brushstrokes are not “refined.” This is not an artist as famous as many of his contemporaries.  But he is a painter who knows how to do a job right, and does it so that it lasts.  He’s a technician, a craftsman, taking pride in his work, as craftsman do, and thereby ensuring its afterlife.  He doesn’t insist didactically on what his work should “mean:” in his clear quiet faces, in the muscularity or limpness of bodies, he just shows us what is.  Look at Gabriel, below, with his tender throat, his unreadable expression.  This is a divine being cloaked, for the painter’s purposes as for the poet’s (remember Milton describing in Paradise Lost how hungry Raphael gets, and how he eats?), in human flesh.  Which, spiritually, is kind of the point.

Take a walk to the river on a bonechilling winter afternoon and you see a Signorelli-ish austerity to the world.  The quality of the light, the starkness of the winter landscape, the small details that flash up: the white silt of snow among the roots of what’s still showing as golden-brown grass, seedpods rattling in the wind, river water slaty-cold and dark over the rocks.  There’s a quiet dignity in the cold and starkness of the oncoming winter, a reality that braces and restores and teaches us to see again in this moment as it is.  It’s important anytime: important now, at the hinge of the year, when something new can come.  Jesus came in just such an ordinary way, born to a tired and bone-chilled girl sliding off her donkey to rest in the only place left: It’s okay, I’ve always imagined her saying to Joseph, we can stay here.  I know you tried.  And Jesus’s birth in a human form in a barn — not some elegant palace — was not just an accident.  It renovates our vision of the divine in this ordinary world and in our own bodies, the bodies that sneeze and weep and shove their reddened hands into their pockets, squinting into the sun, determined to stay out and enjoy the severe and radiant light just as long as they can.

Luca Signorelli, "Head of Archangel Gabriel."

Luca Signorelli, “Head of Archangel Gabriel.”

Posted in art, body, culture, mystery, spirit, the past | Leave a comment

Let there be light.

human retina

On an ordinary November afternoon I sit on a chair in my eye doctor’s dim exam room, chin in the camera-machine, straining not to blink against the stinging dilation drops leaking through my lashes.  A white flash jolts straight to the back of my head.  And then, there on the screen are photographs of my own two eyes as I can never otherwise see them: rosy, beige-pink retinas with elegant intersections of arteries and veins arching from bottom to top, meeting and twining in the middle before going on their separate ways – bringing the blood, taking the blood away.  The macula is a deeper rosy beige, a dark concentrated point like a pregnant woman’s nipple.  And in the center of each retina a white light blazes: the optic nerve, which carries the images formed in light directly to my brain.

Overpoweringly the images of Michelangelo’s Sistine Chapel ceiling come to me – Adam’s flesh illuminated by the God who strains to touch it, whose robes are not so far away from this same color pink.  The round shapes of the dome he made for St. Peter’s Basilica and the dome that Brunelleschi erected over the Duomo floor in Florence, where students and I have stood to gaze up and around and then at the fresco on the wall of sorrowing Dante stretching his hand toward Paradise – Florence itself – with Hell and Purgatory at his back.  Dante struggled with how a God who loved us could have created a world in which we’d be born to fall, and then a hell to punish us for our sins in that same world and in the same bodies He had made, ostensibly, in His image.  William Paley and Charles Darwin wrestled with whether bodies so intricately wrought as ours could have been made by a creator, by the actions of evolution over deep time, or some combination of both.  When Haydn’s oratorio “The Creation,” musically illustrating the Genesis story, was first performed, audiences leapt to their feet and applauded when the quiverings and mutterings of the first orchestral phrases resolved into the burst of choral voices proclaiming, “Let there be light!”  Let there be light: the ultimate example of word into thing.

Dante, Haydn, Paley, Darwin, Michelangelo:  I am studying each of these artists and thinkers with first-year students this semester, wondering as we talk what they are seeing, what they are putting together in their own minds with the words on the page or the overhead-projected, glowing image of Adam and God (fingers almost but not quite touching) we gaze at all together.  There is no way to know, completely, and that is as it should be.

I remember standing in the British Library and reading the words of an original copy of a Tyndale New Testament of 1526, a battered, hand-sized leatherbound book open to Matthew 7 in the case underneath me: “Care not therfore for the daye foloynge.  For the daye foloynge fhall care ffor yt fylfe.  Eche dayes trouble ys fufficient for the fame filfe day.”  From teaching at a Lutheran school – and from Hilary Mantel’s novels – I know how long it takes a human body to burn, and how righteous are the motivations of those who light the torch.  Tyndale burned ten years later for words on a page that need no translation to the literate sixteenth-century English layperson, or to me.  His working brain and his busy hands made this object that carries human language for the divine across four centuries, unchanged.  And I bear the image of his book in the brain that rides unseen in my own head, the patient brain waiting at the other end of that blazing-white optic nerve.

I have been to a human anatomy lab and held a human brain in my hands; I have learned that ears are one of the two most disturbing things for new dissection students to see.  The other is hands.  Eyes can be easier, I’ve heard it said, because in death the light – the light – seems to go out of them, but I don’t think I could ever find it so.

Staring at those two rosy circles on the optometrist’s screen, those photographs of the apparatus of my actual sight — which so resemble twin suns — I’m overwhelmed by the fact of all I am made of and caught up in and do not and cannot see.  The flow of life that lit up Dante and Michelangelo and Haydn lights up me, too, flows into my brain and creates my memories and thoughts and decisions through reflections of light on the surfaces of things that this dot of white so bright it is beyond color conveys directly to my brain.  In looking at the retina and the root of the optic nerve, am I looking at the source of thought, itself?  Am I looking at the source of the human self, what it knows and stores in the brain and calls forth as memory?  Am I looking at the raw material of my soul?

Encounters with the radically mysterious, the radically other thing, can derange us in ways that feel wondrous or painful, because they jostle us off our privileged and self-constructed positions at the center of the universe.  They remind us that life is bigger than we can know, and that it is not about us.  Yet avoiding any discomfort of spiritual or material kinds – as much as our culture urges us to do that, by consumerism, technology, and various other forms of self-help engineering – reduces our capacity for not only imagination and empathy but, simply, for existence.  “To be fully alive, fully human, and completely awake,” says the great Buddhist teacher Pema Chodron, “is to be continually thrown out of the nest.”  Thrown out of the nest – it’s a metaphor of disarrangement, of being nudged aside to make room for what else there is on the earth with us, which we are continually invited to see and to see again.  If the baby bird is not eventually nudged out of the nest into a strange and initially forbidding world, she’ll never develop the muscles to fly or discover that she is not the only being in a world not centered on her.   How rich our lives can be if we really consider this.  And how different the art we make, and our treatment of everything else in creation, can look.

- from a book-length manuscript-in-progress.

Posted in art, attention, body, mystery, spirit, teaching, travel | 1 Comment

First snow, new light.

The first snow of the season started coming down as I hurried to substitute for a colleague’s 8 am class this morning – Seamus Heaney-esque sleet-milt came in pats and splats against the umbrella, white rims thickening.  By the time I left the building it had thickened to actual snow, soft and fast and intent on car hoods and bent, hurrying heads. Now it’s quieted and I can’t stop looking away from the papers in front of me to stare out my tower window at the white layered onto grass and trees by the wind.  I know it’s thicker in the open places. I know it’s thinner in the woods, where the wind might not yet have touched, and where the grass underfoot still holds a little of its last-of-summer-into-fall yield and spring. I had forgotten the particular light of a gray snowy day, iron and austere and somehow bracing.  Here it is again, the forward turn of the wheel.

Posted in attention, Driftless region, seasons | 2 Comments

Darwin’s Beetle: On geeking out.

I wrote this for another “official” college blog but want to share it here.

Darwin’s beetle: On geeking out

October 23, 2013

Our first-year common course doesn’t fit into neat disciplinary boxes, and that’s the point.  Ranging across literature, history, philosophy and science, it’s not designed to convey “expertise” in any one field but to help students develop the set of portable intellectual tools they need in any major, in future employment and in their lives as adults: strong reading and critical thinking skills, the ability to listen and discuss, and the curiosity to find answers and discern realities for themselves.  But when students ask me, “What is this class for?” I tell them it teaches a college student’s most important skill: geeking out.

The best predictor of college success is not test scores, number of AP classes, or even that elusive thing called “intelligence”–it’s the ability to fall in love with ideas, to hurl yourself into a subject so fully that it shapes how you spend your time inside and outside of class, reading and thinking and practicing and shaping yourself into a successful member of that field.  In exposing students to such a wide range of authors and texts, from Plato to Martin Luther to Martin Luther King and Mary Shelley, our first-year common course gives students lots of chances to geek out, to latch onto a fascinating concept and follow it into a lifelong enthusiasm.  It challenges students and faculty alike to step outside of known disciplinary bounds, learn something we never thought we would, and model that learning for each other.

This semester, we added a new text, excerpts from Charles Darwin’s “On the Origin of Species by Means of Natural Selection” (1859).  I was a little worried at first: although I’m a former farm kid, passionate environmentalist and omnivorous reader in the history of ideas, I’m also an English professor.  Would I be able to do justice to Darwin? Would class discussions get bogged down in the misunderstandings of “Darwinism” so prevalent in his time and ours? But in his opening lecture on Darwin, my colleague shared this passage from Darwin’s “Autobiography”:

“[No] pursuit at Cambridge was followed with nearly so much eagerness or gave me so much pleasure as collecting beetles. It was the mere passion for collecting, for I did not dissect them and rarely compared their external characters with published descriptions, but got them named anyhow.

I will give a proof of my zeal: one day, on tearing off some old bark, I saw two rare beetles and seized one in each hand; then I saw a third and new kind, which I could not bear to lose, so that I popped the one which I held in my right hand into my mouth. Alas it ejected some intensely acrid fluid, which burnt my tongue so that I was forced to spit the beetle out, which was lost, as well as the third one.”

Darwin's beetle

Just like that, my worries were over: I was in the presence of a fellow geek – so geeky that he spent five years as a naturalist on the ship Beagle, traveling round the world, collecting and learning and soaking in everything he could, so geeky that one of his friends even caricatured him riding on the back of one of his beloved beetles. The enthusiasm radiating from the page was utterly winning, slightly self-deprecating but passionately driven by the pursuit of knowledge. I knew I could trust this man. And my trust was not misplaced: over the next couple of weeks, as students and I worked carefully through the text, we came to know Darwin, through his words and ideas, as anything but the cold, hard atheist of stereotype. In their papers and in discussions, my students used words like “gentle,” “warm” and “humble,” pointing out the frequent occasions where Darwin hedges his conclusions with appropriate scientific modesty and suggests that such modesty is appropriate in the face of all that humans don’t yet know and understand–and maybe can never know–about the origins of life on earth.

But the main word we kept coming back to, in describing Darwin’s view of the world, was wonder.  A onetime-potential-future clergyman, Darwin can perhaps best be described as agnostic, although even that label’s a bit messy. Yet he never closes the door to beauty, mystery and the need for humans, in our investigations of nature and of our own origins, to open ourselves to the ineffable and challenge ourselves to see how–in the words of Shakespeare’s Hamlet–”there are more things in heaven and earth…than are dreamt of in [our] philosophy.” Even as Darwin describes the “struggle” for life between plants and animals competing for existence, even within “a plant which annually produces a thousand seeds, of which on an average only one comes to maturity,” he refers to “the beautiful and harmonious diversity of nature” and gives a simple, lyrical analogy of life as a giant tree: “As buds give rise by growth to fresh buds, and these, if vigorous, branch out and overtop on all sides many a feebler branch, so by generation I believe it has been with the great Tree of Life, which fills with its dead and broken branches the crust of the earth, and covers the surface with its ever branching and beautiful ramifications.” Wonder and humility shimmer around Darwin’s descriptions of the living world he so closely observed and so well loved, and they shape his diligent investigations toward the form “into which life was first breathed.”

As a mature thinker, Darwin learned to live with uncertainty – even to thrive on it, since it propelled him forward into all he had yet to discover and into the living world that brought him such joy. This spoke powerfully to my students and to me. How can this openness shape our own educations and personal quests? How can we discern meaning, even beauty, amid the confusion of our changing ideas and changing lives, and how we can remain open to, accepting of, and curious about that change? How can we “geek out,” like Darwin, taking full advantage of all the opportunities for discovery, challenge and change in front of us at this college? I’ve had lots of rich conversations over the years in this first-year course–but this was one I will never forget. And as we prepared to finish our reading of Darwin, I saw in the students’ eyes that they felt the same. We had come to know this distant historical figure as a person like ourselves–a humble, rigorous scientist who at heart always remained that excitable undergraduate, popping a beetle into his mouth to keep it and study it as long as he could.

Posted in attention, teaching | Leave a comment

The second disruptor: an open letter to our college’s president on the “digitization of learning.”

51DynJDW4qLSeptember 30, 2013

An Open Letter to Our College President on the Digitization of Learning

From The Cheapskate Intellectual

Dear President,

Thank you for being a thoughtful leader and serious colleague who has invited us to wrestle with the challenges and opportunities facing us and all other colleges these days.  At your State of the College Address this year, you introduced us to “the Five Disruptors” of higher education these days: the student cost-and-debt spiral; the digitization of learning; the regrouping of American communities, including faith communities; the changing profile of prospective students; and the measurement of educational excellence by learning results. I’m writing to speak in a more organized way than I can usually manage from the floor at a faculty meeting, because I’m hoping to serve and develop our conversation about the one “disruptor” that seems to me most intimately and problematically linked with the others: the digitization of learning.  More specifically, I’d like to name what seem to me our frequent (and, to me, problematic) assumptions as faculty that technology, as it makes its way into our classrooms, our admissions efforts, and our and our students’ lives, is part of “progress” and therefore cannot (or should not) be resisted; that adapting the latest device or platform is necessarily productive in all areas of our lives and work; and that electronic media are cognitively, socially, and pedagogically neutral, always able to complement without co-opting the nature of the work we are privileged to engage in here.  I believe that each of these assumptions is wrong.

The notion that using technology is inevitably a marker of progressive thinking, good teaching, and general forward-looking-ness for a college seems to me to be one of the last unchallenged assumptions in higher education – ironic and sad for an enterprise that depends for its life on leading all of us to see and reevaluate our paradigms and assumptions, every day.  “Don’t just be a swimming fish,” I tell my own students (especially first-years), “learn to see the water, too – that’s what being an educated person is all about.”  Yet even as many of us shake our heads at K-12 initiatives that seek to improve cognitive skills by first putting an Ipad in every student’s hands (at the same time as arts programs and library budgets are cut), we accept far too often that digital media should have a place in our classroom and are inevitably necessary and helpful in our work as they are in our lives beyond the classroom – to say nothing of their “necessity” for marketing purposes.  To accede too easily to prevailing modes of being and doing could mean acceding to what educator Lowell Monke has called “a common disregard for one of schooling’s most important tasks: to compensate for, rather than intensify, society’s excesses.”  Since college is one of the last (if not the last) places where our increasingly Iphone-enslaved students can be helped to examine cultural norms and their effect on individual minds (including the constant demand for entertainment and small bites of information that the Internet and electronic devices have helped create in them), we owe it to them to point out what else is possible for them, and what they are missing in a life conducted largely online.  As writers like Nicholas Carr, among others, have pointed out, the Internet is far from neutral in its effect on our individual consciousnesses and capacities for attention (which it flattens, fragments, or “shallows”), our capacities for presence, dignity, and awareness in interactions with others (which suffer from the cognitive reduction of people to things and ideas to soundbites that media like TV, video games, and Internet, by their very nature, tend to wreak), and how we spend (and are able to spend) our time.  It is not neutral, and it is not without effects on our personal, educational, and civic lives, in more ways than we know.  Our students, often without quite realizing it, are seeking other ways of seeing, thinking, and being in the world than what has become their “new normal” of techno-stimulation and subsequent jadedness.  I’m coming to believe that if we, their college professors – especially within the liberal arts tradition of inquiry towards richer meaning in our world – cannot show them alternatives, then no one can, or will.

This is perhaps the place to point out that despite my techno-skepticism, I do use technology as a tool, and I’m not blind to the irony of my own doubts about it.  I run a blog, I use email and Facebook, I publish in online literary journals, and I use an online course module to disseminate course resources and enable online discussions, although like several of my colleagues I’m nervous about what would happen to us all if that module, on which we are now so dependent, were to crash, and I smile somewhat sadly at the professional reality that using these tools enriches my perceived “market value” as a teacher and a writer.   My issue is with what happens to professors and students when we allow ourselves to lose sight of the fact that technology is still just a tool, to be deployed with exceptional mindfulness and care in helping students develop intellectual and emotional independence, curiosity, and humility.  “But we are mindful of this,” colleagues will protest, and rightly so. “We do care.” Yet I wish that higher education in general and the humanities in particular (the natural home, I think, of the kind of fine-grained, humanly-complex, countercultural pressure I’m advocating) were more thoughtfully aggressive about pushing back against the new normal of omnipresent technology rather than (in various ways) acceding to it as “reality.”  When I have made deliberate efforts to discuss (and push back) against this “reality” in my own classrooms (and retooled my creative writing classes, for instance, along these lines), students have responded with initial bewilderment (“What do you mean, technology’s doing things to my brain?”) followed by dawning realization (“It’s true – it does feel different in my head when I take a walk, in silence, with no Iphone or headphones! I’ve never really done this before!”) and increasingly far-reaching and thoughtful personal transformation (“Clicking away to Facebook while I’m writing has gotten to be a bad habit – now I keep that window closed” or “I’ve never kept a notebook and pen in my pocket to write things down before – now I notice more and do a lot more writing because of that.”)  Students who have learned to unhook from and examine our new social assumptions about the way life and experience must be mediated through technology have come a long way toward the independence of thought and capacity for self-analysis that will make them flexible, curious people for the rest of their lives, able to navigate changing worlds based on internal rather than merely external compasses.  Surviving the twenty-first century, intellectually and psychologically, will depend, I think, on being able to interpret and make one’s own decisions about the tsunami of information and advertising and messages and “media” now washing over us from every direction; passivity, credulity, and discomfort with nuance, which are encouraged by the very nature of such media, are more dangerous to our lives as people and citizens in our complex world than they’ve ever been.

Yet for corporations – alarmingly influential in higher education, as in government – passivity, credulity, and adherence to the needs of the moment are profitable and desirable.  Almost the first thing one reads in Jeffrey Selingo’s College Unbound, which you have helpfully suggested to us, is a quote from Jennifer Fremont-Smith, described as “the cofounder of Smarterer, a Boston-based start-up that offers technology for validating technical skills on everything from social media to Microsoft Office programs.”  “In other industries, ‘those who don’t innovate go out of business,’” Fremont-Smith says.  “Higher ed shouldn’t be different.”  This quote is surrounded by descriptions of anxious universities partnering with industry to tailor curricula to that industry’s needs, producing workers for that industry: in 2011 a Stanford professor and Google’s director of research “offered their graduate-level artificial intelligence course online for free” and drew 160,000 students – the top thousand of whom were invited to send the professor their resumes so he could pass them on to Silicon Valley tech companies.  The University of North Texas “called in the management consulting firm Bain & Company, famous for helping corporate America restructure its operations, to assist the university in designing the college of the future for its branch campus in Dallas. The model shaped by Bain called for a limited number of majors ties to the needs of the local economy (such as business and information technology.)”  You needn’t be a confirmed corporate skeptic like me (I’m currently working against frac-sand mining in the anti-environmental “regulatory” environment born from the marriage of corporations and government) to question the sanity of pinning your institution’s future to the needs of a protean business world, and of becoming the handmaidens of entities that have – to say the least – not always acted responsibly.  Remember the first Internet boom and crash around the turn of the millenium? Remember when going to work on Wall Street was not only a “sure thing” but at least vaguely socially respectable?  If the “local economy” suddenly shifts, what internal equipment will those students be left with to respond to the changes?  Yet I have no doubt that the “forward thinking” that is driving those business-partnership decisions is born of the other great unquestioned assumption in American life, that corporate partnership is always a positive good. Universities are about helping students build independent minds, selves, even souls.  When we say “rich lives,” we’re not talking about money – but that’s the only language corporations recognize.  Like technology, the inherent qualities of corporate partnership inherently undermine the nature of what is – necessarily – our countercultural work, focused on cultivating the unguessable, unquantifiable, inward, and defiantly unprofitable texture of the individual.  And technology is one of the forces that can smooth away that texture to make the individual little more than a good consumer.

As the late, great anti-mountaintop-removal-mining activist Judy Bonds asked, “If coal mining is so good for us hillbillies, why are we so poor?” If assessments, measurements, scores, and the like are so all-encompassing and reliable as the way to measure what students are doing, how can I account for the unquantifiable thing I see among even the best-measured, highest-scoring students I teach – a longing for meaning, a sense that the affluent, technologically-smoothed society they’re living in is cheating them out of some rich texture of experience on this Earth? If technology is such an unmitigated good for human beings, why hasn’t it eradicated rather than (as the philosopher Hannah Arendt claimed) exacerbated war, oppression, and their fundamental cause – the notion that with technology all obstacles to whatever one desires, including other people, can be engineered away?  For instance, I believe another of the “disruptors” we’ve discussed – the “regrouping of American communities” – is linked to and even exacerbated by technology: even as American public life fragments into individual living rooms lit by TV sets, cognitively rich educational and even life experiences (digital and not) are becoming the domain of children and families with money.  Affluent schools at all levels are able to provide carefully designed activities and contact with teachers – the hands-on variety of pedagogies and experiences, including contact with caring adults, that has been shown to improve not only children’s social skills but the literal structures of their brains.  Poorer schools (especially preschools) park children in front of the television to keep them quiet.  (The famous open letter to Prof. Michael Sandel from the philosophy faculty of San Jose State University posits, correctly in my view, that MOOCs will create the same effect in colleges.  In today’s New York Times, an obituary of educator Jean Anyon describes her similar conclusions).  And these differences in preparation are visible all the way up the academic chain – including in our classsrooms.  To borrow a notion from that wild American “disruptor” John Brown, we as a society and as individuals are sowing the wind to reap the whirlwind if we cannot ask better questions about the way our tools are reshaping our selves and our lives.

Thank you very much for reading this letter.  I look forward to working with you as we deliberate, in a colleague’s words, about how to keep the “disruptors from disrupting the heart and soul” of our college.

Posted in community, corporations, politics, teaching, technology | 6 Comments