Dennis Paoli, the coordinator of the Reading & Writing Center at Hunter College in New York City, has a short but very effective definition of writing: Writing is thinking and vice versa. In other words, to write clearly you also need to think clearly, and clear thinking is often achieved through writing.
I’ve participated in numerous workshops over the years with Paoli and other faculty members at Hunter, and our task is always the same: helping students become better writers. We talk about teaching students how to develop “voice” in their writing and how to isolate their primary arguments and support those arguments with evidence. We talk about how we correct writing mistakes, how to help students spot those mistakes on their own, and even how to talk about what a “mistake” really is.
I, for example, spend a lot of time trying to undo the damage wrought by Microsoft Word’s often-wrong editing software and high school English classes that instilled students with such lunacy as “Never start a sentence with ‘And,’ ‘But,’ or ‘Because'” and “Never write sentence fragments.” Because those rules are stupid. Plenty of great writers do those things all the time; the key is that they know they’re doing them, and they do them deliberately.
We also talk about our own experiences with writing, the formative moments that led us to careers that involve language in one way or another. When did we learn to care about words, sentences, and semicolons? How did we, as Roy Peter Clark puts it in his terrific 2010 book The Glamour of Grammar, learn to “live inside the language”?
The answers are different for everyone, but one thing is universal: We all learned before the age of Autocorrect.
Very soon, Apple will introduce a new “predictive text” feature with its imminent iOS 8 for the iPhone and iPad, to be released next week. It’s the most sophisticated form of Autocorrect to date: It can actually predict entire phrases and sentences that you personally might write by understanding your own writing style in a variety of contexts. It will “know,” for example, when you’re writing a casual email to a friend or a more formal one to your boss, and adjust its suggestions accordingly.
But as the philosopher Evan Selinger argues, the feature could very well do more harm than good — it might even hijack our souls.
“I’m horrified by this, to be honest with you,” he told David Berreby of the blog BigThink last week. “Rather than needing to fill out my thoughts to you, I’ll say something good enough that was recommended. And to put in the energy and effort to override a good-enough [phrase], you have to overcome a certain amount of inertia. It will require extra effort to do that. And so I think there’s going to be a natural temptation to rely on that tool rather than override it.”
In his next sentence, Selinger hit the nail squarely on the head: “The more we don’t autonomously struggle with language, grapple to find the right word, muscle through to bend language poetically, the less we’re able to really treat conversation as an intentional act. As something that really expresses what we’re trying to say.”
At its core, this is an ethical issue: “Effort is the currency of care,” Selinger said. “And by ‘effort’ I mean deliberate, focussed presence. When we abdicate that, we inject less care into a relationship.”
This is just the beginning. Soon, new programs will be competing with Apple’s and before we know it, we’ll be able to “write” exclusively by selecting entire sentences from drop-down menus instead of crafting each one from scratch and occasionally pausing to scroll through our psychic dictionaries for just the right word, as I just did for the word “psychic.”
For those of us who attempt to teach writing, meanwhile, our job is about to get much, much harder.