That question has been a long-term puzzlement for me.
Although the last thing that the world needs is another definition for literacy, mine is fairly simple: Literacy is the manipulation of symbols. If we want to talk about mathematical literacy, computer literacy, or a traditional alphabetic literacy, I think that my definition holds up. “Manipulation” is a broad enough word to cover reception (reading) and invention (writing), as well as rearrangement (remixing, if you’d like).
But “manipulation” in my simple definition also implies artful, studied symbolic action (hat-tip to Kenneth Burke). The New Oxford American Dictionary on my Mac here adds the qualifier “typically in a skillful manner” in its definition of manipulate.
This semester I’m teaching a course called Humanizing Technology. One of my opening remarks to the class at its first meeting last Tuesday night was that the course’s title is kind of a lie: Technology, specifically the digital/computer technology that is the focus of the class, isn’t waiting for us to humanize it; it’s already a bit too humanized, in so far as technology reflects human imperfection, as all symbol systems do, as well as a human desire to control and perfect—or what Burke describes in his “Definition of Man” (which students are reading this week) as both being “goaded by the spirit of hierarchy” and “rotten with perfection.”
Symbolicity rules the terms of engagement with digital technologies. And digital technologies, no matter how magical, are both the product and the enabling force of specific symbol systems: computer languages.
Of all the challenges that I will be putting in front of students this semester, working through Bruce Tate’s Seven Languages in Seven Weeks: A Pragmatic Guide to Learning Programming Languages (7L7W for short) is the one that already has students most on edge (judging by the number of emails I’ve received this first week of class alone).
As I’ve reassured a number of students this week, 7L7W is not on the reading list because I think it will (or even can) make anyone a programmer; it’s there because I want to achieve two goals.
The first is to demystify programming languages, and the learning of them, by simple (over)exposure. Seven languages is a lot by any measure, especially spread out over seven weeks (more like ten because of how the class is structured). Especially in a course offered in the humanities department. (For some students in the course, particularly my undergraduates coming in from the Information Technology and Management program, that demystification may not be so profound. Though I’m hoping the learning that Tate encourages in the book will.)
The second and, to my mind, more important goal is to help students come to see programming languages (as well as other symbol systems, from number systems like hex and octal right down to binary and ASCII and its modern day Unicode supersets) as designed things. Just as its tempting to think that spoken and written languages came down directly from the gods, so too is it tempting to think that computer languages themselves have their own origins in divinity, or something outside of human invention and cooperation.
One of the features that sold me on the 7L7W book (I’d also considered Chris Pine’s excellent Learn to Program) was the interviews that Tate arranged with the creators or lead developers of the languages in the book. It’s one thing to slog through the syntax and application of, say, Ruby; it’s another to do so alongside the words of Ruby’s creator, Yukihiro “Matz” Matsumoto, who recalled:
Right after I started playing with computers, I got interested in programming languages. They are the means of programming but also enhancers for your mind…
The idea of moving from play to an interest in programming languages, I believe, is unusual. Matz further observed that “the primary motivation” to designing Ruby was “to amuse myself.”
And this is where I return to my original question: Why don’t writers gravitate toward code? Writers, and not necessarily even good ones, all share a certain love for how amusing written language is. Otherwise, we wouldn’t have puns, double entendre, and other forms of word play. We likewise wouldn’t have style, or at least a sense of it. And less obviously, we probably wouldn’t have writing period unless it were well funded (Boswell’s Life of Johnson is quotable here: “No man but a blockhead ever wrote, except for money.”)
There are few writers I can think of, though, who would find much amusing about HTML5 or Ruby. And that is unfortunate, especially when there is so little to lose in being exposed (as my students will be) to computer languages. I’m not crazy enough to believe that all writers will become programmers and developers. But some might. And quite possibly to good effect. Although at first glance a computer language like Ruby is lacking in the flexibility and ambiguity of a natural human language, it is nevertheless full of subtlety and elegance—and has a self-consciousness and even humor about itself that is absent in natural human language (minus maybe a fake one like Pig Latin).
Put another way, the distance between writing and the digital-material conditions and technologies that support writing should be of far more concern than past distances between, say, writers and typists, or scribes and makers of parchment or stone tablets.
Why? Because the digital-material conditions that make Facebook or Tumblr or even Microsoft Word possible are grounded in symbolicity that, like traditional alphabetic writing or even media-based writing (images, film), has a certain grounding in language itself. There is a continuity between communication through and programming of digital technologies, despite the assumption that writers/communicators and other lute-playing humanistic wood nymphs comprise one camp, and cold, calculating, logical (a- or anti-humanistic) programmers compromise another. It’s much more complicated than that. And I still don’t have an answer for my question.