cross-posted from DML Central
In a recent piece at Locus, Cory Doctorow argues:
Computers are the children of the human race’s mind, and as they become intimately involved in new aspects of our lives, we keep stumbling into semantic minefields, where commonly understood terms turn out to have no single, well-agreed-upon meaning across all parts of society.
As an example, Doctorow gives the “real names” policies of social network sites like Facebook and Google+. Where it may seem simple for a person to use his or her legal name on a website, Doctorow uses the example of his family, immigrants from Belarus, to complicate this assumption, describing the convoluted processes by which they acquired transliterated names and nicknames when they moved to Canada. Here he describes his grandfather’s name:
My dad’s father was born Avram, which was anglicized as “Abraham” (naturally enough), but his first employer called him “Bill,” because that was a more “Canadian” name. It stuck, and his Canadian citizenship papers read “Abraham William Doctorow,” though no one ever called him “William.”
The complexity of a name—what I am called depends on where I am and who I am with, and those realities may not be reflected on official documents—can elude computers in ways that are unlikely to stump or confuse a human.
The title of Doctorow’s essay is “Teaching Computers Shows Us How Little We Understand About Ourselves,” and while I think he identifies a real problem in computer culture—he describes it as the “ambiguity that is inherent in our human lives…rub[bing] up against our computerized need for rigid categories”—my takeaway from his article was slightly different than his.
Doctorow uses this example to suggest that we don’t know enough about ourselves to code these relationships properly into computers. I think, however, that it shows the limitations of a certain way of thinking, a way of thinking that wants to apply rigid, rule-based codes to all situations. This is how computers “think,” and they can be quite inflexible because of it. As Doctorow puts it, “With a human bureaucrat, there was always the possibility of wheedling an exception; machines don’t wheedle.”
While wheedle has some negative connotations, what Doctorow is describing is basic persuasion, the chance to convince another to see beyond rigid categories in a particular situation, and computers do not perform well at tasks like that.
Real name policies don’t show us what we don’t know about ourselves, but rather expose limitations of digital culture and our adaptation to it. Near the end of the piece, Doctorow warns against “encoding errors about the true shape of family in software,” implying that if we could only get the software right—if Facebook had the right form for his grandfather, for example—this would solve the problem with real name policies. But, while the correct software might be possible to achieve, the real issue with computers will remain: computers can’t wheedle (at least for the foreseeable future), and wheedling is an important part of who we are and how we interact with each other.
I am an advocate of teaching programming and other forms of digital culture. But Doctorow underscores the (current) limitations of these systems, and performs the correct response. He is able to convey the complexity of real name policies by carefully explaining the issues involved and the limitations of technical systems. That is, his essay emphasizes the importance of teaching writing and communication skills as part of digital media instruction.
This is not writing as simple grammatical correctness (a rule-based, machine view of language), but as the vast array of persuasive possibilities presented by language that are relevant not only to texts, but to podcasts and videos and other artifacts of digital communication. Certain technologies can constrain the range of such persuasive communication, but it will never become unimportant, and pointing out the limitations of our technical systems only serves to underscore this point.