24 novembro 2004

Computers as Authors? Literary Luddites Unite!


For some people, writing a novel is a satisfying exercise in self-expression. For me, it's a hideous blend of psychoanalysis and cannibalism that is barely potent enough to overcome a series of towering avoidance mechanisms - including my own computer. Writers and computers nowadays are locked in such an enduringly dysfunctional embrace that it can be hard to tell us apart. We both rely heavily on memory, for instance. We are both calculating, complex and crash-prone. And like Hebrew National hot dogs, we both seem to answer to a higher power: writers, according to Plato, were divinely inspired; computers have Bill Gates.

Occasionally you hear of a Luddite novelist who shuns computers, but the truth is that most of us would be lost without them. If I rail and curse at mine, it is partly out of resentment at our miserable co-dependence. Imagine, then, the blow to my scribbler's vanity when I discovered a while back that computers might get along just fine without writers.

This is not science fiction. With little fanfare and (so far) no appearances at Barnes & Noble, computers have started writing without us scribes. They are perfectly capable of nonfiction prose, and while the reputation of Henry James is not yet threatened, computers can even generate brief outbursts of fiction that are probably superior to what many humans could turn out - even those not in master of fine arts programs. Consider the beginning of a short story dealing with the theme of betrayal:

"Dave Striver loved the university - its ivy-covered clocktowers, its ancient and sturdy brick, and its sun-splashed verdant greens and eager youth. The university, contrary to popular opinion, is far from free of the stark unforgiving trials of the business world: academia has its own tests, and some are as merciless as any in the marketplace. A prime example is the dissertation defense: to earn the Ph.D., to become a doctor, one must pass an oral examination on one's dissertation. This was a test Professor Edward Hart enjoyed giving."

That pregnant opening paragraph was written by a computer program known as Brutus.1 that was developed by Selmer Bringsjord, a computer scientist at Rensselaer Polytechnic Institute, and David A. Ferrucci, a researcher at I.B.M.

Or consider this sensitive reinterpretation of a literary classic:

"The road to grandmother's house led through the dark forest, but Little Red Riding Hood was not afraid and she went on as happy as a lark. The birds sang her their sweetest songs while the squirrels ran up and down the tall trees. Now and then, a rabbit would cross her path."

What you just read is the work of StoryBook, "an end-to-end narrative prose generation system that utilizes narrative planning, sentence planning, a discourse history, lexical choice, revision, a full-scale lexicon and the well-known Fuf/Surge surface realizer." Believe it or not, that description was written not by a computer but by the humans who created StoryBook, Charles B. Callaway and James C. Lester, who are computer scientists.

That no computer has yet written the Great American Novel may be because computers are subject to some of the same handicaps that afflict human writers. First, writing is hard! Although computers can work unhindered by free will, bourbon or divorce, such advantages are outweighed by a lack of life experience or emotions. Second, and all too familiar to living writers of fiction, there is no money in it. Unable to teach creative writing or marry rich, computers have to depend on research grants. And why would anyone pay for a computer to do something that humans can still do better for peanuts?

Still, what has been accomplished so far is scary enough, and surely there is more to come, thanks to rapid advances in computing power and the rise of "narratology" (how stories are told) as an academic field of study, among other unwholesome trends that are making the novelist's life ever more perilous.

Computers have been doing literary work for a while now - helping nab plagiarists, for instance - and there is even fiction-writing software for people to use, in one case complete "with 2,363 narrative situations." Professor Bringsjord meanwhile is working on a logical framework for the problem of evil, hoping a computer can write fiction on that theme next. It is hard not to worry that sooner or later computers will be monopolizing the best-seller lists rather than focusing on such worthwhile goals as producing an intelligible royalty statement.

Fortunately, flesh-and-blood writers are nowhere near having to hang up their turtlenecks. When I called Steven Pinker, the Harvard University psychologist whose research focuses on language and cognition, he pointed out that the human brain consists of 100 trillion synapses that are subjected to a lifetime of real-world experience. While it is conceivable that computers will eventually write novels, Dr. Pinker says, "I doubt they'd be very good novels by human standards."

If we don't get much good fiction out of computers, we may at least gain some wholesome new perspective on the process of creating literature. The advent of storytelling computers suggests that thinking people and thinking machines confront many of the same problems in writing fiction, even if their solutions are different. Computers have to rely on a rigorous system of logic, while human writers try to turn their disorganized natures to advantage. Our traditional emphasis on inspiration promotes a reliance on serendipity, which, in turn, helps dampen the potentially paralyzing awareness of the infinite choices available when you create a fictional world.

The economist Herbert Simon, who reminded us of the futility of trying to consider every possible alternative in a world without end, might have had in mind the budding novelist in Albert Camus's "Plague," determined to create a perfect first sentence and therefore unable to advance beyond it.

It was Simon's ideas - particularly his notion of "satisficing" - that first got me interested in fiction-writing machines. Though in theory a person shopping for new shoes could consider all the pairs on the planet, in fact, the cost is way too high - an entire life spent shoe-shopping. So in the real world we visit one or two stores, try on a few in our size and buy a pair.

Satisficing in this way - settling, or even sensing, what is good enough - is something novelists must do as well. We think of an idea and go with it because pausing to systematically consider every plot twist, character or phrase that might come next would lead nowhere.

Computers are just as subject as humans to Simon's "bounded rationality." Computers cannot create narratives by using brute computational force to mindlessly try every alternative. It may be fun to think that 10,000 monkeys typing for 10,000 years will sooner or later randomly produce "Paradise Lost," but evidently this is no more plausible for silicon than simians. Computers don't even play chess this way, Dr. Pinker told me, having noted elsewhere that the number of possible sentences of 20 words or less that the average person can understand is perhaps a hundred million trillion, or many times the number of seconds since the universe was born. "The possibilities boggle the mind very quickly," he says.

This doesn't mean nobody is trying. On the Internet, the Monkey Shakespeare Simulator (http://user.tninet.se/~ecf599g/aardasnails/java/Monkey/webpages/) generates random keystrokes and matches them against a database of Shakespeare's plays. The record, last time I looked, was 21 consecutive letters and spaces from - aptly enough - "Love's Labour's Lost."

From the NYTimes

Lud·dite Listen: [ ldt ]
n.


  1. Any of a group of British workers who between 1811 and 1816 rioted and destroyed laborsaving textile machinery in the belief that such machinery would diminish employment.

  2. One who opposes technical or technological change.


[After Ned Ludd, an English laborer who was supposed to have destroyed weaving machinery around 1779.]

From Your Dictionary

Sem comentários: