boston.com Arts and Entertainment your connection to The Boston Globe

Perception, cognition, and a little detection

Novels are the ideal way to kibitz, on science as well as on life. In recent years Richard Powers's ''Galatea 2.2" and David Lodge's ''Thinks . . ." have drawn very skillfully on computers and cognitive science. But although both communicated a lot of up-to-date information, they were traditional novels by literary artists, so human character, psychology, and emotion were primary (even if sometimes, as in ''Galatea," displayed by a computer). ''Radiant Cool" and ''Turing" have engaging characters, but the excitement is in the ideas they expound or embody.

''Radiant Cool" is a campus thriller. The flamboyant professor Maxwell Grue has disappeared. His graduate student Miranda Sharpe discovers that he was close to formulating a novel theory of consciousness. Naturally she suspects foul play. Miranda enlists the help of Grue's colleague ''Dan Lloyd" (who, like the book's author, is a philosophy professor at nearby Trinity College). Together they foil a dastardly KGB plot to hijack both the Internet and the Grue/Lloyd theory, which would have allowed them to control global consciousness, hence the world.

Part 2 of ''Radiant Cool," a hundred pages of straight exposition, is less fun but much meatier. Lloyd's theory is a scientific version of Edmund Husserl's phenomenology. The basis of this method is a relentless attention to context, especially temporal context. We are not, phenomenologists point out, a collection of independent instruments reporting to an overseer called ''consciousness" or ''mind." On the contrary, it is not possible to perceive an object, event, or other stimulus with anything less than one's entire self. This means that by the same act with which we perceive something we interpret it, i.e., situate in some relation to the rest of our experience. And immediately, the experience of having experienced it in this instant, with whatever consequences (if any) that entails, becomes part of our experience, which is then brought to bear in the next instant.

It's not hard to see that this can get complicated, i.e., mathematical. Lloyd and others have adapted two of the newest techniques in brain science -- multidimensional scaling (MDS) and functional magnetic resonance imaging (fMRI) -- to the phenomenological method. Phenomenology demonstrates that the mind is a system in which ''the action of the entire network depends on the interaction of all its parts." To map such a network involves representing each of these parts as a separate dimension. The mathematics of MDS makes this possible, while fMRI, which measures what's happening in each part of an active brain, supplies the data.

As Lloyd's final pages make clear, consciousness may in principle be partly opaque. But by then, you'll see a lot farther into it than before.

Contemporary cognitive science would be impossible without computers, so ''Turing" makes a fitting companion to ''Radiant Cool." ''Turing," though, is a considerably more accomplished novel. Three brilliant, beautiful, glamorous, globe-girdling human characters form a love triangle that frames the main story, which is a series of lessons on the history and future of thought, delivered by a likable super-program named after the first and greatest of computer scientists, Alan Turing.

The lessons center on mathematics, but they're nontechnical and entirely accessible. (The author, a computer science professor at Berkeley, must be a superb lecturer.) There are occasional gleams of futuristic cybergear -- the novel is set a little further on in the 21st century, and two of the lovers are world-class code writers. But what's most delightful about ''Turing" is the charmed glow that Papadimitriou's prose sheds all around: on sunny Mediterranean islands, on the pangs of love, even on death, of which the all-wise ''Turing" gives an astonishing account to a dying character. And if you know anything of Turing's life -- one of the most interesting in the 20th century -- ''Turing" 's conclusion is exquisitely affecting.

''Natural-Born Cyborgs," though in some ways the most imaginative of these three books, is not fictional at all. The fact is, Andy Clark suggests, that ''the mind is less and less in the head." In a few decades, certainly by the end of the century, ''we shall be cyborgs," that is, ''human-technology symbionts: thinking and reasoning systems whose minds and selves are spread across biological brain and nonbiological circuitry." This is just evolution in action, Clark emphasizes. ''Homo faber," after all, means ''man (or woman) the toolmaker." And many of our tools, he argues, from papyrus and multiplication tables to ''user-adaptive home and office devices," are already ''deep and integral parts of the problem-solving systems we now identify as human intelligence." Cyberization began on the prehistoric savannah; it is simply human nature.

Although Clark explains all this carefully and sensitively, many readers will be uneasy, even frightened. Still, if you want to know something important about your grandchildren's great-grandchildren, you should take a deep breath and read ''Natural-Born Cyborgs."

George Scialabba is a freelance writer who lives in Cambridge.

SEARCH THE ARCHIVES
 
Today (free)
Yesterday (free)
Past 30 days
Last 12 months
 Advanced search / Historic Archives