Monday, September 3, 2012

The Question Concerning Technology: Recycling Ellen Ullman

I'm an admirer of the writer Ellen Ullman, the software engineer turned novelist. Her 1997 memoir, Close to the Machine: Technophilia and Its Discontents, made a big impression on me.

Ullman recently wrote a commentary for the New York Times on the computerized stock market debacle triggered last August by the trading firm Knight Capital. In it she reaffirmed a crucial point she'd made in Close to the Machine, a point I find myself coming back to repeatedly in this space. To wit: If you think we're in control of our technologies, think again. To refresh memories, Knight, one of the biggest brokers on Wall Street, had developed a new computerized trading program that was designed to take advantage of some soon-to-be-implemented changes in trading rules. Anxious to profit from getting in first, Knight set its baby loose the moment the opening bell sounded on the day the changes went into effect. It went rogue, setting off an avalanche of errant trades that sent prices careening wildly all over Wall Street. In the 45 minutes it took to shut the system off, Knight's computer bought stocks that had to be sold for losses amounting to nearly half a billion dollars.? Much of the finger-pointing that followed was aimed at Knight's failure to adequately debug its new system before it went live. If only the engineers had been given the time they needed to triple check their code, the story went, everything would have been fine. It was this delusion that Ullman torpedoed in her essay for the Times. It's impossible to fully test any computer system, she said. We like to think that there's a team of engineers in charge who know the habits and eccentricities of their programs as intimately as they know the habits and eccentricities of their spouses. This is a misconception. Systems such as these don't run on a single body of code created by one company, Ullman says. Rather, they're a collection of inter-connected "modules," purchased from multiple vendors, with proprietary software that the buyer (Knight Capital in this case) isn't allowed to see.?

Each piece of hardware also has its own embedded, inaccessible programming. The resulting system is a tangle of black boxes wired together that communicate through dimly explained ?interfaces.? A programmer on one side of an interface can only hope that the programmer on the other side has gotten it right.

The complexities inherent in such a configuration are all but infinite, as are the opportunities for error. Forget, in other words, about testing your way to perfection. "There is always one more bug," Ullman says. "Society may want to put its trust in computers, but it should know the facts: a bug, fix it. Another bug, fix it. The 'fix' itself may introduce a new bug. And so on."

As I say, this was an argument Ullman explored with terrific insight in Close to the Machine. Ullman's experience as a programming insider affirmed what so many us on the outside sense intuitively, that computer systems (like lots of other technologies) follow their own imperatives, imperatives that make them at some fundamental level unresponsive to the more fluid needs of human beings. ?

I devoted an extended passage to Ullman's book in my 2004 masters thesis on the philosophy of technology. I was discussing technology's inherent tendency to deal with people in mass terms, rather than as individuals, a tendency that leads inevitably to dehumanization. Here's what I wrote:

Human beings can be more readily absorbed by technique if they are first turned into abstractions. A penetrating description of how this process occurs can be found in the book Close to the Machine...Ullman demonstrates by the perceptiveness of her book that you don?t permanently or necessarily lose your capacity to be human upon entering into intimacy with machines. But it is precisely because she is so perceptive that she can document how difficult it is, while in the midst of the human-machine exchange, not to lose it.

?I?d like to think that computers are neutral, a tool like any other,? she writes, ?a hammer that can build a house or smash a skull. But there is something in the system itself, in the formal logic of programs and data, that recreates the world in its own image.?

In her opening chapter, Ullman describes a meeting she has with a group of clients for whom she is designing a computer system, one that will allow AIDs patients in San Francisco to deal more smoothly with the various agencies that provide them services. Typically, this meeting has been put off by the project?s initiating agency, so that the system?s software is half completed by the time Ullman actually sits down with the people for whom it is ostensibly designed.

As the meeting begins, it quickly becomes apparent that none of the agency representatives feels their specific needs have been adequately incorporated into the system. Suddenly, the comfortable abstractions on which Ullman and her programmer colleagues based their system begin to take on ?fleshly existence.? That prospect terrifies Ullman.? ?I wished, earnestly, I could just replace the abstractions with the actual people,? she writes.

But it was already too late for that. The system pre-existed the people. Screens were prototyped. Data elements were defined. The machine events already had more reality, had been with me longer, than the human beings at the conference table. Immediately, I saw it was a problem not of replacing one reality with another but of two realities. I was at the edge: the interface of the system, in all its existence, to the people, in all their existence.

The real people at the meeting continue to describe their needs, and how they haven?t been accommodated by the prototype Ullman and her team have created. Ullman takes copious notes, pretending that she?s outlining needed revisions. In truth she's trying to figure out how to save the system. Soon she is back in a room with her programmers, discussing which demands can be integrated into the existing matrix and which will have to be ignored. The talk is of ?globals,? ?parameters,? and ?remote procedure calls.? The fleshly existence of the end users is forgotten once more.

?Some part of me mourns,? Ullman says,

but I know there is no other way: human needs must cross the line into code. They must pass through this semipermeable membrane where urgency, fear, and hope are filtered out, and only reason travels across. There is no other way. Real, death-inducing viruses do not travel here. Actual human confusions cannot live here. Everything we want accomplished, everything the system is to provide, must be denatured in its crossing to the machine, or else the system will die.

Source: http://thequestionconcerningtechnology.blogspot.com/2012/09/recycling-ellen-ullman.html

twisted metal sea lion si swimsuit 2012 westminster dog show abe lincoln vampire hunter xi jinping matt bomer

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.