Skip to Navigation
University of Pittsburgh
Print This Page Print this pages

March 6, 1997

Martha Pollack

Anybody who has ever tried to print every other page in a multi-page document knows how difficult it can be to communicate with a computer. In their frustration, some people have been known to scream: "Why don't they make a computer that understands English?" Although the day when computers will respond to all voice commands is still off in the future, a number of companies have developed systems capable of responding to "natural languages," according to Founders Day speaker Martha Pollack, associate professor of computer science and intelligence systems.

In her speech, "Talking Computers: Why Don't They Understand?" Pollack said several computer systems capable of communicating on a primitive level using natural languages already are on the market.

One of those systems is Clinical Reporting, which allows a physician to generate patient files by speaking into a microphone attached to a headset. The system does not simply transcribe what the doctor says, but interprets the material and uses the interpretation to fill out a patient report, generate a bill or send out letters to consulting physicians. Another natural language system currently available involves machine translation. Because neither humans nor machines can translate from one language to another unless they have an understanding of the languages involved, word-to-word conversion does not work. That means translation systems are limited to very specific areas of language, such as technical reports. Even when subject matter is limited, according to Pollack, the systems currently available do not produce perfect translations. They still need to be gone over by a human versed in the languages involved, but they are good enough that some corporations have begun using them.

There also are currently available experimental machines that when attached to telephones allow two people who speak different languages to communicate by translating the discussion into each language. Such systems are only 50 – 60 percent accurate and, again, are limited as to subject matter, according to Pollack.

A third natural language system now in limited use is an information extraction system that can read a newspaper story or a report and generate a summary with about 75 percent accuracy.

"Computers can understand, at least to a limited extent, but they have two big limitations, both of which are subjects of research," Pollack said.

The first problem is that they only work when the domain of discourse is limited to such things as travel plans or patient reports. "Computers need to learn more about the world before they can talk more about the world," Pollack explained.

The second limitation is that it is difficult for computers to determine indirect and implicit meanings. A human might be talking about Microsoft Windows, for example, while the computer only knows the word "windows" in reference to the windows in a home.

Solving both problems is difficult and it will be some time before computers can understand implicit meanings as well as literal meanings, according to Pollack.

"As we continue to see progress in other areas of machine learning, notably artificial learning, it seems likely that computers will learn from their experience with us and thereby become better communicators," she said. "Eventually, they may really understand us."

–Mike Sajna


Leave a Reply