THE EVOLVING RELATIONSHIP BETWEEN INDUSTRY AND EDUCATION

LOUIS ROBINSON
Director of University Relations
IBM
Armonk, New York

Each morning on Seventh Avenue in New York City, in the heart of the garment district, an employee comes to work, puts on his white work coat, and steps up to his work table. On that table he unrolls the bolt of cloth he is to work on this morning. After he has smoothed the cloth out on the table, he examines it to see if there are any imperfections in the cloth. Hovering over this table is a device that looks somewhat like the movable x-ray machine that one sees in a doctor's office.

In fact, it is very much like that because it has in it a flashlight bulb that casts a cross-hair shadow down on the cloth. Now the tailor, in ex­amining the cloth, sees an imperfection in the cloth. He turns a few dials, and this machine, which is cantilevered over the table, moves automatic­ally until the cross-hair shines directly onto the imperfection. Then he presses a button on the side of the machine, and the computing system that is connected to this device memorizes the geometric locations of that imperfection. Then he sees another imperfection, and he turns the dial until the cross-hairs shine down on that imperfection. He presses the but­ton, and the machine memorizes the location of that imperfection. When he feels he has identified all of the imperfections in the cloth, he looks at the order form and he sees he's being asked to cut, say, a man's size 40, regular suit. He keys that information into the system.

All the rules, formulas and algorithms for taking the standard man's suit pattern and adjusting it to the order reside in the memory of the machine, so it calculates exactly what the pattern ought to be. Then the computing machine lays the pattern out on the cloth, guaranteeing that the pattern never intersects any of the known imperfections in the cloth that the tailor has already identified to the machine and allowing for the proper matching on the cloth. The system ensures that the minimum length of the bolt of cloth is used in the process. When that is done, the tailor presses a button, and a laser beam shines out of that device to cut the cloth. Now, when that gentleman goes home at night, and someone asks him what he does for a living, he says, 'Tm a tailor."

And he is a tailor. In very dramatic ways, through the information technology that we teach and learn in our institutions, business, industry, government and, certainly in education, we have started to change the way people work, the way people live, their vocations, their avocations, the way they think, and it has even affected the very language we use.

This change in our language was brought home to me recently. I re­ceived a call from an acquaintance who is a columnist and she said, "Dr. Robinson, I'm writing an article about computers in the home. Could you help me?" I said I would. She said "Where is it?," and I said "Where is what?" She said, "The revolution; the computer revolution." I asked her what exactly she was looking for. She said she still had to vacuum the rugs, she still had to drive the children to dance class, she still had to wash and dry her hair. "When will the computer change all that?"

I don't believe we'll find evidence of the computer revolution in such mechanical acts. The information revolution is many things, but we'll probably still have to vacuum the rugs, we'll still have to drive the children to dance class and we'll still have to dry our hair. The revolution may not automate the vacuuming of the rugs, it may not automate the driving of children to the dance class, and it may not automate the wash­ing and drying of our hair-the revolution is something different. It's something that is much more important. Because we're not talking about mechanized automation, we're talking about information. Information has great value and is pervasive. It has nothing to do with driving children to school and yet it has everything to do with it. The information revolu­tion has to do with the recognition of the value of information and the learning of the processes needed to record, transcribe, organize, manipu­late and retrieve information usefully.

Let's go back five hundred years ago to the time of Gutenberg and the invention of movable type. Now you and I know that one of the conse­quences of that invention today in 1984 is that we have our newspapers and our books, and our technical journals; we know that Gutenberg made it possible for our society to become a literate society. Imagine five hun­dred years ago asking Mr. Gutenberg to tell us of the dramatic consequen­ces the invention of movable type will have in the home. What would he have answered? Then you had to sweep the carpet; today you have to sweep the carpet. Then you had to prepare the meals; today you have to prepare the meals.

Then you had to take the children to the dance class; today you have to take the children to the dance class. Then you had to comb your hair; today you have to comb your hair. So, what's the big impact of the inven­tion? Except that the invention of printing changed the way we entertain ourselves, it's changed the way we educate ourselves, it's changed the way we think about life, it's changed the way we think about philosophy, it's changed the way we think about religion, it's changed the way we work; except for the fact that it has permeated every thought of every single human being on the planet, and every enterprise and activity in which we engage; except for that, it has no obvious impact in the home . . . !

And computers will play exactly the same kind of pervasive role. To expect an exact enumeration of what the impact will be would be the equivalent of asking Gutenberg, five hundred years ago, to predict the im­pact of printing today. That would be absurd. Information processing is compelling us to learn better how to apply these machines so that they are used to improve the human condition in different ways. That's the greatest challenge of all, a challenge that I think characterizes a great deal of the research that I see going on today.

Foreseeing the trends for the next ten or twenty years in information processing is very hard to do. The fact that, by some accident, some few of us happen to work in research or advanced technology doesn't mean that we have a better crystal ball than anybody else. We don't. Your best guess about what might happen in information processing in the next ten or twenty years is as good, probably better, than anybody else's. As a matter of fact, technologists have historically been very poor forecasters of the longer term implications of their own work. When I was a youngster, one of my favorite heroes (there was a time when we used to have heroes in the United States) was an engineer named Dr. Lee DeForest. He invented the vacuum tube, which was used in the early radios and, subsequently, television and other electronic devices. Back in 1926, DeForest was taken to a laboratory where he witnessed the use of his technology to build the first prototype of a working black-and-white, sending-and-receiving TV system.

"While, theoretically and technically, television may be feasi­ble," he wrote, "commercially and financially, I consider it an impossibility, a development of which we need waste little time dreaming."

That from the man who started the whole thing!
It's never easy to have a global perception of how things are going to be. Still, the attraction for all of us is always the challenge of the future. But what aspect of it? Antoine de Saint-Exupery, the French poet and philosopher, once observed: "As for the future, your task is not to foresee but to enable."

The computer is an enabling instrument. It is an instrument that makes it possible for people to do what is so basic to the human enterprise: to process information in increasingly efficient and different ways. But therein lies a challenge-the beckoning call for learning to live with change, that some find a welcome opportunity and others find a threaten­ing upset to old ways of doing things.

It's all too easy to have the conceit that computers are never going to be much more modern than they are now. We may believe we've seen the ultimate in our laboratories in the form of cryogenic circuits, bubble memories, ink jet printers, gas panel displays and the like. But what we've perceived over the years, and what we can envision today, is a guarantee of change. There will be better components; circuits that are denser, fas­ter and more reliable; faster printers and display devices-a whole spec­trum of things. There'll be breakthroughs in the programming sciences, in how we talk to machines, and in new solutions to business and institu­tional problems.

One of the prime reasons why people fear technology is that they don't understand it. To communicate with machines, techniques requir­ing intensive training have been developed. This has helped create a fear of technology and, as a result, probably an arm's-length dislike and re­jection of it.

Some of that fear must be generated by the rapidly changing tech­nology itself. Consider the constant need for increased memory capacity in computer systems. In silicon technology we saw memory cells packed on a single silicon chip rise from 1,000 to 64,000 bits during the 1970s. In an experimental IBM silicon chip, one sixteenth of an inch square, we can pack 4,000 logic circuits and other experiments have shown that such chips can contain 10,000 logic switches or 250,000 bits of memory. In another development, 288,000 bits of memory were designed onto a single chip and now 512,000 bit memory chips have been demonstrated.

In bubble memory technology, using garnet material made of alu­minum-chromium-copper compound, developed as an epitaxial growth on a low defect single crystal substrate, we can create in our laboratories the equivalent of a memory of a hundred million bits on a square inch of material. That is roughly the equivalent of putting all of the Manhattan telephone directory on something the physical size of a postage stamp.

In order to build such a memory technology we draw circuit logic on the substrate with lines about one micron wide. A micron is a twenty­ five thousandth of an inch. A human hair is about 40 microns wide. That width is close enough to the wave length of light that in a sense we cannot draw such lines with conventional light dependent instrumenta­tion. Instead, we must use technologies that depend on shorter wave length signals such as x-ray lithographic methods.

The availability of such a memory technology has made more mean­ingful the systems and programming research now going on in many laboratories in office automation, electronic mail and text editing systems. This is not so much because such studies weren't formidable ten or even twenty years ago. Some such study ideas were indeed formulated that long ago. But only now are the variety of technologies becoming available in laboratories that gives us assurance of the ultimate implement ability of such systems. Large, local, reliable and inexpensive memories are impor­tant for such systems.

There is always constant demand for computer systems to be faster and more responsive. Today modern machines are built essentially out of nanosecond technology. The current technology of such machines switches in nanoseconds, billionths of a second. In some very expensive top-of-the-line-systems, circuits switch in a few nanoseconds. In slower, less expensive machines, certain functions may take several thousands or tens of thousands of nanoseconds.

In general, however, the state of the art in current electronic technology is nanosecond, billionth of a second, technology. In about a billionth of a second the electrical signal travels nominally about a foot. Therefore, in building modern machines today, we have to put most of the circuits in such a machine within about one cubic foot of each other. And that is feasible. In the IBM 4300 series machines we put seven hun­dred and four circuits on a quarter of an inch silicon chip. If we put enough of those little chips together-we can get almost all the hundred thousand or so circuits of a contemporary machine within a cubic foot of space.

However, in interactive demand environment, yet faster systems may be required to provide the appropriate response time. Fortunately, there are much faster technologies in development in our laboratories.

There is the field effect transistor technology (FET) which, using one micron line, can provide up to 10,000 logic gates per chip. There is also faster bipolar technology.

In cryogenic technology, circuits are imbedded in an environment close to absolute zero in temperature. That is 273 degrees below zero Cen­tigrade or Celsius, or 459 degrees below zero Fahrenheit. At such tempera­tures, resistive properties of circuits seem to all but disappear and using the proper circuit logic, such as Josephson junction tunneling effects, very fast switches can be created.

In recent experiments switches which operate in several picoseconds have been built. A picosecond is a trillionth of a second. Now a picosecond is a very short period of time. A picosecond is to a second what a second is to 31,710 years. And we are able to build experimental circuits which do things in a few picoseconds.

There are certainly some problems involved in such an experiment. In this area we are pressing against the fundamental limits of thermal noise, line impedance and stray capacitance. There are problems with electron migration and complex and sophisticated demands for the draw­ing of circuits using the electron beam of a scanning electron microscope. While in current nanosecond technology we must be concerned with hun­dreds of thousands of circuits packed essentially within a foot of each other, in a picosecond system we must be able to pack all such circuits nominally within one hundredth of an inch of each other.

In order to build such fast circuits we must draw very thin lines. And using an electron beam we are able to draw such lines of the magnitude of 120 angstroms thick. An angstrom is one ten-thousandth of a micron. The distance from the nucleus to the electron in the hydrogen atom is about 5 angstroms. The thickness of the membrane of the human nerve fiber is just under 100 angstroms. There are even alternate technologies under study that suggest that lines of 50 angstroms thickness may be feasible.

Another consideration in the rapidly changing technology is that sys­tems today are relatively complicated. I have in mind programming languages such as FORTRAN, COBOL, APL, BASIC and PASCAL and operating systems such as VM, MVS, UNIX and others. They all represent interfaces to systems which are far too complicated for the most general use we might envision in the future.

Machines must become much less complicated to use. I believe that is happening. The driving force in the simplification in the use of computing systems is the work in programming and systems advanced technology in laboratories around the world. Of special importance is the basic work in the field of artificial intelligence and the current developments in knowledge-based systems. Knowledge-based systems or expert systems are interactive and are built around data structures which can represent all the relevant problem domain facts and their interrelationships. Its inter­active dialogue acts as an intelligent consultive systems to the user. Such experimental systems already exist in a wide area of applications including the problem areas of medical diagnosis and treatment, in organic chemistry for the determination of molecular structures based upon spectrographic data, geology for mineral exploration, structural engineering, in the design of molecular genetic experiments and in other areas.

Systems will be much easier to use in the future. Programming ad­vances that materially simplify the user interface will characterize the decade of the eighties in computer science. We should expect to see a growing simplicity of the interface between user and the complex systems that we build. As the philosopher and theologian Martin Buber stated, "It is more important that the machine reflect our humanness than we be­come the mirror of the machine."

The individual's access to the computer technology depends on that access being simple enough to be manageable by the average person. In­formation processing has certain inherent complexities. There is no way to wish those complexities away. There seems to be a "conservation of com­plexity law" in information processing. The inherent complexity is there. It can be handled either by the system or by the user. Until now a great portion of that complexity was handled by the user. But personalized computing demands have created a market demand that a great deal of that complexity be absorbed by the computing system itself, thus making access to the computer system easier for the user. As a result, a great deal of development and research effort is devoted to creating a better under­standing of the information processing function and figuring how most ef­ficiently and economically to build these required functions and capabil­ities into function products.

There evolves from these considerations a natural evolutionary change in the relation of industry to education. The interaction between high technology companies and universities is growing dramatically as a result of a natural and increasing interdependence. Scientific discoveries in universities can and often do trigger off a whole new industry. But in many areas industrial technology has moved ahead more rapidly than has academic research and curricular development. Universities are depen­dent upon access to industry, its laboratories, processes and people in order to make engineering contributions to advanced technologies such as microelectronic, magnetic, recombinant DNA, and petroleum science.

The current relationship between universities and industry has deep roots in the economic realities of our times. Faced with declining U.S. competitiveness in worldwide markets, professional societies, the National Science Foundation, companies and universities are all looking for more effective bridges between universities and industry through research cooperation. There is a natural interdependence of universities and indus­try. This manifests itself in growing support of education by industry and more joint efforts in research and development. Both parties can benefit from such interrelations by access to the experience and skills of the other.

Many in universities seek science for science's sake-for a deeper understanding of the natural world and those who populate it. But many also seek to make this knowledge benefit mankind. For this to occur, the dialogue between those who create the science and those who apply the science needs to increase. Properly developed, this increased dialogue can provide new perspectives-without jeopardizing the unique roles each party must play in our society.

Industry needs access to good basic science. We also need access to a pool of well-trained scientists and engineers. Industry now employs 60 percent of the scientists in the United States, and our needs are growing.

The universities' dependence on industry stems from their functional and financial consideration. No university, no matter how large or rich, can possibly expect to replicate in its entirety the modern high technology industrial environment. Even if a university could do this, it would be outdated by technology advance as soon as it were done. The time when technology changes can be quickly and easily manifest by a rearrange­ment of instrumentation on a laboratory lead bench is past.

In industry, a parallel phenomenon is taking place. In 1963, just be­fore the announcement of System/360, IBM had thousands of people working as computer scientists. But the first advanced degree in computer science was awarded in 1964. All these 1963 "computer scientists" were actually mathematicians, physicists, and engineers who learned their trades by apprenticeship.

Then, in the late 1960s and early 1970s, IBM became highly depen­dent upon microelectronics technology. Our integrated circuit technolo­gists also learned that subject by apprenticeship. No degrees were granted in large-scale-integrated electronics until the late 1970s. The same is true today in our dependence on magnetic technology, CAD/CAM and other advanced areas.

An apprentice to an old master boot-maker can, indeed, learn to make the next pair of boots better than anyone else. But if there is a sur­prise or unexpected twist-which happens frequently in technology-the apprentice-trained technologist cannot fall back on core curriculum studies to solve those problems. New and formal scientific and engineering training is an essential need for the problem solver. Dependence upon ap­prentice training is thus a dangerous industrial practice.

As we have recognized each of these academic exposures and depen­dencies, we have turned to the academic community, offering our grant and contract support to encourage the development of educational plans and research in the much needed area. The educators have responded en­thusiastically. Another source of interdependence comes from a revolution in materials and process engineering.

Thus, the dependence of industrial technology on leading edge sci­ence grows increasingly acute. Similarly, the dependence of the university on the industry continues to grow. The university is no longer bound by its ivy-covered walls and industry is no longer bound by its plant sites. There is emerging a natural symbiotic interdependence between the two. Mod­ern science owes an increasing debt to the latest in technology, which pro­vides the tools needed by an ever-more-sophisticated science. The com­puter is the most important of those tools.

It is difficult to view the trends and directions in technology separate from the phenomenon of the social acceptance of that technology. The social acceptance and use of the technology is as much a part of the tech­nology as the circuits' memories, devices, printers and other components of the information system. The problem of social non-acceptance is com­pounded by the fact that technologists have always been poor forecasters, as have been our social thinkers. Consider the time just before Plato and Aristotle. At that time the method of communicating information was by verbal communication from one person to another, because society was oriented toward an oral culture. All science, all philosophy, all manner of things were essentially communicated by passing information orally. Only at about the time of Plato did we start to understand that new technology was emerging; only at that time did people learn about the technology of writing. Because of the oral heritage, Plato transcribed his philosophy as a dialogue so that the ideas could be understood. Plato felt that the people had to understand this new medium, this new technology called writing. Scribes did the writing in a role not unlike computer programmers today. Writing was viewed as a highly technical task-a special kind of task. Plato had not just a single scribe, but a battery of scribes. Scribes had to understand the methods of recording and the media. While dictating his text to one scribe, after awhile another had to take over. The quill pens being used on the paper or parchment wore quickly. The scribe would have to stop from time to time and sharpen his pen-and that is why to this very day that little device we carry around in our pocket is called a penknife. While he was sharpening his pen, another scribe would have to keep transcribing while Plato was verbalizing his ideas.

The great thinkers of that time began to feel that they had to improve the technology to make it easier to use. They learned to manufacture the parchment so it was smoother. Thus the pen didn't wear out as fast. The phrase, "gloss over" comes into the language from the event: as the paper and pens were made better, one could gloss over, and write down the thoughts much more rapidly, permitting the speaker to communicate in a more fluid, stream-of-consciousness way than was possible before the technology improved. Ancient Greek philosophers worried that the effects of the technology of writing were so profound and so deep that writing might signal an end to human memory. There might never be a need to remember things any more because we could write everything down on a parchment-and weeks, months, years, later, take down that scroll from the pillar, and read each detail as it was written. It was a revolution in human culture.

As we've seen in the case of the vacuum tube, there will always be skeptics where something new is concerned. Winston Churchill was a young man on a tour of duty in India when the X-ray was discovered. He read about it in the Indian press and wrote a letter home, which appears in the first volume of his memoirs. In it, he feared that this X-ray machine would spell an end to all human privacy because it could see through walls and clothing. He and his friends were so taken with this fear that they created a ditty that went something like this: "What a terrible thing, an horrible thing, is the new photography."

No one today would think of the X-ray as a threat to privacy. First impressions of a technology may not be correct, and being a pessimist about technology is not an uncommon posture. Woody Allen wrote re­cently in an editorial in The New York Times: "More than any other time in history, mankind faces the crossroads. One path leads to despair and ut­ter hopelessness; the other to total extinction."

The poet Keats once said that Isaac Newton had taken the romance out of the rainbow when he told us about the structure of light. I think Keats was wrong. The romance that is in the rainbow is not there because of our lack of understanding about the structure of light. The romance of technology lies in the future potential of that technology to serve society.

Whatever their field, people need-and generate-information, and that accounts for the exploding need for information processing. Some say the use of information or symbols is the single attribute that distinguishes humans from animals, even more than the opposed thumb or the use of fire. In the early days, we expected people with problems to come to the machine. Today, we've learned that it's much more useful to bring the capability of information processing to the people.

Information systems are really fundamentally different from all the other inventions we've become familiar with. The difference tells us something about their ultimate usefulness. We intuitively know that all other machines associated with the Industrial Revolution were used only by the nations that had them. So, for example, it's quite reasonable to speculate that only the English, French and Americans were candidates for building the Panama and Suez Canals because they, and maybe a few other nations, had the technical know-how to solve the disease problems and the giant steam shovels to construct such large canals. Very naturally, the use of technology was almost always limited to those who had it.

But the computer is different. Once it exists anywhere, it exists every­where. There are students in New York, Chicago and Los Angeles sitting at terminals trying to understand genetic principles, agricultural irriga­tion techniques, causes of cancer and world weather phenomena. There are also students sitting at terminals in Addis Ababa and Bangkok, often connected by satellite, using the same technology to study the same prob­lems. It is possible that the great insight into the causes of cancer will be discovered by such a student in Bangkok or by another student in Los Angeles. The qualified researcher in Bangkok has no particular disadvan­tages.

The information machine has this unique characteristic: It's a de­mocratizing instrument. It has a great leveling effect in making the entire society "information-literate," in making information available to people where they need it and can use it. Like Gutenberg's invention of movable type, modern information processing capability creates a society where everyone has equal opportunity to be information literate.

When Gutenberg invented movable type several hundred years ago, for example, the printed word was no longer restricted to the privileged. In a somewhat similar way, the computing machine, because of its ever decreasing cost, has made all kinds of information accessible to people everywhere. That doesn't mean that everyone will necessarily have a ma­chine, or will want or need one. It means that those who do have a need, who can benefit from its use, will have access to one where they work and, ultimately, where they live.

And this new technology will be as pervasive as Gutenberg's. And so the real question is not what kind of new automatic device I will have to clean my rug; the real question is what I have to do as a citizen to be prepared for this change, to use it to my advantage, to be able to use it in my life, to allow me to live as a contemporary person, in the twentieth century. And the answer to that is the same answer we would have given that person in 1468 or Mr. Gutenberg. To understand the impact, we have to think about the implications of this technology on ballet or history or education.

If it all comes down to a robot that's going to sweep the floor auto­matically and drive the child to the dance school automatically, then what has changed? Instead of walking the child to the dance studio, or riding the child on horseback to the dance studio, or riding the child in a wagon to the dance studio, or taking the child in a Model T Ford to the dance studio, or taking the child in a Ferrari to the dance studio, tomor­row we'll take the child to the dance studio in a hovercraft. Those changes are really superficial. What I'm trying to suggest is that there's a human challenge in this technology; it may or may not change how you get your children to the dance studio. But it is going to change the way you think about teaching dancing or indeed where the dance studio is, or even what dance is, and the purpose of dancing, and who does it, and who watches it-and is it being viewed in real time or through a recorded medium at a later date, and is that recording edited to change the original conception by the artist, and why- that's important.

Of course, it isn't dance per se we're talking about. It's all creative ac­tivity.

Picture Gutenberg, who now has invented movable type, and he says to the few monks who then were able to read and write, and were only reading and writing religious tracts and hymnals and related documents, that we're going to use this technology to put in everyone's home every morning, three pages of listing of the opening ask, bid and closing price of every stock in the New York Stock Exchange. Now that would be, in 1460, outrageous. There was no stock exchange, there was no ask or bid price, there was no newspaper; there was no New York. The notion of bringing numbers to your home every morning for some meaningful purpose at 6 a.m. would have been outrageous.

One problem today is that we can't think of applications of informa­tion processing that are outrageous enough.

I suggest that there are some things information processing will not be. It will affect how we work, but it's not a surrogate for working in our offices and plants. It is not the surrogate entertainment medium. That doesn't mean we won't use it for entertainment, but we'll still go to the concert hall, even though we own the records, and we'll still go the movie theater even though we own the video-tapes, and we'll still go to the base­ball game, even though it's broadcast live, in living color, with playback on television. Because there's something special that takes place in these activities. Biologists have said when we finally understand every single bit of information in the genetic code, and we know where it resides, and what its chemical combination is, and where it is on the double helix, we still won't have a living thing. It will affect education but it's not a sur­rogate educational medium.

So where is the "computer revolution" my columnist friend wanted to find? Everyone is looking for evidence of it in the kitchen. It's here in the mind. It's a revolution in the way people think. It's the way we think about law, privacy, society. That's not to say that there won't be automated gadgets all over the place. It's changing the minds of the students in Bangkok. It's changing the view of the stockbroker in Sweden. It's changing the plans of the fishing fleet owner in Boston; it's changing the life of the clothing manufacturer in Madrid. Again, the revolution is really here in our minds. When I walk into the kitchen, the revolution is in the kitchen. And wherever you're bringing your mind, you're bringing the revolution. You are part of that revolution if you're a housewife; you're a part of that revolution if you're a microbiologist; you're part of that revolution if you're a town alderman. You're a part of that revolution if you're a cop on the beat; you're part of that revolution if a computer pro­grammer. You're a part of that revolution if you're an airline reservation clerk. You're part of that revolution if you're a teacher, student or tailor.

There's a popular fantasy about science and technology that brings the following image to mind. We call a meeting in a great coliseum of all the scientists and technologists in the world. And we have at the front of the room an army of scribes, or clerks. We start at the first seat of the first row and we ask the first scientist to stand up and list all the questions that he or she can think of that are unanswered, and they're all written down on the blackboard, one by one. And then we get to the next person and we say, "Can you think of some others?" and he or she adds some more, of course, because he's in a different field. And then we get to the last person and he or she adds a few more. And now we have on the blackboard every question that every scientist can think of. When a scientist makes a discov­ery, he or she calls in to the coliseum and tells the proper secretary to cross off questions he or she has answered. So we keep an inventory. Then we agree, in the UN, that on the day that the last question is crossed off the board, and there are no more questions, we'll have a big four-day interna­tional holiday weekend to celebrate this great achievement. Well, that's absurd-and what's absurd about it is this: that every time we cross off one question, we add ten more to the bottom of the list. The list doesn't get shorter, it gets longer all the time. When the discovery of the double helix was made, then we were able to formulate questions about genetics which were unthinkable, un-formidable before we knew the double he­lix existed. When we fly by Saturn, we are able to formulate questions about the rings of Saturn that we could not formulate before. While we know more every day, what we don't know gets greater and greater, and so it becomes harder not easier, to say what will happen next. When one thought the earth was the center of the universe and there was nothing beyond the visible stars, it was pretty easy to talk about cosmology. The minute we know there are galaxies out there, there are a lot more ques­tions to be answered. The list doesn't get shorter; it gets longer.

H.G. Wells ended one of his adventure stories with its hero saying, "For man there is no rest and no ending. He must go on conquest beyond conquest. And when he has conquered all the deeps of space and all the mysteries of time he will still be but beginning." The challenge for all of us is always the challenge of the future. As Eugene O'Neill wrote, "A man's work is in danger of deteriorating when he thinks he has found the one best formula for doing it." The daring opportunity we face is in sustaining our openness to change. We cannot predict precisely what will happen in our technology. But even in the face of that uncertainty, we must live with that technology and advance it.

One thing that is certain is that this technology will continue to change. We have not heard the last word. The challenge is to recognize, I think, that there will be change, dramatic change, in the opportunity to apply information technology. You and I will be there. We'll have to know how to design to it, how to build to it, how to apply it, how to use it in our institutions and how to teach it to our students. And yet we will have to face that challenge with the knowledge that we cannot predict to­day precisely how that change will manifest itself in the future. It is in the climate of ignorance of what that change will be, that we will have to work.

What is the total potential of computers? I think it is only limited by our imagination. It's like giving an artist a palette that has an infinite number of colors, some of them even invisible to the naked eye, like ultra­violet or x-rays. It's like putting into the hands of a sculptor a chisel of in­finite sharpness that makes it possible to model the growth of trees, the flow of water, the distillation of petroleum, the configuration of compu­ters, the structure of the earth's crust, the growth of cancer cells, the flight to Saturn or even the movement of chess pieces. It's like giving to a com­poser a set of musical notes and harmonics and chords, some of which have never been played before. It's like giving to a poet an alphabet that is not limited to our letters and ideograms, that is virtually infinite in scope. It is a tool for the realization of ideas, and it carries with it a challenge for all of us. Today, information processing invites us, summons us, dares us, commands us to respond in education to apply these technologies to prob­lems that affect the way we live and the quality of our lives.