Welcome!

INTRODUCTION
GREAT IDEAS
SIMULATION
QUIZ



----- h i s t o r y

Click on the yellow marker to see an event.

In the early 1940's when computers were still in their infancy, designers had many hardware and software problems to overcome. Bitter arguments ensued over design as new and radical prototypes appeared. One such area of concern was the overall storage system of the machine.

Early computers used the two-level storage system consisting of main memory and secondary memory. The main memory (RAM) for the first computers were magnetic cores. The secondary memory (hard disks) were magnetic drums. Back then, storage was neither inexpensive nor easy to find. Today, 128 megabytes of RAM is a standard in most personal computers. Sixty years ago, the most sophisticated computer filled up a warehouse and was lucky to have 128 kilobytes, a fraction of what a floppy disk can hold.

This lack of space has serious implications. What happens if program "grows" while it is being executed? Eventually it will run out of main memory to use. This was one of the main complaints programmers had. They had to re-code their program every time they switched machines or added more memory. It would be nice to have an "unlimited" amount of fast, cheap memory in order to do some serious computing. AHA! Main memory is not the only type memory we have.

Although a significant amount slower, we have the non-volatile hard drive. Now, when we run out of main memory we could use the hard drive's memory to store data and code. Because the hard drive is so slow we would like to keep most of the current program in our fast main memory and create the illusion of an unlimited amount of fast available memory. In order to do this, a special automatic set of hardware and software was needed. Now we can treat all memory as being on the same level. Thus, the concept of virtual memory was born.

All new and great ideas usually come into the world on unfriendly terms. Virtual memory was no different. When the idea came up it was considered too radical for the conservative computer profession. The first virtual memory machine was developed in 1959. They called it the one level storage system. Although it was heavily criticized, it spurred many new prototypes during the early 1960's.

Before virtual memory could be regarded as a stable entity, many models, experiments, and theories had to be developed to overcome the numerous problems with virtual memory. Specialize hardware had to be developed that would take a "virtual" address and translate it into an actual physical address in memory (secondary or primary). Some worried that this process would be expensive, hard to build, and take too much processor power to do the address translation.

The final debate was laid to rest in 1969. The question was: Could virtual memory controlled systems perform better than manual programmer strategies? The IBM's research team, lead by David Sayre, showed that the virtual memory overlay system worked consistently better than the best manual-controlled systems.

By the late 1970's the virtual memory idea had been perfected enough to use in every commercial computer. Personal computers were a different matter. Developers thought that their computers would not be subject to the problems of large-scale commercial computers. Consequently, early PC's didn't include virtual memory. Ironically enough, they discovered the same problems that their predecessors discovered during the 50's and 60's. In 1985 Intel offered virtual memory and cache in the 386 microprocessor and Microsoft offered multiprogramming in Windows 3.1. Others finally followed and virtual memory found its place in our everyday lives.

2000
1995
1990
1985
1980
1970
1969
1967-1975
1966
1965
1960'S
1960
1959

George Mason University / The Core of Information Technology