Introduction to Programming Using Visual Basic 2005, An (6th Edition)

[Page 21 (continued)]

1.5. Biographical History of Computing

The following people made important contributions to the evolution of the computer and the principles of programming. While we think of the computer as a modern technology, it is interesting to note that many of its technologies and concepts were developed decades before Silicon Valley became an address in American culture.

1800s

George Boole: a self-taught British mathematician; devised an algebra of logic that later became a key tool in computer design. The logical operators presented in Section 5.1 are also known as Boolean operators.

Charles Babbage: a British mathematician and engineer; regarded as the father of the computer. Although the mechanical "analytical engine" that he conceived was never built, it influenced the design of modern computers. It had units for input, output, memory, arithmetic, logic, and control. Algorithms were intended to be communicated to the computer via punched cards, and numbers were to be stored on toothed wheels.

Augusta Ada Byron: a mathematician and colleague of Charles Babbage; regarded as the first computer programmer. She encouraged Babbage to modify the design based on programming considerations. Together they developed the concepts of decision structures, loops, and a library of procedures. Decision structures, loops, and procedures are presented in Chapters 5, 6, and 4 of this text, respectively.

Herman Hollerith: the founder of a company that was later to become IBM; at the age of 20, he devised a computer that made it possible to process the data for the U.S. Census of 1890 in one-third the time required for the 1880 census. His electromagnetic "tabulating machine" passed metal pins through holes in punched cards and into mercury-filled cups to complete an electric circuit. Each location of a hole corresponded to a characteristic of the population.

1930s

Alan Turing: a gifted and far-sighted British mathematician; made fundamental contributions to the theory of computer science, assisted in the construction of some of the early large computers, and proposed a test for detecting intelligence within a machine. His theoretical "Turing machine" laid the foundation for the development of general-purpose programmable computers. He changed the course of the Second World War by breaking the German "Enigma" code, thereby making secret German messages comprehensible to the Allies.


[Page 22]

John V. Atanasoff: a mathematician and physicist at Iowa State University; declared by a federal court in Minnesota to be the inventor of the first electronic digital special-purpose computer. Designed with the assistance of his graduate assistant, Clifford Berry, this computer used vacuum tubes (instead of the less efficient relays) for storage and arithmetic functions.

1940s

Howard Aiken: a professor at Harvard University; built the Mark I, a large-scale digital computer functionally similar to the "analytical engine" proposed by Babbage. This computer took five years to build and used relays for storage and computations. It was technologically obsolete before it was completed.

Grace M. Hopper: retired in 1986 at the age of 79 as a rear admiral in the United States Navy; wrote the first major subroutine (a procedure that was used to calculate sin x on the Mark I computer) and one of the first assembly languages. In 1945, she found that a moth fused onto a wire of the Mark I was causing the computer to malfunction, thus the origin of the term "debugging" for finding errors. As an administrator at Remington Rand in the 1950s, Dr. Hopper pioneered the development and use of COBOL, a programming language for the business community written in English-like notation.

John Mauchley and J. Presper Eckert: electrical engineers working at the University of Pennsylvania; built the first large-scale electronic digital general-purpose computer to be put into full operation. The ENIAC used 18,000 vacuum tubes for storage and arithmetic computations, weighed 30 tons, and occupied 1500 square feet. It could perform 300 multiplications of two 10-digit numbers per second, whereas the Mark I required 3 seconds to perform a single multiplication. Later they designed and developed the UNIVAC I, the first commercial electronic computer.

John von Neumann: a mathematical genius and member of the Institute for Advanced Study in Princeton, New Jersey; developed the stored program concept used in all modern computers. Prior to this development, instructions were programmed into computers by manually rewiring connections. Along with Hermann H. Goldstein, he wrote the first paper on the use of flowcharts.

Maurice V. Wilkes: an electrical engineer at Cambridge University in England and student of von Neumann; built the EDSAC, the first computer to use the stored program concept. Along with D. J. Wheeler and S. Gill, he wrote the first computer-programming text, The Preparation of Programs for an Electronic Digital Computer (AddisonWesley, 1951), which dealt in depth with the use and construction of a versatile subroutine library.

John Bardeen, Walter Brattain, and William Shockley: physicists at Bell Labs; developed the transistor, a miniature device that replaced the vacuum tube and revolutionized computer design. It was smaller, lighter, more reliable, and cooler than the vacuum tube.


[Page 23]

1950s

John Backus: a programmer for IBM; in 1953, headed a small group of programmers who wrote the most extensively used early interpretive computer system, the IBM 701 Speedcoding System. An interpreter translates a high-level language program into machine language one statement at a time as the program is executed. In 1957, Backus and his team produced the compiled language Fortran, which soon became the primary academic and scientific language. A compiler translates an entire program into efficient machine language before the program is executed. (Visual Basic combines the best of both worlds. It has the power and speed of a compiled language and the ease of use of an interpreted language.)

Reynold B. Johnson: IBM researcher; invented the computer disk drive. His disk drive, known as the Ramac, weighed a ton and stored five megabytes of data. Mr. Johnson's other inventions included an electromechanical device that can read pencil-marked multiple-choice exams and grade them mechanically, the technology behind children's "Talk to Me Books," and major advances in the quality of tapes used in VCRs. He was a 1986 recipient of the National Medal of Technology.

Donald L. Shell: in 1959, the year that he received his Ph.D. in mathematics from the University of Cincinnati, published an efficient algorithm for ordering (or sorting) lists of data. Sorting often consumes a significant amount of running time on computers. The Shell sort is presented in Chapter 7 of this text.

1960s

John G. Kemeny and Thomas E. Kurtz: professors of mathematics at Dartmouth College and the inventors of BASIC; led Dartmouth to national leadership in the educational uses of computing. Kemeny's distinguished career included serving as an assistant to both John von Neumann and Albert Einstein, serving as president of Dartmouth College, and chairing the commission to investigate the Three Mile Island nuclear power plant accident. In later years, Kemeny and Kurtz devoted considerable energy to the promotion of structured BASIC.

Corrado Bohm and Guiseppe Jacopini: European mathematicians; proved that any program can be written with the three structures discussed in Section 2.2: sequence, decisions, and loops. This result led to the systematic methods of modern program design known as structured programming.

Edsger W. Dijkstra: professor of computer science at the Technological University at Eindhoven, The Netherlands; stimulated the move to structured programming with the publication of a widely read article, "Go To Statement Considered Harmful." In that article, he proposes that GOTO statements be abolished from all high-level languages such as BASIC. The modern programming structures available in Visual Basic do away with the need for GOTO statements.

Harlan B. Mills: IBM Fellow and professor of computer science at the University of Maryland; advocated the use of structured programming. In 1969, Mills was asked to write a program creating an information database for the New York Times, a project that was estimated to require 30 person-years with traditional programming techniques. Using structured programming techniques, Mills single-handedly completed the project in six months. The methods of structured programming are used throughout this text.


[Page 24]

Donald E. Knuth: professor of computer science at Stanford University; generally regarded as the preeminent scholar of computer science in the world. He is best known for his monumental series of books, The Art of Computer Programming, the definitive work on algorithms.

Ted Hoff, Stan Mazer, Robert Noyce, and Federico Faggin: engineers at the Intel Corporation; developed the first microprocessor chip. Such chips, which serve as the central processing units for microcomputers, are responsible for the extraordinary reduction in the size of computers. A computer with greater power than the ENIAC now can be held in the palm of the hand.

Douglas Engelbart: human interface designer at the Stanford Research Institute; inventor of the computer mouse. While most of us would believe that the mouse is a new technology, the prototype was actually developed in the 1960s. Funded by a government project, Engelbert and his team developed the idea of a mouse to navigate a computer screen with pop-up "windows" to present information to the user. In a contest to choose the best navigation tool, the mouse won over the light pen, a joystick, a "nose-pointing" device, and even a knee-pointing device!

1970s

Ted Codd: software architect; laid the groundwork for relational databases in his seminal paper, "A Relational Model of Data for Large Shared Data Banks," which appeared in the June 1970 issue of the Communications of the ACM. Relational databases are studied in Chapter 10 of this text.

Paul Allen and Bill Gates: cofounders of Microsoft Corporation; developed languages and the original operating system for the IBM PC. The operating system, known as MS-DOS, is a collection of programs that manage the operation of the computer. In 1974, Gates dropped out of Harvard after one year, and Allen left a programming job with Honeywell to write software together. Their initial project was a version of BASIC for the Altair, the first microcomputer. Microsoft is one of the most highly respected software companies in the World and a leader in the development of applications and programming languages.

Stephen Wozniak and Stephen Jobs: cofounders of Apple Computer Inc.; started the microcomputer revolution. The two had met as teenagers while working summers at HewlettPackard. Another summer, Jobs worked in an orchard, a job that inspired the names of their computers. Wozniak designed the Apple computer in Jobs's parents' garage, and Jobs promoted it so successfully that the company was worth hundreds of millions of dollars when it went public. Both men resigned from the company in 1985. Jobs founded a new company that developed the "Next" computer. He later returned to Apple and incorporated aspects of the Next computer into the Apple operating system.

Dan Bricklin and Dan Fylstra: cofounders of Software Arts; wrote VisiCalc, the first electronic spreadsheet program. An electronic spreadsheet is a worksheet divided into rows and columns, which analysts use to construct budgets and estimate costs. A change made in one number results in the updating of all numbers derived from it. For instance, changing a person's housing expenses will immediately produce a change in total expenses. Bricklin got the idea for an electronic spreadsheet after watching one of his professors at Harvard Business School struggle while updating a spreadsheet at the blackboard. VisiCalc became so popular that many people bought personal computers just so they could run the program.


[Page 25]

Dennis Ritchie: member of the team at Bell Labs, creator of the C programming language. C is often referred to as a "portable assembly language." Programs developed in C benefit from speed of execution by being fairly low-level and close to assembly language, yet not being tied up in the specifics of a particular hardware architecture. This characteristic was particularly important to the development of the Unix operating system, which occurred around the same time as the development of C. Throughout the 1970s, 1980s, 1990s, and even today, C has been a widely used language, particularly in situations where very fast program execution time is important.

Ken Thompson: member of the team at Bell Labs that created the Unix operating system as an alternative to the operating system for IBM's 360 mainframe computers. Unlike many other earlier operating systems, Unix was written in C instead of assembly language. This allowed it to be adapted to a wide variety of computer architectures. Programmers could then develop programs in C that were intended to run on a Unix operating system, avoiding much of the rewriting involved in porting (adapting) these programs from one type of machine to another. Over the past 30 years, many variants upon Unix have emerged, often referred to as different "flavors" of Unix. Unix and its variants have played a tremendous role in the growth of the Internet, as well as being an operating system used by many commercial, scientific, and academic institutions.

Alan Kay: a brilliant programmer at the University of Utah; crystallized the concept of reuseable building blocks of code to develop software programs. He developed a new language, Smalltalk, a pure object-oriented language, while at Xerox Palo Alto Research Center (PARC) in the 1970s. Most of today's programming languages such as C++, C#, Java, and Visual Basic make use of object-oriented features first developed in Smalltalk. Still, because of its conceptual purity, Kay believes that Smalltalk "is the only real object-oriented language."

Don Chamberlain: a Stanford Ph.D. and National Science Foundation scholar working at IBM; created a database programming language, later known as SQL (Structured Query Language). This innovative language was built on a "relational" model for data, where related data groups could be put into tables, then linked in various ways for easy programming and access. Very few people know that one of the world's largest software companies, Oracle Corporation, was founded on this technology, developed by IBM and published for all to use. SQL is covered in Chapter 10 of this book.

1980s

Phillip "Don" Estridge: head of a product group at IBM; directly responsible for the success of the personal computer. The ubiquity of the PC today can be attributed to a marketing decision by Estridge to make off-the-shelf, easily producible computers for a mass market, and to back that with IBM's huge marketing resources. Estridge's "skunk-works" group in Boca Raton broke many established IBM rules for product introduction. The IBM PC, introduced in 1981, chose an operating system from Microsoft and a processor chip from Intel over other vendors. This licensing deal opened the way for Microsoft's and Intel's successes today.


[Page 26]

Mitchell D. Kapor: cofounder of Lotus Corporation; wrote the business software program Lotus 1-2-3, one of the most successful pieces of software for personal computers in its time. Lotus 1-2-3 is an integrated program consisting of a spreadsheet, a database manager, and a graphics package.

Tom Button: group product manager for applications programmability at Microsoft; headed the team that developed QuickBasic, QBasic, and Visual Basic. These modern, yet easy-to-use, languages greatly increased the productivity of programmers.

Alan Cooper: director of applications software for Coactive Computing Corporation; considered the father of Visual Basic. In 1987, he wrote a program called Ruby that delivered visual programming to the average user. A few years later, Ruby was combined with QuickBasic to produce Visual Basic, the remarkably successful language that allows Windows programs to be written from within Windows easily and efficiently.

Tim BernersLee: British computer scientist; father of the World Wide Web. He proposed the Web project in 1989 while working in Switzerland. His brainchild has grown into a global phenomenon. Berners-Lee, currently a senior research scientist at MIT, was awarded the first Millenium Technology Prize in 2004.

Charles Simonyi: a Hungarian programmer; known to the industry as the "father of Word." He left his native Budapest as a 17-year-old prodigy to work at Xerox's prestigious Palo Alto Research Center (PARC), where he developed the capability of "What You See Is What You Get" (WYSIWYG) software. This technology, which allows users to define the fonts and presentations for computer output, opened the door to desktop publishing on the personal computer. In 1980, Simonyi joined a fledgling software company called Microsoft and developed Microsoft Word into one of the most widely used software programs ever.

Bjarne Stroustrup: a native of Denmark; creator of the C++ programming language. Stoustrup came to the United States to work for Bell Labs, during which time he created C++ to extend the C programming language with additional capabilities for object-oriented and generic programming. C++ has been one of the most widely used programming languages, combining the speed and efficiency of C with features that make the development of large-scale programs much simpler. Because of its ability to work at a low level in a manner similar to C, C++ remains the language of choice for many projects where other programming languages such as Java and Visual Basic are not suitable.

Richard M. Stallman: a star programmer at MIT's Artificial Intelligence Lab and a MacArthur Foundation Fellow; founded the Free Software Foundation (FSF). The FSF is an organization dedicated to promoting the free availability of software for public access, modification, and improvement. This philosophy contrasts with that of a large part of the commercial software development world, where software is developed for sale, but the full rights to the source code are maintained by the company writing the software. Among his many technical accomplishments, Stallman created free versions of EMACS (a highly popular text editor on Linux/Unix systems) and GCC (a free C language compiler).


[Page 27]

1990s

Marc Andreessen: a former graduate student at the University of Illinois; inventor of the Web browser. He led a small band of fellow students to develop Mosaic, a program that allowed the user to move around the World Wide Web by clicking on words and symbols. Andreessen went on to cofound NCSA and Netscape Communications Corporation. Netscape's was the leading Web browser throughout the mid 1990s before being surpassed by Microsoft's Internet Explorer.

James Gosling: corporate vice president and Sun Fellow at Sun Microsystems; creator of the Java programming language. What started as an attempt to create a simple language for a networked world, Java, an object-oriented language, became a popular language for Internet programming. Java has become the primary teaching language at many universities. Visual Basic has many of the features and capabilities of Java.

Linus Torvalds: a graduate of the University of Helsinki in Finland; developed the popular Linux operating system. Linux began as a project by Linus to create a Unix operating system that could be used on personal computers. In the early 1990s, he began sharing the Linux source code with other OS programmers over the Internet, allowing them to contribute to and improve it. This philosophy resonated with the Internet culture, and the popularity of Linux grew quickly. Today, Linux is widely used, particularly as an operating system for Web servers. It is an open-source operating system, meaning that the source code (instructions) is made freely available for anyone to obtain, view, modify and use.

Категории