Randy Cannon: Computer Science Information

Randy Cannon & The History of Computing


Randy Cannon has studied the history of computing in the respect of computer science and is excited to share that information with you. In this post, Randy Cannon covers the technical history of computing with an introduction to some of the computer science pioneers. Then, Randy Cannon covers the hardware and software history of computing and concludes this post with a focus on computing as a tool and as a discipline.

Many of you who grew up using the computers of today know about Elon Musk of Tesla, Mark Zuckerberg Facebook, and Larry Page of Google. Some of you younger folks may even remember some of the pioneers of computing like Steve Jobs of Apple and Bill Gates of Microsoft. These are today’s celebrities in computing due to their mega success in affecting the lives of billions and amassing a nice fortune in the process. While these individuals have offered great contributions to the industry, there are many who are not well known outside of the computer science field and this post will highlight some of those people and their contributions to computing and the field of computer science.


Technical History

The first forms of computing date back to astrological Stonehenge, and the abacus which was developed for arithmetic. Blaise Pascal was a French mathematician who built a solid gear calculator. Gottfried Wilhelm Von Leibniz was a German mathematician who built the first calculator for adding, subtracting, multiplying, and dividing. Joseph Jacquard built the first punch card weaving machine. Charles Babbage was a British mathematician who envisioned the “analytical engine” which included components from modern computers like memory, numerical inputs and mechanical steps using punch cards. Ada Augusta, also known as the Countess of Lovelace, was a mathematician who improved Babbage’s analytical engine and was known as not only the first programmer, but the first female programmer and she developed the loop, which is a series of repeating instructions. The U.S. Department of Defense also has a programming language named after Ada. William Burroughs produced and sold a mechanical adding machine. Dr. Herman Hollerith developed the first electro-mechanical tabulator that read from punch cards used for census and Hollerith was also the founder of IBM. In 1936, Alan M. Turing, a British Mathematician, made the Turing machine for computing theory (Dale & Lewis). There is an award with his namesake that is the most prestigious award given in computer science. As you can see from the detailing of the pioneers of computer science, many of them were mathematicians.

More complex computer systems began to be developed and here are a few of them for example. The Harvard Mark I and the ENIAC were the two most famous machines around World War II. UNIVAC I was the first commercial computer used by the Bureau of Census in 1951. It was the first computer used to predict a presidential election (Dale & Lewis).

Computer Hardware Generations

According to Nell Dale and John Lewis, there are 4 generations of computer hardware. The first generation is from 1951 to 1959. The second generation is from 1959 to 1965. The third generation is from 1965 to 1971, and the forth generation is from 1971 to present time.

The first generation included the invention of vacuum tubes to store information with magnetic drums used for memory. Later magnetic tape drives were developed as auxiliary/peripheral storage devices (Dale & Lewis).

The second generation included the invention of the transistor which replaced the vacuum tube. John Bardeen, Walter H Brattain, and William B. Shockley won a Nobel Prize for this invention. The magnetic disk replaced the magnetic tape for instant recall location of information (Dale & Lewis).

In the third generation, integrated circuit boards replaced printed circuit boards to be smaller, cheaper, faster, and more reliable. This is also the time when the co-founder of Intel, Gordan Moore, stated that circuit capacity would double each year. This later became the concept of Moore’s Law. This period included the introduction of the terminal with the keyboard and monitor as input/output devices (Dale & Lewis).

The fourth generation brought on large scale integration bringing on more power, a smaller package, and a lower cost. Moore’s law was also modified to circuit capacity doubling every 18 months. This is also considered the age of the personal computer and micro computers. This is the age of Apple, Radio Shack, Atari, Commodore, Steve Wozniak, Steve Jobs, IBM PC, Macintosh, and business workstations being networked in the workplace. We also saw RISC (Reduced Instruction Set Computer) architecture being developed in this age. Computers were being designed to understand instructions called machine language. In 1987, Sun Microsystems introduced workstations with RISC chips and a UNIX OS. Moore’s was restated as circuit capacity having twice the power at the same price or the same amount of power at half the price every 18 months. Parallel computing was also developed in this generation with interconnected processing units: SIMD (Single Instruction Multiple Data Stream) and MIMD (Multiple Instruction Data Stream). Networking now included smaller machines sharing resources like printers software and data. This was aided by Robert Metcalfe and David Boggs invention of Ethernet in 1973. In 1979 DEC, Intel, Xerox then set Ethernet as the standard for networking. In 1989, Novell’s Netware connected personal computers together with a file-server involving CPC with mass storage and good in/output over local area networks. We also saw the Internet arise out of ARPanet which was the Government Network developed in 1960. The Internet used packet switching to share lines for sending messages. Now it uses TCP/IPC Transmission control protocol/Internet Protocol (Dale & Lewis).

Computer Software Generations

As the hardware of computers had several generations, so does computer software. According to Dale and Lewis. Five generations of computer software: the first generation from 1951 to 1959, the second generation from 1959 to 1965, the third generation from 1965 to 1971, the fourth generation from 1971 to 1989, and the fifth generation from 1990 to the present time (Dale & Lewis).

In the first generation, programs were written with machine language built into the electrical circuits of a computer. This was binary code, ones and zeroes. Then artificial programming languages like assembly languages were created using mnemonic codes to represent machine language instructions. Translators were used to translate assembly language to machine language. An assembler read the instructions and translated it to machine language. The creators then were the first “System” programmers (Dale & Lewis).

In the second generation, more powerful high level languages were developed. These were more English like. FORTRAN, for example was developed for numbers and COBOL was developed for business. Lisp was developed fore artificial intelligence and research. Scheme, a dialect of Lisp, was used as an introduction to programming. Each high level language also has a translator/compiler to convert the information from a high level language, to assembly language, to machine language. Systems programmers made assemblers and compilers while application programmers used them to build their programs (Dale & Lewis).

In the third generation, operating systems were developed to put resources under control of the computer with utility programs for loading programs to memory and linking programs together. This is the system software and made computers work faster. In this generation there was the invention of time sharing (remote network sharing of workstations), general purpose applications created statistical packages for social services written in FORTRAN. This is the generation where more non-programmers became personal computer users (Dale & Lewis).

In the forth generation, there was the invention of structured programming with logic and discipline. This included Pascal, Modula-2, BASIC (upgrade), C, C++ and low level statements. There were also more powerful operating systems later on like UNIX by AT&T for research, PC DOS, IBM, and MS DOS. Macintosh developed the mouse and the graphic user interface was introduced with high quality affordable software for spreadsheets, word processing, and database management. Lotus123 was the first commercially successful spreadsheet at the time with WordPerfect and dBase IV gaining popularity as well (Dale & Lewis).

In the fifth generation, Microsoft dominates the computer industry, object oriented programming is developed and so is the World Wide Web. Office suited software begins to bundle desktop publishing software and Java begins to give C++ a run for its money. In 1990, Tim Berner-Lee, a researcher at CERN setup technical rules for the World Wide Web and created Hyper Text Markup Language (HTML). In 1993, Marc Andersen and Eric Bina release Mosaic which was the first graphical Internet Browser. Then, came Netscape and Internet Explorer. AOL purchased Netscape in 1998 and Mozilla Firefox came around in 2004 and took 28% of the market in 2011. In 2002, more social media was developed such as Facebook, Twitter, Blogs, and Wikipedia in the development of the Web 2.0 concept. In this current day in age, everyone can be computer user meaning one does not need to know how to program and computers are embedded in almost everything such as vehicles and appliances (Dale & Lewis) While Nell Dale and John Lewis did a wonderful job in their book, Computer Science Illuminated, I was a bit disapointed that UNIX by AT&T was included in the discussion but there was no discussion about Linus B. Torvalds and his development of Linux in 1991, which had a major impact in developing open source computing and an open source community. Before AT&T was broken up and not allowed to sell computers to customers due to government fears of a monopoly, it Licensed the source code of UNIX to universities. Berkeley, for example, took the source code and created the BSD (Berkeley Software Distribution). Linus took the same approach by rewriting the Unix code in C, making a UNIX like OS which could be ported on many different machines. Today there are so many variations of Linux and most of them are free to download and install without any payment necessary at all. (Negus).

Computing as a Tool and a Discipline

Peter Denning defines the discipline of computer science as the body of knowledge and practice used by computer professionals at work. The fundamental question of computing is, “what can be efficiently automated?” Each practitioner must be skilled at algorithmic (step by step) thinking to solve problems, representation to store data for efficiency, programming (combining algorithmic thinking with representation), and design (building software to serve a useful purpose) (Dale & Lewis).

In 1989 the computer science curriculum included mathematical theory for understanding relationships, scientific experiments for exploring models for predictions, and engineering design for constructing systems to support work. In 2001 and left unchanged in 2008 the curriculum was modified to include the following: discrete structures, programming fundamentals, algorithms and complexity, architecture and organization, operating systems, net-centric computing, programming languages, human-computer interactions, graphic and visual computing, intelligent systems, information management, social and professional issues, software engineering, and computational sciences (Dale & Lewis).

I hope you learned something for this post about the history of computing and if you would like to learn more about computer science, checkout some of my other posts on this site. The information from this post come from notes taken while studying Computer Science Illuminated by Nell Dale and John Lewis. Some information also comes from the Linux Bible by Chris Negus and Christine Bresham and can be purchased by the links below or on the side bar widgets.


Thank you for reading my blog and leave a comment below if you like what you’ve read. Also, if you would like assistance with development, design, publishing, or even tutoring for any of these services, please download and fill out the attached service request for a 25% discount off my hourly rate.


April 28, 2016