Skip to main content

A Short History of Electronic Computers

Computers have fascinated Kelley since he first started using them at work in the 1980s. He got his first desktop unit in 1999.

Charles Babbage's Analytical Engine

Charles Babbage's Analytical Engine

Antikythera mechanism

Antikythera mechanism



Atanasoff-Berry computer

Atanasoff-Berry computer



The history of computers began in ancient times

The electronic, digital, programmable computer has only been around since the 1940s, yet it’s certainly changed our world in a profound way. These days, computers seem to be in just about everything. But when did people begin making them, and where will this technology lead humankind?

Please keep reading and find out!

Early Computers

The Sumerian abacus, a calculating tool used in arithmetic computations, was invented about 2400 B.C.E. and was still useful until the 1940s. The Antikythera mechanism, possibly invented by Archimedes about 100 B.C.E., was used to calculate astronomical positions and may be the world’s first analog computer. And sliderules, invented in the 1620s, were used by astronauts during the Apollo Moon missions.

Since a computer can be anything that computes, in the early decades of the 1900s computers were often people. Sitting in huge rooms, scores of people worked at identical desks and, using mechanical adding machines and plenty of pencil and paper, computed the specifications for aviation technology during World War Two.

Electronic Computers

In the late 1930s, John V. Atanasoff of Iowa State College, along with graduate student Clifford Berry, were credited with inventing the Atanasoff-Berry computer, the world’s first electronic, digital computer, which had no CPU but could solve up to 29 simultaneous linear equations.

Then, during World War Two, the electronic era of computers accelerated in the United States and the United Kingdom. The Colossus Mark II computer, built in the U.K. in 1944, was used to break German secret codes during the war. The Colossus was the world’s first electronic, digital, programmable computer. Vacuum tubes, all 2,400 of them, were used to make its calculations.

As computer technology tends to work, the ENIAC, produced in 1946, soon trumped Colossus. ENIAC used 18,000 vacuum tubes and was as big as some houses. It was considered the world’s first general-purpose electronic computer. Astonishingly, this marvel stayed useful for the next eight years!

Notable innovations in computerization included production of the Ferranti Mark I (1948), the first commercially produced electronic computer. UNIVAC, produced in the U.S. in 1951, was used to compute the census for 1952. EDSAC (1949), a British computer, was the first to use it own stored programs, utilizing the so-called von Neumann architecture, still used by contemporary computer scientists.

Other innovations that changed computerization included the invention of the transistor (1953), the integrated circuit (1959), the floppy disk (1970), the first microprocessor, Intel’s 4004 (1971) and the Apple 1 personal computer in 1976 (now a collector’s item, by the way).

Commodore 64

Commodore 64

Scroll to Continue
Zuse Z3

Zuse Z3

Cray supercomputer

Cray supercomputer

1998 iMac G3

1998 iMac G3

Enter the Commodore 64

In the early 1980s, the age of personal computers (PCs) gained momentum. People wanted home computers and were willing to pay hundreds of dollars to buy them. Apple PCs were available, but they cost more than a $1,000 apiece. Offering a cheaper alternative, a new PC was marketed in early 1982. Priced at just under $600, it was called the Commodore 64 because it had 64 kilobytes of random access memory or RAM. Believe it or not, that was a big deal in those days!

The Commodore 64 had an 8-bit microprocessor and an operation speed of just over one megahertz. It also had an impressive sound and graphics package and eventually offered as many as 10,000 software titles. Commodore International sold 17 million C64s, more than any PC ever produced! (The model was discontinued in 1994.)

Intel 80486

Keeping in mind that a computer is no more advanced than its microprocessor or central processing unit (CPU), let’s continue:

First produced in 1989, Intel’s 486 microprocessor was the first “chip” to use more than one million transistors; it ran at 50 megahertz, had an on-chip SRAM cache and could execute 40 million instructions per second. This was a monster of a microprocessor for its time! At any rate, it was a vast improvement over its predecessor, Intel’s 386.

Intel Pentium Microprocessors

From 1993, when Intel introduced the Pentium 1, the company has continued producing inexpensive yet powerful microprocessors. The Pentium 1 used over three million transistors (more transistors means higher processing performance) and operated at around 100 megahertz. These Pentium microchips have been used in all manner of electronic devices - desktop computers, laptops, cell phones, smart phones and other mobile devices. Conceivably, billions of people have used these chips!

Macintosh Graphics

In 1984 Apple Inc. built the Macintosh, the first PC produced with Graphic User Interface, rather than a command-line interface, using images instead of words, essentially. This computer also utilized the mouse, a pointing device that’s revolutionized “picking and choosing” in the cyber world. The Macintosh soon became the industry standard for PCs offering desktop publishing.

Unfortunately, the first Macintosh had little memory, no hard drive and could not be easily expanded. So it was modified over the next two years, producing the Macintosh Plus (1986), which operated at eight megahertz and cost about $2,600.

Microsoft Windows

In the late 1980s to early 1990s, PCs using the Microsoft Windows 3.0 operating system began to dominate the computer market. Windows began offering features that many of us now take for granted: word processing, the text editor Notepad, a macro recorder, a paint program, a calculator, various games and many other programs.

In 1983, Microsoft Office Word was introduced, dramatically expanding the use of word processing at home and in business. And then in 1985 Microsoft Excel was introduced. This versatile commercial spreadsheet application eventually replaced Lotus 1-2-3, once the best in the industry until the middle 1990s.

Using Word and Excel and many other Macintosh and Windows applications as well has made it possible for the average person to work at home using software that is identical to that used at work. This capability has revolutionized education and productivity in the workplace and at home!

Cray Supercomputers

For the production of computers used primarily for scientific study, Cray Research, Inc. was established in 1972. The company’s first unit was the Cray-1 supercomputer, which was the fastest computer in the world at the time and sold for over $8 million.

Since Cray computers are very expensive, only elite companies or the governments of rich countries can afford to buy them; therefore, it is a mark of prestige to own one of these marvelous machines. Cray supercomputers produced in the present day have a quarter million processing cores and can perform quadrillions of computations per second!

Multi-Core Processors

A multi-core processor is a single processing unit with at least two microprocessors or “cores” used in computations. This configuration allows the multiple cores to run different instructions at the same time, a kind of multi-tasking, thereby making the computer run faster and allowing the added capability of parallel processing. But one problem with parallel processing is that it’s more difficult to write efficient software programs for such complex processing.

Multi-Processor Systems

Some computers may have more than one CPU, each of which involved with a particular task. This multiprocessing could be called multitasking or multiprogramming, both of which perform numerous tasks at the same time rather than sequentially. Multiprocessing reduces cost and the time it takes to do a job; it also increases reliability, because if one CPU goes down, the others can keep working.

Computer Grids

If utilizing computers with multi cores or multiprocessors isn’t enough to do the job, then computers may be linked into a computer grid or cluster, a world-wide one perhaps, creating a kind of super virtual computer, designed to study such complex issues as climate change, financial modeling and earthquake or tsunami simulations. The possibilities for such computer processing are mind-boggling indeed. Who knows what they may one day accomplish!

Computers in the Modern Era

New computers or devices that use them to communicate with others, play games, watch movies or display information for one reason or another, perhaps while shopping for Christmas gifts, seem to arrive on the market every six months or so, and some of these devices include:

2003: The first 64-bit processor, AMD’s Athlon 64, is introduced to the computer market.

2007: The iPhone, one of many smartphones, includes many computer functions only previously available on desktop units.

2010: Apple introduces the iPad, advancing the market for computer tablets.

2015: Apple unveils the Apple Watch, which makes computing power available on one’s wrist.

Future of Computers

Technically, computers can be made of anything. Instead of electronic circuits, tennis balls could be used. The presence of a ball is considered a one and its absence a zero. Therefore, many other types of computers are theoretically possible, including optical computers, DNA computers, neural computers and quantum computers.

In quantum computers, instead of using binary digits, the quantum properties of atoms such as spin, superposition and entanglement would represent data and make calculations accordingly. Of course, since atoms are very small, a quantum computer could be equally minute, revolutionizing miniaturization and also provide invaluable insights into the growing field of nanotechnology.


As humankind continues producing more and more advanced computers, it may soon create computers that can think for themselves, utilizing what’s called artificial intelligence. Then these smart computers may one day create their own computers and perhaps their own computerized people as well. Who wouldn’t want to see that?

Please leave a comment.

© 2012 Kelley Marks


touseeq from Pakistan, bhalwal on February 20, 2019:

great article

FrancoEliass from Ciudad de Córdoba, Argentina. on April 28, 2018:

Great article, I love computers too.

Kelley Marks (author) from Sacramento, California on August 16, 2017:

Okay, thief12, I added Atanasoff's digital computer to my article. Thanks for the tip - now it's better. Later!

Kelley Marks (author) from Sacramento, California on April 21, 2013:

Thanks for the comment, Thief12. I'll have to check out Mr. Atanasoff, and perhaps I'll add his name to this article. Later!

Carlo Giovannetti from Puerto Rico on April 20, 2013:

Pretty good article, but no mention of John V. Atanasoff? He is actually the one credited with inventing the first digital computer.

Kelley Marks (author) from Sacramento, California on September 14, 2012:

Thanks for the comment, okecha solomon. Explaining something in the simplest of terms is not so simple. Later!

okecha solomon on September 13, 2012:

This is my first time to use this material. I have found it extremely good because it gives me easier way to understand and as a lecturer of basic computing, I could find nothing better . Thank you very much.

Kelley Marks (author) from Sacramento, California on May 16, 2012:

I remember when cars had push button starters. One time when I was a kid I pushed that button and the car jumped forward, scaring me. Later!

Lawrence Da-vid on May 16, 2012:

Computers used to be fascinating, however, virtually everything automotive, electronic today, uses them. I've worked with computers that took minutes, to seconds to process the simple 2 plus 2. Then we went to micro seconds, then nano and finally pico seconds....but the end result was the same.....unless the user couldn't add to verify the answer. In the short time of 70 or 60 years, we have come a long way. Haven't we?

Then again, over a half a century ago, I remember when you used a push button to start a car or pickup truck. Today, the latest invention is a push button to start the new cars....How convenient! I guess that's what progress is.....I imagine we'll go back to the abacus for computing before long. History, indeed, seems to repeat itself. By the way, I was in the building studying schematic's for univac 1 trying to determine why it quit. History dictates that one of "Grace's" Waves, was behind a processor and discovered that famous "moth" that had it's wings between the contacts of a relay. When the critter was removed......the machine started working. Guess what the term was called when the "bug" was removed.

Kelley Marks (author) from Sacramento, California on May 16, 2012:

Thanks for the comment, Lawrence Da-vid. I remember such computer related words as MSDOS and Cobol. I even wrote a program or two in Cobol, an elegant language, at least it was 30 years ago! Also, the first computer I used at the workplace was a Basic Four, the size of a refrigerator and very impressive for its time. Computers are fascinating, aren't they? Later!

Lawrence Da-vid on May 16, 2012:

I was fortunate to have worked with RADM lower half Grace Hopper, who, had the title of "mother" of the computer when she, myself and others worked on UNIVAC ONE. I too watched what took up the space of 3 Walmarts condensed into what fits on a table top. The "Trash 80," the "Commo 64," and a few others out there enabled the home user to compute. I also remember and have used the Frieden calculator and other antiques of the age. My abacus quit working when I was 10. I imagine you remember MSDOS and MSDOS 1 and so on. Windows followed a package developed by Microsoft called Enable. A package begged for by the Government.

Grace was famous for implementing basic, Fortran, Cobol and a few other machine languages....if fact she could read binary.

The last huge system I became familiar was at LL Labs in California which was the CRAY super number cruncher.

Kelley Marks (author) from Sacramento, California on May 08, 2012:

Thanks for the comment, Avid Gardener. Yes, those were the early days of home computers, and they certainly weren't the good old days, were they? I still chuckle when I remember how excited people got when they realized they had 64K of RAM. OMG! It really happened! Later!

avid gardener from Florida on May 08, 2012:

I remember the c64 i also used a radio shack trs-80 in middle school and used a cassette tape recorder to load programs. To hook up to a modem you would actually take the phone and put it on this box that it fit into. Man those were the days. LoL. Nothing like Nostalgia. Nice article btw.

Kelley Marks (author) from Sacramento, California on May 08, 2012:

Thanks for the comment, DS Duby. Computers have always been awesome, ya know? Later!

DS Duby from United States, Illinois on May 07, 2012:

Great article on computer history, my past knowledge of computers is mostly limited to doss programs from grade school in the early 80s lol. Good job, thumbs up with an awesome.

Related Articles