The History of Computers and Coding

Computers are all around us. From laptops to smartphones, to smartwatches, we all use computers, some of us know how to code, but do we all know what is the history of these complex inventions? Here I want to share with everyone what is the history of the machine that changed our World.

    Banū Mūsā brothers (850) & Al-Jazari (1200)

    Banu Musa's Autonomouse Music Sequencer

    The history of computers and coding all began with ideas and philosophies during the 9th Century, some of the pioneers of these ideas are the Banū Mūsā brothers(meaning Son's of Moses), they're namely Abū JaʿfarMuḥammad ibn Mūsā ibn ShākirAbū al‐QāsimAḥmad ibn Mūsā ibn Shākir; and Al-Ḥasan ibn Mūsā ibn Shākir, they were three ninth-century Persians who lived and worked in Baghdad. They are known for their book called; "Book of Ingenious Devices," which consists of approximately 100 drawings and instructions on how to make and operate them. One of the devices is an autonomous music sequencer known today as the first programmable device ever invented.



    Another man during the 13th Century, namely Ismail Al-Jazari is a Muslim polymath: a scholar, inventor, mechanical engineer, artisan, artist, and mathematician; he lived from the Artuqid Dynasty of Jazira in Mesopotamia. He was part of a tradition of artisans and was thus more a practical engineer than an inventor. He also published a book called; "Book of Knowledge of Ingenious Mechanical Devices," where he put all of the invented devices and was inspired by the book from the Mūsā brothers. He is known as the "father of Robotics" as he invented numerous programmable robots, machines, and even the first astronomical clock -  the first programmable analog computer.

    Al-Jazari's First Astronomical Clock

    Jacquard Machine (1803)

    In the 19th Century, punch cards were used in the textile industry with the Jacquard Machine. It was invented by a man named Joseph Marie Jacquard in 1803, based on earlier inventions by the Frenchmen Basile Bouchon (1725), Jean Baptiste Falcon (1728), and Jacques Vaucanson (1740). This machine is a mechanical loom that uses pasteboard cards with punched holes, each card corresponding to one row of the design. Multiple rows of holes are punched in the cards, and the many cards that compose the textile design are strung together in order. This made it possible to produce complex designs on their textile products. 


    Charles Babbage (1837)

    The Jacquard Machine inspired an English Polymath named Charles Babbage to write a concept programmable general-purpose computer. He is also a mathematician, philosopher, inventor, and mechanical engineer. He is also considered the "father of the computer" when he first attempts to make the mechanical computer. He ends up inventing his "Difference Engine," the special-purpose machine designed to tabulate logarithms and trigonometric functions by evaluating finite differences to create approximating polynomials. When working with the Difference Engine, he realized a much more general design, the Analytical Engine, was possible. He spends his entire life developing this machine but could not finish it because of funding disputes. Until his son, Henry Babbage, saw the parts at his late father's laboratory continued to finish what his father started.

    Henry Babbage's Analytical Engine Mill (1910)

    Ada Lovelace (1852)

    Also, during this time, a woman named Ada Lovelace (1852) was an English mathematician and writer who saw the potential of the Analytical Engine. At first, she got assigned to transcribed Babbage's lecture into French. Sometime later, when she got asked to translate Luigi Menabrea's paper into English. She then augmented the paper with notes, which she added to the translation. Ada Lovelace spent the better part of a year doing this, assisted with input from Babbage. Her's notes were labeled alphabetically from A to G. In note G, she describes an algorithm for the Analytical Engine to compute Bernoulli numbers. It is considered the first published algorithm ever specifically tailored for implementation on a computer, and Ada Lovelace has often been cited as the first computer programmer for this reason. Unfortunately, Charles Babbage never completed the Analytical Engine, so Ada Lovelace never tested her program. She also predicted that the theoretical computer would play music and chess one day - but never think for itself.

    Diagram for the computation of Bernoulli numbers - Ada Lovelace

    Herman Hollerith(1906)

    Herman Hollerith was an American businessman, inventor, and statistician who developed the tabulating machine that used punch cards for instructions. The machine was designed to perform separate tasks without having to be reconstructed between each one. It was used to help count people for the United States Census. From 8 years of manual counting, it became 2 years with the help of Hollerith Machine.


    The Z3 (1941) was a German electromechanical computer designed by Konrad Zuse. It was the world's first programmable, fully automatic digital computer. The Z3 was built with 2,600 relays, implementing a 22-bit word length that operated at about 5–10 Hz clock frequency. The program code was stored on a punched film. Zuse's original Z3 was used to compute aerodynamics by the German army, and it was later destroyed by a bomb in 1943.

    ENIAC (1945)

    J.Presper Eckert and John Mauchly built ENIAC at the University of Pennsylvania between 1943 and 1946. This would lead to the development of UNIVAC in 1951. This was designed and primarily used to calculate artillery firing tables for the United States Army's Ballistic Research Laboratory (which later became a part of the Army Research Laboratory); its first program was a study of the feasibility of the thermonuclear weapon. The machine had to be manually programmed for every problem by connecting wires to plugboards. Many female programmers figured out the schematics and taught themselves how to do this.


    The Automatic Computing Engine (ACE) was a British early electronic stored-program computer designed by Alan Turing. It led to the MOSAIC computer, the Bendix G-15, and other computers. 

    UNIVAC I (1951)

    (Universal Automatic Computer I) was the first general-purpose electronic digital computer design for business applications produced in the United States.[1] It was designed principally by J. Presper Eckert and John Mauchly, the inventors of the ENIAC. The UNIVAC I mainframe computer became known for predicting the election of President Eisenhower over Adlai Stevenson in 1952. This was the first computer to use magnetic tapes.

    FORTRAN (1957)

    FORTRAN (FORmula TRANslation), originally developed by IBM and led by John Backus in the 1950s for scientific and engineering applications, FORTRAN came to dominate scientific computing. It has been in use for over six decades in computationally intensive areas such as numerical weather prediction, finite element analysis, computational fluid dynamics, geophysics, computational physics, crystallography, and computational chemistry. This programming language created the foundation for higher-level languages such as HTML and C++.

    COBOL (1959)

    COBOL (Common Business-Oriented Language) is a compiled English-like computer programming language designed for business use. It is an imperative, procedural, and, since 2002, object-oriented language. COBOL is primarily used in business, finance, and administrative systems for companies and governments. COBOL was designed in 1959 by CODASYL and was partly based on the programming language FLOW-MATIC designed by Grace Hopper. It was created as part of a US Department of Defense effort to create a portable programming language for data processing. It was originally seen as a stopgap, but the Department of Defense promptly forced computer manufacturers to provide it, resulting in widespread adoption.

    ASCII Code (1963)

    The ASCII code or American Standard Code for Information Interchange is a character encoding standard for electronic communication. ASCII codes represent text in computers, telecommunications equipment, and other devices. Most modern character-encoding schemes are based on ASCII, although they support many additional characters.


    Hardware/Software Innovations 1930 - 1990

    Vacuum Tubes (the 1930s)

    The vacuum tubes or valve is a device that controls electric current flow in a high vacuum between electrodes to which an electric potential difference has been applied.  These vacuum tubes are used in television, radios, radars, electronic computers, and amplifiers. The first computers shown were powered by these tubes and needed approximately 18,000 vacuum tubes to power.


    Transistors (the 1950s)

    In the 1950s, the transistor was invented by William Shockley, John Bardeen, and Walter Brattain; they all worked at Bell Labs. They got awarded the 1956 Nobel Prize in Physics for "their research on semiconductors and their discovery of the transistor effect." The invention of the transistor leads to the invention of Transistors Radios, which was also released during the 1950s.


    Integrated Circuits (the 1960s)

    An integrated circuit also referred to as an IC, a chip, or a microchip, is a set of electronic circuits on one small flat piece (or "chip") of semiconductor material, usually silicon. Large numbers of tiny MOSFETs integrate into a small chip. This results in circuits that are orders of magnitude smaller, faster, and less expensive than those constructed of discrete electronic components. Until now, we still use Integrated circuits; the only difference is that the ICs built today are smaller and faster than before.


    The Personal Computer Revolution (the 1970s)

    After the "computer-on-a-chip" was commercialized, the cost to manufacture a computer system dropped dramatically. The arithmetic, logic, and control functions that previously occupied several costly circuit boards were now available in one integrated circuit, making it possible to produce them in high volume. This led to the Personal Computer Revolution Era. A few researchers at places such as SRI and Xerox PARC were working on computers that a single person could use, and that could be connected by fast, versatile networks: not home computers, but personal ones.


    Altair (1975)

    Popular Electronics magazine featured the Altair 8800, the "World's first minicomputer kit to rival commercial models." Paul Allen and Bill Gates write software for Altair using the new BASIC language. They formed their own software company, Microsoft.

    IBM Model 5150 (1981)

    The IBM Model 5150 debuted on August 12, 1981, after a twelve-month development. Pricing started at $1,565 for a configuration with 16K RAM, Color Graphics Adapter, and no disk drives. The price was designed to compete with comparable machines in the market.


    Apple II (1976)

    Steve Jobs and Steve Wozniak start Apple Computers with the Apple I, the first computer with a single-circuit board. The Apple II exploded on the scene in 1977.


    Apple Lisa (1983)

    Apple's Lisa is the first commercialized personal computer with a graphical user interface(GUI). Soon Microsoft Windows and the Apple Macintosh would adopt the GUI as their user interface, making it the new paradigm for personal computing.


    Apple Macintosh & IBM PC AT (1984)

    After a year, the two companies Apple and IBM, soon launch Macintosh and IBM PC AT, both with their own GUI and more applications which exploded the market. 1984 was the start of PC becoming a whole world phenomenon.


    Compaq Deskpro 386 (1986)

    Compaq Deskpro was a line of business-oriented personal computers manufactured by Compaq, then discontinued after the merger with Hewlett-Packard. Models were produced containing microprocessors from the 8086 up to the x86-based Intel Pentium 4. This makes the Deskpro 386 the fastest performance PC during that time.


    Global Acceptance (1988)

    During this time, 1988 more and more people are interested about this "Personal Computers," Many people made specialized store to sell personal computers to the market. These resellers or distributors are from the US itself, but they came all around the World. 


    The Age of the Internet (1990)

    The Internet (1990)

    By the 1990s, new technology was shown to the public, the Internet. The Internet is a specialized technology of data distribution and communication. Specialized institutions called ISPs or Internet Service Providers are the ones who make sure that the users have end-to-end connectivity of every part of the network. They use wires to connect computers from one end to another; people can now communicate, buy, or sell using the computer because of this technology.


    Apple iPhone (2007)

    By 2007, Steve Jobs announced the Apple iPhone, the 1st ever smartphone introduced to the world. What's inside the personal computer is also inside the iPhone; this makes the iPhone revolutionary since you won't be stuck at your desk the whole day; you can now use the iPhone to send e-mails, check the stock, weather, and even play games.


    Cryptocurrency (2009)

    Even before 2009, cryptocurrencies were being developed by people like Nick Szabo, David Chaum, Wei Dai, and others. But not until 2008, an anonymous programmer named Satoshi Nakamoto released a whitepaper about Bitcoin; this made the traction on the development of Bitcoin and was released during 2009. The main concept of cryptocurrency is that you can trade money using your computer, but there should be no central authority, and thus cryptography is a must. Since the code of Bitcoin is open-sourced, many people made their own cryptocurrencies but with different styles of functionality, which they are called "Altcoins." Examples of Altcoins are Ethereum(ETC), Litecoin(LTC), Ravencoin(RVN), and many others.


    Cloud Computing (the 2010s)

    Cloud Computing or "Cloud" refers to using business computers via the Internet for things we previously used to do on our own computers, like storing data (Dropbox) and running web applications (Google Docs). This concept was already being used during the 1970s, but not until the 2000s was it used by companies like Amazon, Google, Microsoft, and many more.



    The History of Computers and Coding
    Ducky 14 August, 2021
    Share this post
    Sign in to leave a comment
    How Does a Machine Learn?

    To install this Web App in your iPhone/iPad press and then Add to Home Screen.