Hr Library

Why we shouldn’t forget that the world’s first computers were humans

Source | LinkedIn | John MaedaJohn Maeda is an influencer | EVP Chief Experience Officer at Publicis Sapient, MBA/PhD, 6×?Author, TED Speaker

The first computers were not machines, but humans who worked with numbers—a definition that goes back to 1613, when English author Richard Braithwaite described “the best arithmetician that ever breathed” as “the truest computer of times.” A few centuries later, the 1895 Century Dictionary defined “computer” as follows:

“One who computes; a reckoner; a calculator; specifically, one whose occupation is to make arithmetical calculations for mathematicians, astronomers, geodesists, etc. Also spelled computor”

At the beginning and well into the middle of the twentieth century, the word “computer” referred to a person who worked with pencil and paper. There might not have been many such human computers if the Great Depression hadn’t hit the United States. As a means to create work and stimulate the economy, the Works Progress Administration started the Mathematical Tables Project, led by mathematician Dr. Gertrude Blanch, whose objective was to employ hundreds of unskilled Americans to hand-tabulate a variety of mathematical functions over a ten-year period. These calculations were for the kinds of numbers you’d easily access today on a scientific calculator, like the natural constant ex or the trigonometric sine value for an angle, but they were instead arranged in twenty-eight massive books used to look up the calculations as expressed in precomputed, tabular form. I excitedly purchased one of these rare volumes at an auction recently, only to find that Dr. Blanch was not listed as one of the coauthors—so if conventional computation has the problem of being invisible, I realized that human computation had its share of invisibility problems too.

Try to imagine many rooms filled with hundreds of people with a penchant for doing math, all performing calculations with pencil and paper. You can imagine how bored these people must have been from time to time, and also how they would have needed breaks to eat or use the bathroom or just go home for the evening. Remember, too, that humans make mistakes sometimes— so someone who showed up to work late after partying too much the night prior might have made a miscalculation or two that day. Put most bluntly, in comparison with the computers we use today, the human computers were comparatively slow, at times inconsistent, and would make occasional mistakes that the digital computer of today would never make. But until computing machines came along to replace the human computers, the world needed to make do. That’s where Dr. Alan Turing and the Turing machine came in.

How to Speak Machine: Computational Thinking for the Rest of Us

The idea for the Turing machine arose from Dr. Turing’s seminal 1936 paper “On Computable Numbers, with an Application to the Entscheidungsproblem,” which describes a way to use the basic two acts of writing and reading numbers on a long tape of paper, along with the ability to write or read from anywhere along that tape of paper, as a means to describe a working “computing machine.” The machine would be fed a state of conditions that would determine where the numbers on the tape would be written or rewritten based on what it could read—and in doing so, calculations could be performed. Although an actual computing machine could not be built with technology available back then, Turing had invented the ideas that underlie all modern computers. He claimed that such a machine could universally enable any calculation to be performed by storing the programming codes onto the processing tape itself. This is exactly how all computers work today: the memory that a computer uses to make calculations happen is also used to store the computer codes.

Click here to read the full article

Source
LinkedIn
Show More

Related Articles

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button