InfoArchive

The computer stands at the pinnacle of a long line of inventions on how we communicate an idea from one mind to another. The ability to share thoughts and ideas—beyond verbal communication—can be traced back to prehistoric cave paintings. After all, what is art, if not the act of sharing information visually?

Throughout thousands of years, we shared knowledge and information through hieroglyphs, pictograms, cuneiform language, all the way to the modern alphabet, books, the printed press, morse code, the telegraph, the telephone, fax machines, and, finally, the computer.

With every invention, we increased the speed at which we do it. From a single hand-written papyrus traveling by foot for weeks to printed books shipped by sea for weeks, today, we can send the entire works of Shakespeare from LA to Japan in less than a second.

I’m sharing all this because I think it’s worth defining what a computer does is in the context of information. With the computer, we can break down any thought, idea, or work of art into a pattern of electrical signals, store it, and send it across the globe at the speed of light.

A unique invention stands at the heart of this pattern—the fundamental particle of information—the bit. Technically, the bit is an electrical signal stored in a tiny piece of technology called a transistor. Each transistor holds a single bit, 8 bits making 1 byte; Apple’s new M1 chip can store up to 57 billion transistors.

This electrical signal is stored at a higher or lower voltage, which results in a binary sequence, 0 for lower voltage and 1 for higher voltage. With this two-state bit, we can represent just about anything. Take the simplest of text messages:

Hello is made of 5 individual characters. It’s been agreed—as per the American Standard Code for Information Interchange protocol (ASCII)—each letter maps to a decimal number such that:

Whereas a letter standing for a sound is ambiguous to represent, a number is more concrete. A number is a unit, and units can be counted and expressed in bits.

Take a single bit. If the current coming in is low, we can equate that to a binary 0. Increase the electrical charge so that the transistor can detect it and get yourself a binary 1. With a single binary bit, we can represent the decimal numbers 0 and 1. What if we stick two bits together? We double the number of decimal numbers we can represent with two consecutive bits.

Now, let’s scale things up to 8 bits.

Well, 128 + 64 + 32 + 16 + 8 + 4 + 2 + 1 = 255. With 1 byte (8 bits) we can now represent 255 decimal numbers. We can undoubtedly count from 0 to 72 (back to the H in Hello) with 8 bits.

Scale that up to the entire word, and you have the bit pattern of zeros and ones for Hello. Thus we can send the text Hello from LA to Japan in electrical form at the speed of light—186000 miles per second.

Now, why have we gone to such lengths with the mechanics of a single word? Because this is the essence of the computer, a vast ocean of bits that can flip between 0 and 1 billions of times per second.

A text message, a photo, a book, a song, everything in a computer is built out of patterns of bits, of electricity. Images on screen are constellations of pixels of color, each a certain amount of red, green, and blue, each ranging from 0 to 255, which are patterns of 0s and 1s.

Everything in our digital life arises from long and fast-changing patterns of electricity, and the bit is such a fundamental unit that some believe it stands at the root of our ability to understand the universe.

John Archibald Wheeler, the last surviving collaborator of both Einstein and Bohr, put this manifesto in oracular monosyllables: “It from Bit”. Information gives rise to “every it—every particle, every field of force, even the spacetime continuum itself.”
This is another way of fathoming the paradox of the observer: that the outcome of an experiment is affected, or even determined when it is observed. Not only is the observer observing she is asking questions and making statements that must ultimately be expressed in discrete bits.
“What we call reality,” Wheeler wrote coyly, “arises in the last analysis from the posing of yes-no questions.” He added: “All things physical are information-theoretic in origin, and this is a participatory universe.” The whole universe is thus seen as a computer—a cosmic information-processing machine.

— The Information, James Gleick