Francisco
Pax
Info

The Three Big Ideas In a Programming Language

Everything is data. From the collection of characters in a name, to the digits in a phone number, and all the way to images, videos, websites, or video calls on zoom.

The ability to express data is, therefore, the first big idea in a programming language. A primitive expression, as we call it, can be as simple as a number, a word, a decimal like pi, or even something as elementary as a truth or a falsehood. What makes an expression is that it evaluates—or adds up—to some final thing the computer can store in memory. Primitive expressions, such as these below, simply evaluate to themselves. That's what makes them primitive.

But what do we mean by evaluate? Remember, a computer records data by imprinting electrical signals onto a physical piece of memory. Take, for example, the number 33 that we express using the decimal number system.

To store it, the computer will make electrical marks in memory slots. In other words, the computer has to evaluate the number to its binary equivalent, 00100001.

The same applies to something a little more sophisticated like a word. The name Frank, for example, is a collection of letters such as F, r, and a, which are expressed at a lower level by the numbers 46, 72, and 61, which in turn boil down to 001... you get the point. We express what the computer can evaluate.

Expressions can also be combined, this is the second big idea. Take the simple additions below. We use the plus symbol or the add word, its equivalent in function notation, to combine two primitive expressions together. The numbers 3 and 2 no longer evaluate to themselves individually but to 5, their sum. Not only have we combined two primitive expressions, but we also gave the computer our first instruction: add two numbers together and evaluate the result.

It all seems simple and straightforward, right? Well, not so fast. There's something here that we're taking for granted. Adding two numbers together with the + symbol might seem trivial to us, humans, but when you consider that a computer is just electricity, how is it that it knows what to do with the symbol + or the word add ?

Somewhere in your computer, a programmer left a series of instructions on how to add numbers together in binary and then return the result in decimal notation. These instructions are packaged in a "little container" with the name +. This so we get to type + without having to think about how to add two numbers together using nothing but electricity.

This is the third and most powerful idea in a programming language: the ability to abstract; to make large collections of instructions available simply by their name. Just like when we drive a car there are mechanisms integral to our driving that have been abstracted away in the form of icons on our dashboard or pedals under our feet, so it is with computers and code.

Previously complex collections of instructions are made primitive to us in a "higher level" programming language. We get to stand on the shoulders (and code) of others and use their building blocks to make our own. We can, for example, combine a series of numbers and operators to recreate the process of converting temperatures from Celsius and Fahrenheit degrees. We can do so using infix notation:

Or using function notation—slightly more complicated, at first, but just the same as the notation in the slide above:

Finally, just like the programmers who created the instructions in the + symbol, we now get to abstract our instructions and give them a name of our own:

Only to arrive at where we started.

In short, a programming language is a series of symbols, keywords, and "glyphs", each standing for instructions at a lower level, which in turn stand for ever deeper collections of instructions until we're left with nothing but literal electricity.