A major driver for a lot of the things I have been thinking about of late comes from my participation in a major, popular Q&A site, where people ask and answer questions. There are some amazing people there. I have read answers by such people as Alan Key and even Lee Felsenstein, and it’s mind blowing to think that I have a way to reach out to ask them questions if I so desired. I am so grateful for the time and perspective that they offer.
On the other side, you see questions. A lot of questions. And, not unexpectedly, a lot of questions that expose the vast differences in the way people think about things. It has been a real eye opener. Some of the wildest thoughts I have had have been in response to what might be considered “stupid” questions, as sometimes when you poke at the obvious, you uncover your own surprisingly myopic biases.
One such topic is the very nature of computer programming itself. And I arrived at something that surprised me when I first thought of it, but it fits in so perfectly now with how I view writing software that I have held onto it.
At a high level, this is what I consider computer programming to be:
Computer programming is the expression of human thought in a form that can be executed in a computer.
Now, that might come off as a bit pretentious at first… “Human thought.” But if you think about it, we express thought in many ways, every single day. Saying “It’s a lovely day” to someone is expressing a thought. Someone writing a book is expressing thoughts. A mathematical formula is the expression of thoughts. An architectural drawing is an expression of human thought, as well as what someday may manifest as a real building. A music score in an expression of thought, as well as of music. Giving someone the middle finger is the expression of a thought. I am expressing my thoughts right now writing this.
What differentiates computer programming from these other ways of expression, though, is key: you can then dynamically explore those ideas by bringing them to life within the innards of a computer and seeing what happens.
There is so much that happens implicitly when writing computer code, and many times the fact that there is meaning gets lost in the mix. However, I am hard pressed to think of a case where, for example, a value being operated on in a computer program doesn’t have some kind of meaning. Even a number in a register way down in the machine code being executed by a CPU means something. It might be the age of a person or an alphabetic character in a string or an index into an array or “is the switch connected to the GPIO pin on this Raspberry Pi pressed” or even (and this is where computers get magically self-referential) instructions for the computer itself, being acted on by other instructions. Underneath it all, they are all numbers, but we interpret those numbers differently depending on context, and context gives meaning.
[A brief digression: even the statement “everything in a computer is a number” is a slight misstating of the situation. Underneath those numbers are digital signals, which are simply the movement of electrons through various semiconducting materials. So even what seems like the baseline of everything being a number is already the assignment of meaning (numeric value) to what is something else (electronic manipulation). Even at that level, we have built a layer of meaning on top of what the hardware is actually doing. That in itself is really interesting, as the computer doesn’t really know about meaning itself - it’s just manipulating the flow of electrons. But looked at from above, that flow - and how it’s manipulated - has meaning.]
I have seen many people asking if you have to be good at mathematics in order to write software. And my answer would be, “It depends on what kind of software you want to write.” It certainly helps to be comfortable with numbers, generally. You will encounter them so often, in so many different guises, that if they cause you panic or nausea, then you’re probably not going to enjoy your time writing software. And if you’re writing a 3D graphics engine, you’re going to be using a lot more intensive mathematics than you will if you’re working on, say, a website where artists can create online galleries by uploading images and giving them names.
It can be hard to quantify (no pun intended), but I would say that the vast majority of code I’m writing these days utilizes only the most basic forms of actual number manipulation: incrementing and decrementing numbers, checking ranges, things like that. And, of course, the movement of values around inside the software system, from one place to another.
A far greater amount of the code I write is dealing with things at a higher level: structures, strings, processes, flow, and even more abstract types that have meaning that is highly specific to what I’m doing. They are the embodiments of ideas, ideas I can then manipulate, within the limits of what the computer - and my ability to express them - allows.
You can code anything you can think of, as long as you can work out how to express it in code. Some examples:
The attributes of an enemy in a video game
Roads and buildings in a virtual world
Address book data
Library catalogues, with information about books, their authors, ISBN number and which branches currently have them available
The temperature read from an external sensor
Climate models
How many times I tap the screen in one of those mobile phone games
The coordinates where I have tapped on a screen
The model of the mobile phone
How much battery is left
The temperature flow along a plate (I wrote code for this in Pascal in college)
The layout of text and images on a page
The flow of air across the wings of a model plane in a virtual wind tunnel
Where all the delivery trucks for a distributor are currently located
Where the cars are along the tracks of a rollercoaster
The weights in a deep learning neural net model
An interface that offers APIs for a particular cloud-based service
And on and on. Those were just random examples that came to me. The list is endless, limited only by our own imaginations. And it’s all writing software. You never know what you could be called upon to express in software.
And that’s not even mentioning non-data value aspects, like relationships among data, transformations of information, rules that control flow based on decisions being made, object hierarchies, and so much more.
So when I’m thinking about writing code, I’m not thinking about if’s or while loops. I’m thinking about things, and what I can do to those things, and how information is transformed, and how information flows, and how I can think about things that accomplish what I’m trying to write the code for.
I’m sure that all seems a bit vague, whether you’ve written software or not. Perhaps if you have written software, you’re getting at least some sense of what I mean. But at the end of the day, regardless of how experienced you are, it can still be a bit… squishy.
Writing code happens first in your head and then as an expression of what happened in your head in a way that the computer can execute. And then you can see if what you thought you were expressing you actually expressed properly and, very often once expressed and run, whether the way you were thinking about things was valid to begin with. It can be this intense feedback loop, where you try out what’s inside your head inside the computer instead. And then when it doesn’t work, you either have to express it better or realize that what you expressed wasn’t quite right, so you change how you express it until you’re reasonably convinced that you’re at least replicating in code what was in your head. And then it still doesn’t work, so you go back and make changes to how you think and start the process over again, with revised thoughts fed in.
Programming computers keeps you honest and humble. You have to be able to admit you were wrong, because you will be wrong a lot. At some point, being wrong ceases to be a stigma - it’s a step toward better understanding, better thinking. Every programmer I have ever worked with - and I mean every single one - has gotten things wrong, not only once in a while but probably every single day, every time they sit down in front of that keyboard. A compiler error is annoying… but you just fix it and move on. A logic error can be challenging to track down, but it can also be rewarding in the end. Debugging code can be an exercise in exploring faulty thinking - the computer is only doing what you told it to. And in the end, when you work out what is going on, you have not only fixed the code, but you have a new understanding of the code and how you look at what you’re doing.
It’s humbling and exciting at once.
It’s not like “writing a recipe”, once you get beyond incredibly simplistic scenarios. It’s more exploring how you can get what’s in your head into the computer, how you can describe what you see and imagine to both the computer and maybe others who look at your code, and how it all acts once you tell it to run… And then you go back, in this almost personal feedback loop, to make it as close to what you were thinking as you can.
And then you run it, and it works, and it all looks so easy to someone looking over your shoulder after the fact. But then I suppose working magic does make things look easy, even if the process itself can actually be hard at times.