by Nora Miller
I took my first programming class in the summer of 1976, mostly to get out of working between semesters. Summers in Vermont are short and rewarding, but this one changed my life.
The professor for the class, Phil Hooper, was a math PhD who had been programming since the late 50s. Unlike most science teachers I’d met, Doctor Hooper seemed genuinely thrilled to share what he knew and clearly had no doubt that every one of us could learn this stuff. He started out talking about algorithms and logic. Before we ever touched a keyboard, we learned about computer design and commands, sequencing, accumulators, buffers, stacks, and so on. We wrote “programs” in plain language, on paper, and then simulated how the computer would interpret the code using his “3B” computer—bins, balls, and beans. Step 1—take a bean from the input bin, step 2—put the bean in the accumulator bin, step 3—check to see if there are more beans to count. Just like that, we had executed a machine command! With a new appreciation of how computers “think,” we filed into the computer room and met The Machine, a DEC PDP-8 with 32k of memory.
We would be using this PDP-8 to learn the BASIC programming language. Before each class, the professor would load a 20 MB removable hard disk containing the BASIC operating system into a drawer on the front of the closet sized computer. He would then toggle several switches, press buttons, and the computer would whir to life in 4-user, time-share mode. Time-share meant that the 32k of memory was divvied up into 8k chunks dedicated to one of four teletype machines in the computer room. This was after punch-cards but before screen-based terminals. We typed our code into the teletype and the programs slowly scrolled up on the paper output. We experimented with simple statements like “Print ‘Hello’” and “10 Input A; 20 Print A; 30 GOTO 10”. We learned about the Do command and If-Then tests, and how to avoid infinite loops. In BASIC, each line of code is executed immediately after the Return key is pressed, providing instant feedback that left us both reassured and frustrated—reassured when the computer simply accepted the command, frustrated when it did not. While the error messages we got were a little more meaningful than the sci-fi standard, “Does Not Compute,” they were no more helpful. But oh, the joy when the right thing happened!
At home after class, I would sit with a blank piece of paper and think. Each assignment challenged my sense of myself as a logical competent person, and together they transformed my way of thinking about almost everything. Phil’s confidence in me proved galvanizing. The first few tasks were easy, in retrospect—have the computer ask your name and print out “Hello <name>!” or have it accept two numbers, multiply them, and print the result. All we had to do was learn the required BASIC commands and apply them correctly. This was easy!
Then we ran smack into logic. The task: write a program that could tell if a number was odd or even. A child can do this. But the computer was unlike any child I knew. I began to see that programming a computer was much harder than telling a 10-year-old how to, say, put away dishes. It was much more like telling an alien from outer space. If you want to say “take the clean glass from the sink and put it in the cupboard,” you first have to define what qualifies as a glass–does a mug count? how about a plastic “glass”? what’s “clean”? Once you nail these little aspects down, you face a similar challenge to define a sink and a cupboard, explain how to open a door, and describe the best way to pick up and put down a glass. And that’s just the beginning. Every step has to be explicitly defined. The computer is not an omniscient problem solver. It is Spock without his human half—irritatingly literal-minded and lacking any flexibility or imagination. I had to find the precise wording to direct this stupid genius to my goal without misunderstandings or mutiny.
When I thought I had a working program to show Phil, he would run it the “right” way once—if the program asked for a number, he would enter one and see the results and nod—and then he’d run it again and do something silly, like enter a letter, or hit Enter without typing anything. Of course, that tiny little electronic brain acted like a two-year-old, refusing to respond or responding with some bizarre message like “Error 3”. With a rueful but slightly sly smile, the prof would stand up and head off to break somebody else’s heart.
My head ached for days after each class. But that was because my brain was on fire. I stared into space, straining to visualize the steps of today’s task, how to specify variables, assemble them into logical groups, and manipulate them to the conclusion while avoiding blind alleys. Since this was before the introduction of personal home computers, I could only write my programs on paper. Without immediate, interactive feedback, I had to think algorithmically, to write my code as precisely as I could and see if I could predict how the machine would actually execute it. Once I caught the scent of a solution, I itched to get into that cool computer room to try out my carefully planned instructions. As the weeks flew by, I gained confidence in my grasp of concepts, only to be pushed back into uncertainty by the next assignment. It was one of the most stringent and challenging and mind-bending and joyful learning experiences of my life, and I trace many of my subsequent career successes back to this one class.
My first two years of college had exposed me to new ideas and new perspectives, and I was learning and synthesizing new principles for how to see and think about the world. This apparently primed me to readily integrate Phil’s instruction on programming into my growing skill set. I embraced the logic in systems and processes, both digital and physical, and found examples everywhere: in cooking (a recipe is a program for processing food), in personal interactions (we receive information, process it based on our own programming, and output a result), and in the use of resources (when you run out of memory, your program stops, and when you run out of oil, your factories stop). These skills have since proven invaluable in myriad situations. My mantra was that deliberate steps produce reliable results. I started evaluating life experiences as if-then examples—if I take this shortcut, then I fail to prevent that mistake; if I make this bargain, then I stumble into that trap. Computing took a central role in my life, and my new world view transformed everything I did from then on.
When I started that summer class, I had no idea where programming would take me. By September, I knew I wanted to spend my life working with these monstrous, child-like “brains” that toss off mathematical feats in microseconds but stupidly refuse to recognize misspelled commands. I loved having to be precise in my thinking, to anticipate the machine’s responses, to second-guess where the processor would fall into the holes in my logic.
I also kind of loved how other people looked at me when I told them I was a computer programmer—with a mix of awe and horror and curiosity and fear. It clearly conditioned all their subsequent interactions with me, and not always for the better. I still get that reaction, even today. People who’ve used computers all their lives still get round-eyed and uneasy when I say used to write code and do tech support. Having a whole generation of “digital natives” who teethed on iPads still hasn’t integrated tech into our daily lives in quite the way a lot of us had envisioned. Many people still see the engineers, mathematicians, and programmers who build the machines and create the apps as a breed apart—brilliant and creepy at the same time.
Love the app, fear the app maker.
There’s an old computer room joke about how there are 10 kinds of people: those who understand binary and those who don’t. I don’t know if this bias existed before tech, but even if it did, the rapid bloom of technology in the past 50 years has thoroughly entrenched the assumption that technical people are fundamentally different from the rest of humankind.
I think this is a major source of the obstacles many people encounter when they want to explore their technology ability and interest. We get the message that we either qualify or we don’t.
But it’s not either/or. It’s not some inherent lack that prevents a person from engaging in a technical life. It’s the pervasive and unrealistic insistence that some people cannot achieve logical thinking and some can. Each of us encompasses whole spectrums of skills. We can be poetical about one thing and mute on another. We can work out the answer to one kind of problem and stall out on another. Each us has the drive to piece together the causes and effects we see around us. What we achieve with that drive is often determined by how we are viewed by others. Are we welcomed and encouraged, or does someone slam the door shut because we don’t belong inside?
If I could change one thing about computing, I would make it more like learning language. Babies learn language simply because they are continuously surrounded by people using words with particular sounds in particular ways, with particular grammar and tones, while doing the things of living. They learn to speak because that is what people do in life—speaking connects us with the larger family of our species. Adults love to share language with babies and provide loving feedback and gentle correction, so the child feels completely connected and completely accepted by other speakers of their language.
Imagine if we grounded our tech society the same way, creating places where people use logic as a part of their daily life, where they think algorithmically, with precision, and map their actions deliberately. Those who wish to participate in that society would simply absorb these habits the way babies absorb grammar and vocabulary, by watching, listening, experimenting, and receiving feedback in the form of two-way communication, correction, and approval. They would grow to feel connected to the larger family of technical people, and be welcomed, as my professor welcomed me, without hesitation and without prejudice. Is it possible?
Acknowledgment: I give much credit for the quality of the article to my son, Nathan, who joined me and his father in the tech world well before the age of 10. His thoughtful and insightful comments on earlier versions of this article not only improved the readability but helped sharpen and focus the two main points of the story: that learning to program can greatly improve one’s worldview, and that we have the means to make coding an exciting, welcoming, and inclusive adventure, if we choose to do so as a community.
Nora Miller got into math to avoid chem labs. With a math degree in hand and a talent for FORTRAN, she embarked on what turned out to be a 25 year career in programming and IT management. After taking early retirement from tech, she now freelances as an editor for scientific writers and publishers.