Since setting up my first Raspberry Pi, I’ve become interested in running Linux on other platforms.

Over the holiday weekend I found my father’s old Acer Windows 7 laptop and pulled all the photos and other data files off it preparatory to wiping it completely. The last time I saw it running was shortly before he died in 2015 and it was very slow and, I suspected, infested with viruses. There is a lot of crap out there that specifically preys on old people, getting them to download “PC cleaner” and other utilities to tackle malware that in fact load more malware of their own. My father’s laptop was just about unusable.

When I got it running again I found the hard drive was nearly full as well. I didn’t have any way of quickly figuring out what was taking up all the space, but I suspected a bad Windows setting. I’d run into this myself recently, with my work PC’s drive quickly filling up all the time because of some wonky system restore or backup setting (I’ve forgotten which one now, even though this was only a week or so ago). I believe (but do not know) that a nearly full hard drive can hurt performance, since it limits the ability for the system or applications to use virtual memory.

I was nervous about booting up the machine and getting files off it, since I didn’t want to infect my own PCs. But I considered that whatever malware was on the laptop had to be at least four years old, so by running an up-to-date virus scanner on it I should be able to confidently clean it up before copying files off it. I was also concerned about connecting it to my home network, in case there were any worms installed on it; but then worms are actually kind of rare and again, any worms I did encounter would be old and easily detected ones. So I booted it up and connected it to the internet via our WiFi and downloaded and ran MalwareBytes. I was surprised MalwareBytes didn’t find any serious viruses, just a lot of PUPs (450 of them).

I wasn’t very worried about the hard drive space issue, because all I wanted to do was get the photos and stuff off the machine before I wiped the drive and loaded Ubuntu. Still, it took several hours just to get to that point, the laptop was so messed up and the scan took so long.

While the laptop was chugging away with the MalwareBytes scan, I used my PC to download and install Ubuntu onto a bootable USB thumb drive. Then once I got what I needed off the laptop, I booted it from the thumb drive and installed Ubuntu, wiping Windows and everything else on the hard drive in the process. When I rebooted Ubuntu came right up, there was basically nothing else I had to do. The laptop had a USB wireless mouse installed that didn’t work, but I don’t yet know whether that’s because Ubuntu didn’t recognize the device or because the mouse batteries were dead. I’ll look into that later.

So now I have two Linux systems. And next month after we replace a couple of old laptops we use in the warehouse, I’ll have two more.

While Ubuntu and Raspbian both feature GUIs, I am far more interested in becoming proficient with the Unix/bash command line interface (CLI). Maybe it’s a middle-age thing, but learning a CLI is very exciting and nostalgic for me. While I am of course reminded of my abortive attempts to learn Unix in the 198s0 and 1990s, more than anything else my new Linux machines take me back to my first encounter with personal computers, when I was put in front of a brand new IBM PC XT at Irvine Photographics in 1984.

I was in charge of the art department at IPG and my boss was eager to get into digital imaging and production, which was in its infancy in those days. That summer I attended the Siggraph 1984 show in Minneapolis, looking for 8088- or 8086-based computer graphics software (most vendors laughed me out of their booths). In fact, the most powerful DOS-based business graphics package on the market at that time was produced just a few minutes away in Newport Beach by Zenographics, but I’d left IPG before we got as far as installing and running any computer graphics packages. We didn’t have any application software at all, since the PC XT was never meant to be used for office applications. No, all I did with the machine, after learning how to turn it on, was immerse myself in the PC-DOS command line. And boy was that fun.

Anyone who was part of the computer industry in the 1980s is familiar with the legend of how MS-DOS came to be (and consequently how a tiny company called Microsoft became the most important software company in the world). You can paraphrase its technical evolution by noting MS-DOS was based in part on CP/M which was based in part on Unix. Superficially, there is a substantial resemblance between the Unix and MS-DOS command lines (I won’t comment about what, respectively, is going on under the hoods). I think this is part of the attraction Unix and Linux hold for me: I was very comfortable with the DOS command line, and in some ways the Unix shell simply offers a far more powerful variant of DOS.

I do remember sitting down at the IBM PC XT for the first time. It had a 360KB double-sided 5ΒΌ inch floppy disk drive and a 10MB hard drive, very high-end specs for the time. I opened the manual to figure out how to turn it on, since I assumed there was more to it than flipping the orange power switch. But that was all you had to do. You flipped that switch and after a short time (a very short time compared to what people who grew up with Windows are used to) you saw a C:> prompt (not a C:\> prompt). And that was it. What do you do now?

So I opened the loose-leaf three-ring PC-DOS manual and started reading. Probably the first DOS command I learned was DIR, to see a directory listing (equivalent to Linux’s ls). I’m pretty sure I saw the DIR command in the manual, carefully typed it in and hit return, and muttered, “Whoa!” This was really something. My first command on a PC.

Now this wasn’t my first experience with computers. A couple years earlier I’d taken a BASIC programming class at Saddleback College, followed by a class in the Pascal language. While I was taking the Pascal class I also worked as a lab assistant in the Saddleback computer lab. I would help students with Saddleback’s Data General Eclipse system and with their BASIC code; not because I was any kind of a computer whiz at that time, but because I understood the system and coding just slightly better than the students did. In fact, I recall being quite mystified and even intimidated by the Eclipse minicomputer. All I knew about its workings was how to code and run a program, and how to print a listing.

But the IBM PC XT, well, it was different. Basically, it was my own personal computer, to do with it whatever I wanted. And since we had no software for it, all I could do was learn to manipulate the operating system, PC-DOS. I’m sure that doesn’t sound very exciting in 2019, 35 years later, to work on a computer with no application software (not even games!), but DOS was rich with commands to learn, and well as concepts like batch files, the ANSI.SYS driver and the PATH.

I vaguely remember the moment when I learned about the AUTOEXEC.BAT file, the set of high level routines and programs the system ran upon bootup. It’s almost unbelievable today, but when you started a new IBM PC for the first time, there was no AUTOEXEC.BAT at all. DOS simply loaded into memory and you were deposited on the root director of the boot disk with a C:> prompt (or an A:> prompt if you booted from a floppy disk, which was usually the case, since hard drives were very expensive and kind of rare). So I started EDLIN (DOS’s built-in line editor) and wrote my own AUTOEXEC.BAT. I have no idea what it did. Probably created a PATH environment variable and modified the prompt and changed the current directory. After all, there wasn’t much you could do on a machine with no other software installed. But there was power in that: I made the machine do something automatically, every time it started.

Over the next few years I got to the point where I could make MS-DOS sing and dance. The folks at Zenographics wrote a version of the Unix ls utility for DOS that added tremendous new functionality to the CLI, particularly recursion. With batch files, which are the equivalent of Unix shell scripts, I could combine several command line utilities from Peter Norton with the ANSI.SYS driver to produce elaborate front end and menu systems for DOS based computers. People marveled at what I could do from the DOS command line.

And then Windows v3.1 appeared and all that capability was rendered instantly obsolete, redundant, superfluous. The DOS CLI was dead.

But the Unix CLI never died. To be frank, Unix was probably on its way out around the same time, but was revived by the explosion of the World Wide Web, most of the infrastructure of which was and is Unix-based. I’ve never seen this suggested anywhere else, but I believe the Web itself was responsible for a resurgence in interest and importance of Unix, which otherwise would have gone the way of the VAX; while the development of Linux simply accelerated the trend. And while the popularity of GUI-based desktop variants of Linux might impact the acceptance of the Unix CLI to some extent, I doubt it will ever go away the way the DOS CLI did following the ascendancy of Windows.

For one thing, the Unix shells are far more powerful than the DOS CLI ever was. For another, I don’t think there is much reason for a typical user to install Linux at all unless he or she intends to take advantage of the CLI. While starry-eyed Linux evangelists imagine a future utopia when the great mass of computer users abandon their Windows, Mac and Android clients for the open-source power and utility of Linux desktops, Linux clients are and always will be of primary interest to those who lean towards geekery, great or slight. And the geeks will keep the CLI alive.

And so here we are.