Category Archives: History

Late to the Mac

A few weeks ago I was asked by Stephen Hackett to contribute a story, to his members-only newsletter, about my early experiences with computers and/or with the Mac. I was happy to help Stephen out because he does great work at 512 Pixels, because he is a friend, and because I like to talk about myself.

I agreed with Stephen that my contribution would be exclusive to his newsletter for a few weeks, and that afterwards I could publish it on my own blog. Here it comes!

Stephen’s version in the newsletter was spruced up with cool pictures of the mentioned computers, and possibly edited in ways that made me sound better than I do here. This is the raw, uncut version. Enjoy!


I’ve been a rabid Mac user and developer for 25 years, so it’s fair to say I’ve been a part of this community “for a long time.” But in my heart, I still consider myself something of a newbie. After all, I didn’t get my first Mac until I was 18 years old. My entire childhood was spent using, enjoying, even obsessing over computers made by other companies.

In 1982, I was 7 years old and lived with my mom in the rural town of Dunsmuir, California. While Silicon Valley’s famous fixation with technology was well underway, it hadn’t substantially reached the folks in my small, relatively poor home town. Meanwhile, my dad was living separately from us in Santa Cruz. He had gone back to school as an older student to obtain a computer science degree, and started his professional career working for IBM in San Jose. His enthusiasm for computers would turn out to be crucial to my interest in computers, and to my eventually programming them.

On one of his visits, he brought my first computer: a Timex Sinclair 1000. It was barely the size of a small book, used a television set for a display, saved files to audio cassette tapes, and featured a whopping 2KB of RAM. It’s fascinating that even the content of this article I’m writing would not fit in the memory banks of that computer. My dad gave me some copies of Compute! magazine, which featured BASIC program listings, some of which I guess were applicable to the Timex Sinclair. One of my earliest programming memories is of him walking me through the rough notion of source code as a sequence of instructions, and the computer as the obedient executor of those commands.

That Timex Sinclair wasn’t much of a machine, but it may still have made me the very first person in Dunsmuir to own a “home computer.” Apart from typing in the occasional program from those magazines, I never really learned to program it. But having those computing concepts introduced when I was so young, and in such a fundamentally hands-on way, set me up for the deeper understanding of computers and programming that I would later gain.

In 1983, my mom and I moved to Santa Cruz to live with my dad. In the Bay Area, we were far from the only family with a home computer, but my dad’s enthusiasm certainly meant our home saw a greater variety of them than most! My dad bought me a Commodore 64, which I adored, but the real thrill was his Kaypro IV. This beast was a semi-portable, all-in-one computer that ran the CP/M operating system, and featured a monochrome green display. I’d sneak onto it while he was out of the house, scouring his piles of floppy disks for anything of interest. I spent hours playing Ladder, a Donkey Kong-like game whose graphics consisted entirely of ASCII characters animated on the screen.

When the Amiga 1000 was released in 1985, my dad bought one, and I threw myself completely into it. I experimented with programming, studying the “ROM Kernel Manual”, but never learned how to put together a whole program. Still, for many years, I was “an Amiga guy.” I met other Amiga users, traded software, and reveled in the various technically superior capabilities of that platform.

Through the Santa Cruz BBS community, I got hooked up with an extensive network of UC Santa Cruz computer nerds, and found my way onto the internet by around 1989. That’s how I fell in love with UNIX. Throughout my teenage years, I played with various UNIX systems through the Adm–3A terminals in the university’s computer labs, and eventually ended up owning both a Sun 3/50, and an SGI Indigo. Obviously, I loved computers, but in spite of my diverse experience with various systems, I’d never really learned to program, and I’d never really used a Mac.

By 1993, I had a few friends through the Santa Cruz computer scene who were then working at Apple, about a 45-minute drive away. David Van Brink was a software engineer on the QuickTime team, and he opened my eyes to the virtues of the Mac. As much as I had enjoyed other computers, from that Timex Sinclair, through the Amigas, UNIX systems, and occasional other platforms, I had never witnessed a computer seeming particularly empathetic to its users. The Amigas came closest, but even at their most charming, they were obviously meant for nerds. The Mac, I finally came to realize, truly was “the computer for the rest of us,” and I was ready to become one of us.

David shared his employee discount with me and, after convincing my dad it would be useful for my college work, I came up with the cash for my very first Mac: a PowerBook Duo 210. This curious machine came from an era when Apple supported the notion of a “dockable” computer that could be used as a portable, but also plugged in to a desktop case connected to a full monitor, keyboard, etc. I could only afford the computer itself, so I made do with its limitations, but it was marvelous. Finally, more than 10 years after first being introduced to programming and computers, I owned a Mac, and was eager to write software for it.

I used a program called THINK Reference to study the Macintosh Toolbox API, and got to work writing my first app, Super Robots, which was a blatant clone of a text-graphics based app for UNIX called “robots.” (Fun fact: because the PowerBook Duo had a grayscale screen, I developed and shipped the app without ever having seen it for myself in full color.)

After shipping an app, and becoming vaguely familiar with the Mac operating system, I took the step that would ultimately be most impactful on my career and life-long love of computers: I went to work for Apple! At the encouragement of Qarin Van Brink (David’s sister), I applied at a Silicon Valley contract agency that was known to supply many of Apple’s QA testers. A few improbably successful interviews later, I was working in Cupertino on the System 7 integration team.

I eventually made my way out of testing and into engineering, where at long last, I started to sort of maybe learn to program. I also started learning how to really use and appreciate the Mac, and how to empathize with users. Compared to many Apple fans, I was late to the Mac party, but I’ve been making up for lost time. For 25 years, I devoted myself to using and programming Macs. Here’s hoping I can eke out at least another 25!

Paper Airplane Icons

A friend who is running the latest beta of Microsoft’s Outlook 2016 for Mac shared a screenshot of the app’s sidebar icons:

PaperAirplane

The paper airplane used for “Sent” really jumped out at me, and I felt compelled to re-evaluate how common, and for how long, the metaphor has been used to represent a “sent email” in apps.

It seems obvious the metaphor is supposed to relate sending an email to the storybook notion of passing a note in class. Write your note on the paper, fold it into an airplane shape, launch it across the classroom, and hope against hope that you avoid the teacher’s gaze, aren’t ratted out by a classmate, and that you execute a perfect delivery so it doesn’t fall into the wrong hands.

Come to think of it, maybe this isn’t an icon that inspires confidence of a safe delivery. Nonetheless, I think it’s a pretty cute metaphor.

The app I most associate with paper airplane icons is the Mac’s built in Mail app. Apple uses a mix of metaphors in the app, including a postage stamp for the app’s main icon:

Apple Mail app icon.

and physical envelopes in the icons for some of its preferences:

Image of toolbar icons in Mail Preferences featuring physical envelopes

But when it comes to drafting and sending mail? It’s all about the planes. Notice how they even leverage the playful symbolism to represent a draft message with a paper folding diagram:

Image of Mail.app sidebar icons.

I was curious to know if another email app used paper airplanes to represent drafts before Apple Mail did. I went out Googling and found all manner of representations, usually employing the paper envelope, or another snail-mail related symbol. None of them, except Apple Mail, uses a paper airplane.

So my modest research suggests that the use of a paper airplane was a pretty novel bit of design. Was it an Apple innovation, or did it debut in some prior app I haven’t been able to track down? Is Microsoft’s adoption of the symbol the next step towards making paper airplane icons the universal symbol of sent mail? I kind of hope so!

Can’t Take That Away

I was surprised how much I enjoyed all the pro-Mac celebration today, marking the 30th anniversary of its debut.

Apple’s home page is dedicated to the Mac, and links to an extremely extensive special feature outlining, year-by-year, some significant uses of the Mac, by significant people, as well as the major shifts in design and functionality that the computer has seen.

As a long-time Mac user and developer — I started working for Apple at 18, went indie at 26, and am 38 now, still working on Macs — I was touched by the amount of pride Apple exudes today in celebrating the triumph of the Mac. Especially over the the past several years, as many people have shifted their attention to mobile devices based on iOS and Android, it’s easy to forget that the Mac is still an amazing device and that the hardware and software both continue to improve every year.

Topping off a great day, news came out of Cupertino that Apple has posted giant posters on campus, upon which are printed the names of “every employee who has ever worked for Apple.” Holy cow, my pride overflows. What a grand, yet humble gesture. My old friend Dan Curtis Johnson confirmed that I had made the cut:

In close proximity, both Dan Jalkut and Ellen Hancock. Such a faraway time, the long-long ago.

It was long-enough ago, that some people still called me Dan.

I learned so much at Apple, and had many profound experiences that shaped the way I see the world both technically and otherwise. I was hired a year or two before Steve Jobs came back, and one of my first joys at Apple was adding my own name to the “About Box” for the Memory Control, which I had taken charge of. And because I could, I also added the names of some friends. I was a little immature. When Steve came back, one of his company-wide edicts was that the names of individuals must be removed from about boxes. I had to commit the source code that wiped my own identity off the faces of the products I had worked on.

Steve’s explanation was something along the lines that it was unfair to put the names of a few in about boxes because it was a disservice to all the other employees who were not listed. He insisted that each of them was as important to the success of the company as the people who were listed. Of course he was right, but it didn’t feel great at the time.

It’s hard not to think of Steve when I read about this perfect gesture of gratitude to all the employees who helped, directly or indirectly, in making the Mac a success. The Mac is his baby, and it’s grown to be not only strong and robust, but in some respects unassailable, thanks to the hard work of the tens of thousands of people whose names are on those posters.

I started at Apple while I was still in college, barely having glimpsed the professional world outside of the company. I learned a lot in school, but I learned much, much more at Apple. My time there completely shaped not only the way I develop software but the reasons I develop software. Fundamentally: to serve and delight the people who use it.

I often think back wistfully, wondering what would have happened if I stayed at the company. I might have gone on to do important, admirable work, or I might have become one of those (rare) old slackers at Apple who doesn’t earn his or her keep. Either way, my name would have been on one of those posters today. And either way, it would have been well-earned. People often say the great thing about education is that nobody can take it away from you. The same is true of working for Apple, and the company’s gestures today strongly underscore that fact.

Virtual ROM

My attention was drawn by Anil Dash on Twitter to two posts discussing the purported and actual capacities of devices that, for example, advertise “64GB of storage.”

Marco Arment linked to an article on The Verge, asserting Microsoft’s 64GB Surface Pro will have only 23GB of usable space. Arment went on to suggest that device manufacturers should be required to market devices based on the “amount of space available for end-user” data.

Ed Bott’s take, on the other hand, was that Microsoft’s Surface Pro is more comparable to a MacBook Air than other tablets, and its baseline disk usage should be considered in that context.

I think each of Arment’s and Bott’s analyses are useful. It would be nice, as Arment suggests, if users were presented with a realistic expectation of how much capacity a device will have after they actually start to use it. And there is merit in Bott’s defense that a powerful tablet, using a more computer-scale percentage of a built-in disk’s storage, should be compared with other full-fledged computers.

Let’s just say if fudging capacity numbers was patented, every tech company would be in hot water with the patent trolls. A quick glance at iTunes reveals that my allegedly 64GB iPhone actually has a capacity of 57.3GB.

ITunes

I don’t know precisely what accounts for this discrepancy, but I can guess that technological detritus such as the metadata used by the filesystem to merely manage the content on the disk takes up a significant amount of space. On top of that, the discrepancy may include space allotted for Apple’s operating system, bundled frameworks, and apps. Additional features such as recovery partitions always come at the cost of that precious disk space. Nonetheless, Apple doesn’t sell this 64GB iPhone as the 57.30GB iPhone. No other company would, either.

It seems that in the marketing of computers, capacity has always been cited in the absence of any clarification about actual utility. One of my first computers (after my Timex Sinclair 1000) was the Commodore 64, a computer whose RAM capacity was built in to the very marketing name of the product. Later, Apple followed this scheme with computers that would be known as the Mac 128K and Mac 512K. Each alluding to its ever-increasing RAM capacity.

The purported RAM capacity was somewhat misleading. Sure, the Commodore 64 had 64K of RAM, but some of that had to be used up by the operating system. Surely it would not be possible to run a program that expects 64K of RAM, and have it work. So was it misleading? Yes, all marketing is misleading, and just as it’s easier to describe an iPhone 5 as having 64GB capacity, it was easier to describe a Commodore as having 64K, or a Mac as having 128K of RAM.

But the capacity claims were more honest than they might have been, thanks to the pragmatic allure of storing much of a computer’s own private data in ROM instead of RAM. Because in those days it was much faster to read data from ROM than from RAM, there was a practical, competitive reason for a company to store as much of its “nuts & bolts” code in ROM. The Mac 128K shipped with 128K of RAM, but also shipped with 64K of ROM, on which much of the operating system’s own code could be stored and accessed.

Thanks to the ROM storage that came bundled with computers, more of the installed RAM was available to end users. And thanks to the slowness of floppy and hard disks, not to mention the question of whether a particular user would even have one, disk storage was also primarily at the user’s discretion. It was only after the performance of hard drives and RAM increased that the allure of ROM diminished, and computer makers turned gradually away from storing their own data separately from the space previously reserved for users. With the increasing speed and size of RAM, and then with the advent of virtual memory on consumer computers, disk and RAM storage graduated into a sort of virtual ROM.

The transition took some time. Over the years from that Mac 128K, for example, Apple did continue to increase the amount of ROM that it included in its computers. I started working at Apple in an era when a good portion of an operating system would be “burned in ROM”, with only the requisite bug fixes and updates patched in via updates that were loaded from disk. I haven’t kept up with the latest developments, but I wouldn’t be surprised if the ROM on modern Macs is only sufficient to boot the computer and bootstrap the operating system from disk. For example, the technical specifications of the Mac Mini don’t even list ROM among its attributes. The vast majority of capacity for running code on modern computers is derived from leveraging in tandem the RAM and hard disk capacities of the device.

So we have transitioned from a time where the advertised capacity of a device was largely available to a user, to an era where the technical capacity may deviate greatly from what a user can realistically work with. The speed and affordability of RAM, magnetic disk, and most recently SSD storage, have created a situation where the parts of a computer that the vendor most wants to exploit are the same that the customer covets. Two parties laying claim to a single precious resource can be a recipe for disaster. Fortunately for us customers, RAM, SSD, and hard disks are cheaper than they have ever been. Whether we opt for 64GB, 128GB, or 1TB, we can probably afford to lend the device makers a little space for their virtual ROM.