Touch Bar Everywhere

Apple announced on Thursday that many of their new MacBook Pros will ship with a “Touch Bar,” a narrow, high resolution touch screen in place of the Mac keyboard’s traditional Function Key row.

Many people immediately wondered whether we can expect Apple to release an external keyboard with the Touch Bar. This would bring the technology to the much wider audience of Mac users who are not ready to update to the latest MacBooks, or who prefer desktop Macs, or who prefer the flexibility of using their MacBook in a desktop-style configuration.

The question was discussed on the latest Accidental Tech Podcast, in which, if memory serves, John Siracusa and Marco Arment argued different angles of the “no” argument, citing the hardware cost, the extent to which a Touch Bar keyboard would complicate the accessories lineup, and perhaps most significantly, that Tim Cook does not care enough about the Mac to prioritize pushing any such technology.

Casey Liss watched football. Zing! Actually, Casey pushed back against the cynicism, suggesting that Apple’s apparent lack of enthusiasm for the Mac does not reflect a lack of commitment to improving it. I, on the other hand, take exception with the very suggestion that Apple lacks enthusiasm or is not investing heavily in the Mac. Or, at least in this feature.

I think Apple intends to push the Touch Bar as as widely as it possibly can. The current MacBook Pro lineup is the most practical computer to debut the feature, but as it becomes possible to bundle it with external keyboards, and on notebook computers at every price point, they will do so.

Why am I so assured of Apple’s big plans for the Touch Bar? Because while many people assert that Apple is not investing seriously in the Mac, the Touch Bar’s hardware and software support appear to have been a major priority for the company in the year or two leading up to its release.

A massive amount of design work must have gone into the Touch Bar’s physical hardware, structuring the information it represents, and deciding how users will most usefully interact with it. I suspect that the Touch Bar merited an amount of design effort perhaps less than, but not completely incomparable to a standalone product like the Watch.

Thanks to the Touch Bar simulator in Xcode 8.1, we can also already take stock of the sheer amount of engineering effort, across many disparate groups in the company, that went into supporting the Touch Bar from a wide range of different apps and modes in macOS. Leave the simulator running while you go about your work, and prepare to be repeatedly surprised by the variety of novel use cases that have already been identified and implemented.

I find it impossible to believe that Apple would go to all this work, both on the Touch Bar itself, and across the entire range of its own apps and OS features, unless it had a grand vision for the Touch Bar that extends way beyond the internal keyboard of its premium notebook computers.

Instead, I think Apple sees the Touch Bar as a long-term, distinguishing aspect of using a Mac. Users will always be able to get by without one, just as they do for example when a multi-touch trackpad is not available. But macOS, and nearly every app that users run on it will work better with a Touch Bar. One day we’ll expect to always have access to one, and will feel that something is missing if we don’t.

It’s easy to see why Apple couldn’t come charging out of the gate with this vision fully realized. The Touch Bar hardware is no doubt expensive, and there are probably practical considerations with respect to the security of Touch ID, bandwidth between the device and the Mac, and managing its power needs in a user-friendly manner.

I say give Apple time. They’ve made a huge investment in Touch Bar, and all indications are they are prepared to continue prioritizing support for it down the road. We’re only on the brink of entering the early adopter phase, but in years to come I do think the Touch Bar will be everywhere.

The Price Of GPL

Matt Mullenweg, the founder of Automattic, downloaded his competitor Wix’s iOS app. It looked eerily familiar, and he confirmed it contains source code stolen from WordPress. He called them out on his blog, getting right to the point in addressing the problem:

Your app’s editor is built with stolen code, so your whole app is now in violation of the license.

Wix’s CEO, Avishai Abrahami, responded with a round of non-sequiturs that carefully evade the point that his product is built from source code for which they have not paid. One of his engineers equally misses the point, focusing on the circumstances surrounding the violation, rather than taking responsibility for the theft.

Some will take issue with the use of strong words like “stolen code,” and “theft,” with respect to a GPL violation. But that’s exactly what it is: software has been taken and deployed in Wix’s product, but the price for doing so has not been paid.

Many developers (and CEOs) seem to prefer remaining willfully oblivious to the consequences of using GPL code. They loosely interpret the terms of GPL to suit their own wishes for what they implied:

  • “It’s OK for us to use GPL code anywhere, as long as we contribute back changes.”
  • “It’s only a small amount of GPL code, so the license doesn’t apply.”
  • “We contributed to this GPL code, so we have special rights to use it.”
  • “We give back to the community in other ways, so it balances out.”

All false, yet all common interpretations of GPL, and echoes of the poor arguments presented by Wix’s CEO and engineer.

The price of GPL is fairly obvious and easy to understand, even if there is some bickering about what constitutes “linked code.” You don’t have to be a legal expert to get the gist of it: if you want to link your software with GPL code, you must also make your software’s source code available. Specifically, you must make your software’s source code available to customers who install your software, under a GPL-compatible license. You have to give your code away. That’s the price of GPL.

Many developers understand, and view the price of GPL as perfectly justified, while others (myself included) find it unacceptable. So what am I supposed to do? Not use any GPL source code at all in any of my proprietary products? Exactly. Because the price of GPL is too much for me, and I don’t steal source code.

Log Littering

Apple has dramatically revamped its standard logging mechanism. Unified Logging, available in macOS 10.12 and iOS 10, replaces various file-based logging approaches with a centralized, database-backed repository for log information of all levels of interest.

The big win, both for developers and for users, is performance. Where the simple act of logging some debugging information used to have enough of a cost to dissuade rampant use, the new system makes it so cheap that developers inside Apple and out are being encouraged to “log like the dickens” in the name of generating more evidence for posthumous debugging. The new system also offer impressive data mining abilities, making it far easier to filter for what you’re looking for and to track logging information across specific activities, even if the activity spans multiple processes.

The two big losses, in my opinion, are that the sheer size, number, and variety of logging messages makes it impractical for users to skim the console for “real problems,” and that the resulting logging archives are so large that it’s impractical to casually include them with bug reports to Apple or 3rd party developers.

The WWDC session on the topic goes into detail on these points, but perhaps most useful for framing my complaints here are Apple’s stated goals for the revamp:

  • One common, efficient logging mechanism for both user and kernel mode
  • Maximize information collected while minimizing observer effect [the phenomenon that enabling or adding logging changes the behavior of the system you are trying to understand]
  • We want as much logging on all the time as possible
  • Design privacy into the system

Two of the stated goals, “maximizing information collection,” and “as much logging on all the time as possible,” exacerbate the problems of there being so much log data that it’s both difficult to skim, and cumbersome to share.

Imagine a society in which all packaging has been transformed into fast-composting, biodegradable materials. Your soda bottles, snack wrappers, cigarette packages, etc., all biodegrade to dirt within 48 hours of use. What a boon for the world: the major, global problem of trash accumulating in our towns and environment would be gone. Poof!

Or would it? When “littering has no cost,” I suspect that we’d face a new problem: far more people would litter. Why bother finding a place to throw out that bottle, when nature will take care of it in within 48 hours? Multiply this attitude out over even a small portion of the world’s billions of people, and we’d be guaranteed to be buried in trash. All trash that will be gone in 48 hours, mind you, but constantly replenished with a fresh supply.

I think this metaphor points to a similar problem with unified logging: the sudden onslaught of “low cost logging” has left our developer society unprepared in some specific ways to deal with the consequences of the new reality.

So, how can we, and Apple fix this? We need specific solutions that make skimming for problems, and sharing pertinent log data easier, yet don’t impact the very positive goals outlined by Apple above. Here is my advice:

  1. Don’t litter the logs. Just because you can log it, and just because doing so is cheap, doesn’t mean there isn’t a cost. Some of the recurring log messages littering my Console arrive at a rate of thousands per minute. There may be good arguments that certain subsystems, in certain states, should log this extensively if the resulting data will justify easier diagnosis of problems, but much of the junk I see are redundant reiterations of relatively useless information:
Not switching as we're not in ~/Library/Keychains...
...
CSSM Exception:...
...
Etc. Etc. Etc.

Some of these seem to be well and truly noise, but some of them, even if they seem highly redundant and noisy to me, could in fact represent useful logging if you happened to, for example, be tracking down a subtle bug in Apple’s security libraries. That’s fine, I’ll accept that some amount of log diarrhea in the name of better software, but only if the following accommodation is made…

  • Annotate high-frequency logs for easier filtering. The logging system offers a variety of tools for annotating log messages, but even internal Apple groups do not seem to use these extensively or appropriately. The system supports the notion of three levels of log message: default, info, and debug. Only the “default” level messages are displayed by default in the Console app, yet all of the above-described garbage is displayed in that default mode. The Console will become significantly more useful when the worst offenders have marked their messages for severity, subsystem, and category so that they can be effectively omitted from the default view, and so that they can be mined later for use by folks who can actually make sense of them.

  • Generate context-specific sysdiagnoses. One of the worst practical outcomes of unified logging is that capturing a “sysdiagnose” on my Mac now generates a compressed archive that is still larger than 400MB. These archives are typically requested by most groups at Apple in response to bug reports, regardless of how severe the bug is or how readily reproducable it appears to be. In the world of Apple bug reports, sysdiagnoses are treated like lightweight bits of information that should be appended to every issue, except they’re now anything but lightweight.

    All the annotation work advised above should really pay off when Apple provides a streamlined mechanism for capturing sysdiagnose information pertinent to a specific subsystem or product. The company already offers product-specific advice for capturing log information, sometimes requiring the installation of custom profiles and other shenanigans. The lightweight unified logging system empowers Apple to both require that internal groups properly annotate their log messages and to facilitate smarter gathering of that data.

    Currently if you want to capture a sysdiagnose at any time a Mac, you simply press Ctrl-Opt-Cmd-Shift-Period. On an iOS device it’s done by holding both volume keys and the power key, but you have to enable logging first with a custom profile. This shortcut will generate one of those mondo 400MB style sysdiagnose archives, which you can upload to attach to your bug reports in your copious spare time.

    I envision a prompt that appears after invoking the existing shortcut, or else a new shortcut for “interactive sysdiagnose” where you could specify the category of bug you are reporting. The prompt would list categories correlating to groups at Apple who had done the work of providing streamlined log filtering data that will effectively strip out all the useless (to that group) noise from your log data.

    In fact, sysdiagnose already takes a “process” parameter when invoked from the command line, and my understanding is that this enables it to capture data that is pertinent to the target process. The unified logging system seems to provide the infrastructure for even smarter capturing along these lines.

    I realize that in this time of flux, where much log data is not properly annotated, there is still an incentive for many groups to capture as much as possible and sort it out on Apple’s end. These groups should know, however, that the larger the sysdiagnose archive you ask users to upload, the lower the chances they will bother actually following through on filing the bug or on providing the pertinent information you have requested.

  • Annotate specifically for user-concerning issues. Power users on the Mac have, for years, counted on being able to skim the console log for information that might explain a slow Mac, persistent crashes, or other inexplicable behavior. Given the huge increase in number of log message and the lackluster annotation of these messages by Apple, the Console app is effectively useless to users for this purpose.

    On top of getting the annotation right in general, I think a new special level of log message should be created, indicating particular concern to end-users. These would identify log messages that developers specifically think users should take notice of and follow up about. I mentioned there are three basic levels of log message: default, info, and debug. On top of that, there are two special levels called “fault” and “error”, which can be filtered on in the Console, and which receive special treatment from the logging system.

    These special levels seem close to what users might find interest in, but they’re not quite right either.

    A “user interest” level of logging message would facilitate the kind of high level skimming that seems impossible in Sierra today. These log messages would convey information that a user might gain some insight from. They would stand in stark contrast to the noise of messages like these, from the filecoordinationd process:

    Claim D32EDEBE-C702-4638-800C-E4BAB9B767F3 granted in server

    Nobody knows what a claim is, or cares whether it was “granted in server” … unless they are actively debugging filecoordinationd. On the other hand, a message indicating the failure to grant a claim “because the disk is full,” could be very useful to users indeed.

    I realize the line could be difficult to draw in some circumstances, but a huge number of messages currently being logged are obviously not of user interest. Perhaps then, the opposite approach could be taken, and log messages could be annotated as being specifically useful for posthumous, internal debugging by developers.

I tell you, things are bleak. We’re swimming in a morass of useless console noise, and there is little we can do about it. There is room for optimism however, as the logging infrastructure is good, and seems to support many thoughtful measures that could attenuate the problems and make the system more useful and less cumbersome to developers, end-users, and well-meaning bug reporters.

The console is littered with trash. Sure, it will biodegrade in 48 hours, but that doesn’t mean we shouldn’t be concerned with cleaning it up. A combination of more thoughtful logging, proper annotation, and appropriate filtering will get us out of this mess.

Twitter Optimism

Since its debut in 2006, Twitter has been free to use. As with so many other successful social networks, the strategy seems to have been to first attract a massive number of users, and then set to work figuring out how survive financially off of them.

After being conditioned over years to not pay a dime directly to the company, I don’t know that Twitter’s customers would have accepted a business plan that forced them to overtly pay for the service. Ads seem like a perfect fit, and we’re all so accustomed to trading our attention for free, or heavily subsidized services, that hardly any of us have complained.

Still, ads are obnoxious. They range from the merely obnoxious interruption of your personal timeline’s flow, graduate to the insidious obnoxiousness of ridiculous products you’re not interested in, and peak with the repulsive obnoxiousness of forcing the slogans of despised political candidates or other anathema concepts into your brain, by way of your ever-so valuable eyeballs.

What if Twitter adopted a business model that allowed them to maximize financial gain from advertising, while minimizing the obnoxious intrusion into the sanctity of your personal timeline? In fact, I think they could be on to just such a model.

On the latest episode of Core Intuition, Manton and I chatted about rumors that Twitter might soon be acquired by a larger company such as Salesforce, Google, or Disney. We agreed that each company would bring its own advantages, and threats, to the company. On the whole, though, we also agreed that Disney would be the best of the three with respect to maintaining the status quo.

Disney is a media company, through and through. Coincidentally, Twitter has aligned itself with media outlets over the years, offering high-profile integrations with major events ranging from awards shows, to sporting events, to political debates and beyond.

I suspect that as Twitter focuses more and more on these kinds of enhanced event-based Twitter streams, they will find that advertisers are keenly interested to pay a premium price for ads that target that same audience. Just as many companies will pay a massive amount of money to target the Super Bowl audience, they should expect to pay a significant markup to target the Twitter-based Super Bowl “moment.”

I’m optimistic that Twitter will recognize that the massive advertising potential of sponsored events leaves them free to leave boring, everyday social Twitter relatively, or completely ad-free. This approach would take will: it’s hard to say no to advertising dollars, and there will always be somebody willing to pay a few cents to pop and ad into Joe Public’s private Twitter feed. But selling out private timelines may be a poor investment, when Twitter could capitalize on the user trust and loyalty that will come from having their own “personal” spaces on the service treated with respect.

There are parallels in other media. For example, on television there are a few channels that are left unscathed by the blight of advertising. I don’t know whether it’s by force of Federal laws, local cable contracts, un-marketability, or some combination of the three, but for example you don’t see ads on local cable access stations or C-Span, do you? The cable industry chalks up these losses and nonetheless makes a healthy living selling ads on all the other stations that viewers inevitably opt in to watching, because they value the content.

I think Twitter’s users will continue to opt-in to their special events, because they value the content. And this self-selection is part of what makes the advertising opportunity so valuable. Finally, I don’t think users mind nearly so much when ads junk up the timeline of a public “moment,” because that content doesn’t feel, and isn’t meant to be personal. It’s mass media. We are used to advertising on our mass media, but tend to get really annoyed when it shows up on our personal chats and feeds.

Hopefully Twitter will recognize the opportunity they have to satisfy both their financial needs, and the wishes of their customers, by focusing their advertising where it really counts.

App Store Maturity

Apple announced that they will be taking steps to improve the quality of apps available in the App Store:

We are implementing an ongoing process of evaluating apps, removing apps that no longer function as intended, don’t follow current review guidelines, or are outdated.

Developers have known since early in the App Store’s history that Apple may retroactively decide that a particular app no longer merits inclusion in the store. Because of the large number of apps in the store, it has widely been thought that such reconsideration would only occur when and if a new version of an app is submitted, and thus reviewed again. This announcement, however, hints at a much larger-scale procedure, that would potentially cull thousands of products from the store.

Several months ago, developers noticed that the average review time for apps had dropped dramatically. Instead of taking several days, sometimes a week, or more, it is now common for apps to be reviewed in only one or two days. Many wondered whether some technical breakthrough made it easier to blaze through reviews. For example, improved static analysis, an automated fuzz-testing suite, or some combination of these and other techniques could reduce the need for stringent human involvement in some aspects of the review process.

There are over 2 million apps in the App Store, and Apple has effectively announced that they are prepared to re-review all of them in the name of improving overall quality in the store. This hints strongly that there has been some systematic improvement to the review process. It boggles the mind to imagine that all 2 million of those apps were in fact reviewed by humans, but that happened over the course of almost 10 years. Whatever process Apple is gearing up to apply, they claim apps will start dropping from the store as early as September 7.

It’s interesting to me that Apple feels comfortable dropping a potentially massive number of apps from the store. They have never shied away from boasting about the impact of the App Store, often focusing on the sheer size of it. They make a point in every WWDC keynote to talk about the vast numbers of developers, apps, downloads, and yes, dollars flowing through the store. Yes, if they measured success purely by number of apps in the store, they would have made their review criteria much more lenient from the start. But if they cut the number of apps for sale by a significant degree, it will be the first time in the App Store era that I remember the company emphasizing “quality, not quantity.”

I see both the decision to ratchet up quality control, and the willingness to live with the consequences of smaller bragging numbers in the store, as signs of the App Store’s maturity. When Apple debuted the iOS App Store, one of its main challenges was in justifying the very idea of an app store. Every enthusiastic update on the number of apps or amount of revenue generated by them seemed almost paranoiacally intent on proving the concept of the App Store correct.

Apple’s willingness to now intentionally deflate those metrics strikes me as a sign of cool confidence. The App Store concept has been proven valid. It was proven valid years ago, but Apple’s famous paranoia may not have allowed the defensive posture to relax until now. I’m optimistic that this change is only the first of many, in which Apple will focus less on arguing that the idea of an App Store is good enough, and more on the possibility that such an App Store can be insanely great. Hey, a guy can dream, right?

The Apple

MacRumors pointed out that Apple seems to be dropping “Store” from its store branding. The new flagship store in San Francisco, for example, is “Apple Union Square.” This has led to some criticism and guffawing from friends who now jokingly refer to any Apple Store as simply “The Apple.”

John Gruber thinks it makes sense to drop “Store” from branding, and compares Apple with other major brands whose stores do sound ridiculous with the appendage:

The “Store” branding only made sense when the concept was novel. Now that Apple’s stores are well established, it makes sense to drop the “Store”.

And:

No one goes to the Tiffany Store or Gucci Store, they just go to Tiffany or Gucci. It’s not even just a premium thing — you say Target and Walmart, not Target Store and Walmart Store.

The difference between these brands and Apple is that Apple’s identity has long been independent from the notion of a store. Calling it the “Apple Store” was not only important because the stores were a novelty, but because Apple is a brand that transcends retail. This may be true as well for Tiffany or Gucci, but when you think about these brands, I suspect you think of them in their retail context. Target and Walmart? They’re stores to the core of their being, so of course it would sounds strange to brand them as such.

Apple is a company whose products, hardware and software, have historically been sold separately from its own retail presence. Going to “Apple” will never make sense the way it does to go to “Target” or even to “Tiffany’s.” Where “Store” has been dropped, it’s essential that some other qualifier takes it place. Going to “Apple Union Square” makes sense. Asking a hotel concierge whether there is “an Apple nearby” makes as much sense as asking where the nearest “Ford” or “Honda” is.

Of course, the vast majority of people will probably still refer to any of Apple’s stores as “the iPhone store.”

Named Watch Faces

It’s starting to look as though it will be common with watchOS 3 to configure multiple watch faces for different life contexts. It will now be easier than ever to swipe between watch faces, and multiple faces of the same base kind can even be configured with different complications suitable to a particular context.

To drive this home and aid in reminding users the intention for each face, it would be useful to apply our own names to them:

“Exercise”
“Work”
“Weekend”
“School”
“Relaxing”
“Childcare”

Best yet, giving names to the configured watch faces provides Siri with a handle for affording seamless integration, such that you could easily switch between the named faces with phrases like:

“Hey Siri, put on your Game face.”

Anything I expect to do frequently on my Watch, I will value being able to do hands-free with Siri. In order to achieve that, everything I would like to switch to, activate, or deactivate, needs to have a name. I wrote previously about this in Labeled Siri Alarms, so I guess my head is in a place where I’m looking for every opportunity to enlist Siri’s help. This is one area where I think I would use such help many times per day.

(Radar #26883175)

Labeled Siri Alarms

I’ve written before that Siri, in spite of its flaws, is improving quickly. To that end, I occasionally challenge myself to think of ways that it might make my repeated tasks even more effortless.

I alternate mornings with my wife, taking either our 4-year-old, Matthew, to school, or our 7-year-old, Henry, to a different school. The responsibilities come with a different wake-up time, so I have two pre-set alarms in my phone: one for 7:00AM, and one for 7:45AM.

On a typical weeknight I set one or the other, either by tapping into the Clock app and toggling it on, or by asking Siri to for example “wake me up at 7:45AM.” It’s smart enough at least to notice the existing alarm, and doesn’t set a redundant one. Today I thought it would be nice to add labels to my alarms, so that when I’m awakened, any doubt about what my responsibilities are will be quickly clarified:

Image of Clock app configured with labeled alarm

Having labeled the alarms, I thought I’d test Siri’s prowess: “Hey Siri, set my Henry School Day alarm.” Sigh. No dice:

Image of Siri interface with request to set an alarm not understood.

Here’s an example where Siri is frustratingly dense about something that seems so obvious, at least in retrospect. All is not lost, though. It turns out that although it doesn’t have a clue what “my Henry School Day alarm” is, it knows full well what “the Henry School Day alarm” is:

Image of Siri interface accepting request to set an alarm by label name.

I’ve filed Radar #26696594 requesting that “Siri should recognize MY alarms as readily as THE alarms”.

Not Perfected Here

Almost two years ago, Apple announced Swift, their next-generation language. Politically, it seems poised to imminently succeed Objective-C as the de facto standard language for Apple platforms. Practically, there are many questions.

Since the language’s debut, people have been pointing out the impedance mismatches between Swift, a type-safe and statically compiled language, and Objective-C, an unusually dynamic, almost “anything goes” language in which objects can be magically bound to do one another’s bidding at run time as opposed to compile time.

Brent Simmons recently ignited a round of (mostly) thoughtful analysis about the dynamic shortcomings of Swift, and whether Apple will eventually fill the dynamic void left by Swift, when they inevitably update their core frameworks to accommodate the new language.

Paul Kim, a long-time Mac developer, whose experience with AppKit goes back to the NeXT days, wrote a balanced defense of Objective-C’s dynamism. He concludes with an acknowledgement that many of the people who might contribute practical feedback to the Swift team about how it would best serve app developers, are too busy shipping apps:

What I do see makes me worry that it’s not the experienced app-writers that are being heard. Unfortunately, many of us are too busy to sit in the various mailing lists and forums. Yes, ok, we lose. But we all lose if those creating the platform don’t draw from the experience of those who have built upon it successfully.

It would be OK that a majority of established 3rd party app developers can’t embrace and offer feedback about Swift, so long as Apple were buying into the language internally and ironing out all the kinks on their own. But how can they? Apple is in all likelihood the single largest owner of valuable, time-tested, customer-pleasing Objective-C code. Thus they face, as a company, the same challenges that Paul Kim and many other developers do: they can’t afford to put everything on the line, divert all their attention from Objective C, just to embrace a language that is not completely mature, and which doesn’t offer the same features as its predecessor.

If Apple’s long-time Mac and iOS developers, and the company’s own internal framework developers, are not yet empowered to dive head-long into Swift development, who is? Early adopters. Very early adopters. God bless them. Unfortunately, Swift’s early adopters will become experts in precisely the techniques that the language currently affords, as they aren’t motivated to push the language in the direction it needs to move to accommodate long-time Objective C developers, and … Apple itself.

I have to imagine Apple is stewing on this problem internally. Hopefully they have some brilliant thinking on the subject, some of which will come to light at WWDC in June. In the mean time, Apple faces an unenviable conundrum: the growing number of Swift experts in the world are predominantly neither employed by Apple, nor familiar with the design patterns that have set Apple’s frameworks apart from the competition for at least 15 years.

Swift is a fascinating, beautiful language. How will it evolve to prove superior, in the long-run to Objective-C? By providing a suite of impedance-matched frameworks that fulfill all the needs of current iOS and Mac developers. How will those frameworks come to be in an environment where Apple’s most experienced framework developers, and Apple’s most experienced 3rd-party developers are steeped in a tradition that favors patterns not supported by Swift? I’ll be very eager to see how that plays out.

Twenty Years

Ten years ago, I reflected on having been hired by Apple ten years before, when I was just twenty years old. “The Start Date“:

The day you start at Apple, be it as an administrative assistant or the CFO, you’re joining a proud legacy, and you know it. I still remember the thrill of receiving that offer letter. I grinned wide, stared down at the relatively meager salary I’d be earning, and signed away my agreement to start in two weeks.

That makes twenty years. Today, in fact.

Many of my colleagues from Apple in the mid-1990’s have moved on, as I did. But a very significant number of them remain. In a world where jumping from job to job has become expected of almost everybody, Apple maintains a curious lifelong employment appeal to many people.

Apple has always possessed ineffable uniqueness among its corporate peers. From the moment of its founding as a scrappy, barely funded home-made computer manufacturer, to forty years later when its value and influence are almost impossible to comprehend.

This year, many new young people will stare down at the relatively meager salary they’ll be earning, sign away their agreement to start in two weeks, and be in for the twenty-year ride of their lives.

When Worse Is Better

On the latest episode of John Gruber’s The Talk Show, guest Ben Thompson tries to identify the ways in which Amazon’s Alexa speech recognition is better than Apple’s Siri.

One of his key points was that Alexa, by being theoretically less capable than Siri, manages to avoid the heightened expectations and subsequent disappointment that users feel when Siri fails to listen as well as it promises to. It may be less competent overall, but what it does do it does predictably and well.

A comparison that came immediately to my mind was Apple’s mid-1990’s failure with the Newton handheld computer. The ambitious handwriting recognition was pure magic when it worked, but failed to work a significant amount of the time. Meanwhile, Palm took a more pragmatic approach with Graffiti, an overtly limited interpretation of the Roman alphabet, Arabic numerals, and a few other widely used symbols. By dramatically diminishing the magic of its handwriting recognition technology, Palm dramatically increased its reliability. Users seemed to appreciate this compromise, as Newton sputtered, and Palm Pilots went on to define the whole genre of hand-held digital assistants.

As much as I like this comparison, I don’t think Siri is doomed in the same way Newton was. Handwriting recognition was a primary interface on Newton, while with iOS devices it’s usually considered an augmenting interface. You can, and many people do, get plenty of use out of an iPhone without ever relying upon Siri. Siri is also nowhere near as unreliable, in my opinion, as Newton handwriting was. I use Siri on a daily basis and, perhaps because I’ve rarely tried anything better, I still find it an overall boon to productivity.

I think we are still in the early days of speech recognition, which feels funny to write because back in the mid-1990’s when Apple was failing to perfect handwriting recognition, they were doing the very same thing with speech recognition. But as John and Ben said on the show, none of the existing technologies, whether from Apple, Amazon, Microsoft, Google, Nuance, or others, is even close to perfect. There is so much interest now in the technology, that it’s possible to at least imagine extremely fast, reliable, predictable speech recognition becoming the norm. Whether the standard ends up being a shorthand-based approach such as Amazon is taking with Alexa, or a more ambitious artificial intelligence, may depend on which company can close the gap faster using one approach or the other.