Author Archives: eas

On Moore’s Law’s and the Apple Watch

Today, Jean Louis-Gassée shared some thoughts on forecasting demand for the Apple Watch.  He is thoughtful about the impact Moore’s Law will have on the higher-end models, writing:

But the biggest question is, of course, Moore’s Law. Smartphone users have no problem upgrading every two years to new models that offer enticing improvements, but part of that ease is afforded by carrier subsidies (and the carriers play the subsidy game well, despite their disingenuous whining).

There’s no carrier subsidy for the AppleWatch. That could be a problem when Moore’s Law makes the $5K high-end model obsolete. (Expert Apple observer John Gruber has wondered if Apple could just update the watch processor or offer a trade-in — that would be novel.)

Gasse’s comments were picked up elsewhere, including The Loop. So far though, I haven’t seen anyone consider the impact of the WatchKit software architecture on this question. So I commented with my own thoughts, which I’m revising and publishing here.

The current version of WatchKit only gives developers access to the the watch as a smart terminal interfacing communicating with code running on an iOS device over Bluetooth Low Energy. In this model, the processing demands on the watch are pretty flat across different apps and over hardware generations, since they are bounded primarily by display resolution, which is itself bounded by the optical characteristics of our eyeballs, and UI update rates. As such, Moore’s law improvements would probably accrue to battery life and component cost. I doubt that will have a significant impact on the end-user experience — I suspect that the screen and wireless make up a large portion of the power budget, and neither are going to be tightly linked to Moore’s law improvements. As for expense, I doubt that annual SoC price reductions will have that much impact on anything but the lowest end models (if it impacts them at all).

We’ll see what happens “later next year” when Apple allows native apps to run on the watch. My guess is that their execution will be tightly managed, as they were for web apps in iOS 1.0, and native apps when they were enabled by iOS 2.0. As a result, Apple will have a lot of ability to manage the way the platform and apps takes advantage of Moore’s law.

I’d guess that there will still be major generational discontinuities, but they will come every 5-10 years, rather than every year, as they do with iOS devices. That still creates issues for a device that some may expect to last lifetimes, but perhaps that assumption must itself be revisited.

For all the talk about the timelessness of high-end timepieces among analog watch aficionados, it isn’t all that relevant in the larger sense. The fundamental issues, when deciding whether to spend thousands of dollars on a watch, is: do I have thousands of dollars to spend on a watch when a $50 watch would tell me the time just as well; if so, does spending thousands of dollars on a watch feel good to me (which is sometimes a question of whether it sends the “right” message to others).

I suspect that for all the people who, today, spend thousands of dollars on a luxury watch because of their quality and timelessness, a significant portion could find a different reason to spend thousands of dollars on a watch. Along side of them, a significant number of other people who could spend thousands on a luxury watch, but are unpersuaded by whatever appeal drives traditional high-end watch buyers. Some of these people could find other reasons to buy an Apple watch.

I am at this point unlikely to spend thousands, or even hundreds on any watch, particularly since I finally gave up on wrist watches all together when I started carrying a “pocket watch” (read: cell phone). I will not be surprised though if I am wearing a lower-end Apple Watch a year from now. As a kid in Utah, when I used to go skiing, my friend and I would be freezing our asses off riding the ski lift, and would find ourselves checking our watches for the temperature. Our watches didn’t tell the temperature, and, as I recall, there was not yet any (affordable) watch on the market that told temperature. Still, it was natural to us that a wrist device should provide useful information like that. So, I’m willing to give the (lower end) Apple Watch a try.

 

 

The future of nations

This morning, in my car on the way to an appointment, I was listening to NPR’s The Takeaway. I felt like I caught a glimpse of the future of the nation-state in a common thread connecting their story about Indian Prime Minister Narendra Modi’s recent US visit, and another about ISIS.

In the piece on Modi, they talked with Arvind Rajagopal, a sociologist and media theorist at New York University. Rajagopal talked about how Modi made a strong nationalistic appeal to the Indian diaspora. What was notable, was that the appeal wasn’t for them to come back to India to support its development, but rather to support India’s interest wherever they were.

In the piece on ISIS, they interviewed Louise Shelly, the executive director of the Terrorism, Transnational Crime and Corruption Center at George Mason University. Shelly talked about how ISIS daws on age old smuggling routes in order to finance and supply its efforts.

What struck me is that both these stories hint at an interesting disconnect between geography, identity, power, influence and the economy, things that, in my naive view, are key ingredients in the classical nation state. And yet, here they are, present in forms that don’t quite fit that mold:

  • India, a nation with borders, redefined as a global nation with a geographic center, and a other aspects superimposed on other nation-states.
  • ISIS, a broad-based enterprise with aspirations of geographic borders whose assets are also superimposed on other nation-states.

And somewhere, in the same primordial goo, mixed with cryptocurrency, and social networks, and ragefaces, what else might emerge?

Intel thinks your gadgets need a bowl of their own.

For years, people have been “recharging” with a bowl after a long, hard day. Soon, their electronics devices, will be able to do the same, thanks to Intel’s Humboldt County product design studio.

If it ships at the end of year, as expected, Intel’s charging bowl will be the first product from Intel’s Northern California outpost to make it to market since it was opened in 1972, just a few short years after Intel’s own founding in 1968.

When we asked about the long gestation period, Randy Redwood, a senior vice-pesident of consumer product development, replied that they’d pursued a number of important initiatives in the past forty years. “Some of them were fuckin’ killer, man,” then, after a long pause, Mr. Redwood continued, “…but they didn’t work out…such a bummer… WHAT, oh yeah, carmel corn. Excuse me, I gotta get to the cafeteria before they run out of carmel corn.”

On Sacrifice, iPhones and Devotion

A friend on Facebook (who works for Motorola) is doing his level best to help people understand what people sacrifice when they choose an iPhone 6 over one of Motorola’s latest offerings.

I can only shake my head in pity, for clearly it is he who needs to understand, to understand people, and sacrifice, and devotion. For iPhone users are a devoted people, and there is no devotion without sacrifice. So, iPhone users sacrifice. They sacrifice by paying more, but on its own, such a sacrifice is a hollow indulgence.

True devotion requires daily sacrifice, and so iPhone users suffer fewer pixels on their displays and in their selfies. Perhaps, most of all, they suffer shorter battery life.

To the unenlightened, this sacrifice seems ridiculous, foolish, pathetic. Why would someone sacrifice their hard earned money for a phone that is so clearly inferior? The answer is simple, their devotion, their sacrifice, is repaid daily, a thousand fold. As the battery gauge dwindles, they are reminded that, so to, the daylight dwindles. As the gauge turns red, they are reminded of the the autumn of their years. As the screen blinks and goes black, they are reminded that life is precious, that the breadth and depth of life needs to be explored, and embraced, that life can not be captured in pixels no matter what the quantity.

As for me, I only rarely run short of battery life on my old iPhone, so, to show my devotion as I anticipate the arrival my new iPhone, I am collecting all the 30-pin cables I’ve accumulated over the years, cables that will no work with my new iPhone. Once collected, I’ll straighten them carefully, gather them together, and then slowly, methodically, braid them into a scourge, so I might flagellate my skin and demonstrate my devotion.

Apple Said to Reap Fees From Banks in New Payment System – Bloomberg

A few months after Apple released the first iPhone, they cut the price by $200 because…well, we don’t know. It could be that sales were below what they expected, or it could be that it doing better than expected an they (and/or AT&T) wanted to build on the momentum.

Either way, the fact that the retail price of the phone was subsidized by other revenue streams is part of what made the price cut possible.

I note that a similar opportunity exists with the new Apple Watch, which plays an important role in Apple Pay, their new mobile and internet payment system. As Bloomberg reports, Apple Said to Reap Fees From Banks in New Payment System.

Software Developers

I note, with some interest, a collection of blog posts from earlier this summer.

It starts with a post by developer named Ed Finkler titled “The Developer’s Dystopian Future.”  An excerpt:

I find myself more and more concerned about my future as a developer.

As I’ve gotten older, I don’t stay up so late anymore. I spend more time with my family than on the computer (despite what they may say). I help in my community on a local school board, and I organize events for an open source interest group I started.

I think about how I used to fill my time with coding. So much coding. I was willing to dive so deep into a library or framework or technology to learn it.

My tolerance for learning curves grows smaller every day. New technologies, once exciting for the sake of newness, now seem like hassles. I’m less and less tolerant of hokey marketing filled with superlatives. I value stability and clarity.

Finker’s piece prompted Marco Arment to echo similar feelings:

I feel the same way, and it’s one of the reasons I’ve lost almost all interest in being a web developer. The client-side app world is much more stable, favoring deep knowledge of infrequent changes over the constant barrage of new, not necessarily better but at least different technologies, libraries, frameworks, techniques, and methodologies that burden professional web development.

I even get some of this feeling with Swift so far. I’m about to ship Overcast, a sizable, complex Objective-C app that I’ve been writing for nearly two years with a huge variety of code, from high-level interfaces to low-level audio handling.

Arment’s piece prompted a post by Matt Gemmel, again with similar themes:

There’s a quote in Marco’s piece that struck a chord with me:

I’ve lost almost all interest in being a web developer. The client-side app world is much more stable, favoring deep knowledge of infrequent changes over the constant barrage of new, not necessarily better but at least different technologies, libraries, frameworks, techniques, and methodologies that burden professional web development.

I used to feel exactly the same way, and if I was still a developer, I’d also still feel that way.

Gemmel continues:

The truth is, I actually find the software development world really frightening.

I’m not worried about my skills, either – I’m pretty sure they’re still sharp enough, and fairly relevant – but rather the surrounding environment. The current context.

There’s a chill wind blowing, isn’t there? I know we don’t talk about it much, and that you’re crossing your fingers and knocking on wood right now, but you do know what I mean.

If you attend an iOS/Mac dev meetup and hang around long enough, you’ll start to hear the whispers and the nervous laughter. There are people making merry in the midst of plenty, but each of them occasionally steps away to the edge of the room, straining to listen over the music, trying to hear the barbarians at the gates. There’s something wrong with the world, Neo.

We’ve had our (latest) software Renaissance in the form of the mobile platforms and their App Stores, and I think the software biz is now starting to slide back towards consolidation and mega-corps again. It’s not a particularly great time to be an indie app developer anymore.

Small shops are closing. Three-person companies are dropping back to sole proprietorships all over the place. Products are being acquired every week, usually just for their development teams, and then discarded.

The implacable, crushing wheels of industry, slow to move because of their size, have at last arrived on the frontier. Our frontier, or at least yours now. I’ve relinquished my claim.

As I read these, I started to wonder about the age of the authors.  Arment, I knew, was relatively young, in fact, at the end of his piece, he notes that he’s 32. From his LinkedIn, I see that  Gemmel is a little older, but like Arment, probably started his tech-career sometime after the dotcom crash and recession. Finkler is older, and looks like he started his career in the heady days of the dotcom boom (though well away from its epicenters).

Interesting to consider them alongside Dave Winer, who got his start as a software developer and entrepreneur in the late 70s, the beginning of the microcomputer era.

Winer, did most of his software development for 20 years in Frontier, a language and development environment he first created as part of one of his startups in the late 1980s, until the deprecation of key Windows and MacOS APIs Frontier depended on led him to embrace JavaScript in the past few years. I’m sure Weiner can sympathize with their growing frustration with newness for the sake of newness, and he’s clearly seen several business and technology cycles over the years. And yet, as he says, he’s “Still Digging.”

What can young, and not-quite-as-young software developers, like Finkler, Arment, and Gemmel learn from Winer?