Monthly Archives: September 2014

The future of nations

This morning, in my car on the way to an appointment, I was listening to NPR’s The Takeaway. I felt like I caught a glimpse of the future of the nation-state in a common thread connecting their story about Indian Prime Minister Narendra Modi’s recent US visit, and another about ISIS.

In the piece on Modi, they talked with Arvind Rajagopal, a sociologist and media theorist at New York University. Rajagopal talked about how Modi made a strong nationalistic appeal to the Indian diaspora. What was notable, was that the appeal wasn’t for them to come back to India to support its development, but rather to support India’s interest wherever they were.

In the piece on ISIS, they interviewed Louise Shelly, the executive director of the Terrorism, Transnational Crime and Corruption Center at George Mason University. Shelly talked about how ISIS daws on age old smuggling routes in order to finance and supply its efforts.

What struck me is that both these stories hint at an interesting disconnect between geography, identity, power, influence and the economy, things that, in my naive view, are key ingredients in the classical nation state. And yet, here they are, present in forms that don’t quite fit that mold:

  • India, a nation with borders, redefined as a global nation with a geographic center, and a other aspects superimposed on other nation-states.
  • ISIS, a broad-based enterprise with aspirations of geographic borders whose assets are also superimposed on other nation-states.

And somewhere, in the same primordial goo, mixed with cryptocurrency, and social networks, and ragefaces, what else might emerge?

Intel thinks your gadgets need a bowl of their own.

For years, people have been “recharging” with a bowl after a long, hard day. Soon, their electronics devices, will be able to do the same, thanks to Intel’s Humboldt County product design studio.

If it ships at the end of year, as expected, Intel’s charging bowl will be the first product from Intel’s Northern California outpost to make it to market since it was opened in 1972, just a few short years after Intel’s own founding in 1968.

When we asked about the long gestation period, Randy Redwood, a senior vice-pesident of consumer product development, replied that they’d pursued a number of important initiatives in the past forty years. “Some of them were fuckin’ killer, man,” then, after a long pause, Mr. Redwood continued, “…but they didn’t work out…such a bummer… WHAT, oh yeah, carmel corn. Excuse me, I gotta get to the cafeteria before they run out of carmel corn.”

On Sacrifice, iPhones and Devotion

A friend on Facebook (who works for Motorola) is doing his level best to help people understand what people sacrifice when they choose an iPhone 6 over one of Motorola’s latest offerings.

I can only shake my head in pity, for clearly it is he who needs to understand, to understand people, and sacrifice, and devotion. For iPhone users are a devoted people, and there is no devotion without sacrifice. So, iPhone users sacrifice. They sacrifice by paying more, but on its own, such a sacrifice is a hollow indulgence.

True devotion requires daily sacrifice, and so iPhone users suffer fewer pixels on their displays and in their selfies. Perhaps, most of all, they suffer shorter battery life.

To the unenlightened, this sacrifice seems ridiculous, foolish, pathetic. Why would someone sacrifice their hard earned money for a phone that is so clearly inferior? The answer is simple, their devotion, their sacrifice, is repaid daily, a thousand fold. As the battery gauge dwindles, they are reminded that, so to, the daylight dwindles. As the gauge turns red, they are reminded of the the autumn of their years. As the screen blinks and goes black, they are reminded that life is precious, that the breadth and depth of life needs to be explored, and embraced, that life can not be captured in pixels no matter what the quantity.

As for me, I only rarely run short of battery life on my old iPhone, so, to show my devotion as I anticipate the arrival my new iPhone, I am collecting all the 30-pin cables I’ve accumulated over the years, cables that will no work with my new iPhone. Once collected, I’ll straighten them carefully, gather them together, and then slowly, methodically, braid them into a scourge, so I might flagellate my skin and demonstrate my devotion.

Apple Said to Reap Fees From Banks in New Payment System – Bloomberg

A few months after Apple released the first iPhone, they cut the price by $200 because…well, we don’t know. It could be that sales were below what they expected, or it could be that it doing better than expected an they (and/or AT&T) wanted to build on the momentum.

Either way, the fact that the retail price of the phone was subsidized by other revenue streams is part of what made the price cut possible.

I note that a similar opportunity exists with the new Apple Watch, which plays an important role in Apple Pay, their new mobile and internet payment system. As Bloomberg reports, Apple Said to Reap Fees From Banks in New Payment System.

Software Developers

I note, with some interest, a collection of blog posts from earlier this summer.

It starts with a post by developer named Ed Finkler titled “The Developer’s Dystopian Future.”  An excerpt:

I find myself more and more concerned about my future as a developer.

As I’ve gotten older, I don’t stay up so late anymore. I spend more time with my family than on the computer (despite what they may say). I help in my community on a local school board, and I organize events for an open source interest group I started.

I think about how I used to fill my time with coding. So much coding. I was willing to dive so deep into a library or framework or technology to learn it.

My tolerance for learning curves grows smaller every day. New technologies, once exciting for the sake of newness, now seem like hassles. I’m less and less tolerant of hokey marketing filled with superlatives. I value stability and clarity.

Finker’s piece prompted Marco Arment to echo similar feelings:

I feel the same way, and it’s one of the reasons I’ve lost almost all interest in being a web developer. The client-side app world is much more stable, favoring deep knowledge of infrequent changes over the constant barrage of new, not necessarily better but at least different technologies, libraries, frameworks, techniques, and methodologies that burden professional web development.

I even get some of this feeling with Swift so far. I’m about to ship Overcast, a sizable, complex Objective-C app that I’ve been writing for nearly two years with a huge variety of code, from high-level interfaces to low-level audio handling.

Arment’s piece prompted a post by Matt Gemmel, again with similar themes:

There’s a quote in Marco’s piece that struck a chord with me:

I’ve lost almost all interest in being a web developer. The client-side app world is much more stable, favoring deep knowledge of infrequent changes over the constant barrage of new, not necessarily better but at least different technologies, libraries, frameworks, techniques, and methodologies that burden professional web development.

I used to feel exactly the same way, and if I was still a developer, I’d also still feel that way.

Gemmel continues:

The truth is, I actually find the software development world really frightening.

I’m not worried about my skills, either – I’m pretty sure they’re still sharp enough, and fairly relevant – but rather the surrounding environment. The current context.

There’s a chill wind blowing, isn’t there? I know we don’t talk about it much, and that you’re crossing your fingers and knocking on wood right now, but you do know what I mean.

If you attend an iOS/Mac dev meetup and hang around long enough, you’ll start to hear the whispers and the nervous laughter. There are people making merry in the midst of plenty, but each of them occasionally steps away to the edge of the room, straining to listen over the music, trying to hear the barbarians at the gates. There’s something wrong with the world, Neo.

We’ve had our (latest) software Renaissance in the form of the mobile platforms and their App Stores, and I think the software biz is now starting to slide back towards consolidation and mega-corps again. It’s not a particularly great time to be an indie app developer anymore.

Small shops are closing. Three-person companies are dropping back to sole proprietorships all over the place. Products are being acquired every week, usually just for their development teams, and then discarded.

The implacable, crushing wheels of industry, slow to move because of their size, have at last arrived on the frontier. Our frontier, or at least yours now. I’ve relinquished my claim.

As I read these, I started to wonder about the age of the authors.  Arment, I knew, was relatively young, in fact, at the end of his piece, he notes that he’s 32. From his LinkedIn, I see that  Gemmel is a little older, but like Arment, probably started his tech-career sometime after the dotcom crash and recession. Finkler is older, and looks like he started his career in the heady days of the dotcom boom (though well away from its epicenters).

Interesting to consider them alongside Dave Winer, who got his start as a software developer and entrepreneur in the late 70s, the beginning of the microcomputer era.

Winer, did most of his software development for 20 years in Frontier, a language and development environment he first created as part of one of his startups in the late 1980s, until the deprecation of key Windows and MacOS APIs Frontier depended on led him to embrace JavaScript in the past few years. I’m sure Weiner can sympathize with their growing frustration with newness for the sake of newness, and he’s clearly seen several business and technology cycles over the years. And yet, as he says, he’s “Still Digging.”

What can young, and not-quite-as-young software developers, like Finkler, Arment, and Gemmel learn from Winer?