Apple Said to Reap Fees From Banks in New Payment System – Bloomberg

A few months after Apple released the first iPhone, they cut the price by $200 because…well, we don’t know. It could be that sales were below what they expected, or it could be that it doing better than expected an they (and/or AT&T) wanted to build on the momentum.

Either way, the fact that the retail price of the phone was subsidized by other revenue streams is part of what made the price cut possible.

I note that a similar opportunity exists with the new Apple Watch, which plays an important role in Apple Pay, their new mobile and internet payment system. As Bloomberg reports, Apple Said to Reap Fees From Banks in New Payment System.

Software Developers

I note, with some interest, a collection of blog posts from earlier this summer.

It starts with a post by developer named Ed Finkler titled “The Developer’s Dystopian Future.”  An excerpt:

I find myself more and more concerned about my future as a developer.

As I’ve gotten older, I don’t stay up so late anymore. I spend more time with my family than on the computer (despite what they may say). I help in my community on a local school board, and I organize events for an open source interest group I started.

I think about how I used to fill my time with coding. So much coding. I was willing to dive so deep into a library or framework or technology to learn it.

My tolerance for learning curves grows smaller every day. New technologies, once exciting for the sake of newness, now seem like hassles. I’m less and less tolerant of hokey marketing filled with superlatives. I value stability and clarity.

Finker’s piece prompted Marco Arment to echo similar feelings:

I feel the same way, and it’s one of the reasons I’ve lost almost all interest in being a web developer. The client-side app world is much more stable, favoring deep knowledge of infrequent changes over the constant barrage of new, not necessarily better but at least different technologies, libraries, frameworks, techniques, and methodologies that burden professional web development.

I even get some of this feeling with Swift so far. I’m about to ship Overcast, a sizable, complex Objective-C app that I’ve been writing for nearly two years with a huge variety of code, from high-level interfaces to low-level audio handling.

Arment’s piece prompted a post by Matt Gemmel, again with similar themes:

There’s a quote in Marco’s piece that struck a chord with me:

I’ve lost almost all interest in being a web developer. The client-side app world is much more stable, favoring deep knowledge of infrequent changes over the constant barrage of new, not necessarily better but at least different technologies, libraries, frameworks, techniques, and methodologies that burden professional web development.

I used to feel exactly the same way, and if I was still a developer, I’d also still feel that way.

Gemmel continues:

The truth is, I actually find the software development world really frightening.

I’m not worried about my skills, either – I’m pretty sure they’re still sharp enough, and fairly relevant – but rather the surrounding environment. The current context.

There’s a chill wind blowing, isn’t there? I know we don’t talk about it much, and that you’re crossing your fingers and knocking on wood right now, but you do know what I mean.

If you attend an iOS/Mac dev meetup and hang around long enough, you’ll start to hear the whispers and the nervous laughter. There are people making merry in the midst of plenty, but each of them occasionally steps away to the edge of the room, straining to listen over the music, trying to hear the barbarians at the gates. There’s something wrong with the world, Neo.

We’ve had our (latest) software Renaissance in the form of the mobile platforms and their App Stores, and I think the software biz is now starting to slide back towards consolidation and mega-corps again. It’s not a particularly great time to be an indie app developer anymore.

Small shops are closing. Three-person companies are dropping back to sole proprietorships all over the place. Products are being acquired every week, usually just for their development teams, and then discarded.

The implacable, crushing wheels of industry, slow to move because of their size, have at last arrived on the frontier. Our frontier, or at least yours now. I’ve relinquished my claim.

As I read these, I started to wonder about the age of the authors.  Arment, I knew, was relatively young, in fact, at the end of his piece, he notes that he’s 32. From his LinkedIn, I see that  Gemmel is a little older, but like Arment, probably started his tech-career sometime after the dotcom crash and recession. Finkler is older, and looks like he started his career in the heady days of the dotcom boom (though well away from its epicenters).

Interesting to consider them alongside Dave Winer, who got his start as a software developer and entrepreneur in the late 70s, the beginning of the microcomputer era.

Winer, did most of his software development for 20 years in Frontier, a language and development environment he first created as part of one of his startups in the late 1980s, until the deprecation of key Windows and MacOS APIs Frontier depended on led him to embrace JavaScript in the past few years. I’m sure Weiner can sympathize with their growing frustration with newness for the sake of newness, and he’s clearly seen several business and technology cycles over the years. And yet, as he says, he’s “Still Digging.”

What can young, and not-quite-as-young software developers, like Finkler, Arment, and Gemmel learn from Winer?

Rapidics Accelerates my Machine Learning Learning

rapidics-header2For the past week or so, I’ve been digging into the topic of machine learning. It’s something I’ve been interested in for a long time. I’ve done some reading on the subject, and collected links to informational resources and open source tools for years, but I long ago reached the point of diminishing returns, where I either needed to start actually experimenting with it on my own, or otherwise needed a specific reason to learn more.

Thanks to a chance meeting at a coffee shop, I now have the latter. Mark Seligman of Rapidics has helped me understand more about the application of machine mearning techniques. What I find particularly interesting is that Mark and his company are in the business of providing machine learning infrastructure. They are doing some of the heavy lifting to help make machine learning easier and more useful for others to use by taking a generally applicable algorithm called Random Forest and creating a solid, fast multicore implementation called Arborist that can work with the popular, open source, R statistical computing package. By doing so, they’ve achieved major speedups over the standard R implementation, and efficiency and scaling advantages over many of the coarse-grained approaches to speeding up R on parallel hardware.

What’s particularly interesting to me is that they’ve sped things up enough that it could fundamentally change the way people use Random Forest for machine learning, while at the same time making it useful to people who haven’t even heard of machine learning today. That makes the subject triply interesting to me, because I’m learning about machine learning and getting to think about infrastructure and user experience.

So, thanks to Mark, and his partner Mark, for the education!