Friday, November 9, 2012

Carl Sagan as a Candle in the Darkness.

I woke up this morning, checked the interwebs, and saw #carlsagan was trending. Curious, I clicked through and saw that today, November 9 2012, would have been Carl's 78th birthday.  This is personally relavent to me, and a flood of memories instantly filled my head. I knew Carl personally, and that turn of events happened like this...

When I was a teenage boy growing up in the frozen depths of Northern Minnesota in the 70's, I took a strong interest in science. There were four reasons for this:

  1. There were stars in the sky at night. There was practically nothing around my home in Minnesota, so the sky was unencumbered with man-made light. The sky was brilliantly lit. I saw things that I am sure most don't see anymore: meteor showers, lunar eclipses, aurora borealis, and the big, beautiful milky way. It is a vivid memory even now, looking back 40 years, that I have never seen captured properly in photos or video. (Well, ok, maybe here.)
  2. Dale Gibbs. Dale was my astronomy teacher in high school - yes, we had an astronomy class. Several, actually. Dale taught me about a life in science both within and outside the frozen northland of Minnesota. His passion for what he did was infectious - a wonderful man with a quick sense of humor - another trait he shared with me. Dale, if you're out there, it's a debt I can never repay. Thank you.
  3. Star Trek. Don't laugh, you snarky people. I think most people my age would put that out there - and there's a reason why NASA christened the first Space Shuttle with the name Enterprise. It was a show that taught those of us trapped in a pre-internet world (back when "nerd" was not synonymous with "cool") that there were others out there that thought as we thought, believed as we believed. Not just about science, but about cultural integrity and putting aside racism and outdated beliefs. IDIC lives.
  4. Carl Sagan. When I was a teen, Carl started coming into his own, not just as an astronomer, but as a spokesperson for the power of science to change humanity. It was a grandiose thought, and one I had never heard expressed in the way he expressed it. To him, science was not a religion - it was a replacement for religion. The steadfast notion that humanity's belief in a higher power was really an attempt on the part of an adolescent human race to try to understand that which was unknowable. Paraphrasing from scripture, Carl felt it was time to put away childish things. To push aside a veil  kept in place by dogma and really see what was going on. When he titled the now famous "Cosmos" a "Personal Journey," I found out later that he was serious. It was personal for him.

In 1984, I left graduate school with degrees in computer science, math and astrophysics - these things were important to me, they mattered. My first position in the working world was as a research science assistant at JPL - I was stationed at Brown University in Rhode Island, as part of an image processing and remote sensing team lead by Jim Head, a Distinguished Professor of Geologic Science at Brown.

I was thrilled. I was in it - working on spacecraft data, designing and building image processing systems, and meeting with people who I idolized in my youth: Hal Mazursky, Larry Soderbloom, and Carl Sagan. (While others were following sports personalities in school, I was following these guys.) I was placed on two projects out of the gate: The Galileo Spacecraft (which was to be launched in 1989), and the Russian Venus (Venera) landers, which were major milestones in the 80's. I was going to the Lunar and Planetary Science meetings, IKI in the former Soviet Union, JPL planning meetings, NASA planning seminars, and, of course, Cornell University.

It was in 1987 when I first met Carl - I was at Cornell for a Galileo planning meeting. Carl wasn't on that project, but he was a professor at Cornell, so of course knew everyone on the Galileo project, and so stopped by. Actually, when I first met him he was fiddling with a VCR connected to a TV set, trying to ready his presentation. I didn't know who he was, his back was to me and he was hunched over the old-timey video recorder making frustrating noises... I bent down to help him out, and let him know that it needed to be on channel 3 (remember that?). He laughed, and I immediately recognized him.

His participation in the meeting was interesting, especially looking back using a 2012 lens. In today's world, we understand the importance of marketing, social media and other tools to promote a cause. While some of it is frivolous, awareness of a cause is important in supporting that cause. In 1987, this was a concept that people relegated to a that's-how-you-sell-coca-cola mindset, not a that's how you raise awareness. But, this was 1987 - the Challenger had just exploded the previous year, and the public was questioning our involvement in space exploration in specific, and in a larger way science in general. People needed to see what we in the field saw - they needed to feel why were were doing this.

Carl, more than anyone else on the Galileo science team, knew that instinctually in 1987. He came to us and proposed something preposterous. The Galileo spacecraft was forced to take a long duration flight, due to restrictions that were placed on the craft at the time. Galileo had a small, radioactive payload to power it in the icy cold of the deep solar system - it was not understood at the time, how tiny this payload was - the public heard "radioactive," and refused to allow Galileo the fuel required for a direct flight to Jupiter, incase that fuel ignited and exploded in the atmosphere like Challenger. There was no need for this fear for a number of reasons, but there was no talking congress out of it: Galileo had to come up with a low-power way to get to Jupiter.

The Galileo engineering team came up with just that solution: the long way. Galileo would be launched towards the inner solar system (yes, the wrong way), use the gravity of the sun and two earth flybys to play "crack the whip." (I told you science was cool. )The resulting final velocity would get it to Jupiter - but after several years instead of several months.

Carl proposed using the earth flybys to create, what he called, "a postcard from space." Take a picture, or series of pictures, of the earth and the moon from an angle no one had seen before: an angle of someone from somewhere else approaching our home. Show the world why we doing this, how alone we are, how we are "on our own." How we need each other to cope, to comfort, and - more importantly - to understand where we are in the universe.

There was a lot of nerd-fighting about this. In the 80's spacecrafts had less processing power than even a modest cell phone: less memory, poorer camera, and - most importantly here - less power and bandwidth. What Carl was proposing seemed, to the distinguished men and women of the Galileo science team, a frivolous waste of precious power, memory and bandwidth. There was no science to be gained here, why do it?

Carl, of course, won out - Galileo's imaging system was turned on twice. Once in the first approach to earth, the earth and moon in full color in a single frame...lit by the sun, at once both beautiful and fragile. I've included it here - you be the judge, was it worth waking up Galileo from it's sleep, turning itself on and taking this photo?

The second photos taken by Galileo were during the second approach to earth - by that point, the Galileo earth/moon shot had sparked something in people throughout the world, and it wasn't a far reach from there for Galileo to do even more. Enough photos of the earth in space were taken to assemble a small video. I've included it at the end.

Over the years, Carl became a casual friend - whenever we were in the same city, we'd sit down somewhere and talk about the world, science, and the public's changing attitudes. These were some of the best talks I've had in my personal and professional career. How often does someone get to know, really know, a childhood hero - and have that hero know him back?

As time went on, I left JPL and Carl got sicker. We lost touch, and I didn't want to intrude on the family as things got bad for him... I never reached out, and that always haunts me. He passed just before Christmas in 1996.... it was pre-internet-everywhere, so I read about it in the paper.

I think about him often, and wonder (a lot) about what he would think of the modern world. He was spared the vision of the world descending into a darkness of the spirit and the intellect - total science graduates dropping; US schools appearing lower and lower in the "best in science" rankings;  the rise of creationism; reading books being replaced by listening to sound bites, which in turn are replaced by 140 characters; religious jihads destroying lives and property; countries being felled as collateral damage; and so on. Would he still be the carrier of the candle of science, providing that light in the darkness?

I believe so - I believe that, somehow, it would re-energize him. The need for knowledge is still there, its just being fed by very different sources - Carl was a voice of reason and logic and poetry, and he had a certain something that people gravitated towards. A frequent visitor to the Johnny Carson show, I suspect he would make use of these new forums: YouTube videos, G+ hangouts, some sense, getting his voice out would be easier, not harder.

So - on this his 78th birthday, it's left up to the rest of us. He was there when we needed him, he needs us now. We need us now. Protect that candle, keep it lit. Don't do it through name calling and posturing and illogic - keep it lit with eloquence, with intellectual discourse....with facts. To paraphrase Douglas Adams, the world is strange and wonderful enough as it is, without having to invent any more of it.

Happy 78th Carl. Thank you for this video and everything else you've done and taught us. We should have gotten you something...

Friday, October 5, 2012

For the Record: Why my iPhone is in my Drawer

Is This Just Another Anti-Apple Blog Post?

No...well, yes... but, no, not really. Really.

Over the years, I've used (and owned) a wide variety of gadgetry from laptops to phone to tablets to over-the-top connected TV boxes... and, with the exception of my laptop, I've consistently steered away from Apple products as personal go-to devices. Unfortunately, with the Apple community this behavior on my part is always labeled as some weird form of wrong-thinking Apple hatred..., to be fair, my personal opinion of Apple The Company has taken a strong turn towards the cellar in recent years, but my opinion of the products has pretty much remained the same. With the recent release of the iPhone 5, I figured that this would be an excellent time to state, for the record, what I think about Apple the company vs. Apple the product line, and why - unless something radically changes over the course of the next few years - I will be unlikely to be swayed by an Apple product offering that isn't a laptop.

Full Disclosure: "What's in my Bag" Right Now 

In the spirit of openness for framing a dialog like this, it probably makes sense to explain what tech I use in my daily life. Currently, I own several Apple devices, since my position as CTO for the past several companies has always included making sure that my engineering teams properly support their products. To that end, I personally own an iPhone 4S, an iPod touch, a second gen iPad, an Apple TV, and a Macbook Air. With the exception of the Air, which is really a great device that I use 10 hours a day, these devices only come out of my bag for testing my engineering team's product lines, or for studying the iPhone workflow for designing a new application.

In my day-to-day personal world, my phone of choice is a Samsung Galaxy Nexus, my tablet is a Asus Nexus 7 (or an Asus Transformer Prime if I need a 10"), and my "other" laptop is an Asus Zenbook UX21. Attached to my television is a Windows Media Center computer, an XBox, a PS3 and a Roku - all of which get used with relatively the same frequency.

My Life With Apple

As I've stated frequently, I'm 900 years old. There's a certain perspective of history one gets from that height, which comforts me in an otherwise sea of strange aches and pains in my body that weren't there a few years ago. One of those perspectives comes from watching the birth of personal computers, and the companies that formed around them.

In the 1970's, I had my first regular, professional computer-job-related paycheck - it came from MECC, the Minnesota Educational Computer Consortium. MECC was a state-sponsored organization with the goal of putting computer facilities within reach of every K-12 student in the Minnesota state educational system. This was done through a variety of means (including teletypes and CRT terminals connected to a central mainframe in Minneapolis), as well as the new kid on the block: the fledgling personal computer. 

I was the youngster (in a very literal sense) on an Request For Proposal committee to evaluate potential purchases of some of these new personal computers (called microcomputers back in the day, you young whipper snapper!), the winner of which was the venerable Apple ][, beating out Atari, Commodore and Radio Shack. Being awarded this state contract was a seminal moment in the history of the young company, as MECC placed a large order for the machines. 

Throughout the next 3 decades, I've had various flavors of Apple products in my homes, along side equivalent DOS (and, later, Windows) PCs. My interest in the Apple product line was fairly strong, but I never saw them as anything other than another tool for my research or work. 

It wasn't soon after this that the now familiar story played out: Jobs did or did not walk into Xerox PARC and did or did not see the Xerox Star system running a desktop analog with a new input device called a mouse. At that point, Jobs either did or did not return for a second look, and did or did not offer  two lead engineers a position at Apple. Whatever happened is almost irrelevant, as a little while after those alleged visits, the hyper-expensive Apple Lisa appeared on the scene.

Few outside of Palo Alto had seen anything like it, with it's desktop analog and "intuitive" operating system - and, at $10K a pop, few would. However, from that device, the Macintosh arose, and we can all trace our own personal Apple stories from there.

While entertained and impressed by the little all-in-one Macintosh, I was more interested in other machines of that era, such as the Atari ST and the Commodore Amiga, due to their color graphics capabilities. If I think back, for right or wrong, this was the moment I lost interest in Apple as a primary machine in my life.

However, what I did see from the sidelines was a growing marketing presence from Apple, that I would later learn was driven almost entirely by Steve Jobs. Steve came and went and then came again, and drove Apple to become a genius center for product driven marketing,  industrial and graphic design. When Apple got into the consumer electronics business with the first iPod, I was right there with it. I loved the device, but hated iTunes. (I had hoped iTunes would change over time, but, sadly, it never did. The iTunes of 2012 is basically the iTunes of 2001.) Until 2 years ago, I have had some form of an iPod as my primary music player.

In 2007, I co-founded a company to support video advertising on cell phones. 2007, it turns out, was also the year of the first iPhone. While the iPhone was raised up as a harbinger of things to come, at the time the industry viewed it as more of a "neat trick." Our company supported it, of course, and kept an eye on it - but most of  our clients were more interested in support for RIM devices, Microsoft Phones, Nokia and other devices - this would change, of course, but it took time. iPhones have been a staple in my bag ever since that era - but even though I gave them a shot, I do not use them as my primary device.

My Inability to Separate the Device from the Company 

There is a valid argument that goes something like "I can separate the artist from the art." I may not like a particular actor's personal view of the world outside of his movies, but I definitely love his filmography. (Yeah, I'm looking at YOU, Tom Cruise.)

In the case of media consumption devices created by companies that control the media pipeline itself, it's much harder to apply the "separate the artist from the art" philosophy. Business decisions created by the executive team at a company necessarily dictate specifics about a device such as the user experience, media aggregation and distribution, etc. Amazon became one such company when they created the Kindle, as did Google and Apple when they created their phone OSes.  

Apple in particular has set about a course of common events and business models within the company that  dictate a closed ecosystem ("walled garden") which each user tacitly agrees to join when he or she purchases an Apple product. This is nothing new, the principle of a closed ecosystem has always been present with Apple products from nearly the beginning of the company: Apple computers, for instance, originally would only work with Apple graphic cards, printers, etc. In the hardware arena, this sort of closed environment nearly strangled Apple from within, since it dictates that one company needed to produce every hardware component for their product line, rather than having a published set of standards that other OEMs could use to produce compatible products. 

In the world of the consumer electronics however, that same philosophy worked towards Apple's advantage. By taking baby steps towards a closed ecosystem, Apple eventually walled off it's music service to apply to just Apple products, and carried that philosophy forward into the worlds of the iPhone, iPad and Apple TV. Apple devices were designed to take advantage of this philosophy, and the lure of the shiny "it just works" devices was too strong for the masses. 

Less of a Liberal Arts and Technology Intersection and More of a GroupThink Cul de Sac 

Apple became the successful powerhouse that Jobs always dreamed it would become. It also created a successful mythos in the process: "Apple creates beautiful technology," the conversation begins, "and therefore attracts those that appreciate it." People wanted into the Apple club, so they bought into the shiny, and became part of the artificial ecosystem. 

The results were brilliant, as far as the Apple shareholders were concerned: People showed up to the party and they never left. To be fair, there was a point in time when they could not leave: iTunes purchases, for instance, were protected by DRM that kept people from moving their investment in music from an iPod/iTunes combination to, well, anywhere else. Eventually, that closed door was blown open as  iTunes removed most of its DRM restrictions (for a fee, of course) from its music selection, but apparently there are still enough remnants of it that it pisses off Bruce Willis. (iTunes does still DRM protect it's video content.)

So, what's left to keep people in the ecosystem? Two things combine to keep people in the gate: The Shiny plus the "Members Only" effect. 

Apple products are pretty, there's no doubt about it. There is a status symbol quality to Apple products - you pay a premium for it, but (and this is the brilliant part) it's not that much of a premium....even minimum wage workers can afford it if they squirrel away their rent money. What this does, of course, is create an insular environment comprised of members of a community who "get it" and who have paid the entrance fee. In a way, it's structured very much like a closed residential community: there's no reason to stray outside of that community, so there's no information coming from outside the community. Without any information flowing in, the members of the community feel that they are a representative sample of people outside the community. (It was amusing to see my niece's face when I showed her that neither iPhones nor Apple laptops dominated their respective marketplaces.)

This effect is called GroupThink, and it's been the rationale attributed to everything from the handling of the Bay of Pigs to Watergate. William H. Whyte, Jr coined the term in Fortune magazine in 1952, and he describes it the main principle of GroupThink with this quote:

The more amiability and esprit de corps there is among the members of a policy-making ingroup, the greater the danger that independent critical thinking will be replaced by groupthink, which is likely to result in irrational and dehumanizing actions directed against outgroups.

In other words, Apple keeps its user base together by making them feel like they are members of an elite club. Positive re-enforcement for decisions made by any community member from the rest of the community allow the members to feel like every choice they make is rational, that they have all the available information to make a rational decision, and therefore puts them at a higher plane of reasoning from anyone not privy to exclusive information obtained within the group. Any contradictory decisions or thought processes would be ridiculed as silly or ill-informed, and so member thoughts return inward towards the community. The effect is only broken when a paradoxical evidence for discord is developed within the group itself - discord that cannot be explained away without external, or outgroup, reasoning.

It is important at this point to emphasize that I do not for a moment think that everyone that owns an iPhone is a victim of GroupThink - but when I see brand loyalty trumping logic in otherwise logical people, I do have to raise an eyebrow. An good example that illustrates the effects of groupthink applied to Apple owners relates to the release of the iPhone4. This example works on a couple of levels:
  • Internal to Apple.
    The engineers at Apple are very smart people. I know many of them, and I respect those people and their judgement. However, something very serious happened internally at Apple to result in the production and subsequent release of the iPhone4. A number of very very smart people, including hardware engineers steeped in knowledge of radio technology and electromagnetics, got together and as a group came to the conclusion that, for the sake of design esthetics, it would be a good idea to take a radio antenna and place it outside the phone in a spot where it can be in constant contact with human flesh and, in essence, be grounded out.

    There's a great paper on antenna mechanics (written by ATT, Apple's carrier partner for the iPhone at that time) which touches on the results of placing an antenna in an area where it is in contact with human hands...

    If the product is to be hand-held or otherwise in contact with the human body, remember that interaction with the human body will introduce power loss external to the product in both receive and transmit operation. As a separate issue, interaction with the human body will also cause de-tuning of the antenna. Both of these effects seriously degrade performance. Testing should be performed to properly quantify the effects of the human body.

    The GroupThink internal to Apple was so strong, that the fundamental laws of physics were completely disregarded.
  • External to Apple.
    The iPhone4 was released with an external antenna cleverly wrapped around the body of the phone as a design element, resulting in a degradation of connectivity with the cell phone towers when in contact with human skin. This is not a software problem, or anything that could be fixed with product recall-style solution. The iPhone4 was a designed with a defect built-in to the phone.

    The effect was so pronounced, Consumer Reports could not recommend that the phone be purchased. When alerted to the problem, Steve Jobs first declared the poor reception to be the user's fault ("You are holding the phone wrong.") Eventually, however, the company relented to selling consumers a $20 rubber band case that covered the antenna. Eventually (again), the company gave the rubber band case away.

    Despite all of this, the iPhone 4 sold almost 20M units in the first quarter. People willingly spent several hundred dollars to purchase a product that was demonstrably defective at launch.
Lest you, dear reader, think this is an isolated event in the world of the iPhone, it just happened again...

The iPhone5 was shipped sans Google Maps, and in its place is Apple's mapping application - which is now the brunt of many a joke and even an unheard of apology from Apple itself... in addition, the new operating system, iOS 6, is showing issues with WiFi connectivity. (At the time of the editing of this writing, Apple just admitted a third fault with the phone concerning the phone's camera.) Once again, smart engineers at Apple released a mapping product that not only couldn't compete with the existing Google Maps application, but didn't actually work. Once again, consumers lined up to buy 5M iPhone5's on opening weekend.

It is difficult to have an objective conversation about products that are protected by the mindset of GroupThink, because it's difficult to convince the members of the group that you are actually trying to have an objective conversation.  Any objective view that runs counter the group's faith in the product or service, is met with disinterest (at best), or name calling (at worst). Any conversation that begins with "I purchased product X over product Y," can label one a proponent for "the other side."

Nonetheless, here I go....

My Issues with Apple Products

I make the distinction between the Apple's CE products (iPhone, iPad, etc) and Apple's Computer products (Macbook, Macbook Air, etc) because the two currently have very separate usage philosophies. Apple's computer products are well made devices, with a rich, robust operating system build around a standard UNIX operating system original developed for Steve Job's other computer company, NeXT. There are signs in the recent releases of OS X that imply that many of my issues with iOS that I list below are in the process of bubbling over into OS X, but we probably have a few more years before that happens completely.

So, ok...I'm talking about iOS devices specifically. Allow me to outline five points about these devices that will more than likely keep me from ever desiring one beyond what is required for my work.

  1. They just aren't that interesting.

    Let's admit it: iPhones and iPads are the same basic devices they've been since 2007. They have the same cell top, they have the same single-button, they are just - as their patent case with Samsung recently pointed out - slabs of glass with rounded corners.

    The iOS user experience, which was inspiring in 2007, is uninspiring in 2012. It's the same that it's always been, really. There's a notification bar now, sure, but even that feels like an afterthought thrown in to play catch-up.

    Aesthetically, it just...kinda...lies there.
  2. What happened to the "Think Different?"

    Every iPhone is the same as every other iPhone - similarly for an iPad. A user can change the background image, create some folders on the cell top, specialize their ringtone, and slap a "Hello Kitty" case on it... but, it's the same system setup as your buddy next to you. For the most personal device in the world, it's really not very personal.

  3. The devices claim to be something that they are not.

    The iOS operating system has been multitasking since iOS4, but it's a limited form called "pre-emptive multitasking." With the exception of certain system privileged applications, iOS devices stop processes and cache them out to disk when a new task is called to the foreground. The original task is resumed from where it left off when you recall it.

    There are good reasons for this - a single task running is faster, and avoids errant behavior over the rest of the system. The device's usefulness, however, suffers. iOS devices are only as interesting as the applications that run on them, but the device itself is a neutered version of what it could actually be.
  4. iOS devices feel constraining.

    Every time I pick up an iOS device, I feel like I was tossed into a cramped box. I have to go through the same two-step to move from one application to another. Running multiple apps doesn't feel slick or smooth, it just feels pre-emptive, which is, of course, my complaint from above.

    iOS doesn't allow me to easily switch default browsers, but that doesn't really matter since other browsers are famously hampered by being restricted from using iOS's Nitro technology, a form of just-in-time compilation. Nitro makes Mobile Safari speedy, but no one else has access to it...which smacks of something that the antitrust people may want to look into.

    The cell top is a rigid grid of squares, which is appropriate considering how it feels to use one.
  5. These Things Are Fragile

    Seriously fragile. The iPhone and iPad product like place esthetics and supply lines above practicality of owning a portable computer and communication devices. The things get wet, they get dropped, they get placed in a back pocket and get sat on. That's just the way it is... so, why not plan for it?

    Broken iPhones alone have cost consumers $6B..that's "billion" with a B... since 2007. 

My Issues with Apple, Inc.

Again, being 900 years old, I've seen IBM come and go, and Microsoft come and (nearly) go. In either case, there were complaints with these companies: too big, too bureaucratic, too out-of-touch with consumers, too monopolistic. All of these complaints were (and are) true - but Apple is in a different league all together.

  1. The Reality Distortion Field

    It's a cute name, but I have a more accurate one: the P.T. Barnum Effect.

    Steve Jobs was a master showman - there really hasn't been anything like him before in the tech industry - but the willful disregard of reality (the iPhone 4 antenna issue is a great example), the flowery language to describe industry standard features and applications as though they were just invented (Facetime? Seriously?), and misquoting people and figures on stage without anyone fact checking (the first public volley in the Samsung/Apple feud was probably fired by Jobs when he misquoted one of their VPs) have all taken their toll on Apple's credibility.
  2. I had a Mother, thank you.

    Look, I get it - Apple wants to vet every application submitted to the app store to prevent malicious hackers from ruining your life. It also has another - probably intentional - side-effect: Apple has control over what you are allowed to see and use, and what you aren't allowed to see and use. I've had applications rejected for being "too much like" existing Apple applications, which sounds a little anti-competitive. Other apps are rejected for content - including Ulysses - which smacks a little like nanny-ism (at best) or censorship (at worst).

    Additionally, there's some evidence that Apple tries hard to keep applications out that it deems are too similar to their own product offerings (keeping Google Voice out in 2010, for example, required that the FCC get involved). Other applications, like browsers competitive to Mobile Safari, are denied access to technology that allows the browser the speed advantage that Safari enjoys. These are the sort of tactics that forced the government to step in when Microsoft tried mating Internet Explorer too tightly to the Windows operating system.

    If you want to "protect the children" or "protect my phone," fine...but put that control in my hands through ratings systems or "untested" categories, but don't make my decisions for me.
  3. The Birth of a Litigious Culture

    This is probably the biggest issue I have with the current incantation of Apple. It has taken a stance that draws on lawyers to protect what it declares as it's IP, rather than designers and engineers to invent legitimate IP.

    There's not a lot I can add to the public discourse on Apple vs. Samsung that hasn't been written before. However, I will use this platform to harp on two things:

    a) There is a difference between copywrite infringement and patent infringement. Should Apple go after a competitor for copying their icons, user interface design, etc? Absolutely. Should Apple go after a competitor for prior art (in the technology sense of the word "art") that they have both drawn on? No, of course not.

    There is not a single UIX feature in modern smartphones (be they iOS, Android or Windows Phone) that does not have prior art. Take, for example, two-fingered multi-touch interfaces: a 10 second wikipedia search shows what I remember from SIGGRAPH papers back in the day: multitouch technology began at the University of Toronto in 1982. Yup, it's 30 years old. Just because you add "on a phone," to the end of a sentence, doesn't make it unique.

    b) There's also a concept called "obvious art." Saying a phone is unique (or somehow infringes on a patent) because it is a certain shape or size falls into this category, especially since the iPhone was not the first cell phone to be a rectangular slab with rounded corners, just one of several. (LG/Prada and Sony all produced similar phones in the same time frame as Apple.)
  4. The Proprietary Connector

    OK, I know this sounds like it should be in the "consumer devices" column rather than in the company column, but it really does belong here.

    Apple has a proprietary dock connector on all of its products. Apple isn't doing this for efficiency, or throughput of the connector itself (the "lightning" connector is essentially USB2 with a different wiring pattern), so why are they doing it? What is the point of making devices with proprietary connectors which forces users to pack one more adaptor into a bag?

    Before the iPhone 5, the proprietary connectors kept the OEMs that licensed the rights to use the connector logic from Apple happy - whenever Apple changed the connector or configuration of a device, consumers had to by new products or converters. Now, the lightning connector on the iPhone 5 may inhibit the OEM's from being able to create cheaper chargers and connectors, forcing the consumer to purchase these products from Apple.

    Either way, the proprietary connectors have been contributing mightily to Apple's stockpile of cash...
  5. The Breaking Bad Syndrome

    ...and speaking of which, what, exactly, is Apple doing sitting on $117B of cash?

    In a stalled economy, having $117B in appreciable assets is an amazing feat, but it's not clear to me what they are doing with it. They are not re-investing it in M&A activity, they are not returning it to shareholders, and while they have an active R&D department, it's R&D geared towards making better mousetrap versions of Apple products.

    At this stage of the game, IBM created a venture capital arm and re-invested half it's holdings in an active R&D department (IBM Research). IBM Research group was responsible for unleashing on the world: Fast Fourier Transforms (which allowed everything from voice recognition to Pandora to exist), magnetic disk storage, dynamic random access memory, RISC architecture computers, relational databases, and...of course, Deep Blue, the grandmaster chess computer. All of these creations, and too many more to mention, were released to the world with licensing fees rather than aggressive patent protection litigation. The money spent on IBM's R&D not only fed back into IBM in the form of new product lines and licensing fees, but enriched the world.

    Similarly, at this stage of the game, Microsoft was heavily involved in the R&D through the Microsoft Research project which has been involved in everything from data visualization to machine learning to computer vision projects. (Thank Microsoft Research for your Kinect.) In addition, since 2004, MS has been giving out healthy dividends to it's shareholders.

    ...and don't even get me started on Hewlett-Packard and Google.

    So again, I ask, what exactly is Apple doing sitting on $117B of cash?


..yeah. That's pretty much the story. With all of the choices out there for cell phones, tablets, convertible  laptops...locking myself to a vendor who's business model includes ignoring standards, litigating to the top, makes unexciting products, and locking you into their ecosystem makes little sense in my life.

If you are an Apple user and you get legitimate use and joy out their products, more power to you. Really, I mean that with no factiousness at all. Enjoy it. Really.

Wednesday, April 25, 2012

From Dinosaurs to Birds: Wither Gaming Consoles?

I'm a digital media, data mining and mobile advertising sort of fellow, and - although I am an an active, non-apologetic gamer - I have never touched a line of game code in my life. (Well, I did write a Towers of Hanoi application in LISP for an AI course once, but that doesn't really count.)  So imagine my strangeness yesterday when I found myself at the LA Games 2012 Conference in Hollywood.

The experience was interesting - not quite "fish out of water," since I knew a surprising number of people there - but close. There were several people that I knew from my life in mobile, who apparently moved over to gaming through the mobile experience. Games on iPads, games on Android phones, that sort of thing. 

Opening Slide for Ben Cousin's
GDC Talk
I got there early, which allowed me some time to walk around, network and - most importantly - duck in to other panels. There was a theme that ran through some of the panels, and the afternoon live-panel debate on what monetization strategy will win, concerning gaming consoles. Some of the panelists and industry insiders are predicting the death of high end gaming consoles. One panelist even went so far to say that the PS3, XBox and Wii consoles were "dead man walking," and would be supplanted by tablets, phones and PCs as early as 2016. (I suspect this sentiment was given credence last month when ngmoco's Ben Cousin's called the death of the console an inevitability at the Game Developer's Conference. It's a convincing argument - you can see Ben's talk over at Blog Games.) 

Interesting theory - but it feels shortsighted. (Or, more to the point, surprisingly self-serving, since some of the advocates of this idea ran mobile gaming concerns.) There are a couple of reasons why this feels shortsighted to me.  To begin with,  there are two types of gamers. The first type I would call "arcade gamers" - sitting on your phone for 5 minutes a shot to play relatively simple, yet satisfying games, like Plants vs. Zombies or Angry Birds is a fine way to kill a few moments of time. And while I fully realize that the sophistication, and therefore gaming capabilities, of mobile devices will increase over time, there are just inherent limitations of gaming on mobile: the device itself is the controller, which limits the activity choices and availability. 

The second type of gamer can be thought of as the premium gamer: on a couch with a controller, a gaming console connected to a television (or high end PC), and a beer, a premium gamer will spend hours at a shot immersed in a game (either alone or online) playing a complicated, cinematic game through to completion. Both types of gaming are completely legitimate, but they are completely different experiences. 

However, the primary reason that the belief that the console is going away doesn't quite feel right is the reason I was at LA Games in the first place. I was asked to sit on a panel called "Entertainment on Consoles: Reinventing the Media Hub of the Living Room," moderated by Chris Marlowe over at Digital Media Wire

Game console penetration, for just the XBox and PS3 alone, is around 50M units. Unlike other consumer products, the hardware refresh of a typical gaming console is around 10 years. The reason for this has to do with the complexity involved in developing and engineering games for these systems - a typical game for your iPhone can be constructed in months with just a few people, whereas a premium game for a high end console is more like a movie production: there are hundreds of engineers and game designers, voice and motion talent, set design, etc. It takes upwards of a year or two to create, so having a box that gets hardware refreshed in any timeline faster than a decade is not going to attract a lot of developers. (Why spend 18 months and millions of dollars engineering something only to have the hardware requirement change on the 19th month?)

So, in order to combat consumer fatigue, modern gaming consoles are designed to be as future proof as possible for the technology of the day: high end processors (both CPU and GPU) are designed into systems that can be reprogrammed with new firmware. Constant internet connections to the mothership are made so that new software, operating system changes and business models can be injected onto these systems through upgrades. (For instance, the ability to buy games directly through the Playstation Network wasn't available when the PS3 was launched, but it is now.) The systems are typically sold at a loss to the company that is making/supporting the console, with promise of payouts on the backend for licensing deals, game developer fees, and consumer subscriptions.

All of this makes these boxes attractive purchases for consumers - you are almost guaranteed that the $200-$400 you spent on a box in 2006 will still be a viable device in 2012. 

It also makes these boxes excellent trojan horses: they are quite powerful out of the box, and once installed at a consumer's home, more and more functionality can be added remotely. Modern gaming platforms, most notably the PS3 and XBox have re-invented themselves to be more than just game platforms. These devices now allow users to rent movies from their in-device stores, or download applications such as Hulu+, Netflix and Amazon Instant Video to get film and television through these other sources.

Both the PS3 and the XBox have recently retooled themselves to reflect this additional tour of duty: the PS3 version of Netflix is the only one that outputs in 5.1 audio, and is the only gaming console at all to carry Amazon Instant Video. In January, Microsoft released a complete UI redesign for the XBox reflecting not only the "Metro Tile" look and feel to the XBox, but actually de-emphasizes games as the primary driver on the XBox in lieu of applications. The XBox app category downloaded most frequently? Video applications. (The HBO GO App on the XBox is a thing of beauty, especially when paired with the XBox Kinect.) 

I suspect that the next game console hardware refresh we see (from both Sony and Microsoft) will contain quite a few changes. Some easy to guess predictions: no physical media, higher bandwidth connections, Thunderbolt output, easy mobile connectivity for session shifting (this has already started in the case of the PS3), options (either physical or wireless) to use mobile as controllers, and higher resolution output. They will be smaller, easier to connect, less power hungry, and more discrete devices - perhaps deals will get struck with cable operators similar to what Xfinity just released with XBox, freeing us from cable boxes forever.

So, is the gaming console really the dinosaur of gaming? Sure, but dinosaurs never really became extinct, they just morphed into birds...