Monday, December 28, 2009

The 2009 Rocket Prediction Tally...

God, I hate the year-end prediction wrap-up. The PREDICTIONS are fun, I can just sit here with a whiskey pounding them out, but the wrap-up? Gah!!! Fact checking, looking things up. This is actual WORK man...sigh.

Fine, fine....alrighty then. Let's see what sort of a score I can give myself this year - and see if I can outdo my 2008 75% hit ratio. I kinda doubt it since 2009 was so flippin' all over the place, but...let's get started:

I made 10 predictions at the start of 2009, and tried to cover the gambit from consumer electonics, to services, to the tech industry in general.

  • Economic recovery begins in early Q3 for the tech and housing industries.
    WIN


    It may not feel like it to everyone, but the economy is definitely picking up steam. In my professional life, I have seen ad revenues increase significantly, and the number of available startup opportunities is on the rise. Both of these things began around August-September of 2009.

    In more measurable, and less subjective, trending - the numbers show that the number of layoffs dropped significantly in November, and the leading economic indicators began rising in the US around the end of the summer.

  • The Obama Administration revitalizes the tech industry within 6 months of taking office.
    TOSS UP


    OK, I'm going to play fair here. The tech industry is in the midst of a recovery, but this "prediction" I made was pretty vague. Could have meant anything - so I don't really want to claim it. (Of course, I'm not claiming it as a loss, either.) Also, Aneesh Chopra is currently defending himself against Jon Stewart, so, uh....

  • MEMS technology for low power / flexible displays hits the market.
    WIN

    OK, second time is the charm, but I'm still taking it. MEMS (microelectronic and microelectromechanical systems) display technologies are moving mainstream - whether its from the cleverly named Qualcomm spinoff (although its probably not a good sign that their COO just left at the end of December), or the nanotech from eInk and others, smaller, flatter, less-power-consuming displays are appearing everywhere. It powers your Kindle, Nook and Sony eReader, and its making its way into still more displays, but it's clear that the nano-based, low-power displays are here. (We'll know more after the Consumer Electronics Show in January.)

  • Android phone sales hit iPhone numbers before end of year
    LOSS

    Eh, chalk this one up to wishful thinking. Android, however, has started to show its promise in a major way as the year progressed. The plethora of Android based phones that we were promised last year, has started to make its way onto center stage. TMobile has the CLIQ and MyTouch, and, of course, the Verizon Droid needs no introduction. I can tell you from my professional experience, that video access by Android devices is way up, and info from AdMob shows both the distro of Android handsets as well as Android claiming 24% of all smartphone usage as of the end of November - mostly due to the Droid.

    Still, total Android units are well below those of the iPhone, although a number of industry research firms are claiming Android will move into the top two spots within the next year or two.

  • Digital delivery of home media makes a measurable change in broadcast TV numbers
    WIN

    The numbers are speaking for themselves here, which is why I liked this prediction - its easy to show. it may not be your grandpa and grandma, or even your parents, but viewers are beginning their shift to online - or at least - digitally stored media. Not only has iTunes, Amazon VoD, and Netflix experienced rapid adoption this year, but so has streaming services like Hulu, which now gets as many views as pay cable. In addition, DVR (digital recording and local storage of broadcast television) content has finally been started to be taken seriously.

    In the fall, TheCW and Fox unwittingly entered an experiment. Both networks pitted popular genre shows (Fringe, and Supernatural) against each other. The result was that the Nielsen ratings for both shows (along with CSI, Grey's Anatomy, Flash Forward, and others) tanked. In fact, they dropped so much that Fringe, bizarrely enough, was moved to a "on the bubble" (for cancellation) category. However, once people woke up and Nieslen published it's DVR view numbers, it became clear that these shows maintained their viewship numbers, but were simply timeshifted. I suspect the viewing numbers will increase again, once digital downloads from Amazon and iTunes, and digital streaming from Hulu and the network websites are taken into account.

  • Significant drops in Blu-Ray player prices combined with content publisher pressure to release existing titles in a new format will push Blu-Ray disc sales past DVD disc sales
    WIN

    Note: all units in the graphs below are in millions. At first glance, this doesn't appear to be a win. Using sales information available from HMM/Nielsen, unit sale market share of Blu-Ray is about 14%. However, when plotted on a dual access along side of DVD sales, an interesting trend occurs.

    Spurred on by a 12% drop in Blu-Ray disc prices throughout the year, plus the availability of inexpensive Blu-Ray players and the ubiquity of Blu-Ray titles, 2009 Blu-Ray sales trajectory is outpacing the 2009 DVD sales trajectory. This indicates the beginning of the adoption curve for Blu-Ray and the end of the adoption curve for DVD.

    Although the raw sales numbers won't catch up until mid-2010 at this rate, I'm a big enough of an asshole to still claim the win for this prediction.

  • As Apple pushes deeper into double-digit territory for laptop sales, several serious viral attacks begin in the Mac community. Lack of adequate protection combined with consumer hubris will make the problems significant.
    WIN

    It began right in the beginning of 2009, actually, in late January - coming on board your lovely OS X laptop with, you guessed it, pirated versions of iWork '09. First recognized as a threat by Intego Securities on January 22, and calling itself Black Orange (these virus writers have awesome marketing departments, I must say!) it spread like wildfire through the community, indicating the number of people in the union of the Venn Diagram who think its a) ok to cop a piece of software, b) safe to be on a Mac. The virus was so efficient (well, the host was) that it was still prevalent as late as April.

    The iWorks virus was followed by a parade of virus, malware and other yummy bits on the Mac, which - through no lack of coincidence - hit the 10% market share magic number briefly in Q2. The heightened sense of reality finally caused the "gold standard" in windows and linux based antivirus protection, Kasperksy, to release a version its antivirus software for the Mac....in frakin' October.

    Mac boards have been all a swirl with confusion this year, some folks still claiming it wasn't possible for Macs to get a virus, and some irresponsible download services blogging that anti-virus companies were fear-mongering to get Mac users to buy anti-virus software. Yeah, not so much.
  • At least one other additional security exploits occur in the basic structure of the aging internet protocol and backbones, forcing a rethink of the way packets are carried over the Internets
    Win


    Late in 2008, Dan Kaminsky's now famous DNS security flaw was revealed to a stunned panel of internet backbone companies. Many complied (thank you Comcast) many did not (screw you Time Warner), but once patched, the 20+ year old security flaw seemed under control, and the fears were to be put at rest...

    ...until this year. When not one, but three other DNS flaws were uncovered.


    Seriously, guys. It's over 25 years old. It was invented by a newly minted PhD at the request of guy who just wanted to clear up his own bookkeeping for 12 friggin' computers.

    OK, I'm not giving these guys enough due, but come on. My bank records are using this thing. Let's clean it up and start again, please.

  • Windows 7 arrives at the latter-half of the year, but the PR damage done by the mishandling of Vista's public perception plus the stillborn Microsoft marketing campaign PLUS John Hodgman ensures a tepid reception to the new OS.
    LOSS

    Ok, I'm kinda happy about being wrong about this one. I'm not a big Microsoft torch bearer, but I'm not an Apple apostle either. Competition is a good thing, and having viable operating system on the market that hasn't been pre-tarred and feathered is an excellent thing. (Apologies, Linix ...but, come on...be serious please. And Google OS, you're still vaporware, at least in '09.)

    Windows 7 is pretty damn remarkable - it made a 1G, 5 year old laptop of mine run like I just bought it yesterday, whereas Vista had it crawling to a stop upon boot up. All indications are that the market loves it too, and its been a critical darling since the reviewers got ahold of the alpha versions of the OS. If Microsoft's ass-backwards, destined-to-get-in-its-own-way marketing crew couldn't stop this product, Hodgman never stood a chance.

  • Yahoo breaks up into its original component companies, or at least puts them on the auction block, before Q4.
    Loss

    Effing Yahoos. No, they didn't divest....
    ...or spit up.
    ...or fall apart.
    ...or grow.
    ...or shrink.
    ...or fade away.
    ...or come on strong.

    What they did do was spend the year playing c-tease with Microsoft, and coming up with this winning multi-bazillion dollar ad campaign. "Yahoo! It's You?"

    Eff You! Seriously. What EXACTLY do you guys do for a living? Search? Ads? Email? IM? WHAT? Really, I'd love to know. Oh, that's right, you reactivated rocketmail! Sweet! You know what? I've been missing Compuserve lately, think you could re-animate that dead tissue, too?

    Sorry...I'm just bitter at losing a point on this one.
OK...let's just total these puppies up and see how I did...

10 predictions...I was right or dead even....7 times. 7 out of 10. 70%. That's a drop from last year. Huh. Uh...well...uh....

...damn Yahoo.

Wednesday, December 23, 2009

Don't Cross the Streams...Why? It Would Be Bad.


Ok, this is less of a blog post and more like one of those public safety announcement thingies... I spent the better part of today, when I should have been making damn certain I remembered how to fillet a bronzini in time for Christmas dinner, straightening out a Chrome and Firefox mess.

It started as all of these things do: an impulsive swapping out of a beloved piece of software over a minor "difference of opinion." I thought if I strolled around town with Chrome for a few days, Firefox would see the error of her ways and come crawling back on hands and knees...

It, uh, didn't quite work out that way, of course.

Like a lot of folks out there, I use XMarks (formerly "FoxMarks," until they realized that locking themselves into a specific vendor in this market was most likely foolish) to sync my browser bookmarks cross platform between instances of Firefox on windows, mac and ubuntu. I had previous played with Chrome before, but didn't use it in earnest because it never had addons/extensions. Now, of course, it does - meaning I could use most of the tools I had previously used over in Firefox, including XMarks.

After installing Chrome on Ubuntu, I added the XMarks extension, fired it up, and...VoilĂ ! Bookmarks in Chrome, nicely organized. It worked so well, I installed Chrome + XMarks on Windows7 and OSX. (See where this is going?) Look! HA! See that Firefox? She new and shiny, and she has all the accessories that you have. I don't need you anymore...and I never think about you....

...well...you were really nice to me all those years. Maybe I can forgive your weight problem. I mean, what's a half a megabyte of extra poundage anyway? It's just baby fat! Oh....come here, you saucy minx....I'm sorry. Chrome didn't mean anything, she was just a fling...

Uh oh.

When Firefox came back up, XMarks engaged....within 15 seconds consumed 99% of the CPU, and the hard drive was pounding.

Now, XMarks has never been very good about admiting problems: they have a tendency to ignore the really tricky, hard-to-reproduce stuff and concentrate on the easier issues in their forums. (Last year when Foxmarks was transitioning to XMarks, 100's of us in their forums started complaining that XMarks would often not install in Firefox. There were little to no responses from them on the topic, and the problem mysteriously disappeared during one of their releases.) So, I didn't expect to find much in their forums about this issue, and then I happened across this little ditty:

"...after syncing Chrome, Firefox gets an "Other Bookmarks" folder added, which is really just the name of Chrome's bookmark root..."

Oops.

The XMarks people have yet to respond to this thread, and the other folks in the forum have so far just noted that its an "annoyance" to have Other Bookmarks in the bookmark tree - but, no kids, its much more insidious than that. Since Other Bookmarks is the root of the Chrome bookmark tree, but to the Firefox bookmark manager it appears as a folder, when sync'ing a Chrome-written bookmark tree back to Firefox, you've just established this nice little recursion:

Firefox Tree ->
Other Bookmarks Folder ->
Chrome Root ->
Firefox Tree ->
Other Bookmarks Folder ->
Rinse ->
Repeat...

And, as an added bonus, if you use the "automatic sync" function inside of XMarks, you are guaranteed to pass this recursion rule on to the XMarks Mother Ship, who will propagate it down to all of your other Firefox instances that have auto sync turned on. Yay! The longer you let XMarks attempt additional syncs while you figure out what is happening (yeah, that's me) the deeper the recursion layer, and very shortly your CPU and hard drive will be maxed out. Double yay!

How do you get out of this mess? XMarks isn't sayin', and if you try to delete the recursive folder Other Bookmarks from your myxmarks.com account, XMarks refuses to let you do it because it's identified that folder as the root folder of your bookmark tree - which, of course, it is on Chrome.

After a day of fiddling, there are really only three ways to do this, ranging in frustration from "Oh Thank God I'm so lucky" to "Sigh." (Oh, and needless to say, all of these methods require you to uninstall XMarks - or disable it - from Chrome first, or you'll be back where you started.)

  1. Locate a machine that does not have XMarks set to auto sync, yet still has your most recent bookmark list. You can then use the XMarks "force overwrite of server" function to push your local bookmark tree up to the XMarks server, which will then propagate the corrected tree to your other Firefox instances. No harm, no foul.
  2. Log into myxmarks.com and restore your bookmark tree from a point prior to your first Chrome sync. This will, of course, lose your most recent bookmarks, but stop whining like a little baby and grow a set...you'll find that porn again.
  3. Turn off XMarks auto sync, then use your bookmark manager in Firefox to delete the Other Bookmarks folder. This will take some time if XMarks did many sync attempts before you figured out what was going on. Alternatively, you can delete it manually by going to the location of your Foxmarks bookmarks on your drive and blowing away the Other Bookmarks entry. On Windows7 this is located here:

    ..Users\[your XP user name]\Application Data\Mozilla\Firefox\Profiles\bookmarksbackups

So, there you have it. XMarks is a great tool if you have multiple browser installations, multiple laptops, or just want a good way to back up all those bookmarks...but, like all sync'ing solutions since the dawn of time, it's also a good way to propagate mistakes really, really fast.

Tuesday, December 1, 2009

The Nerdliness of Language

I spent my college years (undergraduate and graduate) steeped in science and math. Practically every waking moment which was not spent studying science and technology was spent reading about science and technology. (Well, ok... there were illicit substances, poker, certain ladies of my acquaintance, and God knows what else ....but...after all that then there was science! ...well...bratwurst....then science...er, bratwurst and sandwiches. Crap. Ok: Illicit substances, poker, women, bratwurst, and sandwiches...and THEN science.)

Those hours were spent both in the pursuit of science (experiments, study, history of science) and in the camaraderie of people who - while not necessarily like minded - believed in the same constructs and principles. It framed our conversations, and moved us to a common point of conversation where we could agree, argue, discuss, and laugh. Thought of another way, it gave us a reason to drink together.

All of those hours, days, weeks, years of being in that community - the commonalities of prose became apparent. There is an elegance to this distortion of language that is very similar to any collection of people searching to find a common ground: sports fans, religious adherents, vegetarians, punk rockers, opera buffs, comic book fans...every subculture under the sun. It's a slang, of sorts, applied to descriptions of terms, ideas and concepts. A way of communicating complex ideas with a minimum of words. It's almost a subconscious attempt of the mind to contract the language. The slang is picked up from contractions of scientific, engineering or mathematics terms, of course, but more interestingly, some is picked up from the street, from pop culture, and from the daily banter of everyday life.

Limiting the conversation to computers for a moment, much of the slang is famously known - the world knows what a virus is when applied to computers, or morphing when it comes to computer graphics. Curiously, the language of the digerati sometimes migrates in the opposite direction, back out to the street: Crashing almost needed no explanation when it was introduced to a population newly enamoured with personal computers. People were more than happy to apply the term to themselves when they stayed up to late, or couldn't work any more for the day. Busy executives are quick to tell people that they are multitasking, even though few of them understand what that term actually means.

Perhaps it is because computers have a symbiotic relationship with the public that the language barrier between the geek and the street is two-way permeable, but other, more esoteric, fields of study that isn't quite the case. To the outsider, hearing common, everyday terms used in technical or scientific descriptions may sound odd, harsh, or lend themselves to misinterpretation. The field of mathematics is chalk o' block full of an odd juxtaposition of language, phraseology and street slang. The use of the word trivial doesn't necessary mean that something is easy, but rather that something is "well understood by everyone in the room, so shut the eff up so we can get on with the real conversation." ("There's no need to go into the proof of the prime-number theorem here, Bob, it's trivial.")

Because of this re-purposing of common words, whenever the outside world hears mathematicians, computer scientists, or comic book fanboys talk - the result is often confusion or misinterpretation. While this is understandable - and should be predictable by most of "inside language" participants - real trouble begins when the press gets involved. Having worked at government science labs for the first half of my career, I was often interviewed by the press and media - and the resulting, published "interviews" were most illuminating. I very quickly learned to change the use of my language whenever I wished to convey information to anyone outside the group.

This misinterpretation of common words and terms can lead to amusing confusion in the public discourse, often revolving around computers and the internet - a "computer virus" although likened to biological viruses, are not biological viruses. Other times, this language makes its way out into the public discourse and is disastrously misinterpreted. The Large Hadron Collider has captured the public imagination, largely due to these language misinterpretations. (Thank you, Dan Brown, for adding to the mess....and I'm sure the Illuminati, the Knights Templar, and the Vatican also thank you for adding to the popularization of misinterpretation.)

Scientists, because of the nature of their work and the natural honesty that comes with scientific inquiry, cannot say anything is certain - even if all evidence brought to bear on a topic confirms a theory or model, a (good) scientist will always feel compelled to say "I am 99% certain that this is true." So, when LHC researchers were asked if the LHC could generate a singularity that could destroy the earth, the physicists responded that they were 99.999% sure that it would not. This, of course, was picked up by the news media and translated as THERE IS A SLIGHT POSSIBILITY THAT THE LHC COULD GENERATE A BLACK HOLE THAT WOULD DESTROY THE EARTH. Don't even get me started on the press and H1N1.

Occasionally, however, there are people who fully understand the disparity between language inside a specific group and laypeople, and deliberately exploit those differences to further their own agendas. This isn't comical misinterpretation or unfortunate misreadings, it is a deliberate manipulation of the media and, by extemtion, public opinion by using what looks like corroboratory evidence. Most recently, we've had the e-mail that was stolen (I won't use the word "hacked") from the University of East Anglia's Climate Research Unit - apparently by...well, no one knows who, or at least no one is saying anything.

The American and British press were all over this, with the usual screaming headlines, like this doosy from, of all places, the New York Times:

In the NY Times article, Andrew Revkin discusses how the email appears to underscore a belief in a conspiracy of climatologists who seek to convince the world that our climate is crumbling before our eyes. The language device that Revkin uses to passive-aggressively enforce the possibility of a conspiracy is of the worst kind of direct manipulation: The infamous double-ditto! (Or, rather, "double ditto,")

In one e-mail exchange, a scientist writes of using a statistical “trick” in a chart illustrating a recent sharp warming trend. In another, a scientist refers to climate skeptics as “idiots.”

There is more, of course. The article is laced with names, double-dittos, nefarious snippets from the stolen correspondence, and "I told you so's!" from fringe climatologists, like Patrick J. Michaels. Michaels may or may not have been a climatologist for the state of Virginia - no one really can tell - but he is most definitely a climatologist at The Cato Institute, a libertarian think tank cum lobby group in Washington DC that strives "to achieve greater involvement of the intelligent, lay public in questions of (public) policy and the proper role of government." Conveniently located in DC, Cato Institute members are frequent guests on chatty panel and talk shows aimed at public policy. Michael's himself has a number of books on the market on how climate change isn't gonna be so bad, and potentially beneficial, so between the books and the Face-The-Nation circuit, he's doin' just fine, thank you very much.

Swinging the spotlight back on Mr. Revkin and his word-play article in the NY Times - perhaps I can help him out a bit. The email that was stolen (again, not hacked) from the CRU contained hand-wringing (choice bits like “the fact is that we can’t account for the lack of warming at the moment and it is a travesty that we can’t..." are sprinkled throughout the article), insults lobbed at the anti-global warming camp, and - most importantly - the use of the word "trick" in conjunction with showing trending.

To his credit, Revkin does include a not-quite-a-quote from Michael Mann, a climatologist at Penn State, where Mann explains the use of the word "trick:"

He said the choice of words by his colleague was poor but noted that scientists often used the word “trick” to refer to a good way to solve a problem, “and not something secret.”

Even thought Revkin didn't properly include a quote from Mann, that explanation is something any mathematician would understand. Having spent years in a math department at my university, "trick" was a word that was used over and over again. It is not used to imply that something is being covered up or misled, but rather that something clever is being done to remove some steps from a process. Essentially, it means that a sort of mathematical shorthand is about to be employed - a way to get from point A to point C by skipping B.

With the hand wringing, that's just good science. Predictions made in the 80's about warmer climates appearing in the 00's have not happened. The implication is not that climate change is wrong necessarily, but rather that the model used to show the climate shift in the 80's was most likely faulty. (Actually, if climatologists had not mentioned the temperature not fitting the model, then there would be grounds for a conspiracy.) In my travels through science, business and public relations, I have found that the hardest concept for laypersons to understand about science is it's most basic precept: science is, by its nature, self-correcting. It holds no public office, it has no allegiances, it is not loyal to the men and women who study it. No matter how beloved a theory is, no matter how many careers depends on a specific conjecture, no matter how old and established an idea is: if data surfaces to contradict the established model, the scientific method demands you throw the model out or find a way that a legitimate modification adjusts for the new data.

To the layperson, this mode of being is 0ften interpreted as waffling, knowing all along that a theory or model was wrong, or a simply as a reason why science doesn't work. Rather than a principle of great objectiveness, it's often used as an excuse to doubt the validity of the scientific method in public discourse. Is it better to steadfastly believe in something that has long been proven to be inaccurate, or is it better to course correct as you move forward, adjusting to new information as it comes in?

Finally, the insults. Yup - I don't doubt it. I think that anti-climate change folks have been insulted by climatologists who believe in climate change. I think that climate change believers have been insulted by anti-climate changers... both, probably pretty frequently. In public and in private. It's human nature. If you're honest with yourself, you do it all the time - I sure as hell do. (Allow me to prove a point by throwing myself on the alter of demonstration: vegans are dinks. There. Was that so bad, really?)

The point here is not whether climate change is happening, or if there is a Giant Global Conspiracy (tm) of climatologists to scaremonger, as some believe - the point is that, taken out of context, words are tools. If used as intended and left in the context in which they are placed, they are sharp, efficient, surgical. If, however, they are separated from their owner's intent through careless or malicious use, they are blunt, crude instruments causing more harm than good.

If you read something that sounds outrageous - in science or politics - chances are pretty good you should listen to your inner editor. Do yourself (and the originator of the words) a favor and google a few things: look up the author, look up the sources, look up the facts.

*Cartoon by Chris Madden http://www.chrismadden.co.uk/

Wednesday, October 14, 2009

LHC Whacked by Artifacts from the Future? ... or God? ...or... Something?

Two points of disclosure here before we continue:

1) I am not religious. Not even a tiny bit. I respect other people's right to be religious, as long as they respect my right to not be religious. I have Omnipotence Avoidance Issues. New term. You like it?

2) I like soup. Often for lunch, I go and grab a soup. It's healthy. It's nutritious. It's relatively low-fat. Well, except for the cream based soups, but that's another story.

Here's how those two life principles of mine come into play: Sometimes after I grab my lunch-soup, I come back to the office to eat and wipe my brain clean by reading websites that have nothing to do with my line of work - a guy eating soup needs a break, you know? So, there I was today, slurping my soup (minestrone, quite good actually) and reading IO9, a pop-culture science fiction website, and this little diddy popped up:

Is the Large Hadron Collider Being Sabotaged from the Future?

I, uh, choked on my soup a bit.

Ok, IO9 is a Gawker property, which is all about the snarky, so I put my tongue in my cheek as I read the article...which linked to a more serious New York Times article on the topic...and, that led me to research this a wee bit more.

For those that have missed the story up 'til this point, the Large Hadron Collider is one of the largest, and, as of today, still unrealized physics experiments in human history. It is a particle accelerator - a 17-mile long loop of a giant circus ride used to slam high energy particles together, so that we can look at the wreckage to see from what the original particles were made. Particle accelerators are nothing new in physics, but the LHC is another beast entirely. At 3 billion all in, 10,000 collaborators strong, and 100 countries supporting the effort, the LHC has a lot of eyes on it, and a lot of tasks on its plate once it gets lit up. The most important of which (and this fits into our little story here) is the tracking down of the Higgs-Boson particle. Or, as the kids like to call it these days: The God Particle. (Fox News didn't even give it this name, physicist Leon Lederman did.)

To understand why the Higgs-Boson particle is called The God Particle, requires just a cursory understanding of the current theory of what is the base level construct of matter. This theory, referred to as the Standard Theory, posits (oh lord, I used the word "posits" in my own blog. Kill. Me.) that there are four basic interactions (from weakest to strongest, first the two you've heard of: gravity, electromagnetism, and then the two with unsexy, unimaginative names: the weak nuclear force and the strong nuclear force) between all matter in the Universe. Those interactions are conveyed through physical particles operating at quantum level scales: massless photons, W bosons, Z bosons, bleh bleh bleh. Physicists have observed all of the particles in this soup that are vector particles - i.e. elementary particles that have a vector or tensor component. Theoretically, there are scalar particles (single, unitary particles that impart no vector component to whatever system they belong to)...well, ok, really just one scalar particle.

Guess which one?

If the Higgs-Boson particle exists, and it can be observed by the LHC, it is the last piece of the jigsaw puzzle to understand how everything is constructed. All mass. All matter. Everywhere. It's a big deal. It is the Thing That Binds Us All, and that is not an understatement. Physicists with more poetic bent like to say that observing the Higgs-Boson particle would be like looking into the face of God...

...but, hell. Poetic physicists say a lot of crap like that for dramatic effect, so that their spouses know that all those nights "working" down "at the lab" is really worth it...and...they're real sorry you can't relate to what they're doing. Sure, it's not making a new marketing slogan for beer...or...writing a new iPhone application that burps when you shake it...or...anything tangible, really....but...it's like looking at the face of God, dammit...doesn't that mean anything, Wanda? Wanda? No...don't leave...where are you taking the kids...? Hey!! Get back here!

...oh, sorry. Uh, where was I...?

Right... Standard Models...right right... ok...

So, finding the Higgs-Boson particle is the missing link, and has been a holy grail of particle physics since it's existence was first hypothesized in 1964. Attempts have been made before, most notably at Fermilab, and the results have been tantalizing, inferring that the particle does exist. Inference, however, is not enough to convince rabid physics wonks. Without a direct observation of the God Particle, the science community cannot accept its existence. (Which, honestly, is fair. I mean, the financial community accepted the existence of the viability of giving $1M home loans to people making $30,000 a year without direct proof they can pay it back, and look where that got us.)

Enter the Large Hadron Collider. First proposed in the 90's, costs on the LHC were kept down (hahah, I love saying that) by reusing a tunnel at CERN that was used to house the LHC's smaller cousin, the Large Electron–Positron Collider.

As the LHC startup date of Sept 2008 approached, the blogosphere and mainstream media alike were filled with crackpot theories about the LHC bringing about the end of the world because it could spontaneously call into existence a black hole, causing the earth to fall in upon itself in huge Michael Bay-esque sorta deal.

The world waited, and on the morning of September 10th, 2008 the fuse was lit (just kidding) and two tiny particles were whipped through the 17 mile long circular tunnel, 3 kilometers at a time. Successful first test! Yay science, yay! No black hole, no Michael Bay, no Bruce Willis, just two subatomic particles goin' for a ride.

The official inauguration of the LHC was to take place on October 21, 2008 with continuous operation after that date. However, on September 19th, 2008, 6 tons of liquid helium was found venting through several of the bends in the magnets. This was the high-energy physics equivalent of the Challenger disaster, and the LHC was shut down until the problem was sussed out, and the system shaken down...

...waiting time is over, as the LHC is scheduled to go into operation in a few weeks, sometime in mid-November, 2009.

Alright, back to my choking-on-my-soup story...

In early October, 2009, two of the 10,000 physicists connected to the LHC, Holger Bech Nielsen, of the Niels Bohr Institute in Copenhagen, and Masao Ninomiya of the Yukawa Institute for Theoretical Physics in Kyoto, Japan, published a series of papers on the Cornell University physics website arXiv.org with titles like "Test of Effect From Future in Large Hadron Collider: a Proposal” and “Search for Future Influence From LHC." Yup, the wacky duo of Nielsen&Ninomiya are saying that some measurable, physical force from the future is preventing the LHC from starting up and showing humans the God Particle.

That isn't too preposterous -well, ok, it really is, but stick with me here for sake of argument. The main thrust of the theory is that exposing the Higgs-Boson particle propagates events backwards through time preventing the LHC from functioning correctly. There's precedent for this in physics already - it is not possible, for instance, to observe the actual physicality in spacetime that we like to call a singularity, an infinite gravity well which is caused by the existence of an infinite mass. A singularity can never be directly observed because it comes with a cosmic bathrobe called an event horizon, beneath which no observational evidence can escape. The outer boundary of the event horizon is observed as the object we call a black hole. (In other words, a black hole is not a thing in and of itself, but an effect caused by the thing at its center: the singularity. A singularity not enshrouded by an event horizon is called a naked singularity, and is considered to be impossible.

The effect of the theory put forward by the comedy stylings of Nielsen&Ninomiya is the temporal equivalent. In essence, they propose that the exposure of the Higgs-Boson particle causes a ripple effect in spacetime that propagates backwards (rather than forwards) and extinguishes the cause of the Higgs-Boson exposure in the first place. In this case, I assume, by causing the venting of the aforementioned liquid hydrogen.

I guess I would have been fine with a theory which was expressed along pure physical cause-and-effect (or in this case: effect-and-cause) terms, but....there's more to this story. In an unpublished essay referenced by the New York Times, Nielsen supposedly made the statement “Well, one could even almost say that we have a model for God...that He rather hates Higgs particles, and attempts to avoid them.” Yeah. God. That God. I'm hoping he's being glib, as when Einstein expressed his distaste for quantum mechanics with the now famous phrase "God doesn't play dice with the Universe." (Einstein was not literally saying "God would never do this," he was simply expressing a rabid distaste of any physical principle in which the outcome could not be mathematically predicted.) I'm hoping that's the sorta meaning that Nielsen had in mind, but...uh...I kinda doubt it.

In the early 90's, looking for the Higgs-Boson particle was attempted before - this time as a sole effort by the United States. The Superconducting Supercollider was another BASC (Big Ass Supercollider) in a tunnel underneath Texas. $3B in, the US Congress canceled the project in 1993. An attempt by congress to control spending? Maybe. A panic move by reluctant Texas Governor Ann Richards? Possibly. A Bill Clinton "fuck you" to a project championed by Ronald Reagan and George H W Bush in Bush's home state of Texas? Sure, why not?

Calling the cancellation of the project an "anti-miracle," Nielsen has a different explanation: the future called, and they want their Higgs-Boson back. The cancellation of the project, he is suggesting, was caused by the effect of reverse propagation through time with the cause being the actual observation of the Higgs-Boson particle. Can a pure physical effect like a temporal shroud cause the US Congress to cancel funding? Or is Nielsen suggesting that God did it? Is he suggesting that people from the future did it? I'm not sure that he's sure.

For all my love of making fun of them (and I do so enjoy it), Nielsen&Ninomiya are not idiots. Nielsen was one of the co-founders of string theory, and Ninomiya won the Partical Physics Medal from the Japanese CiNii. These are smart guys, who sometimes take a road less traveled a bit too far, perhaps. Fortunately, they realize how their theories could be viewed - and they proposed an experiment:

CERN, it is argued, should engage in a game of chance - sort of a physics roulette. The activation of the LHC to look for the Higgs-Bose particle, should be triggered by an unpredictable event. A random number generation method, of some sort, could be connected to the big, giant "GO" button on the device. They even wrote a paper on it: Card Game Restriction on LHC. If the experiment occurs as planned, there is no effect from the future, if it doesn't occur, then there is some sort of physical response propagating backwards through time to prevent the LHC from conduction the particle observation.

What Would Einstein Do?

UPDATE: November 7th, 2009: Yeah. A bagel bit. Dropped by a bird. I'm just sayin'...

UPDATE: Dec 10th, 2009: So far, no Hand of God or Future Mettling: LHC One Step Closer to Unlocking the Secrets of the Universe


Saturday, October 3, 2009

Relaxing with a Book in the Age of Digital

When I was a kid, I was (pretty) convinced that everything I was going to do for entertainment would be available to me in my pocket - or at least through some sort of magic panels in the walls of my home. This was back in the late 60's early 70's, so most people just assumed I was nuts. (Of course I also thought I'd be living on the moon, so they were sorta right.)

I blame Star Trek for these thoughts. People walked around the cardboard sets of the Enterprise with little "memory cards" (ok, painted pieces of wood) that they would place into ubiquitous slots in walls or desks and entertainment, information, communication, etc would appear on the nearest wall panel. When walking around the surface of a planet (or, more appropriately, the redressed backlots of Desilu studios), they would put their little wooden memory cards in their tricorders to get the same information. (Incidentally, when you are 10, a binoculars case makes an excellent tricorder.)

Now, of course, that scenario is pretty much my Life In Information. (Actually, I'm willing to bet it's pretty much the Lives of Information of all you folks that read blogs like this.) My home is wired for gigabit ethernet, which is wired to the outside world at whatever speed Comcast decides to give me for the day. My body is bathed in wifi signals capable of 300Mb/s transmission, and the little memory cards in my phone, laptop, camera and camcorders contain portable files that I just haven't moved to my house network yet. Whenever I wish I can call up information, communication or entertainment on panels throughout the walls of my house, or on portable devices when I'm not at home.

Gone are my CDs, DVDs, albums, photographic prints, and other paraphernalia of the era of physical media, which - for the record - lasted from 3100 BC until, oh, a maybe few years ago. 5000 years, give or take a few decades, is a good run for any technology trend, dontcha think?

One of the last pillars of the era of physical media to fall is the printed word. There's a myriad of conversations going on right now, of course, about the fall of newspapers and magazines - and as much as I love my beloved weekend New York Times, I easily made the transition to nytimes.com.

However, the one form of printed word that seems to be taking forever to make the transition from atoms to bits is the book. Ironically, this was the first physical-to-digital medium that came under attack back when the internet was young. It made sense that it should have been the first to go, since even Moby Dick can be compressed down to about 200K when converted to a text file. 200K was the perfect size for dialup modem transfer rates of the day. So, what happened? After music, television and now high-definition film has made the move, why has it taken about 15 years before anyone was considering digitally consuming literature seriously?

Back in the day, electronic books (eBooks, or digital books, or whatever you want to call it) were displayed first on computer screens, and later on PDAs. While there were adherents to this, they were mostly the bleeding edge crowd - people who didn't mind ruining their vision by staring at small, glowing screens of maybe a few sentences. It was a horrible experience, and a terrible way to read. (Society has a short memory, and it seems to have forgotten about this period of eReading - as is evidenced by the Kindle Reader for the iPhone. The type of folks that would use this little glowing perversion of a book are the modern day equivalents of those of us in the early 90's that would stare at books on our Palm Pilots. Good luck with that.)

The Kindle, Sony eReader, Plastic Logic and others have improved upon the experience by making use of a display screen from eInk, which manipulates physical particles to display text on a screen. I've written about this experience before, but in short eInk technology duplicates the reflective properties of paper almost exactly. The effect is astonishing, and reading Moby Dick becomes a pleasure again.

So, why are Amazon, Sony and others hiding their sales figures? Obviously because the success of these devices is moderate, not groundbreaking as it was with the iPod's conquering of digital music. The reasons for lackluster sales are many: licensing deals with publishers are still strange (the publishers still think it's reasonable to charge 80% the cost of a physical book), the eReaders themselves are still too expensive (think printer ink, Sony and Amazon), and the DRM issues are still too restrictive (why the hell can't I read something I bought on the Sony Reader store on my Kindle?). Marketing around these devices has also been terrible - there's still confusion in the market as to why someone would want a single purpose device that doesn't display color images when they have their laptops, macbooks and iPhones. The explanation is simple (i.e. my rant on reading long form content on glowing screens), but I rarely hear any of these companies come out and talk about it.

Nonetheless, all of these reasons are really just business problems which will get sorted out...but even when those problems are solved, there is still more to the story on the slow adoption rate, and it may be emotional and very hard to duplicate digitally. It's really complicated. Ready? Here is it is...

People like books.

Books are big, bulky, a bitch to move from home to home, they get lost at the beach or when you lend it to a friend, and they smell mildewy if left out in the backyard overnight. None of that matters.

People like books.

People like books more than they like DVDs, CDs, record albums, liner notes, or anything else that the digital revolution has supplanted. They line our walls, they tell people who you are and what you are about when they walk into your house, they have author's signatures, they just feel good when you pick them up and hold them. It's entrenched in us. In our culture. In all cultures. The oldest thing that you can call a book (no, it's not the Bible, chill the eff out) is probably the Epic of Gilgamesh, at around 2150BC. Books have been used for trade, for securing power, as seats of knowledge for kings, and have been the source of global memory since long before the internet. (Award for the Greatest Information Crash Without Backing Up has to go to the sacking of the Ancient Library of Alexandria.)

So, there is an emotional tie here that is going to be hard to move past - and I include myself in this mix. I have whole-heartedly embraced the eBook: you'd have a hard time prying my Sony eReader from my hands - its more convenient, takes up almost no space, makes my business travel load a hell of a lot lighter, and my book consumption has gone WAY up in the last few years since owning it. But....I like books. They still cover my walls. I still schlepped them from Minnesota to Wisconsin to Pasadena to Boston to LA to San Francisco, and all the intercity moves in between, over the years. It was expensive. It was a pain in the ass. Yet, I still did it. We all do it.

What kind of marketing does it take to move past thousands of years of emotional attachment to a bulky, inefficient, easy to destroy form of media? Honestly, I don't have one of my glib, well-you-just-do-this, technology-will-solve-it answers. I just pose the question.

Update:

So I put the Star Trek reference in as sort of a joking referral to what a proto-geek I was growing up, but it turns out - Star Trek was precognizant about the durability of books in the human condition as well. Check out the clip below from the episode "Court Martial," starting at about 3:30 as Kirk's lawyer explains why he doesn't use computers.


Friday, July 31, 2009

Returning to Rapture, a Tale of Two Machines


Alright, let's get this pre-amble outta the way: I started out 2009 with a promise to blog once a week. Here we are 8 months in, and I've only done a handful of posts. Where the hell did I go? Well, it's been a busy year, obviously. Work has been intense as we put more and more into the marketspace, there was the move up to SF, and...

...fine, I've been playing video games. Lots of them. I now play games more than I watch TV or read during my precious few hours of downtime each week. For some reason, I find it relaxing to sit on my couch blasting the crap out of aliens and zombies...as Rome burns, Master Chief fiddles... or something like that. I know I should be blogging (readership is way down), and hell - even the number of my twitter posts is down.

So, here's the deal: I could write about MicroHooBing, or iPhone zombies, the need for Palm to get the hell out of Sprintville, or the emotional importance of the 40th anniversary of the moon landing - I am the RocketMan, after all - but others have covered that while I worked my joystick cramped hands to the nubs. What prompted me to come out of the stupor and return to the blogging fold was the video gaming itself.

I have two systems at home: an XBox360 and a PS3. I bought them both for other primary purposes (the XBox is a great media extender, and the PS3 plays blu-ray discs), but, hell, the gaming is there and I used to be a computer graphics architect...and I do have a few hours to kill... where IS that Bioshock disc I bought...? So, I started to play - and I wanted to give both machines a shot, so - yes - I did get eventually get Halo3 on the XBox...and, since the marketing of PS3 games was nowhere near as overwhelming in my consciousness as XBox games, I had to do a little more research on the PS3 to get the "right" game, so I bought Resistance: The Fall of Man.

As I began to play, I noticed myself on the PS3 more, which I thought was strange and I chalked up to the gameplay. In fact, I finished up Resistance just as Resistance 2 became available. Released 18 months after the original, Resistance 2 was a leap forward in the graphics, response time and interactivity. Blowing Halo3 out of the water for look and feel of the virtual world and critters that inhabit it.

Even so, the similarities in the gameplay between Resistance 2 and in Halo3 were uncanny. As in: they are basically the same game. Oh come now, fanboys, substitute the Chimera aliens and WWII soldiers of Resistance with the Covenant and UNSC of Halo3 and you have the same story. (Let the hate comments commence!) Why then, when I finished Resistance2 was I so reluctant to return to Halo3? Whenever I tried, the colors of the Halo universe seemed washed out, the game play seemed (for lack of a better term) "wonkier," and programming errors (like polygon collision) seemed more frequent.

OK, OK... yeah, these are two different games from two different companies separated in time by almost 2 years. Of course Resistance2 seemed to be brighter, zippier and more photo-realistic. Still, something didn't seem quite right...

Shrugging it off, I went out and picked up Left4Dead...this game had pulled in a legion of fans, and was relatively new - surely it was a better way to show off what the XBox could do... So, I popped it in and played for an hour or so, but I couldn't get into it. The gameplay felt contrived, confusing and stifling, the engineering errors were everywhere (you can make your character "float" by standing on a raised surface and stepping off slightly, for instance) and, once again, that "washed out" feeling was present everywhere in the zombie-infected streets of New York.

Now I was on a quest: was there a significant difference between these two consoles? As I mentioned, when I first got the XBox360, the game I took home with it was Bioshock. I have to say I was mesmerized. A lot has been written about the Ayn Rand nature of the story, and the moral choices that you make inside of the submerged city of Rapture. How many video games can you name that are set in 1960 and start with a monologue like this:

I am Andrew Ryan and I am here to ask you a question:
Is a man not entitled to the sweat of his brow?

No, says the man in Washington; it belongs to the poor.
No, says the man in the Vatican; it belongs to God.
No, says the man in Moscow; it belongs to everyone.

I rejected those answers. Instead, I chose something

different. I chose the impossible. I chose...
Rapture.
— — Andrew Ryan



It's an interesting game providing the player with sometimes disturbing moral choices, up to and including killing infected children called Little Sisters in order to increase your potential for surviving through the game. This particular game plot point sparked controversy from the anti-videogame contingent at the time, and landed Bioshock in the list of controversial games. (For the record, I couldn't bring myself to kill the Little Sisters either - they squirm and scream, for god's sake! - so I found other ways through to the endgame.)

My interest in Bioshock gave me an opportunity: the game was available not only for the XBox360, but also for the PS3 - the moral choices made throughout the game allowed the plot to change significantly enough for me to play again from start to finish without getting bored. Developed by the same engineers, using the same graphics and physics engines, and released during the same time frame, Bioshock provided me with a reasonable way to compare two pieces of hardware against each other. So, back to Rapture I went...

Just a note for you whiny-one-system-or-another zealots out there: Both boxes route via HDMI through a Sony DA5200ES AV switching receiver, which is connected (also via HDMI) to a Pioneer Elite 50 inch plasma display. Audio is out through a Bose Acoustimass 16, 6.1 speaker system. I used the Sound&Vision calibration DVD to make sure that the consoles were set to as close to the same saturation and color levels as I could get. (Bioshock itself provides a simple, initial set up slider that allows you to set your relative black level.) So, as far as I am concerned, these systems run through the same audio and video pathways - the only difference is the obvious one: the pathways within the video game consoles themselves.

Popping the game into the PS3 console, the familiar strains of 1950's of music playing on a scratchy record came on my entertainment system speakers, and I immediately remembered the feeling of being engulfed by Rapture.... but, there was one more thing: quietness. I don't think I ever noticed it so acutely before, but the PS3 is damn quiet - especially when you compare it to the room-heater that is the XBox360. This time through Rapture I could actually hear the water lapping behind me, the sighs and insane ramblings of the splicers and the echoing of footsteps down the hall.

Controlling the fictional character "Jack" through the game, it was easy for me to confirm that the PS3 version of the game was a direct port from the XBox version - it had the same software bugs. (Walk down the flooding "skywalk" leading away from the medical pavilion to the half-closed bulkhead door, for instance, and you can easily insert your POV inside the polygons forming the bulkhead to become part of the door. ) I was glad to see the error which frustrated me so the first time I played - it indicated this was as close as I was going to get to an apples-to-apples comparison of machines.

Strange as it was to step back a few years in state-of-the-art graphics, the Art Deco world of Rapture still felt like moving through a living painting, with gorgeous gold-inlaid walls covered with deconstructionist period murals and creepy marketing slogans on billboards smeared with dried blood. The 5.1 audio is disquietingly convincing - especially playing with the lights out. All of these little cues made it easy to get lost in the game once again.

The controls on the PS3 version were similar to the XBox360 version, with Jack's genetically altered abilities available with your left hand, and the traditional weapons and tools available with the right. The PS3 DualShock controller vibrated in the same disturbing manner as the XBox controllers. (I do think the publishers 2K missed an opportunity not taking advantage of the accelerometer in the DualShock, but that would be a rewrite to the UIX rather than a direct port.) Because of this similarity, plus the speed of the processors on both consoles, motions through the world of Bioshock are seamless and quick on both systems. Still...the game just felt better on the PS3. What was that about?

Part of my subjective impression may be due to the differences in how the boxes render the graphics on the screen. I can only imagine that the bitmaps and rendered flatfiles are the same on both versions of the game. However, on the XBox360 I had the same issue that did with worlds portrayed in Halo3, Left4Dead and others: colors and contrasts on the XBox360 seem muted and washed out to my eye. Contrasts are low, and similar color schemes blend together in my visual field. The same imagery on the PS3 is clear, crisp and high contrast. The splicers and "Big Daddies" in Bioshock popped off the screen at me, whereas on the XBox I often had a hard time pulling them out of the shadows. (Never a good scenario when, you know, they're running at you screaming.)

So what have we learned from this, aside from the fact that Uncle Robby needs to get out more? The consoles themselves seem to operate basically the same when given the same set of physics engines and other algorithms. The controllers are essentially identical, and gameplay on both feels good.

There is, however, a perceived difference in the unquantifiable "enjoyment" between the two due to the quieter PS3 console, plus the PS3's apparent ability to render crisper, high contrast graphics with a higher dynamic color range...

...oh, yeah...and the PS3 has never failed in 2 years of use, the XBox360 however has already given me one red eyed glare in the same time frame.

Alright - back to blinding people with science! Well....maybe after I pop just one more Big Daddy.


Tuesday, May 19, 2009

Stunning Timelapse of Galactic Center passing over a Texas Star Party

OK, no pithy titles or clever sayings here - this video from William Castleman is truly gorgeous.

Galactic Center of Milky Way Rises over Texas Star Party from William Castleman on Vimeo.

I'm glad that folks are still throwing star parties.

For those of you that have never been to one, they are at once highly geeky events akin to Star Trek conventions, enormous opportunities for learning and shared wonder, excuses to stand out in a field and drink beer/whiskey/hot chocolate, and wonderful chances to really understand what this universe is all about and how to observe and appreciate it.

When I was growing up in the 70's (1970's, smart ass) in northern Minnesota, the opportunity to have star parties were pretty plentiful - no light pollution (because who the hell would live up there) coupled with being basically equidistant between the equator and the north pole (think access to Aurora Borealis) and - lets face it - there being little else to do, led to 1-2 of these things a month. 20-30 people with the latest in telescopes, cameras, night vision gear, and beer (usually Hamm's) resulted in dozens of amazing experiences. Including non-astronomical experiences, such as the time that Dabs - my dad - nearly decked my high school astronomy teacher because he didn't quite get the concept of a grown man going out to a frozen field in the middle of the night with 20 highschoolers. (Thanks for not backing down, Dale!)

By the time I left that town when I was 18, I had seen eclipses (lunar and solar), meteor showers, transits, and more aurora displays then I can comfortably tell you about here. I learned how to photograph on night plates, constructed reflector telescopes (including spending a year grinding the primary mirror by hand), help fund-raise for a planetarium, and - later - helped run the projector at that planetarium.

Looking at this video from Monsieur Castleman brings back a lot of memories - as well as making me green with envy that I didn't grow up in an era of high definition digital video and GPS-based telescopes. Growing up in the waning years of photographic negatives, all of my equipment was extremely analog and required patience - instant gratification was out of the question, as you had to wait until at least the next day to process the photos or video...uh, film. It was only after I went to college later in that decade, I was able to construct a then-new charged-couple device (CCD) as project for an astronomy class. It was extremely expensive to build, and was a whopping 100x100 pixels, each able to hold 64 shades of gray - but it was a start. (Alright, Castleman, you got me there - but do you know how to process a photographic glass plate in a darkroom? Hah! No. No you don't!)

Even more important than the amazing view of the Milky Way rising in that video, however, is what you see along the horizon: people. Dozens of them milling about, their red flashlights (LED based, of course) coming on and off, configuring their laptops and calibrating their GPS-ized telescopes, checking exposure times, and drinking beer (probably Lone Star, not Hamms, since the video is from Texas). Star parties are still going on and are still popular, which means people and kids are still learning...which...gives me hope.

...well, at the very least it makes me feel better about the coming robot apocalypse.


Tuesday, April 28, 2009

The End of Human Life as We Know It: Great, Now They Don't Even Need an Exoskeleton

OK, yes - it's been 2 months since I have posted anything, despite my New Year's resolution to post once a week. (Hey, I've been occupied, ok? JOB. I have a JOB.... ok, fine... I also have a PS3. Like I said, I've been occupied!)

At any rate, I come with a warning. They've started to evolve. First they swam, then they could get back up when knocked down, now they don't even need metal. Seal the doors and windows, get the kids and the dogs in from the yard...


Tuesday, February 3, 2009

My Year With The Zune....


....comes to an end.

But before I go there - first my apologies at the lack of posts in January. As a twitter friend has been reminding me (daily, Maria!), I already blew my New Year's resolution to post weekly. Hey, I do have an excuse: I was moving cities again! (This time to San Francisco - and here I'll stay...well, at least for a while.)

Now, back to the end of the Rocket Zune.

As everyone who reads this little diatribe knows, despite all the jeers and being the butt of all jokes on this topic, I'm a Zune fan. What's more, I enjoy the Zune Marketplace a great deal - it's a pleasure to use, and gives me access to millions of DRM free MP3 files, and has a great interface for dealing with podcasts. Also, for as much maligning as "Welcome to the Social" has taken, the Zune is - well - extremely social. The built in social networking aspect of the Zune actually works well. No, I've never "squirted" (ew), but I have taken advantage of the LastFM-esque aspects of the Zune Marketplace. Discovered a lot of good tunes that way.

So, if it's working for me, why stop now?

My current Zune is a Zune 80Gig model - works great, updated to version 3.0 of ZM without a hitch, and all the cool new features came along for the ride. However, my music collection has grown, as has my appetite for video-on-the-go: all of which has pushed me to upgrade to a higher capacity model: the Zune 120. So, when it came down to another $250 outlay, I had to think carefully...

First, there was the bad news from Zuneland this past quarter: Zune revenue declined by a frightening 54%. You might be tempted to blame that on the ailing world economy, until you realize that Apple's iPod sales increase 3% during the same time frame. (I haven't sat down to work out the math, but I bet the numbers come close to balancing out.) People have jumped ship - or, rather, not gotten on board the ship - in record numbers. As a WSJ editorial states, the Zune's market share is now flirting with 0%.

I have a theory about the decline, BTW: it corresponded with the release of the Zune Marketplace 3.0, and corresponding firmware upgrade, at the end of Q3 '08. Unlike Apple or any other media players on the market, Microsoft did not force you to buy a new Zune. All Zunes could be upgraded with the new software, and worked perfectly within the range of their older hardware limitations. (The equalizer software didn't work on the first gen Zunes, for instance, because they had no hardware to support it.) Everything worked: wireless synching, OTA buys from the Zune Marketplace, clicking on FM songs to purchase... all of it. And that may have been the problem...

By respecting their current user base and applying the backward-compatibility ethos which, like it or not, worked as a strategy for PCs, Microsoft may have shot itself in the foot. Who would spend another $250 on a new Zune if you didn't need increased storage capacity and you could get all the cool new features for free? Turns out: no one.

At any rate, even without the sales figure decline, I probably would have made the same call: the weight of the overwhelming market share of the iPod was taking it's toll: my cars have iPod ports, not Zune ports, for instance...and getting something as simple as an armband for the gym was problematic. (As it turns out, the armbands for the iPhones work perfectly with the Zunes...who says we all can't get along?)

So, with a $250 upgrade to make, I set the Zune aside (I won't sell it, I will keep it in a nice little shrine) and headed over to the Apple store to pick up a 120gig 6th generation iPod. (The iPod touch stalled out at 32gig? I crap bigger than 32gig!) I sat down at my laptop, cleaned up my music collection, transferred my podcast subscriptions over to iTunes 8.x, sync'ed it and fired it up.

There it was: my shiny new iPod looking all... well, iPod-ish. After a year of absence, its depressingly the same. Sure, there's cover flow and the sync icon is now orange (ooooo!), but other than that: the system is basically exactly the same. No wifi, no stereo bluetooth, no FM radio... no real changes of any kind. (The damn font still looks like it came from the first generation 64K Macintoshes from the 80's.) Moving from the Zune interface and feature set back to the iPod is, well, a step backwards in look-n-feel and features.

...and then there is iTunes. The "music management" system, and front end to the iTunes store, still looks like it was written by a first year college engineering student as a final project. Same old interface. Oh, sorry, it has "cover flow" too...right. (Do you really use cover flow to find albums, people? Really? I doubt it.) It also has "Genius" now, which doesn't seem to be using the information from the music genome project, like Pandora does, to get its relationships between songs. As best as I can tell does a simple stochastic match between what you've got in your library and what other people have in their libraries to determine what songs you have that possibly sound like other songs you have. (What's a good playlist that sounds like "Dani California?" Well, here's the union of songs that you have in your collection with songs other people have in playlists containing "Dani California." Genius.)

The final affront to my logic centers? iTunes is on Version 8, and it still can't tell that you've put new music into a watched directory. Moving from the Zune Marketplace to iTunes is like trading in the Porsche for a Volkswagen - sure, they are both German cars, but...come on. Seriously? I'm not the only one who thinks so - there's been a lot of articles about ZM lately, such as David Chartier's excellent piece in ARS Technica last week. (David: you almost had me reversing my decision.)

So, market forces win (remember when market competition was a good thing?) and I turn my back on the Zune to move back in with my old girlfriend, Apple. She has a new dress on, and pretty shoes - but I suspect she still can't dance - but everyone seems to think she's just awesome and she's kinda the only one at the party, so I'll give her one more chance.

...hmmm...wait, who's the iRiver girl over there by the bar...?

Sunday, January 4, 2009

Throwing in my Two Cents: The Rocket's 2009 Tech Predictions

Nostradamus hat? Check. Crystal ball? Check. Number of my bookie in Jersey? Check. OK, let me throw myself into the circus act of bloggers out there trying to stake some claims over the next 365 days...

Last year I hit a sad 75% on my predictions for 2008 -- let's see if we can crank it up a notch. (I'm also trying to beat the clock and get this crap out before CES this year.)



  • Economic recovery begins in early Q3 for the tech and housing industries.

    OK, this one goes against nearly every piece of bad press that I have read for the last 6 months, but I still believe it.

    Since this is a tech column, we'll leave the housing industry aside for a moment - aside from the incredible resiliency of the the US economy, the tech sector in the US has fallen behind in several key areas: broadband penetration, high-speed wireless penetration, consumer electronic technology, low power technologies (drives, displays, etc), and battery technology. Any entity that misses one industrial or technological innovation leap-frogs over that innovation to the Next Big Thing. (Europe has an unparalleled infrastructure of train tracks, the US creates automobiles and roadways. The US creates unparalleled wired phone lines, Asian-Pacific markets surpass the US for wireless phone systems. Etc etc.)

    Being behind in the technology leadership in the above (and other) areas will provide a window of opportunity for investors, entrepreneurs and established tech companies. Many of the missed opportunities above already exist as half-constructed ventures, lab experiments and business plans... execution out of desperation is sure to follow in the first half of 2009.

    Update: January 16th - Looks like the Federal Reserve Bank of Minneapolis agrees with me, at least. Thanks to J. McCartie at http://www.volasail.com for bringing this to my attention.


  • The Obama Administration revitalizes the tech industry within 6 months of taking office.

    Slightly related to the first bullet point, the Obama Administration is on top of things enough to realize what the economic engine in the US really is, and where we are failing on delivering. After 8 years of starved technology ecosystems, Obama will begin to place money and resources into crumbling technical infrastructures, and lower the barrier of entry for new companies to compete against established companies. The creation of a "Chief Technology Officer" position for his administration is the first sign that they will get this right.


  • MEMS technology for low power / flexible displays hits the market.

    OK, fine - I made this same prediction last year and it didn't pan out - but it's pissing me off that it didn't, so I'm plopping it back on the table, damn it.

    Qualcomm's spin off company (Qualcomm MEMS Technology) is one of the companies focusing on low-power displays through nanotechnology. (In this case, small shutters control the filtering of light from LEDs, as opposed to digital filters which require energy.) As mobile devices get more and more complex, power management becomes a ridiculously huge issue. (I thought the iPhone was bad, but the Android phone with its multitasking OS fires off its various radios in the background without you even being aware of it. The power meter looks like a sweep second hand on a watch.) One of the biggest consumers of power in these devices - minus the radios - is the display. QMT's nanotech shutter system produces the same screen brightness for 1/5th the power consumption.

    In Cambridge, Mass, eInk has created a flexible, color version of its digital ink technology currently found in the Kindle and the Sony eReader. Just because newspapers and magazines are headed the way of "instant photography," CD publishing, and terrestrial radio doesn't mean that people don't want the content - but reading a newspaper on a laptop or iPhone isn't for everyone, and it doesn't duplicate the newspapers current distribution model. (Wake up, stumble outside, pick up newspaper of the wet pavement.) The first instances of eInk's flexible, color digital "paper" will no doubt be to receive your subscription to the NY Times or Newsweek directly to a portable, always on device. (It will put a crimp in the "taking the paper to the toilet" market, tho.)

    My bet (again): expect one or both of these technologies on the market in other OEM's device by Q4.


  • Android phone sales hit iPhone numbers before end of year

    The T-Mobile G1 is projected to hit 1,000,000 units sold when the final tally for Q4 2008 is done. Keep in mind that the phone has hit the 1M mark having only been on the market since late October 2008, and for sale in only 19 markets because of T-Mobile's late-to-the-game nascent 3G service. There are several more phones expected in the next few months that run on any 3G service out of the box, with form factors that mimic both the iPhone and the Blackberry.


  • Digital delivery of home media makes a measurable change in broadcast TV numbers
    I know, everyone is either predicting this or laughing at people who predict it - but you gotta take a stand, baby! We've seen UPN and The WB merge over the past 18 months to the oddly named "The CW." (I miss the Frog, actually.) It's not unreasonable to expect to see additional kerfuffle amongst the traditional TV networks due to erosion of a viable audience base.

    The additional changes could be consolidation or dissolution of one of the four remaining broadcasters, or it could be something more subtle - there is a very good chance that this is the year that a major network or studio backed network begins using the internet as a direct means of distribution to it's audiences.


  • Significant drops in Blu-Ray player prices combined with content publisher pressure to release existing titles in a new format will push Blu-Ray disc sales past DVD disc sales

    I know the debate: did Blu-Ray win too late in this era of media downloads? Do people really want to switch from their DVD collection when upscaling DVD players are just fine?

    Well, all things being equal: no. However, content companies are hungry for ways of monetizing existing content in their catalogs. As the price of Blu-Ray players falls below the $120 mark, content publishers will be incented to migrate more of their catalogs over to Blu-Ray in an attempt to sell you the Star Wars Trilogy...again. (This is the same logic that pushed music content publishers to move from album to tape to CD.)

    With all those brand spanking new flatscreens out there and sub-$120 Blu-Ray players out there, consumers walking into a Tower Records and faced with their favorite movie or TV show in both DVD and Blu-Ray, which do you think you would choose?


  • As Apple pushes deeper into double-digit territory for laptop sales, several serious viral attacks begin in the Mac community. Lack of adequate protection combined with consumer hubris will make the problems significant.

    This one I'll take some heat from the fanboys for - but I'll get through it somehow. Again.

    You've all heard the argument before: the Apple ecosystem is unprepared for coordinated security attacks on their object of desire - but it's a valid argument. As Apple computers push deeper into double digit territory, they become a target for virus writers. It's not really important that there aren't a lot of virus protection software out there for Mac's, what's more damning is the Mac demographic is woefully under prepared.

    Having not grown up in a culture of locking your backdoor, Mac denizens are not in the ritualistic habit of installing virus protection software, updating it and taking all the usual precautions against Very Bad People that windows and linux people have had to deal with for years.

    Pwn2Own 2008 people: 2 minutes. MINUTES. 'Nuff said.


  • At least one other additional security exploits occur in the basic structure of the aging internet protocol and backbones, forcing a rethink of the way packets are carried over the Internets

    It was designed to survive a nuclear attack on the united states... it was not designed to power everything from television sets to light switches. Last year Dan Kaminski discovered and reported a serious flaw deep in the bowels of the stalwart Domain Name System. That code has been in there since Christ was a corporal, and there's more - trust me.

    Like bad roads and crumbling bridges built during WWII, the internet substructure is due for an overhaul - and I'm not talking about move from IPv4 to IPv6.


  • Windows 7 arrives at the latter-half of the year, but the PR damage done by the mishandling of Vista's public perception plus the stillborn Microsoft marketing campaign PLUS John Hodgman ensures a tepid reception to the new OS.

    By all accounts, Windows 7 really is all that. All the reviewers that have advanced copies are tripping over themselves to say lovely things to Leo Laporte on TWiT about Windows 7. "Better." "Faster." "Prettier."

    After the public shitfest that was heaped upon Vista over the past two years it isn't gonna matter a wit. The OS will be released to sound of crickets, and the windows community will be stuck with having to support "downgrades" from Windows 7 to XP - and maintain the codeline for Vista at the same time.


  • Yahoo breaks up into its original component companies, or at least puts them on the auction block, before Q4.

    Yup - it's over kids. Do you Yahoo!?? Uh, no. No you don't. Prepare for recent and distant tech purchases like Flickr, Geocities and Maven to be divested and scattered upon the wind like seeds from a dandelion.
OK, there you go. Take me to task on Dec 31, 2009...but I'm shootin' to beat that 75%, dammit.

Have a great year everyone - let's go make some tech now, shall we?