Wednesday, June 9, 2010

Die Me, Dichotomy

"We did not enter the search business. They entered the phone business" 
- Steve Jobs, Spring 2010

There are about 2 gwuad-zillion blog postings and articles out there highlighting the differences between the iPhone 4 (and iOS 4), and Android 2.x - and you know what? This isn't going to be one of them.

What I do want to talk about, however, is the above quote by Jobs during an all-hands meeting this past spring. Reality is what Jobs spins it to be, but the essence of what he said is largely (not mostly) true: Google, the search behemoth, and Apple, the restyled consumer device and services company began life in two extremely different places. Yet, here they are a decade and some-spare-change later competing head to head in the mobile arena. How, and more importantly, why did that happen?

There's an interesting slide that Jobs keeps displaying during his keynotes - one that no one pays attention to unless he calls it out, as he did at this week's iphone-a-palooza. Its a simple street sign meant to imply that Apple is on the corner of technology and liberal arts. Jobs likes to spin a tale about Apple being this magical place that brings these two worlds in alignment. In reality, however, that story could be told about Google (technology) and Apple (liberal arts) being dead center at that intersection. Stretching the analogy to the breaking point, I'd throw in two cars headed towards a collision at the intersection: one piloted by an android, the other by an apple.

Google was founded by Sergey Brin and Larry Page in 1998 while they were at Stanford on the PhD track. It was pure platform play from a couple of hard core computer scientists. "This World Wide Web thing seems to be taking off," they reasoned, "and all the ways to find content on it really blow." (Hey! They were looking directly at you, Alta Vista.)  So, they figured out a new way to do it, one which involved web crawlers and indexers and taxonomy engines and warehouses full of servers. They slapped a simple front door on it, misspelled the word "Googol," and the rest - as they say - is history.

The important take away here is that Google, from start to finish, was a research project born from the PhD thesis material of a couple of hard core computer scientists. They focused on ideas as numbers - the world as a large, three-dimensional grid of information that could be searched, indexed and sorted at bizarrely fast speeds. For the first time in human history, it became physically possible to categorize, classify and search literally everything representing the human condition. As long as there was a way to digitize it, Google could organize it.

To Google, the world is The Matrix.

Along the way, Brin and Page realized that they had a perfect way to generate revenue: the digital version of targeted advertising. Caching the entire internet in memory was gonna be pricy, but if they could sell search trends and results back to advertisers, they might be able to keep this thing going. It worked, and the company became one of the most profitable on the planet, able to afford multiple research-projects-to-nowhere, and perform mind-boggling "busy work" tasks, such as photographing every single square foot of street space in the world, just so they could turn the physical into bits.

Jobs, Wozniak and Wayne (the missing Beatle) founded Apple in '76, and their story was a very different one from the Google boys.  As the legend goes, Wozniak, an engineering student who never finished college, hand-constructed a wood-wrapped computer to present at the Homebrew Computer Club, a computer hobby club in the Silicon Valley area that functioned as sort of a Lifehacker.com of the pre-internet, swingin' 70's. Jobs, a Reed College student, saw the potential, as he always does, and they incorporated and sold the first Apple computer  (the Apple I)  in 1976.

Now, I say this not as a denigration, but as a point of discussion: neither Wozniak nor Jobs finished college. When asked if he dropped out, Wozniak's response was a very odd "Not exactly." Jobs, on the other hand, is a self-professed dropout (in fact, I think he only attended a semester or two, and lists "calligraphy" as an example course that set him thinking about typography) - and, in true Jobsian style in a 2009 Stanford graduation ceremony speech goes on to market his "dropping out, and then dropping back in" as having a positive effect on his life.

Jobs brand of esthetics, innate marketing savvy and forceful personality created a computer company that capitalized on societal lifestyle choices. Through 30 years of iterative refinement, Jobs (and, it was almost entirely Jobs) drove the company's business and technical development via trial and error, measuring people's responses to products at different points in time. It was almost as though the man who never finished (or even really started) college, was able to predict (some say define) societal zeitgeist. He literally gave people what they never knew they wanted, and in return they consumed.

To Apple, the world is What Dreams May Come.

Initially, Apple computers focused on home hobby enthusiasts  and gamers, but when the machines moved into the design and print publishing realm, utilizing designer's sensibilities and catering specifically to the needs of publishers things began to change. People didn't just like Apple computers as tools, they began to classify them as necessary objects. Machines that somehow "got" the essence of them as people and problems they faced in their work and lives, and through that understanding the computer company's adherents became almost fetishistic in their desire for all things Apple.

Google and Apple moved forward.

Google slowly became an information juggernaut fueled by an endless stream of advertising revenue. Fear of Google's knowledge about our every purchase, every web click, every move in the physical world was trumped by the public's craving for information about anything, anywhere. It was an uneasy truce with the devil: give them your innermost secrets, and you can find anything your heart desires.

Apple iteratively, and precariously (it famously almost went out of business), climbed its way past Microsoft's market cap. Amusement at Apple's lack of realistically playing in the grown-up world of spreadsheets and word processors gave way to the ease and grace at which its software and devices charmed the populace. People were vaguely aware that as Apple moved deeper and deeper into consumer electronics its ecosystem was slowly becoming a walled garden, but it didn't seem to matter. If they all vowed to never leave Apple, it doesn't really matter if the walls go up. Apple became the epitome of living in a gilded cage.

...and so, here we are. Two companies who, due to their contrasts, would seem to be perfectly positioned more as comrades than foes, yet now stand face-to-face on the current field of battle: your pocket. How did the emergence of the new media marketplace (your mobile phone) become the site of a war that will make the Apple/Microsoft skirmish seem like frivolous playground politics?

Jobs didn't often find himself on the defensive prior to 2010, but when the Android quarterly sales number zoomed by the iPhone's in Q1 of 2010, that's exactly where he found himself. When he made the now famous "We did not enter the search business. They entered the phone business," statement in response, it must have sounded alien to Google.

From Google's point of view, you see, they did not enter the phone business - they merely extended their search business. This was not about competing with Apple in the consumer marketplace, this was about adding an additional query tool to their bag of tricks. This time, however, its a physical device, not a query field in a web browser. Google did not enter the phone business, they simply provided a mechanism for users to provide Google with more information about themselves.

Likewise, Apple really didn't enter the search business. What they did do, however, was enter the advertising business using these same little devices in everyone's pocket. Through these mobile devices, Apple knows what its users want, what its users do, what its users feel, really... all without resorting to search. They are able to know all these things because their users willingly tell them. They tell them with their dollars and their application downloads. Apple didn't need a search engine, they needed another "thing" that they could present to the faithful and the soon-to-be-converted. Something shiny and pretty that the loyal would offer up their inner most desires to possess in order to remain part of the group.

Mobile phones have evolved to be statements of who we are as an individual. A mobile phone is an incredibly intimate device (insert your favorite sex toy joke here, please), more so than your television set or your laptop. You would allow your friend or family member to use your laptop to check email, but chances are you think twice before turning your phone over to someone. It has your music, your pictures, your memories, your conversations. Your phone is your inward self reflected outward.

They are also your tools - you need them to work. Calls cannot drop. Text messages need to go through. When you want information, you need the immediacy of action. The mobile web cannot pause, you don't have time for it to buffer. The applications on the phone need to tell you things before you know you need the information. Display it. Process it. Search it. Tell it to me. Do that for me now, because in 10 seconds I don't need it anymore.

This is the dichotomy of the mobile industry, perfectly represented by two companies that started in the last century doing very different things and solving very different problems. They were created by very different sets of people, one by a couple of Stanford eggheads looking to change how the world understands its information, and the other by a couple of college dropouts looking to change how society communicates with its technology.

To sum all this up another way:


Google users ask questions of Their Oracle.

Apple users share their desires with Their Creator.


These two world-views which meet so sharply to define their mobile environments and devices is perfectly captured in the ad campaigns for both products - both of which are worth watching, even if you've seen them before.

This 'droid ad is from Verizon, of course, but Google approved:



From Apple, ad-as-documentary:

Sunday, May 23, 2010

...as in "Oh God. Oh God. We're All Gonna Die."

Ok, then... while all of us were worried about the LHC and whether or not iPad would have a porn app, we inched that much closer to that lovely Terminator-eque future. A charming, grey, lovely place where none of us have to worry about taxes, political differences, or racial tensions because we are all grabbing the kids and running from the little brothers of Big Dog






Sheesh... at least it's just Terminators we have to worry about, not Cylons....






Ahhh, crap! Seriously?!

Sunday, March 14, 2010

Finding a Stellar Grandpapa

On March 3rd, Anna Frebel of the Harvard-Smithsonian Center for Astrophysics, issued a press release that her team had discovered a really old star. Really old. Dawn-of-time old. While not earth shattering to those of us living in the twitter-timezone, Ms. Frebel's team completed a missing piece of a cosmological puzzle that's been plaguing astronomers and cosmologists since before I was in grad school: where the hell are the old stars? It's been a bit of an embarrassment, really, but before we go there I just want to explain the backstory, silly bookkeeping, and why this discovery is so important.

For the past several decades, astronomers and cosmologist have based the operating assumption of the physical universe on a few principles:
  • The Universe started from a singularity in a massive explosion, commonly referred to as "the Big Bang." For many years there was a competing theory called "steady state," which said that the Universe always was and always will be, but the mountain of evidence to the contrary has basically shouted that viewpoint down. Steady State was poetic, and vaguely religious, but wasn't consistent with observed facts.
  • After the Big Bang, the Universe went through a rapid evolutionary process during its first 3 minutes of existence, in which the literal framework of the Universe was established: all of its physical dimensionality, physics constructs, the flow of time, energy distribution... oh yeah, it was quite a party time.
OK, over simplified, but that's the basic gist of it - everything flows from here. After those first few minutes, everything else began to shake out, including primitive stars. In the beginning of the Universe, you see, there weren't a hell of a lot of building materials. Well, really just hydrogen and a little helium. So, stars that arose from that first boom, were composed almost entirely of hydrogen and helium (with a smattering of the early metals: lithium and beryllium, but such trace amounts that it only counts when dealing with really off-the-beaten-path cosmology issues).

After they lived a few billion years (call it 13B) and burned themselves out, they exploded and sprayed crap all over the bran', spankin' new Universe. They were the frat boys of the universe, beer bottles everywhere the morning after the party. And, by "beer bottles" I mean "heavier elements." Helium, nitrogen... but more importantly: the beginning of the metals...well, "metaloids," actually. Check the periodic table, you'll see them there on the right.

OK, now the fun starts - when these bad boys pop, the crap they spill out is pretty much everything else you see around you: oxygen, iron, heavy metals, the gold in your teeth.... all the rest of the elements. Essentially, the now.

The weird bookkeeping comes into play when you consider how cosmologists categorized these three groups of stars: essentially, in the order of observation. The sun, and all the stars you see when you look up on a cold night are called "Population I" stars. See? They were seen first...

...the second group? The bad boys above that exploded and filled the skies with all the current stuff? They are "Population II," cuz they were found next. Get it?

And the last group....well, yeah, "Population III" stars. When were they observed? They weren't ever observed. Not directly anyway. They are long gone corpses, dried up cinders of their former selves.

(As an aside: I call the bookkeeping "weird" because it never made intuitive sense to me. The first stars should be Pop I's, in my opinion, but that's just me.)

OK, back to Miss Anna at the CfA, and the missing link she just found. If you were following my stream-of-consciousness explanation above, you get the drift that modern stars arose from the ashes of older stars. Similarly, the formation of galaxies (collections of stars bound together by common gravity) such as our own Milky Way underwent their own evolution. Outside the Milky Way galaxy, and other big spiral galactic formations which contain 10's of millions of stars,  are these weenie little malformed galaxies called Dwarf Galaxies. (I know, not Politically Correct, but "Little People Galaxies" didn't quite flow off the tongue.)

Dwarf Galaxies have a few 10's of 1000's of stars at best. You see, in the Population II era, there just weren't a lot of stars yet, so not too many buddies to gang together with... so, the Pop II's did the best they could... hung out together, went to the movies, and watched why all the big galaxies made fun of them. Eventually, some of the Dwarfs hung around to the modern age, because when their Pop II contents exploded, they made some new Pop III friends to hang with. A lot of these bigger, heavier stars were ejected, and they banded together to form the larger spirals.

It wasn't an even distribution, of course, because life isn't like that. Some Dwarf's had Pop III stars in them, and some Spirals had Pop II's. But, as time passed, the distribution of old versus new stars began to change - heavily weighted to the newer Pop IIIs. If you look in our own galaxy, near the center usually, you'll find the Pop II's sitting in their stellar old age retirement communities, taking Viagra and trying to be interested in the television.

And in the remaining Dwarf Galaxies? The ones older than our Milky Way? They should be chock-a-block full of Pop II's, right? Right? Yeah...uh...oops. There's the embarrassment. None. Nada.

Enter Anna and her team - 290,000 light years away, in the Dwarf Galaxy of Sculptor - which, I know, sounds like a Farscape villian - lies Lores: the first metal-poor (Pop II) star found in a Dwarf Galaxy.

Whew. The difficulty in finding a Dwarf Galaxy Pop II makes sense - a Dwarf has fewer stars, remember, so therefore a higher probability that most, if not all, of the Pop II's would have been swapped out for Pop IIIs.... still, it made everyone nervous that no one ever found one before. It called into question the theory of stellar evolution.

But, fortunately, Ms Frebel found Lores: Old, decrepit, and pinching the nurses asses at the nursing home. The Universe is as it should be.

Respect your elders, kids.


Side note: I don't want to leave you thinking that this story is as simple as I describe. Observing Pop II and Pop I stars becomes confusing and interesting the further you look from earth. Taking the speed of light into account means that the farther you look from us, the farther back in time you can see - and the harder it becomes to register and understand the light (diffusion, red-shifting, and other interesting artifacts come into play.). Pop II and even Pop I stars have been observed in abundance by using this lens back into the past. The issues with finding Pop II's in Dwarf galaxies arise when looking at Dwarf galaxies near us, so they are in the same relative "time frame" as we are here on earth.

Also, I'm hoping that the press release from Frebel's team misquoted her. She almost certainly did not say that Lores was as old as the Universe, since that would make it a Pop I star, but she probably said it was as old as the Milky Way galaxy, which would make more sense.

There, I think that does it...

...well, unless you start talking about Multiverses.....eh, next time.

Friday, March 5, 2010

Another Timelapse of the Milky Way...this time over Mauna Kea

I posted one of these before last year, taken by an amateur at a Texas star party. This one was taken at the optical observatories at Mauna Kea in Hawaii, and is far more "produced." It is, however, no less stunning.

It's a great way to end the week - enjoy.


The White Mountain from charles on Vimeo.

Thursday, February 18, 2010

The New(?) Microsoft Give Us a Three-Way Horserace

For the last few months, my personal party-line has been that the cell phone OS wars are over: it's now iPhone and Android phones, with all other OS'es (BREW, Symbian, Windows Mobile, etc) playing the role of Dead Man Walking.

I had heard of the "Zune Phone," of course, as well as "Project Pink," "Windows Mobile 7" and a dozen other working names out of Redmond. However, like everyone else, I had made the mistake of counting Microsoft out of the game. They're old. They're slow. They have crappy marketing. Everyone hates Windows Mobile. (I mean, a stylus? Seriously? Who uses that?)

That was stupid not only on my part, but on all the Apple Faithful out there who have been taking joy at the iPhone's trouncing of the mobile market. I don't blame them, I blame myself. I'm 900 years old, you would think I would have learned by now: Microsoft iterates towards a goal line. They take the criticism, the market hostility. And they wait and they watch and they learn. That's what they do. That's what they have always done.

They watched Atari and Commodore, and they built MS-DOS (tricking the Great IBM into both paying them to write the OS and allowing the proto-MS to keep it for themselves). They watched Apple and Atari and Commodore create GUIs as a new interface paradigm, and they slowly iterated their way into it with the horrible Windows 1.0, 2.0, 3.1 (all of which were just interfaces on top of MS-DOS), while the world laughed. Then Windows 95 showed up, and people stopped laughing. They watched as Sony and Nintendo duked it out with console game stations, and then they showed up with the XBox. What does Microsoft know about gaming and hardware? Apparently, a tremendous amount.

What's happening now is a renaissance for the company - again. Blaise Aguera y Arcas, an architect at Microsoft Live Labs (who is both a physical embodiment of the apparently hip, new crowd occupying One Microsoft Way, and representative of the "new thinking" going on there) garnered a standing ovation at the TED conference last week when he demo'ed the new Bing-based augmented reality maps. In the space of 15 minutes, Google Maps seemed old, stale and decidedly MapQuest-ish. This bears repeating: a hip, handsome, charismatic, non-nerdish, young Microsoft architect was given a standing ovation.

And then, back to my main focus here, there was this little ditty from the Barcelona World Mobile Congress: Windows Phone 7. The Zune phone. Project Pink. The thing that had been the behind-the-back snickering at every mobile gathering I've been at for the last 2 years. There it was...and it was...

  • ...not damned by faint praise in the press.
  • ...not thrown out for ridicule.
  • ...not considered to be "too little, too late"
  • ...not requiring you to use a stylus
It is, by all videos and hands-on experiences and advanced reviews I can get my hands on, gorgeous. Intuitive. Fast. Easy. And, here's the kicker: undeniably hip. Hip? From Balmer's Boys? Really?

OK, so the remnants of Microsoft of the last decade are there: "Windows Phone 7"? Seriously? Wake the eff up, Microsoft Marketing. Hire someone who didn't come from the enterprise software marketing world to name your software products. Hell, just walk down the hall to the hardware guys who named XBox, XBox 360, XBox Live, Zune... they could have called the Zune "Windows Media Portability 1.3," but they didn't. The world does not respond to the formulaic:

(Company Name) (Product Category) (Revision Number)

It's effing bullshit, we hate it, and its killing your reputation. Just stop it!

Lame-ass marketing aside, everything about this combination of redesigned phone OS (Microsoft is wisely killing off the prior versions of WinMo OS'es, and starting fresh here - bad news for the developers, great news for everyone else in the world) plus strict OEM guidelines for phone construction screams that there's something new going on in Redmond.

Comparing this phone to it's competition, it has also taken a completely different approach to its architectural philosophy: where the iPhone and android are application driven architectures, the Windows Phone 7 (dammit! OK, let me try "WP7" and see if that's easier) is data driven. It's not a new philosophy, it's actually quite old - going back to the 70's. The idea surfaced a few times in a couple of consumer products, most notably the original Palm Pilot and the Apple Newton, the latter of which dubbed this architectural concept as data soup.

As opposed to an application driven architecture, which relies on file transfers, data pipes and object passing at the OS level, and "copy and paste" exposed at the user interface level to move information from application to application, information on data-centric operating systems lives together, with all applications sharing the same underlying "like" data structures.

How this manifests itself to the user is the most beneficial on portable devices where the user is often in a crowded environment, or harried. Rather than opening up a contacts entry and then locating a person's twitter name, Last.FM neighborhood and phone number, the workflow on a data-centric device is more fluid. You may be listening to streaming music in the Zune marketplace on the WP7 device, and notice that a friend who likes the same music is online at XBox Live and so you tweet her about her gaming choice. It's all together, live and connected all the time.

This sort of user workflow isn't for everyone, and some people will not want to adapt to it - but the point here is that it truly breaks the paradigm that we are all used to. Actually, the paradigm that we have been taught (by Apple, Google, Nokia and others) is the way it has to be: that there is an app for that. It's new, it's different, it is often more intuitive to a handheld device - and, most impressively, it comes from stodgy old Microsoft.

So, I'm changing my personal party-line sightly: the mobile OS wars are over, but now it's a three-way race: iPhone, Android and WP7...uh, WM7....uh, Windows Phone Mobi....Jesus...and that phone from Microsoft.

UPDATE: March 4, 2010. Sigh, bamboozled again. OK, this little piece of data doesn't obviate my contention that Windows Phone 7 makes it a three-way horse race, but it does kinda crap all over my assertion that Redmond has it together. I had assumed that when WP7 comes out, that all other phone projects from MS were sent to the land of misfit toys. Not so if today's post from Gizmodo is true: Confirmed: Project Pink Lives. Steve, Steve, Steve (no, not THAT Steve, the OTHER Steve)...what are you guys doing? Combine Pink and WP7 or whack one of them - when has marketplace confusion ever worked?

Monday, February 15, 2010

Hey - 2, Maybe 3 Years Late is Better than Nothing

Back in 2008, and again in 2009, I was all hot and bothered by the potential for OLED and MEMS display technologies for both roll-up color displays and low power color eReaders. Well, both years didn't pan out, and so - like a bad gambler - I took my chips off the table for this year's predictions.

Well, great. The rollup, flexible displays aren't to market yet, but the Qualcomm MEMS tech looks poised to make it to market this year. Watch as Qualcomm's marketing director, Cheryl Goodman laughs at me not sticking it out with a prediction:



Read more about this coolness at the LA Times.

Tuesday, February 9, 2010

Tales of the Sync Demon: Getting Android, PCs and Macs to Give It Up

If you're like me, your life is an amalgam of mobile devices, laptops, operating systems, and "clouds" (or whatever the effing buzz word is today). You've also got a job - maybe two - and a personal life - or, maybe not. (Sorry man...but you know who you are.) You aren't on your phone 100% of the time, and you aren't on your Mac or PC 100% of the time - yet you want your information on all devices.

My world consists of androids, iphones, macs, PCs and ubuntu boxes - so this conversation is confined to those worlds. It's the flip-side to the closed ecosystem argument: dictatorships make the trains run on time, but openness requires diplomacy. However, since Apple, Microsoft and Google hate each other with the passion of 1000 white hot burning suns, diplomacy isn't the easiest thing to come by...

So, after an evening of experimentation at the expense of my sanity, here's the deal for reliable sync'ing between google apps, exchange, android, and exchange clients for BOTH the PC and the Mac. (I've left the iPhone and Windows Mobile out of this conversation since once your PC or Mac is sync'ed as described below, the iPhone and WinMo devices will just work properly. I left Blackberry's out of this conversation because, well, what the hell are you doing with a blackberry?)

So, boiling it down to just android, macs and PCs, here's what you want:

  • ...to be able to use the native applications on the Android phone to interact with exchange for work, and google apps for personal work, and have all information available to all client applications - yet remain separated.
  • ...everything to happen in the background and over the air (OTA)
  • ...all three platforms (Android, PC and Mac) to sync within a reasonable time.

And, here's the problems with just setting things up using OEM supplied applications:
  • ...exchange sync for Android only works reliably for email, not for contacts. There is no calendar exchange sync for the Nexus yet (although there is for the Droid, since Motorola modded the exchange sync app)
  • ...other Android sync solutions (such as Touchdown) replace the email, calendar and contact apps on the handset, which blows enormous chunks
  • ...Google Apps to Exchange server direct sync'ers require additional software on the server side, and since most people have IT organizations who will laugh at you if you suggest modifying their exchange servers, you probably don't have that as an option.
Here's the solution methodology I decided to employ (yes, I said "solution methodology"):

The idea here is to have Google Apps, NOT exchange, be the propagation root for calendar and contact information, while email is harvested directly from the exchange server. This way, the Android phone is NOT the source of the sync - which is desirable. However, since we do not control our own exchange servers, the calendar and contact sync is reliant on the PC and Mac client applications. This means that the syncs can only happen when either the PC, Mac or both are on and connected to the internet. However, in practice, you'll see this isn't such a big deal.

If you are using, say, a PC with Outlook when a meeting request comes through and you respond, the PC will sync with Google calendar immediately and it will appear on your phone. If your PC is off, and you are just using your phone and a meeting request comes through, it will arrive as email on your phone's email client. You will accept the meeting and it will be placed immediately into your phone's calendar client, and therefore into your Google calendar. It will not sync to your PC (and back to exchange) until your turn your PC on, but who cares? The only downside here is that no one else will be able to see you have a meeting at the time you accepted until your PC is turned back on. (The same scenario I just described works for the Mac as well.)

What you need on each platform:

On the PC:
  • microsoft exchange (available for $1B from http://www.microsoft.com/)
  • gsyncit ($15 at http://www.daveswebsite.com/software/gsync/) *
On the Mac:

  • iCal, Address Book, Mail.app (I could not find apps that worked for Entourage or the far superior Thunderbird)
  • SpanningSync * ($25 at http://spanningsync.com/)
On the Android:

  • Exchange for Android (for reliable exchange mail) - It is important for offline calendar sync'ing that you use Exchange for Android and not IMAP for your email if you want to have the email-calendar interaction
* both gsyncit and spanningsync allow for sync'ing to specific google calendars and contact groups, allowing you to keep work and play separate.

To set everything up:
On the Android:
  • install Exchange for Android. This comes with Android 2.0 and above, and is available at Android Market for free for Android 1.5 and 1.6
  • set up Exchange for Android to point to your exchange server

On Google Apps:

  • Optional: Separate your calendar and your contacts into separate work and home groups

On the PC:
  • Install Outlook
  • Install gsyncit
  • Optional: have different outlook calendars and contact groups pointing to the correct corresponding calendars and contact groups on google apps
On the Mac:
  • activate iCal, Address Book, and Mail.app
  • install SpanningSync
  • optional: have different outlook calendars and contact groups pointing to the correct corresponding calendars and contact groups on Google apps
...and that, as they say, is that.

Now your phone, google apps online, your PC and your Mac will all be in sync within a delta measured in minutes. (If one or the other of your laptops is off, the delta is measured in the time it takes you to turn on your laptop(s).) If you also use an iPhone or WinMo phone, just set them up normally, and everything should work just fine.

Also, please be careful: a change in one object (say a contact) will quickly propagate through everything you do. You can seriously whack out your information.... Outlook is the easiest to back up, of course, since you just copy the OST or PST file around. However, until you are comfortable that everything is working well, you should engage the sync's one at a time until you are convinced they will do what you need them to do.

Anyway - that oughta do it. When Google fixes their exchange sync application for android, we'll all be able to turn off the 3rd party sync'ers (Actually, I'll leave gsynchit running, since one of its unintended consequences is to back up my personal contact info and calendars.)

This has been a public service announcement from your friendly IT department.

Saturday, January 30, 2010

My Humblest Apologies for a Few Bad Apples...

Just a quick post about Rocket Upkeep.

When I started this blog 3 years ago, I wanted to keep it very open. Anyone could comment on here, no requirement to log in or register, no moderation.

Unfortunately, over the past few months I've had to do quite a bit of comment deletion to remove SPAM postings. (Apparently, my posts attract the "how would you like to make zillions in foreign markets?" crowd.) So, as of today, I'm moving to a CAPTCHA method of verifying comment posting authenticity. Still no need for registering, still no moderation.

I'll leave the CAPTCHA in place for a few months, but if that doesn't do it, I'll have to put in a registration policy. Apologies for the extra inconvenience folks.

...I now return you to your regular scheduled tech rants.

Wednesday, January 27, 2010

The Macbook Air, Mark II....er, the iPad

OK, disclaimer: I have not seen this in person, I have not held one, I have not used one in any sort of fashion. This is a first impression from the specifications and cost factor only, and I am going to try to not be influenced by Apple media hype or naysayer responses.

So, like everyone else in the gadget-sphere, I was watching the various feeds this morning (ironically on an Android phone) when the Jobster was out in front of the cameras, fondling the latest Apple fetish device: the unfortunately named iPad. (The NYTime's "leaked" device name of iSlate was so much cooler.) When The Steve was finished, I was immediately struck by two impressions:

  • This is exactly what we all thought it was going to be...
  • This is nothing like what we all thought it was going to be...
Ultimately, the iPad is on a slider somewhere between an iPhone and a Macbook Air, with some of the capabilities of both, and a lot missing from each.

The Good

This is, like all Apple products, a pretty device. It's a perfect size for slipping into a brief case or leave on the kitchen table, and it looks as though it rests comfortably in your hand/lap when browsing the web. The optional 3G radio will increase its usability in the field (well, except for the whole AT&T thing again), allowing me to get Wired Online wherever I'd happen to be sitting.

It also seems to be fast - the live demos and marketing spiels showed this thing zipping through files, photos and film (you like my alliteration techniques, don't'cha?) like a device should behave. Time will tell if it bogs down as you start to fill it up with apps and images, but we'll give the speed points to it now.

The user interface experience has that Apple magic to it: the company has more than learned a few lessons from the iPhone, iPod Touch and the new Macbook multitouch trackpads. All of the gestures and motions look completely intuitive, now that we've been trained by 3 years of iPhoneage.

The always-on nowness of it is perfect... being able to pick it to check movie times and order tickets as you're dashing out the door is excellent.

The Bad

There was a lot of build up to the iPad in the days leading up to this morning's announcement, some of it was justified (Apple getting into the tablet business), but some of it was misleading. The Apple invite ads (like the one to the right here) don't actually say what the device is about, but they imply a lot.

People and analysts have been reading into these Apple invitations for a long time, and for the most part the invitations have given a clear indication as to what was coming - so it was inevitable that the same logic would be applied to this multi-hued beauty above. The unspoken message is: We hear you artists, thanks for being our champions through the years, here's something for you now...

...except, of course, that's not what the iPad was about. There's no stylus for drawing, no iPaint specifically designed for the iPad (at least none that I've read about), no iMovie for the iPad...nothing specific to the creative artist at all. The iPad is designed for the consumer of art, not the creator of art. You can flip through photos, view the web, watch videos and listen to music. Yet, bizarrely, it lacks the simple inclusion of a $10 pen stylus and enhanced iPaint applications would have gone a long way to making the invitation fit the device - not to mention cemented "Apple" with "Creative Freedom" in the minds of the digerati for another generation.

Instead what we are left with is either a large iPhone, or a keyboardless Macbook Air - and strangely, it has left out some of the best features of both. What it borrowed from the iPhone (overall looks, applications, slick UIX), it left out in functionality: no voice or communications at all? A simple webcam and a relationship with Skype would have been all people wanted.

And from the Macbook Air? It's got the thin and light thing going on, plus the integration with all your Apple products at home - but doesn't have a keyboard, seems significantly more fragile, and 64G (the maximum memory) would get chewed up by even the most casual business user in about a day.

The other thing it doesn't have from the Macbook Air? It's operating system. The iPad is based off the same OS X derivation that the iPhone is based off of - which means: no multitasking. How's that supposed to work if this is a slip-in replacement for a Macbook Air?

How about reading a book? The new iPad bookstore looks sweet. Yes, it does, but people are forgetting the main purpose behind eInk: eyestrain. eInk, as I've written about before, is all about mimicking paper reflectivity. It doesn't generate eyestrain. Staring at a screen (even a pretty OLED screen) does. OLED is awesome for magazines, less so for Moby Dick.

But, let's swallow that one and say I'm wrong. Let's say that the iPad is perfect for reading Moby Dick, then what Apple is telling us is that you now need three products:
  • iPhone - for communications
  • MacBook Air or MacBook for laptop functionality
  • iPad for media consumption
Now my briefcase has less room when I travel then when I started?

Middle Ground?

Ok, so maybe the iPad is for the home. It's meant to be shared by families and friends as a kitchen or family room utility object for reading newspapers and magazine, controlling your home iTunes center, or ordering takeout food. That would be fine at CrunchPad prices, but what we have instead is this:

Wifi only models:

  • 16 GB - $499
  • 32 GB - $599
  • 64 GB - $699

3G plus WiFi models:

  • 16 GB - $629
  • 32 GB - $729
  • 64 GB - $829
And, since it's AT&T, you're looking at additional pricing. Apparently up to 250 MB for $14.99, or unlimited data for $29.99. Free use of AT&T hotspots.

So...how is all of that middle ground? Is the average family going to spend another $500-$800 on a device that allows you to consume media (as a single viewer consumer...well, two if you snuggle), surf the web (like that laptop you already own), or just "leave lying around" ready to order Fandango tickets?

I'll stick by my previous prediction: It will sell to the faithful, but not be a major hit for Apple or an influencer for a "new category" of devices. (Unless that category is Tablet Computer That Can't Quite Do Everything A Computer Can.)

Maybe I'll change my mind when I get one in my hands...but I kinda doubt it...

Saturday, January 23, 2010

The Rocket 2010 (minus 3 Weeks) Predictions!

Remember how all those other tech blogs were doing 2010 reports back in 2009? Remember how you were reading them in that awkward place between Christmas and New Years, when you had nothing else to do but see Avatar again? Ha! Plebes! Anyone can do a prediction list in the weeks before the year being predicted...but it takes a true prognosticator to peer into the future year while that year is actually underway!!

OK, fine. I'm late. This prediction list is late by three weeks. Whatever. At least now I have claim being right if I "predict" something that happened last week, right? No? Fine. Here we go...

  • Apple Tablet Lives, Receives Traditional Apple Fanboy Responses

    Considering that the Apple event happens in, oh, 5 days now and that you'd have to be completely asleep at the switch to not realize there's a tablet just around the corner, all that remains is the prediction over the response to the unit. It's going to be priced in Apple dollars, that's for damn sure, so will people buy it? Undoubtedly yes. There's the Apple contingent who would buy anything and everything that Cupertino puts out, so the market success of the product will carry it through release #1. What remains to be seen here is whether this will be placed in the "iPhone" or "Apple TV" category of success.

    The reality will be that it will appear somewhere in the middle. While portable consumer devices are an overwhelming success story for the company - and for technology at large - the success of Apple from a market acceptance point of view gets murkier as the devices become productivity tools. The Apple TV and Mac Air products were not a success by any real measure, and for all the hype around traditional Mac computers, they still can't seem to break into the double digits and stay there.

    The success of the Apple Tablet will depend on a few points that none of us will be clear on for a few days: pricing (people are guessing in the upper $800's at the time of this writing), connectivity (AT&T? No thank you.), content (Apple does amazing things with content partner deals), usability (long their forte), and lifestyle niche. (How does this fit into the users current computer ecosystem?)

    We'll know the full extent of the story after all the hullabaloo of the launch dies down and the numbers start to roll in - definitely by the end of Q2 - as to whether or not this thing is an iPhone or an Apple TV.... but for my money, it's going to be more like their Macbook line: popular among the cool kids (and so a money and PR generator for Apple), but too expensive for the mainstream to be considered "a hit."

  • T-Mobile Moves to the Second Place Spot Behind Verizon
    Every year I shoot from the hip (and often hit my own foot in the process - thanks Yahoo!) on at least one prediction, and this one is it. Dissatisfaction over AT&T's service and arrogance has reached an all-time high, Verizon's attack ads on the lackluster AT&T service are having a measurable impact, analysts have come out and said that they need to spend $5B just to reach par with their competition, and de la Vega has lost his fraking mind by putting the "blame" for the networks woes on the cutting edge early adopters of AT&T technology. (The 10-point powerpoint slide presentation the charmless de la Vega gave as a "keynote" at CTIA back in October was mind-numbing in both its blindness to its future and in its stumping for why we need to charge for tiering.)

    What does all this have to do with T-Mobile? Well, while de la Vega is out there blaming all of you iPhone users for breaking his network, T-Mobile has quietly built out the most stable 3G network outside of Verizon in less than 2 years, and is currently retrofitting its brand new infrastructure to support 4G (via LTE) and HSPA+. Combined with the continued rollout of Android handsets, better marketing, GSM reliance, and a still failing Sprint, it won't take long before frustrated subscribers jump ship.

  • eBooks Open Up
    With the floodgates of eBook opened up (you should have seen the eBook section at CES 2010...companies I never heard of had some form of eBook reader), and the impending semi-thud of the Apple table showing up, eBook OEMs are going to have to stop their divvying up of content providers and open the doors to all comers.

    Steps have started in this direction late last year: Although they provided support for the open EPUB format in 2008, Sony changed its primary distribution format from its proprietary Sony DRM format to exclusively use EPUB. This means that books I purchase on Sony's eReader store can now be used on any eBook that supports EPUB. Similarly, Amazon recently announced that it was cutting new deals with publishers, and allowing 3rd party developers to develop "apps" for the Kindle. (What a weird, lame, step sideways.)

    All of this will culminate this year with the Kindle, Sony eReaders, Nook and other entrants to focus on selling hardware on its own merits, rather than locking readers into certain hardware based off of which book or periodical you want to buy.

    Apple, of course, will continue to do its own thing and lock you into their ecosystem...and you'll just continue to buy into it, won't'ya? Sheesh.

  • 3D Television Makes a Frightening Sound When It Expodes on Impact, Scares the Children

    I just bitched about this in my CES 2010 review, but Jesus Christmas...seriously Sony / LG / Pioneer / Toshiba / DirectTV / etc? No one wants this. No one.

    Sure, we all loved Avatar. I loved Avatar. The new TRON is going to be awe inspiring in 3D. It really will. But, good God, man...we all just spent a crapload of money on flatscreens in 2008, and you just convinced us to buy Lawrence of Arabia again on Blu-Ray. Do you think we're all gonna throw those out to buy $3000 3D monitors and new content in 2010 so we can, do what, watch the occasional effects movie or sporting event while wearing dork-vision spectacles? Do you seriously expect us to put those freakin' things on to watch Jay rape Conan?

    I only hope that the amount of money you have put into R&D and rushing these things to production, as well as the money you are about to spend on marketing, doesn't lead us all into another recession as your rocket flames out at high altitude and slams into the pavement below at supersonic speeds.

    Jeez, call me when you can do this 3D crap without the glasses, please. I'll be about ready to replace my flatscreen by then.

  • The New Cold War Begins

    Google and Hillary Clinton on the same side of the fence. Oh, what wonders have we wrought?

    Unless you have been living under a rock, or really absorbed in NBC late night politics, by now you've heard that...uh, China has allegedly been trying to hack gmail to get email from potential dissidents. This prompted a response from Google (ok, it was self-serving for a number of reasons, but still...a response) as well as the US state department. For it's response to all of this, spokespeople at the Chinese Foreign Ministry have issued a "bite me" statement, claiming that China has an "open internet" and the US should mind its own business if it doesn't want to harm Sino-American relations.

    Part of the benefit of being 900 years old, is that I do have a bit o' a memory the last time this sort of talk was bandied about - all that's missing at this point is someone banging their shoe on a podium claiming that they will bury us.

    China is huge, and becoming an economic superpower in a world where information can no longer be tightly controlled by nation-states. The United States is an economic superpower recovering from a huge blow to its populace and ego. The stage is set for a showdown that has happened before, and the results are often not pretty - patriotism takes a back seat to xenophobia, the good of the people takes a backseat to national principles. Global economic recovery in 2010 could be hurt by an internet-fueled cold war redux.

    By the way, Ms Clinton? China? My gmail password is "WishIHadAProperGavel2." Just trying to save everybody some time.

  • Porn Goes the Way of Do-Do-Dildo

    Jeez, I had lot I could have gone with for the header.

    "Porn No Longer Thrusts Hard into Consumers"

    ...or...

    "Porn Runs Out of Viagra, Millions Disappointed"

    ...or...

    Alright, alright...I'll get to the point. CES shares its venue each year with the Adult Entertainment Expo. While not as large or throbbing (alright, alright...I'll stop) as CES, the AVN is an important event in the porn industry. Like every other industry, they spend a few days doling out awards, patting themselves on their backs, and discussing new tech and film potentials. The event is seriously hyped, and traditionally takes up a few floors of the Sands Convention Center in Las Vegas.

    While attendance was up over last year by 10%, mostly due to the low $10 admission price to get walk ins, the size of the venue was significantly reduced, taking up only a quarter of its space last year. In an excellent piece in the Daily Beast, Richard Abowitz discusses "Top 5 Reasons Porn-for-Profit is Dying." Among his arguments is, of course, file sharing - but more interestingly he holds the confluence of two interesting issues higher than piracy (or at least equal to piracy) for the rapid demise:

    1) It's no longer taboo. The internet has demystified porn, and made it "acceptable" to a certain extent. (Parents, close your eyes for the next part.) It's no longer considered a big deal if your girlfriend lets it all hang out online, sometimes its even a matter of pride. (Oh sure, you'll never get a white collar job again, but why quibble?) The end result is the same effect you are seeing in online media vs traditional media: there's more free content out there, so why pay?

    2) World of Warcraft. OK, not World of Warcraft specifically, but online gaming. Let's face it, the vast majority of porn-obsessed fans are the adolescent-to-20-something crowd who have a limited amount of time and money. It would appear that the discretionary dollars that used to go to "Nurse Nancy's Bedtime Videos" (I made that up, I swear!) now get converted to Linden Dollars to buy a virtual pair of jeans. It turns out when competing for dollars and time, porn loses.

    Online pay-for porn killed the back-alley porn shops, and this new combination of pressures looks like it is doing the same to online pay-for-porn. Still a financial powerhouse, porn is no longer the 800-pound gorilla it once once. Rocket prediction: by Q4 of this year, online pay-for-porn will realize less than 25% of its 2007 numbers.

    What will Ron Jeremy do now? Become a Starbucks Barista? (All together now: ewwww!)

  • Pressure is on IPv6 to Not Screw Up

    A lot of natural resources are scarce these days: clean water, oil and gas, IP addresses.

    Quick recap: the IPv4 protocol is the system of 4, 3-digit numbers that most of you see when you are setting up a new computer or internet device (xxx.xxx.xxx.xxx), but is the primary "phone number" system used over the internet for moving traffic from place to place. IPv4 was developed in the late 70's and put into place in January of 1980, replacing its aging predecessor. Like all technology from that era, it was planned with remarkable shortsightedness, and lack of imagination for the future.

    OK, that's a little cruel - IPv4 is based on a 32-bit addressing scheme, anything more back then would have been expensive to implement. Still, I maintain it was a failure of imagination to not conceive of a future where every lightswitch and RFID tag would have its own internet address.

    Well, like the old MS-DOS 640K memory limit, time's up kids. IPv4 can hold only about 18M private addresses (or about 270M multicast addresses), and it doesn't take a research analyst to imagine that the world is dangerously close to that many internet addressable devices. Not everyone can just use these addresses, there is an agency in existence, the NRO (Number Resource Organization) that doles these things out, but according to their most recent accounting this month, there's less then 10% addressable space left...which is annoying since just a few months prior, they were saying we had 18% of the addressable space left. Why? Uh, the geek equivalent of an accounting error.

    Fortunately, there is a savior. IPv6 is the next iteration of the IP addressing standard, and the addition of those two extra triple-octal numbers allows for 64-bit addressing, as opposed to 32-bit addressing. This effectively gives IPv6 enough room for 3.4×1038 (340 trillion trillion trillion) unique addresses. Yay! We're saved.

    Sort of. IPv6 has been around for a few years: the 2008 Summer Olympic Games, for instance, were a notable event in terms of IPv6 deployment, being the first time a major world event has had a presence on the IPv6 Internet at http://ipv6.beijing2008.cn/en (IP addresses 2001:252:0:1::2008:6 and 2001:252:0:1::2008:8) and all network operations of the Games were conducted using IPv6. As cool as that was, OS's have been slow to adopt, and when they have it's not been pretty. Vista, Windows7 and recent OSx incantations have included IPv6 in their protocol stacks, but have added them in a ham-handed way, often causing collisions between in-use IPv4 protocols and the newer IPv6 protocols.

    Nothing like a little pressure to help clear the mind, and running out of IP addresses might be just what the doctor ordered. Rocket prediction: 40% of traffic carried via IPv6 addresses by end of 2010.

    Yeah. I think I'll lose this one too.

  • Bye-bye Hulu

    At least in its current viewership numbers. I take no great pride in this prediction, but I do think that Hulu will be seriously crippled by year's end, if not completely shuttered.

    With the purchasing of NBC by Comcast, Hulu's future is in serious doubt. NBC is a major stakeholder in the internet streaming service, and Comcast has just rolled out Fancast as its own streaming solution for its customer base. You do the corporate math.

    And while you're cypherin', Jethro, factor in the dozens of other papercuts that Hulu has been dealt recently: content providers pressuring them to shut off alternative access channels, like Boxee; worse than expected ad revenue causing it to consider charging; and restricting content to less than a seasons worth for many of the shows that it carries.

    Again, it doesn't fill me with joy to make this prediction, but I hope Hulu didn't spend all of its dollars on Superbowl ads and Seth McFarland.

  • FCC Not Withstanding, Tiered Pricing Internet and Mobile is Coming.
    In January the DC district court posed to blow the legs out from underneath the FCC, by stating that the Comcast/Bittorrent throttling argument was a compelling argument to keep the FCC's hands out of internet regulation. (See, if I had written this in December, like I was supposed to, I would have come to a different conclusion. Yeah yeah...I know.)

    Oddly, rather than doing handstands, Comcast's response to this was to back away slowly while muttering...uh, hahaha...uh...that's not exactly what we meant. Herold Feld has a good analysis of this on PublicKnowledge.org where he claims that the reason for Comcast's reaction is that ISP essentially do not want the FCC pulled from the debate because they need an adult in the room. If the FCC is removed from the conversation, then when a crisis occurs the blame and responsibility fall on the ISPs, not the federal government.

    Suffice it to say, the story is far from over - but it will culminate in a tiered pricing structure from at least one of the major ISP (land or mobile) by end of 2010. Let's hope it doesn't wind up being a disaster - a tiered structure is fine, actually, if implemented correctly. If not, however, it could mean the difference between the promise of the internet and the greed of big business.

  • Good Out of Horrible: Technology Breaks the Logjam of Giving

    The recent natural disasters in Haiti were horrible. People's futures and lives will never be the same - and the outpouring of understanding, solidarity and support from the rest of the world (except for that jag-off Pat Robertson) has been heartening and awe-inspiring.

    Now, we've seen this story played out before: natural disasters and human atrocities claiming the attention and sympathy of the world. Yet, while people are concerned and give support, few take action. The reasons for this are entirely human. When tragedy strikes someone you don't know half a world away, there is a disconnect between the empathy and a call to action. Those who do should be praised, but all too often, most of us do not. Finding your credit card and a number to call to send relief funds to take time, and that time often jars your mind out of the empathy you are feeling. If we are honest with ourselves, we've all experienced this.

    This time, however, there was a different component added: this time, the International Red Cross set up an SMS toll road for its international response fund. By texting “HAITI” to 90999 from your cell phone, $10 is donated to the relief effort and added to your cell phone bill. In today's cell obsessed environment, it was a no brainer. The message was distributed by the Red Cross via Twitter and Facebook, and response was immediate. By January 19th, this SMS campaign has raised $22m, or a full 1/5th of the relief dollars generated by the Red Cross for Haiti.

    This was a watershed moment - prior to this, the Red Cross' highest bar for technology-based giving schemes was $400,000. (In other words, an increase of 55 times.) Social networking platforms like Twitter and Facebook, the ubiquity of cell phones, the willingness and generosity of ordinary citizens, and the ease of the methodology have all converged at the right point in time to create a new way of giving that decades of tax breaks, philanthropy, and doe-eyed children pimped by last years actors on late night television have been unable to accomplish.

    Over the next year, the Red Cross, UNICEF, and other organizations will begin to combine their databases and cross promote, using cell phone technologies and social networking to redistribute the wealth properly to areas of the world that could use it.

    And it bears repeating: By texting “HAITI” to 90999 from your cell phone, $10 is donated to the relief effort and added to your cell phone bill.
OK...that's it for this year, minus 3 weeks. Let's see how I do in 11 months. (The Apple thing's a shoe-in tho...)

Sunday, January 10, 2010

The Newst Who in Whoville...


....ok, a brief break from the usual techie stuff so we can spend a brief moment to acknowledge David Tennant's passing into Who History with an unfortunately muddled final two-parter. Despite what was easily the worst written send-off for a Doctor, Tennant is a tough act to follow - but I said that about Chris Eccleston as well, so we'll see if the youngest actor to man the TARDIS, Matt Smith, can pull it off. The BBC press release pictures of him (on the right here) make him look like a direct response to "Edward" in the Twilight films, but last week's few moments of Smith after the transformation seemed to have placed a wee bit of doubt in my skepticism. Smith seems to have absorbed the same manic brilliance, human alienness, and neurotic clear-thinking that have blessed/plagued the 10 who came before him.

So, I'm up for it, let's see what he's got. To check it out for yourself, here's the new season trailer:



...and check out IO9 for a nifty breakdown of the above.

Thursday, January 7, 2010

CES 2010...now in Amazing 3D!!!!!!!

Ok, I'm gonna 'fess up to cheating. My 2010 prediction post is late. Very late. I was waiting until I got to CES so that I could take a peek under the covers to see what there is to offer. So sue me.

I had assumed it would happen - so had everyone. This year's CES was gonna be all about EYE POPPING, EXCITING 3D TV!! I guess I wasn't mentally prepared for it though... Walking into the Las Vegas Convention Center was a little like a trip to a Best Buy in some alternative reality. I have no idea how many bazillions of dollars worth of display technology was littering the walls, ceilings, and the occasional floor, but it sure as hell didn't look like we ever had a recession. (Hell, even the drifters from the Adult Entertainment Expo at the Venetian were floored, and some of those people have been "3D augmented" for years.)

I would like to blame the success of Avatar on all these panels, but the reality is that it takes years for this stuff to hit production, and 3D television has been bandied about for at least a decade. So, call it a happy coincidence for James Cameron that his 3D Blu-Ray edition of Avatar will now have something to play on.

Nonetheless, the question remains - will this transparent attempt by the consumer electronics industry to get you to ditch your plasma's and LCD screens pay off? The OEMs seem to have convinced themselves that all of us have convinced ourselves that televisions sets are commodity items, requiring repurchases every few years like laptops. I don't think that way (I love my energy gobbling plasma, thank you very much), and I suspect that without a compelling reason to switch, most of you don't either.

So, is 3D compelling enough for you to ditch the $3000 machine you just hung on your wall a scant 2 years ago? I'll get to that in a minute, but let's just go over what I saw this afternoon...

I didn't have to try very hard. Every major OEM is out here in full force promoting 3D, thin, or, yes, even transparent display technology. LG, Samsung, Sony, and even no-one-has-ever-heard-of-it TCL are all vying for top "WOW" factor. The opening image above was from the double billboard sized display for LG's new Infinia line of panels. These things are thin. Razor thin. At only about a quarter inch in depth, they are beveled in glass, causing the display to appear to float out from the wall mounting. The Infinia line is more a marketing brand play, and less a technology - as Infinia's can appear in LCD or Plasma flavors.

The Infinia line also boasts 3D technology, in both active (LCD shutter glasses required) and passive (polarized non-shuttered glasses required) flavors - but both behave about the same, with a slight dip in color brightness in the passive models. As far as I know, the Infinia line of monitors from LG is available now....

Samsung is introducing a similar fleet of thin, 3D capable displays - although these are active matrix OLED (AMOLED) powered displays, stealing the thunder from Sony's OLED attempt a few years ago. Unlike the Sony, the Samsung's come in sizes bigger, than 11 inches. (This would be a perfect time for another crack at the Adult Expo going on next door, but it's too easy....oh, the hell with it: "...yet, next door at the Adult Expo, 11 inch technology is considered to be all the rage..." There. Happy now?)

The Samsung AMOLED displays are stunning. Not only are the specs on these things off the charts (100,000:1 contrast ratio for starters), but Samsung developed a sense of design somewhere along the way. These things range in size from 20" up to what looks like a 50" display, encased in stainless steel. Really. I know I drool over electronics a lot, but this is something else entirely. Click on the images I provided and take a look for yourself.

Samsung also had a transparent display. Really. Straight from Minority Report, Dollhouse or Avatar. It was unnamed, and connected up to a standard Windows 7 laptop to give the display something to drive it, but the applications for this type of thing are way beyond "cool prop for TV show" categories. I couldn't get a clear picture of it, cuz this guy below was filming for Wired - but he did a great job, so lets just use him, shall we?



The most interesting 3D display technology for my buck though, came from a company I've never heard of. The Creative Life out of China created a 3D display technology that did not require glasses. The photo I've attached here does not do the display justice. You have to trust me here when I say that the images popped off the screen, and allowed for about 100 degrees of viewing.

The "however" part of this announcement (you knew there had to be one) had to do with the tech they used to achieve this effect. It order to trick your eye in to seeing a 3D image, TCL turned their entire display surface in to a giant, lenticular lens.

The resulting images look great for 3D - although I have my doubts as to whether the system can properly support 2D since the lenticular patterns are cut into the display itself. The whole thing would probably look like it was being displayed through a giant Fresnel lens. The literature for the device claims it does display 2D, but the booth ba...uh, helpful TCL personnel couldn't really answer my question and were unable to play a 2D video on these displays...so.... I dunno.

Which brings us to the whole practicality and desire of 3D displays that I mentioned at the start of this posting. Is the tech compelling enough for hordes of people to run down to Best Buy next Christmas? There's the content argument (there isn't much of it), but that's not really important. If there's a big rush of orders for these things, there will be content. It's more a question of...does anyone care? There are specialized, niche reasons where 3D makes sense: football games, over-the-top "event" movies.... but, really, are you gonna put on a pair of goofy glasses to watch Modern Family or Craig Ferguson? I doubt it - but lord knows I've been wrong before.

There is one exception to this, though: gaming. I wandered over to the Sony booth, and played around with PS3's running 3D versions of existing games. The results were pretty incredible (Little Big Planet is already addicting enough without the third dimension), and gamers are already used to festooning (Yes, it's a freakin' word! Look it up.) themselves with all sorts of crap. They also spend huge amounts of money on their gaming systems, so it isn't out of the question that they would toss down another $5K on a 3D monitor.

So that's the 3D and thinness fun in today's little CES excursion, what else was there?

Well, for one thing, everyone seemed to be all about the blue this year. All the giant Veridian Dynamics companies must have shared the same space designers, because everything looked like a crappy nightclub....or a Virgin America flight.

And...let's see....oh yes, Sony needs to fire the product marketing genius who came up with the name "Bravia Monolithic Design" for their new line of high performance displays. In a show filled with wafer-thin little anorexic displays, was "Monolithic" really the best word you could come up with?

Finally, there was, poor Toshiba. Still feeling the sting of watching HD-DVD get the living crap kicked out of it by Blu-Ray, they refuse to give up... kinda like the ugly America tourist who thinks that if you just talk louder and slower at the Spanish, they will understand what you are saying. In the face of all odds, Toshiba was touting its "Cell TV" technology.

What's "Cell TV" technology? I have no idea. Furthermore, neither did anyone at the Toshiba booth. Really. I was passed from booth person to booth person in a sort of "that guy over there knows" fashion. It was very bizarre. (Fortunately, Gizmodo figured it out.) However, gleaning what I could from obliquely phrase posters and marketing hype videos, it has something to do with a combination of a number of technology to make the pixels brighter, convert 2D to 3D and, ahem, upscale a 1920×1080 image to a 4K image.

Yeah. I crap you not. Apparently, someone at Toshiba thinks that 1920x1080 on a 50" monitor just isn't going to cut it with the kids anymore. So, we need to uprez! More pixelizerers! More! Better!! Yeah! And then...they'll grow tired of uprezing and want the real deal! Then they'll need a physical media to support it! YES!! HD-DVD is BACK baby! HD-DVD II!!! YEAH! 4K.

Sigh. Here's the deal. They had this display running. Split screen. 1920x1080 on one side, uprezed 4K on the other. As God as my witness, I couldn't tell the difference. At all. I tried, I really did. I stared and squinted, and I got closer and farther away, and I tilted my head. I even have bionic eyes now. Nothing.

Dear Toshiba: Enough is enough. It's over. Just...stop it. You're embarassing me and your mother. Just...I dunno. Make thin screens or something. Love, Rocketman.