Posted by : Brij Bhushan Sunday 5 January 2014

legerdemain

CES looms, as it frequently does, and soon we will all be awash in the deluge; the annual international carnival of gadgetry shows no sign of slowing. But beyond this yearly cycle, a longer pattern is about to reach an inflection point.


Mainstream technology is not exactly a paragon of ingenuity. The advances that trickle down to us as consumers are quite prosaic, really, compared to the high-risk world of startups (a few of them, anyway) or the churning erudition of academia and serious R&D. This lack of ingenuity manifests itself in a dozen ways, from acquisition culture to a general failure to grasp the zeitgeist, but the one I think matters the most at the moment is the tendency to advance by accretion.


Basically, it’s bullet-list syndrome. When the underlying technology doesn’t change much, one adds features so that people think the new thing is better than the old thing. Cars have always been a good example of this: a phase occurs between major changes (the seatbelt, for instance, or electronic fuel injection, or dash computers) when manufacturers compete on widgets, add-ons, luxuries, customizations — things inconsequential in themselves, but a moonroof or short-throw shifter is a useful psychological tool to make the pot look sweeter without adding any honey.


That’s what we’ve been seeing the last few years in consumer tech. Certainly there have been quantitative improvements in a few individual components, notably displays, wireless bandwidth, and processors, but beyond that our computers, phones, tablets, hi-fis, headsets, routers, coffee makers, refrigerators, webcams, and so on have remained largely the same.


Of course, one may reasonably say, that could be because of the greater amount of “innovation” being achieved in the area of software. But innovation isn’t a limited quantity that must be expended in one direction or another. Besides, Internet-connected apps and services have blown up mostly because of ubiquity, a consequence of ease of adoption, itself a result of microprocessors and flash storage reaching a certain efficiency and price.


At any rate, stagnation is occurring, which historically can be recognized by how different you are told things are. The iPhone and the Galaxy S 4 — what could be less alike, judging by the Super Bowl ads to which we will no doubt soon be subjected? Except they perform the exact same tasks, using almost identical interactions, access the same 10 or 20 major Internet services, and, in many important ways, are as physically indistinguishable as two peas in a pod.


The aspects in which we are told they differ, from pixel density to virtual assistant quality to wireless speed, are red herrings designed to draw the consumer’s attention; like a laugh track or “applause” sign, they’re signals that these, and not the innumerable similarities, are what you must consider. That they are not self-evident and you must therefore be told about them is testament to their negligibility.


These parlor tricks Apple and Nokia and Samsung are attempting to foist upon a neophilic customer base that desperately wants real magic, but which will accept sleight of hand if it’s convincing enough.


Tablets, too, are this way, and TVs, and fitness bracelets, and laptops, and gaming consoles, and so on and so forth.


This isn’t exactly a problem for consumers, since generally it means things have reached a high degree of effectiveness. I don’t know if you’ve noticed, but everything is great! TVs are huge and have excellent pictures. You have coverage just about everywhere and can watch HBO shows in HD on your phone on the train. Laptops can do serious work, even cheap ones, and not just Excel and email — video editing, high quality gaming.


But when everything is great, people stop buying versions of things. And if you can’t do to the iPhone what the iPhone did to the Treo, you need to start putting bullets on lists.


Yet at some point, the list gets so long that people stop reading it, or else stop believing it. This is the inflection point I think we’re approaching. No one bought fridges that tweet whenever they’re opened, and no one buys a Galaxy S 4 because of some obscure networked dual-camera selfie stamp book, or whatever other garbage they’ve crammed into that awful thing.


At some point, things have to change in more ways than more. Sometimes less is the answer (as I’ve written perhaps too often), even within high tech: the Kindle, for instance, was (and remains) a very limited device; originally it wasn’t even better than the paperbacks it was meant to replace. And the original iPhone, let us not forget, was notoriously feature-poor, lacking rudimentary functionality found in flip phones worldwide. But both were very new in that they leveraged a powerful and promising technology to change the way people thought about what devices could be used for.


The next logical step along the path of proliferation (due to small, cheap microprocessors and memory) after devices that do a lot is devices that do too much — and after that, it’s devices that do very little. This last is the category that is making its real debut this year, in the guise of “wearables” and, more broadly, the “Internet of things.” The fundamental idea here is imbuing simple things with simple intelligence, though trifles like digital pedometers and proximity-aware dongles look for all the world like parlor tricks. There is reason to think that this trend will in fact create something truly new and interesting, even if the early results are a little precious.


Punctuated equilibrium is the rule in tech, and we haven’t seen any decisive punctuation in quite some time. Meanwhile the bland run-on sentence encompassing today’s most common consumer electronics is growing ungrammatical as the additions make less and less sense. And my guess is it will drone on for another couple years (not unlike some columns).


What will jump-start the next phase? Is it, as some suggest, the ascension of coffee mugs, toasters, and keychains to a digital sentience? Will it accommodate and embrace the past or make a clean break? Have we heard of it, or is it taking shape in the obscure skunkworks of Apple or IBM? I don’t know — and I suspect the prestidigitators at CES don’t know either.







Leave a Reply

Subscribe to Posts | Subscribe to Comments

Popular Post

Followers

- Copyright © 2013 FB EDucator - Powered by Blogger-