Posted by : Brij Bhushan Saturday 19 October 2013

Screen Shot 2013-10-19 at 7.42.48 AM

It’s not often that I use a new ‘photo capture’ app and impressed by it within seconds. It’s not that there’s not a lot of cool stuff being built out there, it’s just that the frontiers are getting closer and easier to predict.


That’s not true with Seene, an app by computer vision company Obvious Engineering that leverages smartphone sensors and WebGL to present curious and eerie 3D scenes. The app is the product of four-man team including CEO Andrew McPhee and CTO Sam Hare. The ‘seenes’ themselves are images mapped onto a rough 3D model of your subject that give the feeling of being able to shift perspective even after you’ve shot it.


This produces small three-dimensional digital dioramas of a moment in time and space.


The capture process is simple. You tap on the capture button to shoot an image and then turn your device to capture the sides, top and bottom of your subject. Just a few degrees will do. The image is then processed and mapped onto a simple object that approximates 3D space. You can then view it in 3D or share it with others.



There are a couple of interesting components to Seene, in my view. First, it has the same sort of post-capture feel that Lytro, the focus-stacking camera that everyone loves but that has failed to gain an immense amount of traction in its current hardware form. The power of that kind of experience is interesting in the way that it ‘explodes’ these static images out into things that approximate human vision. In the case of Lytro it’s the way that our eyes nearly instantly re-focus when they travel from object to object. With Seene, it’s our simple but compelling binocular vision that creates a feeling of ‘being there’.


Seene is also an experience that couldn’t exist in the way that it does without the smartphone, something it has in common with other interesting services like FrontBack and Vine. You couldn’t capture a Seene without a mobile camera in your pocket attached to accelerometers and a powerful dual-core processor that renders the images. The only mass-produced product like this that’s ever been made is the smartphone.



The processing power required is one reason that only iPhone 4S and newer devices can create Seenes, though most other devices can view them.


There have been plenty of other experiments using computer vision to model 3D scenes on the web, but Seene doesn’t use cloud processing to accomplish the unique images it produces. Instead everything, from capture to mapping to processing is done right on the device. And the processing time is nearly instantaneous, a fully rendered Seene pops up almost immediately after shooting on new devices like the iPhone 5s.


Obvious Engineering was founded in March of 2012 and typically works on projects for clients using their computer vision expertise. Seene, says McPhee came out of a desire of the team to build “something that was our idea.”


“Photos drive social communication,” McPhee says and that made them want to do something on a ‘mass scale’ that had the potential to reach hundreds of millions of users. It’s the first thing they’ve attempted to do on this scale.



The experience of viewing a Seene in the hand is fairly visceral, as tilting your hand or body will move the 3D image around as if you were ‘looking around the corner’ of an image. I’ve experienced the desire to do that with really compelling images before, but this is the first time I’ve been able to do it and the effect has really impressed me.


You can also view Seenes in browsers that are WebGL compatible like Chrome and new versions of Safari.


Hare says they’ve had testers in London like directors and photographers producing compelling material that ‘feels’ like a photograph but do things with the app that they hadn’t foreseen. McPhee says that these results come from users who have “different ways of looking at the world.”


And indeed some of the Seenes out there are pretty clever, though it does take a bit of experimentation to get results that look great. In my experience, the best subjects are shot at a medium distance, not close up. Moving subjects aren’t really an option at this point though an image of a fountain I shot did give off the impression that water droplets were hovering in mid-air, very cool.


The team is bootstrapped currently but looking for funding. I’m not sure what kind of future an app like Seene has at scale without the welcoming arms of a larger entity. But the initial experience is fairly compelling.


The dangers here, of course, is that there are all sorts of compelling silos and feeds out there vying for our attention. Instagram, Vine, FrontBack and more all create vertical streams of cool, clever things. But there are only so many hours in the day. Is Seene compelling enough to slice off a chunk of that time?


If it gets enough traction and people take to the unique ‘seenes’ that it presents, there could be something here. I hope so, because the team says that they have a bunch of ideas for how to make the app better and some cool features that they need resources to execute, and so far I’m intrigued.


You can grab Seene on the App Store here.







Leave a Reply

Subscribe to Posts | Subscribe to Comments

Popular Post

Followers

- Copyright © 2013 FB EDucator - Powered by Blogger-