News and discussions about image sensors
Seems kinda cool, except that the website's refocus thing doesn't work. The depth maps also don't look right. Texture from the images tend to bleed into their depth map, which makes no sense to me. Delineations between squares on the chess board should not show up on the depth map. What the company has demonstrated is that they're not capable of even implementing published algorithms to extract depth from multiple views.
The IQ is bad. Also who cares? Why would anyone refocus pics that are not his. Pelican: first you make a camera. Then you sell it. Then the buyer takes a picture. Then he might play with the focus (I doubt it). You got the order wrong.
Refocusing images is a ridiculous use case and Pelican's continued focus on it shows they have no imagination. Everyone will have post refocus in their mobile camera by next year without using a power hungry, large sized and expensive Pelican camera. Ultimately, Pelican has no real innovation and it will be gobbled up by Rambus so they can sue people using their bogus patents. Really, how useful are patents to a device which sucks?
Didn't the new (current) CEO come from Rambus?
Pickett was at Scambus and Imaging is their next target.
The delineations are suspicious. I assume show up because there is high contrast in this area and the imager can actually do a somewhat accurate depth at that point. The rest of the square is a soid colour becuase it can't provide any depth so it averages. May yet be useful for decoraters and shopping for size so someone may yet create the killer app.
If you're right, then most of the depth map is actually "averaged data" (aka fake).A lot of depth camera companies have tried to market to the consumer as a shopping experience for size measurements. This is certainly a good use.The problem is that the error for something that Pelican Imaging provides is probably too high to be useful. In order to measure a 1.5m table, I would want to stand 1-2 meters away and the farthest point is going to be 2-3m away from the camera.At 3 meters away, based on their own (presumably) empirical data (http://2.bp.blogspot.com/-MoY5EQ5IZMQ/UuFoHAy-yRI/AAAAAAAAIRk/sWTvZcse3O8/s1600/PiCam+Depth+Resolution.jpg), the error is 600cm! If I have allocated a 1 meter wide hole for a table, then buying a 1.2m wide table by accident due to measurement error is a serious problem.
You should learn math :-)
You're probably referring to the typo of 600cm instead of 600mm. It doesn't change the argument... An error of that much at close range (meters) makes it worthless for most application.
The depth map is pretty reasonable in comparison to what a kinect would output, kinect might even fail outdoors (for a stereo camera of this size-1cm,) . Too bad these folks are unable to show off anything worth despite having a reasonable interpolated depth map.
If that's a joke, then hah. The accuracy of Kinect's depth map is over an order of magnitude better than these.Kinect fails when there is extremely bright sunlight, but it works in complete darkness.... there's no free lunch.
All comments are moderated to avoid spam.