
I sometimes look over the latest SBIR (Small Business Innovation Research) topics that come out to see if there’s anything interesting. Normally the DOD SBIRs (Army, Navy, Air Force, etc. agencies) have some way out stuff they are looking to fund.
Better photo analysis
Something about this National GeoSpatial Intelligence Agency (NGA) SBIR topic caught my eye. Essentially what they are looking for is more ways to interact with images that are taken from overhead.
In the old days, film would be laid on a light table and a scope would be hauled over a spot to examine it in detail. As the whole photo was in front of the analyst, they had before them an entire overview of the spot to lend context to their detailed analysis.
Today, this is all done on desktop view stations using mouse/tracking pads for photo manipulation and examination. However, this approach limits analyst activities to using one or another hand on a 2D surface to analyze a scene. What NGA is looking for is a better approach that uses more of a person’s capabilities to help navigate and analyze a scene in detail.
Enter the iPad
I can’t help but wonder if this couldn’t be an application for the iPad.
I foresee an approach where the iPad is used as a sort of magnifying glass for a photo/image using a a projection of the scene on a floor or a table. The analyst positions the iPad over the scene to view a particular portion in detail. The iPad could be on a suspension system which records the movements in 3d and could provide precise relative positioning of the iPad to the photo to detect where on the scene the iPad is positioned and magnify the view. Of course the analyst could use the standard iPad hand gestures to zoom in or out in the scene. Possibly, the iPad vertical or z-position could zoom in or out of the whole scene leaving magnification the same and then other orientations could move the scene underneath the iPad/analyst (angle relative to the scene).
Using a suspension system is probably easiest and to interface with the iPad app but there’s no reason some sort of WiFi or GPS augmented/detailed location triangulation couldn’t provide the same sorts of information. It would seem to me that providing an X,Y, and Z location could be had with such a system and perhaps even the orientation of the screen could be supplied to provide a proper overlay of the scene being shown.
The nice thing about such a system (without the suspension) is that it would potentially work for multiple analysts using multiple iPads and possibly other IOS devices. Such a capability would not be unlike what was available with real film and magnifying glasses.The other advantage to having the iPad or any tablet above a simulated light table is that the analyst could look around the iPad to see more context if needed.
So what’s an SBIR program
SBIRs are research topics that the Federal government wish to fund. The government sets aside 2% of their R&D budget ($Bs) to devote to small businesses (<500 employees).
I don’t want to discourage anyone from doing an SBIR but I found the effort to do a proposal to be significant and after 7 submissions on different topics with 0 successes, I stopped. However, I still find some of the topics interesting reading.
Commercial applications for a tablet magnifyer
I suppose there are plenty of other opportunities for such a device in photo analysis. The device/app would be useful for commercial satellite imagery, lithographic prints of electronic circuitry, and any large format photographic work that required detailed analysis. In any event, now that 16M pixel cameras are becoming common place it would seem to be a growing market.
—-
Comments?
I've got the solution for SBIR NGA11-002 all mocked up. Check out my tweet on this subject @JasonAFE.
Jason,Thanks for your comment. It looks a little like what I learned in Boy Scouts for my semaphore merit badge. I like the iPad idea better but there's no reason one couldn't combine the two approaches.Ray