I need an application built that streams frames captured from the iPhone camera, through one of a set of custom filters we've already implemented, to the screen. The application would have the following extra features:
1) The ability to switch the active filter
2) The ability to apply the output of an on-screen slider to the filter
3) The ability, on iPhone4, to activate the LED Torch
4) The ability to switch from rendering from the camera, to rendering from a set of JPEGs built into the application
5) Possibly, the ability to sample every other pixel, and then zoom, if performance on iPhone4 at full resolution is too low. This probably won't be necessary, given performance seen on the Android port.
You would not be starting from nothing. We already have:
a) Full source code to a working IOS3 prototype
b) Full source code to a working implementation of the final UI and feature set, implemented in Processing for PC or Mac.
There's some interest in speed, and experience with AVCapture.
Please provide me your proposal, how would be this achievable?