I need help with a simple Prototype with a workflow for 3 screens coded in a Xcode3.1 project.
Audio files depending from location and user input need to be played while showing a specific image. Another screen will show a images depending from action of accelerator sensor.
NIB, sample image and audio files will be provided.
Specific expertise that I am seeking:
iPhone SDK: UI Kit, Core Audio, Localization