I need a couple of prototypes, for two different technologies, to confirm technical capabilities of the platforms. Depending on outcome, there could be follow up work on the 'real' app.
6/27-I am not looking for a minimal-viable-product, these are technology prototypes to validate the differences between cordova and ionic, and to educate myself on how these tasks would be approached on said technologies
1. Produce an app for the Cordova technology stack for the tasks defined below, which runs on Android, IOS and Windows phones and tables.
Build the app incrementally for each task, saving a zip of the project at the end of each task (so the app gets built up as you complete Tasks). The final zip file should contain an app with all the tasks completed.
2. Produce a second app for the Ionic technology stack for the tasks defined below, which runs on Android, IOS and Windows phones and tables.
3. Code should be fully commented, and include documentation on how to set up my environment to test and to rebuild/alter the app as needed. It's ok to point to any
build and deploy documentation applicable (don't rewrite stuff that already exists)
I will consider developers who can only produce one of the two requested apps.
1. Load html content (text, embedded picture, embedded view) from external URL, store locally on device
Note, do not load the webview from URL, the url contents should be downloaded and stored on the device.
2. Load html content into webview
3. Based on user scroll position, store offset (or whatever data available), establishing exact position of html content visible in webview
Show how the app can be stopped, restarted and the content is placed in the same exact position. This should be done through programming (reloading content and position correctly in webview), not by having the app go to sleep and wake up.
6/27-assume the content is a large html file, with text, some tags for sounds/audio files, etc. as in task#4, as the user scrolls through the content (scrolling the webview)) the app needs to understand where the user is in that content (what's visible), in order to reposition correctly upon app restart, and/or to detect tags that have come into the visible portion and should therefore be triggered. Like any e-book reader, which the user scrolls through and it remembers where the person left off to reposition to the right spot.
4. Assume the HTML will contain tags indicating actions the app should take when that tag becomes visible (though the tag itself won't be visible to the user).
As user scrolls webview, when a tag in the html scrolls into the visible part of the webview, the app should play a sound (any sound for this prototype)
5. Assume the HTML will contain an html button or small graphic for the user to click embedded with the textual content.
When the user clicks the button (which appears embedded in the displayed text), a window pops up and contains some text, text entry fields, an OK and Cancel button. If OK'd by the user, the data is stored on the app, using the id of the button as the key in storing the data entered by the user.
6. Create menu for app, creating toggle setting for 'Audio On'.
When setting is turned off, the sound played from task #4 above should not play.