A python based CLI script that can download all product’s firmware (including all versions) from web pages for a given list of predefined vendors and store the information (meta data) in SQLite [login to view URL] mandatory metadata fields include ( Manufacturer, Model, Version, Type, Name, Release Date(if available), Download link ) i.e. ( Cisco, Video Surveillance 6030 IP Camera, 2.7.0, IP Camera, [login to view URL], 21/08/2015, "link" ) There is a non-mandatory binary field which indicates if the device is discontinued or not depending on the fact that vendor mention that on the website or not. The firmware files itself will be stored in the file system and will be referenced by index ID in SQLite.
The arguments to the script should be a list of comma separated vendor names or the location of a text file containing the vendor name.
There are no GUI components in the server where the script will run hence headless mode for browser should be used by the script
1. Script will be written per vendor. This is required because each vendor website will have its own implementation of the firmware download page. However, efforts will be put to identify and implement reusable components, if any.
2. The script will only download new firmware that have been added by the vendor. Hence first execution of script will download all the firmware available but the subsequent runs will only download new ones which will get added. This will be achieved by analysing data available in SQLite and skipping the files that are already been downloaded and processed.
3. Each vendor, that will be provided, will be analysed manually to identify the following, which will be required to develop the script:
a. URL for the firmware download page
b. Credential Requirements (Simple Signups, Specific Signups, No Signups)
c. Any Captcha on the page
d. Any honeypot traps
4. If there are credential required to download the firmware and the credentials are simple ones where a simple sign up is required, the signup will be done manually as part of the manual analysis using a gmail account dedicated for this work.
5. Script will try to imitate human like behaviour (to a limit) while scraping the web page as well as uses Tor, so that if the vendor site has scraper/crawler detection logic implemented, it can be skipped. This will be achieved by adding random delays, random view time, avoiding honeypot traps through manual analysis
A Python Selenium and SQLite based solution will be developed which will have the following features/components:
1. File Management Module: Responsible for storing and managing the downloaded files and meta data. Firmware and installer files will be stored on the filesystem which will have a structured folder hierarchy. Meta data of the files will be stored in SQLite. Meta Data will refer to the stored files through paths on the file system and file index/name.
2. Vendor Scrappers: Python Selenium based scrapper will be written for each of the vendor, responsible for downloading the files and grabbing the meta data from the vendor’s site. This will make use of the file management module to store the file and meta data to SQLite.
3. Configuration File: All the configurations for the framework (including vendor specific like credentials, url etc) will be stored in a json file which can be easily modified manually.
4. Execution Script: The configuration file can be setup to represent the polling interval for each of the vendor scraper and when the execution script is run it will go and schedule each of the vendor scripts individually according the polling interval defined in the config.
1) Python Source Code including the comments in the code explaining each function & its details. We should be able to give any required input as an argument and execute it as one line command in the Linux terminal.
3) Manual to install, configure and use the scraper
21 pekerja bebas membida secara purata €21/jam untuk pekerjaan ini
Hello sir! As a Python expert, I am glad to place the bid on your project. As you can see in my profile, I am fully experienced and lots of skills in automation scripts. I want to discuss more via chat. Regards.