Hi we have recently transferred data from Salesforce to Amazon RDS. Now we are looking for someone who can create a UI and logic in AWS. The UI and the functionality should be similar to our existing Salesforce system.
We are currently looking for someone familiar with building scrapy web scrawlers and understands the intricacies of xpath in order to build web crawlers on a regular basis for us. Please only apply if your familiar with xpath or scrapy. We pay $30 for each spider and have a working template so if you understand xpath you can fill in the blanks. Please only apply if you can build scrapy spiders.
We are seeking the names, title and email addresses of officials involved in the marketing of law schools, medical schools,...management Delivered in an excel file with name, title, email and school name. Bid on the basis of price per 1000. We are unsure if this job can be automated with a web crawling strategy but if so we can provide website urls.
I have the cryptocurrency wallet project what has BTC/ETH/.. options. Current the most of payment method is done. ONly needs to implement the buy/sell logic. I want to implement the buy/sell logic with the 3rd party libraries to reduce the time and budget. Please apply if you have a really experience and Please share your project url what It have
Hi, I need to develop a simple game about math logic, with Venn diagrams. The graphic style I would like is low poly. It is a personal, non-commercial project. It is about a character who needs to gather fishes in the sea for an aquarium, it will fill the tanks depending of the captured fishes with restrictions in each tank, using the Venn diagram
...crawler will crawl only those URLs that are enter on a given list. Re-crawling takes place on specified intervals. A example of a search vertical would be [login to view URL] A lot of the pages that need to be crawled are dynamic (AJAX etc.) and therefore needs to overcome those issues (crawling html static pages). Looking for someone smart who understands web
I am from Industrial Automation Engineer. We need a report format in excel format which is directly generated from PLC(programming Logic Controller) to EXCEL. This is customized report i.e. We insert date and time then it's showing this inserting data in excel.
Hi, we are running a dedicated server on OVH and need someone who can setup a proxy server for our crawling purposes. We can setup up to 256 IPs per dedicated server. As we need to have just a proof of concept we will do a test with 10 IPs. Check the file attachment for the basic concept. Looking forward to you application!
...long term . our website connecting buyers and sellers all over the world (antique products). , SEO, Social media marketing, Digital marketing, Data processing, Sales, Data Crawling, Data mining, Campaigns (Facebook, twitter, YouTube), Google adwords, Virtual Assistant, Data Extraction, Excel, Bulk Marketing, Email Handling, Email Marketing, Telemarketing
We need API development from crawling/scraping. The app will take realtime data from Grab mobile app ([login to view URL]) through crawling. You can use any technology or programming languages such as python, node.js, php, .net etc. By applying this, you are agreed that you will do a simple test
quote for the next 3 pages (content crawling and extracting) is: [login to view URL] -- Large 2gbp/month ( from foundation * 2 gbp = 120 GBP ) [login to view URL] -- Middle -- 60 GBP (total) from foundation [login to view URL] -- Small -- 20
I have a node.js web application that requires a functionality several developers have yet to be able to accomplish. If you are audio API expert this might be your project. Please be an expert in node.js before replying to this post. Thank you.
Hello, I created 2 scripts bash. The 1st script which save in a file all what i write in an ssh session, and the 2nd session use this file for crawling and save in a txt file all raw html source code. I used elinks bin, but since 2 days, elinks doesn't work anymore with Cloudflare. I need someone to modifiy my second script for avoid the cloudflare