[Automated 'SEO' spammers do NOT post link building, article writing, or other non-software related bids. Those will get reported to the admins of the board as spam. I need software, not seo article writing services.]
I'm looking for a _simple_ mass keyword position checker. Must visit the urls I enter or import into the software, grab keywords from the metatags, then check the site's position in Google for that keyword. Must support nearly unlimited numbers of urls I enter or import (resources permitting).
Most similar software only work with one url at a time and require manual entry of keywords. I need a mass keyword checking tool. Similar keyword search capability as these two apps but with MASS keyword and MASS url search capabilities:
- [url removed, login to view]
- [url removed, login to view]
I just need a simple mass keyword position checking software. Nothing fancy. All those extra features of the above two apps are overkill for my needs.
Also, the search results need to be saved in a way that allow exporting of the data to txt or html so I can manipulate the data into other programs or reports. Whether the search results go to an access database or text file or some opensource database is not important to me. The end result is what's most important (provided it's a flexible/distributable solution not requiring any additional licensing).
This software MUST work in xp and should also work in Vista since Vista is in such large scale use. Source code must also be provided so I can have this updated or renamed or whatever if my needs change in the future.
Contact me if you have questions or solutions I may be overlooking (even off the shelf software that already exists or you've already created).
The software MUST read the keywords from the web pages I enter or import (or cut and paste) into the software AND must automatically check the search engines for the position of each URL for each keyword 'grabbed' from the web page.
The software must read the keywords directly from the 'live' pages, not _require_ manually entering the keywords by hand. Having manual entry as an option would be great though in cases where no metatags exist in the web page. But the software must not require entering keywords by hand. It must read the actual metatags if they exist on the web page.
Here's a summary of what I'm looking for (much of this i mentioned already):
- import urls (and/or cut and paste) as well as accept manual entry of urls
- automatically visits selected urls and grabs keywords from the 'live' metatags
- automatically searches Google for web position for each keyword
- saves all search results internally or to external database or file system of some sort
- option to export or save search results to txt or html so I can cut and paste the results into other programs or Ms Word docs
- ignore robot exclusion since the software will NOT spider sites, but instead will only look at the specific pages or urls entered into the software
- netiquette features to avoid hammering the SE's, and risk getting banned
- supports proxy servers, to avoid my ip from getting banned
- must provide source code so changes can be made in future in case any changes are needed. I'm not a programmer and have no idea what my needs will be a year from now. So this needs to be an open option.
- I own the rights to give away or do whatever I want with the program and source
- limits searches to first 100 records for each keyword, to avoid getting ip banned by search engines
- limit searches to a maximum of first 3 keywords to grab per url (in case they are spamming useless keywords into their pages) - but must provide a feature to remove this limit preferably on some sort of a case-by-case basis
- each new search is independent from previous searches. in other words, if I do a new search with a new set of urls, I don't want past searches getting searched again [I hope that makes senses]
Additional preferred optional features but am open to considering omitting or hearing other suggestions:
- gets keyword positioning results for yahoo and msn for each url as well as google. Google is a MUST. The others are optional
- preferably windows xp look and feel, not the cheesy '3d' look of some recent programs that look horrible and confusing to use
- preferably windows desktop application, php/mysql as a _last_ resort. But very heavily prefer windows desktop application`
Standard legalese junk:
1) I require complete and fully-functional working program(s) in executable form as well as complete source code of all work done (so that I may modify it in the future).
2a) If there are any server-side deliverables then they must be installed by the Seller in ready-to-run condition.
2b) All other software (including but not limited to any desktop software or software the buyer intends to distribute) must include a software installation package that will install the software in ready-to-run condition on the platform(s) specified in this bid request.
3) All deliverables will be considered "work made for hire" under U.S. Copyright law. Buyer will receive exclusive and complete copyrights to all work purchased.
3b) No part of the deliverable may contain any copyright restricted 3rd party components (including GPL, GNU, Copyleft, etc.) unless all copyright ramifications are explained AND AGREED TO by the buyer on the site per the seller's Seller Legal Agreement.