Have several blogs running on our main site. Google webmasters is picking up non-existant urls and flagging them as 404 errors. These are also showing up in other tracking software such as SEOMOZ. Looking at the actual pages the links the spiders are following are not present.
Google Explains it as such:
Unexpected 404 errors
For example, your site may use the following code to track file downloads in Google Analytics:
<a href="[url removed, login to view]" onClick="[url removed, login to view](['_trackPageview','/download-helloworld']);">Hello World PDF</a>
When it sees this, as an example, Googlebot may try to crawl the URL [url removed, login to view], even though it’s not a real page. In this case, the link may appear as a 404 (Not Found) error in the Crawl Errors feature in Webmaster Tools.
Google strives to detect these types of issues and resolve them so that they will disappear from Crawl Errors. In general, 404 errors won’t impact your site’s search performance, and you can safely ignore them if you’re certain that the URLs should not exist on your site. It’s important to make sure that these and other invalid URLs return a proper 404 HTTP response code, and that they are not blocked by the site’s [url removed, login to view] file.
While it may not impact our results in search - I am looking for someone to look through the wordpress installations to try to identify the cause of these errors and correct it.