We currently have a search function on our website which returns a dynamic page with different information about companies. In each page, it does a number of API requests from our separate backend database. As the URL created is dynamic, it is currently not indexed by Google. For example, [login to view URL]
We would like to manually upload a list of URLs to our sitemap to make them searchable from Google and point these to a static page loaded with our keywords. This would be used by Google for indexing purposes and wouldn’t require any calls on our API which would increase the number of pages Google can index in each visit. From a user perspective when they see the URL in search results clicking on it will take them to the dynamic page which will call our API to provide the required information.
To achieve the above we would like to create a WordPress plugin that allows Google to crawl our site map to find the selected URLs and let the search crawler see a flat HTML version of the page instead of the one with rendered content.
The objective of the test is to measure the appetite for the data we hold at a user level. If this proves successful, then we’d expand the offering to cover all our data which could include up to 20 million URLs.