basically there is a website that I would like to scrape, and save a URL from a link, (which changes everytime the page loads). A portion of the URL remains the same, so locating the information in the source code will be easy, and it is always the same length, so it will be easy to save. The only problem is that the website checks the user agent, sets cookies, requires login etc etc. You will need to use cURL to set your useragent, cookies and login. Once logged in, i need the page scraped, and the URL saved to a variable. This should all be done within a Function that I can call when needed to get the URL.
This really should be an easy project, as there are LOTS of scripts out there already to do most of the work for you. As such, the budget for this project is very low.