I have a Google sheet that I use as an index for all my product stock. Inside this sheet, I already have two working App Script functions created that work, but I am having some small bug issues. So I am looking for a fresh pair of eyes to debug and help fix. The first function is createFolderSubFolders to work On Edit and as the name suggests it creates folders and files based on edit from values from column A & B. The second script function WriteURLs works with an On Open trigger and it searches the parent folder to find the Folder and File containing specific text related to values in column A and B, gets the URLs and puts the values inside Column L & M. 1. The createFolderSubFolders script already has a find test implemented in the code to check If Folder Already Exists bu...
I need an automation wherein using details such as Device ID, domain etc from the sheet Admin will be able to enroll the multiple Chromebooks. There has to be a front end having 3 specific functions:- 1. Enroll Device (Single or Multiple using CSV) 2. Unenroll Device(Single or Multiple using CSV) 3. Report (For any selected domain)
Hi i have a crawler build through python. First we need to find a solution to the issue of the server killing the process half way when i run the crawler. Thensecondly we need to schedule automatic running of the crawler through crontab. Crawler get data and upload to a googlesheet. Server is Linux.
I have a CSV file that needs to be imported via a API. - The CSV has to be split/merged into 3 different parts: organization, addresses and products. - There is a file that is set on a FTP on daily base. - The file needs to be imported every day on a time we decide. - It should skip the records that already exist - It should overwrite if the data is different - I have Google App Scripts in mind, but other solutions is also not a problem Here is the API information:
anti spy app android and ios Protection against , Malware, Adware & Spyware is someone listening to the microphone? (What app or service uses are microphones used for) geo location ( what apps use it ) alert new apps in phone ETC.. user pay 9.99$ subscription to make app work via apple and android payment system admin panel to see users ( paying not paying ) and dynamic parameter 2 "user come from webpage with 2 dynamic parameters " for ex : "dynamic parameter 1"&s3="dynamic parameter 2" S2S pixel with dynamic url parameters from web to app ( pixel fire after user subscribe with dynamic parameter 1 ) 2 lending pages ( web )
Hi, I need to identify the filesizes of a large (100 initially) collection of URLs. I know there are scripts out there that have attempted this, see below references. Can you build me a Googlesheet and app script working version? Ideally it should obtain the filesize of a URL without the need to download the content. But in the worst case, the deliverable would be a script and sheet that identified the filesizes of any URL via downloading the pages. References:
Crawler is built through python and runs from a Linux server. The crawler is set to run and scrap data 2 times in 1 day. Crawler is meant to autoupdate data to a googlesheet. The googlesheet has 2 sheets. Sheet1 updates as the crawler run. And when the sheet1 finished updating completely, all data is copy-pasted to sheet2 straightaway after. The issue is either the googlesheet is not auto-updating, or the Python script is not starting automatically. Each time i am having to run the script manually.
Hello, I want to connect my ValueSerp API to Google Sheets, so that I can automatically generate calls and get data through it. This should be straightforward as ValueSerp has extensive documentation on how to do API calls through their platform.