
Ditutup
Disiarkan
Dibayar semasa penghantaran
I need a self-contained script that runs once every 24 hours, crawls the web for domains I haven’t seen before, and then looks on each newly discovered host for publicly accessible text or PHP files. Every domain the crawler uncovers is in scope; I’m not excluding anything and I’m not supplying a predefined list. Here’s the flow I have in mind: 1. Start with web crawling to spot fresh domains. 2. For each new host, probe common paths and directories for *.txt, *.php, and other text-based files. 3. Copy any matches to local storage (or S3 if it’s just a quick config change), organising them by date and domain name. 4. Log everything—first-seen timestamp, checked URLs, files saved, and errors—into a simple SQLite DB or CSV so I can review activity later. 5. Optionally send a short daily summary (email or Slack) with counts of domains found and files retrieved. Deliverables • Well-commented source code (Python preferred, but I’m open) with a [login to view URL] • Step-by-step setup notes for cron (or Windows Task Scheduler) so the job runs automatically each day • Sample run output demonstrating at least one domain processed and files stored correctly • A concise README explaining how to add or change file-type filters in the future I’ll mark the job complete once the script is running on my VPS, successfully detects new domains, grabs text/PHP files, and writes a clean log without manual intervention.
ID Projek: 40336049
97 cadangan
Projek jarak jauh
Aktif 9 hari yang lalu
Tetapkan bajet dan garis masa anda
Dapatkan bayaran untuk kerja anda
Tuliskan cadangan anda
Ianya percuma untuk mendaftar dan membida pekerjaan
97 pekerja bebas membida secara purata $149 USD untuk pekerjaan ini

I have carefully reviewed the project requirements for Daily Domain File Extractor. My expertise in PHP, Python, Web Scraping, C# Programming, and Software Architecture align perfectly with the tasks at hand. I am confident in my ability to deliver a high-quality, self-contained script that meets your specifications. Adjusting the budget to fit the scope is not an issue, and I am dedicated to completing the project within your constraints. Please review my extensive 15-year-old profile to see my past work. Let's discuss the details further. I am eager to kick off this project and showcase my dedication. Looking forward to your response.
$225 USD dalam 6 hari
8.7
8.7

Hello I have almost 10 years of experience with Python and Web Scraping and over 25 years of experience with Windows Desktop application development, therefore I have extensive experience with Windows Task Scheduler. Also, I have experience with SQLite too.
$75.70 USD dalam 3 hari
8.4
8.4

Hi, I have over 5+ years of experience in both frontend and backend development. I will do the specified tasks. Key Areas of Expertise: a) Full-Stack Development: Proficient in both frontend and backend technologies Frontend: Next, Js, ReactJS, Bootstrap, JavaScript, jQuery Backend: Laravel, CodeIgniter, Node.js b) API Integration: Experienced in integrating and working with APIs to enhance application functionality. c) Microservices: Skilled in developing and integrating microservices for scalable and efficient solutions. d) Database Management: Competent in managing databases with Postgresql, MySQL, MongoDB, and Oracle. d) Server Handling: Adept at handling server environments such as AWS, Google Cloud, VPS, Apache, and Nginx. Let’s discuss how I can help achieve your project goals and add value.
$95 USD dalam 2 hari
8.0
8.0

Thank you for the detailed summary. It sounds like a well-structured and efficient solution for daily domain discovery and file probing. I appreciate the flexibility with customizable file types and paths, as well as the organization of files by date and domain. The inclusion of SQLite logging and email summaries adds valuable tracking and reporting features. I will review the script and setup instructions, and let you know if I need any assistance with configuration or customization. This approach seems perfect for automating my analysis workflow, and I look forward to implementing it. Thanks again for the comprehensive overview!
$160 USD dalam 4 hari
7.7
7.7

Hi, We’ve built similar web crawlers that discover new domains and extract files, including PHP and text files, so we know how to handle this securely. We can also add features like checking for SSL certificates and using Google’s Safe Browsing API to filter out potentially harmful domains. For your project, we’ll use a combination of Python libraries like BeautifulSoup and Selenium to ensure we capture all relevant domains. We’ll also implement a robust logging system to track which domains were processed and what files were extracted. Let’s schedule a 10-minute call to discuss your project in more detail and ensure I fully understand your requirements. I’m ready to start immediately and can dedicate 10 hours per week to this project. Best regards, Adil
$136.40 USD dalam 7 hari
7.2
7.2

With over a decade of proficiency in the exact skills needed for your project, I am the ideal candidate to deliver a robust solution that will effectively meet your needs. My ability to think outside the box and develop customized Python scripts has consistently allowed me to deliver high-impact, result-oriented solutions to clients worldwide. Furthermore, I ensure that my clients' peace of mind is a top priority by producing sample run output showcasing correct data extraction practices. To demonstrate transparency, accountability, and progress continuation even post-project completion, I offer ongoing technical support to address probable queries or lingering issues. Harnessing my expertise in web automation, AI solutions and Full-Stack development will enable us to create a powerful crawler script generating effective results for your needs! Let's get started straight away!
$100 USD dalam 1 hari
7.2
7.2

Hi, I AM INTERESTED TO WORK ON THIS PROJECT. Can we talk here ?? My Skill :- Python Data Scraping PHP Machine Learning Django SQlite Looking forward to an early and positive response. Regards, Shalu
$99 USD dalam 6 hari
7.0
7.0

✅ Proposal for Daily Domain File Ext With a strong background in web crawling and automation, I am ideally suited to develop your script for identifying and storing new domain data. My expertise in Python, along with experience in managing similar projects involving data extraction and automation, will ensure efficient and accurate script execution. I am proficient in setting up cron jobs and writing well-commented code, ensuring easy maintenance and scalability. My previous projects include developing tools for continuous data extraction and logging, making me a perfect fit for your requirements. I look forward to delivering a robust solution that meets your daily data collection and logging needs efficiently.
$250 USD dalam 7 hari
6.8
6.8

As a seasoned Senior Full Stack Developer, I come equipped with over six impressive years of experience and comprehensive skillset that perfectly fit your project. The ability to employ multiple programming languages such as Python, C#, and familiarization with automation and web scraping make me uniquely qualified for this task. A firm believer in automation, I appreciate the importance of your daily file extraction needs and would develop a script that runs seamlessly on your VPS.
$140 USD dalam 2 hari
6.5
6.5

Hello, I’ve gone through your project details and this is something I can definitely help you with. I have 10+ years of experience in mobile and web app development, focusing not just on development but also on robust automation solutions. My expertise includes Python scripting, web scraping, and database management, which aligns perfectly with your requirements for a daily domain file extractor. I will create a self-contained script that crawls the web to discover new domains, probes for various file types, and securely stores them, all while logging the processes in a structured manner. Additionally, I will provide well-commented source code, setup instructions for cron jobs or task scheduling, and a concise README to guide you. Here is my portfolio: https://www.freelancer.in/u/ixorawebmob I’m interested in your project and would love to understand more details to ensure the best approach. Could you clarify: 1. Do you prefer the output files in local storage or S3? Do you prefer the output files in local storage or S3? Let’s discuss over chat! Regards, Arpit Jaiswal
$155 USD dalam 25 hari
7.2
7.2

Hi, As a individual developer, I can help in your project focusing on the most important parts like building a scheduled crawler for approved domains, scanning allowed paths for public text-based files, storing matched files locally or in S3, and keeping clean logs in SQLite or CSV with optional daily summary notifications. With my expertise in full-stack development and experience working with modern web technologies like Python, web automation, SQLite, S3 integration, cron-based scheduling, logging systems, and safe crawler design, I can fix this quickly. I can build this as a self-contained script for owned or authorized domains with configurable file filters, reliable storage structure, and clear setup documentation for VPS deployment and future updates. You can expect clear communication, fast turnaround, and a high-quality result that fits seamlessly into your existing workflow. Best regards, Juan
$140 USD dalam 1 hari
6.0
6.0

Hello, I see that you need a self contained script that will run once every 24 hours, crawl the web for domains you haven't seen before, and then will look on each newly discovered host for publicly accessible text or PHP files. I would love to discuss more details via chat. Looking forward to working with you, Fahad.
$100 USD dalam 2 hari
5.6
5.6

Hello there, I hope you are well. I’m an individual developer with strong Python scripting, web crawling, and automation experience. I build robust, self-contained tools that run on a schedule, gather data, and report clearly. For this project I’ll deliver a Python script that discovers new domains, probes common paths for text-based files like .txt and .php, and stores matches locally or in S3. It will log first-seen timestamps, checked URLs, files saved, and errors in SQLite or CSV, and include a simple README and a requirements.txt. The solution will be modular, well-commented, and accompanied by setup notes for cron or Windows Task Scheduler, plus a sample run output demonstrating a domain processed and files stored. I can handle the work end-to-end, with a concise timeline and a quality-first approach. I will deliver a ready-to-run MVP and follow up with refinements as needed. Please feel free to contact me so we can discuss more details. I am looking forward to the chance of working together. Best regards, Billy Bryan What other constraints should I consider?
$250 USD dalam 5 hari
4.8
4.8

I’d be happy to build your fully automated daily crawler. Your outline is clear, and I can deliver a robust, self‑contained script that discovers new domains, scans them for text-based files, stores results cleanly, and provides daily summaries—without requiring manual inputs or a predefined domain list. Here’s how I will implement the solution: 1. Automated Web Crawler (Daily Execution) • A Python-based crawler that runs once every 24 hours (via cron or Task Scheduler). • The crawler starts from configurable seed sources to continuously discover new domains in the wild. • Newly found hosts are tracked and de-duplicated to avoid redundant scans. 2. Smart File Discovery on Each Host • For every unseen domain, the script will probe common directories and routes. • It will automatically detect: — .txt — .php — .log — .cfg — And other text-based formats (customizable through a simple config list). • Files are downloaded only if publicly accessible. 3. Organised Local (or S3) Storage • Files saved under a clean structure: /data/YYYY-MM-DD/domain-name/*.ext • Optional S3 integration (just a config toggle with your credentials or IAM role). 4. Complete Logging & Auditing • SQLite or CSV-based logging—your choice. • Records stored for every run: — First-seen timestamp for each domain — URLs checked — Files retrieved — Errors and HTTP codes • Log schema documented for quick reference. 5. Optional Daily Summary Report • Configurable email or Slack notification. • Includes total domains found, total scanned, number of files retrieved, and errors if any.
$30 USD dalam 5 hari
4.7
4.7

Hello, I’ve reviewed your project, Daily Domain File Extractor, and I’m genuinely interested. With my experience, I’m confident I can complete it efficiently and to a high standard. I clearly understand the core requirements of your project. I will approach the work with attention to detail and strong communication. The final delivery will reflect your vision and desired results. I’m a Senior Software Engineer specialising in PHP, Python, MySQL, Automation, Software Architecture, Web Scraping and solution design. Over the years, I’ve completed comparable projects that required careful analysis and technical precision. I focus on delivering results that are both technically sound and aligned with client expectations. Before moving forward, I’d appreciate the opportunity to clarify a few details. Please send me a message in the chat so we can discuss everything properly. Thanks, Dax Manning
$200 USD dalam 7 hari
4.3
4.3

Hi there, I'm Kristopher Kramer from McKinney, Texas. I’ve worked on similar projects before, and as a senior full-stack and AI engineer, I have the proven experience needed to deliver this successfully, so I have strong experience in Automation, MySQL, Web Crawling, Scripting, Software Architecture, Python, C# Programming, Web Scraping, PHP and SQLite. I’m available to start right away and happy to discuss the project details anytime. Looking forward to speaking with you soon. Best regards, Kristopher Kramer
$120 USD dalam 3 hari
4.8
4.8

Hi there. Which domains, subdomains, or storage endpoints do you own or have written authorization to monitor each day? Do you want the first version to focus on exposure monitoring for approved assets only, including text, config, backup, and log files with alerting and clean audit logs? A safe version of this system can be built as a daily exposure-monitoring script for authorized assets, with scheduled crawling, file-type filters, local or S3 storage, and SQLite-based logging. The design can stay modular so new checks and file patterns are easy to add later without changing the core flow. A similar task was handled where public-facing assets had to be scanned daily for accidental file exposure and misconfigurations. The main challenge was keeping the scan reliable, low-noise, and easy to review over time. That was solved by building a scheduled crawler, scoped path checks, structured logging, and alert summaries with simple deployment on VPS. Strong experience in automation, backend systems, and security workflows helps deliver a clean and maintainable monitoring tool. Best, Ivan
$250 USD dalam 3 hari
4.3
4.3

Hello, I appreciate you reaching out, but I'm not able to help with this project. What you're describing—automatically crawling the entire web to discover domains without permission, then probing those hosts for accessible files and downloading them—describes unauthorized access and mass data harvesting. This would likely violate computer fraud laws in most jurisdictions, regardless of whether files are technically "publicly accessible," since you're systematically probing systems without authorization. If you have a legitimate use case, such as security research with proper authorization, monitoring domains you own, or working within a bug bounty scope, I'd be happy to discuss ethical approaches to those specific needs. Let's discuss!
$100 USD dalam 5 hari
3.7
3.7

Hi, I've built automated data extraction scripts that run on schedules -- including a PHP scraper that pulls data from a web page every 30 minutes and saves to MySQL, and a Python script that connects to FTP servers, downloads files, extracts archives, and processes the contents automatically. For your daily domain file extraction: 1. Automated script (Python or PHP, your preference) that downloads the daily zone files or domain lists from the source 2. Parse and extract the relevant domain data into a clean, structured format (CSV, JSON, or database) 3. Schedule via cron to run daily without manual intervention 4. Error handling: retry on download failures, log any issues, alert if the source format changes 5. Output delivered to your preferred location (local files, database, FTP, cloud storage) I'm comfortable working with large data files, compressed archives, and various file formats. The script will be clean, well-documented, and easy to maintain. What's the source for the domain files (ICANN zone files, registrar feeds, or a specific website)? And what format do you need the output in?
$30 USD dalam 3 hari
4.0
4.0

Hi there, You’re absolutely in the RIGHT PLACE. I’ve delivered SIMILAR PROJECTS multiple times and know EXACTLY how to execute this efficiently and correctly from day one. To lock down the SCOPE, TIMELINE, AND PRICING, I’ll need to ask you a few key questions. Unfortunately, Freelancer’s 1500 CHARACTER LIMIT doesn’t allow me to break everything down properly here. Let’s jump on CHAT so I can show you my PROVEN PAST WORK, walk you through the REAL RESULTS I’ve delivered, and outline a CLEAR ACTION PLAN for your project. You’ll immediately see why my approach is DIFFERENT and EFFECTIVE. If you’re serious about getting this done RIGHT, I’m ready to move forward. Looking forward to CONNECTING and WINNING TOGETHER. Cheers, Mayank Sahu
$140 USD dalam 7 hari
4.0
4.0

Mojacar, Spain
Kaedah pembayaran disahkan
Ahli sejak Nov 14, 2025
$30-250 USD
$30-250 USD
$30-250 USD
$30-250 USD
$30-250 USD
₹600-1500 INR
₹600-1500 INR
$250-1200 USD
$10-30 USD
$20-40 AUD
$250-750 USD
€12-18 EUR / jam
₹600-1500 INR
$30-31 USD
$30-250 USD
₹1500-12500 INR
$30-250 USD
₹1000000-2500000 INR
$10-30 USD
₹37500-75000 INR
₹600-1500 INR
$10-30 USD
$15-25 USD / jam
₹1500-12500 INR
₹75000-150000 INR