
Ditutup
Disiarkan
Dibayar semasa penghantaran
I need a highly experienced OpenClaw specialist to build and deploy a complete, production-ready, autonomous public records scraping and lead generation system on AWS EC2. The goal is monthly automated pulls of various public records from multiple county/jurisdiction government portals to identify motivated seller triggers in the real estate space (e.g., tax delinquencies, probate filings, liens, court records, etc.). Key Requirements & Deliverables: Infrastructure Setup Provision and secure a Ubuntu 24.04 LTS EC2 instance ([login to view URL] or equivalent: 2 vCPU, 4GB RAM recommended). Harden security: key-only SSH, UFW firewall (SSH from my IP only or via Tailscale), no unnecessary ports open. Install OpenClaw as a persistent daemon (systemd service) with auto-start on reboot. Use official install methods (e.g., curl installer or npm) and run onboarding securely. Optional but preferred: Set up Tailscale for secure remote access without exposing ports. Core Automation Install and configure Playwright-based scraper skill (e.g., playwright-mcp or playwright-scraper-skill) with stealth mode, residential proxy rotation, CAPTCHA handling (2Captcha or similar integration), and anti-bot evasion. Build a master orchestrator agent that reads from a configurable JSON/CSV list of jurisdictions/portals (I will provide the list privately after hire). Create reusable, agentic skills for scraping various public records types: tax delinquent lists, probate/estate filings, divorce/civil court records, liens/judgments, sheriff sales, code violations, and other motivated-seller indicators. Agents should handle login/guest search, form filling, pagination, table extraction, PDF/CSV downloads where available, using LLM-driven dynamic selectors from page snapshots. Scheduling & Execution Set up monthly scheduled runs (e.g., 1st and/or 15th of the month) via cron or OpenClaw's built-in scheduler. Include robust error handling, retries, logging, and failure alerts. Output & Storage After each jurisdiction scrape (or full run), generate clean CSV/JSON files with extracted data (e.g., owner name, addresses, parcel ID, filing date, trigger type, etc.). Upload files automatically to my AWS S3 bucket (I will provide credentials/access keys securely after hire). Use organized folders like s3://bucket/leads/YYYY-MM/jurisdiction-trigger.csv. Send a completion summary notification to my Slack channel (bot token provided) with stats (e.g., number of leads per type) and direct S3 links. Interaction & Monitoring Primary interaction: Slack integration (configure OpenClaw gateway for Slack bot). I should be able to message the agent from Slack to trigger manual runs, check status, or get reports. Optional: Set up local dashboard access via SSH tunnel (e.g., http://localhost:18789) for logs/skill management. Provide simple commands/docs for basic checks (e.g., status, restart). Extras Full documentation: setup overview, agent prompts/skills explanation, troubleshooting guide, one-line commands. Short video walkthrough (Loom/Screen recording) of the running system. 30 days post-handover support for minor tweaks/bug fixes (e.g., when a portal changes UI). My internal team will handle all database ingestion from S3 onward—no Postgres or other DB setup needed on this EC2. Must-Have Qualifications: Proven experience with OpenClaw deployments on Linux/EC2/VPS (please share links/screenshots of previous OpenClaw + Playwright projects, especially any government/public records or real-estate related). Strong browser automation skills (Playwright/Puppeteer) with proxy/CAPTCHA handling. Experience with AWS (EC2 provisioning, S3 uploads, security groups). Slack bot integration experience a plus. Budget & Timeline: Fixed-price preferred. (depending on your experience, robustness of error handling, and extras like advanced scoring). Target delivery: 7–14 days from start. Please bid with your fixed price, estimated timeline, and a brief 1-paragraph plan for this setup. To Apply: Reply with: Link/proof of 1–2 relevant OpenClaw or similar automation projects (especially public/government sites). Your proposed fixed price and delivery timeline. Short plan outline (e.g., how you'll handle portal variability, proxies, Slack, S3). Automated proposals will be ignored. To qualify, send a quick Loom video link of yourself with why you believe you are fit for this job. Only serious OpenClaw experts—generic scraping bids will be ignored. This is a proven workflow in real estate investing; looking for someone who has built similar before. Thanks!
ID Projek: 40257654
130 cadangan
Projek jarak jauh
Aktif 6 hari yang lalu
Tetapkan bajet dan garis masa anda
Dapatkan bayaran untuk kerja anda
Tuliskan cadangan anda
Ianya percuma untuk mendaftar dan membida pekerjaan
130 pekerja bebas membida secara purata $2,225 USD untuk pekerjaan ini

I understand the importance of building a fully-automated monthly public records scraper on AWS EC2 using OpenClaw for identifying motivated seller leads in the real estate space. Your project requires expertise in Linux/EC2 deployments, browser automation with Playwright, AWS services, and Slack bot integration. With over 10 years of experience in web and mobile development, including blockchain and AI/ML, I have successfully delivered tailored solutions in the real estate, FinTech, and eCommerce domains. My past projects have showcased my ability to handle complex automation tasks and deliver high-quality results. I propose a fixed price of 2400 and a delivery timeline of 14 days. My plan includes setting up OpenClaw on a secure Ubuntu 24.04 LTS EC2 instance, configuring Playwright-based scraper skills with proxy rotation and CAPTCHA handling, scheduling monthly runs via cron, and ensuring seamless Slack integration for interaction. For proof of my capabilities, please refer to my previous OpenClaw projects and similar automation tasks. I am excited about the opportunity to work on this project and deliver exceptional results for you. Looking forward to discussing the details further and getting started on this project.
$2,400 USD dalam 30 hari
8.8
8.8

Hello, I will provision and harden Ubuntu 24.04 on EC2, deploy OpenClaw as a systemd daemon, architect reusable Playwright agent skills with proxy rotation + CAPTCHA integration, implement a JSON-driven master orchestrator, configure cron/scheduler runs, automate S3 uploads + Slack summaries, and deliver full documentation + walkthrough with rollback-safe configuration. With 10+ years of experience building secure AWS-based automation systems, browser automation with Playwright (stealth/proxy flows), and production scraping infrastructures with monitoring and failure recovery, I focus on resilient, maintainable architectures — not fragile scripts. Let’s connect on chat so I can share relevant automation case studies, confirm a realistic fixed-price range within your 7–14 day window, and outline a precise phased execution plan tailored to your jurisdiction list. thank you Regards Gaurav Garg
$2,250 USD dalam 12 hari
8.5
8.5

Hello! We have strong, hands-on experience with exactly this type of infrastructure and automation setup. Our team has deployed complex scraping and orchestration systems on Ubuntu-based EC2 instances, including hardened server configurations and long-running daemonized services managed via systemd. We have deep experience with Playwright-based automation, including stealth configurations, proxy rotation, CAPTCHA handling, dynamic selector handling and anti-bot evasion techniques. We’ve worked with government-style portals and highly variable public-facing systems where pagination, session handling, and form-driven workflows require adaptive logic rather than static scraping. On the infrastructure side, we’re fully comfortable with AWS and Slack bot integrations for command triggers, status monitoring, and automated reporting. We’ve built systems where structured outputs are generated, versioned, and pushed to S3 with clear folder hierarchies for downstream ingestion. We understand the importance of resilience in monthly automated workflows: retries, logging, isolation of failed jurisdictions, and alerting mechanisms are standard practice in our deployments. Please, review our profile https://www.freelancer.com/u/tangramua where you can find detailed information about our company, our portfolio, and the client's recent reviews. Please contact us via Freelancer Chat to discuss your project in details. Best regards, Kateryna Sales department Tangram Canada Inc.
$2,450 USD dalam 7 hari
8.8
8.8

⭐⭐⭐⭐⭐ Build and Deploy Public Records Scraping System with OpenClaw ❇️ Hi My Friend, I hope you are doing well. I reviewed your project and see you are looking for an OpenClaw specialist. Look no further; Zohaib is here to help you! My team has completed over 50 similar projects in public records scraping. I will set up your AWS EC2 instance, install OpenClaw, and create a fully automated system for monthly data pulls. I will ensure security and provide detailed documentation, all within your budget. ➡️ Why Me? I can easily build your autonomous public records scraping system as I have 5 years of experience in web scraping, AWS management, and automation. My skills include browser automation, security hardening, and data storage solutions. Moreover, I have a strong grip on Playwright and Slack integrations. ➡️ Let's have a quick chat to discuss your project in detail and show you samples of my past work. I look forward to our conversation! ➡️ Skills & Experience: ✅ OpenClaw Deployment ✅ AWS EC2 Setup ✅ Playwright Automation ✅ Public Records Scraping ✅ Security Hardening ✅ Data Extraction ✅ JSON/CSV Handling ✅ Slack Integration ✅ Error Handling ✅ Scheduling Tasks ✅ Proxy Management ✅ Documentation Creation Waiting for your response! Best Regards, Zohaib
$1,800 USD dalam 2 hari
7.8
7.8

Hello. Experienced OpenClaw and browser automation developer here with 9+ years of hands on experience, I can assist you in building and deploying a secure, production-ready autonomous public records scraping and lead generation system on AWS EC2. To proceed further, I kindly request the following information: 1. Can you share sample target portals so I can assess anti-bot complexity and login requirements? 2. Do you already have residential proxy and CAPTCHA service accounts, or should I provision and configure them? 3. What specific data fields must be standardized across all jurisdictions in the output files? 4. Should the Slack bot support advanced commands like per-county reruns and log exports from chat? Please initiate a chat so we can discuss the project thoroughly. I look forward to collaborating with you on delivering a robust, secure, and fully automated public records scraping system on AWS.
$2,250 USD dalam 35 hari
7.8
7.8

Hello, As a seasoned Full-Stack Developer with a strong background in Python, JavaScript, and web scraping, I am well-prepared to tackle the challenges of your OpenClaw project. I have worked extensively on automating web scraping tasks to facilitate dynamic data extraction, making me proficient with tools like Playwright and Puppeteer and even surpassed this by developing my own Python libraries for web scraping. This expertise will prove valuable in your need for retrieving various types of public records from multiple county/jurisdiction portals. With a thorough understanding of AWS, I can confidently set up and provision the recommended Ubuntu instance with utmost security and vulnerability protection. Not only am I adept at handling the Proxy rotations and CAPTCHA situations that often hinder automation processes but also experienced in handling secure remote access without exposing sensitive ports, an added advantage in terms of system security. Moreover, my proficiency in back-end development makes me capable of building a reusable and scalable orchestration agent as per your requirements. With my automation instinct, I will ensure a robust error handling system with effective logging mechanisms and intelligent failure alerts. Also, implementing monthly schedules using cron or OpenClaw's own scheduler is well within my skill set. Assuredly, at each jurisdiction scrape, you will have well-organized output files automatically u Thanks!
$2,500 USD dalam 20 hari
7.1
7.1

Hi there, I am a seasoned Full-Stack Developer with extensive experience in building and deploying automated systems. I have reviewed your project requirements for developing a fully-automated public records scraping system on AWS EC2 using OpenClaw. I am confident in my ability to provision a secure Ubuntu 24.04 EC2 instance, install OpenClaw as a persistent daemon, set up Playwright-based scraping skills, orchestrate agent actions, schedule monthly runs, handle data output & storage, and configure interactions with Slack and monitoring. My approach focuses on robust error handling, consistent delivery of clean data, and seamless integration with your existing tools. For the next steps, I propose a detailed plan to execute the setup efficiently and effectively. Looking forward to your response. What specific motivated seller triggers are you primarily looking to identify through this automated public records scraper?
$2,550 USD dalam 53 hari
6.9
6.9

Hi, I am excited to apply for your project to build a fully-automated public records scraper using OpenClaw on AWS EC2. With extensive experience as a top California freelancer, I have successfully completed multiple OpenClaw deployments and browser automation projects specifically tailored for real estate purposes, earning five-star reviews from clients. I understand the critical need for secure and reliable scraping of public records to identify motivated seller leads. I will provision a secure Ubuntu EC2 instance, implement rigorous security measures, install OpenClaw as a systemd service, and develop a comprehensive scraping solution that incorporates error handling and integration with Slack for seamless notifications and interaction. My plan includes using Playwright for robust scraping capabilities while ensuring that all outputs are securely stored in your AWS S3 bucket. Let's discuss how I can get started and ensure a smooth implementation that meets your requirements. What specific jurisdictions will the scraper need to access, and are there any particular data fields that are essential for your leads? Thanks,
$2,750 USD dalam 16 hari
6.4
6.4

Sure, Recently I have done 2 openclaw AI projects and I installed it on windows and macOS. In those projects, I used gemini API key for AI agent and succesfully connected message channels. please reach out to me for further details.
$1,500 USD dalam 5 hari
6.9
6.9

Having been involved in systems administration and DevOps for well over a decade, I’ve developed the necessary skills to set up and optimize IT infrastructures that are efficient and scalable. My deep understanding of Linux system administrations of various distributions such as Ubuntu 24.04 LTS—a requirement for the project—places me in an excellent position to lead this task. My proficiency with AWS and experience with EC2 provisioning aligns directly with your needs, as does my command of browsers automation tools agent OpenClaw, which is essential for this multipurpose scraper. I've also specifically worked on projects that included developing systems where data was processed real-time: my consistent updating skills combined with my commitment to security will be valuable in guaranteeing a successful monthly run. I have Deployed my own Agents using openclaw
$2,500 USD dalam 1 hari
6.0
6.0

Your scraper will fail the moment a county portal changes its DOM structure or implements Cloudflare's bot detection. I've seen three real estate scraping projects collapse because they hardcoded CSS selectors instead of building adaptive agents that can reason through page changes. Before architecting this, I need clarity on two things: What's your expected monthly volume per jurisdiction (are we scraping 10 counties or 500)? And do you already have a list of portals that require authenticated logins versus guest access? This determines whether we need session management with cookie persistence or can run stateless scrapes. Here's the architectural approach: - OPENCLAW + PLAYWRIGHT: Deploy a self-healing agent system where the LLM analyzes page snapshots in real-time and generates dynamic selectors. When a portal redesigns, the agent adapts without code changes. I'll configure residential proxy rotation through Bright Data or Smartproxy with automatic IP cycling every 5 requests. - AWS INFRASTRUCTURE: Provision a hardened Ubuntu 24.04 EC2 instance with Tailscale mesh networking (no exposed SSH ports), systemd service for OpenClaw daemon with auto-restart on failure, and CloudWatch alarms for CPU spikes or memory leaks during long scrapes. - CAPTCHA + ANTI-BOT: Integrate 2Captcha API with retry logic and implement Playwright's stealth plugin to mask automation signatures. I'll add random mouse movements and typing delays to mimic human behavior - this reduced detection rates by 80% on my last county records project. - S3 + SLACK PIPELINE: After each jurisdiction completes, the agent uploads structured CSVs to S3 with metadata tagging (jurisdiction, trigger type, record count) and posts a Slack message with pre-signed download links. I'll include a daily summary report showing success/failure rates per portal. - SCHEDULING: Use systemd timers (more reliable than cron for long-running tasks) with staggered execution windows to avoid hitting multiple portals simultaneously and triggering rate limits. I've built similar multi-jurisdiction scrapers for two PropTech companies that pulled 40K+ records monthly from 150+ county sites. One system has run for 18 months with 94% uptime despite constant portal changes. Let's schedule a 15-minute technical call to walk through your jurisdiction list and identify high-risk portals before I finalize the fixed price.
$2,030 USD dalam 30 hari
6.0
6.0

Hi, I came across your project "OpenClaw Expert: Build Fully-Automated Monthly Public Records Scraper on AWS EC2 (Multi-Jurisdiction Motivated Seller Leads)" and I'm confident I can help you with it. About Me: I'm a agency owner with over 8+ years of experience in PHP, Amazon Web Services, Automation, JSON, Real Estate. , and I understand exactly what’s needed to deliver high-quality results on time. Why Choose Me? - ✅ Expertise in required Technologies and 1 year post deployment free support - ✅ On-time delivery and excellent communication - ✅ 100% satisfaction guarantee Let’s discuss your project in more detail. I’m available to start immediately and would love to hear more about your goals. Looking forward to working with you! Best regards, Deepak
$2,200 USD dalam 45 hari
5.7
5.7

Hello client, I’ve carefully reviewed your job description and have strong experience in these PHP, Amazon Web Services, JSON, Ubuntu, Automation, Web Scraping, Data Extraction, Real Estate and OpenClaw. I can build a reliable web scraping solution tailored specifically to your needs. Whether using Node.js with Puppeteer/Cheerio or Python with Selenium/BeautifulSoup, I will extract, clean, and organize your data efficiently. I also handle anti-bot protections, pagination, and full automation as required. As you can see from my profile, my web scraping reviews are excellent, reflecting my commitment to quality work. I focus on writing clean, maintainable, and scalable code because I know the difference between 99% and 100%. If you hire me, I’ll do my best until you’re completely satisfied with the result. Let’s discuss your target website and preferred data format. Thanks, Denis
$1,500 USD dalam 15 hari
5.5
5.5

Hello, I've deployed production OpenClaw systems on EC2 for real estate lead generation including tax delinquency and probate scraping pipelines with Playwright, residential proxy rotation, and 2Captcha integration - so I understand exactly where government portals break standard scrapers and how to handle it. My approach: EC2 Ubuntu 24.04 hardened with key-only SSH and Tailscale, OpenClaw installed as a systemd daemon with auto-restart. Playwright-MCP skill configured with stealth mode, proxy rotation, and CAPTCHA handling baked in from day one. Master orchestrator reads your jurisdiction JSON list, routes to the correct reusable skill per record type - tax delinquent, probate, liens, sheriff sales, code violations - handling login flows, pagination, and PDF/CSV extraction dynamically using LLM-driven selectors from page snapshots. Output lands in your S3 bucket under organized folders by month and jurisdiction, and a Slack completion summary fires automatically with lead counts and direct S3 links. Manual triggers and status checks work directly from Slack as well. Monthly cron handles scheduled runs with retry logic and failure alerts. Deliverables include full documentation, one-line command reference, Loom walkthrough, and 30 days post-handover support for portal UI changes. Fixed price: $2000. Delivery: 10 days. Happy to share relevant project screenshots privately. Thank you, John
$2,000 USD dalam 10 hari
5.5
5.5

Hi I’ll deploy a production-ready OpenClaw system on AWS EC2 with hardened Ubuntu 24.04, persistent services, and monthly autonomous runs that pull motivated-seller triggers from county portals. Your requirements for Playwright with stealth, proxy rotation, CAPTCHA handling, and LLM-driven selectors will be built into reusable scraping skills, orchestrated from a configurable jurisdiction list. I’ll wire scheduled runs with retries, logs, and Slack alerts, then auto-export clean CSV/JSON files to S3 using your YYYY-MM folder structure. Slack will be the control plane for manual runs, status checks, and summaries, with optional Tailscale for secure access and a local dashboard via SSH tunnel. I’ve delivered Playwright-based scrapers on EC2 with proxy/CAPTCHA workflows and Slack + S3 pipelines for public-records style sources. If you’re open to a quick Loom exchange, I can share relevant examples and confirm fixed price, 7–14 day delivery, and the rollout plan. Best Regards, Fizza Nadeem K
$2,250 USD dalam 7 hari
5.6
5.6

Hello, I have reviewed the details of your project. i will provision an ubuntu 24.04 lts ec2 instance on aws, secure it with key-only ssh, ufw restricted to your ip or tailscale, and install openclaw as a systemd service with auto restart on reboot. playwright will be configured with stealth settings, residential proxy rotation, captcha solving via 2captcha, and dynamic selector extraction using llm-driven page snapshots to handle portal variability. i will design a master orchestrator agent that reads a configurable json or csv of jurisdictions and dispatches reusable scraping skills for tax delinquencies, probate, liens, court filings, and other trigger types. monthly execution will run via cron with structured logging, retry logic, and slack alerts for failures or completion summaries. extracted data will be normalized into clean csv and json files and uploaded to your s3 bucket using organized year month folders. slack bot integration will allow manual triggers, status checks, and report summaries directly from chat. Let's have a detailed discussion, as it will help me give you a complete plan, including a timeline and estimated budget. I will share my portfolio in chat I look forward to hear from you. Thanks Best Regards, Mughira
$2,250 USD dalam 7 hari
5.2
5.2

I can build your fully automated OpenClaw + Playwright scraping system with a strong focus on reliability, security, and scalability. With 13+ years of backend and automation experience, I’ve deployed secure AWS EC2 environments (Ubuntu) with key-only SSH access, hardened UFW firewall rules, and production-ready configurations. I regularly build browser automation systems using Playwright for complex data extraction across multiple sources, including real estate datasets and public record systems. For your project, I can: • Configure and secure EC2 (key-only SSH, firewall, fail2ban if needed) • Implement OpenClaw + Playwright automation pipelines • Handle stealth scraping techniques and rate control • Integrate APIs where applicable • Structure clean data pipelines and upload results to S3 • Add logging, retry logic, and error monitoring I design scrapers to be resilient — with queue systems, modular architecture, and clear documentation — so they can run unattended and scale as lead volume grows. You’ll receive a well-structured, documented system that’s secure, maintainable, and ready for continuous operation. Happy to discuss target sources and volume expectations to fine-tune the architecture.
$2,250 USD dalam 7 hari
5.3
5.3

I’ve built production-ready OpenClaw systems on EC2 that automate multi-jurisdiction scraping of public records for real estate leads, including tax delinquencies and probate filings. In one project, I set up Playwright scrapers with proxy rotation and 2Captcha integration to handle anti-bot measures across different county portals. My plan is to provision a secured Ubuntu 24.04 EC2 instance with strict SSH/UFW rules plus optional Tailscale access. I will install OpenClaw as a systemd service using the official methods. For each jurisdiction, I’ll build reusable agentic skills that adapt via LLM-driven selectors to portal changes, handling logins, pagination, and file downloads. Monthly cron jobs will trigger scrapes with robust retries and alerts. Extracted CSV/JSON data will upload automatically to your S3 bucket under organized folders. Slack bot integration will enable you to trigger runs and get reports instantly. I’ll deliver full setup docs plus a short screen walkthrough and provide 30 days of support for tweaks after handoff. Does your jurisdiction list usually include portals with heavy login/anti-bot controls? Also, are PDF downloads generally consistent in format or highly variable? I can deliver within 12 days for $2800. Ready to start as soon as you share the portal list.
$1,500 USD dalam 7 hari
5.1
5.1

Hello, I am Vishal Maharaj, a seasoned professional with 20 years of expertise in PHP, Amazon Web Services, Automation, JSON, Ubuntu, and Web Scraping. I have carefully reviewed your project requirements for building a fully-automated public records scraper on AWS EC2 using OpenClaw. To accomplish this task, I will start by setting up a secure Ubuntu EC2 instance, installing OpenClaw as a persistent daemon, and configuring Playwright-based scraper skills for automated data extraction. I will create a master orchestrator agent to handle scraping from multiple jurisdictions, schedule monthly runs, and automate the storage of extracted data in your AWS S3 bucket. Additionally, I will integrate Slack for interaction and monitoring purposes. I am eager to discuss the project details further. Please initiate the chat to explore how I can assist in achieving your objectives. Cheers, Vishal Maharaj
$2,000 USD dalam 20 hari
5.8
5.8

Hello, I’m excited about the opportunity to contribute to your OpenClaw-based public records automation project. With deep experience in AWS EC2 hardening, Playwright-driven scraping with proxy and CAPTCHA handling, and building orchestrated automation pipelines with S3 and Slack integrations, I can deliver a secure, production-ready system that runs autonomously and scales reliably across multiple jurisdictions. I’ll architect the EC2 environment with hardened Ubuntu 24.04, systemd-managed OpenClaw services, stealth scraping skills, structured S3 exports, and Slack-controlled orchestration so your monthly lead generation runs are resilient, monitored, and easy to trigger or audit. You can expect clean infrastructure setup, robust error handling, clear documentation, and a stable deployment that your team can confidently operate long term. Best regards, Juan
$1,500 USD dalam 7 hari
4.9
4.9

Media, United States
Kaedah pembayaran disahkan
Ahli sejak Nov 5, 2018
$30-250 USD
$10 USD
$10 USD
$10 USD
$10 USD
₹12500-37500 INR
₹1500-12500 INR
$250-750 USD
₹600-1500 INR
₹1500-12500 INR
₹12500-37500 INR
$2-8 USD / jam
₹12500-37500 INR
₹12500-37500 INR
₹12500-37500 INR
£250-750 GBP
$5000-10000 USD
$5000-10000 USD
$10 USD
€250-750 EUR
$150 USD
₹600-1500 INR
$2-18 USD / jam
$2-8 USD / jam
₹12500-37500 INR