
Ditutup
Disiarkan
Dibayar semasa penghantaran
No worries, that's way more workable. Here's the full post trimmed to fit under 10,000 characters: Nationwide Property Auction Web Scraping & Intelligent Alert System (Ongoing) About Us We're a commercial real estate investment firm that acquires distressed properties nationwide. We have the capital to close on any deal in the U.S. — our bottleneck is finding opportunities before competitors. We're building an automated system that monitors every property auction source in the country, filters against our criteria, and alerts us only on qualified deals. This is not a data dump project. We don't want spreadsheets with thousands of rows. We want a smart radar system that scans everything, filters ruthlessly, and only pings us when something matches. Long-term ongoing engagement — we build incrementally and need one reliable developer who grows with us. The Problem U.S. distressed property auctions are fragmented across 3,143 counties, 11+ federal agencies, and 15+ online platforms. County tax sales live on individual county websites or SaaS providers like [login to view URL] (500+ counties) and Grant Street Group. Sheriff sales are on county sheriff sites. Federal seized properties are on [login to view URL], [login to view URL], and others. No single database captures everything. What You'll Build Two components: a Data Ingestion Engine (scraping) and an Alert & Filter System (intelligence). PART 1: Data Ingestion (Phased) Phase 1 — Major Platform Scrapers (Month 1–2): [login to view URL] (500+ county subdomains, 15+ states — #1 priority) Grant Street Group / LienHub / DeedAuction (FL, AZ, MD, CA) [login to view URL] (county tax sales, sheriff sales, federal forfeiture) [login to view URL] (Apify pre-built scraper exists) [login to view URL], [login to view URL], [login to view URL], [login to view URL] SRI Services / ZeusAuction (Indiana, 92 counties) CivicSource (Louisiana) Phase 2 — Federal Agency Monitoring (Month 2–3): HUD HomeStore, Fannie Mae HomePath, Freddie Mac HomeSteps IRS Auctions, U.S. Treasury/TEOAF ([login to view URL]) GSA ([login to view URL] — has a REST API) FDIC asset sales, VA REO ([login to view URL]), USDA Phase 3 — Individual County Websites (Month 3+, Ongoing): Custom crawlers for county tax collector, sheriff, and clerk of court sites Start with top 200 counties by population, expand over time Many post PDF lists — requires OCR/PDF parsing PART 2: Alert & Filter System (Critical) Raw scraped data is worthless to us. Our team is small and cannot review thousands of listings. You must build a filtering and scoring pipeline. Our Investment Criteria: Commercial properties: 8%+ cap rate, 12%+ cash-on-cash, 70%+ occupancy, $2M+ deal size. Target types: retail, office, industrial, multifamily, medical office, government-leased, NNN, mixed-use. Residential/smaller: Estimated value must be 30%+ higher than opening bid. Flag anything under $100K as lower priority. What the Filter System Must Do: For every listing, auto-lookup estimated market value via ATTOM Data API (we provide access), county assessor data, or comparable sources. Calculate the spread between opening bid and estimated value. Score each property 1–100 based on: discount to value (highest weight), property type match, deal size, location, auction competition level, and days until auction. Categorize alerts into tiers: RED ALERT (80+): Matches all criteria, big discount, auction soon. Send immediately via email + webhook. HIGH PRIORITY (60–79): Matches most criteria. Include in daily summary. WATCHLIST (40–59): Partial match. Weekly report only. Below 40: Store in database, don't alert. Daily summary email by 7 AM ET: new listings scraped, number passing filters, ranked top 10–20 opportunities with address, auction date, opening bid, estimated value, discount %, property type, score, and direct link to listing. Must be scannable in 30 seconds. Weekly report: total volume by state, best opportunities, scraper health status, coverage stats. Deduplication: if the same property appears on multiple platforms, merge into one record noting all sources. Data Fields Per Listing Property address, parcel ID/APN, auction type, auction date/time, opening bid, assessed value, estimated market value, discount %, property type, sqft, lot size, occupancy (if available), county/state, source URL, status (upcoming/active/sold/postponed/canceled), priority score (1–100), alert tier. Output Structured JSON via webhook to our Supabase REST API (we provide schema + credentials). Alert emails as formatted HTML via SendGrid/SES. No duplicates — dedup by address + parcel ID. Technical Requirements Python (Scrapy, BeautifulSoup, Selenium) and/or JavaScript (Puppeteer, Playwright) API integration (Supabase REST API, ATTOM Data API) Anti-bot handling: CAPTCHA solving, IP rotation, proxy management Government website scraping experience (fragile, inconsistent sites) PDF parsing and OCR Email automation (formatted HTML alerts) Scheduling/orchestration (cron, n8n, Airflow, or similar) Error handling — scrapers must alert YOU when they break, not silently fail What We Provide ATTOM Data API access for property valuations Supabase database schema and API credentials Detailed platform documentation and URL structures Prioritized county/platform target lists per phase Investment criteria and scoring weights Fast, responsive communication Ideal Candidate Has scraped real estate or government auction sites (show examples) Has built alert/scoring systems on top of scraped data Builds maintainable code — sites change constantly, updates must be easy Proactive communicator — tells us immediately when something breaks Thinks like an architect, not just a scripter — this system will eventually monitor 3,000+ sources Available 20–40 hrs/week, scaling as we expand To Apply Examples of scraping projects, especially real estate/auctions/government data Your approach to building the scoring/alert layer — how do you process thousands of listings daily and surface only the top 20? Tools/frameworks you prefer and why How you handle anti-bot protections at scale Availability and rate How you'd approach scraping [login to view URL]'s 500+ county subdomains, structured so adding new counties is trivial Generic copy-paste proposals will be skipped. If your application doesn't address the filter/alert system, we'll pass.
ID Projek: 40297515
22 cadangan
Projek jarak jauh
Aktif 1 hari yang lalu
Tetapkan bajet dan garis masa anda
Dapatkan bayaran untuk kerja anda
Tuliskan cadangan anda
Ianya percuma untuk mendaftar dan membida pekerjaan
22 pekerja bebas membida secara purata ₹28,260 INR untuk pekerjaan ini

Have over 18 years of experience in data mining/ Web scrapping/ Scraping Bots/ Chrome/Opera Extensions I have done it all. Tell us your source and we will put it in excel for you, Or we can even give you filtered results as per your requirement, In the format you want. You can also ask for data into a particular format - Excel, Json, Mysql, Databases, XMLs, you name them. Further Can help you with integrating it with ur databases, Can create json outputs. We are not only good with scraping but also with the tools that u may need after that. We can help you build you softwares round the data we have 99% Data Accuracy. We have Duplicate finder. etc., We can help with Statistics on the data We can help with creating Api's front the data We can create Softwares to manage that data We can build Sites round the data
₹25,000 INR dalam 2 hari
6.9
6.9

Hi there, I hope you're doing well. I reviewed your project and this is exactly the kind of system I love building. Look no further, Suryansh is here to help you! I actually built something very similar for Dubai real estate listings where we scraped multiple auction platforms, enriched data with pricing APIs, calculated per-square-foot values, scored properties using OpenAI, and sent alerts only when deals matched investment criteria. Your project is right in my wheelhouse. Why Me? I have 5 years scraping experience covering 1000+ websites including government sites, auction platforms, and fragile county portals. I handle CAPTCHA, IP rotation, persistent sessions, and all the anti-bot headaches that come with this territory. I currently run a system scraping 230 websites daily at midnight using Scrapy with zero manual intervention, saving everything to servers automatically. I understand your bottleneck is not data collection but intelligent filtering. I will build a two-part system: robust scrapers for Realauction, Grant Street, Bid4Assets, federal agencies, and county sites, plus a smart alert engine using ATTOM API that scores properties 1-100, deduplicates across platforms, and sends you only RED ALERT and HIGH PRIORITY deals in a clean daily email. You will not drown in spreadsheets. Skills & Experience: ✅ Scrapy & BeautifulSoup ✅ Government Site Scraping ✅ CAPTCHA & IP Rotation ✅ API Integration ✅ Property Data Scoring
₹25,000 INR dalam 7 hari
6.7
6.7

Hello there, I have strong experience building large-scale web scraping systems and automation pipelines using Python tools like Scrapy, Playwright, and Selenium. I’ve handled multi-source scraping, data normalization, scoring logic, and alert systems. I can build a scalable ingestion and filtering system with APIs, alerts, and reliable monitoring.
₹35,000 INR dalam 7 hari
6.1
6.1

Hello, I will build the ingestion engine using the requested Python and JavaScript tools to navigate the fragmented auction sites. I will implement a robust scraping architecture that handles anti-bot measures and OCR for PDF lists. For the intelligence layer, I will develop a scoring pipeline that integrates the ATTOM Data API to calculate value spreads and rank properties according to your specific criteria. The system will deduplicate records and push high-priority alerts directly to your Supabase instance, while automated summaries will be delivered via email as requested. This phased approach ensures we scale coverage from major platforms to individual counties while maintaining a clean, actionable data flow. 1) Which specific OCR tool or library do you prefer for the county PDF parsing? 2) Do you have an existing proxy provider for the anti bot handling? 3) Should the scoring weights be manageable via a configuration file or a database table? Thanks, Bharat
₹35,000 INR dalam 14 hari
4.9
4.9

Hello, I bring 12+ years of experience in Python, Scrapy, BeautifulSoup, Selenium, and API integration, specializing in web scraping, data processing, and intelligent alert systems. I can build a scalable property auction scraper across county, federal, and commercial platforms, with PDF/OCR parsing, anti-bot handling, and structured JSON outputs to your Supabase API. Beyond scraping, I’ll implement a tiered alert and scoring system aligned to your investment criteria, complete with daily/high-priority notifications. I deliver clean, maintainable code and proactive communication. Let’s collaborate to create a reliable, intelligent system that surfaces only the top deals efficiently.
₹37,000 INR dalam 7 hari
4.3
4.3

Drawing from my extensive experience as a tech-savvy freelancer, I'm confident that I'm the perfect fit for your ambitious Nationwide Property Auction Intelligence Platform. I offer a broad range of technical skills in API Integration, Python, and Web Scraping. My complete understanding of your problem statement and key goals will enable me to swiftly and effectively develop both the Data Ingestion Engine and the Alert & Filter System you require. My in-depth experience with Python web scraping will ensure I meticulously gather data from all county auctions, online platforms, and federal sources with precision. Additionally, guarantying the efficiency of our investment criteria filtering is my promise. As an experienced software engineer, creating intelligent algorithms to categorize alerts into different tiers is second nature to me.I'll generate daily email summaries by 7 AM ET which can be quickly scanned in 30 seconds highlighting the top 10-20 opportunities with detailed metric analyses such\x92 as discount%,estimated value etc. With a project of this scale and complexity that requires long-term engagement coupled with incremental development,I'm your reliable choice. Drawing from my proven track record and a career built on consistently delivering high-quality solutions under demanding conditions such as yours,I am eager for us to get started on this journey together.
₹25,000 INR dalam 7 hari
4.2
4.2

Hi! This is a really interesting project — building a nationwide auction “deal radar” instead of just another scraper is exactly the right approach. I’ve built large-scale scraping and data pipelines before, and I’d structure this with modular scrapers (Scrapy + Playwright when needed) feeding into a centralized processing layer. Raw listings would be normalized, deduplicated by address/parcel ID, enriched using the ATTOM API, then passed through a scoring engine (1–100) based on discount, property type, deal size, location, and auction timing. Only high-value opportunities would trigger alerts (Red / High Priority / Watchlist) via email and webhook to Supabase, while daily summaries highlight the top deals. For platforms like Realauction, I’d build a config-driven scraper template so adding new county subdomains is trivial. The system would also include proxy rotation, CAPTCHA handling when needed, PDF/OCR parsing for county lists, and monitoring so scrapers alert when something breaks. Happy to share similar scraping/data-pipeline work and discuss how I’d start Phase 1 with the major platforms. Looking forward to hearing from you.
₹25,000 INR dalam 7 hari
4.2
4.2

......................................................................................................
₹25,000 INR dalam 7 hari
4.7
4.7

⭐ Hello there, My availability is immediate. I read your project post on Python Developer for Web Scraping & Alert System Engineer. We are experienced full-stack Python developers with skill sets in - Python, Django, Flask, FastAPI, Jupyter Notebook, Selenium, Data Visualization, ETL - React, JavaScript, jQuery, TypeScript, NextJS, React Native - NodeJS, ExpressJS - Web App Development, Data Science, Web/API Scrapping - API Development, Authentication, Authorization - SQLAlchemy, PostegresDB, MySQL, SQLite, SQLServer, Datasets - Web hosting, Docker, Azure, AWS, GPC, Digital Ocean, GoDaddy, Web Hosting - Python Libraries: NumPy, pandas, scikit-learn, tensorflow, etc. Please send a message So we can quickly discuss your project and proceed further. I am looking forward to hearing from you. Thanks
₹36,200 INR dalam 10 hari
4.4
4.4

Hi, I am an IIT Grad, PMP Certified Professional, ex-BFSI and worked at fortune 500 companies. I will make it a reality for you. As a Web Scraping Alert System Engineer, I will implement web scraping using Python with BeautifulSoup and Scrapy to extract property auction data from various sources, and then utilize machine learning algorithms with TensorFlow or Scikit-learn to filter and alert qualified deals based on predefined criteria. Kindly click on the chat button so we can discuss and get started. Will share you my prior projects done and my resume too. I have been doing freelancing since 2019 worked at top MNCs in both USA and India. Lets connect
₹12,500 INR dalam 7 hari
3.4
3.4

With my 7+ years of experience as a Full-Stack Developer, specializing in Database Management and Python, I am the perfect candidate to build your Web Scraping & Alert System for your Nationwide Property Auction Intelligence Platform. I have a proven track record of handling large data sets and extracting valuable, actionable information from them. For your project, it involves scraping data from multiple sources and transforming it into meaningful alerts. My proficiency in Python enables me to efficiently navigate complex web platforms and process diverse data formats like OCR/PDF parsing. What sets me apart is my ability to deliver end-to-end solutions. In addition to robust data ingestion through efficient web scrapers, I will create an intelligent alert system using algorithms to filter the scraped data according to your investment criteria. Moreover, as discussed, I'll ensure deduplication so you don't waste time analyzing the same property multiple times. I'll keep you updated with daily progress reports, set realistic timelines, and remain available for any clarification or discussion throughout the project. My commitment to writing clean code that's documented and testable will guarantee ease of maintenance in the long run. Timing is crucial for your business; I take pride in delivering projects on or before the deadline 98% of the time. Let's leverage my skills and experience to build an invaluable solution for your company's growth! Let’s get started today!
₹35,000 INR dalam 7 hari
3.1
3.1

Hello, This project is less about scraping and more about building a reliable ingestion and intelligence system that filters thousands of listings and surfaces only high value deals. For the scraping layer I would build modular collectors using Python with Scrapy and Playwright. Platforms like Realauction, Bid4Assets, and Grant Street can be handled through reusable crawler templates where each county is defined by configuration rather than new code. This makes expanding from 50 to 500+ counties simple. For government and county sites I would also support PDF parsing and OCR since many listings are posted as documents. The processing layer will normalize the scraped data, remove duplicates using address and parcel ID, and enrich each listing using the ATTOM Data API. A scoring pipeline will then calculate the spread between opening bid and estimated value and apply weighted criteria such as property type, deal size, location, and auction timing. Only listings above your threshold will trigger alerts. Alerts will be delivered as structured JSON to your Supabase endpoint and formatted HTML emails via SendGrid. Daily summaries and weekly reports can be generated automatically. The system will include monitoring and scraper health checks so failures are detected immediately rather than silently breaking. This architecture allows the system to scale gradually toward monitoring thousands of auction sources nationwide.
₹37,500 INR dalam 5 hari
2.8
2.8

At Paper Perfect, we don't just understand how to build scraper systems, be compile data -- we understand how businesses work and what you need. Your property intelligence platform requires a dependable and knowledgeable developer who can work with a complex range of platforms, automate extensive data processing, and deliver timely alerts - that's exactly what we offer. As skilled API integrators, Python aficionados, and experts in database management, we've built similar solutions for clients across diverse industries. This proves our versatility and ability to adapt to unique project requirements. Finally, we understand the importance of long-term partnerships. As you grow, innovate, and expand your platform, you'll need a reliable partner who can scale alongside you and provide ongoing support. At Paper Perfect, that's exactly what we promise you – consistent availability and continuous optimization to ensure your automated system starts delivering value ASAP and keeps doing so as it grows. Choose us for a tailored experience delivering exceptional results consistently!
₹12,500 INR dalam 7 hari
2.4
2.4

Hi — your brief makes it clear you don’t need “more data,” you need a deal radar that surfaces only actionable opportunities. That’s exactly the kind of systems work I do. I’ve built large-scale scrapers plus scoring/alert pipelines for fragmented sources (including government sites and marketplaces) where reliability and signal-to-noise matter more than raw volume. My approach would be to build this as a modular pipeline, not ad-hoc scripts. Each platform (e.g., Realauction subdomains) gets a reusable spider template driven by configuration, so adding new counties is trivial. Playwright/Scrapy hybrid for dynamic pages, proxy rotation + anti-bot handling, and a PDF/OCR path for counties that publish lists instead of pages. All listings would be normalized, deduplicated (address + APN), enriched with ATTOM data, then scored using your weighted criteria. Only high-value properties trigger alerts — immediate for RED tier, daily/weekly summaries for others — delivered via SendGrid/SES and webhook to Supabase. Scraper health monitoring is included so failures are detected immediately. I’m available 20–40 hrs/week and comfortable with long-term ownership as coverage expands. If helpful, I can outline a concrete Phase-1 plan for Realauction + one federal source to validate the pipeline before scaling nationwide.
₹15,000 INR dalam 4 hari
1.9
1.9

Chhatrapati Sambhajinagar, India
Ahli sejak Feb 24, 2026
min £36 GBP / jam
₹400-750 INR / jam
$10-50 USD
₹750-1250 INR / jam
$15-25 USD / jam
$30-250 USD
$750-1500 SGD
$10 USD
₹750-1250 INR / jam
€30-250 EUR
₹1500-12500 INR
$10-30 USD
$10-30 CAD
$250-750 USD
₹600-1500 INR
$5000-10000 USD
€250-750 EUR
₹600-1500 INR
$30-250 USD
₹1000 INR