Informatica powercenter etlpekerjaan

Tapis

Carian terbaru saya
Tapis mengikut:
Bajet
hingga
hingga
hingga
Jenis
Kemahiran
Bahasa
    Status Pekerjaan
    5,000 informatica powercenter etl pekerjaan dijumpai

    ...data management system that can collect, clean, store, and surface large volumes of information in a way that’s secure, fast, and easy to extend. The primary goal here is, quite simply, data management—everything we build should serve that purpose. Here’s what I have in mind: • Centralised, well-structured database architecture (SQL, NoSQL, or a hybrid—open to your recommendation) • Automated ETL pipelines for reliable data ingestion and transformation • A lightweight admin interface for monitoring, querying, and exporting datasets • Role-based access controls and encryption to keep everything compliant and secure • Clear documentation so future contributors can step in without guesswork You’re welcome to lean on t...

    $450 Average bid
    $450 Avg Bida
    4 bida

    Permasalahan sbb: Programmer saya yg lama membuatkan Task scheduler utk ETL di sql server, dan skarang tidak bisa dibuat otomatis. Sebelumnya berjalan nomormal tapi kemudian fungsi otomatisnya hilang. Mohon untuk bisa diperiksa sql servernya

    $30 - $250
    $30 - $250
    0 bida

    Permasalahan sbb: Programmer saya yg lama membuatkan Task scheduler utk ETL di sql server, dan skarang tidak bisa dibuat otomatis. Sebelumnya berjalan nomormal tapi kemudian fungsi otomatisnya hilang. Mohon untuk bisa diperiksa sql servernya

    $130 Average bid
    $130 Avg Bida
    1 bida

    ...AI-powered natural language query interface on top. ## Scope of Work **Phase 1: Data Integration to BigQuery** Connect and automate data pipelines from: - Google Analytics 4 (GA4) - Google Ads - Google Search Console - Google Tag Manager - Google Merchant Center Requirements: - Set up automated, near real-time data transfers - Design efficient BigQuery schema with proper data modeling - Implement ETL/ELT processes with data quality checks - Create unified views combining data across platforms - Import historical data - Document all data flows and transformation logic **Phase 2: AI Integration Layer** Implement AI-powered interface for natural language querying of BigQuery data. Requirements: - Configure secure connection between AI platform and BigQuery - Set up authentica...

    $189 Average bid
    Ditampilkan
    $189 Avg Bida
    41 bida

    ...hybrid OLTP and analytics workloads Cloud Platforms * Strong operational experience with: * AWS RDS (PostgreSQL) * Azure Cosmos DB * Expertise in: * VPC peering * Secure networking setup * IAM policy management * Secure cross-cloud data transfer Migration and Tooling * Proven experience leading large-scale PostgreSQL migrations * Expertise with: * pg_dump * COPY * AWS DMS * ETL pipelines * Audit-based delta synchronization strategies * Focus on zero or near-zero downtime migrations Connection Pooling and Performance * Experience with: * RDS Proxy * PgBouncer * High-concurrency workload optimization * Performance tuning for distributed PostgreSQL clusters Ideal Candidate Profile * 7+ years of database engineering experience * Demonstrated su...

    $1170 Average bid
    $1170 Avg Bida
    10 bida

    I need an experienced Python engineer who works confidently with AWS Glue to build and manage a small suite of data-integration jobs for a Hyderabad-based project. The core of the work is to design and automate Glue ETL pipelines that pull data from our production databases, catalog it accurately, and transform it into analytics-ready tables. Here is what I expect from the engagement: • Develop, test, and deploy Glue ETL jobs in Python. • Populate and maintain the Glue Data Catalog so new tables are discoverable and properly version-tracked. • Implement efficient transformation logic that cleans, enriches, and partitions data for downstream reporting. • Optimise job performance and cost by selecting the right worker types, job parameters, and data...

    $94 Average bid
    $94 Avg Bida
    12 bida

    ...challenges through research, collaboration, and application of best practices. • Continuous Improvement: Stay updated on product features and developments, provide feedback for tool improvements, and contribute to process optimization. Qualifications • Technical Proficiency: o Experience with data migration processes and working with large datasets (Excel proficiency is essential). o Experience with ETL (Extract, Transform and Load) processes. o Experience running data-cleaning/data-manipulating scripts. o Experience interacting with data in XML or JSON formats. o Familiarity with n8n, Zapier, Make or similar integration tools for building data pipelines and custom workflows. o Knowledge of internal tools for configuring, deploying, and managing custom software solutio...

    $16 / hr Average bid
    $16 / hr Avg Bida
    66 bida

    제가 보유한 대용량 숫자 데이터를 MongoDB에 정리해 간편하게 테이블 형태로 확인할 수 있도록 도와주세요. 작업 범위 • 원본 파일 전달 후, 최적의 MongoDB 컬렉션·필드 구조 설계 • 데이터 정합성 점검 및 클렌징(중복 제거, 형식 통일 등) • 효율적인 인덱스 설정과 기본 집계 파이프라인 구현 • MongoDB Compass 또는 웹 기반 대시보드에서 한눈에 볼 수 있는 테이블 뷰 구성 완료 기준 - 모든 레코드가 오류 없이 import되고 쿼리 테스트 통과 - 컬렉션 구조와 주요 쿼리 예시를 문서화(PDF 또는 Markdown) - 테이블 뷰 스크린샷 및 사용 안내 전달 필요 기술: MongoDB, 데이터 모델링, ETL 스크립트(JavaScript/Python 택일 가능) 데이터를 빠르고 깔끔하게 조회할 수 있도록 체계적인 접근을 기대합니다.

    $25 Average bid
    $25 Avg Bida
    2 bida

    I’m overhauling a Power Apps Canvas app and power query and want the interface to look and feel far slicker. The sole focus is a dashboard that surfaces our sales data in a way managers can grasp at a glance. That means: • Designing a responsive Canvas screen with charts, KPI cards and slicers that load fast on desktop and mobile • Shaping and scheduling the underlying data with Power Query so the numbers update automatically—no manual refreshes or copy-pasting • Leaving me with a short hand-over video or doc so I can tweak visuals or extend the query later Everything sits in SharePoint lists and an Excel workbook today; use whatever combination of Dataverse, collections or direct connectors keeps things fast and maintainable. Native Power Apps contro...

    $146 Average bid
    $146 Avg Bida
    84 bida

    Ho bisogno di un supporto strategico per avviare una raccolta pre-seed nel settore deep tech e informatica. L’obiettivo principale è mettere in luce la forte componente di innovazione e tecnologia della startup e ottenere presentazioni mirate a venture capitalist e angel investor in linea con questo focus. Mi occorre: • Creazione da zero di un pitch deck chiaro e convincente, completo di problema, soluzione, vantaggio tecnologico, modello di business, go-to-market e metriche chiave. • Ottimizzazione del messaggio per evidenziare l’innovazione e l’impatto di mercato. • Identificazione di VC e angel europei o internazionali specializzati in deep tech. • Introduzioni calde o organizzazione di call con gli investitori selezionati. P...

    $2184 Average bid
    $2184 Avg Bida
    14 bida

    ...Pydantic v2 Arquitectura modular por servicios Base de datos: PostgreSQL local Base vectorial: Qdrant local (o alternativa previamente aprobada) Contenerización: Docker obligatorio docker-compose mínimo para despliegue Prohibido instalar dependencias manualmente en servidor 3.3 Arquitectura de Proyecto Obligatoria Estructura mínima: /backend /api /services /connectors /rag /etl /agents /frontend /infrastructure /docs /tests No se aceptarán estructuras fuera de este estándar sin aprobación previa. 3.4 Normas de Desarrollo Repositorio Git propiedad del cliente Branching mínimo: main / dev / feature/* Commits descriptivos obligatorios Logging estructurado (JSON) Manejo exhaustivo de errores Siste...

    $13 / hr Average bid
    $13 / hr Avg Bida
    26 bida

    ...unifying those streams into a clean, continuously updated dataset. On top of that data layer, the build must train, evaluate, and deploy the best-performing predictive models—whether regression, decision-tree, neural-network, or any other technique that proves superior—then surface the results through a lightweight web interface and an API our teams can call in real time. Key deliverables • Automated ETL jobs and data-quality checks for the three sources mentioned above • Modular training pipeline with experiment tracking, lift/ROC reporting, and feature-importance visuals • Scoring service exposed via REST (or GraphQL) endpoints plus an intuitive dashboard for non-technical users • Deployment scripts, environment setup notes, and a live han...

    $15565 Average bid
    $15565 Avg Bida
    34 bida

    I have several disparate sources holding our inventory information and I want it all pulled together into one clean, well-structured Microsoft SQL Server database file. The job is straightforward in principle: extract every piece of inventory data you can reach, reconcile d...complete when I can: • restore or run the script on my Microsoft SQL Server instance without errors, • query a consolidated Items table and see one row per unique SKU, • join stock levels to their warehouse locations with no orphan records, and • export a sample CSV of current inventory that matches today’s live figures. T-SQL proficiency is essential; SSIS, Azure Data Studio, or any other ETL tooling you prefer is fine as long as the final deliverable is the ready-to-go SQL Se...

    $147 Average bid
    $147 Avg Bida
    36 bida

    Estoy por arrancar una migración con Informatica PowerCenter y necesito apoyo directo en la construcción de los procesos ETL. El conjunto de datos con el que trabajaremos está en Oracle y lo extraeremos desde una base de datos SQL; el objetivo es una migración limpia, controlada y auditable hacia el nuevo esquema. Qué espero de ti • Diseñar y desarrollar mappings, sesiones y workflows en PowerCenter que cubran todo el flujo de extracción, transformación y carga. • Incluir manejo de errores, validaciones de calidad y registro detallado de eventos. • Documentar cada mapping en el repositorio y entregar un pequeño manual de despliegue/rollback. Aceptaremos el trabajo cuando: -...

    $14 / hr Average bid
    $14 / hr Avg Bida
    18 bida

    ...through Power BI, so once the model is validated I’ll ask you to craft intuitive dashboards that highlight drivers, confidence ranges and any red-flag anomalies the model detects. Solid statistical grounding is essential; I want clear explanations of feature importance, assumptions and limitations that business stakeholders can grasp quickly. Big-data exposure, cloud familiarity (Azure, AWS or GCP), ETL pipeline design and MLOps practices are all welcome extras—you’ll have room to propose improvements if they make the solution more robust or scalable. Deliverables I need from you: • A well-documented predictive model with reproducible code and clear version control • Cleaned and transformed datasets stored back into SQL (or a recommended alternativ...

    $2365 Average bid
    $2365 Avg Bida
    90 bida

    ...social protection, and legal frameworks. Below are the positions and their respective qualifications: 1. Senior Project Manager - PMP or PRINCE2 certification - 10 years of experience managing complex projects in IT/e-government 2. Principal IT Architect - Degree in Computer Engineering or equivalent, TOGAF certification - 10 years of experience as an enterprise architect with expertise in API, ETL, cloud technologies 3. Business Architects/Analysts - Degree or Master’s in Computer Security, Data Science, or Computer Science - Expertise in implementing Digital Public Infrastructure (DPI) approaches 4. Cybersecurity and Data Security Expert - Degree or Master’s in Cybersecurity or IT Governance - 10 years of experience with CSIRT/SOC, PKI, IAM 5. Data Management E...

    $63 / hr Average bid
    $63 / hr Avg Bida
    41 bida

    ...recorded purchase prices Notifications or alerts for discrepancies or missing data Secure storage of credentials (where needed) and compliance with data protection/privacy guidelines Easy scalability as the number of suppliers increases Required skills/experience: API or data source integration (REST/SOAP, CSV/JSON/XML imports; web scraping only when permissible) Experience with data pipelines, ETL processes, scheduling (Cron, CI/CD tools, Zapier/Integromat or similar) Knowledge of price and stock data reconciliation, logging, error handling Handling secure authentication, token management, secrets management Ability to implement discrepancy alerts (email/Slack/chat alerts) and provide a simple dashboard or reporting interface What we offer: Clear list of suppliers with access ...

    $21 / hr Average bid
    $21 / hr Avg Bida
    76 bida

    ...auditability, integrity, and security 3. HIGH-LEVEL CLOUD ARCHITECTURE Core components: Network Layer AWS VPC (multi-AZ) Private subnets per regulated institution Central supervisory subnet Data Layer S3 Data Lake (Raw / Processed / Curated) Redshift / Aurora (analytics storage) Object Lock for integrity Compute & Processing Lambda (validation, rules engine) EC2 (stress testing engines, Monte Carlo) Glue ETL (transform pipelines) Step Functions (workflow orchestration) Streaming & APIs API Gateway (data submissions) Kinesis (real-time data ingestion) AI / ML SageMaker (fraud detection, early warning models) Neptune (graph AML network analytics) Redshift ML (ratio prediction) Monitoring & Security IAM / RBAC KMS encryption CloudTrail / GuardDuty CloudWatch Visualizati...

    $33 / hr Average bid
    $33 / hr Avg Bida
    40 bida
    Databricks ETL CI Framework
    17 jam left
    Disahkan

    I need a reusable ETL framework built inside Databricks notebooks, version-controlled in Bitbucket and promoted automatically through a Bitbucket Pipeline. All source data arrives via GraphQL APIs, so the job includes handling authentication, pagination, and schema inference before landing raw payloads in Delta tables. A dedicated cleaning stage must then standardise and validate the data before it moves on to the curated layer. The structure should be modular—ideally a bronze/silver/gold notebook hierarchy—so I can slot in new sources or extra transformations without touching the core logic. I also want a lightweight Python package (wheel) that wraps the GraphQL connector and can be attached to any cluster. Acceptance criteria • Parameter-driven notebooks organ...

    $343 Average bid
    $343 Avg Bida
    117 bida

    Job Title: NestJS Backend Developer – High-Performance Car Bulk Import (ETL) The Challenge We are looking for a senior-level NestJS developer to build a robust, production-ready car data import engine for our vehicle marketplace. This isn't a simple "upload and save" task; we require a sophisticated streaming pipeline capable of processing massive datasets (CSV, XML, JSON) with minimal memory footprint and high reliability. Core Task Build a POST /imports/cars endpoint that: Automatic Format Detection: Handles multipart/form-data and identifies the file type (CSV, XML, or JSON) programmatically. Stream-Based Processing: Processes data using Node.js Streams / AsyncIterables. The application should never load the full file into RAM. Data Pipeline: Implements a...

    $153 Average bid
    $153 Avg Bida
    52 bida

    - Designed automated ETL routines to standardize disparate source formats, reducing manual reconciliation time by 40%. Performed root‑cause analysis and cohort studies to identify process bottlenecks and cost drivers. Built enterprise dashboards with row‑level security, incremental refresh, and performance tuning; improved executive visibility into KPIs. Implemented and supported ERP reporting modules, mapped master data across modules, and led data migration and validation during upgrades. - Established validation rules, lineage documentation, and data quality KPIs to maintain trust in analytics. - Replaced manual spreadsheets with parameterized Power BI reports and scheduled dataflows, saving recurring effort and reducing errors. - Performance Metrics: Delivered dashboards that i...

    $196 Average bid
    $196 Avg Bida
    13 bida

    ...always one step away. I’m comfortable with tools such as Python, Pandas, LangChain, Node, SQL, Power BI, Tableau, or any similar stack you can justify. Key deliverables • Deployed WhatsApp agent(s) connected through the WhatsApp Business API - WhatsApp channel is ready. • Retrieval-augmented knowledge base so the bots surface the latest information without hallucinations .. Critical • Automated ETL jobs (n8n, Airflow, or your suggested alternative) feeding a structured data store • Reusable analysis scripts/notebooks with documented logic • Interactive dashboards and geo-visualisations accessible via web link • Deployment guide and a brief hand-off walkthrough Please respond ASAP with links to past conversational AI or data-analyti...

    $6 / hr Average bid
    $6 / hr Avg Bida
    30 bida

    ... deployment, and monitoring of integration solutions Required Skills Strong hands-on experience with SnapLogic Integration Platform Experience with REST/SOAP APIs and Web Services Knowledge of JSON, XML, and data transformation techniques Experience working with databases such as SQL Server, MySQL, or Oracle Understanding of cloud platforms (AWS, Azure, or GCP is a plus) Familiarity with ETL/ELT concepts and integration patterns Strong debugging and problem-solving skills Preferred Qualifications Experience with CI/CD and version control tools Exposure to enterprise applications like Salesforce, NetSuite, SAP, or similar Good communication and documentation skills Ability to work independently and in a collaborative team environment Engagement Details Long-term opp...

    $608 Average bid
    $608 Avg Bida
    14 bida

    ...and loads it downstream for ETL processing. The transfer works, but network-level performance is far below what I need. Here is what I’m looking for: • Diagnose the current ADO.NET-to-Oracle connection, identify any network, buffer, or packet-size bottlenecks, and benchmark the baseline throughput. • Tune the SSIS data flow (buffers, rows per batch, commit size, async settings, etc.) and, if necessary, adjust the Oracle driver or provider configuration. • Provide an updated package or detailed change list so I can reproduce the performance gains in other environments. • Produce a concise report summarizing findings, before-and-after metrics, and next-step recommendations. Source: another Oracle database accessed via ADO.NET. Goal: reliable, hi...

    $163 Average bid
    $163 Avg Bida
    24 bida

    Job Description We are looking for an experienced Azure Data Engineer / Data Integration Specialist to design and implement a robust, scalable data pipeline that pulls data from the Blackbaud API and loads it into an Azure SQL Database using Azure Data Factory (ADF). The goal is to have a fully automated, secure, and monitored ETL pipeline that runs on a scheduled basis and supports future scaling. Project Scope 1. Data Ingestion Connect to Blackbaud REST APIs (OAuth authentication) Handle pagination, rate limits, and API throttling Extract multiple endpoints (e.g., constituents, gifts, transactions, etc.) 2. Data Transformation Clean, normalize, and structure raw API JSON Handle nulls, schema drift, and data type conversions Add audit fields (load date, source system, batch id) 3. ...

    $132 Average bid
    $132 Avg Bida
    69 bida

    Top Ranked Requirements 1. Adobe/AEM Architect: Expert-level structuring of Adobe Analytics (segments, dimensions, templates). This is the "heavy lift" of the role. 2. The "Storyteller" (Strategy Bridge): 5+ years of experience turning data into strategic recommendations for Creative and Strategy teams. 3. Technical Data Handling: Hands-on proficiency with SQL/Python for ETL and data cleaning, and BI tools (Domo, Tableau, Looker) for automation. 4. Governance Lead: The ability to create "durable" frameworks—naming conventions, tagging plans, and intake processes—that ensure data stays clean year-over-year. Core Responsibilities (The "What") • Design & Implement: Build measurement frameworks and "always-on" tr...

    $36 / hr Average bid
    $36 / hr Avg Bida
    56 bida

    ### Overview We are looking for **Python engineers focused on web scraping, data extraction, and data cleaning**. This is **NOT** a large system or customer-facing role. The work consists of: * Small, clearly scoped Python scripts * Web scraping (HTML, PDFs, APIs) * Data cleaning and transformation * ETL-style utilities All work is: * Async-first * Internal tools only * Clearly scoped with written requirements This is **ongoing contract work**. Strong performers may receive long-term work. --- ### What You’ll Be Doing * Build Python scripts to scrape public websites * Parse HTML, JSON, CSV, and PDF files * Clean and normalize messy real-world data * Write clear, maintainable utility scripts * Deliver working code (not just prototypes) --- ### Required Skills * Str...

    $11 / hr Average bid
    $11 / hr Avg Bida
    104 bida

    ...español de forma fluida. Por favor, no apliques si no cumples este requisito, ya que la comunicación constante con el equipo es vital. Buscamos un experto freelance en Power BI y n8n para una colaboración a largo plazo. Tenemos múltiples proyectos de automatización en cola, pero comenzaremos con un Dashboard de Productividad operativo. Buscamos un perfil técnico que domine la integración de datos (ETL) y la visualización de alto nivel. El Primer Proyecto: Dashboard de Productividad El objetivo es añadir una sección a un reporte existente para calcular la productividad por operario. Fuentes de datos: Odoo 17 Enterprise: Extracción vía API de ausencias (vacaciones, bajas), festivos y listado de...

    $31 / hr Average bid
    $31 / hr Avg Bida
    67 bida

    ...privacy-preserving transformations so that multiple parties can collaborate on sensitive datasets without risk. What I expect from you is a focused, hands-on transfer of know-how rather than a generic primer. Walk me through real-world patterns, preferred storage layers, and the way you wire up connectors and workflow engines. If you have battle-tested approaches using platforms such as Apache NiFi, Talend, Informatica—or any equivalent stack—please weave those in, but feel free to recommend superior alternatives if they serve the clean-room model better. Deliverables • 3–4 live or recorded sessions (screen-share) illustrating architecture, pipeline setup, and governance checkpoints • A concise reference document summarising the clean-room componen...

    $32 / hr Average bid
    $32 / hr Avg Bida
    30 bida

    ...to extract and aggregate content from my website, including blogs, podcasts, YouTube videos, books, and articles. The goal is to populate a structured spreadsheet (Excel or Google Sheets) with this data, making it easy for AI tools to analyze themes, trends, summaries, etc. Background on the Process: This is essentially a data extraction or content aggregation task (also known as web scraping or ETL: Extract, Transform, Load). It involves systematically collecting unstructured content from my site and organizing it into a spreadsheet. Each entry should include metadata like title, URL, summaries, tags, and more. I have a template spreadsheet ("new come and reason ") with columns such as: id type (e.g., blog, podcast, video, book, article) title date_published source_na...

    $460 Average bid
    $460 Avg Bida
    119 bida

    ...for me. Each paper must be: • 8 pages (≈ 5 000 words) • fully formatted in IEEE style and delivered as clean, compilable LaTeX • accompanied by the final PDF Scope for the three papers 1. AWS services and cloud-native architecture—cover core building blocks such as VPC, IAM, S3, Lambda, and how they interoperate in production-grade designs. 2. Data Engineering methodologies—drill into ETL processes and modern data-pipeline patterns on AWS (Glue, Step Functions, Kinesis, etc.) with diagrams and code snippets where helpful. 3. AI/ML algorithms and applications—tie in SageMaker, feature engineering, and model deployment on AWS, illustrating at least one end-to-end use case. Purpose These are personal papers for my own portfolio, ...

    $254 Average bid
    $254 Avg Bida
    21 bida

    ...search stack that starts with an ETL flow pulling exclusively from our internal PostgreSQL databases. The pipeline must ingest and transform 38 000+ B2B category records and 5 000–10 000 company profiles, then run cleaning, vectorization, and enrichment steps so every record is categorized and stored in a pgvector-enabled schema. Once the data is in place, a separate microservice should expose a REST API that supports hybrid search: dense vectors (OpenAI text-embedding-3-small) combined with BM25 and blended with RRF scoring. Results have to work equally well in Hungarian and English; huspacy, spaCy, and Open AI are the preferred tools for language handling and any fallback generation. I expect the codebase in Python 3.10+, organised as two deployable units: • ET...

    $312 Average bid
    $312 Avg Bida
    15 bida

    ...surface company or product metadata Verification rules must anchor to two primary sources: the official manufacturer websites themselves and relevant government databases. When information conflicts, the system should automatically weigh each source and assign a confidence score, storing the rationale so we can trace every decision later. What I expect you to deliver • Clean, well-documented ETL scripts (Python, SQL or comparable) that ingest, normalise and enrich my current tables • A modular rules engine where I can tweak source priority, matching logic and thresholds without touching core code • A confidence-scoring function that explains how each record was resolved, including the exact URLs or API records consulted • Logging and error-handling...

    $159 Average bid
    $159 Avg Bida
    97 bida

    KEY RESPONSIBILITIES - Build an AI chatbot infrastructure on Amazon Bedrock using Anthropic Claude - Develop Knowledge Base infrastructure and ETL pipelines - Implement security controls: VPC isolation, PrivateLink endpoints, KMS encryption - Configure Bedrock Guardrails (content filters, PII masking, threat detection) - Set up monitoring and logging systems REQUIRED SKILLS - AWS: Bedrock, VPC/PrivateLink, S3, OpenSearch, IAM, KMS, Lambda, CloudTrail - Security: Entra ID/IAM Identity Center SSO, OIDC, encryption, network isolation, SIEM - Data Engineering: ETL pipelines, API integrations, data classification - IaC: Terraform or CloudFormation - Monitoring: CloudWatch, log analysis, KQL queries - ISO 27001, GDPR, data privacy principles - AI security (prompt injection, guardr...

    $142 / hr Average bid
    $142 / hr Avg Bida
    88 bida

    I have direct access to our e-commerce sale...connections or scheduled refresh against the database. I’ll provide the connection string and a sample export; you’ll propose the data model, set up the transformations, and surface the insights in clean, intuitive visuals. Exact metrics and calculated fields can be finalised together once the framework is in place. Deliverables • Secure connection to the sales database, including any necessary ETL or query optimisation • Fully-interactive dashboard with filter panels, drill-through views, and export options • Clear hand-off documentation covering data model, refresh schedules, and how to extend the report Please outline the tool you prefer, the timeline you need, and one example of a similar dashboar...

    $261 Average bid
    $261 Avg Bida
    41 bida

    ...modernizációjának támogatása - Adatmigrációs és adatátalakítási folyamatok implementálása és felügyelete - SQL-alapú és ETL jellegű megoldások fejlesztése: PostgreSQL/MSSQL környezetben - Migrációs validációs, adatminőségi és kontroll mechanizmusok kialakítása reconciliation, ellenőrző riportok, idempotens betöltések - Technikai döntések előkészítése: batch vs. incremental megközelítés reprocessing, rollback, cut-over stratégiák - Szoros együttműködés üzleti és IT oldali stakeholderekke...

    $34 / hr Average bid
    $34 / hr Avg Bida
    22 bida

    Project Description We need a person to work on ETL development, Informatica IDMC) Work Type Full-time or Part-time 3–5 days per week Around 4 hours per day Remote work Required Skills Informatica PowerCenter & IDMC ETL development SQL Payment Payment based on work More pay if performance is very good

    $120930 / hr Average bid
    $120930 / hr Avg Bida
    25 bida

    Project Title Data Integration & Visualization Specialist Project Description We need a person to work on ETL development, Informatica IDMC) Work Type Full-time or Part-time 3–5 days per week Around 4 hours per day Remote work Required Skills Informatica PowerCenter & IDMC ETL development SQL Payment Payment based on work More pay if performance is very good

    $741 / hr Average bid
    $741 / hr Avg Bida
    5 bida

    ...indexes) * Import setup from CSV/JSON (including ID strategy and validation) * 10–20 Cypher queries covering typical usage patterns (lookup, filtering, “context retrieval”) * Short documentation + recommended next structure Skills / Requirements Must-have * Neo4j + Cypher (strong hands-on experience) * Graph data modeling (ontology / domain modeling; clean labels, relationships, and boundaries) * ETL / ingestion from CSV/JSON (ID strategy, validation, deduplication) * Performance & quality fundamentals: constraints, indexing, and query profiling/optimization using EXPLAIN / PROFILE Nice-to-have * Python for scripting and transformations (import utilities, data cleanup, automation) * LLM / RAG integration (chunking strategies, metadata design, retrieval ...

    $24 / hr Average bid
    Perjanjian Kerahsiaan
    $24 / hr Avg Bida
    70 bida

    I need my separate databases to behave as one reliable source of truth. The job starts with assessing the current schemas and finishes when a single, well-documented repository is live, fully populated, and syncing automatically. Core tasks include mapping tables and fields, building the ETL pipelines, migrating historical records, validating row-level accuracy, and putting monitoring in place so future updates flow without manual intervention. I am open to the tech stack—whether you prefer native SQL scripts, Python with Pandas and SQLAlchemy, Talend, Airflow, or another proven toolset—as long as the choice is justified and scalable. Please attach a detailed project proposal that walks through your approach, milestones, estimated timeline, and any comparable integ...

    $7161 Average bid
    $7161 Avg Bida
    62 bida

    ...search stack that starts with an ETL flow pulling exclusively from our internal PostgreSQL databases. The pipeline must ingest and transform 38 000+ B2B category records and 5 000–10 000 company profiles, then run cleaning, vectorization, and enrichment steps so every record is categorized and stored in a pgvector-enabled schema. Once the data is in place, a separate microservice should expose a REST API that supports hybrid search: dense vectors (OpenAI text-embedding-3-small) combined with BM25 and blended with RRF scoring. Results have to work equally well in Hungarian and English; huspacy, spaCy, and Open AI are the preferred tools for language handling and any fallback generation. I expect the codebase in Python 3.10+, organised as two deployable units: • ET...

    $7842 Average bid
    $7842 Avg Bida
    96 bida

    Project Title: Data Integration & Visualization Specialist for ETL, IDMC, Informatica & Qlik Project Description: We are seeking an experienced Data Integration & Visualization Specialist for a project involving ETL development, Informatica, IDMC, and Qlik. The goal of the project is to design, implement, and maintain data pipelines and dashboards for seamless data processing and reporting. Responsibilities: Design and develop ETL pipelines for data extraction, transformation, and loading. Work with Informatica PowerCenter and IDMC to manage and optimize data workflows. Develop and maintain Qlik dashboards and reports for data visualization. Ensure data quality, accuracy, and consistency across systems. Collaborate with pr...

    $569 Average bid
    $569 Avg Bida
    21 bida
    Developer
    Tamat left

    Platinum by ETL; Combining our experience in the crypto/digital asset space, as well as ETL global’s expertise in legal, accountancy & tax matters, we are looking to target HNW/UHNW individuals with significant assets in cryptocurrency. The goal is to assist them in incorporating digital assets into their day-to-day accounting & legal matters, with an additional focus on how these assets can be passed along to heirs as part of a trust/inheritance set-up. Core product: “The Vault”- This product will be a secure multi-sig/ MPC wallet (Hosted by DFNS). Core functionalities: Frontend design to be high-class, slick, professional. (Please see below a link to the website which is in development: , for you to get an idea of the design we have gone for)...

    $10866 Average bid
    Segera
    $10866 Avg Bida
    118 bida

    ...precio del suelo en la zona. El objetivo es que el sistema procese estos datos, genere un valor estimado confiable y lo presente en informes PDF descargables, listos para compartir con inversionistas o clientes internos. Necesito: 1. Un modelo de valoración claro y documentado (puede ser un algoritmo estadístico o machine learning, siempre que justifique la precisión). 2. Un pequeño flujo ETL para cargar y depurar las tres fuentes de datos. 3. Plantilla de informe PDF con gráficos y métricas principales ya embebidos. 4. Guía de uso e instalación para que el motor pueda ejecutarse en mi propio entorno (preferentemente Python con librerías comunes como pandas, scikit-learn y reportlab, pero abierto a suger...

    $19 / hr Average bid
    $19 / hr Avg Bida
    22 bida

    ...data operations tasks. Any experience with building AI agents into the workflows or experience with N8N that is is nice to have add on. We are looking exclusively for people in Egypt. We agree weekly on 10 hour work packages and will asses the work quality and deliverables also on weekly basis prior to payout. skills & experience required: - Proven experience designing and implementing robust ETL pipelines for large-scale, heterogeneous data sources - Strong proficiency in building and maintaining custom web crawlers and data scrapers using tools like power automate desktop, Javascript or similar frameworks - Expertise in handling unstructured and semi-structured data (e.g., JSON, APIs, flat files) - Familiarity with SQL, excel, power bi - Strong problem-solving skills an...

    $12 / hr Average bid
    $12 / hr Avg Bida
    38 bida

    We are a fast-growing technology company building intelligent, AI-driven products that solve real-world business problems. Our focus is on automation, scalable systems, and p...workflows. You will work closely with product, engineering, and business stakeholders to develop reliable, production-grade AI solutions—not just experiments. What You’ll Do Design and implement AI-powered features and automation workflows Build and integrate LLM-based applications (OpenAI, Hugging Face, etc.) Develop scalable backend services and APIs Work with structured and unstructured data (ETL pipelines, embeddings, vector databases) Optimize performance, reliability, and cost of AI systems Collaborate on system architecture and technical decision-making Take ownership of features from con...

    $21 / hr Average bid
    $21 / hr Avg Bida
    98 bida

    ...similar), ingest the relevant tables, and then handle missing values, outliers, type casting, and duplicate detection. Once cleaned, the data should be written back to a new table in the same database and optionally exported to CSV so that downstream teams can verify the results in Excel if they choose. Key deliverables – Python script (Pandas, NumPy, SQLAlchemy preferred) that performs the full ETL/cleaning routine – Re-usable functions or class-based structure so future data drops can be processed with one command – Clear inline comments plus a short README explaining installation, execution, and configurable parameters Acceptance criteria 1. Script connects to the database with credentials provided at run-time or via .env file 2. All missing o...

    $42 / hr Average bid
    $42 / hr Avg Bida
    54 bida

    I’m looking for a Data Engineer with strong AWS native services experience to help build and support an event-driven data platform. This project focuses on automated batch data pipelines, data governance, and making data available in a secure and scalable way. This is not ad-hoc ETL — it’s a platform-style setup. Tech stack involved: • AWS: S3, SQS, Lambda, MWAA (Airflow), EMR Serverless • Data Processing: PySpark, Apache Spark • Data Lake: Apache Iceberg, AWS Glue Catalog • Governance & Security: Lake Formation, IAM, KMS • Querying: Amazon Athena

    $17 / hr Average bid
    $17 / hr Avg Bida
    41 bida

    ...Livreur, Chaussures pour une Infirmière). ​ D. Système de Rémunération (Data-to-Cash) ​Gain : 1,00 par facture / 1,50 par contrat soumis. ​But : Rémunérer l'utilisateur pour alimenter la base de données. ​ 3. Le Volet B2B : Altro Insight (La Plateforme Data) ​Objectif : Une usine automatisée de vente de données (SaaS + E-commerce). ​ A. Le Moteur de Traitement (L'Usine Interne) ​Nettoyage Automatisé (ETL) : Chaque donnée entrante est nettoyée, standardisée et anonymisée avant d'être stockée. ​Organisation : Les données sont classées par tags : Tranche d'âge, Genre, Métier, Région, Salaire, Employeur. ​ B. Le Mod&egrav...

    $16 / hr Average bid
    $16 / hr Avg Bida
    57 bida

    ...analysts, engineers, or BI specialists with strong communication and writing skills to contribute technical articles to our blog. Your mission? Share hands-on experience, real-world examples, and practical tips to help fellow data professionals work smarter. Who are we? ClicData is an all-in-one data management and business intelligence platform (SaaS), offering data connectivity, warehousing, ETL, data visualization, and automation. Our audience includes data professionals and data-savvy business leaders, primarily in mid-market companies across North America. Why write with us? - Were looking for long-term collaborators not one-off gigs. That means predictable, recurring income for you. - You'll be credited as the author of each piece you write your expertise will be sho...

    $256 Average bid
    $256 Avg Bida
    44 bida