1. Hands on programming experience in Java.
2. Strong production experience with Spark. (Minimum of 1 to 2 years)
3. Experience in data pipelines using big data technologies (Hadoop, Spark, Kafka) on large unstructured data sets.
4. Working experience and good understanding of public cloud environments (AWS or Azure or Google cloud)
5. Python knowledge is good to have.
5 pekerja bebas membida secara purata ₹250/jam untuk pekerjaan ini
Hi, I am available to do this, expertise in bigdata technologies with 7+ years of experience. Let me know if I can work on my flexible timing. let me know if we can chat Thanks
8 years of BigData Developer experience in Hadoop Spark Kafka Snowflake AWS. Worked for many clients like BankOf America, Home Depot,Bank Of New York Mellon,Flex in both batch and stream processing.