Ditutup

spark application on google cloud platform for fetching and processing data from hdfs

Hello we are looking for a scala developer who has experience working on handling data in .packet form on spark clusters on google cloud platform. Basically the task is to access data from hdfs in .packet form, query through the data for relevant UIDs, fetch some specific fields in those UIDs, process parameters by performing some mathematical computations on those fields for those specific UIDs and store the processed values in a separate .packet file on hdfs. Further aggregation needs to be performed on the computed values, and final summary file needs to be stored into Mongo dB.

The technologies you need to be comfortable with : Dataproc on google (cloud native hadoop and spark), airflow (will be used for scheduling), google cloud platform (in general), scala (for scripts), Mongo dB (for data export)

Kemahiran: Enjin App Google, Hadoop, NoSQL Couch & Mongo, Skala, Spark

Lihat lagi: hadoop on google cloud platform, no filesystem for scheme: gs, google dataproc hdfs, gcs connector maven, google cloud storage vs hdfs, gsutil hdfs, class com.google.cloud.hadoop.fs.gcs.googlehadoopfilesystem not found, google dataproc tutorial, cron php google cloud platform, find developer for google cloud platform, Google Cloud Platform, google cloud application, google cloud print java application, web application processing data report, geofencing application google map

Tentang Majikan:
( 80 ulasan ) BANGALORE, India

ID Projek: #16298138

8 pekerja bebas membida secara purata ₹12156 untuk pekerjaan ini

lokeshyadav0005

Hello I have extensive experience working with various data formats and using Spark to deal with those. I believe I'll be able to complete this task successfully. About me: - 3 years of experience working in the f Lagi

₹35000 INR dalam 7 hari
(6 Ulasan)
3.7
gkbhardwaj87

I have work experience 5 year in big data technology . I have experience in elasticsearch, java, scala , spark. for more info ping m.e

₹11111 INR dalam 7 hari
(3 Ulasan)
4.1
AmolZinjadeP

Hi, I have more than 3+ years of experience in Hadoop technologies like mapreduce , spark, hdfs etc. I can complete your project contact me for more details

₹10000 INR dalam 3 hari
(4 Ulasan)
3.6
haadfreelancing

I am interested to work on this project as I have relevant experience in Big Data,Sqoop, Hadoop, Spark, Hive, Kafka, Spark Streaming, Rdd, Datframe, Dataset , Python, Scala, google cloud, azure, aws. I am well versed i Lagi

₹11111 INR dalam 6 hari
(4 Ulasan)
0.8
₹12222 INR dalam 3 hari
(1 Ulasan)
0.2
dineshrajputit

hi, I am hadoop, sparkand nosql engineer with 6 years of experience. can do this, comfortable with spark, scala, airflow, Google cloude. data proc I can manage.

₹2250 INR dalam sehari
(1 Ulasan)
0.0
khannanav

We have 8 years of experience working in Machine Learning. We have built various recommendation engines, web apps, crawlers, analytical dashboards etc. We have rich experience in Python, Spark, R, Scala, Cassandra, Hiv Lagi

₹7777 INR dalam 3 hari
(0 Ulasan)
0.0
bigdatabear

Languages: JAVA. Java/J2EE: Core JAVA,JAVAFX, Advanced JAVA, Servlet, JSP, JSTL, EJB, JDBC, Junit, Web Services, XML, XSD, JAX-RS, DOM, SAX, Multithreading, JTA, Custom Tags, JPA API’s. Web Technologies: Html, DHTML Lagi

₹7777 INR dalam 3 hari
(0 Ulasan)
0.0