UpGrad is looking for a Big Data/Cloud systems freelancer who can work to resolve student issues in our Big Data Engineering program.
Here is the list of basic requirements for the role:
- 3-6 years of experience
- Strong Linux/Unix background
- Experience in sizing and designing a Hadoop Cluster on AWS
- Building and Operating Big data clusters (EMR, Hortonworks, Cloudera, etc.)
- Performance benchmarking and performance-tuning of Hadoop Clusters
- Installing and configuring Hadoop ecosystem components like Sqoop, Spark, Spark2, Hive, Kafka, Oozie, HBase, and Zeppelin
- Experience with Hadoop Security - Kerberos, Ranger, Knox, Data Encryption
- Troubleshooting in Java environment
This role will require work on weekends. Above all, we're looking for a passion for education, and a desire to improve the learning experience of students in Big Data Engineering.
15 pekerja bebas membida secara purata ₹1936/jam untuk pekerjaan ini
I have hands on experience with Apache/Hadoop technologies. I am doing system administration of 200+ nodes for more than 5 years. We can easily solve your problems and delvier a hassle-free project.