Here's an interesting job that we think might be relevant for you - .
Click here to apply
Job Title : BigData Developer
Location : Hyderabad
Long Term / Permanent
Experience : 6-8 Years.
Mandatory Skills : Airflow, Python, Pyspark, AWS S3 bucket, Jenkins Bitbucket Optional, Scala , Java, Hive
Roles and Responsibilities :
- Big data : Good programming skills (shell and python)
- Should have 4+ years of Strong skills in Hive, Scoop, Spark, SQL (Python & SPARK) and HIV
- Strong understanding of Hadoop Architecture, HDFS AND ecosystem.
- Strong experience in SQL is Mandatory.
- Should have knowledge on Shell Scripting, Python scripting, Strong SQL and KAFKA
- Better if the candidate has knowledge on Presto and Sqoop
- Should have the ability to adapt into client environment and culture
-Should be flexible in work and the ability to build a STRONG team in Big Data space.
Click here to apply
PS: Please ignore this email if you have already applied or not interested in this job.
Best regards,
Team hirist.com
info@hirist.com
_________________________________________________________
Copyright © 2022 hirist.com. All rights reserved.
Sent by hirist.com | 6th Floor, Kings Mall, Sector - 10, Rohini, Delhi-85
You are receiving this email because you are registered to hirist.com.
If you don't want to receive emails like these anymore, you can unsubscribe.
No comments:
Post a Comment