Job Description
Overview:We are looking for an experienced senior Hadoop developer for to help design, develop, and secure Big Data solutions who has broad understanding of Big Data tools and technologies and the fundamentals of how they operate.
Job duties:
- Design and develop applications utilizing the Bigdata technologies like Hive, Spark, HBase and Hadoop Frameworks.
- Read, extract, transform, stage and load data to multiple targets, including Hadoop and other databases
- Translate complex functional and technical requirements into detailed design.
- Migrate existing data processing from standalone or legacy technology scripts to Hadoop framework processing.
- Identify and apply performance tuning in Hive, Spark, Hbase and Kafka
- Perform POC deployments and conversions
- Maintain security and data privacy.
- Propose best practices/standards.
- 5 years designing and developing Enterprise-level data, integration, and reporting/analytic solutions. Proven track record of delivering backend systems that participate in a complex ecosystem
- Minimum 3 years development experience on Big Data/Hadoop platform including Hive, Spark, Sqoop, Hbase, Kafka and related tools.
- Experience with Hadoop 2.0+ and Yarn applications
- Experience working with multiple Hadoop distributions (eg. Hortonworks, MapR)
- Current knowledge of Unix/Linux/python scripting, solid experience in code optimization and high performance computing.
- Knowledge in Solr/Elastic Search and NiFi is preferred
- Strong written and verbal communication skills
- Excellent analytical and problem-solving skills
by via developer jobs - Stack Overflow
1 comment:
Very Nice article,Thank you.
Keep Updating...
Big Data Online Course
Post a Comment