Placement papers | Freshers Walkin | Jobs daily: Hadoop/ Spark Developer with Scala at PruTech Solutions, Inc. (Charlotte, NC)


Search jobs and placement papers

Hadoop/ Spark Developer with Scala at PruTech Solutions, Inc. (Charlotte, NC)

Job Description

Overview:
We are looking for an experienced Hadoop/Spark developer for to help design, develop, and secure Big Data solutions who has broad understanding of Big Data tools and technologies and the fundamentals of how they operate.

Job duties:
  • Design and develop applications utilizing the Bigdata technologies like Hive, Spark, HBase and Hadoop Frameworks.
  • Read, extract, transform, stage and load data to multiple targets, including Hadoop and other databases
  • Translate complex functional and technical requirements into detailed design.
  • Migrate existing data processing from standalone or legacy technology scripts to Hadoop framework processing.
  • Identify and apply performance tuning in Hive, Spark, Hbase and Kafka
  • Perform POC deployments and conversions
  • Maintain security and data privacy.
  • Propose best practices/standards.
Requirements:
  • 5 years designing and developing Enterprise-level data, integration, and reporting/analytic solutions. Proven track record of delivering backend systems that participate in a complex ecosystem
  • Minimum 3 years development experience on Big Data/Hadoop platform including Hive, Spark, Sqoop, Hbase, Kafka and related tools
  • Experience working with multiple Hadoop distributions (eg. Hortonworks, MapR)
  • Current knowledge of Unix/Linux/python scripting, solid experience in code optimization and high performance computing.
  • Scala Development is a big plus
  • Strong written and verbal communication skills
  • Excellent analytical and problem-solving skills

by via developer jobs - Stack Overflow
 

No comments:

Post a Comment