Placement papers | Freshers Walkin | Jobs daily: Big Data Hadoop Operations Administrator at Comcast (Englewood, CO)


Search jobs and placement papers

Big Data Hadoop Operations Administrator at Comcast (Englewood, CO)

Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.

Summary

We are seeking an experienced Hadoop Administrator with understanding of the Big Data Hadoop Ecosystem (HDFS, MapReduce2, Accumulo, Pig, Hive, HBase, Cassandra etc.) to work within a DevOps team. This position will be responsible for Installing and maintaining Hadoop infrastructure, interact with tool vendors to resolve problems, monitor and troubleshoot problems within the Hadoop Ecosystem. The developer will work closely with customer teams to insure business applications are highly available and performing within agreed upon service levels.

Key Responsibilities

-Troubleshooting on Hadoop technologies including HDFS, MapReduce2, YARN, Hive, Pig, HBase, Accumulo, Tez, Sqoop, Zookeeper, Spark, Kafka, and Storm.

-Maintain security with Kerberos, Knox and Ranger.

-Install and maintain platform level Hadoop infrastructure including additional tools like SAS, Informatica, Presto and R.

-Analysis of data stores and uncover insights.

-Ensures solutions developed adhere to security and data privacy policies.

-Investigations and proof of concepts as Big Data technology evolves.

-Test prototypes and oversee handover to operational teams.

-Designing, building, installing, configuring applications that lend themselves to a continuous integration environment.

-Performance tuning applications and systems for high volume throughput.

-Translate complex functional and technical requirements into detailed design.

-Define best practices/standards.

-Maintain clear documentation to help increase overall team productivity.

Required:

-3+ years of administrator experience on Big Data Hadoop Ecosystem and components (HDFS, Sqoop, Hive, Pig, HBase, etc.)

-Java programming experience with frameworks, Scrum Agile, SOA, and software architecture

-Experience in administering high performance and large Hadoop clusters

-Strong understanding of Hadoop architecture, storage and IO subsystems, network and distributed systems

-Experience with Kerberized Hadoop clusters

-Experience managing and developing utilizing open source technologies and libraries

-Experience in scripting with shell scripting (Bash, PHP, PERL, Python, etc.)

-In depth understanding of system level resource consumption (memory, CPU, OS, storage, and networking data), and the Linux commands such as sar and netstat

-Familiarity with version control, job scheduling and configuration management tools such as Github, Puppet, UC4

-Ability to lead and take ownership of projects

Desired Skills/ Experience:

-Experience with RDBMS technologies and SQL language; Oracle and MySQL highly preferred

-Knowledge of NoSQL platforms

-Data modeling (Entity-Relational-Diagram)

-Hands on experience with open source management tools including participation in the community

-Large scale data warehousing

-Hadoop Certified

Comcast is an EOE/Veterans/Disabled/LGBT employer


by via developer jobs - Stack Overflow
 

No comments:

Post a Comment