Job Description
JOB SUMMARY:
Monitor and maintain all the data platform and analytical processes which includes the Hadoop Data Lake ecosystem. This is a support administration role with the potential of some development responsibilities on minor enhancements and break/fixes. Ensure all system issues are resolved in a timely manner. Leverage standard Issue, Problem and Change Management ITIL toolset in tracking and providing status on support work activities. Coordinate the necessary operations documentation and standards required for compliance and identified best practices. Perform all software upgrades and patching partnering with the necessary Infrastructure teams. Provide work estimates as needed. Facilitate and lead meetings with end users. etc.
ESSENTIAL DUTIES AND RESPONSIBILITIES:
- Monitor and ensure stability of the analytical & data platforms/environments/processes in order to meet the defined SLA.
- Monitor and maintain all the data platform and analytical processes which includes the Hadoop Data Lake ecosystem
- Partner with various business partners to gather requirements for minor enhancements.
- Work effectively with other Technology Operations/Infrastructure and the responsible technology/3rd party support teams to ensure continued operations and maintenance of analytical & data management data platforms/environments/processes/
- Support Release Management and Change Control processes along with other UPT Compliance Initiatives.
- Partner with various business partners to debug, test, develop and implement fixes for the customer / campaign management data platforms/environments/processes.
- Keeps management, team members and business stakeholders informed of critical issues, status, changes
REQUIREMENTS
- 5+ years of experience in development or operational support of Enterprise BI analytics systems.
- Core understanding of EDW architecture.
- Strong SQL skills in RDBMS platform (MSsql, DB2, etc.)
- 2+ years of experience in operational support of Linux server OS.
- RedHat 6 or higher certified administrator preferred.
- JDBC and ODBC connections.
- Ability to read/interpret Java.
- Ability to produce high quality technical documentation.
- Strong knowledge of Agile project management methodologies/processes.
- Able to understand and interpret Entity-Relationship, logical and physical data models.
- Able to work with various platforms and databases (i.e. DB2, SQL Server). Able to write complex SQL and leverage backend databases.
- Participate in an on-call rotation and available to work off-hours and weekends.
- Strong interpersonal skills, with a demonstrated ability to make effective decisions while working through complex system issues.
- Must be able to utilize and effectively communicate functional and technical components of an initiative to applicable parties both verbally and through documentation.
- Attention to detail, good analytical and problem solving skills and critical thinking
- Self-starter/motivator and have a proactive, agile and strategic mindset.
- Bachelor’ s degree or higher in a computer science field.
Preferred
- 7+ years of demonstrated experience working as part of large Information Technology teams and/or consulting organizations partnering with clients/business groups to support complex Big Data platform or BI analytics environment.
- 2+ years of experience in installation and administration of Hadoop ecosystem.
- 2+ years of experience in the data integration with Hadoop ecosystem.
- Certifications in Horton Works HDP and HDF administration or development preferred.
- Experience in operational support of HDFS and edge application including Hive (TEZ, LLAP), Spark, Ranger, HBase, Knox, Kerberos.
- Experience with Nifi as ETL tool is preferred.
- Strong skill set in troubleshooting and resolving HDFS cluster issues using Ambari.
THIRD PARTY CANDIDATES WILL NOT BE CONSIDERED. THANK YOU.
by via developer jobs - Stack Overflow
No comments:
Post a Comment