As a member of our Software Engineering Group you will dive head-first into creating innovative solutions that advance businesses and careers. Youll join an inspiring and curious team of technologists dedicated to improving the design, analytics, development, coding, testing and application programming that goes into creating high quality software and new products. Youll be tasked with keeping the team and other key stakeholders up to speed on the progress of whats being developed. And best of all, youll be working with and sharing ideas, information and innovation with our global team of technologists from all over the world.
Data Acquisition team is a technology group within Risk Finance and Technology which is developing a Data Convergence Platform. This strategic reengineering program will develop a platform to process information from various upstream systems related with various Risk stripes: Credit Risk, Market Risk, Portfolio Risk, Stress Test, and Country Risk and make it available to downstream consumers such as applications and analysts.
We are actively seeking a Java/Hadoop Software Engineer with a deep knowledge of Spark and Hadoop framework.
- Potential candidates MUST have experience with and are enthusiastic about developing Hadoop processes using Spark, Sqoop, and also Core Java programming for both back-end processes as well as middle-tier processing.
- The responsibilities of this position include but are not limited to:
- Writing large scale Spark Jobs in Java, creating Oozie Workflows, transferring data from relational databases to our Hadoop store using Sqoop, and writing Java programs in pure Core Java to facilitate both real-time and batch processing.
- All processes will be deployed to Unix/Linux environments; therefore a working understanding of Unix/Linux is required.
Technical Skills:
- 5+ years experience developing web applications and integrating with databases
- Core Java/J2EE Design Patterns
- Weblogic/Webshpere/Tomcat
- Java related work experience
- Spring or Enterprise Java Beans 3 (EJB3)
- ORM Frameworks (Java Persistence API/JPA, Hibernate, TopLink)
- SQL & Databases (Oracle/ExData, TeraData, Netezza etc)
- Java Message Service (JMS)
- HTML, JavaScript, Cascading Style Sheets (CSS)
- Testing Frameworks (JUnit, Selenium)
- Strong in Java performance tuning, debugging & Profiling Tools.
- Experience in ETL tool (Ab Initio) or Informatica is a plus
-
- Experiences in developing scalable & multi-threaded systems
- 5+ years experience developing web applications and integrating with databases
- Core Java/J2EE Design Patterns
- Weblogic/Webshpere/Tomcat
- Java related work expreince
- Spring or Enterprise Java Beans 3 (EJB3)
- ORM Frameworks (Java Persistence API/JPA, Hibernate, TopLink)
- SQL & Databases (Oracle/ExData, TeraData, Netezza etc)
- Java Message Service (JMS)
- HTML, JavaScript, Cascading Style Sheets (CSS)
- Testing Frameworks (JUnit, Selenium)
- Strong in Java performance tuning, debugging & Profiling Tools.
- Experience in ETL tool (Ab Initio) or Informatica is a plus
- Good knowledge on Hadoop architecture
- Good knowledge on Spark architecture
- Programming knowledge on spark
- Experience or knowledge of Hadoop, MapReduce, Spark, Hive and Pig ecosystem would be a huge plus.
- Exposure to Rules and Workflow Engines, preferably Drools/jBPM is plus
Financial, Retail Banking Products or BCBS knowledge is plus
Understanding of Web Content Accessibility Guidelines (WCAG 2.0) and assistive technologies (JAWS, NVDA, Zoomtext), and proven track record of incorporating into your day-to-day work, is a plus but not required
by via developer jobs - Stack Overflow
No comments:
Post a Comment