Placement papers | Freshers Walkin | Jobs daily: Big Data Engineer at JP Morgan Chase (Houston, TX)


Search jobs and placement papers

Big Data Engineer at JP Morgan Chase (Houston, TX)

Description

JPMorgan Chase & Co. (NYSE: JPM) is a leading global financial services firm with assets of $2 trillion and operations in more than 60 countries.  The firm is a leader in Investment & Corporate Banking; Financial Services for consumers, small business and commercial banking; financial transaction processing; asset management and private equity.

The Treasury/Chief Investment Office (T/CIO) is responsible for firm wide asset and liability management, including:
Aggregating and managing the interest rate risk of the firms four main lines of business (LOBs), primarily through the global investment securities portfolio.
Managing the firms funding and liabilities through long-term debt and other funding sources, and managing the short term cash deployment activities.
Aggregating and managing the firms liquidity risk, including deploying the firm's excess liquidity. Also responsible for compliance with both internal and regulatory liquidity requirements.
Aggregating and managing the firms structural foreign exchange risk.
Risk managing Mortgage Servicing Rights on behalf of the Mortgage Bank.
Managing the company-sponsored retirement plan.

The Treasury and CIO technology team is looking for a Big data developer with expertise in design and implementation of high data volume solution, for the Forecasting and Analytics work streams of global finance.   

Responsibilities:

  • Build distributed, scalable, and reliable data pipelines that ingest and process data at scale and in real-time.

  • Own product features from the requirements gathering phase through to production deployment

  • Identify and specify complex business requirements and processes. 

  • Research, evaluate & recommend solutions to achieve those requirements.

  • Model, design, develop, code, test, debug, document and deploy application to production through standard processes

  • Plan delivery of work, including creating story plans, and being an active member of team stand-ups, retrospectives and sprint planning

  • Partner with other technology teams to deliver end to end solutions

  • Conduct design & code reviews in a Continuous Deployment environment

  • Support system testing, user testing and production implementation.


 


Basic Skills:

  • Bachelors/Masters degree in Computer Science, Computer Engineering or equivalent

  • Ability to interact with technology and business partners

  • Ability to lead team in geographically distributed locations across multiple time zones

  • Excellent written and verbal communication skills



Technical Skills:

  • 5+ years of experience with Big data tools and technologies

  • Experience with Hadoop and related technologies such as Spark, Hive, Impala

  • Hands on experience with related/complementary open source software platforms and languages like Python, Java, Scala

  • Strong relational databases and SQL skills

  • Experience with agile development methodologies

  • Experience building scalable and flexible software frameworks

  • Financial experience is a must

Preferred Skills:

  • Experience with ETL

  • Experience with reporting tools such as Tableau, Qlikview

  • NoSQL/Big Data: Cassandra



 


by via developer jobs - Stack Overflow
 

No comments:

Post a Comment