Placement papers | Freshers Walkin | Jobs daily: DevOps Lead, Managing Director - Data Ops Engineering at BlackRock (New York, NY)


Search jobs and placement papers

DevOps Lead, Managing Director - Data Ops Engineering at BlackRock (New York, NY)

About BlackRock

BlackRock helps investors build better financial futures. As a fiduciary to investors and a leading provider of financial technology, our clients turn to us for the solutions they need when planning for their most important goals. As of December 31, 2018, the firm managed approximately $5.98 trillion in assets on behalf of investors worldwide. For additional information on BlackRock, please visit www.blackrock.com | Twitter: @blackrock | Blog: www.blackrockblog.com | LinkedIn: www.linkedin.com/company/blackrock.

Job Description:

Technology & Operations

When BlackRock was started in 1988, its founders envisioned a company that combined the best of financial services with powered by technology. BlackRock has become a FinTech platform in the years since for investment management and technology services for thousands of users around the world. Data is at the heart of everything we do.  The DataOps Engineering team is responsible for building and maintaining a cutting edge data platform that provides quality data for to all users of the Aladdin platform, including investors, operations staff, data scientists, Aladdin applications and engineers.

Team Overview

Data is at the core of the Aladdin platform and our ability to consume, store, analyze and gain insight from data is a key component of our competitive advantage. The DataOps Engineering team is responsible for the data ecosystem within BlackRock. Our goal is to provide highly available, consistent data of the highest quality to our clients, while evolving our platform to deliver exponential scale to the firm and powering the future growth of Aladdin. We engineer high performance data pipelines, provide a fabric to discover and consume data and continually evolve our data storage capabilities. We believe in writing small, testable code with a focus on innovation. We are committed to open source and we regularly contribute our work back to the community.

Role Responsibility

The DevOps chapter lead will be responsible for end to end data pipeline and messaging fabric infrastructure. In practice this means accelerating from R&D to production technologies such as native cloud, K8 and Kafka to support key data supply chain technologies such as ALE and Astra. This newly formed chapter will include embedded DevOps engineers reporting to the chapter lead. The Lead and the engineers will need to work in close coordination with teams at the firm working on these topics to ensure we both hit our target project milestone deliverables and do so in a highly aligned fashion with the broader firms evolving technology strategy. This role has a seat on on the DataOps management team, and likely TES extended ExCo and APG OpCo.

The DevOps Chapter Lead and DevOps engineers will be deployed within DataOps squads to assist in the build out and deployment of the future state data supply chain, examples include:

Acquisition: Help evolve Aladdin Loading Engine into a full-feature, cloud and container ready ETL capable of supporting all data loading and transformation requirements for ref data, index data and client data in new and flexible data schemas

Quality: Assist in the sourcing of real time performance monitoring and data driven KPIs for data and system quality through providing the right infrastructure

Production: Support the transition from batch to event driven production by re-underwriting the messaging fabric of the future state supply chain, enabling seamless, persistent messaging within the nextgen orchestration framework (eg Instamatic, Astra)

Distribution: Support the selection and build out of fit for purpose data stores to power our distribution of data via service and API gateways to Aladdin apps, Aladdin engineers and data scientists

Experience

* 10-15+ years of experience in financial technology or a related technical field, implementing DeOps methodology tool chain
* Experience with Cloud Native platforms, both hosted and on-premise
* Experience working closely with Systems Engineers and Technology teams
* Working knowledge of building and deploying distributed systems
* Experience with scripting languages such as Python or Ruby
* Experience with software deployment and orchestration technologies such as Helm, Docker, Kubernetes, OpenStack, etc.
* Experience supporting databases and datastores e.g. MongoDB, Redis, Cassandra, Ignite, Hadoop, S3
* Experience in messaging and streaming platforms such as NATS or Kafka
* Experience in creating and evolving CI/CD pipelines with GitLab or Github following GitOps principles
* A passion for engineering highly available, performant systems
* Experience in security fundamentals
* Experience in troubleshooting and system administration tasks in Linux or Unix
* Experience in networking, coordination and service discovery

BlackRock is proud to be an Equal Opportunity and Affirmative Action Employer.  We evaluate qualified applicants without regard to race, color, national origin, religion, sex, sexual orientation, gender identity, disability, protected veteran status, and other statuses protected by law.

BlackRock will consider for employment qualified applicants with arrest or conviction records in a manner consistent with the requirements of the law, including any applicable fair chance law.


by via developer jobs - Stack Overflow

 

No comments:

Post a Comment