Placement papers | Freshers Walkin | Jobs daily: Software Engineer - Data Acquisition at Moz (Seattle, WA)

Search jobs and placement papers

Software Engineer - Data Acquisition at Moz (Seattle, WA)

Moz is growing its Pro Services team under the Data Acquisition Services org. This team is focused on all aspects of managing data. We build systems to acquire, extract, transform, score, and serve terabytes of SEO data from a large variety of sources.

If you like the challenge of streamlining data collection processes, managing and improving Big Data related infrastructure, and wrangling hundreds of millions of data points, consider joining us as we build our next-generation data procurement platform to produce the foundational data that fuels Moz’s product, services, and research.

About the position

  • Build, maintain, and support large-scale collections systems.

  • Optimize and expand our HBase infrastructure that serves our keywords relationship metrics plus suggestion and SERPs corpus.

  • Be part of a Scrum team that designs, swarms, and pairs together

  • Implement and improve deployment, monitoring, testing, operational and foundational tools.

  • Mentor junior developers

  • Be an excellent problem solver and be willing to roll up sleeves to tackle any issue thrown your way

Experience We Want to See

  • 5+ years experience architecting and developing large-scale distributed systems

  • B.S. or higher in Computer Science or equivalent training and experience

  • Familiarity with Ruby, Python and/or Node.js

  • Familiarity with source control. Github preferred.

  • Experience with Linux and cloud-based environments

  • Ability to communicate with management, software architects, and developers about project status, APIs and overall system health.

  • Work in an agile process – be able to break down work into bite-size chunks and communicate this to the team

  • Working knowledge of modern web technologies including cloud based APIs and protocols (REST, JSON)

Nice to Have

  • Experience on HBase cluster using Hadoop’s HDFS and MapReduce infrastructure

  • Data Science background or experience

  • Amazon Web Services (AWS) experience with EC2 and S3

  • Advanced HTTP protocol knowledge and/or networking experience

  • Participation in a technical community is a major plus: e.g., contributions to open-source projects, published works, presentations in academic conferences or industry circles

Here is what you will be working with!

  • Ruby

  • Node.js ES6

  • Java

  • Python

  • HDFS/HBase clusters, Riak, Redis, MySQL

  • Thousands of servers

by via developer jobs - Stack Overflow

No comments:

Post a Comment