Placement papers | Freshers Walkin | Jobs daily: Principal Data Engineer at Realtor.com (Santa Clara, CA)


Search jobs and placement papers

Principal Data Engineer at Realtor.com (Santa Clara, CA)

At realtor.com data is very important to us and we have lots of it! Our goal is to use this data to make the home buying experience a breeze for our consumers and customers. 


Our Data Engineering team lives and breathes data and has fun doing it. We are not afraid to break and rebuild code to make it better... stronger... faster, to overcome the Big Data challenges of today and tomorrow. We are in the final stages of building out a brand new platform on the AWS cloud using cutting-edge technologies. We work on the coolest data projects and products that reports, predicts and affects the outcome - all this using data.


This impactful role will promote the build-out and usage of the data platform, data-driven decisions and data products for realtor.com, it's customers, partners, and consumers. Additionally, this role will be part of the core team contributing to design and development of realtor.com's new data and analytics platform.


Duties and Responsibilities:



  • Define data processing patterns that will be needed in the data platform

  • Define needed capabilities in the platform and then lead engineering teams to deliver these capabilities

  • Design, develop, and deliver building blocks for products based on data and analytics

  • Design, develop and orchestrate data pipelines for real-time and batch data processing

  • Design optimal storage, data structures, security, and retrieval mechanisms for data at rest in Data lake and Analytics data store or data in motion for real-time processing requirements

  • Design and develop reusable components and frameworks for ingestion, cleansing, and data quality

  • Collaborate with upstream sources and downstream consumers to come up with expandable data contracts

  • Design and develop Rest APIs/ or web-service consuming clients for data push/pull for both upstream and downstream applications

  • Help guide and grow the technical depth of junior members of the team.  

  • Collaborate with data team, product owners, Scrum-master to refine and estimate stories/epics

  • Be integral part of scrum team to deliver on commitments on time and with good quality


Education, Skills and Experience:



  • B.S. in Computer Science or Engineering discipline

  • 10+ years of experience or Masters degree in relevant field with 8+ years experience in Data engineering/ Big Data.

  • Solid understanding of distributed programming and experience with EMR or Hadoop

  • Expert at SQL and query performance tuning 

  • Deep understanding of Cloud technologies (preferably AWS services), and security and how they can be combined to design scalable cloud solutions

  • Experience with columnar storage and MPP/Analytical DBs (Redshift, Netezza, GreenPlum, Teradata). 

  • Expert Python skills. Experience with bash scripting and one or more of Java/Scala/node.js

  • Strong prioritization skills and sense of urgency

  • Strong analytical and problem solving skills. Takes pride in efficient designs and accurate results

  • Objectively analyzes the pros, cons, and the tradeoffs of a design path and helps the team to arrive at the most optimal solution, which may not be their own

  • Loves to learn and experiment with new technologies and shares findings with the team

  • Effective team player. Honest and respectful of others


Additional Preferred Skills:



  • AWS Certifications

  • Functional programming experience

  • Experience with Apache Storm (Or any CEP tool) is nice to have

  • Familiar with agile development and sprints

  • Experience with at least one ETL tool (Informatica Cloud, Talend, etc.)

  • Familiarity with machine learning and data analysis packages is nice to have SciPy/NumPy/MatplotLib, R, SAS, SPSS but not required

  • Familiar with how reporting software work (Tableau/MSTR/SSRS etc.)

  • Open source software contribution


by via developer jobs - Stack Overflow
 

No comments:

Post a Comment