Placement papers | Freshers Walkin | Jobs daily: Data Engineer DSP at VideoAmp (Los Angeles, CA)


Search jobs and placement papers

Data Engineer DSP at VideoAmp (Los Angeles, CA)

About Us VideoAmp’s mission is to bridge the gap between television and digital advertising with campaign measurement, planning and audience-targeting solutions that dramatically improve the performance and cost-efficiency of advertising investment. Our engineers, data scientists, designers and media strategists are inspired by tackling one of the most challenging problems in media and marketing — bridging the divide between TV and digital media.


Responsibilities Your mission is to extend and maintain our current real time data platform. Understanding our complex data ecosystem and being able to build, deploy, and maintain scalable and reusable pipelines that are mission critical to support our Fortune 100 clients is what you'll be doing as part of a 10 person team daily. You'll be working with a wide variety of datasets including TV viewership data and digital advertising data to help customers optimize their marketing spend. No two days will be the same and you will be receiving support from some of the best and brightest in the industry (don't take my word for it check out our LinkedIn/Glassdoor). Don't wait, we're taking off like a rocket and want you to be a part of our journey.


Tech We Use Scala, Go, Redis, Airflow, Spark, Kafka, Cassandra, Druid, SQL, Hadoop, and Hive


Currently, we are only accepting applicants currently based in the Unites States.


Prerequisites



  • Bachelor of Science in Computer Science preferred

  • 4+ years of hands-on coding experience building data pipelines

  • Strong experience using distributed computing frameworks for high volume data processing

  • Experience working with one of the following Scala, GO, Java, or Spark

  • A solid and demonstrable understanding of ETL workflows, data warehousing, and big data principles

  • A solid understanding of NoSQL datastores

  • SQL fluency and an understanding of relational data models

  • Experience with AWS, GCP or Azure or another cloud provider


Responsibilities



  • Designing and building out new data pipelines

  • Analyze, scrub, and integrate third party data

  • Building new, and scaling out existing ETL applications

  • Collaborating with Data Scientists and productionalizing various Data Science Models

  • Coordinating data models with other engineering teams

  • Develop and release using agile methodologies


Perks



  • Top compensation

  • Comprehensive health benefits

  • Meaningful equity

  • 401k

  • Unlimited vacation with a stipend for travel and accommodations of $2000/year

  • Unlimited in-office gym use with personal trainer

  • Childcare stipend

  • Plenty of snacks and beverages

  • Corporate support for Hackathons, lunch and learns, and attending conferences

  • Personal and professional development


by via developer jobs - Stack Overflow
 

No comments:

Post a Comment