Placement papers | Freshers Walkin | Jobs daily: Data Engineer at Ultimate Software (San Francisco, CA)


Search jobs and placement papers

Data Engineer at Ultimate Software (San Francisco, CA)

Ultimate Software is looking for a Data Engineer to join our analytics experts within the Business Intelligence and Data Warehousing product development Organization. The individual will be responsible for optimizing our data pipelines and helping to design our next generation analytics environment.


The ideal candidate is experienced in data pipeline builder who enjoys building new solutions and optimizing existing solutions. Must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. And be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives. 


Primary/Essential Duties and Key Responsibilities:



  • Assemble large, complex data sets that meet functional / non-functional business use case.

  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.

  • Build software required for optimal extraction, transformation, and loading of data from a wide variety of data sources using numerous technologies.

  • Build analytics tools that utilize data to provide actionable insights into customer behavior, operational efficiency and other key business performance metrics.

  • Work with key personnel to assist with data-related technical issues and product support.

  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

  • Work with product teams to strive for greater functionality in our products.


Required Qualifications: 



  • 5+ years of experience in a Data Engineer role

  • Experience supporting and working with cross-functional teams in a dynamic environment.

  • Experience with big data tools: Hadoop, Spark, Kafka, etc.

  • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.

  • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.

  • Experience with Public cloud services

  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.

  • Experience with Business Intelligence tools and platforms.


Education:



  • Bachelor’s degree in Computer Science, Statistics, Informatics, Information Systems, another quantitative filed or relevant work experience.


Preferred Qualifications:



  • Advanced working SQL knowledge and experience working with numerous large- scale SQL and NoSQL databases.

  • Experience building and optimizing data pipelines, architectures and data sets.

  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.

  • Strong analytic skills related to working with structured and unstructured datasets.


Travel Requirements:



  • 25%


Check out how we give our employees the chance to work on whatever project they want for 48 hours! https://youtu.be/2Aw55CP1IO8  


Typical Interview Process:



  • If your application is selected, a Talent Acquisition Manager will reach out to schedule a phone screen with them.

  • If selected to move forward, you will complete a HackerRank Coding Assessment.

  • If you pass, you will either move forward to a technical phone call for an additional screening, OR directly to an onsite interview.

  • Offer stage.


by via developer jobs - Stack Overflow
 

No comments:

Post a Comment