Placement papers | Freshers Walkin | Jobs daily: Senior Data Engineer -- Do Good. Do Well. Have Fun Doing It. at LiteracyPro Systems, Inc. (Boulder, CO)


Search jobs and placement papers

Senior Data Engineer -- Do Good. Do Well. Have Fun Doing It. at LiteracyPro Systems, Inc. (Boulder, CO)

LiteracyPro creates software for social good. Our data collection and reporting software helps government and community-based organizations help millions of disadvantaged people improve their opportunities to get jobs with sustainable, living wages for themselves and their families. Enjoy being part of a fun group of bright, passionate folks, in a great working environment, with generous salary & benefits while helping those who help others.


Our newest product, CommunityPro Suite (CPS), is a growing SaaS case management, referral, reporting and analysis enterprise application that enables hundreds of agencies to securely match, consolidate and share data in real time. The purpose of the software is to reduce organizational friction by helping local agencies better collaborate to help people achieve economic self-sufficiency for themselves and their families. The end-goal of CPS is to play an important role in helping to create healthy, vibrant communities in this country and the world.


Our Company is currently experiencing explosive growth due to national demand for our software, and we’re looking for that rare, accomplished leader who wants to do good, get paid well and have fun doing it. We’re seeking someone with an energizing leadership style, who has the smarts, passion and people skills to guide our efforts on all aspects of the data management process, including: on-boarding; acquisition; attribution; transformation; and reporting of newly acquired data sets to provide high-quality data supporting analytics and our clients’ needs in a timely fashion.


The Senior Data and Reporting Engineer will be responsible for expanding and optimizing our data and data pipeline architecture; optimizing data flow, governance, management, and collection for our clients and cross functional teams. Given the highly execution-focused nature of the work, the ideal candidate will roll up their sleeves to ensure that their projects meet deadlines and will always look for ways to optimize processes in future cycles.


Responsibilities:



  • Select and integrate data tools and frameworks required to maintain and manage data integrity from multiple, external and internal data sources

  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer needs and requests, operational efficiency and other key business performance metrics

  • Build data reporting structures to meet needs of our BA’s, clients and other members of the data team

  • Monitor, identify, investigate and resolve data discrepancies by finding root cause of issues

  • Drive the on-boarding and ingestion of newly acquired data sets in relation to the governance of existing data; manage all aspects of the loading process; monitor attribution throughout the process with quality checks; identify ways to optimize and solidify process cycles


Experience and Qualifications:



  • 5+ years of experience in a similar role as a Senior Data and Reporting Engineer

  • 3+ years of extensive experience in data flow and ETL processes

  • 3+ years of experience transforming and transferring data from transactional/relational databases to analytical databases

  • 3+ years of experience implementing, setting up, and configuring dynamic reporting tools

  • 4+ years of expert-level experience working SQL knowledge with relational databases, query authoring, and analyzing/debugging

  • Demonstrated ability to build and optimize processes that support data transformation, data structures, metadata, dependency, workload management, and integration of data from external and internal data sources

  • 5+ years of experience performing root cause analysis on internal and external data and processes to identify bugs, answer specific business questions and highlight opportunities for improvement

  • Required:

    • Transactional/Relational (specifically MySQL) & Analytical databases

    • AWS RDS, S3, and EC2

    • Big Data Analytics & Reporting

    • Ad-hoc reporting

    • Data pipeline/workflow management tools such as Pentaho/Hitachi Vantara



  • Preferred:

    • Other AWS services: Redshift, Lambda, EMR, etc.

    • Programming Languages (Python, R, Java, Scala, PHP, C, C++, JavaScript, Julia, Mathematica, etc.)

    • Strong interpersonal skills and ability to project manage and work with cross-functional teams

    • Focused attention to detail and high standards for quality and accuracy



  • Degree in Computer Science, Information Systems, Mathematics, or equivalent quantitative field


by via developer jobs - Stack Overflow
 

No comments:

Post a Comment