Homegrown stories and hollywood hits. At HOOQ we’re telling millions of stories to billions of people across Singapore, Philippines, Thailand, Indonesia, and India. Just like our content, our team comes from around the world, we’re ambitious, driven and unique, we embrace the difference. There’s many paths people take to join us, what links us together is our love of stories.
HOOQ is backed by some of the biggest players in entertainment, we’re a joint venture between Singtel, Sony and Warner Brothers. We build for the customer first to deliver original, local and international content on their phone, tablet, computer and television wherever they are.
It’s an exciting time to be at HOOQ! We are currently seeking experienced and energetic Data Engineers at different levels for our Singapore office with experience in dealing with large volumes, variety and velocity of Data. Data team is tasked with absorbing billions of rows of data from dozens of sources, organizing them, analyzing them, and visualizing them to help inform both short- and long-term decision-making.
Primary responsibilities
- Design, implement and support an analytical data infrastructure providing ad-hoc access to large datasets and computing power.
- Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL and AWS big data technologies.
- Creation and support of real-time data pipelines built on AWS technologies including EMR, Glue, Kinesis, Redshift/Spectrum and Athena
- Supporting existing ETL/ELT infrastructure built on Pentaho, Python, EMR
- Continual research of the latest big data, elasticsearch technologies to provide new capabilities and increase efficiency
- Working closely with team members to drive real-time model implementations for monitoring and alerting of systems.
- Collaborate with other tech teams to implement advanced analytics algorithms that exploit our rich datasets for statistical analysis, prediction, clustering and machine learning
- Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers
Your CV should show:
- 4+ years of industry experience in software development, data engineering, business intelligence, data science, or related field with a track record of manipulating, processing, and extracting value from large datasets
- Demonstrated strength in data modeling, ETL development, and data warehousing
- Experience in programming in Python
- Experience using big data technologies (Hadoop, Hive, Hbase, Spark etc.)
- Experience using business intelligence reporting tools (Tableau, Cognos etc.)
- Knowledge of data management fundamentals and data storage principles
- Knowledge of distributed systems as it pertains to data storage and computing
- Experience working with AWS big data technologies (Redshift, S3, EMR)
- Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations
by via developer jobs - Stack Overflow
No comments:
Post a Comment