Placement papers | Freshers Walkin | Jobs daily: AGODA HIRING: Hadoop Application Developer - Bangkok at Big Wednesday Digital (Bangkok, Thailand)


Search jobs and placement papers

AGODA HIRING: Hadoop Application Developer - Bangkok at Big Wednesday Digital (Bangkok, Thailand)



Our Bangkok team is looking for top quality passionate engineers to build products across our next gen data platform products.
  • Our systems scale across multiple data centres, totalling a few million writes per second and managing petabytes of data. We deal with problems from real-time data-ingestion, replication, enrichment, storage and analytics. We are not just using Big Data technologies; we are pushing them to the edge.
  • In this competitive world of online travel agencies, finding even the slightest advantage in the data can make or break a company. That is why we put data systems in our top priorities.
  • While we are proud of what we built so far, there is still a long way to go to fulfil our vision of data. We are looking for people like you who are as excited about data technology as we are, to join the fight.
    You can be part of designing, building, deploying (and probably debugging) products across all aspects of our core data platform products.




Why Agoda Hadoop Apps Team?

Joining this team, you will be solving some of the most difficult challenges out there today for the Hadoop ecosystem. We focus largely on building user tools and applications for other teams to use, most of these are built on top of Yarn utilising Apache Spark as well as other cutting edge technologies.

We are also the engine that drives a fully functional world class data warehouse on top of Hadoop, this means providing tools to define data cubes and syncing high load of data from a processing Hadoop to other systems using in-house built Applications.

Day to Day:
  • You will design, build, test and deploy scalable and efficient tools and applications to process large amount of data while keeping to the highest standards of testing and code quality
  • You will improve scalability, stability, accuracy, speed and efficiency of our existing data systems
  • You will build monitoring systems to monitor the data warehouse and other SLA's.
  • You will work with experienced engineers and product owners to identify and build tools to automate many large-scale data management / analysis tasks.


About You:
  • You'll probably hold a bachelor's degree in Computer Science / Information Systems / Engineering / related field
  • You'll have 3 years' plus experience in designing and implementing large, scalable distributed systems
  • You'll be a proficient coder in either Java or (preferably) Scala
  • You'll have operational experience debugging production issues
  • You'll have a good understanding of data architecture principles
  • You'll have proficient English oral and written communication skills


Nice to Haves:
  • Hadoop ecosystem working knowledge or other "Big Data" experience
  • Any experience with Apache Spark (Scala API preferred)
  • Strong design and OOP skills
  • Python/Django skills also a plus
  • Experience working with open source products
  • Working in an agile environment using test driven methodologies.




What else?

With Agoda you can grow rapidly as an engineer.
  • You will work with top Hadoop engineers
  • Have the ability to use and expand your experience
  • Have a big impact on the business.


Some tech you will use:

Hadoop, Spark, Hive, Impala, Scala, Avro, Parquet, Sensu, ElasticSearch, Python, Django, Postgres, Vertica, Graphite, Grafana, MSSQL, AtScale.

If that's the kind of team you want to join, let's talk!



Agoda is an equal opportunity employer and values diversity. We do not discriminate on the basis of race, religion, colour, national origin, gender, sexual orientation, age, marital status or disability status.



Please note this role open to local and international applications. Full visa sponsorship and relocation assistance available.
by via developer jobs - Stack Overflow
 

No comments:

Post a Comment