Your Responsibilities:
- Responsible for development and troubleshooting using various Hadoop technologies
- Deploy and extend large-scale Hadoop clusters for our clients
- Fine tune existing clusters for higher performance and throughput
- Work in our offices and on-site at the premises of our clients
- Enjoy being challenged and to solve complex problems on a daily basis
- Be part of our newly formed team in Berlin and help driving its culture and work attitude
Job Requirements
- You are a structured problem solver who is able to work in a team as well as independently
- Experience administrating a Linux environment (RHEL)
- At least 2 years of experience with the Hadoop ecosystem and its tools
- Understanding of software development methodologies and processes
- Sound knowledge of SQL, relational concepts and RDBMS systems is a plus
- Knowledge of Docker and related tools is a plus
- Being able to work in an English-speaking, international environment
We offer:
- Fascinating tasks and interesting Big Data projects in various industries
- Benefit from 10 years of delivering excellence to our customers
- Work with our open-source Big Data gurus, such as our Apache HBase committer and Release Manager
- Work in the open-source community and become a contributor
- Learn from open-source enthusiasts which you will find nowhere else in Germany!
- Fair pay and bonuses
- Enjoy our additional benefits such as a free BVG ticket and fresh fruits in the office
- Possibility to work remotely or in one of our development labs throughout Europe
- Work with cutting edge equipment and tools
by via developer jobs - Stack Overflow
No comments:
Post a Comment