Placement papers | Freshers Walkin | Jobs daily: Mid/Senior/Lead Scala Engineer - Scala, Spark, Hadoop at The Quantium Group (Sydney, Australia)


Search jobs and placement papers

Mid/Senior/Lead Scala Engineer - Scala, Spark, Hadoop at The Quantium Group (Sydney, Australia)

For over 15 years Quantium have combined the best of human and artificial intelligence to power possibilities for individuals, organisations and society. Our solutions make sense of what has happened and what will, could or should be done to re-shape industries and societies around the needs of the people they serve.


Times and technology have changed, but this remains our goal. Instead of wrangling single, SQL-based databases, our MapR Hadoop platform runs across 200 nodes with multiple clusters using the latest big data technology.


Working with Scala, Spark and the rest of the Hadoop ecosystem, you’ll be building applications to work with unique data sets (some of the largest and most complex in Australia) to make a real difference to our clients.


FAQ:


Is your Scala fully FP?


Not always. We focus on building highly maintainable code so there’s often a decision to make on what the best way to go is. We’re not purists, but we love to use FP where it’s the best solution


How do your teams work?


We work in multi-discipline teams so you’ll be working alongside Data Scientists, Analysts, Testers and Devops


What are you looking for in the ideal candidate?


We’re looking for Big Data Software engineers:


You’ve been working with Scala and love to play with it, you know your way around the Hadoop eco-system with Spark at the top of your go to frameworks


You’re a pragmatist, a true engineer and love to solve complex issues


You can also have a conversation about Scalaz without alienating anyone!


You haven't ticked "Do programmers have quiet working conditions?" In the Joel test?


We have an open plan office, which is normally pretty quiet, but if you want to head to a meeting room or use headphones, that's totally fine


Skills & Requirements


Do you have a bullet point checklist for me to check off my suitability?


We know people are not bullet points but sure thing!


You have:



  • Experience developing applications using Apache Spark or similar Hadoop-based big data technologies

  • Experience building Scala applications, preferably in distributed contexts (if you have a real love of Scala and are itching to move from Java, we’ll also consider it)

  • A solid foundation in functional and object-oriented programming with data structures

  • A passion for solving problems and writing efficient algorithms

  • An awareness of considerations around structuring data on distributed systems to support analytic use cases

  • A passion for delivering high-quality, peer-reviewed, well-tested code

  • A love for knowledge sharing, you know what works, but you’re also happy to learn new methods and technology


by via developer jobs - Stack Overflow
 

No comments:

Post a Comment