YotaScale is the pioneer in AI-powered Cloud Infrastructure Management. It empowers CloudOps and DevOps engineering teams to effectively optimize around cost, performance, and availability considerations in real time. As a Senior Data Platform Engineer for YotaScale, you will build evolve and scale YotaScale's data platform and pipelines. You will work with engineering and ML teams to deploy their code, provide operational support for or new and existing applications and debugging production issues. You will contribute directly by helping deliver our product to some of the most technically sophisticated customers in the world. Join us on this exciting journey as we build the future of cloud infrastructure with the latest breakthroughs in artificial intelligence.
The organization: Data Platform
The Data Platform team's mission is to build, evolve and scale YotaScale's data platform and pipelines. The team charter is (1) develop, improve, and deploy highly scalable distributed data pipelines that ingest, analyze, and store billions of time-series metrics and events to our downstream storage and analytics services; (2) ensure the system delivers high throughput and low latency to enable the delivery of highly available, resilient, and durable services; and (3) build and maintain our real-time data processing streams and frameworks (Spark).
Major Projects
- A distributed computation engine capable of handling ad hoc analysis requests while self-scaling to accommodate varying loads
- A data ingestion pipeline that can receive, analyze, and annotate millions of events/day
- A solid testing infrastructure for validating both the computational and interactive components of our products repeatedly and reliably
- An experimental platform for quantifying ML algorithms across changes in platform, libraries, and techniques
- Packaged delivery and deployment of customized microservice architectures
Requirements
- Bachelor's degree or higher, in Computer Science or related field
- 8+ years of professional software engineering experience with consumer-facing/external facing products, including 2+ years of demonstrated experience building REST APIs and microservices based applications
- Proficiency with Java, Python, Bash, Postgres, Hadoop, Redis
- Experience with Kubernetes, memcache, docker, RabbitMQ, Kafka, Hive, and Spark is a big plus
- Experience with a major cloud provider AWS, Azure or Google Cloud Platform
- Deployment of scalable web systems with architectural requirements for high availability, highly secure high throughput, low response time, etc.
- Emphasis on writing clear, readable, testable, deployable, monitored code for server-side applications
- Excellent communication and problem-solving skills.
- An inherent drive to solve problems, collaborate with others and mentor team members in a fast-paced, high-growth, ever-changing startup environment
by via developer jobs - Stack Overflow
No comments:
Post a Comment