Placement papers | Freshers Walkin | Jobs daily: Sr. Software Engineer - Data Platform at Twilio (San Francisco, CA)


Search jobs and placement papers

Sr. Software Engineer - Data Platform at Twilio (San Francisco, CA)

Twilio's Data Platform group delivers a self-service platform of powerful, scalable, reliable and high performance data services and infrastructure to keep up with our rapid growth and the increasing demands of our product teams, business units and external customers. As a Senior Engineer, you will be a core contributor in this group helping us to deliver the world-class data platform Twilio needs in order to succeed. You and your team will face some of the most complex challenges in distributed data systems at scale.


About the job:



  • Develop low latency, highly available, globally distributed services in the flight path of every Voice Call and SMS message.

  • Design and build data pipelines for handling millions of events per second.

  • Work in a small, empowered team. Move fast - ship to prod multiple times in a 2 week sprint.

  • Own and operate production services in AWS cloud infrastructure using the latest tools, like Datadog and Rollbar.

  • Exposure to industry leading open source data technologies (Spark, Kafka, Presto).


Responsibilities:



  • Lead development of Data products, services and tools, in Java and Scala.

  • Drive quality by writing unit tests, functional tests, and performance tests in a continuous delivery environment.

  • Break down requirements, estimate tasks and plan work accurately. Definition of done is Production.

  • Support development operations, building, releasing and assisting with team on-call.

  • Collaborate with other teams and mentor junior engineers. Work cross-functionally for product launches.

  • Lead best practices in the team.


Requirements:


In the Twilio Data Platform group we believe that versatility and complementing proficiencies are the key to forming a better team. We are therefore looking for people with a variety of skills and specializations. You should have at least 3 of the proficiencies below to be considered for the role:



  1. Data persistency. You are an expert in designing, operating and fine-tuning highly-available data persistency layers. You teach and mentor engineers on how to apply the CAP theorem in everyday operations.
  2. Data Compliance: You understand and have built HIPAA, GDPR, PCI or other compliant system(s)
  3. Scalable backend. You understand scalability challenges and performance of server-side code. You can design and develop horizontally-scalable, resilient and efficient systems.
  4. Java or Scala. You are a Java or Scala enthusiast and professional. There is at least one framework in which you can be called an expert.
  5. Cloud. You are proficient in cloud technologies and are hands-on in at least one cloud platform: GCC, AWS, or Azure. You are able to both design and develop cloud-based systems and operate them in an automated fashion.
  6. Security. You are able to design secure distributed systems and assure operational processes around security. You understand fitness for purpose of security techniques.
  7. Data Compliance
  8. Elasticsearch

by via developer jobs - Stack Overflow
 

No comments:

Post a Comment