Placement papers | Freshers Walkin | Jobs daily: Python Developer- Data Infrastructure Engineering at Akuna Capital (Chicago, IL)


Search jobs and placement papers

Python Developer- Data Infrastructure Engineering at Akuna Capital (Chicago, IL)

What you’ll do as a Python Developer on the Data Infra team at Akuna:


We are a data driven organization and are seeking Python Data Infrastructure Engineers to take our data to the next level. We collect large volumes of data from both internal and external sources, requiring talented individuals to identify opportunities to improve and expand our data capabilities. Working on our data infrastructure is a high impact position and you will have opportunity to work closely with our world class team of Quants, Traders and Management.



  • Lead the effort to democratize data access at Akuna

  • Architect, implement, and improve tools that build and interact with our diverse data

  • Standardize our data management best practices

  • Work closely with stakeholders throughout the firm to identify how data is consumed

  • Build and deploy pipelines to collect and transform our rapidly growing Big Data set

  • Propose and effect changes to our data generation processes

  • Produce clean, well-tested, and documented code with a clear design to support mission critical applications

  • Challenge the status quo and help push our organization forward, as we grow beyond the limits of our current tech stack


Qualities that make great candidates:



  • BS/MS/PhD in Computer Science, Engineering, Physics, Math, or equivalent technical field

  • 3+ years of professional experience developing software applications in Python, other languages like C++ are a plus

  • Highly motivated and willing to take ownership of high-impact projects upon arrival

  • Experience architecting a solution from collection to storage access

  • Must possess excellent communication, analytical, and problem solving skills

  • Demonstrated experience working with diverse data sets and frameworks across multiple domains – financial data experience not required

  • Experience with common Big Data technologies, like Kafka and Spark, is a big plus

  • Interest or experience in building scalable, containerized workflows using Docker

  • Demonstrated experience using software engineering best practices to deliver complex software projects


by via developer jobs - Stack Overflow
 

No comments:

Post a Comment