Responsibilities:
- Work with discipline Leads and other engineers to create Module/Unit and Interface specification, implementation, integration and testing of product.
- Rapidly architect, design, prototype, and implement architectures to tackle the Big Data and Data Science needs
- Research, experiment, and utilize leading Big Data methodologies, such as HD Insight, Hadoop, Spark, Azure Data Lake, PowerBI, Azure Data Factory, Redshift and Microsoft Azure PaaS
- Architect, implement and test data processing pipelines, and data mining / data science algorithms on a variety of hosted settings, such as Azure, client technology stacks, and Crestron’s own clusters.
- Work as part of an agile team.
Qualifications:
- Bachelor’s degree is required. Area of study such as: Computer Science, Business Information Systems or other relevant field.
- A minimum of 5 years of experience in building enterprise scale systems and system architecture is required.
- A minimum of 2 years of experience in C# .NET, with experience with source code management systems like SVN is required
- Fluency with either Agile or SCRUM methodologies is required
- Experience with large-scale, big data methods, such as MapReduce, Hadoop, Spark, Hive, Impala, or Storm is required
- Experience in cloud and distributed systems principles, including load balancing, networks, scaling, in-memory vs. disk, etc.is required.
Must be able to work in the US without sponsorship
by via developer jobs - Stack Overflow
No comments:
Post a Comment