Placement papers | Freshers Walkin | Jobs daily: 05/01/2024 - 06/01/2024

Search jobs and placement papers

Intraedge Technologies Ltd. is hiring for: Java Developer - Data Structure/Algorithm (5-8 yrs)

Hi Mohamed Bilal,

Here's an interesting job that we think might be relevant for you -

Job Title: Java Developer - Data Structure/Algorithm (5-8 yrs)

Click here to Apply

Job Description :

Experience 5-8 Years

- Strong grasp of data structures and algorithms

- Strong hands-on experience in Java with extensive understanding of the JVM and garbage collectors and JVM memory model

- Extensive experience in designing and implementing large scale distributed data intensive platforms with high resiliency, availability, and reliability.

- Strong experience in maintaining high throughput, low latency applications.

- Strong written, verbal communications, problem solving and analytical skills.

- Proven collaboration skills along with the ability to influence without authority.

- Bachelor's degree in computer science, Computer Science Engineering, or related field required; Advanced Degree preferred, with 7 or more years of experience in software engineering, design and architecture.

- Good knowledge on Couchbase.


- Knowledge on Security, Cryptography is a plus.

- Immediate to Serving NP until month end





To view the job description, Please refer to the below link:
https://www.hirist.com/j/java-developer-data-structurealgorithm-5-8-yrs-1337364.html?ref=tm

PS: Please ignore this email if you have already applied or not interested in this job.

Best regards,
Team hirist.com
info@hirist.com

Indium Software is hiring for: Data Scientist - NLP/Deep Learning (4-7 yrs)

Hi Mohamed Bilal,

Here's an interesting job that we think might be relevant for you -

Job Title: Data Scientist - NLP/Deep Learning (4-7 yrs)

Click here to Apply

Job Description :

Job Description:

We are looking for a highly skilled and experienced Data Scientist to join our team.

The ideal candidate should have a strong background in data science, with a focus on Natural Language Processing (NLP), Deep Learning, and experience working with AWS.

Additionally, experience in implementing Gen AI solutions in projects is highly desirable.

As a Data Scientist, you will play a key role in developing advanced analytics solutions that drive business insights and innovation.

Responsibilities:

- Apply advanced statistical and machine learning techniques to analyze large datasets and extract meaningful insights.

- Develop and deploy NLP models for text classification, sentiment analysis, named entity recognition, and other NLP tasks.

- Utilize Deep Learning frameworks such as TensorFlow or PyTorch to build and train neural network models for various data science applications.

- Collaborate with cross-functional teams to understand business requirements and develop data-driven solutions to address them.

- Design and implement scalable and reliable data pipelines for data ingestion, processing, and transformation using AWS services.

- Leverage AWS cloud infrastructure and services such as S3, EC2, SageMaker, and Lambda for building and deploying data science solutions.

- Explore and implement Gen AI techniques to automate and optimize processes, enhance user experiences, and drive business outcomes.

- Stay updated on the latest trends and advancements in data science, machine learning, and AI technologies.

Requirements:

- Bachelor's or Master's degree in Computer Science, Statistics, Mathematics, or related field.

- 4-7 years of experience working as a Data Scientist, with a proven track record of delivering impactful data science projects.

- Strong expertise in NLP techniques and libraries such as NLTK, spaCy, Transformers, etc.

- Hands-on experience with Deep Learning frameworks like TensorFlow, PyTorch, or Keras.

- Proficiency in working with AWS cloud services for data analytics and machine learning.

- Experience in implementing Gen AI solutions in real-world projects is a plus.

- Solid programming skills in languages such as Python, R, or Scala.

- Excellent analytical and problem-solving abilities.

- Strong communication and collaboration skills, with the ability to work effectively in a team environment.

If you are a talented Data Scientist with a passion for leveraging advanced analytics and AI technologies to drive business value, we encourage you to apply for this exciting opportunity!





To view the job description, Please refer to the below link:
https://www.hirist.com/j/data-scientist-nlpdeep-learning-4-7-yrs-1332880.html?ref=tm

PS: Please ignore this email if you have already applied or not interested in this job.

Best regards,
Team hirist.com
info@hirist.com

Ganit Inc is hiring for: Ganit Inc. - Senior Data Engineer - Python/Java/Scala (7-12 yrs)

Hi Mohamed Bilal,

Here's an interesting job that we think might be relevant for you -

Job Title: Ganit Inc. - Senior Data Engineer - Python/Java/Scala (7-12 yrs)

Click here to Apply

Job Description :

About Ganit:

At Ganit, analytics is our playing field.

We're not just data geeks - we're a family of scientists, refiners and artisans.

We manage data from the source and extract valuable information, working with businesses to develop state-of-the-art solutions that push the boundaries of data science.

And it just so happens that we excel at translating data into actions and insights - turning cacophony into melody.

Being a data science and data analytics company that effectively bridges intelligence and action is no walk in the park, but here at Ganit our people make it possible.

Our thought framework helps us craft a data-backed response to the most pressing concerns of organizational leaders today.

Roles & Responsibilities:

- Design, implement, and improve the analytics platform

- Implement and simplify self-service data query and analysis capabilities of the BI platform


- Develop and improve the current BI architecture, emphasizing data security, data quality and timeliness, scalability, and extensibility

- Deploy and use various big data technologies and run pilots to design low latency data architectures at scale

- Collaborate with business analysts, data scientists, product managers, software development engineers, and other BI teams to develop, implement, and validate KPIs, statistical analyses, data profiling, prediction, forecasting, clustering, and machine learning algorithms

- Experience with complex data modelling, ETL design, and using large databases in a business environment

- Proficiency with Linux command line and systems administration

- Experience with languages like Python/Java/Scala

- Experience with Big Data technologies such as Hive/Spark

- Proven ability to develop unconventional solutions, sees opportunities to innovate and leads the way

- Good experience of working in cloud platforms like AWS, GCP & Azure.

- Having worked on projects involving creation of data lake or data warehouse Excellent verbal and written communication.

- Proven interpersonal skills and ability to convey key insights from complex analyses in summarized business terms.

- Ability to effectively communicate with multiple teams

- Ability to work with shifting deadlines in a fast-paced environment.

Educational Qualification:

At Ganit we are building an elite team, ergo we are seeking candidates who possess the following backgrounds:

- 7+ years relevant experience

- Expert level skills writing and optimizing complex SQL Knowledge of data warehousing concepts

- Experience in data mining, profiling, and analysis

Good to have:

- AWS/GCP/Azure Data Engineer Certification





To view the job description, Please refer to the below link:
https://www.hirist.com/j/ganit-inc-senior-data-engineer-pythonjavascala-7-12-yrs-1332081.html?ref=tm

PS: Please ignore this email if you have already applied or not interested in this job.

Best regards,
Team hirist.com
info@hirist.com

TecHHire Global is hiring for: Senior Data Engineer - GCP (4-9 yrs)

Hi Mohamed Bilal,

Here's an interesting job that we think might be relevant for you -

Job Title: Senior Data Engineer - GCP (4-9 yrs)

Click here to Apply

Job Description :

Job Description :


We are looking for a Senior Data engineer-GCP who can design and build production data pipelines from ingestion to consumption within a big data architecture using Python, Scala, PySpark, Java implementing data engineering, ingestion, and curation functions on GCP cloud.


Responsibilities :

Must be able to write quality code and build secure, highly available systems.Assemble large, complex data sets that meet functional / non-functional business requirements.Identify, design, and implement internal process improvements: automating manual processes, optimising data delivery, re-designing infrastructure for greater scalability, etc with the guidance.Create data tools for analytics and data scientist team members that assist them in building and optimising our product into an innovative industry leader.Monitoring performance and advising any necessary infrastructure changes.Defining data retention policies.Implementing the ETL process and optimal data pipeline architectureBuild analytic tools that utilise the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.Create design documents that describe the functionality, capacity, architecture, and process.Develop, test, and implement data solutions based on finalised design documents.Work with data and analytics experts to strive for greater functionality in our data systems.Proactively identify potential production issues and recommend and implement solutions

Requirements :

Should have at least 2 years of relevant experience as GCP Data EngineerEducational Qualifications - Bachelor's/Master'sGood understanding of optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and GCP big data' technologiesStrong in SQL and good understanding and experience in writing stored proceduresGood experience in GCPGood experience in Hadoop, Big dataProficient understanding of distributed computing principlesExperience in working with batch processing/ real-time systems using various open-source technologies like NoSQL, Spark, Pig, Hive, Apache Airflow.Implemented complex projects dealing with the considerable data size (PB).Optimisation techniques (performance, scalability, monitoring, etc.)Experience with integration of data from multiple data sources preferably AWSExperience with NoSQL databases, such as HBase, Cassandra, MongoDB, etc.Knowledge of various ETL techniques and frameworks, such as FlumeExperience with various messaging systems, such as Kafka or RabbitMQGood understanding of Lambda Architecture, along with its advantages and drawbacksCreation of DAGs for data engineeringExpert at Python /Scala programming, especially for data engineering/ ETL purposes





To view the job description, Please refer to the below link:
https://www.hirist.com/j/senior-data-engineer-gcp-4-9-yrs-1331874.html?ref=tm

PS: Please ignore this email if you have already applied or not interested in this job.

Best regards,
Team hirist.com
info@hirist.com