Job Description :

Hello 

 

This is Rasika from Ztek Consulting Inc.
Hope you're doing great. 
We are assisting our client, for hiring “
Data Engineer ”
Location : 
Remote
Experience: 10+


If this is of interest, please email your resume along with your present work/visa status, compensation/rate, and any other relevant details for an immediate consideration.

Below is a small synopsis of the said position.

Data Engineer (Ex Infosys, IBM, CTS, Accenture, TCS, Wipro Or Lead)

Remote

 

Requirements

 

Requirements:

 

Experience:

Strong understanding of data structures and algorithms

Strong understanding of solution and technical design

Has a strong problem solving and analytical mindset?

Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders

Able to quickly pick up new programming languages, technologies, and frameworks

Advanced experience building cloud scalable, real time and high-performance data lake solutions

In-depth understanding of micro service architecture

Strong understanding of developing complex data solutions

Experience working on end-to-end solution design

Able to lead others in solving complex problems by taking a broad perspective to identify innovative solutions

Willing to learn new skills and technologies

Has a passion for data solutions

 

Required and Preferred Skill Sets:

3+ years of hands-on expert level experience in AWS - EMR [Hive, Pyspark], S3, Athena or any other equivalent cloud

3+ years of hands-on expert level experience Spark Batch Processing and some familiarity with Spark Structured Streaming

3-4 years’ experience Working experience with Hadoop stack dealing huge volumes of data in a scalable fashion

5+ years of hands-on experience with SQL, ETL, data transformation and analytics functions 

5+ years of hands-on Python experience including Batch scripting, data manipulation, distributable packages

5+ years’ experience working with batch orchestration tools such as Apache Airflow  or equivalent

5+ years working with code versioning tools such as GitHub or BitBucket; expert level understanding of repo design and best practices

5+ years working with deployment automation tools such as Jenkins and familiarity with containerization concepts such as Docker and Kubernetes

5+ years of hands-on experience designing and building ETL pipelines; expert with data ingest, change data capture, data quality; hand on experience with API development; some exposure to Nifi or Kafka

5+ years designing and developing relational database objects; knowledgeable on logical and physical data modelling concepts; some experience with Snowflake

3+ years of experience supporting Tableau or Cognos use cases; familiarity with tools and capabilities

Familiarity with Agile; working experience preferred

 

             

Similar Jobs you may be interested in ..