Job Description :

Job Description -Senior Data Engineer
• Strong understanding of data structures and algorithms
• Strong understanding of solution and technical design
• Has a strong problem solving and analytical mindset?
• Able to influence and communicate effectively, both verbally and written, with team members and
business stakeholders
• Able to quickly pick up new programming languages, technologies, and frameworks
• Advanced experience building cloud scalable, real time and high-performance data lake solutions
• In-depth understanding of micro service architecture
• Strong understanding of developing complex data solutions
• Experience working on end-to-end solution design
• Able to lead others in solving complex problems by taking a broad perspective to identify innovative
• Willing to learn new skills and technologies
• Has a passion for data solutions
Required and Preferred Skill Sets:
• 1 -2 years of hands-on experience in AWS - EMR [Hive, Pyspark], S3, Athena or any other equivalent cloud;
Ability to solve complex problems
• 1-2 years of hands-on experience Spark Batch Processing and some familiarity with Spark Structured
Streaming; Ability to solve complex issues
• 1-2 years’ experience working experience with Hadoop stack dealing huge volumes of data in a scalable
• 2-3 years of hands-on experience with SQL, ETL, data transformation and analytics functions; Ability to
solve complex problems
• 2-3 years of hands-on Python experience including Batch scripting, data manipulation, distributable
packages; Ability to solve complex problems
• 2-3 years’ experience working with batch orchestration tools such as Apache Airflow or equivalent,
preferable Airflow
• 2-3 years working with code versioning tools such as GitHub or BitBucket; expert level understanding of
repo design and best practices
• 2-3 years working with deployment automation tools such as Jenkins and familiarity with containerization
concepts such as Docker and Kubernetes
• 2-3 years of hands-on experience designing and building ETL pipelines; expert with data ingest, change
data capture, data quality; hand on experience with API development; some exposure to Nifi or Kafka
• 2-3 years designing and developing relational database objects; knowledgeable on logical and physical
data modelling concepts; some experience with Snowflake
• Preferred 1+ years of experience supporting Tableau or Cognos use cases
• Familiarity with Agile; working experience preferred
• Bachelor's degree in IT or related field
• One of the following alternatives may be accepted:
o Associates + 6 yrs;


Similar Jobs you may be interested in ..