Job Description -Senior Data Engineer• Strong understanding of data structures and algorithms• Strong understanding of solution and technical design• Has a strong problem solving and analytical mindset?• Able to influence and communicate effectively, both verbally and written, with team members andbusiness stakeholders• Able to quickly pick up new programming languages, technologies, and frameworks• Advanced experience building cloud scalable, real time and high-performance data lake solutions• In-depth understanding of micro service architecture• Strong understanding of developing complex data solutions• Experience working on end-to-end solution design• Able to lead others in solving complex problems by taking a broad perspective to identify innovativesolutions• Willing to learn new skills and technologies• Has a passion for data solutionsRequired and Preferred Skill Sets:• 1 -2 years of hands-on experience in AWS - EMR [Hive, Pyspark], S3, Athena or any other equivalent cloud;Ability to solve complex problems• 1-2 years of hands-on experience Spark Batch Processing and some familiarity with Spark StructuredStreaming; Ability to solve complex issues• 1-2 years’ experience working experience with Hadoop stack dealing huge volumes of data in a scalablefashion• 2-3 years of hands-on experience with SQL, ETL, data transformation and analytics functions; Ability tosolve complex problems• 2-3 years of hands-on Python experience including Batch scripting, data manipulation, distributablepackages; Ability to solve complex problems• 2-3 years’ experience working with batch orchestration tools such as Apache Airflow or equivalent,preferable Airflow• 2-3 years working with code versioning tools such as GitHub or BitBucket; expert level understanding ofrepo design and best practices• 2-3 years working with deployment automation tools such as Jenkins and familiarity with containerizationconcepts such as Docker and Kubernetes• 2-3 years of hands-on experience designing and building ETL pipelines; expert with data ingest, changedata capture, data quality; hand on experience with API development; some exposure to Nifi or Kafka• 2-3 years designing and developing relational database objects; knowledgeable on logical and physicaldata modelling concepts; some experience with Snowflake• Preferred 1+ years of experience supporting Tableau or Cognos use cases• Familiarity with Agile; working experience preferredEducation:• Bachelor's degree in IT or related field• One of the following alternatives may be accepted:o Associates + 6 yrs;