Job Description :

Requirements

 Strong understanding of data structures and algorithms

  • Strong understanding of solution and technical design
  • Has a strong problem solving and analytical mindset?
  • Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders
  • Able to quickly pick up new programming languages, technologies, and frameworks
  • Advanced experience building cloud scalable, real time and high-performance data lake solutions
  • In-depth understanding of micro service architecture
  • Strong understanding of developing complex data solutions
  • Experience working on end-to-end solution design
  • Able to lead others in solving complex problems by taking a broad perspective to identify innovative solutions
  • Willing to learn new skills and technologies
  • Has a passion for data solutions

 Required and Preferred Skill Sets:

  • Hands-on expert level experience in AWS - EMR [Hive, Pyspark], S3, Athena or any other equivalent cloud
  • Hands-on expert level experience Spark Batch Processing and some familiarity with Spark Structured Streaming
  • Working experience with Hadoop stack dealing huge volumes of data in a scalable fashion
  • Hands-on experience with SQL, ETL, data transformation and analytics functions 
  • Hands-on Python experience including Batch scripting, data manipulation, distributable packages
  • Experience in working with batch orchestration tools such as Apache Airflow  or equivalent
  • Experience in working with code versioning tools such as GitHub or BitBucket; expert level understanding of repo design and best practices
  • Experience in working with deployment automation tools such as Jenkins and familiarity with containerization concepts such as Docker and Kubernetes
  • Hands-on experience designing and building ETL pipelines; expert with data ingest, change data capture, data quality; hand on experience with API development; some exposure to Nifi or Kafka
  • Designing and developing relational database objects; knowledgeable on logical and physical data modelling concepts; some experience with Snowflake
  • Experience supporting Tableau or Cognos use cases; familiarity with tools and capabilities
  • Familiarity with Agile; working experience preferred
             

Similar Jobs you may be interested in ..