Job Description :

Hadoop ETL Developer

Preferred location is Atlanta followed by any other locations including but not limited to Richmond VA, Atlanta GA, Norfolk VA, Mason, OH, Woodland Hills CA, Thousand Oaks, CA


Description

Looking for a Hadoop Developer to build end to end business solutions and to work with one of the leading healthcare providers in the US. The ideal candidate must possess an excellent background on Hadoop Development with ETL. The candidate must possess excellent written and verbal communication skills with the ability and collaborate effectively with domain and technical experts in the team

Primary Skills

  • Should have thorough understanding of Hadoop concepts
  • Candidate with a mixture of Software Development, Data Engineering and Data Science skills
  • Strong Software Development and Engineering skills with some basic knowledge in Machine Learning and Deep Learning
  • Ability to write efficient code-- Good understanding of core Data Structures/algorithms is critical for engine development
  • Good python skills following software engineering best practices
  • Comfort & familiarity with SQL and Hadoop ecosystem of tools including spark
  • Understanding of foundational Machine Learning concepts and some Deep Learning basics
  • 5+ years' experience in any flavor of SQL dealing with complex queries, analytics and data models.
  • 3+ years' experience in any of the modern programming languages
  • Should have thorough understanding of Hadoop concepts
  • Must have 4+ years of experience in Spark
  • Should have Python programming experience
  • Should have experience in AWS services like Glue, S3, Athena, EMR
  • Should have experience in Windows Powershell scripting
  • Good Analytical skills and experience in cross functional team environment
  • Healthcare domain knowledge is preferred
  • The work will entail heads down coding, testing, data analysis, component packing/deployment
  • Strong SQL, Teradata, unix, Strong standard Big Data/hadoop skillset (Hadoop hive, sqoop, hdfs, etc) String Spark/python/scala
  • 3+ solid yrs experience with the above skillset
  • 7+ years' experience of ETL application development and implementation experience using Teradata and Informatica / any other ETL tool
  • Must be an expert in script development using BTEQ
  • Must have 5+ years of experience in unix scripting
  • Must have 5+ years of experience in Informatica / ETL tools
  • Must have experience in performance tuning, query optimization
  • Must have experience in health care industry
  • Must have experience in building data warehouse and/or DataMart
  • Must have experience with database design, data modelling, dimension modelling concepts, ETL, data development, data integration, data distribution, data quality and analytics
  • Must have of Hands on experience in BTEQ, TPT load utilities, View point, writing complex SQL queries involving aggregation, OLAP functions etc , query tuning, Teradata performance tuning.
  • Should have experience at Architectural level
  • Should have worked in Agile development and scrum methodology
  • Should have experience in Onshore offshore coordination
  • Experience in Medicare business is an added advantage
  • Good verbal, written, and interpersonal communication skills
  • Should have experience of leading 10+ member team
  • Should have worked with job scheduling tool - Control M
  • Knowledge in Medicare business would be an added advantage

Responsibilities

  • Design, code, install, and maintain appropriate systems software program.
  • This is a lower level development role Which means absolute hands on expertise is mandatory
  • Identify, evaluate, tailor, and direct the implementation of vendor-supplied software packages.
  • Ensure the maintenance of adequate software systems documentation.
  • Conduct quality assurance activities such as peer reviews and test plan development
  • Work on sprint team in agile, rapid development and deployment environment
             

Similar Jobs you may be interested in ..