Job Description :

Job Title: Apache Hadoop / Hive

Location: Hartford CT

Duration: 1 year

L2 support for Hadoop, Hive, Python, Unix Shell scripting and GCP services

Job Summary:

  • Strong Hadoop, Hive, Python, Unix Shell scripting experience.
  • Scala, Spark - added advantage.
  • Knowledge on Google Cloud Platform like Cloud shell commands, Composer, PostgreSQL, Big query, snowflakes, other GCP services Exceptional troubleshooting skills in the following: Unix Shell scripting, Hive. Respond to ServiceNow Incident tickets in case of Production Job failures. Work flexible schedules, which may include evenings, weekends or holidays. Resolve customer technical issues through diligent research, reproduction, and troubleshooting followed by RCA.
  • Construct and maintain ELT/ETL job processes sourcing from disparate systems throughout the enterprise and load into an enterprise data lake.
  • Effectively acquire and translate user requirements into technical specifications to develop automated data pipelines to satisfy business demand.
  • Comprehensive understanding of modern data architecture and enterprise data management frameworks, concepts, and best practices.
  • Extensive experience working with cloud and modern data platforms such as Azure, Google Cloud, Snowflake, etc.
  • Experience working with relational databases-Oracle, MySQL, PostgreSQL, IBM DB2, SQL Server, etc.
  • Advantage if they have Active Google Cloud Data Engineer Certification or Active Google Professional Cloud Architect Certification.
  • Minimum 1 year of hands-on GCP experience with a minimum of an end to end solution designed and implemented at production scale.

Roles & Responsibilities:

  • Strong Hadoop, Hive, Python, Unix Shell scripting experience.
  • Scala, Spark - added advantage.
  • Knowledge on Google Cloud Platform like Cloud shell commands, Composer, PostgreSQL, Big query, snowflakes, other GCP services Exceptional troubleshooting skills in the following: Unix Shell scripting, Hive. Respond to ServiceNow Incident tickets in case of Production Job failures. Work flexible schedules, which may include evenings, weekends or holidays. Resolve customer technical issues through diligent research, reproduction, and troubleshooting followed by RCA.
  • Construct and maintain ELT/ETL job processes sourcing from disparate systems throughout the enterprise and load into an enterprise data lake.
  • Effectively acquire and translate user requirements into technical specifications to develop automated data pipelines to satisfy business demand.
  • Comprehensive understanding of modern data architecture and enterprise data management frameworks, concepts, and best practices.
  • Extensive experience working with cloud and modern data platforms such as Azure, Google Cloud, Snowflake, etc.
  • Experience working with relational databases-Oracle, MySQL, PostgreSQL, IBM DB2, SQL Server, etc.
  • Advantage if they have Active Google Cloud Data Engineer Certification or Active Google Professional Cloud Architect Certification.
             

Similar Jobs you may be interested in ..