Job Description :

Job Title: Big Data Developer with Hadoop, Hive and Spark

Location: Hartford CT

Duration: 1 year

Roles & Responsibilities:

  • Good understanding of Hadoop eco system & Yarn architecture. Writing high performance Hive queries.
  • Hands on experience in Spark with Python/Scala. Hands on with loading and manipulating large data sets using SparkSQL & Hive.
  • Knowledge on debugging and troubleshooting Hadoop jobs.
  • Good communication and client interfacing skill. Prepare implementation plan as per the need and build the in-scope applications in Big Data technologies
  • Responsible for all technical deliveries of project, good understanding of Agile & DevOps methodology
  • Good Communication & soft skills
  • Prior experience with US customer is nice to have Should have worked in offshore delivery model Should be strong in Unix & SQL, PL/SQL.
             

Similar Jobs you may be interested in ..