Job Description :
IKCON TECHNOLOGIES INC delivers exceptional IT services and solutions that provide clients with definite edge over competitors and promoting highest standards of quality. We are currently looking for a [Datalake/Hadoop Developer with AWS migration exp] with one of our clients in [Boston], [MA]. If you are actively looking for opportunities, please send us your updated resume with your contact details.

JOB TITLE Datalake/Hadoop Developer with AWS migration exp
CITY Boston
STATE MA
TAX TERMS C2C/W2
EXPERIENCE 8+ Years
INTERVIEW MODE Telephonic/Skype/In person


JOB DESCRIPTION
Implementation and Administration of On-prem Data lake environment
Monitoring and managing the Hadoop services on 3 clusters
Installing the New hosts (Head nodes, compute nodes and worker nodes to the existing cluster) and decommission of the hosts from the cluster
Maintenance and Monitoring of the jobs of Production, UAT and Development environments
Code changes and updated code deployments in the UAT and Production environments
Deploying code changes on Rshiny server and Rstudio server as per the user request
Implementation of patching activities and applying the fixes to the data lake environment provided by the Hortonworks
Working on the job failures mostly Hive and Spark jobs across the data lake environment
Onboarding the new users to the Hadoop data lake environment
Supporting the developers for executing the adhoc jobs in Hive environments for the existing POCs like enrollment_forecaster etc
HDFS home directories and Hive schema, table and column level enforcing access bases policies management from Ranger
Implementation of Security and management of Ambari and other HDP services in Hortonworks environment and Active Directory based Kerberos authentication across data lake clusters
Management of Encryption and Decryption of the users data using Ranger-KMS across the clusters of data lake environment
Installation and upgradation of Jupyterhub and python packages to support the developers for implementing the code in on-prem environments
Hail- Spark implementation and analysis of UKBIOBANK datasets of genotypes and Phenotypes
Work with Hortonworks team for the planned upgradation of HDP version from 2.6 to 3.0
Support and maintenance of MongoDB servers in data lake and Source code Repository maintenance in Bitbucket

In addition to the above tasks, the resource will also perform the following AWS activities

Support of Cloudbreak server in AWS for the Hortonworks CB Deployment
Support of software upgrades for Cloudbreak,HDP packages installation in AWS Cluster
Support Data scientists for any technical issues during the execution of Spark-Hail jobs in Cloudbreak AWS cluster
Setup of latest versions of Spark and Hail in AWS spark cluster.
Implement Salesforce solutions that adhere to platform best practices and perform peer code reviews.

MINIMUM QUALIFICATIONS

Bachelor’s Degree or master’s degree in the field of Computer Science or Information Systems or a related field.
             

Similar Jobs you may be interested in ..