Job Description :
any visa/tax term

Big Data Platform Engineer (Hadoop, AWS)

San Jose, CA

6 months+

Client is seeking a passionate and talented technical Platform engineer who has wide experience in managing and Administrating AWS and Hadoop Clusters.

The Role Demands,

· To work closely with Hadoop Developers, Data Scientists, and IT and the key responsibility is to build/support the Hadoop platform from scratch on cloud infrastructure like AWS/On premise and to develop and provision associated devops operations

· To provide Long term Effective Solution design on architecting platform solutions like helping in picking the right product for the problem , positioning it at right spot and demonstrating platform success on serving the BIZ needs

· To be instrumental on effectively owning the day to day Platform operations on a dynamically changing environment

· To enforce security compliance on each layer of the platform architecture

· To document all environmental settings and configurations.

· Proactive Planning and upgrading of the environment, both hardware and software (wherever applicable

· Providing front line support to various teams using the Hadoop environments.

Essential Skills:

· Candidates with over 5 to 12 years of relevant IT experience will be considered .

· Should possess very strong solution design capabilities/experience on major of AWS Provisioning Services /On premise cloud infrastructure services

· Should have production-grade Hadoop admin experience on any of these distributions like Cloudera (CDH 5.x), Hortonworks (HDP 2.x), Apache open-source or comparable Hadoop distribution;

· Should have strong competency with key Hadoop stacks like HDFS, YARN, MR2, HIVE, PIG, HCATALOG, OOZIE, IMPALA , SPARK, KAFKA,SENTRY,HUE,SPARK

· Should have strong knowledge on No SQL databases such as Hbase & HP Vertica

· Should have strong experience in automations with scripting via Shell (or) Python

· Should have strong experience on Devops tools like Jenkins, GIT, Ansible, Chef and Puppet

· Proven ability on arriving backup strategies for big data systems

· Should have strong linux capabilities to handle server build activities with LDAP, Proxy server, Active Directory , SSH tunneling ,SSL & SAML authentication, Kerberos , MFA, Load balancing , HAPROXY

· Proven ability in setting up real time streaming application with big data eco system kinesis or Kafka

· Expert knowledge of Hadoop hardware and network infrastructure;

· Proven ability to install and configure software binaries for key BI/stats products (e.g. Qlik®, SAS®, Tableau®, Cognos®, Excel®)

· Should hold experience on critical implementation , Cluster Build /Cluster Upgrade activities on Big Data System

· Proven experience in defining, developing administration standards, policies & procedures;

· Strong communication, influencing and collaboration experience with all levels of the organization

Basic Qualifications:

· Bachelors / Master’s Degree in Computer Science or related field in a reputed institution

· 5-8 years professional experience in software development with most of them from a product company