Job Description :
Qualifications
BS Degree in Computer Science/Engineering preferred.
7+ years of IT experience
2+ years experience in Hadoop required
4+ years of Java experience preferred
4+ years experience in ETL with tools like Informatica, Data Stage, Talend or Ab-initio preferred
Exposure to Data Wrangling tools an advantage
Exposure to Java and procedural languages like PL/SQL, C, C++
Understanding of Storage, Filesytem , Disks, Mounts , NFS
Development experience in Sentry, Cloudera Manager, Hive, HBase, Impala preferred
Excellent customer service attitude, communication skills (written and verbal), and interpersonal skills.
Experience working in cross-functional, multi-location teams.
Excellent analytical and problem-solving skills.
Experience in financial services industry preferred
Responsibilities
Lead build of Hadoop platform and Java applications.
Support application until handed over to Production Operations.
Provide direction to junior programmers
Technical documentation, architecture diagram, data flow diagram, server configuration diagram creation
Complete Risk questionnaires for the application
Drive project team to meet planned dates and deliverables
Work with big data developers designing scalable supportable infrastructure.
Work with Linux server admin team in administering the server hardware and operating system
Assist with developing and maintaining the system runbooks.
Develop/manage tools to migrate data from traditional databases to Big Data environment.
If you are comfortable with the requirement, please forward your profile to murali at keylent dot com or feel free to reach me at 4 0 7 4 0 1 7 7 1 8
             

Similar Jobs you may be interested in ..