Job Description :
Hadoop and Cloud Engineer for Big Data Infrastructure Team:
Required Experience:
§ Bachelor’s degree or technical college diploma in a technical field or equivalent experience
§ Strong Working Experience of Administering Hadoop, Kafka and the Hadoop ecosystem
§ Working experience in integrating big data platforms
§ Proficiency in building Hadoop clusters in the cloud and/or AWS EMR
§ Understanding of configurations and internal working of Cloudera Kerberized clusters
§ Moving data across Big Data platforms like Hadoop, Kafka, Spark, Cassandra, MongoDB, Greenplum, Relational databases.
§ Understanding User Access Management, Authentication and Authorization
§ Strong knowledge in Networking (subnets, vLAN, firewall, security, interface configuration, bonds, DNS, proxy, TLS) and Networking tools
§ Strong working experience and understanding of Linux based operating systems
§ Proficiency in shell scripting and Python

§ Familiarity with configuration automation tools (Chef, Puppet, Ansible, Salt Stack, etc. (We use Chef
§ Familiarity with continuous integration and delivery
§ Strong communication skills
§ Absolutely must be a team player (DevOps)
§ Boundless passion for technology and a willingness to learn and teach
§ Strong work ethic, attention to detail and problem solving skills

Desired Experience:
§ DevOps focused engineer with experience in LEAN principles
§ Managing/Adminstering Cloudera platforms
§ Experience working within an agile framework
§ Familiarity with encryption at rest and transit.
             

Similar Jobs you may be interested in ..