Job Description :
Required Experience:
Bachelor’s degree or technical college diploma in a technical field or equivalent experience
Strong Working Experience of Administering Hadoop, Kafka and the Hadoop ecosystem
Working experience in integrating big data platforms
Moving data across Big Data platforms like Hadoop, Kafka, Spark, Cassandra, MongoDB, Greenplum, Relational databases.
Understanding of configurations and internal working of Cloudera Kerberized clusters
Understanding User Access Management, Authentication and Authorization
Strong knowledge in Networking (subnets, vLAN, firewall, security, interface configuration, bonds, DNS, proxy, TLS) and Networking tools
Strong working experience and understanding of Linux based operating systems
Proficiency in shell scripting and Python

Familiarity with configuration automation tools (Chef, Puppet, Ansible, Salt Stack, etc. (We use Chef
Familiarity with continuous integration and delivery
Strong communication skills
Absolutely must be a team player (DevOps)
Boundless passion for technology and a willingness to learn and teach
Strong work ethic, attention to detail and problem solving skills

Desired Experience:
DevOps focused engineer with experience in LEAN principles
Managing/Adminstering Cloudera platforms
Experience working within an agile framework
Familiarity with encryption at rest and transit.
             

Similar Jobs you may be interested in ..