Job Description :
  • Write software to interact with HDFS and MapReduce.
  • Assess requirements and evaluate existing solutions.
  • Build, operate, monitor, and troubleshoot Hadoop infrastructure.
  • Develop tools and libraries, and maintain processes for other engineers to access data and write MapReduce programs.
  • Develop documentation and playbooks to operate Hadoop infrastructure.
  • Evaluate and use hosted solutions on AWS / Google Cloud / Azure. {{If you’d like to use hosted solutions}}
  • Write scalable and maintainable ETLs. {{If you need to run ETLs}}
  • Understand Hadoop’s security mechanisms and implement Hadoop security. {{If you need fine-grained security within your organization}}
  • Write software to ingest data into Hadoop.
             

Similar Jobs you may be interested in ..