Job Description :
Mandatory Skills:
Kafka and Apache Nifi message brokering system
Expertise knowledge of Big Data HortonWorks platform
ETL jobs using Sqoop
Elastic Search implementation on BigData
Hadoop Big data platform administration knowledge

Desired Skills:
Knowledge of Oracle, SQL, Sybase databases
Azure Data Lake and Data warehousing
Tableau tool for Big data analytics
Strong communication skills
Healthcare IT knowledge

Job Responsibilities:
Maintain and develop on Apache nifi and Kafka
Develop, analyze and maintain all ETL jobs according to Mayo standards on Sqoop
Scheduling and monitoring the ETL jobs
Administration on Hadoop administration for managing large volume ETL jobs
Interact with system stakeholders for archival of old data
Actively work with Architect, Business Analyst, Data Analyst and Report Developers
Troubleshooting and fixing the code