Job Description :
Required Skills & Responsibilities:



Extremely strong T-SQL experience

At least 5 years of experience working in Big Data Technologies

At least 5 years of experience in Data warehousing

Bachelor’s degree or Masters degree in Computer Science/Computer Science & Engineering/equivalent.

Experience working across/understanding of one or more ETL/Big Data tools (combination) eg. Informatica/Talend/Qubole/AtScale/Zaloni Data Platform/Infoworks etc.

Strong understanding and hands-on experience on the Big Data stack (HDFS, Sqoop, Hive, Java etc

Big Data solution design and architecture

Design, sizing and implementation of Big Data platforms based on Cloudera or Hortonworks

Experience in configuring Azure or AWS components and managing data flows.

Deep understanding of Cloudera and or Hortonworks stack (Spark, Installation and configuration, Navigator, Oozie, Ranger etc

Experience in extracting data from feeds into Data Lake using Kafka and other open source components

Understanding of and experience in Data ingestion patterns and experience with building pipelines.
             

Similar Jobs you may be interested in ..