Job Description :
Required Skills:

At least 7 years of related organizationally-based experience required, including.
Working on large volume databases is required.
Hadoop experience is required.
Shell scripting experience is required.
Terradata, Data Integration, and cloud technologies experience is a preferred.
Working in complex projects with multi-tier architecture required.
Expert knowledge in Big Data ETL Hadoop Stack.
Minimum 5 years development experience on Hadoop platform including PIG, Hive, Sqoop, Hbase, Flume, Spark and related tools.
Minimum 1 year development experience on AWS EMR, Lambda, Dynamo DB, S3 services.
Minimum 7 Years Data Integration Development Experience.
Expert knowledge of relational and dimensional data concepts.
Expert knowledge of some ETL tools such as SSIS ,InformaticaPowerCenter, or SnapLogic.
Working knowledge of SQL including DDL and DML.
             

Similar Jobs you may be interested in ..