Job Description :
Minimum of 5 years hands-on experience with Data Integration, ETL and/or Data Engineering

Minimum of 5 years hands-on experience with Big Data tools and techniques

Minimum 3 years of hands-on experience with Talend used in conjunction of Hadoop MapReduce/Spark/Hive.

Ability and willingness to be hands-on including use of SQL, ETL tools, scripts, etc. to tune performance, data transformation routines, etc. when necessary.

Creating data flow documentation and business process modeling
Data integration design and development

Leveraging a Big Data capability and understanding the schema-on-read approach and implications

Hands-on experience with Hadoop ecosystem (Cloudera preferred) including MapReduce, Spark, Hive/Impala, HBase, Kafka.