Job Description :
Bachelor’s degree or equivalent working experience with minimum of 5+ years of IT experience.

Required Skills:
Hands-on experience building Big Data solutions using the Hadoop ecosystem
Experience with Big data ingestion and integration tools experience Hadoop, Spark, Sqoop, Kafka, Flume, and etc. on cloud/on premise
Demonstrable understanding and experience with enterprise data warehouse, big data, cloud, BI & analytics, content management and data management
Some Programming experience in tools/technologies such as Hadoop/HDFS, Sqoop, Pig, Hive, Hbase, Flume, Yarn, MapReduce, Spark, etc.
Demonstrate excellent communication skills including the ability to effectively communicate with internal and external customer
Ability to use strong industry knowledge to relate to customer needs and dissolve customer concerns and high level of focus and attention to detail.
Strong work ethic with good time management with ability to work with diverse teams and lead meetings