Job Description :
1. 6 to 10 years of experience in relevant technologies
2. Hands-on knowledge of the Big Data technologies – Hadoop, Spark, Hive, Impala, HBase, Cloudera Manager, Sqoop, Flume, Pig, Kafka, Flume, Python, Shell Scripts, etc.
3. Experience with solution architecture, design, development and delivery of a full development life cycle.
4. Experience with detailed level of data analysis using technical tools.
5. Should be able to develop code and self-contribute.
6. Hands-on working experience with one or more Hadoop distribution from Cloudera, Hortonworks, or MapR.
7. Experience with data wrangling tools and BI and reporting technologies – Paxata, MicroStrategy, Cognos, Datameer, Tableau, Arcadia, etc.
8. Understanding of data life cycle – data acquisition, data quality management, data governance, and metadata management.
9. Experience in large scale data warehouse implementations and knowledge of ETL technologies.
10. Excellent written and verbal communication skills.
11. Capable of building, articulating, and presenting new ideas to technical, non-technical, and business communities. Education And Experience
12. Minimum bachelor degree required, master degree preferred


Client : NA

             

Similar Jobs you may be interested in ..