Job Description :
Need a resource with good knowledge in Spark, hive and pig;
· Ability to design and implement end to end solution.
· Build libraries, user defined functions, and frameworks around Hadoop
· Exposure on AWS, Airflow, snowflake is desired.
· Research, evaluate and utilize new technologies/tools/frameworks around Hadoop eco system
· Develop user defined functions to provide custom hive and pig capabilities
· Define and build data acquisitions and consumption strategies
· Define & develop best practices
· Work with support teams in resolving operational & performance issues
· Work with architecture/engineering leads and other teams on capacity planning
· Work with Site-Operations team on configuration/upgrades of the cluster
· Excellent communication