Job Description :
Understand detailed requirements, provide inputs for test cases, prepare test data and execute the test cases WITH THE OIBJECTIVE of ensuring least defects WITHIN the client and project specifications.
Good testing experience Should be good hands-on in Hive, Sqoop , Pig, Shell Scripting, Scala, Spark(Core, Spark SQL,Pyspark) .
Good communication skill set.
Interacting with customer to get the requirements and prepare the requirement analysis documents and design and development.
Understand business requirement and Build framework that load data from the source to Data lake.
Integrate transition and processing with the ingestion framework.
To build end to end solution from getting data to make it available for the reporting and dashboard.We used shell-scripting, Scala, Python, Spark, Pyspark, MySql.
Worked for Data Ingestion program involved in Requirements gathering, Analysis, Design, Development, Testing and Deployment phases of the project life cycle.
Proposed Design solutions to process the Ingested data in Hadoop Data Lake in varying file formats like Text, JSON and Sequence Files.
Designed and developed Sqoop scripts to import the data from various.