Job Description :
Mandatory Skills : Talend, PySpark, ETL, AWS, Java/Python, SQL, SparkSQL

Analyze, design, implement and maintain Talend applications.

· Advanced ETL and EAI development using Talend Big data platform.

· Be a senior technical player on complex ETL and EAI development projects with multiple team members.

· 2+ years with Big Data Hadoop cluster (HDFS, Yarn, Hive, MapReduce frameworks), Spark

· 2+ years of recent experience with building and deploying applications in AWS (S3, Hive, Glue, EMR, AWS Batch, Dynamo DB, Redshift, Cloudwatch, RDS, Lambda, SNS, SQS etc

· 4+ years of Java/Python, SQL, SparkSQL, PySpark

· Excellent problem-solving skills, strong verbal & written communication skills

· Ability to work independently as well as part of a team

· Develop relevant functional and technical documentation

· Having hands on data transformation logic , validation and trouble shooting

· Knowledge of Data warehousing is preferred

· SQL knowledge is preferred

· Good communication

Additional Remarks (if Any like start date)

Responsibilities:



Experience

10+ Years is mandatory
             

Similar Jobs you may be interested in ..