Job Description :
1. You Should be passionate in development/modernization of Enterprise Web Applications.
2. Hands-on experience with Hive, Apache Hadoop, Spark, Java/Python is a must
3. Collaboratively working as a team and be able to architect applications using modern cloud-native principles.
4. Participated in, and be familiar with, Agile (Scrum) project methodology and practices.
5. Should have hands-on experience in Data architecture/Big Data/ETL environments.
6. Should have Experience with ETL design using tools Informatica, Talend, Oracle Data Integrator (ODI), Dell Boomi or equivalent Hands-on understanding of development using Java to build enterprise applications.
7. Should have extensive experience on Big Data & Analytics solutions like Hadoop, Hive, Spark, Spark SQL Storm, AWS (EMR, Redshift, S3, glue, Snowflake etc, Data Lake Design, Machine learning (Sagemaker, Tensorflow)
8. Knowledge to setup Data science environment and prepare the environment for development and testing activities in AWS
9. Strong hands-on knowledge of/using solutioning languages like: Java, Scala, Python - any one is fine
10. Experience with continuous integration/deployment tools and standard methodologies in DevOps.
11. Experience in working with JIRA/Git/Bitbucket/JUNIT and other code management toolsets.
12. Ability to quickly understand business requirements and propose reference architectures and technology solutions.
13. Capable to present options and weigh implementation complexities and risks to recommend the right technical decision.


Client : NA

             

Similar Jobs you may be interested in ..