Job Description :
10+ years of experience with last 3 years in Data Architecture covering Traditional data warehouse and Hadoop data lake
Has hands-on Hadoop experience
2-3 years’ experience working with data models using Erwin or equivalent is a must
Experience in data warehousing, driving data design and end to end processing
Experience with moderate SQL scripts (creating, reading, running, customizing)
Excellent verbal and written communication skills
Hands-on experience using compute, networking, storage(EBS) and database AWS services(RDS, NoSQL, Search)
Has good understanding of elasticity and scalability concepts
Understanding of network technologies as they relate to AWS
A good understanding of all security features and tools that AWS provides
Hands-on experience with migration of on-prem data to AWS
Self-Driven and requires minimal supervision
10+ yrs of experience with Java based enterprise with more than 4+yrs of specializing in Big Data solutions architecture
Able to document and clearly explain technical solutions to both technical and non technical teams
Must have 10+ exp in Java and related technologies.
Must have 3+ Years of experience, Hands-on exp with one or more major hadoop distributions and various ecosystems components (e.g HDFS, Sqoop, Impala, Spart, Flume, Kafka, Etc)
Experience in ELK stack.
Experience in developing web services technologies (SOAP and RESTful)
Exceptionally good with code, Unit testing to deliver defect free codes
Experience with UML modeling tools.
Onsite/Offshore Coordinating
Life insurance experience is a plus

Client : NA