Job Description :
Responsible for modernization of enterprise data solutions on AWS cloud integrating native AWS services and 3rd party data technologies.
A solid experience and understanding of considerations for large scale delivery, suctioning and operationalization of data lakes, Data warehouses, Data Services and analytics platforms on AWS is a must.
We are looking for candidate who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with an appropriate combination of AWS and 3rd party technologies for deploying on AWS cloud.
AWS skills with strong Hadoop and Data warehousing skills to enable us to move to cloud efficiently.

Basic Qualification:
3-4 years of building and operationalizing large scale enterprise data solutions, Data Lakes and applications using one or more of AWS data and analytics services in combination with 3rd parties –EC2, EMR, S3, Kinesis, Dynamo DB, RedShift, RDS, Lambda, Glue, Spark, Snowflake etc.
3rd party KMS, HSM with AWS data services for building secure data solutions
Hadoop stack (HDFS, PIG, Hive, Spark, Ambari, SQOOP, Map reduce, Tez, Ranger etc)
Experience in Data Lake management tools Like Podium Data, Diyotta, Informatica BDE etc. is a plus
Minimum 3 years of hands-on experience analyzing, re-architecting and re-platforming on-premise data Lake to data platforms on AWS cloud using AWS/3rd party services
Minimum 3 years of designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Podium, Java, Python, Scala, C++ etc.
Minimum 3 years architecting and implementing next-generation data and analytics platforms on AWS cloud serving analytics and BI application integrations
Hands-on AWS experience with a minimum of 3 years of solution design, build and implemented at production scale
5-8 Years of Demonstrated knowledge and application of ETL Data Warehousing best practices
5-8 Years of Experience with SQL against relational databases preferably with SQL Server, Oracle database platforms 10g and above on Linux/Unix.
1-2 year’s exposure to Logi, SAS, Tableau, R or other dashboarding reporting experience is a plus

Responsibilities:
Work with implementation teams from concept to operations, providing deep technical subject matter expertise for successfully deploying large scale data solutions in the enterprise, using modern data/analytics technologies on-premise and cloud
Create detailed target state technical, security, data and operational architecture and design blueprints incorporating modern data technologies and cloud data services demonstrating modernization value proposition
Build, test and deploy solutions using cloud data services
Provide business analysis and programming expertise within an assigned business unit/area in the analysis, design, and development of business applications.
Participate in business and IT project estimation activities. Provide technical leadership for small to medium-scale projects. Utilize business knowledge to collaborate and offer technical solutions.
Be self-guided and complete tasks with minimum assistance.

Other Requirements:
Under minimal supervision, effectively analyze and resolve medium to moderately large risk production problems related to assigned applications, assess alternatives and implement long-term solutions.
Successful performance includes demonstrated ownership and timely responses to production problems and business unit inquiries.
Ability to communicate effectively with business partners
Ability to provide fundamental technical and business analysis on projects
Strong technical knowledge, Leadership, with hands-on experience managing systems development in new computing architectures and environments.
Knowledge of relevant technology and tools is critical, including development methodologies and programming/scripting languages.
Ability to accurately estimate project development activities.
Experience with design, development, and implementation of new computing architectures.

Preferred Academic Skills:
Bachelor’s degree in Computer Science, Information Systems, or other technical fields
AWS/Hadoop Certifications of any of the technical skills are preferred.