Job Description :
. Develop data ingestion processes and scripts from on premise data systems into AWS S3 buckets Develop Data Warehousing, Data Quality, ELT and Data Cataloging Capabilities in AWS Perform tests to optimize performance and resource utilization Gather and address AWS Data Lake technical design requirements Recommend and/or Validate AWS Data Lake solution architecture Provide AWS training and support to internal teams Build reusable CloudFormation code and libraries for future use Liaise with developers, designers and system administrators to identify new features Evaluate emerging technologies Develop transformation scripts using Redshift, RDS and/or ETL tools Experience: 2+ years hands-on experience with Amazon Web Services (AWS) 2+ years in AWS experience including Kinesis, Lambda, RDS, Redshift, S3, Glacier, DynamoDB, SQS, SNS, Elastic cache, Elastic Search, EC2, ECS 3+ years build data lake solutions AWS Architecture certification is a plus Significant experience in ELT tools like Snaplogic, Alooma , Fivetran etc. Experience with variety of data stores including AWS S3, AWS RDS(Oracle), MongoDB and AWS Elastic Cache Experience with message queueing platforms like SQS, Kinesis, Kafka Experience in other data technologies like Spark, Kibana Demonstrated work with highly scalable, distributed applications. Experience in budgeting/cost modeling AWS solutions. Proven track record for identifying, architecting and building new technology solutions to solve complex business problems Experience in a consulting environment Good presentation and communication skills
             

Similar Jobs you may be interested in ..