Job Description :
Solid understanding of data migration from on-premise to cloud DW solution. Strong understanding of snowflake features including warehouses, clustering etc. Exposure to the Agile and DevOps tools. Skills: Minimum Qualifications: 5+ years' experience working in an environment focusing on data initiatives/projects 5+ years' experience in software development leveraging a variety of languages such as Java, Python, Scala etc. 3+ years' experience architecting solutions within a Big Data environment (preferably in the cloud) 1-2 years' experience designing, deploying and/or managing services in AWS with an understanding of how to manage cloud resources economically 1-2 Years of experience in implementing Snowflake on AWS Excellent communication and presentation skills Preferred Qualifications: Experience within the tech industry with a strong understanding of related data Define and architect future state data architectures incorporating latest industry best practices utilizing Snowflake and AWS. Experience with AWS Data Warehouse based services (eg. S3, Glue/Crawlers, Snowflake, , Athena, EMR) Experience with designing and implementing development frameworks in PySpark Understanding of various BI Tools (eg. Cognos, Tableau, AWS Quicksight) Experience with continuous integration tools (eg. Git, Jenkins) Experience working with containers and container orchestration systems Experience implementing data security using database and non-database technologies Understanding of Snowflake architecture
             

Similar Jobs you may be interested in ..