Job Description :
Data warehouse Architect Strong background in architecting and developing technical solutions for data warehouse and data lakes. Experience with Snowflake cloud DataWarehouse. Experience working in AWS cloud computing environment S3, EMR, Lambda, RDS, DynamoDB etc. Good understanding and Experience with ETL Tools such as Informatica(on premises and Cloud), OBIEE, Tableau. Experience designing and troubleshooting complicated data extraction routines and real time / batch integrations. Experience on Migration of Data Warehouse from on premises to cloud(Snowflake preferred Understanding & experience on Python, Shell scripting. Experience extracting and processing data from multiple data sources such as: Databases (Exadata, Oracle, Snowflake), Applications, Web Services (SOAP/REST), flat files .csv, txt, xml, json) and streaming data. Define DWH and Data Lake standards, guidelines and best practices for business groups and technical teams. Diagnose and resolve DWH and Data Lake tool capacity issues. Recommend strategies to improve performance and capacity of DWH tools. Exposure to version control (GitLab, GitHub, BitBucket) and CI/CD (GitHub, Jenkins, etc) tools and methodologies Experience writing and troubleshooting complex SQL queries Must be able to quickly understand existing architecture and application requirements Must be able to follow software development processes, prepare detailed documentation, and generate work estimates in Agile & SDLC environments. Excellent verbal and written communication skills, including the ability to explain technical concepts and technologies to business leaders
             

Similar Jobs you may be interested in ..