Job Description :
Senior Data Engineer Charlotte, NC Contract Client : Cognizant We are currently looking for a senior Data Engineer to join our asset and Liability Technology team at Wells Fargo Corporate Treasury. Senior Data Engineer is responsible to provide application design guidance and consultation, utilizing a thorough understanding of applicable technology, tools and existing design. Candidate will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The incumbent should be an experienced data analyst, data pipeline builder and data wrangler who enjoys analyzing data, building and optimizing data systems from the ground up. Their duties include solution design, development and may include ongoing operational support of the code-line. Key Responsibilities: End to end ownership of ETL data pipelines, from ingestion of data to consumption by business intelligence and advanced analytics teams. Design and build an automated, self-service data platform, freeing teams to focus on customer features and analysis. Evolve existing tools and framework to support new scalability requirements as well new functionality as needed. Identify and drive new solutions to enhance the development cycle to increase development productivity. Work with product owners to identify and mature upcoming business needs and develop technical backlog to answer those needs in a timely manner. Work with team to identify and resolve technical debt to improve the team's throughput. Skills: Strong communication skills. Deep experience designing and implementing highly scalable, distributed application systems. 5+ years' experience building data pipelines. 5+ years' experience programming in Python Extensive knowledge in fine tuning SQL, understanding optimizers, and execution plans. Extensive experience architecting complex data models to handle millions of transactions. Experience in application design and Implementation using agile practices & TDD. Experience leveraging open source data infrastructure projects, such as Apache Spark, Kafka, Flink. Strong understanding of software development life cycle and release management Past experience integrating with Oracle, Microsoft SQL Server Self-motivated, independent, team-player Education: Bachelor's Degree in Computer Science, Math, Engineering, MIS or related field or equivalent experience