Job Description :

Position:: Data Engineer

Location::Charlotte, NC

Full Time Role

Responsibilities

  • Ability to do development of data services and integrations with On Prem and Cloud native stack
  • Collaborate with architects analysts modelers and domain SMEs
  • Responsible for working within a team to deliver data integration and engineering projects across a wide variety of systems in support of business functions and processes to improve end user
  • experience What does your success look like in the first 90 days In the first 90 days What are we looking for We want
  • strong collaborators who can deliver a world class client experience
  • We are looking for people who thrive in a fast paced environment are client focused team oriented and are able to execute in a way that encourages creativity and continuous improvement

Requirements ::

  • 7 Years of experience in ELT and ETL technologies of which 1 year in designing and implementing workflows and complex mapping using Informatica
  • 1 years' experience working with Stream Sets tool
  • 1 years' experience working on Spark Scala Python
  • 1 years of experience working on Apache Sqoop
  • Must have experience with batch process design implementation scheduling monitoring and enhancement using industry standard tools like Autosys or Tivoli
  • Good hands on experience with ANSI SQL T SQL programming SQL with performance tuning and working knowledge of No SQL
  • Ability to do development of data services and integrations on cloud native stack and on prem stack
  • Collaborate with architects analysts modelers and subject matter experts in domain and data modeling
  • Responsible for working within a team to deliver data integration projects across a wide variety of systems in support of business functions and processes to improve end user experience
  • Design develop document and test data engineering solutions using industry standard tools
  • Understand and support metadata management and information practices that improves its usability through its lifecycle Present ETL documentation and designs to team members and convey complex information in a clear and concise manner Extract data from multiple sources integrate disparate data into a common data model and integrate data into a target database application or file using efficient ETL processes
  • Maintain technical documentation that facilitate
  • understanding maintaining reusing and code refactoring
  • Experience using one or more ETL tools to orchestrate and ingest data
  • Experience with DevOps tool chain that enables CI CD pipeline e g Git Hub TFS Jenkins Team City Octopus Puppet
  • Experience with Agile development process Preferences
  • Experience working on Lambda with AWS Experience in AWS Redshift
  • Experience in development on cloud based ETL tools like Talend AWS Glue is an added advantage
             

Similar Jobs you may be interested in ..