Job Description :

We need immediately few Data Engineers with Spark Java and AbInitio and/or Informatica. The position is based out of Charlotte and can be done remotely.

Requirements:

  • 5+ years' experience on ETL Development Tools: Spark / Java (preferred), Python, Ab Initio, Informatica
  • Advanced experience & demonstrated proficiency in Core Java and Spark.
  • Experience with Apache Beam, Data Flow, Cloud Data Flow, Cloud Composer and Big Query
  • 5+ years' experience in Big Data technologies , Distributed Multi-tier Application Development, Database Design, Data processing, Data Warehouse.
  • Requires  Strong experience in SQL queries and stored procedures; data profiling, data analysis, and data validations skills.
  • Deep DBMS (Incl. Oracle, Teradata, and SQL Server) and Advanced understanding of data warehousing ETL concepts (esp. change data capture).
  • Strong experience on database design, best architecture practices, normalization, and dimensional modeling etc.
  • Experience at developing complex UNIX shell scripts.
  • Experience with GIT or similar source code versioning tools and coding standards.
  • Experience with scheduling tool such as Autosys or similar tools.
             

Similar Jobs you may be interested in ..