Job Description :

Senior Data Engineer

** Role will remote till office opens in Westlake, TX 76262 or Lone Tree, CO 80124**

12-18 Months Contract

Must Haves:

  1. Data Warehousing
  2. Expert in SQL
  3. Experience with ETL tools: informatica preferred
  4. Understanding of big data
  5. Understanding of cloud: AWS preferred
  • Bachelor's degree in Computer Science or related discipline
  • Experience with a structured application development methodology, using any industry standard Software Development Lifecycle, in particular Agile Methodologies is required
  • 6+ years of overall experience in I.T. with strong understanding of best practices for building and designing ETL code, Strong SQL experience with the ability to develop, tune and debug complex SQL applications is required
  • 5+ years of experience in ETL tools. Specific expertise in implementing Informatica / Talend in an Enterprise environment is a plus.
  • Experience architecting the whole process of consuming all the data from all the systems that are of interest
  • Expertise in schema design, developing data models and proven ability to work with complex data is required
  • Hands-on experience in Java object oriented programming (At least 2 years)
  • Hands-on experience with Hadoop, MapReduce, Hive, Pig, Flume, STORM, SPARK, Kafka and HBASE (At least 3 years)
  • Understanding Hadoop file format and compressions is required
  • Familiarity with MapR distribution of Hadoop is preferred
  • Understanding of best practices for building Data Lake and analytical architecture on Hadoop is required
  • Strong scripting / programming with UNIX, Python, Java, Scala etc. is required
  • Strong SQL experience with the ability to develop, tune and debug complex SQL applications is required
  • Expertise in schema design, developing data models and proven ability to work with complex data is required
  • Experience in real time data ingestion into Hadoop is required
  • Experience in or deep understanding of cloud based data technology GCP/AWS is preferred
  • Proven experience in working in large environments such as RDBMS, EDW, NoSQL, etc. is preferred
  • Knowledge of Big Data ETL such as Talend and Informatica/IICS BDM tools is preferred
  • Understanding security, encryption and masking using Kerberos, MapR-tickets, Vormetric and Voltage is preferred
  • Experience with Test Driven Code Development, SCM tools such as GIT, Jenkins is preferred
  • Experience with Graph database is preferred
  • Strong with SQL Server, Oracle and Mango DB preferred
  • Experience in Active Batch Scheduling, control M preferred
  • Excellent analysis, debugging and trouble-shooting skills, and problem-solving skills
  • Good verbal and written communication skills
  • Ability to thrive in a flexible and fast-paced environment across multiple time zones and locations
  • Experience in Financial Services industry a plus.
             

Similar Jobs you may be interested in ..