Job Description :
Position: ETL Hadoop
Location: Jersey City, NJ
Job Type: Full Time & C2H

Job Description:
Bachelor’s degree or foreign equivalent required. Will also consider one year of relevant work experience in lieu of every year of education.
Minimum of 5+ years of work experience in the Information Technology Field.
Minimum of 4+ years of hands on experience with DW/BI and ETL tools like Informatica, Microsoft SSIS etc. and strong knowledge in various databases like Oracle, SQL server, Teradata etc.
Minimum of 2+ years of hands on experience Big Data technologies.
Experience in full software development lifecycle of the Data Warehousing Project.
Experience in Self-service BI tool is preferred
Develop ETL jobs using ETL Tools and SQL objects/procedures based on requirements and using various transformations
Has good conceptual knowledge of systems analysis and design methodologies
Experience in Hadoop ecosystem products such as HDFS, MapReduce, Hive, AVRO, Zookeeper
Experience with Hadoop ecosystem and experience with Hive, Oozie, Flume, Impala and Sqoop.
Expertise in building distributed systems, query processing, database internals or analytic systems Expertise with data schema - logical and physical data modeling
Experience with Spark, HBase, Java (MapReduce), Python development
Experience in full software development lifecycle of the Data Warehousing Project.
Experience in loading data into HDFS from heterogeneous databases – DB2, Oracle, and SQL server using Apache Sqoop.
Experience in analysis of data using Hive and Impala
Work with Oozie, Flume, Sqoop, Spark, and Solr for data loading and analytics
Efficient in Writing SQL’s with complex joins, aggregations, UNIX Scripts
Good understanding of Data Warehouse modeling concepts.
Must be able to provide Solutions or Enhancements to fix the issues quickly when reported by the clients or users.
Flexibility to Self-learn and understand the system, further assist with query tuning and application performance

Preferred

At least 4 years of experience in software development life cycle stages
At least 2 years of experience Big Data technologies and ecosystem
At least 4 years of experience in Project life cycle activities on development and maintenance projects
At least 2 years of experience in Design and Architecture review
At least 2 years of experience in application support and maintenance (including some experience on-call support)
Good Analytical skills
High impact communication
Ability to ramp up in new technologies
Ability to work in a team, in diverse/ multiple stakeholder environments.
Experience and desire to work in a Global delivery environment