Job Description :
Senior/Lead Hadoop Developer

Location: Irving, TX
Duration: 6 months

Description:
Required Functions:
The Senior/Lead Hadoop Developer is responsible for designing, developing, testing, tuning and building a large-scale data processing system, for Data Ingestion and Data products that allow client to improve quality, velocity and monetization of our data assets for both Operational Applications and Analytical needs. This position supports this goal with strong experience in software engineering and development of solutions within the Hadoop Ecosystem

Responsible for design, development and delivery of data from operational systems and files into ODSs (operational data stores), downstream Data Marts and files.
Troubleshoot and develop on Hadoop technologies including HDFS, Hive, Pig, Flume, HBase, Spark, Impala and Hadoop ETL development via tools such as Informatica,
Translate, load and present disparate data sets in multiple formats and multiple sources including JSON, Avro, text files, Kafka queues, and log data.
Will implement quality logical and physical ETL designs that have been optimized to meet the operational performance requirements for our multiple solutions and products, this will include the implementation of sound architecture, design, and development standards.
Has the experience to design the optimal performance strategy, and manage the technical metadata across all ETL jobs.
Responsible for building solutions involving large data sets using SQL methodologies, Data Integration Tools like Informatica in any Database preferably in an MPP platform.
Has strong Core Java Programming experience to apply in Data Integration.
Works with BA’s, end users and architects to define and process requirements, build code efficiently and work in collaboration with the rest of the team for effective solutions.
Deliver projects on-time and to specification with quality.
8 years’ experience in managing data lineage and performing impact analyses.
5 years’ experience with ETL tool development
4 years’ experience with Hadoop Eco System
Experience working in Data Management projects.
Experience working in Hive or related tools on Hadoop, Performance tuning, File Format, executing designing complex hive HQL’s, data migration conversion.
Experience working with Spark for data manipulation, preparation, cleansing.
Experience working with ETL Tools ( Informatica/DS/SSIS) for data Integration.
Experience designing and developing automated analytic software, techniques, and algorithms
Ability to handle multiple tasks and adapt to a constantly changing environment
Self-starter with the ability to work independently and take initiative. Ability to translate ideas and business requirements into fully functioning ETL workflows.
Strong analytical and problem solving skills.
Excellent written and oral communication skills, with the ability to articulate and document processes and workflows for use by various individuals of varying technical abilities.
Excellent organizational skills.
Knowledge of HealthCare a plus.

Minimum Education:
MS/BS in Computer Science, Information Systems, or related field preferred and/or equivalent experience
Ability to apply mastery knowledge in one of the relational data base (DB2, MSSQL, Teradata, Oracle 8i/9i/10g/11i)
Ability to apply mastery knowledge in one of the Data Integration Tools (Informatica, SSIS- Expert ability and hands on experience in SQL and Core Java a must.• Experience with Unix/Linux and shell scripting.
Ability to demonstrate experience in distributed UNIX environments
Ability to work both independently and in a collaborative environment.
Excellent problem solving skills, communication skills and interpersonal skills.
Ability to analyze information and use logic to address work related issues and problems.
Ability to demonstrate proficiency in Microsoft Access, Excel, Word, PowerPoint and Visio.
Ability to present to a group.
Experience working in Agile, DevOps environment a plus.
Experience or knowledge of web architecture, (Javascript, SOAP/XML, Weblogic, Tomcat) is a plus.
Experience with an ORM framework, SOA Architecture, Microservices is a plus.
Experience with Middleware components (ESB, API Gateway) is a plus.
             

Similar Jobs you may be interested in ..