Job Description :
The Senior Hadoop Developer is responsible for designing, developing, testing, tuning and building a large-scale data processing system for data products that allow HMS to improve quality, velocity and monetization of our data assets for both Operational Applications and Analytical needs.

Responsibilities:
Responsible for design, development and delivery of data from operational systems and files into ODS, downstream Data Marts and files.
Works with BA’s, end users and architects to define and process requirements, build code efficiently and work in collaboration with the rest of the team for effective solutions.
Has strong analytical SQL experience working with dimensional modeling.
Research, develop and modify ETL processes and job according to the requirements.
Troubleshoot and develop on Hadoop technologies including HDFS, Hive, Pig, Flume, HBase, Spark, Impala and Hadoop ETL development via tools such as Informatica, Talend.
Knowledge of and experience with any Azure Data Platform components – Azure Data Lake, Data Factory, Data Management Gateway, Azure Storage Options, DocumentDB, Data Lake Analytics, Stream Analytics, EventHubs, Azure SQL
Translate, load and present disparate data sets in multiple formats and multiple sources including JSON, Avro, text files, Kafka queues, and log data.
Will implement quality logical and physical ETL designs that have been optimized to meet the operational performance requirements for our multiple solutions and products, this will include the implementation of sound architecture, design, and development standards.
Has the experience to design the optimal performance strategy and manage the technical metadata across all ETL jobs.
Responsible for building solutions involving large data sets using SQL methodologies, Data Integration Tools like Informatica in any database.
Deliver projects on-time and to specification with quality.
Requirements:
Strong Hadoop spark develoment in a Hadoop environment (2-3 years)
8+ years of experience in managing data lineage and performing impact analysis.
5+ years of experience with any ETL tool development.
4+ years of experience with Hadoop Eco System.
Experience working in Data Management projects.
Experience working in Hive or related tools on Hadoop, Performance tuning, File Format, executing designing complex hive HQL’s, data migration conversion.
Experience working with programing language like Java/Scala or python.
Experience working in agile environment.
Experience working with Spark for data manipulation, preparation, cleansing.
Experience working with ETL Tools (Informatica/DS/SSIS) for data Integration.
Experience designing and developing automated analytic software, techniques, and algorithms.
Ability to handle multiple tasks and adapt to a constantly changing environment.
Self-starter with the ability to work independently and take initiatives.
Ability to translate ideas and business requirements into fully functioning ETL workflows.
Ability to apply mastery knowledge in one of the relational data base (DB2, MSSQL, Teradata, Oracle 8i/9i/10g/11i)
Expert ability and hands on experience in SQL is a must.
Experience with Unix/Linux and shell scripting.
Strong analytical and problem-solving skills.
Excellent written and oral communication skills, with the ability to articulate and document processes and workflows for use by various individuals of varying technical abilities.
Knowledge of HealthCare a plus.
Minimum Education: MS/BS in Computer Science, Information Systems, or related field preferred and/or equivalent experience.
Ability to work both independently and in a collaborative environment.
Excellent problem-solving skills, communication skills and interpersonal skills.
Ability to analyze information and use logic to address work related issues and problems.
Ability to demonstrate proficiency in Microsoft Access, Excel, Word, PowerPoint and Visio.
Experience working in DevOps environment is a plus.
Experience or knowledge of web architecture, (Javascript, SOAP/XML, Weblogic, Tomcat) is a plus.
Experience with an ORM framework, SOA Architecture, Microservices is a plus.
Experience with Middleware components (ESB, API Gateway) is a plus.