Job Description :
Role : Senior Big Data/Hadoop Consultant

Location : Wilmington ,DE

Job Description :

Technical/Functional Skills :
5+ years’ experience with Hadoop, Azure HDInsights, Databricks, Hive, and SQL Server technologies.
Provide Client with recommendations and guidance on leveraging their existing Hadoop platform
Strong Sound abstract, analytical, problem solving and critical thinking skills
Strong leadership, communication and interpersonal skills.
Analytical and problem solving skills, applied to a Big Data environment.
Manufacturing Industry experience is required
SAP working knowledge is preferred

Experience Required :
Have architected and designed 2-3 end-to-end big data, cloud (Azure, AWS or other) or BI solutions across multiple technologies and platforms;
Hadoop technologies – HDFS, Hive, Pig, Sqoop, MapReduce, Flume, Spark, distributed processing concepts, Hbase
Solid understanding and experience using at least one of the following programming languages — Python, Java.
4+ java development experience for batch, web services/API, or web application development
3+ years Teradata (or equivalent) experience and complex SQL
At least one to two years of hands-on experience with the Azure HDInsight platform

Roles & Responsibilities :
Lead and advice diverse areas of strategy development and big data design strategy roadmap.
Design the solution taking advantage of the existing assets maintaining a balance between architecture requirements and specific client needs.
Work with development team to ensure architecture is implemented to meet both functional and non-functional requirements
Create prototypes, conduct proofs of concepts, evaluate options with pros and cons and provide recommendations.
Build consensus on solution, and work with other senior architects and managers to influence decisions.
Ability to understand big data use-cases and recommend standard design patterns commonly used in Hadoop-based deployments
Ability to understand and translate customer requirements into technical requirements
Experience designing data queries against data in the HDFS environment using tools such as Apache Hive