Job Description :
Position: Sr Hadoop developer/Lead
Location: MA
Duration: 6+ Months
Interview Process: Zoom video interview; could do on-site but will hire off video
Lead experience is a must
Requirements:
Architectural knowledge
Hadoop experience
AWS experience
Horton Works distribution experience
Hands on development experience
Lead experience
Horton works tools knowledge
HIVE
SQOOP
FLUME
SQL to HQL experience
Must haves:
At least 6 to 8 years’ experience Architect the complete end-end to design of enterprise wide big data solution
Should be able differentiate and recommend different tools that can be used to solve business problems
Strong Scala, Spark, Hadoop technology experience is must.
Hortonworks distributions experience is must
Amazon Web services Experience is must
Big data development experience on Hadoop platform including Hive, Hive LLAP, Sqoop, Flume, Spark.
Application development experience in Core Java/ Python
Should be able to cover end-to-end BI and Data strategy inclusive of partnership with internal and external stakeholders.
Experience with data modeling, complex data structures, data processing, data quality and data lifecycle.
Should be able to lead critical aspects of the data management and application management.
Experience in UNIX shell scripting, batch scheduling and version control tools.
Experience with analytical programming and ability to work with EDW architecture to bridge the gap between a traditional DB architecture and a Hadoop centric architecture.
Highly organized and analytic, capable of solving business problems using technology.
Should be an individual with in-depth technical knowledge and hands-on experience in the areas of Data Management, BI Architecture, Product Development, RDBMS and non-RDBMS platforms.
Should have excellent analytical skills, able to recognize data patterns and troubleshoot the data.
Have a thorough understanding of the implications of software design and implementation choices on performance and maintainability.
Experience in large scale server-side application development that includes the design and implementation of high-volume data processing jobs.
Willing to work in Big Data development initially and then move to support operations group.
Plus:
Experience building solutions for streaming applications is a plus
Functional programming experience is desired.
Day to Day:
Design and Develop big data solutions using Hadoop and Spark platform
Will be responsible for design and delivery of data solutions to empower data migration initiatives, BI initiatives, dashboards development etc.
Will design ETL processes, must be able to explore the POC/prototype options.


Client : Confidential

             

Similar Jobs you may be interested in ..