Job Description :
Hi,
Hope you are doing great!!

Job Designation : Big Data Developer
Job Location : Phoenix AZ
Job Type : Full Time

Job Description:

Wanted: Global Innovators To Help Us Build Tomorrow’s Enterprise
In the role of Technology Lead, you will interface with key stakeholders and apply your technical proficiency across different stages of the Software Development Life Cycle including Requirements Elicitation, Application Architecture definition and Design. You will play an important role in creating the high level design artifacts. You will also deliver high quality code deliverables for a module, lead validation for all types of testing and support activities related to implementation, transition and warranty. You will be part of a learning culture, where teamwork and collaboration are encouraged, excellence is rewarded, and diversity is respected and valued.
Location for this position is Phoenix, AZ, USA. This position may require 100% relocation.
Qualifications Basic
Bachelor’s degree or foreign equivalent required. Will also consider one year of relevant work experience in lieu of every year of education
At least 5 years of Design and development experience in Big data, Java or Datawarehousing related technologies
Atleast 3 years of hands on design and development experience on Big data related technologies – PIG, Hive, Mapreduce, HDFS, HBase, Hive, YARN, SPARK, Oozie, Java and shell scripting
Should be a strong communicator and be able to work independently with minimum involvement from client SMEs
Should be able to work in team in diverse/ multiple stakeholder environment
Mandatory Technical Skills
Background in all aspects of software engineering with strong skills in parallel data processing, data flows, REST APIs, JSON, XML, and micro service architecture.
Must have strong programming knowledge of Core Java or Scala - Objects & Classes, Data Types, Arrays and String Operations, Operators, Control Flow Statements, Inheritance and Interfaces, Exception Handling, Serialization, Collections, Reading and Writing Files.
Must have hands on experience in design, implementation, and build of applications or solutions using Core Java/Scala.
Strong understanding of Hadoop fundamentals.
Must have experience working on Big Data Processing Frameworks and Tools – MapReduce, YARN, Hive, Pig.
Strong understanding of RDBMS concepts and must have good knowledge of writing SQL and interacting with RDBMS and NoSQL database - HBase programmatically.
Strong understanding of File Formats – Parquet, Hadoop File formats.
Proficient with application build and continuous integration tools – Maven, SBT, Jenkins, SVN, Git.
Experience in working on Agile and Rally tool is a plus.
Strong understanding and hands-on programming/scripting experience skills – UNIX shell, Python, Perl, and JavaScript.
Should have worked on large data sets and experience with performance tuning and troubleshooting.
Preferred
Knowledge of Java Beans, Annotations, Logging (log4j), and Generics is a plus.
Knowledge of Design Patterns - Java and/or GOF is a plus.
Knowledge of Spark, Spark Streaming, Spark SQL, and Kafka is a plus.
Experience to Financial domain is preferred
Experience and desire to work in a Global delivery environment
             

Similar Jobs you may be interested in ..