Job Description :

Job Title: Kafka Developer
Location: USA, Arizona, Phoenix
Duration: Full Time
Experience: 2 - 10 years  
US Citizen, GC Holder, H4 EAD

Job Description

Client is seeking a Kafka Developer with development experience in Hadoop Ecosystem. The position will primarily be responsible interface with key stakeholders and apply your technical proficiency across different stages of the Software Development Life Cycle including Requirements Elicitation, Application Architecture definition and Design. You will play an important role in creating the high-level design artifacts. You will also deliver high quality code deliverables for a module, lead validation for all types of testing and support activities related to implementation, transition and warranty. You will be part of a learning culture, where teamwork and collaboration are encouraged, excellence is rewarded, and diversity is respected and valued.

Required Qualifications:

    Candidate must be located within commuting distance of Phoenix, AZ or be willing to relocate to the area. This position may require travel in the US and Canada.
    Bachelor's Degree or foreign equivalent, will consider work experience in lieu of a degree
    2+ years of experience with Information Technology
    Experience in Hadoop ecosystem, i.e. Hadoop, Cloudera, Scala, SPARK, Kafka
    Strong knowledge in object oriented concepts, data structures and algorithms
    Good experience in end-to-end implementation of DW BI projects, especially in data warehouse and mart developments
    Knowledge and experience with full SDLC lifecycle
    Experience with Lean / Agile development methodologies
    S. Citizenship or Permanent Residency required, we are not able to sponsor at this time


Preferred Qualifications:

    5+ years of experience in software development life cycle
    5+ years of experience in Project life cycle activities on development and maintenance projects
    1+ year of experience in Hadoop ecosystem, i.e. Hadoop, SPARK, Kafka, Kafka, Python
    At least 1 year of experience in Relational Modeling, Dimensional Modeling and Modeling of Unstructured Data
    Good experience in end-to-end implementation of DW BI projects, especially in data warehouse and mart developments
    Good understanding of Data integration, Data Quality and data architecture
    Good expertise in impact analysis due to changes or issues
    Experience in preparing test scripts and test cases to validate data and maintaining data quality
    Strong understanding and hands-on programming/scripting experience skills - UNIX shell, Perl, and JavaScript
    Experience with design and implementation of ETL/ELT framework for complex warehouses/marts. Knowledge of large data sets and experience with performance tuning and troubleshooting
    Hands-on development, with a willingness to troubleshoot and solve complex problems
    CI / CD exposure
    Ability to work in team in diverse/ multiple stakeholder environment
    Ability to communicate complex technology solutions to diverse teams namely, technical, business and management teams
    Excellent verbal and written communication skills
    Experience and desire to work in a Global delivery environment

The job may entail extensive travel. The job may also entail sitting as well as working at a computer for extended periods of time. Candidates should be able to effectively communicate by telephone, email, and face to face. 

             

Similar Jobs you may be interested in ..