Job Description :

Requirements:

Bachelor's degree typically in Computer Science, Management Information Systems, Mathematics, Business Analytics or another STEM degree.

5+ years proven ability of professional data development experience

3+ years proven ability of developing with Hadoop/HDFS

3+ years developing experience with either Java or Python

3+ years experience with PySpark/Spark

Continuous Integration/Continuous Delivery (CI/CD) experience

Full understanding of ETL concepts

Exposure to VCS (Git, SVN)

Proficient with relational data modeling

Key Responsibilities

Take ownership of features and drive them to completion through all phases of the entire 84.51 SDLC.? This includes internal and external facing applications as well as process improvement activities:

Participate in design, development and support of Oracle and Hadoop based solutions

Perform unit and integration testing

Collaborate with architecture and lead and senior engineers to ensure consistent development practices

Collaborate with other engineers to solve and bring new perspectives to complex problems

Drive improvements in people, practices, and procedures

Embrace new technologies and an ever-changing environment


Required Skills : Azure,SNOWFLAKE,Python
Basic Qualification :
Additional Skills :
Background Check :Yes
Drug Screen :Yes
Notes :Requirements: Bachelor s degree typically in Computer Science, Management Information Systems, Mathematics, Business Analytics or another STEM degree. 5+ years proven ability of professional data development experience 3+ years proven ability of developing with Hadoop/HDFS 3+ years developing experience with either Java or Python 3+ years experience with PySpark/Spark Continuous Integration/Continuous Delivery (CI/CD) experience Full understanding of ETL concepts Exposure to VCS (Git, SVN) Proficient with relational data modeling Key Responsibilities Take ownership of features and drive them to completion through all phases of the entire 84.51 SDLC.? This includes internal and external facing applications as well as process improvement activities: Participate in design, development and support of Oracle and Hadoop based solutions Perform unit and integration testing Collaborate with architecture and lead and senior engineers to ensure consistent development practices Collaborate with other engineers to solve and bring new perspectives to complex problems Drive improvements in people, practices, and procedures Embrace new technologies and an ever-changing environment
Selling points for candidate :
Project Verification Info :
Candidate must be your W2 Employee :Yes
Exclusive to Apex :No
Face to face interview required :No
Candidate must be local :No
Candidate must be authorized to work without sponsorship ::No
Interview times set : :No
Type of project :Development/Engineering
Master Job Title :Big Data: DBA
Branch Code :Cincinnati
             

Similar Jobs you may be interested in ..