Job Description :
Position: Hadoop developer
Location: Atlanta, GA
Duration: 6 months
Interview: Phone and Skype
Client: CareCentrix
Visa Required: GC & USC
Remote, but cannot be in Florida or on the West Coast because of the time difference
Notes: We are waiting on approval to purchase and implement a big data platform here at CareCentrix. We do parts of Big Data here (data movement, data analytics), we are hoping to do a lot more in it once the platform is approved.

Bachelor’s Degree in Computer Science, Computer Engineering, Information Technology or other related field in addition to relevant work experience required for the job
Minimum 3+ years of developer experience on writing SQLs, Hadoop, YARN, Sqoop, Spark SQL, Hive, other ETL tools (Informatica, Talend etc), Impala, etc.
Minimum 2+ years of development experience in delivering DataStage based ETL solutions.
2-5 years of experience in Python, Java or Scala Development
Experience on developing Spark processes and performance tuning
Experience performing data analytics on Hadoop-based platforms and implementing complex ETL transformations.
Strong Experience with UNIX shell scripting to automate file preparation and database loads
Experience in data quality testing; adept at writing test cases and scripts, presenting and resolving data issues
Familiarity with relational database environment (Oracle, DB2, etc leveraging databases, tables/views, stored procedures, agent jobs, etc and integrating with Salesforce backend.
Experience analyzing and designing deliverables in an agile environment is required.
Demonstrated independent problem solving skills and ability to develop solutions to complex analytical/data-driven problems

Client : CareCentrix