Job Description :
Experience with IBM Spectrum Conductor for Spark for Job scheduling and resource management.
Development experience with languages such as Python and R
Experience with Anaconda and Conda environments and building python packages from source in Unix Anaconda environments
Experience with JupyterLab enabling extensions and addition of kernels for Spark, R or H2O Sparkling water
Excellent understanding of Hadoop ecosystem tools for data ingestion, processing and provisioning such as Apache Spark, Hive
Hadoop administration is a plus
Acts in the highest level technical role as an individual contributor and/or team lead for the most complex computer applications and/or application initiatives. Utilizes a thorough understanding of available technology, tools, and existing designs. Works on the most complex problems where analysis of situations or data requires evaluation of intangible variance factors. Plans, performs, and acts as the escalation point for the most complex platform designs, coding, and testing. Leads most complex multiple modeling, simulations, and analysis efforts. Acts as expert technical resource to programming staff in the program development, testing, and implementation process.: 10+ years application development and implementation experience.
             

Similar Jobs you may be interested in ..