Job Description :
Title: Talend/Big Data Developer
Location: Chicago, IL
Duration: 10+ Months

Talend/Big Data developer would be responsible for building and implementing new data pipelines/reusable frameworks for Big data and Kafka.

This position requires the following-
Experienced in working with enterprise data platforms, data lakes using big data technologies.
Minimum 5 Years of Proficient experience in designing and developing of mappings, transformations, sessions and workflows, and deploying integration solutions using Talend tool.
Implemented best practices around Talend development methodology and experienced in building ETL prototypes using subjobs, joblets and reusable components.
Experienced in Talend Big Data edition (Talend Standard, Big Data Batch and Streaming jobs) and have developed pipelines using HDFS, Hive, Impala and Kafka platforms.
Experienced in working with Custom java, Scala and PySpark usage with Talend Integration.
Strong knowledge in performance tuning of Talend big data processes.
Experienced in handling Kafka topics using Talend tool.
Experience in Unix, SQL queries, HQL and database scripting (procedures, functions
Knowledge of CI/CD (Git/Jenkins
Knowledge of Cloudera Hadoop installation.
Identify, recommend and implement ETL process and architecture improvements.
Looking for a team player who is open to working with newer technologies and with cross-functional teams.
More information is that the resource will be expected to build prototypes to hand over to developers. This part is hands on.
             

Similar Jobs you may be interested in ..