Job Description :
Hi,

Greetings from XTGlobal, Inc.!

We at XTGlobal, Inc. are currently sourcing for Talend requirement. Request you to kindly review the job description given below and reply back if you are interested in pursuing this opportunity.

Title: Talend Big Data Developer [XTGL_20832]
Location: St. Louis, MO
Type: 6 + Months Contract


Job Summary
Responsible for design, development and implementation of Big Data Projects using Spark Scala.
Resolve issues regarding development, operations, implementations, and system status.

Responsibilities:
Strong Knowledge in Hadoop Architecture and its implementation.
Strong understanding of best practices in Talend coding on large scale Hadoop Clusters.
Proficiency with Software Development Lifecycle (SDLC)
Solid knowledge of the programming language(s), application server, database server and/or architecture of the system being developed.
Good communication skills and problem solver mentality.
Solid understanding of current programming languages and employs any/all of these languages to solve the business needs of Client''s internal customers.
Professional Strong Functional programming using Scala and Java.
Strong experience in Talend Big Data Real Time or other functional languages.
Excellent understanding of data engineering concepts.
Experience working with Spark for data manipulation, preparation, cleansing
Experience in whole Hadoop ecosystem like HDFS, Hive , Yarn, Flume, Oozie, Flume, Cloudera Impala, Zookeeper, Hue, Sqoop, Kafka, Storm, Spark and
Spark Streaming including Nosql database knowledge
Good knowledge of Windows/Linux/Solaris Operating systems and shell scripting
Strong desire to learn a variety of technologies and processes with a "can do" attitude

Required Skills :
Talend Development Experience in a Big Data/Hadoop Environment
Ability to Load data to and from HDFS
Experience with RDBMS
Experience with general Hadoop Ecosystem

Required Qualifications
8-10 years of hands-on experience in handling large-scale software development and integration projects.
2+ years of experience working with Hadoop cluster environments and tools ecosystem: Spark/Spark Streaming/Sqoop/HDFS/Kafka/Zookeeper
Experience with Java, Python, Pig, Hive, or other languages a plus

Preferred:
Experience in working with RDBMS and Java
Exposure to NoSQL databases like MongoDB, Cassandra etc.
Experience with cloud technologies(AWS)
Certification in Hadoop development is desired


If you are interested in pursuing this opportunity, kindly reply back with your word format resume attached.

Please do refer your friends or colleagues if they are looking out for job opportunities.

Ashwin | Recruiter
2701 Dallas Parkway, Suite 550
Plano TX, 75093
Direct
Email:
             

Similar Jobs you may be interested in ..