Job Description :
Hope you are doing great !!
Mandatory Skills: Talend, Hadoop, ETL, NoSQL.
This is Mahendra from Sumeru Inc., we are sourcing for an Talend Lead for one of our clients in Media, PA.
Please go through the job description and if you feel comfortable with the required skills and responsibilities, please reply to me back with your updated resume at  or contact me for more details on.
Role: Talend Lead
Duration: 12+ Months (Contract)
Location: Media, PA
Job Description -
  • Looking for Tech Lead experienced to design, build and maintain Big Data workflows/pipelines for collecting, storing, processing, and analyzing huge sets of data into and out of data lake
  • Engage in application design and data modeling discussions
  • Build and test data workflows/pipelines
  • Troubleshoot and resolve application and data processing issues
  • Code optimization and fine tune application performance


  • BS/BA degree in Computer Science, Information Systems or related field
  • 10+ years exp in Data Integration Tools such as Talend to develop data pipelines and workflows

  • Strong understanding of data quality Process & Procedures such as Define, Discovery, Profiling, Remediation, and Monitoring.

  • Experience on designing & developing ETL processes using Talend ( data load performance optimization)

  • Knowledge on storage design concepts including partitioning

  • Maintain, modify and improve large set of structured and unstructured data

  • Monitoring and troubleshooting data integration jobs

  • Handling of JSON data source and ingestion of API responses using Talend.

  • Must programming in Big Data Hadoop technology area

  • Highly skilled in Spark and Scala, preferably on Databricks platform

  • Worked in AWS environment in storing, processing data on S3 and transforming data with complex computing into other data models

  • Strong knowledge in SQL and Unix/Linux scripts

  • Exposure to other Hadoop technologies Ecosystem like YARN, Zookeeper, HDFS, Avro, Parquet etc.

  • Experience with cleansing, preparing large, complex data sets for reporting and analytics

  • Must have used Data Integration Tools such as Talend to develop data pipelines and workflows


  • Exposure to Databricks is highly preferred
  • NoSQL such as HBase

  • Distributed Messaging such as Kafka

  • Data architecture

  • DevOps Environment Experience

  • Cloud platform such as AWS


Similar Jobs you may be interested in ..