Job Description :


We Have Urgent requirement for Spark /Scala/Big Data Developers at NewYork

Role: Spark /Scala/Big Data Developers
Location: NewYork
Duration: 12+ Months

Mandatory Technical / Functional Skills:
6+ years of experience in Big Data technologies i.e. Hadoop Platform (Hive, HDFS and Spark)
Must have Experience developing big data systems utilizing Spark API and UDFs in Scala/Java.
Hands on experience using UNIX, SQL, Spark framework and Java

Roles & Responsibilities
Expertise in Scala Functional programming.
Hands-on experience in Scala Development, Spark is must.
Adept at Apache Spark programming using Scala.
Good knowledge of Configuring and working on Multi node clusters and distributed data processing framework Spark.,
Experience in designing data pipelines, complex event processing, analytics components using big data technology (Scala/ Spark),
Experience in working with large volumes of data (Tera-bytes), analyze the data structures and design in Hadoop cluster effectively