Job Description :
Position: Big Data Architect – Hadoop/Scala/Elastic
Locations: Paramus, NJ
Duration: Long Term Contract – Contract to hire also OK

Job Description:
Primary Skills:
Hadoop Distributions(Cloudera, Hortonworks, MapR), Hadoop Core(HDFS, MapReduce), YARN, Zookeeper, Oozie, Sqoop, NoSQL(Hive, HBase, Cassandra, MongoDB), Hadoop Security(Knox, Kerberos, Sentry), Hadoop Data Governance(Apache Falcon, Cloudera Navigator, Sentry, Atlas), Event Processing (Flume, Kafka, Storm, Spark Streaming), Apache Spark(Spark Core, Spark SQL, Spark Streaming), Programming Language -Scala. Elastic Stack - Elastic search, Logstash and Kibana

Roles and Responsibilities:
As a Big Data Architect with minimum experience of 12years ; you will be responsible to architect, design and develop multiple big data utilities that automate various aspects of data acquisition, ingestion, storage, access and transformation of data volume that scale up to petabytes. You will be part of a multi-disciplinary technical team.

* At-least 3+ years of experience in architecting, designing and implementing solutions using big data technologies(mentioned in Primary Skills)
* Hands on in design/development of data intensive applications
* Excellent understanding of software development life cycle and quality processes
* Strong analytical and programming skills
* Unix and Linux experience
* Design documentation using UML and other visual modeling tools
* Able to collaborate and thrive in a fast-paced high-performance environment
* Excellent communication skills
* Develop high quality code with special consideration on performance aspects

Desirable Skills:
* Expertise on open source technologies in Big Data
* Experience in Cloud-based Big Data implementations
* Experience in Designing and Developing applications using Micro Services
* Certifications in Big Data Technologies