Job Description :

Role: Big data developer

Location – Chicago IL

Job Qualifications:
• Must have extensive hands on experience in designing, developing, and maintaining software solutions in Hadoop cluster.
• Must have experience with strong UNIX shell scripting, SQOOP, eclipse, HCatalog
• Must have experience with NoSQL Databases like HBASE, Mongo or Cassandra
• Must have experience with Developing Pig scripts/Hive QL,UDF for analyzing all semi-structured/unstructured/structured data flows.
• Must have working experience with Developing MapReduce programs running on the Hadoop cluster using Java/Python.
• Must have working experience with Spark and Scala.
• Must have knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2) and considerations for scalable, distributed systems
• Must demonstrate Hadoop best practices
• Must have working experience in the data warehousing and Business Intelligence systems
• SDLC Methodology (Agile / Scrum / Iterative Development).
• Systems change / configuration management.

             

Similar Jobs you may be interested in ..