Job Description :
Minneapolis, MN
Big Data Engineer - Hadoop
6 months CTH or DP
What you’ll get to do:
· Design and develop Big Data solutions, including data sourcing and ingestion, data quality processing, data provisioning and analytics.
· Perform all phases of software engineering including application design, and code development and testing
· Establish and improve internal processes for leveraging a multitenant Data Lake
· Contribute to the architecture and design of the Data Lake
· Design reusable components, frameworks and libraries in Java, Scala, or Python
· Implement a data management framework for the Data Lake
· Review code and provide feedback relative to best practices, performance improvements etc.
· Troubleshoot production support issues
What you’ll need to succeed:
· 2+ years of hands-on Hadoop implementation and configuration experience
· 2+ years of hands-on experience working with a variety of Hadoop features and tools (Python, Spark, Ambari, HDFS, MapReduce, Java, etc
· 2+ years of hands-on expertise with Big Data technologies (HBase, Hive, Sqoop, Pig, etc
· 5+ years of hands-on expertise with SQL writing queries and RDMS
· 5+ years of Business Intelligence / Report Development experience (ex: SSRS)
· 3+ years of hands-on expertise with UNIX shell scripts and commands
· Bachelor’s Degree or equivalent experience
· Preferred Qualifications
o HortonWorks Certification