Job Description :
We have 3 openings for Sr. Hadoop Developers with the follwoing skills and experience.

senior level 9-10+ years of overall experience with excellent COM skills. If you are interested and available, please send me your updated resume.

Duration : Long Term
Location: Minneapolis, MN

As a Senior Data Engineer you will work on a product team using Agile Scrum methodology to design, develop, deploy and support solutions that leverage the Client''s big data platform. The Sr. Data Engineer will work with Enterprise Architecture, D&BI Solution Architects, and Business Analysts to understand business unit requirements and to build solutions to meet their needs and objectives. This role requires the ability to interpret and apply data ingestion/storage/usage patterns developed by the architecture team in order to build solutions. The Sr. Data Engineer is expected to leverage common solutions and services, and to follow Cargill development standards and principles. The Senior Data Engineer will also be responsible for troubleshooting complex incidents related to development.

Primary Accountabilities:

Solution Analysis & Design:
Work with businesses, process owners, and product team members to design solutions for Clients’s big data and Advanced Analytics solutions.
Perform data modeling and prepare data in databases for reporting through various analytics tools.
Create or modify design documentation as defined by team development standards, processes, and tools.
Ensure the solution designed and built is supportable as part of a DevOps model

Development, Testing, and Quality:
Perform integration development to move data from production systems to database/data warehouses using ETL tools.
Develop technical solutions.
Support testing by fixing defects and making necessary back end design changes.
Ensure adherence development and architecture standards and best practices.
Provide necessary technical support through all phases of testing and incident handling after deployment.
Provide support for product solutions as part of a DevOps model
Ability to work well in cross-functional teams and foster team commitment to meeting objectives
Proactive, creative problem solving skills in ambiguous and changing environments
Strong data modelling and query writing skills
Skills building tables/views or data warehousing on Oracle or MS SQL Server environments
Skills and/or experience developing database programming on Oracle or MS SQL Server
Understanding of structured data, dimensional models or cubes and various forms of ETL for reporting
Experience preparing data for use in analytics and reporting
Familiarity with unstructured data
Familiarity with object oriented programming
Business fluency in English

Preferred Technical Skills:
3+ years of experience developing in SAP HANA, SAP BW, Oracle, or SQL Server
Experience building Big Data Solutions using a secure Hadoop environment and NoSQL technology.
Experience data modeling using ETL/ingestion tools: SLT, Streamsets, Business Objects Data Services (BODS), Sqoop, and Flume)
Advanced in at least one functional programming language.
Experience with scripting languages (SQL, Scala, Pig, Bash/Python) to manipulate data.
Experience in a JVM language• Experience working with front end visualization tools like Power BI, Tableau, and Business Objects.
Experience with NoSQL data stores (especially Cassandra
Experience with Spark, Hive, Impala.
Experience with Kafka.
Comfortable scripting in *NIX environment (ssh and standard commands)
Understanding of object-oriented and functional programming paradigms.
Contributions to large open-source projects.
Version control, particularly Git.