Job Description :
Position: Big Data/Hadoop Developer

Location: Lafayette LA

Duration: 12 Months

Big Data Engineer

Position Description
Client is looking for a team player with excellent analytical and technical skills and ability to implement complex big data projects with a focus on collecting, parsing, managing, analysing and visualizing large sets of data to turn information into insights using multiple platforms

The Big data engineers on our project will develop, maintain, test, and evaluate big data solutions for our business objectives. Much of the time they will be involved in the design of big data solutions, and hence need to have a solid experience with Hadoop based technologies such as MapReduce, Hive, MongoDB or Cassandra. Experience with object-oriented design, coding and testing patterns as well as experience in engineering (commercial or open source) software platforms and large-scale data infrastructures is a must. Additionally, he/she should understand how algorithms work and have experience building high-performance algorithms.

As a big data engineer the member should know how to apply technologies to solve big data problems and to develop innovative big data solutions. The engineer should have extensive knowledge in different programming or scripting languages like Java, Phyton and/or R, Linux. Also expert knowledge should be present regarding different (NoSQL or RDBMS) databases such as MongoDB or Redis. Building data processing systems with Hadoop and Hive using Java or Python should be common knowledge to the big data engineer.

Your future duties and responsibilities

To enjoy being challenged and to solve complex problems on a daily basis;

To have excellent oral and written communication skills;

To be proficient in designing efficient and robust ETL workflows;

To be able to work with cloud computing environments;

To have a Bachelor’s or Master’s degree in computer science or software engineering;

To be able to work in teams and collaborate with others to clarify requirements;

To be able to assist in documenting requirements as well as resolve conflicts or ambiguities;

To be able to tune Hadoop solutions to improve performance and end-user experience;

To have strong co-ordination and project management skills to handle complex projects.

Having a background with Information Security will help in the role

MUST be well spoken and comfortable in presenting to managers and team members alike.

MUST have participated in Agile project execution.

Required qualifications to be successful in this role

The big data engineer is a technical job that requires substantial expertise in a broad range of software development and programming fields. The big data engineer should especially have sufficient knowledge of big data solutions to be able to implement those on premises or in the cloud.

Bachelor’s or Master’s degree in computer science or software engineering

Enjoy being challenged and to solve complex problems on a daily basis;

Have excellent oral and written communication skills;

Be proficient in designing efficient and robust ETL workflows;

Be able to work with cloud computing environments;

Be able to work in teams and collaborate with others to clarify requirements;

Be able to assist in documenting requirements as well as resolve conflicts or ambiguities;

Be able to tune Hadoop solutions to improve performance and end-user experience;

Minimum 1-3 years’ experience working with health care data, from both a technical (IT) and non-technical perspective

Must be a team player – be able to work with Stake holders and team members without conflicts

Be able to present solution to team and leaders