Job Description :
Position: Big data

Location: Memphis

Duration:6 months

Rate: $50/hr as per DOE

Minimum 12 years of experience in IT space and minimum 4 years of Big Data platform experience

Should be well versed with overall IT landscape, technologies and should be able to analyze how different technologies integrates with each other

Well versed with Presales and creating proposal / responding to RFIs/RFPs

Should be able to provided scalable and robust solution architecture depending on the needs

Should be able to compare tools and technologies and recommend a tool or technology

Must have good understanding of Data Structures, Distributed processing framework.

Must have experience of executing couple of big data projects (end to end)

Strong technical expertise and hands on experience on Hive, Flume, HBase & other Hadoop components

Candidate must have strong background & hands on experience of SPARK or JAVA or Data warehouse / ETL/BI or DWH appliances

Must have strong customer service skills and excellent verbal and written communication skills.

Excellent problem solving and analytical skills.

Ability and desire to work in a fast-paced environment stay motivated and flexible.

Ability to work cross functionally to deliver appropriate resolution of technical, procedural, and operational issues.

Deep Knowledge of either Cloudera/Horton Works/MapR/Pivotal

Good knowledge of NoSQL Databases/HBase/ MongoDB

Very good understanding of in-memory databases and distributed databases

Thoroughly understanding of how to handle real time data integration

Good understanding of security aspects including Kerberos, Sentry etc

Good experience with databases and SQL

Good experience with at least one programming language like Scala, Java, Python, Pig etc

Good experience with Analytics use cases and R language

Proven expertise of delivering end to end projects working with virtual teams""