Job Description :
TECHNOGEN, Inc. is a Proven Leader in providing full IT Services, Software Development and Solutions for 15 years.



TECHNOGEN is a Small & Woman Owned Minority Business with GSA Advantage Certification. We have offices in VA; MD & Offshore development centers in India. We have successfully executed 100+ projects for clients ranging from small business and non-profits to Fortune 50 companies and federal, state and local agencies.
Apply: Ron(at) They need to screen out Senior level for both Hadoop and Core Java before we can send them. Project Details (size, scale, scope) // Positions Day to Day Responsibilities This position will be supporting the Enterprise Data Lake team for ongoing development for the hadoop platform. This project has been in flight for several years and the majority of the platform is built out, they are now at critical stages of adding in security checks to be compliant with regulators. What are the top 3 MUST HAVE requirements of the position (years exp, technical, etc? 10+ years Java development experience 4+ years Hadoop Development Experience Good Communication skills, able to work with other teams NICE TO HAVES (OR WHAT GETS THE WIN): Any additional big data experience MapReduce, HDFS, Spark, Kafka, and Hive Title- Senior Hadoop Developer Duration- 24 Months Client- Large Financial Client Location- North Brunswick, NJ Solution Engineering team for client is leveraging big data technologies and cutting edge innovations in solving business critical problems. We work with business and technology teams and provide solutions with our vast data and processing power in the data lake. The position is for a senior engineer with solid experiences in big data related technologies. The new team member will focus on helping other teams in the company using the data lake and also work on enhancing the data lake. The position will be located at New Jersey location. A successful candidate for this role will have a service-oriented mentality and possess a strong sense of ownership of the problems/requests assigned. The role will focus on evaluating technology solutions; building up and enhancing the platform; and offering solutions and supports to our business users. The expertise which we are looking for includes: Deep understanding of Big data and Hadoop architecture Strong experience in Hadoop and its related technologies like MapReduce, HDFS, Spark, Kafka, and Hive Hands on experience in Java, Scala, Python, and Unix Shell Script Solid experience in NoSQL databases like HBase, MongoDB and Cassandra Experience in designing and developing technical solutions with big data technologies Proven track record of strong verbal/written communication and presentation skills, abilities to articulate technical solutions to both technical and business audiences Excellent planning, project management, leadership and time management skills Abilities to work independently Delivery focus and willingness to work in a fast-paced, mission-critical production environment Hadoop Certification is preferred.