Job Description :
This is Neha from Apetan consulting,
Please find the below job description and send me your update resume matching to it along with Contact details, Current location, Visa and Availability ASAP.

POSITION: Lead Hadoop Developer/ Architect
INTERVIEW: Phone/Skype Interview
LOCATION: San Antonio, TX

Job Overview:

Team is looking for individuals with experience in Big Data Technologies such as Hadoop HDFS, MapReduce, Spark, Scala, Python, Hive, Impala, and SQL who can provide the technical leadership towards architecting a highly scalable, cost effective and highly performing platform for CCB RFT. The vision is to create a reliable and scalable data platform, provide standard interfaces to query, support Risk Models, analytics and consumer applications.

The Technical Architect will also have to build data marts/ data warehouses. He/She will be involved in designing a transient data mart until the long-term strategic Hadoop based data platform can replace it.

Minimum 8 years of overall experience, including experience in a Lead role/Architect
3+ years of development experience in Big Data technologies like Hadoop and Scala / Python
Ability to own and establish physical Architecture for Big Data platform.
Ability to design and support development of a data platform for data processing (data ingestion and data transformation) and data repository using Big Data Technologies like Hadoop stack including HDFS cluster, MapReduce, Spark, Scala, Hive and Impala.
Past experience to build proof of concepts using Big Data technologies to test various use cases.
Ability to support logical data model design and convert it into physical data model.
Ability to design and support RESTful API based web services for data distribution to downstream applications.
Past experience working with best practices/standards for Big data platform and web services.
Past experience in translating functional and technical requirements into detail design