Job Description :
Cloudera, Hive, Impala and Azure and Real Time Data Screening Concepts

The Sr. Java/Hadoop Developer position will provide expertise in a wide range of technical areas, including but not limited to: Cloudera Hadoop ecosystem, Java, collaboration toolsets integration using SSO, configuration management, hardware and software configuration and tuning, software design and development, and application of new technologies and languages which are aligned with other FordDirect internal projects.
1. Design and development of data ingestion pipelines.
2. Perform data migration and conversion activities.
3. Develop and integrate software applications using suitable development methodologies and standards, applying standard architectural patterns, taking into account critical performance characteristics and security measures.
4. Collaborate with Business Analysts, Architects and Senior Developers to establish the physical application framework (e.g. libraries, modules, execution environments
5. Perform end to end automation of ETL process for various datasets that are being ingested into the big data platform.

Required Skills:
1. Java/J2EE
2. Web Applications, Tomcat (or any equivalent App server) , RESTful Services, JSON
3. Spring, Spring Boot, Struts, Design Patterns
4. Hadoop (preferably Cloudera (CDH , HDFS, Hive, Impala, Spark, Oozie, HBase
5. SCALA
6. SQL
7. Linux

Good to Have:
8. Google Analytics, Adobe Analytics
9. Python, Perl
10. Flume, Solr
11. Strong Database Design Skills
12. ETL Tools
13. NoSQL databases (Mongo, Couchbase, Cassandra)

IF YOUR ARE AVAILABLE PLEASE REACH ME AT SWAROOP AT KEYLENT DOT COM