Job Description :
BigData Solution Architect
Milford, OH
Full Time

Experience: 12+ Years

We are looking for self-motivated Big Data architect to drive innovations on technology interoperable platforms with Hadoop (MapR, Cloudera, HortonWorks, EMR, HDInsight etc for enterprise customers in diverse industry verticals. Key focus areas would be real time analytics, in-memory processing, Next Best Action, and Internet of Things.

Responsibilities:
Design and implement Big Data solutions, including architecture, design and code reviews, to address business problems in various industry verticals.
Drive Proof of Concept (POC) and Proof of Technology(POT) evaluation on interoperable technology platforms
Focus on competency development on technology areas such as Hadoop & associated frameworks, Big data appliances, in-memory, NoSQL DBs
Support presales engineering activities for Big data based RFPs
Support projects/delivery teams on technical issues/solutions
Participate in external and internal branding of Big Data Thought Leadership

Must have:
Minimum 12+ years of solid IT consulting experience in data warehousing, operational data stores and large scale implementations
Minimum 5 years of experience in Core Java or Python or Scala.
Minimum 4 years Hands-on experience on Hadoop Technologies
Design and implementation of Big Data solutions, including leadership role in design to develop shared/reusable components.
Hands-on experience with architecting and Implementing Hadoop applications with complete detailed design of the Hadoop solution, including Data Ingestion, Data Storage/Management, Data Transformation.
Hands-on experience with the Hadoop stack - MapReduce, Sqoop, Pig, Hive, Flume, Spark, Kafka
Hands-on experience with related/complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef, Scala
Hands-on experience in Web Application related frameworks and technologies like Spring, Ajax, Angular JS, O-R mapping
Experience with Hadoop Security model consisting of authentication, service level authorization, authentication for Web consoles and data confidentiality.
Experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture.
Using Big Data technology and customer’s business requirements design and document a comprehensive technical architecture.
Analysis and documentation of source system data sources from traditional (RDBMS) and new data sources (web, machine-to-machine, geospatial, etc
Using business SLAs and a technical architecture calculate performance and volumetric requirements for infrastructure components
Design an architecture using cloud and/or virtualization technology
Plan and execute a technology proof-of-concept (POC) using Big Data technology
Experience in handling the Structured and Unstructured data using Big Data (Any tools, best practice & Industry trends

Hands on technical competences:
Java/J2EE, Linux, PHP, PEARL, Python, C, C++, Scala
Hadoop, Hive, HBase, Pig, MapReduce, Spark, Kafka and other Hadoop eco-system components
NoSQL databases – Cassandra, MongoDB, MariaDB, Couchbase
Data warehouse, BI and ETL tools
Detailed knowledge of RDBMS data modeling and SQL
AWS, Azure knowledge would be an advantage.
             

Similar Jobs you may be interested in ..