Job Description :

We have an Immediate Hadoop Architect Opening One of Our Client, VA , OH, CA

Position : Hadoop Architect
Experience : 12+years
Locations : VA, OH, CA
Hire Type : C2C
Visa Status : H1B / GC / USC / L2EAD / H4EAD
Pay Rate : DOE
Duration : LongTerm

Job Description:

Job Summary

Perform architecture design, data modeling, and implementation of Big Data platform and analytic applications for our clients
Analyze latest Big Data Analytic technologies and their innovative applications in both business intelligence analysis and new service offerings, bring these insights and best practices
Architect and implement complex big data solutions
Drive architecture engagement models and be an ambassador for partnership with delivery and external vendors.
Effectively communicate complex technical concepts to non-technical business and executive leaders
Assist with scoping, pricing, architecting, and selling large project engagements
Responsibilities and Duties

On premises Big Data platforms such as Cloudera, Hortonworks
Big Data Analytic frameworks and query tools such as Spark, Storm, Hive, Impala
Streaming data tools and techniques such as Kafka, AWS Kinesis, Microsoft Streaming Analytics
ETL (Extract-Transform-Load) tools such as Informatica, Pentaho or Talend
Continuous delivery and deployment using Agile Methodologies.
Data Warehouse and DataMart design and implementation
NoSQL environments such as MongoDB, Cassandra
Data modeling of relational and dimensional databases
Metadata management, data lineage, data governance, especially as related to Big Data
Structured, Unstructured, Semi-Structured Data techniques and processes
Required Experience, Skills and Qualifications

Over 8+ years of engineering and/or software development experience and demonstrable architecture experience in a large organization.
Experience should contain 5 years of experience of architecture support combined of these environments: warehouse, DataMart, business intelligence, and big data.
Hands-on experience in Big Data Components/Frameworks such as Hadoop, Spark, Storm, HBase, HDFS, Pig, Hive, Scala, Kafka, PyScripts, Unix Shell scripts
Experience in architecture and implementation of large and highly complex big data projects
Demonstrated ability to communicate highly technical concepts in business terms and articulate business value of adopting Big Data technologies