Job Description :
Position: Big Data Architect
Location: Washington, DC Metro
Duration: Long Term
Interview: Face to Face

Requirements:

Architect end to end Big Data solutions from Big Data application and infrastructure point of view
Identify right Big Data technology (MapReduce, Yarn, Pig, Sqoop, Hive, Impala, Kafka, Oozie, HBase, Cassandra, MongoDB, CouchDB, Hue, Flume, Solr, NoSQL Databases, APIs, Spark, Storm, Samza, Kafka, Avro, ) stack based on business scenarios.
Provide design patterns for building frameworks for data ingestion, loading and transformation
Provide design patterns for building frameworks for analytics and reporting on Big Data
Provide best practices for the performance tuning to handle large scale data volumes, real time data volumes and transformations.
Provide best practices for security integration, backup/recover, change management processes and scheduling batch jobs.
Identify use cases for the Big Data technology stack
Design complex highly scalable statistical models and solutions that comply with security requirements
Define/Design APIs for integration with various data sources in the enterprise
Actively collaborate with other architects and developers in developing client solutions
Works with Project Manager to perform detailed planning, risks/issues escalation.

Qualifications:

6+ years of experience working with batch-processing and tools in the Hadoop tech stack (e.g., MapReduce, Yarn, Pig, Hive, HDFS, Oozie
6+ years of experience working with tools in the stream-processing tech stack (e.g., Spark, Storm, Samza, Kafka, Avro Experience developing applications that work with NoSQL stores (e.g., ElasticSearch, Hbase, Cassandra, MongoDB, CouchDB Experience developing for TB-level data stores and/or 10Gbps+ ingest speeds
High-capacity data ingest into Hadoop or Spark is highly desired
Hands-on experience with at least one major Hadoop Distribution such as Cloudera or Horton Works or MapR or IBM Big Insights
System usage and optimization tools such as Splunk is a plus
At least 6 years of experience delivering enterprise IT solutions as a solutions architect
8+ years of experience with SQL and at least two major RDBMS''s
8+ years as a systems integrator with Linux systems and shell scripting
8+ years of doing Data related benchmarking, performance analysis and tuning
6+ years of Java experience
Solid programming experience with a preference towards Java or Python
5+ DBA and/or Data Modeling experience
Bachelor''s degree in Computer Science, Information Systems, Information Technology or related field and 10+ years of DW & BI experience.
Experience with operational and business-level metadata management
Health care experience is plus
Excellent verbal and written communication skills

Love to Have:

Hands-on experience with Cloudera 4.5 and higher, Horton Works 2.1 and higher or MapR 4.01 and higher
Experience with Map/Reduce solution design and development
ETL Solution experience, preferable on Hadoop
Experience with industry leading Business Intelligence tools

The interested may send the resume to mohanattechnetllcdotcom or give a call on


Client : Technet LLC

             

Similar Jobs you may be interested in ..