Job Description :
Title: Big Data Architect [Candidate required to travel 60% to Warren, NJ]
Location: Warren, NJ
Duration: Full Time - Perm

Big Data Architect

The Big Data Architect will be responsible for designing, architecting and implementing the data processing systems capable of processing, storing and distributing data in robust Big Data Solutions. Working with multi-technology and cross functional teams and clients, you should be able to manage the entire life cycle of a solution. You should be able to take architectural decisions and provide a technology leadership and direction to the organization.

Position Activities and Tasks
Should be able to design complex and high performance architecture
Developing and maintaining strong client relations with senior and C-level executives—developing new insights into the client’s business model and pain points, and delivering actionable, high-impact results
Participating and leading client engagement in developing plans and strategies of data management processes and IT programs for the clients, providing hands on assistance in data modeling, technical implementation of vendor products and practices
Facilitating, guiding, and influencing the clients and teams towards right information technology architecture and becoming interface between Business leadership, Tech leadership and the delivery teams
Leading and mentoring other IT consultants within the practice and across business units
Supporting business development and ensuring high levels of client satisfaction during delivery
Contributing to the thought capital through the creation of executive presentations, architecture documents, and IT position papers
Scope Client requirements
Specify solutions and articulate value to customers
Provide best practice advice to customers and team members
Participate with End-users and do requirement gathering and convert into technical documentation
Identify the performance bottle-necks and resolve the same
Knowledge of Big Data features and Plug-ins

Mandatory People Skills
Should be able to work in a team
Should be able to Mentor other team members on Big Data
Ability to produce high quality work products under pressure and within deadlines
To coordinate with developers / other architects / other stakeholders and cross functional teams from organization and customer side

Mandatory Functional Skills
Knowledge of any one of CPG/Retail will be appreciated
Participate in full Software Development Life Cycle (SDLC) of the Big Data Solution

Education
Bachelor''s degree in Computer Science or a related field preferred (BE/B.Tech)
Master''s degree in a related field preferred (ME/M.Tech, MS, MCA)

Professional Experience Required
12-15 years of experience in designing, architecting and implementing large scale data processing/data storage/data distribution systems
Extensive experience working with large data sets with hands-on technology skills to design and build robust Big Data solutions
Ability to work with a multi-technology/cross-functional teams and customer stakeholders to guide/managing a full life-cycle of a Hadoop solution
Extensive experience in data modeling and database design involving any combination of -
Data warehousing and Business Intelligence systems and tools
Relational and MPP database platforms like Netezza, Teradata
Open source Hadoop stack
Hands-on administration, configuration management, monitoring, performance tuning of Hadoop/Distributed platforms
Strong understanding of Big Data Analytics platforms and ETL in the context of Big Data
Ability to frame architectural decisions, provide technology leadership & direction
Excellent problem solving, hands-on engineering skills and communication skills

Professional Experience Preferred
Knowledge/experience of cloud computing infrastructure (e.g. Amazon Web Services EC2, Elastic MapReduce)
Considerable understanding and experience of real-time analytics
Should be willing to independently code & review modules are required ( Role not at solution architecture level only)

Technical Skills Required
Any combination of below technical skills
Hadoop : HDFS, MapReduce, Hive, Hbase, Pig, Mahout, Avro, Oozie
Streaming Processing : Spark, Strom, Samza
NoSQL : Cassandra, MongoDB, Hbase
Appliances : Teradata, Netezza, Greenplum, Aster Data, Vertica
Languages : Java, Scala, Linux, Apache, Perl/Python/PHP
Cloud : AWS, IBM Softlayer, Microsoft Azure, Google Cloud
Any RDBMS/DWBI technologies
             

Similar Jobs you may be interested in ..