Job Description :
DirectClient: TXDOT/NTTDATA
Job Title: Senior Big Data Solution Architect
Job ID: 2016-182661
Duration: 7+ Months Contract(CTH)
Location: Austin, TX

Job Responsibilities:
- Document and understand business requirements, environment dependencies and integration points
- Develop end to end architecture design on big data solution based on variety of business use cases
- Lead end-to-end Hadoop implementation at large enterprise environment integrating with multiple legacy applications in heterogeneous technologies (Microsoft, Java, PowerBuilder, Oracle, SQL Server, Mainframe, GIS (Point Cloud), Sensors etc)
- Will implement Hadoop ecosystem that will enable big data storage repository, data warehouse and data mart capabilities and business intelligence (BI) plus big data analytics.
- Present and persuade the design architecture to the various stakeholders (Customer, Server, Network, Security and other teams )
- Provide technical leadership and governance of the big data team and the implementation of the solution architecture in following Hadoop ecosystem (Hadoop (HortonWorks), Map Reduce, Pig, Hive, HC Catelog, Tez, Spark, Pheonix, Presto, Hbase, Accumulo, Storm, Kafka, Flume, Falcon, Atlas, OoZie, Ambari, Hue, Security – Kerberos, Ranger, Knox, Oracle ASO, HDFS encryption, AD/LDAP, hosting platform - AWS)
- Manage the architecture design changes due to the business requirement and other interface integration changes
- Provide an overall architect responsibilities including roadmaps, leadership, planning, technical innovation, security, IT governance, etc
- Design, Layout, and Deploy Hadoop clusters in the cloud using Hadoop ecosystem & open Source platforms
- Configure and tune production and development Hadoop environments with the various intermixing Hadoop components
- End-to-end system implementation including data security and privacy concerns
- Design and implement geospatial big data ingestion, processing and delivery
- Provide cloud-computing infrastructure solutions on Amazon Web Services (AWS - EC2, VPCs, S3, IAM)

Basic Qualifications:
- Minimum of 8 years work experience with Enterprise Architecture
- Minimum of 5 years work experience with Big Data Architecture
- Minimum of 5 years work experience with Hadoop, setup and implementation
- Minimum of 3 years work experience with Active Directory, LDAP, and Identity Management Integration
- Hands on experience implementation big data solutions on Geospatial data

Preferred Skills:
- Expertise on architecting end-to-end big data solution based on variety of business use cases
- Past experience leading the implementation and operationalization of enterprise wide big data solution in a cross functional and cross technology/domain environment
- Strong preference with expertise on administration, configuration management, monitoring, debugging, performance tuning, technical resolution on Hadoop applications suit (Hadoop platform, MapReduce, Hive, HBase, Spark, Flume, Oozie, Tez, Ambari, Kafka, Pig, Accumulo, Storm, Falcon, Atlas, Scoop, NFS, WebFDS, Hue, Knox, Ranger, Impala, ZooKeeper)
- Database familiarity (Oracle, SQL Server, MySQL)
- Knowledge and/or experience on Java, Linux, PHP, Ruby, Phyton and/or R, Informatica, Tableau


Client : TXDOT

             

Similar Jobs you may be interested in ..