Job Description :

DirectClient: Texas Department of Transportation(TXDOT)
Solicitation4
Title: Big Data Architect
Location: 3712 Jackson Ave, Austin, TX 78731
Duration: 6 Months with possible extension
Last date for submission: June 16, 2021 (2.00 PM-CST)
DESCRIPTION OF SERVICES:

The Department of Information Resources (DIR) requires the services of 1 Data Modeler referred to as Worker, who meets the general qualification of Data Modeler 3 and the specifications outlined in this document for ITD. Seeking a candidate with Strong skills with Data architecture and expertise in dimensional data modeling, preferably in a bigdata or an enterprise data warehouse environment. Experience with Informatica Power Exchange for CDC based replication or similar tool is a plus. Expertise in development of data ingestion to Hadoop based data lake (HDFS, Hive) and S3 storage, complex data transformations, and data security. Programming expertise in a major programming/scripting language including but not limited to Java, C++, Scala, python, Go. Hands on delivery capability working in a team across the full lifecycle Data Source Analysis, Design, Development, Testing and Implementation. This includes providing input into requirements, platform development, technical design of the project level technical architecture for ETL, big data application design and development, testing, and deployment of the proposed solution. Proficient in Hadoop ecosystem components such as Hive, Yarn, Knox, Ranger, Sqoop, Oozie, HiveQL, Spark, SQL/data access. Experience in ETL development using Informatica BDM, Kafka, or comparable software. Understanding and skills in data governance, security architecture, load balancing and troubleshooting. Well-versed and deep development experience in Big Data Platform clusters on AWS, HDFS storage optimization, Data Lake/Data Warehouse foundation capabilities.

The Data architect will work with a team of developers, analysts, project managers, and solution architects. It is expected that this person shall have a good understanding of data lake and data warehousing architectures including loading to dimensional data structures and ETL development for many types of data such as RDBMS sources, API/streaming, structured, and unstructured. Design, develop, support, optimize, and provide technical oversight on all phases of the development life cycle on Data lake project engagements, including but not limited to participation in data sourcing, analysis, requirements gathering, design, development, testing, deployment, transition, and support. Help monitor and manage the technical scope on engagements to prevent scope creep and deviation from the contract. Follow good practices and project guidelines in all cases including the use of source code control, automated testing and deployment approaches, test-case-based development, and proper documentation practices.

WORKER SKILLS AND QUALIFICATIONS
Minimum (Required):
Years Skills/Experience
10 Data warehouse management, Data Architecture, Dimensional Data modeling
8 Experience in ETL development, should be able to guide ETL developers
8 Standard concepts, practices, and procedures within a solution architecture development. Relies on limited experience and judgment to plan and accomplish goals. A certain degree of creativity and latitude is required. Works under limited supervision with considerable latitude for the use of initiative and independent judgment. Performs deep analysis of systems and the interactions and defines the most efficient solutions for the interaction between the systems.
8 Database development (such as Oracle, MS Sql Server, Postgres) and complex SQL
3 Troubleshooting Hadoop configuration and optimization of ETL/Informatica
2 Working knowledge of MuleSoft integration to Hadoop Data Platform

Preferred (Optional):
Years Skills/Experience
1 Experience with TxDOT systems



Client : TXDOT

             

Similar Jobs you may be interested in ..