Job Description :
Role: Big Data Administrator
Location: Westlake, TX
Tenure: 12+ Months/ Contract
Mode of Interview: 02 Video Rounds [1st with the manager followed by 2nd with a Lead on the team]

Job Summary:
This role is responsible for providing support of data ingestion/ integration platforms like Nifi, Kafka or Informatica. This team is responsible for installing, securing, managing & monitoring the entire eco system that comes with data ingestion and/or streaming requirements.
This role will focus primarily on Nifi Administration for Finance systems like MAP and other hubs that are currently in path to production. The resource will also support the Data Lake Platform with provisioning of environment, installation and maintenance of tools , onboarding support for new teams and projects

The Expertise You Have:
Bachelors or Masters in Engineering, Information Systems, Computer Science or Information Technology or equivalent experience.
Strong Admin experience supporting Nifi, Kafka, Informatica, Talend or other standard ETL tools!
Scripting and automation experience using Java / Shell / Pearl / Python.
Working experience on Cloud platforms like AWS, AZURE, GCP.
Familiarity / experience with standard DBMS systems like Oracle, SQL, DB2.
Working on MPP systems like – Netezza, Teradata, Greenplum etc.
Nice to have experience with on Cloud DW like Snowflake.
Working experience and good understanding of eco systems like Hadoop, Hive, Spark, zoo keeper etc.

The Skills You Bring:
5+ years of Nifi, Kafka, Informatica Admin.
SQL & PL/SQL experience and other Database programming languages.
Must have scripting and automation experience using Java, shell, Perl and python scripting.
The ability to install, configure, manage & maintain Nifi systems on-prem and in AWS.
Experience in supporting multiple RDBMS like Oracle, Postgres, SQL server.
Ability to implement audit, security and risk controls to secure data in both on-prem & cloud.
Expertise in managing and processing large data sets distributed on multi-server, distributed systems from inception to execution.
Expertise in integration patterns with various internal and external systems. Experience with data structures, ETL, and real-time communication.
Identify Nifi flow issues and help developers prioritize their data flows.
Be the domain specialist and point-of-contact in core technical capabilities, onboarding new projects, operationalizing procedure, preparing SoP docs and defining RACI matrix.
Participate in design discussions and offer consulting services to Business and Architecture community including Logical/ Physical design discussions with Architects and Application teams.
We are looking for proficiencies and architectures related to High Availability, Disaster Recovery & Business Continuity, Backup and Recovery procedures.
Experience in backend programming languages like PL/SQL, T-SQL, NZ SQL etc.
Scripting experience with Unix shell, Python, Perl, C.
Familiar with Java, J2EE, Embedded SQL, Forms, Reports, etc.
Bring Application authorities and other infrastructure teams together for finding efficient solutions to issues related to capacity, security, performance.

Client : Compunnel