Job Description :
Title: Big Data Administrator
Location: Westlake, TX
Duration: 6 months (Expected to be a long term contract)

Special Instruction:
Kafka or NIFI Administration
Apache Flume
Scoop

The Team:
Corporate Technology provides back office systems support for various business groups like Finance, Accounting, Procurement, Enterprise Risk management, Compliance, IT enablement, Human Resources, Communications, Legal & Learning within Client.
This role is part of CTG Data Services group and responsible for providing support of data ingestion / integration platforms like Nifi, Kafka or Informatica.
This team will be responsible for installing, securing, managing & monitoring the entire eco system that comes with data ingestion and/or streaming requirements of Corporate Technology Group.

The Expertise You Have:
Bachelors or Masters in Engineering, Information Systems, Computer Science or Information Technology or equivalent experience.
Strong Admin experience supporting Nifi, Kafka, Informatica, Talend or other standard ETL tools!
Scripting and automation experience using Shell / Pearl / Python.
Working experience on Cloud platforms like AWS, AZURE, GCP.
Familiarity / experience with standard DBMS systems like Oracle, SQL, DB2.
Working on MPP systems like – Netezza, Teradata, Greenplum etc.
Nice to have experience with on Cloud DW like Snowflake.
Working experience and good understanding of eco systems like Hadoop, Hive, Spark, zoo keeper etc.

The Skills You Bring:
You bring 5+ years of Nifi, Kafka, Informatica Admin.
SQL & PL/SQL experience and other Database programming languages.
Must have scripting and automation experience using shell, Perl and python scripting.
The ability to install, configure, manage & maintain Nifi systems on-prem and in AWS.
Experience in supporting multiple RDBMS like Oracle, Postgres, SQL server.
Ability to implement audit, security and risk controls to secure data in both on-prem & cloud.
Expertise in managing and processing large data sets distributed on multi-server, distributed systems from inception to execution.
You’re Expertise in integration patterns with various internal and external systems. Experience with data structures, ETL, and real-time communication.
Your ability to identify Nifi flow issues and help developers prioritize their data flows.
You be the domain expert and point-of-contact in core technical capabilities, onboarding new projects, operationalizing procedure, preparing SoP docs and defining RACI matrix.
Participate in design discussions and offer consulting services to Business and Architecture community including Logical/Physical design discussions with Architects and Application teams.
We are looking for proficiencies and architectures related to High Availability, Disaster Recovery & Business Continuity, Backup and Recovery procedures.
You bring the experience in backend programming languages like PL/SQL, T-SQL, NZ SQL etc.
Scripting experience with UNIX shell, Python, Perl, C.
Familiar with Java, J2EE, Embedded SQL, Forms, Reports, etc.
You’re able to bring Application authorities and other infrastructure teams together for finding efficient solutions to issues related to capacity, security, performance.

The Value You Deliver:
Primarily supporting Data ingestion / integration platforms using Nifi, Kafka, Informatica or Talend.
On boarding new projects to Nifi, setting up clusters on-prem or in cloud
Developing automation for deployment, re-hydration and cluster management.
Establish standard backup / recovery policies and procedures.
End-to-end accountability for Nifi environment stability, performance & availability
Support peripheral tools within Data lake ecosystem as required.
             

Similar Jobs you may be interested in ..