Job Description :
Job Title: Big Data Administrator(Kafka, Flume, Sqoop)
Job Location: Westlake, TX 76262
Duration: 6 months (with strong possibility of extension)

Description:
Client’s Corporate Technology Group (CTG) is seeking a Nifi / Kafka or Bigdata administrator candidate who can configure, monitor, handle and take end-to-end ownership of Data ingestions platform using Nifi / Kafka or other tools as required.

The Team:
Corporate Technology provides back office systems support for various business groups like Finance, Accounting, Procurement, Enterprise Risk management, Compliance, IT enablement, Human Resources, Communications, Legal & Learning.
This role sits within the CTG Data Services group and is responsible for providing support of data ingestion / integration platforms like Nifi, Kafka or Informatica. This team will be responsible for installing, securing, managing & monitoring the entire eco system that comes with data ingestion and/or streaming requirements of Corporate Technology Group.
The Expertise You Have:
Bachelors or Masters in Engineering, Information Systems, Computer Science or Information Technology or equivalent experience.
Strong Admin experience supporting Nifi, Kafka, Informatica, Talend or other standard ETL tools!
Scripting and automation experience using Shell / Pearl / Python.
Working experience on Cloud platforms like AWS, AZURE, GCP.
Familiarity / experience with standard DBMS systems like Oracle, SQL, DB2.
Working on MPP systems like – Netezza, Teradata, Greenplum etc.
Nice to have experience with on Cloud DW like Snowflake.
Working experience and good understanding of eco systems like Hadoop, Hive, Spark, zoo keeper etc.
The Skills You Bring:
You bring 5+ years of Nifi, Kafka, Informatica Admin.
SQL & PL/SQL experience and other Database programming languages.
Must have scripting and automation experience using shell, Perl and python scripting.
The ability to install, configure, manage & maintain Nifi systems on-prem and in AWS.
Experience in supporting multiple RDBMS like Oracle, Postgres, SQL server.
Ability to implement audit, security and risk controls to secure data in both on-prem & cloud.
Expertise in managing and processing large data sets distributed on multi-server, distributed systems from inception to execution.
Your Expertise in integration patterns with various internal and external systems. Experience with data structures, ETL, and real-time communication.
Your ability to identify Nifi flow issues and help developers prioritize their data flows.
You be the domain specialist and point-of-contact in core technical capabilities, onboarding new projects, operationalizing procedure, preparing SoP docs and defining RACI matrix.
Participate in design discussions and offer consulting services to Business and Architecture community including Logical/Physical design discussions with Architects and Application teams.
We are looking for proficiencies and architectures related to High Availability, Disaster Recovery & Business Continuity, Backup and Recovery procedures.
Bring the experience in backend programming languages like PL/SQL, T-SQL, NZ SQL etc.
Scripting experience with Unix shell, Python, Perl, C.
Familiar with Java, J2EE, Embedded SQL, Forms, Reports, etc.
Your able to bring Application authorities and other infrastructure teams together for finding efficient solutions to issues related to capacity, security, performance.
The Value You Deliver:
Primarily supporting Data ingestion / integration platforms using Kafka, Informatica or Talend.
On boarding new projects to Nifi, setting up clusters on-prem or in cloud
Developing automation for deployment, re-hydration and cluster management.
Establish standard backup / recovery policies and procedures.
End-to-end accountability for Nifi environment stability, performance & availability
Support peripheral tools within Data lake ecosystem as required.