Job Description :
1. Monitor and coordinate all data system operations, including security procedures, and liaison with infrastructure, security, DevOps, Data Platform and Application team.
2. Ensure that necessary system backups are performed, and storage and rotation of backups is accomplished.
3. Monitor and maintain records of system performance and capacity to arrange vendor services or other actions for reconfiguration and anticipate requirements for system expansion.
4. Assist managers to monitor and comply with State data security requirements.
5. Coordinate software development, user training, network management and major/minor software installation, upgrade and patch management.
6. Must demonstrate a broad understanding of client IT environmental issues and solutions and be a recognized expert within the IT industry.
7. Must demonstrate advanced abilities to team and mentor and possess demonstrated excellence in written and verbal communication skills.

A Bachelor''s Degree from an accredited college or university with a major in Computer Science, Information Systems, Engineering, Business, or other related scientific or technical discipline. A Master''s Degree is preferred.

General Experience:
1. At least ten (10) years of experience in administering cloud based multi user environment with expertise in planning, designing, building, and implementing IT systems.
2. At least ten (10) years of product administration experience in RHEL Linux based environment or Windows server environment

Special Qualifications:

Big Data Admin (Two positions)
1. Must have a minimum of 10 years of related experience as a Hadoop administrator with an Expert level knowledge of Cloudera Hadoop components such as HDFS, Sentry, HBase, Impala, Hue, Spark, Hive, Kafka, YARN, ZooKeeper and Postgres supporting an Enterprise Data Warehouse (EDW) environment.
2. Prior Hadoop cluster deployment experience in adding and removing nodes, troubleshooting failed jobs, configure and tune the clusters, monitor critical parts of the cluster
3. Must have knowledge of best practices for Data Warehousing including business intelligence, predictive analytics and business continuity planning.
4. Must have experience with Data Marts, Online Analytical Processing (OLAP), Online Transaction Processing (OLTP), and predictive analysis
5. Hands-on experience with Cloudera, Working with data delivery teams to setup new Hadoop users. This includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.
6. Responsible for implementation and ongoing administration of Hadoop infrastructure
7. Competency in Windows Server and Red Hat Linux administration (security, configuration, tuning, troubleshooting and monitoring
8. Experience supporting a variety of databases

Client : Direct