Job Description :
Client is a leading, “global ten” provider of custom information technology, consulting and business process outsourcing services, and serves primarily Global 2000 companies. The firm employs more than 150,000 people and works with 805 active clients across banking & financial services, insurance, healthcare, life sciences, retail/consumer, manufacturing, energy, communications, and media. Since being spun-out as a public entity in 1998, the company has grown at an unprecedented rate, with anticipated revenue of >$8, making the fastest growing IT services company over the last 10 years, and certainly the most profitable now featuring a market capitalization greater than $18B. Client is a member of the NASDAQ-100 Index and the S&P 500 Index and part of “Fortune 500” list.

We currently have openings for Cloud Engineer

Position Type- Contract

Location: Chicago, IL

Job Description:

Primary Skills: Cloud Engineer with Hadoop

" Manage large scale multi-tenant Hadoop cluster environments residing on AZURE/AWS

Experience with one or more leading cloud platforms across SaaS and PaaS including Amazon AWS, Pivotal Cloud Foundry, Microsoft Azure and Google Cloud Platform

Handle all Hadoop environment builds, including design, security, capacity planning, cluster setup, performance tuning and ongoing monitoring

Perform high-level, day-to-day operational maintenance, support, and upgrades for the Hadoop Cluster.

Research and recommend innovative, and where possible, automated approaches for system administration tasks.

Creation of key performance metrics, measuring the utilization, performance and overall health of the cluster.

Deploy new/upgraded hardware and software releases and establish proper communication channels.

Work with appropriate stakeholders to ensure we have solid capacity planning and can manage our TCO.

Ability to collaborate with product managers, lead engineers and data scientists on all facets of the Hadoop Eco-System.

Ensure existing data/information assets are secure and adhering to a best in class security model.

Possess at least 3 years of managing a multi-tenant production Hadoop environment;

A deep understanding of Hadoop internals, design principals, cluster connectivity, security and the factors that affect distributed system performance;

Proven experience with identifying and resolving hardware and software related issues;

Knowledge of best practices related to security, performance, and disaster recovery;

Expert experience with at least two of the following languages; SQL, Python, Java, Scala, Spark or Bash.

Experience managing Cloud Services (IaaS, PaaS AWS Certification is preferred;

Experience with Terraform, CloudFormation, Git, Jenkins, Ansible a plus

Experience with complex networking infrastructure including firewalls, VLANs, and load balancers.

Experience as a DBA or Linux Admin"