Job Description :
Position : Sr. Big Data engineer.
Location : Cincinnati, OH
Duration : Long term
Need Local In person interview and GC or USC

Report To :
Required Skills :

Sr Big Data Engineer
Typically requires a minimum of 5 years of related experience with a Bachelor''s degree; or 3 years and a Master''s degree; or a PhD without experience; or equivalent work experience.
Bachelor''s degree in Data Analytics, Computer Science, Information Systems or Business; prior work experience in Business Intelligence/Data Integration required.
Experience with Spark, Hadoop, MapReduce, HDFS.
Experience with various ETL techniques and frameworks in high volume data environment
Experience with various messaging systems, such as Kafka or RabbitMQ.
Knowledge of workflow/schedulers like Oozie.
Familiar with DevOps experience and best practices.
Good knowledge of database structures, theories, principles, and practices in a warehouse/analytic platform environment.
Analytical and problem solving skills.
Experience in Agile development methodologies desired.
Job Description:

This Senior Technology Engineer uses his/her specialized knowledge to evaluate current framework, design, and infrastructure to identify areas to improve stability and efficiency of the Big Data Environment. The Senior Technology Engineer will also plan, design, develop and test product enhancements to new offerings. With a wide array of services, Worldpay needs to ensure that our products deliver a seamless and intuitive experience, which begins with thoughtful and smart engineering. He/she will be a passionate and informed individual, keeping on top of developments and trends in the payments.

The Day-to-Day Responsibilities:
Uses Scrum Agile methodology and Scaled Agile Framework (SAFe) to maximize the productivity and ensure quality
Works in all phases of Software Development Life Cycle (SDLC) such as Requirements Gathering, Design, Development, All Testing (Unit, Integration, Regression, and User Acceptance), Production Deployment, and Support
Collaborates closely with stakeholders and groups (Billing, Product, Settlement, Customer, Accounting, Finance, File Transfer, and Support) to understand, refine and translate the business requirements into technical specifications
Evaluate poorly performing processes and design short and long term resolutions.
Builds Extract-Transform-Load (ETL) solutions for Data Integration and Data Warehousing (DW) using the appropriate Data transformation tools ( Spark, Sqoop, IBM InfoSphere DataStage) for wide variety of source and target formats (Mainframe, DB2, Oracle, Exadata, MS SQL Server, SalesForce Web API, XML, MQ, Hadoop)
Assists Solution Architects in designing and estimating solutions as necessary
Creates objects in Relational Database and proficient in building moderate/complex SQL for data validation, manipulation and reporting
Serves as mentor to other team members of development team
Works on programming using UNIX shell scripting to enhance ETL solutions
Assists Production Support in triaging various problems by doing root cause analysis and resolving the issue in timely manner to minimize impact to systems/environment
Uses change requests (CRs) to release the code to production without negatively impacting or disrupting workflow, systems, and end users
Works closely with IBM Tivoli Work Load Scheduler (TWS) team to automate/schedule various jobs taking in account complex systems interdependencies
Exercises judgment within defined procedures and practices to determine appropriate action
Documents ETL processes & solutions by following the company standards for Support and Production Release
Uses excellent communication and interpersonal skills to build lasting working relationships
Self-starter that can carry out a given project / task with little or no assistance