Job Description :


I hope you are doing well. I'm Ashok from Nityo Infotech Corp. We are actively looking for SDET with ETL Big Data and please share your consultant resume.


Role: SDET with ETL Big Data

Location: San Ramon, CA


Essential Job Functions

•  Create & execute software test design, test strategies, test cases for Bank’s MDM, Digital, Marketing, Transactional systems data ingestion through batch and streaming framework.

•  Perform detailed Data Analysis, Data Reconciliation and Data Integrity in databases and file systems using SQL, Hive, Spark and other programming languages/tools.

•  Validate the data ingestion pipeline built using NIFI, Spark, Kafka, Ariflow and Denodo.

•  Create test data in various file formats an databases.

•  Analyze the logs created by data ingestion jobs and report any failures and performance issues in detail.

•  Build reusable standard TCOE automation frameworks for functional, regression, performance and E2E testing using DevOps tools and scripts.

•  Define and document software test plan and perform several phases of testing cycles and record test results, test metrics and track defects using Jira-Xray in order to ensure applications, products and/or releases are in compliance with Bank of the West`s QA standards.

•  Partner end-to-end with Product Managers, Architects, Tech Leads, Data Governance team to understand business requirements to recommend quality improvement best practices and processes.

Other Job Duties

•  High level of personal commitment to each task, a can do attitude and a drive to deliver.

•  Strong communication skills to be able to communicate at all levels.

•  Ability to understand banking financial requirements.

•  High level of responsibility and Ownership from inception through to implementation.

•  The candidate should have a proactive approach to problem solving.

•  Good analysis skills in order to aid in trouble shooting and problem solving.

•  Good testing principles as well as good defect management skills should have experience in handling.

•  Multiple assignments at same time and long with other team members.


Required Experience

•  Overall 7+ years of experience of IT experience with ETL/ Data Warehouse Testing or Development experience with Big data experience definitely a plus.

•  Experience in writing complex SQL, python/shell Scripts to test data ingestion framework based on the data mapping & requirements provided and perform extensive data analysis to identify the defects.

•  Strong Data Analytics, ETL, Data warehouse, Data Virtualization, BI Dashboard concepts.

•  Experience in working with large scale Big data/Enterprise Data Warehouse, Data Integration, Data Migration and upgrade projects.

•  Experience in testing complex data systems, data ingestion pipeline through batch, real time/streaming framework.

•  Experience in building/updating automating frameworks using programming languages such as Python/Java/Shell or previous proven programming experience in any relevant scripting languages.

•  Experience in test data setup in various file formats and databases.

•  Experience working with Database upgrades, tool upgrades and Interface testing.

•  Experience in creating Test plan, test cases and engineering best practices related to software test engineering, both manual and automated testing.

•  Experience coordinating testing activities and optimizing test cycles working with project team.

•  Experience with utilization of appropriate test methodologies and use of test management tools like JIRA.

•  Experience with conducting and running defect triage meeting with project teams.

•  Experience working with financial services applications.

•  Strong in SDLC process, Test Strategy/Plan, Test Estimation and experience working in Agile Methodology.

•  Effective Project & People Management Skills, exposure to Knowledge Management. Solid time management and prioritization skills.

•  Excellent verbal and written communication skills.


Preferred Skills

•  Domain knowledge & previous experience with Banking and Financial services.

•  Passion towards giving technical solutions and using different testing tools.

•  Experience or knowledge of Big data tools such as Spark, NIFI, Kafka, Denodo technologies, Hive, NOSQL Databases like HBase, Cassandra, MangoDB, BI tools like PowerBI, Zeppeline, etc.

•  Experience in working data integration testing pipeline built using heterogeneous source systems like transactional databases & files systems (JSON, delimited, COBOL, Parquet, Avro, etc.,), HDFS, API and Webservices.

•  Experience in source control tools like GitLab or GitHub and built DevOps CICT pipeline or similar.

•  Experience in using Data Governance, Metadata and data lineage tools like Schema Registry, Atlas, ABACUS, etc.

•  Experience in data masking, tokenization, detokenization process and testing the same.

•  Experience handling multiple assignments and be a team player



Looking forward to work with you



Ashok Raju

Nityo Infotech Corp.
Suite 1285, 666 Plainsboro Road
Plainsboro , NJ , 08536



Desk:  EXT: 4029    


 “If you feel you received this email by mistake or wish to unsubscribe, kindly reply to this email with “UNSUBSCRIBE” in the subject line”



Similar Jobs you may be interested in ..