Job Description :

Dear candidate,

This is Shahista Naaz from DEG, we have urgent need for Software Engineering - AB Initio - Advanced | Columbus, OH with our client. If you are interested, then share your updated resume with below mentioned details.

call: five one six nine two seven seven two one (or)

Position: Software Engineering - AB Initio - Advanced

Duration: 4 Months

Location: Columbus, OH

Job Family Description:

Expert in Ab Initio and design
technique as well as experience working across large
environments with multiple operating
systems/infrastructure for large-scale programs (e.g.,
Expert Engineers) starting to be firm-wide resources
working on projects across JPMC
§ Is multi-skilled with expertise across software
development lifecycle and toolset
§ May be recognized as a leader in Agile and
cultivating teams working in Agile frameworks
§ Sought out as coach for at least one technical skill
§ Strong understanding of techniques such as
Continuous Integration, Continuous Delivery, Test
Driven Development, Cloud Development, resiliency,
security
§ Stays abreast of cutting-edge technologies/trends
and uses experience to influence application of
those technologies/trends to support the business.
may give speeches and outside the firm, writes
articles

Additional Skills:

Update on 7/29/2021 Manager updating job requirements see below. Rate and Location is staying the same.
Title: Data Integration Developer (Java/Spark)
• Bachelor's degree or equivalent in Computer Science, Engineering (any), or related field.
• 4+ years of experience in the design and delivery of Data Pipeline solutions using Hadoop/Java/Spark
• 1+ years of experience in Cloud deployment (AWS, Kubernetes), Kafka, CI / CD, Automation
• 2+ years of experience in Writing and Modifying complex SQL queries
• Extensive Knowledge of application, design-patterns, data and infrastructure architecture disciplines
• Experience in Kubernetes, AWS-Glue is preferred
• Working experience as Agile developer and good understanding of SDLC methodologies/guidelines
• Knowledge of big data technologies like Kafka/Hadoop/HIVE/
A data integration developer is responsible for the following key areas:
• Creation of ETL Data Pipeline processes to validate, transform, enrich, and integrate data
• Adopt cutting edge technologies such as cloud/containers as a part of application evolution and modernization
• Understand business requirements and collaborate with the architecture team to translate them into technical design
• Participate in end-to-end development lifecycle activities of the application, including design, coding, testing and deployment activities.
• Produce comprehensive tests for all developed code. Support and participate in system and integrated testing across sub-systems as the need arises.
• Provide technical support for the application on a rotational basis, including meeting service level and performance requirements; and diagnosing and evaluating inefficient processes/code.

             

Similar Jobs you may be interested in ..