Job Description :

Job Title: Datastage Architect
Location: Concord, New Hampshire, 03301
Experience Required: 12+ Years
Employment Type: Contract

Job Description

We are seeking an experienced Datastage Architect to lead the design, implementation, and optimization of enterprise-level ETL and data integration solutions. In this role, you will work closely with business and technical teams to analyze requirements, architect solutions, and ensure high performance, scalability, and reliability of data workflows across the organization.

Key Responsibilities
  • Design and architect end-to-end ETL solutions using IBM Datastage.

  • Lead the planning, modeling, and execution of large-scale data integration projects.

  • Work with stakeholders to translate business needs into technical solutions.

  • Define architecture standards, frameworks, metadata management, and data governance practices.

  • Optimize ETL workflows for performance, reliability, and scalability.

  • Support data integration with cloud, on-premises databases, and enterprise applications.

  • Provide technical leadership, mentoring, code reviews, and development support.

  • Manage troubleshooting, performance tuning, and production issue resolution.

  • Collaborate with data warehouse, analytics, infrastructure, and QA teams.

  • Prepare technical documentation and architecture diagrams.

Required Skills and Experience
  • 12+ years of experience working with IBM Datastage and ETL development.

  • Strong background in data warehousing concepts, design, and architecture.

  • Expertise with ETL pipeline optimization and performance tuning.

  • Experience with relational databases such as Oracle, DB2, SQL Server, Teradata, or PostgreSQL.

  • Strong SQL and stored procedure development skills.

  • Hands-on experience with Unix/Linux scripting and scheduling tools.

  • Familiarity with cloud data platforms (AWS, Azure, GCP) is preferred.

  • Experience working on large enterprise-scale projects.

  • Strong communication and stakeholder management skills.

Good to Have
  • Knowledge of Kafka, Python, or other integration tools.

  • Experience with Agile methodologies and DevOps practices.

  • Background in big data ecosystems or modern data platform architecture.

Education
  • Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, or related field.

             

Similar Jobs you may be interested in ..