Job Summary
12 years of Data migration and data architect experience.
Able to capture requirements from source and target applications
Able to create and understand data relationships
Experience in handling data exports from sql servers, Postgres and file systems
Experience on data profiling, data cleansing and data mapping
Able to lead team of data architect, data analyst and data engineer to develop data extraction pipelines
We are an equal opportunity employer. All aspects o
|
 |
Launch Your Career with Cook Systems
Ready to elevate your career? Cook Systems, a certified veteran owned IT consulting firm, has been transforming businesses and careers since 1990. Whether you're aiming to work with a Fortune 500 company or a small business, we've got you covered.
Our core values: integrity, investment, and innovation drive everything we do, ensuring you grow and succeed in a dynamic, supportive environment. We understand the importance of work life balance and personal g
|
 |
Hi, Hope all is well, Please respond me with Updated resume if you would like to apply for this contract Position, 100% remote : Maximo Data Architect Location: Columbus, OH Duration: 20 Oct 2025 - 27 Mar 2026 Travel Type: remotely or will travel and needs expenses Job Requirements: Key Responsibilities: - Define and implement enterprise data architecture and standards for IBM Maximo systems. - Develop logical and physical data models for Maximo and its integrations with ERP, GIS, and oth
|
 |
Role: –Data ArchitectBill Rate: $90/hour C2CLocation:RemoteDuration: 12+ months/ long-term Interview Criteria: Telephonic + ZoomDirect Client Requirement
Key Responsibilities:
Define the data architecture framework and standards.Design reference architecture and functional design of Enterprise Data Management solutions, including broader Enterprise solutions.Develop key architecture artifacts such as conceptual/logical data models, data flow diagrams, etc.Gain a deep understanding of various
|
 |
Job Title: Data Architect
Location: On-Site (Tallahassee, FL)
Job Summary:
The Data Architects, under the working job title of Extract, Transform, Load (ETL) Architects, will serve as the principal line of communication for the project team. The ETL Architects will drive the development of data integration pipelines, enabling efficient, reliable access to critical data within the Correction Information Management System (CIMS) Data Warehouse/Data Lake on Azure. They will work with Azure
|
 |
Hi Friends i have a job post for Data Architect / Data ModlingData Architect Engineer Role and Responsibilities include the following:• Data modeling• Data integration• Data security• Performance optimization• Data governance• Data strategy• Data migration• Continuous improvement• ETL process.• Cloud development experience (GCP, Azure)
|
 |
Data Scientist- Snowflake SME Location: Remote Duration: Long term contract 10+years of experience level required and Snowflake experience is mandate Candidates who can work independently are more preferred. Job Description: The Snowflake resource needs to have a blended of Data Scientist/Architect & Engineer skills. Able to work with the Snowflake AI Data Cloud, leveraging SQL and Python to extract valuable insights from complex datasets. You develop and optimize machine learning models t
|
 |
Hi Hope you are doing well !! I have an urgent position. Kindly go through the Job description and let me know if this would be of interest to you. Title : Data Architect: EPIC VBPM & Population Health (Remote) Duration : 6-12 Months Location : Remote About the job Note : Travel -Follows project travel schedule. usually once a QTR & Go Live 10/4/25 (Travel will cover by the client) Possibly open to conversion. Hands-on and Certified expert Data Architect specialized in population health data,
|
 |
Data Scientist- Snowflake SME Location: Remote Duration: Long term contract 10+years of experience level required and Snowflake experience is mandate Candidates who can work independently are more preferred. Job Description: The Snowflake resource needs to have a blended of Data Scientist/Architect & Engineer skills. Able to work with the Snowflake AI Data Cloud, leveraging SQL and Python to extract valuable insights from complex datasets. You develop and optimize machine learning models t
|
 |
Role: –Data Architect
Bill Rate: $90/hour C2C
Location: Waco, TX
Duration: 12+ months/ long-term
Interview Criteria: Telephonic + Zoom
Direct Client Requirement
Job Description:Position : Data ArchitectThe client mainly wants a senior-level "data architect" who can design and modernize Magnolias retail/eCommerce data infrastructure on cloud platforms (Snowflake/Databricks/Redshift), build scalable pipelines (ETL/ELT), ensure governance & security, and support analytics/ML use cases.C
|
 |