Top Skills > Datawarehousing

Datawarehousing ETL Jobs in USA - 1778

NO matched jobs found.
Click here for a custom search.
($) : Market
8+ Years of experience with data engineering 4+ years of experience working with AWS data platforms Experience with Agile and Scrum methodologies Experience with Confluence and Jira Experience with building and deploying streaming and batch data pipelines Significant development and delivery experience with Python and PySpark. Deployment experience with CI/CD tools, Jenkins, Bitbucket, Kubernetes, Docker, Salt Automation experience with Cloud Formation, Docker, Kubernetes, Jenkins Extensive hands-on experience implementing data migration and data processing using AWS services such as Athena, Glue, Lambda, S3, DynamoDB, NoSQL, Relational Database Service (RDS), Amazon EMR and Amazon Redshift. Experience and detailed knowledge with ETL, Data Quality, metadata management, data profiling, micro-batches, streaming data loads. Experience with development lifecycle (development, testing, documentation and versioning) on an AWS platform.
Aug-07-20
($) : Market
Job Role: Master Data Operations Analyst Location: Chicago, IL (position is remote until after the Covid restrictions) Duration: 6 Months+ Rate : Market Open All Inclusive Role Summary: Performs duties necessary to maintain the integrity of Master Data records, according to established procedures and guidelines. Responsible for daily review and resolution of tasks. Provide communication regarding data quality concerns or other production issues. Responsibilities: Responsible for daily review and resolution of duplicate/linkage tasks. Review all types of tasks based on the established workflow hierarchy (tasks: potential overlays, potential duplicates, potential linkages and review identifier Updates tasks as appropriate according to established procedures and guidelines, while meeting set expectations with metrics. Provide timely and clear communication regarding production issues and business impacts, as needed. Notifies Data Steward/Manager of errors and potential problems and provides appropriate follow up. Maintains and protects confidentiality information. Complies with all organizational and departmental policies and procedures. Performs other duties as assigned. Qualifications: 5+ years’ experience in related data operations or analysis functions. Experience with health care operations and data preferred. Experience with Master Data analysis Excellent decision making abilities and effective problem solving skills. Ability to analyze data and make decisions based on the information gathered. Analytical experience (e.g. data and process analysis, quality metrics, policies, standards, and processes) Strong time management skills; organized with strong focus and excellent attention to detail. Strong verbal and written communication skills.
Aug-07-20
($) : Market
Position: Data Modeler Location: Boston, MA Duration: 06 month Must Have Skills End client is looking for a DATA MODELER that must have at least 8+ years of data modelling experience in Informatica BDM and AWS cloud which includes but not limited to: Work with Business and BA to understand the data requirements for Dashboards and reporting. Develop and maintain the conceptual, logical and physical data models along with corresponding metadata Work with ETL team to implement data strategies and building data flows Develop best practices for naming conventions and coding standards to ensure data model consistencies. Creation of data dictionaries Conversion of business to technical requirements Detailed Job Description Work with Business and BA to understand the data requirements for Dashboards and reporting. • Develop and maintain the conceptual, logical and physical data models along with corresponding metadata • Work with ETL team to implement data strategies and building data flows • Develop best practices for naming conventions and coding standards to ensure data model consistencies. Minimum years of experience 8
Aug-07-20
($) : Market
Position: Stibo Developer Location: Houston, TX Duration: 06 month Minimum years of experience*: 6+ Must Have Skills 1. Stibo STEP 2. Java 3. MDM Architecture Nice to have skills 1. AWS Cloud Detailed Job Description: An experienced Stibo resource with extensive hands on experience in STIBO MDM. Strong knowledge on MDM architecture, design, and development skills and should be aware of MDM best practices and development. Strong understanding of MDM concepts, Object Oriented design and programming, data modeling, entity relationship diagrams. Strong understanding of Java, JavaScript, XML, XSD, and JSON. Strong understanding of pattern matching and regular expressions. Resource onboarded would be working alongside the Client Team and BT Director to help architect the Master data ingestion into the system. The Candidate will be designing, developing and transforming data from multiple sources to get that Golden Record. Would help develop, implement, maintain and help enforce enterprise data management policies, standards, guidelines and operating procedures.
Aug-07-20
($) : USD 60 / Hourly / C2C
VDart We are a Global Information Technology Services & Workforce Solutions firm headquartered out of Atlanta, GA with presence in US, Canada, MX, UK, Belgium, Japan & India. Founded in 2007, Our team of over 2550+ professionals continually create impact for our customers worldwide in solving complex technology challenges with cutting edge technologies. We specialize in providing the Fortune 1000 companies, niche hard to find skills in technologies including Social, Mobile, Big Data Analytics, Data Sciences, Cyber Security, IoT, Cloud, Machine Learning, and Artificial Intelligence. With delivery centers in the UK, Mexico, Canada, and India, we provide global workforce solutions to our customers covering EMEA, APAC & Americas. VDart is an award-winning organization recognized by Inc 5000 Hall of Fame; Atlanta Business Chronicle*s Fastest Growing Companies; NMSDC*s National Supplier of the Year; Ernst & Young*s Regional Entrepreneur of the Year and more. Data analyst- Actuarial Houston, TX Contract Roles & Responsibilities: 1. Need a strong data Analyst (8-10+ year experience ) in insurance life/actuarial background who understands Business Processes and can prepare business Requirement Documentation 2. Experience in Data Modeling, Data Warehousing, PL/SQL, or ETL processing 3. Familiarity with all phases of the software development life cycle and Experience documenting requirements for large and complex data processing and analysis projects 4. Strong attention to detail in both work performed and presentation of work products with experience communicating and interacting with all levels of management in the process of gathering requirements 5. Understand the monthly manual tasks/activities for the Actuarial processes 6. Perform data analysis on the policy admin systems, databases, excel sheets, and other sources of data required for the various processes 7. Experience with GGY AXIS and/or other valuation system a plus Referral Program: Ask our recruiting team about how you can be a part of our referral program. If you refer a candidate with the desired qualifications and your candidate accepts the role, you can earn a generous referral fee. We want to hire the best talent available and are committed to building great teams and partnerships. We are Equal Employment Opportunity Employer. VDart Inc Alpharetta, GA Follow us on Twitter for the hottest positions: @VDart_Jobs Follow us on Twitter: @vdartinc
Aug-07-20
($) : DOE
Job Title: ETL MDM Consultant Location: Baton Rouge, LA Visa: USC, GC, Expertise and/or relevant experience in the following areas are mandatory: Minimum of 3-4 years experience in a technology field Subject matter expertise in Financial Industry desired Facilitation of executive level resources Understanding of business processes in financial services preferred Understanding of privacy regulations and applying best practices Implementation of Master Data Management (MDM) solutions Experience working across multiple departments/programs to develop a 360-degree view Experience supporting data governance implementation or expansion Business process design Relational database interaction using SQL (or SQL-based tools) Requirements gathering, documentation and analysis Building Concept of Operations documents Contribute to the development of Technical Design documentation
Aug-07-20
($) : DOE
Greetings!! This is Eva from TechnoGen Inc. And I am writing to see whether you are interested in an exciting/challenging opportunity in Wilmington, Delaware. Kindly reach me on or . Job Details: Job Title: Big Data Enginee Location: Wilmington, Delawar Duration: Long term on W2 Job Description Top Skills: Spark (very strong) AWS Python SQL Airflow (plus to have) (preference will be given to candidates with Airflow experience) Ex-Capital One candidates (preferred) Best Regards, Eva
Aug-07-20
Data Analyst  Columbia, SC
Job title: Data Analyst Location: Columbia ,SC. Duration: Long term Knowledge & Experience: Overall Data Management and Governance Frameworks like: DAMA/DMBOK and DCAM/DGI Business Glossary & Taxonomy Data Architecture & Modeling Master Data Management Data Warehousing & Business Intelligence Data Custodianship Experience with Collibra or similar Data Governance Tool Strong Expertise in: Metadata Management: Metadata management from ingestion to consumption supporting all data governance functions Data Classification and Prioritization Data Tagging and custom regular expressions Data Access Governance Import & maintenance of metadata across platforms Reverse engineering of Data models Integrate Data transformation jobs Change control mechanisms (following deployment schedules) Bulk Import of Glossaries Data Lineage: Responsible for Lineage that is consumable & actionable for Business users Generate Lineage for all identified CDEs Discover broken links and create manual overrides Custom scripting for generating Lineage on parameterized jobs Additional tagging for discovering Business Lineage Data Quality: Strong experience with entire data quality lifecycle: Profile, Standardize, Enrich and Remediate Custom profiling on data elements beyond basic dimensions for finding anomalies Ability to define data quality rules based on Governance Policies and/or Business Rules Schedule & monitor batch data quality processes for standardization and enrichment Further logging & tracking of data quality issues to remediation by working with respective stakeholders Dash-boarding of aggregated data quality exception results and respective metrics Qualifications: Bachelor s degree in Information Technology or similar subject area A minimum of (10-15) seven years of relevant experience Technical Toolset: (Strong hands-on expertise with ) Databases: SQL Server, Oracle, Essbase, SSAS Data Modeling: ER Studio (or Erwin) with Deep Operational & Dimensional modeling and mapping experience Data Governance amp; MDM): Collibra or similar Data Integration: SSIS BI: Tableau, Hyperion, Cognos, SSRS or similar Environment: On-prem and Cloud
Aug-07-20
Hi, Greetings!! We have an immediate opening for AbInitio Developer at Houston, TX. Please go through the requirement and reply with your updated profile, contact details, and your availability if you would be interested in it. Title: AbInitio Developer Location: Houston, TX Duration: 6 - 9 Months contract to hire Job Description: 8+ years of experience in AbInitio Graphical Development Interface. Experience in Designing the Graphs, Parameterizing the graph using PSETS Scheduling the job through Control M Writing unix shell Scripts. Knowledge in performance tuning of the ETL process by writing optimal SQL code Knowledge in writing stored procedures and functions in the database. Exposure to Command line utilities. Creation of Automation scripts like .bat and .sh. Should have knowledge of working in Agile projects. The associate should possess good communication and interpersonal skills. Should be able to effectively interact with customers and understand requirements and pass on the information to offshore. Should be able to handle the deliverables with the help of offshore team 5+ years of experience with the following is a PLUS: Hadoop ecosystem technology stacks as HDFS, HBase, Hive, Pig, Spark, Scala, Cloudera etc Thanks & Regards Krishna Kishore United Software Group Inc 6000 Venture Dr, Suite # C & D Dublin, Ohio 43017 Phone * 326 Email: Fax: 1 To unsubscribe from this mailing, please click here UNSUBSCRIBE USG is an Equal Opportunity Employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, disability, military status, national origin or any other characteristic protected under federal, state, or applicable local law.
Aug-07-20
($) : Market
Position: AWS Architect with ETL Location: Houston, TX Duration: 06 month Must have skills include: Hands-on experience of complex project for implementation of end-to-end Data Ingestion/ETL, Data Quality, Data Transformation and Data Integration Hands-on Data Engineering on AWS (development of data pipelines) Hands-on experience in AWS, including Snowflake and Matillion. Define AWS based end-to-end Solution Architecture for data processing through Data Ingestion, Data Quality, Data Transformation and Data Integration Define AWS overall framework for Data on Cloud solution Drive discussions with various technical teams from client. Experienced in driving requirement elicitation – conduct meetings with various client technical and business teams Perform gap analysis vis-à-vis requirements Requirement and Architecture documentation Provide inputs on feasibility and scope fitment Worked with offshore teams (based in India)
Aug-07-20
($) : USD 45 / Hourly / C2C
VDart We are a Global Information Technology Services & Workforce Solutions firm headquartered out of Atlanta, GA with presence in US, Canada, MX, UK, Belgium, Japan & India. Founded in 2007, Our team of over 2550+ professionals continually create impact for our customers worldwide in solving complex technology challenges with cutting edge technologies. We specialize in providing the Fortune 1000 companies, niche hard to find skills in technologies including Social, Mobile, Big Data Analytics, Data Sciences, Cyber Security, IoT, Cloud, Machine Learning, and Artificial Intelligence. With delivery centers in the UK, Mexico, Canada, and India, we provide global workforce solutions to our customers covering EMEA, APAC & Americas. VDart is an award-winning organization recognized by Inc 5000 Hall of Fame; Atlanta Business Chronicle*s Fastest Growing Companies; NMSDC*s National Supplier of the Year; Ernst & Young*s Regional Entrepreneur of the Year and more. Job Title: Ab Initio Developer Location: Bay Area, CA / Charlotte, NC Duration: Long Term Contract Mandatory Skills Ab Initio, Collibra / Oracle or Teradata Required Skills Developer with strong experience in Ab Initio, ETL Development and Support. Independently develops and maintains banking applications using Ab Initio. Detail-oriented with an aptitude for solving unstructured problems. previous experience in financial services / banking domain is highly regarded Higher level of professional maturity with great sense of urgency and follow-through. Experience with Big Data, Oracle or Teradata or Collibra or MS SQL and Unix scripting will be considered as added advantage Referral Program: Ask our recruiting team about how you can be a part of our referral program. If you refer a candidate with the desired qualifications and your candidate accepts the role, you can earn a generous referral fee. We want to hire the best talent available and are committed to building great teams and partnerships. We are Equal Employment Opportunity Employer. VDart Inc Alpharetta, GA Click here to Apply Follow us on Twitter for the hottest positions: @VDart_Jobs Follow us on Twitter: @vdartinc
Aug-07-20
($) : USD 90 / Hourly / C2C
VDart We are a Global Information Technology Services & Workforce Solutions firm headquartered out of Atlanta, GA with presence in US, Canada, MX, UK, Belgium, Japan & India. Founded in 2007, Our team of over 2550+ professionals continually create impact for our customers worldwide in solving complex technology challenges with cutting edge technologies. We specialize in providing the Fortune 1000 companies, niche hard to find skills in technologies including Social, Mobile, Big Data Analytics, Data Sciences, Cyber Security, IoT, Cloud, Machine Learning, and Artificial Intelligence. With delivery centers in the UK, Mexico, Canada, and India, we provide global workforce solutions to our customers covering EMEA, APAC & Americas. VDart is an award-winning organization recognized by Inc 5000 Hall of Fame; Atlanta Business Chronicle*s Fastest Growing Companies; NMSDC*s National Supplier of the Year; Ernst & Young*s Regional Entrepreneur of the Year and more. Execute standard on boarding of technical and business metadata into the Informatica CDQ/EDC and Axon environments, ensuring the population of data lineage and linkage between the technical and business metadata. Participate in the development and implementation of enterprise metadata standards, guidelines, and processes to ensure quality metadata and support for ongoing data governance. Providing administration support for Informatica CDQ/Axon and in future other data governance related products (MDM/ Data Quality) which includes installation, configuration, upgrades and business continuity support. Facilitate discussion with data stakeholders on data governance processes, and translate those requirements into Axon workflows. Participate in Metadata Management scrum team to partner in work efforts, provide data management experience, including an understanding of concepts, practices, procedures, and tools providing strong analytical, innovative and creative problem-solving Participate in training sessions for business and technology partners covering enterprise metadata standards, guidelines, and processes Participate in product evaluations for future data governance initiatives. Qualifications: Minimum of 5-6+ years of enterprise data integration and management experience working with Datawarehouse technologies and data governance solutions (Data Governance, Data Catalog, MDM and Data Quality) Must have 5+ years of hands on Informatica Data Governance (Axon), CDQ, EDC, Data Quality, and MDM experience, including executing at least 2 large Data Governance, Quality and MDM projects from inception to production, working as technology expert. Must have 5+ years of hands on developer/designer experience as well as experience working with Informatica's Data Governance, MDM, and Data Quality product Must have 5+ years of practical experience configuring CDQ, EDC and Axon, including business glossaries, dashboards, policies, search, Axon maps Experience in data quality tools, including data profiling, cleansing and identity resolution. Experience defining solution and technical architectures for new solutions, including experience driving project team execution from an architectural standpoint across the complete project life cycle within defined and finite time frames. Must be a team player as this role requires working with multiple teams. Knowledge of AWS and cloud setup of Informatica products will be a big plus Excellent communication, presentation, interpersonal and organizational skills If your skills match our requirements, Click here to Apply. Be sure to reference the job number and title in the subject line. Referral Program: Ask our recruiting team about how you can be a part of our referral program. If you refer a candidate with the desired qualifications and your candidate accepts the role, you can earn a generous referral fee. We want to hire the best talent available and are committed to building great teams and partnerships. We are Equal Employment Opportunity Employer. VDart Inc Alpharetta, GA Click here to Apply Follow us on Twitter for the hottest positions: @VDart_Jobs Follow us on Twitter: @vdartinc
Aug-07-20
($) : USD 55 / Hourly / C2C
Informatica Developer Baton Rouge , LA TP Responsibilities: Expertise in PowerCenter development experience Familiar with Informatica PowerCenter 10.x or above Develop ETL processes Write SQL statements Document and gather business, functional or technical requirements. Create, develop and code from technical specifications. Familiar with Oracle DB Familiar with flat file, XML sources and targets Able to install client and configure repository Participate in code reviews.
Aug-07-20
($) : USD 70 / Hourly / C2C
Strong GL recon experience along with Trintech Cadency GL recon tool 8+ years of relevant work experience in regulatory reporting, data quality, data analytics, data management within the financial services industry required, preferably at a large financial institution Strong knowledge of Regulatory Reporting and BCBS 239 Compliance Understanding of current accounting principles, policies and financial institution management reporting, General Ledger and reconciliation. Candidate should have good SQL skills(SQL and advance SQL) Exposure to ETL tools like Informatica(Preferred) Solid analytical, problem solving, root cause analysis skills required, with an ability to deliver analyses in a concise and logical manner Must have solid communication skills (oral and written) to clearly articulate findings, defend point of view amongst various groups, and present to senior management Must be assertive and have strong follow up skills Strong General Ledger Platforms and Data Warehouses experience Must be a proactive and goal-oriented strategic thinker, with ability to identify creative solutions. Ability to multi-task, demonstrating independent leadership skills. Strong Microsoft Office skills including Microsoft Excel as well as PowerPoint to develop reports/presentations
Aug-07-20
Data Analyst  Chandler, AZ
($) : USD 55 / Hourly / C2C
5+ years of experience in IT and banking field with 3+ experience in SQL Server/Oracle or other database query language Understand and update business reporting and processes which will reduce duplication of data, reduce data entry into multiple systems and facilitate quicker data driven business decisions and document the details for technology team to understand Work with end users, business analysts and developers to streamline data flowing in to the applications. Export and manipulate data from business systems to be used in reports and analysis Setup graphs and presentations to visually represent information using Excel and/or Tableau Experience with creating, documenting, and maintaining processes Develop presentations and produce clear data visualizations for multiple levels of staff, management, and variety of stakeholders Implement methods to improve data reliability and quality Creative in solving problems, especially thinking outside-the-box to come up with solutions An ideal resource loves working with data Strong analytical skills with high attention to detail and accuracy Excellent verbal, written, and interpersonal communication skills
Aug-07-20
Data Architect  Cerritos, CA
($) : USD Negotiable / Hourly / C2C
Role: Data Architect Location: Cerritos, CA Type of hire: Contract Strong EDW design experienced solid data profile with SQL server modeling experience.
Aug-07-20
($) : USD 95 / Hourly / C2C
VDart We are a Global Information Technology Services & Workforce Solutions firm headquartered out of Atlanta, GA with presence in US, Canada, MX, UK, Belgium, Japan & India. Founded in 2007, Our team of over 2550+ professionals continually create impact for our customers worldwide in solving complex technology challenges with cutting edge technologies. We specialize in providing the Fortune 1000 companies, niche hard to find skills in technologies including Social, Mobile, Big Data Analytics, Data Sciences, Cyber Security, IoT, Cloud, Machine Learning, and Artificial Intelligence. With delivery centers in the UK, Mexico, Canada, and India, we provide global workforce solutions to our customers covering EMEA, APAC & Americas. VDart is an award-winning organization recognized by Inc 5000 Hall of Fame; Atlanta Business Chronicle*s Fastest Growing Companies; NMSDC*s National Supplier of the Year; Ernst & Young*s Regional Entrepreneur of the Year and more Position: Data Engineer Location: Remote contract: 4 Weeks - contract Deliverables Perform multi-year transactional data set compares, determining anomalies and analysis of root cause(s) Create and implement an interim business user process to operationalize a daily data compare function and exception outputs Create and implement interim daily reporting utilizing existing datasets Turn learnings into business requirements documentation for future development Timeline Some of this work can be done with flexible hours. Coordination between business and technology partners at some times will be required between 9-7pm Pacific M-F. The expectation is this work will take a few months to complete but is not expected to go beyond this general timeframe. Skills 7+ years of data analytics experience & skilled scripting We use Java, Python, JavaScript & Ruby throughout the business but welcome experience in different programming languages Experience with transactional RDBMS such as MySQL, Postgres, MS SQL Server, or Oracle Experience with creating and/or designing data imports and schemas, tools and processes to build a foundation for reports, reconciliation analysis and exception management Ability to write code from scratch and resolve any vulnerabilities in a timely manner Proficiency in encoding, testing, debugging and modifying existing programming to support current applications Experience working in large data environments Ability to influence manager and teammates with the development strategy Strong communication skills - including the ability to listen to the needs of others and comprehend complex matters, articulate issues in a clear and concise manner, and present effectively in both oral and written presentations to all levels in the organization Team-oriented approach can effectively lead a project or participate as an effective team member as well as work cross-functionally with other organizations Bonus Points For Experience with Payments and banking transactions (i.e. BAI2, NACHA, Card transaction life cycle) Production Referral Program: Ask our recruiting team about how you can be a part of our referral program. If you refer a candidate with the desired qualifications and your candidate accepts the role, you can earn a generous referral fee. We want to hire the best talent available and are committed to building great teams and partnerships We are Equal Employment Opportunity Employer. VDart Inc Alpharetta, GA Click here to Apply Follow us on Twitter for the hottest positions: @VDart_Jobs Follow us on Twitter: @vdartinc
Aug-07-20
($) : USD 110000 / Yearly / Full Time
VDart We are a Global Information Technology Services & Workforce Solutions firm headquartered out of Atlanta, GA with presence in US, Canada, MX, UK, Belgium, Japan & India. Founded in 2007, Our team of over 2550+ professionals continually create impact for our customers worldwide in solving complex technology challenges with cutting edge technologies. We specialize in providing the Fortune 1000 companies, niche hard to find skills in technologies including Social, Mobile, Big Data Analytics, Data Sciences, Cyber Security, IoT, Cloud, Machine Learning, and Artificial Intelligence. With delivery centers in the UK, Mexico, Canada, and India, we provide global workforce solutions to our customers covering EMEA, APAC & Americas. VDart is an award-winning organization recognized by Inc 5000 Hall of Fame; Atlanta Business Chronicle*s Fastest Growing Companies; NMSDC*s National Supplier of the Year; Ernst & Young*s Regional Entrepreneur of the Year and more. Role: Kronos Application Engineer / Data Engineer Location: Seattle, WA (Initial Remote) Type of Hire: Full-time Roles and Responsibilities: The role will work on location specific requirements for production deployment within the project timeline. Key Skills: Bachelor's Degree in Computer Science or related field, or equivalent work experience 5+ years of Kronos configuration experience required Hands on experience in Workforce Integration Manager Experience in building Inbound and Outbound Interfaces. Able to Analyze and handle complicated interfaces including calculations. Capable of understanding the flow of Scheduled Jobs related to Interfaces. Strong knowledge in SQL or Oracle Database 5+ years of Data engineering experience with SQL and/or NoSQL environment and Functional knowledge on Kronos is preferred. Experience in building complex SQL queries and stored procedures based on requirement. Understanding of how to model data, build pipelines, write optimized SQL and report and dashboard creation. Business requirement gathering and analyses experience. Computer Science fundamentals in data structures, algorithms, problem solving and complexity analysis Knowledge on Technical Design Documentation work for whole Implementation. Ability to work with end users on testing or User Training. Ability to do the Pay file Comparison/Validation Configuration knowledge with Kronos Version 6.3 Able to understand the flow of Unit Testing, QA, SIT and parallel testing. Skills Summary: Experience with large scale distributed systems Understanding of Big Data technologies and solutions Requires interaction with Business Analyst, Functional Analyst, Technical, End Users and HRA for successful project launch. Ability to deal well with ambiguous/undefined problems; ability to think abstractly. Knowledge of professional software engineering practices and best practices for the full software development life cycle (SDLC), including coding standards, code reviews, source control management, build processes, testing, and operations Quick understanding of Business Requirements and Design Analyses. Must be able to work independently and complete task in a timely manner Experience with Agile software development Referral Program: Ask our recruiting team about how you can be a part of our referral program. If you refer a candidate with this background and if the candidate accepts the role our team pays a generous referral. We are keen on networking and establishing a long-term, mutually beneficial partnership with you. We are Equal Employment Opportunity Employer. VDart Inc Alpharetta, GA Follow us on Twitter for the hottest positions: @VDart_Jobs Follow us on Twitter: @vdartinc
Aug-07-20
Data Architect  Lake Oswego, OR
($) : USD 70 / C2C
Role: Data Architect Location: Lake Oswego OR 97035 Duration: 11 Months Top 3 skills: Data Analytics , Designing , Health Care Experience. Primary Skills: Data Analytics Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Architectural Experience and solid Understanding of Data Warehouse Excellent Knowledge on Data Modeling and SQL In depth knowledge on End to End ETL Solution Hands on Experience of creating Mapping Documents and Design Documents Understanding and knowledge on Big data Technologies
Aug-07-20
AWS Data Engineer  Baton Rouge, LA
($) : USD 60 / Hourly / C2C
VDart We are a Global Information Technology Services & Workforce Solutions firm headquartered out of Atlanta, GA with presence in US, Canada, MX, UK, Belgium, Japan & India. Founded in 2007, Our team of over 2550+ professionals continually create impact for our customers worldwide in solving complex technology challenges with cutting edge technologies. We specialize in providing the Fortune 1000 companies, niche hard to find skills in technologies including Social, Mobile, Big Data Analytics, Data Sciences, Cyber Security, IoT, Cloud, Machine Learning, and Artificial Intelligence. With delivery centers in the UK, Mexico, Canada, and India, we provide global workforce solutions to our customers covering EMEA, APAC & Americas. VDart is an award-winning organization recognized by Inc 5000 Hall of Fame; Atlanta Business Chronicle*s Fastest Growing Companies; NMSDC*s National Supplier of the Year; Ernst & Young*s Regional Entrepreneur of the Year and more Role: AWS Data Engineer Location Baton Rouge, LA Hiring Mode: Contract Job Description: AWS(S3,EMR,Glue) Python, SQL, Shell Scripting, Hive, JSON, Postgress Good experience in ETL Development Experience working with Big Data Environment If your skills match our requirements, Click here to Apply. Be sure to reference the job number and title in the subject line. Referral Program: Ask our recruiting team about how you can be a part of our referral program. If you refer a candidate with this background and if the candidate accepts the role our team pays a generous referral. We are keen on networking and establishing a long-term, mutually beneficial partnership with you. We are Equal Employment Opportunity Employer. VDart Inc Alpharetta, GA Click here to Apply Follow us on Twitter for the hottest positions: @VDart_Jobs Follow us on Twitter: @vdartinc
Aug-07-20

Understanding Data Warehouse & ETL

A Data Warehouse is a huge database designed solely for data analysis. It is not used for any standard database process or business transactions. ETL (Extract, Transform, and Load) is a process by which normalized data is extracted from various sources and loaded into a data warehouse or a repository. This is then made available for analysis and querying. This repository is transformed to remove any redundancy and inconsistency. ETL supports effective decision making and analytics based on the composite data. Slices of data from the data warehouse can be stored in a data mart. This enables quick access to specific data like the summary of sales or finance reports. Search for exciting IT job openings in New Jersey.

Data Warehouse Features & Capabilities

Data Warehouse has features and capabilities that support data analysis with ease. A good data warehouse should have the following abilities: • Interact with other sources and input; extract data using data management tools. • It should be able to extract data from various sources, files, Excel, applications, and so on. • Allow cleansing so duplication and inconsistency can be removed. • Reconcile data to have standard naming conventions. • Allow both native and autonomous storage of data for an optimized process.

Top ETL Tools to excel in Data Warehousing Jobs

There are many ETL tools available in the market. The most commonly used tools for ETL are given below. • Sybase • Oracle Warehouse Builder • CloverETL • MarkLogic. There are excellent data warehousing tools like Teradata, Oracle, Amazon Web Services, CloudEra, and MarkLogic. Expertise in any of these can fetch you a good job in the field of data warehousing.

Salary Snapshot for Data warehousing Jobs in US

A senior Data Warehouse developer receives an average pay of $123,694 a year. Based on the skill and expertise the salaries in this field can range anywhere from $193,000 to $83,000. Most of the Senior Data Warehouse Developer receives a salary that ranges between $103,500 to $138,000 in the United States. There are currently plenty of Data Warehouse developer jobs in USA.

Career Path for a Data Warehouse Professional

Data Warehouse gives immense opportunities for an IT professional. There are a plethora of roles and designations required to manage this vast application and its different modules. Data warehouse managers are software engineers who build storage mechanisms for organizations to meets the need of the company. Entry-level roles in Data Warehouse are Software Developer, Software Engineer, Business Intelligence (BI) Developer, and Data warehouse ETL Developer People who make use of the data in the Data Warehouse to arrive at various decisions are Data Analyst, Data Scientist, and Business Intelligence (BI) Analyst. Senior roles in this field are Data Warehouse Managers, Senior Financial Analyst, Senior Software Engineer / Developer / Programmer, and Senior Business Analyst. Data warehousing jobs in USA are still prevalent, and if you are a specialist in this field, you can make a great career out of it.
Data warehouse & Skills & Tools
To be a Data Warehousing professional, you need an in-depth understanding of the database management system and its functions. Experience in developing databases using any of the database applications will be an added advantage. Apart from this, other technical skills required for a Data Warehousing job are discussed below: • Tools for developing ETL. You can either develop ETLs by creating mappings quickly or build it from scratch. Some commonly used ETL tools are Informatica, Talend, Pentaho. • Structured Query Language or SQL is the backbone of ETL. You must know SQL as it is the technology used to build ETLs. • Parameterization is another crucial skill to master. • Knowledge in any of the scripting languages used in a database application, like, Python, Perl, and Bash, will come in handy. • Debugging is another essential technical skill as nothing ever goes as planned. Apply for top tech jobs from other US states and cities.
Data Engineer  Fremont, CA
($) : USD Negotiable / Hourly / C2C
Role -Data Engineer Location : Fremont, CA Rate $Negotiable Job Type: Long Term Contract Job Description Should be comfortable working with complex SQL queries , fine tuning them , using functions like maps and arrays Should have good knowledge in Python and Hives Experience working on Data pipelines Must have exposure working on SFDC and familiar with table structures (deals, quotes,service orders, subscription billing etc etc) Should be comfortable in creating tables ( preferably in dimensional modeling ) , should also know normalization Good to know Tableau or any other reporting tool Requirement gatherings and co-coordinating with multiple clients and cross functional teams.
Aug-07-20
($) : open
Hi, Title: database developer III Location: NYC, will be remote though Contract: 9 months extendable Must have: becubic or Rochade DB Dev III (38184) Becubic / Rochade 10 spots – Data Lineage (tracking the metadata for flow – in rest as well as flight – files and reports in the lifecycle) Have “factories” Communication is very important User ID and password to allow for scanning Lineage diagrams ? what the tech is, how they connect etc. ? Oracle ? CSV file ? Table ? Report Data is moved with multiple programming languages / tools so they outline and pass to the next team for scanning (Stitchers) Apply patches / triage issues They have migrations that they are heavily involved in and are responsible for the environment to support the application What are the top 3 skills that you’ll be looking for on a resume? Data Lineage, Data Scanning, ASG Becubic, ASG Rochade Responsibility : Responsible for determining systems requirements for new or modified database application programs, creates the system specifications and Is responsible for the development, testing and implementation of efficient, cost effective application solutions. Will receive general direction from the Manager, work closely with business analysts to identify and specify complex business requirements and processes. May co-ordinate the activities of the section with the client area and other IT sections (e.g., data base, operations, technical support A good fundamental database developer who can work in conjunction with the data architect/modeler on the data warehouse reporting solution. Possess expertise in writing and tuning SQL queries, view, stored procedures, and functions. Experience interpreting data models and developing database structures; using standard diagramming techniques to design and develop computer data models; and implementing and troubleshooting programming changes and modifications. Ability to work within a team and lead database efforts and to document customer requirements, translate into technical designs, and explain technical issues clearly and accurately to both technical and nontechnical audiences. Strong problem-solving and interpersonal skills, as well as strong written and oral communications skills and demonstrated ability to adjust to changing priorities and handle multiple tasks simultaneously. Prepares deliverables such as source to target mappings and transformation rules documentation. 10-15 years of relevant work experience is required.
Aug-07-20
($) : Market
Title: Data Analyst Duration: 1 Year + Location: Piscataway, New Jersey Job Description Description Seeking an experienced Data Visualization Specialist to join the Digital Analytics team. The Visualization Specialist will work closely with the GCP/BQ team, Data Engineering team and Business teams to understand the data and reporting requirements & leverage their Visualization expertise to build Executive dashboards. Requirements: *Knowledge of best practices in data visualization; Shows creativity in creating visualizations which communicate effectively *Implement data visualizations using Google Data Studio. *Create KPIs, dashboards and alerts based on business requirements. *Transform data into a usable state for analytics and visualizations. *Experience defining meaningful metrics for an operations or support team to provide insights and visibility into key strategic and operational performance of the organization *Excellent SQL scripting & query skills * 5-10 years of experience
Aug-07-20
Role: ETL Informatica Developer Location: 100% Remote (2 video interviews) (No WEST Coast, EAST coast Preferred) Duration: 12-18mth+ Location: Remote VISA: Citizen, GC · Skills: Informatica ETL · Oracle SQL · SQL Server TSQL · Microsoft SSIS Packages for ETL Provides advanced professional input to complex Data Science assignments/projects. Responsible for the research, extraction and analysis of data. Evaluates and writes reports. May research and analyze algorithms. Provides statistical data trends to business partners such as medical management and underwriting. May also assist in the designing and implementation of systems to analyze and report findings. Combines IT capabilities with advanced clinical knowledge to determine trends, cost/benefit ratios, and forecasting of costs, management, and economics. Supports and provides direction to more junior professionals. Works autonomously, only requiring expert level technical support from others. Exercises judgment in the evaluation, selection, and adaptation of both standard and complex techniques and procedures. Utilizes in-depth professional knowledge and acumen to develop models and procedures, and monitor trends, within Data Science. Builds and maintains ETL processes. Monitors running processes and recommends continual improvements.
Aug-07-20
($) : Market
Title: Senior Data Architect Duration: 1 Year + Location: Washington DC Job Description Responsible for establishing guidelines and standards for the Medicaid Enterprise Management Solution (MEMS) data architecture, design reviews, data exchange planning, and implementation and data migration to new system. He/she should have similar experience in MEMS solution development of similar size and complexity. He/she should have at least seven (7) years of similar experience. Research and develop data solutions for our healthcare clients Work on proof-of-concept architecture and features before they are integrated into data systems Influence data design decisions, features, and APIs based on real-world usage experience Work to create and work within a data governance structure Independently lead projects from inception to completion Assist other CLIENT teams in the evaluation of future technical needs Mandatory Qualifications: 7 years or greater in data architecture, data management, data integrity and/or data warehousing Familiarity with HL7, FHIR or other healthcare data transmission standards College degree or equivalent business experience, preferably within State Healthcare Requested Qualifications: Additional experience with Healthcare data and analytics systems as well as BI and reporting such Power BI, PsiSense, QlikView Knowledge of federal and state healthcare IT programs (i.e., Medicaid, CHIP, TANF) Strong analytical and critical thinking skills Ability to work independently and manage work to a defined schedule Capable of conducting
Aug-07-20
Hi, This is Sarath from Avtech solution Inc, Hope you are doing great! Please review the following job description and let me know if you are available and willing to apply Title: Senior Data Architect Location: Costa Mesa, CA Duration: 6 Months + Visa: USC and GC only Employment: W2 & 1099 only Job Description: Bachelor's degree in Computer Science or relevant work expertise 10+ years of experience in critical business database modeling, design, development, and management, including ETL 5+ years of experience with non-SQL databases and big data technologies, such as Apache Hadoop, Spark, Mongo, Airflow 5+ years supporting and developing data governance processes, including provenance Recent experience with Cloud Native databases, such as DynamoDB, RedShift, BigQuery, Snowflake, etc Expert skills in SQL/query development and analysis, physical/logical database design, and performance optimization Ability to create clear, detailed, concise documentation - architecture diagrams, presentations, and design documents Strong attention to detail with matching experience in translating business requirements to data solutions, excellent written and oral communication skills Solid architectural knowledge of relevant OS, storage, networking, virtual machines, high availability Ability to work in a team with highly motivated people. Ability to prioritize and work on multiple projects concurrently Thanks & Regards, Sarath | IT Recruiter AVTECH Solutions Inc. Phone-Ext(1006) Email: Web: www.avtechsol.com
Aug-07-20
Greetings from Avtech solutions!! We have an opening for "Senior Data Architect" requirement for one of our client engagement, below is the job description. If your Interested Kindly Call me Or Mail me ASAP. Position: Senior Data Architect Location: Costa Mesa, CA Duration: 6Months Contract W2 Exp: 10+ Years Experience in critical business database modeling, design, development, and management, including ETL Experience with non-SQL databases and big data technologies, such as Apache Hadoop, Spark, Mongo, Airflow Supporting and developing data governance processes, including provenance Recent experience with Cloud Native databases, such as DynamoDB, RedShift, BigQuery, Snowflake, etc Expert skills in SQL/query development and analysis, physical/logical database design, and performance optimization Ability to create clear, detailed, concise documentation - architecture diagrams, presentations, and design documents Strong attention to detail with matching experience in translating business requirements to data solutions, excellent written and oral communication skills Solid architectural knowledge of relevant OS, storage, networking, virtual machines, high availability Thanks & Regards, Raj - Rajesh Kumar AVTECH Solutions Inc Ext - 1009 (Voice)
Aug-07-20
($) : Market
Title: Senior Data Architect Duration: 1 Year + Location: Waterbury, Vermont Job Responsibilities: Responsible for establishing guidelines and standards for the Medicaid Enterprise Management Solution (MEMS) data architecture, design reviews, data exchange planning, and implementation and data migration to new system. He/she should have similar experience in MEMS solution development of similar size and complexity. He/she should have at least seven (7) years of similar experience. Research and develop data solutions for our healthcare clients Work on proof-of-concept architecture and features before they are integrated into data systems Influence data design decisions, features, and APIs based on real-world usage experience Work to create and work within a data governance structure Independently lead projects from inception to completion Assist other CLIENT teams in the evaluation of future technical needs Mandatory Qualifications: 7 years or greater in data architecture, data management, data integrity and/or data warehousing Familiarity with HL7, FHIR or other healthcare data transmission standards College degree or equivalent business experience, preferably within State Healthcare Requested Qualifications: Additional experience with Healthcare data and analytics systems as well as BI and reporting such Power BI, PsiSense, QlikView Knowledge of federal and state healthcare IT programs (i.e., Medicaid, CHIP, TANF) Strong analytical and critical thinking skills Ability to work independently and manage work to a defined schedule Capable of conducting meetings and making presentations Experience using SharePoint and Microsoft Office applications
Aug-07-20
Specific position duties include: • Work with Business teams to understand requirements for PDO Backlog features and deliver ETL solutions with high business value using Agile. • Work with Data Architects (DAs) to define, create and modify EDW/GDW data models • Assist with analyzing commonality in EDW/GDW information models and identify needs for Extract, Transform and Load (ETL • Participate in PDO product team ceremonies such as Standups, Iteration Planning, Demo Days and Release Planning Skills Required: Must Have Skills: ETL development experience with IBM InfoSphere (DataStage v11.5) ETL and development experience with Informatica PowerCenter SQL experience involving multiple, complex queries Good understanding of data modeling and data quality solutions for large programs Exceptional analytical experience of complex applications and data relationships Team oriented with strong interpersonal skills and able to work as part of a product team Strong written/oral communication skills Strong drive for results and ability to work independently Self-starter with proven innovation skills Demonstrated commitment to quality and project timing Demonstrated ability to document complex systems Demonstrated ability to multi-task and adjust resources/assignments based on changing priorities Experience in creating and executing detailed test plans If you are Interested please share your profile to
Aug-07-20
SAS Developer  Raleigh, NC
Required: Experience building ETL process flows in the SAS BI Tool Data Integration Studio (SAS/DI) or a similar ETL tool. Advanced level programming in SQL, SAS Data Step and SAS Macro. Proficiency with joining tables from an RDBMS such as Oracle. Proven experience delivering Business Intelligence solutions Desired: Knowledge of health insurance data. Experience with metadata-driven SAS code generation. Experience with SAS Enterprise Guide and Management Console. Experience with Tableau, dashboard design and development
Aug-07-20
($) : Market
Title: Data/Technical Architect Duration: 1 Year + Location: Cary, North Carolina Job Description As an Enterprise Architect (EA) in the Network Systems Strategy and Architecture team, you will drive the architecture, design, and engineering practices and be a key contributor to the target application architecture of the portfolio. The EA/SA will be working across portfolios and with the Development teams to ensure that the progression of our technical stack is kept in mind while balancing the needs of business programs. The EA/SA will also facilitate the reuse of ideas, components, services, and proven patterns across various solutions in a portfolio. The EA/SA will be supporting current and near-term business needs, defining architectural and i as well as enforcing them by working with Development Teams. The EA/SA will work at both a Portfolio, Program level and with the Development Agile Teams. The EA/SA will be working with other Solution Architects, Development Teams and Project Managers to define and implement synergies and savings by consolidating OSS (Operations Support Systems) applications in alignment with our North Star Architecture Roadmap. The EA/SA will also be assisting with AWS (Amazon Web Services) migration by working with SAs, Development Teams and DBAs (Database Administrators) to implement rehosted and refactored systems as the most cost effective AWS Cloud solutions. The candidate must be able to establish relationships and work with managers at all levels without supervision. You must be able to articulate the problem statement and openly bring up challenges/roadblocks and impediments, negotiate, influence and drive solutions with the OSS (Operations Support Systems Responsibilities include: Partnering & Communication Skills. Partners with Business and IT stakeholders to vet benefits associated with OSS Application Consolidations. Represents the NTS portfolio in IT-wide coordination, etc. Key Skills: Excellent Communication and Presentation skills, Consensus Building skills Business Understanding. Understands the OSS domains and the language of the telecommunication industry. Key Skills: Domain knowledge. Pragmatism Analytic Skills. Leads the oversight of financial research for cost saving initiatives within the OSS and cost. Key Skills: Modeling and Critical thinking Technology Strategy. Understand and build on emerging industry trends. Contributes to and influences, IT-wide definition and enhancements of AWS and other public cloud platforms customization, technology and product standards and best practices for Verizon. Define NTS portfolio specific technology requirements and architecture enablers and drive solutions to address them. Key Skills: Thought Leadership, Architectural practices, Research Architectural Compliance. Defines and enforces in the Network Systems portfolio architecture and best practices; availability; redundancy schemes, performance, Disaster Recovery, DB and Interfacing Systems, etc. Key Skills: Strategic Alignment ADDITIONAL MUST HAVE SKILLS: 10+ years of experience creating architectural artifacts, system designs, and associated executive level presentations 10+ years of demonstrable experience leading business analysts, systems or process analysts, and software developers and facilitate group meetings/discussions. Understanding of EA frameworks and a wide range of systems/solution engineering life-cycle methods and artifacts (e.g., agile, waterfall and associated artifacts) 5+ years of experience designing cloud based architectures with a preference for AWS 5+ years of experience developing ESB, SOA, and microservice architectures with a preference for Kafka/Pulsar, Apigee and RESTful services DESIRED SKILLS Experience in telecommunications OSS (Operations Support Systems) domains and functions related to Engineering, Provisioning, Activation, Service Assurance, Trouble Ticketing, & Network Managed Systems Experience with SAP S/4HANA with a preference for AM, PM, and MDG modules. 6+ years of experience as a Java developer
Aug-07-20
Ab Initio Developer Location: Minneapolis, MN and Chandler, AZ Duration: 6-24 Months Zoom/Web-EX interview! Need someone with very good development exp. 6+ years of Ab Initio experience 4+ years of ETL experience Locations are Minneapolis, MN and Chandler, AZ Email: Peter at softsages dot com
Aug-07-20
($) : Market
Title: Data Analyst Specialist Duration: 6 Months + Location: Eden Prairie Minnesota Job Description Expertise in Provider domain. Proven ability to discover new information, form conclusions and support decision-making. Self-starter who can embrace challenges and take initiatives for difficult tasks. Experience working with large datasets from various sources. Strong analytical and problem solving skills. Experience in collecting data requirements, data exploration, ingesting, cleansing, transformation and modelling. Strong technical skills. Strong proficiency with SQL. Advance query writing skills. Experience in Data Integration methods such as ETL/ELT/ELTL, Data Pipeline etc. Knowledge of best practices, schema management and CRUD operations. Preferred Experience with Python. Basic understanding of data science. Experience in Data Life Cycle(DLC) such as Data Acquisition, Data Curation, Data Governance, Data Quality, Data Normalization, Data Externalization Pattern etc. Experience in Data Analytics such as Descriptive Analytics and Predictive Analytics is most preferred Lead data analyst experience. Additional Notes to Vendor : Need senior and Experienced Data Analyst. Mandatory Skills : Healthcare and Provider Data Experience, Python, Data Integration Methods, SQL Queries.
Aug-07-20
Data Modeler  Hartford, CT
($) : Market
Job Title : Data Modeler Location : Hartford CT Job Description : Data modelling Data Warehousing Data mapping Business Collaboration Reverse Engineering AWS Snowflake Micro Strategy Agile Methodologies Develop the logical data model based on the understanding of the conceptual model and conduct data profiling with the objective of delivering high quality data models and high quality of data within the limits of standards defined by Cognizant client and industry Map KPIs to the conceptual model if required Validate the conceptual model with Sr data modeler SMEs etc Create a logical model capturing the descriptions of entities tables columns etc. based on the conceptual model Create tables and relationships between the tables Map conceptual to the logical model
Aug-07-20
($) : Market
We have an urgent opening on ETL lead. It's a remote position until covid 19. interested candidates can send resume to srini(at)nexusitinc(dot)com 5 – 10 years of hands-on experience working with Informatica power center Must have experience in leading offshore based teams to deliver software solutions to customers Strong knowledge of working with relational databases like DB2, Oracle, Sql server Hands-on experience in writing shell scripts on Unix platform Understanding of Data Models: Conceptual, Logical, and Physical & Dimensional/Relational Data Model Design. Analyze functional specifications and assist in designing potential technical solutions Identifies data sources and works with source system team and data analyst to define data extraction methodologies Proficient in ETL Informatica and possesses working knowledge of Informatica MDM Defines, develops, documents and maintains Informatica ETL mappings and need to be well versed in Unix shell scripting Good knowledge in writing complex queries in DB2/Oracle PL/SQL Maintain batch processing jobs and respond to critical production issues communicate well with stakeholders on his/her proposal/recommendations Knowledge, status/risks regarding delivering solution on time Strong experience with Data Analysis, Data Profiling, Root Cause Analysis Should able to understand Banking system/processes and data flow Can work independently, lead and mentor the team
Aug-06-20
($) : Market
We have an urgent position on ETL Lead. Interested candidates can send resumes to (srini at nexusitinc dot com The ETL Lead will be responsible for analyzing the business requirements, design, develop and implement highly efficient, highly scalable ETL processes. Candidate is required to perform daily project functions with a focus on meeting the business objectives on time in rapidly changing work environment and should be able to lead and drive globally located team to achieve business objectives Required Skills: 5 – 10 years of hands-on experience working with Informatica power center Strong knowledge of working with relational databases like DB2, Oracle, Sql server Hands-on experience in writing shell scripts on Unix platform Basic knowledge of working with Hadoop technology stack like HDFS, Impala, MapReduce, Spark, Hive etc. Understanding of Data Models: Conceptual, Logical, and Physical & Dimensional/Relational Data Model Design. Analyze functional specifications and assist in designing potential technical solutions Identifies data sources and works with source system team and data analyst to define data extraction methodologies • Defines, develops, documents and maintains Informatica ETL mappings and need to be well versed in Unix shell scripting Good knowledge in writing complex queries in DB2/Oracle PL/SQL Maintain batch processing jobs and respond to critical production issues communicate well with stakeholders on his/her proposal/recommendations Knowledge, status/risks regarding delivering solution on time Strong experience with Data Analysis, Data Profiling, Root Cause Analysis Should able to understand Banking system/processes and data flow Can work independently, lead and mentor the team
Aug-06-20
Data analyst  Columbia, SC
Job title: Data Analyst Location: Columbia ,SC. Duration: Long term Knowledge & Experience: Overall Data Management and Governance Frameworks like: DAMA/DMBOK and DCAM/DGI Business Glossary & Taxonomy Data Architecture & Modeling Master Data Management Data Warehousing & Business Intelligence Data Custodianship Experience with Collibra or similar Data Governance Tool Strong Expertise in: Metadata Management: Metadata management from ingestion to consumption supporting all data governance functions Data Classification and Prioritization Data Tagging and custom regular expressions Data Access Governance Import & maintenance of metadata across platforms Reverse engineering of Data models Integrate Data transformation jobs Change control mechanisms (following deployment schedules) Bulk Import of Glossaries Data Lineage: Responsible for Lineage that is consumable & actionable for Business users Generate Lineage for all identified CDEs Discover broken links and create manual overrides Custom scripting for generating Lineage on parameterized jobs Additional tagging for discovering Business Lineage Data Quality: Strong experience with entire data quality lifecycle: Profile, Standardize, Enrich and Remediate Custom profiling on data elements beyond basic dimensions for finding anomalies Ability to define data quality rules based on Governance Policies and/or Business Rules Schedule & monitor batch data quality processes for standardization and enrichment Further logging & tracking of data quality issues to remediation by working with respective stakeholders Dash-boarding of aggregated data quality exception results and respective metrics Qualifications: Bachelor s degree in Information Technology or similar subject area A minimum of (10-15) seven years of relevant experience Technical Toolset: (Strong hands-on expertise with ) Databases: SQL Server, Oracle, Essbase, SSAS Data Modeling: ER Studio (or Erwin) with Deep Operational & Dimensional modeling and mapping experience Data Governance amp; MDM): Collibra or similar Data Integration: SSIS BI: Tableau, Hyperion, Cognos, SSRS or similar Environment: On-prem and Cloud
Aug-06-20
($) : $ANNUAL
Job Title:- Cloud Big Data EngineerJob Location:- Reston, VADuration:12 Months ContractRequired Skills:Experience : 9+ Years  nbsp;  Must have AWS and data transformation/Spark experience  nbsp;  Must have Python experience  nbsp;  Must have SQL experience
Aug-06-20
Data Stage Developer  White plains, NY
Job title: Data Stage Developer Location: White plains, NY Duration: Long term Experience in Datastage with full project life cycle development process, involving analysis, planning, designing, development, implementation, support and administrative phases. Strong experience in developing jobs using different stages in DataStage like link collector, join, merge, lookup, remove duplicates, XML Stage, filter, dataset, transformer, aggregator Experience creating process using various operational sources like Oracle, SQLServer, Flat Files, Excel Files, into a staging area. High experience creating tables and databases in Oracle and SQL Server Expertise in data warehousing techniques like surrogate keys Excellent skills in problem-solving and trouble-shooting capabilities, Quick Learner, highly motivated, result oriented and good team player. Good interpersonal skills, experience in handling communication and interactions between different teams.
Aug-06-20
Job Details: BI Developer / Data Architect Detroit, MI Long Term Job Title: BI Developer Job Description: BI Developers work with various organizations and departments to support strategic decisions through optimized reporting and data visualization. The developer will be expected to translate business needs into analytics/reporting requirements, generate the desired visualizations and provide them to the organization. The BI Developer will be a clear-thinking, problem solver who demonstrates an ability to transform data into practical insights and can communicate those insights with confidence. This role will work with key stakeholders to gather and analyze business performance requirements as well as to develop monitors for key performance indicators. Additionally, the developer will participate in a team environment to support the design, development, and delivery of reporting solutions and provide training to others on how to interpret and use data for business analysis. Duties and Responsibilities: Provide the business customers with visualizations of data to enable business value and drive self serve functionality Generate and publish reports for general use using approved BI Visualization Tools Develop and maintain Cognos, Power BI or Tableau reports Generate ad hoc analysis and reports Work with the Data teams to set up appropriate data structures/content for use in developing BI Reports Create and maintain User Guide/Training Manuals Maintain queries and reports after deployment Identify opportunities for process improvement and automation Top Skills: 3-5+ years'' experience with Power BI, Tableau, and Cognos (client currently uses all three BI tools, Power BI is most important to the manager) 3-5+ years'' experience with SQL Experience puling from different data sources Call center or after-sales experience Required Skills: Advanced communication skills with demonstrated ability to communicate clearly with charts, graphs, text, and oral presentations Strong analytical skills with ability to provide internal and/or external data analysis, consultation, and decision support Ability to recognize and solve data problems and inconsistencies Ability to prepare, analyze, and select methods of presenting various data Technically savvy; experience with SQL, Power BI, Tableau, Cognos and/or other reporting/analytics software is required Ability to report metrics on a detail or summary level and manipulate data into meaningful performance measurements Proficiency with Microsoft Office, especially advanced Microsoft Excel Experience in metrics analysis and process improvement
Aug-05-20
DirectClient: Texas Health and Human Services Commission (HHSC) Solicitation#:52908096R2 Title: Informatica Administrator Location: Cross Park Location, 701 W. 51st Street, Austin, Texas 78751 Duration: Until 8/31/2021 with possible extension Job Description: The Department of Information Resources (DIR) requires the services of one (1) Informatica Administrator hereafter referred to as Worker, who meets the general qualification of Systems Analyst 3, Emerging Technologies, and the specifications outlined in this document for Texas Health and Human Services. HHSC IT is currently implementing a Performance Management visualization portal pilot and in the planning stage of an HHS Performance Management and Data Analytics system with a goal to accomplish the following: Development and publication of project management documents and deliverables in compliance with DIR Framework directives; Conduct an in-house assessment of HHS data analytics and reporting needs. Creation of statements of work that clearly define the services and deliverables required of a vendor in support of the implementation of the data and analytics solution; Obtainment of matching federal funds for this initiative through the development of federally approved IAPD(s); Design, development, and implementation of HHS performance portal using an agile methodology for all standard SDLC phases that includes, but not limited to: Validation of performance metric requirements Creation of EPICS/User Stories Creation and validation of dashboard and report mock-ups Automation of data acquisition from a variety of data sources Dashboard and report development Testing – integration, load and stress, and user Deployment / publication internally and externally Operations support and enhancement of the Performance Portal pilot Informatica Administrator position will administer Informatica tools including Data Quality and PowerCenter. Administration includes helping govern best practices, fine tuning application and server, and overseeing environment controls. High-level responsibilities may include: Monitoring performance and up time of the Informatica application and domain services. Ensuring Informatica services are up-to-date with upgrades and applicable patches. Migration of code between different environments. Work closely with the developers and educate them on best practices related to building mappings, workflows, and sessions. Troubleshoot performance issues and resolve them in a timely manner Creation and maintenance of technical documents and specifications. All other duties as assigned. CANDIDATE SKILLS AND QUALIFICATIONS Minimum Requirements: Years Required/Preferred Experience 8 Required Demonstrated experience with System/Unix/Linux system administration and troubleshooting. 4 Required Experience configuring and maintaining domain and application services related to Informatica Power Center, Power Exchange, and Data Quality. 3 Required Experience with MS Office – Word and Excel, and Visio 2 Required Experience creating batch scripts to automate the Informatica administration, schedules, and deployment activities. 2 Required Demonstrated experience in optimizing and troubleshooting performance of ETL mappings, sessions, and workflows. Preferred: 2 Preferred Experience with administration of Tableau Server 2 Preferred Experience with administration of Informatica Intelligent Cloud Services 1 Preferred Prior experience in the Healthcare Industry.
Aug-05-20
Role: Informatica MDM Architect Location: Florida, MA Experience: 8+Years Duration: Long term BR/H on C2C: $60 Hiring Mode: Contractual through IMPLEMENTATION PARTNER End Client: Confidential Skills: ETL, MDM Education: Any Bachelor Degree or equal academic qualification Mode of interview: Video Job Description: Informatica tools (ETL, MDM) Oracle (SQL/Pl SQL), Unix MDM match & Merge property and match rule configuration MDM server log files knowledge, to investigate any issue INFA MDM Data Model Configuration & Infrastructure tables MDM Repos and Associated tables in MDM Hub MDM Data Load & Job Execution Cleansing and Standardization concepts, Cleanse Function & Cleanse List MDM Mapping, Delta & Delete Detection concepts IDD setup/user exit code, SIF API framework understanding. Collaborate extensively with Business and Architect groups. Take complete ownership of delivering technical deliverables including the codebase and actively participate in the rest of the deliverables. Coordinate with various teams for all MDM related requirements, design, development and testing. Apply knowledge of technologies, application Methodologies. Mail: neha (AT) sierrasoln (DOT) com with updated resume, expected salary & current visa status & validity
Aug-05-20
($) : DOE
Remote Until COVID comes Down Role: Enterprise Data Quality Analyst Location: Chicago, IL Duration: 12 months Healthcare experience is MUST The Enterprise Data Quality Analyst - Customer Data Domain requires a candidate with strong data management background who understands data, data ingestion, proper use/consumption, data quality, and stewardship. In this role you will perform data quality processes, measurements and analyses to assess patterns, identify root cause, define data quality rules, champion automated measurement and partner with stakeholders to identify data improvement opportunities. Responsibilities Implement data quality rules, automated measurement / monitoring, issues management framework, operational dashboards, and predictive models for proactive data quality assessment and identification of improvement opportunities Work with source systems and downstream data consumers to apply advanced data quality techniques through automation to remediate issues, and implement processes to monitor data quality risks, including: Qualifications 5+ years’ experience of doing data quality analysis, root cause investigations, and remediation (not application testing 5+ years’ experience working with data tools such as SQL, R, Python, SAS, Looker, Tableau, Hue and TOAD Experience and working knowledge of data in a healthcare data management environment Experience in migrating manual data quality processes to an automated, sustainable framework
Aug-05-20
($) : DOE
W2 ONLY Role: Hadoop / Informatica Technical Lead Location: Detroit, MI Duration: 12 months+ Must have hands on Hadoop Lead Experience Informatica 9.x and above (MUST) Informatica PowerCenter (MUST) Informatica Data Quality (STRONGLY PREFERRED) Big Data Hadoop Eco-system (preferably Cloudera distribution) (MUST) Hadoop HDFS / Pig / Spark / Oozie NoSQL Databases - Hive /Impala/MongoDB/Cassandra Oracle 10g and above (MUST) Unix Shell Scripting - AIX or Linux (MUST) Experience in any of Scheduling Tools - Tivoli, Autosys, Ctrl-M
Aug-05-20
Role: Data Analyst II Location: Detroit, MI RESPONSIBILITIES: Engagement Description: IT Data Analyst is responsible for the creating dashboards for business clients that include and improve IT processes, ensuring processes are repeatable and measurable. They also ensure that the IT processes and procedures are in alignment with the security framework and meet business requirements. The Data Analyst works with cross-functional IT teams to design, develop and maintain IT processes and procedures utilizing best practices and industry standard frameworks. The overarching goal of the work IT Data Analyst is to manage monthly budget and actual variance drivers. Coordinate and Conduct audit reviews/findings and prepare reports. Work closely in the development of Service Level agreements to improve the IT organization’s performances Individuals in this role require the ability to foster a work environment in which individuals collaborate in pursuit of a common mission and mutual goals. To be successful this individual must have the ability to develop well defined processes and procedures with clearly documented accountability for each activity. QUALIFICATIONS: Top 3 Required Skills/Experience: Minimum of at least 3 years of experience in financial assessment and planning Minimum 3 years of experience collecting data, developing reports, and creating dashboards for management Strong Excel, PowerPoint, Tableau, Visio and SharePoint skills along with excellent note taking, documentation and issue/action/decision tracking experience Required Skills/Experience – The rest of the required skills/experience. Include: At least 3 years of experience in financial assessment and planning Experience with identifying and understanding budget variance drivers Working knowledge of managing a cost center’s BPR monthly and year-to-date performance processes and procedures Working knowledge of cost transparency models or supporting existing applications such as iServer or other tools Experience with Audit reviews and assessments Preferred Skills/Experience – Optional but preferred skills/experience. Include: Strong analytical, technical, interpersonal and communication skills. Ability to work independently, or within a team environment. Ability support month-end and year-end close processes such as identifying accruals and reclasses Other related skills and/or abilities may be required to perform this job. Education/Certifications – Include: Bachelor''s degree in related field preferred Finance or Accounting Experience required
Aug-04-20
Data Engineer  Milwaukee, WI
Logisoft Technologies, Inc is a global information Technology, Consulting and Services company with more than 100% satisfaction rating from all the clients across USA. Logisoft was established with a commitment to fulfill the Information Technology Resource Management needs in Organizations. In the current industry trends, growing number of organizations focus on their core business processes and outsource their IT business process. We represent Logisoft Technologies Inc. with pride; We present ourselves as a premiere Technology, Consulting, Product Development and Software Services Company. Our Head Office located in South Plain field, NJ and Our Offices location in Hyderabad, INDIA & Accra, Ghana, South Africa. We are Microsoft Official Partners - A Microsoft Certified Partner help customers with a range of IT projects and specific IT solutions Job ID: LS_DE_DR_WI_04 Job Title: Data Engineer Location: Milwaukee, WI Duration: Full Time Required Skills: Experience with databases (DB2, SQL, UDB) Experience with ETL (Informatica preferred) Experience with Linux scripting and Autosys jobs Experience with CI/CD pipelines Experience working in a Scrum/Agile environment Experience working in large environments with an understanding of complexity, business rules, and tranformations LOGISOFT Technologies is committed to a policy of Equal Employment Opportunity and will not discriminate against an applicant or employee on the basis of age, sex, sexual orientation, race, color, creed, national origin, ancestry, disability, marital status, or any other legally protected basis under federal, state or local law.
Aug-04-20
Data Engineer Location: Menlo Park, CA Long term Required skills: 10+ years of experience Experience in Advance SQL, Python, Tableau or any BI tool Should be well versed in creating data pipelines using Python. Should be very strong in writing advance SQL queries. Please read through the email trail. o Data Engineer-Source only experts from TOP Companies/Customer (client is looking for only Google, Facebook, Amazon, Linkedin, Cisco, Walmart, Adobe, Microsoft, Yahoo, Apple as current or recent customer. Kindly look for same onlyzaw2a o Rate are open o Junior Profiles should be from Computer science background (Only FTE’s)
Aug-04-20
($) : Market
Job Role: Policy/Framework Data Steward Location: Chicago, IL Duration: 6 Months+ Rate : Market Open Client: Healthcare Company Role Summary: Define, develop and finalize Data Governance policies, charters and framework documents in alignment with Enterprise data governance strategy and goals Lead cross enterprise working groups to solicit input and feedback to develop and revise policy and framework content Develop subject matter expertise in related policies (Privacy, CIP, IT, Architecture etc across the enterprise Partner with Domain and Business unit leads to facilitate policy and framework implementation and monitoring Develop, implement and manage a Policy exception processes Identify areas where new policies and procedures need to be developed or refined based on current and future enterprise or data management initiatives, drive the written knowledge expansion development process, and ensure appropriate documentation Skills: Sound understanding of Data governance and management practices Demonstrated experience with the creation of foundational data governance documents: policies, framework, operating models and playbooks Excellent command of grammar, usage, style, and tone in written and verbal communication Proficient in leveraging visual depictions (using Microsoft Power Point, Visio or other products) to convey complex concepts First-rate organizational skills and meticulous attention to detail Ability to multitask and work under tight deadlines Ability to foster collaboration, value others perspectives and gain support and buy-in for organizational proposals Organizational agility; able to effectively manage through the various internal functions and organizations Qualifications: Bachelor’s Degree required at a minimum. Master’s Degree a plus. 3-5 years of experience in Data Governance 2+ years’ experience with policy writing Experience with JRA Agile, SAFE, Kanban and Waterfall work management methodologies Experience with Collibra or other metadata management systems is a plus
Aug-04-20
Role: EDC Programming SME-JReview Location: Cambridge, MA / Remote Bill Rate: $75/hr Job Description : Functional Knowledge required. Sql must and strong. Deliverables: clinical reporting background must PL SQL expertize must Any reporting tool knowledge must JReview skills needed. Lead Level Experience and good coding skills. Strong knowledge of databases, such as Oracle Knowledge of clinical reporting tools (e.g. Cognos,J-Review, Power BI, SQL, PL-SQL, SAS, etc) Knowledge of visualization tools, such as TIBCO Spotfire, etc Knowledge of industry specific data standards e.g.,CDASH, CDISC ODM/SDTM Ability to work with diverse user groups, lead discussions on and creation of User Requirements and Design documentation Support creation and maintenance of standard report libraries Liaise with other technical personnel in related functions, as needed
Aug-04-20
Data Engineer  Saint Louis, MO
($) : Market
RESPONSIBILITIES Improve time to market by enhancing current ETL processes and data preparation workflows. Design and develop solutions to support ingestion, provisioning and visualization of complex enterprise data to achieve analytics, reporting, and data science goals. Analyze data and system issues to design and implement effective, extensible solutions. Support end-to-end development of departmental self-service applications, including designing, coding, testing, debugging, and release. Provide operational support of departmental self-service applications, including change management, incident response, and documentation upkeep. Identify and recommend appropriate continuous improvement opportunities. Integrate into an Agile (Scrum-Ban) team within a SAFe framework. Actively participate in collaborative design and review sessions; pushing innovation while observing best practices. SKILLS & EXPERIENCE Excellent written and verbal communication skills with the ability to capture and articulate technical and non-technical details, as well as elaborate process flows. A strong, team oriented spirit, a mindset focused on learning and achieving objectives, and the discipline to work rigorously while unmonitored. An appetite for working in a fast paced, quick changing, environment and spending time on multiple unrelated tasks/projects. TECHNICAL EXPERIENCE 3+ years data engineering. 2+ years on a reporting/BI team at an Enterprise company. 1+ years data or business systems analysis. Fluency in multiple relevant languages such as SQL and Python. Expertise using data manipulation tools such as Alteryx, Tableau Prep, Stitch, AWS Glue, and SSIS. Experience working with structured and unstructured data. Expertise using data visualization tools such as Tableau, Informatica, Spotfire, and PowerBI is a plus. Python application development experience is a plus. Business user expertise with other telemetry tools a plus.
Aug-03-20
Datawarehousing jobs statewise
Skill Test
  • C++ Coding Test - High
  • C++ Coding Test - Medium
  • C++ Language
  • C# ASP.NET MVC
  • C# ASP.NET MVC
Try your first Skill Score test now for FREE
Improve your Employability
Are your skills in Demand?
Try DemandIndex and find out instantly
Looking for IT Jobs
Full Time - Contract - Consulting
IT jobs
Create account and get notified in 30 seconds