Top Skills > Datawarehousing

Datawarehousing ETL Jobs in USA - 2376

NO matched jobs found.
Click here for a custom search.
Position: SAS Administrator Location: Jersey City, NJ Duration: 12+ months We are looking for SAS Admin for one of our clients located in New Jersey, Please find the below job description and let me know if you are interested and available for projects. This is a long term onsite contract role in New Jersey. If you are interested, Please send me your UPDATED word format of the resume along with the contact information. Please see job responsibilities below. Role : SAS Admin Installation, administration, monitoring, maintenance, and tuning of SAS Platforms in Linux and AWS Cloud environments. Provide technical support, troubleshooting and patch updates for SAS Solutions Manage folder structures for SAS datasets Integrate SAS/ACCESS interfaces to various data sources such as Redshift, S3, and Snowflake SAS grid experience and Strong SQL ability Strong Windows Understanding like Scripting ability and Shared Drive. SAS Server Knowledge Knowledge of Base SAS, SQL, and SAS macro language. Experience administering applications on Linux Servers. Understanding of SAS products and how they function together in a distributed environment. Experience with PC-SAS and SAS Studio Self-starter that can work under general direction in a highly collaborative, team-based environment. Strong communication skills. SAS and OS level security knowledge; able to develop and create security schemes Able to create libraries (SAS and RDBMS), users, groups, and roles Able to administer applications Web Application Server administration and configuration knowledge General knowledge of project management Setting up Data Access in SAS, including registering libraries and tables in the metadata, updating table metadata, setting up data access, and troubleshooting data access SAS Management Console Managing multi-tier SAS platform with products o SAS server deployment and administration SAS configuration files Administering SAS Mid-tier SAS servers and troubleshooting potential configuration problems Sateesh
Apr-07-20
Informatica Developer – Little Rock, AR Responsibilities Analyze legacy data that is stored in RDBMS and Hierarchical data models Proficient in Informatica with a good knowledge of its ecosystems Proficient in Java, Cognos with a good knowledge of its ecosystems Proficient in writing complex stored procedures Hands on experience in performance tuning Informatica and Stored procedures code Commitment to team success and positive team dynamics Passion for growing and applying technical skills in service to customers Experience with Agile Development Participate in the research, proof of concept, and implementation of cutting-edge ETL Technologies. Required Skills: Informatica, DB2, Java JEE, SQL, JavaScript Preferred Skills: Experience with working and configuring on Informatica tool Experience working on RDBMS such as Oracle/DB2 required. Experience with working on Cognos tool Knowledge of Integrated Eligibility domain/Experience working on IE projects preferred. Expected Deliverable(s): Implement modular, reusable components for application with high quality
Apr-07-20
Job Responsibilities: Implement data quality rules, automated measurement / monitoring, issues management framework, operational dashboards, and predictive models for proactive data quality assessment and identification of improvement opportunities. Work with source systems and downstream data consumers to apply advanced data quality techniques through automation to remediate issues, and implement processes to monitor data quality risks, including: Analyzing data quality results, Measuring and auditing large volumes of data for quality issues and improvements, Performing root cause analysis of data anomalies, Triaging data quality issues and creating remediation plans, Evaluating and quantifying business impact of data quality issues and making recommendations for data improvements including required process and system changes. Develop data quality policies, procedures, best practices, and related knowledge content. Collaborate with stakeholders to ensure data quality best practices are implemented across enterprise data assets. Lead efforts to communicate quality of enterprise data assets and the value realized through data improvement efforts. Orchestrate data quality work within the Agile framework and other methodologies where required for on-time delivery of data quality measures and results to stakeholders Qualifications. 5+ years experience of doing data quality analysis, root cause investigations, and remediation (not application testing 5+ years’ experience working with data tools such as SQL, R, Python, SAS, Looker, Tableau, Hue and TOAD. Experience and working knowledge of data in a healthcare data management environment. Experience in migrating manual data quality processes to an automated, sustainable framework.
Apr-07-20
Installation, administration, monitoring, maintenance, and tuning of SAS Platforms in Linux and AWS Cloud environments. Provide technical support, troubleshooting and patch updates for SAS Solutions Manage folder structures for SAS datasets Integrate SAS/ACCESS interfaces to various data sources such as Redshift, S3, and Snowflake SAS grid experience and Strong SQL ability Strong Windows Understanding like Scripting ability and Shared Drive. SAS Server Knowledge Knowledge of Base SAS, SQL, and SAS macro language. Experience administering applications on Linux Servers. Understanding of SAS products and how they function together in a distributed environment. Experience with PC-SAS and SAS Studio Self-starter that can work under general direction in a highly collaborative, team-based environment. Strong communication skills. SAS and OS level security knowledge; able to develop and create security schemes Able to create libraries (SAS and RDBMS), users, groups, and roles Able to administer applications Web Application Server administration and configuration knowledge General knowledge of project management Setting up Data Access in SAS, including registering libraries and tables in the metadata, updating table metadata, setting up data access, and troubleshooting data access SAS Management Console Managing multi-tier SAS platform with products o SAS server deployment and administration SAS configuration files Administering SAS Mid-tier SAS servers and troubleshooting potential configuration problems
Apr-07-20
SAS Admin  Jersey City, NJ
We are looking for SAS Admin for one of our clients located in New Jersey, Please find the below job description and let me know if you are interested and available for projects. This is a long term onsite contract role in New Jersey. If you are interested, Please send me your UPDATED word format of the resume along with the contact information. Please see job responsibilities below. Role : SAS Admin Installation, administration, monitoring, maintenance, and tuning of SAS Platforms in Linux and AWS Cloud environments. Provide technical support, troubleshooting and patch updates for SAS Solutions Manage folder structures for SAS datasets Integrate SAS/ACCESS interfaces to various data sources such as Redshift, S3, and Snowflake SAS grid experience and Strong SQL ability Strong Windows Understanding like Scripting ability and Shared Drive. SAS Server Knowledge Knowledge of Base SAS, SQL, and SAS macro language. Experience administering applications on Linux Servers. Understanding of SAS products and how they function together in a distributed environment. Experience with PC-SAS and SAS Studio Self-starter that can work under general direction in a highly collaborative, team-based environment. Strong communication skills. SAS and OS level security knowledge; able to develop and create security schemes Able to create libraries (SAS and RDBMS), users, groups, and roles Able to administer applications Web Application Server administration and configuration knowledge General knowledge of project management Setting up Data Access in SAS, including registering libraries and tables in the metadata, updating table metadata, setting up data access, and troubleshooting data access SAS Management Console Managing multi-tier SAS platform with products o SAS server deployment and administration SAS configuration files Administering SAS Mid-tier SAS servers and troubleshooting potential configuration problems
Apr-07-20
Job Qualifications: Bachelor’s degree in Computer Science or related education or practical experience 7+ years of advanced SAS experience (macros, data management, data profiling and reporting) SAS programming experience required Understand and experience of using various SAS methods for merging, joining, and performing lookups. Experience in creating performance reports using SAS, SQL, and Microsoft Office tools Demonstrated experience producing accurate and detailed work on multiple projects under time pressure Strong verbal, written, and interpersonal communication skills desired Experience in a financial or banking environment preferred Should know basic Python coding
Apr-07-20
Database Developer/ ETL, Data Analysis, Data Modeling/(627913) Atlanta, GA Duration: 15 Months Web Cam Interview Only Short Description: Under broad supervision, designs, codes, tests, modifies & debugs computer software. Researches & analyzes program or systems problems & develops program documentation. Translates business requirements into development activities secure & maintainable. Complete Description: Under broad supervision, designs, codes, tests, modifies and debugs computer software for the Georgia Dept. of Transportation. Researches and analyzes program or systems problems and develops program documentation. Translates business requirements into development activities in secure and maintainable code. - 100% Remote. Applicants will be expected to be onsite once remote mandate is lifted. - Bachelor''s degree from an accredited college or university with coursework in computer science or management information systems AND Six years of related experience. Skill Proven working experience in the software development life cycle Required 6 Years Proven working experience with ETL, Data Analysis, Data Modeling, Logical and Physical Database Required 5 Years Working experience with business requirement documents, functional design and testing documents Required 5 Years Oracle Workflow and Application Management Engine highly desired Required 5 Years Questions Description Question 1 Absences greater than two weeks MUST be approved by CAI management in advance, and contact information must be provided to CAI so that the resource can be reached during his or her absence. The Client has the right to dismiss the resource if he or she does not return to work by the agreed upon date. Do you accept this requirement? Question 2 Please list candidate''s email address that will be used when submitting E-RTR. Question 3 Remote Work Permitted: Due to COVID-19, the client has agreed to allow the selected candidate to work remotely for the time being. However, the selected candidate must be available to report onsite as directed by the client. Do you accept this requirement?
Apr-07-20
Role : ETL Data Engineer - Architect Location : St Louis MO. Job : Corp-Corp. Description : Contact me for the description.
Apr-07-20
Should have good experience in Informatica
Apr-07-20
Position: Informatica ETL Architect Secondary: Azure Location: Chicago, IL Contract Hands-on experience in data management ETL solutions Experience in cloud integration (Azure) for data integration platform Experience in agile implementation and understand role/responsibility of an ETL architect in agile program Ability to work with key stakeholders (like Business SME, BA, enterprise solution architect, data modeler, BI solution architect) to assess business requirements and understand the accuracy and completeness of the requirements Ability to assess the data-flow use cases covering end-to-end business requirements Ability to assess the business objective of the program and understand how the data integration platform (on-premise / cloud) fits within overall program landscape Understand customer expectation on Azure cloud integration platform and deep drive on the technical/infrastructural feasibility and work-around Estimate the ETL chunk of work and align with program timeline Ability to determine the area of complexity of historical data migration in terms of data-volume, nature and variety of source systems, data quality etc. Ability to propose best-fit ETL solution keeping in mind the on-premise / cloud / hybrid nature of overall solution platform Ability to work closely with enterprise data modeler in deciding the best-fit data model and data structure for data-integration layer Design the ETL solution in Informatica adhering best practices Assist ETL Development team in deciding the cost-effective approach of data-processing (load balance between DB server and Informatica Server while processing the data from source to target) Guide team on crucial deployment/implementation planning in higher environments (like UAT / Production)
Apr-07-20
($) : Market
Job Title Data Scientist | | IBM Watson Location Hartford, CT Duration 6+ Months Need atleast 10+ Years of IT Experience Must Have Skills IBM Watson NLP Services ML Text Analytics Deep lEarning using Tensorflow Detailed Job Description 10 years of experience in machine learning and data analysis. Highly skilled and experienced in NLP Text Analytics on IBM Watson Platform. Develop, construct, test and maintain architectures and align architecture with business requirements Will be responsible for proposing, training, validating, and shipping AIML models from concept to production. Work with data analysts, data engineers, data scientists and product SMEs. Requirements gathering and assessment. Ability to improve data r Responsibilities Design and Develop NLP based AI models Gather Requirements and Clarify Queries Demonstrate good business understanding Thanks and Regards Harish Reddy
Apr-07-20
Should have strong Informatica experience
Apr-07-20
Python Engineer  San Francisco, CA
($) : Market
Python Engineer San Francisco, CA 1. Experience coding in Python 2. Background in data science / familiarity with data science principles 3. Desired: Familiarity with grid topology and power system fundamentals 4. Desired: Available for on-site work/cross-training in San Francisco thro= ugh August 2020 5. Preferable: Experience in ESRI Technology specially in ARCGIS Desktop an= d ArcGIS Server. 6. Preferable: Electric Domain knowledge 7. Excellent communication skills and Good problem-solving skills. 8. Must be a strong team player with the ability to communicate and collabo= rate effectively in a geographically disperse working environment. Experien= ce building and maintaining data pipelines and/or analytical or reporting s= ystems at scale o Knowledge of Data Science, Machine Learning and Statistical Models is des= irable o Experience in Python, Scala, or R for large scale data analysis o Experience with Relational Database Systems and Graph Database o Experience with Cloud computing (AWS) o Experience with Spark o Experience in developing high volume transaction processing solutions
Apr-07-20
Candidate should have data science and data Modeler experience.
Apr-07-20
SAS Data Engineer  California City, CA
($) : Market
Location: Mountain View, CA Need local Consultants C2C Project The SBSEG Marketing Data Operations team uses SAS to manage a variety of data processes to support the business. These processes include SAS extracts and macros to streamline data extractions from various data warehouses for the team to easily obtain data for marketing use. There is an initiative to move away from SAS and use alternate industry standard methods to extract and house the data. We looking for a technical expert to help understand these processes and rebuild/rewrite the SAS programs using SQL (AWS EMR Hive Responsibilities Partner with members of the team to develop a deep understanding of what & how data is used Collaborate with others in building innovative data capture and real-time customer data analytic capabilities needed by the business (e.g. requirements identification, design specification, prototype testing) Work side-by-side with cross-functional team including other members of the SAS community, SBSEG Analytics, and partner DBAs to develop the most optimal data experience. Rewrite/rebuild processes existing in SAS using SQL (AWS EMR Hive) Solid communication skills: Demonstrated ability to explain complex technical issues to both technical and non-technical audiences Educates and provides guidance to business stakeholders on how best to harness available data in support of business needs, makes recommendations, and provides alternatives to meet business needs Creates complex software programs and applications for management of massive quantities of data (big data) using high level programming languages Develop and test automated data extractions and data feeds to our e-mail vendors in support of trigger-based e-mail marketing campaigns. Develop and test ad hoc data extraction queries in support of one-time e-mail marketing campaigns and market research requests. Automate repeat data requests to create sustainability and efficiencies Extract and aggregate data for business analysts to enable measurement of key performance indicators. Ensure all automated data routines run as scheduled with expected results and troubleshoot as necessary when outages occur. Partner with cross-functional data teams to identify and implement new data capture and aggregation mechanisms to enable more efficient data processing. Partner with immediate team to improve process, documentation, and data ingestion tools.
Apr-07-20
ETL Data Engineer  California City, CA
($) : Market
ETL Data Engineer, Location: Mountain View, CA Contract : C2C with Implementer Duration: 12+ months Must Have: Informatica ETL Oracle SQL Linux Must be able to bring up new architecture/design for ETL processing Minimum 8+ years’ experience in Data/Warehousing Python or Scala with good ETL/ELT skills. Good exposure and knowledge on using HIVE and MPP databases (like REDSHIFT,Vertica etc) Candidate must be able to use Python/Scala with Spark and AWS for writing ETLs Good exposure on writing ETL using Spark Good SQL query writing skills AWS Services like REDIS,EMR,EC2,Glue,s3,Cloudwatch etc
Apr-07-20
($) : 65,000 / year
Job Duties: Provide reporting and data analysis support for various business partners Support data visualization and dashboard efforts across subject areas within the Insurance domain (Submission, Policy/Premium or Claim level analysis Ability to learn complex data sources and merge data across disparate systems to meet business requirements. Ability to work directly with business users to understand requirements and provide recommendations on how best to display data in a report or visual dashboard. Must have the ability to bridge the communication gap between business users and technical support resources. Create standard operating procedures for new requirements and manage the transition to offshore support teams or junior staff. Process change requests on steady state reports with attention to reconciliation expectations. Automate existing manual reports using database and reporting tool functionality. Execute standard testing and reconciliation for new or revised standard reports. Share knowledge with other junior team members. Skills: SQL expertise required. Requires advanced understanding of data modeling, table relationships, and query optimization. Ability to own report development effort with minimal supervision. Problem solving; data related trouble-shooting such as identifying reconciliation variances, or determining cause of data anomalies. Ability to interview users and understand requirements. Expert knowledge of Business Objects, PowerBI, Tableau or similar reporting tool required. Skilled in merging data from multiple disparate sources. Data warehousing experience important. Expert in MS Excel and MS Access VBA knowledge a plus Be able to prioritize based on due dates and task demands. Experience: 3+ years insurance data experience. 3+ years SQL 3+years'' experience with data warehousing and/or relational databases Finance or Planning data experience is a plus.
Apr-07-20
J.O#:60305, Data Analyst, Ft. Worth, TX (locals and OPT, H4 EAD, GC Holders) Greeting Business Partners, We have multiple openings Data Analyst with our clients. Please send submit your consultants Please do not share profiles who feel uncomfortable sharing legal name, DOB, Copies of ID proof, Work authorization and Passport details (Must for H1B, EAD etc Job Title: Data Analyst Master Job Title: Administrative Location: Ft. Worth, TX Client: BNSF Railway Required Skills: Tableau and Power BI are the two priorities. Azure is a plus. Job Description: Role: Data Analyst (Rapid Insightful Analytics) Team: Intelligent Automation RPA, NLP, AI and Advanced Analytics Job Summary: Provide specialization in advanced analytics to extract insight from information by iterating rapidly, to summarize, visualize large data sets. This position will be working closely with business SME s, operations research and will be learning / leveraging AI tools & techniques. Responsibilities & Tasks The Data Analyst will assist in our efforts to create analytical solutions for initiatives. Candidate will be responsible for data exploration, discovery, and presentation of gathered insight from data. This role will be responsible for visualizing and communicating insight extracted from data to stakeholders at various levels across the company. This role will construct, test, maintain and architect supporting datasets. Experienced candidate needs proven ability to work independently on big projects. Candidate must have excellent interpersonal skills and ability to work with several stakeholders across multiple organizations. Must be very organized and detailed in development efforts. Required Experience: Excellent interpersonal skills and ability to work with several stakeholders across multiple organizations Ability to facilitate conversations with business SME s to understand the problem, rapidly iterate proposed solutions and clearly present new findings/ solutions. Lead level experience providing technical & functional guidance Hands on & expert experience with enterprise data visualization tools like Tableau and / or Power BI Leveraging Azure AI platform tools like Databricks & Auto ML, etc. Experience with relational database management system development. Solid analysis and problem-solving skills Able to build analytics solution including data exploration, extraction, cleaning, transformation, testing and implementation. Open to learn new tools and technologies. Able to adapt to fast-paced working environment Preferred Experience Project leadership experience leading collaborative efforts Master s Degree in Management Information Systems, Computer Science or equivalent. Ability to write code in python, java and R Vinith Ailam Technical Recruiter Phone Email: WAFTS SOLUTIONS INC. 32969 HAMILTON COURT, SUITE 123, FARMINGTON HILLS, MI- 48334. eFax Website :www.waftssolutions.com This email and files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error please unsubscribe by sending an email to Wafts Solutions | 32969 HAMILTON COURT , SUITE 123, FARMINGTON HILLS, MI 48334 Unsubscribe Update Profile | About Constant Contact Sent by in collaboration with Try email marketing for free today!
Apr-07-20
Hadoop Data Analyst Location – Hartford, CT. 6+ months Job Description: Continuously optimize, enhance, monitor, support and maintain all Talend data integration processes and should be an expert in Talend Big Data jobs. Use APIs or source XML Type columns to dynamically extract, integrate and load data in target schema using Talend Should be responsible for building extraction and mapping rules for loading data from multiple sources for greenfield data warehouse implementation based on Talend, AWS and Snowflake. Should have experience working with multiple file formats especially with Parquet and Avro Should have experience with moving data into S3 folder structures and working with Talend Spark jobs on AWS EMR. Should contribute to logical data model for data warehousing and making data available for downstream consumption Maintain documentation, manage source code and deployments, and implement best-practices
Apr-07-20
Role: Informatica/Hadoop Technical Lead Location : Detroit, MI Interview type: Skype. Informatica/Hadoop Technical Lead The Informatica/Hadoop Technical Lead must have prior hands-on (MUST) experience delivering/leading successful data warehousing projects (Informatica) as well as broad background and experience with IT application development (Hadoop This individual is responsible for working with other teams to deliver the artifacts and the code. This person must have strong professional consulting skills and the ability to communicate well at all levels of the organization. REQUIRED EXPERTISE IN TOOLS & TECHNOLOGIES: * Informatica 9.x and above (MUST) * Informatica Power Center (MUST) * Informatica Data Quality (STRONGLY PREFERRED) * Big Data Hadoop Eco-system (preferably Cloudera distribution) (MUST) * Hadoop HDFS / Pig / Spark / Oozie * NoSQL Databases - Hive /Impala/MongoDB/Cassandra * Oracle 10g and above (MUST) * Unix Shell Scripting - AIX or Linux (MUST) * Experience in any of Scheduling Tools - Tivoli, Autosys, Ctrl-M * Bachelor''s Degree in related field OTHER SKILLS/EXPERIENCE REQUIRED: * More than 3 years of experience as a Senior ETL Developer * More than 5 years experience as ETL Developer using Informatica and Oracle 10g/11g to implement data warehousing projects * Working knowledge of Informatica Data Quality is preferred * More than 2 years of experience in leading/designing Informatica /Hadoop projects * Excellent understanding of data warehousing concepts. * Candidate should be able to clearly communicate fundamental concepts during interview and demonstrate previous experience in all aspects. * MUST HAVE strong SQL skills in Oracle 10g/11g * Experience in Oracle database programming using Partitioning, Materialized View and OLAP * Experience in tuning Oracle queries/processes and performance management tools * Strong data modeling skills (normalized and multidimensional) * Strong business and communication skills * Knowledge of Health Care Insurance Payer Data Warehousing Preferred * Preferred Certifications: * Cloudera (CCP / CCA) RESPOSIBILITIES: * The Senior Developer should be able to perform the following with minimal supervision: * Understand Business Requirements and Conceptual Solution * Convert Business Requirements into Technical Requirements and Design * Create High Level Design and Detailed Technical Design * Create Informatica Mappings, Workflows and Sessions * Create Shell Scripts for automation * Understand the source systems and conduct the data profiling * Co-ordinate Data Modeling of Sources and Targets * Create Source to Target Mapping Specifications from the Business Requirements * Review Unit Testing and Unit Testing Results Documents * Provide Support for QA/UA Testing and Production code migrations * Provide warranty support by assisting/resolving production issues The Senior Developer should have the following leadership skills: * Provide hands on technical leadership * Lead technical requirements, technical and data architectures for the data warehouse projects * Direct the discovery process * Provide knowledge guidance to Business Analysts and ETL Developers * Provide subject matter expertise and knowledge guidance to a team of analysts and ETL Developers * Lead the design and development of the ETL process, including data quality and testing * Follow the standards and processes defined * Contribute for process and performance improvements * Ensure compliance of meta data standards for the data warehouse
Apr-07-20

Understanding Data Warehouse & ETL

A Data Warehouse is a huge database designed solely for data analysis. It is not used for any standard database process or business transactions. ETL (Extract, Transform, and Load) is a process by which normalized data is extracted from various sources and loaded into a data warehouse or a repository. This is then made available for analysis and querying. This repository is transformed to remove any redundancy and inconsistency. ETL supports effective decision making and analytics based on the composite data. Slices of data from the data warehouse can be stored in a data mart. This enables quick access to specific data like the summary of sales or finance reports.

Data Warehouse Features & Capabilities

Data Warehouse has features and capabilities that support data analysis with ease. A good data warehouse should have the following abilities: • Interact with other sources and input; extract data using data management tools. • It should be able to extract data from various sources, files, Excel, applications, and so on. • Allow cleansing so duplication and inconsistency can be removed. • Reconcile data to have standard naming conventions. • Allow both native and autonomous storage of data for an optimized process.

Top ETL Tools to excel in Data Warehousing Jobs

There are many ETL tools available in the market. The most commonly used tools for ETL are given below. • Sybase • Oracle Warehouse Builder • CloverETL • MarkLogic. There are excellent data warehousing tools like Teradata, Oracle, Amazon Web Services, CloudEra, and MarkLogic. Expertise in any of these can fetch you a good job in the field of data warehousing.

Salary Snapshot for Data warehousing Jobs in US

A senior Data Warehouse developer receives an average pay of $123,694 a year. Based on the skill and expertise the salaries in this field can range anywhere from $193,000 to $83,000. Most of the Senior Data Warehouse Developer receives a salary that ranges between $103,500 to $138,000 in the United States. There are currently plenty of Data Warehouse developer jobs in USA.

Career Path for a Data Warehouse Professional

Data Warehouse gives immense opportunities for an IT professional. There are a plethora of roles and designations required to manage this vast application and its different modules. Data warehouse managers are software engineers who build storage mechanisms for organizations to meets the need of the company. Entry-level roles in Data Warehouse are Software Developer, Software Engineer, Business Intelligence (BI) Developer, and Data warehouse ETL Developer People who make use of the data in the Data Warehouse to arrive at various decisions are Data Analyst, Data Scientist, and Business Intelligence (BI) Analyst. Senior roles in this field are Data Warehouse Managers, Senior Financial Analyst, Senior Software Engineer / Developer / Programmer, and Senior Business Analyst. Data warehousing jobs in USA are still prevalent, and if you are a specialist in this field, you can make a great career out of it.
Data warehouse & Skills & Tools
To be a Data Warehousing professional, you need an in-depth understanding of the database management system and its functions. Experience in developing databases using any of the database applications will be an added advantage. Apart from this, other technical skills required for a Data Warehousing job are discussed below: • Tools for developing ETL. You can either develop ETLs by creating mappings quickly or build it from scratch. Some commonly used ETL tools are Informatica, Talend, Pentaho. • Structured Query Language or SQL is the backbone of ETL. You must know SQL as it is the technology used to build ETLs. • Parameterization is another crucial skill to master. • Knowledge in any of the scripting languages used in a database application, like, Python, Perl, and Bash, will come in handy. • Debugging is another essential technical skill as nothing ever goes as planned.
($) : DOE
Data Engineer with INFORMATICA AND KAFKA Horsham, PA 6+ Months Interview: phone to Skype/WebEx - will eventually require candidate to work ONSITE. Looking at candidates in PA, NJ, NY, DE, VA area. Top Skills MUST HAVE Informatica AND Kafka Job Description: We are currently seeking a replacement for a resigned Data Engineer. The position primarily requires strength in the following: Bachelor's degree or higher required. Big Data Technology experience including HDFS, Pig, Hive, Sqoop, Python. Experience with Machine Learning, Artificial Intelligence, and Data Science is a plus. Informatica expertise with an emphasis on working with a diverse set of sources and targets, implementing auditing, error trapping/tracking, reusability and restartablilty, and the ability to troubleshoot and performance tune Informatica mappings, sessions and workflows Informatica Power Exchange CDC experience on Oracle, DB2, and Mainframe preferred Exadata, Teradata, or Netezza Appliance database expertise PL/SQL or T-SQL experience Deep understanding of Data Warehousing principles with hands on experience with slowly changing dimensions and fact tables Detailed work ethics around analysis and coding practices ETL and Database tuning experience Provide scalable solutions for handling large data volumes (Terabytes of data) Develop design specifications, unit test plans, and troubleshoot client issues Experience working in an Agile Methodology environment Accountability in deliverables with the ability to work independently Excellent communications and collaboration skills Ability to work in a fast paced environment and meet deadlines Thanks and Regards, Shivangi Singh | Team Lead | KPG99, INC Certified Minority Business Enterprise (MBE) Direct| | www.kpgtech.com
Apr-07-20
Position: Informatica MDM Developer Location: Boston, MA Mandatory Required Skills: Master Data Management (MDM), PL-SQL, Unix, Shell Scripting Job Description: 6 to 8 years of experience in Informatica. Candidate must be from development background. Strong in Oracle Database Concepts, SQL. PL-SQL. Unix, Shell Scripting Master Data Management process knowledge. Extensive Experience in Informatica MDM (Master Data Management) i.e., Configuring stage and load process Configuring merge process and data access Configure the match process Configure data access views Various Data Management Tools, User exits and log files Hierarchy management and security access manager Conversant in Agile methodology of project execution is preferred. Customer facing experience w.r.t technical discussions. Team Handling/Generic Skills: Should be able to lead at least 3 to 4 members team in all phases of Development life cycle. Self-driven, ready to learn and adopt depending on customer/organization needs. Excellent communication skills. Good working, understanding of investment/custodian banking exposure for financial institutions preferable. Should be able to work in Agile model and some exposure to Agile will be preferable.
Apr-07-20
Role: Lead Data Modeling with Snowflake Location: San Francisco, CA ( Locals only ) Duration: Long term contract Job description : Should not be 15 – 20 years experienced candidates. Need 10 to 12 years experience. Data Architect who has done Data Modeling with experience on Snowflake Cloud Data warehouse software. Need to have experience in Data migration from on Premise to Cloud. Need to be more of hands on Data rather than just architect. 60% hands on. 40 % Business Analysis/ Architect/lead.
Apr-07-20
DATA ANALYST  Fort Worth, TX
($) : Negotiable
Role: Data Analyst  Location: Fort Worth, TX Intelligent Automation – RPA, NLP, AI and Advanced Analytics Job Summary: Provide CLIENT specialization in advanced analytics to extract insight from information by iterating rapidly, to summarize, visualize large data sets. This position will be working closely with business SME’ s, operations research and will be learning / leveraging AI tools & techniques. Responsibilities & Tasks The Data Analyst will assist in our efforts to create analytical solutions for CLIENT initiatives. Candidate will be responsible for data exploration, discovery, and presentation of gathered insight from data. This role will be responsible for visualizing and communicating insight extracted from data to stakeholders at various levels across the company. This role will construct, test, maintain and architect supporting datasets. Experienced candidate needs proven ability to work independently on big projects. Candidate must have excellent interpersonal skills and ability to work with several stakeholders across multiple organizations. Must be very organized and detailed in development efforts. Required Experience: •                   Excellent interpersonal skills and ability to work with several stakeholders across multiple organizations •                   Ability to facilitate conversations with business SME’ s to understand the problem, rapidly iterate proposed solutions and clearly present new findings/ solutions. •                   Lead level experience providing technical & functional guidance •                   Hands on & expert experience with enterprise data visualization tools like Tableau and / or Power BI •                   Leveraging Azure AI platform tools like Databricks & Auto ML, etchellip; •                   Experience with relational database management system development. •                   Solid analysis and problem-solving skills •                   Able to build analytics solution including data exploration, extraction, cleaning, transformation, testing and implementation. •                   Open to learn new tools and technologies. •                   Able to adapt to fast-paced working environment   Preferred Experience •                                           Project leadership experience leading collaborative efforts •                                           Master’ s Degree in Management Information Systems, Computer Science or equivalent. •                                           Ability to write code in python, java and Rnbsp;  
Apr-07-20
Role: Informatica/Hadoop Technical Lead Location: Detroit, MI Interview type: Skype. Informatica/Hadoop Technical Lead The Informatica/Hadoop Technical Lead must have prior hands-on (MUST) experience delivering/leading successful data warehousing projects (Informatica) as well as broad background and experience with IT application development (Hadoop This individual is responsible for working with other teams to deliver the artifacts and the code. This person must have strong professional consulting skills and the ability to communicate well at all levels of the organization. REQUIRED EXPERTISE IN TOOLS & TECHNOLOGIES: Informatica 9.x and above (MUST) Informatica Power Center (MUST) Informatica Data Quality (STRONGLY PREFERRED) Big Data Hadoop Eco-system (preferably Cloudera distribution) (MUST) Hadoop HDFS / Pig / Spark / Oozie NoSQL Databases - Hive /Impala/MongoDB/Cassandra Oracle 10g and above (MUST) Unix Shell Scripting - AIX or Linux (MUST) Experience in any of Scheduling Tools - Tivoli, Autosys, Ctrl-M Bachelor''s Degree in related field OTHER SKILLS/EXPERIENCE REQUIRED: More than 3 years of experience as a Senior ETL Developer More than 5 years’ experience as ETL Developer using Informatica and Oracle 10g/11g to implement data warehousing projects Working knowledge of Informatica Data Quality is preferred More than 2 years of experience in leading/designing Informatica /Hadoop projects Excellent understanding of data warehousing concepts. Candidate should be able to clearly communicate fundamental concepts during interview and demonstrate previous experience in all aspects. MUST HAVE strong SQL skills in Oracle 10g/11g Experience in Oracle database programming using Partitioning, Materialized View and OLAP Experience in tuning Oracle queries/processes and performance management tools Strong data modeling skills (normalized and multidimensional) Strong business and communication skills Knowledge of Health Care Insurance Payer Data Warehousing Preferred Preferred Certifications: Cloudera (CCP / CCA) RESPOSIBILITIES: The Senior Developer should be able to perform the following with minimal supervision: Understand Business Requirements and Conceptual Solution Convert Business Requirements into Technical Requirements and Design Create High Level Design and Detailed Technical Design Create Informatica Mappings, Workflows and Sessions Create Shell Scripts for automation Understand the source systems and conduct the data profiling Co-ordinate Data Modeling of Sources and Targets Create Source to Target Mapping Specifications from the Business Requirements Review Unit Testing and Unit Testing Results Documents Provide Support for QA/UA Testing and Production code migrations Provide warranty support by assisting/resolving production issues The Senior Developer should have the following leadership skills: Provide hands on technical leadership Lead technical requirements, technical and data architectures for the data warehouse projects Direct the discovery process Provide knowledge guidance to Business Analysts and ETL Developers Provide subject matter expertise and knowledge guidance to a team of analysts and ETL Developers Lead the design and development of the ETL process, including data quality and testing Follow the standards and processes defined Contribute for process and performance improvements Ensure compliance of meta data standards for the data warehouse
Apr-07-20
Data Architect We have an immediate opportunity with a large F500 client in the Alpharetta, GA area. We are looking for Data Architect at Alpharetta, GA with one of our major clients. Please go over the details let me know Data Architect Alpharetta, GA Big Data Architect (Big Data and Cloud (GCP) Management Practice) Description Cleint is looking for Bid data and cloud Architect to build end to end business solutions and to work with one of the leading financial services organization in US This job offers an unique opportunity to work in a high growth company, with multiple recent acquisitions, rapidly maturing Big Data and Cloud data practice and a need to shape the future of Data Architecture across the enterprise to support the organization’s goal of becoming the industry standard for Financial Services Industry The Big Data Architect (Domain– Finance, Marketing, Sales, Consumers, Products, etc is accountable for architecting and designing comprehensive solutions that meet business and functional requirements in support of a given initiative. The Architect plays a role in establishing architectural vision and direction, architects solutions, provides advice and guidance, monitors emerging technologies, and assists in software and service The Architect must be process oriented, results driven and focused in delivering high quality solutions. The ideal candidate has demonstrated experience in functional design & implementing (hands on experience) the full life-cycle of both Enterprise Data Warehouse/Lake (EDW on Cloud platform In addition, the Data Architect will passionately drive transformation initiative which will guide and influence key business decisions. This role will partner with, Finance, Sales, Marketing, Product management team as well as other enterprise customers to understand business needs, drive requirements, propose solutions and deliver multiple cross-functional projects and integrations between different systems. This is a unique opportunity to build and institutionalize a best-in-class data ecosystem The Data Architect will be accountable for partnering with key roles (e.g. project managers, architects, business analysts, etc to develop solution blue prints that are aligned to organization’s architecture standards, principles, leveraging common solutions and services, and meet the financial targets (cost and benefits To be successful in this role one must be able to work effectively in a fluid, fast-paced environment. This role requires strong communication skills with delivery and engineering team members, operations support staff and business customers. In addition, the successful Data Architect must be able to work with minimal supervision on multiple concurrent projects. Responsibilities Work closely with management and key business stakeholders to determine business needs. Develop and maintain current and target state data architectures, define strategy and roadmaps to Drive the discovery, design and implementation of our EDW/EDH Serve as the functional expert responsible for the data architecture, design and implementation of Data solutions, with complete and accurate information information/data delivery using maintainable, systematic, and automated procedures. Make recommendations about data collection methods, data management, data definitions, and evaluation methods in collaboration with internal stakeholders. Establish and maintain data marts where appropriate Partner to define and enforce data architecture standards, procedures, metrics, and policies to ensure consistency across the portfolio Help establish and define validation, data cleansing, integration, and data transformation practices. Also lead or co-lead data governance and quality standards. Facilitate reporting and analytics design review sessions Stay current with contemporary technology trends/concepts and serve as an SME for the business teams Manage issues and bugs within the system using tracking/ support systems; liaise with internal and external resources to facilitate resolution and closure Define reference and data solution architectures that support cloud initiatives, big data and data lake use cases, and traditional data platforms Present architecture deliverables to stakeholders at all levels of the organization Experience Successfully architected complex large Big Data Solutions in Cloud. Experience implementing Big Data on GCP platform. Highly skilled is applying data governance processes and best practices Working experience designing, developing, and delivering data related functional and technology solutions Working experience in delivering solutions Working experience architecting complex, multi-system solutions Working experience developing architecture principles and standards Qualifications 7+ years’ experience as a practicing data architect/ engineer with 3+ years’ experience in as a Data Architect function. Significant experience in data architecture services, designing, developing, and delivering technology solutions. Working experience in delivering solutions Thorough understanding of Cloud and Big Data platforms specifically GCP Working knowledge of Oracle BRM, MDM. OPH adjacent technology including SOA and Micro Services. Working experience in cloud technologies, specifically, AWS and Google Cloud and integration experience with Oracle ERP, BRM, RMB, SFDC and other home grown application and platforms. Bachelor’s Degree in Computer Science, MIS, Business, or equivalent experience. For immediate consideration please contact: Anu UpStream Global Services. Reply to: www.upstreamgs.com
Apr-07-20
($) : DOE
< p style=" margin-top:0pt; margin-bottom:0pt; text-align:justify" > < span style=" font-family:Georgia, serif; " > < em> < span style=" font-size:10px; " > < span style=" unicode-bidi:embed" > < span style=" color:black" > TECHNOGEN, Inc. is a Proven Leader in providing full IT Services, Software Development and Solutions for 15 years. < /span> < /span> < /span> < /em> < /span> < /p> < p style=" margin-top:0pt; margin-bottom:0pt; text-align:justify" >  < /p> < p style=" margin-top:0pt; margin-bottom:0pt; text-align:justify" > < span style=" font-family:Georgia, serif; " > < em> < span style=" font-size:10px; " > < span style=" unicode-bidi:embed" > < span style=" color:black" > TECHNOGEN is a Small & Woman Owned Minority Business with GSA Advantage Certification. We have offices in VA; MD & Offshore development centers in India. We have successfully executed 100+ projects for clients ranging from small business and non-profits to Fortune 50 companies and federal, state and local agencieslt; /span> < /span> < /span> < /em> < /span> < /p> < br/> < p> Position: Data Analyst< /p> < p> Location: Rockville, Maryland (initially will be remote due to corona virus. After everything is normal you have to go onsitelt; /p> < p> Duration: Long Term< /p> < p> < /p> < p> Job Description:< /p> < ul> < li> Expert in data analysis< /li> < li> Data modelling/ Data Architecting< /li> < li> Understanding data< /li> < li> How to map, move the data, how to architect data< /li> < li> Heavy SQL and ability to write complex queries from scratchlt; /li> < li> Experienced in Agile< /li> < li> Prior experience with AHRQ/NIH etc is preferred< /li> < /ul> < p> < /p> < p> Best Regards, < b style=" font-variant-ligatures:normal; font-variant-caps:normal; orphans:2; text-align:start; widows:2; -webkit-text-stroke-width:0px; text-decoration-style:initial; text-decoration-color:initial; word-spacing:0px" > < /b> < br /> < br /> < b> Kevin< /b> < br /> < b> Sr. Talent Acquisition Specialist< /b> < br /> < b> Phone< /b> < br /> < b> Email: < /b> < a data-auth=" NotApplicable" href=" mailto:kevin@technogeninc. com" style=" color:blue; text-decoration:underline" target=" _blank" > < b> kevin@technogeninc. com< /b> < /a> < b style=" font-variant-ligatures:normal; font-variant-caps:normal; orphans:2; text-align:start; widows:2; -webkit-text-stroke-width:0px; text-decoration-style:initial; text-decoration-color:initial; word-spacing:0px" > < /b> < br /> < b> Web:< /b> < b style=" font-variant-ligatures:normal; font-variant-caps:normal; orphans:2; text-align:start; widows:2; -webkit-text-stroke-width:0px; text-decoration-style:initial; text-decoration-color:initial; word-spacing:0px" > < u> < /u> < /b> < a data-auth=" NotApplicable" href=" http://www. technogeninc. com/" style=" color:blue; text-decoration:underline" target=" _blank" > < b> www. technogeninc. com< /b> < /a> < br /> < b> 4229 Lafayette Center Dr, Suite 1880, Chantilly, VA 20151< /b> < /p> < p> < /p>
Apr-07-20
<div>Job title: Big Data Architect<br /> Location: Stamford, CT<br /> Duration: Long term</div> <div> Strong Big Data Architect with knowledge on PYSPARK, Hive, Pig, Spark, Administration of AWS EMR (preferred),Airflow, Lamda, Oozielt;br /> Experience in building scalable big data ingestion frameworks<br /> Define and build scalable and futuristic architecture for data platforms<br /> Good Hands on experience in PySparklt;br /> Work closely with the customer on data exploration & provide technology guidance (Technical) on enabling Data foundation for analytics<br /> Build multiple PoCs, as part of the data framework build, eg:help users Query unstructured data for formulating the requirements<br /> Exposure to other Big Data technologies is preferred as this is a green field implementation Hive, Pig, PYSpark, Administration of AWS EMR (preferred<br /> Exposure to other Big Data technologies is preferred as there is a lot of scope for experimentation & adoption of new technologieslt;/div>
Apr-07-20
Data Analyst  Santa Clara, CA
($) : 65000 / year
Data Analyst Responsibilities: · Managing master data, including creation, updates, and deletion. · Provide quality assurance of imported data, working with quality assurance analyst if necessary. · Commissioning and decommissioning of data sets. · Processing confidential data and information according to guidelines. · Helping develop reports and analysis. · Supporting initiatives for data integrity. · Evaluating changes and updates to source production systems. · Providing technical expertise on data storage structures, data mining, and data cleansing. · Understand business requirements from operations users and translate into work items. Key Skills: · Experience as a Data Analyst in Insurance is mandatory. Preferably Life Insurance · Hadoop experience with HIVE SQL mandatory · High-level experience in methodologies and processes for managing large scale databases. · Experience to Redshift, Business Objects , Tableau would be an added advantage. · Experience with Python is preferable · Demonstrated experience in handling large data sets and relational databases. · Understanding of addressing and metadata standards. · High-level written and verbal communication skills.
Apr-07-20
Description: Experienced Ab Initio Developer who has worked in end to end SDLC life cycle. Develop and foster a positive relationship with team members, team leads and business partners Develops and updates documentation, departmental technical procedures and user guides Responsibilities: 6-8 years of work experience in Ab Initio Good analytical and logical skills in writing SQLs, Stored Procedures and creating Marts/Views in Oracle Should have worked in an Agile delivery environment Capable of understanding of business requirement/mappings and converting them into design
Apr-07-20
Data Analyst  Dallas, TX
($) : 65000 / year
Responsibilities of the Data Analyst: Analysis of structured and unstructured data for data quality issues Perform data steward activities such as cleansing data which has errored off to keep enterprise master data accurate Presentation development for senior management supporting the Data Team Gather and synthesize functional and non-functional business requirements ensuring alignment to data strategy Analysis of current communication processes and identify opportunities for enhancements Centralize and streamline activities to promote agility and process improvements Create process and data flow diagrams for data movement capture Work with assigned Data Architects or development team members related to data questions Work as a liaison between the business and project teams related to data questions or concerns Collaborate/communicate with project team and business users as required Support functional testing and performance testing Work with technical delivery lead on project activities Ensure assigned work is implemented within project schedules Requirements of the Data Analyst: At least 2-3 years of experience with Data Analysis Proficient is writing SQL queries to analyse database Excellent communication and coordination techniques Experience working with senior leadership Demonstrated ability to work in a high-intensity, multi-project environment Excellent interpersonal and communication skills (technical and non-technical) Extremely Detail-Oriented individual with the ability to multi-task Ability to work autonomously towards a goal Strong verbal and written communications skills Strong analytical, problem-solving, and conceptual skills Excellent interpersonal skills Strong Microsoft Office tools (Powerpoint, Excel, Word, Visio) knowledge. Strong understanding of data related concepts such as mater data management, data warehousing and analytics Education: Bachelor'' s degree in Business (or Management), Computer Science, or related discipline, or equivalent work experience is required
Apr-07-20
We are currently looking for a SAS Programmer in the Boston, MA area to work with actuarial. Duties: ?? Ability to transform verbal direction and business requirements into meaningful reports and compile data for analysis and ad hoc projects ?? The positions include writing and modifying SAS code to produce complicated combinations of detailed data ?? The key requirement for the role is the ability to understand complex business requirements, develop data-driven solutions to address business problems, provide data and SAS programming expertise in supporting Actuarial team requirements ?? Hands on experience in writing complex queries, macros and stored procedures ?? Use SAS to extract data per specifications to create analytic datasets ?? Merge different types of data (e.g. transaction-level data with account-level data) ?? Quickly create and run ad hoc queries to answer urgent business questions ?? Ability to maintain and enhance existing SAS processes ?? Able to understand complex macro/Do loops in existing codes, able to make changes if needed. ?? Ability to query data from SQL databases, large datasets, and files using SAS and/or SQL ?? Ability to research reporting variances or business issues to identify root cause and build appropriate audits ?? Able to decode Python script and convert to SAS Requirements Job Qualifications: ?? Bachelor??s degree in Computer Science or related education or practical experience ?? 7+ years of advanced SAS experience (macros, data management, data profiling and reporting) ?? SAS programming experience required ?? Understand and experience of using various SAS methods for merging, joining, and performing lookups. ?? Experience in creating performance reports using SAS, SQL, and Microsoft Office tools ?? Demonstrated experience producing accurate and detailed work on multiple projects under time pressure ?? Strong verbal, written, and interpersonal communication skills desired ?? Experience in a financial or banking environment preferred ?? Should know basic Python coding
Apr-06-20
Job Title: Principal Data Scientist Duration: 6+ months Location: Durham, NC Looking for a Lead Data Scientist for Durham, NC working in Health Care Technology business units. MUST HAVE HEALTH CARE TECHNOLOGY Experience, or will not be considered (experience with Medicare / Medicaid, etc The associate’s key differentiating abilities will be outstanding data science, programming and problem-solving skills. Absolutely critical is individual''s ability to carry an initiative from ideation to execution. A natural programmer, with demonstrated industry experience statistics, machine learning, data modeling and building associated pipelines Experience with one or more of the following tools/frameworks – python, scikit-learn, nltk, pandas, numpy, R, pyspark, scala, SQL/big data tools, TensorFlow, PyTorch, etc. Proven track record, over several years, in conceptualizing, planning, leading, and developing data science in the Medicare domain Past accomplishments include leading project teams; knowledge of Agile ceremonies a plus Education– At least one advanced degree (Master or PhD level) in a technical or mathematically-oriented discipline, e.g., coursework or experience in fields such as statistics, machine learning, computer science, applied mathematics, econometrics, engineering, etc. Analytic Skills: In addition to core regression, classification and time series skills that accompany the data science role, experience with A/B testing, causal inference and experimentation methods are preferred Consulting experience is a plus Experience with financial data sets is nice to have Experience with the full stack of data science: Data extraction, cleaning, and preparation; Feature extraction AI/ML algorithm development Model training and evaluation Putting applications into production Past work includes developing data science roadmaps Demonstrated experience working with healthcare data, including claims with its various codes such as ICD-10 and CPT, as well as the rules around Medicare plans is preferred
Apr-06-20
($) : Market
Experience of MDM- ability to apply those principles to our existing platforms. Experience of Informatica Hub, IDQ and IDD Working experience of general data management practices, including but not limited to: Data management standards and approaches, Data quality management (validation, data profiling, etc, Data governance practices, including data governance program lifecycles, Data stewardship practices, and Data sourcing practices. Hands on experience in developing and implementing data quality rules IDQ or other data quality Tool. Excellent SQL query skills and understand of relational databases.
Apr-06-20
We required local candidate … HI, Please go through below job description and let me know your interest. In case you have any queries, feel free to call me Job Role : Informatica developer |AWS|Power BI Location :Houston ,TX Contract duration 6+ month Technical Hiring Criteria (Must Haves) Programming Languages: ETL Informatica Power Center and Power BI Platform/Environment: Informatica on AWS cloud Database Management System: Oracle and Postgres Application Packages, etc.: Well version with Database Years of experience on each of the Technical must have skills: Min 8 Years into ETL technology and mainly Informatica Power Center and Power BI Minimum years of experience: 8 years
Apr-06-20
Job Title: IBM Watson Consultant Duration: Long Term Location: Hartford, CT 06156 * 10 years of experience in machine learning and data analysis. * Highly skilled and experienced in NLP Text Analytics on IBM Watson Platform. * Develop, construct, test and maintain architectures and align architecture with business requirements Will be responsible for proposing, training, validating, and shipping AIML models from concept to production. * Work with data analysts, data engineers, data scientists and product SMEs. * Requirements gathering and assessment.
Apr-06-20
($) : Market
Informatica ETL Developer Sunnyvale, CA 12 Months LinkedIn ID Must - If currently working, need Official Email ID or ID card Pharma background Role: Responsibilities: * Analyze business requirements, follows standard change control and config= uration management practices and conforms to departmental application devel= opment standards and systems life cycle. * Lead the Design, development and implementation of new system components = or fixes to resolve system defects. * Incorporates source code reuse wherever possible. * Understands data ETL concepts and best practices. * Sets up and executes component tests as well as tracks and documents syst= em defects. * Participates in software design and programming reviews. * Design and build data models to conform with our existing EDW architectur= e * Work with teams to deliver effective, high-value reporting solutions by l= everaging an established delivery methodology. * Perform data mining and analysis to uncover trends and correlations to de= velop insights that can materially improve our decisions. Skills: Skillset - Must have: * Looking for an ETL developer with 10+ years of Informatica experience. * Position requires advance knowledge of Informatica and SQL Experience. * Must have Strong Knowledge of Data Base Concepts * Experience visualizing data in business intelligence tools such as Tablea= u * Must have experience to Understand and source system and recommend soluti= on from data integration standpoint * Must have performed ETL projects end to end * Ability to lead the project and provide technical guidance to the team * Domain experience in Pharmaceutical Commercial area is a great plus!! Domnic Uliyano Insigma Inc 24805 Pinebrook Rd, Suite 315 Chantilly, VA 20152
Apr-05-20
Hi, Please find the job description below and let me know, TITLE: Ab Initio Developer Location: Wilmington, DE, Lewisville, TX Duration: Contract Interview Mode: Telephonoc/Skype Skillset: 1. Ab Initio experience is mandatory 2. Abi BRE Rules Engine experience preferred 3. Cards domain experience is a positive 4. Other rules engine experience if Abi BRE experience is not available - Blaze, ODM, etc. (ODM- IBM rules engine 5. Right blend of BA and programming experience will be preferred 6. Candidate should be motivated to work in Rules Engine area 7. Open to consider resources trained in Abi BRE if past experience not available and willing to learn 8. Years of exp blend of Senior, mid-level and Junior is acceptable Thanks & Regards, Hasan Khan | IT Recruiter AV TE CH Solutions Inc Phone Email: hasan@avtechsol. com Web: www. avtechsol. com
Apr-05-20
Should have strong Informatica / ETL.
Apr-05-20
Job Roles/Responsibilities: 8-10 years Total Experience in Informatica Power Center Strong in ETL Development processes Extensive knowledge in Data identification and Test data preparation Strong knowledge in Data warehouse concepts ETL Architecture knowledge for developing Technical Specifications Documents Design Documents and providing Application Support Maintenance Strong in UNIX scripting Oracle PLSQL Strong Analytical and Troubleshooting Skills Ability to provide technical support to other team members during project execution Good understanding of DWBI Skills Excellent Communication Documentation and Presentation skills Good to have Data ware housing testing experience
Apr-05-20
Data Architect  Houston, TX
We are looking for a Data Architect, please send resumes.
Apr-04-20
($) : Market
Job Description Senior Statistical Programmer - SDTM Location: Remote U.S. *Strong focus will be on SDTM experience* BA/BSc or higher degree in Computer Science, Statistics, Mathematics, Life Sciences or other related scientific subject, or work experience equivalent Greater than 5 years of relevant career experience programming in a clinical development environment Excellent SAS data manipulation, analysis and reporting skills Ability to provide quality output and deliverables Ability to write, test, and validate SAS programs, and review resulting output and data. Excellent oral and written English communication skills Familiarity with drug development life cycle and experience with the manipulation, analysis and reporting of clinical trials data Good knowledge of Clinical Data Interchange Standards Consortium (CDISC) Study Data Tabulation Model (SDTM) and Analysis Data Model (ADaM) standards and the principles on which it is founded Ability to work effectively and successfully in a team environment Ability to manage challenging timelines Willingness and ability to learn and follow standard processes and procedures Ability to work effectively on multiple tasks or projects Ability to effectively perform complex statistical programming and related tasks Willingness and ability to provide guidance to team members on technical and process questions Responsibilities for this role involves, but not limited to, data manipulation, analysis and reporting of clinical trial data. Specifically: creation of analysis files, tables, figures and listings (TFLs), validation of those files, and maintenance of associated tracking and validation documentation. Work is to be performed in a team environment, where team members may be distributed globally across several locations. The Senior Programmer is expected to be able to provide technical and process-related guidance to team members.
Apr-03-20
($) : Market
Title: Informatica Developer Location: Thousand Oaks, CA Duration: 6 months contract Experience working with Informatica PowerCenter 9.x or higher Requirements gathering and creating architectural, detail design documents and mapping documents. Transformations of Informatica including complex lookups, stored procedures, update strategy, mapplets, etc. Skilled in developing Informatica objects - mappings, sessions, and workflows based on the prepared low level design documents. Experience in debugging mappings and identifying errors and error rows so that they can be corrected and re-loaded into a target system. Performance tuning of Informatica Mappings, processes, and load routines. Experience with Data Integration Web Services, Data Quality, and Data Validation Option. Work with business requirements to identify and understand source data systems; provide resolutions to all data issues and coordinate with data analyst to validate all requirements, perform interviews with all users and developers. Map source system data to data warehouse tables Develop and perform tests and validate all data flows and prepare all ETL processes according to business requirements and incorporate all business requirements into all design specifications. Define and capture metadata and rules associated with ETL processes Adapt ETL processes to accommodate changes in source systems and new business user requirements Collaborate with all developers and business users to gather required data and execute all ETL programs and scripts on systems and implement all data warehouse activities and prepare reports for same. Provide support and maintenance on ETL processes and documentation Experience automating ETL processes with Informatica PowerCenter. Ability to perform relational and multidimensional modeling and schema creation. Development of cube dimensions, measure groups, calculations, and any data management code for the partitioning schemes. XML and corresponding XSD''s. Database query/performance optimizations. Experience with SQL Server, Oracle or DB2 in the areas of database administration, as well as schema design, database object development, management, optimization, and SQL programming
Apr-03-20
($) : DOE
Sr. Data Analyst with Install Base Experience · Need a candidate with any Install Base implementation experience (Any homegrown, Oracle EBS or SAP domain is fine) · Strong SQL skills · Can talk to stakeholders, business users, customers. · Understanding on the sales domain.
Apr-03-20
($) : DOE
For one of our going on project we are looking for a Data Analyst (T) Data Science and Analytics. Skills: 7+ years of experience in the software industry, with extensive knowledge of implementing Medallia Solution. Should possess expertise in Medallia’s product capabilities and address a wide variety of business concerns through a customized and differentiated product capability demonstration using Medallia’s software platform Ability to understand data flows that are aligned with customer information systems. Passion for helping clients; empathy for their challenges, ability to build relationships and effectively communicate with client stakeholders. Hands-on knowledge of common web technologies, e.g. JavaScript, CSS, HTML, and integration technologies and protocols (APIs, REST, HTTP, SFTP, etc. Understanding of common security concepts and standards (SSO, SAML, OAuth, RBAC, etc. Solution-oriented; passion for creative problem solving, comfortable tackling new and undefined problem spaces.
Apr-03-20
- Extensive knowledge in SQL Server - Expert in Informatica PowerCenter - Depth knowledge in DWH concepts - Knowledge on Control-M - Knowledge in scripting (PowerShell and batch Scripting) - Good Analytical and communication Skills - scheduling tool like CTM knowledge is an added advantage.
Apr-02-20
Position: ETL Developer Location- Dallas, TX Please find the job description for ETL Developer Job Description: - Extensive knowledge in SQL Server - Expert in Informatica PowerCenter - Depth knowledge in DWH concepts - Knowledge on Control-M - Knowledge in scripting (PowerShell and batch Scripting) - Good Analytical and communication Skills - scheduling tool like CTM knowledge is an added advantage.
Apr-02-20
($) : DOE
For a long-term multiyear project required a Data Analyst - Hadoop-AWS EMR. Responsibilities: Continuously optimize, enhance, monitor, support and maintain all Talend data integration processes and should be an expert in Talend Big Data jobs. Use APIs or source XML Type columns to dynamically extract, integrate and load data in target schema using Talend Should be responsible for building extraction and mapping rules for loading data from multiple sources for greenfield data warehouse implementation based on Talend, AWS and Snowflake. Should have experience working with multiple file formats especially with Parquet and Avro Should have experience with moving data into S3 folder structures and working with Talend Spark jobs on AWS EMR. Should contribute to logical data model for data warehousing and making data available for downstream consumption Maintain documentation, manage source code and deployments, and implement best-practices
Apr-02-20
($) : Market
Data Scientist III World Wide Technology Holding Co, LLC. (WWT) has an opportunity available for a Data Scientist II to support our client located in Phoenix, AZ. This position will be focused on utilizing large sets of data pools to solve complex and meaningful problems. Location:  Phoenix, AZ (30-50% travel to mining sites, possible international travel) Duration: 6-month contract to hire End Client: Freeport McMoran Copper and Gold Inc Visa Type: USC & GC Responsibilities/Job Duties/Job Description/Qualifications: Lead the deployment of ensemble machine learning methods, deep learning, and advanced optimization techniques to drill into our toughest challenges. Test hypotheses and draw insights to support our goal of being industry-leading and resource-efficient. Utilize modern cloud technologies, such as Microsoft Azure, to deliver innovative analytics solutions at scale and craft the policies to support them as a thought leader. Steer and manage cross-functional teams through all stages of building practical, analytical assets and contribute to their growth. Minimum Qualifications Bachelor’s degree in a related field AND seven (8) years of relevant work experience OR Master’s degree in a related field AND five (6) years of relevant work experience OR Ph.D. in a related field and two (4) years of relevant work experience Preferred Some of the skills needed for Freeport Data Scientists are (I highlight in yellow the must-haves): Databases and SQL Data ETL: missing data treatment Advanced decision trees like Random Forest and/or Gradient Boosting Machine (GBM, XGBoost) Culturally aware of artificial neural network K-Means clustering Anomaly detection R and Python, Python is becoming more important than R, lately BI (Business Intelligence) tools like Tableau, PowerBI, SAP Business Object, etc. Cloud computing experience would be good to have: AWS, Azure, GCP (Google Cloud Platform) Culturally aware of modern AI / Deep Learning. Hands-on experience would be great. CNN (Convolutional Neural Network), LSTM (Long Short Term Memory) machine, Autoencoder Culturally aware of NLP (Natural Language Processing): word embedding (word2vec, doc2vec), topic classification, sentiment analysis Image/Video analytics is a plus Culturally aware of the four families of optimization methods: (1)LP Linear Programming, Integer Programming, Mixed Integer Programming (2)Gradient Descend, Newton-Raphson, Quasi-Newton (3)Genetic algorithm (4)Simulated Annealing
Apr-02-20
Job Title: Informatica TDM Consultant Work Location & Reporting Address: Weehawken, NJ 07086 Contract duration: 12 Months Job Details: Must Have Skills Informatica TDM Test Data Management Oracle SQL Detailed Job Description Candidate MUST have experience implementing a Test Data Management initiative for a Financial Services company or a Bank. Should have 2 years of experience in Test Data Management. Experience with settingup configuring Informatica TDM Tool is a MUST. Strong Knowledge and experience in Oracle and SQL databases. Experience in data profiling, sub setting and validation of test data copied to QA environment systems. Experience with setting up configuration.
Apr-02-20
Datawarehousing jobs statewise
Skill Test
  • C++ Coding Test - High
  • C++ Coding Test - Medium
  • C++ Language
  • C# ASP.NET MVC
  • C# ASP.NET MVC
$50 Coupon for your First Assessment
Improve your Employability
Are your skills in Demand?
Try DemandIndex and find out instantly
Looking for IT Jobs
Full Time - Contract - Consulting
IT jobs
Create account and get notified in 30 seconds