Top Skills > Datawarehousing

Datawarehousing ETL Jobs in USA - 2381

NO matched jobs found.
Click here for a custom search.
SAS Developer/Lead  San Antonio, TX
($) : Depends on experience
Role: SAS Developer/Lead Location: San Antonio, TX Type: Contractual through IMPLEMENTATION PARTNER Duration: Long term BR/H on C2C: DOE Exp: 5+ years Skill set: Advanced SAS, Control-M, UNIX, SAS/VBA(Basics), Mainframe(Basics) Job Description: -Should have minimum 5+ years of experience as SAS Developer in Linux environment -Should have hands on experience with SAS Enterprise Guide - Automation Skills, Develop shell scripts - Schedule the SAS Jobs in Control-M, Troubleshooting the abends, Error Resolutions - Basic Mainframe knowledge with navigation and file system experience. - Working with SAS Connections in Microsoft tools If interested in exploring the opportunity, please revert with resume, visa status & BR/H on C2C Neha Doshi Sr. Technical Recruiter Sierra Business Solution (five two zero) five eight eight - eight one two eight neha(at)sierrasoln(dot)com
Apr-01-20
($) : DOE
Data Engineer with INFORMATICA AND KAFKA Horsham, PA 6+ Months Interview: phone to Skype/WebEx - will eventually require candidate to work ONSITE. Looking at candidates in PA, NJ, NY, DE, VA area. Top Skills MUST HAVE Informatica AND Kafka Job Description: We are currently seeking a replacement for a resigned Data Engineer. The position primarily requires strength in the following: Bachelor's degree or higher required. Big Data Technology experience including HDFS, Pig, Hive, Sqoop, Python. Experience with Machine Learning, Artificial Intelligence, and Data Science is a plus. Informatica expertise with an emphasis on working with a diverse set of sources and targets, implementing auditing, error trapping/tracking, reusability and restartablilty, and the ability to troubleshoot and performance tune Informatica mappings, sessions and workflows Informatica Power Exchange CDC experience on Oracle, DB2, and Mainframe preferred Exadata, Teradata, or Netezza Appliance database expertise PL/SQL or T-SQL experience Deep understanding of Data Warehousing principles with hands on experience with slowly changing dimensions and fact tables Detailed work ethics around analysis and coding practices ETL and Database tuning experience Provide scalable solutions for handling large data volumes (Terabytes of data) Develop design specifications, unit test plans, and troubleshoot client issues Experience working in an Agile Methodology environment Accountability in deliverables with the ability to work independently Excellent communications and collaboration skills Ability to work in a fast paced environment and meet deadlines Thanks and Regards, Shivangi Singh | Team Lead | KPG99, INC Certified Minority Business Enterprise (MBE) Direct| | www.kpgtech.com
Apr-01-20
Position: Informatica MDM Developer Location: Boston, MA Mandatory Required Skills: Master Data Management (MDM), PL-SQL, Unix, Shell Scripting Job Description: 6 to 8 years of experience in Informatica. Candidate must be from development background. Strong in Oracle Database Concepts, SQL. PL-SQL. Unix, Shell Scripting Master Data Management process knowledge. Extensive Experience in Informatica MDM (Master Data Management) i.e., Configuring stage and load process Configuring merge process and data access Configure the match process Configure data access views Various Data Management Tools, User exits and log files Hierarchy management and security access manager Conversant in Agile methodology of project execution is preferred. Customer facing experience w.r.t technical discussions. Team Handling/Generic Skills: Should be able to lead at least 3 to 4 members team in all phases of Development life cycle. Self-driven, ready to learn and adopt depending on customer/organization needs. Excellent communication skills. Good working, understanding of investment/custodian banking exposure for financial institutions preferable. Should be able to work in Agile model and some exposure to Agile will be preferable.
Apr-01-20
SQL Data Analyst  Santa Clara, CA
Title: SQL Data Analyst Location: Santa Clara, CA (Look for local candidates only) Experience: 3-5 yrs Duration: 1 yr Job Description: · Strong communication skills – both written & oral. Should be able to directly speak with business customers, gather requirements, translate to technical requirements, solution & deliver. · End-to-end delivery ownership – work with offshore teams, flexible, get it-done attitude. · IT experience in Microsoft Business Intelligence (MSBI · Experience in SQL SERVER 2008\2012, Business Intelligence, SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), SQL Server Reporting Services (SSRS), PL\SQL and Oracle. · Writing and optimizing complex SQL queries. · Strong knowledge on RDBMS _concepts. · Nice to have – Hive or impala or HiveQL.
Apr-01-20
Role: Lead Data Modeling with Snowflake Location: San Francisco, CA ( Locals only ) Duration: Long term contract Job description : Should not be 15 – 20 years experienced candidates. Need 10 to 12 years experience. Data Architect who has done Data Modeling with experience on Snowflake Cloud Data warehouse software. Need to have experience in Data migration from on Premise to Cloud. Need to be more of hands on Data rather than just architect. 60% hands on. 40 % Business Analysis/ Architect/lead.
Apr-01-20
DATA ANALYST  Fort Worth, TX
($) : Negotiable
Role: Data Analyst  Location: Fort Worth, TX Intelligent Automation – RPA, NLP, AI and Advanced Analytics Job Summary: Provide CLIENT specialization in advanced analytics to extract insight from information by iterating rapidly, to summarize, visualize large data sets. This position will be working closely with business SME’ s, operations research and will be learning / leveraging AI tools & techniques. Responsibilities & Tasks The Data Analyst will assist in our efforts to create analytical solutions for CLIENT initiatives. Candidate will be responsible for data exploration, discovery, and presentation of gathered insight from data. This role will be responsible for visualizing and communicating insight extracted from data to stakeholders at various levels across the company. This role will construct, test, maintain and architect supporting datasets. Experienced candidate needs proven ability to work independently on big projects. Candidate must have excellent interpersonal skills and ability to work with several stakeholders across multiple organizations. Must be very organized and detailed in development efforts. Required Experience: •                   Excellent interpersonal skills and ability to work with several stakeholders across multiple organizations •                   Ability to facilitate conversations with business SME’ s to understand the problem, rapidly iterate proposed solutions and clearly present new findings/ solutions. •                   Lead level experience providing technical & functional guidance •                   Hands on & expert experience with enterprise data visualization tools like Tableau and / or Power BI •                   Leveraging Azure AI platform tools like Databricks & Auto ML, etchellip; •                   Experience with relational database management system development. •                   Solid analysis and problem-solving skills •                   Able to build analytics solution including data exploration, extraction, cleaning, transformation, testing and implementation. •                   Open to learn new tools and technologies. •                   Able to adapt to fast-paced working environment   Preferred Experience •                                           Project leadership experience leading collaborative efforts •                                           Master’ s Degree in Management Information Systems, Computer Science or equivalent. •                                           Ability to write code in python, java and Rnbsp;  
Apr-01-20
Role: Informatica/Hadoop Technical Lead Location: Detroit, MI Interview type: Skype. Informatica/Hadoop Technical Lead The Informatica/Hadoop Technical Lead must have prior hands-on (MUST) experience delivering/leading successful data warehousing projects (Informatica) as well as broad background and experience with IT application development (Hadoop This individual is responsible for working with other teams to deliver the artifacts and the code. This person must have strong professional consulting skills and the ability to communicate well at all levels of the organization. REQUIRED EXPERTISE IN TOOLS & TECHNOLOGIES: Informatica 9.x and above (MUST) Informatica Power Center (MUST) Informatica Data Quality (STRONGLY PREFERRED) Big Data Hadoop Eco-system (preferably Cloudera distribution) (MUST) Hadoop HDFS / Pig / Spark / Oozie NoSQL Databases - Hive /Impala/MongoDB/Cassandra Oracle 10g and above (MUST) Unix Shell Scripting - AIX or Linux (MUST) Experience in any of Scheduling Tools - Tivoli, Autosys, Ctrl-M Bachelor''s Degree in related field OTHER SKILLS/EXPERIENCE REQUIRED: More than 3 years of experience as a Senior ETL Developer More than 5 years’ experience as ETL Developer using Informatica and Oracle 10g/11g to implement data warehousing projects Working knowledge of Informatica Data Quality is preferred More than 2 years of experience in leading/designing Informatica /Hadoop projects Excellent understanding of data warehousing concepts. Candidate should be able to clearly communicate fundamental concepts during interview and demonstrate previous experience in all aspects. MUST HAVE strong SQL skills in Oracle 10g/11g Experience in Oracle database programming using Partitioning, Materialized View and OLAP Experience in tuning Oracle queries/processes and performance management tools Strong data modeling skills (normalized and multidimensional) Strong business and communication skills Knowledge of Health Care Insurance Payer Data Warehousing Preferred Preferred Certifications: Cloudera (CCP / CCA) RESPOSIBILITIES: The Senior Developer should be able to perform the following with minimal supervision: Understand Business Requirements and Conceptual Solution Convert Business Requirements into Technical Requirements and Design Create High Level Design and Detailed Technical Design Create Informatica Mappings, Workflows and Sessions Create Shell Scripts for automation Understand the source systems and conduct the data profiling Co-ordinate Data Modeling of Sources and Targets Create Source to Target Mapping Specifications from the Business Requirements Review Unit Testing and Unit Testing Results Documents Provide Support for QA/UA Testing and Production code migrations Provide warranty support by assisting/resolving production issues The Senior Developer should have the following leadership skills: Provide hands on technical leadership Lead technical requirements, technical and data architectures for the data warehouse projects Direct the discovery process Provide knowledge guidance to Business Analysts and ETL Developers Provide subject matter expertise and knowledge guidance to a team of analysts and ETL Developers Lead the design and development of the ETL process, including data quality and testing Follow the standards and processes defined Contribute for process and performance improvements Ensure compliance of meta data standards for the data warehouse
Apr-01-20
Data Architect We have an immediate opportunity with a large F500 client in the Alpharetta, GA area. We are looking for Data Architect at Alpharetta, GA with one of our major clients. Please go over the details let me know Data Architect Alpharetta, GA Big Data Architect (Big Data and Cloud (GCP) Management Practice) Description Cleint is looking for Bid data and cloud Architect to build end to end business solutions and to work with one of the leading financial services organization in US This job offers an unique opportunity to work in a high growth company, with multiple recent acquisitions, rapidly maturing Big Data and Cloud data practice and a need to shape the future of Data Architecture across the enterprise to support the organization’s goal of becoming the industry standard for Financial Services Industry The Big Data Architect (Domain– Finance, Marketing, Sales, Consumers, Products, etc is accountable for architecting and designing comprehensive solutions that meet business and functional requirements in support of a given initiative. The Architect plays a role in establishing architectural vision and direction, architects solutions, provides advice and guidance, monitors emerging technologies, and assists in software and service The Architect must be process oriented, results driven and focused in delivering high quality solutions. The ideal candidate has demonstrated experience in functional design & implementing (hands on experience) the full life-cycle of both Enterprise Data Warehouse/Lake (EDW on Cloud platform In addition, the Data Architect will passionately drive transformation initiative which will guide and influence key business decisions. This role will partner with, Finance, Sales, Marketing, Product management team as well as other enterprise customers to understand business needs, drive requirements, propose solutions and deliver multiple cross-functional projects and integrations between different systems. This is a unique opportunity to build and institutionalize a best-in-class data ecosystem The Data Architect will be accountable for partnering with key roles (e.g. project managers, architects, business analysts, etc to develop solution blue prints that are aligned to organization’s architecture standards, principles, leveraging common solutions and services, and meet the financial targets (cost and benefits To be successful in this role one must be able to work effectively in a fluid, fast-paced environment. This role requires strong communication skills with delivery and engineering team members, operations support staff and business customers. In addition, the successful Data Architect must be able to work with minimal supervision on multiple concurrent projects. Responsibilities Work closely with management and key business stakeholders to determine business needs. Develop and maintain current and target state data architectures, define strategy and roadmaps to Drive the discovery, design and implementation of our EDW/EDH Serve as the functional expert responsible for the data architecture, design and implementation of Data solutions, with complete and accurate information information/data delivery using maintainable, systematic, and automated procedures. Make recommendations about data collection methods, data management, data definitions, and evaluation methods in collaboration with internal stakeholders. Establish and maintain data marts where appropriate Partner to define and enforce data architecture standards, procedures, metrics, and policies to ensure consistency across the portfolio Help establish and define validation, data cleansing, integration, and data transformation practices. Also lead or co-lead data governance and quality standards. Facilitate reporting and analytics design review sessions Stay current with contemporary technology trends/concepts and serve as an SME for the business teams Manage issues and bugs within the system using tracking/ support systems; liaise with internal and external resources to facilitate resolution and closure Define reference and data solution architectures that support cloud initiatives, big data and data lake use cases, and traditional data platforms Present architecture deliverables to stakeholders at all levels of the organization Experience Successfully architected complex large Big Data Solutions in Cloud. Experience implementing Big Data on GCP platform. Highly skilled is applying data governance processes and best practices Working experience designing, developing, and delivering data related functional and technology solutions Working experience in delivering solutions Working experience architecting complex, multi-system solutions Working experience developing architecture principles and standards Qualifications 7+ years’ experience as a practicing data architect/ engineer with 3+ years’ experience in as a Data Architect function. Significant experience in data architecture services, designing, developing, and delivering technology solutions. Working experience in delivering solutions Thorough understanding of Cloud and Big Data platforms specifically GCP Working knowledge of Oracle BRM, MDM. OPH adjacent technology including SOA and Micro Services. Working experience in cloud technologies, specifically, AWS and Google Cloud and integration experience with Oracle ERP, BRM, RMB, SFDC and other home grown application and platforms. Bachelor’s Degree in Computer Science, MIS, Business, or equivalent experience. For immediate consideration please contact: Anu UpStream Global Services. Reply to: www.upstreamgs.com
Apr-01-20
($) : DOE
< p style=" margin-top:0pt; margin-bottom:0pt; text-align:justify" > < span style=" font-family:Georgia, serif; " > < em> < span style=" font-size:10px; " > < span style=" unicode-bidi:embed" > < span style=" color:black" > TECHNOGEN, Inc. is a Proven Leader in providing full IT Services, Software Development and Solutions for 15 years. < /span> < /span> < /span> < /em> < /span> < /p> < p style=" margin-top:0pt; margin-bottom:0pt; text-align:justify" >  < /p> < p style=" margin-top:0pt; margin-bottom:0pt; text-align:justify" > < span style=" font-family:Georgia, serif; " > < em> < span style=" font-size:10px; " > < span style=" unicode-bidi:embed" > < span style=" color:black" > TECHNOGEN is a Small & Woman Owned Minority Business with GSA Advantage Certification. We have offices in VA; MD & Offshore development centers in India. We have successfully executed 100+ projects for clients ranging from small business and non-profits to Fortune 50 companies and federal, state and local agencieslt; /span> < /span> < /span> < /em> < /span> < /p> < br/> < p> Position: Data Analyst< /p> < p> Location: Rockville, Maryland (initially will be remote due to corona virus. After everything is normal you have to go onsitelt; /p> < p> Duration: Long Term< /p> < p> < /p> < p> Job Description:< /p> < ul> < li> Expert in data analysis< /li> < li> Data modelling/ Data Architecting< /li> < li> Understanding data< /li> < li> How to map, move the data, how to architect data< /li> < li> Heavy SQL and ability to write complex queries from scratchlt; /li> < li> Experienced in Agile< /li> < li> Prior experience with AHRQ/NIH etc is preferred< /li> < /ul> < p> < /p> < p> Best Regards, < b style=" font-variant-ligatures:normal; font-variant-caps:normal; orphans:2; text-align:start; widows:2; -webkit-text-stroke-width:0px; text-decoration-style:initial; text-decoration-color:initial; word-spacing:0px" > < /b> < br /> < br /> < b> Kevin< /b> < br /> < b> Sr. Talent Acquisition Specialist< /b> < br /> < b> Phone< /b> < br /> < b> Email: < /b> < a data-auth=" NotApplicable" href=" mailto:kevin@technogeninc. com" style=" color:blue; text-decoration:underline" target=" _blank" > < b> kevin@technogeninc. com< /b> < /a> < b style=" font-variant-ligatures:normal; font-variant-caps:normal; orphans:2; text-align:start; widows:2; -webkit-text-stroke-width:0px; text-decoration-style:initial; text-decoration-color:initial; word-spacing:0px" > < /b> < br /> < b> Web:< /b> < b style=" font-variant-ligatures:normal; font-variant-caps:normal; orphans:2; text-align:start; widows:2; -webkit-text-stroke-width:0px; text-decoration-style:initial; text-decoration-color:initial; word-spacing:0px" > < u> < /u> < /b> < a data-auth=" NotApplicable" href=" http://www. technogeninc. com/" style=" color:blue; text-decoration:underline" target=" _blank" > < b> www. technogeninc. com< /b> < /a> < br /> < b> 4229 Lafayette Center Dr, Suite 1880, Chantilly, VA 20151< /b> < /p> < p> < /p>
Apr-01-20
<div>Job title: Big Data Architect<br /> Location: Stamford, CT<br /> Duration: Long term</div> <div> Strong Big Data Architect with knowledge on PYSPARK, Hive, Pig, Spark, Administration of AWS EMR (preferred),Airflow, Lamda, Oozielt;br /> Experience in building scalable big data ingestion frameworks<br /> Define and build scalable and futuristic architecture for data platforms<br /> Good Hands on experience in PySparklt;br /> Work closely with the customer on data exploration & provide technology guidance (Technical) on enabling Data foundation for analytics<br /> Build multiple PoCs, as part of the data framework build, eg:help users Query unstructured data for formulating the requirements<br /> Exposure to other Big Data technologies is preferred as this is a green field implementation Hive, Pig, PYSpark, Administration of AWS EMR (preferred<br /> Exposure to other Big Data technologies is preferred as there is a lot of scope for experimentation & adoption of new technologieslt;/div>
Apr-01-20
Data Analyst  Santa Clara, CA
($) : 65000 / year
Data Analyst Responsibilities: · Managing master data, including creation, updates, and deletion. · Provide quality assurance of imported data, working with quality assurance analyst if necessary. · Commissioning and decommissioning of data sets. · Processing confidential data and information according to guidelines. · Helping develop reports and analysis. · Supporting initiatives for data integrity. · Evaluating changes and updates to source production systems. · Providing technical expertise on data storage structures, data mining, and data cleansing. · Understand business requirements from operations users and translate into work items. Key Skills: · Experience as a Data Analyst in Insurance is mandatory. Preferably Life Insurance · Hadoop experience with HIVE SQL mandatory · High-level experience in methodologies and processes for managing large scale databases. · Experience to Redshift, Business Objects , Tableau would be an added advantage. · Experience with Python is preferable · Demonstrated experience in handling large data sets and relational databases. · Understanding of addressing and metadata standards. · High-level written and verbal communication skills.
Apr-01-20
Description: Experienced Ab Initio Developer who has worked in end to end SDLC life cycle. Develop and foster a positive relationship with team members, team leads and business partners Develops and updates documentation, departmental technical procedures and user guides Responsibilities: 6-8 years of work experience in Ab Initio Good analytical and logical skills in writing SQLs, Stored Procedures and creating Marts/Views in Oracle Should have worked in an Agile delivery environment Capable of understanding of business requirement/mappings and converting them into design
Apr-01-20
Data Analyst  Dallas, TX
($) : 65000 / year
Responsibilities of the Data Analyst: Analysis of structured and unstructured data for data quality issues Perform data steward activities such as cleansing data which has errored off to keep enterprise master data accurate Presentation development for senior management supporting the Data Team Gather and synthesize functional and non-functional business requirements ensuring alignment to data strategy Analysis of current communication processes and identify opportunities for enhancements Centralize and streamline activities to promote agility and process improvements Create process and data flow diagrams for data movement capture Work with assigned Data Architects or development team members related to data questions Work as a liaison between the business and project teams related to data questions or concerns Collaborate/communicate with project team and business users as required Support functional testing and performance testing Work with technical delivery lead on project activities Ensure assigned work is implemented within project schedules Requirements of the Data Analyst: At least 2-3 years of experience with Data Analysis Proficient is writing SQL queries to analyse database Excellent communication and coordination techniques Experience working with senior leadership Demonstrated ability to work in a high-intensity, multi-project environment Excellent interpersonal and communication skills (technical and non-technical) Extremely Detail-Oriented individual with the ability to multi-task Ability to work autonomously towards a goal Strong verbal and written communications skills Strong analytical, problem-solving, and conceptual skills Excellent interpersonal skills Strong Microsoft Office tools (Powerpoint, Excel, Word, Visio) knowledge. Strong understanding of data related concepts such as mater data management, data warehousing and analytics Education: Bachelor'' s degree in Business (or Management), Computer Science, or related discipline, or equivalent work experience is required
Apr-01-20
($) : Market
Informatica ETL Developer Sunnyvale, CA 12 Months LinkedIn ID Must - If currently working, need Official Email ID or ID card Pharma background Role: Responsibilities: * Analyze business requirements, follows standard change control and config= uration management practices and conforms to departmental application devel= opment standards and systems life cycle. * Lead the Design, development and implementation of new system components = or fixes to resolve system defects. * Incorporates source code reuse wherever possible. * Understands data ETL concepts and best practices. * Sets up and executes component tests as well as tracks and documents syst= em defects. * Participates in software design and programming reviews. * Design and build data models to conform with our existing EDW architectur= e * Work with teams to deliver effective, high-value reporting solutions by l= everaging an established delivery methodology. * Perform data mining and analysis to uncover trends and correlations to de= velop insights that can materially improve our decisions. Skills: Skillset - Must have: * Looking for an ETL developer with 10+ years of Informatica experience. * Position requires advance knowledge of Informatica and SQL Experience. * Must have Strong Knowledge of Data Base Concepts * Experience visualizing data in business intelligence tools such as Tablea= u * Must have experience to Understand and source system and recommend soluti= on from data integration standpoint * Must have performed ETL projects end to end * Ability to lead the project and provide technical guidance to the team * Domain experience in Pharmaceutical Commercial area is a great plus!! Domnic Uliyano Insigma Inc 24805 Pinebrook Rd, Suite 315 Chantilly, VA 20152
Apr-01-20
< div> Hi, < br /> < br /> Please find the job description below and let me know, < br /> < br /> < b> TITLE: Ab Initio Developer < br /> Location: Wilmington, DE, Lewisville, TX< br /> Duration: Contract< br /> Interview Mode: Telephonoc/Skype< /b> < /div> < div> < br /> < b> Skillset:< /b> < br /> 1. Ab Initio experience is mandatory< br /> 2. Abi BRE Rules Engine experience preferred< br /> 3. Cards domain experience is a positive< br /> 4. Other rules engine experience if Abi BRE experience is not available - Blaze, ODM, etc. (ODM- IBM rules enginelt; br /> 5. Right blend of BA and programming experience will be preferred< br /> 6. Candidate should be motivated to work in Rules Engine area< br /> 7. Open to consider resources trained in Abi BRE if past experience not available and willing to learnlt; br /> 8. Years of exp blend of Senior, mid-level and Junior is acceptablelt; br /> < br /> < br /> < br /> < b> Thanks & Regards, < /b> < br /> < b> Hasan Khan< /b> < b> | IT Recruiter< /b> < br /> < b> AV< /b> < b> TE< /b> < b> CH< /b> < b> < /b> < b> Solutions Inclt; /b> < br /> < b> Phone< /b> < br /> < b> < a data-auth=" NotApplicable" href=" mailto:Email:%20hasan@avtechsol. com" rel=" noopener noreferrer" target=" _blank" > Email: hasan@avtechsol. com< /a> < /b> < br /> < b> Web: < /b> < a data-auth=" NotApplicable" href=" http://clicks. mg. avtechsol. com/c/eJxVjcsKwyAQRb9GlzIzvuLCRaHJbxTxUQNJUxKpv1 -3gbs6HM5N3mGKk-SrJyAASQYVOaUFCg2E2jxnlLOCxVmmYH -L8Gs51uvYRDx2Xn2ORgEgKAo4WSg2ZwgYQeuisWDgm6-tfZ l8MFrGeu_3xmD8zOW1Jm8RJT99DVf4jLeb9gdjiDHd" rel=" noopener noreferrer" target=" _blank" > < b> www. avtechsol. com< /b> < /a> < /div>
Apr-01-20
Should have strong Informatica / ETL.
Apr-01-20
Job Roles/Responsibilities: 8-10 years Total Experience in Informatica Power Center Strong in ETL Development processes Extensive knowledge in Data identification and Test data preparation Strong knowledge in Data warehouse concepts ETL Architecture knowledge for developing Technical Specifications Documents Design Documents and providing Application Support Maintenance Strong in UNIX scripting Oracle PLSQL Strong Analytical and Troubleshooting Skills Ability to provide technical support to other team members during project execution Good understanding of DWBI Skills Excellent Communication Documentation and Presentation skills Good to have Data ware housing testing experience
Apr-01-20
($) : DOE
Sr. Data Analyst with Install Base Experience · Need a candidate with any Install Base implementation experience (Any homegrown, Oracle EBS or SAP domain is fine) · Strong SQL skills · Can talk to stakeholders, business users, customers. · Understanding on the sales domain.
Apr-01-20
Data Analyst  Mclean, IL
Role: Data Analyst Duration: 12 months Location: McLean, VA GLIDER Required: Data Analyst Top 4 1. Heavy SQL 2. Python (write scripts, work with Jupiter notebook); Not looking for Python development 3. Pandas (library used in Python to create rectangular data sets) 4. Compile the results in Google sheets; document how the analysis was completed 5. Tableau: Create dashboards to replicate results Must be able to use SQL extremely well. Most of the data will be in relational databases of various means. All of these will be in the cloud. Project Description: Team of 12 analysts that’s growing to 15 analysts. Part of a larger effort to grow the team by 10. Line of Business: Enterprise Services (EP2) Original Description: Work with product managers to understand what they need in an analysis Pull data from databases using complex queries Analyze data with aggregations and basic statistics Compile results into a coherent story Present results in a spreadsheet or email Document how the analysis was done and the results Develop dashboards for ongoing reporting Required Skills: Very strong SQL Python scripting Spreadsheets (Excel / Google Sheets) Ability to tell a story with the results of an analysis
Apr-01-20
Role : Ab Initio Developer. Location : Phoenix, AZ 1. Candidate needs to have min 2 years work exp in continuous flow and hands on experience on Abinitio 2. Should be aware of webseries, Rest and Soap 3. Should have Total ETL exp more than 6+ years Any Visa is fine.
Mar-31-20

Understanding Data Warehouse & ETL

A Data Warehouse is a huge database designed solely for data analysis. It is not used for any standard database process or business transactions. ETL (Extract, Transform, and Load) is a process by which normalized data is extracted from various sources and loaded into a data warehouse or a repository. This is then made available for analysis and querying. This repository is transformed to remove any redundancy and inconsistency. ETL supports effective decision making and analytics based on the composite data. Slices of data from the data warehouse can be stored in a data mart. This enables quick access to specific data like the summary of sales or finance reports.

Data Warehouse Features & Capabilities

Data Warehouse has features and capabilities that support data analysis with ease. A good data warehouse should have the following abilities: • Interact with other sources and input; extract data using data management tools. • It should be able to extract data from various sources, files, Excel, applications, and so on. • Allow cleansing so duplication and inconsistency can be removed. • Reconcile data to have standard naming conventions. • Allow both native and autonomous storage of data for an optimized process.

Top ETL Tools to excel in Data Warehousing Jobs

There are many ETL tools available in the market. The most commonly used tools for ETL are given below. • Sybase • Oracle Warehouse Builder • CloverETL • MarkLogic. There are excellent data warehousing tools like Teradata, Oracle, Amazon Web Services, CloudEra, and MarkLogic. Expertise in any of these can fetch you a good job in the field of data warehousing.

Salary Snapshot for Data warehousing Jobs in US

A senior Data Warehouse developer receives an average pay of $123,694 a year. Based on the skill and expertise the salaries in this field can range anywhere from $193,000 to $83,000. Most of the Senior Data Warehouse Developer receives a salary that ranges between $103,500 to $138,000 in the United States. There are currently plenty of Data Warehouse developer jobs in USA.

Career Path for a Data Warehouse Professional

Data Warehouse gives immense opportunities for an IT professional. There are a plethora of roles and designations required to manage this vast application and its different modules. Data warehouse managers are software engineers who build storage mechanisms for organizations to meets the need of the company. Entry-level roles in Data Warehouse are Software Developer, Software Engineer, Business Intelligence (BI) Developer, and Data warehouse ETL Developer People who make use of the data in the Data Warehouse to arrive at various decisions are Data Analyst, Data Scientist, and Business Intelligence (BI) Analyst. Senior roles in this field are Data Warehouse Managers, Senior Financial Analyst, Senior Software Engineer / Developer / Programmer, and Senior Business Analyst. Data warehousing jobs in USA are still prevalent, and if you are a specialist in this field, you can make a great career out of it.
Data warehouse & Skills & Tools
To be a Data Warehousing professional, you need an in-depth understanding of the database management system and its functions. Experience in developing databases using any of the database applications will be an added advantage. Apart from this, other technical skills required for a Data Warehousing job are discussed below: • Tools for developing ETL. You can either develop ETLs by creating mappings quickly or build it from scratch. Some commonly used ETL tools are Informatica, Talend, Pentaho. • Structured Query Language or SQL is the backbone of ETL. You must know SQL as it is the technology used to build ETLs. • Parameterization is another crucial skill to master. • Knowledge in any of the scripting languages used in a database application, like, Python, Perl, and Bash, will come in handy. • Debugging is another essential technical skill as nothing ever goes as planned.
Data Analyst  Seattle, WA
8+ years of industry experience as an Analyst or related specialty. 3+ years Programming experience manipulating and analyzing data (Python or Scala) Experience building robust and scalable data integration (ETL) pipelines using Airflow, SQL, Python and Spark. Experience in data modeling, ETL development, and Data warehousing. Data Warehousing Experience with Oracle, Redshift, Teradata, Snowflake etc. Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy Adept at queries, report writing and presenting findings Defining new data collection and analysis processes An analytical mind and inclination for problem-solving Experience building data products incrementally and integrating and managing datasets from multiple sources. Experience with Big Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, Apache Druid) such as S3, EC2, and EMR (Spark) etc. Lead the transformation of a peta-byte scale batch-based processing platform to a near real-time streaming platform using technologies such as Apache Kafka, Cassandra, Spark and other open source frameworks.
Mar-31-20
Role: Informatica MDM Admin Location: Minneapolis. Job Description: Design, Install, Configuration, and Administration of Informatica Platform v10 or higher (currently on v10.2) on Linux, Install experience with Informatica MDM 10.x preferred. Leads software upgrades, Implementation of Hot fixes, implementation of new software offerings and infrastructure, maintenance, and coordinates testing activities with project teams. Researches and provides recommendations for capacity modifications, collaborates with PM to document tasks and update status. Creates and maintains architecture diagrams, Informatica/Data Integration/Data Quality tools & UNIX troubleshooting and Automating Daily Tasks. Informatica/Data Integration/Data Quality Tools Security. Informatica MDM Platform administration and Integration support. Coordinates patching and other infrastructure related activities with different teams. Monitoring of servers and services.
Mar-31-20
($) : Depends on experience
Role: Informatica Developer Location: Malvern, Pa Experience: 8 years Type: Contractual through IMPLEMENTATION PARTNER Duration: Long term BR/H on C2C: DOE Need a strong consultant With 8 years’ experience in Informatica Strong on sql, pl sql and query With 1-2 year’s data warehouse exp If interested in exploring the opportunity, please revert with resume, visa status & BR/H on C2C Neha Doshi Sr. Technical Recruiter Sierra Business Solution Five two zero - two eight eight - eight one two eight neha(at)sierrasoln(dot)com
Mar-31-20
Work with product managers, analysts and clients to better understand the business problem Think through a business problem and formulate approach based on success criteria, design experiment Collaborate with other Data Scientists and Analysts on contemporary data science research that could be applied to valuable business opportunities; discover potential enrichment data Perform exploratory data analysis to validate business understanding and context Test data for signal; feature selection and correlation analysis Train and tune the model and set up meaningful tests for assessing model health Collaborate with product managers to find the best way to present results REQUIRED SKILLS/EXPERIENCE Advanced understanding of time-series forecasting (Arima model) for commodities/equities trading or supply chain demand planning Experience with advanced techniques in non-linear dimensionality reduction and/or manifold learning to leverage large amounts of enrichment data for feature extraction Experience with cloud-based development environment for storage and model training; AWS, Google Cloud, Azure An advanced Degree (Masters or PhD) in Physics, Operations Research, Mathematics, Neuroscience, Computer Science, or Statistics. PhD
Mar-31-20
Position: Data Architect - Tableau Duration: 6+ months Location: Merrimack, NH Description:                      This role will assist the WI Workplace Consulting – Client Technology Team with database design and architecture to support external, web-based data visualization projects. Candidate will demonstrate proficiency in developing optimized data architectures and database modeling to ensure the data warehouse is scalable and efficient. Coordinating with Tableau Data Stewards, this role will assist with modeling, structuring, and optimizing sources of data that connect to Tableau. The Team Workplace Consulting is a group within client’s Workplace Investing business unit and was created to serve and partner with client’s strategic external clients on their data and technology needs. The Client Technology Team, housed within Workplace Consulting, is primarily staffed with Technical Consultants, whose dynamic and diverse skillsets are leveraged to complete advanced and challenging projects in partnership with both internal and external partners. The Expertise You Have Expertise in relational database modeling, data warehousing, performance-tuning using MariaDB, Microsoft SQL Server, Oracle, and Microsoft Azure Expertise in ETL concepts, specifically with Informatica PowerCenter In depth knowledge in a variety of analytics techniques including data mining, modeling, statistical analysis, data visualization, and SQL code Experience with Tableau Desktop, Tableau Server, Tableau Prep Excellent written, oral communication, project management and presentation skills, and the ability to present complex data and statistics in a simple and clear way Strong background in Microsoft Office products Strong verbal and written communication skills A willingness to find uncommon solutions to complex problems, and an ability to work with little or ambiguous direction The Skills You Bring You are a seasoned Data Architect who strives for the best outcomes and change while being cautious by understanding risks and threats Experience in defining technology strategies, blueprints, roadmaps and collaboratively defining solutions and enabling architectures Proven hands on expertise in Data Architecture, Data Modeling, Database architecture, DB Design, Database programming (SQL/PlSql, etc and batch processing in a distributed client server environment Expertise in architecting Master Data Management, Operational Data Stores, Data Warehousing solutions Expertise designing Batch and event driven processes, ETL and Data Quality processes Suggest architectural, design and coding changes to improve the performance of the Tableau dashboards. Working knowledge of service architecture, API, Security infrastructure Proven leadership skills, demonstrated ability to mentor, influence and partner with application architects, engineering, and product teams to deliver scalable robust data solutions Collaborate with Data Leads and Consultants to define the technical solution for data access, data movement, and data transformation Promote opportunities to provide data reuse, managing the issues of centralization and replication The Value You Deliver You deliver products that allow internal and external partners to make informed business decisions You assist Client Technology Team members as needed during the project lifecycle You provide opportunities through innovation through advanced analytics and implementation of new technologies and techniques
Mar-30-20
Data Modeler  Durham, NC
Job Title: Data Modeler Location: Durham, NC Duration: 6+ months Description: We are seeking a candidate that will provide data analysis leadership on complex data analysis projects, often across systems and companies. They will be responsible for translating and gathering data requirements those requirements into business solutions which ensure coordinated, accurate data within the organization. Data modeling work will include design documentation, logical and physical models, including OLTP, dimensional and object models, data strategies, standards and procedures and data definition language (DDL)! Are you ready to take your career to the next level? Primary Responsibilities: Capture requirements from business and technical staff to analyze data requirements and recommend appropriate solutions. Document database solutions and present solutions to project teams, Data Engineering team and architecture review boards; Integrate data into existing enterprise logical model and physical data stores to avoid data redundancy. Ensure all technical database decisions result in coherent system designs which use the most effective methods and tools; Use modeling tool to capture clear definitions of data elements and produce code to effect database changes. Implement data solutions to satisfy business requirements and act as mentor and provide guidance and leadership to junior members of team. Perform research into emerging technologies, define data solution standards and guidelines, and drive technical approaches at a tactical level, serving as the data engineering member on FI and cross-Client systems projects. Provide data warehousing / dimensional modeling and OLTP modeling expertise as well as write and maintain the business rules, logic and SQL queries Education and Experience Degree required along with 10 years’ of experience in a technical field (analysis, development, database administration, report development 6 years’ experience in relational data modeling preferred & Data Analysis. Skills and Knowledge: Knowledge of financial services business knowledge of institutional brokerage business preferred. Excellent requirements gathering and written and verbal communication skills; Ability to manage workload across multiple projects and balance deadlines Excellent presentation and negotiation skills; Expert knowledge of systems development methodologies, specifically Agile. Dexterity in using data modeling tools such as PowerDesigner or Erwin. Prefer experience with PowerDesigner. Knowledge and understanding of different database platforms, such as, Oracle, Postgres, Hadoop, Snowflake, Netezza, DB2; Advanced ability to use SQL to query data on any of the above DBMS’s. Knowledge and in depth exposure to modeling strategies, such as relational (logical, physical), star, snowflake, unstructured, to support OLTP, Warehousing and Data Marts, Data Lakes, MDM
Mar-30-20
SAS Consultant  Minneapolis, MN
SAS - SAS/Macro, SAS/SQL, Enterprise Guide (SAS cert) SQL RDBMS BA/PM/Analytics - Medicare exp nice to have Description: This position is part of a team responsible for management and execution of our provider incentive programs known as Additional Compensation Programs (ACP) including system setup and maintenance; incentive reporting, calculation and distribution; and incentive research and reconciliation. As part of this team, the Business Analyst Consultant will be involved in the processes around the analysis, design, development, implementation, testing, support and maintenance of the systems. This includes system development, detail testing, data cleansing and data analysis for new or revised business/system processes. The Business Analyst Consultant will be responsible for the delivery of accurate, network specific claims detail including all support, both internal and external related to the data. This position is also responsible for turning business problems into understandable, actionable analyses; identifying underlying causes and potential mitigation strategies for medical cost trends, based on broad knowledge of healthcare issues including benefit designs, contracting methodologies, and reimbursement policies; and perform statistical analyses to identify root causes of medical cost trends such as unit cost issues and provider issues. Major Responsibilities Provide accurate and complete claims detail to networks participating in select M & R incentive agreements Understand, research, and assist in problem resolution as needed. Develop/maintain training programs and strategies for staff. Design and develop programs in SAS required to administer compensation agreements. Create/maintain programs supporting the compensation calculation and payment process. Ensure systems and applicable environments are stable in order to prepare for potential growth in compensation arrangements. Ability to research both internal and external questions and provide input and suggestions for system sustainability, programming, new processes, etc. Communicate with other teams both internally and externally involved with the process and provide ongoing support for request tracker Independently develop programs and queries to run analytics for the compensation program. Administer/Set up new networks in Mainframes. Identify and solve potential program/data problems ensuring appropriate follow through and resolution. Ensure sufficient documentation is available (and provided) to support audits including current/historical documentation of programming and system changes and additions to support the ACP program. Perform analyses of issues, participate in development of potential solutions, and make recommendations to ensure accurate and timely resolution. Recommend and drive process improvements. Generally work is self-directed and not prescribed. Serves as a technical resource to others. Experience with financial operations is beneficial. Other projects as assigned. Qualifications Undergraduate degree in Information Technology (or related field) or equivalent experience. Exceptional analytical and critical thinking skills. Ability to quickly analyze, interpret and implement incentive plans. Working knowledge in Base SAS/ SAS Macro, SAS/SQL, Enterprise Guide and UNIX SAS Programmer Certification preferred Part D / Rx claims knowledge preferred Relational Database system knowledge : DB2, Oracle and Teradata Ability to learn and understand the interdependencies of computer systems quickly. Strong verbal/written communication skills. Ability to communicate effectively with multiple levels within and outside the company. Possesses written communication skills enabling independent handling of correspondence and proofreading of complex legal agreements, letters, reports and other documents. Ability to help others work independently, prioritize work and meet deadlines. Demonstrated customer service skills in a professional business environment. Tact, diplomacy and sensitivity to respect confidential information. Work schedule flexibility, including availability for overtime as needed. Participation in the development/maintenance of computer systems. Previous experience with incentive or commission processes and systems is preferred. Previous experience with Medicare and/or the healthcare insurance industry preferred. Anticipates customer needs and proactively identifies solutions. Solves complex problems independently. Competencies and Best Practice for High Performers (List competencies and at least two behavioral anchors for each): Job Related Knowledge o Strong knowledge of all Medicare & Retirement policies and regulatory requirements for physician incentive programs. o Strong knowledge of compensation plans and processes. o Working knowledge of Medicare products. o Strong knowledge of other finance processes. Management o Strong abilities in training, development and monitoring of staff, new hire and ongoing. o Strong personnel and team development and management. o Develop strategies to achieve team goals. Ability to make and communicate difficult or unpopular decisions Internal Operations & Capabilities o Ability to navigate Medicare & Retirement internal departments to produce positive results. o Extensive knowledge of incentive program roles/responsibilities and processes. Personal Attributes o Detail-oriented. o Consistent accurate performer. o Ability to multi-task. o Quickly grasps new and/or complex ideas and products. o Strong communication skills (written and verbal o Drives change. o Consistently demonstrates flexibility. o Strong planning and organization skills. o Strong problem solving ability. o Strong analytical skills. o Drives toward deadlines. o Ability to work in a fast-paced environment. o Team player. o Ability to adapt to change. o Ability to deal with difficult people. o Exceptional computer skills. o Can be relied upon to act ethically, to safeguard confidential information and to adhere to company code of conduct and all legal regulatory requirements.
Mar-30-20
Data Modeling/Architecture Brooklyn, NY Duration: 24+ months Duration: 84 Months experience in Data Modeling/Architecture (conceptual, logical, physical, understanding of the data) 84 Months experience in Database Management performing installation, configuration, customization, implementation and support of SQL Server Database. 84 Months experience in creating Data Dictionaries and Data Mappings documents. 84Months experience and expertise in monitoring and troubleshooting ETL, DTS, SSIS and Data warehouse queries. 84 Months experience in SQL Server Performance Tuning, Query Optimization and Production support. 84 Months experience writing complex Stored Procedure/ Functions in PL/T-SQL and MS SQL. Writing complex Shell Scripts to support SQL and other functions. Bachelor’s Degree
Mar-30-20
Role: Informatica developer Location: Milwaukee, WI Interview: Phone & Skype Duration: 6-12+ Months Tax Term: w2 or 1099 Only Note ::: Initially it would be 1-2 Month remote than onsite once the situation is under controlled. Job Responsibilities: Develop and facilitate functional design documents for the interfaces based on BRD (business requirement documents) submitted from business. Follow SDLC best practices in the development of technical specifications and other supporting documents for the project Assure document quality and coherence across technical design. Refine documents to hand over to the development team(s Work with System Design Architect, Business Analysts and other development team(s) to ensure design coherency. Skills and Experience Required: Mandatory Skills 6-8 years of experience using Informatica tool Experience in the development of Interfaces Logical thinking capability and strong documentation skills. Excellent communication skills to coordinate with different groups Desired Skills Project/experiences in financial industry preferable.
Mar-30-20
POSITION: INFORMATICA CONSULTANT with Power Center LOCATION: RICHMOND, VA DURATION: LONG TERM Informatica Powercenter with SQL or PL/SQL. Unix/Perl will be add-on
Mar-30-20
Job Description: Develop and manage big data pipeline using Microsoft Technology Stack such as Cosmos, Azure Data Factory. Manage high volume, high traffic GDPR solutions build using CAzure Functions. The role provides excellent opportunity to work with top class data team which focuses on business growth using data. Must Have Skills: C# required SQL required Azure functions
Mar-30-20
Data Analyst  Summit, NJ
($) : Market
Data Analyst Summit NJ 12+ months (remote to start due to COVID then onsite in NJ) .7+ years of experience writing SQL queries to perform complex data analysis independently using Teradata and/or Oracle, Experience with Informatica Master Data Management tools Experience with Customer/Party Master and Reference Data Management
Mar-29-20
HI , Please find the below JD & let me know your response ASAP!!   Title                            :       Informatica ETL/Hadoop/ UNIX/ DB Team Lead Location           :       NYC Type                        :       Full Time   Below is the  JD :   There is no Proper JD client Need :   Primary Must Have Tech Skills with minimum years of experience in each Hadoop, Scoop, Hive, Spark dataiku, Scala, SQL   Secondary Tech Skills with minimum years of experience in each Informatica ETL, Unix, DB    Other nice to have Requirements Kafka, Excellent business communication   Note : Please send me your response to prasanth@cyspacetech. com or Please call back to    CST provides its clients with complete, cost-effective, end-to-end personnel solutions across a range of industrial domains. CST''s mission is to empower businesses around the world to make better, faster operational decisions.
Mar-29-20
HI , Please find the below JD & let me know your response ASAP!!   Title                            :    Informatica ETL/Hadoop/ UNIX/ DB Team Lead Location           :       NYC Type                        :       Full Time   Below is the  JD :   There is no Proper JD client Need :   Primary Must Have Tech Skills with minimum years of experience in each Informatica ETL/ Unix/ DB SQL   Secondary Tech Skills with minimum years of experience in each Basic Hadoop with 1+ years of experience     Note : Please send me your response to prasanth@cyspacetech. com or Please call back to Prasanth@cyspacetech. com  CST provides its clients with complete, cost-effective, end-to-end personnel solutions across a range of industrial domains. CST''s mission is to empower businesses around the world to make better, faster operational decisions.
Mar-29-20
Data Analyst  San Jose, CA
($) : Market
Job Title: Data Analyst Location: San Jose, CA Exp: 8 to 10 years Duration: Long Term Millions of customers around the world use Adobe products. With a growing myriad of consumers, small businesses and large enterprises trying out Adobe products every day, making repeated purchases and managing their subscriptions, these commerce activities generate a ton of invaluable data. Data will translate into actionable insights to improve customer satisfaction, informing the next multi-million dollar opportunity for Adobe. As a Data Analyst in the Commerce product management team, you responsible to lead this data driven movement to radically change the way we operate. You need to be able to rise above the numbers and focus on the most important questions and insights. You enjoy finding relationships amongst disparate data and generate absolute clarity out of a muddied data environment. You are highly proficient in statistical methods, well-versed in visualization techniques, building dashboards, and feel at ease with cloud-based big data environment. While you rely on data to prove your point, you also love to solve problems creatively. You thrive operating in an ambiguous environment, and you are excited by the challenge of unveiling new insights. To be successful, you constantly ask important questions, remove noise in the data, always learning and fine tuning your analyses/dashboards, and seek opportunities to share knowledge with others. If you fit the description, come join us in this exciting journey! Responsibilities: Acquire data from primary or secondary sources, and maintain databases and dashboards to unlock operational and exploratory analyses Dig in deep to analyze root causes to unusual trends, dips and spikes for all Commerce metrics Summarize your findings in an accurate, concise, easy-to-understand manner Handle the detail execution of data gathering, dashboard implementation and bug fixing Foster a culture of having easy access to data and autonomy in obtaining answers by implementing tools that help others Maintain existing data visualizations, data pipelines and dashboards Requirements: Sr. candidate with 8-10+ data analysis exp. Focused on Data massaging, Extraction, Filtering and Visualizations. Exp. working with multiple different data sources. Very strong on Exp. on Hadoop, SQL query, Power BI, Tableau. Exp. working with huge revenue data would be a good plus. Proven track record as a high-performing data analyst, and can thrive in a fast-paced environment Comfortable working with proxy data, incomplete data, normalizing and joining datasets from different sources Well verse with databases (MySQL, SQL Server, Hadoop) and adept with modern Business Intelligence and Visualization tools (Microsoft Power BI, Tableau, Amazon QuickSight) Familiar with web analytics technologies and techniques (Adobe Analytics, Google Analytics, digital pixel tracking, site tagging etc Strong analytical abilities; Collect, prioritize, analyze, and disseminate critical information with attention to detail and accuracy Excellent communication skills with expertise in data visualizations, trend analysis, forecasts, statistical testing and data-storytelling Collaborate with others to understand, identify and translate business challenges into data projects Bachelor's and/or Master's degree specializing in Statistics or Data Science. Based at the Adobe headquarters in San Jose, California.
Mar-29-20
($) : Market
• 7+ years total work experience • 5+ years of work experience in MDM • Experience must include defining standardization rules, match & merge rules, configuring thresholds, configuring events and integration of data to & from MDM • Prefer experience with Informatica MDM V10, 9.x is must • Experience in messaging and web service integrations • Java experience and hands-on experience in creating user exits for MDM • Experience with multi domain implementations • Experience with Oracle RDBMS • Experience tuning all aspects of MDM processing • Knowledge of or experience with DQ tools like Informatica IDQ or IBM Qualitystage • Prefer some experience with Oracle DB tuning • Unix and/or other scripting experience • Prefer experience developing operations support guides • An agile mindset with experience working in agile environment • A spirit of collaboration and transparent communication   Need the below information:   How many years as a Informatica MDM Developer How many years in MDM - MUST HAVE! How many years in JAVA - MUST HAVE! How many years in Informatica Data Quality (IDQ) How many years in ETL - MUST HAVE! How many years in SQL How many years in PL/SQL Candidate hourly Rate Candidate visa status Candidate current location (city/ state) Skype timeslots for next 3 business days post 6 pm EST, pls specify Whether candidate will relocate or Travel (NO T & E PAID BY CTS), if travel what percentage , be precise which days / months onsite! Total IT exp Will he relocate?
Mar-29-20
Data Analyst  Sunnyvale, CA
($) : Market
Position: Data Analyst Work Location: Sunnyvale, CA Contract duration: 6 months Must Have Skills - Must be proficient in data insights data visualities and data analysis. - Define data standardization, consistency, taxonomy, drive data warehouse enhancements. - Drive definition and design of power bi reporting and dashboarding needs. Detailed Job Description - Understand data element and requirements to achieve business needs, identify dependencies and drive implementation roadmap. - Develop and implement reporting to track key metrics and quickly identify trends to drive proactive resolutions
Mar-29-20
($) : DOE
For one of our ongoing projects we are looking for a Software Engineer - ETL. Responsibilities: Description Looking for ETL Developer with 3 to 4 years of experience and at least 1-2 years of Production Support experience. The hire will be responsible for supporting Data ingestion into Enterprise Data Lake using ETL tools Ab initio, Talend, handling production support operations, incident management and root cause analysis of issues. Responsibilities: • Hands-on experience to perform L1, L2 production support 24X7 from USA EST/PST time. Work with various Ab Initio tools including Conduct It, Control Center Work on Monitoring tools like Splunk or Tableau Support Cluster monitoring, maintenance and troubleshooting. Communicating on production support issues and managing tickets, resolutions and reporting metrics Requirements: • Bachelor’s degree in Computer Science or a related field and/or equivalent experience. 2 to 4 years of experience in ETL Ab initio tools 1-2 years of experience on production support projects Flexible to work in support Shifts 24X7 Strong in Unix, SQL, any 1 of scheduling tool and should have basics of Hadoop Good oral and written communication skills Experience on data ingestion into Enterprise Data Lake on Big Data platform (Horton works Data platform) This role requires strong technical skills specific to Ab initio, Unix, SQL, Autosys and should have good knowledge and experience of the software development lifecycle. The candidate must be a team player, self-motivated with the ability to meet deadlines, problem solve, and learn quickly within a mature technology environment. Good understanding of Business Intelligence solutions for Customers
Mar-28-20
($) : Market
JD ·       Developing Test Data Management Strategy and solution to adhere to compliance/regulations on PHI/PII/PCI data usage in non-prod environments and to optimize the overall efficiency of the testing Strategy defining or applying the data generation, data masking, data sub-setting following various data provisioning strategies. ·       Data model analysis of large databases with up to 1000 tables and Sensitive data analysis for larger and different Databases, Files. ·       Create subset plan and build subset queries for large databases more than 1000 tables ·       Build Data Generation Algorithms/Rules and Manage Seed lists using CA TDM/Informatica TDM / Delphix/IBM Optim. ·       Develop detailed TDM solutions for database platforms (SQL Server, Oracle, Sybase, DB2, VSAM, Mongo etc, messaging platforms (e.g. MQ), and feeds (e.g. XML, flat files) ·       Provisioning Test Data through data-mining/generation processes for the various testing team as per their data needs using CA TDM/Informatica TDM/Delphix/IBM Optim. ·       Build custom code as needed using Java and Windows Scripting to support Out of box TDM solution build ·       Building or maintaining gold copy repository developing, maintaining and enhancing TDM capabilities by employing repeatable processes, governance framework, automation, and various tools/utilities. ·       High level knowledge of Test Data Automation (Self Service provisioning of Test Data) and TDOD ·       Interaction and co-ordination with Development and testing stakeholders, Business analysts and end-users over the environmental and data issues. ·       Architect TDM (Test Data Management) solutions for data provisioning for Domain and Enterprise wide applications including Data Generation, Data Discovery, Data Masking and Data Subset and architect Gold Copy databases and Data Stores for quick data provisioning And Implements and supports technical capabilities, repeatable processes, and best practices to support test data management
Mar-28-20
Hi, This is Sarath Avtech solution Inc, Hope you are doing great! Please review the following job description and let me know if you are available and willing to apply. Role: ETL Developer Location: Carlsbad, CA Duration :Long Term Job description:     8+ years of ETL experience, Informatica preferred.     Good Data Analysis and data profiling experience.     Ability to work directly with business users to understand their requirements to perform data engineering work.     Good Communication skills.     Advanced skills in writing SQL Queries.     Experienced as tech lead for onsite/offshore group of data engineers or ETL Developers. Thanks & Regards, Sarath | IT Recruiter AVTECH Solutions Inc. Phone-Ext(509) Email:
Mar-28-20
Logisoft Technologies, Inc is a global information Technology, Consulting and Services company with more than 100% satisfaction rating from all the clients across USA. Logisoft was established with a commitment to fulfill the Information Technology Resource Management needs in Organizations. In the current industry trends, growing number of organizations focus on their core business processes and outsource their IT business process. We represent Logisoft Technologies Inc. with pride; We present ourselves as a premiere Technology, Consulting, Product Development and Software Services Company. Our Head Office located in South Plain field, NJ and Our Offices location in Hyderabad, INDIA & Accra, Ghana, South Africa. We are Microsoft Official Partners - A Microsoft Certified Partner help customers with a range of IT projects and specific IT solutions JOB ID: LS_IE_DR_WI_27 Job Title: Informatica Engineer Location: Milwaukee, WI Duration: Long Term Contract Required Skills: Experience using Informatica tool Experience in the development of Interfaces Logical thinking capability and strong documentation skills. Excellent communication skills to coordinate with different groups Project/experiences in financial industry preferable. LOGISOFT Technologies is committed to a policy of Equal Employment Opportunity and will not discriminate against an applicant or employee on the basis of age, sex, sexual orientation, race, color, creed, national origin, ancestry, disability, marital status, or any other legally protected basis under federal, state or local law.
Mar-27-20
($) : DOE
Job Description: Good experience in data modelling concepts. Good knowledge in data analysis. Well versed in OLTP and OLAP Data Modeling and Strong knowledge of Entity-Relationship concepts. Have strong Database knowledge, RDBMS concepts, Data warehousing skills including SCD Type transformation. Strong experience in data analysis and implementation of business rules. Good experience in all the phases of Software Development Life Cycle (Analysis of requirements, Design, Development, Verification and Validation, Deployment Expertise in ETL Informatica Power center, Unix/shell programming, SQL scripting. Expertise in scheduling tools like Maestro, Jenkins, Control M jobs Experience in Design/ Development of RDBMS using Teradata, Vertica Ability to work effectively and efficiently in a team and individually with excellent interpersonal, technical and communication skills
Mar-27-20
Data Analyst  Summit, NJ
($) : TBD
Title: Data Analyst Location: Summit, NJ Duration: Long Term 7+ years of experience writing SQL queries to perform complex data analysis independently using Teradata and/or Oracle, ? ? ? ? Experience with Informatica Master Data Management tools ? ? ? ? Experience with Customer/Party Master and Reference Data Management
Mar-27-20
ETL Developer  San Jose, CA
($) : DOE
8-10 Years of BI Experience with ETL background ETL background with Oracles EBS as source is required. Oracle order to Cash domain knowledge is plus 5 years of experience with Some ETL tool. BODS experience is plus Lead a team from offshore
Mar-27-20
($) : Market
Role: Data Scientist Location: Mooresville, NC Duration: Long Term  Required : Data scientists PhD in a quantitative field (Physics, Chemistry, Biology, Startistics, Computer Science) Should know Coding in python Have some knowledge of Machne learnign operations Building machine learning models Will be building Supply chain, Merchandising,  demand forecasting models Quantitative and technical interviews will be held. Thanks, Bhaskar,
Mar-27-20
Title: Data Analytics Developer Location: Atlanta, GA Duration: Long Term MOI: In-person Job Description: The Georgia Department of Human Services (DHS) is seeking qualified candidates for the position of Data Analytics Developer in Atlanta, Georgia. We seek candidates with a combination of strong knowledge of data models, software engineering experience, and machine learning skills. The Data Analytics Developer will create innovative reports, data visualizations and analytical solutions that deliver actionable information to the business. Having implemented Advanced Analytical models in Python or R Expertise and solid experience in BI Tools – Power BI, and OBIEE. Strong data visualization concepts and techniques. Excellent analytical, conceptual and problem-solving abilities. Highly self-motivated, self-directed and attentive to detail. Strong Informatica technical knowledge in design, development, and management of complex Informatica mappings. Strong programming skills, relational database skills with expertise in Advanced SQL and PL/SQL, indexing and query tuning Experienced in Business Intelligence and Data warehousing concepts and methodologies. Extensive experience in data analysis and root cause analysis and proven problem solving and analytical thinking capabilities. Analytical capabilities to slice and dice data and display data in reports for the best user experience. Demonstrated ability to review business processes and translate into BI reporting and analysis solutions. Ability to follow the Software Development Lifecycle (SDLC) process and should be able to work under any project management methodologies used. Ability to follow best practices and standards. Ability to identify BI application performance bottlenecks and tune. Ability to work quickly and accurately under pressure and project time constraints Ability to prioritize workload and work with minimal supervision Basic understanding of software engineering principles and skills working on Unix/Linux/Windows Operating systems, Version Control and Office software Experience with Big Data Lake / Hadoop implementations Required Qualifications: A bachelor’s degree in Computer Science or related field Preferred Qualifications: Experience with relational, multidimensional and OLAP techniques and technology Experience with Python building predictive models Experience with Visualization tools like MS Power BI, Tableau, Oracle DVD Experience with OBIEE tools version 10.X Working knowledge with Informatica Change Data Capture installed on DB2 z/OS Working knowledge of Informatica Power Exchange Soft Skills: Strong written and oral communication skills in the English Language Ability to work with Business and communicate the technical solution to solve business problems Skill Required / Desired Amount of Experience Bachelor's Degree in Computer Science or a related field Required Experience with relational, multidimensional and OLAP techniques and technology Highly desired 3 Years Experience with Visualization tools like MS Power BI, Tableau, Oracle DVD Highly desired 3 Years Knowledge with Informatica Change Data Capture installed on DB2 z/OS Highly desired 3 Years Knowledge of Informatica Power Exchange Highly desired 3 Years
Mar-27-20
($) : Market
Job Title : Informatica MDM Consultant Location : Washington, DC Duration : Full time. No. of Positions : 01 Role Description :  Should be having at least 5-7 years of hands on experience in Informatica MDM Should have good communication skills, analytical ability and problem-solving skills Should be able to articulate the issues and propose viable solutions Should be familiar with the ITSM concepts like Incident/Service/Change/Problem/Knowledge Managements Should be a self-starter who can work independently Excellent hands on experience with Informatica MDM tool to create master data from various source systems in real time. Excellent skills with MDM Service Integration Framework, Business Entity Service, Configuration, Data Director Work with Informatica DQ tool to profile, monitor and control data quality Experience with Informatica Analyst Good to have  Java programming knowledge Skills Required : Experience on Spring boot framework Experience on SIF SOAP API calls for Real-time data integration Minimum Experience : 7 Thank you, Best regards, Morris Kemp D:   Office EXT: -143 | E: Certified Women Owned Minority Business Enterprise {WMBE} 3868 Carson Street, Suite 204, Torrance, CA 90503 | Offices: USA, India, Australia W: http://www.Vedainfo.com| This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. Please note that any views or opinions presented in this email are solely those of the author and do not necessarily represent those of the organization. If you are not the intended recipient of this email, you must neither take any action based upon its contents, nor copy or show it to anyone. Please contact the sender if you believe you have received this email in error.
Mar-27-20
Data Analyst  Irvine, CA
($) : Market
Role: Data Analyst Location: Irvine, CA Duration: Contract Must Have Skills: 1. Data warehouse 2. AWS 3. Data Analytics 4. Capital Market(any FS / accounting knowledge is workable)
Mar-27-20
Role: Informatica MDM Admin and Developer Location: Minneapolis, MN Job Description: Participates in administrative activities related to the MDM platform including MDM hub, Process Server, Active VOS, Provisioning and IDD. Design, Install, Configuration, and Administration of Informatica Platform v10 or higher (currently on v10.2) on Linux, Install experience with Informatica MDM 10.x preferred. Leads software upgrades, Implementation of Hotfixes, implementation of new software offerings and infrastructure, maintenance, and coordinates testing activities with project teams. Provide recommendations for capacity modifications and documentation for installation, the configuration including network and support data store. Creates and maintains architecture diagrams, Informatica/Data Integration/Data Quality tools & UNIX troubleshooting and Automating Daily Tasks. Informatica/Data Integration/Data Quality Tools Security. Informatica MDM Platform administration and Integration support. Coordinates patching and other infrastructure-related activities with different teams. Monitoring of servers and services. Respond quickly/effectively to production issues by taking responsibility for seeing those issues through resolution
Mar-26-20
($) : open
Requirement analysis, estimation, scoping, Design, Review, performance tuning.• Highlighting and tracking issues and risks in a timely fashion.• Code, test, modify, debug, document, and implement Ab Initio Graphs utilizing the GDE environment and EME.• Develop scripts to automate the execution of Ab Initio graphs using shell scripts under UNIX environment.• Develop technical specification documents.• Help develop test cases and plans to complete the unit testing and support System testing.• Participate in design reviews, code reviews, unit testing and integration testing.• Develop code migration strategies from current Java Batch/PL SQL code to Ab Initio. • Experience with other Ab Initio suite of products like Control Center, ACE, BRE and Metadata hub. • Exposure to open source ETL tools like Talend, Pentaho and Hadoop concepts of HDFS and HBASEBasic Qualifications:• 6+ years’ experience in distributed systems development using Ab Initio software.• Bachelors Degree or higher in computer science, engineering, or related technical field.• Strong shell scripting skills, with experience encapsulating Ab Initio code for execution via automated schedulers.• Working knowledge of relational database management systems and experience writing SQL.• Good know of core Java & JEE and Web Services
Mar-26-20
Datawarehousing jobs statewise
Skill Test
  • C++ Coding Test - High
  • C++ Coding Test - Medium
  • C++ Language
  • C# ASP.NET MVC
  • C# ASP.NET MVC
$50 Coupon for your First Assessment
Improve your Employability
Are your skills in Demand?
Try DemandIndex and find out instantly
Looking for IT Jobs
Full Time - Contract - Consulting
IT jobs
Create account and get notified in 30 seconds