Job Description :

Hi

 

We have Multiple AWS Data Engineer with our client, Please do let us know if you are comfortable on any of the below roles and request you to share the updated resume along with expected pay rate and contact details

 

 

Position 1:

 

AWS Data Engineer

Location:  West Chester – PA

Experience in yrs.  6 to 10 years

Skill Description  AWS Glue , Spyspark , Python, Dynamo DB, Neptune DB

 

 

• Looking for AWS Data Engineer, who can understand AWS landscape and analyze the data issue and work with team to resolve those issues.

• Hands on experience with AWS services such as Glue, EMR, Lambda, Step Functions, EventBridge, CloudTrail, CloudWatch, SNS, SQS, S3, VPC, EC2, RDS, IAM Roles, Graph database experience.

• Proficient in at least one programming language (Python, Java, Scala, Java) Experience in version control systems such as Git.

• Experience in ETL / Data application development Experience working with Spark and real-time analytic frameworks Willingness to continuously learn & share learnings with others

• 5+ years’ experience delivering and operating large scale, highly visible distributed systems.

• Strong troubleshooting and performance tuning skills. Experience building multi-tenant, virtualized infrastructure a strong plus

• Staying up-to-date on the latest process and IT advancements to automate and modernize systems.

• Conducting meetings and presentations to share ideas and findings. Performing requirements analysis. Documenting and communicating the results of your efforts   

• AWS Glue • Spyspark • Python • Dynamo DB • Neptune DB • AWS CLI • Analytical Skills & Problem solving • Communication

 

 

 

Position 2:

 

AWS Data Architect

Location:  Chattanooga – TN

Experience in yrs.  6 to 10 years

Skill Description  AWS, Data Architect, AWS Big Data, Snowflake

 

 

Sr. Architect - Snowflake AWS Data Architect

Lead solution/application architecture for medium to large engagements deliver technically complex applications identify and institutionalize best practices across multiple accounts and manage multiple customers and architects WITH THE OBJECTIVE OF delivering technically sound projects across one / multiple customers WITHIN the guidelines of the customer and Cognizant standards and norms.Innovation and Thought Leadership

- Participate in external forums (seminars paper presentation etc.) to showcase Cognizant capabilities and insights

- Interact and engage with customers/ partners around new innovative ideas concepts assets as well as industry trends and implications

- Participate in Beta testing of products / joint lab setup with customer to explore application of new technologies / products

- Identify areas where components/accelerators or design patterns could be reused across different accounts

- Create documents reports white papers (international/national) on the research findings

 

Technology Consulting

- Define Problem statement for the customer

- Analyze application/ technology landscape process tools and arrive at the solution options best fit for the client

- Analyze Cost Vs Benefits of solution options

- Define the technology/ architecture roadmap for the client

- Articulate the cost vs benefits of options to key stakeholders in customer

 

Alliance Management

- Identify alliance partners based on the understanding of service offerings and client requirements

- Identify areas for joint GTM with the partner

- Develop internal capabilities/complementary toolsets to support the GTM strategy

- Maintain the relationship with partners

- Act as the Cognizant technical POC for the specific technology/solution area

 

 

 

Position 3:

 

Senior AWS Data Analyst

Location:  Stamford – CT

Skill Description  AWS DevOps, Talend Platform, Talend Data Quality

 

Job Title : Cognizant is looking for Sr Data Analyst 2. Job Summary : Have to understand the end-to-end flow of the data pipeline development and deployment process using Talend and DBT - Have to automate the whole build and release process of the developed talend and DBT data pipelines across SDLC 3. Shift : (9AM to 7PM) IST 4. Roles & Responsibilities : Sr. Developer Bachelors in science engineering or equivalent.Understand requirements build codes support testing and fix defects guide developers in the course of development activities Need to have good understanding of DevOps flow Branching strategies Build and deployment flows as per industry standards - Need to be good at writing yml scripts.Understand various functional and non functional requirements and HLD in order to provide inputs to create LLD and review Recommend make/ buy or alternate solutions Provide inputs in designing end to end solution from a technical perspective Create code development checklist Prepare UI specifications / mock ups for reports / dashboards and provide guidance to team members/ designers for creating data mapping documents or unit documents for ETL & BI Provide inputs to data modeling based on the project understanding Walkthrough the design along with PM to the customer and coordinate to seek sign-off on design and other artifacts 5. Demand requires Travel? : No 6. Certification(s) Required : No

 

 

 

Position 4:

 

AWS Data Architect

Location:  Ashville - AL

Experience in yrs.  11 to 15 years

Skill Description  Informatica EDC, Azure Datalake, IICS, Axon

 

Build the technical architecture in EDC (Data catalogue).

Perform Scan of the data (parquet, csv) in Azure Data lake and create data catalogue in EDC

Define standard and best practices for the EDC usage

Analyze the technical data architecture and existing business rules with client SMEs

Analyze data sources, data type and frequency required for ETL jobs

Help identification of domains and connectors using Informatica tool set.

Validate the technical feasibility of business requirements

Finalize business and technical requirements and constraints for design

Preferred Skills: IICS, Axon

 

 

Position 5:

 

Senior Data engineers   

Location: Waltham, MA / Atlanta, GA

Must Have Python and AWS

Senior Data engineers

•          AWS and BigData skills - AWS, Spark, Python, Hbase, Kafka, HDFS etc.

•          Experience building data pipelines for enterprise-grade data and analytics platforms that are highly scalable,

•          compliant and secure with robust data quality, data governance, data discovery, catalog and visualization capabilities.

•          Experience in all facets of software development life cycle like analysis, design, development, data conversion, data security, system integration and implementation.

Relational SQL and NoSQL databases, including Postgres and Cassandra

 

 

 

 

Position 6:

 

Sr. Data Engineer/ Architect

Java, J2EE, Spring Boot, Hadoop Cassandra, Spark,

Mandatory Skills

ü  8- 10 years of experience in Data analysis and data management

ü  Build and Deploy Data Pipelines on Google Cloud to enable Data Standardization & Data Quality capabilities.

ü  Responsible for implementation and ongoing Support of Big Data & Cassandra infrastructure.

ü  Working with data delivery teams to setup new Big Data Pipelines. This job includes setting up users, setting up Kerberos principals and testing Data Pipelines

ü  Strong experience in Java, J2EE, Spring boot, Web services, Spark, Solr, Cassandra

ü  Lead & drive the development of cloud-based data warehouses & data management platforms

ü  Design, Build & Manage Data Pipelines to ingest structured and Semi-Structured Data.

ü  Experience leading data warehousing, data ingestion, and data profiling activities

ü  Collaborate with various business and technical teams to gather requirements around data management rules and process

ü  Understand the business requirement or the downstream process requirement

ü  Provide impact analysis whenever there are any changes and production events happen that may affect Data Quality

ü  Monitor throughout the project delivery life cycle in order to manage the scope, schedule, and cost for IT projects

 

Desirable Skills:

ü  Demonstrate expertise in Snowflake data modeling and ETL using Snowflake SQL, implementing complex stored procedures and best practices with data warehouse and ETL concepts

ü  Demonstrate expertise in Snowflake data modeling and ETL using Snowflake SQL, implementing complex stored procedures and best practices with data warehouse and ETL concepts

ü  Propose the optimization of these rules if applicable, then design and develop these rules with ETL logic

ü  Deliver ETL solution implementation and support

 

 

 

 

Position 7:

 

Senior Data Engineer - production – NBD

Location: Remote

 

"Total industry experience between 12-15 years

Experience in HLS Domain + E2E solution architecture

Experience to capture customer requirements and performing Platform demos to customers

Should have experience to work with enterprise big data teams to handle issues resolution in the DWBI Analytics environment and obviously have experience in delegating the issues to the right team etc.

Extensive experience defining and rationalizing complex data models within a Snowflake/Amazon data warehousing environment

Required Skills: Data Architecture, Snowflake,AWS, Bigdata, SQL

 

             

Similar Jobs you may be interested in ..