Job Description :

Must Have Skills Neo4j

Job Title:

Sr. AWS Data Engineer

Location: Remote

Duration:

Description:

As a AWS Data Engineer, you will be responsible for leading a team of data engineers in the development and maintenance of scalable data pipelines and implementing solutions in graph databases and machine learning. Your role will be pivotal in processing and validating data integrity for analysis and leveraging a variety of tools and technologies to meet business needs. With a hands-on approach, you will apply Agile methodologies to translate business requirements into technical specifications, partnering with cross-functional teams to deliver data-driven solutions that optimize operations and contribute to our organization's goals.

Key Responsibilities:

- Lead and mentor a team of data engineers in designing, building, and maintaining efficient, scalable data pipelines.

- Ensure the integrity and accuracy of data used for analysis, implementing robust validation processes.

- Utilize graph databases like Neo4j for complex data modeling and analysis.

- Collaborate with business and IT teams to understand requirements and translate them into actionable data and engineering specifications.

- Drive the adoption of best practices in data engineering, including continuous integration and delivery (CI/CD) pipelines using tools like GitHub, Jenkins, and UCD.

- Oversee the implementation and use of data storage solutions, including PostgreSQL, MS SQL, Amazon RDS/Aurora, Databricks, and Snowflake.

- Manage cloud-based data solutions utilizing AWS services like EC2, S3, IAM, SQS, SNS, Lambda, and Step Functions.

- Advocate for Agile methodologies, fostering a culture of rapid iteration, feedback, and continuous improvement.

- Ensure team alignment with disaster recovery plans and practices.

- Facilitate communication and collaboration within the team and across departments, leveraging tools like SharePoint, Confluence, and JIRA.

Skills and Qualifications:

- Proven experience in data engineering and leadership, with a strong portfolio of successful data pipeline creation and management.

- Expertise in Python, Java programming languages.

- Deep understanding of SQL and experience with Neo4j.

- Familiarity with CI/CD tools (GitHub, Jenkins, UCD) and data storage solutions (PostgreSQL, MS SQL, Amazon RDS/Aurora, Databricks, Snowflake).

- Certified in AWS technologies with hands-on experience in EC2, S3, Lambda, and Step Functions.

- Proficiency in Apache Airflow, ECS/EKS, Kafka, Docker, Terraform, and disaster recovery strategies.

- Strong leadership skills, with the ability to mentor, inspire, and lead a technical team.

- Excellent problem-solving, communication, and team collaboration skills.

- Agile methodology advocate with a track record of applying Agile practices to improve team performance and outcomes.

             

Similar Jobs you may be interested in ..