Job Description :
AWS architect

Philadelphia, PA

6 - 12 months contract



· The Solution Architect will mainly developing the Big Data solutions in AWS to cover, and not limited to,

· Able to Create roadmap and prototype design solutions for systems, educate others on design of solution and establish standards, review code and provide constructive feedback on improvement, research and recommend changes in software development standards and Governance processes.

· Make suggestions and recommend about Big Data platform, Cloud and technology adoption, including database servers, application servers, libraries, and frameworks.

· Collaborate on projects with architecture, strategy and Information security teams to on-board Big Data platform (application patterns

· Business Communication – engage with internal teams, project teams and business areas to discuss issues and document or explain technical solutions in a simplified meaningful way.

· Collaborate on projects with architecture, strategy and Information security teams to on-board Cloud platform (Cloud patterns

· Participate in design reviews of architecture patterns for service/application deployment in AWS cloud.

· Collaborate with platform pillar (Cloud, Big Data, Database, middleware etc leads to build-out the components for Big Data and Cloud Platform.

· Mentor junior members in the development team and operations teams

· Help coordinate project deployments with other engineers and make schedule recommendations.



Qualifications:

· 10+ years of Applications solution architecture development experience with 5+ years of AWS Cloud and Big Data tools and technologies, such as Lambda, Databricks, and EMR.

· Bachelor’s degree or higher in Computer Science, IT, related field or equivalent work experience

· A passion for data, and several years of hands-on experience leading data engineering projects, such as data architecture and data warehouse implementation projects end-to-end.

· Experience in building solutions on large, complex projects in a high-tech development environment with multi-function teams.

· Strong technical background/subject matter expertise – experience as technical lead or architect on a Big Data development team.

· Have developed systems based on key principles (consistency and availability, liveness and safety, durability, reliability, fault-tolerance, consensus algorithms)

· Strong solution delivery experience in Hortonworks Hadoop echo system; Must be very familiar with Hive, Zookeeper, Pig, Spark, HDFS, Apache Ranger etc.

· Working experience of building and managing data pipelines on On-Premise and AWS Cloud data platforms end-to-end using Sqoop, Kafka, kinesis, AWS Step functions, X-Ray, CloudTrail, CloudWatch, and AWS CLI, Python, Scala, Node.js, bash, Job Orchestration tools and etc.

· Experience in designing and implementing multi-terabyte data warehouses in MPP columnar database systems, such as Redshift and Teradata.

· Experience working with data streaming technologies (Kinesis, Kafka etc

· Expertise with AWS technology stack and CI/CD deployment (Docker, Jenkins, etc

· Prior experience in migrating on premise data platforms implementation to a cloud based implementation
             

Similar Jobs you may be interested in ..