Job Description :
Job Responsibilities for Big Data Engineer with AWS and Python   •  In this role, you will work with our data engineering team and build big data platforms to deliver value to our clients. •  Responsible for data ingestion, processing, storage, extraction and orchestration in big data ecosystem specifically using AWS and server less services to meet the business objectives. •  Work with the data validation team to address the data defects and ensure the defects are closed in efficient and timely manner. •  Help apply your expertise in building world-class solutions, conquering business problems, addressing technical challenges using big data platforms and technologies. •  Required to utilize the existing frameworks, standards, patterns to create architectural foundation and services necessary for building data pipelines that scale and demonstrate yourself as an expert by actively researching and identifying new ways to solving data management problems in this emerging area. •  Ensure the assigned tasks are completed within the timelines and budget requirements. •  Understand client's business problems, defining, executing, and delivering assigned tasks to meet these requirements. •  Develop POCs and create POVs as required towards achieving program objectives. •  Your day-to-day interactions is with peers, clients and management. •  You will be given minimal instruction on daily work/tasks and a moderate level of instructions on new assignments. •  You will need to consistently seek and provide meaningful and actionable feedback in all interactions. •  You will be expected to be constantly on the lookout for ways to enhance value for your respective stakeholders/clients. •  Decisions that are made by you will impact your work and may impact the work of others. •  You would be an individual contributor and/or oversee a small work effort and/or team. •  Proactive towards organizational initiatives and contribute towards people activities like training, developing people by creating growth plans, etc.     Job Requirements •  B.Tech in computer science or equivalent. •  8+ years of relevant experience in building data pipelines, ETL, ELT and data engineering •  Should have experience working in distributed computing and Cloud native architecture, big data and AWS ecosystem, preferably AWS ceritified •  Experience in AWS and server less services– S3, AWS Glue, Step Functions, Redshift •  Experience in handling healthcare and clinical data and familiarity with these domains
             

Similar Jobs you may be interested in ..