Job Description :

Role: Senior Graph Database Engineer

Location: Anywhere in EST Zone

Duration: 12+ Months/ Contract

# of Openings: 05

Start Date: ASAP


Must Have Skills:

  • Graph data modeling (Experience with graph data models (LPG, RDF) and graph languages (Cypher, Gremlin, SparQL), exposure to various graph data modeling techniques)
  • Optimizing complex queries
  • AWS


Ideal Candidate:

  • Someone with the technology and experience in the Graph space, preferably Neo4j, and good analytical and data skills.
  • Someone who understands and can apply analytical skills to data. Solid understanding of graph data modeling.


Job Summary:

  • As a Graph Database Engineer, you will design and build graph database load processes to efficiently populate the knowledge graphs using large-scale datasets to solve various business use cases.
  • You will also contribute to advanced analytics and machine learning platforms. You will partner closely with various business & engineering teams to drive the adoption, integration with graph database.
  • This role is a critical element to using the power of data in delivering Client’s promise of creating the best customer experiences in financial services!


The Expertise You Have:

  • Bachelor’s or Master’s Degree in a technology related field (e.g. Engineering, Computer Science, etc.).
  • Demonstrable experience in implementing Big data solutions in data analytics space.
  • Working experience with knowledge graphs/graph databases (Neo4j preferred).
  • Expertise with Graph database technology (Neo4J preferred)
  • Experience Tuning Graph databases (Neo4j preferred)
  • Experience with graph data models (LPG, RDF) and graph languages (Cypher, Gremlin, SparQL), exposure to various graph data modeling techniques for common use cases (e.g., Customer 360, data lineage, recommenders) for analytical (OLAP) workloads.
  • Solid understanding of graph data modeling, graph schema development, graph data design.
  • Experience developing graph algorithms, optimizing complex queries, designing indexes and constraints.
  • Experience developing APIs using knowledge graph data
  • Experience in designing and building highly scalable Knowledge Graphs in production
  • Experience with building data pipelines in getting the data required to build and evaluate ML models, using tools like Apache Spark or other distributed data processing frameworks.
  • Data movement technologies (ETL/ELT), Messaging/Streaming Technologies (AWS SQS, Kinesis/Kafka), Relational and NoSQL databases (DynamoDB, EKS, Graph database), API and in-memory technologies.
  • Solid understanding of developing highly scalable distributed systems using Open-source technologies.
  • Experience with CI/CD tools (e.g., Jenkins or equivalent), version control (Git), orchestration/DAGs tools (AWS Step Functions, Airflow, Luigi, Kubeflow, or equivalent).
  • Solid experience in Agile methodologies (Kanban and SCRUM).


The Skills You Bring:

  • You have the ability to deal with ambiguity and work in fast paced environment.
  • Your experience supporting critical applications.
  • Your Data wrangling experience with structured, semi-structure and unstructured data.
  • You have superb communication skills, both through written and verbal channels.
  • You have excellent collaboration skills to work with multiple teams in the organization.
  • Your ability to understand and adapt to changing business priorities and technology advancements in Big data and Data Science ecosystem.


The Value You Deliver:

  • Build Knowledge Graph solutions leveraging large-scale datasets to solve various business use cases.
  • Design and build graph database schemas to support variety of use cases including knowledge graphs
  • Design and build graph database load processes to efficiently populate the knowledge graphs
  • Use AWS to host and manage the knowledge graphs
  • Support ML activities against the knowledge graph to expose insights
  • Build both batch and real-time update processes to keep the knowledge graphs in sync
  • Collaborate across Data Platform teams to use existing data assets in the knowledge graph and to drive the creation of additional data assets
  • Guiding teams to improve development agility and productivity.
  • Resolving technical roadblocks and mitigating potential risks.
  • Delivering system automation by setting up continuous integration/continuous delivery pipelines. 


Client : Compunnel


Similar Jobs you may be interested in ..