5+ years of experience in data analysis, engineering, architecture and operations roles, including experience with transformational efforts
Strong Database skills, with RDBMS (E.g., Oracle, SQL) as well as modern relational and unstructured data sources (like NoSQL), including cloud services (AWS/GCP/Azure). Hands on experience using tools is strongly preferred
Experience with Tools (or similar) such as Hadoop Stack, Airflow, Kafka, NiFi, PostgreSQL, Oracle, SQL Server, ElasticSearch (ELK), JSON, Parquet, Avro and other Data Storage formats, Tableau, Superset and other Visualization Tools, Apache Atlas, and other Data-centric Apache Packages
Extensive Knowledge of Design Patterns for Software and Data Engineering.
Experience Coding with Java, Javascript (Nodejs), Python, GO, Rust and similar.
Experience in on-prem and hybrid cloud infrastructure, including service and cost optimization
Experience with production and analytics data, batch and real time / streaming, etc.
Experience in regulated industries preferred (such as financial services, insurance, healthcare, etc.)
Familiarity with optimization tools and techniques, including Bayesian modelling and variety of machine learning techniques
Ability to manage large programs and projects will be essential