Job Description :
*Position: Solution Architect*

*Location: Hartford, CT (We are open to any location as long as a person
can travel to Hartford for 2 days per week)Legal status of candidate s: no
limitations*

*Permanent-Full Time*

*Must Skills:*

a) Expert-level, hands-on experience with advanced
integration/streaming/messaging technologies.
b) Java/Spring
c) High-performance/multithreading/real-time
d) Rule engines, especially Drools
Wants (cover as many as possible):
a) Integration tools like Camel/RedHat Fuse or Mule
b) Solid relational DB/persistence experience
c) Data integration/ETL
d) CDC (change data capture)
e) Kafka
f) Docker/Kubernetes
g) Graph technologies - Neo4J or similar
h) Agile
i) CI/CD
*Important soft skills:*
- Good communication skills
- Ability to work with a hybrid client/EPAM team
- Leadership
- Ability to work in fast-paced, agile environment
Hard skills:
focus on technology prowess, hands-on coding capacity.
*Project Description:*
1. Large-scale project to build integration framework that connects several
upstream systems (providers of data) with many (150) consumers of the data.
2. Data updates will be coming from the source systems via Kafka and
consumed by the new system.
3. The data will be validated and merged across the sources using
metadata-driven rules and transformations (Drools, Camel, Spark
4. The data will be distributed into different data repositories (MongoDB,
Neo4J, Oracle, Hadoop)
5. The system will provide near real-time event driven processing of the
data updates and distribution of the data to consumers.
6. The system will also provide REST API and data distribution via Kafka
for subscribed consumers
7. The system will consist of multiple spring-boot based components
deployed via Docker on OpenShift/Kubernetes infrastructure.
8. The system will require a lot of custom Java development to integrate
with Kafka, MongoDB, Neo4J, Drools, metadata-driven rules and flows, REST
API.
             

Similar Jobs you may be interested in ..