Job Description :
Data Engineer (Data Platform)
Engineer, Information (Senior)

Location: Linthicum MD

Rate: Depends on experience (DOE)

Require FTF

W2 requirements

Duties:
1. Provides design recommendations based on long-term IT organization strategy.
2. Develops enterprise level application and custom integration solutions including major enhancements and interfaces, functions and features. Uses a variety of platforms to provide automated systems applications to customers.
3. Provides expertise regarding the integration of applications across the business.
4. Determines specifications, then plans, designs, and develops the most complex and business critical software solutions, utilizing appropriate software engineering processes – either individually or in concert with a project team. Will assist in the most difficult support problems.
5. Develops programming and development standards and procedures as well as programming architectures for code reuse. Has in-depth knowledge of state-of-the art programming languages and object-oriented approach in designing, coding, testing and debugging programs.
6. Understands and consistently applies the attributes and processes of current application development methodologies.
7. Researches and maintains knowledge in emerging technologies and possible application to the business.
8. Viewed both internally and externally as a technical expert and critical technical resource across multiple disciplines. Acts as an internal consultant, advocate, mentor and change agent.

Education:
1. A Bachelor's Degree from an accredited college or university with a major in Computer Science, Information Systems, Engineering, Business, or other related scientific or technical discipline. A Master's Degree is preferred.

General Experience:
1. At least 8 years of experience in developing cloud based multi user application with expertise in designing, building, testing and implementing IT application.
2. Must have a strong background in software engineering principles and techniques.

Special Qualifications:

MDM (Two positions)
1. The candidate must have experience designing and developing MDM Solution in a large-scale deployment.
2. Should be able to develop for a robust MDM platform that matches and consolidates key data subject areas.
3. This individual must produce MDM solutions meeting technical specifications and business requirements according to the established designs, conducts MDM unit tests and code reviews, participates in MDM design reviews.
4. This individual must also have a strong background in software engineering principles and techniques.
5. Specialized Experience using Informatica MDM suite of products.

ETL (Six positions)
1. Provides technical direction in the maintenance and development of the extract, transform and load (ETL) aspects data platform.
2. Maintains an understanding of the inputs received from the data source providers.
3. Performs analysis, design, development, and implementation of new ETL requirements.
4. Recommends changes to enhance the data platform cleansing and conversion processes.
5. Supports testing and validation of the new data conversion processes.
6. Responsible for planning and monitoring schedule milestones and deliverables for assignments.
7. Possesses strong SDLC experience and manages appropriate levels of systems documentation as required.
8. Specialized Experience using Informatica BDM (Big Data Management) suite of products.

Marklogic (One positions)

1. 8 years of overall experience in information technology.
2. 2 years implementation experience in MarkLogic.
3. Experience in translating the business requirement into a Technology solution roadmap.
4. Ability to consult and advice customers in the NoSQL implementations.
5. Extremely good in Communication skills.
6. Experience with Java development, XML and Web Technologies.
7. Experience in implementing XQuery and MarkLogic API development.
8. Experience in rolling out large NOSQL implementations.
9. Excellent design, Development, Implementation, Documentation and problem solving skills.
10. Experience with integration methodologies and tools.
11. Experience in Big Data technologies (Hadoop and NoSQL) Experience in DEVOPS functions.
12. Experience in defining best practice and patterns for ingestion and retrieval of data from MarkLogic.
13. Familiarity with other NoSQL and Big Data technologies

Big Data Java/C++ (One position)

1. 8 years of hands-on experience as a C/C++/Java programmer with object-oriented analysis, design, and implementation expertise.
2. Experience utilizing open source technologies such as Kafka, Docker, relational and no-sql databases, to build cloud based products.
3. Experience with integration development using REST APIs and message queuing / integration platforms such as Apache Kafka.

Qlik (Four positions)
1. The candidate must have a minimum of 3 years of experience in BI and Data Warehouse.
2. The BI Developer will contribute directly by developing and maintaining dashboards and reports that turn raw data into easily digestible stories.
3. They will work to improve and enrich the existing reporting systems while working to extend the platforms by incorporating new data sources and requirements.
4. They will maintain a self-service data model and teach end-users across the organization how to create basic reports themselves.
5. Needs to have experience designing BI solutions using Qlikview and QlikSense.

Apache Spark Developer (Four positions)
1. Senior Developer with 5 years of hands on experience in Spark and Hadoop
2. 10 years of overall IT experience
3. Proven experience in building in Data Driven applications using a combination of Java/Scala and the Spark framework
4. Hands on experience in Map Reduce
5. Ability to work independently and drive solutions end to end leveraging various technologies to solve data problems and develop innovative big data solutions.
6. Proven expertise w/leveraging big data components (not limited to.Hive, HBASE, Oozie, Kafka etc) to build large scale data processing systems.
7. Expertise in two or more of the programming languages (Java, Spark, Python, R)

Java/Python Developer (Two positions)
1. Experience working in cloud platforms and services such as Amazon Web Services, DevOps and containerized cloud environments is preferred.
2. Experience developing web applications using AngularJS or equivalent JavaScript framework consuming RESTful API's developed on J2EE platform.
3. Practical knowledge of Jenkins. Maven and Sonar.
4. Proficiency of scripting languages such as JavaScript, jQuery, JSON and other front-end scripting languages.
5. Ability to generate SQL scripts and ETL packages to migrate data from legacy databases into PostgreSQL Database.
6. Ability to design and write ETL code using Java, Python, JQuery to transform legacy data.
7. Must have knowledge in building applications with concurrency.
8. Must have knowledge of Spring Component and Spring Framework
Data Scientist (Two positions)
1. 5 Years of Experience as a Data Scientist
2. Excellent understanding of machine learning techniques and algorithms (such as k-NN, Naive Bayes, etc)
3. Experience with varied range of data visualization tools.
4. Experience with varied data querying languages (sql, pig, hive)
5. Expertise in two or more of the programming languages (Java, Spark, Python, R)
6. Experience leveraging and harvesting data using various Relational and NoSQL Database Technologies.
7. Ability to work independently and drive solutions end to end leveraging various technologies to solve data problems and develop innovative big data solutions.
8. Data oriented, with proven experience in discovering information from vast amount of data, providing value to the business and enabling smart decision making.
9. Experience working in a Cloud based environment.
             

Similar Jobs you may be interested in ..