Job Description :
TECHNOGEN, Inc. is a Proven Leader in providing full IT Services, Software Development and Solutions for 15 years.

 

TECHNOGEN is a Small & Woman Owned Minority Business with GSA Advantage Certification. We have offices in VA; MD & Offshore development centers in India. We have successfully executed 100+ projects for clients ranging from small business and non-profits to Fortune 50 companies and federal, state and local agencieslt; /span>
Please share me the resumes to john@technogeninc. com and call me at Data Engineer 6 month contract Minnetonka MN Client seeks a Data Engineer within its Human Capital Systems team. The Data Engineer must have experience with modern database technologies, and API driven design and development. The Data Engineer makes design decisions and partners across the organization in a collaborative manner to achieve results for our Human Capital applicationslt; /p> Primary Responsibilities: * Architect all phases of software engineering including requirements analysis, application design, code development and testing with a focus on DevOps deployment in particular * Design reusable components, frameworks and libraries * Contribute to the design and architecture to enable secure, scalable, and maintainable solution and should be able to clearly articulate the implications of design/architectural decisions, issues and plans to technology leadership * Collaborate on the design with other team members and product owners * Conduct design and code reviews to ensure code developed meets business needs, coding best practices guidelines, unit testing, security, and scalability and maintainability guidelines * Work very closely with architecture groups and drive solutions * Use engineering best practices following an Agile methodology to deliver high-quality emerging tech solutions * Communicate with impact - influence and negotiate effectively with all internal and external stakeholders to achieve win-win solutions that advance organizational goals * Grow and maintain knowledge of and leverage emerging technologies * Developing and analyzing highly complex system standards, thresholds, and recommendations to maximize system performance * Analyzing project requirements and develop detailed specifications for new data warehouse reporting requirements * Coordinating to support Bulk API work and make necessary changes to meet the business, contractual, security, performance needs To be considered for this position, applicants need to meet the required qualifications listed in this postinglt; /p> Required Qualifications: *Taleo API experience is ideal. Experience in batch API is preferred * Bachelors degree in related area or equivalent experience * 2+ years of data engineering * 5+ years of full lifecycle application, software development experience * 2+ years of SDLC experience in an Agile environmentlt; /p> * 4+ years of modern programming language such as Python, Java, Scalalt; /p> * Working knowledge of the following business and technology concepts: APIs, CI/CD, Big Data, data architecture and governance * Experience with Jenkins, GitHub, Big Data technologies like Spark * Experience using IDEs such as Eclipse, JBoss, IntelliJ * Relational database experience * Experience ingesting and working with large and complex datasets * Experience gathering requirements from end users Preferred Qualifications: * Bachelors degree in Computer Science, Engineering, or Technology * Experience with Cloud technologies and platforms such as Docker, OSE, Kubernetes, AWS, and Azure * Taleo knowledge * Experience with disaster and recovery models * DevOps experienc Required Skills : SDLC
             

Similar Jobs you may be interested in ..