Full job description
Duration: 6-month contract
Key Responsibilities:
Refactor and optimize existing Python code for scalability, modularity, and maintainability.
Implement software engineering best practices, including CI/CD pipelines, testing frameworks, and DevOps principles.
Collaborate with team members to deliver high-quality Python solutions.
Work on large-scale projects, ensuring robust architecture and performance improvements.
Utilize libraries such as Pandas, NumPy
|
 |
Full job description
Basic Requirements:
14 years of experience as a software engineer
Bachelor’s degree in a technical discipline
4 additional years of experience as a software engineer may be substituted for a degree
Job Description:
Sourcing for a candidate with experience in Python, Docker, AWS (app deployment, S3, opensearch, terraform
Experience with Kotlin is desired.
-Excellent benefits package including 25 days PTO, 11 paid holidays, 100% employer-paid healthca
|
 |
Role: Google Cloud (GCP) data engineer with Python
Hartford, CT
Full Time
Required Qualifications:
At least 4 years of Information Technology experience.
Experience working with technologies like GCP with data engineering data flow / air flow, pub sub/ kafta, data proc/Hadoop, Big Query.
ETL development experience with strong SQL background such as Python/R, Scala, Java, Hive, Spark, Kafka.
Strong knowledge on Python Program development to build reusable framework
|
 |
Role: –Python Developer Bill Rate: $80/hour C2CLocation: Wilmington, DEDuration: 12+ months/ long-term Interview Criteria: Telephonic + ZoomDirect Client Requirement
Job Description:
Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems.
Develops secure high-quality production code, and reviews and debugs code written by others.
Identifi
|
 |
Full job description
Responsibilities:
Write clean, efficient, and reusable Python code.
Build and maintain back-end services, APIs, and data pipelines.
Collaborate with front-end developers, designers, and product managers.
Develop, test, and deploy new features and fixes.
Integrate with databases, third-party services, and external APIs.
Identify and address technical challenges, finding optimal solutions, and optimising application performance.
Write unit and int
|
 |
Full job description
Key Responsibilities
Agent Logic & Tooling: Develop and maintain the backend "tools" (APIs, scrapers, database connectors) that AI agents use to perform tasks.
Orchestration Implementation: Use frameworks like LangChain, LangGraph, or CrewAIto implement complex reasoning chains and multi-agent coordination.
RAG Pipeline Engineering: Build and optimize data ingestion and retrieval systems usingVector Databases , ensuring the agent has the right context at the ri
|
 |
Job DescriptionWalmart is seeking a highly skilled Data Engineer to design, build, and optimize large-scale data pipelines and platforms. The ideal candidate will have strong hands-on experience withApache Spark and solid programming expertise in Java, Python, and/or Scala, working in high-volume, enterprise data environments.
Key Responsibilities
· Design, develop, and maintain scalable data pipelines using Apache Spark
· Build and optimize ETL/ELT workflows for large datasets
· Write
|
 |
Full job description
Key Responsibilities:
Architect and implement scalable Full Stack AI Agents, Agentic Workflows, and Gen AI applications to address diverse and complex business use cases.Develop, optimize, and maintain the front-end using Angular, ensuring a seamless user experience.Design and deploy Python-based microservices for robust orchestration and integration with AI models.Collaborate with Gen AI scientists to integrate machine learning models such as LLMs, RAG, and multi-mo
|
 |
Full job description:
Key Responsibilities:
Software Development:
Write clean, scalable, and efficient Python code for various applications.
Design and implement high-availability and low-latency applications.
Develop RESTful APIs and integrate with third-party services.
Collaborate with front-end developers to integrate user-facing elements with server-side logic.
Data Processing and Analysis:
Work with data science teams to implement data processing pipeline
|
 |
Role Name: Senior Python Developer
Client Name: Cognizant
Location: Denver, CO/St. Louis, MO(onsite)
Experience : 10+ year and W2
Job Description:
We are seeking a skilled Python Developer with strong Apache Airflow experience to design, develop, and optimize workflow orchestration solutions. The role focuses on DAG development, dynamic workflow generation, and improving existing systems. The candidate will support application teams and assist with Airflow cluster adminis
|
 |