Tech Evaluation Login
Tech Evaluation Signup
PasswordReset
Unsubscribe
PasswordReset
PasswordReset
PasswordReset
Register for Webinar
Register for Event
Job Channels
Skill Score
About Us
Contact Us
Our Blog
Toggle navigation
Home
Skill Score
DemandIndex
Jobs
FETCH Jobs
Java J2EE
Dot Net, C#
DBA
SAP
Admin, Networking
Datawarehousing
QA
Demo
Candidate
Recruiter
Services
Job Posting & Resume Access
Integrated Talent Sourcing
Sub Contractor Supply Chain
Merger & Acquisition
Candidate
Post Resume
Login
Create Job Alert
Recruiter
Login
Free Trial
Signup
Pricing
| Pricing
Dashboard
My Resumes
Resume List
Add Resume
Edit / Update Resume
My Jobs
Fetch Jobs
Matched Jobs
Applied Jobs
Digital analytics
Deerfield, IL
Deerfield
IL
60015
Date
: Dec-21-18
2018-12-21
2019-12-11
Digital analytics
Deerfield, IL
Dec-21-18
Work Authorization
US Citizen
GC
H1B
L2 EAD, H4 EAD
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Architect, Senior
Rate/Salary ($)
:
Market
Duration
:
6 Months
Sp. Area
:
Others
Sp. Skills
:
Others
Permanent Direct Hire
Consulting / Contract
FULL_TIME, CONTRACTOR
Direct Client Requirement
Required Skills
:
Digital analytics
Preferred Skills
:
Domain
:
Work Authorization
US Citizen
GC
L2 EAD, H4 EAD
H1B
Preferred Employment
Corp-Corp
W2-Permanent
W2-Contract
1099-Contract
Contract to Hire
Job Details
Experience
:
Architect, Senior
Rate/Salary ($)
:
Market
Duration
:
6 Months
Sp. Area
:
Others
Sp. Skills
:
Others
Permanent Direct Hire
Consulting / Contract
FULL_TIME, CONTRACTOR
Direct Client Requirement
Required Skills
:
Digital analytics
Preferred Skills
:
Domain
:
ZTEK CONSULTING
Duluth, GA
Post Resume to
View Contact Details &
Apply for Job
Job Description
:
Should have strong Digital analytics experience.
Turn OFF keyword highlights
Similar Jobs you may be interested in ..
Data Management Developer
,
Chicago, IL
Apr-10-24
tanishasystems
($) :
BASED ON EXPERIENCE
Role: Data Management DeveloperLocation: Chicago IL( Only Fulltime-)JOB SUMMARY:The Data Management Specialist will be responsible for managing and optimizing our organization's data infrastructure, ensuring data quality, and implementing data solutions using PySpark, DataBricks, Snowflake, and/or Redshift. The ideal candidate will have a strong background in data management, programming, and cloud-based data platforms.Key Responsibilities:1. Develop and maintain data pipelines using PySpark to
Apply
[Apply Individually]
Remote - Data Engineer
,
Chicago, IL
Apr-02-24
Technogen, Inc
($) :
USD 80
Job Title: DATA ANALYTICS ENGINEER Locations: Chicago, IL (Hybrid) Duration: Contract Skills and Experience: A degree, preferably in Economics, Econometrics, Engineering, Physics, (Applied) Mathematics, Accounting, Information Systems. Minimum 5 years of professional experience, with a minimum 2 years of relevant Data Analytics experience Extensive experience in technical business analysis in the fintech industry Exceptional communication skills toward facilitating the use of
Apply
[Already Applied]
[Apply Individually]
Snowflake Data Engineer
,
Chicago, IL
Mar-28-24
Technogen, Inc
($) :
USD 70 / Hourly / C2
Please look for EST and CST time zone profiles. 253160 Need immediate support on below role ASAP. Location: Chicago,IL (Hybrid 3 days a week) Let me know in case of any questions. Responsibilities: Design, implement, and maintain data pipelines on Snowflake, ensuring scalability, reliability, and performance. Develop and optimize data ingestion processes from various sources, including Azure Blob Storage, Azure Data Lake, databases, APIs, and streaming data sources. Imple
Apply
[Already Applied]
[Apply Individually]
Data Engineer
,
Chicago, IL
Mar-24-24
Technogen, Inc
($) :
USD 65 / Hourly / C2
Data Engineer Remote Develop EL/ELT/ETL pipelines to make data available in BigQuery analytical data store from disparate batch, streaming data sources for the Business Intelligence and Analytics teams Work with on-prem data sources (Hadoop, SQL Server), understand the data model, business rules behind the data and build data pipelines (with GCP, Informatica) for one or more business verticals. This data will be landed in GCP BigQuery. Build cloud-native services and APIs to support and e
Apply
[Already Applied]
[Apply Individually]