Job Description :
This Is Sachin from Saxon Global,
This is Reference for Following Position
* Please reply with Resume, Work Authorization , Rate & Contact Details ASAP *

Position: Data Engineers
Location: Sunnyvale, CA
Duration: 1+ Years to extension

Job Description

1) Teradata/Hive ( Data Ingestion)
2) Data modeling –data models, star schema, conceptual models, basic Modeling quests.
3) Domain experience. – Preference is ecommerce
4) Dashboard building with Tableau/Microstrategy.
5) Shell/Python scripting.
6) ETL tool experience like Talend and Tableau.

Position Summary:
* Very Strong engineering skills. Should have an analytical approach and have good programming skills.
* Provide business insights, while leveraging internal tools and systems, databases and industry data
* Minimum of 5+ years experience. Experience in retail business will be a plus.
* Excellent written and verbal communication skills for varied audiences on engineering subject matter
* Ability to document requirements, data lineage, subject matter in both business and technical terminology.
* Guide and learn from other team members.
* Demonstrated ability to transform business requirements to code, specific analytical reports and tools
* This role will involve coding, analytical modeling, root cause analysis, investigation, debugging, testing and collaboration with the business partners, product managers other engineering team.

Must Have
* Strong analytical background
* Self-starter
* Must be able to reach out to others and thrive in a fast-paced environment.
* Strong background in transforming big data into business insights

Technical Requirements
* Knowledge/experience on Teradata Physical Design and Implementation, Teradata SQL Performance Optimization
* Experience with Teradata Tools and Utilities (FastLoad, MultiLoad, BTEQ, FastExport) * Advanced SQL (preferably Teradata)
* Experience working with large data sets, experience working with distributed computing (MapReduce, Hadoop, Hive, Pig, Apache Spark, etc.
* Strong Hadoop scripting skills to process petabytes of data
* Experience in Unix/Linux shell scripting or similar programming/scripting knowledge
* Experience in ETL/ processes
* Real time data ingestion (Kafka) Nice to Have
* Development experience with Java, Scala, Flume, Python
* Cassandra
* Automic scheduler
* R/R studio, SAS experience a plus
* Presto
* Hbase
* Tableau or similar reporting/dash boarding tool
* Modeling and Data Science background
* Retail industry background Education BS degree in specific technical fields like computer science, math, statistics preferred
             

Similar Jobs you may be interested in ..