Role – Data Engineer
Need to be onsite - 2-Locations - Menlo Park,CA and Austin, TX
Must Have: SQL and Python
Visa: Open to OPT/CPT candidates ( Kindly do the due diligence while screening opt/cpt)
Rates - Open – Ranges between $55 - $85 (depends upon the experience)
Experience – 5+yrs
Client – Meta (Facebook)
Duration – 12+ months contract
Note: Candidates should be willing to take online coding test in (SQL and python)
Interview Process: . Post clearing online assessment, Trianz Tech discussion + customer call (2 rounds)
Data Engineers
Menlo Park, CA, New York, NY, Austin, TX, Seattle, WA
Day 1 Onsite Job
6-12 months duration
About the Role:
Trianz is looking for passionate Data Engineers who are looking to tackle challenges and build solutions.
We are looking for a Data Engineer to not only build data pipelines but also extend the next generation of our data tools. As a Data Engineer, you will develop a clear sense of connection with our organization and leadership - as Data Engineering is the eyes through which they see the product.
This is a partnership-heavy role. As a member of Infrastructure Strategy Data Engineering, you will belong to a centralized Data Science/Data Engineering team who partners closely with teams in the client’s Infrastructure organization. Through the consulting-nature of our team, you will contribute to a variety of projects and technologies, depending on partner needs. Projects include analytics, ML modeling, tooling, services, and more. The broad range of partners equates to a broad range of projects and deliverables: ML Models, datasets, measurements, services, tools and process.
Responsibilities
· Manage data warehouse plans for a business vertical or a group of business verticals.
· Build data expertise and own data quality for allocated areas of ownership.
· Design, build, optimize, launch, and support new and existing data models and analytical solutions.
· Partner with internal stakeholders to understand business requirements, work with cross-functional data and products teams and build efficient and scalable data solutions.
· Conduct design and code reviews.
· Work with data infrastructure to triage infra issues and drive to resolution.
· Manage the delivery of high impact dashboards, tools and data visualizations.
Minimum Qualifications:
· BS/B.Tech. /M. Tech in Computer Science, Math or related field
· 2+ years of experience in the data warehouse space, custom ETL design, implementation and maintenance
· 2+ years of experience in SQL or similar languages, and development experience in at least one language (Python, PHP etc.)
· Experience with data architecture, data modeling, schema design and software development
· Experience in leading data driven projects from definition through interpretation and execution.
· Experience with large data sets, Hadoop, and data visualization tools
· Experience initiating and driving projects and communicating data warehouse plans to internal clients/stakeholders.