Job Description :

We have opening for Data Architect position

Location is San Francisco, CA

This is Fulltime position


Responsible for designing and implementing data lake and applications that ingest data, do ETL, and analytics on top.

Be able to evaluate and recommend querying engines, reporting and tools for data ingestion and management; and
support movement of high volumes of data with best possible performance.

Required Skills

10 or more years of hands-on experience with systems, databases, programming and architecture

Experience with database and data warehousing systems such as Oracle/SQL Server, Teradata, AWS’s Data
Migration Service

Experience in designing and working with cloud based datastores such as AWS RedShift, Snowflake and Azure
SQL etc.

Experience with Tableau/Qlikview/Power BI/Spotfire

Experience with Big Data Technologies:

Sqoop, Flume, Talend/ETL, HDFS, Hive, HBase, Phoenix, Ignite, Cassandra or any other popular NoSQL/document DBMS

SQL knowledge and knowledge of tools such as Impala, Tez, Presto, Drill, etc.

Data integration experience - integration of Big Data Lake with system of records (ERP, CRM, BPM, Salesforce,

Working Knowledge of any in memory databases such as Redis, Memcached, Hazelcast, etc.

Experience in High Availability and High Performance

Ability to communicate, present, and collaborate well

Minimum Qualifications

Bachelor's Degree in Computer Science

5+ years of hands-on experience working with extremely high throughput, high availability data processing systems

A focus on high quality deliverables and meeting deadlines

Be someone who will bring ideas to the table

A collaborative style and a focus on continuous improvement, quality and planning, teamwork, strong communication

Good understanding of big data database architecture, indexing, and partitioning