Job Description :

• Data migration on prem to cloud/cloud to cloud (GCP Must)
• Experience Setting up data pipelines Apache Airflow, Cloud composer
• Experience with Dataproc, Bigtable, Big Query cluster on GCP, Cloud Dataflow, Cloud SQL, Cloud Storage
• Experience with Bog data tools such as Hadoop, Spark, Kafka, Pubsub etc
• Work with data team to efficiently use Hadoop/Cloud infrastructure to analyze data, build models, and generate reports/visualizations
• Extracting, Loading, Transforming, cleaning, and validating data
• Experience with Data catalog, Data fusion, Data Prep, Data loss prevention
• Designing pipelines and architectures for data processing
• Querying datasets, visualizing query results and creating reports
• Data modeling, Data Catalog
• Experience with reporting tools such as Tableau, Looker, Data Studio
• Experience with Python Scripting (sci-kit learn, pandas, numpy etc.)


Similar Jobs you may be interested in ..