Upvote
Downvote
Sr. DataOps Engineer 1
Share Job
- Suggest Revision
- Build and maintain data pipelines to extract, transform and load (ETL) data in a data warehouse (BigQuery).
- Must have recent work experience with BigQuery or one of the following Snowflake, Redshift, Azure SQL data warehouse or Vertica.
- Develop tools to manage configuration and deployment of our GCP environment
- 5 years of recent cloud DataOps experience, including proficiency with data warehouse technologies, preferably with BigQuery experience or one of the following Snowflake, Redshift, Azure SQL data warehouse or Vertica.
- Experience with infrastructure as code (IaC) technologies like Terraform, ArgoCD
Active Job
Updated 6 days agoSimilar Job
Relevance
Active