Upvote
Downvote
GCP Data Engineer
Share Job
- Suggest Revision
- The successful candidate will be responsible for designing, developing the transformation and modernization of big data solutions on GCP cloud integrating native GCP services and 3rd party data technologies also build new data products in GCP. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design and develop right solutions with appropriate combination of GCP and 3rd party technologies for deploying on GCP cloud.
- Experience with large scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP.
- Design and build production data engineering solutions to deliver our pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow (Apache Beam), Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer (Apache Airflow), Cloud SQL, Compute Engine, Cloud Functions, and App Engine
- Migrate existing Big Data pipelines into Google Cloud Platform.
- Minimum 2 Years of Experience in data engineering pipelines/ building data warehouse systems with ability to understand ETL principles and write complex SQL queries.
Active Job
Updated 15 days agoSimilar Job
Relevance
Active