Upvote
Downvote
GDIA Data Engineer
Share Job
- Suggest Revision
- Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successfully deployment of Data Platform Implement methods for automation of all parts of the pipeline to minimize labor in development and production.
- Experience with large scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP
- Design and build production data engineering solutions to deliver our pipeline patterns using Google Cloud Platform (GCP) services: o BigQuery, DataFlow (Apache Beam), Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer (Apache Airflow), Cloud SQL, Compute Engine, Cloud Functions, and App Engine
- Migrate existing Big Data pipelines into Google Cloud Platform.
- Minimum 3 Years of Experience in Java/python in-depth Minimum 2 Years of Experience in data engineering pipelines/ building data warehouse systems with ability to understand ETL principles and write complex sql queries.
Active Job
Updated 29 days agoSimilar Job
Relevance
Active