Upvote
Downvote
DataOps Engineer
Share Job
- Suggest Revision
- We collaborate with customers to strategize, navigate, and accelerate an ideal path forward with a digital transformation via assessments, migration, or modernization.
- Amazon S3, Amazon EMR, and Apache Airflow for workflow management are used to build the data lake.
- You have the experience of building and running data lake platforms on AWS. You have exposure to operating PySpark-based ETL Jobs in Apache Airflow and Amazon EMR. Expertise in monitoring services like Amazon CloudWatch.
- Operate the current data lake deployed on AWS with Amazon S3, Amazon EMR, and Apache Airflow
- Overall 5+ years of exp in the software industry Exp in developing architecture data applications using python or scala, Airflow, and Kafka on AWS Data platform Experience and expertise.
Active Job
Updated 7 days agoSimilar Job
Relevance
Active