Upvote
Downvote
Principal Software Engineer - Data Pipelines
Share Job
- Suggest Revision
- Responsibilities:Architect and develop large-scale, distributed data processing pipelines using technologies like Apache Spark, Apache Beam, and Apache Airflow for orchestration.
- Implement best practices for data governance, data quality, and data security across the entire data lifecycle.
- Stay up-to-date with the latest trends, technologies, and industry best practices in the big data and data engineering domains.
- Minimum of 10 years of experience in backend software development, with a strong focus on data engineering and big data technologies.
- Proven expertise in Apache Spark, Apache Beam, and Airflow, with a deep understanding of distributed computing and data processing frameworks.
Active Job
Updated 5 days agoSimilar Job
Relevance
Active