- UpvoteDownvoteShare Job
- Suggest Revision
Experience working job orchestration (eg., Airflow / Databricks Workflows) Experience with Compute technologies like EMR and Databricks. RVO Health was created by joining teams from both Red Ventures and UnitedHealth Group’s Optum Health.
Full-timeExpandApply NowActive JobUpdated 4 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Good to have - Airflow, Splunk, Kubernetes, Power BI, Git, Azure DevOps. Deep understanding of one or more of the big data computing technologies such as Databricks, snowflake. Disaster recovery strategies for Databricks environments, ensuring data resilience and minimal downtime in case of failure.
ExpandApply NowActive JobUpdated 4 days ago - UpvoteDownvoteShare Job
- Suggest Revision
1+ years of experience working on AWS (Kinesis / S3 / RedShift / DMS / Athena ). Experience with other Data platforms like DBT, Snowflake, DeltaLake, etc. 3+ years of experience working on Spark (RDDs / Data Frames / Dataset API) using Scala/Python to build and maintain complex ETL pipelines.
Full-timeExpandApply NowActive JobUpdated 4 days ago - UpvoteDownvoteShare Job
- Suggest Revision
8+ years data engineering experience focused on Snowflake, Databricks, Kafka, AWS Glue, Airflow, and AWS services. Hands-on experience & understanding designing, building, and maintaining complex ETL pipelines using Snowflake, Databricks, Kafka, and AWS Glue.
$130,000 - $160,000 a yearFull-timeExpandApply NowActive JobUpdated 4 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Direct hands on experience with highly scalable data pipelines using BigData technologies (Spark, Hive, Airflow, DBT, Parquet / ORC, Kafka / Streaming, etc) Database / Warehouse: PostgreSQL, DynamoDB, Databricks Unity Catalog.
Full-timeExpandApply NowActive JobUpdated 5 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Experience in Apache Spark, Databricks or similar distributed processing systems. Implement data pipelines using Apache Spark/Databricks. You will primarily be working on Apache Spark/DataLake/Airflow (in Python) and other cloud-based technologies.
Full-timeExpandApply NowActive JobUpdated 5 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Strong knowledge of Extraction, Transformation and Loading (ETL) processes using frameworks like Azure Data Factory, Synapse, Databricks, Informatica by gathering requirements from the stake holders or analyzing the existing code and perform enhancements or new development.
ExpandApply NowActive JobUpdated 4 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Proficiency in designing and implementing ETL processes and data integration workflows using tools like Apache Airflow, Informatica, or Talend. Snowflake or Databricks certifications and/or hands-on-keyboard experience.
ExpandApply NowActive JobUpdated 4 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Design and implement high-performance data processing pipelines using Spark Streaming, Databricks, Snowflake and other technologies to ensure data security and privacy. Expertise in Spark Streaming, Apache Airflow, Cloud (Azure preferred), and on-premises open-source data systems.
$215,000 - $322,250 a yearFull-timeExpandApply NowActive JobUpdated 4 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Advanced experience in SQL in big data warehouse systems such as Snowflake, BigQuery, Databricks, etc. Experience with workflow orchestration management engines such as Airflow, Dagster, DBT, etc.
Full-timeExpandApply NowActive JobUpdated 3 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Experienced in implementing standardized pipelines with automated testing, Airflow scheduling, Azure DevOps for CI/CD, Terraform for infrastructure as code, and Splunk for monitoring. Technologies - IaaS (AWS or Azure or GCP), Databricks platform, Delta Lake storage, Spark (PySpark, Spark SQL.
ExpandApply NowActive JobUpdated 4 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Fluent in Python, SQL, PySpark, Spark SQL, Databricks, AWS. Experience with Terraform (AWS, Databricks), Shell Scripting, GIT, Databricks CLI, and other Command Line Tools. Working knowledge of a modern orchestration platform such as DBT, Airflow, etc.
ExpandApply NowActive JobUpdated 4 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Experience with workflow orchestration management engines such as Airflow, Dagster, DBT, etcExperience with Cloud Services (AWS, Google Cloud, Microsoft Azure, etc). Experience with workflow orchestration management engines such as Airflow, Dagster, DBT, etcExperience with Cloud Services (AWS, Google Cloud, Microsoft Azure, etc.
ExpandApply NowActive JobUpdated 4 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Our infrastructure – including USDC, a blockchain-based dollar – helps businesses, institutions and developers harness these breakthroughs and capitalize on this major turning point in the evolution of money and technology.
Full-timeExpandApply NowActive JobUpdated 3 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Experience in optimizing data models and access patterns for modern warehouse and orchestration platforms such as Snowflake, Redshift, Databricks, Spark, DBT, Apache Airflow, and Prefect.
Full-timeRemoteExpandApply NowActive JobUpdated 5 days ago
airflow databricks jobs
FEATURED BLOG POSTS
4 Ways to Make Your Job Posting More Inclusive
According to a Glassdoor survey,
How to Calculate Net Income
Understanding your finances can be daunting even if you’re good with numbers. Your net income, in particular, is a key metric for determining how well you’re doing financially and whether your current way of operating is sustainable or not.
To ATS or not to ATS
As hiring is becoming more analytical and data-driven, companies have found ways to incorporate technology to help hire and recruit more efficiently. ATS, also known as an applicant tracking system, has become one of the most widely adopted technological recruiting tools to date. In fact, according to data from Capterra:
6 Best Ways to End a Cover Letter with Examples
Including a cover letter with your resume is a great way to introduce yourself to the hiring manager, tell them why you’re the ideal fit for the role, and provide context about your personal situation. A strong cover letter will give you an advantage over other applicants. But it’s important that you structure it properly and write it powerfully so that it carries an impact. This article will discuss how to end a cover letter effectively so you catch the eye of a hiring manager and increase your odds of landing an interview. Read on to learn more.
How to Write a Follow-Up Email for a Job Application?
Most times, we have to do more than submit a "sugar-coated" resume to land our dream jobs. Going the extra mile to follow up on your job application can increase your chances of employment. Additionally, it may even help you get confirmation sooner on whether you are seriously being considered for the job or not.
How to Address a Cover Letter With Examples
It’s easy to get caught up in focusing on your resume – how it looks, what it says, and whether it’s going to land you a job interview. Because there is a big focus on building the perfect resume, job searchers often overlook the importance of a high-quality cover letter. Your cover letter plays a huge role in your first impression. It humanizes you and provides context for your resume.
How to Call Out of Work
No matter how happy we are with our jobs, there are days when we feel overwhelmed and want to call out of work. No, don't feel guilty. It's expected because we're humans, and we can't control the uncertainties of life. But the problem lies in how to call out of work without seeming uncommitted to work, especially if you seldom get work-free days.