- UpvoteDownvoteShare Job
- Suggest Revision
<> Tech Skills - Scala, Spark, GCP, Dataproc, Hadoop, Airflow, SBT, Maven, Docker, Kubernetes, pyspark, Jenkins, Bigquery <> Experience with workflow management tools such as Jenkins, Airflow.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
ETL/BI: Airflow, dbt, Fivetran, Matillion, Looker, Tableau. Experience working with a cloud-based data warehousing and analytics stack (Airflow, dbt, Snowflake, AWS) ETL/BI: Airflow, dbt, Fivetran, Matillion, Looker, Tableau.
Full-timeExpandApply NowActive JobUpdated 15 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Familiar with Spark, MLLib, Databricks MLFlow, Apache Airflow and similar related technologies. Good foundation in Machine Learning (ML), Deep Learning, Large Language Models (LLM) and Natural Language Processing (NLP.
$144,000 - $169,000 a yearFull-timeExpandApply NowActive JobUpdated 16 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Tech stack: Google cloud, HDFS, SPARK, Scala, Python (optional), Automic/Airflow, BigQuery, Kafka, API, Druid. Expertise in building idempotent workflows using orchestrators like Automic, Airflow, Luigi etc.
ExpandApply NowActive JobUpdated 3 months ago - UpvoteDownvoteShare Job
- Suggest Revision
Experience using developer-oriented data pipeline and workflow orchestration (e.g. Airflow (preferred), dbt, dagster or similar) Manage and maintain Airflow ETL jobs. Strong experience with data warehousing (e.g. Snowflake (preferred), Redshift, BigQuery, or similar.
ExpandApply NowActive JobUpdated 9 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Apache Kafka, Apache Flink, Apache Spark, Trino (Presto), Apache Airflow/Dagster, Apache Superset, AWS S3, Snowflake, Amplitude, CDP (e.g. Segment.com) Apache Kafka, Apache Flink, Apache Spark, Trino (Presto), Apache Airflow/Dagster, Apache Superset, AWS S3, Snowflake, Amplitude, CDP (e.g. Segment.com.
Full-timeExpandApply NowActive JobUpdated 24 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Strong understanding of data processing technologies and distributed computing frameworks (e.g. Hadoop, Hive, Airflow, Spark, DBT, Kafka, Flink) Hands-on experience with database systems (e.g. SQL, NoSQL, vector, search) and data warehousing solutions (e.g. Snowflake, Redshift, BigQuery.
ExpandApply NowActive JobUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
Knowledge with thermal airflow testing: fans, pumps, liquid loop cold plate testing, and augmentation to system thermal simulation models. Experience in numerical modeling based on Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) (convergence, discretization, meshing.
ExpandApply NowActive JobUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
We use various open-source technologies (such as Apache Spark, Presto, Comet, Apache Airflow to name a few) to develop services for Adobe customers and partners. We use various open-source technologies (such as Apache Spark, Presto, Comet, Apache Airflow to name a few) to develop services for Adobe customers and partners.
Full-timeExpandApply NowActive JobUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
Expertise in designing, implementing and operating data engineering platforms and scalable web services, including schema management, transport (Kafka, SQS), warehousing and storage (AWS Redshift, AuroraDB, MySQL, Postgres), processing (streaming and batch ETL/ELT pipelines, Kinesis, Airflow, Spring batch), analytics (Spark, Hadoop, BigQuery, Databricks, Snowflake), and business intelligence reports (Tableau, QuickSight.
$127,500 - $150,000 a yearFull-timeExpandApply NowActive JobUpdated 9 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Working knowledge of data platform technologies such as: AWS Services (RDS, ECS/EKS, Lambda, S3), ETL Tools (DBT, Airflow), Data Warehouses (Snowflake, Databricks). Working knowledge of data platform technologies such as: AWS Services (RDS, ECS/EKS, Lambda, S3), ETL Tools (DBT, Airflow), Data Warehouses (Snowflake, Databricks.
ExpandApply NowActive JobUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
Our stack: Python, SQLAlchemy, a little Bash, JavaScript/React, AWS (S3, EC2), PostGIS. The infrastructure and cloud orchestration pipeline uses Terraform, Apache Airflow, and Astronomer. Our stack: Python, SQLAlchemy, a little Bash, JavaScript/React, AWS (S3, EC2), PostGIS. The infrastructure and cloud orchestration pipeline uses Terraform, Apache Airflow, and Astronomer.
ExpandApply NowActive JobUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
Experience with modern data processing pipelines and technologies (i.e. Hadoop, Spark, Airflow). NVIDIA is developing groundbreaking solutions in some of the world’s most exciting technology areas including Virtual Reality, Artificial Intelligence, Deep Learning and Autonomous Vehicles.
ExpandApply NowActive JobUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
Design and implement distributed data processing pipelines using Spark, Hive, Python, Airflow, and other tools and languages prevalent in the Hadoop ecosystem. Expert knowledge and experience in Databricks, Lake House, Structured Streaming, Kafka, Delta Lake, Delta Live table, Delta shareetc.
RemoteExpandApply NowActive JobUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
Experience with scientific libraries in Python (numba, pandas) and machine learning tools and frameworks (scikit-learn, tensorflow, torch, etc. Tagged as: Industry , Language Modeling , Machine Learning , Natural Language Processing , NLP , Unspecified.
ExpandApply NowActive JobUpdated Yesterday
airflow job in Cupertino, CA
FEATURED BLOG POSTS
How to Write an Address Correctly: Explained with Examples
It's hard to imagine a scenario where a text or phone call just won't do these days. With communication at our fingertips, you may think learning how to write an address is a superfluous skill. But it's a skill that will come in handy when you need to fill out healthcare forms, ship a package, order food delivery, or even apply for new jobs.
What is Employment Participation Rate
According to economists, there are four factors of production that go into creating higher quality goods at lower prices. These are
How to Get Pay Stubs (From Previous Employee Also!)
Pay stubs are an important piece of document which shows your earnings in a given period, as well as any deductions made towards your health insurance or pension contributions. They’re also excellent for finding out how much your recent salary raise has bumped up your monthly net income.
How to Write a Job Description?
It might be tempting to overlook the importance of a well-written job description. After all, if you’ve posted job ads before and ended up with tons of resumes in hand, it’s easy to assume that this will always be the case, regardless of how your job ad reads. But, in reality, you really can’t take getting an influx of resumes for granted.
How to Get a W2 From Previous Employers
When tax time rolls around, the last thing you want to worry about is having to track down a W-2 from your former employer. Many times you won’t have to because the IRS requires companies to send these forms to all current and former employees who have earned more than $600 in the last year. Unfortunately, there are employers who don’t do what they’re supposed to. There are even times where something else may happen that prevents the W-2 from getting where it’s supposed to go.
4 Ways to Make Your Job Posting More Inclusive
According to a Glassdoor survey,
To ATS or not to ATS
As hiring is becoming more analytical and data-driven, companies have found ways to incorporate technology to help hire and recruit more efficiently. ATS, also known as an applicant tracking system, has become one of the most widely adopted technological recruiting tools to date. In fact, according to data from Capterra: