- UpvoteDownvoteShare Job
- Suggest Revision
Familiarity with the following technologies: Hadoop, Kafka, Airflow, Hive, Presto, Athena, S3, Aurora, EMR, Spark. Experience utilizing or leading implementations leveraging ETL tools (Informatica / Talend), BI Reporting tools such as MicroStrategy, Microsoft Power BI, data modeling tools such as Erwin, Oracle, SQL server, NoSQL, JDBC, UNIX shell scripting, PERL and JAVA, XML/JSON files, SAS , Python, AWS cloud-native technologies, S3, Athena, Redshift.
Full-timeExpandUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
Tech stack: Google cloud, HDFS, SPARK, Scala, Python (optional), Automic/Airflow, BigQuery, Kafka, API, Druid. Experience building complex near real time (NRT) streaming data pipelines using Apache Kafka, Spark streaming, Kafka Connect with a strong focus on stability, scalability, and SLA adherence.
ExpandApply NowActive JobUpdated 3 months ago - UpvoteDownvoteShare Job
- Suggest Revision
Expert knowledge and experience in Databricks, Lake House, Structured Streaming, Kafka, Delta Lake, Delta Live table, Delta shareetc. Explore and build proof of concepts using open source NOSQL technologies such as HBase, DynamoDB, Cassandra and Distributed Stream Processing frameworks like Apache Spark, Kafka stream.
RemoteExpandApply NowActive JobUpdated 4 days ago - UpvoteDownvoteShare Job
- Suggest Revision
The Data Architect will use their skills and experience of technologies including Kubernetes, Kafka, Airflow, Linux, AWS and Python to spearhead the evolution and enhancement of their existing data infrastructure, as they scale up the adoption of their software as a medical device product.
ExpandApply NowActive JobUpdated 3 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Data Engineering with experience in tools such as: Big data tools (Hadoop, Spark, Kafka, etc.) Relational SQL and NoSQL databases (including Postgres and Cassandra), Data pipeline and workflow management tools (Azkaban, Luigi, Airflow, etc.
Full-timeExpandUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
Our tech stack includes (but is not limited to) languages and technologies like Golang, Python, SQL, shell scripts, AWS EC2, Athena, Aurora / PostgreSQL, Kafka / MSK, Kubernetes, SQLite, Airflow, Spark, and more.
ExpandApply NowActive JobUpdated 2 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Includes Snowflake, Apache Airflow, AWS cloud infrastructure (e.g., Kinesis, Kubernetes/EKS, Lambda, Aurora RDS PostgreSQL), Python and Typescript. Experience building and managing production-grade data pipelines using tools such as Airflow, dbt, DataHub, MLFlow.
Full-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Design and implement distributed data processing pipelines using Spark, Hive, Python, Airflow, and other tools and languages prevalent in the Hadoop ecosystem. Design and implement distributed data processing pipelines using Spark, Hive, Python, Airflow, and other tools and languages prevalent in the Hadoop ecosystem.
RemoteExpandApply NowActive JobUpdated 3 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed. Experience in Real Time Data Ingestion using GCP PubSub, Kafka, Spark or similar.
ExpandApply NowActive JobUpdated 12 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Design, Develop, Modernize/Migrate pipelines to Databricks. As a Sr. Big Data Engineer, you will work with a variety of talented teammates and be a driving force in technical initiatives that will accelerate analytics at Client.
RemoteExpandApply NowActive JobUpdated 3 days ago - UpvoteDownvoteShare Job
- Suggest Revision
US Tech Solutions is a global staff augmentation firm providing a wide range of talent on-demand and total workforce solutions. Big Data Engineer. We’re looking to expand our Big Data Engineering team to keep pace.
RemoteExpandApply NowActive JobUpdated 4 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Programming skills in Java/Scala, Python, Shell scripting, and SQL. You will be working on projects that build data artifacts to answer questions about consumer behavior, commerce trends, consumer touchpoint preferences and more.
RemoteExpandApply NowActive JobUpdated 4 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Working, hands on, knowledge of modern Data Pipeline tools (DBT, Snowflake, Airflow) and Data visualization tools (Looker) Practical hands on experience with: Building large-scale event driven systems and relevant services (Kafka, Confluent, RabbitMQ) and streaming based data pipelines.
RemoteExpandApply NowActive JobUpdated 14 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Experience in Business Rule management systems like Drools will also come in handy. Development skills around Hadoop, Spark, PySpark, Hive. Beaverton OR – Remote for some time - candidate open to relocating to be onsite in the future.
RemoteExpandApply NowActive JobUpdated 11 days ago - UpvoteDownvoteShare Job
- Suggest Revision
The team uses a variety of tools, technologies, and languages to build software like Delta Lake, Hadoop, Kinetica, Solr, Spark, Kafka, Python, Java, Ruby, Scala, Airflow, Luigi, EMR, Databricks, etc.
$124,000 - $204,250 a yearExpandApply NowActive JobUpdated 11 days ago
airflow kafka jobs
FEATURED BLOG POSTS
The Effects of Workplace Racism and Sexism
One day it's a covert statement to a mother returning to work after maternity leave. Another day it's a lingering gaze at an employee enjoying a culturally rich meal. These microaggressions (or sometimes macroaggressions) can take an employee from a confident, high-performer to one that feels insecure being themselves at work. Your employees engage with people with different ideas and feel most comfortable and valued when they can work without losing their cultural, racial, and gender identity. While most employers know this, why have workplace racism and sexism often been neglected?
When Rage Applying Strikes: How to Identify Unserious Candidates
As the job market remains highly competitive, we have seen a surge in "rage applying." This is when candidates apply to multiple jobs, often without considering whether they are truly interested in the role. Rage applying goes hand-in-hand with quiet quitting. Often, employees want to entertain the thoughts and feelings of leaving their job, but they aren't necessarily serious about leaving yet. Meanwhile, other employees engaging in this trend are actually trying to find a better role. As a recruiter, it can be hard to identify who are the real applicants in a sea full of quiet quitters, but understanding rage applying and identifying red flags will certainly help.
How to Increase Job Ad Exposure
In today's competitive job market, writing quality job ads is critical for attracting top talent to your organization. While networking and candidate referrals are prime real estate for finding qualified candidates, nothing beats the tried-and-true method of writing an extraordinary job ad. But while writing a great job ad is the first step, what's more important is increasing visibility. You could have the most detailed, well-written ad on the internet, but if no one sees it, then you are wasting time (and potentially money!). Employers often believe that job boards are the root of the problem, but you can learn how to increase job ad exposure by tweaking a few steps of your recruitment process.
How to Navigate Hiring Out of State
The job market has shifted significantly in recent years. The accelerated adoption of technology has not only pushed many companies into remote working arrangements but also increased the availability of supporting tools and technologies (i.e., video conferencing and collaboration software).
Building a Candidate Pipeline Through Internships
Building a candidate pipeline through a great internship program for local college students and recent graduates at local universities is a great and cost-effective way to attract and retain top talent. By offering meaningful and impactful work experiences, regular feedback, coaching, and mentorship, you can create a positive internship experience that will make your organization a sought-after destination for future employees. This not only benefits the organization in the short-term but also in the long-term, as you'll have a pool of well-trained and experienced candidates who may be interested in full-time employment once they graduate. Furthermore, building relationships with local universities and college students can increase brand awareness and build a positive reputation for your organization in the local community.
Hiring Transparency
Transparency in hiring refers to the open and honest communication and information sharing that takes place between employers and job candidates. It encompasses all aspects of the hiring process, from posting job descriptions to providing feedback on performance during and after the interview process. In today's job market, hiring transparency has become increasingly important for both employers and candidates alike.
Recruitment strategies that are weird, but actually work
In the current candidate-driven job market, recruiters are looking for unique ways to attract talent. Some have resorted to even (dare we say it?) recruitment strategies on the border of weird and wacky. What can we learn from the unusual recruitment tactics that are being used and actually getting results? Here’s a rundown of some unique recruitment strategies that actually work.