- UpvoteDownvoteShare Job
- Suggest Revision
Expertise in designing and implementing scalable data pipelines using tools such as Apache Spark, Databricks, Kafka, and others. This role requires a highly skilled and hands-on data professional with expertise in Databricks, Azure, AWS, Kafka, and both SQL and noSQL databases.
Full-timeExpandUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
AWS Cloud, Kubernetes, Microservices, MongoDB, Kafka, Spring Boot, DevOps, Jenkins, React, Node.js, GraphQL, OAuth. AWS Cloud, Kubernetes, Microservices, MongoDB, Kafka, Spring Boot, DevOps, Jenkins, React, Node.js, GraphQL, OAuth.
ExpandUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Experience with open-source technologies like Java, Spring boot, Kafka, etc. Experience in Jenkins/Cloud bees, Gitlab, Apigee, Istio, Kubernetes, Rancher, Splunk. Experience with open-source technologies like Java, Spring boot, Kafka, etc.
$150,480 a yearFull-timeExpandUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Design and implement scalable data pipelines using tools like Apache Kafka or Apache Airflow to facilitate real-time data processing and batch data workflows. ● Experience in designing and implementing data pipelines using tools like Apache Kafka or Apache Airflow.
ExpandUpdated 6 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Data Technologies: Kafka, Spark, Hadoop, Presto, Alloy – a data management and data governance platform. Kafka, Spark, Hadoop, MongoDB, ElasticSearch, MemSQL, Sybase IQ / ASE. Data Technologies: Kafka, Spark, Hadoop, Presto, Alloy – a data management and data governance platform.
ExpandUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Our primary tech stack consists of Golang, Kubernetes, Postgres, and Elasticsearch, but we also use tools like ScyllaDB, Redis, Kafka, SQS, and various other AWS services as well. Experience with Docker, Kubernetes, GraphQL, Postgres, Elasticsearch, ScyllaDB, Redis, Kafka, SQS, Docker, databases or AWS services.
Full-timeRemoteExpandUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Experience in designing, building, and deploying production-level data pipelines using tools from Hadoop stack (HDFS, Hive, Spark, HBase, Kafka, NiFi, Oozie, Apache Beam, Apache Airflow etc) 2+ years of experience working with Streaming using Spark or Flink or Kafka or NoSQL.
$188,640 a yearFull-timeExpandUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Relevant certifications in cloud platforms (AWS, Azure, GCP) and data engineering tools (Databricks, Kafka) are a plus. Leverage Kafka (or similar messaging queue) as an enterprise integration tool to ingest and integrate near real time and large volumes of diverse data from multiple sources.
Full-timeExpandUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
In-depth knowledge of big data technologies such as Hadoop, Spark, Kafka, and cloud platforms such as AWS, Azure, GCP, Snowflake, Databricks, etc. In-depth knowledge of big data technologies such as Hadoop, Spark, Kafka, and cloud platforms such as AWS, Azure, GCP, Snowflake, Databricks, etc.
Full-timeExpandUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Big data technologies like Hadoop (Hortwonworks, Cloudera, Azure HDInsight, Amazon EMR), Spark, Kafka, Elasticsearch and others; Big data technologies like Hadoop (Hortwonworks, Cloudera, Azure HDInsight, Amazon EMR), Spark, Kafka, Elasticsearch and others.
Full-timeExpandUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Knowledge of Data Streaming products such as Kafka, Confluent Platform, Azure Eventhub, Azure Streaming Analytics, Azure Data Factory, Azure Synapse, Databricks, Google BigQuery and Google PubSub, SAP Data Intelligence and SAP DataSphere.
$65 - $70ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Strong expertise in programming languages - Python, R, or Scala and technologies like Tensorflow, PyTorch, Kubernetes, Spark, Airflow, Kafka and Opensource LLMs. Strong expertise in programming languages - Python, R, or Scala and technologies like Tensorflow, PyTorch, Kubernetes, Spark, Airflow, Kafka and Opensource LLMs.
Full-timeExpandUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Senior Data Engineer (Python, Kafka, Databricks, AWS) 3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) Senior Data Engineer (Python, Kafka, Databricks, AWS.
ExpandUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
The ideal candidate will have a minimum of 10 years of experience in Java development, with extensive expertise in Java 8 or Java 11, microservices architectures, Spring Boot, and real-time data streaming with Kafka.
$125,000 a yearFull-timeExpandUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Providing support for the engineering team on Kubernetes Clusters, Dataproc, Kafka, HDP and CDP cluster related issues. Two or more years Experience with Big Data on GCP - Dataproc, dataflow, Cloud Function, BigQuery, Composer, GKE, IA, etc.
Full-timeExpandApply NowActive JobUpdated Yesterday
kafka job in Irving, TX
FEATURED BLOG POSTS
Learn How to Respond to an Interview Request With Templates
Job interviews are an inevitable part of any job search. So, handling them well is key to building a fulfilling career. Regardless of whether you’re looking at a more junior role or strive for a role as a business executive, you need to maintain your professionalism every step of the way in order to stay in the game.
How to Write an Address Correctly: Explained with Examples
It's hard to imagine a scenario where a text or phone call just won't do these days. With communication at our fingertips, you may think learning how to write an address is a superfluous skill. But it's a skill that will come in handy when you need to fill out healthcare forms, ship a package, order food delivery, or even apply for new jobs.
What is Employment Participation Rate
According to economists, there are four factors of production that go into creating higher quality goods at lower prices. These are
How to Get Pay Stubs (From Previous Employee Also!)
Pay stubs are an important piece of document which shows your earnings in a given period, as well as any deductions made towards your health insurance or pension contributions. They’re also excellent for finding out how much your recent salary raise has bumped up your monthly net income.
How to Write a Job Description?
It might be tempting to overlook the importance of a well-written job description. After all, if you’ve posted job ads before and ended up with tons of resumes in hand, it’s easy to assume that this will always be the case, regardless of how your job ad reads. But, in reality, you really can’t take getting an influx of resumes for granted.
How to Get a W2 From Previous Employers
When tax time rolls around, the last thing you want to worry about is having to track down a W-2 from your former employer. Many times you won’t have to because the IRS requires companies to send these forms to all current and former employees who have earned more than $600 in the last year. Unfortunately, there are employers who don’t do what they’re supposed to. There are even times where something else may happen that prevents the W-2 from getting where it’s supposed to go.
How to Ask Someone to be a Reference + Email Templates
One part of the job-hunting process that frequently gets overlooked is putting together a list of good references. Most of the time we focus on creating the perfect resume, writing an awesome cover letter, and getting our hands on letters of recommendation. We think about what outfit we’ll wear to the job interview, how we’ll answer those tricky questions, and what our career plan looks like. But, in fact, having multiple references lined up who will speak favorably about you to a potential employer is critical to landing a job. This aspect of job searching really can’t be ignored.