- UpvoteDownvoteShare Job
- Suggest Revision
Experience using modern containerization software including Docker, OpenShift and Kubernetes Experience with Snowflake, Microsoft Power BI, Splunk, Securonix, and CywareProficiency with automation tools such as Ansible, Chef or Puppet Project Management experience Industry related certifications such as PMP, GSLC, GISF, GSEC, CISSP Experience with big data tools: Hadoop, Spark, Kafka, etc.
$128,900 - $195,400 a yearFull-timeRemoteExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Proficiency in data technologies, such as relational databases, data warehousing, big data platforms (e.g., Hadoop, Spark), data streaming (e.g., Kafka), and cloud services (e.g., AWS, GCP, Azure.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Experience with distributed data and computing tools, including Spark, Databricks, Hadoop, Hive, AWS EMR, or Kafka. As a big data engineer at Booz Allen, you ll implement data engineering activities on some of the most mission-driven projects in the industry.
$93,300 - $212,000 a yearFull-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Architect, design, and build distributed systems processing and applying business logic for Big Data workloads, serve thousands of clients and support advanced analytics using tools such as SnowFlake, BigQuery, Databricks, Kafka, Airflow, DBT, Looker, MongoDB, etc.
Full-timeExpandApply NowActive JobUpdated 3 months ago - UpvoteDownvoteShare Job
- Suggest Revision
Extensive hands-on experience implementing data migration and data processing using AWS services: VPC/SG, EC2, S3, Autoscaling, CloudFormation, Lake Formation, DMS, Kinesis, Kafka, Nifi, CDC processing Redshift, Snowflake, RDS, Aurora, Neptune, DynamoDB, CloudTrail, CloudWatch, Docker, Lambda, Spark, Glue, Sage Maker, AI/Client, API GW, etc.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Familiarity with big data technologies such as Apache Spark is a plus. Experience working with data streaming technologies like Apache Kafka or Amazon Kinesis. The ideal candidate will have a strong background in data engineering and be proficient in using the ELK (elastic, kibana, logstash) stack and Databricks for data processing and analytics.
Full-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Experience with at least 3 of the technologies/tools mentioned here: HAProxy, Kafka, Big Data/ Hadoop, Presto, Spark, Airflow, Pinot, Druid, Opensearch, Gcp, Data Proc. Experience scaling production systems running Big Data tools like Spark, Hadoop, Apache Druid, Looker.
Full-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Data platform: Hadoop, Hbase, OpenSearch, Kafka, Spark, Flink, Go, Java, SQL, Databricks, Snowflake, BigQuery. Experience with big data and workflow management technologies like Hadoop, Spark, Redshift, Athena, Airflow, etc.
$145,000 - $190,000 a yearExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Big Data & Streaming: 2+ years using big data and/or streaming technologies (e.g. Apache Spark, Apache Kafka, Apache Flink) Data: Good understanding of data & data processing tools (e.g. Spark, Kafka, SQL), of relational database technologies and of analytics databases (e.g. Redshift, Vertica, Snowflake.
Full-timeExpandApply NowActive JobUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
Experience using big data technologies (Snowflake, Airflow, Kubernetes, Docker, Helm, Spark, pySpark) Research and evaluate new technologies in the big data space to guide our continuous improvement.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Experience designing and building highly available, distributed systems of data extraction, ingestion, normalization and processing of large data sets in real time as well as batch, that will be used across engineering teams using orchestration frameworks like Airflow, KubeFlow or other pipeline tools.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Experience with building stream-processing systems, using solutions such as Kafka, Storm or Spark-Streaming. As a tech lead specialized in data engineering, you are expected to code and contribute to the stack.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
You have experience in building systems that orchestrate and execute complex workflows in big-data leveraging Apache Spark, Apache Kafka, and Hadoop stack preferably in Google Cloud Platform.
Full-timeExpandApply NowActive JobUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
Any Big Data certification (ex., Cloudera's CCP, CCA) Experience with Big Data Analytics & Business Intelligence and Industry standard tools integrated with Hadoop ecosystem (R , Python.
$51.73 - $59.73 an hourExpandApply NowActive JobUpdated 1 month ago - UpvoteDownvoteShare Job
- Suggest Revision
Experienced in delivering highly scalable solutions across Java, Big Data & Cloud Platforms (AWS, PCF, Kubernetes) Java 8, Spring boot, Python, PyCharm, Intellij, NodeJS, NPM, VSCode, ExpressJS, Microservices, PCF, AWS EC2, Glue, AWS Lambda, Step Functions, EKS, VPC, CloudWatch, S3, SQS, SNS Apache Kafka, Kubernetes, NodeJS, Gitlab, Jenkins, Grafana, Prometheus, Hadoop, Spark, Hive, Sqoop, Terraform.
ExpandApply NowActive JobUpdated Today
kafka big data jobs
FEATURED BLOG POSTS
How to Describe Your Personality with Examples
Imagine you’re in an elevator with the CEO of your dream company and you get to talking. The conversation is going well and you start to imagine yourself working for their company when the CEO turns around and asks you “tell me a bit about yourself.” Would this catch you off guard or would you be able to give a clear and succinct description of who you are?
4 Ways to Make Your Job Posting More Inclusive
According to a Glassdoor survey,
How to Calculate Net Income
Understanding your finances can be daunting even if you’re good with numbers. Your net income, in particular, is a key metric for determining how well you’re doing financially and whether your current way of operating is sustainable or not.
To ATS or not to ATS
As hiring is becoming more analytical and data-driven, companies have found ways to incorporate technology to help hire and recruit more efficiently. ATS, also known as an applicant tracking system, has become one of the most widely adopted technological recruiting tools to date. In fact, according to data from Capterra:
6 Best Ways to End a Cover Letter with Examples
Including a cover letter with your resume is a great way to introduce yourself to the hiring manager, tell them why you’re the ideal fit for the role, and provide context about your personal situation. A strong cover letter will give you an advantage over other applicants. But it’s important that you structure it properly and write it powerfully so that it carries an impact. This article will discuss how to end a cover letter effectively so you catch the eye of a hiring manager and increase your odds of landing an interview. Read on to learn more.
How to Write a Follow-Up Email for a Job Application?
Most times, we have to do more than submit a "sugar-coated" resume to land our dream jobs. Going the extra mile to follow up on your job application can increase your chances of employment. Additionally, it may even help you get confirmation sooner on whether you are seriously being considered for the job or not.
How to Address a Cover Letter With Examples
It’s easy to get caught up in focusing on your resume – how it looks, what it says, and whether it’s going to land you a job interview. Because there is a big focus on building the perfect resume, job searchers often overlook the importance of a high-quality cover letter. Your cover letter plays a huge role in your first impression. It humanizes you and provides context for your resume.