- UpvoteDownvoteShare Job
- Suggest Revision
Knowledge and experience working with Spark or Databricks is a plus. We help organizations like Uber, GoDaddy, MGM, Siemens, Stanford University, and the State of California, build distributed software development teams, and deliver transformational digital solutions.
Full-timeRemoteExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Experience of implementing Amazon EMR or Big Data technologies like Hadoop, Spark, Presto, Hive will be a plus. Implementation experience of Big Data technologies like Hadoop, Spark, Presto, Hive, and Hue will be a major advantage.
ExpandUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
We use proprietary and open source technologies, Kafka, Spark, Iceberg, Airflow, Presto, etc. Proficiency in at-least one of the following programming languages - Python, Scala or Java. Experience in multi-threaded, concurrent programming and synchronization Cloud technology experience on platforms like AWS, Microsoft Azure, Google Cloud Experience developing Big Data applications using java, Spark, Kafka is a huge plus.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Have experience with both proprietary and open-source big data technologies and platforms (Snowflake, Vertica, Hive, Spark, Presto, Airflow). Have experience with both proprietary and open-source big data technologies and platforms (Snowflake, Vertica, Hive, Spark, Presto, Airflow.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Explore and build proof of concepts using open source NOSQL technologies such as HBase, DynamoDB, Cassandra and Distributed Stream Processing frameworks like Apache Spark, Kafka stream. Design and implement distributed data processing pipelines using Spark, Hive, Python, Airflow, and other tools and languages prevalent in the Hadoop ecosystem.
RemoteExpandApply NowActive JobUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
As an integral part of our team, you will not only need to tackle operational challenges at every layer of the system infrastructure, but also set up essential tools like Spark, Databricks, Snowflake, Kubernetes, and Kafka for data science infrastructure.
RemoteExpandApply NowActive JobUpdated 3 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Experience with AWS, Spark, Databricks, Data pipelines frameworks, streaming tools (Kafka), API design. Experience with AWS, Spark, Databricks, Data pipelines frameworks, streaming tools (Kafka), API design.
ExpandApply NowActive JobUpdated 4 days ago - UpvoteDownvoteShare Job
- Suggest Revision
What you will likely bring:Strong experience in Microsoft Azure Proficiency in data processing frameworks such as Azure Spark, or cloud-native data processing services (Azure Data Lake, Azure Data Factory, Azure Databricks, Azure Synapse, Snowflake, CosmosDB) Experience with data integration and ETL (Extract, Transform, Load) processes, including tools like cloud-native orchestration services.
Full-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Experience with data warehouse and big data technologies such as Azure Synapse Analytics, Azure Data Lake Storage, Spark, Kafka, Databricks, and Containers. Implement data processing pipelines using Azure Data Factory, Databricks, Azure Machine Learning, Azure Logic Apps, Azure Function Apps, Azure Kubernetes container and other big data services.
ExpandApply NowActive JobUpdated 3 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Strong programming skills in languages such as Python, Java, or Scala, with experience in data processing frameworks like Apache Spark or Apache Flink. Data Processing Frameworks: Lead the implementation and optimization of data processing frameworks and technologies, such as Apache Hadoop, Apache Spark, and Apache Flink, to enable efficient data processing and analysis.
ExpandApply NowActive JobUpdated 3 days ago - UpvoteDownvoteShare Job
- Suggest Revision
GCP Data Engineer (Standard) with skills Big Data, Kafka, Python, Scala, Apache Spark for location Bangalore, India. Additional Skill(s): Kafka, Python, Scala, Apache Spark. GCP Data Engineer (Standard) with skills Big Data, Kafka, Python, Scala, Apache Spark for location Bangalore, India.
ExpandApply NowActive JobUpdated 14 days ago - UpvoteDownvoteShare Job
- Suggest Revision
6-8 years of strong experience in Spark, Python, Shell Scripting, PostgreSQL, Hadoop, AWS (S3, EC2, SNS, SQS, Lambda, ECS, Glue, IAM, and CloudWatch), Databricks (Delta lake, Notebooks, Pipelines, cluster management, Azure/AWS integration.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Proficiency with the architecture, deployment, performance tuning, and troubleshooting of open source data analytics and data governance technologies, especially Apache Hive, Hadoop/HDFS, Trino, Druid, Spark, and related software.
ExpandApply NowActive JobUpdated 14 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Configure and maintain systems such as Kubernetes (Helm, Charts, ArgoCD), Kafka, Spark, Elastic, Bitbucket / GitLab, Nexus, and Jenkins. Radian is helping to ensure the American dream of homeownership in even bigger and better ways with industry-leading mortgage insurance and a comprehensive suite of mortgage, risk, real estate, and title services.
RemoteExpandApply NowActive JobUpdated 3 days ago - UpvoteDownvoteShare Job
- Suggest Revision
5+ years validated ability in distributed data technologies , Hadoop, Hive, Presto, Spark etc. 3+ years of experience with Cloud based technologies – Databricks, S3, Azure Blob Storage, Notebooks, AWS EMR, Athena, Glue etc.
Full-timeExpandApply NowActive JobUpdated Today
spark job in Austin, TX
FEATURED BLOG POSTS
5 Common Interview Mistakes
Everyone's interview process is unique in some form or fashion. Like most, your interview process is crafted so you can get the most information out of your candidates to increase hiring confidence and make the right hiring decisions. However, there are often small problems in interview processes that could ultimately affect the success of hiring decisions.
Structured vs Unstructured Interviews
The goal of an interview is to evaluate candidates based on their skills, personality, and knowledge. You want to choose the BEST candidate from your candidate pool, so the interview is something you can't mess up. As you begin planning your interview process, one of the major decisions you'll face is whether the interview should be a structured vs unstructured interview. So let's take a dive into the differences and sort out which circumstances warrant which interview process.
How to Describe Your Personality with Examples
Imagine you’re in an elevator with the CEO of your dream company and you get to talking. The conversation is going well and you start to imagine yourself working for their company when the CEO turns around and asks you “tell me a bit about yourself.” Would this catch you off guard or would you be able to give a clear and succinct description of who you are?
4 Ways to Make Your Job Posting More Inclusive
According to a Glassdoor survey,
How to Calculate Net Income
Understanding your finances can be daunting even if you’re good with numbers. Your net income, in particular, is a key metric for determining how well you’re doing financially and whether your current way of operating is sustainable or not.
To ATS or not to ATS
As hiring is becoming more analytical and data-driven, companies have found ways to incorporate technology to help hire and recruit more efficiently. ATS, also known as an applicant tracking system, has become one of the most widely adopted technological recruiting tools to date. In fact, according to data from Capterra:
6 Best Ways to End a Cover Letter with Examples
Including a cover letter with your resume is a great way to introduce yourself to the hiring manager, tell them why you’re the ideal fit for the role, and provide context about your personal situation. A strong cover letter will give you an advantage over other applicants. But it’s important that you structure it properly and write it powerfully so that it carries an impact. This article will discuss how to end a cover letter effectively so you catch the eye of a hiring manager and increase your odds of landing an interview. Read on to learn more.