- UpvoteDownvoteShare Job
- Suggest Revision
Proficiency in data engineering tools such as Databricks, Apache Spark, Delta Lake, MLflow, and SQL. Work with data engineering platforms and tools such as Databricks, Apache Spark, Delta Lake, MLflow, and SQL.
Full-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Advanced knowledge of cloud computing technologies such as: Apache Spark, Azure Data Factory, Azure DevOps, Azure ML (Machine Learning), Hadoop, Microsoft Azure, Databricks, AWS, Google Cloud. Leverage a broad set of modern technologies including Python, R, and Spark to analyze and gain insights within large data sets.
Full-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Your expertise in ETL, Databricks, Python, Spark, Scala, JavaScript/JSON, SQL, and Jupyter Notebooks will be essential in ensuring efficient data processing and analysis. High level of proficiency in ETL processes and demonstrated, hands-on experience with technologies such as Databricks, Python, Spark, Scala, JavaScript/JSON, SQL, and Jupyter Notebooks.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Your expertise in ETL, DataBricks, Python, Spark, Scala, JavaScript/JSON, SQL, and Jupiter Notebooks will be essential in ensuring efficient data processing and analysis. Job Responsibilities:- Design, develop, and implement end-to-end data pipelines, utilizing ETL processes and technologies such as DataBricks, Python, Spark, Scala, JavaScript/JSON, SQL, and Jupyter Notebooks.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Deep understanding of Apache Spark, Delta Lake, and their integration within the Databricks environment. Solid experience with SQL, Python, and Spark for data processing and transformation.
Full-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Experience with distributed data and computing tools, including Spark, Databricks, Hadoop, Hive, AWS EMR, or Kafka. Experience with distributed data and computing tools, including Spark, Databricks, Hadoop, Hive, AWS EMR, or Kafka.
$75,600 - $172,000 a yearPart-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Demonstrates advanced functional knowledge of data visualization libraries such as matplotlib or ggplot2; knowledge of other visualization tools such as Microsoft Power BI and Tableau. Advanced knowledge of advanced techniques such as: dimension reduction techniques, natural language processing, sentiment analysis, anomaly detection, geospatial analytics, etc.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Advanced knowledge of cloud computing technologies such as: Apache Spark, Azure Data Factory, Azure DevOps, Azure Client (Machine Learning), Hadoop, Microsoft Azure, Databricks, AWS, Google Cloud.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
This position requires an active TS/SCI Clearance and will be an onsite role reporting to the Pentagon in Arlington, VA. What you will do Design, develop, and implement end-to-end data pipelines, utilizing ETL processes and technologies such as Databricks, Python, Spark, Scala, JavaScript/JSON, SQL, and Jupyter Notebooks.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Minimum of 5 years of hands-on skills in Python, Apache Spark, Hive, Cassandra, Snowflake, Elastic, Lucene, DataBricks, Hadoop, Redshift, Scala, Java, or other relevant technologies in data engineering.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Big Data Technologies:Distributed Computing: Tools like Apache Spark, Hadoop. Familiarity or experience with developing solutions for proposal within any of U.S. Government Agencies, to include but not limited to the following: Department of Transportation (e.g., FAA), NASA (including JPL), Department of Commerce (e.g., NOAA), or Department of Energy (e.g., HQ, NNSA.
$150ExpandApply NowActive JobUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
Experience with distributed data or computing tools including Spark, Databricks, Hadoop, Hive, AWS EMR, or Kafka. Experience with developing in Spark Scala, Pyspark, Python, Spark SQL, or SQL.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Experience developing applications/utilities using Python, Java, or Scala leveraging tools like Presto, AWS Athena, Spark or AWS Glue. Understand data lakehouse technologies like Iceberg, Databricks, Redshift Spectrum, Athena and Snowflake.
Full-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Basic to substantial experience in one or more of the following commercial/open-source data discovery/analysis platforms: RStudio, Spark, KNIME, RapidMiner, Alteryx, Dataiku, H2O, SAS Enterprise Miner (SAS EM) and/or SAS Visual Data Mining and Machine Learning, Microsoft AzureML, IBM Watson Studio or SPSS Modeler, Amazon SageMaker, Google Cloud ML, SAP Predictive Analytics.
$184,000 - $248,400 a yearFull-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Familiarity with “big data” tooling (Hadoop, Spark, RedShift, HDFS) Infrastructure Automation: Develop and maintain infrastructure as code (IaC) using tools such as Terraform, Ansible & CFT. Our client, a leading provider of Artificial Intelligence Integration Services and innovative data platforms for government and defense clients, is recruiting a full-time Cyber Security Engineer.
$120 - $200ExpandApply NowActive JobUpdated Today
spark job Company: Boys And Girls Club Of Bluffton in Springfield, VA
FEATURED BLOG POSTS
How to Write an Address Correctly: Explained with Examples
It's hard to imagine a scenario where a text or phone call just won't do these days. With communication at our fingertips, you may think learning how to write an address is a superfluous skill. But it's a skill that will come in handy when you need to fill out healthcare forms, ship a package, order food delivery, or even apply for new jobs.
What is Employment Participation Rate
According to economists, there are four factors of production that go into creating higher quality goods at lower prices. These are
How to Get Pay Stubs (From Previous Employee Also!)
Pay stubs are an important piece of document which shows your earnings in a given period, as well as any deductions made towards your health insurance or pension contributions. They’re also excellent for finding out how much your recent salary raise has bumped up your monthly net income.
Structured vs Unstructured Interviews
The goal of an interview is to evaluate candidates based on their skills, personality, and knowledge. You want to choose the BEST candidate from your candidate pool, so the interview is something you can't mess up. As you begin planning your interview process, one of the major decisions you'll face is whether the interview should be a structured vs unstructured interview. So let's take a dive into the differences and sort out which circumstances warrant which interview process.
How to Describe Your Personality with Examples
Imagine you’re in an elevator with the CEO of your dream company and you get to talking. The conversation is going well and you start to imagine yourself working for their company when the CEO turns around and asks you “tell me a bit about yourself.” Would this catch you off guard or would you be able to give a clear and succinct description of who you are?
4 Ways to Make Your Job Posting More Inclusive
According to a Glassdoor survey,
To ATS or not to ATS
As hiring is becoming more analytical and data-driven, companies have found ways to incorporate technology to help hire and recruit more efficiently. ATS, also known as an applicant tracking system, has become one of the most widely adopted technological recruiting tools to date. In fact, according to data from Capterra: