- UpvoteDownvoteShare Job
- Suggest Revision
Integrate Python, AWS services (such as S3 and EMR), Redshift, Snowflake, Hadoop, Hive, Spring, Hibernate, Cassandra, Kafka, and ETL processes into Spark applications. Advanced - Python, AWS, Redshift, Snowflake,Hadoop, Hive, Spring, Hibernate, Cassandra, Kafka, ETL.
Full-timeExpandApply NowActive JobUpdated 0 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Advanced Scala, Python, AWS, Redshift, Snowflake,Hadoop, Hive, Spring, Hibernate, Cassandra, Kafka, ETL, Strong proficiency in Scala, Apache Spark, including Spark SQL, Spark Streaming, and Spark MLlib.
Full-timeExpandApply NowActive JobUpdated 0 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Your expertise in ETL, Databricks, Python, Spark, Scala, JavaScript/JSON, SQL, and Jupyter Notebooks will be essential in ensuring efficient data processing and analysis. High level of proficiency in ETL processes and demonstrated, hands-on experience with technologies such as Databricks, Python, Spark, Scala, JavaScript/JSON, SQL, and Jupyter Notebooks.
Full-timeExpandApply NowActive JobUpdated 0 days ago - UpvoteDownvoteShare Job
- Suggest Revision
We are seeking a Senior Data Architect with extensive experience in Databricks, Spark, Python, and ETL technologies to join our dynamic team. Leading the architecture and implementation of Spark and Databricks-based ETL frameworks for large-scale enterprise systems.
Full-timeExpandApply NowActive JobUpdated 0 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Design, develop, and implement end-to-end data pipelines, utilizing ETL processes and technologies such as DataBricks, Python, Spark, Scala, JavaScript/JSON, SQL, and Jupyter Notebooks. Knowledge of Qlik/ Qlik Sense, QVD/ QlikView, and Qlik Production Application Standards (QPAS) is a huge plus.
Full-timeExpandApply NowActive JobUpdated 1 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Knowledge of Qlik/Qlik Sense, QVD/QlikView, and Qlik Production Application Standards (QPAS) is a significant plus. Location & Travel: The Data Engineer position will be an on-site role based at the Pentagon in Northern VA.
ExpandApply NowActive JobUpdated 1 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Senior Experience in designing, building, and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Hive, Databrick, Cloud DataProc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud BigQuery, Cloud PubSub, Cloud storage Cloud Functions & Github.
Full-timeExpandApply NowActive JobUpdated 0 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Hadoop, SQL Database/Coding, Apache Spark, Machine Learning, Natural Language Processing, and visualization tools such as Tableau. Assist with migration efforts of existing ETL jobs into Azure/Databricks cloud environment.
Full-timeExpandApply NowActive JobUpdated 0 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Python, Databricks, perl, spark , Kubernetes, dockers and other cloud native tools and technologies. Skill: Databricks Tech LeadPython, Databricks, perl, spark , Kubernetes, dockers and other cloud native tools and technologies.
ExpandApply NowActive JobUpdated 1 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Familiarity with data integration techniques, ETL frameworks (e.g., Apache Spark), and workflow management tools (e.g., Airflow). blob storage, Redshift, Kafka, Hadoop, Spark, Hive etc.
ExpandApply NowActive JobUpdated 1 days ago - UpvoteDownvoteShare Job
- Suggest Revision
You have hands-on experience with the Databricks platform to create ETL pipelines using notebooks, scheduling jobs and create optimized spark cluster configuration for various ETL workloads.
Full-timeExpandApply NowActive JobUpdated 1 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Experience building ETL/ELT pipelines for complex data engineering projects (using Airflow, dbt, Great Expectations would be a plus) 2+ years experience using modern data stack (spark, snowflake) on cloud platforms (AWS.
ExpandApply NowActive JobUpdated 0 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Hands-on experience with technologies such as Databricks, Python, Spark, Scala, JavaScript/JSON, SQL, and Jupyter Notebooks. High level of proficiency in ETL processes and demonstrated.
Full-timeExpandApply NowActive JobUpdated 1 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Technologies - IaaS (AWS or Azure or GCP), Databricks platform, Delta Tables, Delta Lake storage, Spark (PySpark, Spark SQL). Develop and optimize ETL pipelines from various data sources using Databricks on cloud (AWS, Azure, etc.
ExpandApply NowActive JobUpdated 1 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Data Warehousing, Data modeling, Data mastering, Data quality, ETL, Informatica, Alteryx, MSSQL, Oracle, Teradata, Hadoop, Snowflake, MSSQL, SSIS, SSRS, PowerBI, Tableau, Python, R, Scala, SPARK a další programovací jazyky, reporting, ETL nebo další datové nástroje.
ExpandApply NowActive JobUpdated 1 days ago
spark etl jobs
FEATURED BLOG POSTS
To ATS or not to ATS
As hiring is becoming more analytical and data-driven, companies have found ways to incorporate technology to help hire and recruit more efficiently. ATS, also known as an applicant tracking system, has become one of the most widely adopted technological recruiting tools to date. In fact, according to data from Capterra:
6 Best Ways to End a Cover Letter with Examples
Including a cover letter with your resume is a great way to introduce yourself to the hiring manager, tell them why you’re the ideal fit for the role, and provide context about your personal situation. A strong cover letter will give you an advantage over other applicants. But it’s important that you structure it properly and write it powerfully so that it carries an impact. This article will discuss how to end a cover letter effectively so you catch the eye of a hiring manager and increase your odds of landing an interview. Read on to learn more.
How to Write a Follow-Up Email for a Job Application?
Most times, we have to do more than submit a "sugar-coated" resume to land our dream jobs. Going the extra mile to follow up on your job application can increase your chances of employment. Additionally, it may even help you get confirmation sooner on whether you are seriously being considered for the job or not.
How to Hire Remote Workers
Remote work used to be a thing of the future. However, with social, economic, and cultural events taking place across the country, it has now evolved into something that both job searchers AND companies are benefiting from. Remote work is multifaceted and can come in handy in a variety of situations. So, to help your small business take advantage of all of remote work's benefits, here is a short guide on how to hire remote workers.
How to Address a Cover Letter With Examples
It’s easy to get caught up in focusing on your resume – how it looks, what it says, and whether it’s going to land you a job interview. Because there is a big focus on building the perfect resume, job searchers often overlook the importance of a high-quality cover letter. Your cover letter plays a huge role in your first impression. It humanizes you and provides context for your resume.
How to Call Out of Work
No matter how happy we are with our jobs, there are days when we feel overwhelmed and want to call out of work. No, don't feel guilty. It's expected because we're humans, and we can't control the uncertainties of life. But the problem lies in how to call out of work without seeming uncommitted to work, especially if you seldom get work-free days.
What is Seasonal Employment?
Depending on where you are in your career, you might have first-hand experience with seasonal employment. Seasonal employment can be a great way to expand your skill set and earn extra cash while helping businesses meet seasonal increases in demand.