- UpvoteDownvoteShare Job
- Suggest Revision
Technologies - IaaS (AWS or Azure or GCP), Databricks platform, Delta Lake storage, Spark (PySpark, Spark SQL). Develop and optimize ETL pipelines from various data sources using Databricks on cloud (AWS, Azure, etc.
ExpandApply NowActive JobUpdated 3 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Engineer ETL frameworks with cutting-edge technologies such as Snowflake, Airflow, Apache Spark, Python etc. Engineer ETL frameworks with cutting-edge technologies such as Snowflake, Airflow, Apache Spark, Python etc.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Strong understanding of current data engineering concepts, techniques and technologies; e.g. data contracts, software engineering design patterns, Airflow, DataFlow, Spark, BigQuery, Snowflake, DataBricks etc.
ExpandApply NowActive JobUpdated 3 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Utilizing knowledge of modern big data approaches preferred: e.g. PySpark, Apache Spark, Azure Databricks, Azure SQL, Azure ML, R, Python, Java, etc. Working knowledge of performing data ETL (Extract, Transform and Load), and experience with SQL, Alteryx, Power Query and other agile technologies.
ExpandApply NowActive JobUpdated 4 days ago - UpvoteDownvoteShare Job
- Suggest Revision
6 years of experience must include: Apache Spark, Hive, Kafka; Scala, Python; Abinitio, Oracle; Cloud technologies: ORAAS, IBM S3 (Simple Storage Service); Intellij, Eclipse, SQL Developer, Arcadia, Tectia, Tableau, Hue, Jenkins, Sonar; OpenShift, Unravel, Cloudera Manager; and Red Hat Enterprise Linux 8, Tibco.
ExpandApply NowActive JobUpdated 5 days ago - UpvoteDownvoteShare Job
- Suggest Revision
IPaaS Integration Platform (Boomi), Databricks (Spark, Delta Lake, etc.) · Design, develop, and implement data integration solutions using Sales/Marketing, ERP APIs, and Databricks ETL/ELT tools.
ExpandApply NowActive JobUpdated 6 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Proficiency in Databricks for data analytics, ETL, and data engineering. Familiarity with big data technologies such as Apache Spark is a plus. The ideal candidate will have a strong background in data engineering and be proficient in using the ELK (elastic, kibana, logstash) stack and Databricks for data processing and analytics.
ExpandApply NowActive JobUpdated 6 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Experience in working on AWS ecosystem as a whole and should have experience in migrating ETL pipelines from Talend / Informatica/Abinitio to Hadoop / Glue / Spark. Should be competent to design and develop architectures for Data Migration, Data Ingestion, Data Storage, Build Data Lakes, creating various layers in Data Lake, ETL using Hadoop tools like Spark, and AWS tools like Glue and EMR.
Full-timeExpandApply NowActive JobUpdated 12 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Proficiency in data integration technologies, ETL/ELT processes, data mapping tools, and MDM platforms like Reltio, Informatica MDM, or similar. Experience with messaging software, AWS Hadoop, Apache Spark.
Full-timeExpandApply NowActive JobUpdated 3 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Extensive experience in designing, engineering and managing data lake ingestion, validation, transformation and consumption services leveraging cloud data tools like Hive, Spark, EMR, Glue ETL and Catalog, Snowflake etc.
Full-timeExpandApply NowActive JobUpdated 8 days ago - UpvoteDownvoteShare Job
- Suggest Revision
SQL/noSQL/Postgres, Amazon Redshift, Hadoop/Spark, Azure, Data Pipeline and ETL Development (Python, Scala, AWS Glue, Data Factory) SQL/noSQL/Postgres, Amazon Redshift, Hadoop/Spark, Azure, Data Pipeline and ETL Development (Python, Scala, AWS Glue, Data Factory.
Full-timeExpandApply NowActive JobUpdated 17 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Pyspark, SQL, CI/CD Pipelines for Databricks, ETL Processes, Project Management and Delta Lake Storage. 8+ years experience in developing and implementing ETL pipelines from various data sources using Databricks on cloud.
ExpandApply NowActive JobUpdated 6 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Skilled in creating MLOps pipelines, employing tools like Apache Spark or Databricks. As the Senior Data Engineer, you will be responsible for crafting and constructing additional ETL pipelines to broaden core datasets.
Full-timeExpandApply NowActive JobUpdated 4 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Experience with Databricks/Apache Spark structured streaming, and/or Kafka experience. Develop ETL workflows/data pipelines to ingest data using AWS Data Migration Service (DMS), Scala, Kafka, Restful APIs, and other technologies as determined by the client from multiple transactional systems to the target (including ODS, data marts, and data lake) according to documented logic and source-to-target mappings.
ExpandApply NowActive JobUpdated 6 days ago - UpvoteDownvoteShare Job
- Suggest Revision
4+ years of Spark experience using Scala/Python for ETL pipelines. Document database designs and ETL processes. Proficiency in AWS services like Kinesis, S3, RedShift, DMS, Athena.
ExpandApply NowActive JobUpdated 2 days ago
spark etl jobs in Fort Lee, NJ
FEATURED BLOG POSTS
How to Professionally Reschedule a Job Interview Without Destroying Your Chances
You’ve practiced answering common interview questions and refined your “greatest weakness.’ Nothing can stop you until BAM! The flu hits your household. Or you ran over a nail and popped a tire en route to the interview. When you need to pivot, there’s a good, better, and best way to reschedule a job interview. Here’s how to do it professionally, so you can nail the gig when the timing is right.
What is a W-9 and How to Fill One Out
When you began working for yourself, you probably didn’t account for the tax reporting work that will fall on your shoulders each year. If you’re a freelancer, independent contractor, or business owner, filing your taxes is not as simple as uploading your W-2 form into some online tax preparation software. Most self-employed people need to complete a W-9 as a step for accurately reporting their earnings to the IRS. Below, you can learn how to fill out a W-9 and when to submit it.
Tightening the HR budget in 2023
With the state of the economy still uncertain, 2023 is expected to be approached with much anticipation. Human Resource leaders have many concerns, including how they will manage to accomplish their goals with budget belts already getting snug. Let’s look at some of the factors that the new year is projected to bring for HR and how to prioritize budgets to reach human capital objectives.
A Comprehensive Guide to Becoming a Better Conversationalist
Have you ever stood awkwardly next to someone at a party because you didn’t know what to say to them? How about at a networking event or on a first date? You're not alone if you’ve ever experienced this uncomfortable silence. Many people struggle to master the art of being a great conversationalist.
Why is Non-Verbal Communication Really Important?
In a world where words and phrases rule daily communication, you may wonder why non-verbal communication is important. Whether you realize it or not, you communicate more with nonverbal actions than you do with verbal communication. When you interact with your peers, people are reading your body language, facial expressions, voice, and many other factors that help fill in blanks that words can't fill.
Making Hybrid Work More Efficient
Covid was a catalyst for change in the work environment. Keeping people safe and helathy was the initial goal for employers, but the unintended result was the considerable demand in remote work. Now, onsite work has been dramatically altered to remote work, which is now transforming into a combination of the two: hybrid work.
How Can HR Technology Help Retain Employees?
Human resources' rapid adoption of technology has led to new ways of streamlining human capital management. Based on the IEE Global Study, these technologies changed how HR handled recruitment and retention in 2022. This includes tech like