- UpvoteDownvoteShare Job
- Suggest Revision
Experience using ETL/ELT tools and technologies such as Talend, Informatica, SSIS or DBT a plus. Experience using Spark, Kafka, Hadoop, or similar distributed data technologies a plus. Expertise in building Cloud Data Warehouses in Snowflake, Redshift, BigQuery or analogous architectures.
ExpandApply NowActive JobUpdated 2 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Hands on working experience in AWS GLUE ETL tool and having good knowledge on any of the ETL tools(Informatica cloud,Talend etc. Expertise with Python Language and Apache Spark. Strong AWS development experience for data ETL/pipeline/integration/automation work.
Full-timeExpandApply NowActive JobUpdated 2 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Data Engineer having 6-10 year experience in ETL, Data Ingestion and implementation of data warehousing and Data Lake solutions. Strong PySpark/Python programming experience - Partitioning, parallel distributed computation and spark cluster concepts.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
3+ years' experience of Data Storage/Big Data platform implementation, with a preference for hands-on experience in implementation and performance tuning Hadoop/Spark implementations. Monitor performance, troubleshoot and tune ETL processes as appropriate using tools like in the AWS ecosystem.
ExpandApply NowActive JobUpdated 2 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Solid understanding of data engineering principles, data modeling, data warehousing, and ETL/ELT processes, encompassing data testing, validation, and reconciliation procedures. Proficiency in programming languages such as Spark (with either Python or Scala) and SQL.
ExpandApply NowActive JobUpdated 2 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Minimum 0-2 years experience Data Analysis Skills - Strong in Querying - SQL, Kusto (Spark and Azure ML can be good add on in a CV but don't seem to be must have) Design, implement, automate and maintain large scale enterprise data ETL processes.
ExpandApply NowActive JobUpdated 2 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Spark, Airflow for ETL, DAG management. Python, Django, Javascript, React, gRPC as the main languages and frameworks of choice. Postgres, S3, Elasticsearch, Dynamo, Snowflake as data stores.
ExpandApply NowActive JobUpdated 5 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Experience with ETL tools such as Airflow, and Jenkins. Preferred Qualifications, capabilities and skills - Experience with distributed computing frameworks, such as Apache Spark. - Developing and implementing data ETL pipelines within AWS Required qualifications, capabilities, and skills: - Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
Full-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Minimum of 3 years experience in data engineering, with substantial work on ETL pipeline construction, preferably in environments utilizing Fivetran, dbt, Snowflake, and Looker. Deep understanding of the complete data stack, including Apache Hadoop, Apache Spark, Spark Streaming, Kafka, and the ability to adapt and learn new technologies.
ExpandApply NowActive JobUpdated 7 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Use of Spark, Java and Python to build secure, scalable and reliable ETL pipelines on AWS, GCS and Azure cloud platforms; Use of SQL and data-warehouse data modeling concepts to design, transform and store modeled data in data-warehouse tools including Redshift, Google BigQuery and Snowflake.
ExpandApply NowActive JobUpdated 7 days ago - UpvoteDownvoteShare Job
- Suggest Revision
DataScience #MachineLearning #ExperimentalDesign #AI #Python #PyTorch #Tensorflow #Algorithms #MLModeling #DataEngineering #DataWrangling #ETL #DataMatching #DeepLearning #LargeLanguageModel #GenerativeAI #InsuranceIndustry #BigData #Hadoop #Spark #CloudComputing #Agile #DataVisualization #Tableau #QlikView #D3js.
ExpandApply NowActive JobUpdated 7 days ago - UpvoteDownvoteShare Job
- Suggest Revision
You’ve built and maintained an ETL pipeline using a data warehouse like BigQuery or Redshift. Experience working with streaming and batch data processing tools like Apache Beam, Spark, Flink, etc.
RemoteExpandApply NowActive JobUpdated 7 days ago - UpvoteDownvoteShare Job
- Suggest Revision
3+ years of Spark experience using Scala/Python for ETL pipelines. Document database designs and ETL processes. Proficiency in AWS services like Kinesis, S3, RedShift, DMS, Athena.
$140,000 - $170,000 a yearFull-timeExpandApply NowActive JobUpdated 8 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Strong knowledge of Apache Spark and Databricks architecture. Collaborate with data engineers, data scientists, and analysts to optimize and streamline data workflows, ETL processes, and analytical pipelines on the Databricks platform.
$120,000 - $140,000 a yearFull-timeExpandApply NowActive JobUpdated 10 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Based out of US Some experience in insurance domain/ data is must Programming Languages - SQL, Python Technologies - laas (AWS or Azure or GCP), Databricks platform, Delta Lake storage, Spark (PySpark, Spark Good to have - Airflow, Splunk, Kubernetes, Power BI, Git, Azure Devops.
ExpandApply NowActive JobUpdated 9 days ago
spark etl jobs in Weehawken, NJ
FEATURED BLOG POSTS
How to Call Out of Work
No matter how happy we are with our jobs, there are days when we feel overwhelmed and want to call out of work. No, don't feel guilty. It's expected because we're humans, and we can't control the uncertainties of life. But the problem lies in how to call out of work without seeming uncommitted to work, especially if you seldom get work-free days.
What is Seasonal Employment?
Depending on where you are in your career, you might have first-hand experience with seasonal employment. Seasonal employment can be a great way to expand your skill set and earn extra cash while helping businesses meet seasonal increases in demand.
How to Avoid a Bad Hire
"A new employee who doesn't meet the minimum performance, quality, and culture fit standards you set when you began sourcing and recruiting. Additionally, bad hires will immediately show signs of self-interest instead of an interest in their role and the company."
How to Ask for a Letter of Recommendation
When the job board you subscribe to finally posts your dream job, you may feel like the stars have aligned. But part of securing a position that matches your career plan is ensuring you address all the application basics. You know, the resume, the cover letter, the portfolio. It seems like you've got this in the bag — until you realize they want a letter of recommendation, too!
16 Tech Jobs You can Get Without a College Degree
You might think that if you don’t have a computer science, information technology, or related degree, then there’s absolutely no way you can break into the technology field and score a high-paying tech job. But this is a misconception. There are actually tons of tech jobs out there that don’t require a college degree. Instead, employers are more interested in the skills that you can offer. So, read on to learn more about how to land tech jobs without a degree.
What Are SMART Goals?
When it comes to achieving our goals, there’s a lot of noise to work through. A study by the University of Scranton has found that only 8% of people who set New Year’s resolutions actually achieve them. Our busy lives might be one reason for this. Another, even more important reason, is our approach to goal setting. Being too vague, too ambitious or simply unclear on the timeframe can set us up for failure.
In-House vs Outsourcing Recruiting: Which is Better?
When looking at in-house vs outsourcing recruiting, it is important to nail down the benefits for each and whether those benefits outweigh the risks that follow.