- UpvoteDownvoteShare Job
- Suggest Revision
3 to 5 years of ETL development experience with any leading platforms (Informatica/SnapLogic/Dell Boomi/Talend/Sterling Integrator) 3 to 5 years of ETL development experience with any leading platforms (Informatica/SnapLogic/Dell Boomi/Talend/Sterling Integrator.
Full-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
GCP Data Engineer (Standard) with skills Big Data, Kafka, Python, Scala, Apache Spark for location Poland. Additional Skill(s): Kafka, Python, Scala, Apache Spark. GCP Data Engineer (Standard) with skills Big Data, Kafka, Python, Scala, Apache Spark for location Poland.
ExpandApply NowActive JobUpdated 8 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Knowledge and/or experience in working with SQL on Hadoop tools and technologies including HIVE, Impala, Presto, others from an open source perspective and Hortonworks Data Flow (HDF), Dremio, Informatica, Talend, others from a commercial vendor perspective.
ExpandApply NowActive JobUpdated 11 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Experience with designing and maintaining data warehouses and/or data lakes with big data technologies such as Spark/Databricks, or distributed databases, like Redshift and Snowflake, and experience with housing, accessing, and transforming data in a variety of relational databases.
ExpandApply NowActive JobUpdated 12 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Hands-on experience with Hadoop, MapReduce, Hive, Pig, Sqoop, SPARK, Kafka and HBASE (Preferred) Hands-on experience with Talend/Informatica/IICS integration tool and other ETL tools (At least 6 years.
ExpandApply NowActive JobUpdated 7 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Experience with data engineering frameworks: Apache Airflow, dbt, MLFlow, Apache Spark, AWS services, Docker, Kubernetes. Experience with data engineering frameworks: Apache Airflow, dbt, MLFlow, Apache Spark, AWS services, Docker, Kubernetes.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Process streaming data by developing real-time data processing and streaming analytics solutions with Apache spark streaming and structured streaming in DataBricks. Strong understanding of distributed computing principles and Apache Spark architecture.
ExpandApply NowActive JobUpdated 12 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Proficiency in the Cloudera suite, including Kafka, HDFS, HBASE, KUDU, Zookeeper, HIVE, Impala, NIFI, SPARK, FLINK, Oozie, Yarn, Atlas, Ranger, RangerKMS, and KTS. Proficiency in the Cloudera suite, including Kafka, HDFS, HBASE, KUDU, Zookeeper, HIVE, Impala, NIFI, SPARK, FLINK, Oozie, Yarn, Atlas, Ranger, RangerKMS, and KTS.
Full-timeExpandApply NowActive JobUpdated 3 months ago - UpvoteDownvoteShare Job
- Suggest Revision
Big data distributed programming languages, and ecosystems: Spark, Hadoop, MapReduce, Pig, Kafka. common data science tools such as Python, R, Pytorch, TensorFlow, Keras, NLTK, Spacy, or Neo4j, and a good understanding of modelling platforms (Azure AutoML, SageMaker, DataBricks, DataRobot and H2O.ai.
Full-timeExpandApply NowActive JobUpdated 7 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Experience with cloud computing environments (AWS, Azure, or GCP) and Data/ML platforms (Databricks, Spark). Python ecosystem preferred, R will be acceptable, machine learning libraries & frameworks (e.g. TensorFlow, PyTorch, scikit-learn) and familiar with data processing and visualization tools (e.g., SQL, Tableau, Power BI.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Familiarity with relevant technology, such as Big Data (Hadoop, Spark, Hive, BigQuery); Data Warehouses; Business Intelligence; and Machine Learning. Experience with Operational Technology equipment and related software, such as Honeywell Connected Plant, Emerson Plantweb/AMS, GE/Meridum APM, Aveva, Bentley, and OSIsoft PI.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Ability to use In-memory capabilities of Tibco Spotfire, Spark as well as rich Machine Learning applications with easy to use Data visualization of Models using built in TERR (Tibco Runtime R engine) and Revolution R/R Studio.
ExpandApply NowActive JobUpdated 1 month ago - UpvoteDownvoteShare Job
- Suggest Revision
Strong programming skills in languages such as Python, Java, or Scala, with experience in data processing frameworks and technologies such as Apache Spark, Apache Kafka, or Hadoop. Knowledge of cloud platforms and services such as AWS, Azure, or Google Cloud Platform, and experience with cloud-based data services such as Amazon Redshift, Google BigQuery, or Azure SQL Database.
Part-timeExpandApply NowActive JobUpdated 3 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Monitoring and optimizing data infrastructure performance, reliability, and cost efficiency through automation, monitoring tools, and infrastructure as code (IaC) practices. Professional development opportunities to enhance skills and knowledge in data engineering, cloud computing, and analytics.
Part-timeExpandApply NowActive JobUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
Tool Expertise - Python, R, SQL, SAS Good To Have Experience in working on Big data, Cloud, Hadoop, Spark, End to end deployment. Tool Expertise - Python, R, SQL, SAS Good To Have Experience in working on Big data, Cloud, Hadoop, Spark, End to end deployment.
Full-timeExpandApply NowActive JobUpdated Today
talend spark jobs in Houston, TX
FEATURED BLOG POSTS
Making Hybrid Work More Efficient
Covid was a catalyst for change in the work environment. Keeping people safe and helathy was the initial goal for employers, but the unintended result was the considerable demand in remote work. Now, onsite work has been dramatically altered to remote work, which is now transforming into a combination of the two: hybrid work.
How Can HR Technology Help Retain Employees?
Human resources' rapid adoption of technology has led to new ways of streamlining human capital management. Based on the IEE Global Study, these technologies changed how HR handled recruitment and retention in 2022. This includes tech like
Why is Time Management Important? 10 Crucial Importances of Time Management
We’ve all been there before. What starts as a relaxing evening scroll quickly becomes a full-blown binge. You blink, and it’s midnight - throwing off your entire next day before it even starts. And at its worst, this indulgence might leave you feeling behind on things you planned to finish that night. This is why time management is important.
Minimizing Candidate Renegs During the Hiring and Onboarding Process
Candidates reneging on job offers or during the onboarding process can be a frustrating experience for any recruiter. In a talent-driven job market, it’s common for candidates to have more than one job offer to consider. It becomes a race against time to see which organization can offer the best career experience, compensation, and circumstances that secure the right employees.
10 Reasons to Be on Time at Work
Being punctual at work may not be something you’ve given much thought to, but it’s the foundation for building a successful career. All of your technical or job-specific skills will be in vain if your peers and superiors can’t trust you to show up on time and do the work. In fact, Simon Sinek once famously said that
10 Importancies of Setting Realistic Goals
We’ve all heard how important it is to set professional and personal goals. Developing and establishing goals keeps us motivated and moving forward in life. But not all goals are created equal. If you’re chasing goals that are too lofty, you’ll end up disappointed when you cannot reach them. Setting goals that are achievable and measurable is the key to success.
Email Etiquette Principles - Why is it Important
Why is email etiquette important? Let's imagine you're hiring for a new role, and you’ve just received the email below.