- UpvoteDownvoteShare Job
- Suggest Revision
Develop Tradeweb’s data science platform based on open source software and Cloud services. Build and run Tradeweb’s data platform using such technologies as public cloud infrastructure (AWS and GCP), Kafka, databases and containers.
$100,000 - $250,000 a yearFull-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
End-to-end supply chain of data including, but not limited to, AI and Advanced Analytics, Business Intelligence, Cloud enablement, Data Privacy, Data Science, Digital Transformation, Data Governance / Data Management, Enterprise Data Architecture, RPA/Intelligent Automation, Risk and Regulatory Compliance.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
PDO - The Data Software Engineer position will involve design, implementation, testing and launch of new applications for loading dealer data and generating analytical insights.
Full-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Title : Data Engineer. Significant experience building Data Warehouses (Azure Synapse Analytics or similar), Data Lakes (Azure Data Lake or similar), ETL/ELT pipelines (Databricks or similar), and nice to have skills on data streaming (Azure Event Hub, Kafka, Cosmos, MongoDB, or similar.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
QTS Data Centers is a Blackstone REIT Portfolio companyQTS Data Centers is seeking a DCIM Engineer II / Controls Engineer II (Senior Data Center Infrastructure Management (DCIM) Engineer) to be based at one of our brand new mega data center campuses in the US. Quality Technology Services (QTS) is a leading provider of data center solutions across a diverse footprint spanning more than 9 million square feet of QTS Mega Data Centers throughout North America and Europe.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Degree in Data Science, Computer Science, Computational Chemistry, or related field. Bachelor’s or Master’s Degree in Data Science, Computer Science, Computational Chemistry, or related relevant discipline.
$50 - $60 an hourExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Will maintain, move, and manipulate data between applications, using appropriate software/code: Apache Spark, ElasticSearch, R, Python, Kibana and others as technology evolves. Experience working with commercial-off-the-shelf (COTS) statistical software or tools for data visualization (i.e., SPSS, SAS, MatLab, Tableau, etc.
Full-timeExpandApply NowActive JobUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
Experience working with large datasets and big data technologies, preferably cloud-based (e.g., Redshift, Databricks, Azure SQL Data Warehouse, MongoDB). Design and implement data integration solutions, leveraging ETL/ELT tools such as Apache Spark, Informatica PowerCenter/BDM, and SSIS.
ExpandApply NowActive JobUpdated 3 days ago - UpvoteDownvoteShare Job
- Suggest Revision
The Big Data Engineer shall lead the Big Data Engineering team. Job Title: Sr. Hadoop Architect / Big data Architect. Hands-on experience in Big Data, Cloudera Distribution 7.
RemoteExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Experience reviewing in-vivo and in-vitro data including, but not limited to, inhalation, pathology, chemistry, PCR, and ELISA. The Data Coordinator is responsible for maintaining study files, reviewing data in real time, and ensuring that SOP, Protocol, and GLP requirements are met.
Full-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Familiarity with one of the Cloud data-warehouses like Snowflake, Google BigQuery, AWS Redshift, Azure Synapse, Databricks. Good understanding of data engineering fundamentals, ELT / ETL, latency, observability, lineage, distributed storage and distributed computing.
Full-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Stay updated with the latest trends and technologies in data management, cloud computing, and big data analytics. Job Summary: The Data Management Specialist will be responsible for managing and optimizing our organization's data infrastructure, ensuring data quality, and implementing data solutions using PySpark, DataBricks, Snowflake, and/or Redshift.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Design, implement, and optimize data solutions using DataBricks for data analytics and machine learning applications. · Hands-on experience with DataBricks for building and optimizing data pipelines.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Proficiency in managing and administering cloud-based data platforms such as Snowflake and/or Redshift. Manage and administer cloud-based data platforms such as Snowflake and Redshift, ensuring high availability, scalability, and performance.
Full-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
The Data Management Specialist will be responsible for managing and optimizing our organizations data infrastructure, ensuring data quality, and implementing data solutions using PySpark, DataBricks, Snowflake, and/or Redshift.
ExpandApply NowActive JobUpdated Today
data job Title: big data software engineer Company: Change Healthcare
FEATURED BLOG POSTS
5 Common Interview Mistakes
Everyone's interview process is unique in some form or fashion. Like most, your interview process is crafted so you can get the most information out of your candidates to increase hiring confidence and make the right hiring decisions. However, there are often small problems in interview processes that could ultimately affect the success of hiring decisions.
Job Rejection Email Response with Examples
Glassdoor estimates that, on average, there are about 250 applicants for every job vacancy out there. If you’ve ever applied for a job, the odds are that you’ve received the dreaded job rejection email.
Structured vs Unstructured Interviews
The goal of an interview is to evaluate candidates based on their skills, personality, and knowledge. You want to choose the BEST candidate from your candidate pool, so the interview is something you can't mess up. As you begin planning your interview process, one of the major decisions you'll face is whether the interview should be a structured vs unstructured interview. So let's take a dive into the differences and sort out which circumstances warrant which interview process.
How to Describe Your Personality with Examples
Imagine you’re in an elevator with the CEO of your dream company and you get to talking. The conversation is going well and you start to imagine yourself working for their company when the CEO turns around and asks you “tell me a bit about yourself.” Would this catch you off guard or would you be able to give a clear and succinct description of who you are?
4 Ways to Make Your Job Posting More Inclusive
According to a Glassdoor survey,
How to Calculate Net Income
Understanding your finances can be daunting even if you’re good with numbers. Your net income, in particular, is a key metric for determining how well you’re doing financially and whether your current way of operating is sustainable or not.
To ATS or not to ATS
As hiring is becoming more analytical and data-driven, companies have found ways to incorporate technology to help hire and recruit more efficiently. ATS, also known as an applicant tracking system, has become one of the most widely adopted technological recruiting tools to date. In fact, according to data from Capterra: