- UpvoteDownvoteShare Job
- Suggest Revision
Data warehousing tooling and services (Apache Airflow, AWS DMS, Snowflake); Authoring and maintaining IaC with Terraform and using IaC to deploy resources in AWS, Azure, GCP, or any other public cloud providers.
$227,500 a yearFull-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Manage architectural standards for the AWS cloud-based UT Data Hub and Integration Hub. Ensure data management processes meet compliance, quality, and efficiency standards. Knowledge of ETL (Extract, Transform, Load) processes and data integration tools (e.g., Informatica, AWS Glue.
$138,000 a yearFull-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
You may also work on workflow, ETL, data governance and visualization tools like Apache SuperSet, dbt, and Temporal, or data warehouse solutions such as Apache Trino, or ClickHouse.
Full-timeExpandApply NowActive JobUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
The individual will also lead and guide a team of data engineers, data scientists, mentor junior members and oversee project planning, resource allocation, time management, and coordinate with stakeholders to ensure that projects are delivered on time, within budget, and according to specifications.
Full-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
SFL Scientific, a Deloitte Business, is a data science professional services practice focused on strategy, technology, and solving business challenges with Artificial Intelligence (AI). Mentor, motivate and coach junior data scientists on technical best practices and inspire professional development.
$253,000 a yearFull-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Maintain and improve backend services on platforms like Kubernetes, utilizing cloud providers such as AWS/Google Cloud. Python Backend Data Collections Developer for global consumer device company in Austin, TX Summary: In this role, a successful candidate will be working on developing data-centric tools that scale well for internal usage.
$80 an hourExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
We support all AWS data centers and all of the servers, storage, networking, power, and cooling equipment that ensure our customers have continual access to the innovation they rely on.
Full-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Experience with cloud platforms (e.g., AWS, Snowflake, Databricks) for large-scale data processing and model hosting. A minimum of 3 years in data science focused on Marketing, with extensive experience in developing Bayesian models for MMM/MTA.
Full-timeExpandApply NowActive JobUpdated 16 days ago - UpvoteDownvoteShare Job
- Suggest Revision
As a Data Engineer on the AWS Marketing Insights team, your pivotal role will revolve around designing, building, and maintaining robust data pipelines and infrastructure to drive data-driven insights.
Full-timeExpandApply NowActive JobUpdated 4 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Minimum of 4 year’s of experience in Data Architecture and delivery experience on one of the Cloud platforms (Azure, AWS or GCP) Minimum of 4 year’s of experience in Databricks engineering solutions on one of the Cloud platforms (Azure, AWS or GCP.
Full-timeExpandApply NowActive JobUpdated 15 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Hands-on experience developing data pipelines using tools such as Airflow, AWS Step Functions, Dagster, Prefect,Stitch Data, Fivetran and Airbyte. You will be responsible for the end-to-end implementation of the data stack from collection to reporting, with a focus on infrastructure and technical processes.
$223,150 a yearFull-timeExpandApply NowActive JobUpdated 19 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Develop, implement, and optimize scalable data pipelines on AWS to ensure efficient processing and storage of large datasets. A minimum of 3 years of hands-on experience with Apache Spark for large-scale data processing, including building data visualizations using PySpark and Jupyter Notebooks.
Full-timeExpandApply NowActive JobUpdated 21 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Proficiency and hands-on knowledge in a variety of technologies such as SQL, Bash, Python, Java, Presto, Spark, AWS, data streaming like Kafka, RabbitMQ, Hands-on experience and proficiency with data stacks including Airflow, Databricks, and dbt, as well as data stores such as Cassandra, Aurora, and ZooKeeper.
$250,000 a yearFull-timeExpandApply NowActive JobUpdated 21 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Apache Spark, AWS platform data services. exploratory data analysis (EDA), data profiling, and data cleaning on a variety of datasets. Operations Research, Physics or a related field with a substantial research component and focus on data.
Full-timeExpandApply NowActive JobUpdated 14 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Extensive hands-on experience implementing serverless real-time/near real-time architecture using Cloud native services (i.e., Azure, AWS or GCP Tech Stack), and Spark technologies (e.g., Spark Streaming, Spark ML.
Full-timeExpandApply NowActive JobUpdated 12 days ago
on aws jobs Title: data in Austin, TX
FEATURED BLOG POSTS
How to Write an Address Correctly: Explained with Examples
It's hard to imagine a scenario where a text or phone call just won't do these days. With communication at our fingertips, you may think learning how to write an address is a superfluous skill. But it's a skill that will come in handy when you need to fill out healthcare forms, ship a package, order food delivery, or even apply for new jobs.
What is Employment Participation Rate
According to economists, there are four factors of production that go into creating higher quality goods at lower prices. These are
How to Get Pay Stubs (From Previous Employee Also!)
Pay stubs are an important piece of document which shows your earnings in a given period, as well as any deductions made towards your health insurance or pension contributions. They’re also excellent for finding out how much your recent salary raise has bumped up your monthly net income.
How to Write a Job Description?
It might be tempting to overlook the importance of a well-written job description. After all, if you’ve posted job ads before and ended up with tons of resumes in hand, it’s easy to assume that this will always be the case, regardless of how your job ad reads. But, in reality, you really can’t take getting an influx of resumes for granted.
How to Get a W2 From Previous Employers
When tax time rolls around, the last thing you want to worry about is having to track down a W-2 from your former employer. Many times you won’t have to because the IRS requires companies to send these forms to all current and former employees who have earned more than $600 in the last year. Unfortunately, there are employers who don’t do what they’re supposed to. There are even times where something else may happen that prevents the W-2 from getting where it’s supposed to go.
Structured vs Unstructured Interviews
The goal of an interview is to evaluate candidates based on their skills, personality, and knowledge. You want to choose the BEST candidate from your candidate pool, so the interview is something you can't mess up. As you begin planning your interview process, one of the major decisions you'll face is whether the interview should be a structured vs unstructured interview. So let's take a dive into the differences and sort out which circumstances warrant which interview process.
How to Describe Your Personality with Examples
Imagine you’re in an elevator with the CEO of your dream company and you get to talking. The conversation is going well and you start to imagine yourself working for their company when the CEO turns around and asks you “tell me a bit about yourself.” Would this catch you off guard or would you be able to give a clear and succinct description of who you are?