- UpvoteDownvoteShare Job
- Suggest Revision
Proficiency in understanding modern data architectures and concepts such as cloud services (AWS, Azure, GCP), real-time data distribution (Kafka, Dataflow), and modern data warehouse tools (Snowflake, Databricks.
ExpandApply NowActive JobUpdated 4 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Stay up-to-date on GCP offerings (Dataflow, Dialogflow, Cloud Composer, Cloud Functions, Gemini, Looker, BigQuery, DocAi, Vertex. Design and develop large-scale data solutions using GCP services (i.e. Dataflow, Cloud Bigtable, BigQuery, Cloud SQL, Pub/Sub, Cloud Data Fusion, Cloud Composer, Cloud Functions, Cloud Storage, Compute Engine.
ExpandApply NowActive JobUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
The position involves a little bit of everything, but the focus will be on cloud security and IAM within their GCP environment. Strong experience with cloud security engineering, specifically GCP.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Knowledge of cloud computing concepts & leaders, such as Kubernetes, AWS, Azure, GCP. Architect cloud infrastructure solutions like Kubernetes, Kubeflow, OpenStack, and Spark. to help global companies embrace AI in their business, using the latest open source capabilities on public and private cloud infrastructure, Linux and Kubernetes.
ExpandUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
Deep understanding of cloud computing, specifically IaaS and PaaS offerings from major providers such as AWS, Azure, and GCP. Cloud focused certifications from AWS, Azure or GCP.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Knowledge of master data, metadata, reference data, data warehousing, database structure, and business intelligence principles and processes, including technical architecture. Support data governance program adoption and effectiveness across the enterprise, aiding with program metrics and monitoring, program scoping and resource requirements, communication, collaboration, and ideation on improving program efforts.
ExpandApply NowActive JobUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
Experience working with cloud technologies like Snowflake on Microsoft Azure, Amazon AWS or Google GCP. Be sure to highlight your experience with cloud, ETL, SQL, and Python! Prior experience supporting Data Governance initiatives desired: Data Quality, Metadata Management (Data Cataloging, Data Lineage), Master Data Management, Data Security.
ExpandApply NowActive JobUpdated 11 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Previous experience with Cloud data tools (GCP, Azure, AWS) such as: EMR, S3, EC2, Glue, and ECS. You’ll work to ingest and process both streaming and batch data using leading-edge cloud tools from AWS and Databricks that represent purpose-built architecture.
ExpandApply NowActive JobUpdated 2 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Support development and maintenance of ETL data pipelines with SQL and Python based transformations running on our Google Cloud Platform (GCP). Your work on data collection, analysis, and integration using cloud data pipelines will advance our enterprise data capabilities.
ExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Experience with cloud providers (e.g. AWS, Azure, or GCP). Senior Software Engineer - Hub Cloud Platform (Minneapolis, MN) Department: Edge Devices & Hub. The Hub to Cloud (H2C) team is responsible for maintaining connectivity between all of the hubs and the SmartThings Platform.
Full-timeExpandApply NowActive JobUpdated Today - UpvoteDownvoteShare Job
- Suggest Revision
Knowledge of Google Cloud Platform (GCP) or Azure would be nice to have. Proficiency with the following or similar technologies: AWS cloud native services, Infrastructure as Code (Terraform), Python, PostgreSQL, GitLab, Git, Bitbucket, Bamboo, Maven, Nexus, Fortify, Sonar, etc.
ExpandApply NowActive JobUpdated 3 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Certifications for any of the cloud services like AWS, Snowflake, GCP or Azure. Experience in architecting data pipelines and solutions for both streaming and batch integrations using tools/frameworks like Glue ETL, Lambda, Google Cloud DataFlow, Azure Data Factory, Spark, Spark Streaming, etc.
Full-timeExpandApply NowActive JobUpdated 2 days ago - UpvoteDownvoteShare Job
- Suggest Revision
Collaborating amongst team members across several geographies, our Cloud practitioners engineer cloud-based analytics solutions on AWS, Azure, Databricks, GCP, Snowflake, Oracle, Informatica Cloud and a combination of native cloud technologies, including computing at edge and curating data-in-motion.
Full-timeExpandApply NowActive JobUpdated 1 month ago - UpvoteDownvoteShare Job
- Suggest Revision
You will be the trusted advisor to your customers for all things related to cloud security across AWS, Azure, and GCP. We are passionate about technical sales and helping our customers achieve maximum value from our solution.
ExpandApply NowActive JobUpdated Yesterday - UpvoteDownvoteShare Job
- Suggest Revision
AWS/ Azure/ GCP hands on experience. Cloud security experience. Experience with traditional CSPM tools and / or cloud infrastructure security Technical SaaS Selling experience. Leadership and management of a team of Solutions Engineers focused on Cloud accounts.
ExpandApply NowActive JobUpdated Today
gcp cloud jobs in St Louis Park, MN
FEATURED BLOG POSTS
5 Common Interview Mistakes
Everyone's interview process is unique in some form or fashion. Like most, your interview process is crafted so you can get the most information out of your candidates to increase hiring confidence and make the right hiring decisions. However, there are often small problems in interview processes that could ultimately affect the success of hiring decisions.
How to Ask Someone to be a Reference + Email Templates
One part of the job-hunting process that frequently gets overlooked is putting together a list of good references. Most of the time we focus on creating the perfect resume, writing an awesome cover letter, and getting our hands on letters of recommendation. We think about what outfit we’ll wear to the job interview, how we’ll answer those tricky questions, and what our career plan looks like. But, in fact, having multiple references lined up who will speak favorably about you to a potential employer is critical to landing a job. This aspect of job searching really can’t be ignored.
Job Rejection Email Response with Examples
Glassdoor estimates that, on average, there are about 250 applicants for every job vacancy out there. If you’ve ever applied for a job, the odds are that you’ve received the dreaded job rejection email.
Structured vs Unstructured Interviews
The goal of an interview is to evaluate candidates based on their skills, personality, and knowledge. You want to choose the BEST candidate from your candidate pool, so the interview is something you can't mess up. As you begin planning your interview process, one of the major decisions you'll face is whether the interview should be a structured vs unstructured interview. So let's take a dive into the differences and sort out which circumstances warrant which interview process.
How to Describe Your Personality with Examples
Imagine you’re in an elevator with the CEO of your dream company and you get to talking. The conversation is going well and you start to imagine yourself working for their company when the CEO turns around and asks you “tell me a bit about yourself.” Would this catch you off guard or would you be able to give a clear and succinct description of who you are?
4 Ways to Make Your Job Posting More Inclusive
According to a Glassdoor survey,
To ATS or not to ATS
As hiring is becoming more analytical and data-driven, companies have found ways to incorporate technology to help hire and recruit more efficiently. ATS, also known as an applicant tracking system, has become one of the most widely adopted technological recruiting tools to date. In fact, according to data from Capterra: