<Back to Search
Data Engineer Level 2 - 26-00273
Dayton, OHMarch 22nd, 2026
Job Description LeadStack Inc. is an award winning, one of the nation's fastest growing, certified minority owned (MBE) staffing services provider of contingent workforce. As a recognized industry leader in contingent workforce solutions and Certified as a Great Place to Work, we're proud to partner with some of the most admired Fortune 500 brands in the world.Data Engineer Level 2 Location: Blue Ash, OH 45241Duration: 6 months (Contractor) Pay Rate: $55–$75/hr (W2)Interview Process: In-person interviews required at Blue Ash, OH location.Overview Join our team to build modern data solutions in Azure! We're seeking a skilled Data Engineer with hands-on expertise in Databricks, Spark, Python, and cloud DataOps. You'll design scalable data pipelines, automate infrastructure with Terraform/GitHub Actions, and treat data as an enterprise asset—collaborating on CI/CD, governance, and optimization for reliable, secure analytics.Key ResponsibilitiesAnalyze, design, and develop Azure-based data products, pipelines, and architecture using Databricks, Spark, PySpark, Python, and SQL.Optimize Spark/PySpark pipelines for performance (e.g., data skew, partitioning, caching, shuffles).Build and maintain Delta Lake tables/models for analytical/operational use cases, including Delta Live Tables (DLT) or Databricks SQL.Provision cloud/Databricks resources via Terraform (IaC) and manage GitHub-based CI/CD workflows with GitHub Actions.Implement Git workflows for notebooks/jobs; troubleshoot clusters, jobs, and pipelines for reliability.Collaborate on data governance (e.g., Purview, Unity Catalog), lineage, cataloging, and enterprise standards.Deploy Azure fixes/upgrades; mentor on best practices; create diagrams/specs; support stakeholders and data strategy.Requirements5+ years as Data Engineer.Strong hands-on with Azure Databricks, Spark/PySpark, Python, SQL, and databases.Experience with Delta Live Tables (DLT), Databricks SQL, Azure Functions, messaging/orchestration tools.Proficiency in Terraform (IaC), GitHub/GitHub Actions (CI/CD, version control).Azure cloud data services integration; monitoring/optimizing Databricks clusters/workflows.Knowledge of distributed computing (partitions, joins, shuffles); data governance tools (Purview, Unity Catalog).SDLC familiarity; ability to manage priorities independently. know more about current opportunities at LeadStack , please visit us on https://leadstackinc.com/careers/ Should you have any questions, feel free to call me on (513) 3184502 or send an email on waseem.ahmad@leadstackinc.com
Showing 300 of 36,975 matching similar jobs in Calio, ND
- Azure Data Engineer (NO Sponsorship)
- Data Engineer - Snowflake
- Engineer, Big Data - Power BI/Databricks/SQL/Data Modeling - Remote
- Data Engineer
- Senior Data Engineer - Data Platform and Semantic Architecture and Engineering
- Fabric & Power BI Analytics Engineer (Hybrid)
- ETL Data Engineer - SpringBatch | AWS, Oracle, SQL
- Staff Data Engineer - Vehicle Telemetry and Data Infrastructure
- Data Engineer - Databricks
- Senior Data Engineer
- Hybrid Chicago Data Engineer: ETL, Spark & Snowflake
- Senior Data Analyst - ETL, Snowflake & AWS Expert
- Big Data Architect - TS/SCI | Scalable Cloud Data Solutions
- Principal Data Architect
- Data Engineer - Databricks
- Cloud Data Platform Engineer for AI/ML & BI
- Sr Principal Engineer - Data
- Staff Data Engineer
- Senior Snowflake Data Engineer & Tech Lead
- Senior Data Platform Engineer: Snowflake Ops & Automation
- Databricks Data Engineer
- Data Engineer ID50062
- Tableau BI Engineer
- Staff Data Engineer (Databricks) - Remote - USACharleston, SCMarch 20th, 2026
- Data Integration & AI Engineer
- Sr Data Engineer
- Senior Staff Software Engineer, Data Platform
- Senior Developer II - Data Science & Enterprise Data Platform Development
- Forward Deployed Engineer
- Senior Data Engineering Lead - ETL, SSIS & SQL
- Data Engineer: AI-Driven Pipelines & Cloud Solutions
- Data Engineering Intern - Analytics Platform
- Data Integration Co-op
- Data Specialist
- Remote Data Engineer Consultant | Snowflake Pipelines
- Analytics ETL Developer (Ab Initio/Boomi)
- Data Architect - Power & Utilities - Senior Manager- Consulting - Location OPEN
- Data Product Engineer
- Data Architect
- Data Architect