1 day ago

Data Engineer - Databricks & AWS Lakehouse

Pauwels Consulting

On Site
Full Time
€75,000
Ghent, Flemish Region, Belgium

Job Overview

Job TitleData Engineer - Databricks & AWS Lakehouse
Job TypeFull Time
Offered Salary€75,000
LocationGhent, Flemish Region, Belgium

Who's the hiring manager?

Sign up to PitchMeAI to discover the hiring manager's details for this job. We will also write them an intro email for you.

Uncover Hiring Manager

Job Description

Job Summary

Pauwels Consulting is seeking a skilled Data Engineer to join their data organization. This role focuses on designing and building scalable, reliable, and cost-efficient data pipelines on a governed lakehouse platform using Databricks on AWS.

Key Responsibilities

  • Design and maintain production-ready data pipelines using medallion architecture.
  • Optimize ETL/ELT processes for large-scale data processing on Delta Lake.
  • Implement data governance through Unity Catalog, managing access controls and data lineage.
  • Build CI/CD pipelines for automated testing and deployment of data workloads.
  • Configure Databricks workspaces and compute resources for optimal performance and cost.
  • Collaborate with cross-functional teams in an Agile environment to translate requirements into technical solutions.

Required Skills and Experience

  • Experience with Python, PySpark, and SQL for data engineering.
  • In-depth knowledge of Databricks, Delta Lake, and Unity Catalog.
  • Experience with AWS services including S3, IAM, KMS, and VPC.
  • Proficiency in CI/CD and Infrastructure as Code using GitHub Actions, Terraform, and Databricks Asset Bundles.
  • Knowledge of medallion architecture, ETL/ELT patterns, and CDC.
  • Ability to work effectively in an Agile/Scrum environment.
  • Fluent in English and preferably Dutch.

Nice to Haves

  • Experience with data quality frameworks such as Great Expectations.
  • Familiarity with Lakehouse monitoring or CloudWatch dashboards.

Position Details

  • Start date: ASAP - long term engagement
  • Location: Gent/Zwijnaarde - hybrid (minimum 2 days/week onsite)
  • Contract: open to both permanent employees and freelancers

Key skills/competency

  • Data Engineering
  • Databricks
  • AWS
  • Delta Lake
  • Unity Catalog
  • Python
  • PySpark
  • SQL
  • ETL/ELT
  • CI/CD

Tags:

Data Engineer
Databricks
AWS
Data Lakehouse
ETL
ELT
Python
PySpark
SQL
Delta Lake
Unity Catalog
CI/CD
Terraform
GitHub Actions
Data Governance
Cloud
Big Data
Data Pipelines
Agile
Medallion Architecture

Share Job:

How to Get Hired at Pauwels Consulting

  • Research Pauwels Consulting's culture: Study their mission, values, recent news, and employee testimonials on LinkedIn and Glassdoor.
  • Tailor your Data Engineer resume: Highlight Databricks, AWS, Python, PySpark, and SQL experience directly relevant to lakehouse architecture.
  • Showcase your project portfolio: Provide examples of scalable data pipelines, ETL/ELT optimizations, and CI/CD implementations.
  • Prepare for technical interviews: Expect questions on Databricks, AWS services, SQL, Python, and data pipeline design principles.
  • Demonstrate Agile collaboration: Be ready to discuss your experience working effectively within Agile/Scrum team environments.

Frequently Asked Questions

Find answers to common questions about this job opportunity

Explore similar opportunities that match your background