2 days ago

Senior Data Platform Engineer

GSCF

On Site
Full Time
HUF 0
Budapest, Budapest, Hungary

Job Overview

Job TitleSenior Data Platform Engineer
Job TypeFull Time
CategoryCommerce
Experience5 Years
DegreeMaster
Offered SalaryHUF 0
LocationBudapest, Budapest, Hungary

Who's the hiring manager?

Sign up to PitchMeAI to discover the hiring manager's details for this job. We will also write them an intro email for you.

Uncover Hiring Manager

Job Description

About GSCF

GSCF is the leading global provider of working capital solutions. The company empowers companies and their financial institution partners to accelerate growth, unlock liquidity, and manage the risk and complexity of the end-to-end working capital cycle. GSCF’s innovative Working Capital-as-a-Service offering combines the power of an end-to-end connected capital technology platform with expert managed services and alternative capital solutions.

GSCF’s team of working capital experts operates in over 75 countries, offering a truly global and holistic perspective to solve working capital efficiency challenges. Visit www.gscf.com to learn more.

The Role of a Senior Data Platform Engineer at GSCF

We are expanding and elevating our data infrastructure and we’re looking for a highly skilled Senior Data Platform Engineer to join our growing Data Platform team in Budapest. This is a hands-on, high-impact role where you will design, build, and operate scalable data infrastructure, data warehousing, and lakehouse solutions that power decision‑making and client-facing data products across the entire company.

Our Data Platform team is fully internal, co‑located in Budapest, and works closely with Product, Operations, Finance, Engineering, and leadership. You won’t be siloed; you will be part of a core function that supports the whole business. To succeed here, technical excellence is only half the story. The other half is developing a strong understanding of how our business works and how data can drive our strategy.

If you enjoy ownership, solving complex data challenges, and shaping a modern data ecosystem, we’d love to hear from you.

How You Will Make an Impact:

Data Infrastructure & Architecture
  • Build, and optimize scalable data warehousing and lakehouse solutions and ensure data integrity end‑to‑end
  • Create and own robust ELT pipelines connecting multiple internal and external data sources
  • Design and implement data models aligned with industry best practices
  • Contribute to architectural decisions and bring senior-level technical depth
  • Track, structure, and maintain key business metrics and KPIs
Data Governance & Quality
  • Build and maintain governance processes: data lineage, metadata, quality monitoring
  • Implement automated data quality checks, validation, and testing
  • Define and enforce data standards, naming conventions, and documentation practices
Collaboration & Leadership
  • Partner with Product, Operations, Finance, and other teams to understand data needs
  • Mentor and guide team members to elevate engineering standards
  • Translate business requirements into robust technical solutions
  • Ensure compliance with data governance, privacy, and security standards
  • Drive adoption of best practices across the organization

What You Bring to the Team:

Required Experience
  • 5+ years of hands-on SQL experience (analytical queries, performance tuning)
  • 3+ years of designing and operating large-scale data warehouse systems
  • Strong knowledge of data modeling: dimensional modeling, schema design, architecture patterns
  • 3+ years of experience with a major cloud provider (AWS / Azure / GCP)
  • Experience with orchestration tools (Airflow, Dagster, etc.)
  • 5+ years of real programming experience (e.g., Python, Java, Scala)
  • Experience with modern cloud data platforms: Snowflake, Redshift, Databricks, BigQuery
Preferred Experience
  • Deep dbt experience (modeling, testing, CI/CD, performance tuning)
  • Hands-on experience integrating dbt with orchestration tools and cloud workflows
Nice to Have
  • Ability to understand the business and translate needs into solutions
  • Snowflake expertise (data loading, table design, permission models, DDL/DML, views)
  • Experience with BI tools (Power BI, Tableau, etc.; DAX, Power Query)
  • Infrastructure as code (Terraform, AWS CDK, KDE)
  • Streaming/Kafka experience (event driven architectures, real time data pipelines, schema design, and operating streaming systems at scale)

Key skills/competency

  • Data Warehousing
  • Lakehouse Solutions
  • ELT Pipelines
  • Data Modeling
  • SQL
  • Cloud Platforms (AWS/Azure/GCP)
  • Orchestration (Airflow/Dagster)
  • Python
  • Data Governance
  • Data Quality

Tags:

Data Platform Engineer
data infrastructure
data warehousing
ELT pipelines
data modeling
data governance
data quality
architectural design
team collaboration
technical leadership
KPI tracking
SQL
AWS
Azure
GCP
Airflow
Dagster
Python
Snowflake
dbt
Power BI

Share Job:

How to Get Hired at GSCF

  • Research GSCF's culture: Study their mission, values, recent news, and employee testimonials on LinkedIn and Glassdoor.
  • Customize your resume for GSCF: Highlight experience with data warehousing, cloud platforms (AWS/Azure/GCP), SQL, and Python. Tailor to the Senior Data Platform Engineer role.
  • Prepare for technical depth: Showcase expertise in data modeling (dimensional, schema design), ELT pipelines, and modern data platforms like Snowflake/Redshift.
  • Demonstrate business acumen: Articulate how your data solutions drive strategic decision-making and support business growth in past roles.
  • Practice behavioral questions: Focus on collaboration, problem-solving complex data challenges, and leadership in data governance within a cross-functional team.

Frequently Asked Questions

Find answers to common questions about this job opportunity

Explore similar opportunities that match your background