3 days ago

Data Engineer

Netconomy

On Site
Full Time
€53,802
Graz, Styria, Austria

Job Overview

Job TitleData Engineer
Job TypeFull Time
CategoryCommerce
Experience5 Years
DegreeMaster
Offered Salary€53,802
LocationGraz, Styria, Austria

Who's the hiring manager?

Sign up to PitchMeAI to discover the hiring manager's details for this job. We will also write them an intro email for you.

Uncover Hiring Manager

Job Description

About Netconomy

As a leading expert for Digital Platform Building and Customer Experience Innovation, NETCONOMY is shaping the digital leadership of its clients. We help brands to build flexible and scalable digital platforms, with top-notch technologies by SAP, Google Cloud and Microsoft Azure. NETCONOMY has 20 years of experience and employs close to 500 professionals across Europe. By introducing and driving innovation initiatives around customer experience, we support clients on the road of expanding their core business in the digital world.

Over the past 25 years, NETCONOMY has grown from a startup into a team of 500 people across 10 European locations. We believe in the power of agile and cross-functional collaboration, bringing together people from diverse backgrounds to build outstanding digital solutions.

Your Role as a Data Engineer

As a Data Engineer, you’ll play a key role in building modern, scalable, and high-performance data solutions on Google Cloud Platform (GCP). You’ll be part of our growing Data & AI team, designing and implementing data architectures that help clients unlock the full potential of their data.

Key Responsibilities

  • Building efficient and scalable ETL/ELT processes to ingest, transform, and load data from various structured and unstructured sources (databases, APIs, streaming platforms) into BigQuery and Cloud Storage
  • Implementing data ingestion and real-time processing using Dataflow (Apache Beam) and Pub/Sub for batch and streaming workflows
  • Developing SQL transformation workflows with Dataform, including version control, testing, and automated scheduling with built-in quality assertions
  • Creating efficient, cost-optimized BigQuery queries with proper partitioning, clustering, and denormalization strategies
  • Orchestrating complex workflows using Cloud Composer (Apache Airflow) and Cloud Functions for event-driven data processing
  • Implementing centralized data governance and metadata management using Dataplex with automated cataloging and lineage tracking
  • Monitoring and optimizing data pipelines for performance, scalability, and cost using Cloud Monitoring and Cloud Logging
  • Collaborating with data scientists and analysts to understand data requirements and deliver actionable insights
  • Staying up to date with GCP advancements in data services, BigQuery features, and data engineering best practices

Your Skills

Essential Skills
  • 3+ years of hands-on experience as a Data Engineer with proven expertise in Google Cloud Platform (GCP)
  • Strong experience with BigQuery (SQL, partitioning, clustering, optimization) and Dataflow (Apache Beam)
  • Strong programming skills in Python with experience in data manipulation libraries (PySpark, pandas)
  • Expert-level SQL proficiency for complex transformations, optimization, and analysis
  • Proficiency with Dataform for modular SQL-based data transformations and data pipeline management
  • Solid understanding of data warehousing principles, ETL/ELT processes, dimensional modeling, and data governance
  • Experience integrating data from various APIs and streaming systems (Pub/Sub)
  • Cloud Composer experience for workflow orchestration
  • Excellent communication and collaboration skills in English (min. B2 level)
  • Ability to work independently and as part of an agile team
Beneficial Skills
  • Google Professional Data Engineer certification
  • Knowledge of BigLake for unified access and management of structured and unstructured data
  • Experience with Dataplex for managing metadata, lineage, and data governance
  • Familiarity with Infrastructure-as-Code (Terraform) for automating GCP resource provisioning and CI/CD pipelines
  • Experience with data visualization tools such as Looker, Looker Studio, or Power BI
  • Interest in or experience with machine learning workflows using Vertex AI or similar platforms

Our Offer

While we’re one company, each location has unique aspects and benefits. Here’s a glimpse of what to expect:

  • Flexible Working Models: Early bird or night owl? Thanks to our flexible working models, you start your work day when and where it fits you best. Plus, our hybrid work option lets you choose whether you prefer working more from the office or from home.
  • Career Development and Onboarding: Start your journey in NETCONOMY with a structured onboarding and mentoring phase and continue with individual training opportunities. Our People Enablement team will support you in finding the best solution for you!
  • Company Summit: Exchange professional and personal experiences at our annual in-house conference! One of the highlights of our company Summit is the vibrant networking environment it cultivates.
  • Social Events: Join our colleagues and build new connections at our social events such as pizza evenings, sports activities, Christmas parties or milestone celebrations.
  • Snacks and Wellbeing: Fuel your productivity and taste buds with our monthly meal allowance and discounts at partner restaurants.
  • Mobility Support: Choose eco friendly transportation! As we take our responsibility for the environment seriously, we support your climate-friendly transportation costs.

The actual salary depends on your qualifications and experience. What is also important to us is to pay our employees an internally comparable salary. For legal reasons, we need to disclose the minimum annual salary for full time employment as stated in the current collective IT agreement, which is € 53.802,00 gross / year for this position.

Key skills/competency

  • Google Cloud Platform
  • BigQuery
  • Dataflow
  • Python
  • SQL
  • ETL/ELT
  • Data Warehousing
  • Dataform
  • Cloud Composer
  • Data Governance

Tags:

Data Engineer
ETL
ELT
Data Modeling
Data Governance
Data Orchestration
Data Transformation
Data Ingestion
Real-time Processing
Performance Optimization
Scalability
Google Cloud Platform
GCP
BigQuery
Dataflow
Apache Beam
Python
SQL
Dataform
Pub/Sub
Cloud Composer
Apache Airflow

Share Job:

How to Get Hired at Netconomy

  • Research Netconomy's culture: Study their mission, values, recent news, and employee testimonials on LinkedIn and Glassdoor.
  • Tailor your Data Engineer resume: Highlight GCP expertise, BigQuery, Python, SQL, and data warehousing relevant to Netconomy's platforms.
  • Showcase project portfolio: Prepare to discuss specific data engineering projects demonstrating ETL, data modeling, and GCP services.
  • Master Google Cloud Platform: Deepen your knowledge of BigQuery, Dataflow, Pub/Sub, and Cloud Composer for technical interviews.
  • Demonstrate agile collaboration: Emphasize teamwork and communication skills, as Netconomy values cross-functional agile teams.

Frequently Asked Questions

Find answers to common questions about this job opportunity

Explore similar opportunities that match your background