Senior Data Engineer
@ Suvoda

Iași, Romania
On Site
Posted 19 days ago

Your Application Journey

Personalized Resume
Apply
Email Hiring Manager
Interview

Email Hiring Manager

XXXXXXXXX XXXXXXXXXXX XXXXXXX****** @suvoda.com
Recommended after applying

Job Details

Senior Data Engineer

Suvoda is seeking a skilled and driven Senior Data Engineer to help evolve our data platform towards a data mesh architecture. This role is remote (based in Romania) and reports to the Manager, Data Engineering in the Product Development department.

Responsibilities

  • Design and implement a data mesh architecture using GraphQL APIs.
  • Build and maintain an AWS-based data lake with S3, Glue, Lake Formation, Athena, and Redshift.
  • Develop and optimize ETL/ELT pipelines using AWS Glue and PySpark.
  • Implement AWS DMS pipelines for near real-time replication to Aurora PostgreSQL.
  • Ensure data governance, quality, observability, and API design best practices.
  • Collaborate with cross-functional teams (product, engineering, analytics) to deliver robust data solutions.
  • Contribute to automation and CI/CD practices for data pipelines.
  • Stay updated with emerging technologies and industry trends.

Requirements

  • Bachelor’s degree in Computer Science, Mathematics, or related technical field.
  • Minimum 5 years of experience in data engineering.
  • Experience with GraphQL APIs and complex data systems.
  • Strong expertise in AWS Glue and PySpark.
  • Solid knowledge of AWS data lake technologies including S3, Lake Formation, Athena, and Redshift.
  • Hands-on experience with AWS DMS and Aurora PostgreSQL.
  • Understanding of data mesh principles.
  • Proficiency in Python, SQL, and infrastructure-as-code tools like Terraform or CloudFormation.
  • Experience with data modeling, orchestration tools (e.g., Airflow) and CI/CD pipelines.
  • Strong communication and collaboration skills.

Preferred Qualifications

  • Master’s degree in data engineering, distributed systems, or cloud architecture.
  • Experience with event-driven architectures (e.g., Kafka, Kinesis).
  • Familiarity with data cataloging and metadata management tools.
  • Knowledge of data privacy and compliance standards (e.g., GDPR, HIPAA).
  • Background in agile development and DevOps practices.

Important Information

Beware of fraudulent recruiters. Suvoda will never ask for sensitive personal information or payments during the hiring process. Genuine emails will come from an @suvoda.com address. For California residents, additional information is provided as per state guidelines.

Key skills/competency

  • Data Mesh
  • GraphQL
  • AWS Glue
  • PySpark
  • ETL/ELT
  • Aurora PostgreSQL
  • Data Lake
  • CI/CD
  • Airflow
  • Terraform

How to Get Hired at Suvoda

🎯 Tips for Getting Hired

  • Research Suvoda's culture: Study their mission and values online.
  • Tailor your resume: Highlight AWS, data lake, and ETL expertise.
  • Showcase projects: Include GraphQL and data mesh experience.
  • Prepare for technical rounds: Review AWS Glue, PySpark, and CI/CD practices.

📝 Interview Preparation Advice

Technical Preparation

Review AWS Glue and PySpark documents.
Practice building ETL pipelines on AWS.
Familiarize with GraphQL API integrations.
Update skills in Aurora and AWS DMS.

Behavioral Questions

Describe a complex project challenge solved.
Explain teamwork during data architecture design.
Discuss time management in prior roles.
Demonstrate adaptability to technology changes.

Frequently Asked Questions