8 days ago

Data Platforms & DevOps Engineer

Tether

On Site
Full Time
€90,000
Barcelona, Catalonia, Spain

Job Overview

Job TitleData Platforms & DevOps Engineer
Job TypeFull Time
CategoryCommerce
Experience5 Years
DegreeMaster
Offered Salary€90,000
LocationBarcelona, Catalonia, Spain

Who's the hiring manager?

Sign up to PitchMeAI to discover the hiring manager's details for this job. We will also write them an intro email for you.

Uncover Hiring Manager

Job Description

About Tether

Tether enables EV charging operators (CPOs) to earn new revenue from flexibility/balancing markets and through e-credits, without extra complexity. We’re working with several clients, close to our first market qualification in Sweden, and expanding across the Nordics and selected EU markets.

Role summary

We are looking for a hybrid Data Platforms & DevOps Engineer to manage our end-to-end data lifecycle. You will not only maintain the cloud infrastructure but also build and optimize the ingestion engines that feed our Databricks Lakehouse. A key part of this role involves interacting with REST APIs, managing serverless ingestion (AWS Lambda), and ensuring data quality from the moment it hits S3.

Key responsibilities

  • IaC & security hardening: Expand our AWS infrastructure using Terraform or CloudFormation. You will implement rigorous security measures, including VPC peering/PrivateLink for Databricks, KMS encryption at rest, and IAM least-privilege policies.
  • API ingestion & engineering: Build and maintain Python-based ingestion services. You will manage API authentication, handle rate limiting, and ensure efficient data partitioning in S3.
  • CI/CD evolution: Scale our GitHub Actions workflows to handle multi-environment deployments (Dev/Sandbox/Prod) for both cloud infrastructure (Glue and Kinesis) and Databricks DLT pipelines.
  • Spark performance & optimization: Monitor and tune Spark configurations (shuffling, partitioning, caching) to ensure our DLT and AutoLoader pipelines run efficiently.
  • MLOps support: Partner with Data Scientists to automate ML model deployments, managing feature store integrations and model serving infrastructure.
  • Security & governance: Implement SSE-KMS encryption, IAM policies, and lifecycle rules to ensure our data lake is compliant and cost-effective.
  • Observability & monitoring: Build a "single pane of glass" using CloudWatch and Datadog. You’ll create dashboards that track pipeline latency, AutoLoader costs, and system health.
  • Documentation & knowledge transfer: Produce high-quality architectural diagrams and runbooks. You aren't just building, you are mentoring the internal team to ensure long-term operational success.

Requirements

  • AWS Serverless: Hands-on with Lambda, EventBridge, SQS, SNS, and S3.
  • Databricks: Experience with Delta Live Tables (DLT), AutoLoader, and Unity Catalog.
  • DevOps: Proficient in GitHub Actions and Terraform.
  • Monitoring: Hands-on experience with Datadog and CloudWatch Logs/Metrics.
  • Languages: Strong Python (for Lambda/PySpark) and SQL (for data validation/modelling).
  • Data prep: Knowledge of AWS Glue (Catalog/ETL) and Kinesis Firehose is a major plus.

Compensation & structure

Competitive salary appropriate for an early-stage European startup. Company equity.

Join Tether for…

  • A central role in defining how EV charging flexibility is sold in Europe
  • Direct collaboration with the management team on product, data, and cloud infrastructure
  • High autonomy, fast decision cycles, and meaningful equity/ownership discussions for the right profile

Key skills/competency

  • AWS Infrastructure
  • DevOps
  • Databricks
  • Python
  • Terraform
  • CI/CD
  • Data Pipelines
  • CloudWatch
  • Datadog
  • Spark Optimization

Tags:

Data Platforms Engineer
DevOps Engineer
AWS
Databricks
Python
Terraform
CI/CD
Data Ingestion
Spark
MLOps
Serverless
Cloud Infrastructure
Data Engineering
CloudWatch
Datadog
S3
Lambda
Glue
Kinesis
SQL

Share Job:

How to Get Hired at Tether

  • Research Tether's mission: Study their focus on EV charging flexibility and market expansion in the Nordics/EU.
  • Tailor your resume: Highlight AWS Serverless, Databricks, Python, and DevOps experience to match the role.
  • Showcase your projects: Prepare to discuss real-world examples of IaC, data ingestion, and Spark optimization.
  • Prepare for technical depth: Brush up on AWS services, Databricks DLT, GitHub Actions, and data quality practices.
  • Understand startup culture: Emphasize autonomy, collaborative spirit, and a proactive approach in an early-stage environment.

Frequently Asked Questions

Find answers to common questions about this job opportunity

Explore similar opportunities that match your background