29 days ago

Data Engineer

BTSE

Hybrid
Full Time
$130,000
Hybrid
Apply

Job Overview

Job TitleData Engineer
Job TypeFull Time
Offered Salary$130,000
LocationHybrid

Who's the hiring manager?

Sign up to PitchMeAI to discover the hiring manager's details for this job. We will also write them an intro email for you.

Uncover Hiring Manager

Job Description

About BTSE

BTSE Group is a global leader in fintech and blockchain technology, anchored by three core business pillars: Exchange, Payments, and Infrastructure Development. Serving over 100 corporate clients worldwide, we provide white-label exchange and payment solutions. Our offerings encompass everything from exchange infrastructure hosting and development to custody, wallets, payments, blockchain integration, trading, and more. We are looking for talented professionals in marketing, operations, customer support, and other departments. The roles offered may be on-site, remote, or hybrid, in collaboration with our local partner.

About The Opportunity

You own the storage layer and cloud infrastructure provisioning. You design the multi-tenant data architecture that makes IP isolation possible and auditable: shared public data stores for market data, and per-tenant isolated stores for proprietary data. Every table, bucket, and index is either explicitly shared or explicitly tenant-scoped. You provision all infrastructure as code and ensure that onboarding a new tenant is a scripted operation, not a manual one.

Responsibilities

  • Provision all cloud infrastructure via Terraform: object storage, vector databases, event streaming, Kubernetes, time-series databases, authentication. All reproducible for new tenants.
  • Design multi-tenant storage: shared vector indices for public data, per-tenant indices for proprietary data. Row-level security or schema-level isolation.
  • Design per-tenant storage structure with bucket policies enforcing isolation.
  • Build market data storage pipeline: exchange feeds → event bus → time-series database.
  • Build monitoring dashboards for data pipeline health across all data sources.
  • Design feedback data storage: per-tenant schema for feedback events and training data candidates.
  • Build data archival pipelines for cost-efficient long-term storage.
  • Automate tenant provisioning: a script that creates a new tenant’s storage, network policies, and service accounts.

Requirements

  • 4+ years data engineering; strong SQL, Python, and cloud infrastructure.
  • Experience designing multi-tenant data architectures with isolation requirements.
  • Infrastructure as Code: Terraform or Pulumi — mandatory.
  • PostgreSQL experience (vector extensions, partitioning, row-level security a plus).
  • Kafka consumer/producer development.
  • Time-series data storage and querying experience.

Nice to have

  • Experience with financial data: time-series, tick data, on-chain events.
  • Understanding of data sovereignty and compliance requirements.
  • Experience with tenant provisioning automation.
  • Blockchain or crypto data pipeline experience.

We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.

Key skills/competency

  • Data Engineering
  • Cloud Infrastructure
  • Terraform
  • Python
  • SQL
  • Multi-tenant Architecture
  • PostgreSQL
  • Kafka
  • Time-Series Databases
  • Data Pipelines

Tags:

Data Engineer
Data Engineering
Cloud Infrastructure
Terraform
Python
SQL
Multi-tenant Architecture
PostgreSQL
Kafka
Time-Series Databases
Fintech
Blockchain

Share Job:

How to Get Hired at BTSE

  • Tailor your resume: Highlight your 4+ years of data engineering experience, focusing on SQL, Python, and cloud infrastructure, especially Terraform and multi-tenant architecture design.
  • Showcase IaC skills: Emphasize your experience with Infrastructure as Code tools like Terraform or Pulumi in your application and resume.
  • Quantify achievements: Provide specific examples of building data pipelines, automating tenant provisioning, and managing complex data storage solutions.
  • Prepare for technical questions: Be ready to discuss your experience with PostgreSQL (including vector extensions, partitioning, row-level security), Kafka, and time-series databases.
  • Highlight relevant experience: Mention any experience with financial data, data sovereignty, or blockchain/crypto data pipelines if applicable.

Frequently Asked Questions

Find answers to common questions about this job opportunity

Explore similar opportunities that match your background