Want to get hired at Crossing Hurdles?
Data Engineer
Crossing Hurdles
HybridHybrid
Original Job Summary
Overview
Crossing Hurdles, a recruitment firm, refers top candidates to our partners working with the world’s leading AI research labs. The Data Engineer role is full-time and remote with compensation up to $100,000/year.
Role Responsibilities
- Build robust data pipelines to ingest, transform, and consolidate diverse data sources like MongoDB, Airtable, PostHog, and production databases.
- Design dbt models and transformations to unify disparate data tables into production-ready schemas.
- Implement scalable, fault-tolerant workflows using Fivetran, dbt, SQL, and Python.
- Collaborate with engineers, data scientists, and business stakeholders to ensure data availability and accuracy.
- Own data quality and reliability across the entire data stack from ingestion to consumption.
- Continuously improve pipeline performance, monitoring, and scalability.
Requirements
- Proven experience in data engineering with strong proficiency in SQL and Python.
- Experience using modern data stack tools such as Fivetran, dbt, Snowflake or similar platforms.
- Skilled in building and maintaining large-scale ETL/ELT pipelines.
- Strong understanding of data modeling, schema design, and transformation best practices.
- Familiarity with data governance, monitoring, and quality assurance tools.
- Bonus: Experience supporting machine learning workflows or analytics platforms.
- Must have access to a desktop or laptop computer (Chromebooks are not supported).
Application Process
Submit your resume highlighting relevant data engineering experience. Selected candidates will proceed with interview and onboarding steps. Compensation and role details will be shared during the process.
Key skills/competency
- Data Pipelines
- SQL
- Python
- dbt
- Fivetran
- Snowflake
- ETL
- Data Modeling
- Collaboration
- Data Quality
How to Get Hired at Crossing Hurdles
🎯 Tips for Getting Hired
- Customize resume: Tailor your experience to data engineering.
- Include keywords: Emphasize SQL, Python, and ETL skills.
- Review tools: Highlight Fivetran, dbt and Snowflake expertise.
- Prepare for interviews: Practice technical and situational questions.
📝 Interview Preparation Advice
Technical Preparation
circle
Review SQL query optimization techniques.
circle
Practice Python coding for data tasks.
circle
Gain hands-on with ETL tool setups.
circle
Study dbt and Fivetran integration.
Behavioral Questions
circle
Describe a challenging project collaboration.
circle
Explain your problem-solving approach.
circle
Discuss managing deadlines under pressure.
circle
Share teamwork experiences in data projects.