Data Platform Engineer
@ Lean IT Inc.

Hybrid
$120,000
Hybrid
Contractor
Posted 11 hours ago

Your Application Journey

Personalized Resume
Apply
Email Hiring Manager
Interview

Email Hiring Manager

XXXXXXXXX XXXXXXXXXXX XXXXXXX****** @leanit.com
Recommended after applying

Job Details

Company Overview

Welcome to Lean IT Inc., a leader in technological innovation and an official Salesforce Ridge Partner. We excel in data visualization, big data implementation, data migration, and data modernization, transforming data into actionable insights. We also contribute to global philanthropy as a member of Pledge 1%.

Role Overview

The Data Platform Engineer is a remote contract role for a seasoned professional with 6+ years of experience. In this hybrid role (60% administration and 40% development/support), you will design, build, and maintain scalable data platforms and DataOps pipelines with cutting-edge technologies.

Key Responsibilities

  • Design, develop and maintain scalable ETL pipelines and integration frameworks.
  • Administer and optimize Databricks and Apache Spark environments.
  • Build and manage AWS-based data workflows using Lambda, Glue, Redshift, SageMaker, and S3.
  • Support and troubleshoot DataOps pipelines to ensure reliability and performance.
  • Automate operations with Python, PySpark, and infrastructure-as-code tools.
  • Collaborate with cross-functional teams on data ingestion, transformation, and deployment.
  • Provide technical leadership and mentorship.
  • Create and maintain technical documentation and training materials.
  • Troubleshoot issues and implement long-term resolutions.

Minimum Qualifications

Bachelor’s or Master’s degree in Computer Science or a related field with 5+ years of relevant experience and strong expertise in Databricks, AWS, and ETL development.

Required Technical Skills

  • Proficiency in Python and PySpark.
  • Expertise in Databricks, Apache Spark, and Delta Lake.
  • Strong experience with AWS CloudOps and Cloud Security.
  • Solid SQL skills and hands-on with Amazon Redshift.
  • Experience in ETL development, data transformation, and orchestration.

Nice to Have / Working Knowledge

  • Kafka for real-time data streaming.
  • Fivetran and DBT for data ingestion.
  • Familiarity with DataOps practices and open-source tools.
  • Experience with Apache Camel and MuleSoft integration tools.
  • Understanding of RESTful APIs, message queuing, and event-driven architectures.

Key skills/competency

Data Engineering, ETL, AWS, Python, Databricks, Apache Spark, Delta Lake, CloudOps, DataOps, SQL

How to Get Hired at Lean IT Inc.

🎯 Tips for Getting Hired

  • Customize your resume: Highlight data platform and AWS experiences.
  • Research Lean IT Inc.'s culture: Review their Salesforce partnership and tech innovations.
  • Showcase technical projects: Demonstrate expertise in Python and Spark.
  • Prepare for interviews: Practice explaining ETL processes and troubleshooting methodologies.

📝 Interview Preparation Advice

Technical Preparation

Review Python, PySpark syntax and libraries.
Practice AWS service configuration and deployment.
Study Databricks and Apache Spark optimization techniques.
Rehearse ETL pipeline design and troubleshooting.

Behavioral Questions

Describe a time you solved a technical challenge.
Explain your approach to mentoring team members.
Discuss handling conflicting project requirements.
Share experiences managing remote team collaborations.

Frequently Asked Questions