Data Engineer - Project Delivery Analyst
Deloitte
Job Overview
Who's the hiring manager?
Sign up to PitchMeAI to discover the hiring manager's details for this job. We will also write them an intro email for you.

Job Description
Data Engineer - Project Delivery Analyst at Deloitte
Are you an experienced, passionate pioneer in technology who wants to work in a collaborative environment? As an experienced Data Engineer - Project Delivery Analyst, you will have the ability to share new ideas and collaborate on projects as a consultant without the extensive demands of travel. If so, consider an opportunity with Deloitte under our Project Delivery Talent Model. This model is tailored specifically for long-term, onsite client service delivery.
Recruiting for this role ends on April 10th, 2026.
Work You'll Do/Responsibilities
You will support a Data & Analytics Foundry across numerous business product teams (a scaled program with approximately 235 onshore/offshore resources), building reliable pipelines and curated datasets for analytics and downstream consumption.
- Build and enhance data pipelines on AWS using Python to ingest, transform, and deliver data to Snowflake and downstream consumers.
- Develop and maintain Snowflake objects (schemas, tables, views) and performant SQL transformations to produce curated, analytics-ready datasets.
- Implement workflow automation and scheduling (e.g., Airflow/MWAA, Step Functions, Glue) with proper dependencies, retries, and logging.
- Apply data quality checks and basic observability (validation rules, reconciliation, alerts) and support incident triage and remediation.
- Optimize pipeline and query performance with guidance (efficient Python, partitioning/file formats in S3, Snowflake warehouse usage and query tuning).
- Follow CI/CD and IaC standards (e.g., Git-based workflows, Terraform/CloudFormation changes) to promote code across environments.
- Collaborate with analysts, product owners, and source-system teams to clarify requirements and validate outputs; participate in sprint ceremonies and estimations.
- Contribute to code reviews (give/receive), unit tests, and peer debugging; learn and apply team engineering standards.
- Communicate regularly with Engagement Managers (Directors), project team members, and representatives from various functional and/or technical teams, including escalating any matters that require additional attention and consideration from engagement management.
- Independently and collaboratively lead client engagement workstreams focused on improvement, optimization, and transformation of processes including implementing leading practice workflows, addressing deficits in quality, and driving operational outcomes.
The Team
The AI & Data - AI & Engineering team leverages cutting-edge engineering capabilities to build, deploy, and operate integrated/verticalized sector solutions in software, data, AI, network, and hybrid cloud infrastructure. These solutions are powered by engineering for business advantage, transforming mission-critical operations. We enable clients to stay ahead with the latest advancements by transforming engineering teams and modernizing technology & data platforms. Our delivery models are tailored to meet each client's unique requirements.
Qualifications
Required
- 1+ year of experience building/enhancing data pipelines and curated datasets for analytics/downstream consumers.
- 1+ year of hands-on experience with SQL and Python, including Snowflake and/or PySpark for transformations and scalable processing.
- 1+ year of experience with cloud data engineering on AWS (preferred) or Azure/GCP, including orchestration/scheduling (e.g., Airflow/MWAA, Step Functions, Glue, ADF/Fabric Data Factory).
- Understanding of ELT patterns and Lakehouse/warehouse concepts; familiarity with S3 file formats/partitioning (e.g., Parquet/Delta).
- Working knowledge of DevOps practices (Git-based workflows, CI/CD) and exposure to Infrastructure-as-Code (Terraform/CloudFormation).
- Understanding data quality, basic observability, and metadata/governance fundamentals.
- Bachelor's degree, preferably in Computer Science, Information Technology, Computer Engineering, or related IT discipline; or equivalent experience.
- Limited immigration sponsorship may be available.
- Ability to travel 10%, on average, based on the work you do and the clients and industries/sectors you serve.
Preferred
- Agile delivery experience.
- Analytical ability to manage multiple projects and prioritize tasks into manageable work products.
- Can operate independently or with minimum supervision.
- Excellent written and communication skills.
- Ability to deliver technical demonstrations.
A reasonable estimate of the current wage range for this role is $57,300 to $95,500, not adjusted for geographic differentials.
Key skills/competency
- Data Pipeline Development
- AWS Cloud Services
- Snowflake Data Warehouse
- Python Programming
- SQL Querying
- ETL/ELT Processes
- Workflow Automation
- Data Quality Assurance
- DevOps Practices
- Project Delivery
How to Get Hired at Deloitte
- Research Deloitte's culture: Study their mission, values, recent news, and employee testimonials on LinkedIn and Glassdoor to align your application.
- Tailor your resume: Customize your resume to highlight experience in AWS, Python, Snowflake, and data pipeline development relevant to Deloitte's Data Engineer roles.
- Showcase project delivery skills: Emphasize your ability to lead workstreams, collaborate with clients, and drive operational outcomes in a consulting environment.
- Prepare for technical assessments: Practice advanced SQL, Python scripting, and demonstrate proficiency in cloud data engineering concepts, especially within AWS.
- Demonstrate agile and communication skills: Be ready to discuss your experience in Agile environments and effective communication with diverse project stakeholders and engagement managers.
Frequently Asked Questions
Find answers to common questions about this job opportunity
Explore similar opportunities that match your background