Ingénieur de Données Principal AWS Snowflake
Deloitte
Job Overview
Who's the hiring manager?
Sign up to PitchMeAI to discover the hiring manager's details for this job. We will also write them an intro email for you.

Job Description
Our Purpose
At Deloitte, our purpose is to make an impact. We exist to inspire and help our people, organizations, communities, and countries prosper by creating a better future. Our work supports a thriving society where people can flourish and seize opportunities. It strengthens consumer and business confidence, helps organizations find creative ways to deploy capital, empowers fair, reliable, and effective social and economic institutions, and allows our friends, families, and communities to enjoy the quality of life that comes with a sustainable future. As the largest professional services firm 100% owned and operated by Canadians in our country, we are proud to work alongside our clients to make a positive impact on all Canadians.
By embodying our purpose, we will make a significant impact.
Diversify your career within the Firm. Take advantage of flexible, proactive, and practical benefits that foster a culture of well-being and strong connections. Deepen your knowledge through expert mentorship and on-the-job coaching.
What Your Day Will Look Like
We are looking for an experienced Data Engineer with in-depth expertise in AWS cloud data services or Snowflake to lead a high-performing team. In this role, you will oversee the design, development, and optimization of scalable, secure, and resilient data platforms and pipelines. You will guide technical strategy, ensure engineering excellence, and collaborate with analytics, product, and business teams to drive data enablement across the enterprise.
This position is ideal for a Principal Advisor who possesses strong technical capabilities, coupled with strategic thinking and a passion for building modern data ecosystems.
Key Responsibilities
Strategy
- Define the data engineering roadmap considering business priorities, cloud strategy, and enterprise architecture standards.
- Establish best practices, coding standards, and governance frameworks for data engineering in AWS and Snowflake ecosystems.
Technical Delivery
- Design and oversee the development of scalable data pipelines leveraging AWS services (e.g., Glue, Lambda, S3, Step Functions, Redshift) and/or Snowflake.
- Lead the design and implementation of robust Extract, Transform, Load (ETL) and Extract, Load, Transform (ELT) frameworks with a focus on performance, security, and maintainability.
- Ensure enterprise-wide data integration across internal and external systems, ensuring reliable, high-quality data flows.
Data Quality and Automation
- Support the implementation of automated frameworks for data quality, profiling, and validation.
- Drive automation to improve reliability, reduce operational costs, and eliminate manual data processing steps.
- Oversee the development of cleaning, transformation, and enrichment strategies using SQL, Python, and other relevant tools.
Operational Excellence
- Drive continuous optimization of data pipelines to ensure efficiency, performance, and cost reliability.
- Establish monitoring, alerting, and incident management processes.
- Maintain clear documentation for architecture, workflows, pipelines, and operational procedures.
Collaboration and Stakeholder Management
- Collaborate with stakeholders, analytics teams, and data governance groups to understand requirements and deliver scalable solutions.
- Communicate progress, risks, and technical decisions to leaders and cross-functional associates.
Required Skills
- Bachelor's degree in Computer Science, Engineering, Information Systems, or a related field.
- At least three years of hands-on experience in data engineering, including developing ETL/ELT frameworks and cloud data platforms.
- Strong expertise in AWS data services (e.g., Glue, Lambda, EMR, Step Functions, S3).
- Proven experience in designing, implementing, and optimizing solutions in Snowflake (warehousing, performance tuning, cost management, security).
- Advanced proficiency in SQL and scripting languages (Python preferred).
- Strong understanding of modern data architecture practices, e.g., data lakes, lakehouses, data orchestration, as well as Continuous Integration/Continuous Deployment (CI/CD) and metadata management.
- Experience with data integration, quality monitoring, and data observability frameworks.
- Excellent communication and problem-solving skills.
Preferred Skills
- Experience with big data technologies (Spark, Hadoop, EMR, Databricks).
- Knowledge of data governance, lineage, and security frameworks.
- Familiarity with modern orchestration tools (Airflow, dbt, Dagster).
- AWS or Snowflake certification (Data Engineer, Solutions Architect, SnowPro).
Due to the nature of the role requiring interactions with national and global clients, bilingualism in French and English is required for this position.
Total Compensation
The salary range for this position is $72,000 to $138,000. Some individuals may be eligible for our bonus program. In terms of salary, Deloitte focuses on fairness and competitiveness. We regularly benchmark against market data for various positions, sectors, targets, and levels. Our approach is based on recognizing each person's unique strengths and contributions and rewarding them for the value they bring.
Deloitte's total compensation extends well beyond traditional compensation and benefits programs and is designed to recognize employee contributions, foster personal well-being, and support the Firm's growth. In addition to our regular paid time off, some examples include: $4,000 per year for mental health support benefits, a $1,300 flexible spending account, firm-wide closures known as 'Deloitte Days', dedicated learning days (known as Development and Innovation Days), flexible work arrangements, and a hybrid work structure.
Key skills/competency
- Data Engineering
- AWS
- Snowflake
- ETL/ELT
- Data Pipelines
- SQL
- Python
- Data Architecture
- Cloud Data Platforms
- Data Governance
How to Get Hired at Deloitte
- Research Deloitte's Culture: Study their mission, values, recent news, and employee testimonials on LinkedIn and Glassdoor.
- Tailor Your Resume: Highlight AWS, Snowflake, ETL/ELT, and data architecture experience for data engineering roles.
- Showcase Technical Prowess: Be prepared to discuss advanced SQL, Python scripting, and cloud data platform optimization.
- Emphasize Strategic Thinking: Demonstrate your ability to define data roadmaps and solve complex data challenges.
- Prepare for Behavioral Questions: Focus on collaboration, leadership in data projects, and stakeholder management experiences.
Frequently Asked Questions
Find answers to common questions about this job opportunity
Explore similar opportunities that match your background