23 hours ago

Lead Data Engineer, Global Security

RBC

On Site
Full Time
CA$150,000
Toronto, ON

Job Overview

Job TitleLead Data Engineer, Global Security
Job TypeFull Time
Offered SalaryCA$150,000
LocationToronto, ON

Who's the hiring manager?

Sign up to PitchMeAI to discover the hiring manager's details for this job. We will also write them an intro email for you.

Uncover Hiring Manager

Job Description

What is the opportunity?

At RBC, our data engineering team enhances visibility into assets across the Public Cloud and Application Security landscape. Our mission is to provide clear insights into digital infrastructure, enabling effective identification and management of security risks. We harness industry-leading tools like Databricks, Python, PySpark, and Tableau, transforming data into strategic assets. Our approach goes beyond traditional security by analyzing complex datasets to generate actionable business insights, thereby strengthening our cyber resilience. Collaboration is key to our success, fostering an innovative environment where team members leverage their narrative and technical skills to drive continuous advancements in cloud security.

What will you do?

  • Design, develop, and maintain end-to-end data pipelines in Azure Databricks using Spark (SQL, PySpark) to transform large datasets efficiently.
  • Develop and optimize ELT/ELT workflows using Databricks Workflows or Apache Airflow ensuring data integrity, quality, and reliability.
  • Design and manage Delta Lake solutions for data versioning, incremental data loads, and efficient data storage.
  • Collaborate with cross-functional teams to understand data requirements, create robust data models, and deliver actionable insights.
  • Implement Site Reliability Engineering (SRE) practices for data pipelines by building automated monitoring, alerting, and incident managements solution to ensure data reliability, availability, and performance.
  • Apply best practices in data governance, ensuring compliance using Unity Catalog for access management and data lineage tracking.
  • Monitor, troubleshoot, and optimize Spark jobs for performance, addressing data pipelines bottlenecks and ensuring cost efficiency.
  • Implement infrastructure-as-code solutions using Terraform for automated resource provisioning and management.
  • Develop and maintain comprehensive documentation for data pipelines, transformations and data models.
  • Provide mentorship and technical guidance to junior engineers, fostering a culture of learning and best practices in data engineering.
  • Lead and mentor a team of data engineers, providing technical guidance and fostering professional development.
  • Oversee the design and implementation of complex data solutions, ensuring alignment with business objectives.
  • Drive the adoption of best practices in data engineering, including code reviews, testing, and documentation.
  • Collaborate with stakeholders to define and prioritize data engineering projects, ensuring timely delivery and high-quality outcomes.
  • Stay updated on emerging technologies and trends in data engineering, recommending and implementing innovative solutions.

What do you need to succeed?

Must-have
  • Bachelor’s or master’s degree in computer science, Data Engineering, or a related field.
  • 8+ years of proven experience in data engineering, delivering business-critical software solutions for large enterprises with a consistent track record of success.
  • Strong expertise in Databricks (Delta Lake, Unity Catalog, Lakehouse Architecture, Table Triggers, Delta Live Pipelines, Databricks Runtime, Cluster management, etc.)
  • Proficiency in Azure Cloud Services.
  • Solid understanding of Spark and PySpark for big data processing.
  • English fluency, verbal and written.
  • Knowledge of SCM, Infrastructure-as-code, and CI/CD pipelines.
  • Experience leading and mentoring a team of data engineers.
  • Strong project management skills, with the ability to prioritize tasks and manage multiple projects simultaneously.
  • Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams.
  • Experience with Agile methodologies and DevOps practices.
Nice to Have
  • Databricks certifications (e.g., Databricks Certified Data Engineer, Spark Engineer).
  • Exposure to Kubernetes, Docker, and Terraform.
  • Strong understanding of business intelligence and reporting tools.
  • Familiarity with Cyber Security Concepts.

What’s in it for you?

We thrive on the challenge to be our best, progressive thinking to keep growing, and working together to deliver trusted advice to help our clients thrive and communities prosper. We care about each other, reaching our potential, making a difference to our communities, and achieving success that is mutual.

  • A comprehensive Total Rewards Program including bonuses and flexible benefits, competitive compensation, commissions, and stock where applicable
  • Leaders who support your development through coaching and managing opportunities
  • Work in a dynamic, collaborative, progressive, and high-performing team
  • A world-class training program in financial services
  • Flexible work/life balance options
  • Opportunities to do challenging work
  • Opportunities to take on progressively greater accountabilities
  • Opportunities to building close relationships with clients

Key skills/competency

  • Data Pipeline Development
  • Azure Databricks
  • Spark and PySpark
  • Delta Lake Management
  • Site Reliability Engineering (SRE)
  • Data Governance (Unity Catalog)
  • Infrastructure-as-Code (Terraform)
  • Big Data Processing
  • Team Leadership & Mentorship
  • Cyber Security Concepts

Tags:

Lead Data Engineer
Data Pipeline
ETL
Databricks
Spark
Data Modeling
SRE
Data Governance
Troubleshooting
Mentorship
Leadership
Azure
Python
PySpark
Delta Lake
Unity Catalog
Terraform
Kubernetes
Docker
Airflow

Share Job:

How to Get Hired at RBC

  • Research RBC's culture: Study their mission, values, recent news, and employee testimonials on LinkedIn and Glassdoor.
  • Tailor your resume: Highlight your 8+ years of data engineering experience, Azure, Databricks, and leadership skills for RBC's Lead Data Engineer, Global Security role.
  • Showcase relevant projects: Prepare to discuss complex data pipeline projects, SRE implementation, and security-focused data solutions during interviews.
  • Master Databricks & Azure: Demonstrate deep expertise in Databricks (Delta Lake, Unity Catalog) and Azure Cloud services relevant to global security data.
  • Practice leadership scenarios: Be ready to share examples of mentoring, team leadership, and driving best practices in a data engineering environment.

Frequently Asked Questions

Find answers to common questions about this job opportunity

Explore similar opportunities that match your background