10 days ago

Senior Data Engineer

McAfee

On Site
Full Time
CA$130,000
Waterloo, ON

Job Overview

Job TitleSenior Data Engineer
Job TypeFull Time
CategoryCommerce
Experience5 Years
DegreeMaster
Offered SalaryCA$130,000
LocationWaterloo, ON

Who's the hiring manager?

Sign up to PitchMeAI to discover the hiring manager's details for this job. We will also write them an intro email for you.

Uncover Hiring Manager

Job Description

Role Overview: Senior Data Engineer at McAfee

As a Senior Data Engineer at McAfee, you will be a pivotal member of our data innovation team, tasked with designing, building, and overseeing the deployment and operation of critical technology architecture, solutions, and software. Your work will unlock the full potential of McAfee's data assets, combining hands-on technical implementation with strategic problem-solving to drive data-driven innovation across the organization.

This role involves establishing and building processes and structures to capture, manage, store, and utilize structured and unstructured data from diverse internal and external sources. You will create scalable solutions spanning from cloud-based architectures to traditional databases, operating at the intersection of data engineering, data science, and data quality. Leveraging artificial intelligence, machine learning, and big-data techniques, you will transform raw data into actionable insights that provide significant business value.

This is a collaborative position where you will partner closely with business stakeholders, data scientists, and product teams to solve complex problems, enable company-wide data solutions, and establish the foundational framework for data-driven decision making across McAfee.

This is a Hybrid Position located in either Waterloo or Toronto, Canada. Candidates must be within a commutable distance to either location, with onsite presence required on an as-needed basis.

About The Role

  • Partner with business stakeholders to understand data requirements and translate them into scalable technical solutions that drive operational efficiency and strategic insights.
  • Lead data innovation initiatives by identifying opportunities to leverage data assets for new business capabilities and competitive advantages.
  • Review internal and external business and product requirements for data operations and recommend strategic changes and upgrades to systems and storage.
  • Collaborate with data scientists to enable advanced analytics, predictive modeling, and machine learning initiatives that solve complex business problems.
  • Work with Professional Services teams on client-focused data solutions, ensuring alignment with business objectives and customer needs.
  • Design and oversee the deployment of comprehensive data architecture that captures, manages, and stores structured and unstructured data from multiple internal and external sources.
  • Build resilient ETL/ELT pipelines that channel data from multiple inputs, route appropriately, and store using cloud structures, local databases, and other applicable storage forms.
  • Establish processes and structures based on business and technical requirements to ensure optimal data flow across systems.
  • Create and maintain well-documented data services and interfaces for efficient data access across the organization.
  • Develop company-wide, web-enabled solutions that democratize data access and empower self-service analytics.
  • Develop technical tools and programming leveraging artificial intelligence, machine learning, and big-data techniques to cleanse, organize, and transform data on an automated basis.
  • Implement comprehensive data quality frameworks including validation checks, monitoring, and automated recovery strategies to maintain data accuracy, completeness, and freshness.
  • Apply business logic to cleanse, enrich, and structure raw data, ensuring consistency and quality across domains.
  • Leverage Model Context Protocol (MCP) to connect with top enterprise applications, enabling seamless automation of data flows and improving operational efficiency.
  • Utilize Copilot and Anthropic models to accelerate development, automate documentation, and enhance code quality and review processes.
  • Create and establish design standards and assurance processes for software, systems, and applications development to ensure compatibility and operability of data connections, flows, and storage requirements.
  • Ensure secure, scalable, and auditable data ingestion processes, with appropriate handling of PII and compliance requirements.
  • Uphold SDLC best practices across development and delivery stages to ensure reliability, maintainability, and scalability.
  • Maintain and defend data structures and integrity on an automated basis, implementing proactive monitoring and alerting systems.
  • Troubleshoot pipeline issues and collaborate with platform teams to optimize performance and recovery strategies.
  • Participate in on-call rotations to ensure 24/7 reliability of critical data systems.
  • Continuously evaluate and implement new technologies and methodologies to improve data engineering capabilities.
  • Mentor junior team members and contribute to the growth of the data engineering practice.

About You

  • 5+ years of hands-on experience in developing ETL/ELT pipelines across varied data sources, with demonstrated ability to work across the full spectrum of data engineering challenges.
  • Experience with Copilot and Claude Anthropic models to enhance development speed, code quality, and documentation.
  • Strong programming skills in languages such as Python, Scala, or Java, with ability to write production-quality code.
  • Experience with modern data platforms and tools (e.g., Snowflake, Databricks, Apache Spark, Kafka, Airflow).
  • Practical knowledge of Model Context Protocol (MCP) to connect enterprise applications and automate data workflows.
  • Experience with cloud platforms (AWS, Azure, GCP) and their native data services.
  • Knowledge of containerization and orchestration technologies (Docker, Kubernetes).
  • Strong expertise in data integration, transformation, and curation with a focus on quality and consistency.
  • Experience with real-time data processing and streaming architectures.
  • Background in data science or analytics, with ability to collaborate effectively with data scientists.
  • Experience in client-facing or Professional Services roles.
  • Familiarity with DataOps and MLOps practices.
  • A mindset focused on operational efficiency, automation, and continuous improvement.
  • Strong business acumen with ability to translate technical capabilities into business value.
  • Commitment to SDLC best practices and structured development processes.
  • Excellent communication and collaboration skills, with ability to work effectively with both technical and non-technical stakeholders.
  • Proactive approach to problem-solving with strong analytical and critical thinking skills.
  • Passion for innovation and staying current with emerging technologies and industry trends.
  • Proven experience with both structured and unstructured data, including design and implementation of solutions that leverage both traditional databases and modern cloud architectures.
  • Experience managing sensitive data, including PII, with attention to compliance and governance requirements.
  • Demonstrated ability to work with artificial intelligence, machine learning, and big-data techniques.
  • Solid understanding of data modeling, data warehousing concepts, and dimensional modeling.

Key Skills/Competency

  • ETL/ELT Pipeline Development
  • Cloud Data Platforms
  • Python/Scala/Java Programming
  • Data Architecture Design
  • Big Data Technologies
  • Data Quality & Governance
  • AI/ML Data Integration
  • Data Modeling
  • Real-time Data Processing
  • Containerization (Docker/Kubernetes)

Tags:

Senior Data Engineer
ETL
ELT
Data Pipelines
Data Architecture
Cloud Platforms
Big Data
Data Quality
AI
Machine Learning
Data Modeling
Python
Scala
Java
Snowflake
Databricks
Apache Spark
Kafka
Airflow
AWS
Azure
GCP
Docker
Kubernetes

Share Job:

How to Get Hired at McAfee

  • Research McAfee's culture: Study their mission, values, recent news, and employee testimonials on LinkedIn and Glassdoor to understand their commitment to consumer security and innovation.
  • Tailor your resume: Customize your resume to highlight experience in ETL/ELT pipelines, cloud platforms (AWS, Azure, GCP), big data technologies (Spark, Kafka), and AI/ML integration, directly addressing the Senior Data Engineer requirements.
  • Showcase your technical skills: Be prepared to discuss your proficiency in Python, Scala, or Java, and practical experience with data platforms like Snowflake and Databricks. Emphasize projects demonstrating data architecture design and data quality frameworks.
  • Prepare for behavioral questions: Reflect on experiences where you collaborated with cross-functional teams, solved complex data problems, or mentored junior colleagues, aligning your responses with McAfee's collaborative and innovative environment.
  • Demonstrate problem-solving acumen: During interviews, be ready to articulate your approach to troubleshooting data pipeline issues, optimizing performance, and ensuring data integrity, showcasing your proactive and analytical thinking.

Frequently Asked Questions

Find answers to common questions about this job opportunity

Explore similar opportunities that match your background