Data Engineer - Google Data Platforms
IBM
Job Overview
Who's the hiring manager?
Sign up to PitchMeAI to discover the hiring manager's details for this job. We will also write them an intro email for you.

Job Description
Introduction
A career in IBM Consulting is built on long-term client relationships and close collaboration worldwide. You’ll work with leading companies across industries, helping them shape their hybrid cloud and AI journeys. With support from strategic partners, robust IBM technology, and Red Hat, you’ll have the tools to drive meaningful change and accelerate client impact. IBM Consulting encourages curiosity, challenging the norm, exploring new ideas, and creating innovative solutions that deliver real results. The culture emphasizes growth, empathy, and long-term career development, valuing your unique skills and experiences.
Your Role And Responsibilities
As a Data Engineer specializing in Google's data platforms, you will design, build, and maintain data engineering solutions on Google's Cloud ecosystem. Key responsibilities include:
- Design Data Pipelines: Develop batch and real-time data pipelines using Google services like DataProc, DataFlow, PubSub, BigQuery, and Big Table.
- Develop Data Engineering Solutions: Use Google Cloud Storage, BigTable, DataProc with Spark and Hadoop, and DataFlow with Apache Beam or Python.
- Manage Data Platforms: Schedule and manage data pipelines with Cloud Scheduler and Cloud Composer (Airflow).
- Implement Data Migration: Develop solutions for seamless data transfer between systems.
- Optimize Data Layer: Design and optimize data storage using BigQuery, Big Table, and Cloud Spanner.
Preferred Education
Master's Degree
Required Technical And Professional Expertise
- Expertise in Google Cloud ecosystem including DataProc, DataFlow, PubSub, BigQuery, Big Table, Cloud Spanner, CloudSQL, and AlloyDB.
- Experience in developing and managing batch and real-time data pipelines for Data Warehouse and Datalake.
- Proficiency in Google Cloud Storage, BigTable, DataProc (with Spark and Hadoop), and DataFlow with Apache Beam or Python.
- Knowledge of scheduling and management using Cloud Scheduler and Cloud Composer (Airflow).
- Understanding of data layer design using Google services.
Preferred Technical And Professional Experience
- Exposure to open-source technologies such as Apache Airflow, dbt, Spark/Python, or Spark/Scala.
- Experience in developing and implementing data migration solutions.
- Expertise using Cloud Composer (Airflow) for efficient scheduling and management.
Key Skills/Competency
- Google Cloud
- Data Pipelines
- Data Engineering
- Batch Processing
- Real-time Processing
- Data Migration
- Pipeline Scheduling
- BigQuery
- Apache Beam
- Airflow
How to Get Hired at IBM
- Customize your resume: Highlight Google Cloud and data engineering skills.
- Research IBM: Understand IBM Consulting’s culture and tech partnerships.
- Show technical prowess: Demonstrate experience with Google data platforms.
- Prepare for interviews: Focus on technical challenges and pipeline design.
Frequently Asked Questions
Find answers to common questions about this job opportunity
Explore similar opportunities that match your background