Data Engineer - Data Platforms Google
IBM
Job Overview
Who's the hiring manager?
Sign up to PitchMeAI to discover the hiring manager's details for this job. We will also write them an intro email for you.

Job Description
Introduction
A career in IBM Consulting is built on long-term client relationships and global collaboration. As a Data Engineer - Data Platforms Google at IBM, you will work with top companies, leveraging hybrid cloud and AI journeys with support from strategic partners, robust IBM technology, and Red Hat.
Your Role And Responsibilities
In this role, you will design, build, and maintain data engineering solutions on Google's Cloud ecosystem. Your main responsibilities include:
- Design Data Pipelines: Develop batch and real-time pipelines for Data Warehouse and Datalake using services like DataProc, DataFlow, PubSub, BigQuery, and Big Table.
- Develop Engineering Solutions: Utilize Google Cloud Storage, BigTable, DataProc, and DataFlow with Apache Beam or Python to build solutions.
- Manage Data Platforms: Schedule and oversee pipeline operations with Google Cloud Scheduler and Cloud Composer (Airflow).
- Implement Data Migration: Develop seamless data migration solutions using Google services.
- Optimize Data Layer: Design and improve data storage and retrieval using BigQuery, Big Table, and Cloud Spanner.
Preferred Education
Master's Degree
Required Technical And Professional Expertise
Expertise in the Google Cloud Ecosystem is essential. Experience required includes:
- Designing, building, and maintaining data engineering solutions with Google services.
- Developing batch and real-time data pipelines using tools like Apache Airflow, dbt, Spark/Python, or Spark/Scala.
- Proficiency with Google Cloud Storage, BigTable, DataProc, and DataFlow.
- Managing data platforms using scheduling tools such as Cloud Scheduler and Cloud Composer.
- Designing data layer using Google services for effective data storage and retrieval.
Preferred Technical And Professional Experience
Additional valuable experience includes:
- Utilizing open-source technologies like Apache Airflow, dbt, Spark/Python, or Spark/Scala.
- Developing data migration solutions for seamless data transfer.
- Expertise with Cloud Composer for scheduling and managing data pipelines.
Key skills/competency
- Google Cloud
- Data Pipelines
- Data Engineering
- Batch Processing
- Real-time Processing
- Data Migration
- Cloud Composer
- BigQuery
- DataFlow
- Apache Beam
How to Get Hired at IBM
- Research IBM Consulting: Understand their consulting culture and recent projects.
- Tailor Your Resume: Highlight experience with Google Cloud and data engineering.
- Showcase Technical Skills: Emphasize proficiency in Google services and open-source tools.
- Prepare for Interviews: Practice problem-solving and data pipeline scenarios.
Frequently Asked Questions
Find answers to common questions about this job opportunity
Explore similar opportunities that match your background