28 days ago

Databricks Expert - Data Engineer/Data Architect

NTT DATA Europe & Latam

On Site
Full Time
€75,000
Turin, Piedmont, Italy
Apply

Job Overview

Job TitleDatabricks Expert - Data Engineer/Data Architect
Job TypeFull Time
Offered Salary€75,000
LocationTurin, Piedmont, Italy

Who's the hiring manager?

Sign up to PitchMeAI to discover the hiring manager's details for this job. We will also write them an intro email for you.

Uncover Hiring Manager

Job Description

About the Databricks Expert - Data Engineer/Data Architect Role at NTT DATA Europe & Latam

The Databricks Expert in NTT Data understands the challenges of a cloud-native environment and is able to think outside the box, finding innovative, yet simple, solutions to everyday issues.

We are looking for an engineer/architect who can be the bridge between Data Science and Data Engineering, having a clear understanding of both worlds.

If you are a Data Scientist looking for more experience in preparing data, or a Data Engineer with interest also in Data Science, willing to have a KAIZEN approach, continuously improving technical skills and knowledge, you can be a Databricks Expert in NTT Data’s Data&Intelligence Service Line, in an environment where Japanese and European cultural elements are intertwined.

Your Contribution In This Role

Your role as a Databricks Expert in NTT Data will include supporting customers in Big Data projects, working in partnership with business analysts and solution architects to understand use cases, data needs, and outcome objectives. In this role, you will deliver data acquisition, transformations and improvement, cleansing, conversion, compression, and loading of data into data lakes in order to be consumed by analytics and machine learning models.

The Ideal Profile

  • Working experience in a cloud-native environment in at least one of the 3 major public clouds (GCP/AWS/Azure), 5 years experience on Databricks is preferred
  • Experience and knowledge of Big Data Architectures, cloud and on-premise Azure infrastructure and networking working experience
  • Experience in building and delivering proofs-of-concept, in order to address specific business needs, using the most appropriate techniques, data sources and technologies
  • Working experience in migrating workloads from on-premise to cloud environment
  • Experience in monitoring distributed infrastructure, using Azure tools or open source ones (es: Grafana, Prometheus)
  • Proven experience in: Java, Scala, Python, and shell scripting
  • SQL language knowledge
  • Experience in leading medium teams
  • Working experience with: Apache Spark, Databricks, Azure Data Factory, Azure Synapse, and other Azure related ETL/ELT tools
  • Microsoft Certification: at least one certification on Azure or Databricks (Azure Data Engineer Associate, Azure Solutions Architect Expert, Databricks Spark Developer, Databricks Data Engineer, Databricks Machine Learning)

Nice To Have

  • Working experience with Agile Methodology and Kanban
  • Other Big Data certifications, such as Cloudera, Databricks, etc. are welcome
  • Experience with non-Azure ETL/ELT solutions (Talend, Oracle Data Integrator, Informatica PowerCenter, etc.)
  • Experience with Data Visualization solutions (PowerBI, Qlik, Tableau, etc.)

Key skills/competency

  • Databricks
  • Azure
  • Apache Spark
  • Data Engineering
  • Data Architecture
  • Cloud Native
  • Big Data
  • Python
  • Java/Scala
  • SQL

Tags:

Databricks Expert
Data Engineering
Data Architecture
Big Data
Cloud Native
ETL
ELT
Machine Learning
Data Science
Solutions Architecture
Team Leadership
Databricks
Azure
Apache Spark
Python
Java
Scala
SQL
Azure Data Factory
Azure Synapse
Grafana

Share Job:

How to Get Hired at NTT DATA Europe & Latam

  • Research NTT DATA Europe & Latam's culture: Study their mission, values, recent news, and employee testimonials on LinkedIn and Glassdoor.
  • Customize your resume for Data Engineering: Highlight Databricks, Azure, Spark, and Big Data architecture experience, aligning with the Databricks Expert role.
  • Showcase cloud-native expertise: Emphasize experience in GCP, AWS, or Azure, especially Azure tools like Data Factory and Synapse.
  • Prepare for technical and behavioral interviews: Focus on your experience with Java, Scala, Python, SQL, and team leadership, demonstrating a KAIZEN approach.
  • Demonstrate your problem-solving skills: Be ready to discuss how you've found innovative, yet simple, solutions to complex data challenges in a cloud environment.

Frequently Asked Questions

Find answers to common questions about this job opportunity

Explore similar opportunities that match your background