Azure Data Engineer

Procter & Gamble
Mid
Online interview
Employment contract
Prosta 20, Warsaw

Project description

The Azure Data Engineer role will be focused on Supply Chain systems, and work in partnership with business analysts and application managers to understand use cases, data needs, and outcome objectives. The role will deliver data acquisition, transformations, cleansing, conversion, compression, and loading of data into Data and Analytics models residing in Core Data Lake and Product Supply (PS) Data Hub.

Responsibilities:

  • Collect data from Supply Chain systems, including SAP ERP, SAP BW, or various systems and develop methods to automatically pull data of varying size at pre-determined frequency into Azure based Core Data Lake (CDL), and further load into PS Data Hub from CDL.
  • Deliver data modeling and optimization of data and analytics solutions at scale.
  • Be responsible for the automation and scale up of the solutions and assist operations in diagnosing ad hoc user issues quickly and efficiently as well as ensuring consistent data via synchronization between multiple environments.
  • Ability to traverse multiple pipelines across Azure environments to resolve root causes of platform data related issues and expediently derive how to resolve.
  • Define, analyze and improve Platform Service Operations metrics and ITIL Service Management processes executed by various teams. Collaborate with Security Engineer improve Compliance metrics
  • Develop automation scripts (e.g. PowerShell, Azure Automation) to optimize cost, capacity and performance, drive compliance, or to understand Platform health
  • Work with PG Corporate & Application teams, Cloud Service Partners and other vendors to understand, verify, improve and fix as needed shared Platform capabilities across Data & Analytics Organization (D&A) applications. This includes L3 support for major capability critical incidents and "deep" Problem Management support


What we offer:

  • Responsibilities as of day 1. You will have project ownership and autonomy to deliver change and results from the beginning.
  • Dynamic and encouraging work environment. At P&G our employees are at the core, we value every individual and encourage initiatives, promoting agility and work/life balance.
  • Continuous mentoring, you will work with hardworking people and receive ongoing coaching and mentoring from your line manager and other colleagues. Corporate and functional training will enable you to succeed and develop from day one.
  • Industry Certifications (ITIL, DevOps, MS portfolio etc), full additional benefit program like private health care, P&G Dynamic Living programs like sport cards, in-office fitness center, PG stock options, saving plans, lunch subsidy, regular salary increases and possible promotions, flexible work arrangements, mentoring programs & trainings.
  • Big Picture understanding of P&G IT and Product Supply organization and its Services in global multi-functional teams with several locations across continents.

Who we're looking for?

To fit well into this role, you must have technical experience with Databricks and some business understanding of supply processing.

Below we present the profile of an ideal candidate:

  • Sophisticated programming skills on Databricks technology stack (PySpark, SparkSQL, etc.) as well as the ability to construct detailed data models.
  • Hands-on skills with suite of applications, ETL tools (Azure Data Factory, MS SQL Server).
  • Azure related logging, Monitoring, Alerting systems (Cloud Native tools, Log Analytics & Kusto Query Language, Azure Diagnostics, Metrics, Event Logs, Application Logs).
  • Understanding of infrastructure and Platform components (network, servers), Hosting and related technologies (data center, cloud, computing, Windows, Linux, storage, backup, virtualization, etc.), cloud provisioning (ARM, Terraform), have a passion for these domains, and can learn technologies quickly.
  • DevOps tools such as Azure DevOps, Git Hub for use for CI/CD. This includes Repos, Pipelines, Testing, Story Boards, Backlogs, etc.
  • English proficiency and at least a Bachelor's degree in Computer Science, Computer / Systems / Industrial Engineering, Business / Management Information Systems or Software Development.
Technologies: PySpark, SparkSQL, Azure offerings including Databricks, SQL DB, Azure Data Factory, Key Vault, Virtual Machines and ADLS Gen2 Storage.


Additionally, we look for:

  • Stewardship experience to ensure services follow P&G policies (Security, Governance, etc.).
  • Problem-solving attitude. Proactive, initiative taking and not afraid to challenge the status quo.
  • Being an extraordinary teammate. Big projects aren’t developed by individuals – we work with others well!

Skills
ETL
Azure

Our company

Procter & Gamble

Warsaw
Tech skills
  • Java
  • Python
  • JavaScript

Check out similar job offers