Big Data Developer

Big Data Developer

Online interview
B2B Employment contract

Your tasks

  • Work as the Big Data developer in a self-organizing Scrum team
  • Implement data sourcing and transformation code, perform semantic data modelling, API build
  • Manage the deployment, maintenance, and L3 user support of the reporting / data access tool(s)
  • Create technical documentation of all delivered artifacts
  • Perform other duties as assigned

Project description

HRDS (Human Resources Data Solutions) is a bank IT project to build a strategic, data-sourcing and analytics hub; hub will provide data for the bulk of the current and future HR integrations and data analytics needs, providing additional data for cross-domain HR analytics and data to the client Enterprise. It includes building of semantic (web ontologies) domain models. Core Project team is based in Poland and delivers new functionalities using Agile (Scrum) methodology.

Successful candidate would become a member of a Scrum team(s), which consist of up to 15 in total resources and is focused on development of a particular product (HRDS). Within a team, members there are several developers with expertise in DWH and BI, some with Cloudera/Hortonworks BigData platform experience. There is also a 0.5 FTE dedicated tester (open position to have a 1 FTE) and experienced Scrum Master (0.5 FTE).

DM & Reporting in total consist of several teams, 30+ people in total, that develop different products.

Who we're looking for?

  1. Project experience with at least one of the following Big Data platforms is a must: Cloudera, Hortonworks (min. 3 years)
  2. SQL data engineering skills
  3. Knowledge of Hadoop ETL tools (sqoop, impala, hive, oozie)
  4. Bash scripting experience (min. 2 years)
  5. Working knowledge of at least one of the programming languages: Python, PySpark, R, Scala, Java (min. 2 years)
  6. Self-motivated and a team player with good problem solving skills
  7. Ability to meet tight deadlines and work under pressure
  8. Continuous integration
  1. Pentaho skills are a plus
  2. Experience with Tableau reporting is a plus
  3. Experience in semantic data modelling is a big plus
  4. Knowledge of Anzo mapping tool is a big plus
  5. Knowledge about reporting solutions modeling is a plus
  6. Familiarity with scrum methodology is appreciated
  7. Experience with containers for Cloudera is a plus
  8. Experience with REST APIs, ESB and/or Apigee is a plus
  9. Experience with cloud-based data solutions (Azure, AWS)

Our company


Gdańsk, Wrocław, Warsaw, Krakow, Zug 13000
Tech skills
  • Java
  • JavaScript
  • .Net

Check out similar job offers