DevOps Engineer - Hadoop
- HSBC Service Delivery (Polska)
- Employment contract
- General HDP Knowledge (HDFS, Yarn, Ranger).
- General HDF Knowledge (Nifi, Kafka, Storm).
- Experience of secured HDP (Kerberos/LDAP Integration).
- Real world management of the HDP/HDF platform,
- Dashboarding metrics using Ambari/Grafana.
- DevOps Tools: Terraform, Ansible, Jenkins, Git
Project you can join
The Data and Analytics IT team is part of the overall Commercial Banking (CMB) IT team under the remit of the CMB CIO.
The team is responsible for partnering with various stakeholders within the CMB business Transformation and other lines of business and functions to provide robust, secure, leading edge technology solutions for data analytics that are scalable, provide competitive advantage to the business in a changing regulatory and competitive landscape across the world.
- Troubleshooting issues on the cluster: Spark, HDFS, Yarn, Hive,
- Support our users: guide them so as they use the platform in the best way.
- Work on automation (DevOps), monitoring, reporting around the platform.
- Upgrades in Hortonworks Data Platform (HDP).
- Upgrades in Hortonworks Data Flow (HDF).
- On-prem Hadoop migration to GCP.
Work time division
How we manage our projects?
- Methodology: Scrum
- Who makes architectural decisions? Development Team
- Additional monitor
- Personal container
- Tech supervisor
- Open space
- Flexible working hours
- Remote possible: 30%
- Paid vacation: 26
- Healthcare package
- Healthcare package for families
- Financial bonus
- Hot beverages
- Language courses
- Visa Services
- Car parking
- Bicycle parking
- Chill room
- Integration events