You will be responsible for the new and further development of the cluster's products. You'll use the latest Big Data and Cloud technologies to process large amounts of data on our Data Lake and deliver services on premise or in the cloud.
- Take over responsibility for existing applications.
- Design and implement new features to enable new business cases.
- Together with operations ensure stable operations and maintain application
- Prototype and pitch new ideas
Who we're looking for?
- Python - good/high knowledge
- UNIX / LINUX
- SOA, Web services, REST
- Data warehouse, relational and non- relational database solutions
- Very good structural, analytical and conceptual skills
- Customer-oriented, innovative team player with strong solution/target orientation
- Experience in working in an agile environment
- English B2 level mandatory
Nice to have:
- Hadoop (Spark/PySpark)
- Experience with GraphQL
- Experience with Cloud technology (kubernetes)
- Data Science experience (Python:Pandas) or equivalent
- Experience with container technology
- German optional
- Healthcare package
- Healthcare package for families
- Leisure package
- Leisure package for families
- Financial bonus
- Cold beverages
- Hot beverages
- Car parking
- Bicycle parking