It is a great time to join GSK Tech Hub in Poznan. We value courage, accountability, development and teamwork. You will be encouraged to experiment and collaborate across teams to bring innovation to our every day job. We are open to candidates with various levels of experience - whether you are a seasoned specialist with deep expertise or a graduate looking to kick start your career. Help us to lead in Technology to improve lives of patients and consumers around the world.
Are you excited at the prospect of working on ground-breaking projects that leverage and develop novel, cutting-edge Big Data and AI/ML technologies that have a huge impact on health, wellness, and patient outcomes around the globe? If so, then read on to learn more about an exciting opportunity for you in the GSK’s R&D Innovation Team!
As a Big Data AI Engineer in R&D Innovation, you will work on the design, development, and implementation of the Enterprise Big Data Platform for GSK using Containers (e.g. Dockers and Kubernetes) and Cloud Technologies (e.g., Azure). You will also utilize Site Reliability Engineering and Infrastructure automation techniques (e.g., Terraform) to create, automate and use AI/ML ecosystems.
You will be part of a multi-disciplinary team within GSK’s R&D Technology vertical with experience in Entrepreneurship, Data Science and High-Performance Computing. Our members are passionate about technology innovation. Some hold multiple PhDs have worked in Silicon Valley (California) and at the IBM TJ Watson Research Centre (New York) for the US government. We have consistently delivered excellent results on mission-critical problems using scalable expertise and bleeding-edge technologies and methods. GSK leadership is leveraging us to ideate, create, and productionize novel, ground-breaking, high-value solutions for increasingly complex and critical challenges.
This role will provide YOU the opportunity to lead key activities to progress YOUR career.
Considering your skillset and abilities, your growth opportunities will include some or all of the following:
- Designing automated Infrastructures that creates new auto-healing capabilities
- Creation and integration of storage technology and DFS independency into the solution landscape
- Data Pipeline development leveraging DevOps standards
- Use of Continuous Integration (CI) and Continuous Deployment (CD) to build Data Engines
- Creation of secure and private anonymization data systems using declarative programming languages that will interface between Data Silos, Data Engines and Graph Databases. These systems are fundamental for executing AI/ML workflows to accelerate drug discovery and to optimize the manufacturing processes
- Creation of holistic (e.g. integrated) data views through the ingestion, cleaning, linking, harmonization and contextualization of multiple systems. These views will enable our AI/ML work on complex high-value, multi-root cause problems
- Active involvement in all stages of the project lifecycle – from ideation to industrialization – in an Agile development environment. You will discover and develop new promising technologies in a collaborative way, create Proof-of-Concepts (POCs), Proof-of-Values (POVs) and Minimal-Viable Products (MVPs). Whatever we design and prototype, we make it scalable, flexible and robust. Our projects do not sit on the shelf! They are Industrialized by us and later handed-over to the R&D Support teams to drive their further adoption across GSK
Who we're looking for?
We are looking for professionals with these required skills to achieve our goals:
- Bachelor’s Degree – Engineering, Mathematics, Statistics, or Computer Science
- Minimum 5 years as a full-time software engineer
- Expertise with Data Engineering or Site Reliability Engineering
- Expertise with non-imperative paradigms – Scala, Haskell, F#, Typescript or OPA Rego
- Minimum 2 years working on Big Data platforms, preferably Spark
- Minimum 3 years deploying solutions on Cloud Platforms, preferably Azure or GCP
- Infrastructure-as-Code experience: Terraform, Ansible or Cloud templates (Azure, GCP)
- Expertise with container technologies: Kubernetes, Helm or Docker
- Professional DevOps experience: Jenkins, Azure DevOps, CI/CD or Junit
- Ability to design and implement logging, tracing, and application monitoring systems
- Experience building and maintaining APIs
If you have the following characteristics, it would be a plus:
- Streaming data experience with technologies like Apache Kafka
- Cryptography / Cyber Security experience
- Experience operating in a highly regulated and secure environment
- Career at one of the leading global healthcare companies
- Contract of employment
- Attractive reward package (annual bonus & awards for outstanding performance, recognition awards for additional achievements and engagement, holiday benefit)
- Life insurance and pension plan
- Private medical package with additional preventive healthcare services for employees and their eligible
- Sports cards (Multisport)
- Possibilities of development within the role and company’s structure
- Personalized learning approach (mentoring, online trainings’ platforms: Pluralsight, Business Skills, Harvard Manage Mentor, Skillsoft and external trainings)
- Extensive support of work life balance (flexible working solutions, short Fridays option, health & wellbeing activities)
- Supportive community and integration events
- Modern office with creative rooms, fresh fruits everyday
- Free car and bike parking, locker rooms and showers
- Healthcare package
- Healthcare package for families
- Leisure package
- Hot beverages