Data Scientist

Online interview
Remote possible

Project description

About the project

Currently for our client - International Pharmaceutical Company we are looking for a talented Data Scientist. The successful candidate will establish processes and implementations to enable and facilitate the mining of digital biomarkers out of the aforementioned data. In addition, the candidate will help the staff to setup and maintain new digital biomarker studies. Fully remote job.

Tasks and Responsibilities

  • Develop the computational back-end of applications used to discover Digital Biomarkers
  • Help to automate the creation of reports and dashboards supporting the analytics-driven decision taken from the sensor data collected
  • Participate in a technical decision regarding the implementation of a new algorithm to mine sensor data
  • Enhance the existing infrastructure that handles the data collected from sensors used in clinical trials
  • Proactively collaborate with a team comprising data scientists, software engineers and life science experts

Our offer

  • Transparent wage model; disclosed margin for 7N. The aforementioned wages are target wages paid to the consultant for the subcontracted work
  • Stable and long-term collaboration with various client projects
  • Professional freedom; We are one of the few IT companies who do not use non-compete clauses or retention agreements
  • Career development support, training and technical certification subsidies, conference participation, etc.
  • Collaboration with experts
  • Large client and project portfolio of over 40 companies, prioritizing project continuity and ongoing personal agent support
  • Full integration into the client company structure (e.g. participation in all company events, 7N Kick Off 2018:

Who we're looking for?

Required skills

  • Bachelor with emphasis on coursework of a quantitative nature (e.g. Computer Science, Engineering, Mathematics, Data Sciences)
  • 3 of software development experience
  • Experience in or eagerness to write and maintain ETLs (extract, transform, load) pipelines which operate on a variety of structured and unstructured sources
  • Experience with SQL and NoSQL data stores
  • Deep knowledge of Python and frameworks around it.
  • Previous working experience with the common Python data analysis (e.g. NumPy/SciPy, Pandas, Scikit-learn, SQLAlchemy, etc) and, ideally, data pipelining (e.g. Luigi) libraries
  • Knowledge of UNIX internals and workload management systems (SLURM, SGE /UGE)
  • Ability to write standards-compliant database-related code for MongoDB and MySQL
  • Strong working knowledge of best coding practices (versioning, TDD, debugging)
  • Strong analytical skills combined with conceptual thinking and structured working style; ability to work in a multicultural team
  • Fluent in English

How we manage our projects?
  • Healthcare package
  • Healthcare package for families
Leisure package
  • Leisure package
  • Trainings

Our company


Warsaw, Gdańsk 800+
Tech skills
  • C#
  • Java
  • JavaScript

Check out similar job offers