7N is an agent for high-end IT professionals. For over 30 years of operation we have proven that a clear and transparent financial model, collaboration exclusively with experts in their respective fields, and taking good care of them, comprise the best possible IT consulting model. We act as an individual agent for our consultants and promote their competences to our clients by offering them a wide range of projects in which they may participate. We add wage transparency, career development support and professional stability. Our main goal is a long-term collaboration; therefore the majority of our staff have been with us for many years.
We're looking for an ambitious Data Engineer who wants to join a project from the pharmaceutical industry with a multi-technology application stack. The person will take a leading role in implementing new tools and analyses in the project, using the Python programming language.
Fully remote work
- Transparent wage model; disclosed margin for 7N. The aforementioned wages are target wages paid to the consultant for the subcontracted work
- Stable and long-term collaboration with various client projects
- Professional freedom; We are one of the few IT companies who do not use non-compete clauses or retention agreements
- Career development support, training and technical certification subsidies, conference participation, etc.
- Private healthcare and the Benefit Multisport card
- Collaboration with experts
- Large client and project portfolio of over 40 companies, prioritizing project continuity and ongoing personal agent support
- Full integration into the client company structure (e.g. participation in all company events, 7N Kick Off 2019: https://www.youtube.com/watch?v=i5KjJpFBNpI)
Who we're looking for?
- 3+ years of experience working with SQL
- 3+ years of working in Python or R
- 3+ years of experience working on GCP, AWS or other cloud platform
- 1+ years of experience with different types of storage (filesystem, relation, MPP, NoSQL) and working with various kinds of data (structured, unstructured, metrics, logs, etc.)
- 2+ years of experience in working in data architecture concepts (in any of following areas data modeling, metadata mng., workflow management, ETL/ELT, real-time streaming, data quality, distributed systems)
- Exposure to open source and proprietary cloud data pipeline tools such as Airflow, Glue and Dataflow
- Very good knowledge of relational databases
- Very good knowledge of code management tools (e.g. Git, SVN) and DevOps tools (e.g. Docker, Bamboo, Jenkins)
- Healthcare package
- Healthcare package for families
- Leisure package