Appodeal
Appodeal is a dynamic US-based product company with a truly global presence.
We have offices in Warsaw, Barcelona and Parkland (FL), along with remote team members located around the world.
Our company thrives on diversity, collaboration, and innovation, making us a leader in the mobile app monetization space.
At Appodeal, we’re more than just a company—we’re a team united by a common mission: Help people discover and grow their talents through products that enable successful mobile app businesses!
We take pride in our cutting-edge product and our internationally dispersed team of talented professionals.
Here’s what we value, and what we hope you do too:
-
Continuous Learning and Growth: We are passionate about learning, growing personally, and building rewarding careers.
-
Making an Impact: We are committed to building a history-defining company that leaves a lasting impact on the mobile app industry.
-
Solving Exciting Challenges: We tackle complex problems every day, supported by a team of world-class professionals and mentors.
-
Enjoying the Journey: We believe in having fun while working toward our goals.
We are seeking a skilled Data Engineer to join our Mobile Growth Platform UA team in Barcelona. The ideal candidate will be responsible for building, maintaining, and optimizing our data infrastructure, ensuring seamless data flow and accessibility for ML and DA, analytical and business needs.
Key Responsibilities:
- Design and build scalable data pipelines: Architect, construct, and maintain robust ETL/ELT pipelines using the Databricks platform (Apache Spark, Delta Lake, Delta Live Tables) for batch and streaming data ingestion from various sources.
- Manage data architecture: Define and maintain a scalable and secure data platform architecture that integrates Databricks (for data lakehouse management and complex transformations) with specialized OLAP engines (ClickHouse/Druid for fast querying).
- Design, develop, test, and orchestrate data workflows to streamline pipelines.
- Optimize performance and reliability: Monitor, troubleshoot, and fine-tune data workflows and database performance (e.g., Spark job optimization, ClickHouse schema tuning, Druid data partitioning) to ensure maximum efficiency and reliability.
- Contribute to the development of internal tools for automating data processes and business workflows, enhancing efficiency and scalability.
- Collaborate closely with product, MLOps and data science teams to ensure an optimal data experience for all ML practitioners and data consumers.