Location: Remote - Spain
Departments: Professional Services Operations
As a Big Data Solutions Architect (Resident Solutions Architect) in our Professional Services team, you will work with clients on short to medium-term customer engagements, addressing their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects, integrating with client systems, providing training, and performing other technical tasks to help customers derive maximum value from their data. RSAs are billable and excel at completing projects according to specifications with excellent customer service. You will report to the regional Manager/Lead.
The impact you will have:
- Work on a variety of impactful customer technical projects, including designing and building reference architectures, creating how-to guides, and productionalizing customer use cases.
- Collaborate with engagement managers to scope professional services work, incorporating customer input.
- Guide strategic customers in implementing transformational big data projects and third-party migrations, including end-to-end design, build, and deployment of industry-leading big data and AI applications.
- Consult on architecture and design; bootstrap or implement customer projects to ensure successful understanding, evaluation, and adoption of Databricks.
- Provide an escalated level of support for customer operational issues.
- Work with the Databricks technical team, Project Manager, Architect, and Customer team to ensure the technical components of the engagement meet customer needs.
- Collaborate with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement-specific product and support issues.
What we look for:
- Proficiency in data engineering, data platforms, and analytics, with a strong track record of successful projects and in-depth knowledge of industry best practices.
- Comfortable writing code in either Python or Scala.
- Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP), with expertise in at least one.
- Deep experience with distributed computing using Apache Spark™ and knowledge of Spark runtime internals.
- Familiarity with CI/CD for production deployments.
- Working knowledge of MLOps.
- Experience in the design and deployment of performant end-to-end data architectures.
- Experience with technical project delivery, including managing scope and timelines.
- Strong documentation and white-boarding skills.
- Experience working with clients and managing conflicts.
- Ability to build skills in technical areas that support the deployment and integration of Databricks-based solutions to complete customer projects.
- Travel is required up to 10%, potentially more during peak times.
- Databricks Certification is a plus.
About Databricks
Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics, and AI. Databricks is headquartered in San Francisco, with offices around the globe, and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake, and MLflow. To learn more, follow Databricks on Twitter, LinkedIn, and Facebook.
Benefits
At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all our employees. For specific details on the benefits offered in your region, please visit https://www.mybenefitsnow.com/databricks.