Are you the expert who transforms data into strategic insights to drive our success?
As a Senior Data Developer, you will be at the heart of developing our data platform. Your main role? Designing fundamental data models, creating data warehouses to facilitate business analysis, and improving data monitoring and observability solutions. Using advanced technologies, you will build efficient and scalable data pipelines.
What’s exciting about this role?
Your efforts will have a direct impact on uncovering essential product insights and supporting the development of data-driven product features. Through your expertise, you will empower our teams to fully leverage data for strategic decision-making and optimized solutions.
Here is a glimpse of your responsibilities:
- Collaborate with cross-functional teams to define data requirements and implement efficient data pipelines for comprehensive analysis of user interactions within applications.
- Build scalable data pipelines that can handle hundreds of millions of events daily.
- Optimize data processing by setting up pre-calculations in data storage solutions to enhance the performance of reporting dashboards.
- Continuously improve system performance and ensure the analytics tools are current with industry standards.
- Provide support to team members by assisting with data extraction and queries analysis, fostering a collaborative environment.
- Coordinate with teams responsible for product integration to ensure adherence to data standards and protocols across applications.
Here is what will qualify you for the role:
- You have 5 years of experience in a similar role, with expertise in building data pipelines, data models, and analytics for large-scale distributed systems.
- You possess a strong engineering background, with hands-on experience in Python, dbt, Snowflake, and/or Databricks.
- You have in-depth knowledge of SQL, ETL processes, data warehousing, and Data Lake architecture.
- You have experience with automated testing frameworks and a commitment to code quality, including unit testing, integration testing, and code coverage metrics.
- You excel at working collaboratively with cross-functional teams and possess exceptional written and verbal communication skills on both technical and non-technical subjects.
- You have experience with Cloud Providers: Proficiency in AWS, Google Cloud Platform, or Azure.
What would make you stand out:
- Experience with PySpark and structured streaming.
- Experience with orchestrating complex workflows using tools such as Airflow, Dagster or Prefect.
- Familiarity with infrastructure as code and with CI/CD tools such as Jenkins, Github Actions, etc to automate testing, deployment, and integration processes.
- Hands-on experience with containerizing applications and managing them at scale using orchestration platforms.
Do you think you can bring this role to life? Or add your own color? You don’t need to check every single box; passion goes a long way and we appreciate that skillsets are transferable.
Send us your application, we want to hear from you! / Send us your application, we want to know what you're all about!
Join the Coveolife!
We encourage all qualified candidates to apply regardless of, for example, age, gender, disability, gaps in CV, national or ethnic background.
#li-hybrid