As a Senior Data Engineer, you will architect intelligent data ecosystems, build ETL pipelines, mentor team members, and optimize data processes using AI technologies.
We build the tech that moves industries forward. We have our eyes set on AI, energy, logistics, sports and other complex and exciting segments.
We believe in an innovative approach to solving deep issues and encourage our people to find their own solutions. We are constantly rethinking processes, business models, architecture, and tech stacks.
We foster a sense of curiosity, experimentation, and passion beyond code. With us, you can easily deepen your knowledge in any field you’re curious about. And because we work across many industries, you’ll be gaining the experience others can only dream of.
At the forefront of reimagining how industries operate, we are a team of builders and thinkers reshaping e-commerce, ticketing, and logistics from first principles. Our work is grounded in curiosity, experimentation, and a drive for real business impact. We eliminate inefficiencies—not just in code, but in legacy models and outdated assumptions. For those who seek to solve complex problems and mentor others in the process, this is a place to thrive.
We are looking for a Senior Data Engineer who combines deep technical expertise with a strategic mindset. In this role, you’ll architect and build intelligent data ecosystems that power autonomous workflows—integrating Generative and Agentic AI to help businesses move faster, think smarter, and operate more efficiently. Equal parts architect and builder, you’ll be instrumental in delivering high-impact, AI-powered solutions across diverse industries.
In this role, you will
- Analyze and optimize business processes by collaborating with stakeholders to uncover inefficiencies and define data requirements for automation
- Design scalable, modular data architectures that integrate with Generative AI and Agentic AI systems to support real-time decision-making
- Engineer robust ETL/ELT pipelines using Python, cloud-native services, and orchestration tools, supporting both batch and streaming data needs
- Architect RAG and vector database solutions using semantic search to enable LLMs to retrieve curated, context-rich business data
- Build intelligent data products, from predictive models and decision engines to AI-driven insights platforms
- Implement data quality, validation, and governance frameworks to ensure data integrity, lineage, and compliance across systems
- Lead technical discovery sessions with clients to transform complex business challenges into AI and data-driven opportunities
- Mentor team members on best practices in data engineering, AI integration, and modern cloud architectures
What you will bring
- Expert-level Python proficiency for data engineering, including API integrations, data transformations (Pandas, PySpark), and automation
- Proven experience designing and deploying large-scale data platforms on AWS, GCP, or Azure
- Strong foundation in building production-grade ETL/ELT pipelines using Apache Airflow, Kafka, Spark, or cloud-native tools
- Hands-on experience with vector databases (e.g., Pinecone, Weaviate, Chroma, Milvus) and implementing semantic search
- Demonstrated knowledge of Generative AI and LLMs, with practical experience in RAG architectures and prompt engineering
- Deep understanding of data governance, quality, and documentation, with a focus on lineage, metadata, and compliance
- Familiarity with cloud services including serverless computing, managed databases, and data warehouses such as BigQuery, Redshift, or Snowflake
- Experience working with complex real-world data environments, including legacy systems, SaaS integrations, APIs, and databases
- Fluency in English, both written and spoken
What we offer
- A working culture that is high performing, ambitious, collaborative and fun
- Private medical care and life insurance
- Yearly training budget (local and international conferences, language courses) and employee-led workshops
- Medicover sport package
- Flexible working hours
- Unlimited WFH (work from home) policy
- Bonus for referrals
- For those who dream of traveling: WFA (work from anywhere) possibilities in NFQ - approved countries
- B2B contracts include paid annual service break and paid public holidays in Poland
- Office perks and team activities
Salary range:
103 - 164 PLN/h + VAT (B2B)
14 270 - 22 200 PLN gross (Permanent)
If you have any questions, please contact me at [email protected] or via Linkedin.
Check all our career opportunities here.
Top Skills
Apache Airflow
AWS
Azure
BigQuery
Chroma
GCP
Kafka
Milvus
Pinecone
Python
Redshift
Snowflake
Spark
Weaviate
Similar Jobs
Information Technology • Software
The Senior Data Engineer will design and optimize ETL/data pipelines, automate workflows, integrate data sources, and translate business needs into data solutions using Azure tools.
Top Skills:
Azure Data FactoryAzure DatabricksPysparkSQL
Analytics
As a Senior Data Engineer, you will design, implement, and maintain real-time data pipelines and cloud-based infrastructure, mentor engineers, and collaborate across teams to enhance data products and services.
Top Skills:
AWSAzureFlinkGCPJavaKafkaKinesisPulumiPythonRabbitMQSparkTerraform
Agency
The Senior Data Engineer will build and optimize data pipelines, collaborate with engineering teams, and ensure data quality and scalability.
Top Skills:
AWSAzureCi/CdDatabricksDbtDelta LakeGCPGitKafkaPythonSparkSQLTerraform
What you need to know about the Vancouver Tech Scene
Raincouver, Vancity, The Big Smoke — Vancouver is known by many names, and in recent years, it has gained a reputation as a growing hub for both tech and sustainability. Renowned for its natural beauty, the city has become a magnet for professionals eager to create environmental solutions, and with an emphasis on clean technology, renewable energy and environmental innovation, it's attracted companies across various industries, all working toward a shared goal: advancing clean technology.


