LotLinx, Inc. Logo

LotLinx, Inc.

Analytics Engineer

Posted Yesterday
Be an Early Applicant
Easy Apply
In-Office
3 Locations
Senior level
Easy Apply
In-Office
3 Locations
Senior level
Design, build, and optimize product-facing data models and large-scale ELT/ETL pipelines in BigQuery. Integrate disparate sources, ensure data validation and CI/CD, optimize query performance at multi-terabyte scale, and collaborate with product and engineering teams while mentoring juniors.
The summary above was generated by AI

Since our founding in 2012, Lotlinx has consistently pioneered advancements in the automotive landscape. We specialize in empowering automobile dealers and manufacturers by providing cutting-edge data and technology, delivering a distinct market advantage for every single vehicle transaction. Today, we stand as the foremost automotive AI and machine learning powered technology, excelling in digital marketing, risk management, and strategic inventory management.

Lotlinx provides employees with a dynamic work environment that is challenging, team-oriented, and full of passionate people. We offer great incentives to our employees, such as competitive compensation and benefits, flex time off, and career development opportunities.

Job Summary

We are seeking an experienced Analytics Engineer to join our growing Data team. In this role, you will play a pivotal role in architecting, building, and optimizing the data foundations that directly power our customer-facing products. Unlike traditional internal-only BI roles, your work here is the product. You will be responsible for pre-processing, integrating, and rigorously modeling complex data from a multitude of external and internal sources, transforming it into the highly reliable, performant datasets that our customers use to drive their business decisions.

Reporting to the Director of Data Analytics, you will collaborate closely with Product Managers, Data Engineers, and Software Engineers to seamlessly integrate these data models into our product suite. This is a key position where you'll have significant ownership over pipelines that directly impact our customers' success and our bottom line.

Why Join Us?

Direct Impact: You won't be stuck maintaining internal dashboards. You hold the keys that drive our customer-facing products. Your architectural decisions will have a highly visible, direct impact on our bottom line.

Autonomy & Fast Shipping: We dislike red tape just as much as you do. We trust our engineers. You will have the autonomy to make high-level technical decisions and ship code quickly without waiting on month-long committee approvals.

Outcome-Driven Culture: We measure success by the impact of your code, not the hours you sit at your desk. We value an async-friendly approach that treats you like an adult.

The Best of Both Worlds (Hybrid Flexibility): Enjoy the perk of working from home two days a week, while joining your team for three days of in-person collaboration at our Winnipeg, Hamilton, or Vancouver offices.

Top-Tier Developer Experience: To ensure you have the best tools for the job, we provide top-of-the-line laptops and let you choose your preferred hardware environment (Mac, Windows, or Linux). Furthermore, we actively encourage leveraging cutting-edge technologies, including LLMs and AI coding assistants to supercharge your daily workflows.

Key Responsibilities
  • Architect Product-Facing Data Models: Design, develop, and maintain scalable data models in our data warehouse (Google BigQuery) using modern transformation frameworks to serve as the backend engine for our customer-facing applications.
  • Pre-process & Integrate Disparate Sources: Build robust logic to ingest, clean, and unify messy data from a wide variety of third-party APIs, external systems, and internal databases.
  • Data Validation & QA: Conduct rigorous data validation and testing across massive datasets (billions of rows, terabytes) to ensure pipeline integrity. Because this data is customer-facing, zero-downtime and high data fidelity are critical.
  • Develop Large Data Pipelines: Develop, monitor, and troubleshoot ELT/ETL pipelines processing high-volume data streams, ensuring reliability and SLA adherence at the terabyte scale.
  • Optimize Pipeline Performance: Write, tune, and debug complex SQL queries. Analyze execution plans and usage patterns to optimize performance and cost-efficiency using multi-terabyte datasets within BigQuery and Apache Pinot to ensure low-latency product experiences.
  • Champion Data Governance & CI/CD: Implement strict data quality checks, automated testing frameworks, and maintain CI/CD best practices to ensure the trustworthiness of our production data assets.
  • Collaborate & Mentor: Work effectively within a collaborative, cross-functional product environment, mentor junior team members, and advocate for the adoption of new analytics engineering best practices.
Qualifications
  • Experience: 3+ years of professional experience in Analytics Engineering, Data Engineering, or a highly related role, with a proven track record of managing complex data systems—preferably powering user-facing applications.
  • Advanced SQL & Optimization: Deep expertise in writing, tuning, and debugging complex SQL in multi-terabyte cloud environments (BigQuery, Apache Pinot). Strong understanding of query execution plans, partitioning, and clustering.
  • Data Modeling Mastery: Practical experience designing and implementing warehouse schemas tailored for both analytical processing and application integration.
  • ETL/ELT & Orchestration: Proven experience building data pipelines using relevant tools (e.g., Airflow) and frameworks. Python proficiency for scripting, API interaction, and data manipulation is essential.
  • Cloud Data Warehousing: Significant experience working with Google Cloud Platform (GCP) and BigQuery, understanding its underlying architecture and cost-optimization techniques.
  • Communication: Proven ability to explain complex technical pipelines to product managers and software engineering teams.
Nice-to-Haves
  • Previous experience in the Automotive or AdTech industry.
  • Familiarity with workflow orchestration tools like Apache Airflow.
  • Experience with Apache Pinot.
  • Experience with data quality and testing frameworks.
  • Familiarity with Google Cloud Platform (GCP) services beyond BigQuery.
Our Tech Stack
  • Cloud/Warehouse: Google Cloud Platform (GCP), AWS, Google BigQuery, Apache Pinot
  • Transformation: SQL, Python
  • Orchestration: Airflow, Cloud Composer
  • Ingestion: Custom Scripts, Pub/Sub, Lambda

The salary range for this position is $103,000 - $157,000, plus an annual target bonus.

Lotlinx provides a comprehensive benefits package.

This job posting is for an existing vacancy.

Lotlinx is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.

Lotlinx is not currently able to offer sponsorship for employment visa status.

Lotlinx is headquartered in Peterborough, NH and has locations in Holmdel NJ, Manitoba, Ontario and British Columbia, Canada in addition to a large team spanning from the US to Canada.

Our success relies heavily on our customers but also our dedicated talent that continuously moves our platform forward. We value our employees, their abilities and seek to foster an open, cooperative, dynamic environment where the team and company alike can thrive. 

Top Skills

Google Bigquery,Apache Pinot,Sql,Python,Google Cloud Platform (Gcp),Aws,Google Pub/Sub,Aws Lambda,Apache Airflow,Cloud Composer

Similar Jobs

Yesterday
Easy Apply
In-Office
3 Locations
Easy Apply
Senior level
Senior level
AdTech • Automotive • Information Technology
Design, build, and optimize product-facing data models and large-scale ELT/ETL pipelines in BigQuery and Apache Pinot. Integrate disparate sources, validate data at terabyte scale, ensure pipeline reliability and performance, implement data governance and CI/CD, and collaborate with product and engineering teams while mentoring junior members.
Top Skills: Google Bigquery,Apache Pinot,Sql,Python,Airflow,Cloud Composer,Pub/Sub,Aws Lambda,Google Cloud Platform (Gcp),Aws
Yesterday
Easy Apply
In-Office
3 Locations
Easy Apply
Senior level
Senior level
AdTech • Automotive • Information Technology
Design, build, and optimize customer-facing data models and high-volume ETL/ELT pipelines in BigQuery. Ensure data validation, performance tuning, governance, and CI/CD while collaborating with product and engineering teams and mentoring junior members.
Top Skills: Google Bigquery,Apache Pinot,Sql,Python,Airflow,Cloud Composer,Google Cloud Platform (Gcp),Aws,Pub/Sub,Aws Lambda
6 Days Ago
In-Office
Toronto, ON, CAN
Senior level
Senior level
Cloud • Fintech • HR Tech
The Principal Analytics Engineer will lead the development of data products and pipelines, mentor junior engineers, enforce best practices, and collaborate across teams for high-impact analytics projects.
Top Skills: DbtGitPythonSnowflakeSQL

What you need to know about the Vancouver Tech Scene

Raincouver, Vancity, The Big Smoke — Vancouver is known by many names, and in recent years, it has gained a reputation as a growing hub for both tech and sustainability. Renowned for its natural beauty, the city has become a magnet for professionals eager to create environmental solutions, and with an emphasis on clean technology, renewable energy and environmental innovation, it's attracted companies across various industries, all working toward a shared goal: advancing clean technology.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account