LotLinx, Inc. Logo

LotLinx, Inc.

Data Engineer

Posted 2 Days Ago
Be an Early Applicant
Easy Apply
In-Office
3 Locations
Senior level
Easy Apply
In-Office
3 Locations
Senior level
Design, build, and scale cloud-first data pipelines, manage ELT/ETL workflows, ensure data quality and security, optimize performance, and enable Analytics/Product teams through robust data infrastructure.
The summary above was generated by AI

Since our founding in 2012, Lotlinx has consistently pioneered advancements in the automotive landscape. We specialize in empowering automobile dealers and manufacturers by providing cutting-edge data and technology, delivering a distinct market advantage for every single vehicle transaction. Today, we stand as the foremost automotive AI and machine learning powered technology, excelling in digital marketing, risk management, and strategic inventory management.

Lotlinx provides employees with a dynamic work environment that is challenging, team-oriented, and full of passionate people. We offer great incentives to our employees, such as competitive compensation and benefits, flex time off, and career development opportunities.

Job Summary

We are seeking an experienced Data Engineer to join our growing Data team. In this role, you will be the primary architect of the cloud-first data infrastructure that powers LotLinx's Automotive Digital Advertising platform. You will be responsible for designing, building, and scaling the robust data pipelines that ingest, process, and store massive volumes of data from highly disparate sources.

Reporting to the Senior Director of Data Analytics, you will thrive in a fast-paced environment where your infrastructure decisions directly enable Analytics, Product, and Design teams to extract maximum value from our data. This is a highly collaborative, high-impact role where you will not just maintain pipelines, but actively optimize, secure, and innovate our entire data aggregation architecture.

Why Join Us?

Direct Impact: You hold the keys to the engine. The infrastructure you build directly impacts our ability to deliver real-time insights to our automotive clients. Your architectural decisions will have a highly visible, direct impact on the company's success.

Autonomy & Fast Shipping: We dislike red tape just as much as you do. We trust our engineers. You will have the autonomy to make high-level technical decisions and deploy infrastructure quickly without waiting on long committee approvals.

Outcome-Driven Culture: We measure success by the reliability and scalability of your pipelines, not the hours you sit at your desk. We value an async-friendly approach.

The Best of Both Worlds (Hybrid Flexibility): Enjoy the perk of working from home two days a week, while joining your team for three days of engaging, in-person collaboration at our Winnipeg, Oakville, or Vancouver offices.

Top-Tier Developer Experience: To ensure you have the best tools for the job, we provide top-of-the-line laptops and let you choose your preferred hardware environment (Mac, Windows, or Linux). Furthermore, we actively encourage leveraging cutting-edge technologies, including LLMs and AI coding assistants to supercharge your daily workflows.

Key Responsibilities
  • Architect High-Scale Data Infrastructure: Design, build, and maintain robust, scalable, and highly available data pipelines to process massive datasets from internal and external sources.
  • Build Robust Ingestion & Transformation: Develop and manage complex ELT/ETL workflows for data ingestion, transformation, and loading into our data lakes and cloud data warehouses.
  • Cloud Architecture & Security: Architect and implement modern cloud-based solutions (AWS, GCP) while partnering closely with DevOps and Security to ensure strict compliance with data governance, privacy, and security standards.
  • Deep Performance Optimization: Proactively identify and resolve performance bottlenecks, scaling challenges, and technical issues. Engineer solutions for large-scale data storage and compute efficiency.
  • Cross-Functional Enablement: Work closely with Analytics, Product, and Design teams to assist with data-related technical issues and build the infrastructure they need to succeed.
  • Ownership of Data Quality: Act as the internal expert for our core data sources. Explore new technologies to continuously improve workflow reliability, data quality, and system scalability.
Qualifications
  • Experience: 3+ years of professional experience in data engineering, with a proven track record of designing, building, and managing scalable distributed data systems.
  • Cloud Infrastructure Mastery: Proven, hands-on expertise with major cloud platforms, specifically AWS and/or GCP.
  • Advanced Programming & SQL: Strong software engineering skills in Python, Scala, or Java, coupled with proficiency in advanced SQL and query optimization.
  • Big Data & Streaming: Deep experience with big data processing frameworks (Apache Spark, Hadoop, Beam) and real-time event streaming platforms (Apache Kafka, Pub/Sub, Kinesis).
  • Orchestration & Transformation: Extensive experience with workflow orchestration (Airflow, Dataflow) and modern transformation tools (dbt).
  • DevOps & CI/CD: Strong working knowledge of CI/CD pipelines, Git, and containerization/orchestration technologies (Docker, Kubernetes).
  • Modern Warehousing: Experience managing massive datasets within modern data platforms like Snowflake, BigQuery, or Redshift.
  • Problem Solving: Demonstrated ability to creatively troubleshoot complex distributed systems and innovate within modern cloud ecosystems.
Nice-to-Haves
  • Previous experience in the Automotive or AdTech industry.
  • Familiarity with workflow orchestration tools like Apache Airflow.
  • Experience with Apache Pinot.
  • Experience with data quality and testing frameworks.
  • Familiarity with Google Cloud Platform (GCP) services beyond BigQuery.
Our Tech Stack
  • Cloud/Warehouse: Google Cloud Platform (GCP), AWS, Google BigQuery, Apache Pinot
  • Transformation: SQL, Python
  • Orchestration: Airflow, Cloud Composer
  • Ingestion: Custom Scripts, Pub/Sub, Lambda

The salary range for this position is $108,000 - $162,000 with an annual target bonus.

Lotlinx provides a comprehensive benefits package.

This job posting is for an existing vacancy.

Lotlinx is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.

Lotlinx is not currently able to offer sponsorship for employment visa status.

Lotlinx is headquartered in Peterborough, NH and has locations in Holmdel NJ, Manitoba, Ontario and British Columbia, Canada in addition to a large team spanning from the US to Canada.

Our success relies heavily on our customers but also our dedicated talent that continuously moves our platform forward. We value our employees, their abilities and seek to foster an open, cooperative, dynamic environment where the team and company alike can thrive. 

Top Skills

Python,Scala,Java,Sql,Apache Spark,Hadoop,Apache Beam,Apache Kafka,Google Pub/Sub,Amazon Kinesis,Airflow,Dataflow,Dbt,Git,Docker,Kubernetes,Snowflake,Bigquery,Redshift,Apache Pinot,Cloud Composer,Aws,Gcp,Aws Lambda

Similar Jobs

2 Days Ago
Hybrid
Toronto, ON, CAN
Senior level
Senior level
Blockchain • Fintech • Payments • Consulting • Cryptocurrency • Cybersecurity • Quantum Computing
Lead the MEI data strategy and run day-to-day operations for research-grade analytics environments. Design and operate Hadoop/Cloudera and Databricks lakehouse platforms, build ETL/ELT pipelines, implement data quality and governance, automate checks and CI/CD, liaise with central technology teams, and enable analysts with documentation, templates, and GenAI-enhanced data tooling.
Top Skills: Hadoop,Ozone,Cloudera,Spark,Sql,Python,R,Tableau,Databricks,Lakehouse,Delta,Splunk Catalog,Airflow,Databricks Jobs,Dbt,Kafka,Mlflow,Model Registry,Rag,Vector Search,Nl->Sql,Aws,Azure,Gcp,Ci/Cd,Etl,Elt
6 Days Ago
Remote or Hybrid
Canada
Senior level
Senior level
Digital Media • Gaming • Information Technology • Software • Sports • Esports • Big Data Analytics
Lead end-to-end personalization initiatives: define problems, build and deploy machine learning and statistical models for recommendations and layout optimization, integrate scalable solutions with engineering and product, monitor performance, communicate insights to stakeholders, and mentor junior data scientists.
Top Skills: Python,Machine Learning,Statistical Modeling,Personalization Algorithms,Data Manipulation Tools,Data Visualization Tools,Production Model Deployment
2 Days Ago
Easy Apply
In-Office
4 Locations
Easy Apply
Senior level
Senior level
Financial Services
Lead design and build scalable data ingestion and feature pipelines for foundation models; implement data quality monitoring; model and integrate new data sources; run experiments measuring data impact; optimize ML training and hyperparameters; collaborate with ML, platform, and infra teams; lead technical initiatives and mentor team members.
Top Skills: Python,Ray,Spark,Mlflow,Dagster,Airflow,Transformer Architectures,Foundation Models

What you need to know about the Vancouver Tech Scene

Raincouver, Vancity, The Big Smoke — Vancouver is known by many names, and in recent years, it has gained a reputation as a growing hub for both tech and sustainability. Renowned for its natural beauty, the city has become a magnet for professionals eager to create environmental solutions, and with an emphasis on clean technology, renewable energy and environmental innovation, it's attracted companies across various industries, all working toward a shared goal: advancing clean technology.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account