E2open Logo

E2open

Data Engineer

Sorry, this job was removed at 01:47 a.m. (PST) on Friday, Feb 21, 2025
Remote
3 Locations
Remote
3 Locations

E2open is the connected supply chain platform that enables the world’s largest companies to transform the way they make, move, and sell goods and services. We connect more than 400,000 partners as one multi-enterprise network. Powered by the network, data, and applications, our SaaS platform anticipates disruptions and opportunities to help companies improve efficiency, reduce waste, and operate sustainably. Our employees around the world are focused on delivering enduring value for our clients.

Job Summary:

e2open seeks a Data Engineer with approximately 3-5 years of experience in building and maintaining scalable data pipelines, architectures, and infrastructure. The ideal candidate will have hands-on experience with Databricks and/or Snowflake, as well as a strong understanding of data governance, regulatory requirements, and global data hosting.


***This is a hybrid role requiring 3 days in office***

Key Responsibilities:

  • Design, build, and maintain scalable data pipelines, architectures, and infrastructure using Databricks and/or Snowflake
  • Work with large data sets, ensuring data quality, integrity, and compliance with regulatory requirements
  • Collaborate with cross-functional teams, including data science, product, and engineering, to identify and prioritize data requirements
  • Develop and implement data governance policies, procedures, and standards to ensure data quality, security, and compliance
  • Ensure compliance with global data hosting regulatory requirements such as GDPR
  • Optimize data infrastructure for performance, scalability, and reliability
  • Develop and maintain technical documentation for data infrastructure and pipelines
  • Stay current with industry trends, best practices, and emerging technologies in data engineering

Requirements:

  • 5+ years of experience in data engineering, with a focus on building and maintaining scalable data pipelines, architectures, and infrastructure
  • Hands-on experience with Databricks and/or Snowflake
  • Strong understanding of data governance, regulatory requirements, and global data hosting
  • Experience working with large data sets, ensuring data quality, integrity, and compliance
  • Strong programming skills in languages such as Python or Java
  • Experience with data warehousing, ETL/ELT, and data modeling
  • Strong understanding of data security, access controls, and compliance
  • Excellent problem-solving skills, with the ability to work in a fast-paced environment
  • Strong communication and collaboration skills, with the ability to work with cross-functional teams

Nice to Have:

  • Experience with cloud-based data platforms, such as AWS, Azure, or GCP
  • Knowledge of data discovery, metadata management, and data cataloging
  • Experience with agile development methodologies and version control systems, such as Git
  • Certification in data engineering, data governance, or related fields



E2open is proud to be an Equal Employment Opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics.

E2open participates in the E-verify program in certain locations, as required by law.

E2open does not accept unsolicited referrals or resumes from any source other than directly from candidates or preferred vendors. We will not consider unsolicited referrals.

Similar Jobs

3 Days Ago
Easy Apply
Remote
United States
Easy Apply
Mid level
Mid level
AdTech • Cloud • Digital Media • Marketing Tech • Analytics • Consulting
As a Data Engineer at Adswerve, you will design and build scalable data architectures and maintain ELT pipelines on Google Cloud Platform. Collaborate with stakeholders to create data solutions, ensure data security and availability, and enhance infrastructure through detailed documentation and monitoring processes.
Top Skills: BigQueryCloud StorageDataformGitGoogle Cloud PlatformJavaScriptPythonSQL
4 Days Ago
Easy Apply
Remote
Hybrid
3 Locations
Easy Apply
Senior level
Senior level
AdTech • Big Data • Information Technology • Marketing Tech • Sales • Software
As a Data Engineer at Bombora, you'll develop and maintain scalable ETL and data processing pipelines, optimize SQL queries, and design distributed data processing systems using GCP services. You'll collaborate across teams and drive engineering excellence while dealing with billions of data interactions and intent signals.
5 Days Ago
Remote
San Francisco, CA, USA
Expert/Leader
Expert/Leader
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
As a Principal Data Engineer at Atlassian, you will lead the data engineering team, build scalable data solutions, and enhance data infrastructure. You will drive technical direction, ensure data quality, mentor engineers, and collaborate with cross-functional teams to support data-driven decisions within the organization.

What you need to know about the Vancouver Tech Scene

Raincouver, Vancity, The Big Smoke — Vancouver is known by many names, and in recent years, it has gained a reputation as a growing hub for both tech and sustainability. Renowned for its natural beauty, the city has become a magnet for professionals eager to create environmental solutions, and with an emphasis on clean technology, renewable energy and environmental innovation, it's attracted companies across various industries, all working toward a shared goal: advancing clean technology.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account