Rackspace Technology Logo

Rackspace Technology

Data Architect (Azure and Databricks)

Posted 12 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in Gurgaon, Gurugram, Haryana
Senior level
Remote
Hiring Remotely in Gurgaon, Gurugram, Haryana
Senior level
As a Data Architect, you will design modern data architectures on the Databricks platform, oversee data migration initiatives, implement CI/CD practices, and ensure compliance with industry standards.
The summary above was generated by AI

Overview 

We are seeking an experienced Data Architect with extensive expertise in designing and implementing modern data architectures. This role requires strong software engineering principles, hands-on coding abilities, and experience building data engineering frameworks. The ideal candidate will have a proven track record of implementing Databricks-based solutions in the healthcare industry, with expertise in data catalog implementation and governance frameworks. 

 

About the Role 

As a Data Architect, you will be responsible for designing and implementing scalable, secure, and efficient data architectures on the Databricks platform. You will lead the technical design of data migration initiatives from legacy systems to modern Lakehouse architecture, ensuring alignment with business requirements, industry best practices, and regulatory compliance. 

 

Key Responsibilities 


Design and implement modern data architectures using Databricks Lakehouse platform 

Lead the technical design of Data Warehouse/Data Lake migration initiatives from legacy systems 

Develop data engineering frameworks and reusable components to accelerate delivery 

Establish CI/CD pipelines and infrastructure-as-code practices for data solutions 

Implement data catalog solutions and governance frameworks 

Create technical specifications and architecture documentation 

Provide technical leadership to data engineering teams 

Collaborate with cross-functional teams to ensure alignment of data solutions 

Evaluate and recommend technologies, tools, and approaches for data initiatives 

Ensure data architectures meet security, compliance, and performance requirements 

Mentor junior team members on data architecture best practices 

Stay current with emerging technologies and industry trends 


Qualifications 

Extensive experience in data architecture design and implementation 

Strong software engineering background with expertise in Python or Scala 

Proven experience building data engineering frameworks and reusable components 

Experience implementing CI/CD pipelines for data solutions 

Expertise in infrastructure-as-code and automation 

Experience implementing data catalog solutions and governance frameworks 

Deep understanding of Databricks platform and Lakehouse architecture 

Experience migrating workloads from legacy systems to modern data platforms 

Strong knowledge of healthcare data requirements and regulations 

Experience with cloud platforms (AWS, Azure, GCP) and their data services 

Bachelor's degree in Computer Science, Information Systems, or related field; advanced degree preferred 


Technical Skills 

Programming languages: Python and/or Scala (required) 

Data processing frameworks: Apache Spark, Delta Lake 

CI/CD tools: Jenkins, GitHub Actions, Azure DevOps 

Infrastructure-as-code (optional): Terraform, CloudFormation, Pulumi 

Data catalog tools: Databricks Unity Catalog, Collibra, Alation 

Data governance frameworks and methodologies 

Data modeling and design patterns 

API design and development 

Cloud platforms: AWS, Azure, GCP 

Container technologies: Docker, Kubernetes 

Version control systems: Git 

SQL and NoSQL databases 

Data quality and testing frameworks 


Optional - Healthcare Industry Knowledge 

Healthcare data standards (HL7, FHIR, etc.) 

Clinical and operational data models 

Healthcare interoperability requirements 

Healthcare analytics use cases 

Top Skills

Alation
Spark
AWS
Azure
Azure Devops
CloudFormation
Collibra
Databricks Unity Catalog
Delta Lake
Docker
GCP
Github Actions
Jenkins
Kubernetes
NoSQL
Pulumi
Python
Scala
SQL
Terraform

Similar Jobs

An Hour Ago
Remote
17 Locations
Senior level
Senior level
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
The Content SDET Engineer will develop and test content features at CrowdStrike, ensuring quality and integration across systems, and enhancing automated tests.
Top Skills: AWSAzureDockerEsGCPGoKafkaKubernetesPythonSQL
An Hour Ago
Remote
Hybrid
18 Locations
Senior level
Senior level
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
The role involves developing the IOT Discover product, utilizing technologies for data ingestion, ensuring product quality, and collaborating with teams.
Top Skills: AWSAzureCassandraDockerElastic SearchGCPGoJavaKafkaKubernetesPython
2 Hours Ago
Remote
Hybrid
16 Locations
Senior level
Senior level
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
As a Software Engineer, you will design and develop sensor platform modules, own features from design to delivery, and debug customer issues, focusing on Linux kernel and OS internals.
Top Skills: C++Driver DevelopmentLinuxOs Kernel

What you need to know about the Vancouver Tech Scene

Raincouver, Vancity, The Big Smoke — Vancouver is known by many names, and in recent years, it has gained a reputation as a growing hub for both tech and sustainability. Renowned for its natural beauty, the city has become a magnet for professionals eager to create environmental solutions, and with an emphasis on clean technology, renewable energy and environmental innovation, it's attracted companies across various industries, all working toward a shared goal: advancing clean technology.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account