close

Please Contact your admin to assign you a license to post jobs

close

Please complete your KYC.
Contact Admin.

close

Disable one of the active jobs below to activate your new job!

Notice: Undefined variable: allpostedjobs in /var/www/html/beenavom_base/catalog/view/theme/default/template/account/job_details.tpl on line 764Warning: Invalid argument supplied for foreach() in /var/www/html/beenavom_base/catalog/view/theme/default/template/account/job_details.tpl on line 764

Senior Data Engineer

Wroots Global Private Limited

Bangalore | Hybrid
Full-time-Permanent
5 - 8 Years
 ₹  30 Lakh - 35 Lakh

Employer is no longer accepting applications for this Job


Posted on :  Feb 1, 2026

Applicants : 11

Vacancies : 20

Share on 

Job Description

Role Overview

As a Senior Data Engineer, you will be a key contributor to the enterprise-wide initiative to build a unified analytics foundation across Digital, Stores, and Marketplace experiences.

You will design, build, and operate scalable data pipelines and data products on AWS and Databricks, supporting Nike’s transition away from Snowflake toward a modern Spark Lakehouse architecture. This role is ideal for an engineer who thrives in complex environments, enjoys ownership of critical data assets, and is ready to grow into broader technical leadership.

 

Key Responsibilities

Data Engineering & Pipeline Development

  • Design, develop, and maintain robust ETL/ELT pipelines using Python, Spark, and SQL.
  • Build and optimize data pipelines on AWS (S3, Lambda, EMR) and Databricks for large-scale analytics workloads.
  • Implement batch and incremental data processing patterns aligned to enterprise standards.
  • Ensure high data quality through testing, validation, and automated checks.

Analytics Enablement & Data Modeling

  • Develop analytical data models that support KPI reporting and advanced analytics use cases.
  • Write advanced SQL involving complex joins, window functions, and performance optimization.
  • Collaborate with analytics, product, and business teams to translate requirements into reliable datasets.
  • Contribute to achieving high KPI accuracy and metric consistency across domains.

Orchestration, CI/CD & Operations

  • Build and manage workflows using Apache Airflow for reliable pipeline orchestration.
  • Contribute to CI/CD pipelines using Git and Jenkins, following engineering best practices.
  • Support containerized deployments using Docker and exposure to Kubernetes.
  • Assist in infrastructure provisioning and configuration using Terraform / CloudFormation.

Streaming & Real-Time Processing

  • Develop or support real-time and near-real-time pipelines using Kafka, Kinesis, and Spark Streaming.
  • Understand event-driven architectures and their application in retail and digital analytics.

Data Governance & Observability

  • Adhere to enterprise data governance standards, including data lineage, cataloging, and access controls.
  • Contribute to metadata management and dataset documentation using Unity Catalog or similar tools.
  • Implement logging, monitoring, and alerting to ensure pipeline reliability and SLA adherence.

 

Must-Have Qualifications

  • 5–8 years of hands-on experience in data engineering or big data development.
  • Strong experience with AWS cloud services, especially S3, Lambda, and EMR.
  • Hands-on expertise with Databricks and Apache Spark for large-scale data processing.
  • Advanced Python skills for data transformation, automation, and testing.
  • Strong SQL skills, including complex queries, window functions, and performance tuning.
  • Experience with Apache Airflow for pipeline orchestration.
  • Hands-on with streaming technologies such as Kafka, Kinesis, or Spark Streaming.
  • Working knowledge of Docker and modern deployment practices.

 

Required Certifications

  • Databricks Certified Data Engineer – Associate or Professional
  • AWS Certified Data Analytics – Specialty or AWS Solutions Architect – Associate

 

Preferred Qualifications

  • Experience in retail, e‑commerce, or consumer analytics environments.
  • Exposure to digital, physical store, or marketplace data domains.
  • Experience supporting platform modernization or cloud migration initiatives.
  • Familiarity with data governance frameworks, metric definitions, and enterprise analytics standards.
  • Prior experience working in large, global data platforms with multiple stakeholders.

 

Why This Role Matters

As a Senior Data Engineer on Project AGNI, your work will directly power how Nike understands its consumers and athletes—enabling faster decisions, trusted metrics, and scalable innovation. This role offers high visibility, complex technical challenges, and a clear growth path toward staff or lead-level engineering roles.

 

Education

Any Graduate

Work Mode

Hybrid

Key Skills

Kayka Apache Airflow AWS Cloud services advance sql Python

About the company

Wroots Global Private Limited

We craft innovative high end digital solutions for clients worldwide. Wroots Global is a new age comprehensive Recruitment Solution provider of Talent Acquisition and Recruitment Process Outsourcing solutions. We operate across various industry sector with each division of our business providing a knowledgeable and trustworthy service to specialized areas of the market.

  351 Kunnur High Road Chitanya, Shanthi Niketan, Sunny Vel, Block A34, 3rd Floor, Ayanavaram, Chennai – 600023, Tamil Nadu, India, Chennai - 600023

Gurdeep Gurjal

Director

Get more job offers

Create FREE accounts and get job notification and personalised job offers.

Details

Job Description

Role Overview

As a Senior Data Engineer, you will be a key contributor to the enterprise-wide initiative to build a unified analytics foundation across Digital, Stores, and Marketplace experiences.

You will design, build, and operate scalable data pipelines and data products on AWS and Databricks, supporting Nike’s transition away from Snowflake toward a modern Spark Lakehouse architecture. This role is ideal for an engineer who thrives in complex environments, enjoys ownership of critical data assets, and is ready to grow into broader technical leadership.

 

Key Responsibilities

Data Engineering & Pipeline Development

  • Design, develop, and maintain robust ETL/ELT pipelines using Python, Spark, and SQL.
  • Build and optimize data pipelines on AWS (S3, Lambda, EMR) and Databricks for large-scale analytics workloads.
  • Implement batch and incremental data processing patterns aligned to enterprise standards.
  • Ensure high data quality through testing, validation, and automated checks.

Analytics Enablement & Data Modeling

  • Develop analytical data models that support KPI reporting and advanced analytics use cases.
  • Write advanced SQL involving complex joins, window functions, and performance optimization.
  • Collaborate with analytics, product, and business teams to translate requirements into reliable datasets.
  • Contribute to achieving high KPI accuracy and metric consistency across domains.

Orchestration, CI/CD & Operations

  • Build and manage workflows using Apache Airflow for reliable pipeline orchestration.
  • Contribute to CI/CD pipelines using Git and Jenkins, following engineering best practices.
  • Support containerized deployments using Docker and exposure to Kubernetes.
  • Assist in infrastructure provisioning and configuration using Terraform / CloudFormation.

Streaming & Real-Time Processing

  • Develop or support real-time and near-real-time pipelines using Kafka, Kinesis, and Spark Streaming.
  • Understand event-driven architectures and their application in retail and digital analytics.

Data Governance & Observability

  • Adhere to enterprise data governance standards, including data lineage, cataloging, and access controls.
  • Contribute to metadata management and dataset documentation using Unity Catalog or similar tools.
  • Implement logging, monitoring, and alerting to ensure pipeline reliability and SLA adherence.

 

Must-Have Qualifications

  • 5–8 years of hands-on experience in data engineering or big data development.
  • Strong experience with AWS cloud services, especially S3, Lambda, and EMR.
  • Hands-on expertise with Databricks and Apache Spark for large-scale data processing.
  • Advanced Python skills for data transformation, automation, and testing.
  • Strong SQL skills, including complex queries, window functions, and performance tuning.
  • Experience with Apache Airflow for pipeline orchestration.
  • Hands-on with streaming technologies such as Kafka, Kinesis, or Spark Streaming.
  • Working knowledge of Docker and modern deployment practices.

 

Required Certifications

  • Databricks Certified Data Engineer – Associate or Professional
  • AWS Certified Data Analytics – Specialty or AWS Solutions Architect – Associate

 

Preferred Qualifications

  • Experience in retail, e‑commerce, or consumer analytics environments.
  • Exposure to digital, physical store, or marketplace data domains.
  • Experience supporting platform modernization or cloud migration initiatives.
  • Familiarity with data governance frameworks, metric definitions, and enterprise analytics standards.
  • Prior experience working in large, global data platforms with multiple stakeholders.

 

Why This Role Matters

As a Senior Data Engineer on Project AGNI, your work will directly power how Nike understands its consumers and athletes—enabling faster decisions, trusted metrics, and scalable innovation. This role offers high visibility, complex technical challenges, and a clear growth path toward staff or lead-level engineering roles.

 

Education

Any Graduate

Work Mode

Hybrid

Number of Vacancies

20

Key Skills

Kayka Apache Airflow AWS Cloud services advance sql Python

About the company


Wroots Global Private Limited

We craft innovative high end digital solutions for clients worldwide. Wroots Global is a new age comprehensive Recruitment Solution provider of Talent Acquisition and Recruitment Process Outsourcing solutions. We operate across various industry sector with each division of our business providing a knowledgeable and trustworthy service to specialized areas of the market.

  351 Kunnur High Road Chitanya, Shanthi Niketan, Sunny Vel, Block A34, 3rd Floor, Ayanavaram, Chennai – 600023, Tamil Nadu, India, Chennai - 600023

Gurdeep Gurjal

Director

Jobsss you might be interested in

Get more job offers

Create FREE accounts and get job notification and personalised job offers.