
DATA ENGINEER (ETL DEVELOPER)
Employer is no longer accepting applications for this Job
Posted on : Apr 14, 2025
Applicants : 40
Vacancies : 10
Job Description
IMMEDIATE JOINERS OR NOTICE PERIOD WITH 30 DAYS ARE REQUIRED
DUTIES AND RESPONSIBILITIES:
• Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce, and AWS technologies
• Monitoring active ETL jobs in production.
• Build out data lineage artifacts to ensure all current and future systems are properly documented
• Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes
• Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies
• Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations
• Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs.
• Hands-on experience in developing and implementing large-scale data warehouses, Business Intelligence and MDM solutions, including Data Lakes/Data Vaults.
QUALIFICATIONS:
• Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years experience in business analytics, data science, software development, data modeling or data engineering work
• 3-7 year’s experience with a strong proficiency with SQL query/development skills
• Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks
• Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory,Fivetran)
• Experience working in the healthcare industry with PHI/PII
• Creative, lateral, and critical thinker
• Excellent communicator
• Well-developed interpersonal skills
• Good at prioritizing tasks and time management
• Ability to describe, create and implement new solutions
• Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef)
• Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau)
• Big Data stack (e.g.Snowflake(Snowpark), SPARK, MapReduce, Hadoop, Sqoop, Pig, HBase, Hive, Flume)
Education
B.Tech/B.E.
Work Mode
In-office
Key Skills

About the company
Wroots Global Private Limited
We craft innovative high end digital solutions for clients worldwide. Wroots Global is a new age comprehensive Recruitment Solution provider of Talent Acquisition and Recruitment Process Outsourcing solutions. We operate across various industry sector with each division of our business providing a knowledgeable and trustworthy service to specialized areas of the market.
351 Kunnur High Road Chitanya, Shanthi Niketan, Sunny Vel, Block A34, 3rd Floor, Ayanavaram, Chennai – 600023, Tamil Nadu, India, Chennai - 600023
Gurdeep Gurjal
Details
Job Description
IMMEDIATE JOINERS OR NOTICE PERIOD WITH 30 DAYS ARE REQUIRED
DUTIES AND RESPONSIBILITIES:
• Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce, and AWS technologies
• Monitoring active ETL jobs in production.
• Build out data lineage artifacts to ensure all current and future systems are properly documented
• Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes
• Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies
• Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations
• Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs.
• Hands-on experience in developing and implementing large-scale data warehouses, Business Intelligence and MDM solutions, including Data Lakes/Data Vaults.
QUALIFICATIONS:
• Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years experience in business analytics, data science, software development, data modeling or data engineering work
• 3-7 year’s experience with a strong proficiency with SQL query/development skills
• Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks
• Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory,Fivetran)
• Experience working in the healthcare industry with PHI/PII
• Creative, lateral, and critical thinker
• Excellent communicator
• Well-developed interpersonal skills
• Good at prioritizing tasks and time management
• Ability to describe, create and implement new solutions
• Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef)
• Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau)
• Big Data stack (e.g.Snowflake(Snowpark), SPARK, MapReduce, Hadoop, Sqoop, Pig, HBase, Hive, Flume)
Education
B.Tech/B.E.
Work Mode
In-office
Number of Vacancies
10
Key Skills
About the company

Wroots Global Private Limited
We craft innovative high end digital solutions for clients worldwide. Wroots Global is a new age comprehensive Recruitment Solution provider of Talent Acquisition and Recruitment Process Outsourcing solutions. We operate across various industry sector with each division of our business providing a knowledgeable and trustworthy service to specialized areas of the market.
351 Kunnur High Road Chitanya, Shanthi Niketan, Sunny Vel, Block A34, 3rd Floor, Ayanavaram, Chennai – 600023, Tamil Nadu, India, Chennai - 600023

Gurdeep Gurjal