The International Rescue Committee (IRC) responds to the world's worst humanitarian crises, helping to restore health, safety, education, economic wellbeing, and power to people devastated by conflict and disaster. Founded in 1933 at the call of Albert Einstein, the IRC is one of the world's largest international humanitarian non-governmental organizations (INGO), at work in more than 40 countries and 29 U.S. cities helping people to survive, reclaim control of their future and strengthen their communities. A force for humanity, IRC employees deliver lasting impact by restoring safety, dignity and hope to millions. If you're a solutions-driven, passionate change-maker, come join us in positively impacting the lives of millions of people world-wide for a better future.

Background/IRC Summary:

 Technology and Operations support the organization’s work by providing reliable and scalable solutions for the IRC’s offices around the world. The Data Team at IRC is responsible for the design and delivery of global data strategies and the systems and products that deliver on it.

Job Overview/Summary:

The Data Engineer will support the implementation, configuration, and maintenance of data systems and pipelines across IRC’s data environment. This role assists in building and operating ETL/ELT processes, data integrations, and cloud-based data platforms such as Azure Databricks, Synapse, and Fabric.

The successful candidate will help maintain Lakehouse data environments by monitoring pipeline execution, supporting data loads, and assisting in data modeling tasks under guidance from senior team members. This is a hands-on technical role that requires foundational data engineering knowledge, willingness to learn, and strong collaboration skills.

The Data Engineer will work closely with senior engineers and architects but will not be responsible for deputizing for the Data Architect or owning critical security responsibilities.

 

Major Responsibilities:

 

1.     Design, build, and maintain reliable ETL/ELT data pipelines for batch and near-real-time processing from internal and external sources using tools such as Azure Data Factory or Databricks workflows.

2.     Implement data validation, testing, and reconciliation checks (including dbt tests where applicable)

  • Monitor pipeline health, performance, and reliability.

  • Identify issues and escalate or collaborate with senior engineers to resolve them.

  • Write SQL and Python queries for data extraction and transformation.

  • Support documentation of processes, standards, and improvements.

  • Support solution design by preparing data samples, documentation, or prototype queries.

 

 

Key Working Relationships:

  • Data Team

  • Business/Departmental Priority Setters

  • Enterprise Systems Owners

Position Reports to: Omar Bouidel

Travel Requirements:

  • Support solution design by preparing data samples, documentation, or prototype queries.

Minimum Requirements:

  • Experience: 2–4 years of hands-on experience in data engineering, data processing, or software engineering.

  • Technical Skills:

    • SQL (advanced): joins, window functions, CTEs, performance tuning

    • Python: data processing, APIs, automation, PySpark basics

    • Data modeling: star/snowflake schemas, fact & dimension tables

    • ETL/ELT pipelines: building, monitoring, and optimizing pipelines

    • dbt Core / dbt Cloud: developing, scheduling, and maintaining models

  • CI/CD Tools: Familiarity with Git or other version control systems.

  • Problem-Solving: Strong problem-solving skills and attention to detail.

  • Communication: Good communication and teamwork skills.

Preferred Additional Requirements

  • Experience with cloud platforms (Azure preferred), Azure Data Factory, or similar cloud data tools.

  • Support solution design by preparing data samples, documentation, or prototype queries.

  • Support solution design by preparing data samples, documentation, or prototype queries.

Working Environment:

  • Remote

PROFESSIONAL STANDARDS

All International Rescue Committee workers must adhere to the core values and principles outlined in IRC Way - Standards for Professional Conduct. Our Standards are Integrity, Service, Equality and Accountability. In accordance with these values, the IRC operates and enforces policies on Safeguarding, Conflicts of Interest, Fiscal Integrity, and Reporting Wrongdoing and Protection from Retaliation. IRC is committed to take all necessary preventive measures and create an environment where people feel safe, and to take all necessary actions and corrective measures when harm occurs. IRC builds teams of professionals who promote critical reflection, power sharing, debate, and objectivity to deliver the best possible services to our clients.

Cookies: https://careers.rescue.org/us/en/cookiesettings


At Impactpool we do our best to provide you the most accurate info, but closing dates may be wrong on our site. Please check on the recruiting organization's page for the exact info. Candidates are responsible for complying with deadlines and are encouraged to submit applications well ahead.
Before applying, please make sure that you have read the requirements for the position and that you qualify. Applications from non-qualifying applicants will most likely be discarded by the recruiting manager.