Data Engineer (Contractor) Data Engineer

What You’ll Do
Success in this role is determined by meeting these key objectives:

[20%] Ingest all relevant data sources into our data lakehouse (AWS, Databricks).

We have existing pipelines, metrics, and dashboards leveraging data from our donor CRM and email provider. You will build new pipelines and jobs to ingest data from our website analytics platform and donor ticketing system.

[40%] Build unified datasets to fully understand donors and fundraising interventions.

Once all data sources are in Databricks, you will work with stakeholders to define metrics, facts, and dimensions necessary for new dashboards, analysis, ML models, and experimentation that will drive fundraising strategy. Then you will build pipelines to clean, transform, and combine data from all platforms into these actionable datasets.

[30%] Reduce data quality incidents with automated data quality tests and monitoring.

Implement data quality tests and alerting for key donation, donor, and donor engagement variables.
Monitor job and pipeline performance.
Proactively identify and implement improvements to our existing pipelines.

[10%] Make it easy to maintain your pipelines and tools.

Create comprehensive, easy to understand documentation to ensure effective knowledge transfer of your work.
Build within our existing configurable pipeline framework and identify ways to improve this process.
Leave our systems and processes better than you found them.

What You’ll Bring

Exceptional alignment with GiveDirectly Values and active demonstration of our core competencies: emotional intelligence, problem solving, project management, follow-through, and fostering inclusivity. We welcome and strongly encourage applications from candidates who have personal or professional experience in the low-income and/or historically marginalized communities that we serve.
Language Requirement: English
Language Preferences: No additional language preferences
Critical thinking and analytical approach necessary to develop technical solutions that scale and are resilient to changes over time
Entrepreneurial mindset and stakeholder management skills required to identify, design, and execute technical solutions that solve important, ambiguous organizational problems
Python, SQL, and spark expertise, along with core competencies required to ship high quality data pipelines and data tools fast
Extensive experience with Databricks preferred; experience with Tableau is a plus
Intellectual humility, curiosity, and a commitment to being part of an exceptional team

go to method of application »

Use the link(s) below to apply on company website.  

Apply via :