Data Engineer Delivery Lead

As a member of CloudFactory’s engineering team, you will be tasked with building and growing a world-class distributed workforce management platform that will connect one million people in developing countries to basic computer work.
Roles and Responsibilities
Technical

Help to design and develop a data estate that is performant, accessible, secure, scalable, maintainable, and extensible.
Help to implement true CI/CD using Github Actions, etc.
Design and develop EDW using Snowflake, DBT.
Design and develop AWS Data Lake using S3, Athena, Snowflake
Design and develop data ingestion pipelines using SnowPipe, FiveTran, etc.
Model EDW entities and ensure all data is complete, accurate, timely, and documented within Data Dictionary and Ubiquitous Terms documents.
Work towards the implementation of a true Self-Service BI platform.
Implement good practices to ensure that Git is used in all circumstances, This includes reviewing others’ work and having your work reviewed and approved by colleagues.

Governance & Data Protection

Ensure that all work follows best security practices and fully adheres to GDPR, PECR, and other data regulations.
Ensure that all work follows the correct approval and sign-off process before it is pushed into Production.
Ensure that all work is documented and if needed – has a runbook in order to allow the business to continually support it.
Where required, ensure that a PIA (Privacy Impact Assessment) is completed.
Work with others in the team to keep the Data Dictionary and Ubiquitous Language complete and up to date.

Process

Follow existing processes and work to improve/identify gaps in these processes.
Ensure the correct SDLC promotion processes are followed.
Follow the correct sign-off processes to ensure that only approved releases are deployed into Production.
Ensure that all AWS development follows CI/CD processes and is repeatable.

Team/People

Evangelise about the Data Team and estate across the business.
Build relationships with members of the Data Team and the wider Engineering Team.
Work closely and collaboratively with all members of the Data Team and wider Engineering Team.
Work closely with and learn from tech and team leads and challenge proposed solutions with your own ideas.

Requirements
Behavioral Skills

Ability to work across global teams and working with different cultures across different time zones with strong communication and collaboration skills
Tendency to go above and beyond to make things work; manage own and others work to meet the deadline and assist other team members in their deliverables
Ability to identify solutions and make the complex simple.
Ability to breakdown complex problems into simple solutions

Technical Skills

Some experience and knowledge of a coding language such as Python.
Good experience and knowledge of the SQL query language.
Some understanding of star schemas and data warehouse concepts..
Some knowledge of AWS tools and technologies (i.e. Lambda, S3, SQS, SNS, DDB, RDS etc).
Beneficial – ETL and ELT experience – both batch and microservices-led.
Beneficial – Some Snowflake experience.

Tech Stack

Snowflake (including SnowPipe, streams, security, integrations etc)
AWS Data Lake (SNS, SQS, S3, Glue, Athena, DynamoDB etc)
AWS Data Streams (Kinesis, Elastic Search, LogStash, Lambda, API-G etc)
FiveTran
DBT
Quicksight

go to method of application »

Use the link(s) below to apply on company website.  

Apply via :