As one of our first data engineers, you’ll work with our Business, Analytics, Product and other Engineering teams to enable them to do more with KOKO’s data, and faster. You’ll formulate a strategy for our data processing and data stores, and own implementation and maintenance of these designs, ensuring continued high performance and availability.
KOKO’s suite of products spans multiple technical domains and disciplines in software and hardware, and so experience in end-to-end systems thinking across distributed components is key for this role. IoT and real time data processing feature heavily in our technology suite.
What you will do
Design, implement and support KOKO’s data infrastructure
Design and implement ETL pipelines, selecting the most appropriate tools and products in collaboration with KOKO’s analytics, product and engineering teams
Build logical and understandable data models and data stores to house data from KOKO’s various live products and other diverse data sources for consumption by other teams
Build systems to track data quality and consistency, ensuring that our data is accurate & up to date
Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service use of KOKO’s data (create tools that assist teams in this process).
Keep our data secure across national boundaries through multiple data centers and AWS regions
As the team grows, help recruit and manage additional data engineers
Represent the Data Engineering discipline during sprint planning meetings and retrospectives
Help build the engineering culture at KOKO, attracting similar minded people with a passion for using technology to tackle hard challenges
What You Will Bring to KOKO
At least three years experience in software development, data engineering, business intelligence, data science, or a related field
Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
A track record of manipulating, processing, and extracting value from large datasets
Demonstrated strength in data modeling, ETL development and optimisation, and data warehousing
Experience working with big data technologies (preferably within AWS)
Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
Strong analytic skills related to working with unstructured datasets
Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations
Strong initiative, project management and organizational skills
Strong communicator, both written and verbal, who is calm and decisive under pressure
Proactive approach to knowledge sharing and developing best practices
Strong experience taking part in sprints, being involved in planning, retrospectives and estimations with cross-functional teams
Apply via :
jobs.lever.co