Job Summary
The role holder is responsible for creation of algorithms to automate extraction, analysis, and presentation of complex data. They will be responsible for improving efficiency in the presentation of data and insights for end user consumption. The holder will be required to translate business requirements into automated solutions using self-service platforms and analytics solutions. They will be responsible for ensuring development standards are maintained across all solutions generated by the data team.
Job Summary / Purpose:
The role holder is responsible for creation of algorithms to automate extraction, analysis, and presentation of complex data. They will be responsible for improving efficiency in the presentation of data and insights for end user consumption. The holder will be required to translate business requirements into automated solutions using self-service platforms and analytics solutions. They will be responsible for ensuring development standards are maintained across all solutions generated by the data team.
Key accountabilities/Deliverables/Outcomes
Lead the design, development, and maintenance of software solutions using technologies such as MS PowerApps, PowerBI, SharePoint, ReactJs, PHP, and AWS.
Build web-based applications that integrate with core banking systems using RESTful APIs.
Work closely with business stakeholders& Analysts to identify and prioritize software requirements.
Review requirement documents and propose timelines and architecture for delivery
Facilitate automation by designing systems to collect data, maintain its quality, enrich it, and present insights from warehouse, PowerApps, AWS, API among other sources
Adhere and maintain standards set by the data governance team in development of data artefacts and BI solutions
Review solutions developed by junior developers for governance compliance
Conduct regular training to junior developers on development standards
Support BI Analysts respond to complex data requests by creating and curating advanced structured queries, algorithms, dashboards, and reports
Support Data Science team in data sourcing, classification, and exploratory data analysis in support of AI/ML use cases
Transform large, disparate datasets into reusable artefacts such as multidimensional cubes, fact tables, data models, views etc to be maintained in the data warehouse.
Monitor tasks and projects assigned on Helpdesk and DevOps backlogs
Undertake research and development and apply new techniques in solving business reporting problems.
Actively challenge status quo and offer ideas to improve operations and existing solutions deployed by colleagues
Work within the multidisciplinary teams including data engineers, data scientists, product managers, agile delivery managers, to scope, plan and deliver data driven insight
Preferred Qualification
Bachelor’s degree in IT, technology, data science, business analytics or math focused fields is preferred (or equivalent on-the-job experience and personal analytics projects).
Certified in SQL, AWS or Hadoop Data management
Preferred Experience
Minimum 5-6 years technical experience
Knowledge and Skills
Strong analytical and diagnostic skills
Ability to work in teams and remotely
Experience in working within a large complex organisation with multiple stakeholders
Experience in object-oriented programing/function scripting languages
Experience in use of project management tools such as JIRA, Planner, DevOps, GIT
Behavioral Competency
Problem-solving skills- Ability to identify and solve complex problems using analytical thinking and creativity
Attention to detail- Ability to maintain accuracy and thoroughness in all tasks, from data collection to analysis and reporting.
Curiosity and continuous learning- Passion for learning and keeping up with the latest trends and developments in the field of data science.
Strong analytical thinking- Ability to approach problems in a logical and systematic manner, using data-driven insights to guide decision-making.
Effective communication skills- Ability to articulate complex ideas and findings in a clear and concise manner, both verbally and in writing.
Ability to work in a team environment- Ability to collaborate with others, share knowledge and expertise, and contribute to a positive team culture.
Time management and prioritization- Ability to manage multiple projects and deadlines, prioritize tasks effectively, and deliver high-quality work on time.
Adaptability and flexibility- Ability to adjust to changing priorities, work in a dynamic environment, and thrive in a fast-paced setting.
Technical Competencies
Visual Analytics- Ability to use visual intelligence tools such as PowerBI, Tableau, QlikView, Power Query, IDEA to create business solutions
Query language proficiency- Knowledge of query languages such as SQL, Hive, Pig, and the ability to write efficient queries to extract, transform, and load data from relational and non-relational databases
Scripting and object-oriented programming skills- Proficiency in scripting languages such as Python, R, or PHP in ReactJs, and knowledge of object-oriented programming
Data Warehouse management- Proficiency in creating views, tables, procedures, dimensions and working with Kimball marts and tabular models to create reusable data artefacts in Warehouse
Streaming Data Analytics- Proficiency in working with streaming data from KAFKA, Hadoop, API to ingest, analyse and output data through API or into warehouses in support of data solutions
Structured and unstructured Data analysis
Agile Project management- Ability to manage projects and develop technology solutions in Agile way and using agile tools and GIT for CI/CD
Workflow data analysis- Ability to use workflow-based data tools such as Knime, Alteryx, SSIS for data blending, cleansing, analysis, creation of data pipelines and algorithms in support of business solutions
Cloud computing experience- Experience with Cloud offerings such as AWS, Azure, and GCP, and the ability to set up, manage, and optimize cloud-based data infrastructure and analytics solutions.
Big data engineering tools- Ability to work with Apache Spark, Hadoop, Impala, Hive, Hue,DataBricks to create and maintain warehouse and data pipelines
go to method of application »
Use the link(s) below to apply on company website.
Apply via :
Leave a Reply