Who is Ezra?
Ezra provides B2B digital lending solutions for emerging markets in partnership with mobile and digital wallet operators and financial service providers. Ezra supports 24 operations in 23 countries, across Africa, the Middle East and Asia. Our key office locations are in Nairobi, Kenya and Dubai, UAE.
Our flagship products are Advance Credit Service (ACS), Nano and BNPL.
- ACS is an airtime or data advance offered to prepaid mobile subscribers at the point of low credit.
- Nano is a micro cash advance offered to mobile wallet users on demand.
- BNPL facilitates payment installments for products and services
As a FinTech company, our business is entirely technology and data driven, from determining subscriber eligibility, generating relevant offers, managing risk, loan issuance, recovery, optimizing performance and reporting, reconciliation and billing.
Each day we process approximately 21M loan requests and 1.4 TB of data across our markets. This process needs to be robust, reliable and secure.
But it doesn’t end there. We’re exploring new ways of using our platform and transactional data to improve our products and develop new product opportunities.
About the Role
We are seeking a highly skilled and experienced Senior Database Engineer to join our dynamic team. The ideal candidate should possess a strong background in database engineering, coupled with a solid understanding of data engineering principles. Proficiency in database automation tools, Linux, and Bash scripting is essential
As a Senior Database Engineer, you will be responsible for designing, implementing, and maintaining our database systems to ensure high performance, scalability, and reliability. You will collaborate with the different teams, clients, stakeholders and advise as the expert in database design, implementation and serve as a focal point in escalations.
Key Responsibilities
- Design, implement, and maintain robust database solutions.
- Capacity planning in line with the infrastructure. Design and Implement DBs that can scale.
- Optimize and tune database performance to ensure efficient data processing and retrieval.
- Develop and maintain ETL (Extract, Transform, Load) processes for data integration and migration.
- Ensure data integrity, consistency, and security across all database systems.
- Collaborate with software engineers, data scientists, and other stakeholders to define data requirements and develop solutions.
- Monitor and troubleshoot database issues, ensuring minimal downtime and quick resolution.
- Automate database management tasks using automation tools such as Ansible, Terraform and Bash scripting
- Implement backup and recovery strategies to safeguard critical data. Developing, managing and testing back-up and recovery plans
- Monitoring performance and managing parameters to provide fast query responses to front-end users
- Refining the logical design so that it can be translated into a specific data model
- Maintaining data standards, including adherence to the data protection act
- Writing database documentation, including data standards, procedures and definitions for the data dictionary (metadata)
- Controlling access permissions and privileges. Establishing the needs of users and monitoring user access and security
- Ensuring that storage, archiving, back-up and recovery procedures are functioning correctly
- Work directly with development and infrastructure teams to enhance the performance and observability of various database services through monitoring solutions (Grafana, ELK)
- Proficient with building data integrations using both API and file based protocols
Key Requirements
- BSc Degree in one of the following subject areas: Computer Science, Business Administration, Information Technology or related field preferred
- 4 – 5 years IT operation with strong understanding of database structures, theories, principles, and practices
- 4-5 years PostgreSQL Database Administration experience
- 5+ years of experience in database engineering or a similar role.
- Understanding of, and experience with, server-client computing and relational database environments
- Experience with data management and data processing flowcharting techniques
- Knowledge of reporting and query tools and practices
- Proficiency in SQL and experience with database management systems (e.g., MySQL, PostgreSQL, Oracle, SQL Server).
- Strong knowledge of data engineering concepts and ETL processes.
- Extensive experience with Linux operating systems and Bash scripting.
- Familiarity with cloud-based database solutions (e.g., AWS RDS, Google Cloud SQL, Azure SQL Database).
- Experience with NoSQL databases (e.g., MongoDB, Cassandra) is a plus.
- Strong problem-solving skills and the ability to work independently and as part of a team.
- Excellent communication skills and the ability to convey complex technical concepts to non-technical stakeholders.
- Undertstanding of big data technologies (Apache Hadoop, Spark) and DW solutions (Google Big-Query, Snowflake, Azure Synapse analytics)
- Knowledge in python would be an added advantage