Design Principles and Scenario

This page explains the design scenario for a API interfacing connector approach between your new Fineract instance and legacy suite of CBS/LMS/transaction processor.

The possible design scenario with which import process can be automated. This can also be applicable for DB migration events between two live external services to Fineract. We divide the complete process in following stages:

1. Ingest and Enrich

Receiving files via a storage function, thus implementing a storage function for keeping a track of the files recieved from legacy CBS/LMS/transaction processor , since the number of the files are pre-defined and naming conventions are pre-defined in the legacy suite.

We need a data layer, which handles the operations such as

2. Observe files

Observation of the files, when any new files are added , the method rotate the file and store those processed and identify using a stored hash whether to process the file.

3. Transform and mutate

Once identified the file needs processing, the file needs to be cleaned and parsed in an open excel format which is supported by the fineract api.

Adding filters like removing fields which are not used, or adding default values before submitting to fienract excel API.

Validation

Ensuring validation before submitting to the bulk import API where one needs to ensure all the checks and serialisation are fullfilled.

4. Use Apache Kafka

Why Kafka, Kafka seems like a best solution since this is an event driven application e.g:

A consumer making a request to fetch all the memo including latest one. A consumer making a new transfer using online banking app

Both the above scenario could lead to multi services events where some action is taking place. For eg:

Fetch Memo's from the CBS, now Kafka could return the state notifying the application in real time.

Apache kafka is useful to fineract API backend calls for e.g where one of the jobs is to send out API service call to update a new client and another job is to update the CBS also. It can handle state updates from both.

Using Job Processing Using Redis

Redis based queue where it can handle the jobs processing and manage the updates on the jobs. This can come handy where we have multiple services working together and all of them are working concurrently. This helps the developer to identify using the business logic what to do next.

Jobs can be eg:

Perform a latest MEMO pull, where developers can watch which jobs are in action and then once the states are updated using Kafka, jobs can update the methods to step ahead.

Examples for importing say with 100 new loans to the stage of disbursal.

  1. Create a pipeline , extract data

  2. Extract client values map those client values to account number or clientID.

  3. Fetch product ID

  4. Set the right dates

  5. Start the job

Approval Process and beyond

  1. Set the Approval date

  2. Disbursal Date

Loans are imported completely and effectively.

Role of Middleware:

Send out events to API management layer.

The role of the middleware

  1. Observe changes and maintain changelog

  2. Expose data transform API

  3. Expose API for start and status of the job

  4. Finalise the data interchange API

  5. Mutate information API

These API should contain following methods

  1. When processing a file or a new record directly with the job processing entity

    1. Make sure that you store the job unit details.

    2. Returning and updating listening parties to ensure consistent state updates.

  2. Maintain distinct logs:(separate from DevOPS logs)

    1. These logs can be read and written by other services.

    2. Maintaining of these logs should be done by introducing a consensus approach. If the services go down where data migration process could resume from where it was left/ interrupted/ suspended.

    3. Context aware of the other services.

  3. Fineract Related activities should be handled in seperate queues even in case of sending just a single record .

Note:

Fineract has a process oriented API eg:

When creating a new client, one needs to follow up the process of Activation of the client and approval of the savings account and many other similar steps are followed when dealing with portfolio.

You need to keep this in mind while automating such import from external tool, you have make sure that you are dealing with these processes linearly. Or else it can create cascading negative effects on data import exercise. E.g of a sequence for above 100 loans import.

Step 1. Create Loan Request

Step 2.Approve Loan Request.

Step 3. Disburse Loan Request.

Last updated

Logo

Maintained by © Muellners Foundation. All Rights Reserved.