Abstract: Methods and systems for determining intent of cardholders to perform repayment(s) are disclosed. Method performed by a server system includes accessing features for each payment transaction of cardholder based on unique account identifier (ID) received from financial service provider for ongoing payment transaction. Method includes segregating features into a set of authorization features and a set of non-authorization features based on authorization information associated with the unique account ID. Method includes generating, by a trained temporal Machine Learning (ML)-based model, a set of temporal embeddings from set of authorization features. Method includes generating a set of combined embeddings from the set of temporal embeddings, the set of non-authorization features, and ticket size of the ongoing payment transaction. Method includes computing, by an intent-to-pay ML-based model, an intent-to-pay score of the cardholder for ongoing payment transaction based on the set of combined embeddings, indicating likelihood of the cardholder to perform repayment(s).
Description: The present disclosure relates to artificial intelligence-based processing systems and, more particularly, to electronic methods and complex processing systems for determining the intent or willingness of a cardholder to perform repayments.
BACKGROUND
Over the years, rapid technological advancements in the financial sector including banking, financial services, and insurance (BFSI) Industry, have increased the acceptance of digital banking across many end-user sectors, such as E-commerce, shopping, the education sector, lending sector, etc. As an effect, the digital lending market is also growing, especially among Small and Medium Size Enterprises (SMEs). Digital lending is the practice of offering financial loans and credit services using digital platforms, typically employing internet technologies and data analytics, and bypassing traditional intermediaries such as banks. Digital lending includes several credit models, such as credit card services, point-of-sale financing services, Buy Now Pay Later (BNPL) services, and the like. Several financial institutions, such as banks, non-banking financial companies (NBFC), peer-to-peer lending platforms, Financial Technology (FinTech) institutions, and the like are involved in utilizing the benefits of such credit models.
In recent times, it is observed that such institutions are involved in shifting the interest of end users towards interest-free installment-based services such as BNPL from credit card-based services which require end users to pay a certain amount of interest in return for a loan. For instance, some financial institutions are involved in providing such interest-free credit services to end users via merchant platforms such as shopping platforms. It allows end users to purchase goods and services and pay for them at a later date. As a result, the end users or cardholders can buy items that they may not be able to afford upfront and pay what they owe in installments over an agreed time. This practice positively impacts the user experience while also attracting young consumers thus improving cash flow, boosting consumer loyalty, and the like.
However, it is observed that, with an increase in the count of cardholders opting for BNPL services, there is an increase in late BNPL payments, one or more missed BNPL payments, declined transactions, default rates, declines due to non-sufficient funds (NSF), resulting in fall in credit score of such cardholders. The reason for such issues is that such interest-free credit services are often unregulated and do not involve a thorough affordability check of the end users. In some scenarios, cardholders can sign up for BNPL services with low-balance payment cards without any intent of making payments for their purchase. Further, some cardholders might perform most of their repayments using a debit payment card which may lack sufficient balance for making the repayments on the loan leading to a decline in repayment due to NSF. As per a recent study, the cardholders who experienced declines at one BNPL provider may have seen NSF declines on other BNPL providers and/or E-commerce or recurring payment transactions as well.
Conventionally, in the space of structured loans, there exist multiple approaches for quantifying the intent and creditworthiness of cardholders before a loan is sanctioned to them. This ensures that the financial service providers do not face the above-mentioned problems. In such approaches, the receiver of such structured loans, i.e., the cardholder is profiled using details, such as social security number, and personal information, such as email, phone number, and the like. However, in the case of unstructured loans such as the BNPL loans, the intent and/or the credit worthiness of the cardholders to repay the loan amount cannot be accurately determined using the conventional approach. This is because unstructured loans are not backed with solid profiling of the cardholders and collaterals.
For instance, conventionally, in the case of BNPL loans, the BNPL provider often uses emails to improve the client base, reduce friction, and provide a better onboarding process for the cardholder. However, it is difficult for the BNPL provider to determine whether the cardholder has any intent or willingness to pay back this loan based on the limited information available to the BNPL provider.
Thus, there exists a need for technical solutions, such as methods and systems for determining the intent or willingness of cardholders to perform one or more repayments while overcoming the aforementioned technical drawbacks.
SUMMARY
Various embodiments of the present disclosure provide methods and systems capturing willingness of cardholders to perform one or more repayments.
In an embodiment, a computer-implemented method for capturing willingness of cardholders to perform one or more repayments is disclosed. The computer-implemented method performed by a server system includes accessing a plurality of features for each payment transaction from a plurality of payment transactions performed by a cardholder with at least one of at least one financial service provider and at least one merchant from a database associated with the server system based, at least in part, on a unique account identifier (ID) received from a financial service provider for an ongoing payment transaction. The unique account ID is linked to the cardholder. The method further includes segregating the plurality of features into a set of authorization features and a set of non-authorization features for each payment transaction based, at least in part, on authorization information associated with the unique account ID. Further, the method includes generating, by a trained temporal Machine Learning (ML)-based model associated with the server system, a set of temporal embeddings for the plurality of payment transactions based, at least in part, on the set of authorization features corresponding to each payment transaction. Each temporal embedding in the set of temporal embeddings is generated for each payment transaction. Furthermore, the method includes generating a set of combined embeddings based, at least in part, on concatenating the set of temporal embeddings, the set of non-authorization features for each payment transaction, and a ticket size of the ongoing payment transaction. The method includes computing, by a trained intent-to-pay ML-based model associated with the server system, an intent-to-pay score of the cardholder for the ongoing payment transaction based, at least in part, on the set of combined embeddings, the intent-to-pay score indicating a likelihood of the cardholder to perform one or more repayments to the financial service provider for the ongoing payment transaction.
In another embodiment, a server system is disclosed. The server system includes a communication interface and a memory including executable instructions. The server system also includes a processor communicably coupled to the memory. The processor is configured to execute the instructions to cause the server system, at least in part, to access a plurality of features for each payment transaction from a plurality of payment transactions performed by a cardholder with at least one of at least one financial service provider and at least one merchant from a database associated with the server system based, at least in part, on a unique account identifier (ID) received from a financial service provider for an ongoing payment transaction. The unique account ID is linked to the cardholder. The server system is further caused to segregate the plurality of features into a set of authorization features and a set of non-authorization features for each payment transaction based, at least in part, on authorization information associated with the unique account ID. Further, the server system is caused to generate, by a trained temporal Machine Learning (ML)-based model associated with the server system, a set of temporal embeddings the plurality of payment transactions based, at least in part, on the set of authorization features corresponding to each payment transaction. Each temporal embedding in the set of temporal embeddings is generated for each payment transaction. The server system is further caused to generate a set of combined embeddings for the plurality of payment transactions based, at least in part, on concatenating the set of temporal embeddings, the set of non-authorization features for each payment transaction, and a ticket size of the ongoing payment transaction. Furthermore, the server system is caused to determine, by a trained intent-to-pay ML-based model associated with the server system, an intent-to-pay score of the cardholder for the ongoing payment transaction based, at least in part, on the set of combined embeddings, the intent-to-pay score indicating a likelihood of the cardholder to perform one or more repayments to the financial service provider for the ongoing payment transaction.
In yet another embodiment, a non-transitory computer-readable storage medium is disclosed. The non-transitory computer-readable storage medium includes computer-executable instructions that, when executed by at least a processor of a server system, cause the server system to perform a method. The method includes accessing a plurality of features for each payment transaction from a plurality of payment transactions performed by a cardholder with at least one of at least one financial service provider and at least one merchant from a database associated with the server system based, at least in part, on a unique account identifier (ID) received from a financial service provider for an ongoing payment transaction. The unique account ID is linked to the cardholder. The method further includes segregating the plurality of features into a set of authorization features and a set of non-authorization features for each payment transaction based, at least in part, on authorization information associated with the unique account ID. Further, the method includes generating, by a trained temporal Machine Learning (ML)-based model associated with the server system, a set of temporal embeddings for the plurality of payment transactions based, at least in part, on the set of authorization features corresponding to each payment transaction. Each temporal embedding in the set of temporal embeddings is generated for each payment transaction. Furthermore, the method includes generating a set of combined embeddings based, at least in part, on concatenating the set of temporal embeddings, the set of non-authorization features for each payment transaction, and a ticket size of the ongoing payment transaction. The method includes computing, by a trained intent-to-pay ML-based model associated with the server system, an intent-to-pay score of the cardholder for the ongoing payment transaction based, at least in part, on the set of combined embeddings, the intent-to-pay score indicating a likelihood of the cardholder to perform one or more repayments to the financial service provider for the ongoing payment transaction.
BRIEF DESCRIPTION OF THE FIGURES
For a more complete understanding of example embodiments of the present technology, reference is now made to the following descriptions taken in connection with the accompanying drawings in which:
FIG. 1 illustrates a schematic representation of an environment related to at least some example embodiments of the present disclosure;
FIG. 2 illustrates a simplified block diagram of a server system, in accordance with an embodiment of the present disclosure;
FIG. 3 illustrates a block diagram of an overall model architecture used for generating the intent-to-pay score, in accordance with an embodiment of the present disclosure;
FIG. 4A illustrates a schematic representation of an architecture of a temporal ML-based model, in accordance with an embodiment of the present disclosure;
FIG. 4B illustrates a schematic representation of an architecture of an intent-to-pay ML-based model, in accordance with an embodiment of the present disclosure;
FIG. 5A illustrates a graphical representation of a variation of an intent-to-pay score with an average ticket size, in accordance with an embodiment of the present disclosure;
FIG. 5B illustrates a graphical representation of a variation of an intent-to-pay score with an average ticket size across different segments of cardholders, in accordance with an embodiment of the present disclosure;
FIG. 6 illustrates a sequence flow diagram depicting a process flow of an example scenario that requires the generation of an intent-to-pay score, in accordance with an embodiment of the present disclosure;
FIG. 7 illustrates a sequence flow diagram depicting a detailed process flow of generating the intent-to-pay score, in accordance with an embodiment of the present disclosure;
FIG. 8 illustrates a flow diagram depicting a method for capturing willingness of cardholders to perform one or more repayments, in accordance with an embodiment of the present disclosure; and
FIG. 9 illustrates a simplified block diagram of a payment server, in accordance with an embodiment of the present disclosure.
The drawings referred to in this description are not to be understood as being drawn to scale except if specifically noted, and such drawings are only exemplary in nature.
DETAILED DESCRIPTION
In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be apparent, however, to one skilled in the art that the present disclosure can be practiced without these specific details. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
Reference in this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. The appearances of the phrase “in an embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not for other embodiments.
Moreover, although the following description contains many specifics for the purposes of illustration, anyone skilled in the art will appreciate that many variations and/or alterations to said details are within the scope of the present disclosure. Similarly, although many of the features of the present disclosure are described in terms of each other, or in conjunction with each other, one skilled in the art will appreciate that many of these features can be provided independently of other features. Accordingly, this description of the present disclosure is set forth without any loss of generality to, and without imposing limitations upon, the present disclosure.
Embodiments of the present disclosure may be embodied as an apparatus, a system, a method, or a computer program product. Accordingly, embodiments of the present disclosure may take the form of an entire hardware embodiment, an entire software embodiment (including firmware, resident software, micro-code, etc.), or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit”, “engine”, “module”, or “system”. Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer-readable storage media having computer-readable program code embodied thereon.
The terms “account holder”, “user”, “cardholder”, and “buyer” are used interchangeably throughout the description and refer to a person who has a payment account or at least one payment card (e.g., credit card, debit card, etc.) may or may not be associated with the payment account, that will be used by a merchant to complete the payment transaction that may be initiated by the cardholder. The payment account may be opened via an issuing bank or an issuer server.
The term “merchant”, used throughout the description generally refers to a seller, a retailer, a purchase location, an organization, or any other entity that is in the business of selling goods or providing services, and it can refer to either a single business location or a chain of business locations of the same entity.
The term “payment account” used throughout the description refers to a financial account that is used to fund a financial transaction interchangeably referred to as “payment transaction” or “transaction”. Examples of the financial account include but are not limited to a savings account, a credit account, a checking account, and a virtual payment account. The financial account may be associated with an entity, such as an individual person, a family, a commercial entity, a company, a corporation, a governmental entity, a non-profit organization, and the like. In some scenarios, the financial account may be a virtual or temporary payment account that can be mapped or linked to a primary financial account, such as those accounts managed by payment wallet service providers, and the like.
The term “issuer”, used throughout the description, refers to a financial institution normally called an “issuer bank” or “issuing bank” in which an individual or an institution may have an account. The issuer also issues a payment card, such as a credit card or a debit card, etc. Further, the issuer may also facilitate online banking services, such as electronic money transfer, bill payment, etc., to the account holders through a server called “issuer server” throughout the description. Thus, the terms “issuer”, “issuer bank”, “issuing bank” or “issuer server” will be used interchangeably throughout the description.
Further, the term “acquirer”, is a financial institution (e.g., a bank) that processes financial transactions for merchants. In other words, this can be an institution that facilitates the processing of payment transactions for physical stores, merchants, or institutions that own platforms that make either online purchases or purchases made via software applications possible (e.g., the shopping cart platform providers and the in-app payment processing providers). The acquirers process the payment transactions by providing merchants with merchant accounts that facilitate electronic payments. The terms “acquirer”, “acquirer bank”, “acquiring bank” or “acquirer server” will be used interchangeably herein.
The term “merchant account” used throughout the description, refers to a kind of commercial bank account that enables merchants or companies to accept and handle credit and debit card payments electronically. To open a merchant account, a company must collaborate with a merchant-acquiring bank, which handles all correspondence related to an electronic payment transaction. Merchant accounts are essential for online business transactions; however, they involve added costs that the merchants have to pay the acquirers and/or the payment service providers.
The terms “payment network” and “card network” are used interchangeably throughout the description and refer to a network or collection of systems used for the transfer of funds through the use of cash substitutes. Payment networks may use a variety of different protocols and procedures in order to process the transfer of money for various types of transactions. Payment networks are companies that connect an issuing bank with an acquiring bank to facilitate online payment. Transactions that may be performed via a payment network may include product or service purchases, credit purchases, debit transactions, fund transfers, account withdrawals, etc. Payment networks may be configured to perform transactions via cash substitutes that may include payment cards, letters of credit checks, financial accounts, etc. Examples of networks or systems configured to perform as payment networks include those operated by such as Mastercard®.
The term “payment card”, used throughout the description, refers to a physical or virtual card that may or may not be linked with a financial or payment account that may be presented to a merchant or any such facility to fund a financial transaction via the associated payment account. Examples of payment cards include, but are not limited to, debit cards, credit cards, prepaid cards, virtual payment numbers, virtual card numbers, forex cards, charge cards, e-wallet cards, and stored-value cards. Alternatively, or additionally, the payment card may be embodied in the form of data stored in a user device, where the data is associated with a payment account such that the data can be used to process the financial transaction between the payment account of the cardholder and a merchant’s financial account.
The term “payment transaction” refers to an agreement that is carried out between a buyer and a seller to exchange goods or services in exchange for assets in the form of a payment (e.g., cash, fiat-currency, digital asset, cryptographic currency, coins, tokens, etc.).
OVERVIEW
Various embodiments of the present disclosure provide methods, systems electronic devices, and computer program products for capturing willingness of cardholders to make repayments. In a specific embodiment, the server system may be embodied within a payment server associated with a payment network. The server system includes a processor and a memory. In a non-limiting implementation, the server system may receive an intent-to-pay propensity request from a financial service provider for an ongoing payment transaction. Herein, the ongoing payment transaction is at least one of a loan request transaction from a cardholder and a loan standard instruction (SI) transaction initialized by at least one financial service provider with an issuing bank of the cardholder. The intent-to-pay propensity request may indicate a request to determine a likelihood of the cardholder to perform one or more repayments to the financial service provider for the ongoing payment transaction. The intent-to-pay propensity request may include at least a unique account Identifier (ID) and a ticket size for the ongoing payment transaction. In one embodiment, the server system may receive the intent-to-pay propensity request via a communication interface associated with the server system.
In one embodiment, the server system may be configured to access a cardholder-related dataset associated with the cardholder from a database associated with the server system. The server system may access the cardholder-related dataset based, at least in part, on the unique account ID. As may be understood, the unique account ID is linked to the cardholder and received from the financial service provider for the ongoing payment transaction. Further, in one embodiment, the cardholder-related dataset may include historical information corresponding to a plurality of payment transactions performed by the cardholder with at least one of at least one financial service provider and at least one merchant. In a non-limiting example, the historical information for each payment transaction may include: the authorization information, Three Domain Secure 2.0 (3DS2) data, Buy-now-pay-later (BNPL) transaction data, postpaid merchant transaction data, card Recurrent-Payment Cancellation Service (RPCS) data, and the like. Upon accessing the cardholder-related dataset, the server system may extract a plurality of features for each payment transaction and store the same in the database.
In one embodiment, the server system is configured to access the plurality of features for each payment transaction of the plurality of payment transactions from the database. The plurality of payment transactions may be performed by the cardholder with at least one of the at least one financial service provider and the at least one merchant. In one embodiment, the server system may access the features based, at least in part, on the unique account ID. The server system may further be configured to segregate the plurality of features into a set of authorization features and a set of non-authorization features for each payment transaction based, at least in part, on authorization information associated with the unique account ID.
Upon segregation, the server system may be configured to generate a set of temporal embeddings for the plurality of payment transactions based, at least in part, on the set of authorization features corresponding to each payment transaction. Each temporal embedding in the set of temporal embeddings may be generated for each payment transaction. In a non-limiting implementation, the server system may generate the set of temporal embeddings using a trained temporal Machine Learning (ML)-based model associated with the server system.
In a non-limiting implementation, for generating the trained temporal ML-based model, the server system may further be configured to access a training set of authorization features for each payment transaction of the plurality of payment transactions for a training period from the database. Further, the server system may be configured to generate training sequential data for each cardholder based, at least in part, on the training set of authorization features and a time stamp associated with each payment transaction. The training sequential data may include one or more card-specific sequences of the plurality of payment transactions.
Further, the server system may train a temporal ML-based model for obtaining the trained temporal ML-based model based, at least in part, on performing a first set of operations iteratively till the performance of the temporal ML-based model converges to first predefined criteria. The first set of operations may include: (i) generating the temporal ML-based model based, at least in part, on one or more temporal model parameters; (ii) generating, via the temporal ML-based model, a training set of temporal embeddings for the training sequential data for each cardholder based, at least in part on the one or more temporal model parameters; (iii) generating, via the temporal ML-based model, a prediction for each payment transaction in the training sequential data based, at least in part, on the training set of temporal embeddings, the prediction corresponds to a hidden state of the temporal ML-based model; (iv) computing a temporal loss value for each payment transaction in the training sequential data using a temporal loss function based, at least in part, on the prediction and an actual outcome; and (v) optimizing the one or more temporal model parameters based, at least in part, on backpropagating the temporal loss value.
Furthermore, for the server system to generate the set of temporal embeddings, the server system may be configured to generate sequential data for the cardholder based, at least in part, on the set of authorization features corresponding to each payment transaction and a time stamp associated with each payment transaction performed by the cardholder for a predefined period. The sequential data may include one or more card-specific sequences of the plurality of payment transactions performed by the cardholder within the predefined period. The server system may process the sequential data using the trained temporal ML-based model. Finally, the server system may extract the set of temporal embeddings from the at least one hidden layer of the trained temporal ML-based model, each temporal embedding indicating a pattern of the sequential data.
Moreover, the server system may be configured to generate a set of combined embeddings for the plurality of payment transactions based, at least in part, on concatenating the set of temporal embeddings, the set of non-authorization features for each payment transaction, and a ticket size of the ongoing payment transaction. The server system may further be configured to compute an intent-to-pay score of the cardholder for the ongoing payment transaction based, at least in part, on the set of combined embeddings. Herein, the intent-to-pay score may indicate a likelihood of the cardholder to perform one or more repayments to the financial service provider for the ongoing payment transaction. In a non-limiting implementation, the server system may compute the intent-to-pay score using a trained intent-to-pay ML-based model associated with the server system.
In a non-limiting implementation, for generating the trained intent-to-pay ML-based model, the server system may further be configured to train an intent-to-pay ML-based model based, at least in part, on performing a second set of operations iteratively till the performance of the intent-to-pay ML-based model converges to second predefined criteria. The second set of operations may include: (i) generating the intent-to-pay ML-based model based, at least in part, on one or more intent-to-pay model parameters; (ii) generating, by the intent-to-pay ML-based model, an intent-to-pay score for each payment transaction based, at least in part, on a training dataset of the set of combined embeddings and the one or more intent-to-pay model parameters; (iii) generating, via the intent-to-pay ML-based model, a predicted outcome, based at least in part, on the intent-to-pay score; (iv) computing an intent-to-pay loss value for each payment transaction using an intent-to-pay loss function based, at least in part, on the predicted outcome and an actual outcome; and (v) optimizing the one or more intent-to-pay model parameters based, at least in part, on backpropagating the intent-to-pay loss value.
In one embodiment, the server system may transmit the intent-to-pay score of the cardholder to the financial service provider once the intent-to-pay score is computed for the ongoing payment transaction. The server system may transmit the intent-to-pay score via the communication interface. In some embodiments, the server system also be configured to label the cardholder as a willing-to-pay cardholder for the ongoing payment transaction based, at least in part, on the intent-to-pay score of the cardholder is at least equal to a predefined intent threshold. Alternatively, the server system may be configured to label the cardholder as a not-willing-to-pay cardholder for the ongoing payment transaction based, at least in part, on the intent-to-pay score of the cardholder being less than the predefined intent threshold.
Further, in some cases, the server system may generate one or more recommendations for the financial service provider based, at least in part, on the intent-to-pay score of the cardholder for the ongoing payment transaction. The one or more recommendations may include a suggestion to the financial service provider to provide a much lower ticket size option to the cardholder having the intent-to-pay score less than a predefined intent threshold.
Various embodiments of the present disclosure offer multiple advantages and technical effects. For instance, the present disclosure aims to solve the technical problem of accurately capturing the intent of the cardholders to perform repayments while on-boarding and as well as between the installment repayment period. The present disclosure utilizes a unique account ID (e.g., PAN) of the payment card used by the cardholder to understand the historical behavior of the cardholder with different merchants and different financial service providers (including lenders). For instance, the historical behavior may provide insights into how many NSF-related declines in recurring payments the cardholder encounters. Similarly, recurrent payment cancellation service (RPCS) provides insights into historical instances where cardholder has explicitly instructed to stop recurring payments to merchants. Also, insights into the number of open loans (such as BNPL loans) the cardholder is repaying can also be gathered, which may otherwise not be visible to the financial service provider. Hence, once the PAN of the payment card is known using which the cardholder will be potentially paying back, the history from that card’s transaction can be utilized towards understanding the cardholder’s ability and intent/willingness to repay. The present approach utilizes an AI/ML model to quantify the intent of the cardholder into a score that can be easily understood by the financial service provider. This score can then be used by the financial service provider to determine whether to disburse the loan amount to the cardholder for making the purchase. Additionally, for an ongoing loan, this score may be used by the financial service provider to determine whether to send additional early reminders to the cardholders for repayments.
In addition, the present disclosure supports financial service providers who do not have access to the credit history of the cardholders. Further, the present disclosure is also applicable for POS, ECOM, recurring payment transactions, and the like. Moreover, as the score is dependent on the ticket size of the loan amount, the most preferred ticket size of the cardholder can be determined by the financial service provider. For instance, it can be determined that the cardholder is most likely to pay for a ticket of 30 dollars and is less likely to pay when the ticket size goes beyond 50 dollars. Based on this, recommendations can be provided to the financial service provider suggesting that if the cardholder is not willing to pay for a particular ticket size, then a much lower ticket size option may be provided to the cardholder.
Also, it may be noted that the approach of the present disclosure is not only applicable to the BNPL loans but also to other types of loans with a higher ticket size, different installment periods, different repay durations, different industries, and the like. Moreover, the approach of the present disclosure is applicable to different types of payment cards, such as debit cards, credit cards, etc. The approach of the present disclosure is also applicable for different payment channels, such as API, queried via a payment card number, etc. Additionally, the approach of the present disclosure also captures recent decline trends, newer open credit lines, higher on-boarding frictions, and the like along with capturing the intent to pay.
Various example embodiments of the present disclosure are described hereinafter with reference to FIGS. 1 to 9.
FIG. 1 illustrates a schematic representation of an environment 100 related to at least some example embodiments of the present disclosure. Although the environment 100 is presented in one arrangement, other embodiments may include the parts of the environment 100 (or other parts) arranged otherwise depending on, for example, receiving an intent-to-pay propensity request, determining an intent-to-pay score based on the unique account ID, a ticket size of an ongoing payment transaction, and a time slot for which the intent-to-pay score and/or an intent-to-pay propensity may be needed, transmitting the intent-to-pay score, and the like.
The environment 100 generally includes a plurality of entities, such as a server system 102, a plurality of cardholders 104(1), 104(2), … 104(N) (collectively referred to hereinafter as a ‘plurality of cardholders 104’ or simply, ‘cardholders 104’), a plurality of merchants 106(1), 106(2), … 106(N) (collectively referred to hereinafter as a ‘plurality of merchants 106’ or simply, ‘merchants 106’), a plurality of issuer servers 108(1), 108(2), …, 108(N) (collectively referred to hereinafter as a ‘plurality of issuer servers 108’ or simply ‘issuer servers 108’), a plurality of acquirer servers 110(1), 110(2), …, 110(N) (collectively referred to hereinafter as a ‘plurality of acquirer servers 110’ or simply ‘acquirer servers 110’), a payment network 112 including a payment server 114, a database 116, a plurality of financial service providers 118(1), 118(2), …, 118(N) (collectively referred to hereinafter as a ‘plurality of financial service providers 118’ or simply, ‘financial service providers 118’) associated with a plurality of electronic devices 120(1), 120(2), … 120(N) (collectively referred to hereinafter as a ‘plurality of electronic devices 120’ or simply ‘electronic devices 120’), respectively, a plurality of financial service provider banks 122(1), 122(2), … 122(N) (collectively referred to hereinafter as a ‘plurality of financial service provider banks 122’ or simply ‘financial service provider banks 122’) each coupled to, and in communication with (and/or with access to) a network 124. Herein, it may be noted that ‘N’ is a non-zero natural number and may be different for each distinct entity. The network 124 may include, without limitation, a Light Fidelity (Li-Fi) network, a Local Area Network (LAN), a Wide Area Network (WAN), a Metropolitan Area Network (MAN), a satellite network, the Internet, a fiber optic network, a coaxial cable network, an infrared (IR) network, a Radio Frequency (RF) network, a virtual network, and/or another suitable public and/or private network capable of supporting communication among two or more of the parts or users illustrated in FIG. 1, or any combination thereof.
Various entities in the environment 100 may connect to the network 124 in accordance with various wired and wireless communication protocols, such as Transmission Control Protocol and Internet Protocol (TCP/IP), User Datagram Protocol (UDP), 2nd Generation (2G), 3rd Generation (3G), 4th Generation (4G), 5th Generation (5G) communication protocols, Long Term Evolution (LTE) communication protocols, New Radio (NR) communication protocol, any future communication protocol, or any combination thereof. In some instances, the network 124 may utilize a secure protocol (e.g., Hypertext Transfer Protocol (HTTP), Secure Socket Lock (SSL), and/or any other protocol, or set of protocols for communicating with the various entities depicted in FIG. 1.
In an embodiment, a cardholder (e.g., the cardholder 104(1)) may be any individual, representative of a corporate entity, a non-profit organization, or any other person who is presenting payment account details during an electronic payment transaction. The cardholder (e.g., the cardholder 104(1)) may have a payment account issued by an issuing bank (not shown in figures) associated with an issuer server (e.g., the issuer server 108(1)) and may be provided with a payment card (e.g., payment card 126) with financial or other account information encoded onto the payment card 126 such that the cardholder (i.e., the cardholder 104(1)) may use the payment card 126 to initiate and complete a payment transaction using a bank account at the issuing bank. In one embodiment, the financial account information associated with the payment card 126 may be referred to as payment card information which may include, but is not limited to, a unique payment card number, a unique account Identifier (ID) such as a Primary Account Number (PAN), a cardholder name, an expiration date, and the like. It should be noted that the payment card information may be stored in the database 116 as a cardholder-related dataset.
In another embodiment, the cardholders 104 may use their corresponding cardholder electronic devices (not shown in figures) to access a mobile application or a website associated with the issuing bank, or any third-party payment application to perform a payment transaction. In various non-limiting examples, the cardholder electronic devices may refer to any electronic devices, such as, but not limited to, Personal Computers (PCs), tablet devices, smart wearable devices, Personal Digital Assistants (PDAs), voice-activated assistants, Virtual Reality (VR) devices, smartphones, laptops, and the like.
In an embodiment, the merchants 106 may include retail shops, restaurants, supermarkets or establishments, government and/or private agencies, or any such places equipped with Point-of-sale (POS) terminals, where the cardholders 104 visits to perform financial transactions in exchange for any goods and/or services or any financial transactions. In an embodiment, the merchants 106 are generally associated with financial institutions such as acquiring banks who are associated with the acquirer servers 110. Herein, the acquiring bank can be an institution that facilitates the processing of payment transactions for physical stores, merchants, or institutions that own platforms that make either online purchases or purchases made via software applications possible.
In one scenario, the cardholders 104 may use their corresponding payment accounts to conduct payment transactions with the merchants 106. Moreover, it may be noted that each of the cardholders 104 may use their corresponding payment cards differently or make the payment transaction using different means of payment, such as net banking, Unified Payments Interface (UPI) payment, card transaction, cheque transaction, etc. For instance, the cardholder 104(1) may enter payment account details on an electronic device (not shown) associated with the cardholder 104(1) to perform an online payment transaction. In another instance, the cardholder 104(2) may utilize a payment card to perform an offline payment transaction. In yet another instance, the cardholder 104(3) may enter details of the payment card to transfer funds in the form of fiat currency on an e-commerce platform to buy goods.
In some embodiments, as the cardholders 104 are involved in making purchases with the corresponding merchants 106, if the ticket size of a payment transaction is a huge amount, some of the cardholders 104 might not perform the corresponding payment transaction due to lack of funds in their financial accounts. To address such a situation, the merchants 106 can provide an additional payment option to the cardholders 104 on their merchant platforms to make the payment in installments over some time. To provide such payment options, the merchants 106 can partner with one or more financial service providers that may lend money to the cardholders 104 for performing the payment in full, while requiring the cardholders 104 to return the money to them in installments. This way the cardholders 104 are able to purchase products that they may not be able to afford upfront. It should be noted that although the terms ‘financial service provider’ and ‘lender’ are used interchangeably throughout the present disclosure, the term ‘financial service provider’ may refer to other forms of suitable financial institutions as well.
In an embodiment, the lenders 118 may include merchants, Financial Technology (FinTech) institutions, banks, non-banking financial companies (NBFC), issuers, postpaid merchants, or any financial institution that provides credit services to the merchants and the cardholders, without limiting the scope of the invention. In a non-limiting example, the credit services may include, but are not limited to, point-of-sale financing services, Buy Now Pay Later (BNPL) services, structured loan services, and the like. Further, in an embodiment, the electronic devices 120 that are associated with the corresponding lenders 118 may include electronic devices, such as, but not limited to, PCs, tablet devices, smart wearable devices, PDAs, voice-activated assistants, VR devices, smartphones, laptops, and the like.
In one embodiment, the lenders 118 may provide such a lending service to the cardholders via the merchant platforms. In such an embodiment, the merchants 106 pay service charges to the lenders 118 in return for the add-on of the lending service provided by the lenders 118 on the merchant platforms. In another embodiment, the lenders 118 may provide the lending service to the cardholders directly via lending platforms. In either of the scenarios, most of the lenders 118 in the lending industry, make additional profit by charging interest on the payment transaction made by the lenders 118 to the merchants 106 on behalf of the cardholders 104. This payment transaction is considered as a loan amount given to the cardholders 104 by the lenders 118. For instance, when the cardholder 104(1) uses a credit card to make the payment for the purchase made at the merchant 106(1), then the issuing bank makes the complete payment to the acquiring bank of the merchant 106(1). Later, the cardholder 104(1) pays the issuing bank in installments i.e., Equated Monthly Installment (EMI) along with a predefined percent (%) of interest such as about 8-12% over the principal amount (i.e., the loan amount corresponding to the purchase made by the cardholder 104(1)).
In recent times, as mentioned earlier, the focus has shifted to interest-free installment-based services such as Now Pay Later (BNPL) services. The terms “Buy Now Pay Later service” and “BNPL” are used interchangeably, throughout the description, and refer to a service that offers cardholders the ability to purchase an item and pay the amount due over a period of time. A credit check could be necessary for this, but most merchants will not ask for one or will just perform a ‘soft inquiry’ that will not have an impact on a credit score of the cardholders 104. It should be noted that the lenders 118 may facilitate the BNPL service on the lending platforms as preapproved offers before the purchase is made by the cardholders 104 or at a merchant checkout as real-time offers during purchase on the merchant platforms. Merchants sign a contract with the lenders or the BNPL service providers to integrate the BNPL service option in a checkout page of the merchant platforms. In most cases, the BNPL service is free for the cardholders 104, however, the merchants 106 must pay a fee for every payment transaction. Nonetheless, the benefits that merchants 106 have from such services are way more than the fees they have to pay. For instance, it increases sales of the merchants 106 as it allows the cardholders 104 to make purchases that they may not be able to afford upfront, increases the average ticket size of purchases, improves cash flow, boosts customer loyalty, and the like.
However, due to high default rates, high decline rates, high NSF-based decline rates, and the like, that are experienced by cardholders (e.g., the cardholders 104) that choose BNPL services either on the merchant platforms or on the lending platforms, the merchants (e.g., merchants 106) and the lenders (e.g., the lenders 118) face continued loss in their profits. Therefore, there is a need to check for the willingness (otherwise, also referred to as ‘intent’) of the cardholders 104 either while on-boarding the cardholders 104 or between repayments and/or installments to be paid by the cardholders 104. By determining the willingness of the cardholders 104, the BNPL providers, such as the merchants 106 and/or the lenders 118 can take the necessary actions upfront or during each repayment. As used herein, the term “BNPL provider” refers to a service provider that gives an additional payment option to cardholders at checkout i.e., to facilitate the payment of checkout amount in installments over a period of time.
In one non-limiting implementation, the BNPL provider (e.g., the lender 118(1)) pays a merchant (e.g., the merchant 106(1)) in full upon completion of checkout at a merchant’s platform. After that, the cardholders (e.g., cardholder 104(1)) can perform a number of installment payments to the BNPL provider due to the availability of a BNPL service option on the merchant’s platform.
In another non-limiting implementation, the BNPL provider (e.g., the lender 118(1)) may launch such an additional payment option on their lending platform rather than on the merchant’s platform. This option can be accessed by the cardholders 104 while making a purchase at a merchant (e.g., the merchant 106(1)) by visiting the lending platform and selecting the corresponding option. Upon selecting the corresponding option, the BNPL provider makes the complete payment to the merchant 106(1) and facilitates the cardholder 104(1) to pay the same amount in small installments to the BNPL provider.
Further, the term “repayment” refers to the amount of money to be paid back to a lender by a borrower. Herein, the amount may be paid back either completely or in installments. Thus, throughout the description, the term “repayment” may be used to represent the total loan amount that is supposed to be paid back to the lender or the smaller installment amount that is less when compared to a ticket size of the total loan amount to be paid to the lender at constant time intervals until the total loan amount is paid back. Thus, in one embodiment, determining the willingness of the cardholders 104 to make repayments may correspond to determining the willingness of the cardholders 104 to make a repayment of the total loan amount taken by the cardholders 104. In another embodiment, determining the willingness of the cardholders 104 to make repayments may correspond to determining the willingness of the cardholders 104 to make a repayment of each installment amount associated with the total loan amount.
The above-mentioned technical problems, among other problems, are addressed by one or more embodiments implemented by the server system 102 and the methods thereof provided in the present disclosure. In an embodiment, it may be noted that the methods and systems proposed in the present disclosure can be used in any domain or industry to capture the intent of the cardholder 104(1) to make the repayments having different ticket sizes. However, for the sake of explanation, analysis, and performance comparison, the example of using the proposed system in the payment industry that involves providing BNPL services is considered in the present disclosure. On that note, the present disclosure is applicable to various credit providers with different ticket sizes, different installment periods, and different repayment durations, such as virtual rent-to-own, off-card financing, card-lined installments, focused large ticket verticals, and the like. Further, the present disclosure may be applicable in various industries, such as, but not limited to, the apparel industry, footwear, fitness, electronics, furniture, mattresses, travel, home goods, sports, etc. In one embodiment, the server system 102 is configured to perform one or more of the operations described herein.
In a specific embodiment, the server system 102 may facilitate payment processors (such as Mastercard®) to determine the willingness of the cardholders 104 to perform the repayments by computing an intent-to-pay score of the cardholders 104 for an ongoing payment transaction while training one or more Artificial Intelligence (AI) or Machine Learning (ML) models to perform a specific classification task. In an embodiment, for determining the willingness of the cardholder 104(1), the server system 102 may compute an intent-to-pay score of the corresponding cardholder 104(1) for the ongoing payment transaction based at least on processing details of the corresponding cardholder 104(1) available at the payment processors. Moreover, in a non-limiting implementation, the server system 102 may generate the intent-to-pay score upon receiving a request for the same from the lenders (e.g., the lender 118(1)). The server system 102 may transmit the intent-to-pay score to the lender 118(1) on their electronic device (e.g., the electronic device 120(1)). The lender 118(1) may generate such a request for one or more of the repayments to be performed by the cardholder 104(1) at the merchants 106 either during on-boarding or between the installment payments.
It should be noted that, in the present disclosure, for the simplicity of the explanation, a scenario where the lender 118(1) requests for the intent-to-pay score of the cardholder 104(1) for an ongoing payment transaction either while on-boarding the cardholder 104(1) or in-between of installment payments from an entity where the methods and systems proposed in the present disclosure are deployed. In an embodiment, the entity may correspond to the payment processors. However, the server system 102 may generate the intent-to-pay score upon receiving a similar request from the merchants 106, issuers 108, other BNPL providers, other lenders, and the like.
Further, upon receiving the intent-to-pay score, the lender 118(1) may take necessary actions upfront or during each repayment. In a non-limiting implementation, the necessary actions that the lender 118(1) can take based on the intent-to-pay score of the cardholder 104(1) to make repayments may include, but are not limited to, while on-boarding of cardholders decide whether to disburse the loan or not, especially for new cardholders, provide higher credits for credit-worthy and cardholders who actually have intentions to make the repayments, assist in better liquidity management, perform preventive than reactive measure for loan reconciliation, discontinue credit lines if intent-to-pay of a cardholder is found to be less, or the like.
In a specific embodiment, the server system 102 may be configured to receive an intent-to-pay propensity request from the lender 118(1) for an ongoing payment transaction via the electronic device 120(1). The intent-to-pay propensity request may indicate a request to determine a propensity corresponding to the likelihood of the cardholder 104(1) to perform the one or more repayments to the lender 118(1) for the ongoing payment transaction. The ongoing payment transaction may correspond to a payment transaction corresponding to a purchase made by the cardholder 104(1) which is performed by the lender 118(1) to the merchant 106(1) on behalf of the cardholder 104(1). Thus, the one or more repayments may correspond to one or more installment payments to be made by the cardholder 104(1) to the lender 118(1). Further, the intent-to-pay propensity request may include at least a unique account ID (e.g., PAN) corresponding to a payment card (e.g., the payment card 126) used by the cardholder 104(1) to perform the one or more repayments and a ticket size related to the ongoing payment transaction. In one embodiment, the intent-to-pay propensity request may further include other payment card information as mentioned above.
Later, upon receiving the intent-to-pay propensity request, the server system 102 may access the cardholder-related dataset from the database 116 based, at least in part, on the unique account ID of the cardholder 104(1). In one embodiment, the cardholder-related dataset may include historical information corresponding to a plurality of payment transactions performed by the cardholder 104(1) with at least one of at least one financial service provider (e.g., the financial service provider 118(1)) and at least one merchant (e.g., the merchant 106(1)). In a non-limiting example, the historical information may include, but is not limited to, authorization data, Three Domain Secure 2.0 (or 3DS2) data, BNPL/postpaid transaction data, card Recurrent Payment Cancellation Service (RPCS) data, and the like related to the cardholder 104(1).
In one embodiment, the authorization data may include data related to historical payment transactions performed by the cardholder 104(1) using a particular payment card such as the payment card 126 for a predefined interval. The predefined interval may correspond to the last one year, last five years, last three months, last one month, last 7 days, last three days, or the like. The authorization data may include transaction time, transaction amount, a payment cycle period, card PAN, name associated with a financial account linked to the payment card, card expiry date, account balance, card type, and the like. From the authorization data, variation in a payment transaction pattern associated with the payment card 126 with time can be captured.
Further, the 3DS-2 data may include cardholder email ID, multiple payment cards having the same email ID, a count of approved payment transactions performed using the payment card 126, a count of declined payment transactions, payment transactions performed using other payment cards, card details of different payment cards owned by the cardholder 126, and the like. From the 3DS-2 data, the behavior of payment cards can be observed and compared based on payment transactions performed by the corresponding payment cards. Further, based on the 3DS-2 data, it can be determined that if one payment card behaves in a certain way then other payment cards that have the same email ID and/or PAN may behave in the same way.
In one embodiment, the BNPL/postpaid transaction data may include an amount associated with BNPL transactions, a count of BNPL transactions, a count of open credit lines, a number of times installment was not paid, and the like. From the BNPL/postpaid transaction data, it can be determined that based on the behavior of the cardholder on different BNPL merchants and/or postpaid merchants, an expected behavior of the cardholder of the current BNPL transaction can be predicted. Also, it can be determined whether the cardholder has a history of paying the money or not paying the money on time.
Further, the RPCS data may include information related to any recurring service that the cardholder has opted for, whether any such recurring service was canceled in the past, and the like. From the RPCS data, it can be determined that, since the cardholder has canceled a particular recurrent service even after having sufficient balance in their financial account, the cardholder has no intention of making the payment and hence the cardholder should be given less priority for sanctioning any kind of loans in future.
In an example, the historical information may include, but is not limited to, transaction attributes, such as transaction amount, source of funds such as bank accounts, debit cards or credit cards, transaction channel used for loading funds such as Point-of-sale (POS) terminal or Automated Teller Machine (ATM), transaction velocity features such as count and transaction amount sent in the past ‘x’ number of days to a particular user, transaction location information, external data sources, merchant country, merchant Identifier (ID), cardholder ID, cardholder product, cardholder PAN, Merchant Category Code (MCC), merchant location data or merchant co-ordinates, merchant industry, merchant super industry, and other transaction-related data.
In various non-limiting examples, the database 116 may include one or more Hard Disk Drives (HDD), Solid-State Drives (SSD), an Advanced Technology Attachment (ATA) adapter, a Serial ATA (SATA) adapter, a Small Computer System Interface (SCSI) adapter, a Redundant Array of Independent Disks (RAID) controller, a Storage Area Network (SAN) adapter, a network adapter, and/or any component providing the server system 102 with access to the database 116. In one implementation, the database 116 may be viewed, accessed, amended, updated, and/or deleted by an administrator (not shown) associated with the server system 102 through a database management system (DBMS) or relational database management system (RDBMS) present within the database 116.
In other various examples, the database 116 may also include multifarious data, for example, social media data, Know Your Customer (KYC) data, payment data, trade data, employee data, Anti Money Laundering (AML) data, market abuse data, Foreign Account Tax Compliance Act (FATCA) data, and fraudulent payment transaction data. In addition, the database 116 provides a storage location for data and/or metadata obtained from various operations performed by the server system 102.
Further, it may be noted that, in a specific example, the server system 102 coupled with the database 116 is embodied within a payment server associated with the payment processor, however, in other examples, the server system 102 can be a standalone component (acting as a hub) connected to the issuer servers and the acquirer servers. The database 116 may be incorporated in the server system 102 or maybe an individual entity connected to the server system 102 or maybe a database stored in cloud storage.
Further, in one embodiment, the server system 102 may train one or more AI or ML models for generating the intent-to-pay score for the cardholder 104(1) for the ongoing payment transaction based on the cardholder-related dataset. Thus, the one or more AI or ML models may also be stored in the database 116. In a non-limiting implementation, the one or more AI or ML models may include a temporal ML-based model and an intent-to-pay ML-based model. Further, for training the one or more AI or ML models, the cardholder-related dataset may have to be prepared which involves pre-processing the cardholder-related dataset. One of the pre-processing steps includes extracting or generating features for each payment transaction from the cardholder-related dataset. Thus, in one embodiment, the server system 102 is further configured to extract a plurality of features from the cardholder-related dataset associated with each payment transaction and store the features corresponding to the unique account ID in the database 116 for future use. The process of training the one or more AI or ML models is described in further parts of the present disclosure.
In a non-limiting implementation, the server system 102 is configured to access the plurality of features for each payment transaction from a plurality of payment transactions performed by the cardholder 104(1) with at least one of at least one financial service provider (e.g., the lender 118(1)) and at least one merchant (e.g., the merchant 106(1)) from the database 116 based, at least in part, on the unique account ID received from the lender 118(1) for the ongoing payment transaction. In one embodiment, the plurality of features may include authorization-related card features, 3DS-2-related card features, BNPL/postpaid merchants-related card features, RPCS-related card features, and the like.
In one embodiment, the authorization-related card features may include, but are not limited to, amount and count related to: a plurality of payment transactions performed by the cardholder, a plurality of declines, a plurality of approvals, a plurality of DNH declines, a plurality of NSF declines, and the like. In another embodiment, the authorization-related card features may include, but are not limited to, features across different channels, such as card present (CP), card not present (CNP), E-commerce, recurring, domestic, cross border, and the like. In yet another embodiment, the authorization-related card features may include, but are not limited to, features across different time frames, such as previous one, three, seven, fourteen, twenty-eight, etc., days, previous three, six, nine, etc., months, previous one, three, seven, fourteen, etc., payment transactions, and the like. Further, in yet another embodiment, the authorization-related card features may include, but are not limited to, a count of days since the last NSF decline, approval amount since the last NSF decline, average approval amount between NSFs, and the like. For example, the last fourteen days' CNP and NSF count of payment transactions, the last three months recurring approval amount, and the like.
Further, in one embodiment, the 3DS-2-related card features may include, but are not limited to, authorization, RPCS, BNPL merchants-related features from payment cards sharing the same: email ID, IP address, shipping address, MAC address, phone number, and the like. In another embodiment, the 3DS-2-related card features may include, but are not limited to, features across channels, such as CNP, recurring, and the like. In yet another embodiment, the 3DS-2-related card features may include, but are not limited to, features across different time frames, such as previous one, three, seven, fourteen, twenty-eight, etc., days, previous three, six, nine, etc., months, previous seven, fourteen, etc., payment transactions, and the like. For example, last fourteen days CNP and NSF count of payment transactions performed by payment cards having the same email ID.
Furthermore, in one embodiment, the BNPL/postpaid merchants-related card features may include, but are not limited to, amount and count related to: a plurality of BNPL payment transactions, a plurality of BNPL declines, a plurality of BNPL approvals, a plurality of BNPL and DNH declines, a plurality of BNPL and NSF declines, and the like. In another embodiment, the RPCS-related card features may include, but are not limited to, features across different payment channels, such as CP, CNP, E-commerce, recurring, and the like. In yet another embodiment, the RPCS-related card features may include, but are not limited to, features across different time frames, such as previous one, three, seven, fourteen, twenty-eight, etc., days, previous three, six, nine, etc., months, previous seven, fourteen, etc., payment transactions, and the like. Further, in yet another embodiment, the RPCS-related card features may include, but are not limited to, a count of unique BNPL merchants, percentage of amount and/or count across BNPL merchants relative to overall amount and/or count, count of active BNPL merchants, and the like. For example, count of BNPL, CNP, and NSF during the last fourteen days, last three months recurring postpaid approval amount, etc.
Moreover, in one embodiment, the RPCS-related card features may include, but are not limited to, amount and count related to: overall RPCS declined payment transactions, unique merchants with RPCS declined payment transactions, MCC-specific RPCS declined payment transactions, BNPL-specific RPCS declined payment transactions, and the like. In another embodiment, the RPCS-related card features may include, but are not limited to, features across a recurring payment channel, features across different time frames, such as one, three, seven, fourteen, twenty-eight, etc., days, previous three, six, nine, etc., months, previous seven, fourteen, etc., payment transactions, and the like. In yet another embodiment, the RPCS-related card features may include, but are not limited to, approval count and amount before RPCS decline on merchants, partial approval before RPCS on BNPL merchants, and the like. For example, the last twenty-eight days’ RPCS amount, the last three months’ BNPL and PRCS count of payment transactions, and the like.
In a non-limiting implementation, a temporal pattern associated with the authorization features from the plurality of features may have to be determined prior to computing the intent-to-pay score of the cardholder 104(1) for the ongoing payment transaction, who is supposed to perform the one or more repayments to the lender 118(1) of the ongoing payment transaction. Thus, in an embodiment, the server system 102 may be configured to segregate the plurality of features into a set of authorization features and a set of non-authorization features for each payment transaction based, at least in part, on authorization information associated with the unique account ID. Herein, the authorization information that is associated with the unique account ID of the cardholder 104(1) may correspond to the above-mentioned authorization data that is extracted from the database 116 upon receiving the unique account ID along with the intent-to-pay propensity request from the lender 118(1).
It should be noted that the set of authorization features may include at least the above-mentioned authorization-related card features. Further, all the other features, such as the 3DS-2-related card features, the BNPL/postpaid merchants-related card features, the RPCS-related card features, and the like may be segregated under the set of non-authorization features for each payment transaction.
Further, in one embodiment, the server system 102 may be configured to train one or more AI or ML models such as the temporal ML-based model to determine the temporal pattern associated with the payment transactions performed by the cardholder 104(1) in the past. Further, data corresponding to the temporal pattern may have to be prepared in a way that it can be used further by the intent-to-pay ML-based model for generating the intent-to-pay score of the cardholder 104(1) for the ongoing payment transaction. Thus, the server system 102 may be configured to generate a set of temporal embeddings for the plurality of payment transactions based, at least in part, on the set of authorization features corresponding to each payment transaction. Each temporal embedding in the set of temporal embeddings is generated for each payment transaction. In one embodiment, the server system 102 generates the set of temporal embeddings using the temporal ML-based model. The process of training the temporal ML-based model is described in further parts of the present disclosure.
Subsequently, in an embodiment, the server system 102 may be configured to generate a set of combined embeddings for the plurality of payment transactions based, at least in part, on concatenating the set of temporal embeddings, the set of non-authorization features for each payment transaction, and the ticket size of the ongoing payment transaction. Lastly, the server system 102 may be configured to compute the intent-to-pay score of the cardholder 104(1) for the ongoing payment transaction based, at least in part, on the set of combined embeddings. As may be understood, the intent-to-pay score indicates the likelihood of the cardholder 104(1) to perform the one or more repayments to the lender 118(1) for the ongoing payment transaction. Moreover, the server system 102 computes the intent-to-pay score using the intent-to-pay ML-based model. The process of computing the intent-to-pay score is explained in further parts of the present disclosure.
In some embodiments, the server system 102 may be configured to label the cardholder 104(1) as one of a willing-to-pay cardholder and a not-willing-to-pay cardholder based, at least in part, on one of the intent-to-pay score being at least equal to a predefined intent threshold and less than the predefined intent threshold, respectively.
In some other embodiments, the server system 102 may also be configured to generate recommendations for the financial service provider 118(1) based, at least in part, on the intent-to-pay score of the cardholder 104(1) for the ongoing payment transaction. The recommendations may include a suggestion to the financial service provider 118(1) to provide a much lower ticket size option to the cardholder 104(1) having the intent-to-pay score less than the predefined intent threshold.
In one embodiment, the payment network 112 may be used by the payment card issuing authorities as a payment interchange network. Examples of the payment cards include debit cards, credit cards, etc. Similarly, examples of payment interchange networks include but are not limited to, a Mastercard® payment system interchange network. The Mastercard® payment system interchange network is a proprietary communications standard promulgated by Mastercard International Incorporated® for the exchange of electronic payment transaction data between issuers and acquirers that are members of Mastercard International Incorporated®. (Mastercard is a registered trademark of Mastercard International Incorporated located in Purchase, N.Y.).
It should be understood that the server system 102 is a separate part of the environment 100, and may operate apart from (but still in communication with, for example, via the network 124) any third-party external servers (to access data to perform the various operations described herein). However, in other embodiments, the server system 102 may be incorporated, in whole or in part, into one or more parts of the environment 100.
The number and arrangement of systems, devices, and/or networks shown in FIG. 1 are provided as an example. There may be additional systems, devices, and/or networks; fewer systems, devices, and/or networks; different systems, devices, and/or networks; and/or differently arranged systems, devices, and/or networks than those shown in FIG. 1. Furthermore, two or more systems or devices are shown in FIG. 1 may be implemented within a single system or device, or a single system or device is shown in FIG. 1 may be implemented as multiple, distributed systems or devices. In addition, the server system 102 should be understood to be embodied in at least one computing device in communication with the network 124, which may be specifically configured, via executable instructions, to perform steps as described herein, and/or embodied in at least one non-transitory computer-readable media.
FIG. 2 illustrates a simplified block diagram of a server system 200, in accordance with an embodiment of the present disclosure. The server system 200 is identical to the server system 102 of FIG. 1. In one embodiment, the server system 200 is a part of the payment network 112 or integrated within the payment server 114. In some embodiments, the server system 200 is embodied as a cloud-based and/or SaaS-based (software as a service) architecture.
The server system 200 includes a computer system 202 and a database 204. The computer system 202 includes at least one processor 206 (herein, referred to interchangeably as ‘processor 206’) for executing instructions, a memory 208, a communication interface 210, a user interface 212, and a storage interface 214. The one or more components of the computer system 202 communicate with each other via a bus 216. The components of the server system 200 provided herein may not be exhaustive and the server system 200 may include more or fewer components than those depicted in FIG. 2. Further, two or more components depicted in FIG. 2 may be embodied in one single component, and/or one component may be configured using multiple sub-components to achieve the desired functionalities.
In some embodiments, the database 204 is integrated into the computer system 202. In one non-limiting example, the database 204 is configured to store a cardholder-related dataset such as a cardholder-related dataset 218. In a non-limiting example, as mentioned earlier in the present disclosure, the cardholder-related dataset 218 may include historical information corresponding to the plurality of payment transactions performed by a cardholder (e.g., the cardholder 104(1)). Herein, the cardholder 104(1) may have a financial account at one or more issuers (e.g., the issuers 108). Further, in an embodiment, since the cardholder-related dataset 218 may be used for training one or more AI or ML models for performing a classification task, such as determining a probability of paying and a probability of not paying, the one or more AI or ML models, such as a trained temporal ML-based model 220 and a trained intent-to-pay ML-based model 222 may also be stored in the database 204. The trained temporal ML-based model 220 and the trained intent-to-pay ML-based model 222 may be used by the server system 200 for further processing the cardholder-related dataset 218. In addition, the database 204 provides a storage location for data and/or metadata obtained from various operations performed by the server system 200. In one embodiment, the database 204 is substantially similar to the database 116 of FIG. 1. Thus, it should be understood that all the details, data, or information as mentioned in the description of FIG. 1 to be stored in the database 116 is applicable for the database 204 as well.
Further, the computer system 202 may include one or more hard disk drives as the database 204. The user interface 212 is an interface, such as a Human Machine Interface (HMI) or a software application that allows users such as an administrator to interact with and control the server system 200 or one or more parameters associated with the server system 200. It may be noted that the user interface 212 may be composed of several components that vary based on the complexity and purpose of the application. Examples of components of the user interface 212 may include visual elements, controls, navigation, feedback and alerts, user input and interaction, responsive design, user assistance and help, accessibility features, and the like. More specifically these components may correspond to icons, layout, color schemes, buttons, sliders, dropdown menus, tabs, links, error/success messages, mouse and touch interactions, keyboard shortcuts, tooltips, screen readers, and the like.
The storage interface 214 is any component capable of providing the processor 206 access to the database 204. The storage interface 214 may include, for example, an Advanced Technology Attachment (ATA) adapter, a Serial ATA (SATA) adapter, a Small Computer System Interface (SCSI) adapter, a RAID controller, a SAN adapter, a network adapter, and/or any component providing the processor 206 with access to the database 204.
The processor 206 includes suitable logic, circuitry, and/or interfaces to execute operations for accessing the cardholder-related dataset 204, generating and storing the plurality of features in the database 204, accessing the plurality of features, segregating the plurality of features, generating a set of temporal embeddings for a set of authorization features of the plurality of features, generating a set of combined embeddings by concatenating the set of temporal embeddings, the set of non-authorization features, and the ticket size of the ongoing payment transaction, determining an intent-to-pay score of the cardholder for the ongoing payment transaction, and the like. Examples of the processor 206 include, but are not limited to, an Application-Specific Integrated Circuit (ASIC) processor, a Reduced Instruction Set Computing (RISC) processor, a Graphical Processing Unit (GPU), a Complex Instruction Set Computing (CISC) processor, a Field-Programmable Gate Array (FPGA), and the like.
The memory 208 includes suitable logic, circuitry, and/or interfaces to store a set of computer-readable instructions for performing operations. Examples of the memory 208 include a Random-Access Memory (RAM), a Read-Only Memory (ROM), a removable storage drive, a Hard Disk Drive (HDD), and the like. It will be apparent to a person skilled in the art that the scope of the disclosure is not limited to realizing the memory 208 in the server system 200, as described herein. In another embodiment, the memory 208 may be realized in the form of a database server or a cloud storage working in conjunction with the server system 200, without departing from the scope of the present disclosure.
The processor 206 is operatively coupled to the communication interface 210, such that the processor 206 is capable of communicating with a remote device 224, such as the issuer servers 108, the acquirer servers 110, the payment server 114, electronic devices 120 associated with the lenders 118, financial service provider banks 122, or communicating with any entity connected to the network 124 (as shown in FIG. 1).
It is noted that the server system 200 as illustrated and hereinafter described is merely illustrative of an apparatus that could benefit from embodiments of the present disclosure and, therefore, should not be taken to limit the scope of the present disclosure. It is noted that the server system 200 may include fewer or more components than those depicted in FIG. 2.
In one implementation, the processor 206 includes a data pre-processing module 226, a training module 228, a data processing module 230, and an intent-to-pay score computation module 232. It should be noted that components, described herein, such as the data pre-processing module 226, the training module 228, the data processing module 230, and the intent-to-pay score computation module 232 can be configured in a variety of ways, including electronic circuitries, digital arithmetic, and logic blocks, and memory systems in combination with software, firmware, and embedded technologies. Moreover, it may be noted that the data pre-processing module 226, the training module 228, the data processing module 230, and the intent-to-pay score computation module 232 may be communicably coupled with each other to exchange information with each other for performing the one or more operations facilitated by the server system 200.
In one scenario where the lender 118(1) has paid the merchant 106(1) a purchase amount corresponding to a purchase made by the cardholder 104(1) on behalf of the cardholder 104(1), and the lender 118(1) is expecting to receive the repayments and/or the installments from the cardholder 104(1), the cardholder 104(1) can choose to not to pay to the lender 118(1). In some scenarios, the cardholder 104(1) might choose not to pay, not necessarily because the cardholder 104(1) is not capable of making the payment but because the cardholder 104(1) is not willing to pay. During such scenarios, the lender 118(1) can register with the server system 200 to receive a facility of determining a willingness of the cardholder 104(1) to perform the repayments. In an embodiment, the lender 118(1) can use this feature of the server system 200 by making Application Programming Interface (API) calls to the server system 200. Thus, the lender 118(1) may send the intent-to-pay propensity request to the server system 200 through an API call. In one embodiment, the communication interface 210 includes suitable logic and/or interfaces for receiving the intent-to-pay propensity request from the lender 118(1) via the electronic device 120(1). The intent-to-pay propensity request indicates a request to determine a likelihood of the cardholder 104(1) to perform the one or more repayments to the lender 118(1) for the ongoing payment transaction (i.e., the purchase amount). In an embodiment, the intent-to-pay propensity request may include payment card information and the ticket size related to the ongoing payment transaction. The communication interface 210 transfers the intent-to-pay propensity request to the data processing module 230.
Further, in one embodiment, the data processing module 230 includes suitable logic and/or interfaces for extracting the unique account ID (e.g., PAN) corresponding to the cardholder 104(1) from the payment card information. The data processing module 230 transfers the unique account ID to the intent-to-pay score computation module 232.
In one embodiment, the intent-to-pay score computation module 232 includes suitable logic and/or interfaces for computing the intent-to-pay score of the cardholder 104(1) for the ongoing payment transaction based, at least in part, on the ticket size related to the ongoing payment transaction and processing the plurality of features associated with each payment transaction performed by the cardholder 104(1) having the extracted unique account ID.
In one embodiment, The data processing module 230 transfers the unique account ID to the data pre-processing module 226. The data pre-processing module 226 includes suitable logic and/or interfaces for accessing the cardholder-related dataset 218 from the database 204 based, at least in part, on the unique account ID. The cardholder-related dataset 218 may include historical information corresponding to the plurality of payment transactions performed by the cardholder 104(1) with at least one of the plurality of merchants 106 and the plurality of lenders 118.
In a non-limiting implementation, for computing the intent-to-pay score, the data pre-processing module 226 may further be configured to extract the plurality of features from the cardholder-related dataset 218 associated with each payment transaction. The data pre-processing module 226 may then store the plurality of features for each payment transaction corresponding to the unique account ID in the database 204.
Further, the data processing module 230 is configured to access the plurality of features for each payment transaction performed by the cardholder 104(1) from the database 204 based, at least in part, on the unique account ID received from the lender 118(1) of the ongoing payment transaction to the cardholder 104(1) via the electronic device 120(1). Upon accessing the plurality of features from the cardholder-related dataset 218, a temporal ML-based model and an intent-to-pay ML-based model may have to be trained and validated for their respective classification task. Thus, the plurality of features may be provided to the training module 228. The training module 228 includes suitable logic and/or interfaces for training the temporal ML-based model and the intent-to-pay ML-based model to generate the trained temporal ML-based model 220 and the trained intent-to-pay ML-based model 222.
Further, it should be noted that before training any AI or ML models, input data fed to the AI or ML models may have to be prepared for training. Therefore, initially, in an embodiment, for training the temporal ML-based model, the data pre-processing module 226 may further be configured to perform one or more data pre-processing operations, such as data cleaning, feature scaling (e.g., normalization, standardization, etc.), handling missing values, encoding categorical variables if needed, and the like. It is noted that since the data pre-processing operations described earlier are well-known in the art, they are not explained here in detail for the sake of brevity. Herein, it may be noted that the data pre-processing module 226 is configured to perform the one or more data pre-processing operations on the plurality of features before providing the plurality of features to the training module 228 for training the temporal ML-based model and the intent-to-pay ML-based model. In addition, an architecture for the temporal ML-based model and the intent-to-pay ML-based model may have to be selected from several predetermined architectures or algorithms. It should be noted that the architecture of the trained temporal ML-based model 220 and the trained intent-to-pay ML-based model 222 is explained further in detail in the present disclosure.
In one embodiment, along with the cardholder-related dataset 218, since a temporal pattern of the payment transactions performed by the cardholder 104(1) is also required for generating the intent-to-pay score for the cardholder 104(1), the data processing module 230 is configured to segregate the plurality of features into a set of authorization features and a set of non-authorization features for each payment transaction of the plurality of payment transactions performed by the cardholder 104(1) based, at least in part, on authorization information associated with the unique account ID. Further, the data processing module 230 is also configured to generate, via the trained temporal ML-based model 220, the set of temporal embeddings for the plurality of payment transactions based, at least in part, on the set of authorization features corresponding to each payment transaction. Each temporal embedding in the set of temporal embeddings is generated for each payment transaction.
In one embodiment, for generating the set temporal embeddings, the data processing module 230 may be configured to generate sequential data for the cardholder 104(1) based, at least in part, on the set of authorization features corresponding to each payment transaction and a time stamp associated with each payment transaction performed by the cardholder 104(1) for a predefined period. The sequential data may include one or more card-specific sequences of the plurality of payment transactions performed by the cardholder 104(1) within the predefined period. Further, the data processing module 230 may be configured to process the sequential data using the trained temporal ML-based model 220. The data processing module 230 may further be configured to extract the set of temporal embeddings from the at least one hidden layer of the trained temporal ML-based model 220. Each temporal embedding indicates a pattern (i.e., a temporal pattern) of the sequential data. More specifically, the set of temporal embeddings may be extracted from the final hidden state of the trained temporal ML-based model 220.
Furthermore, the data processing module 230 is configured to generate a set of combined embeddings for the plurality of payment transactions based, at least in part, on concatenating the set of temporal embeddings for each payment transaction, the set of non-authorization features for each payment transaction, and the ticket size related to the ongoing payment transaction. In one embodiment, the data processing module 230 may concatenate the set of temporal embeddings, the set of non-authorization features, and the ticket size using a concatenation mechanism. As used herein, the term “concatenation mechanism” refers to a mechanism of joining vectors together in a new dimension.
Considering the training of the temporal ML-based model, it may be noted that the training module 228 may train the temporal ML-based model by performing a first set of operations iteratively until the performance of the temporal ML-based model converges to the first predefined criteria. The process of training the temporal ML-based model to obtain the trained temporal ML-based model 220 is explained in detail in further parts of the present disclosure. Upon training the temporal ML-based model, the trained temporal ML-based model 220 may be provided to the data processing module 230 for generating the set of temporal embeddings.
Similarly, considering the training of the intent-to-pay ML-based model, it may be noted that the training module 228 may train the intent-to-pay ML-based model by performing a second set of operations iteratively until the performance of the intent-to-pay ML-based model converges to the second predefined criteria. The process of training the intent-to-pay ML-based model to obtain the trained intent-to-pay ML-based model 222 is explained in detail in further parts of the present disclosure. Upon training the intent-to-pay ML-based model, the trained intent-to-pay ML-based model 222 may be provided to the intent-to-pay score computation module 232 for generating the intent-to-pay score for the cardholder 104(1).
In one embodiment, the intent-to-pay score computation module 232 includes suitable logic and/or interfaces for computing, via the trained intent-to-pay ML-based model 222, the intent-to-pay score of the cardholder 104(1) for the ongoing payment transaction based, at least in part, on the set of combined embeddings. The intent-to-pay score indicates the likelihood of the cardholder 104(1) to perform one or more repayments to the lender 118(1) for the ongoing payment transaction.
Upon determining the intent-to-pay score for the cardholder 104(1), the intent-to-pay score computation module 232 may further be configured to transfer the intent-to-pay score to the communication interface 210. The communication interface 210 may be configured to transmit the intent-to-pay score of the cardholder 104(1) to the lender 118(1) through the electronic device 120(1), in response to receiving the intent-to-pay propensity request from the lender 118(1).
In some embodiments, the intent-to-pay score computation module 232 may be configured to label the cardholder 104(1) as a willing-to-pay cardholder for the ongoing payment transaction based, at least in part, on the intent-to-pay score of the cardholder 104(1) being at least equal to a predefined intent threshold. Alternatively, the intent-to-pay score computation module 232 may be configured to label the cardholder 104(1) as a not-willing-to-pay cardholder for the ongoing payment transaction based, at least in part, on the intent-to-pay score of the cardholder 104(1) being less than the predefined intent threshold.
In some other embodiments, the intent-to-pay score computation module 232 may also be configured to generate one or more recommendations for the financial service provider 118(1) based, at least in part, on the intent-to-pay score of the cardholder 104(1) for the ongoing payment transaction. The one or more recommendations may include a suggestion to the financial service provider 118(1) to provide a much lower ticket size option to the cardholder 104(1) having the intent-to-pay score less than the predefined intent threshold.
FIG. 3 illustrates a block diagram 300 of an overall model architecture used for generating the intent-to-pay score, in accordance with an embodiment of the present disclosure. In a non-limiting implementation, the temporal ML-based model 220 can be a gated recurrent unit (GRU)-based Neural Network (NN) (e.g., GRU-based NN 302) as shown in FIG. 3. Further, the intent-to-pay ML-based model 222 can be a Multi-layer perceptron (MLP)-based NN (e.g., MLP-based NN 304) as shown in FIG. 3. In a non-limiting implementation, it should be noted that both the GRU-based NN 302 and the MLP-based NN 304 work together as the overall model architecture to generate the intent-to-pay score (e.g., intent-to-pay score 306) as shown in FIG. 3. Herein, the model architecture of each of the GRU-based NN 302 and the MLP-based NN 304 is explained in further parts of the present disclosure.
Referring to FIG. 3 it may be understood that the authorization data 308, the 3DS 2.0 data 310, the BNPL and postpaid transaction data 312, the card RPCS data 314, the ticket size from the lender (see, 316), and the like are provided to the overall model architecture in a way as shown in FIG. 3. It may be understood that that the authorization data 308 is initially processed by the GRU-based NN 302 for obtaining the set of temporal embeddings. Then along with other data, the authorization data 308 and the set of temporal embeddings are provided to the MLP-based NN 304. Also, in an embodiment, all the data is shown in the FIG. 3 are fed to the GRU-based NN 302 and the MLP-based NN 304 in the form of features extracted from the corresponding data, as any AI or ML model process features extracted from the data rather than the data itself. Thus, pre-processing of the data may be performed by the server system 200 for generating the features. In one embodiment, the features may include the set of authorization features and the set of non-authorization features. In a specific embodiment, the set of non-authorization features may include the 3DS-2-related card features, the BNPL/postpaid merchants-related card features, the RPCS-related card features, and the like.
In an embodiment, the GRU-based NN 302 is trained to generate the set of temporal embeddings. The server system 200 may be configured to train the GRU-based NN 302 to generate the set of temporal embeddings. Before training the GRU-based NN 302, the server system 200 may be configured to access a training set of authorization features for each payment transaction of the plurality of payment transactions performed by a plurality of cardholders 104 with at least one of the at least one financial service provider and the at least one merchant for a training period from the database 204. The server system 200 may further be configured to generate training sequential data for each cardholder based, at least in part, on the training set of authorization features and a time stamp associated with each payment transaction. The training sequential data may include one or more card-specific sequences of the plurality of payment transactions.
Upon obtaining the training sequential data, the server system 200 may be configured to train the temporal ML-based model i.e., the GRU-based NN 302 for obtaining the trained temporal ML-based model 220 based, at least in part, on performing a first set of operations iteratively till the performance of the temporal ML-based model converges to first predefined criteria. In one embodiment, the first set of operations may include: (i) generating the temporal ML-based model based, at least in part, on one or more temporal model parameters; (ii) generating, via the temporal ML-based model, a training set of temporal embeddings for the training sequential data for each cardholder based, at least in part on the one or more temporal model parameters, (iii) generating, via the temporal ML-based model, a prediction for each payment transaction in the training sequential data based, at least in part, on the training set of temporal embeddings, the prediction corresponds to a hidden state of the temporal ML-based model; (iv) computing a temporal loss value for each payment transaction in the training sequential data using a temporal loss function based, at least in part, on the prediction and an actual outcome; and (v) optimizing the one or more temporal model parameters based, at least in part, on backpropagating the temporal loss value.
Once the temporal ML-based model is trained and the trained temporal ML-based model 220 is obtained, the server system 200 may be configured to generate, by the trained temporal ML-based model 220, the set of temporal embeddings. For generating the set of temporal embeddings, the server system 200 may be configured to generate sequential data for the cardholder 104(1) based, at least in part, on the set of authorization features corresponding to each payment transaction and a time stamp associated with each payment transaction performed by the cardholder 104(1) for a predefined period. The sequential data may include one or more card-specific sequences of the plurality of payment transactions performed by the cardholder 104(1) within the predefined period. The server system 200 may further process the sequential data using the trained temporal ML-based model 220,
Upon processing, the server system 200 may be configured to extract the set of temporal embeddings from the at least one hidden layer of the trained temporal ML-based model 220. Each temporal embedding indicates a pattern (i.e., temporal pattern) of the sequential data. Basically, the temporal ML-based model and the trained temporal ML-based model 220 have the same model architecture such as the GRU-bad NN 302. However, the GRU-bad NN 302 in the trained temporal ML-based model 220 is trained to perform a specific task which is to generate the set of temporal embeddings, whereas the GRU-bad NN 302 in the temporal ML-based model is yet to be trained.
In a non-limiting example, the one or more temporal model parameters may include selecting a loss function, optimizer, an evaluation metric, and the like. In another example, the one or more temporal model parameters may include selecting a batch from a training dataset of the cardholder-related dataset 218, a count of epochs, a count of iterations, weights, a learning rate, and other hyperparameters. In a specific embodiment, for training the temporal ML-based model, the server system 200 may use an Adaptive Moment Estimation (Adam) optimizer with about 500 epochs. Herein, the Adam optimizer is an iterative optimization algorithm used to minimize the loss function during the training of neural networks.
The Adam optimizer can be considered as a combination of Root Mean Squared Propagation (RMSprop) and stochastic gradient descent (SGD) with momentum. It may be noted that it uses the squared gradients to scale the learning rate like RMSprop, and it takes advantage of momentum by using the moving average of the gradient instead of the gradient itself, like SGD with momentum. This combines dynamic learning rate and smoothening to reach the global minimum. According to the Adam optimizer, weights of the temporal ML-based model may be computed using the following formula:
w_t=w_(t-1)-? m ˆ_t/(v(v ˆ_t )+?) … Eqn. 1
Herein, w_t and w_(t-1) refers to weights of the model at time t and t-1, ? refers to a step that depends on the iteration and is also known as a learning rate of the model. Further, m ^_t and v ^_t refers to first and second momemts’ moving averages which are generally initialized to zero. Furthermore, ? refers to a small constant epsilon which is used for numerical stability. Upon setting the weights as per Eqn. 1, the gradient and the temporal loss function may be computed. Further, the moving averages may be updated using exponentially decaying averages. At every iteration, all the temporal model parameters may be updated until the performance convergence of the model.
In some embodiments, the learning rate may be about 0.01 to about 0.05, and weight decay may be about 0.001 to about 0.005. Further, in an embodiment, a learning rate schedular, such as a cosine schedular, a step scheduler, or the like may also be used for training the temporal ML-based model. As may be understood, the cardholder-related dataset is split into a training dataset, a testing dataset, and a validation dataset that are used for training, testing, and validation of the temporal ML-based model, respectively. Herein, it should be noted that the training dataset, a testing dataset, and a validation dataset are taken for different timeframes.
Further, the process of generating the predictions using the temporal ML-based model which is the hidden state in the model architecture of the temporal ML-based model is explained further in the present disclosure. However, the predictions may have to be checked for their correctness which is performed by comparing the predictions with the actual outcome. Herein, the actual outcome may be known to the temporal ML-based model based on the historical data associated with past payment transactions that may have been performed by the cardholder 104(1) with other merchants 106 and the lender 118.
In a non-limiting example, the server system 200 compares the predictions with the actual outcome and calculates the temporal loss value using a temporal loss function. The temporal loss function can be any loss function, such as Mean Squared Error (MSE), cross-entropy loss, KL Divergence loss, or the like. In a specific embodiment, if the cross-entropy loss is considered as convergence can be achieved faster with it due to its convex properties, then it can be computed using the following formula:
"logloss "=-1/N ?_i^N¦? ?_j^M¦? y_ij log?(p_ij ) … Eqn. 2
Herein, the logloss formula for computing the temporal loss value may be used for binary classification tasks. Further, ‘N’ refers to a number of rows, and ‘M’ refers to a number of classes, p_ij refers to predicted probabilities.
The temporal loss value may be obtained using the formula in Eqn. 2 and the one or more temporal model parameters may have to be updated if the temporal loss value is too high. Thus, one of the steps of training the temporal ML-based model is to optimize the one or more temporal model parameters by backpropagating the temporal loss value. The first set of operations repeats until the one or more temporal model parameters are optimized to the extent that the performance of the temporal ML-based model converges to the first predefined criteria. In one embodiment, the first predefined criteria may include saturation of the temporal loss value. Herein, it may be noted that the temporal loss value may get saturated after a plurality of iterations of the first set of operations. It is understood that during each iteration, the one or more temporal model parameters are updated before they are used to reinitialize the model in the subsequent interaction.
Once the temporal loss value saturates, the server system 200 may be configured to extract, by the trained temporal ML-based model 220, the set of temporal embeddings for each payment transaction of the sequential data based, at least in part, on a final hidden state of the trained temporal ML-based model 220. The set of temporal embeddings indicates the summary of information from the sequential data.
Once the set of temporal embeddings is obtained, the set of temporal embeddings is concatenated with the set of non-authorization features and the ticket as shown in FIG. 3. Upon concatenation, the set of combined embeddings (otherwise, also referred to as a combined feature vector) may be generated by the server system 200. Further, the server system 200 is configured to train the intent-to-pay ML-based model having the MLP-based NN 304 to generate the intent-to-pay score for the cardholder 104(1), for obtaining the trained intent-to-pay ML-based model 222. The server system 200 may train the intent-to-pay ML-based model 222 based, at least in part, on performing a second set of operations iteratively till the performance of the intent-to-pay ML-based model converges to second predefined criteria. The second set of operations may include: (i) generating the intent-to-pay ML-based model based, at least in part, on one or more intent-to-pay model parameters; (ii) generating, by the intent-to-pay ML-based model, an intent-to-pay score for each payment transaction based, at least in part, on a training dataset of the set of combined embeddings and the one or more intent-to-pay model parameters, (iii) generating, via the intent-to-pay ML-based model, a predicted outcome, based at least in part, on the intent-to-pay score; (iv) computing an intent-to-pay loss value for each payment transaction using an intent-to-pay loss function based, at least in part, on the predicted outcome and an actual outcome; and (v) optimizing the one or more intent-to-pay model parameters based, at least in part, on backpropagating the intent-to-pay loss value.
Upon completion of the training process, the trained intent-to-pay ML-based model 222 may be obtained. Basically, the intent-to-pay ML-based model and the trained intent-to-pay ML-based model 222 have the same model architecture as the MLP-based NN 304. However, the MLP-based NN 304 in the trained intent-to-pay ML-based model 222 is trained to perform a specific task which is to generate the intent-to-pay score, whereas the MLP-based NN 304 in the intent-to-pay ML-based model is yet to be trained.
In a non-limiting example, the one or more intent-to-pay model parameters may be similar to the one or more temporal model parameters with the values for each parameter being different. Thus, the intent-to-pay loss function may also be similar to the temporal loss function. Also, Eqn. 1 can be used to compute the weights associated with the intent-to-pay ML-based model, and Eqn. 2 can be used to compute the intent-to-pay loss value. Moreover, some of the initialization of the one or more intent-to-pay model parameters may be similar to that of the one or more temporal model parameters and some may vary. Further, the second predefined criteria may include saturation of the intent-to-pay loss value. Herein, it may be noted that the intent-to-pay loss value may get saturated after a plurality of iterations of the second set of operations. Thus, in a non-limiting implementation, a pseudocode for the overall algorithm used for capturing the willingness of the cardholder 104(1) to perform the repayments may be as follows:
Input: Training Dataset D = {x_i,y_i}
The classifier g and its parameters W
Epochs T and iterations l
Procedure:
For t = 1 to T do:
For i = 1 to l do:
Fetch a random batch B from D
Segregate Authorization and rest of the features (RPCS + BNPL + 3DS)
Pass Authorization features to temporal ML-based model
Once the output from temporal ML-based model is ready pass embedding from temporal ML-based model and non-authorization features to MLP model
Calculate the loss of current batch using Cross Entropy Loss Function
Update network parameters W via gradient Descent
Output: Learned classifier G
From the above-mentioned pseudo-code, it may be noted that the classifier and learned classifier are similar to an AI or ML model and a trained AI or ML model (e.g., temporal ML-based model and trained temporal ML-based model 220, and intent-to-pay ML-based model and trained intent-to-pay ML-based model 222), respectively.
FIG. 4A illustrates a schematic representation 400 of an architecture of a temporal ML-based model (e.g., the trained temporal ML-based model 220), in accordance with an embodiment of the present disclosure. As may be understood, the trained temporal ML-based model 220 is used for determining the temporal pattern associated with the payment transactions performed by the cardholder 104(1) in the past by generating the set of temporal embeddings. It should be noted that the set of temporal embeddings may correspond to a representation that captures long-term dependencies in sequential data, such as time series data, speech, text, or the like. In the present disclosure, the input to the trained temporal ML-based model 220 is the set of authorization features.
Further, as the trained temporal ML-based model 220 can process input data that is sequential in nature, the payment transactions that are fed as the set of authorization features to the trained temporal ML-based model 220 may have to be arranged in the order of the time of occurrence of the corresponding payment transactions. Thus, in an embodiment, the server system 200 may be configured to generate sequential data based, at least in part, on the set of authorization features corresponding to each payment transaction and a time stamp associated with each payment transaction performed by the cardholder 104(1) for a predefined period. In a non-limiting example, the sequential data may include one or more card-specific sequences of the payment transactions performed by the cardholder 104(1) within the predefined period. Herein, each card-specific sequence of the one or more card-specific sequences corresponds to a particular payment card (e.g., the payment card 126) of a plurality of payment cards that the cardholder 104(1) may have owned. The plurality of payment cards may be a debit card and/or a credit card from different issuer banks 108 with different PANs.
Furthermore, during a training process, the training sequential data may be generated and fed to the temporal ML-based model to train the temporal ML-based model to generate the set of temporal embeddings. In a non-limiting implementation, the temporal ML-based model may have a GRU-based NN (e.g., the GRU-based NN 302). As used herein, the term “Gated Recurrent Unit” refers to a type of recurrent neural network (RNN) architecture that is similar to a Long Short-Term Memory (LSTM) model architecture. The difference between the LSTM architecture and the GRU architecture is that, the LSTM architecture has an input gate, output gate, and a forget gate which update a memory cell state and a hidden state, whereas the GRU architecture has a candidate activation vector which is updated using a reset gate and an update gate. Herein, the reset gate determines how much of the previous hidden state to forget, while the update gate determines how much of the candidate activation vector to incorporate into the new hidden state.
It should be noted that, the temporal ML-based model having the GRU architecture processes the training sequential data one element at a time, updating its hidden state based on the current input and the previous hidden state. At each time step, the temporal ML-based model computes a “candidate activation vector” that combines information from the input and the previous hidden state. This candidate activation vector is then used to update the hidden state for the next time step.
In a non-limiting implementation, the input xt which corresponds to the training sequential data obtained from the training set of authorization features at time t is provided to the temporal ML-based model after standardization of values between 0-1. The reset gate r_t (see, 402) and the update gate z_t (see, 404) are computed using the current input xt and the previous hidden state h_(t-1) and the formulae that may be used are as follows:
z_t=s(W_z·[h_(t-1),x_t ]) … Eqn. 3
r_t=s(W_r·[h_(t-1),x_t ]) … Eqn. 4
h ~_t=tanh?(W·[r_t*h_(t-1),x_t ]) … Eqn. 5
h_t=(1-z_t )*h_(t-1)+z_t*h ~_t … Eqn. 6
Herein h_tcorresponds to the hidden state and/or the actual output state at time t after combining all the previous hidden states h_((t-1) ) and taking all inputs x_t. Further, h ~_t corresponds to the candidate output state at time t. The symbol * stands for element-wise multiplication, s is a sigmoid function which stands for the vector-concatenation operation. W_r,W_z,W?R^(d_h×(d+d_h ) ) are the temporal model parameters of the reset and update gates, where d_h is the dimension of the hidden state and d is the dimension of the input vector.
Further, it should be noted that, in a non-limiting implementation, t takes values between 0-90 which signifies considering the last 90 days card’s transaction level features of the cardholder 104(1). Also, dimensions (d and d_h) of each vector i.e., x_t, h_t, and h_(t-1) may be about 117. This may mean that 117 features of the set of authorization features may be considered for training the temporal ML-based model to generate the set of temporal embeddings.
As may be understood, the GRU architecture may include components, such as an input layer, a hidden layer, the reset gate (see, 402), the update gate (see, 404), the candidate activation vector, and an output layer. The input layer takes the sequential data, and the hidden layer is where the recurrent computation occurs, then the hidden state is updated at each time step. The reset gate takes as input the previous hidden state and the current input and produces a vector of numbers between 0 and 1 that controls the degree to which the previous hidden state is “reset” at the current time step. The update gate takes as input the previous hidden state and the current input and produces a vector of numbers between 0 and 1 that controls the degree to which the candidate activation vector is incorporated into the new hidden state. The candidate activation vector is a modified version of the previous hidden state that is “reset” by the reset gate and combined with the current input. It is computed using a tanh activation function that squashes its output between -1 and 1. The output layer takes the final hidden state as input and produces the network’s output. This could be a single number, a sequence of numbers, or a probability distribution over classes, depending on the task at hand.
FIG. 4B illustrates a schematic representation 420 of an architecture of an intent-to-pay ML-based model (e.g., the trained intent-to-pay ML-based model 222), in accordance with an embodiment of the present disclosure. As may be understood, the trained intent-to-pay ML-based model 222 is used for capturing the set of non-authorization features for each payment transaction, such as the 3DS-2-related card features, the BNPL/postpaid merchants-related card features, the RPCS-related card features, and the like. In one embodiment, the trained intent-to-pay ML-based model 222 may also receive the set of authorization features in the form of the set of temporal embeddings. Further, the trained intent-to-pay ML-based model 222 may also receive the ticket size of the ongoing payment transaction for which the cardholder 104(1) is supposed to perform the one or more repayments to the lender 118(10). Based on all these factors, the intent-to-pay ML-based model is trained to generate the intent-to-pay score for the cardholder 104(1) which translates to the willingness of the cardholder 104(1) to perform the one or more repayments.
It should be noted that, of all the above-mentioned features, the ticket size of the ongoing payment transaction has a major impact on the change in the intent-to-pay score. The variation of the intent-to-pay score with the ticket size is explained further in the present disclosure. Further, in one embodiment, the trained intent-to-pay ML-based model 222 may correspond to the MLP-based ML model (e.g., the MLP-based ML model 304). As used herein, the term “Multi-layer Perceptron” refers to a type of feedforward Neural Network (NN) with multiple layers including an input layer, one or more hidden layers, and an output layer. each layer containing artificial neurons (otherwise also referred to as ‘nodes’ or ‘units’).
As used herein, the term ‘input layer’ corresponds to a layer in the NN that receives raw input data or features as inputs, and each neuron in the input layer represents one feature of the data. Similarly, the term ‘hidden layers’ refers to layers that are located between the input and output layers that perform complex transformations on the input data received from the input layer. It may be noted that the number of hidden layers and neurons in each hidden layer can vary based on the network’s architecture and problem complexity. Further, the term ‘output layer’ refers to a layer that produces a final output of the network, which is typically a classification result. Herein, the number of neurons in the output layer depends on the number of classes or categories in the classification task.
To that end, it is understood that a neuron in the hidden layers of the NN includes several components. In a non-limiting example, the components may include weighted connections from neurons in the previous layer. Herein, the previous layer may correspond to one or more layers positioned before a current layer from the one or more hidden layers or the input layer. In case the weighted connections are coming from the input layer, inputs coming from each connection may be associated with a respective weight, which determines the strength of the corresponding connection. In an embodiment, a bias value may also be introduced. It should be noted that the weights and the bias value are learned during the training of the network. Further, in an embodiment, each neuron in the hidden layers may be associated with a summation function and an activation function. It should also be noted that the connection between the hidden layers is also associated with weights and optionally a bias value. Herein, the summation function includes performing a summation of the inputs received at each neuron in the hidden layers that are multiplied by their respective weights. Further, the activation function may introduce non-linearity into the model when applied to the summation results. In a non-limiting example, different activation functions may include sigmoid, Rectified Linear Unit (ReLu), softmax, and the like. To that note, outputs from each neuron on the hidden layers are obtained upon the application of the activation function. Further, results obtained from the hidden layers are transmitted to the output layer, where similar computation is performed by each neuron, to produce final classification results.
Further, in an embodiment, as the architecture chosen for the intent-to-pay ML-based model is based on MLP-based NN 304, a forward propagation operation may be performed. Herein, forward propagation operation may refer to a flow of input data in a forward direction from an input layer such as an input layer 422 of the intent-to-pay ML-based model, one or more hidden layers such as a hidden layer 430, and an output layer such as an output layer 432 to produce the final classification result. Herein, the input data may correspond to a training dataset of the set of combined embeddings. Moreover, while training the network to generate the classification results, the weights in each neuron in all the layers are adjusted to minimize the difference between its predictions and actual labels in the training data by using backpropagation and gradient descent. This ability to capture relevant information is a key strength of deep neural networks for performing various tasks, including image recognition, natural language processing, fraud detection, multi-class categorization, and the like. Upon training the intent-to-pay ML-based model, the server system 200 may generate the trained intent-to-pay ML-based model 222 which is further used for generating the intent-to-pay score for the cardholder 104(1) in real time.
In a non-limiting implementation, since all the features are to be provided to the trained intent-to-pay ML-based model 222, the server system 200 may be configured to generate a set of combined embeddings based, at least in part, on concatenating the set of temporal embeddings for each payment transaction, the set of non-authorization features for each payment transaction, and the ticket size related to the ongoing payment transaction. Herein, the ticket size may be used as another feature for training the intent-to-pay ML-based model to determine the intent-to-pay score. Further, it should be noted that the input layer 422 is formed by the training set of the set of combined embeddings.
However, it may be noted that prior to concatenation, dimensions of the training set of the set of temporal embeddings for each payment transaction, the set of non-authorization features for each payment transaction, and the ticket size related to the ongoing payment transaction may have to be matched. For instance, if the set of temporal embeddings is two-dimensional, then it has to be reduced to one-dimensional so that it is feasible to concatenate it with the set of non-authorization features and the ticket size which is one-dimensional. To that note, in an embodiment, several techniques that can be used for reducing the dimensions of embeddings may include, but are not limited to, flattening, averaging, pooling, etc. Further, in one embodiment, for generating the set of combined embeddings, the server system 200 may use the concatenation mechanism. Moreover, in an embodiment, the set of combined embeddings can be interchangeably referred to as a combined feature vector.
Further, the server system 200 is configured to determine the intent-to-pay score for the cardholder 104(1) based, at least in part, on the set of combined embeddings. The intent-to-pay score indicates the likelihood of the cardholder 104(1) to perform one or more repayments to the lender 118(1) for the ongoing payment transaction. In one embodiment, based on the intent-to-pay score, it may be determined whether the cardholder 104(1) is willing to perform the one or more repayments or not. As may be understood that the trained intent-to-pay ML-based model 222 is used by the server system 200 to determine the intent-to-pay score for the cardholder 104(1). The output of the trained intent-to-pay ML-based model 222 would be a probability of willing to pay and a probability of not willing to pay corresponding to the cardholder 104(1). In a non-limiting example, the probability value varies from 000 to 999. If the probability value is at least equal to a predefined intent threshold, the intent-to-pay score may indicate the probability of willing to pay indicating that it is more likely that the cardholder 104(1) is willing to perform the one or more repayments. If the probability value is less than the predefined intent threshold, the intent-to-pay score may indicate the probability of not willing to pay indicating that it is more likely that the cardholder 104(1) is not willing to perform the one or more repayments.
In a non-limiting implementation of the present disclosure, if 323 features are considered, then it may be noted that the input layer 422 of the trained intent-to-pay ML-based model 222 may have 323 neurons, each neuron corresponding to each feature. Moreover, the input layer 422 is formed by the set of combined embeddings (otherwise, also referred to as ‘combined feature vector’). As may be understood, the combined feature vector is formed by the concatenation of the set of temporal embeddings, the set of non-authorization features, and the ticket size. Thus, in the example considered, the set of temporal embeddings can have a dimension of about 117 indicating 117 neurons in the 323 neurons in the input layer (see, 424). Similarly, the set of non-authorization features can have a dimension of about 205 indicating 205 neurons in the 323 neurons (see, 426). Further, the ticket size which is one-dimension indicates a single neuron in the input layer (see, 428). Thus, the dimension of the input layer 422 which is 323 can alternatively be represented as 117 || 205 || 1.
This may be followed by at least one hidden layer such as a hidden layer 430 having, for example, 50 neurons which are obtained from the 323 neurons by performing the summation and the activation functions. Further, this is followed by an output layer such as the output layer 432 having 2 neurons which are obtained from the 50 neurons of the hidden layer 430 by performing the summation and the activation functions. Herein, it should be noted that the 2 neurons represent logits of the probability of the cardholder 104(1) paying or not paying the one or more repayments for which the request was received from the lender 118(1). In a non-limiting example, the intent-to-pay score can be computed by multiplying the probability of paying by 1000.
Thus, it may be noted that inputs to a neuron can be represented as a vector x = {x1, x2, x3, … xn} with connections (or weights) for each element of the vector x being represented as another vector w = {w1j, w2j, w3j, … wnj}, respectively. Then, the results obtained upon applying the summation function may be indicated as follows:
z_j=?_i¦? w_ij x_i+b_j … Eqn. 7
Herein, the term ‘zj’ indicates the results obtained at jth neuron upon the application of the summation function. Similarly, the term ‘bj’ refers to a bias value that is used to fine-tune the behavior of individual neurons, allowing the network to learn and make better predictions. Further, an activation function is applied to the results obtained after the application of the summation function at each neuron in each layer. In a non-limiting implementation, the activation function may be indicated as follows:
y_j=f(z_j ) … Eqn. 8
Herein, the term ‘yj’ indicates a result obtained at the jth neuron upon the application of the activation function on the results obtained from Eqn. 7, i.e., ‘zj’. In a non-limiting example, the activation function that is applied for each neuron of each layer of the hidden layers such as the hidden layer 430 may introduce non-linearity into the network, allowing the network (i.e., the trained intent-to-pay ML-based model 222) to learn complex patterns in the input data i.e., to learn complex transaction behavior patterns from the plurality of features for the cardholder 104(1). As mentioned above, several activation functions that can be applied at each neuron may include ReLu, tangent (tanh), sigmoid function, softmax, etc.
To that note, each neuron in the hidden layer 430 receives connections such as connections 434 from neurons of the previous layer i.e., the input layer 422 having 323 neurons. Upon receiving the connections 434 which are weighted features from the 323 neurons, a summation function may be applied to each neuron of the hidden layer 430. This is followed by an activation function for generating 50 outputs that may be termed as hidden layer results hereinafter. Similarly, each neuron in the output layer 432 receives connections 436 from neurons of the hidden layer 430 having 50 neurons. Upon receiving the connections 436 that are weighted hidden layer results from the 50 neurons, a simulation function may be applied to each neuron of the output layer. This is followed by an activation function for generating 2 outputs. This is applicable in a scenario, where two probability values are expected for determining whether the cardholder 104(1) is willing to pay or is not willing to pay.
Further, a softmax function may be applied on each neuron of the output layer 432 for categorizing the cardholder 104(1) as being either a cardholder willing to pay or a cardholder not willing to pay. It may be noted that in the context of the feed-forward neural network, the two scores obtained from the output layer 432 correspond to raw scores (otherwise also referred to as logits). Upon obtaining 2 logits, the softmax function is applied on each neuron to obtain a prediction including probability distributions as outputs at each neuron in the output layer 432 such that the sum of all the probabilities is equal to 1. Herein, it may be noted that each value represents a likelihood of the input (i.e., the cardholder 104(1)) having intentions to perform the one or more repayments.
For instance, if the two classes i.e., willing to pay and not willing to pay are represented as a vector of probability distributions and if the vector generated upon application of the softmax function is [0.7, 0.3], then a class with the highest probability value will be assigned to the cardholder 104(1). In the current scenario, since the class ‘willing to pay’ has the highest probability value, the cardholder 104(1) under consideration may be classified under the ‘willing to pay’ class. Similarly, different cardholders 104 may be classified under these classes accordingly.
Thus, in other words, it may be stated that the server system 200 is configured to generate the intent-to-pay score corresponding to the cardholder 104(1) based, at least in part, on the set of combined embeddings using the trained intent-to-pay ML-based model 222. Herein, the intent-to-pay score may correspond to one score of a vector of probability distributions indicating a likelihood of labeling the cardholder 104(1) with a ‘willing to pay’ label. In an embodiment, for determining the likelihood of labeling the cardholder 104(1) with the ‘willing to pay’ label, the predefined intent threshold may be defined. For instance, the server system 200 may assign the ‘willing to pay’ label to the cardholder 104(1) if the intent-to-pay score is at least equal to the predefined intent threshold and if the intent-to-pay score is less than the predefined intent threshold, then the ‘not willing to pay’ label will be assigned. In a non-limiting embodiment, the predefined intent threshold may correspond to the application of the activation function such as the softmax function. In a non-limiting implementation, the softmax function may be represented as follows:
S(a)_i=exp?(a_i )/(?_(j=1)^n¦? exp?(a_j ) ) … Eqn. 9
Herein, the term ‘a’ represents an input vector i.e., a vector of raw scores (logits) to the softmax function, S. It may include ‘n’ elements for ‘n’ classes. Similarly, the term ‘ai’ refers to ith element of the input vector. Further, ‘exp(ai)’ refers to the standard exponential function applied on ‘ai’. Similarly, ‘?_(j=1)^n¦? exp?(a_j )’ refers to a normalization term. It ensures that the values of output vector ‘S(a)i’ sum to 1 for ith class and each of them is in the range 0 and 1 which makes up a valid probability distribution.
FIG. 5A illustrates a graphical representation 500 of a variation of an intent-to-pay score with an average ticket size of an ongoing payment transaction, in accordance with an embodiment of the present disclosure. The graphical representation 500 shows two random payment transactions with varying ticket sizes in accordance with an experiment that is conducted to validate the operation of the proposed approach of generating the intent-to-pay score of the cardholders 104 for an ongoing payment transaction. X-axis represents variation in ticket size (i.e., query ticket size (in USD) 502) and Y-axis represents variation in the intent-to-pay score (i.e., model score 504).
Further, curve 506 represents a payment transaction that experienced a decline at 10 USD of ticket size and hence has the intent-to-pay score that is very low (less than 250). Since a decline is observed at 10 USD ticket size, the server system 200 assigns lower intent-to-pay scores than the score at the 10 USD ticket size for the payment transaction having the ticket size of more than 10 USD i.e., 15 USD, 25 USD, 30, USD, etc. On the other hand, for the same transaction that observed a decline at the 10 USD ticket size, the server 200 assigns higher intent-to-pay scores than the score at the 10 USD ticket size, having the ticket size less than the 10 USD ticket size i.e., 5 USD with a score of around 500, 2 USD with a score of around 1000, etc.
Similarly, curve 508 represents a payment transaction that experienced approval at around 5 USD of ticket size and hence has the intent-to-pay score that is very high (around 1000). Since an approval is observed at around 5 USD ticket size, the server system 200 assigns lower intent-to-pay scores than the score at around 5 USD ticket size for the payment transaction having the ticket size more than 5 USD i.e., 15 USD, 25 USD, 30, USD, etc. However, the ticket sizes, such as 10 USD, 15 USD, and 25 USD would be assigned slightly better intent-to-pay scores that are close to about 1000 and greater than 750. On the other hand, the ticket sizes that are more than 25 USD, such as 30 USD, 35 USD, 50 USD, etc., would be assigned intent-to-pay scores that are lower than 750, such as around 500, around 250, and even less than 250. Further, for the same transaction that observed an approval at around 5 USD ticket size, the server 200 assigns higher intent-to-pay scores than the score at the 5 USD ticket size, having the ticket size less than the 5 USD ticket size i.e., 2 USD and even less with a score of around 1000 and more.
FIG. 5B illustrates a graphical representation 520 of variation of an intent-to-pay score with an average ticket size of an ongoing payment transaction across different segments of cardholders, in accordance with an embodiment of the present disclosure. The graphical representation 520 shows score (otherwise, also referred to as an average predicted score) behavior for different cardholder segments, such as a low spender, a medium spender, and a high spender. X-axis represents variation in ticket size (i.e., query ticket size (in USD) 522) and Y-axis represents variation in the intent-to-pay score (i.e., model score 524).
Curve 526 represents the payment transaction with varied ticket sizes and corresponding intent-to-pay scores for different ticket sizes for a low spender (i.e., a cardholder that has a low spending behavior). Similarly, curve 528 represents the payment transaction with varied ticket sizes and corresponding intent-to-pay scores for different ticket sizes for a medium spender (i.e., a cardholder that has a medium spending behavior). Further, curve 530 represents the payment transaction with varied ticket sizes and corresponding intent-to-pay scores for different ticket sizes for a high spender (i.e., a cardholder that has a high spending behavior).
In a non-limiting implementation, considering that the model predicts a BNPL transaction to be NSF decline if the score is below a threshold such as about 450 (see, 532), three cardholder segments formed from a selected dataset for conducting an experiment, reach that mark on an average at the following dollar amounts:
Low spender – at about $35
Medium spender – at about $145
High spender – at about $300
The above variation establishes the model’s ability (e.g., the trained temporal ML-based model 220 and the trained intent-to-pay ML-based model 222) to learn cardholder’s spending behavior and appropriately predict the intent to pay of the cardholders 104 based on the query ticket size 522.
Further, in an example implementation, the inference made on the selected dataset for the experiment for a particular time period is as follows:
Segment Average ticket size of past approved payment transactions Percent (%) of total payment transactions
Low spender < $5 25.3
Medium spender $5-$50 50.6
High spender > $50 24.1
Table 1: Inference made of a selected dataset
Referring to Table 1 it may be understood that the average ticket size of past approved payment transactions for a low spender is about less than 5 dollars ($) and the percent of the total payment transactions that had the above-mentioned ticket for the low spender is about 25.3%. Similarly, the average ticket size of past approved payment transactions for a medium spender varies between about $5 and about $50 and the percent of the total payment transactions that had the above-mentioned ticket for the medium spender is about 50.6%. Further, the average ticket size of past approved payment transactions for a high spender is about more than $50 and the percent of the total payment transactions that had the above-mentioned ticket for the high spender is about 24.1%.
FIG. 6 illustrates a sequence flow diagram 600 depicting a process flow of an example scenario that requires the generation of an intent-to-pay score, in accordance with an embodiment of the present disclosure. The sequence of operations of the sequence flow diagram 600 may not be necessarily executed in the same order as they are presented. Further, one or more operations may be grouped and performed in the form of a single step, or one operation may have several sub-steps that may be performed in parallel or in a sequential manner. It is to be noted that to explain the sequence flow diagram 600, references may be made to elements described in FIGS. 1 to 2. It is understood that some of the various steps of the sequence flow may have already been explained with reference to, therefore an explanation for the same is not repeated herein for the sake of brevity. The sequence flow begins at step 602.
At step 602, a cardholder (e.g., the cardholder 104(1)) visits a checkout page on a merchant’s platform while purchasing a product via the merchant’s platform. Upon visiting the merchant’s platform, the cardholder 104(1) enters payment transaction-related information on the checkout page. In a non-limiting example, the payment transaction-related information may be related to the payment transaction initiated by the cardholder 104(1) and may include mobile number, email ID, billing for a final transaction, etc.
At step 604, the merchant (e.g., the merchant 106(1)) checks if the payment transaction is eligible for installments. In a non-limiting example, the merchant 106(1) may use a pay in installment software development kit (SDK) to determine if the payment transaction is eligible for the installments. In other words, the merchant sends this request to the server system (e.g., the server system 200). Then the server system 200 checks the account details of the cardholder 104(1) and the ticket size of the ongoing payment transaction initiated by the cardholder 104(1) and determines whether the corresponding payment transaction is eligible for installments.
At step 606, Upon matching the account details of the cardholder 104(1) and the ticket size of the payment transaction, the server system 200 responds to the merchant 106(1) on the checkout page of the merchant’s platform with the applicable financial service provider (otherwise, also referred to as ‘lender’) offers. The server system 200 sends the applicable lender offers to the merchant 106(1) at the merchant’s platform, if the payment transaction is determined to be eligible for conversion to installments.
At step 608, the merchant 106(1) via the checkout page on the merchant’s platform, presents the applicable lender offers to the cardholder 104(1) on an electronic device of the cardholder 104(1).
At step 610, upon receiving the applicable lender offers, the cardholder 104(1) selected one lender offer of the presented lender offers.
At step 612, once the cardholder 104(1) selects one of the presented lender offers, the merchant 106(1) requests the server system 200 to initialize an installment flow for the selected offer. Upon receiving this request, the server system 200 facilitates converting the total amount of the payment transaction to installments for the cardholder 104(1) to pay the total amount to the lender 118(1) in installments.
At step 614, upon initializing the installment flow, the server system 200 notifies the lender 118(1) about the initialization by sending a notification. In a non-limiting example, the notification may be sent in the form of a text message, an email, a pop-up notification on an electronic device of the lender 118(1), or the like.
At step 616, upon receiving the notification about the initialization of the installment flow, the lender 118(1) initiates the payment of the total or complete amount of the payment transaction that was initiated by the cardholder 104(1) to the merchant 106(1). Moreover, the lender 118(1) completes the payment.
At step 618, once the merchant 106(1) receives the complete payment from the lender 118(1), the merchant 106(1) notifies about the completion of the payment transaction that was initiated by the cardholder 104(1), to the cardholder 104(1) by sending the notification to the cardholder 104(1) on the electronic device of the cardholder 104(1).
At step 620, the merchant 106(1) also notifies the server system 200 about the completion of the payment transaction by the lender 118(1).
At step 622, once the installment flow is initiated, the lender 118(1) has an option to revert back the transaction based on the willingness of the cardholder 104(1) to perform the repayments via the installments having a specific ticket size. Thus, in an embodiment, the lender 118(1) can send a request to the server system 200 to check the willingness of the cardholder 104(1) to pay. The lender 118(1) is able to do this, when the lender 118(1) registers with the server system 200 to use the feature of checking the willingness of the cardholder 104(1) to pay. Thus, the server system 200 receives the intent-to-pay propensity request from the lender 118(1) for the payment transaction that was initiated by the cardholder 104(1) on the merchant’s platform while making a purchase i.e., the ongoing payment transaction. In one embodiment, the ongoing payment transaction is at least one of a loan request transaction from the cardholder 104(1) and a loan standing instruction (SI) transaction initialized by the at least one financial service provider (e.g., the financial service provider 118(1)) with an issuing bank of the cardholder 104(1). Herein, the term ‘loan request transaction’ refers to a payment transaction performed by the lender 118(1) to the merchant 106(1) on behalf of the cardholder 104(1). Similarly, the term ‘SI transaction’ refers to an automatic payment of a fixed amount at regular intervals by debiting from a borrower’s financial account to a lender’s financial account. For example, installment payments, recurrent payments, or the like. At step 622, the ongoing payment transaction refers to the payment of the first installment.
At step 624, upon receiving the intent-to-pay propensity request from the lender 118(1), the server system 200 determines an intent-to-pay score of the cardholder 104(1) for the ongoing payment transaction using the process explained above in the present disclosure. As may be understood, the intent-to-pay score indicates the likelihood of the cardholder 104(1) to perform the repayments in installments for the payment transaction initiated by the cardholder 104(1) on the merchant’s platform.
At step 626, once the intent-to-pay score is determined, the server system 200 sends it to the lender 118(1) on the electronic device of the lender 118(1). The lender 118(1), based on the intent-to-pay score of the cardholder 104(1) can decide whether to go ahead and wait to receive the repayments from the cardholder 104(1) or decline the repayments and take actions to avoid facing any kind of losses in future.
At step 628, suppose the lender 118(1) did not take actions to stop the repayments, the reason being the intent-to-pay score of the cardholder 104(1) is high or good, then the lender 118(1) receives the repayment of a first installment from the cardholder 104(1).
At step 630, similarly to how the first installment was received from the cardholder 104(1), a second installment is also received by the lender 118(1) from the cardholder 104(1).
At step 632, the lender 118(1) decides to check the intent-to-pay score of the cardholder 104(1) for the next installment. As a result, the lender 118(1) sends another intent-to-pay propensity request to the server system 200.
At step 634, upon receiving the request, the server system 200 determines the intent-to-pay score of the cardholder 104(1) for the next installment.
At step 636, upon determining the intent-to-pay score of the cardholder 104(1), the server system 200 sends it to the lender 118(1) on the electronic device of the lender 118(1).
At step 638, the lender 118(1) receives the next installment i.e., a third installment from the cardholder 104(1).
At step 640, the lender 118(1) again decides to check the intent-to-pay score of the cardholder 104(1) for the next installment and send the intent-to-pay propensity request to the server system 200.
At step 642, upon receiving the request, the server system 200 determines the intent-to-pay score of the cardholder 104(1) for the next installment.
At step 644, upon determining the intent-to-pay score of the cardholder 104(1), the server system 200 sends it to the lender 118(1) on the electronic device of the lender 118(1).
At step 646, the lender 118(1) receives the next installment i.e., a third installment from the cardholder 104(1). However, in the flow explained above, there is a possibility that at any moment the cardholder 104(1) can drop from performing the repayment of the installments. Thus, the lender 118(1) with the feature provided by the server system 200 of checking the intent-to-pay score of the cardholder 104(1) either while on-boarding or between the installments, makes sure whether the cardholder 104(1) will pay the next installment or no, and takes preventive actions accordingly.
FIG. 7 illustrates a sequence flow diagram 700 depicting a detailed process flow of generating the intent-to-pay score, in accordance with an embodiment of the present disclosure. The sequence of operations of the sequence flow diagram 700 may not be necessarily executed in the same order as they are presented. Further, one or more operations may be grouped and performed in the form of a single step, or one operation may have several sub-steps that may be performed in parallel or in a sequential manner. It is to be noted that to explain the sequence flow diagram 700, references may be made to elements described in FIGS. 1 to 2. It is understood that some of the various steps of the sequence flow may have already been explained with reference to, therefore an explanation for the same is not repeated herein for the sake of brevity. The sequence flow begins at step 702.
At step 702, the lender 118(1) sends the intent-to-pay propensity request to the server system 200. The intent-to-pay propensity request can be initiated by the lender 118(1) either during onboarding the cardholder 104(1) or between the repayments. As mentioned earlier, the intent-to-pay propensity request includes the payment card information and the ticket size related to the ongoing payment transaction.
At step 704, Upon receiving the intent-to-pay propensity request, the server system 200 stores the payment card information and the ticket size that are associated with the intent-to-pay propensity request in the database 204. The payment card information and the ticket size are stored in the database 204 so that they can be accessed in the future whenever necessary.
At step 706, the server system 200 extracts the unique account ID from the payment card information that is stored in the database 204. The server system 200 performs this step, so that the server system 200 can access features corresponding to a payment card (e.g., the payment card 126) that may be used by the cardholder 104(1) to perform the repayments.
At step 708, upon extracting the unique account ID, the server system 200 sends a request to the database 204 for the features associated with the unique account ID.
At step 710, upon receiving the request, the database 204 transmits the features to the server system 200.
At step 712, upon accessing the features from the database 204, the server system 200 segregated the features into a first set and a second set. Herein, in one embodiment, the first set corresponds to the set of authorization features and the second set corresponds to the set of non-authorization features.
At step 714, the server system 200 requests for the trained temporal ML-based model 220 from the database 204. As mentioned above, the server system 200 is also configured to train the temporal ML-based model for generating the set of temporal embeddings, thereby obtaining the trained temporal ML-based model 220. Later, the server system 200 stores the trained temporal ML-based model 220 in the database 204 for future use.
At step 716, upon sending the request, the server system 200 receives the trained temporal ML-based model 220 from the database 204.
At step 718, upon receiving the trained temporal ML-based model 220, the server system 200 generates the set of temporal embeddings for the first set of features using the trained temporal ML-based model 220.
At step 720, the server system 200 concatenates the set of temporal embeddings, the second set of features, and the ticket size of the ongoing payment transaction provided by the lender 118(1) to the cardholder 104(1). Upon concatenation, the server system 200 generates the set of combined embeddings.
At step 722, the server system 200 requests for the trained intent-to-pay ML-based model 222 from the database 204. As mentioned earlier, the server system 200 is configured to train the intent-to-pay ML-based model to generate the intent-to-pay score of the cardholder 104(1) for the ongoing payment transaction, thereby obtaining the trained intent-to-pay ML-based model 222. The server system 200 may then store the trained intent-to-pay ML-based model 222 in the database 204 for future use.
At step 724, the server system 200 receives the trained intent-to-pay ML-based model 222 from the database 204.
At step 726, upon receiving the trained intent-to-pay ML-based model 222 from the database 204, the server system 200 may generate the intent-to-pay score of the cardholder 104(1) for the ongoing payment transaction using the trained intent-to-pay ML-based model 222.
At step 728, finally, in response to the intent-to-pay propensity request received from the lender 118(1), the server system 200 sends the intent-to-pay score to the lender 118(1) on the electronic device 120(1) of the lender 118(1). Upon receiving the intent-to-pay score, the lender 118(1) can take necessary actions to make sure the lender 118(1) does not face any kind of losses in the future.
FIG. 8 illustrates a flow diagram depicting a method 800 for capturing the willingness of cardholders to perform one or more repayments, in accordance with an embodiment of the present disclosure. The method 800 depicted in the flow diagram may be executed by, for example, the server system 200. The sequence of operations of the method 800 may not be necessarily executed in the same order as they are presented. Further, one or more operations may be grouped and performed in the form of a single step, or one operation may have several sub-steps that may be performed in parallel or in a sequential manner. Operations of the method 800, and combinations of operations in the method 800 may be implemented by, for example, hardware, firmware, a processor, circuitry, and/or a different device associated with the execution of software that includes one or more computer program instructions. The plurality of operations is depicted in the process flow of the method 800. The process flow starts at operation 802.
At operation 802, the method 800 includes accessing, by a server system (such as the server system 200), a plurality of features for each payment transaction from a plurality of payment transactions performed by a cardholder (e.g., the cardholder 104(1)) with at least one of at least one financial service provider (e.g., the financial service provider 118(1)) and at least one merchant (e.g., the merchant 106(1)) from a database (e.g., the database 116) associated with the server system 200 based, at least in part, on a unique account identifier (ID) received from the financial service provider 118(1) for an ongoing payment transaction. The unique account ID is linked to the cardholder 104(1).
At operation 804, the method 800 includes segregating, by the server system 200, the plurality of features into a set of authorization features and a set of non-authorization features for each payment transaction based, at least in part, on authorization information associated with the unique account ID.
At operation 806, the method 800 includes generating, by a trained temporal Machine Learning (ML)-based model (e.g., the trained temporal ML-based model 220) associated with the server system 200, a set of temporal embeddings for the plurality of payment transactions based, at least in part, on the set of authorization features corresponding to each payment transaction, each temporal embedding in the set of temporal embeddings being generated for each payment transaction.
At operation 808, the method 800 includes generating, by the server system 200, a set of combined embeddings for the plurality of payment transactions based, at least in part, on concatenating the set of temporal embeddings, the set of non-authorization features for each payment transaction, and a ticket size of ongoing payment transaction.
At operation 810, the method 800 includes computing, by a trained intent-to-pay ML-based model (e.g., the trained intent-to-pay ML-based model 222) associated with the server system 200, an intent-to-pay score of the cardholder 104(1) for the ongoing payment transaction based, at least in part, on the set of combined embeddings. The intent-to-pay score indicates a likelihood of the cardholder 104(1) to perform one or more repayments to the financial service provider 118(1) for the ongoing payment transaction.
FIG. 9 illustrates a simplified block diagram of the payment server 900, in accordance with an embodiment of the present disclosure. The payment server 900 is an example of the payment server 114 of FIG. 1. The payment server 900 and the server system 200 may use the payment network 112 as a payment interchange network. Examples of payment interchange networks include, but are not limited to, Mastercard® payment system interchange network.
The payment server 900 includes a processing module 902 configured to extract programming instructions from a memory 904 to provide various features of the present disclosure. The components of the payment server 900 provided herein may not be exhaustive, and the payment server 900 may include more or fewer components than that depicted in FIG. 9. Further, two or more components may be embodied in one single component, and/or one component may be configured using multiple sub-components to achieve the desired functionalities. Some components of the payment server 900 may be configured using hardware elements, software elements, firmware elements, and/or a combination thereof.
Via a communication module 906, the processing module 902 receives a request from a remote device 908, such as the issuer servers 108, the acquirer servers 110, or the server system 102. The request may be a request for conducting the payment transaction. The communication may be achieved through API calls, without loss of generality. The payment server 900 includes a database 910. The database 910 also includes transaction processing data, such as issuer ID, country code, acquirer ID, and merchant identifier (MID), among others.
When the payment server 900 receives a payment transaction request from the acquirer servers 110 or a payment terminal (e.g., IoT device), the payment server 900 may route the payment transaction request to the issuer servers 108. The database 910 stores transaction identifiers for identifying transaction details, such as transaction amount, IoT device details, acquirer account information, transaction records, merchant account information, and the like.
In one example embodiment, the acquirer servers 110 is configured to send an authorization request message to the payment server 900. The authorization request message includes, but is not limited to, the payment transaction request.
The processing module 902 further sends the payment transaction request to the issuer servers 108 for facilitating the payment transactions from the remote device 908. The processing module 902 is further configured to notify the remote device 908 of the transaction status in the form of an authorization response message via the communication module 906. The authorization response message includes, but is not limited to, a payment transaction response received from the issuer servers 108. Alternatively, in one embodiment, the processing module 902 is configured to send an authorization response message for declining the payment transaction request, via the communication module 906, to the acquirer servers 110. In one embodiment, the processing module 902 executes similar operations performed by the server system 200, however, for the sake of brevity, these operations are not explained herein.
The disclosed method with reference to FIG. 8, or one or more operations of the server system 200 may be implemented using software including computer-executable instructions stored on one or more computer-readable media (e.g., non-transitory computer-readable media, such as one or more optical media discs, volatile memory components (e.g., DRAM or SRAM), or nonvolatile memory or storage components (e.g., hard drives or solid-state nonvolatile memory components, such as Flash memory components) and executed on a computer (e.g., any suitable computer, such as a laptop computer, netbook, Web book, tablet computing device, smartphone, or other mobile computing devices). Such software may be executed, for example, on a single local computer or in a network environment (e.g., via the Internet, a wide-area network, a local-area network, a remote web-based server, a client-server network (such as a cloud computing network), or other such networks) using one or more network computers. Additionally, any of the intermediate or final data created and used during the implementation of the disclosed methods or systems may also be stored on one or more computer-readable media (e.g., non-transitory computer-readable media) and are considered to be within the scope of the disclosed technology. Furthermore, any of the software-based embodiments may be uploaded, downloaded, or remotely accessed through a suitable communication means. Such a suitable communication means includes, for example, the Internet, the World Wide Web, an intranet, software applications, cable (including fiber optic cable), magnetic communications, electromagnetic communications (including RF, microwave, and infrared communications), electronic communications, or other such communication means.
Although the invention has been described with reference to specific exemplary embodiments, it is noted that various modifications and changes may be made to these embodiments without departing from the broad scope of the invention. For example, the various operations, blocks, etc., described herein may be enabled and operated using hardware circuitry (for example, Complementary Metal Oxide Semiconductor (CMOS) based logic circuitry), firmware, software, and/or any combination of hardware, firmware, and/or software (for example, embodied in a machine-readable medium). For example, the apparatuses and methods may be embodied using transistors, logic gates, and electrical circuits (for example, Application-Specific Integrated Circuit (ASIC) circuitry and/or in Digital Signal Processor (DSP) circuitry).
Particularly, the server system 200 and its various components may be enabled using software and/or using transistors, logic gates, and electrical circuits (for example, integrated circuit circuitry such as ASIC circuitry). Various embodiments of the invention may include one or more computer programs stored or otherwise embodied on a computer-readable medium, wherein the computer programs are configured to cause a processor or the computer to perform one or more operations. A computer-readable medium storing, embodying, or encoded with a computer program, or similar language, may be embodied as a tangible data storage device storing one or more software programs that are configured to cause a processor or computer to perform one or more operations. Such operations may be, for example, any of the steps or operations described herein. In some embodiments, the computer programs may be stored and provided to a computer using any type of non-transitory computer-readable media. Non-transitory computer-readable media includes any type of tangible storage media. Examples of non-transitory computer-readable media include magnetic storage media (such as floppy disks, magnetic tapes, hard disk drives, etc.), optical magnetic storage media (e.g. magneto-optical disks), Compact Disc Read-Only Memory (CD-ROM), Compact Disc Recordable CD-R, Compact Disc Rewritable CD-R/W), Digital Versatile Disc (DVD), BLU-RAY® Disc (BD), and semiconductor memories (such as mask ROM, programmable ROM (PROM), Erasable PROM (EPROM), flash memory, Random Access Memory (RAM), etc.). Additionally, a tangible data storage device may be embodied as one or more volatile memory devices, one or more non-volatile memory devices, and/or a combination of one or more volatile memory devices and non-volatile memory devices. In some embodiments, the computer programs may be provided to a computer using any type of transitory computer-readable media. Examples of transitory computer-readable media include electric signals, optical signals, and electromagnetic waves. Transitory computer-readable media can provide the program to a computer via a wired communication line (e.g., electric wires, and optical fibers) or a wireless communication line.
Various embodiments of the invention, as discussed above, may be practiced with steps and/or operations in a different order, and/or with hardware elements in configurations, which are different from those which, are disclosed. Therefore, although the invention has been described based on these exemplary embodiments, it is noted that certain modifications, variations, and alternative constructions may be apparent and well within the scope of the invention.
Although various exemplary embodiments of the invention are described herein in a language specific to structural features and/or methodological acts, the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as exemplary forms of implementing the claims.
, Claims:1. A computer-implemented method, comprising:
accessing, by a server system, a plurality of features for each payment transaction from a plurality of payment transactions performed by a cardholder with at least one of at least one financial service provider and at least one merchant from a database associated with the server system based, at least in part, on a unique account identifier (ID) received from a financial service provider for an ongoing payment transaction, the unique account ID being linked to the cardholder;
segregating, by the server system, the plurality of features into a set of authorization features and a set of non-authorization features for each payment transaction based, at least in part, on authorization information associated with the unique account ID;
generating, by a trained temporal Machine Learning (ML)-based model associated with the server system, a set of temporal embeddings for the plurality of payment transactions based, at least in part, on the set of authorization features corresponding to each payment transaction, each temporal embedding in the set of temporal embeddings being generated for each payment transaction;
generating, by the server system, a set of combined embeddings for the plurality of payment transactions based, at least in part, on concatenating the set of temporal embeddings, the set of non-authorization features for each payment transaction, and a ticket size of the ongoing payment transaction; and
computing, by a trained intent-to-pay ML-based model associated with the server system, an intent-to-pay score of the cardholder for the ongoing payment transaction based, at least in part, on the set of combined embeddings, the intent-to-pay score indicating a likelihood of the cardholder to perform one or more repayments to the financial service provider for the ongoing payment transaction.
2. The computer-implemented method as claimed in claim 1, further comprising:
accessing, by the server system, a cardholder-related dataset associated with the cardholder from the database based, at least in part, on the unique account ID, the cardholder-related dataset comprising historical information corresponding to the plurality of payment transactions performed by the cardholder with the at least one of at least one financial service provider and at least one merchant;
extracting, by the server system, the plurality of features from the cardholder-related dataset associated with each payment transaction; and
storing, by the server system, the plurality of features for each payment transaction corresponding to the unique account ID in the database.
3. The computer-implemented method as claimed in claim 2, wherein the historical information for each payment transaction comprises: the authorization information, Three Domain Secure 2.0 (3DS2) data, Buy-now-pay-later (BNPL) transaction data, postpaid merchant transaction data, and card Recurrent-Payment Cancellation Service (RPCS) data.
4. The computer-implemented method as claimed in claim 1, wherein the ongoing payment transaction is at least one of a loan request transaction from the cardholder and a loan standard instruction (SI) transaction initialized by the at least one financial service provider with an issuing bank of the cardholder.
5. The computer-implemented method as claimed in claim 1, further comprising:
receiving, via a communication interface associated with the server system, an intent-to-pay propensity request from the financial service provider for the ongoing payment transaction, the intent-to-pay propensity request indicating a request to determine a likelihood of the cardholder to perform the one or more repayments to the financial service provider for the ongoing payment transaction, the intent-to-pay propensity request comprising at least the unique account ID and the ticket size of the ongoing payment transaction; and
6. The computer-implemented method as claimed in claim 1,
transmitting, via the communication interface, the intent-to-pay score of the cardholder to the financial service provider once the intent-to-pay score is computed for the ongoing payment transaction.
7. The computer-implemented method as claimed in claim 1, further comprising:
accessing, by the server system, a training set of authorization features for each payment transaction of the plurality of payment transactions performed by a plurality of cardholders with at least one of the at least one financial service provider and the at least one merchant for a training period from the database;
generating, by the server system, training sequential data for each cardholder based, at least in part, on the training set of authorization features and a time stamp associated with each payment transaction, the training sequential data comprising one or more card-specific sequences of the plurality of payment transactions;
training, by the server system, a temporal ML-based model for obtaining the trained temporal ML-based model based, at least in part, on performing a first set of operations iteratively till the performance of the temporal ML-based model converges to first predefined criteria, the first set of operations comprising:
generating the temporal ML-based model based, at least in part, on one or more temporal model parameters;
generating, via the temporal ML-based model, a training set of temporal embeddings for the training sequential data for each cardholder based, at least in part on the one or more temporal model parameters;
generating, via the temporal ML-based model, a prediction for each payment transaction in the training sequential data based, at least in part, on the training set of temporal embeddings, the prediction corresponds to a hidden state of the temporal ML-based model;
computing a temporal loss value for each payment transaction in the training sequential data using a temporal loss function based, at least in part, on the prediction and an actual outcome; and
optimizing the one or more temporal model parameters based, at least in part, on backpropagating the temporal loss value.
8. The computer-implemented method as claimed in claim 1, wherein generating the set of temporal embeddings comprises:
generating, by the server system, sequential data for the cardholder based, at least in part, on the set of authorization features corresponding to each payment transaction and a time stamp associated with each payment transaction performed by the cardholder for a predefined period, the sequential data comprising one or more card-specific sequences of the plurality of payment transactions performed by the cardholder within the predefined period;
processing, by the trained temporal ML-based model, the sequential data; and
extracting, by the server system, the set of temporal embeddings from the at least one hidden layer of the trained temporal ML-based model, each temporal embedding indicating a pattern of the sequential data.
9. The computer-implemented method as claimed in claim 1, further comprising:
training, by the server system, an intent-to-pay ML-based model for obtaining the trained intent-to-pay ML-based model based, at least in part, on performing a second set of operations iteratively till the performance of the intent-to-pay ML-based model converges to second predefined criteria, the second set of operations comprising:
generating the intent-to-pay ML-based model based, at least in part, on one or more intent-to-pay model parameters;
generating, by the intent-to-pay ML-based model, an intent-to-pay score for each payment transaction based, at least in part, on a training dataset of the set of combined embeddings and the one or more intent-to-pay model parameters;
generating, via the intent-to-pay ML-based model, a predicted outcome, based at least in part, on the intent-to-pay score;
computing an intent-to-pay loss value for each payment transaction using an intent-to-pay loss function based, at least in part, on the predicted outcome and an actual outcome; and
optimizing the one or more intent-to-pay model parameters based, at least in part, on backpropagating the intent-to-pay loss value.
10. The computer-implemented method as claimed in claim 1, further comprising:
labeling, by the server system, the cardholder as a willing-to-pay cardholder for the ongoing payment transaction based, at least in part, on the intent-to-pay score of the cardholder is at least equal to a predefined intent threshold; and
labeling, by the server system, the cardholder as a not-willing-to-pay cardholder for the ongoing payment transaction based, at least in part, on the intent-to-pay score of the cardholder being less than the predefined intent threshold.
11. The computer-implemented method as claimed in claim 1, further comprising:
generating, by the server system, one or more recommendations for the financial service provider based, at least in part, on the intent-to-pay score of the cardholder for the ongoing payment transaction, the one or more recommendations comprising a suggestion to the financial service provider to provide a much lower ticket size option to the cardholder having the intent-to-pay score less than a predefined intent threshold.
12. A server system, comprising:
a communication interface;
a memory comprising executable instructions; and
a processor communicably coupled to the communication interface and the memory, the processor configured to cause the server system to at least:
access a plurality of features for each payment transaction from a plurality of payment transactions performed by a cardholder with at least one of at least one financial service provider and at least one merchant from a database associated with the server system based, at least in part, on a unique account identifier (ID) received from a financial service provider for an ongoing payment transaction, the unique account ID being linked to the cardholder;
segregate the plurality of features into a set of authorization features and a set of non-authorization features for each payment transaction based, at least in part, on authorization information associated with the unique account ID;
generate, by a trained temporal Machine Learning (ML)-based model associated with the server system, a set of temporal embeddings for the plurality of payment transactions based, at least in part, on the set of authorization features corresponding to each payment transaction, each temporal embedding in the set of temporal embeddings being generated for each payment transaction;
generate a set of combined embeddings for the plurality of payment transactions based, at least in part, on concatenating the set of temporal embeddings, the set of non-authorization features for each payment transaction, and a ticket size of the ongoing payment transaction; and
compute, by a trained intent-to-pay ML-based model associated with the server system, an intent-to-pay score of the cardholder for the ongoing payment transaction based, at least in part, on the set of combined embeddings, the intent-to-pay score indicating a likelihood of the cardholder to perform one or more repayments to the financial service provider for the ongoing payment transaction.
13. The server system as claimed in claim 12, wherein the server system is further caused at least to:
access a cardholder-related dataset associated with the cardholder from the database based, at least in part, on the unique account ID, the cardholder-related dataset comprising historical information corresponding to the plurality of payment transactions performed by the cardholder with at least one of at least one financial service provider and at least one merchant;
extract the plurality of features from the cardholder-related dataset associated with each payment transaction; and
store the plurality of features for each payment transaction corresponding to the unique account ID in the database.
14. The server system as claimed in claim 12, wherein the server system is further caused at least to:
receive, via the communication interface associated with the server system, an intent-to-pay propensity request from the financial service provider for the ongoing payment transaction, the intent-to-pay propensity request indicating a request to determine a likelihood of the cardholder to perform the one or more repayments to the financial service provider for the ongoing payment transaction, the intent-to-pay propensity request comprising at least the unique account ID and the ticket size of the ongoing payment transaction.
15. The server system as claimed in claim 12, wherein the server system is further caused at least to:
transmit, via the communication interface, the intent-to-pay score of the cardholder to the financial service provider.
16. The server system as claimed in claim 12, wherein the server system is further caused at least to:
access a training set of authorization features for each payment transaction of the plurality of payment transactions performed by a plurality of cardholders with at the least one of the at least one financial service provider and the at least one merchant for a training period from the database;
generate training sequential data for each cardholder based, at least in part, on the training set of authorization features and a time stamp associated with each payment transaction, the training sequential data comprising one or more card-specific sequences of the plurality of payment transactions;
train a temporal ML-based model for obtaining the trained temporal ML-based model based, at least in part, on performing a first set of operations iteratively till the performance of the temporal ML-based model converges to first predefined criteria, the first set of operations comprising:
generating the temporal ML-based model based, at least in part, on one or more temporal model parameters;
generating, via the temporal ML-based model, a training set of temporal embeddings for the training sequential data for each cardholder based, at least in part on the one or more temporal model parameters;
generating, via the temporal ML-based model, a prediction for each payment transaction in the training sequential data based, at least in part, on the training set of temporal embeddings, the prediction corresponds to a hidden state of the temporal ML-based model;
computing a temporal loss value for each payment transaction in the training sequential data using a temporal loss function based, at least in part, on the prediction and an actual outcome; and
optimizing the one or more temporal model parameters based, at least in part, on backpropagating the temporal loss value.
17. The server system as claimed in claim 11, wherein to generate the set of temporal embeddings, the server system is further caused at least to:
generate sequential data the cardholder based, at least in part, on the set of authorization features corresponding to each payment transaction and a time stamp associated with each payment transaction performed by the cardholder for a predefined period, the sequential data comprising one or more card-specific sequences of the plurality of payment transactions performed by the cardholder within the predefined period;
process, by the trained temporal ML-based model, the sequential data; and
extract the set of temporal embeddings from the at least one hidden layer of the trained temporal ML-based model, each temporal embedding indicating a pattern of the sequential data.
18. The server system as claimed in claim 11, wherein the server system is further caused at least to:
train the intent-to-pay ML-based model for obtaining the trained intent-to-pay ML-based model based, at least in part, on performing a second set of operations iteratively till the performance of the intent-to-pay ML-based model converges to second predefined criteria, the second set of operations comprising:
generating the intent-to-pay ML-based model based, at least in part, on one or more intent-to-pay model parameters;
generating, by the intent-to-pay ML-based model, an intent-to-pay score for each payment transaction based, at least in part, on a training dataset of the set of combined embeddings and the one or more intent-to-pay model parameters
generating, via the intent-to-pay ML-based model, a predicted outcome , based at least in part, on the intent-to-pay score;
computing an intent-to-pay loss value for each payment transaction using an intent-to-pay loss function based, at least in part, on the predicted outcome and an actual outcome; and
optimizing the one or more intent-to-pay model parameters based, at least in part, on backpropagating the intent-to-pay loss value.
19. The server system as claimed in claim 11, wherein the server system is further caused at least to:
label the cardholder as a willing-to-pay cardholder for the ongoing payment transaction based, at least in part, on the intent-to-pay score of the cardholder is at least equal to a predefined intent threshold; and
label the cardholder as a not willing to pay cardholder for the ongoing payment transaction based, at least in part, on the intent-to-pay score of the cardholder is less than thepredefined intent threshold.
20. The server system as claimed in claim 11, wherein the server system is further caused at least to:
generate one or more recommendations for the financial service provider based, at least in part, on the intent-to-pay score of the cardholder for the ongoing payment transaction, the one or more recommendations comprising a suggestion to the financial service provider to provide a much lower ticket size option to the cardholder having the intent-to-pay score less than a predefined intent threshold.
| # | Name | Date |
|---|---|---|
| 1 | 202441020643-STATEMENT OF UNDERTAKING (FORM 3) [19-03-2024(online)].pdf | 2024-03-19 |
| 2 | 202441020643-POWER OF AUTHORITY [19-03-2024(online)].pdf | 2024-03-19 |
| 3 | 202441020643-FORM 1 [19-03-2024(online)].pdf | 2024-03-19 |
| 4 | 202441020643-FIGURE OF ABSTRACT [19-03-2024(online)].pdf | 2024-03-19 |
| 5 | 202441020643-DRAWINGS [19-03-2024(online)].pdf | 2024-03-19 |
| 6 | 202441020643-DECLARATION OF INVENTORSHIP (FORM 5) [19-03-2024(online)].pdf | 2024-03-19 |
| 7 | 202441020643-COMPLETE SPECIFICATION [19-03-2024(online)].pdf | 2024-03-19 |
| 8 | 202441020643-Proof of Right [07-05-2024(online)].pdf | 2024-05-07 |