Specification
DESC:
FORM 2
THE PATENTS ACT 1970
(39 of 1970)
&
The Patent Rules 2003
COMPLETE SPECIFICATION
(refer section 10 & rule 13)
TITLE OF THE INVENTION:
METHODS AND SYSTEMS FOR IMPROVING FRAUD DETECTION IN ELECTRONIC TRANSACTIONS
APPLICANT(S):
Name:
Nationality:
Address:
MASTERCARD INTERNATIONAL INCORPORATED
United States of America
2000 Purchase Street, Purchase, NY 10577, United States of America
PREAMBLE TO THE DESCRIPTION
The following specification particularly describes the invention and the manner in which it is to be performed.
DESCRIPTION
(See next page)
METHODS AND SYSTEMS FOR IMPROVING FRAUD DETECTION IN ELECTRONIC TRANSACTIONS
TECHNICAL FIELD
The present disclosure relates to artificial intelligence processing systems and, more particularly to, electronic methods and complex processing systems for improving fraud risk models for detecting frauds in electronic payment transactions.
BACKGROUND
Tracking fraud or default in payment transactions is a very challenging task. Fraudsters keep utilizing very sophisticated techniques in online payment account fraud, where such transactions do not appear like fraudulent transactions to the parties involved in the transactions. Fraudsters can look and behave exactly how an authentic customer might be expected to look and behave while doing online and/or offline transactions.
In existing risk control systems, fraud and credit risk detection models used in such systems are passive in nature. Once fraud/default patterns are captured, the fraud and credit risk models are trained to react in the future. Reactive strategies are no longer effective against fraudsters or emerging default patterns. Too often, financial institutions learn about fraud very late. It is no longer realistic to attempt to stop fraudsters by defining new detection rules after the fraudulent act, as one can never anticipate and respond to every new fraud pattern. This prompts an immense loss in revenue and impacts customer experience negatively. The fraud risk models take into consideration, a list of locations, points of sales, amount of transaction, etc., which are marked as fraudulent in the past to detect fraudsters and defaulters in the future.
The transactions may happen at any point in time. There is no fixed interval of time or pattern when a transaction may be performed by a user. Further, there might always be some dependence on past transactions. In addition, the existing fraud risk models do not consider the time of occurrence of a payment transaction into account for predicting fraudulent transactions. As a result, these models are unable to spot frauds based on the difference between the expected time of transactions and the actual time of transactions.
Thus, there is a technological need for a technical solution for improving existing fraud detection models with a higher degree of accuracy.
SUMMARY
Various embodiments of the present disclosure provide methods and systems for improving fraud detection in electronic transactions.
In an embodiment, a computer-implemented method is disclosed. The method includes accessing, by a server system, historical transaction data of payment transactions performed with a payment card associated with a cardholder during a particular time segment from a transaction database. In addition, the method includes determining, by the server system via implementation of a deep neural network (DNN) model, a predicted time of occurrence of an upcoming payment transaction based, at least in part, on the historical transaction data. The method includes receiving, by the server system, timestamp information associated with the actual time of occurrence of the upcoming payment transaction performed with the payment card. The method includes calculating, by the server system, a deviation value based, at least in part, on a comparison of the predicted time of occurrence with the actual time of occurrence of the upcoming payment transaction performed with the payment card. The method includes determining, by the server system, whether the upcoming payment transaction is fraudulent based, at least in part, on a concatenation of the calculated deviation value and a hidden representation. The hidden representation is generated as an output next to the execution of the DNN model.
Other aspects and example embodiments are provided in the drawings and the detailed description that follows.
BRIEF DESCRIPTION OF THE FIGURES
For a more complete understanding of embodiments of the present technology, reference is now made to the following descriptions taken in connection with the accompanying drawings in which:
FIG. 1 is an example representation of an environment, related to at least some embodiments of the present disclosure;
FIG. 2 is a simplified block diagram of a server system, in accordance with one embodiment of the present disclosure;
FIG. 3 is a schematic block diagram of a process of fraud detection in payment transactions, in accordance with an embodiment of the present disclosure;
FIG. 4 is a schematic representation of a deviation-based marked temporal point process (DMTPP) model, in accordance with an embodiment of the present disclosure;
FIG. 5 is a block diagram representation of the deviation-based marked TPP model, in accordance with an embodiment of the present disclosure;
FIG. 6 is a flow chart of the training phase of DMTPP model, in accordance with an embodiment of the present disclosure;
FIG. 7 is a flow chart of the execution phase of DMTPP model, in accordance with an embodiment of the present disclosure;
FIG. 8 is a simplified block diagram of a payment server, in accordance with an embodiment of the present disclosure; and
FIG. 9 is a simplified block diagram of a server system, in accordance with an embodiment of the present disclosure.
The drawings referred to in this description are not to be understood as being drawn to scale except if specifically noted, and such drawings are only exemplary in nature.
DETAILED DESCRIPTION
In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be apparent, however, to one skilled in the art that the present disclosure can be practiced without these specific details.
Reference in this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. The appearance of the phrase “in an embodiment” in various places in the specification is not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not for other embodiments.
Moreover, although the following description contains many specifics for the purposes of illustration, anyone skilled in the art will appreciate that many variations and/or alterations to said details are within the scope of the present disclosure. Similarly, although many of the features of the present disclosure are described in terms of each other, or in conjunction with each other, one skilled in the art will appreciate that many of these features can be provided independently of other features. Accordingly, this description of the present disclosure is set forth without any loss of generality to, and without imposing limitations upon, the present disclosure.
The term "payment network", used herein, refers to a network or collection of systems used for the transfer of funds through the use of cash substitutes. Payment networks may use a variety of different protocols and procedures in order to process the transfer of money for various types of transactions. Transactions that may be performed via a payment network may include product or service purchases, credit purchases, debit transactions, fund transfers, account withdrawals, etc. Payment networks may be configured to perform transactions via cash substitutes that may include payment cards, letters of credit, checks, financial accounts, etc. Examples of networks or systems configured to perform as payment networks include those operated by such as Mastercard®.
The term "merchant", used throughout the description generally refers to a seller, a retailer, a purchase location, an organization, or any other entity that is in the business of selling goods or providing services, and it can refer to either a single business location or a chain of business locations of the same entity.
The terms "cardholder", “user”, and “customer” are used interchangeably throughout the description and refer to a person who holds a payment card (e.g., credit card, debit card, etc.) that will be used by a merchant to perform a payment transaction.
The terms “event”, “transaction”, and “payment transaction” are used interchangeably throughout the description and refer to a payment transaction being initiated by the cardholder.
OVERVIEW
Various embodiments of the present disclosure provide methods, systems electronic devices, and computer program products for improving fraud detection in electronic transactions.
Artificial intelligence (AI) models help in improving the accuracy of fraud risk models. There have been some limited attempts to use the AI models to learn the transaction behavior of a plurality of users to predict fraudulent transactions. While such efforts may alleviate some false positives in fraud detection, however, there remain some technical limitations/problems in existing fraud risk models such as (a) They do not consider adaption in fraud patterns over time, (b) Since fraud does not occur at a fixed time interval, therefore, these models cannot predict a time of occurring of fraudulent transactions.
In view of the foregoing, various embodiments of the present disclosure provide methods, systems, user devices, and computer program products for improving fraud detection in electronic payment transactions. The present disclosure predicts the inter-event time for the next payment transaction to be performed by a user and determines whether there is a deviation between the predicted inter-event time and the actual time of occurrence of the next payment transaction, to detect fraudulent payment transactions. The present disclosure uses long short-term memory (LSTM) for modeling an end-to-end nonlinear dependency over the event history. The present disclosure takes additional meta information (e.g., transaction type, amount, card not present, etc.) into account for fraud detection on real-time transactional data.
In one example, the present disclosure describes a server system that is configured to predict fraudulent transactions by modeling the inter-dependency of a past transaction sequence. In one non-limiting example, the server system is a payment server. The server system is configured to utilize a deviation-based marked temporal point process (DMTPP) model for modeling the inter-dependency of the past transaction sequence.
Initially, the server system is configured to access historical transaction data of payment transactions performed with a payment card associated with a cardholder during a particular time segment from a transaction database. The historical transaction data may include data of payment transactions occurring at irregular intervals of time. The historical transaction data includes a time-series transaction sequence of at least one of fraudulent or non-fraudulent payment transaction performed with the payment card.
Additionally, the server system is configured to determine a predicted time of occurrence of an upcoming payment transaction based, at least in part, on the historical transaction data and a deep neural network (DNN) model. The server system is also configured to generate a feature vector corresponding to the payment transactions performed with the payment card based, at least in part, on the historical transaction data. In one non-limiting example, the DNN model is created based on a recurrent neural network (RNN) and a deviation-based marked temporal point process (DMTPP) with a conditional intensity function. In one non-limiting example, the RNN includes a long short-term memory (LSTM) module.
Further, the server system is configured to receive timestamp information associated with the actual time of occurrence of the upcoming payment transaction performed with the payment card. Furthermore, the server system is configured to calculate a deviation value based, at least in part, on a comparison of the predicted time of occurrence with the actual time of occurrence of the upcoming payment transaction performed with the payment card. The deviation value is based on a time difference between the predicted time of occurrence and the actual time of occurrence of the upcoming payment transaction. Moreover, the server system is configured to determine whether the upcoming payment transaction is fraudulent based, at least in part, on the concatenation of the calculated deviation value and a hidden representation. The hidden representation is generated as an output next to the execution of the DNN model.
In one embodiment, the DNN model is trained prior to execution. The training includes accessing a time series sequence of past payment transaction data and corresponding past transaction markers associated with a plurality of payment transactions from the transaction database. The training is performed by implementing a plurality of operations in an iterative manner. The plurality of operations includes generating training vectors corresponding to the plurality of payment transactions based, at least in part, on the time-series sequence of past payment transaction data along with the corresponding past transaction markers.
In addition, the plurality of operations includes providing the time-series sequence of past payment transaction data along with the corresponding past transaction markers as input to the DNN model. Further, the plurality of operations includes determining the next inter-event transaction time and the next marker corresponding to each payment transaction of the time series sequence of past payment transaction data.
Furthermore, the plurality of operations includes updating neural network parameters (e.g., weights and biases) of the DNN model based, at least in part, on a loss function. The loss function depends on a deviation parameter indicating the time difference between the next inter-event transaction time and the actual inter-event transaction time.
Various embodiments of the present disclosure offer multiple advantages and technical effects. For instance, the present disclosure provides a system for improving fraud detection in electronic payment transactions using machine learning algorithms. In addition, the system calculates a first fraud risk score using one or more fraud risk models and calculates a second fraud risk score using the DMTPP model. By performing a fusion of the first and second fraud risk scores, the system improves fraud detection in electronic payment transactions.
Various embodiments of the present disclosure are described hereinafter with reference to FIGS. 1 to 9.
FIG. 1 illustrates an exemplary representation of an environment 100 related to at least some embodiments of the present disclosure. Although the environment 100 is presented in one arrangement, other embodiments may include the parts of the environment 100 (or other parts) arranged otherwise depending on, for example, improving fraud risk models employed for detecting fraudulent card-present and/or card-not-present payment transactions. The environment 100 generally includes a server system 102, a plurality of user devices 104a, 104b, and 104c associated with a plurality of cardholders 106a, 106b, and 106c, an issuer server 108, and a payment network 112 including a payment server 114, each coupled to, and in communication with (and/or with access to) a network 110. The network 110 may include, without limitation, a light fidelity (Li-Fi) network, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a satellite network, the Internet, a fiber optic network, a coaxial cable network, an infrared (IR) network, a radio frequency (RF) network, a virtual network, and/or another suitable public and/or private network capable of supporting communication among the entities illustrated in FIG. 1, or any combination thereof. The terms "customer", "user", or "cardholder" may be interchangeable throughout the description. These terms may relate to a direct customer of an issuer (e.g., the issuer server 108) or a person or entity that has the authorization to act on behalf of the direct customer or user (i.e., indirect customer).
Various entities in the environment 100 may connect to the network 110 in accordance with various wired and wireless communication protocols, such as Transmission Control Protocol and Internet Protocol (TCP/IP), User Datagram Protocol (UDP), 2nd Generation (2G), 3rd Generation (3G), 4th Generation (4G), 5th Generation (5G) communication protocols, Long Term Evolution (LTE) communication protocols, any combination thereof or any future communication protocols. For example, the network 110 may include multiple different networks, such as a private network or a public network (e.g., the Internet, etc.) through which the server system 102 and the payment server 114 may communicate.
In one embodiment, the payment network 112 may be used by the payment card issuing authorities as a payment interchange network. Examples of payment interchange networks include but are not limited to, Mastercard® payment system interchange network. The Mastercard® payment system interchange network is a proprietary communications standard promulgated by Mastercard International Incorporated® for the exchange of electronic payment transaction data between issuers and acquirers that are members of Mastercard International Incorporated®. (Mastercard is a registered trademark of Mastercard International Incorporated located in Purchase, N.Y.).
The issuer server 108 is a computing server that is associated with the issuer bank. The issuer bank is a financial institution that manages accounts of multiple users. Account details of the accounts established with the issuer bank are stored in user profiles of the users in a memory of the issuer server 108 or on a cloud server associated with the issuer server 108.
In one embodiment, the issuer server 108 is associated with a financial institution normally called an "issuer bank" or "issuing bank" or simply "issuer", in which the plurality of cardholders 106a, 106b, and 106c, may have at least one payment account, (which also issues payment cards, such as credit cards or debit cards), and provides banking services (e.g., payment transaction using credit/debit cards) for processing payment transactions using a card, to the plurality of cardholders 106a, 106b, and 106c. More specifically, each of the plurality of customers 106a, 106b, and 106c may be any individual buyer, representative of a corporate entity, or any other person that is presenting a credit or debit card during a payment transaction with a merchant representative or other seller. In one embodiment, the plurality of customers 106a, 106b, and 106c can perform card present (CP) or card-not-present (CNP) transactions.
In an embodiment, the cardholder (i.e., “the plurality of cardholders 106a, 106b, or 106c”) may operate a user device (e.g., user devices 104a, 104b, or 104c) to conduct a payment transaction through a payment gateway application. Generally, “payment transaction” is an agreement that is carried out between a buyer and a seller to exchange assets as a form of payment (e.g., cash, currency, etc.). In another embodiment, the cardholder (e.g., “the cardholder 106a”) may use a payment card (e.g., “swipe” or present a payment card) at a point-of-sale (POS) terminal. The cardholder (e.g., “the cardholder 106a”) may be any individual, representative of a corporate entity, non-profit organization, or any other person that is presenting a credit or debit card during an electronic payment transaction. The cardholder (e.g., “the cardholder 106a”) may have a payment account issued by an issuing bank (associated with the issuer server 108) and may be provided a payment card with financial or other account information encoded onto the payment card such that the cardholder (i.e., “the cardholder 106a”) may use the payment card to initiate and complete a transaction using a bank account at the issuing bank.
The user device 104a is a communication device of the cardholder (e.g., “the cardholder 106a”). The cardholder 106a uses the user device 104a to access a mobile application or a website of the issuer server 108, or any third-party payment application. The user device 104a and the mobile device are used interchangeably throughout the present description. The user device 104a may be any electronic device such as, but not limited to, a personal computer (PC), a tablet device, a Personal Digital Assistant (PDA), a voice-activated assistant, a Virtual Reality (VR) device, a smartphone, and a laptop.
The environment 100 also includes a transaction database 116 communicatively coupled to the server system 102. In one embodiment, the transaction database 116 may include multifarious data, for example, social media data, Know Your Customer (KYC) data, payment data, trade data, employee data, Anti Money Laundering (AML) data, market abuse data, Foreign Account Tax Compliance Act (FATCA) data, and fraud risk score data.
In an example, the transaction database 116 stores user profile data associated with each cardholder (e.g., the cardholder 106a). In one embodiment, the user profile data may include account balance, credit line, details of the cardholder (e.g., “the cardholder 106a”), account identification information, payment card number, or the like. Further, the details of the cardholder 106a may include, but are not limited to, name, age, gender, physical attributes, location, registered contact number, family information, alternate contact number, registered e-mail address, or the like of the cardholder 106a.
In another example, the transaction database 116 stores real-time transaction data of the plurality of cardholders 106a, 106b, and 106c. The transaction data may include, but is not limited to, transaction attributes, such as transaction amount, source of funds such as bank or credit cards, transaction channel used for loading funds such as POS terminal or ATM machine, transaction velocity features such as count and transaction amount sent in the past ‘x’ number of days to a particular user, transaction location information, external data sources, and other internal data to evaluate each transaction.
In one embodiment, the server system 102 is configured to perform one or more of the operations described herein. The server system 102 may be a computing server configured to improve fraud risk models by utilizing a deep neural network (DNN) model. The DNN model is a deviation-based marked temporal point process (DMTPP) model. The terms “DNN model” and “DMTPP model” may be used interchangeably throughout the description. The server system 102 is configured to model the time inter-dependency of transaction behavior to predict whether a payment transaction performing in real-time is fraudulent. In one example, the server system 102 is configured to train the DMTPP models based on historical transaction data of the plurality of cardholders 106a, 106b, and 106c.
The number and arrangement of systems, devices, and/or networks shown in FIG. 1 are provided as an example. There may be additional systems, devices, and/or networks; fewer systems, devices, and/or networks; different systems, devices, and/or networks; and/or differently arranged systems, devices, and/or networks than those shown in FIG. 1. Furthermore, two or more systems or devices shown in FIG. 1 may be implemented within a single system or device, or a single system or device shown in FIG. 1 may be implemented as multiple, distributed systems or devices. Additionally, or alternatively, a set of systems (e.g., one or more systems) or a set of devices (e.g., one or more devices) of the environment 100 may perform one or more functions described as being performed by another set of systems or another set of devices of the environment 100.
Referring now to FIG. 2, a simplified block diagram of a server system 200 is shown, in accordance with an embodiment of the present disclosure. The server system 200 is an example of the server system 102. In one embodiment, the server system 200 is a part of the payment network 112 or integrated within the payment server 114. In some embodiments, the server system 200 is embodied as a cloud-based and/or SaaS-based (software as a service) architecture.
The server system 200 includes a computer system 202 and a database 206. The computer system 202 includes at least one processor 204 for executing instructions, a communication interface 216, a user interface 218, a memory 220, and a storage interface 222 that communicate with each other via a bus 224. The processor 204 includes a data pre-processing engine 208, a risk decision engine 210, a neural network engine 212, and a fusion engine 214.
The processor 204 includes suitable logic, circuitry, and/or interfaces to execute operations for accessing various transaction data and utilize trained machine learning models. Examples of the processor 204 include, but are not limited to, an application-specific integrated circuit (ASIC) processor, a reduced instruction set computing (RISC) processor, a graphical processing unit (GPU), a complex instruction set computing (CISC) processor, a field-programmable gate array (FPGA), and the like. The memory 220 includes suitable logic, circuitry, and/or interfaces to store a set of computer-readable instructions for performing operations. Examples of the memory 220 include a random-access memory (RAM), a read-only memory (ROM), a removable storage drive, a hard disk drive (HDD), and the like. It will be apparent to a person skilled in the art that the scope of the disclosure is not limited to realizing the memory 220 in the server system 200, as described herein. In another embodiment, the memory 220 may be realized in the form of a database server or a cloud storage working in conjunction with the server system 200, without departing from the scope of the present disclosure.
The processor 204 is operatively coupled to the communication interface 216, such that the processor 204 is capable of communicating with a remote device 230 such as the issuer server 108 or communicating with any entity connected to the network 110 (as shown in FIG. 1).
It is noted that the server system 200 as illustrated and hereinafter described is merely illustrative of an apparatus that could benefit from embodiments of the present disclosure and, therefore, should not be taken to limit the scope of the present disclosure. It is noted that the server system 200 may include fewer or more components than those depicted in FIG. 2.
In some embodiments, the database 206 is integrated within the computer system 202. For example, the computer system 202 may include one or more hard disk drives as the database 206. A storage interface 222 is any component capable of providing the processor 204 with access to the database 206. The storage interface 222 may include, for example, an Advanced Technology Attachment (ATA) adapter, a Serial ATA (SATA) adapter, a Small Computer System Interface (SCSI) adapter, a RAID controller, a SAN adapter, a network adapter, and/or any component providing the processor 204 with access to the database 206. In one non-limiting example, the database 206 is configured to store deviation-based marked temporal point process (DMTPP) model 228.
In one embodiment, the processor 204 includes a data pre-processing engine 208, a risk decision engine 210, a neural network engine 212, and a fusion engine 214. It should be noted that the components, described herein, can be configured in a variety of ways, including electronic circuitries, digital arithmetic and logic blocks, and memory systems in combination with software, firmware, and embedded technologies.
The data pre-processing engine 208 includes suitable logic and/or interfaces for accessing historical transaction data or past transaction sequence of payment transactions performed with a payment card associated with a cardholder (e.g., the cardholder 106a) during a particular time segment (e.g., daily, weekly, monthly, annually, etc.) from the transaction database 116. The data pre-processing engine 208 then performs the featurization process over the historical transaction data for generating a feature vector corresponding to the payment transactions performed with the payment card based, at least in part, on the historical transaction data.
The past transaction sequence may denote payment transactions performed at different times (i.e., irregular times) in the past for the particular time segment (e.g., 3 months). Each transaction record is augmented with any number of predictive variables and their values. Any previous time period may be used to gather the historical transaction data from which the past transaction sequence is created. In one example, the time segment extends from the current time and date when the past transaction sequence is created (or a practical equivalent such as the morning of the current day or the day before) to a previous date such as three to six months ago as mentioned. By including data up until the current date, fraud risk models will be trained, validated, and tested using the most current transaction data that best reflects any recent fraud patterns.
In an example, during the training phase, the data pre-processing engine 208 may be configured to access a time-series sequence of past payment transaction data and corresponding past transaction markers associated with a plurality of payment transactions from the transaction database 116. In one example, the issuer server 108 may assign the plurality of payment cards to the plurality of cardholders 106a, 106b, and 106c. The time-series sequence of past payment transaction data includes time-series information of transaction time (i.e., time of transaction) and transaction marker (e.g., fraudulent, or non-fraudulent) corresponding to each payment transaction performed with each of the plurality of payment cards.
The time-series sequence of past payment transaction data may be accessed for an interval of time (such as 1 year, 2 years, 3 years, etc.). In some examples, the data pre-processing engine 208 is configured to generate training vectors corresponding to the plurality of payment transactions based, at least in part, on the time-series sequence of past payment transaction data along with the corresponding past transaction markers.
In an example, the data pre-processing engine 208 is configured to generate the feature vector based at least on the customer spending behaviors, payment behavior, and customer credit bureau information (for example, credit score), etc.
In one embodiment, the historical transaction data includes not only past transaction sequences but also numerous predictive variables and a binary flag indicating whether or not the payment transaction was fraudulent (the “fraud” variable). In general, a fraudulent payment transaction may be reported by the cardholder 106a sometime after the payment transaction occurs, may be reported by the cardholder's bank, may be indicated by a chargeback, and may be discovered in other ways. The binary flag may be set for a transaction in the database at any suitable time. The predictive variables known in the art include such as whether the dollar amount of the transaction is in a particular range, how many orders have been completed by the particular user device in the last thirty days, etc.
In one embodiment, the data pre-processing engine 208 is configured to generate a fraud feature vector for the past transaction sequence based on the binary flag associated with each payment transaction included in the past transaction sequence. The fraud feature vector is utilized for modeling deviation based marked TPP model (i.e., the DMTPP model 228).
In an example, the historical transaction data represents spend transactions performed by cardholders across various merchant categories such as grocery, airlines, and the like. In another example, the historical transaction data includes spending transactions across various merchant industries such as retail clothing, hotel industry, and the like. In yet another example, the historical transaction data includes spend transactions across various locations where the spend transactions occurred, and payment transaction types such as contactless, card present, and the like.
In one embodiment, the data pre-processing engine 208 is configured to provide the past transaction sequence along with the predictive variables to the risk decision engine 210 for modeling the one or more fraud risk models 226 based on the historical transaction data.
In one embodiment, the risk decision engine 210 includes suitable logic and/or interfaces for predicting a fraud risk score associated with a payment transaction based at least on the one or more fraud risk models 226. The one or more fraud risk models 226 are trained based on the historical transaction data of the plurality of cardholders. As is known, a variety of machine learning technologies (including nonlinear systems) may be used to implement one or more fraud risk models in the detection of fraudulent transactions. These fraud risk models may include a regression model, a neural net, a decision tree, a rule-based model, a random forest model, and a support vector machine model, etc.
In one example, the one or more fraud risk models 226 may represent Mastercard® Decision Intelligence (registered as a trademark) product powered by Mastercard® that provides decision and fraud detection service. The product uses artificial intelligence technology to help financial institutions increase the accuracy of real-time approvals of genuine transactions and reduce false declines.
The one or more fraud risk models 226 are trained based upon calculated performance metrics in view of the operational objectives of the enterprise and the client. In this specific example, the three metrics of sensitivity, false positives, and manual review Rate are not independent; choosing a value for one of these metrics for a particular fraud risk model necessarily dictates a value for each of the other two metrics. Choosing a value for one of these metrics thus dictates how a model will perform in terms of sensitivity, false positives and review Rate. Also, each performance metric for a particular fraud risk model is not necessarily a single value, but rather a range of values, one of which may be selected in order to dictate how the model performs.
In real-time, the risk decision engine 210 is configured to generate a fraud risk score associated with real payment transactions performed by the cardholder 106a. Further, the data pre-processing engine 208 is configured to provide the past transaction sequence, the fraud feature vector, and/or predictive variables associated with the past transaction sequence to the deviation-based marked TPP model (see, DMTPP model 228).
The neural network engine 212 includes suitable logic and/or interfaces for determining a predicted time of occurrence of an upcoming payment transaction based, at least in part, on the historical transaction data and the DMTPP model 228. The neural network engine 212 is further configured to determine whether the upcoming payment transaction is fraudulent or not based, at least in part, on a deviation between the predicted time of occurrence of the upcoming payment transaction and the actual time of occurrence at which the payment transaction is executed. In one non-limiting example, the neural network engine 212 implements the DMTPP model 228. The DMTPP model 228 is based on a recurrent neural network (RNN) and a deviation-based marked temporal point process. In one non-limiting example, the RNN includes a long short-term memory (LSTM) module.
In general, a recurrent neural network is a class of artificial neural networks. In addition, recurrent neural networks include a memory state to remember past data and decisions taken by the network. The recurrent neural network is a feed-forward neural network structure where additional edges, referred to as the recurrent edges, are added such that the outputs from the hidden units at the current time step are fed into them again as the future inputs at the next time step. In addition, the temporal point process (TPP) models facilitate the modeling of event sequences that do not occur at regular time intervals. In an example, the TPP models are used in modeling social media activities, financial transactions, the occurrence of earthquakes, and the like.
In general, the TPPs have mostly been used to predict the occurrence of the next event (time) with limited focus on the type/category of the event, termed as the marker. Further, the limited focus has been given to modeling the inter-dependency of the event time and marker information for more accurate predictions.
In one embodiment, the TPP models are generally used to predict the time of occurrence of the next event and the type or category of occurrence of the next event. The type or category of the next event is also termed as a marker. The TPP models learn the inter-dependency of the time of the event with a prediction of the marker to make accurate predictions.
In general, the temporal point process is a stochastic process that models a sequence of discrete events occurring in a continuous time interval. In addition, TPP is modeled using a conditional intensity function. The conditional intensity function measures the number of events that are expected in a specific time interval, given the historical sequence of event information. Generally, the historical sequences are modeled to predict the occurrence of the next event, and a categorical value is associated with it, referred to as the event marker.
Mathematically, the intensity function of a TPP is defined as the probability that an event will occur in [t, t + dt] time interval given the event history ht till time t:
?*(t)dt = ? (t|ht) = P(event in [t,t + dt] | ht) … Eqn. (1)
Where, dt refers to a small window of time, and P(.) refers to the probability function. Further, the conditional density function (f(t|ht)) of an event occurring at time t can be specified as:
f(t|ht) = ?*(t)exp(-?_(t_n)^t¦??*(t)dt)?… Eqn. (2)
Where tn refers to the last event and t corresponds to a very small value tending to zero. The conditional intensity function is generally modeled using various parametric forms.
In one embodiment, the Poisson process is used. In the Poisson process, events are assumed to be independent of their history such that ?(t|ht) = ?(t).
In another embodiment, the Hawkes process is used. In the Hawkes process, the conditional intensity function constitutes a time decay kernel to consider events history. The intensity function is assumed to be a linear function (?(.)) of history along with base intensity value (?0) and a weight parameter (a) as:
?*(t)= ?^0+ a ?_(t_j
Documents
Application Documents
| # |
Name |
Date |
| 1 |
202141037610-STATEMENT OF UNDERTAKING (FORM 3) [19-08-2021(online)].pdf |
2021-08-19 |
| 2 |
202141037610-PROVISIONAL SPECIFICATION [19-08-2021(online)].pdf |
2021-08-19 |
| 3 |
202141037610-POWER OF AUTHORITY [19-08-2021(online)].pdf |
2021-08-19 |
| 4 |
202141037610-FORM 1 [19-08-2021(online)].pdf |
2021-08-19 |
| 5 |
202141037610-DRAWINGS [19-08-2021(online)].pdf |
2021-08-19 |
| 6 |
202141037610-DECLARATION OF INVENTORSHIP (FORM 5) [19-08-2021(online)].pdf |
2021-08-19 |
| 7 |
202141037610-Correspondence_Power of Attorney_26-08-2021.pdf |
2021-08-26 |
| 8 |
202141037610-Proof of Right [18-11-2021(online)].pdf |
2021-11-18 |
| 9 |
202141037610-Correspondence And Assignment_06-12-2021.pdf |
2021-12-06 |
| 10 |
202141037610-DRAWING [17-08-2022(online)].pdf |
2022-08-17 |
| 11 |
202141037610-CORRESPONDENCE-OTHERS [17-08-2022(online)].pdf |
2022-08-17 |
| 12 |
202141037610-COMPLETE SPECIFICATION [17-08-2022(online)].pdf |
2022-08-17 |
| 13 |
202141037610-FORM 18 [11-08-2025(online)].pdf |
2025-08-11 |