Sign In to Follow Application
View All Documents & Correspondence

Registration And Verification Of Biometric Modalities Using Encryption Techniques In A Deep Neural Network

Abstract: Conventionally, biometric template protection has been achieved to improve matching performance with high levels of security by use of deep convolution neural network models. However, such attempts have prominent security limitations mapping information of images to binary codes is stored in an unprotected form. Given this model and access to the stolen protected templates, the adversary can exploit the False Accept Rate (FAR) of the system. Secondly, once the server system is compromised all the users need to be re-enrolled again. Unlike conventional systems and approaches, present disclosure provides systems and methods that implement encrypted deep neural network(s) for biometric template protection for enrollment and verification wherein the encrypted deep neural network(s) is utilized for mapping feature vectors to a randomly generated binary code and a deep neural network model learnt is encrypted thus achieving security and privacy for data protection.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
08 January 2020
Publication Number
28/2021
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ip@legasis.in
Parent Application
Patent Number
Legal Status
Grant Date
2024-04-25
Renewal Date

Applicants

Tata Consultancy Services Limited
Nirmal Building, 9th Floor, Nariman Point, Mumbai - 400021, Maharashtra, India

Inventors

1. JINDAL, Arun Kumar
Tata Consultancy Services Limited, Block C, Kings Canyon, ASF Insignia, Gurgaon - Faridabad Road, Gawal Pahari, Gurgaon - 122003, Haryana, India
2. SHAIK, Imtiyazuddin
Tata Consultancy Services Limited, Deccan Park, Cubicle: GS2-10, Plot No 1, Survey No. 64/2, Software Units Layout, Serilingampally Mandal, Madhapur, Hyderabad - 500081, Telangana, India
3. NARUMANCHI, Harika
Tata Consultancy Services Limited, 2nd Floor - Block A - Phase II, IIT Madras Research Park, Kanagam Road, Tharamani, Chennai - 600113, Tamil Nadu, India
4. KUMARI, Vasudha
Tata Consultancy Services Limited, Tata Research Development & Design Centre, 54-B, Hadapsar Industrial Estate, Hadapsar, Pune - 411013, Maharashtra, India
5. CHALAMALA, Srinivasa Rao
Tata Consultancy Services Limited, GS2-55, Deccan Park, Plot No 1, Survey No. 64/2, Software Units Layout, Serilingampally Mandal, Madhapur, Hyderabad - 500081, Telangana, India
6. BHATTACHAR, Rajan Mindigal Alasingara
Tata Consultancy Services Limited, Unit-III, No 18, SJM Towers, Seshadri Road, Gandhinagar, Bangalore - 560 009, Karnataka, India
7. LODHA, Sachin Premsukh
Tata Consultancy Services Limited, Tata Research Development & Design Centre, 54-B, Hadapsar Industrial Estate, Hadapsar, Pune - 411013, Maharashtra, India

Specification

Claims: 1. A processor implemented method, comprising: capturing (202), via one or more hardware processors, a first image comprising a biometric modality of a user for enrollment; pre-processing (204), via the one or more hardware processors, the first captured image comprising the biometric modality to obtain a first set of augmented images; generating (206), via the one or more hardware processors, a feature vector for (a) each of the first set of augmented images and (b) the first captured image comprising the biometric modality of the user, and encrypting the generated feature vector thereof using a fully homomorphic encryption (FHE) technique; generating (208), via the FHE technique executed by the one or more hardware processors, an encrypted reduced dimensionality feature vector using (i) the encrypted feature vector for (a) each of the first set of augmented images and (b) the first captured image comprising the biometric modality of the user, and (ii) an encrypted Random Projection Matrix (RPM) being identified and assigned to the user; and mapping (210), via an encrypted deep neural network executed by the one or more hardware processors, the encrypted reduced dimensionality feature vector for (a) each of the first set of augmented images and (b) the first captured image comprising the biometric modality of the user with a corresponding randomly generated binary code assigned to the user, wherein during the mapping, one or more encrypted parameters of a mapping network model are learnt by the encrypted deep neural network, and wherein a key pair is generated, the key pair comprising a private key and a public key. 2. The processor implemented method as claimed in claim 1, wherein size of an input layer (fc1_size) of the encrypted deep neural network is equal to size of the encrypted reduced dimensionality feature vector. 3. The processor implemented method as claimed in claim 1, wherein size of an output layer of the deep neural network is based on number of bits comprised in each of a plurality of binary codes assigned to the user. 4. The processor implemented method as claimed in claim 1, wherein (i) the encrypted RPM and (ii) the one or more encrypted parameters of the mapping network model, and (iii) the corresponding randomly generated binary code in an encrypted form along with a unique label assigned to the user are stored in a database comprised in the memory. 5. The processor implemented method as claimed in claim 1, further comprising: capturing (212) a second image comprising a biometric modality of the user for validating an identity of the user; pre-processing (214) the captured second image comprising the biometric modality to obtain a second set of augmented images; generating (216), via the one or more hardware processors, a feature vector for (i) each of the second set of augmented images and (ii) the second captured image comprising the biometric modality of the user; performing (218), by using the generated public key, the FHE technique on the feature vector for (i) each of the second set of augmented images and (ii) the second captured image comprising the biometric modality of the user to obtain an encrypted feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user; generating (220) an encrypted reduced dimensionality feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user based on (i) the encrypted feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user and (ii) the encrypted RPM assigned to the user during the enrollment; generating (222), via the encrypted deep neural network, a set of encrypted prediction based binary codes based on (i) the encrypted reduced dimensionality feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user and (ii) the one or more encrypted parameters learnt by the deep neural network; decrypting (224), via the encrypted deep neural network, the set of encrypted prediction based binary codes by using the generated private key to obtain a set of decrypted binary codes; performing (226), via the encrypted deep neural network, a cryptographic hash function on the set of decrypted binary codes to obtain a set of cryptographic hash of binary code and performing a comparison of each of the set of cryptographic hash of binary code with a corresponding cryptographic hash of the randomly generated binary code assigned to the user comprised in the memory; and validating (228), via the encrypted deep neural network, the identity of the user based on the comparison. 6. A system (100), comprising: a memory (102) storing instructions; one or more communication interfaces (106); and one or more hardware processors (104) coupled to the memory (102) via the one or more communication interfaces (106), wherein the one or more hardware processors (104) are configured by the instructions in a trusted execution environment (TEE) to: capture a first image comprising a biometric modality of a user for enrollment; pre-process the first captured image comprising the biometric modality to obtain a first set of augmented images; generate, via the one or more hardware processors, a feature vector for (a) each of the first set of augmented images and (b) the first captured image comprising the biometric modality of the user, and encrypt the generated feature vector thereof using a fully homomorphic encryption (FHE) technique; generate, via the FHE technique executed by the one or more hardware processors, an encrypted reduced dimensionality feature vector using (i) the encrypted feature vector for (a) each of the first set of augmented images and (b)_the first captured image comprising the biometric modality of the user, and (ii) an encrypted Random Projection Matrix (RPM) being identified and assigned to the user; and map, via an encrypted deep neural network executed by the one or more hardware processors, the encrypted reduced dimensionality feature vector for (a) each of the first set of augmented images and (b) the first captured image comprising the biometric modality of the user with a corresponding randomly generated binary code assigned to the user, wherein during the mapping, one or more encrypted parameters of a mapping network model are learnt by the deep neural network, and wherein a key pair is generated, the key pair comprising a private key and a public key. 7. The system as claimed in claim 6, wherein size of an input layer (fc1_size) of the deep neural network is equal to size of the encrypted reduced dimensionality feature vector. 8. The system as claimed in claim 6, wherein size of an output layer of the deep neural network is based on number of bits comprised in each of a plurality of binary codes assigned to the user. 9. The system as claimed in claim 6, wherein the one or more hardware processors are further configured by the instructions to store, in a database comprised in the memory, (i) the encrypted RPM and (ii) the encrypted parameters of the mapping network model, and (iii) the corresponding randomly generated binary code in an encrypted form along with a unique label assigned to the user. 10. The system as claimed in claim 6, wherein the one or more hardware processors are further configured by the instructions to: capture a second image comprising a biometric modality of the user for validating an identity of the user; pre-process the captured second image comprising the biometric modality to obtain a second set of augmented images; generate, via the one or more hardware processors, a feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user; perform, by using the generated public key, the FHE on the feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user to obtain an encrypted feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user; generate an encrypted reduced dimensionality feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user based on (i) the encrypted feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user, and (ii) the encrypted RPM assigned to the user during the enrollment; generate, via the encrypted deep neural network, a set of encrypted prediction based binary codes based on (i) the encrypted reduced dimensionality feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user and (ii) the one or more encrypted parameters learnt by the deep neural network; decrypt the set of encrypted prediction based binary codes by using the generated private key to obtain a set of decrypted binary codes; perform a cryptographic hash function on the set of decrypted binary codes to obtain a set of cryptographic hash of binary code; performing a comparison of each of the set of cryptographic hash of binary code with a corresponding cryptographic hash of the randomly generated binary code assigned to the user comprised in the memory; and validate the identity of the user based on the comparison. , Description:FORM 2 THE PATENTS ACT, 1970 (39 of 1970) & THE PATENT RULES, 2003 COMPLETE SPECIFICATION (See Section 10 and Rule 13) Title of invention: REGISTRATION AND VERIFICATION OF BIOMETRIC MODALITIES USING ENCRYPTION TECHNIQUES IN A DEEP NEURAL NETWORK Applicant: Tata Consultancy Services Limited A company Incorporated in India under the Companies Act, 1956 Having address: Nirmal Building, 9th Floor, Nariman Point, Mumbai 400021, Maharashtra, India The following specification particularly describes the invention and the manner in which it is to be performed. TECHNICAL FIELD [001] The disclosure herein generally relates to biometric template protection, and, more particularly, to registration and verification of biometric modalities using encryption techniques in a deep neural network. BACKGROUND [002] The term biometrics is defined as automated recognition of individuals based on their unique behavioral and biological characteristics. A typical biometric system obtains these unique behavioral and physical characteristics by acquiring the user’s biometric trait (such as fingerprints, iris, face, voice, gait etc.) via a sensor. Acquired data is processed to extract the salient information (feature set). During enrollment phase, the extracted feature set is stored in the database as a template. During verification, similar process is carried, and the template generated during verification is attempted for match with template generated during enrollment. Further a matching score S is outputted indicating the similarity between the two templates. However, the above conventional approach has its own security limitation. For instance, the biometric templates are stored in an un-protected form on a server system. It is important to secure biometric templates because unlike credit cards and passwords which when compromised can be revoked and reissued, biometric data (template) is permanently associated with a user and cannot be replaced. If a biometric template is exposed once, it is lost forever. Further, a compromised biometric template can be misused for cross-matching across databases. Moreover, not only in case of an attack on the server system, there is a possibility of all the original biometric templates getting lost or stolen. Using such stolen information, an attacker can perform model inversion attacks to retrieve image corresponding to each biometric template with reasonable accuracy, and the stolen biometric templates can be used for cross application matching. [003] To overcome the above, conventionally attempts have been made to improve matching performance with high levels of security by use of deep convolution neural network models. However, such attempts have some of prominent security limitations such as deep convolution neural network CNN model used to map images (e.g., face) to binary codes (256 Bit / 1024 Bit) is stored in an unprotected form on the server. Given this model and access to the stolen protected templates, the adversary can exploit the False Accept Rate (FAR) of the system. Secondly, once the server system is compromised all the users need to be re-enrolled again. Further, the above conventional approaches also have privacy limitations, for instance, it is easily possible to identify whether a particular user is enrolled in the system or not wherein an attacker has to simply provide an image (collected from the online social media) to the biometric system. If the authentication is successful, the user is enrolled. Moreover, during enrollment (training) phase, unprotected feature vectors of the user are used to train the deep CNN. If the system administrators collude, then they can use a model inversion attack to know about the users who enrolled in the biometric system, thus compromising both privacy and security. SUMMARY [004] Embodiments of the present disclosure present technological improvements as solutions to one or more of the above-mentioned technical problems recognized by the inventors in conventional systems. For example, in one aspect, there is provided a processor implemented method for registration and verification of biometric modalities using encryption techniques in a deep neural network. The method comprises capturing a first image comprising a biometric modality of a user for enrollment; pre-processing the first captured image comprising the biometric modality to obtain a first set of augmented images; generating, via the one or more hardware processors, a feature vector for (a) each of the first set of augmented images and (b) the first captured image comprising the biometric modality of the user, and encrypting the generated feature vector thereof using a fully homomorphic encryption (FHE) technique; generating, via the FHE technique executed by the one or more hardware processors, an encrypted reduced dimensionality feature vector using (i) the encrypted feature vector for (a) each of the first set of augmented images and (b) the first captured image comprising the biometric modality of the user, and (ii) an encrypted Random Projection Matrix (RPM) being identified and assigned to the user; and mapping, via an encrypted deep neural network executed by the one or more hardware processors, the encrypted reduced dimensionality feature vector for (a) each of the first set of augmented images and (b) the first captured image comprising the biometric modality of the user with a corresponding randomly generated binary code assigned to the user, wherein during the mapping, one or more encrypted parameters of a mapping network model are learnt by the encrypted deep neural network, and wherein a key pair is generated that comprises a private key and a public key. [005] In an embodiment, size of an input layer (fc1_size) of the encrypted deep neural network is equal to size of the encrypted reduced dimensionality feature vector. [006] In an embodiment, size of an output layer of the encrypted deep neural network is based on number of bits comprised in each of a plurality of binary codes assigned to the user. [007] In an embodiment, the method further comprises storing, in a database comprised in the memory, (i) the encrypted RPM and (ii) the encrypted parameters of the mapping network model, and (iii) the corresponding randomly generated binary code in an encrypted form along with a unique label assigned to the user. In other words, cryptographic hash is performed on the corresponding randomly generated binary code and the cryptographic hash of randomly generated binary code is stored in the memory and then the corresponding (original) randomly generated binary code are discarded. [008] In an embodiment, the method further comprises capturing a second image comprising a biometric modality of the user for validating an identity of the user; pre-processing the captured second image comprising the biometric modality to obtain a second set of augmented images; generating, via the one or more hardware processors, a feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user; performing, by using the generated public key, the FHE on the feature vector for each of the second set of augmented images to obtain an encrypted feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user; generating an encrypted reduced dimensionality feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user based on (i) the encrypted feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user and (ii) the encrypted RPM assigned to the user during the enrollment; generating, via the encrypted deep neural network, a set of encrypted prediction based binary codes based on (i) the encrypted reduced dimensionality feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user and (ii) the one or more encrypted parameters learnt by the encrypted deep neural network; decrypting the set of encrypted prediction based binary codes by using the generated private key to obtain a set of decrypted binary codes; performing a cryptographic hash function on the set of decrypted binary codes to obtain a set of cryptographic hash of binary code and performing a comparison of each of the set of cryptographic hash of binary code with a corresponding cryptographic hash of the randomly generated binary code assigned to the user comprised in the memory; and validating the identity of the user based on the comparison. [009] In another aspect, there is provided a system for registration and verification of biometric modalities using encryption techniques in a deep neural network. The system comprises a memory storing instructions; one or more communication interfaces; and one or more hardware processors coupled to the memory via the one or more communication interfaces, wherein the one or more hardware processors are configured by the instructions to execute the (programmed) instructions in a Trusted Execution Environment (TEE) to: capture a first image comprising a biometric modality of a user for enrollment; pre-process the first captured image comprising the biometric modality to obtain a first set of augmented images; generate, via the one or more hardware processors, a feature vector for each of the first set of augmented images and encrypt the generated feature vector thereof using a fully homomorphic encryption (FHE) technique; generate, via the FHE technique executed by the one or more hardware processors, an encrypted reduced dimensionality feature vector using (i) the encrypted feature vector for (a) each of the first set of augmented images and (b) the first captured image comprising the biometric modality of the user and (ii) an encrypted Random Projection Matrix (RPM) being identified and assigned to the user; and map, via an encrypted deep neural network executed by the one or more hardware processors, the encrypted reduced dimensionality feature vector for (a) each of the first set of augmented images and (b) the first captured image comprising the biometric modality of the user with a corresponding randomly generated binary code assigned to the user, wherein during the mapping, one or more encrypted parameters of a mapping network model are learnt by the encrypted deep neural network, and wherein a key pair is generated that comprises a private key and a public key. [010] In an embodiment, size of an input layer (fc1_size) of the encrypted deep neural network is equal to size of the encrypted reduced dimensionality feature vector. [011] In an embodiment, size of an output layer of the encrypted deep neural network is based on number of bits comprised in each of a plurality of binary codes assigned to the user. [012] In an embodiment, the one or more hardware processors are further configured by the instructions to store, in a database comprised in the memory, (i) the encrypted RPM and (ii) the encrypted parameters of the mapping network model, and (iii) the corresponding randomly generated binary code in an encrypted form along with a unique label assigned to the user. [013] In an embodiment, the one or more hardware processors are further configured by the instructions to capture a second image comprising a biometric modality of the user for validating an identity of the user; pre-process the captured second image comprising the biometric modality to obtain a second set of augmented images; generate, via the one or more hardware processors, a feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user; perform, by using the generated public key, the FHE on the feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user to obtain an encrypted feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user; generate an encrypted reduced dimensionality feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user based on (i) the encrypted feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user and (ii) the encrypted RPM assigned to the user during the enrollment; generate, via the encrypted deep neural network, a set of encrypted prediction based binary codes based on (i) the encrypted reduced dimensionality feature vector for (or of) (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user and (ii) the one or more encrypted parameters learnt by the encrypted deep neural network; decrypt the set of encrypted prediction based binary codes by using the generated private key to obtain a set of decrypted binary codes; perform a cryptographic hash function on the set of decrypted binary codes to obtain a set of cryptographic hash of binary code and perform a comparison of each of the set of cryptographic hash of binary code with a corresponding cryptographic hash of the randomly generated binary code assigned to the user comprised in the memory; and validate the identity of the user based on the comparison. [014] In yet another aspect, there are provided one or more non-transitory machine-readable information storage mediums comprising one or more instructions which when executed by one or more hardware processors in a Trusted Execution Environment (TTE) cause registration and verification of biometric modalities using encryption techniques in a deep neural network by capturing a first image comprising a biometric modality of a user for enrollment; pre-processing the first captured image comprising the biometric modality to obtain a first set of augmented images; generating, via the one or more hardware processors, a feature vector for (a) each of the first set of augmented images and (b) the first captured image comprising the biometric modality of the user and encrypting the generated feature vector thereof using a fully homomorphic encryption (FHE) technique; generating, via the FHE technique executed by the one or more hardware processors, an encrypted reduced dimensionality feature vector using (i) the encrypted feature vector for (a) each of the first set of augmented images and (b) the first captured image comprising the biometric modality of the user and (ii) an encrypted Random Projection Matrix (RPM) being identified and assigned to the user; and mapping, via an encrypted deep neural network executed by the one or more hardware processors, the encrypted reduced dimensionality feature vector for (a) each of the first set of augmented images and (b) the first captured image comprising the biometric modality of the user with a corresponding randomly generated binary code assigned to the user, wherein during the mapping, one or more encrypted parameters of a mapping network model are learnt by the encrypted deep neural network, and wherein a key pair is generated that comprises a private key and a public key. [015] In an embodiment, size of an input layer (fc1_size) of the encrypted deep neural network is equal to size of the encrypted reduced dimensionality feature vector. [016] In an embodiment, size of an output layer of the encrypted deep neural network is based on number of bits comprised in each of a plurality of binary codes assigned to the user. [017] In an embodiment, the instructions when executed by the one or more hardware processors further cause storing, in a database comprised in the memory, (i) the encrypted RPM and (ii) the encrypted parameters of the mapping network model, and (iii) the corresponding randomly generated binary code in an encrypted form along with a unique label assigned to the user. [018] In an embodiment, the instructions when executed by the one or more hardware processors further cause capturing a second image comprising a biometric modality of the user for validating an identity of the user; pre-processing the captured second image comprising the biometric modality to obtain a second set of augmented images; generating, via the one or more hardware processors, a feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user; performing, by using the generated public key, the FHE on the feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user to obtain an encrypted feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user; generating an encrypted reduced dimensionality feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user based on (i) the encrypted feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user and (ii) the encrypted RPM assigned to the user during the enrollment; generating, via the encrypted deep neural network, a set of encrypted prediction based binary codes based on (i) the encrypted reduced dimensionality feature vector for (a) each of the second set of augmented images and (b) the second captured image comprising the biometric modality of the user and (ii) the one or more encrypted parameters learnt by the encrypted deep neural network; decrypting the set of encrypted prediction based binary codes by using the generated private key to obtain a set of decrypted binary codes; performing a cryptographic hash function on the set of decrypted binary codes to obtain a set of cryptographic hash of binary code and performing a comparison of each of the set of cryptographic hash of binary code with a corresponding cryptographic hash of the randomly generated binary code assigned to the user comprised in the memory; and validating the identity of the user based on the comparison. [019] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed. BRIEF DESCRIPTION OF THE DRAWINGS [020] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles: [021] FIG. 1 depicts an exemplary block diagram of a system for registration and verification of biometric modalities using encryption techniques in an encrypted deep neural network, in accordance with an embodiment of the present disclosure. [022] FIG. 2A depicts an exemplary functional block diagram for registration of biometric modalities of users using the encryption techniques in the (encrypted) deep neural network, in accordance with an embodiment of the present disclosure. [023] FIG. 2B depicts an exemplary functional block diagram for verification of biometric modalities of the users using the encryption techniques in the (encrypted) deep neural network, in accordance with an embodiment of the present disclosure. [024] FIGS. 3A-3B depict an exemplary flow chart for registration and verification of biometric modalities of users using the encryption techniques in the encrypted deep neural network as implemented by the system of FIG. 1 and components depicted in FIGS. 2A-2B, in accordance with an embodiment of the present disclosure. DETAILED DESCRIPTION OF EMBODIMENTS [025] Exemplary embodiments are described with reference to the accompanying drawings. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. Wherever convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the scope of the disclosed embodiments. It is intended that the following detailed description be considered as exemplary only, with the true scope being indicated by the following claims. [026] As mentioned above, conventionally attempts have been made to improve matching performance with high levels of security by use of deep convolution neural network models. However, such attempts have some of prominent security limitations such as the deep CNN model used to map images (e.g., face) to binary codes (256 Bit / 1024 Bit) is stored in an unprotected form on the server. Given this model and access to the stolen protected templates, the adversary can exploit the False Accept Rate (FAR) of the system. Secondly, once the server system is compromised all the users need to be re-enrolled again. Further, the above conventional approaches have also privacy limitations such as it is easily possible to identify whether a particular user is enrolled in the system or not wherein an attacker has to simply provide an image (collected from the online social media) to the biometric system. If the authentication is successful, the user is enrolled. Moreover, during enrollment (training) phase, unprotected feature vectors of the user are used to train the deep CNN. If the system administrators collude, then they can use a model inversion attack to know about the users who enrolled in the biometric system, thus compromising both privacy and security. Conventional biometric registration and verification systems pose a challenge with respect to intra-user variability being caused due to variations in users pose in constrained environment (e.g., illumination, expression(s) and the like). Further, several known biometric verification systems utilize Partial Homomorphic Encryption with support for binarized data for encryption purposes. Such systems require quantification of feature vectors and this may lead to information loss due to lossy computations and thereby achieve degraded matching performance at the time of verification. Moreover, in conventional systems and approaches, feature vectors were stored on the server and matching was performed in the unprotected domain which pose, both security and privacy concern for registration and as well as verification process (refer above). Unlike conventional systems and approaches, present disclosure provides systems and methods that implement encrypted deep neural network(s) for mapping feature vectors to a randomly generated binary code. The deep neural network model learnt is encrypted. [027] More specifically, embodiments of the present disclosure herein provide method and system for biometric registration and verification (also referred as ‘biometric modality registration and verification’) in a secure manner using fully homomorphic encryption and encrypted neural network. For example, in various embodiments, the disclosed system utilizes homomorphic encryption computations for encrypting the feature vectors of the captured biometric template. As can be seen in the detailed description of various figures, the present disclosure acquires only 1 biometric type (e.g., one-shot enrollment) for registration/verification, where only one biometric image of the user is used for enrollment. Data augmentation is performed on segmented image set of the acquired biometric template (or biometric image of the user) to increase number of samples per user for enrollment since deep learning-based method requires lot of data for training. The increase in number of samples per user enables the system of the present disclosure to improve accuracy. Further, in the present disclosure the system performs dimensionality reduction of feature vector generated based on augmented images set, in the encrypted domain using fully homomorphic encryption. Additionally, system of the present disclosure an encrypted random projection matrix as a security feature, wherein random projection matrix is identified and assigned to each user during registration and fully homomorphic encryption is performed to obtain the encrypted random projection matrix. [028] Referring now to the drawings, and more particularly to FIG. 1 through 3B, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments and these embodiments are described in the context of the following exemplary system and/or method. [029] FIG. 1 depicts an exemplary block diagram of a system for registration and verification of biometric modalities using encryption techniques in a deep neural network, in accordance with an embodiment of the present disclosure. In an embodiment, the system 100 includes one or more hardware processors 104, communication interface device(s) or input/output (I/O) interface(s) 106 (also referred as interface(s)), and one or more data storage devices or memory 102 operatively coupled to the one or more hardware processors 104. The one or more processors 104 may be one or more software processing components and/or hardware processors. In an embodiment, the hardware processors can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor(s) is configured to fetch and execute computer-readable instructions stored in the memory. In an embodiment, the system 100 can be implemented in a variety of computing systems, such as laptop computers, notebooks, hand-held devices, workstations, mainframe computers, servers, a network cloud and the like. [030] The I/O interface device(s) 106 can include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like and can facilitate multiple communications within a wide variety of networks N/W and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. In an embodiment, the I/O interface device(s) can include one or more ports for connecting a number of devices to one another or to another server. [031] The memory 102 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. In an embodiment, a database 108 is comprised in the memory 102, wherein the database 108 comprises information, for example, biometric modalities (e.g., face image, iris (or retina), fingerprint (or palm print), and the like) of one or more user, segmented images pertaining to the biometric modalities captured via one or more sensors (e.g., image capturing device(s) or device(s) that are capable of capturing the biometric modalities of user), and the like. The database 108 may further comprise augmented images that are obtained by pre-processing the segmented images. The database 108 further comprises feature vector generated for each user, and encrypted feature vector associated thereof. The database 108 further comprises an encrypted reduced dimensionality feature vector specific to each user, encrypted Random Projection Matrix (RPM) being identified and assigned to the user, mapping information pertaining to the encrypted reduced dimensionality feature vector for each of the first set of augmented images being mapped with a corresponding randomly generated binary code assigned to the user, encrypted parameters of a mapping network model (that is generated and comprised in the memory 102) that are learnt by an encrypted deep neural network. The memory 102 may further store a key pair comprising a private key and a public key that is generated for each user. Similarly, the database 108 further stores encrypted prediction based binary codes and decrypted binary codes specific to each user. The database 108 further stores a set of cryptographic hash of binary code for each user, and information pertaining to validation of user(s). The memory 102 further comprises various technique(s) for performing registration and verification of users in the deep neural network wherein information is stored in a protected form (e.g., encrypted form). The various technique(s) include, but are not limited to, pre-processing technique(s) such as (a) segmentation technique(s), (b) augmentation technique(s) and the like. Other techniques that are stored in the memory 102 include for example, encryption technique(s) (e.g., such as fully homomorphic encryption (FHE) technique(s)), decryption technique(s), binary code generation technique(s), random projection matrix generation technique(s) and the like, which when invoked as appropriate perform corresponding operations for registration and verification of biometric modalities of users using in the encrypted deep neural network. In an embodiment, the present disclosure may utilize the above mentioned techniques as known in the art for performing methodologies described herein. The encrypted deep neural network refers to a deep neural network (DNN) wherein FHE is performed on the deep neural network and the FHE of DNN is comprised in the memory 102 and executed to perform methodology/methodologies described herein. The memory 102 further comprises (or may further comprise) information pertaining to input(s)/output(s) of each step performed by the systems and methods of the present disclosure. In other words, input(s) fed at each step and output(s) generated at each step are comprised in the memory 102 and can be utilized in further processing and analysis. [032] FIGS. 2A-2B, with reference to FIG. 1, depicts an exemplary functional block diagram for registration and verification of biometric modalities of users using the encryption techniques in the deep neural network. More specifically, FIG. 2A depicts an exemplary functional block diagram for registration of biometric modalities of users using the encryption techniques in the (encrypted) deep neural network, in accordance with an embodiment of the present disclosure. FIG. 2B depicts an exemplary functional block diagram for verification of biometric modalities of the users using the encryption techniques in the (encrypted) deep neural network, in accordance with an embodiment of the present disclosure. [033] FIGS. 3A-3B, with reference to FIGS. 1-2B, depicts an exemplary flow chart for registration and verification of biometric modalities of users using the encryption techniques in the deep neural network as implemented by the system 100 of FIG. 1 and components depicted in FIGS. 2A-2B, in accordance with an embodiment of the present disclosure. In an embodiment, the system(s) 100 comprises one or more data storage devices or the memory 102 operatively coupled to the one or more hardware processors 104 and is configured to store instructions (also referred as programmed instructions) for execution of steps of the method by the one or more processors 104 in a trusted execution environment (TEE). The steps of the method of the present disclosure will now be explained with reference to components of the system 100 of FIG. 1 and FIGS. 2A-2B, and the flow diagram as depicted in FIGS. 3A-3B. At step 202 of the present disclosure, a first image comprising a biometric modality of a user is obtained. For instance, the first image comprising the biometric modality (e.g., a face image or an iris, or fingerprint(s) and the like) may be captured through an image capturing device (or sensor) attached (e.g., either internally configured, or externally attached via available communication interface(s)) to the system 100. The image capturing device may also be referred as biometric modality capturing device or sensing device that is configured to capture biometric modality (or modalities) of various users for enrollment and verification thereof. For instance, an image (e.g., the first image) captured at the sensor is of jpeg type (e.g., say abc.jpeg). For better understanding of the embodiments of the present disclosure, face image of a user (e.g., say John Doe) is considered and the same is captured during the enrollment and verification process. Assuming the captured image is a face image (e.g., the first image being the face image of a specific user) has width of 640 pixels and height of 480 pixels. In the present disclosure, the systems and methods are implemented wherein only a single high-resolution image of the biometric modality is captured at the sensor. This one-shot enrollment (where only 1 image of the user is used for enrollment process) makes the entire biometric system user friendly. [034] Upon obtaining the first captured image, at step 204 of the present disclosure, the one or more hardware processors 104 pre-process the first captured image comprising the biometric modality to obtain a first set of augmented images. The step of pre-processing as described herein includes segmenting the first captured image to obtain a plurality of sub-images; and augmenting each of the plurality of sub-images by applying a plurality of image processing techniques (e.g., image processing technique(s) as known in the art) to obtain the first set of augmented images. In other words, the captured jpeg image (say face image of the user John Doe) is subjected to a face detection algorithm, wherein the detected face is extracted and resized to a 160*160 image. Each 160*160 face image is subjected to data augmentation process to generate ‘m’ augmented images (e.g., 6 augmented images), of size 160*160 pixels. The segmented images undergo augmentation corresponding to one or more operations for example, but are not limited to, zoom, horizontal shift, vertical shift, horizontal flip, rotation, brightness adjustment and the like. More specifically, the segmented image (also referred as sub-image of the captured image post segmentation and may be interchangeably used hereinafter) can be randomly zoomed in within a certain pre-defined range (e.g., zoom range may include [0.5, 1.5]). Similarly, positive and negative horizontal shifts may be randomly selected within a certain pre-defined range (e.g., positive negative horizontal shift range may include [-10, 10]) and the pixel values at the end of the image are duplicated to fill in empty part of the image created by the shift. Similarly, the segmented image may undergo vertical shift operation wherein positive and negative vertical shifts are randomly selected within a certain pre-defined range (e.g., positive negative vertical shift range may include [-0.5, 0.5]). Further other operation of augmentation may include such as applying horizontal flip to the biometric image (segmented image or captured image). Alternatively, image may be rotated clockwise (or anti-clockwise) within the certain pre-defined rotation range argument. Moreover, brightness adjustment (darkening and brightening) may be carried out within a certain pre-defined range depending on the image (e.g., first image) being captured in a given environment (e.g., constrained environment or controlled environment). Post augmentation, all possible 157*157 crops of each 160*160 image are taken. Each 157*157 cropped image is then resized to 160*160. Thus, total number of augmented images in 7 * (160-157+1) * (160-157+1). [035] At step 206 of the present disclosure, the hardware processors 104 generates a feature vector for (a) each of the first set of augmented images and (b) the first captured image comprising the biometric modality of the user and performs encryption thereof to obtain an encrypted feature vector for (a) each of the first set of augmented images and (b) the first captured image comprising the biometric modality of the user. In an embodiment, the present disclosure may utilize a pre-trained model (comprised in the system 100) which when executed by the one or more hardware processors 104 generate the feature vector. In another embodiment, the present disclosure may utilize standard feature descriptors (comprised in the system 100) which when executed by the one or more hardware processors 104 generate the feature vector. In the experiments conducted by the present disclosure, each of the 7 * (160-157+1) * (160-157+1) 160*160 face image was fed into the system 100 (or a pre-trained feature extraction model/or the pre-trained model) for faces. A 128 dimensional feature vector corresponding to each of the 7*(160-157+1)*(160-157+1) face images was outputted by the system 100 and the same 128 dimensional feature vector was encrypted using the fully homomorphic encryption (FHE) technique stored in the memory 102 to output an encrypted feature vector thereof for (a) each of the first set of augmented images and (b) the first captured image comprising the biometric modality of the user. [036] At step 208 of the present disclosure, the FHE technique (executed by the one or more hardware processors 104) generates an encrypted reduced dimensionality feature vector using (i) the encrypted feature vector for (a) each of the first set of augmented images and (b) the first captured image comprising the biometric modality of the user and (ii) an encrypted Random Projection Matrix (RPM) being identified and assigned to the user. In other words, the encrypted feature vector generated for (a) each of the first set of augmented images and (b) the first captured image comprising the biometric modality of the user is used for generation of encrypted reduced dimensionality feature vector along with the encrypted Random Projection Matrix (RPM). Each user who is being enrolled is assigned a Random Projection Matrix (RPM). RPM of dimension ‘k*d’ was used by embodiments of the present disclosure during experiments to project the original ‘d’ dimensional data to a ‘k’ dimensional (k<

Documents

Application Documents

# Name Date
1 202021000863-IntimationOfGrant25-04-2024.pdf 2024-04-25
1 202021000863-STATEMENT OF UNDERTAKING (FORM 3) [08-01-2020(online)].pdf 2020-01-08
2 202021000863-REQUEST FOR EXAMINATION (FORM-18) [08-01-2020(online)].pdf 2020-01-08
2 202021000863-PatentCertificate25-04-2024.pdf 2024-04-25
3 202021000863-FORM 18 [08-01-2020(online)].pdf 2020-01-08
3 202021000863-CLAIMS [22-11-2021(online)].pdf 2021-11-22
4 202021000863-FORM 1 [08-01-2020(online)].pdf 2020-01-08
4 202021000863-COMPLETE SPECIFICATION [22-11-2021(online)].pdf 2021-11-22
5 202021000863-FIGURE OF ABSTRACT [08-01-2020(online)].jpg 2020-01-08
5 202021000863-FER_SER_REPLY [22-11-2021(online)].pdf 2021-11-22
6 202021000863-OTHERS [22-11-2021(online)].pdf 2021-11-22
6 202021000863-DRAWINGS [08-01-2020(online)].pdf 2020-01-08
7 202021000863-FER.pdf 2021-10-19
7 202021000863-COMPLETE SPECIFICATION [08-01-2020(online)].pdf 2020-01-08
8 Abstract1.jpg 2020-01-10
8 202021000863-FORM 3 [23-02-2021(online)].pdf 2021-02-23
9 202021000863-Proof of Right [14-03-2020(online)].pdf 2020-03-14
9 202021000863-CERTIFIED COPIES TRANSMISSION TO IB [15-12-2020(online)].pdf 2020-12-15
10 202021000863-Covering Letter [15-12-2020(online)].pdf 2020-12-15
10 202021000863-FORM-26 [12-11-2020(online)].pdf 2020-11-12
11 202021000863-Form 1 (Submitted on date of filing) [15-12-2020(online)].pdf 2020-12-15
11 202021000863-Request Letter-Correspondence [15-12-2020(online)].pdf 2020-12-15
12 202021000863-Power of Attorney [15-12-2020(online)].pdf 2020-12-15
13 202021000863-Form 1 (Submitted on date of filing) [15-12-2020(online)].pdf 2020-12-15
13 202021000863-Request Letter-Correspondence [15-12-2020(online)].pdf 2020-12-15
14 202021000863-Covering Letter [15-12-2020(online)].pdf 2020-12-15
14 202021000863-FORM-26 [12-11-2020(online)].pdf 2020-11-12
15 202021000863-CERTIFIED COPIES TRANSMISSION TO IB [15-12-2020(online)].pdf 2020-12-15
15 202021000863-Proof of Right [14-03-2020(online)].pdf 2020-03-14
16 202021000863-FORM 3 [23-02-2021(online)].pdf 2021-02-23
16 Abstract1.jpg 2020-01-10
17 202021000863-COMPLETE SPECIFICATION [08-01-2020(online)].pdf 2020-01-08
17 202021000863-FER.pdf 2021-10-19
18 202021000863-DRAWINGS [08-01-2020(online)].pdf 2020-01-08
18 202021000863-OTHERS [22-11-2021(online)].pdf 2021-11-22
19 202021000863-FER_SER_REPLY [22-11-2021(online)].pdf 2021-11-22
19 202021000863-FIGURE OF ABSTRACT [08-01-2020(online)].jpg 2020-01-08
20 202021000863-FORM 1 [08-01-2020(online)].pdf 2020-01-08
20 202021000863-COMPLETE SPECIFICATION [22-11-2021(online)].pdf 2021-11-22
21 202021000863-FORM 18 [08-01-2020(online)].pdf 2020-01-08
21 202021000863-CLAIMS [22-11-2021(online)].pdf 2021-11-22
22 202021000863-REQUEST FOR EXAMINATION (FORM-18) [08-01-2020(online)].pdf 2020-01-08
22 202021000863-PatentCertificate25-04-2024.pdf 2024-04-25
23 202021000863-STATEMENT OF UNDERTAKING (FORM 3) [08-01-2020(online)].pdf 2020-01-08
23 202021000863-IntimationOfGrant25-04-2024.pdf 2024-04-25

Search Strategy

1 search_strategyE_31-07-2021.pdf

ERegister / Renewals

3rd: 23 Jul 2024

From 08/01/2022 - To 08/01/2023

4th: 23 Jul 2024

From 08/01/2023 - To 08/01/2024

5th: 23 Jul 2024

From 08/01/2024 - To 08/01/2025

6th: 08 Jan 2025

From 08/01/2025 - To 08/01/2026