Specification
Description:FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
Title of invention:
GENERATING USABLE HIGH ENTROPY PASSPHRASES IN LOCAL LANGUAGE FROM PERSONALIZED INTERSECTION CORPUS
Applicant:
Tata Consultancy Services Limited
A company Incorporated in India under the Companies Act, 1956
Having address:
Nirmal Building, 9th Floor,
Nariman Point, Mumbai 400021,
Maharashtra, India
The following specification particularly describes the invention and the manner in which it is to be performed.
TECHNICAL FIELD
[001] The embodiments herein generally relate to the field of device authentication set up techniques and, more particularly, to a method and system for generating usable high entropy passphrases in local language from personalized intersection corpus for device authentication set up for a Basic Emergent User (BEU).
BACKGROUND
[002] With user devices functioning as gateways to personal, financial, and other information of an individual, device authentication techniques need to be robust. However, setting unique passwords or passphrases that are challenging to guess still remains a challenge for Basic Emergent Users (BEUs), who are in the less-literate or non-tech-savvy user category. As of today, there are no robust tools addressing the challenges of BEU in password or passphrase generation, where the BEU is assisted to generate higher entropy passphrases from the limited vocabulary they have. This leads to passphrases which are simple and easy to recall for them, making them vulnerable to dictionary attacks or attacks from people known to them in their social circle.
[003] Ther have been attempts such as the literature titled ‘Improving security and usability of passphrases with guided word choice’ by Nikola K. Blanchard et.al. This work proposes a way of making more memorable, more secure passphrases by choosing from a randomly generated set of words presented as a two-dimensional array. The above work is based on a random corpus, for a general user.
[004] However, there is no focus for BEU category, where there are challenges such as limited vocabulary, language barrier, and local accent to be addressed. Furthermore, one of the important features for BEU specific passphrase generation is that the passphrase suggested should have high entropy for the adversary, but the recall should be easy for the BEU, thus should be a low entropy passphrase.
SUMMARY
[005] Embodiments of the present disclosure present technological improvements as solutions to one or more of the above-mentioned technical problems recognized by the inventors in conventional systems.
[006] For example, in one embodiment, a method for high entropy passphrase generation is provided. The method includes prompting a user, to speak out a recallable phrase comprising a set of words using the natural language of the user, wherein the recallable phrase is converted to a text using an Automated Speech Recognition (ASR) engine.
[007] Further, the method includes filtering a set of Seed List Words (SLWs) from the recallable phrase by based on a personalized intersection corpus comprising a plurality of words correctly identifiable by the ASR engine.
[008] Further, the method includes generating a passphrase distance matrix for the set of SLWs by referring to an intersection corpus distance matrix generated for the personalized intersection corpus based on a vector distance between each of the plurality of words in the personalized intersection corpus, wherein column elements of the passphrase distance matrix comprise the set of SLWs in alphabetical order and row elements comprise a predefined number of passphrase words, for each SLW among the set of SLWs, identified based on descending order of the vector distance between each SLW and the plurality of words of the personalized intersection corpus, and wherein the descending order of vector distance arranges the predefined number of passphrase words from high entropy to low entropy value.
[009] Further, the method includes performing column wise splitting of the passphrase distance matrix to generate a plurality of splits, wherein each of the plurality of splits comprising a predefined number of words for each SLW, wherein a first split to a last split comprising words with higher vector distance resulting in high entropy words gradually shifting to lower vector distance resulting in low entropy ;
[010] Furthermore, the method includes generating a passphrase corpus comprising a plurality of passphrases generated from at least one of the first spilt and a subsequent split by a row wise selection of words of the passphrase matrix one at a time;
[011] Further, the method includes randomly reading out and displaying a preset number of passphrases from among the plurality of phrases on a display screen of the user device, wherein a passphrase among the preset number of passphrases associated with the high entropy words is positioned at a display screen position having highest usability in context of the use. Furthermore, the method includes setting a user selected passphrase for device access authentication of the user device.
[012] In another aspect, a system for high entropy passphrase generation is provided. The system comprises a memory storing instructions; one or more Input/Output (I/O) interfaces; and one or more hardware processors coupled to the memory via the one or more I/O interfaces, wherein the one or more hardware processors are configured by the instructions to
[013] In yet another aspect, there are provided one or more non-transitory machine-readable information storage mediums comprising one or more instructions, which when executed by one or more hardware processors causes a method for high entropy passphrase generation.
[014] The one or more hardware processors are configured to prompt a user, to speak out a recallable phrase comprising a set of words using the natural language of the user, wherein the recallable phrase is converted to a text using an Automated Speech Recognition (ASR) engine.
[015] Further, the one or more hardware processors are configured to filter a set of Seed List Words (SLWs) from the recallable phrase by based on a personalized intersection corpus comprising a plurality of words correctly identifiable by the ASR engine.
[016] Further, the one or more hardware processors are configured to generate a passphrase distance matrix for the set of SLWs by referring to an intersection corpus distance matrix generated for the personalized intersection corpus based on a vector distance between each of the plurality of words in the personalized intersection corpus, wherein column elements of the passphrase distance matrix comprise the set of SLWs in alphabetical order and row elements comprise a predefined number of passphrase words, for each SLW among the set of SLWs, identified based on descending order of the vector distance between each SLW and the plurality of words of the personalized intersection corpus, and wherein the descending order of vector distance arranges the predefined number of passphrase words from high entropy to low entropy value.
[017] Further, the one or more hardware processors are configured to perform column wise splitting of the passphrase distance matrix to generate a plurality of splits, wherein each of the plurality of splits comprising a predefined number of words for each SLW, wherein a first split to a last split comprising words with higher vector distance resulting in high entropy words gradually shifting to lower vector distance resulting in low entropy ;
[018] Furthermore, the one or more hardware processors are configured to generate a passphrase corpus comprising a plurality of passphrases generated from at least one of the first spilt and a subsequent split by a row wise selection of words of the passphrase matrix one at a time;
[019] Further, the one or more hardware processors are configured to randomly read out and display a preset number of passphrases from among the plurality of phrases on a display screen of the user device, wherein a passphrase among the preset number of passphrases associated with the high entropy words is positioned at a display screen position having highest usability in context of the use. Furthermore, the method includes setting a user selected passphrase for device access authentication of the user device.
[020] The method includes prompting a user, to speak out a recallable phrase comprising a set of words using the natural language of the user, wherein the recallable phrase is converted to a text using an Automated Speech Recognition (ASR) engine.
[021] Further, the method includes filtering a set of Seed List Words (SLWs) from the recallable phrase by based on a personalized intersection corpus comprising a plurality of words correctly identifiable by the ASR engine.
[022] Further, the method includes generating a passphrase distance matrix for the set of SLWs by referring to an intersection corpus distance matrix generated for the personalized intersection corpus based on a vector distance between each of the plurality of words in the personalized intersection corpus, wherein column elements of the passphrase distance matrix comprise the set of SLWs in alphabetical order and row elements comprise a predefined number of passphrase words, for each SLW among the set of SLWs, identified based on descending order of the vector distance between each SLW and the plurality of words of the personalized intersection corpus, and wherein the descending order of vector distance arranges the predefined number of passphrase words from high entropy to low entropy value.
[023] Further, the method includes performing column wise splitting of the passphrase distance matrix to generate a plurality of splits, wherein each of the plurality of splits comprising a predefined number of words for each SLW, wherein a first split to a last split comprising words with higher vector distance resulting in high entropy words gradually shifting to lower vector distance resulting in low entropy ;
[024] Furthermore, the method includes generating a passphrase corpus comprising a plurality of passphrases generated from at least one of the first spilt and a subsequent split by a row wise selection of words of the passphrase matrix one at a time;
[025] Further, the method includes randomly reading out and displaying a preset number of passphrases from among the plurality of phrases on a display screen of the user device, wherein a passphrase among the preset number of passphrases associated with the high entropy words is positioned at a display screen position having highest usability in context of the use. Furthermore, the method includes setting a user selected passphrase for device access authentication of the user device.
[026] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[027] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles:
[028] FIG. 1 is a functional block diagram of a system, for generating usable high entropy passphrases in local language from personalized intersection corpus for device authentication set up for a Basic Emergent User (BEU), in accordance with some embodiments of the present disclosure.
[029] FIGS. 2A through 2B (collectively referred as FIG. 2) is a flow diagram illustrating a method for generating usable high entropy passphrases in local language from personalized intersection corpus for device authentication set up for a Basic Emergent User (BEU), using the system depicted in FIG. 1, in accordance with some embodiments of the present disclosure.
[030] FIGS. 3A through 3D are example illustrations of personalized intersection corpus generation, intersection corpus distance matrix generation, and passphrase selection process of the system of FIG. 1, in accordance with some embodiments of the present disclosure.
[031] FIGS. 4A through 4D are example illustrations of display tests executed by the system of FIG. 1 to determine the screen usability for the user, in accordance with some embodiments of the present disclosure.
[032] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems and devices embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION OF EMBODIMENTS
[033] Exemplary embodiments are described with reference to the accompanying drawings. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. Wherever convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the scope of the disclosed embodiments.
[034] There are technical challenges that need addressing when it comes to the technical capability limitations of a Basic Emergent User (BEU) which will enable and assist a BEU in generating high entropy passphrases but which are easier for self-recall. For a robust BEU specific passphrase set up, a BEU customized or personalized corpus is needed, which takes care of BEU’s language capability, accent, further ease of selecting and setting the passphrase. As of today, there are no tools for BEUs to generate higher entropy passphrases from the limited vocabulary they possess. This leads to passphrases which are simple and easy to recall. This makes them vulnerable to dictionary attacks or attacks from people known to them in their social circle. The keyboard layout on the smartphone for their password or passphrase selection is also not optimal for input from their physiology perspective.
[035] Embodiments of the present disclosure provide a method and system for generating usable high entropy passphrases in local languages from personalized intersection corpus for device authentication set up of user device of a Basic Emergent User (BEU)(also referred to as user). A recallable phrase (user’s favorite phrase) spoken by the BEU in a local language, is converted to a text and Seed List Words (SLWs) are filtered based on a pre-generated personalized intersection corpus. A passphrase distance matrix is generated for the SLWs by referring to an intersection corpus distance matrix generated for the personalized intersection corpus. Words associated with each SLW are arranged in descending order of vector distance or entropy. Words of same order are concatenated in for each SLW to generate passphrase corpus. Randomly selected passphrases are read out and displayed on the user device by positioning the highest entropy passphrase at the most usable display screen position, nudging the user to select high entropy passphrase.
[036] It can be noted that the recallable phrase that the user is asked to read out, or the passphrase generated from the user read out phrase have a word limitation of five words. This limitation is obtained based on Miller's criteria for memory capacity for a person (herein, the BEU)
[037] Referring now to the drawings, and more particularly to FIGS. 1 through 4D, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments, and these embodiments are described in the context of the following exemplary system and/or method.
[038] FIG. 1 is a functional block diagram of a system 100, also referred to as user device 100, for generating usable high entropy passphrases in local language from personalized intersection corpus for device authentication set up for a Basic Emergent User (BEU), in accordance with some embodiments of the present disclosure. In an embodiment, the system 100 includes a processor(s) 104, communication interface device(s), alternatively referred as input/output (I/O) interface(s) 106, and one or more data storage devices or a memory 102 operatively coupled to the processor(s) 104. The system 100 with one or more hardware processors is configured to execute functions of one or more functional blocks of the system 100.
[039] Referring to the components of system 100, in an embodiment, the processor(s) 104, can be one or more hardware processors 104. In an embodiment, the one or more hardware processors 104 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the one or more hardware processors 104 are configured to fetch and execute computer-readable instructions stored in the memory 102. In an embodiment, the system 100 can be implemented in a variety of computing systems including laptop computers, notebooks, hand-held devices such as mobile phones, workstations, mainframe computers, servers, and the like.
[040] The I/O interface(s) 106 can include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, a microphone and speaker interface for reading out to user and receiving user voice commands, phrases during passphrase generation, passphrases spoken for device authentication and the like. The I/O interface 106 can facilitate multiple communications within a wide variety of networks N/W and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular and the like. In an embodiment, the I/O interface (s) 106 can include one or more ports for connecting to a number of external devices or to another server or devices.
[041] The memory 102 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
[042] In an embodiment, the memory 102 includes a plurality of modules 110. The plurality of modules 110 include programs or coded instructions that supplement applications or functions performed by the system 100 for executing different steps involved in the process of high entropy passphrase generation for the BEU, being performed by the system 100. The plurality of modules 110, amongst other things, can include routines, programs, objects, components, and data structures, which performs particular tasks or implement particular abstract data types. The plurality of modules 110 may also be used as, signal processor(s), node machine(s), logic circuitries, and/or any other device or component that manipulates signals based on operational instructions. Further, the plurality of modules 110 can be used by hardware, by computer-readable instructions executed by the one or more hardware processors 104, or by a combination thereof. The plurality of modules 110 can include various sub-modules (not shown) such as an Automated Speech Recognition (ASR) engine or the ASR engine can be accessed via Application Programming Interfaces (APIs).
[043] Further, the memory 102 may comprise information pertaining to input(s)/output(s) of each step performed by the processor(s) 104 of the system100 and methods of the present disclosure.
[044] Further, the memory 102 includes a database 108. The database (or repository) 108 may include a plurality of abstracted pieces of code for refinement and data that is processed, received, or generated as a result of the execution of the plurality of modules in the module(s) 110. The database 108 can store the intersection passphrase corpus, passphrase distance matrix, the personalized intersection corpus, the passphrase corpus and the like.
[045] Although the data base 108 is shown internal to the system 100, it will be noted that, in alternate embodiments, the database 108 can also be implemented external to the system 100, and communicatively coupled to the system 100. The data contained within such an external database may be periodically updated. For example, new data may be added into the database (not shown in FIG. 1A) and/or existing data may be modified and/or non-useful data may be deleted from the database. In one example, the data may be stored in an external system, such as a Lightweight Directory Access Protocol (LDAP) directory and a Relational Database Management System (RDBMS). Functions of the components of the system 100 are now explained with reference to steps in flow diagrams in FIG. 2 through FIG. 4D.
[046] FIGS. 2A through 2B (collectively referred as FIG. 2) is a flow diagram illustrating a method 200 for generating usable high entropy passphrases in a local language from personalized intersection corpus for device authentication set up for a Basic Emergent User (BEU), using the system depicted in FIG. 1, in accordance with some embodiments of the present disclosure.
[047] In an embodiment, the system 100 comprises one or more data storage devices or the memory 102 operatively coupled to the processor(s) 104 and is configured to store instructions for execution of steps of the method 200 by the processor(s) or one or more hardware processors 104. The steps of the method 200 of the present disclosure will now be explained with reference to the components or blocks of the system 100 as depicted in FIG. 1 and the steps of flow diagram as depicted in FIG. 2. Although process steps, method steps, techniques or the like may be described in a sequential order, such processes, methods, and techniques may be configured to work in alternate orders. In other words, any sequence or order of steps that may be described does not necessarily indicate a requirement that the steps be performed in that order. The steps of processes described herein may be performed in any order practical. Further, some steps may be performed simultaneously.
[048] The method 200, pre-generates the personalized intersection corpus and the intersection corpus distance matrix, which is an one-time setup. FIGS. 3A. 3B and 3C are example illustrations of personalized intersection corpus generation. The intersection corpus is unique per BEU, but once generated standardized for all repeat operations for that BEU. Intersection corpus distance matrix generation process of the system of FIG. 1, comprising the steps of:
1. Generation of an original corpus by extracting words from preidentified literature from material comprising local language newspaper literature and local language schoolbook literature accessible to the user among other such literature. In an embodiment, school books of fifth standard are used The same textbook corpus is administered to all BEU users to maintain standardization of the intervention.
2. Prompting a user to speak out a plurality of phrases from the preidentified literature, wherein user is allowed to speak in a natural language not restricted to follow accent or be present in a defined environment while speaking out the plurality of phrases. Thus, during corpus generation user can be in his natural environment, thus the surrounding noise, if any, a typical accent of the user is captured for specific user. This makes the corpus user environment and user accent specific, which also increases entropy for an adversary to guess the passphrase.
3. Generating text for each of the plurality of phrases to generate a recorded corpus, comprising words recognized by the ASR engine. The user spoken phrases are thus digitized.
4. Performing intersection of the original corpus and the recorded corpus to generate the personalized intersection corpus, which represent a dataset common to ground truth and that of the ASR engine output. Say, the number of words in the personalized intersection matrix is N. The personalized corpus once generated remains the same for the specific user.
5. Generating the intersection corpus distance matrix (NXN) based on the vector distance between each word of the personalized intersection corpus. Techniques well known in art such as Word2vec represents a word as a high-dimension vector of numbers which capture relationships between words. In particular, words which appear in similar contexts are mapped to vectors which are nearby as measured by cosine similarity. This indicate the level of semantic similarity between the words. Thus, a NxN Matrix is created with each column having words w1, w2, wN. N is the number of words in SetInterSectCorpus (also referred to as personalized intersection corpus). Each row has words w1, w2, wN from the SetInterSectCorpus. Sort the SetIntersectCorpus alphabetically and then populate each row with this sorted words. (here N= number of words in the intersection corpus. That is word W(C,R)=W11 at location C1R1. The next column word is chosen whose distance from W11 is maximum as per word2vec. Word at C2R1 is hence W21= W11Max. Next word at C3R1 is chosen such that it is max distance from W31=W11Max-Sorted Entropy Index. Thus, generated is a NXN Matrix where each word in it is at C1Ri, wherein ‘i’ is the index from 1 to N.
[049] Some works in literature refer to certain matrix generation approaches. For example, Linguistic matrix theory (NPL), by Dimitrios Kartsaklis et.al. However, the work merely cites a gaussian matrix method for corpus of data. It does not specify the precise construction approach and structuring the words in the corpus based on the distance to create an entropy representation for a given set of contextualized data as explained in Table 3 and 4.
[050] Referring to the steps of the method 200, at step 202 of the method 200, the one or more hardware processors 104 are configured by the instructions to prompt the user of the user device (system 100) to a user, to speak out a recallable phrase comprising a set of words using natural language of the user. The user can select easily recallable phrase (favorite phrase), wherein the recallable phrase is converted to a text using the ASR engine. The entire steps 202 through 214 are explained using a use case example. Let the recallable phrase, which is also referred to herein after as phrase, be "Kashmir offers glimpse of paradise to people on earth" as spoken by the user.
[051] At step 204 of the method 200, the one or more hardware processors 104 are configured by the instructions to filter a set of Seed List Words (SLWs) from the phrase based on the personalized intersection corpus, which comprises a plurality of words correctly identifiable by the ASR engine ( N words as mentioned above). The set of selected recognized words SLWs are: Kashmir Offers Glimpse Paradise People Earth.
[052] At step 206 of the method 200, the one or more hardware processors 104 are configured by the instructions to generate the passphrase distance matrix for the set of SLWs by referring to the intersection corpus distance matrix generated for the personalized intersection corpus based on a vector distance between each of the plurality of words in the personalized intersection corpus. The row elements of the passphrase distance matrix comprise the set of SLWs in alphabetical order. The column elements comprise a predefined number of passphrase words mapped to each of the SLWs and are associated words from the intersection distance matrix.. The words in each column for the SLW are identified based on descending order of the vector distance between each SLW and the plurality of words of the personalized intersection corpus. The descending order of vector distance indicates arranging words from less similar to more similar words w.r.t SLW. The entropy theory in security can be understood as “A Measure of the amount of Uncertainty an Attacker faces to determine the content of interest”. It is also a measure of unpredictability in a String of Data Set. Thus, increase in vector distance indicates less similar is the word to the seed word for an attacker to guess and hence has high entropy. Vice versa it can be understood when words are referred to as low entropy w.r.t the SLW. Thus, the method herein arranges the predefined number of passphrase based on vector distance generating a high entropy to low entropy word sequence for each SLW. Example in Table 2 explains the same, wherein for the word earth Planet is the highest similarity word while sphere is least similarity word. Thus. ‘sphere’ is placed earlier and ‘planet’ later(highest to lowest entropy).
[053] For the user spoken phrase in the example above,
Alphabetical order is: Earth Glimpse Kashmir Offers Paradise People
[054] At step 208 of the method 200, the one or more hardware processors 104 are configured by the instructions to column wise perform a plurality of splits of the passphrase distance matrix, with each split covering 5 words, the number identified based on with Miller’s criteria. Thus, the maximum number of splits are N/5. An example first split and a subsequent split (interchangeably also referred to as second split) of the passphrase distance matrix is depicted in the use case example explained herein. Each split comprising a predefined number of words (herein, 5 is the spilt length derived from Miller’s criteria for memory capacity for recall) for each SLW as seen in Table 1 and Table 2 below with 5 words per split ( in accordance with Miller’s criteria). The first split comprising words with higher vector distance resulting in high entropy words and the second spilt comprising words with lower vector distance resulting in low entropy and continues till the end of N/5 splits ( all the plurality of splits).
Table 1
Max Distance Vector for the Selected SeedListWords (SLWs)
(Intersection corpus distance matrix )
Words (SLWs) Word 1 Word 2 Word 3 Word 4 Word 5 Word 6 Word 7 Word 8 Word 9 Word 10
Earth 0.98 0.92 0.85 0.76 0.66 0.54 0.47 0.39 0.26 0.11
Glimpse 0.88 0.83 0.72 0.66 0.41 0.39 0.31 0.24 0.19 0.07
Kashmir 0.78 0.72 0.52 0.43 0.31 0.24 0.19 0.15 0.09 0.04
Offers 0.93 0.91 0.87 0.72 0.57 0.39 0.31 0.23 0.12 0.08
Paradise 0.97 0.89 0.76 0.62 0.55 0.51 0.49 0.36 0.26 0.12
People 0.82 0.74 0.71 0.56 0.48 0.39 0.36 0.22 0.11 0.09
Generate = 0 ( higher entropy for adversary) Generate = 1 (comparatively lower entropy)
Table 2
5 Passphrase illustrative words generated in each iteration for the words in the SLWs
Words
-SLWs Word 1 Word 2 Word 3 Word 4 Word 5 Word 6 Word 7 Word 8 Word 9 Word 10
Earth Sphere Geography Land Ground Bhu Bhumi Prithvi Globe World Planet
Glimpse View Gaze Look Eyeball Espy Notice Percieve Sight Peek Glance
Kashmir Shikara Kahwa Paradise Lakes Mountains Valleys Tourist Apples Saffron Jammu
Offers Propose Pass Grant Proffer Extend Recommend Suggest Present Give Provide
Paradise Lotusland Fantasyland Dreamland Bliss Fairyland Eden Utopia Nirvana Wonderland Heaven
People Masses Crowd Society Tribe Human Citizen Ethnic Public Community Person
Generate = 0 Generate = 1
[055] At step 210 of the method 200, the one or more hardware processors 104 are configured by the instructions to generate the passphrase corpus comprising a plurality of passphrases generated from at least one of the first spilt and the subsequent split by row wise selection of words of the passphrase matrix one at a time. Example passphrase corpus forth example recallable phrase spoken by user is provided in Table 3 below. This process continues for each tap of the regenerate button till exhaustion of the loop count=N/5.
Table 3
Select 5 Passphrases for the Mobile application
Passphrases SLWs Generate = 0 Generate = 1
Passphrase 1 Earth Sphere Geography Land Ground Bhu
Bhumi Prithvi Globe World Planet
Passphrase 2 Glimpse View Gaze Look Eyeball Espy Notice Percieve Sight Peek Glance
Passphrase 3 Kashmir Shikara Kahwa Paradise Lakes Mountains Valleys Tourist Apples Saffron Jammu
Passphrase 4 Offers Propose Pass Grant Proffer Extend Recommend Suggest Present Give Provide
Passphrase 5 Paradise Lotusland Fantasyland Dreamland Bliss Fairyland Eden Utopia Nirvana Wonderland Heaven
Passphrase 6
( sixth row generated but not used since Miller’s criteria is applied) People Masses Crowd Society Tribe Human Citizen Ethnic Public Community Person
[056] Referring to Table 2, for each Seed word in the list, find the row(i) where the word SLW(i) is present. For each word, SLW(i) at Row(j,column=1), selected the words at Row(j, 1
Documents