Sign In to Follow Application
View All Documents & Correspondence

Method, System And Apparatus For Conducting A Reliable Survey

Abstract: A survey system in an embodiment of the present invention monitors the integrity of the data collected by the surveyor. The surveyor is enabled with a mobile device to collect the response from plurality of respondents along with integrity data. A server is configured to receive the responses along with integrity data from the mobile device of the surveyor. The server assigns a quality score for each response based on the integrity data. According to one embodiment of the present invention, the response collector in a survey system may receive survey data comprising a plurality of responses and a plurality of integrity data from a mobile device. The response comprises a plurality of answers to corresponding plurality of questions and the integrity data may comprise at least one of time taken to answer a question in a survey, first location where a respondent answered to the question and biometric data of the respondent.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
21 November 2012
Publication Number
22/2014
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

MUBBLE NETWORKS PRIVATE LIMITED
#16, IIA, SBI COLONY, 3RD BLOCK, KORAMANGALA, BANGALORE

Inventors

1. PRANAV KUMAR JHA
295 PH 2, PALM MEADOWS, VARTHUR ROAD, BANGALORE 560 066
2. RAGHVENDRA VARMA
I-1101, SPRINGFIELDS, NEAR TOTAL MALL, SARJAPUR ROAD, BANGALORE - 560 102

Specification

DESCRIPTION

FIELD OF INVENTION / TECHNICAL FIELD

[0001] Embodiments of the present disclosure relate generally to a survey systems, and more specifically to method, system and apparatus for conducting a reliable survey. RELATED ART

[0002] Survey is a method of gathering quantitative information about items from a number of individuals. Surveys have a variety of purposes and can be conducted in many ways. Surveys may be conducted to gather information through a printed questionnaire, over the telephone, by mail, in person, by diskette, or on the web. This information is collected through use of standardized procedures so that every participant is asked the same questions in the same way. It involves asking people for information in some structured format. Depending on what is being analyzed, the participants being surveyed may be representing themselves, their employer, or some organization to which they belong.

[0003] The survey conducted through paper has several disadvantages like paper cost, postage and labour. These disadvantages may be eliminated through online survey. Online surveys provide instant feedback to researchers. Data collection is instantaneous as the results are automatically sorted out. This technology saves researchers considerable time and money. Researchers can quickly view the results of the survey and go direcT1y to the data analysis portion of the study.

[0004] But, unlike a study in which the researcher is interviewing a subject, online surveys depend on people to be honest about basic demographic information such as age, gender and race. Since people are not always honest, this can create inaccuracy in the data. Also, occasionally, technical problems can affect the user experience, and subsequenT1y the quality, of online surveys. Pages can time out and servers can become overloaded.

[0005] In an online/web survey, everyone may not be connected and even if connected, not all potential respondents are equally computer literate. So this survey method will not work with all populations. Also, screen configurations may appear significanT1y different from one respondent to another, depending on settings of individual computers. Selection of e-mail addresses is also difficult because sometimes there may be more than one e-mail address per respondent.

[0006] A reliable, user friendly and cost effective method of surveying which eliminates all the existing problems is so required.

SUMMARY

[0007] A survey system in an embodiment of the present invention monitors the integrity of the data collected by the surveyor. The surveyor is enabled with a mobile device to collect the response from plurality of respondents along with integrity data. A server is configured to receive the responses along with integrity data from the mobile device of the surveyor. The server assigns a quality score for each response based on the integrity data. [0008] According to one embodiment of the present invention, the response collector in a survey system may receive survey data comprising a plurality of responses and a plurality of integrity data from a mobile device. The response comprises a plurality of answers to corresponding plurality of questions and the integrity data may comprise at least one of time taken to answer a question in a survey, first location where a respondent answered to the question and biometric data of the respondent.

[0009] According to another embodiment, the mobile device may comprise a display, an input device, a processor etc. The display in the mobile device is configured to display plurality of questions of a survey to the plurality of respondent. The input device receives answers for the plurality of questions displayed and the processor may be configured to generate integrity data without respondent intervention for each answer. The mobile device may be configured to send the plurality of answers and the integrity data to a central server.

[0010] According to yet another embodiment, the response analyzer in a survey system may determine the integrity of a response from the integrity data associated with the response. The response analyzer may mark the answer as suspect answer if a time taken to answer the question is greater than a threshold time.

[0011] According to yet another embodiment, the response analyzer may mark the answers as suspect if the distance between the first location and a second location is greater than an allowable distance, where the second location is where the respondent answered another question in the response. The response analyzer may mark the response as suspect if the number of suspect answers in the response is greater than 40% of the answers in the response.

[0012] According to yet another embodiment, the response analyzer may create a quality score for each response after the response quality analysis. A quality score of 10 indicates a good quality response and a response with quality score below 7 may be excluded from reports, summaries, conclusions etc.

BRIEF DESCRIPTION OF DRAWINGS

[0013] The present invention is described with reference to the following accompanying drawings.

[0014] FIG. 1 is a block diagram depicting survey system according to an embodiment.

[0015] FIG. 2 is a flow chart illustrating the steps involved in the working of the present invention according to one embodiment.

[0016] FIG. 3 is a flowchart illustrating the measurement of time and position by the system.

[0017] FIG. 4 is a table showing the survey data comprising time taken to answer, location of the respondent and other integrity data.

[0018] FIG. 5 is a flowchart illustrating the overall working of a server according to one embodiment.

[0019] FIG. 6 is a flowchart determining whether the answers are good or suspect based on time to one embodiment.

[0020] FIG. 7 is a flowchart determining whether the answers are good or suspect based on location to one embodiment.

[0021] FIG. 8 is a flowchart illustrating the calculation of score of the responses after analyzing MTTA and answer location.

[0022] FIG. 9 is a flowchart determining whether the answers are good or suspect based on pause time of responses.

[0023] FIG. 10 is a flowchart illustrating measurement of quality score for each response.

[0024] FIG. 11 is an example diagram depicting the menu icons in the survey mobile device.

[0025] FIG. 12 is a respondent interface window in the survey mobile device for receiving user login details.

[0026] FIG. 13 is an example respondent's registration page in the survey mobile device for receiving respondent's details.

[0027] FIG. 14 is a diagram depicting the questionnaire page in the survey mobile device.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

[0028] FIG. 1 is a block diagram depicting survey system according to an embodiment. The block diagram is shown comprising a survey mobile device 110, network 120, a firewall 130 and server 140. A mobile device may be a small hand held computing device typically having a touch pad and/or a miniature keyboard which has an operating system and can run various types of application software.

The mobile device 110 may be connected to the server 140 via network 120 and firewall 130. The network 120 may be a collection of computers and/or mobile devices and other hardware components interconnected by communication channels that allow sharing of resources. A network may be a LAN, WAN (like internet), WLAN, MAN etc. The firewall 130 may be either software based or hardware based which may be used to help keep a network secure by controlling the incoming and outgoing network traffic by analyzing the data packets and determining whether it should be allowed through or not, based on a predetermined rule set. The server 140 generally refers to a physical computer (a computer hardware system) dedicated to running one or more services (as a host), to serve the needs of the users of other computers on the network. A server may be a single physical computer or a group of computers with one or more applications installed in each computer.

[0029] According to one embodiment of the present invention, survey application software may be installed in the survey mobile device 110. The survey mobile device 110 may be the surveyor's mobile device with which surveyor may approach various respondents and performs surveying. A respondent may be an individual who provides answers to all the questions in a given questionnaire (set of well defined questions). The survey mobile device 110 may receive questionnaire from the server 140 and displays it to the respondent through survey application software. The respondent may give response to the displayed questionnaire. The survey mobile device 110 may receive the response along with some integrity data such as voice, image, answering style, answer pattern, minimum time field etc and may send to the server 140. The response for the questionnaire may be collected from several respondents and may send instanT1y to the server 140. The server computers 150 and 160 may be deployed with response collection system software and response analysis system software respectively. The response collection system or response collector 150 may collect all the responses along with integrity data and stores them. The response analysis system or response analyser 160 may perform quality analysis of the responses stored in the response collection system 150 based on certain criteria and develop a quality score for each response. The survey mobile device 110 may comprise one or more of mobile phones, tablet PC, hand held devices, laptops, for example.

[0030] The manner in which the survey mobile device 110 may be configured to operate in the survey system 100 is explained below with the help of a flowchart.

[0031] FIG. 2 is a flow chart illustrating the steps involved in the working of the present invention according to one embodiment. The flowchart begins at step 201 and control passes to step 210.

[0032] In step 210, the survey may be started when the surveyor approaches a respondent. During a survey, multiple respondents may be asked questions from the same questionnaire. After approaching a respondent, the surveyor may collect the respondent's registration details using the survey mobile device 110. Once the respondent is registered, the server 140 may stream the questionnaire to the survey mobile device 110 and the control passes to step 220.

[0033] In step 220, the survey mobile device 110 may collect answers for the questionnaire.

The surveyor may collect answers for each question in the questionnaire from the respondent using the survey mobile device 110. The respondent may deliver the answer for the question by marking on the survey mobile device 110 using devices such as stylus, joystick etc or may deliver the answer by speech. The control passes to step 230.

[0034] In step 230, the survey mobile device 110 may collect additional integrity data. In addition to collecting answers for each question in the questionnaires, the survey mobile device 110 may also records the location of the respondent, time taken by the respondent to answer a question, respondent's voice etc during a survey. These integrity data may be used for calculating quality score of each response. After receiving integrity data, the control passes to step 240.

[0035] In step 240, the survey mobile device 110 may send answers along with integrity data to the server. After completing a survey, all the answers received for a questionnaire may be grouped as a response and the time taken for the entire response may be calculated. The survey mobile device 110 may send the entire answers after forming a response (collection of answers for a set of questions in the questionnaire) or as soon each answer is received for each question. Along with the response or each answer to the questions, survey mobile device 110 may send corresponding integrity data to the server 140. The integrity data may be sent instanT1y or after storing temporarily in the device for a predetermined number of surveys.

Upon sending the responses along with integrity data, the control passes to step 299. The flowchart ends in step 299.

[0036] The manner in which the survey mobile device 110 may collect integrity data while respondent provide response to a survey is described in further detail below.

[0037] FIG. 3 is a flowchart illustrating the measurement of time and position by the system.

The flowchart starts in step 301 and the control passes to step 310.

[0038] In step 310, the survey mobile device 110 may present the question to the respondent.

Once the respondent registered with the required details for the survey, the server 140 may stream corresponding questionnaire to the survey mobile device 110. After receiving the questionnaire, the control passes to step 320.

[0039] In step 320, the survey mobile device 110 may receive the response from the respondent. A response is a set of answers received on a given questionnaire from a single respondent. The respondent may answer each question either by marking on the mobile device or by speech or through any other means. When a survey is completed, all the answers may be grouped together as a response.

[0040] According to one embodiment, the respondent's answer pattern may be sent to the server for determining the integrity of the answers. For example, the repeating answer patterns such as a/a/a or b/b/b or a/b/c may represent a non serious response given by a respondent. The server or the mobile device may be configured to mark the answer provided in one or more such pattern as suspect, may be after receiving the respondent's response.

[0041] In another example embodiment, the mobile device may capture the answering style of the respondent such as but not limited to tapping style, location or area in which the touch screen receives the answers, tapping pressure, tapping frequency, for example. In one embodiment, the answers are provided such that user respondent may provide answer by touching any position on the answer line/choice (row wise over the touch screen). For example, while answering a multiple choice question, different respondent may answer in different way/style such as touching at the left end of the choice, touching at the centre or may touch at the right end of the choice. The touching style is sent to the server as the integrity data. The server may analyse the touching style in survey responses, to predict the integrity of the answer. If the same style is found in more than three or multiple survey responses, server may concluded that the same individual has answered the survey questions and those responses may be marked as suspect. As a further alternative, if no particular response style is found in one response, the server may again determine the answers as suspect. The control passes to step 330.

[0042] In step 330, the survey mobile device 110 may calculate the time taken to answer each question. The time may be calculated using reference time T1 and reference time T2. The reference time T1 may be the time at which a question is posed to the respondent and reference time T2 may be the time at which the respondent marks the corresponding answer on the mobile device. The time taken to answer the question may be the difference between reference time T1 and reference time T2, that is, T2-T1. After calculating the time taken to answer the question, the control passes to step 340.

[0043] In step 340, the survey mobile device 110 may collect the location information for each answer. The location information may be collected through any positioning system such as GPS, Inertial navigation, cell phone reference, etc. The location information may be collected while answering each question or for entire response. While recording an answer, the survey mobile device 110 may note the location where the answer was recorded. After locating the position of the respondent, the control passes to step 350.

[0044] In step 350, the survey mobile device 110 may repeat the above steps for all questions in the questionnaire. All the answers along with time taken for each answer, location of each answer may be recorded and send to the server 140. The control passes to step 399 where the flowchart ends.

[0045] FIG. 4 is a table showing example survey data comprising time taken to answer, location of the respondent and other integrity data. The received data from the survey mobile device 110 may be stored in the server in the form of table. Column 410 may represent the questions in the questionnaire. The questions in a questionnaire may be a well defined set of questions and may be arranged in a predetermined order from general to specific. The questions may be grouped according to the content which may help the respondent to organise his/her thoughts and reactions leading to more accurate response to the questions. Column 420 may represent the reference time T1, which may be the time at which question poses to the respondent and column 430 represents the reference time T2, which may be the time at which answer may be marked by the respondent. Later, the difference between the two reference times may be calculated for each question to find the time taken to answer that particular question. Column 440 may represent the locations of each answer. The location of each answer in a questionnaire may be verified to check whether they are within configurable minimum distance. Column 450 may represent additional informations such as voice of the respondent, image, biometric data etc. Voice and image may be recorded additionally to check the integrity of the response.

[0046] The manner in which the server 140 operates on the data received from the survey mobile device 110 is described in further detail below.

[0047] FIG. 5 is a flowchart illustrating example working of a server according to one embodiment. The flowchart begins at step 501 and the control passes to step 510.

[0048] In step 510, the server 140 may receive response from the survey mobile device 100 corresponding to plurality of respondents. During a survey, the surveyor may approach several respondents with the survey mobile device for collecting answers for the same questionnaire. The collected responses along with integrity data may be sent to the server 140 instanT1y or after storing temporarily for predetermined number of survey. The server 140 may receive all the responses along with integrity data and may store in the response collection system 150. After receiving the response, the control passes to step 520.

[0049] In step 520, the response analysis system 160 may analyse the response and determines the integrity of the response. In one embodiment, the response analysis system 160 analyses the data and may compute certain predetermined parameters such as mean time to answer (MTTA), distance between the answer location etc using the integrity data sent by the mobile device 110. The manner in which the parameters are determined/ computed is described in the later sections. Based on the computed parameters, the response analysis system 160 may assign a quality score for the responses. After analysing the response, the control passes to step 530.

[0050] In step 530 the server 170 may provide survey report with integrity data information.

After analysing the response and assigned quality score, server 170 may generate a survey report that may indicate the quality of the survey or the reliability of the survey. The generated report may be stored in the database. The control passes to step 599 where the flowchart ends.

[0051] The manner in which the server may determine a score for the answers is described in further detail below.

[0052] FIG. 6 is a flowchart illustrating a manner in which response analysis system 160 determining whether an answer is good or suspect based on the time in an embodiment. The flowchart begins at step 601 and the control passes to step 610.

[0053] In step 610, the response analysis system 160 analyse the time taken to answer a question in the response, T. The time taken to answer a question may be calculated by finding the difference between corresponding reference time T1 and reference time T2 as described with reference to figure 3. After analysing the time taken to answer each question, the control passes to step 620.

[0054] In step 620, the response analysis system 160 may calculate the mean time to answer (MTTA) each question across all the response. MTTA may be computed by taking the mean across all the responses received for a particular questionnaire. After calculating MTTA, the control passes to step 630.

[0055] In step 630, the response analysis system 160 may check whether the time taken, T to answer a question exceeds MTTA. The response analysis system 160 may check whether the time taken to answer a question exceeds the configurable standard deviation with respect to its corresponding MTTA. This may be done for all the answers stored for a particular survey.

If the result is true, the control passes to step 640, otherwise the control passes to step 650.

[0056] In step 640, the response analysis system 160 may mark the answer as suspect. If the response analysis system 160 finds that the time taken for answering any of the questions exceeds the configurable standard deviation relative to MTTA, the corresponding answer may be marked as suspect, otherwise the answer may be marked as good in step 650.

[0057] According to one embodiment, the minimum time to answer may also be computed to check the integrity of the response. A minimum time field may be set for each question in the questionnaire for minimum time analysis. If a question is answered too quickly (time taken is below the minimum time field), it may indicate a non-serious response. A minimum time analysis may be performed for each question in the questionnaire and the questions whose time taken to answer is less than this minimum time may be marked as suspect. After determining the answer as suspect or good, the control passes to step 699. The flowchart ends in step 699.

[0058] The manner in which the server determining whether the answers are good or suspect based on location is described below with the help of a flowchart.

[0059] FIG. 7 is a flowchart determining whether the answers are good or suspect based on location to one embodiment. The flowchart begins at step 701 and the control passes to step710.

[0060] In step 710, the response analysis system 160 may analyse the location for each answer in a response from the table. After determining each answer of a questionnaire across all responses based on time, the response analysis system 160 may check each response in detail based on location of each answer. After finding the location for each answer in a response, the control passes to step 720.

[0061] In step 720, the response analysis system 160 may determine if all answer locations are within a minimum distance. The minimum configurable distance may be set some 50 meters as default, for example. The response analysis system 160 may calculate the distance between all the answer locations in a response. The distance may be calculated by simple summation. After calculating the distance, the control passes to step 730.

[0062] In step 730, the response analysis system 160 may check if the distances between two or more location of answers are greater than minimum distance. If the distance between any two or more answer locations exceeds the minimum configurable distance, the control passes to step 740, otherwise control passes to step 750.

[0063] In step 740, the response analysis system 160 may mark the two or more answers as suspect that exceeds the default configurable minimum distance. After marking the answers as suspect, the control passes to step 760.

[0064] In step 750, the response analysis system 160 may mark the two or more answers as good. The good answers may be those answers which come within the configurable minimum distance. After marking the answers as good, the control passes to step 799.

[0065] In step 760, the response analysis system 160 may calculate the distance of suspected answers from the location of remaining answers in the response. The response analysis system 160 may check the suspected answers for distance from all other answers. The distance may be a simple summation of distance from location of all answers other than the suspected answers. The control passes to step 770.

[0066] In step 770, the response analysis system 160 may remark the answer with least distance as good and the other one as suspect. After checking the suspected answers for distance from all other answers, the answer that scores least on distance may be retained as good and the other answer may be left marked as suspect. Then the control passes to step 799. The flowchart ends in step 799.

[0067] The manner in which the response analysis system 160 may compute the score of a response is described below in further detail.

[0068] FIG. 8 is a flowchart illustrating the manner in which the score may be assigned to an answer based on MTTA and answer location in one embodiment. The flowchart begins at step 801 and the control passes to step 810.

[0069] In step 810, the response analysis system 160 may analyze all responses. The response analysis system 160 may analyze all the suspected and good answers based on MTTA and answer location among all responses. Then, the control passes to step 815.

[0070] In step 815, the response analysis system 160 may check if more than 40% answers of the response marked as suspect. After the analysis of all responses for mean time to answer (MTTA) and answer location, the response analysis system 160 may calculate score for each response. The score may be calculated based on the percentage of suspected answers. If the percentage of suspected answers is more than 40, the control passes to step 820, otherwise control passes to step 825.

[0071] In step 820, the response analysis system 160 may mark score as 5 and the control passes to step 899.

[0072] In step 825, the response analysis system 160 checks whether the percentage of suspected answers are greater than 30% and/or less than 40%. If the percentage of suspected answers are greater than 30% and/or less than 40%, the control passes to step 830, otherwise the control passes to step 835. In step 830, the response analysis system 160 marks score as 6 and the control passes to step 899.

[0073] In step 835, the response analysis system 160 checks whether the percentage of suspected answers is greater than 20% and/or less than 30%. In this step, the analysis system may check whether the percentage of suspected answers is more than 20 and less than 30. If yes, the control passes to step 840, otherwise the control passes to step 845. In step 840, the response analysis system 160 marks the score as 7 and the control passes to step 899.

[0074] In step 845, the response analysis system 160 checks whether the percentage of suspected answers is greater than 10% and/or less than 20%. Here if the percentage of suspected answers is more than 10 and less than 20, the control passes to step 850, otherwise passes to step 855. In step 850, the response analysis system 160 may mark score as 8 and the control passes to step 899.

[0075] In step 855, the response analysis system 160 checks whether the percentage of suspected answers is greater than 5% and/or less than 10%. If the percentage of suspected answers lies between 5 and 10, then the control passes to step 860, otherwise the control passes to step 865. In step 860, the response analysis system 160 may mark score as 9 and the control passes to step 899. In step 865, the response analysis system 160 may mark score as 10 and the control passes to step 899. The flowchart ends in step 899.

[0076] FIG. 9 is a flowchart determining whether the answers are good or suspect based on pause time of responses. The flowchart begins at step 901 and the control passes to step 910. In step 910, the response analysis system 160 may calculate pause time of each response. The pause time may be the difference between total time taken to complete a response and sum of time taken for each answer. For example, a survey is started at 2PM, and the last question is answered at 3PM. But the sum total of time taken to answer each question may be 40 minutes. In this case, the pause time may be 1 hour minus 40 minutes which is equal to 20 minutes. Similar way, the response analysis system 160 may calculate the pause time for all the responses. After calculating the pause time, the control passes to step 920. [0077] In step 920, the response analysis system 160 may calculate mean pause time across all response. Mean pause time may be the average pause time across all the responses. After calculating the mean pause time, the control passes to step 930.

[0078] In step 930, the response analysis system 160 may compare the pause time of each response with the mean pause time. The pause time of each response may be compared with the mean pause time. Then, the control passes to step 940.

[0079] In step 940, the response analysis system 160 may check if pause time is more than a threshold value. Threshold may be a default value which is the configurable standard deviation from the mean pause time. The threshold value may be set as three by default. In this step, the response analysis system 160 may check whether the pause time exceeds the default threshold value. If yes, the control passes to step 950, otherwise the control passes to step 960.

[0080] In step 950, the response analysis system 160 may mark response as suspect. If the pause time of any response exceeds the threshold value, the response may be marked as suspect and the control passes to step 970. In step 960, the response analysis system 160 may mark response as good and the control passes to step 999.

[0081] In step 970, the response analysis system 160 may reduce the score of the response.
After analysing each answer in a response based on the time taken to answer and the location, the response analysis system 160 may analyse each response based on pause time also. If any of the response is marked as suspect after pause time analysis, then the score of that response may be reduced by 1. After changing the score, the control passes to step 999. The flowchart ends in step 999.

[0082] The manner in which the response analysis system 160 measures the quality score for each response may be explained in the following flowchart.

[0083] FIG. 10 is a flowchart illustrating measurement of quality score for each response.

The flowchart begins at step 1001 and the control passes to step 1010. In step 1010, the response analysis system 160 analyzes the score of each response. At the end of response quality analysis, the response analysis system 160 may have a quality score for each response. After analyzing the score, the control passes to step 1020.

[0084] In step 1020, the response analysis system 160 checks, whether the score is equal toK1.K1 may be a constant which may be set to a default value 10. If the score is equal toK1, the control passes to step 1030, otherwise the control passes to step 1040. In step 1030, the response analysis system 160 determines the response as a good quality response.

[0085] In step 1040, the response analysis system 160 checks, whether the score is less than K2. K2 may be another constant which may be set to a default value 7. If the score is less than K.2, the control passes to 1050, otherwise control passes to 1060.

[0086] In step 1050, the response analysis system 160 excludes the response. If the quality score is less than K2, the corresponding response may be excluded from reports, summaries and conclusions and the control passes to step 1099. If the quality score is greater than K2, the response may be included while making reports, conclusions etc which may be represented in step 1060. In step 1060, the response analysis system 1060 may include the response and the control passes to step 1099. The flowchart ends at step 1099.

[0087] The working procedure explained above may be executed with a survey application installed mobile device and response collection system and response analysis system installed server. The mobile device installed with survey application may be further explained below.

[0088] The mobile device shown may comprise functional units such as but not limited to for example, a display, an input device, a processor and other peripheral devices providing desired functionality to the user. The functional units of the mobile device are not shown in the figure for simplicity. The display may be configured to display the questionnaire to the respondent. The input device may receive answers for the questionnaire displayed on the display. The processor in the mobile device may be configured to generate the integrity data for each answer without respondent intervention. In the present invention, the display, input device and processor may together represent a mobile device.

[0089] FIG. 11 is an example diagram depicting the menu icons in the survey mobile device. The mobile device is shown comprising a survey application 1110, a voice recorder, GPS, contacts, settings, messaging etc. According to one embodiment, a survey application 1010 may be installed on the mobile device that may be used to present questionnaire to the respondent and record all answers. A voice recorder may be used to record the spoken voice, singing or any sound effects. The GPS installed in the mobile device may provide location and time information of the respondent.

[0090] FIG. 12 is an example respondent interface window in the survey mobile device for receiving user login details. Survey application in the mobile device 110 receives the user name and password of the respondent. Upon receiving the login details, the survey application may send it to the server. The server may authenticate the user with the furnished login details and may allow the respondent to access the survey application.

[0091] FIG. 13 is an example respondent's registration page in the survey mobile device for receiving respondent's details. The registration page is shown containing respondent's details may include respondent's name, designation, address, phone number, email ID, survey ID 1310, survey location etc. According to an embodiment, the survey ID 1310 may represent the type of organisation/ industry, institution etc. After entering the details the respondent may click the submit button which makes the details to be stored in the server data base. If the respondent needs to make any changes, he/she may click the reset button and may edit the details. Once the respondent clicks the submit button, the server may stream the questionnaire corresponding to the survey ID and respondent details furnished which may shown in the below figure.

[0092] FIG. 14 is a diagram depicting an example questionnaire page in the survey mobile device. The questionnaire page 1410 is shown comprising a set of well defined questions displaying on the mobile device. Each question in a questionnaire may be displayed on the mobile device with multiple answer options. The respondent may choose any of the answer provided based on his/her wish or requirement. The respondent may deliver the answer for each question either by marking on the survey mobile device 110 using devices such as stylus, joystick etc or may deliver the answer by speech. After marking an answer, the respondent may click the next button to move on to the next question.

[0093] While various examples of the present disclosure have been described above, it should be understood that they have been presented by way of example, and not limitation. Thus, the breadth and scope of the present disclosure should not be limited by any of the above-described examples, but should be defined in accordance with the following claims and their equivalents.

CLAIMS

I/we claim,

1. A system comprising:

a response collector configured to receive survey data from a mobile device, wherein survey data comprises a plurality of responses and a plurality of integrity data; and a response analyser configured to determine the integrity of a response from the integrity data associated with the response, wherein the response comprises plurality of answers to corresponding plurality of questions.

2. The system of claim 1, wherein the integrity data comprise at least one of time taken to answer a question in a survey, first location where a respondent answered to the question and biometric data of the respondent.

3. The system of claim 2, wherein the response analyser configured to mark the answer as suspect answer if a time taken to answer the question is greater than a threshold time.

4. The system of claim 3, wherein the response analyser configured to mark the answers as suspect answers if the distance between the first location and a second location is greater than a allowable distance, where the second location is where the respondent answered another question in the response.

5. The system of claim 4, wherein the response analyser configured to mark a response as suspect if number of the suspect answers in the response is greater than 40% of the answers in the response.

6. A mobile device comprising:

a display configured to display plurality of questions of a survey to the plurality of respondent;
an input device configured to receive plurality of answers to the plurality of questions displayed on the display; and
a processor configured to generate an integrity data for each answer when a question is answered by the respondent.

7. The mobile device of claim 6, wherein the integrity data comprise at least one of time taken to answer, location from where the answer is recorded, and biometric data of the respondent.

8. The mobile device of claim 7, wherein the processor is configured to generate the integrity data without the respondent intervention.

9. The mobile device of claim 8, further comprising a transceiver configured to send the plurality of answers and the integrity data to a central server.

10. The mobile device of claim 9, wherein the plurality of questions of the survey is received from the server.

Documents

Application Documents

# Name Date
1 4861-CHE-2012 ASSIGNMENT 21-11-2012.pdf 2012-11-21
1 4861-CHE-2012 FORM-3 21-11-2012.pdf 2012-11-21
2 4861-CHE-2012 POWER OF ATTORNEY 21-11-2012.pdf 2012-11-21
2 4861-CHE-2012 FORM-2 21-11-2012.pdf 2012-11-21
3 4861-CHE-2012 ABSTRACT 21-11-2012.pdf 2012-11-21
3 4861-CHE-2012 FORM-1 21-11-2012.pdf 2012-11-21
4 4861-CHE-2012 CLAIMS 21-11-2012.pdf 2012-11-21
4 4861-CHE-2012 DRAWINGS 21-11-2012.pdf 2012-11-21
5 4861-CHE-2012 DESCRIPTION (COMPLETE) 21-11-2012.pdf 2012-11-21
5 4861-CHE-2012 CORRESPONDENCE OTHER 21-11-2012.pdf 2012-11-21
6 4861-CHE-2012 CORRESPONDENCE OTHER 21-11-2012.pdf 2012-11-21
6 4861-CHE-2012 DESCRIPTION (COMPLETE) 21-11-2012.pdf 2012-11-21
7 4861-CHE-2012 CLAIMS 21-11-2012.pdf 2012-11-21
7 4861-CHE-2012 DRAWINGS 21-11-2012.pdf 2012-11-21
8 4861-CHE-2012 ABSTRACT 21-11-2012.pdf 2012-11-21
8 4861-CHE-2012 FORM-1 21-11-2012.pdf 2012-11-21
9 4861-CHE-2012 POWER OF ATTORNEY 21-11-2012.pdf 2012-11-21
9 4861-CHE-2012 FORM-2 21-11-2012.pdf 2012-11-21
10 4861-CHE-2012 FORM-3 21-11-2012.pdf 2012-11-21
10 4861-CHE-2012 ASSIGNMENT 21-11-2012.pdf 2012-11-21