Sign In to Follow Application
View All Documents & Correspondence

Method And Electronic Device For Determining Information As Accurate Or Fake

Abstract: “METHOD AND ELECTRONIC DEVICE FOR DETERMINING INFORMATION AS ACCURATE OR FAKE” Accordingly, embodiments herein disclose a method for determining information as accurate or fake by an electronic device (100). The method includes receiving one or more first information and processing the one or more first information using a machine learning model. Further, the method includes identifying a location of the first information and determining one or more user present in the identified location. Further, the method includes acquiring one or more second information similar to the first information. The one or more second information is acquired from the one more user present in the identified location. Further, the method includes matching the one or more second information with the one or more first information. Further, the method includes determining that the one or more first information is accurate or fake based on the matching.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
02 June 2021
Publication Number
49/2022
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
yasirdgku@gmail.com
Parent Application

Applicants

Lieko Technologies Pvt Ltd
401 B, Surubhi Enclave Nagras Road Pune Maharashtra India 411007

Inventors

1. Jai Shankar Vishwakarma
A101, First Floor, Tinseltown, Hinjewadi Phase 2 Next to Embassy Quadron IT Park Pune Maharashtra India 411057

Specification

Claims:CLAIMS
We claim:
1. A method for determining information as accurate or fake, comprising:
receiving, by an electronic device (100), one or more first information;
processing, by the electronic device (100), the one or more first information using at least one of a machine learning (ML) model or an artificial intelligence (AI) model;
identifying, by the electronic device (100), a location of the one or more first information;
determining, by the electronic device (100), one or more user present in the identified location;
acquiring, by the electronic device (100), one or more second information similar to the first information, wherein the one or more second information is acquired from the one more user present in the identified location;
matching, by the electronic device (100), the one or more second information with the one or more first information; and
determining, by the electronic device (100), that the one or more first information is accurate or fake based on the matching.

2. The method as claimed in claim 1, further comprises:
classifying, by the electronic device (100), one or more accurate portion of the one or more first information, one or more inaccurate portion of the one or more first information, and fake information associated with the one or more first information; and
causing to display, by the electronic device (100), for the one or more accurate portion of the one or more first information, the one or more inaccurate portion of the one or more first information, and the fake information associated with the one or more first information to a user; and
receiving, by the electronic device (100), an input from a user of the electronic device (100).

3. The method as claimed in claim 1, wherein determining, by the electronic device (100), the one or more user present in the identified location comprises:
analyzing, by the electronic device (100), one or more previous information posted by the one or more user;
providing, by the electronic device (100), a higher reputation index based on the analyzing; and
determining, by the electronic device (100), the one or more user present in the identified location based on the higher reputation index.

4. The method as claimed in claim 3, wherein the higher reputation index indicates a trust level of the user, wherein the higher reputation index is determined based on a basis expertise associated with the first information, a demography, and knowledge on the first information.

5. The method as claimed in claim 1, wherein processing, by the electronic device (100), the one or more first information using the machine learning model comprises:
consolidating, by the electronic device (100), the one or more first information;
parsing, by the electronic device (100), the one or more first information;
validating, by the electronic device (100), the one or more first information;
tagging, by the electronic device (100), the one or more first information;
updating, by the electronic device (100), the one or more first information; and
filtering, by the electronic device (100), the one or more first information.

6. The method as claimed in claim 1, wherein the one or more first information is received from one or more an online content, a social media content, a news information and a user input.

7. An electronic device (100) for determining information as accurate or fake, comprising:
a processor (140);
a memory (130); and
an artificial intelligence (AI) based information fact check controller (110) coupled with the processor (140) and the memory (130), configured to:
receive one or more first information;
process the one or more first information using at least one of a machine learning (ML) model or an artificial intelligence (AI) model;
identify a location of the one or more first information;
determine one or more user present in the identified location;
acquire one or more second information similar to the first information, wherein the one or more second information is acquired from the one more user present in the identified location;
match the one or more second information with the one or more first information; and
determine that the one or more first information is accurate or fake based on the matching.

8. The electronic device (100) as claimed in claim 7, wherein the data driven based information fact check controller (110) is configured to:
classify one or more accurate portion of the one or more first information, one or more inaccurate portion of the one or more first information, and fake information associated with the one or more first information; and
display for the one or more accurate portion of the one or more first information, the one or more inaccurate portion of the one or more first information, and the fake information associated with the one or more first information to a user; and
receive an input from a user of the electronic device (100).

9. The electronic device (100) as claimed in claim 7, wherein determine the one or more user present in the identified location comprises:
analyze one or more previous information posted by the one or more user;
provide a higher reputation index based on the analyzing, wherein the higher reputation index indicates a trust level of the user, wherein the higher reputation index is determined based on a basis expertise associated with the first information, a demography, and knowledge on the first information; and
determine the one or more user present in the identified location based on the higher reputation index.

10. The electronic device (100) as claimed in claim 7, wherein process the one or more first information using the machine learning model comprises:
consolidate the one or more first information, wherein the one or more first information is received from one or more an online content, a social media content, a news information and a user input;
parse the one or more first information;
validate the one or more first information;
tag the one or more first information;
update the one or more first information; and
filter the one or more first information.

11. A method for discriminating an information, comprising:
receiving, by an electronic device (100), at least one information;
identifying, by the electronic device (100), the at least one information as real information or a fake information; and
performing, by the electronic device (100), one of:
causing to display to a user if the at least one information is the real information, and
discarding the at least one information if the at least one information is the fake information.

12. The method as claimed in claim 11, further comprises:
performing, by the electronic device (100), at least one of:
ranking a fake information providing source based on the identification,
publishing a fake information providing source based on the identification,
notifying a fake information providing source to a central authority based on the identification, and
blocking a fake information providing source based on the identification.

13. The method as claimed in claim 11, wherein identifying, by the electronic device (100), the at least one information as the real information or the fake information comprises:
parsing the at least one information;
validating the at least one parsed information; and
identifying the at least one information as the real information or the fake information based on the validation, wherein the at least one parsed information is validated using at least one of an Artificial intelligence (AI) model and a machine learning (ML) model.

14. An electronic device (100) for discriminating an information, comprising:
a processor (140);
a memory (130); and
a data driven based information fact check controller (110), coupled with the processor (140) and the memory (130), configured to:
receive at least one information;
identify the at least one information as a real information or a fake information; and
perform one of:
display to a user if the at least one information is the real information, and
discard the at least one information if the at least one information is the fake information.

Dated this 2nd June, 2021
Signatures:
Name of the Signatory: Yasir Arafath
Patent Agent No- 3798

, Description:FORM 2
The Patent Act 1970
(39 of 1970)
&
The Patent Rules, 2005

COMPLETE SPECIFICATION
(SEE SECTION 10 AND RULE 13)

TITLE OF THE INVENTION

“METHOD AND ELECTRONIC DEVICE FOR DETERMINING INFORMATION AS ACCURATE OR FAKE”

APPLICANT:
Name : Lieko Technologies Pvt Ltd

Nationality : India

Address : 401 B, Surubhi Enclave, Nagras Road, Pune, Maharashtra, India, 411007

The following specification particularly describes and ascertains the nature of this invention and the manner in which it is to be performed:-

FIELD OF INVENTION
[0001] The present disclosure relates to a fact checking method, and more specifically related to a method and an electronic device for determining information as accurate or fake.

BACKGROUND OF INVENTION
[0002] Information is easily dispersed through a social media, the Internet, a television, and many other online source, so that an accuracy of the information is often questionable or even incorrect. Although there are many fact checkers, they typically suffer from various issues.
[0003] The internet has given people access to new tools and platforms to build communities and new capabilities to speak truth to power. Yet these same spaces are being abused to spread division, fear and mistrust – and sow the seeds of disinformation, so there is need for providing to disrupt, defund and down-rank disinformation sites and sources, while giving people a way to trust news, facts, information and sources.
[0004] In an example, in exiting methods, a Global Disinformation Index (GDI) aims to disrupt, defund and down-rank disinformation sites, so that a system collectively work with governments, business and civil society. The system operates on three core principles of neutrality, independence and transparency.
[0005] Thus, it is desired to address the above mentioned disadvantages or other shortcomings or at least provide a useful alternative.

OBJECT OF INVENTION
[0006] The principal object of the embodiments herein is to provide a method and an electronic device for determining information as accurate or fake.
[0007] Another object of the embodiment herein is to receive one or more first information and process the one or more first information using a machine learning (ML) model or an artificial intelligence (AI) model.
[0008] Another object of the embodiment herein is to identify a location of the one or more first information.
[0009] Another object of the embodiment herein is to determine one or more user present in the identified location.
[0010] Another object of the embodiment herein is to acquire one or more second information similar to the first information, where the one or more second information is acquired from the one more user present in the identified location.
[0011] Another object of the embodiment herein is to match the one or more second information with the one or more first information and determine that the one or more first information is accurate or fake based on the matching.
[0012] Another object of the embodiment herein is to classify one or more accurate portion of the one or more first information, one or more inaccurate portion of the one or more first information, and fake information associated with the one or more first information.

SUMMARY OF INVENTION
[0013] Accordingly, embodiments herein disclose a method for determining information as accurate or fake. The method includes receiving, by an electronic device, one or more first information. Further, the method includes processing, by the electronic device, the one or more first information using a machine learning model or an AI model. Further, the method includes identifying, by the electronic device, a location of the first information. Further, the method includes determining, by the electronic device, one or more user present in the identified location. Further, the method includes acquiring, by the electronic device, one or more second information similar to the first information. The one or more second information is acquired from the one more user present in the identified location. Further, the method includes matching, by the electronic device, the one or more second information with the one or more first information. Further, the method includes determining, by the electronic device, that the one or more first information is accurate or fake based on the matching.
[0014] In an embodiment, the method further includes classifying, by the electronic device, one or more accurate portion of the one or more first information, one or more inaccurate portion of the one or more first information, and fake information associated with the one or more first information. Further, the method includes causing to display, by the electronic device, for the one or more accurate portion of the one or more first information, the one or more inaccurate portion of the one or more first information, and the fake information associated with the one or more first information to a user. Further, the method includes receiving, by the electronic device, an input from a user of the electronic device.
[0015] In an embodiment, determining, by the electronic device, the one or more user present in the identified location comprises analyzing, by the electronic device, one or more previous information posted by the one or more user, providing, by the electronic device, a higher reputation index based on the analyzing, and determining, by the electronic device, the one or more user present in the identified location based on the higher reputation index.
[0016] In an embodiment, the one or more first information is received from one or more an online content, a social media content, a news information and a user input.
[0017] In an embodiment, processing, by the electronic device, the one or more first information using the machine learning model includes consolidating, by the electronic device, the one or more first information, parsing, by the electronic device, the one or more first information, validating, by the electronic device, the one or more first information, tagging, by the electronic device, the one or more first information, updating, by the electronic device, the one or more first information, and filtering, by the electronic device, the one or more first information.
[0018] In an embodiment, the higher reputation index indicates a trust level of the user, where the higher reputation index is determined based on a basis expertise associated with the first information, a demography, and knowledge on the first information.
[0019] Accordingly, embodiments herein disclose an electronic device for determining information as accurate or fake. The electronic device includes a data driven based information fact check controller coupled with a processor and a memory. The data driven based information fact check controller is configured to receive one or more first information and process the one or more first information using a machine learning model. Further, the data driven based information fact check controller is configured to identify a location of the first information. Further, the data driven based information fact check controller is configured to determine one or more user present in the identified location. Further, the data driven based information fact check controller is configured to acquire one or more second information similar to the first information. The one or more second information is acquired from the one more user present in the identified location. Further, the data driven based information fact check controller is configured to match the one or more second information with the one or more first information. Further, the data driven based information fact check controller is configured to determine that the one or more first information is accurate or fake based on the matching.
[0020] Accordingly, embodiments herein disclose a method for discriminating an information. The method includes receiving, by an electronic device, at least one information. Further, the method includes identifying, by the electronic device, the at least one information as real information or a fake information. Further, the method includes performing, by the electronic device, one of: causing to display to a user if the at least one information is the real information, and discarding the at least one information if the at least one information is the fake information.
[0021] In an embodiment, the method further includes performing, by the electronic device, at least one of: ranking a fake information providing source based on the identification, publishing a fake information providing source based on the identification, notifying a fake information providing source to a central authority based on the identification, and blocking a fake information providing source based on the identification.
[0022] In an embodiment, identifying, by the electronic device, the at least one information as real information or a fake information includes parsing, by the electronic device, the at least one information, validating, by the electronic device, the at least one parsed information, and identifying, by the electronic device, the at least one information as real information or a fake information based on the validation.
[0023] In an embodiment, the at least one parsed information is validated using at least one of an Artificial intelligence (AI) model and a machine learning (ML) model.
[0024] Accordingly, embodiments herein disclose an electronic device for discriminating an information. The electronic device includes a data driven based information analyzer coupled with a processor and a memory. The data driven based information analyzer is configured to receive at least one information and identify the at least one information as a real information or a fake information. Further, the data driven based information analyzer is configured to perform one of: display to a user if the at least one information is the real information, and discard the at least one information if the at least one information is the fake information.
[0025] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the scope thereof, and the embodiments herein include all such modifications.

BRIEF DESCRIPTION OF FIGURES
[0026] The method and the system are illustrated in the accompanying drawings, throughout which like reference letters indicate corresponding parts in the various figures. The embodiments herein will be better understood from the following description with reference to the drawings, in which:
[0027] FIG. 1 shows various hardware components of an electronic device for determining information as accurate or fake, according to embodiments as disclosed herein;
[0028] FIG. 2 is an overview of a system for determining the information as accurate or fake, according to embodiments as disclosed herein;
[0029] FIG. 3 is a flow chart illustrating a method for determining the information as accurate or fake, according to embodiments as disclosed herein;
[0030] FIG. 4 is a flow chart illustrating a method for discriminating information, according to embodiments as disclosed herein; and
[0031] FIG. 5 and FIG. 6 are example scenarios in which the system determines the information as accurate or fake, according to embodiments as disclosed herein.

DETAILED DESCRIPTION OF INVENTION
[0033] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. Also, the various embodiments described herein are not necessarily mutually exclusive, as some embodiments can be combined with one or more other embodiments to form new embodiments. The term “or” as used herein, refers to a non-exclusive or, unless otherwise indicated. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein can be practiced and to further enable those skilled in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0034] As is traditional in the field, embodiments may be described and illustrated in terms of blocks which carry out a described function or functions. These blocks, which may be referred to herein as units or modules or the like, are physically implemented by analog or digital circuits such as logic gates, integrated circuits, microprocessors, microcontrollers, memory circuits, passive electronic components, active electronic components, optical components, hardwired circuits, or the like, and may optionally be driven by firmware and software. The circuits may, for example, be embodied in one or more semiconductor chips, or on substrate supports such as printed circuit boards and the like. The circuits constituting a block may be implemented by dedicated hardware, or by a processor (e.g., one or more programmed microprocessors and associated circuitry), or by a combination of dedicated hardware to perform some functions of the block and a processor to perform other functions of the block. Each block of the embodiments may be physically separated into two or more interacting and discrete blocks without departing from the scope of the invention. Likewise, the blocks of the embodiments may be physically combined into more complex blocks without departing from the scope of the invention
[0035] The accompanying drawings are used to help easily understand various technical features and it should be understood that the embodiments presented herein are not limited by the accompanying drawings. As such, the present disclosure should be construed to extend to any alterations, equivalents and substitutes in addition to those which are particularly set out in the accompanying drawings. Although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are generally only used to distinguish one element from another.
[0036] Accordingly, embodiments herein achieve a method for determining information as accurate or fake. The method includes receiving, by an electronic device, one or more first information. Further, the method includes processing, by the electronic device, the one or more first information using a machine learning model. Further, the method includes identifying by the electronic device, a location of the first information. Further, the method includes determining, by the electronic device, one or more user present in the identified location. Further, the method includes acquiring, by the electronic device, one or more second information similar to the first information. The one or more second information is acquired from the one more user present in the identified location. Further, the method includes matching, by the electronic device, the one or more second information with the one or more first information. Further, the method includes determining, by the electronic device, that the one or more first information is accurate or fake based on the matching.
[0037] Unlike conventional methods and system, where we were only depending on the AI models to decipher on the accuracy of the information, primarily either by using clustering or correlation methods. Ideally all this was dependent on the underlying data and if the source itself is spreading rumour, then the AI wouldn’t help. Hence human factor was necessary and geo location tagging of the information is must for achieving a better accurate data/information.
[0038] The proposed method can be used to verify the people/profiles-chain who would work together to validate information and facts. In an example, Person A is from local community and has a higher reputation index on a social media platform, hence he/she will have the privilege to tag information as accurate or rumor. The machine learning model or the AI model would act as a parser and first level filter for such information, trying to curate, compare and validate online content.
[0039] Based on the proposed method, if a news has a seal of trust worth information, the user of the electronic device would be open to pay or read that information with complete trust without worrying about being misled. The method can be used to provide disinformation risk ratings of a media site. The method can be used to identify adversarial narrative topics and high risk domains for the advertisement tech industry. The method can be used to provide policy advice and guidance to key global efforts and institutions to combat disinformation.
[0040] Based on the proposed method, misinformation can be stopped or reduced to minimal. The trust worth information can be tagged to stop mis-information and disinformation both, whereby increasing people’s trust and confidence around information, brand and data being shared. In an example, a trust worth Seal will be more than enough for people to believe in the information to be free from mis- information and disinformation.
[0041] Seal of the trust worth will be awarded to sites and information sources after due diligence that they are un bias, non-political and provide information that is factually correct, has been vetted and has no tonality of being pro or con against anyone or anything. The seal of the trust worth means data being showed is accurate, unbias and has been verified by all levels and sources.
[0042] Referring now to the drawings, and more particularly to FIGS. 1 through 6, there are shown preferred embodiments.
[0043] FIG. 1 shows various hardware components of an electronic device (100) for determining information as accurate or fake, according to embodiments as disclosed herein. The electronic device (100) can be, for example, but not limited to a cellular phone, a smart phone, a Personal Digital Assistant (PDA), a tablet computer, a laptop computer, an Internet of Things (IoT), a virtual reality device, an immersive system or the like.
[0044] In an embodiment, the electronic device (100) includes a data driven based information fact check controller (110), a communicator (120), a memory (130), and a processor (140). The processor (140) is operated with the data driven based information fact check controller (110), the communicator (120), and the memory (130).
[0045] The data driven based information fact check controller (110) is configured to receive one or more first information. The one or more first information can be, for example, but not limited to a political news, a religious news, a disaster related news, a medicine related news or the like. The one or more first information is received from one or more an online content, a social media content, a news information and a user input. After receiving the information, the data driven based information fact check controller (110) is configured to process the one or more first information using a machine learning model or the AI model. The ML model or the AI model can be used to check profanity, identify and match data and features across various online sources. In an embodiment, the one or more first information is processed by consolidating the one or more first information, parsing the one or more first information, validating the one or more first information, tagging the one or more first information, updating the one or more first information, and filtering the one or more first information.
[0046] Further, the data driven based information fact check controller (110) is configured to identify a location of the one or more first information and determine one or more user present in the identified location. In an embodiment, the one or more user present in the identified location is determined by analyzing one or more previous information posted by the one or more user, providing a higher reputation index based on the analyzing, and determining the one or more user present in the identified location based on the higher reputation index. The higher reputation index indicates a trust level of the user, wherein the higher reputation index is determined based on a basis expertise associated with the first information, a demography, and knowledge on the first information.
[0047] Further, the data driven based information fact check controller (110) is configured to acquire one or more second information similar to the first information. The one or more second information is acquired from the one more user present in the identified location. Further, the data driven based information fact check controller (110) is configured to match the one or more second information with the one or more first information. Based on the matching, the data driven based information fact check controller (110) is configured to determine that the one or more first information is accurate or fake.
[0048] Further, the data driven based information fact check controller (110) is configured to classify one or more accurate portion of the one or more first information, one or more inaccurate portion of the one or more first information, and fake information associated with the one or more first information. Further, the data driven based information fact check controller (110) is configured to display for the one or more accurate portion of the one or more first information, the one or more inaccurate portion of the one or more first information, and the fake information associated with the one or more first information to the user and receive an input from an user of the electronic device (100).
[0049] In an example, when verified people on the social media platform, the electronic device (100) validates information basis expertise, demography, knowledge on subject, location, alongside the AI, the AI model would hold more accurate information that can be free from bias and represent facts than adultered information.
[0050] In another embodiment, the data driven based information fact check controller (110) is configured to receive an information. The information can be, for example, but not limited to a political news, a religious news, a disaster related news, a medicine related news or the like. The information is received over the internet, online source, a social media site or the like. After receiving the information, the data driven based information fact check controller (110) is configured to identify the information as a real information (i.e., trust worth information) or a fake information. The information is identified as the real information or the fake information by parsing the information and validating the parsed information. The parsed information is validated using an AI model and a ML model. Based on the identification, the data driven based information fact check controller (110) is configured to display or notify to a user if the information is the real information, and discard the information if the information is the fake information.
[0051] Further, the data driven based information fact check controller (110) is configured to perform various action. The action can be, for example, but not limited to, rank a fake information providing source, publish a fake information providing source, notify a fake information providing source to a central authority, and block a fake information providing source. The central authority can be, for example, but not limited to, a central government, a police department, an investigation department, a state government or the like.
[0052] The processor (140) is configured to execute instructions stored in the memory (130) and to perform various processes. The communicator (120) is configured for communicating internally between internal hardware components and with external devices via one or more networks.
[0053] The memory (130) also stores instructions to be executed by the processor (140). The memory (130) may include non-volatile storage elements. Examples of such non-volatile storage elements may include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories. In addition, the memory (130) may, in some examples, be considered a non-transitory storage medium. The term “non-transitory” may indicate that the storage medium is not embodied in a carrier wave or a propagated signal. However, the term “non-transitory” should not be interpreted that the memory (130) is non-movable. In certain examples, a non-transitory storage medium may store data that can, over time, change (e.g., in Random Access Memory (RAM) or cache).
[0054] Further, at least one of the plurality of modules may be implemented through the AI model. A function associated with AI model may be performed through the non-volatile memory, the volatile memory, and the processor (140). The processor (140) may include one or a plurality of processors. At this time, one or a plurality of processors may be a general purpose processor, such as a central processing unit (CPU), an application processor (AP), or the like, a graphics-only processing unit such as a graphics processing unit (GPU), a visual processing unit (VPU), and/or an AI-dedicated processor such as a neural processing unit (NPU).
[0055] The one or a plurality of processors control the processing of the input data in accordance with a predefined operating rule or the AI model stored in the non-volatile memory and the volatile memory. The predefined operating rule or artificial intelligence model is provided through training or learning.
[0056] Here, being provided through learning means that a predefined operating rule or AI model of a desired characteristic is made by applying a learning algorithm to a plurality of learning data. The learning may be performed in a device itself in which AI according to an embodiment is performed, and/o may be implemented through a separate server/system.
[0057] The AI model may comprise of a plurality of neural network layers. Each layer has a plurality of weight values, and performs a layer operation through calculation of a previous layer and an operation of a plurality of weights. Examples of neural networks include, but are not limited to, convolutional neural network (CNN), deep neural network (DNN), recurrent neural network (RNN), restricted Boltzmann Machine (RBM), deep belief network (DBN), bidirectional recurrent deep neural network (BRDNN), generative adversarial networks (GAN), and deep Q-networks.
[0058] The learning algorithm is a method for training a predetermined target device (for example, a robot) using a plurality of learning data to cause, allow, or control the target device to make a determination or prediction. Examples of learning algorithms include, but are not limited to, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning.
[0059] In another embodiment, the method can be implemented by using a machine learning model. The machine learning model can be, for example, but not limited to a linear regression model, a logistic regression model, a classification and regression tree (CART) model, a naïve bayes model, a k-Nearest Neighbors (KNN) model or the like.
[0060] Although the FIG. 1 shows various hardware components of the electronic device (100) but it is to be understood that other embodiments are not limited thereon. In other embodiments, the electronic device (100) may include less or more number of components. Further, the labels or names of the components are used only for illustrative purpose and does not limit the scope of the invention. One or more components can be combined together to perform same or substantially similar function in the electronic device (100).
[0061] FIG. 2 is an overview of a system (1000) for determining information as accurate or fake, according to embodiments as disclosed herein. In an embodiment, the system (1000) includes the electronic device (100) and a server (200). The operations and functions of the electronic device (100) is already explained in the FIG. 1. In an embodiment, the server (200) is configured to collect a third party information related the first information and share the third party information with the electronic device (100) based on the requirement. The server (200) can be, for example, but not limited to a third party server, a cloud server, an edge server or the like.
[0062] FIG. 3 is a flow chart (300) illustrating a method for determining information as accurate or fake, according to embodiments as disclosed herein. The operations (302-314) are performed by the data driven based information fact check controller (110).
[0063] At 302, the method includes receiving the one or more first information. At 304, the method includes processing the one or more first information using the machine learning model or the AI model. At 306, the method includes identifying the location of the first information. At 308, the method includes determining the one or more user present in the identified location. At 310, the method includes acquiring the one or more second information similar to the first information. The one or more second information is acquired from the one more user present in the identified location. At 312, the method includes matching the one or more second information with the one or more first information. At 314, the method includes determining that the one or more first information is accurate or fake based on the matching.
[0064] FIG. 4 is a flow chart (400) illustrating a method for discriminating information, according to embodiments as disclosed herein. The operations (402-408) are performed by the data driven based information fact check controller (110). At 402, the method includes receiving, the information. At 404, the method includes identifying the information as the real information or the fake information. At 406, the method includes causing to display to the user if the information is the real information. At 408, the method includes discarding the information if the information is the fake information.
[0065] The various actions, acts, blocks, steps, or the like in the flow charts (300 and 400) may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some of the actions, acts, blocks, steps, or the like may be omitted, added, modified, skipped, or the like without departing from the scope of the invention.
[0066] FIG. 5 and FIG. 6 are example scenarios in which the system determines the information as accurate or fake, according to embodiments as disclosed herein.
[0067] As shown in the FIG. 5 and FIG. 6, the news information is received from various sources. The data driven based information fact check controller (110) refines and perform stance detection, fact check using the high trust worth users get to perform verification. Based on the verification, the aggregated content is scored and verified. The verified content is provided to the user.
[0068] The data driven based information fact check controller (110) reduces media distortion by distilling topical news into salient information with confidence, while also identifying slant and alignment of each content sources. The ingest of topical news is received from sources. The data driven based information fact check controller (110) is configured to analysis and compare the content and adjusted for source bias score. The content is presented with simplified salient information. The source content rating for is provided for bias and alignment with political agenda.
[0069] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.

Documents

Application Documents

# Name Date
1 202121024513-STATEMENT OF UNDERTAKING (FORM 3) [02-06-2021(online)].pdf 2021-06-02
2 202121024513-POWER OF AUTHORITY [02-06-2021(online)].pdf 2021-06-02
3 202121024513-FORM FOR STARTUP [02-06-2021(online)].pdf 2021-06-02
4 202121024513-FORM FOR SMALL ENTITY(FORM-28) [02-06-2021(online)].pdf 2021-06-02
5 202121024513-FORM 1 [02-06-2021(online)].pdf 2021-06-02
6 202121024513-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [02-06-2021(online)].pdf 2021-06-02
7 202121024513-EVIDENCE FOR REGISTRATION UNDER SSI [02-06-2021(online)].pdf 2021-06-02
8 202121024513-DRAWINGS [02-06-2021(online)].pdf 2021-06-02
9 202121024513-DECLARATION OF INVENTORSHIP (FORM 5) [02-06-2021(online)].pdf 2021-06-02
10 202121024513-COMPLETE SPECIFICATION [02-06-2021(online)].pdf 2021-06-02
11 202121024513-FORM 18 [14-06-2021(online)].pdf 2021-06-14
12 Abstract1..jpg 2021-11-18
13 202121024513-FER.pdf 2022-12-20
14 202121024513-AbandonedLetter.pdf 2024-03-05

Search Strategy

1 202121024513E_20-12-2022.pdf