Sign In to Follow Application
View All Documents & Correspondence

A System And Apparatus For Augmented Reality Based Education

Abstract: The present disclosure provides a system and apparatus for augmented reality (AR) enabled education comprising a mobile computing device such as a mobile phone mounted on to a table. The mobile device is configured to scan a two-dimensional representation of a character of a language and, using an augmented reality module, convert it to a three-dimensional representation to be displayed on the screen of the mobile device. The mobile device is configured to receive input to visually manipulate the character displayed on the screen. One or more characters can be scanned and the three-dimensional representations can be manipulated to form words and sentences which can be validated. Further, pronunciations for the characters, words and sentences are also provided.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
08 June 2019
Publication Number
50/2020
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
info@khuranaandkhurana.com
Parent Application
Patent Number
Legal Status
Grant Date
2025-03-24
Renewal Date

Applicants

Chitkara Innovation Incubator Foundation
SCO: 160-161, Sector -9c, Madhya Marg, Chandigarh- 160009, India.

Inventors

1. TULI, Neha
Chitkara University, Chandigarh Patiala National Highway (NH-64), Tehsil - Rajpura, District Patiala-140401, Punjab, India.
2. SHARMA, Shivam
Chitkara University, Chandigarh Patiala National Highway (NH-64), Tehsil - Rajpura, District Patiala-140401, Punjab, India.
3. MANTRI, Archana
Chitkara University, Chandigarh Patiala National Highway (NH-64), Tehsil - Rajpura, District Patiala-140401, Punjab, India.
4. SINGH, Narinder
Chitkara University, Chandigarh Patiala National Highway (NH-64), Tehsil - Rajpura, District Patiala-140401, Punjab, India.

Specification

TECHNICAL FIELD
[1] The present disclosure relates generally to the field of augmented reality based educational tools. In particular, the present disclosure relates to an augmented reality based educational apparatus for children.
BACKGROUND
[2] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[3] Conventionally, to teach children characters of alphabets of a language, flashcards or books are used. However, these media comprise only a two-dimensional representation of the characters or alphabets and do not provide the children with a three-dimensional visualisation.
[4] Further, books and flashcards do not allow children to combine multiple characters to form words and sentences while also providing inputs regarding the validity of the words and sentences formed.
[5] There is, therefore, a requirement in the art for a comprehensive education system for children that does not rely on just two-dimensional representations of characters for their learning. Further, the system should allow the children to manipulate the characters to learn word and sentence construction.
[6] All publications herein are incorporated by reference to the same extent as if each individual publication or patent application were specifically and individually indicated to be incorporated by reference. Where a definition or use of a term in an incorporated reference is inconsistent or contrary to the definition of that term provided herein, the definition of that term provided herein applies and the definition of that term in the reference does not apply.
3
[7] In some embodiments, the numbers expressing quantities or dimensions of items, and so forth, used to describe and claim certain embodiments of the invention are to be understood as being modified in some instances by the term “about.” Accordingly, in some embodiments, the numerical parameters set forth in the written description and attached claims are approximations that can vary depending upon the desired properties sought to be obtained by a particular embodiment. In some embodiments, the numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of some embodiments of the invention are approximations, the numerical values set forth in the specific examples are reported as precisely as practicable. The numerical values presented in some embodiments of the invention may contain certain errors necessarily resulting from the standard deviation found in their respective testing measurements.
[8] As used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
[9] The recitation of ranges of values herein is merely intended to serve as a shorthand method of referring individually to each separate value falling within the range. Unless otherwise indicated herein, each individual value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g. “such as”) provided with respect to certain embodiments herein is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention otherwise claimed. No language in the specification should be construed as indicating any non-claimed element essential to the practice of the invention.
[10] Groupings of alternative elements or embodiments of the invention disclosed herein are not to be construed as limitations. Each group member can be referred to and claimed individually or in any combination with other members of the group or other elements found herein. One or more members of a group can be included in, or deleted from, a group for reasons of convenience and/or patentability. When any such inclusion or deletion
4
occurs, the specification is herein deemed to contain the group as modified thus fulfilling the written description of all groups used in the appended claims.
OBJECTS
[11] A general object of the present disclosure is to provide an educational apparatus for children based on augmented reality.
[12] Another object of the present disclosure is to provide an educational apparatus for children to enable better visualisation.
[13] Another object of the present disclosure is to provide an educational apparatus for children with special needs.
[14] Another object of the present disclosure is to provide an educational apparatus for children which can be monitored in real-time.
SUMMARY
[15] The present disclosure relates generally to the field of augmented reality based educational tools. In particular, the present disclosure relates to an augmented reality based educational apparatus for children.
[16] In an aspect, the present disclosure provides an apparatus for augmented reality (AR) based education, said apparatus comprising: a working surface coupled with a stand, said stand configured to be adjustable along any or all of three axes; a mobile computing device comprising: a scanner configured to scan one or more two-dimensional representations of characters of a language; one or more processors operatively coupled to a memory storing a set of instructions executable by the one or more processors to: identify the scanned one or more characters by comparing each scanned character with a first dataset operatively coupled to the mobile computing device, said first dataset comprising two-dimensional representations of characters of the language; retrieve, from a second dataset operatively coupled to the mobile computing device, corresponding three-dimensional representation of the scanned one or more characters; receive, as input through a touch enabled screen coupled with the mobile computing device, one or more instructions to manipulate the three-dimensional representations of the scanned one or more characters to arrange the any or more characters in a combination to form any or a combination of words
5
and sentences; and validate the formed any or a combination of words and sentences based on comparison with a fourth dataset operatively coupled to the mobile computing device, said fourth dataset comprising list of both of valid words and valid sentences, wherein the any or a combination of the scanned on or more characters, any or a combination of the formed word and sentence and the validity of any or a combination of the formed word and sentence are displayed on the screen.
[17] In an embodiment, the mobile computing device can be a personal mobile device selected from a group comprising a mobile phone and a tablet.
[18] In another embodiment, the mobile computing device is configured to retrieve, from a fifth dataset operatively coupled to the mobile computing device, corresponding aural representations of the scanned one or more characters. In another embodiment, the mobile computing device is configured with a speaker unit to emit the aural representation of the scanned one or more characters.
[19] In another embodiment, aural representations of the any or a combination of formed words and sentences are emitted from the speaker unit.
[20] In another embodiment, the scanner is configured to scan one or more colour attributes of the two-dimensional representation of the characters and display the one or more colour attributes on the display.
[21] In another embodiment, the mobile computing device is configured to receive, as input through the touch enabled screen to manipulate the scanned one or more characters to alter their orientation in any or a combination of the three axes.
[22] In another embodiment, the mobile device is remotely accessible by one or more users using a unique access code provided to each user.
[23] In an aspect, the present disclosure provides a system for augmented reality (AR) based education, said system comprising: one or more processors operatively coupled to a memory storing a set of instructions executable by the one or more processors to: receive scanned two-dimensional representations of one or more characters of a language; identify the scanned one or more characters by comparing each scanned character with a first dataset operatively coupled to the one or more processors, said first dataset comprising two-dimensional representations of characters of the language; retrieve, from a second dataset operatively coupled to the one or more processors, corresponding three-dimensional representation of the scanned one or more characters; receive, one or more instructions to
6
manipulate the three-dimensional representations of the scanned one or more characters to arrange the any or more characters in a combination to form any or a combination of words and sentences; and validate the formed any or a combination of words and sentences based on comparison with a fourth dataset operatively coupled to the one or more processors, said fourth dataset comprising list of both of valid words and valid sentences, wherein the any or a combination of the scanned on or more characters, any or a combination of the formed word and sentence and the validity of any or a combination of the formed word and sentence are displayed on the screen.
[24] In an embodiment, the system is a mobile device mounted on a table using a stand.
[25] Various objects, features, aspects and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.
BRIEF DESCRIPTION OF DRAWINGS
[26] The accompanying drawings are included to provide a further understanding of the present invention and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present invention and, together with the description, serve to explain the principles of the present invention.
[27] FIG. 1 illustrates an exemplary module diagram for a system for augmented reality (AR) based education, in accordance with an embodiment of the present disclosure.
[28] FIG. 2 illustrates an exemplary apparatus for augmented reality (AR) based education, in accordance with an embodiment of the present disclosure.
[29] FIG. 3 illustrates a computer system in which or with which embodiments of the present invention can be utilized in accordance with embodiments of the present disclosure.
DETAILED DESCRIPTION
[30] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit
7
the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
[31] If the specification states a component or feature “may”, “can”, “could”, or “might” be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
[32] As used in the description herein and throughout the claims that follow, the meaning of “a,” “an,” and “the” includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
[33] Exemplary embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments are shown. These exemplary embodiments are provided only for illustrative purposes and so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those of ordinary skill in the art. The invention disclosed may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Various modifications will be readily apparent to persons skilled in the art. The general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the invention. Moreover, all statements herein reciting embodiments of the invention, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (i.e., any elements developed that perform the same function, regardless of structure). Also, the terminology and phraseology used is for the purpose of describing exemplary embodiments and should not be considered limiting. Thus, the present invention is to be accorded the widest scope encompassing numerous alternatives, modifications and equivalents consistent with the principles and features disclosed. For purpose of clarity, details relating to technical material that is known in the technical fields related to the invention have not been described in detail so as not to unnecessarily obscure the present invention.
[34] The use of any and all examples, or exemplary language (e.g., “such as”) provided with respect to certain embodiments herein is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention otherwise claimed.
8
No language in the specification should be construed as indicating any non – claimed element essential to the practice of the invention.
[35] Embodiments described herein relate generally to the field of augmented reality based educational tools and, in particular, to an augmented reality based educational apparatus for children.
[36] In an aspect, the present disclosure provides a system and apparatus for augmented reality (AR) enabled education comprising a mobile computing device such as a mobile phone mounted on to a table. The mobile device is configured to scan a two-dimensional representation of a character of a language and, using an augmented reality module, convert it to a three-dimensional representation to be displayed on the screen of the mobile device. The mobile device is configured to receive input to visually manipulate the character displayed on the screen.
[37] In another aspect, one or more characters can be scanned and the three-dimensional representations can be manipulated to form words and sentences which can be validated. Further, pronunciations for the characters, words and sentences are also provided.
[38] FIG. 1 illustrates an exemplary module diagram for a system for augmented reality (AR) based education, in accordance with an embodiment of the present disclosure. In an embodiment, the system 100 comprises: an AR module 106; and a database 112. It would be appreciated that the database 112 of the system can be configured at a remote location say a cloud or a server.
[39] In another embodiment, the database 112 can include a non-transitory storage device embodied with one or more subroutines for an augmented reality (AR) based education. In a further embodiment, the system 100 comprises one or more processors 102 and one or more memory units 104 coupled to the storage device to execute the one or more subroutines for an augmented reality (AR) based education. In another embodiment, the memory units 104 can be cache memory.
[40] In another embodiment, the database 112 holds information pertaining to characters or alphabets of a language. The database 112 further stores three-dimensional (3D) representations of the characters or alphabets obtained through an AR-based transformation.
[41] In another embodiment, the database 112 further holds information pertaining to words and sentences formed by compounding one or more characters or letters. The database
9
112 further holds information pertaining to validity of the word or sentence in any given language.
[42] In another embodiment, the database 112 further holds information pertaining to audio or aural representations of the character, alphabets, words and sentences formed.
[43] In another embodiment, the database 112 and the AR module 106 are in bidirectional communication in real time.
[44] In another embodiment, the AR module 106 comprises: an input unit 108; and an AR unit 110.
[45] In another embodiment, the input unit 108 receivesa character or an alphabet in a language. The character or the alphabet can be scanned from a two-dimensional (2D) representation of the character or alphabet from a flashcard by a scanner that is operatively coupled to the input unit 108.
[46] In another embodiment, the AR unit 110 is configured to identify the character or alphabet scanned and retrieve, from the database 112, the associated 3D representations of the character or alphabet, along with their aural representation. The 3D representation and the aural representation can be displayed on a screen and through a speaker respectively, the screen and speaker being operatively coupled to the AR unit 110.
[47] In another embodiment, the AR unit 110 can receive as input instructions to visually manipulate the 3D representation of the character or the alphabet displayed.
[48] In another embodiment, the system 100 can be coupled to a memory configured to store each scanned character or alphabet. The input unit 108 can be configured to scan one or more characters or alphabets which are then stored in the memory. The AR unit 110 can be configured to display 3D representations of the one or more character or alphabets.
[49] In another embodiment, the AR unit 110 can receive, as input, to visually manipulate the 3D representations to move any of the one or more characters or alphabets to form different combinations of the scanned one or more characters or alphabets.
[50] In another embodiment, the AR unit 110 is further configured to identify the word or sentence formed by the combination of characters or alphabets and determine the validity of the formed word or sentence by comparing it with the data stored in the database 112. The result of the validity determination can be displayed visually and aurally. Further,
10
options for a valid word or sentence can be accordingly suggested if the formed word or sentence is not valid.
[51] In another embodiment, aural representations of the word or sentence formed can also be provided.
[52] In another embodiment, the flashcards can be coloured by the user using one or more colours, and the display screen can represent the 3D character or alphabet with the colours used on the flashcards.
[53] In another embodiment, the system 100 can be remotely accessed by an administrator to monitor the use of the system 100. A unique access can be provided to both user and the administrator such that the progress of each user can be monitored by the administrator in real-time.
[54] FIG. 2 illustrates an exemplary apparatus for an augmented reality (AR) based education, in accordance with an embodiment of the present disclosure. In an embodiment, the apparatus 200 comprises a table 202 with demarcated one or more areas on its functional surface. In an exemplary embodiment, the demarcations can be two, namely, working area 204; and play area 206. In another embodiment, the table 202 can be foldable.
[55] In an exemplary embodiment, the working area 204 can be configured to hold flashcards with characters or alphabets that can be arranged to provide a first predetermined functions, such as to study alphabets or characters according to order of the alphabet or characters of the language. In another exemplary embodiment, the play area 206 can be configured to hold flashcards with characters or alphabets that can be arranged to provide a second predetermined function, such as to study combinations of characters or alphabets to build words or sentences.
[56] In another embodiment, the table 202 can be coupled to a stand 208 which can be configured to hold a mobile device 210. The stand 208 can be adjustable so as to allow orientation of the mobile device to be easily altered.
[57] In another embodiment, the mobile device 210 is provided with a display screen and speakers. The mobile device 210 is also provided with a scanner configured to scan flashcards each comprising a character or alphabet of the language. The mobile device 210 is provided with an AR module configured to convert the scanned 2D character or alphabet to a 3D representation of the character or alphabet. The mobile device 210 is further configured to
11
receive, as input, directions to manipulate eth e3D representations. The input can be received by the mobile device 210 through a touch enabled screen.
[58] In another embodiment, the scanned character or alphabet scanned is identified and the associated 3D representations of the character or alphabet is retrieved, along with their aural representation. The 3D representation and the aural representation can be displayed on the screen and through a speaker respectively.
[59] In another embodiment, the touch enabled screen can receive as input instructions to visually manipulate the 3D representation of the character or the alphabet displayed.
[60] In another embodiment, the apparatus200 can be coupled to a memory configured to store each scanned character or alphabet. One or more characters or alphabets can be scanned from one or more flashcards, which are then stored in the memory. The 3D representations of the one or more character or alphabets can then be displayed.
[61] In another embodiment, the touch enabled screen can receive, as input, to visually manipulate the 3D representations to move any of the one or more characters or alphabets to form different combinations of the scanned one or more characters or alphabets.
[62] In another embodiment, the mobile device 210 is further configured to identify the word or sentence formed by the combination of characters or alphabets and determine the validity of the formed word or sentence. The result of the validity determination can be displayed visually and aurally. Further, options for a valid word or sentence can be accordingly suggested if the formed word or sentence is not valid.
[63] In another embodiment, aural representations of the word or sentence formed can also be provided.
[64] In another embodiment, the flashcards can be coloured by the user using one or more colours, and the display screen can represent the 3D character or alphabet with the colours used on the flashcards.
[65] In another embodiment, the mobile device 210 can be remotely accessed by an administrator, such as a teacher or a parent, to monitor the use of the apparatus 200. A unique access can be provided to both user and the administrator such that the progress of each user can be monitored by the administrator in real-time.
12
[66] FIG. 3 illustrates a computer system in which or with which embodiments of the present invention can be utilized in accordance with embodiments of the present disclosure.
[67] As shown in FIG. 3, computer system includes an external storage device 310, a bus 320, a main memory 330, a read only memory 340, a mass storage device 350, communication port 360, and a processor 370. A person skilled in the art will appreciate that computer system may include more than one processor and communication ports. Examples of processor 370 include, but are not limited to, an Intel® Itanium® or Itanium 2 processor(s), or AMD® Opteron® or Athlon MP® processor(s), Motorola® lines of processors, FortiSOC™ system on a chip processors or other future processors. Processor 370 may include various modules associated with embodiments of the present invention. Communication port 360 can be any of an RS-232 port for use with a modem-based dialup connection, a 10/100 Ethernet port, a Gigabit or 10 Gigabit port using copper or fibre, a serial port, a parallel port, or other existing or future ports. Communication port 360 may be chosen depending on a network, such a Local Area Network (LAN), Wide Area Network (WAN), or any network to which computer system connects.
[68] Memory 330 can be Random Access Memory (RAM), or any other dynamic storage device commonly known in the art. Read only memory 340 can be any static storage device(s) e.g., but not limited to, a Programmable Read Only Memory (PROM) chips for storing static information e.g., start-up or BIOS instructions for processor 370. Mass storage 350 may be any current or future mass storage solution, which can be used to store information and/or instructions. Exemplary mass storage solutions include, but are not limited to, Parallel Advanced Technology Attachment (PATA) or Serial Advanced Technology Attachment (SATA) hard disk drives or solid-state drives (internal or external, e.g., having Universal Serial Bus (USB) and/or Firewire interfaces), e.g. those available from Seagate (e.g., the Seagate Barracuda 7200 family) or Hitachi (e.g., the Hitachi Deskstar 7K1000), one or more optical discs, Redundant Array of Independent Disks (RAID) storage, e.g. an array of disks (e.g., SATA arrays), available from various vendors including Dot Hill Systems Corp., LaCie, Nexsan Technologies, Inc. and Enhance Technology, Inc.
[69] Bus 320 communicatively couples processor(s) 370 to the other memory, storage and communication blocks. Bus 320 can be, e.g. a Peripheral Component Interconnect (PCI) / PCI Extended (PCI-X) bus, Small Computer System Interface (SCSI), USB or the like, for connecting expansion cards, drives and other subsystems as well as other buses, such a front side bus (FSB), which connects processor 370 to software system.
13
[70] Optionally, operator and administrative interfaces, e.g. a display, keyboard, and a cursor control device, may also be coupled to bus 320 to support direct operator interaction with computer system. Other operator and administrative interfaces can be provided through network connections connected through communication port 360. External storage device 310 can be any kind of external hard-drives, floppy drives, IOMEGA® Zip Drives, Compact Disc - Read Only Memory (CD-ROM), Compact Disc - Re-Writable (CD-RW), Digital Video Disk - Read Only Memory (DVD-ROM). Components described above are meant only to exemplify various possibilities. In no way should the aforementioned exemplary computer system limit the scope of the present disclosure.
[71] It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein. The inventive patient matter, therefore, is not to be restricted except in the spirit of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms “includes” and “including” should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refer to at least one of something selected from the group consisting of A, B, C ….and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc. The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practised with modification within the spirit and scope of the appended claims.
[72] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is
14
not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.
ADVANTAGES
[73] The present disclosure provides an educational apparatus for children based on augmented reality.
[74] The present disclosure provides an educational apparatus for children to enable better visualisation.
[75] The present disclosure provides an educational apparatus for children with special needs.
[76] The present disclosure provides an educational apparatus for children which can be monitored in real-time.

We Claim:
1. An apparatus for augmented reality (AR) based education, said apparatus comprising:
a working surface coupled with a stand, said stand configured to be adjustable along any or all of three axes;
a mobile computing device comprising:
a scanner configured to scan one or more two-dimensional representations of characters of a language;
one or more processors operatively coupled to a memory storing a set of instructions executable by the one or more processors to:
identify the scanned one or more characters by comparing each scanned character with a first dataset operatively coupled to the mobile computing device, said first dataset comprising two-dimensional representations of characters of the language;
retrieve, from a second dataset operatively coupled to the mobile computing device, corresponding three-dimensional representation of the scanned one or more characters;
receive, as input through a touch enabled screen coupled with the mobile computing device, one or more instructions to manipulate the three-dimensional representations of the scanned one or more characters to arrange the any or more characters in a combination to form any or a combination of words and sentences; and
validate the formed any or a combination of words and sentences based on comparison with a fourth dataset operatively coupled to the mobile computing device, said fourth dataset comprising list of both of valid words and valid sentences,
wherein the any or a combination of the scanned on or more characters, any or a combination of the formed word and sentence and the validity of any or a combination of the formed word and sentence are displayed on the screen.
2. The apparatus as claimed in claim 1, wherein the mobile computing device can be a personal mobile device selected from a group comprising a mobile phone and a tablet.
16
3. The apparatus as claimed in claim 1, wherein the mobile computing device is configured to retrieve, from a fifth dataset operatively coupled to the mobile computing device, corresponding aural representations of the scanned one or more characters.
4. The apparatus as claimed in claim 3, wherein the mobile computing device is configured with a speaker unit to emit the aural representation of the scanned one or more characters.
5. The apparatus as claimed in claim 1, wherein aural representations of the any or a combination of formed words and sentences are emitted from the speaker unit.
6. The apparatus as claimed in claim 1, wherein the scanner is configured to scan one or more colour attributes of the two-dimensional representation of the characters and display the one or more colour attributes on the display.
7. The apparatus as claimed in claim 1, wherein the mobile computing device is configured to receive, as input through the touch enabled screen to manipulate the scanned one or more characters to alter their orientation in any or a combination of the three axes.
8. The apparatus as claimed in claim 1, wherein the mobile device is remotely accessible by one or more users using a unique access code provided to each user.
9. A system for augmented reality (AR) based education, said system comprising:
one or more processors operatively coupled to a memory storing a set of instructions executable by the one or more processors to:
receive scanned two-dimensional representations of one or more characters of a language;
identify the scanned one or more characters by comparing each scanned character with a first dataset operatively coupled to the one or more processors, said first dataset comprising two-dimensional representations of characters of the language;
retrieve, from a second dataset operatively coupled to the one or more processors, corresponding three-dimensional representation of the scanned one or more characters;
17
receive, one or more instructions to manipulate the three-dimensional representations of the scanned one or more characters to arrange the any or more characters in a combination to form any or a combination of words and sentences; and
validate the formed any or a combination of words and sentences based on comparison with a fourth dataset operatively coupled to the one or more processors, said fourth dataset comprising list of both of valid words and valid sentences,
wherein the any or a combination of the scanned on or more characters, any or a combination of the formed word and sentence and the validity of any or a combination of the formed word and sentence are displayed on the screen.
10. The system as claimed in claim 9, wherein the system is a mobile device mounted on a table using a stand.

Documents

Application Documents

# Name Date
1 201911022811-IntimationOfGrant24-03-2025.pdf 2025-03-24
1 201911022811-STATEMENT OF UNDERTAKING (FORM 3) [08-06-2019(online)].pdf 2019-06-08
2 201911022811-FORM FOR STARTUP [08-06-2019(online)].pdf 2019-06-08
2 201911022811-PatentCertificate24-03-2025.pdf 2025-03-24
3 201911022811-FORM FOR SMALL ENTITY(FORM-28) [08-06-2019(online)].pdf 2019-06-08
3 201911022811-CLAIMS [25-08-2022(online)].pdf 2022-08-25
4 201911022811-FORM 1 [08-06-2019(online)].pdf 2019-06-08
4 201911022811-COMPLETE SPECIFICATION [25-08-2022(online)].pdf 2022-08-25
5 201911022811-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [08-06-2019(online)].pdf 2019-06-08
5 201911022811-CORRESPONDENCE [25-08-2022(online)].pdf 2022-08-25
6 201911022811-FER_SER_REPLY [25-08-2022(online)].pdf 2022-08-25
6 201911022811-EVIDENCE FOR REGISTRATION UNDER SSI [08-06-2019(online)].pdf 2019-06-08
7 201911022811-FORM-26 [25-08-2022(online)].pdf 2022-08-25
7 201911022811-DRAWINGS [08-06-2019(online)].pdf 2019-06-08
8 201911022811-FER.pdf 2022-03-02
8 201911022811-DECLARATION OF INVENTORSHIP (FORM 5) [08-06-2019(online)].pdf 2019-06-08
9 201911022811-COMPLETE SPECIFICATION [08-06-2019(online)].pdf 2019-06-08
9 201911022811-FORM 18 [22-05-2021(online)].pdf 2021-05-22
10 201911022811-Correspondence-180719.pdf 2019-07-26
10 201911022811-Proof of Right (MANDATORY) [16-07-2019(online)].pdf 2019-07-16
11 201911022811-FORM-26 [16-07-2019(online)].pdf 2019-07-16
11 201911022811-OTHERS-180719.pdf 2019-07-26
12 201911022811-Power of Attorney-180719.pdf 2019-07-26
12 abstract.jpg 2019-07-22
13 201911022811-Power of Attorney-180719.pdf 2019-07-26
13 abstract.jpg 2019-07-22
14 201911022811-FORM-26 [16-07-2019(online)].pdf 2019-07-16
14 201911022811-OTHERS-180719.pdf 2019-07-26
15 201911022811-Correspondence-180719.pdf 2019-07-26
15 201911022811-Proof of Right (MANDATORY) [16-07-2019(online)].pdf 2019-07-16
16 201911022811-COMPLETE SPECIFICATION [08-06-2019(online)].pdf 2019-06-08
16 201911022811-FORM 18 [22-05-2021(online)].pdf 2021-05-22
17 201911022811-FER.pdf 2022-03-02
17 201911022811-DECLARATION OF INVENTORSHIP (FORM 5) [08-06-2019(online)].pdf 2019-06-08
18 201911022811-FORM-26 [25-08-2022(online)].pdf 2022-08-25
18 201911022811-DRAWINGS [08-06-2019(online)].pdf 2019-06-08
19 201911022811-FER_SER_REPLY [25-08-2022(online)].pdf 2022-08-25
19 201911022811-EVIDENCE FOR REGISTRATION UNDER SSI [08-06-2019(online)].pdf 2019-06-08
20 201911022811-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [08-06-2019(online)].pdf 2019-06-08
20 201911022811-CORRESPONDENCE [25-08-2022(online)].pdf 2022-08-25
21 201911022811-FORM 1 [08-06-2019(online)].pdf 2019-06-08
21 201911022811-COMPLETE SPECIFICATION [25-08-2022(online)].pdf 2022-08-25
22 201911022811-FORM FOR SMALL ENTITY(FORM-28) [08-06-2019(online)].pdf 2019-06-08
22 201911022811-CLAIMS [25-08-2022(online)].pdf 2022-08-25
23 201911022811-PatentCertificate24-03-2025.pdf 2025-03-24
23 201911022811-FORM FOR STARTUP [08-06-2019(online)].pdf 2019-06-08
24 201911022811-STATEMENT OF UNDERTAKING (FORM 3) [08-06-2019(online)].pdf 2019-06-08
24 201911022811-IntimationOfGrant24-03-2025.pdf 2025-03-24

Search Strategy

1 SearchSE_25-02-2022.pdf

ERegister / Renewals