Abstract: ABSTRACT A GLOBE, AND A METHOD AND A SYSTEM FOR ENABLING AUGMENTED REALITY INTERACTIONS WITH A GLOBE A method (200) for enabling augmented reality interactions with a globe (107) comprises steps of receiving an image of a portion of an outer shell (1072) of the globe (107), from an image capturing device (105) of a computing device (101), identifying a geographical region from the image, generating a plurality of graphical elements (310) related to the geographical region and displaying the plurality of graphical elements (310) on a display device (102) of the computing device (101). [Figure 2]
DESC:FORM 2
THE PATENTS ACT 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
[See section 10 and rule 13]
“A GLOBE, AND A METHOD AND A SYSTEM FOR ENABLING AUGMENTED REALITY INTERACTIONS WITH A GLOBE”
We, GOYAL, Vivek, an Indian citizen, resident of JC 104, Salarpuriya Greenage, Hosur Road, Bangalore, Karnataka-560068, India, and ADVANI, Dinesh, an Indian citizen, resident of B201, Mantri Sarovar, HSR Layout Sector 4, Bangalore, Karnataka-560102, India
The following specification particularly describes the invention and the manner in which it is to be performed.
TECHNICAL FIELD
Embodiments of the present invention relate generally to augmented reality based learning and gaming and more specifically to a globe, and a method and a system for enabling augmented reality interactions with a globe.
BACKGROUND ART
Globe has been around for more than five centuries but has not changed much, except the boundaries between nations. The level of engagement, information and interactivity that a traditional globe of nineteenth or twentieth century provides is not enough today to engage kids and learn from. Globe has a lot of information printed on its outer shell. Kids have to recognise and memorise the oceans, continents, countries within each continent. Even if the kids recognise the countries or their names, they always have to manually turn the globe and find a particular country. This makes it difficult for the kids to learn. Major problem is that the interactive learning environment is missing in the current situation.
Therefore, in light of the discussion above, there is need for a globe, and a method and a system for enabling augmented reality interactions with the globe.
OBJECT OF THE INVENTION
An aspect of the present invention provides a globe for enabling augmented reality interactions
Another aspect of the present invention provides a method for enabling augmented reality interactions with a globe.
A yet another aspect of the present invention provides a system for enabling augmented reality interactions with a globe
SUMMARY OF THE PRESENT INVENTION
The present invention is described hereinafter by various embodiments. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiment set forth herein. Rather, the embodiment is provided so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those skilled in the art.
Embodiments of the present invention aim to provide a globe, and a method and a system for enabling augmented reality interactions with the globe. With the present invention described here, the Globe is brought to the twenty-first century. The Augmented Reality, using household smart devices like smart phones, tablets or head mounted devices, audio control and self-rotation features allow infinite learning and playtime value for kids and adults alike.
In accordance with an embodiment of the present invention, a globe includes an outer shell, a rotation mechanism provided with an actuator, a microphone and a control module. The microphone is configured to receive an audio signal. The control module is configured to transmit an actuation signal to the actuator, in response to the microphone receiving the audio signal. The actuator is configured to actuate on receiving the actuation signal and cause the outer shell to rotate, using the rotation mechanism.
In accordance with an embodiment of the present invention, the outer shell is a hollow shell made up of a material selected from a metal or a light weight polymer.
In accordance with an embodiment of the present invention, the outer shell has an outer surface, the outer surface includes a map of the world having a plurality of geographical regions marked out.
In accordance with an embodiment of the present invention, the geographical regions is selected from the group comprising oceans, continents, countries, states, cities, mountain ranges, plateaus, grasslands and rivers.
In accordance with an embodiment of the present invention, a method for enabling augmented reality interactions with a globe, comprises steps of receiving an image of a portion of an outer shell of the globe, from an image capturing device of a computing device, identifying a geographical region from the image, generating a plurality of graphical elements related to the geographical region and displaying the plurality of graphical elements on a display device of the computing device.
In accordance with an embodiment of the present invention, the outer shell is a hollow shell made up of a material selected from a metal or a light weight polymer.
In accordance with an embodiment of the present invention, the outer shell has an outer surface, the outer surface includes a map of the world having a plurality geographical regions marked out.
In accordance with an embodiment of the present invention, the geographical regions is selected from the group comprising oceans, continents, countries, states, cities, mountain ranges, plateaus, grasslands and rivers.
In accordance with an embodiment of the present invention, the plurality of graphical elements include one or more of 2-Dimensional (2D) and 3-Dimensional (3D) illustrations of entities selected from a group comprising animals, monuments, national flags, landmarks, inventions and foods related to the geographical regions.
In accordance with an embodiment of the present invention, the method further comprises a step of displaying information related to the plurality of graphical elements on the display device of the computing device.
In accordance with an embodiment of the present invention, a system for enabling augmented reality interactions with a globe, comprises an imaging module, an image processing module, a graphics generation module and an interface module. The imaging module is configured to receive an image of a portion of an outer shell of the globe, from an image capturing device of a computing device. The image processing module is configured identify a geographical region from the image. The graphics generation module is configured to generate a plurality of graphical elements related to the geographical region. The interface module is configured to display the plurality of graphical elements on a display device of the computing device.
In accordance with an embodiment of the present invention, the outer shell is a hollow shell made up of a material selected from a metal or a light weight polymer.
In accordance with an embodiment of the present invention, the outer shell has an outer surface, the outer surface including a map of the world having a plurality of geographical regions marked out.
In accordance with an embodiment of the present invention, the geographical region is selected from the group comprising oceans, continents, countries, states, cities, mountain ranges, plateaus, grasslands and rivers.
In accordance with an embodiment of the present invention, the plurality of graphical elements include one or more of 2-Dimensional (2D) and 3-Dimensional (3D) illustrations of entities selected from a group comprising animals, monuments, national flags, landmarks, inventions and foods related to the geographical regions.
In accordance with an embodiment of the present invention, the interface module is further configured to display information related to the plurality of graphical elements on the display device of the computing device.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
So that the manner in which the above recited features of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may have been referred by examples, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical examples of this invention and are therefore not to be considered limiting of its scope, for the invention may admit to other equally effective examples.
These and other features, benefits, and advantages of the present invention will become apparent by reference to the following text figure, with like reference numbers referring to like structures across the views, wherein:
Fig. 1A illustrates an exemplary environment to which various embodiments of the present invention may be implemented;
Fig. 1B illustrates a side sectional view of a globe, in accordance with an embodiment of the present invention;
Fig. 2 illustrates a method for enabling augmented reality interactions with a globe, in accordance with an embodiment of the present invention;
Fig. 3 illustrates a plurality of graphical elements being displayed at a display device, in accordance with an embodiment of the present invention;
Fig. 4A illustrates a plurality of graphical elements being displayed at a display device, in accordance with another embodiment of the present invention;
Fig. 4B illustrates selection of a first graphical element from the plurality of graphical elements being displayed in Fig. 4A and display of information related to the first graphical element, in accordance with an embodiment of the present invention;
Fig. 5A illustrates a plurality of graphical elements being displayed at a display device, in accordance with another embodiment of the present invention;
Fig. 5B illustrates selection of a second graphical element from the plurality of graphical elements being displayed in Fig. 5A and display of information related to the second graphical element, in accordance with an embodiment of the present invention;
Fig. 6A illustrates a plurality of graphical elements being displayed at a display device, in accordance with another embodiment of the present invention;
Fig. 6B illustrates selection of a third graphical element from the plurality of graphical elements being displayed in Fig. 6A and display of information related to the third graphical element, in accordance with an embodiment of the present invention;
Fig. 7A illustrates a plurality of graphical elements being displayed at a display device, in accordance with another embodiment of the present invention;
Fig. 7B illustrates selection of a fourth graphical element from the plurality of graphical elements being displayed in Fig. 7A and display of information related to the fourth graphical element, in accordance with an embodiment of the present invention;
Fig. 8 illustrates a plurality of graphical elements being displayed at a display device, in accordance with another embodiment of the present invention; and
Fig. 9 illustrates a system for enabling augmented reality interactions with a globe, in accordance with an embodiment of the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
While the present invention is described herein by way of example using embodiments and illustrative drawings, those skilled in the art will recognize that the invention is not limited to the embodiments of drawing or drawings described, and are not intended to represent the scale of the various components. Further, some components that may form a part of the invention may not be illustrated in certain figures, for ease of illustration, and such omissions do not limit the embodiments outlined in any way. It should be understood that the drawings and detailed description thereto are not intended to limit the invention to the particular form disclosed, but on the contrary, the invention is to cover all modifications, equivalents, and alternatives falling within the scope of the present invention as defined by the appended claim. As used throughout this description, the word "may" is used in a permissive sense (i.e. meaning having the potential to), rather than the mandatory sense, (i.e. meaning must). Further, the words "a" or "an" mean "at least one” and the word “plurality” means “one or more” unless otherwise mentioned. Furthermore, the terminology and phraseology used herein is solely used for descriptive purposes and should not be construed as limiting in scope. Language such as "including," "comprising," "having," "containing," or "involving," and variations thereof, is intended to be broad and encompass the subject matter listed thereafter, equivalents, and additional subject matter not recited, and is not intended to exclude other additives, components, integers or steps. Likewise, the term "comprising" is considered synonymous with the terms "including" or "containing" for applicable legal purposes. Any discussion of documents, acts, materials, devices, articles and the like is included in the specification solely for the purpose of providing a context for the present invention. It is not suggested or represented that any or all of these matters form part of the prior art base or were common general knowledge in the field relevant to the present invention.
In this disclosure, whenever a composition or an element or a group of elements is preceded with the transitional phrase “comprising”, it is understood that we also contemplate the same composition, element or group of elements with transitional phrases “consisting of”, “consisting”, “selected from the group of consisting of, “including”, or “is” preceding the recitation of the composition, element or group of elements and vice versa.
The present invention is described hereinafter by various embodiments with reference to the accompanying drawing, wherein reference numerals used in the accompanying drawing correspond to the like elements throughout the description. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiment set forth herein. Rather, the embodiment is provided so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those skilled in the art. In the following detailed description, numeric values and ranges are provided for various aspects of the implementations described. These values and ranges are to be treated as examples only, and are not intended to limit the scope of the claims. In addition, a number of materials are identified as suitable for various facets of the implementations. These materials are to be treated as exemplary, and are not intended to limit the scope of the invention.
A globe represents a map of the world on a spherical outer shell. However, due to size constraints, an amount of information that can be maintained on the globe is rather limited. However, there may be other different sources such as online repositories, directories, encyclopaedias and other sources which may hold much more information with regards to various geographical regions illustrated on the globe. The present invention intends to integrate the different sources with the information presented on a conventional globe, while adding additional functionalities to the globe to convert a conventional globe into a globe.
It is in this regard that the present invention has been elucidated with help of an exemplary environment discussed below. However, a person skilled in the art would appreciate that the present invention is not limited to the exemplary environment, and many variations to the implementation of the present invention is possible, without departing from the scope of the present invention.
Figure 1A illustrates an exemplary environment (100) to which various embodiments of the present invention may be implemented. The environment (100) comprises a computing device (101) associated with a user. In various embodiments of the invention, the computing device 101 is selected from a group of a mobile handheld device, a PDA, a personal computer (such as a desktop or a laptop) or a tablet etc. The computing device (101) comprises a display device (102). The display device (102) may be one of, but not limited to, an LCD screen or an LED screen. Additionally, the computing device (101) includes an input device (104). In various embodiments, the input device (104) is one of, but not limited to, a keypad, a joystick, a mouse and a trackball etc. In various other embodiments, the display device (102) and the input device (104) have been integrated into a capacitive or a resistive or an equivalent touch based screen device. The computing device (101) also included an image capturing device (105), such a camera or a combination of one or more cameras.
The computing device (101) is envisaged to have further computing capabilities, such as, a local processor (106) and a local memory (108). In various embodiments, the local processor (106) is one of, but not limited to, Field Programmable Gate Array (FPGA), Application Specific Integrated Circuit (ASIC), general purpose or an ARM based processor. Additionally, the local memory (108) is one of, but not limited to, EPROM, EEPROM and Flash memory etc. The computing device (101) is envisaged to have additional storage capabilities in form of additional local storage (110). The local storage (110) is envisaged to store reference data, at least in part, in the computing device (101), for the user’s access. The reference data here is envisaged to include data such as data relevant to various geographical locations across the world including images, text, media etc. of monuments, famous landscapes, rivers, political scenarios, economic scenarios, population data, cultural information, occupational information etc. The reference data may be sourced from various online and offline sources such as image repositories, mapping services, encyclopedias, local governing bodies, international forums and their online and offline journals and libraries etc.
The computing device (101) is connected to a network (112). The network (112) is one of, but not limited to, a Local Area Network (LAN) or a Wide Area Network (WAN) implemented through a number of protocols, such as but not limited to, 802.x, Bluetooth, ZigBee or the like. Preferably, the network (112) is Internet. Further connected to the network (112) is a central server (113). The central server (113) is envisaged to have computing capabilities such as a server memory (114) and a server processor (116). In various embodiments, the server processor (116) is one of, but not limited to, Field Programmable Gate Array (FPGA), Application Specific Integrated Circuit (ASIC), general purpose or an ARM based processor. Additionally, the server memory (114) is one of, but not limited to, EPROM, EEPROM and Flash memory etc. An external storage device (118) connected with the network (112) is also envisaged to include at least a portion of reference data (or completely). The external storage device may be a local storage device or a cloud based storage device. When any portion of the reference data is requested at the computing device (101) (for example, by a browser application of a standalone application, for the purposes of the present invention), the central server (113) fetches the portion of the reference data from the external storage device (118) and delivers the portion to the computing device (101) through the network (112). Further illustrated in figure 1A is a globe (107), of which an image is being captured by the image capturing device (105).
As shown in figure 1B, the globe (107) includes an outer shell (1072). The outer shell (1072) may be a hollow shell made up of metal or a light weight polymer. An outer surface of the outer shell (1072) is envisaged to may or may not include a map of the world having various geographical regions (such as oceans, continents, countries, states, cities, mountain ranges, plateaus, grasslands, rivers etc.) marked out. The outer surface may include additional graphics like animals, cultural elements, food items, monuments that are specific to a geo location on the globe (107). Figure 1C illustrates a side sectional view of the globe (107), in accordance with an embodiment of the present invention. As shown in figure 1C, a rotation mechanism (1074) provided with an actuator (1076) (not shown) is located inside of the outer shell (1072). The actuator (1076) may be a D.C. motor. Further, a microphone (1078) has been provided within the outer shell (1072). Also, a control module (not shown) has been provided within the outer shell (1072).
The microphone (1078) is configured to receive an audio signal. The audio signal may be received from the user or from the computing device (101) or any other device. Further, the audio signal may include a word identifying a geographical region, such as a name of a city or a mountain range or an ocean or a country or a state etc. The control module is configured to transmit an actuation signal to the actuator (1076), in response to the microphone (1078) receiving the audio signal. The actuator (1076) is configured to actuate on receiving the actuation signal and cause the outer shell (1072) to rotate, using the rotation mechanism (1074). In that manner, if a user speaks ‘Africa’ or an audio command from a speaker of the computing device (101) includes the word ‘Africa’, the outer shell (1072) would be rotated in order to point a map of Africa, towards the user or the computing device (101), as the case may be. The rotation mechanism is completely separate, and does not affect the augmented reality features as such. Embodiments of the present invention can now be understood with the exemplary environment (100) as a reference. It is to be noted here that although the embodiments of the method have been described here using the globe (107) having automation features such as the microphone (1078) and the actuation mechanism (1074), a conventional globe with just an outer shell having geographical regions marked out on the outer shell, would also be applicable, to varying extents, for various embodiments of the method described below, for example where the automation features are not implicitly required.
Figure 2 illustrates a method (200) for enabling augmented reality interactions with the globe (107), in accordance with an embodiment of the present invention. The method begins at step 210 by receiving an image of a portion of the outer shell (1072) of the globe (107), from the image capturing device (105). The image may either be stored locally inside the local storage (110) and/or may be transmitted to the central server (113) for storage in the external storage device (118), through the network (112).
At step 220, a geographical region is identified from the image. Again the geographical region may be identified locally at the computing device (101) or at the central server (113). The geographical region may be identified by comparing the image with the reference data. Further different image recognition algorithms may be used to compare various points in the image, as well as any textual information, with the reference data, to identify the geographical region. One such algorithm may include sampling of a predetermined number of points along a boundary defining the geographical region to recreate the boundary, internally, in a digital form. The digital form may then be compared with the reference data to identify the geographical region. Any form of textual information present of the geographical region may be helpful in identification of the geographical region. For example, a country may have the very name of the country or names of major cities marked out, an ocean might have well known islands and archipelagos marked out.
In one embodiment of the invention, if no geographical region is being identified from the image, an alert may be provided to the user, in form of a text or voice notification for example. The user may then reorient the computing device (101) and/or the globe (107), so that a more accurate and/or appropriate image of the portion of the outer shell (1072) may be captured.
At step 230, a plurality of graphical elements related to the geographical region, are generated. It is envisaged here, that the plurality of graphical elements includes one or more of 2-Dimensional (2D) and 3-Dimensional (3D) illustrations of entities selected from a group comprising animals, monuments, national flags, landmarks, inventions and foods related to the geographical regions. For example, if the identified geographical region is ‘Paris’ city, a 2-Dimensional (2D) view (or an Orthogonal view) or a 3-Dimensional (3D) view (or a Perspective or an Isometric View) of Eiffel Tower may be generated. In another example, if the identified geographical region is ‘Egypt’, a 2D or a 3D view of Sphinx or Pyramids may be generated. Again the plurality of graphical elements may be generated at the computing device (101) and/or the central server (113).
At step 240, the plurality of graphical elements is displayed at the display device (102) of the computing device (101). In various embodiments, the plurality of graphical elements is displayed at the display device (102), in a manner that they appear to be located upon the geographical region of the globe (107). For example, the plurality of graphical elements may be angularly oriented, in a manner, that they appear to be normal to a curved outer shell (1072) of the globe (107). This gives a feeling or appearance of the plurality of graphical elements to be actually, originating from or located on the globe (107), in the 3-Dimensional space. Hence the term “augmented reality”. Another point to be noted about the plurality of graphical elements is that the entities such animals, monuments, national flags, landmarks, inventions and foods, in themselves constitute different categories. Therefore, the plurality of graphical elements may be divided into a plurality of categories, and one or more categories may be displayed on selection. This would prevent overcrowding of screen space of the display device (102) and allow for greater clarity while viewing of the plurality of graphical elements.
Additionally, other information such as media or text (historical or cultural information) related to the plurality of graphical elements may also be displayed at the display device (102). It is envisaged here, that in some embodiments, the information may be displayed concurrently, while the plurality of graphical elements is being displayed. In various alternate embodiments, the information is displayed on selection (by use of the input device (104)) of any one or more of the plurality of graphical elements. The information may also be arranged in a cascading manner, that is one piece of information leading to another (via a hyperlink, for example), so that more and more information can be associated and displayed with any graphical element, without using a lot of the screen space of the display device (102).
Figure 3 illustrates the plurality of graphical elements (310) (in this case Eiffel Tower) being displayed at the display device (102), in accordance with an embodiment (300) of the present invention. The plurality of graphical elements may be displayed along with the identified geographical region (such as a 3D image of Eiffel Tower being located on a map of Paris, at the display device (102)) or individually. It is further envisaged that additional inputs may be received from the user, through the input device (104) and the plurality of graphical elements may then be augmented in response to the additional inputs. The plurality of graphical elements may also be provided with additional features such as 3D animations, videos, audios and text etc.
The additional features may be actuated and controlled in response to the additional inputs. The additional inputs may include, for example, instruction like rotate, pan, zoom, orient, activate animation, activate audio and display text etc. It is also envisaged that the plurality of graphical elements may also be augmented in response to a movement detection of the computing device (101). The movement detection may be facilitated by a plurality of sensors (not shown) available with the computing device (101), such as gyroscopes and accelerometers etc. It is also envisaged that the additional elements, along with the plurality of graphical elements, may also be displayed as being a part of the globe (107), at the display device (102). This may be achieved by integrating the image of the globe (107) with the plurality of graphical elements and/or the additional elements.
In accordance with another embodiment (400) shown in figure 4A, the plurality of graphical elements (310) are displayed at the display device (102) over the geographical region of the United States of America (USA) (402). The plurality of graphical elements (310) include, but not limited to, national flag, famous landmarks, locations, inventions and popular food items associated with the various cities of the USA, for example, Hollywood, telephone, television, iPhone, peanut butter etc. Further, any of the plurality of graphical elements (310) displayed, may be selected by the user to access augmented illustration and information of the selected graphical element. The selection may be made by tapping any of the plurality of graphical elements (310) on the touch (input) based display device (102).
Figure 4B illustrates a first graphical element (3102) selected from the plurality of graphical elements (310) being displayed in figure 4A, in accordance with an embodiment (450) of the present invention. The first graphical element (3102) here is peanut butter. As previously discussed, the display device (102) enables the user to rotate, pan, zoom, orient and/or animate the first graphical element (3102). Also, an information box (452) is displayed at the display device (102) providing information about peanut butter.
In yet another embodiment (500) shown in figure 5A, the plurality of graphical elements (310) are displayed on the display device (102) over the geographical region of Africa (502). The plurality of graphical elements (310) include, but not limited to, national flags of the countries in African continent, animals and popular food items associated with the various countries of the Africa. Further, any of the plurality of graphical elements (310) displayed, may be selected by the user to access augmented illustration and information of the selected graphical element. The selection may be made by tapping any of the plurality of graphical elements (310) on the touch (input) based display device (102).
Figure 5B illustrates a second graphical element (3104) selected from the plurality of graphical elements (310) being displayed in figure 5A, in accordance with an embodiment (550) of the present invention. The second graphical element (3104) is hippopotamus (animal). As previously discussed, the display device enables the user to rotate, pan, zoom, orient and/or animate the second graphical element (3104). Also, an information box (552) is displayed at the display device (102) providing information about the hippopotamus.
In yet another embodiment (600) shown in figure 6A, the plurality of graphical elements (310) are displayed on the display device (102) over the geographical region of Asia (602). The plurality of graphical elements (310) include, but not limited to, national flags of the countries in Asia, monuments, locations, birds and animals associated with the various countries of the Asia. Further, any of the plurality of graphical elements (310) displayed, may be selected by the user to access augmented illustration and information of the selected graphical element. The selection may be made by tapping any of the plurality of graphical elements (310) on the touch (input) based display device (102).
Figure 6B illustrates a third graphical element (3106) selected from the plurality of graphical elements (310) being displayed in figure 6A, in accordance with an embodiment (650) of the present invention. The second graphical element (3106) is a peacock (a bird). As previously discussed, the display device (102) enables the user to rotate, pan, zoom, orient and/or animate the second graphical element (3106). Also, an information box (652) is displayed at the display device (102) providing information about the peacock.
In yet another embodiment (700) shown in figure 7A, the plurality of graphical elements (310) are displayed on the display device (102) over the geographical region of Russia (702). The plurality of graphical elements (310) include, but not limited to, monuments, landmarks, animals and popular food items of Russia. Further, any of the plurality of graphical elements (310) displayed, may be selected by the user to access augmented illustration and information of the selected graphical element. The selection may be made by tapping any of the plurality of graphical elements (310) on the touch (input) based display device (102).
Figure 7B illustrates a second graphical element (3108) selected from the plurality of graphical elements (310) being displayed in figure 7A, in accordance with an embodiment (550) of the present invention. The second graphical element (3108) is Saint Basil’s Cathedral. As previously discussed, the display device enables the user to rotate, pan, zoom, orient and/or animate the second graphical element (3108). Also, an information box (752) is displayed at the display device (102) providing information about Saint Basil’s Cathedral.
In yet another embodiment (800) shown in figure 8, a plurality of states (804), for example, Western Australia, South Australia, Queensland, Victoria etc. along with the geographical boundaries are displayed on the display device (102) over the geographical region of Australia (802). Also, an information box (808) is displayed at the display device (102) providing information about the Australia. Further, additional information about weather of the states (804) displayed on the display device (102) may also be provided.
The method steps as described above are capable of being performed by either of the local processor (106) and the server processor (116). For example, in case any portion of the reference data is being accessed locally from the local storage (110), the method steps would be performed by the local processor (106). In case of any portion of the reference data is being fetched from the external storage device (118), through the network (112), the method steps would be performed by any one or both of the local processor (106) and the server processor (116). Additionally, the method steps described above may also be performed through a number of modules as described in the following discussion.
Figure 9 illustrates a system (900) for enabling augmented reality interactions with the globe (107), in accordance with an embodiment of the present invention. The system (900) comprises an imaging module (910), an image processing module (920), a graphics generation module (930) and an interface module (940). The imaging module (910) is configured to receive the image of the portion of the outer shell (1072) of the globe (107), from the image capturing device (105) of the computing device (101). The image processing module (920) is configured identify the geographical region from the image. The graphics generation module (930) is configured to generate the plurality of graphical elements related to the geographical region. The interface module (940) is configured to display the plurality of graphical elements on the display device (102) of the computing device (101).
In accordance with an embodiment of the present invention, the outer shell (1072) is a hollow shell made up of a material selected from a metal or a light weight polymer.
In accordance with an embodiment of the present invention, the outer shell (1072) has the outer surface. The outer surface includes the map of the world having the geographical regions marked out.
In accordance with an embodiment of the present invention, the geographical regions are selected from the group comprising the oceans, the continents, the countries, the states, the cities, the mountain ranges, the plateaus, the grasslands and the rivers.
In accordance with an embodiment of the present invention, the plurality of graphical elements (310) include one or more of 2-Dimensional (2D) and 3-Dimensional (3D) illustrations of entities selected from a group comprising animals, monuments, national flags, landmarks, inventions and foods related to the geographical regions.
In accordance with an embodiment of the present invention, the interface module (940) is further configured to display information related to the plurality of graphical elements (310) on the display device (102) of the computing device (101).
The present invention offers a number of advantages. The present invention brings a globe to life. It enables Augmented Reality interactions with the globe, using household smart devices like smart phones, tablets or head mounted devices, audio control and self-rotation features that provides infinite learning and playtime value for kids and adults alike. Recognising and memorising different continents, countries, oceans etc. which otherwise is tedious task becomes interesting and interactive using the present invention. Further, additional information such as national animals, national flags, famous landmarks, popular food items etc. related to a particular geographical region, that is generally not provided on a globe, is also displayed on the smart devices. This way kids learn a lot of new things with good understanding without having to specifically cram or study the globe or maps.
In some examples, the systems described herein, may include one or more processors, one or more forms of memory, one or more input devices/interfaces, one or more output devices/interfaces, and machine-readable instructions that when executed by the one or more processors cause the system to carry out the various operations, tasks, capabilities, etc., described above.
In some embodiments, the disclosed techniques can be implemented, at least in part, by computer program instructions encoded on a non-transitory computer-readable storage media in a machine-readable format, or on other non-transitory media or articles of manufacture. Such computing systems (and non-transitory computer-readable program instructions) can be configured according to at least some embodiments presented herein, including the processes described in above description.
The programming instructions can be, for example, computer executable and/or logic implemented instructions. In some examples, a computing device is configured to provide various operations, functions, or actions in response to the programming instructions conveyed to the computing device by one or more of the computer readable medium, the computer recordable medium, and/or the communications medium. The non-transitory computer readable medium can also be distributed among multiple data storage elements, which could be remotely located from each other. The computing device that executes some or all of the stored instructions can be a microfabrication controller, or another computing platform. Alternatively, the computing device that executes some or all of the stored instructions could be remotely located computer system, such as a server.
In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, written in a programming language, such as, for example, Java, C, or assembly. One or more software instructions in the modules may be embedded in firmware, such as an EPROM. It will be appreciated that modules may comprised connected logic units, such as gates and flip-flops, and may comprise programmable units, such as programmable gate arrays or processors. The modules described herein may be implemented as either software and/or hardware modules and may be stored in any type of computer-readable medium or other computer storage device.
Further, while one or more operations have been described as being performed by or otherwise related to certain modules, devices or entities, the operations may be performed by or otherwise related to any module, device or entity. As such, any function or operation that has been described as being performed by a module could alternatively be performed by a different server, by the cloud computing platform, or a combination thereof.
Further, the operations need not be performed in the disclosed order, although in some examples, an order may be preferred. Also, not all functions need to be performed to achieve the desired advantages of the disclosed system and method, and therefore not all functions are required.
Various modifications to these embodiments are apparent to those skilled in the art from the description. The principles associated with the various embodiments described herein may be applied to other embodiments. Therefore, the description is not intended to be limited to the embodiments but is to be providing broadest scope consistent with the principles and the novel and inventive features disclosed or suggested herein. Accordingly, the invention is anticipated to hold on to all other such alternatives, modifications, and variations that fall within the scope of the present invention.
,CLAIMS:We Claim:
1. A globe (107), comprising:
an outer shell (1072);
a rotation mechanism (1074) provided with an actuator (1076), a microphone (1078) and a control module;
wherein the microphone (1078) is configured to receive an audio signal;
wherein the control module is configured to transmit an actuation signal to the actuator (1076), in response to the microphone (1078) receiving the audio signal; and
wherein the actuator is (1076) configured to actuate on receiving the actuation signal and cause the outer shell (1072) to rotate, using the rotation mechanism (1074).
2. The globe (107) as claimed in claim 1, wherein the outer shell (1072) is a hollow shell made up of a material selected from a metal or a light weight polymer.
3. The globe (107) as claimed in claim 2, wherein the outer shell (1072) has an outer surface, the outer surface includes a map of the world having a plurality of geographical regions marked out.
4. The globe (107) as claimed in claim 3, wherein the geographical regions are selected from the group comprising oceans, continents, countries, states, cities, mountain ranges, plateaus, grasslands and rivers.
5. A method (200) for enabling augmented reality interactions with a globe (107) comprising the steps of:
receiving an image of a portion of an outer shell (1072) of the globe (107), from an image capturing device (105) of a computing device (101);
identifying a geographical region from the image;
generating a plurality of graphical elements (310) related to the geographical region; and
displaying the plurality of graphical elements (310) on a display device (102) of the computing device (101).
6. The method (200) as claimed in claim 5, wherein the outer shell (1072) is a hollow shell made up of a material selected from a metal or a light weight polymer.
7. The method (200) as claimed in claim 6, wherein the outer shell (1072) has an outer surface, the outer surface including a map of the world having a plurality of geographical regions marked out.
8. The method (200) as claimed in claim 5, wherein the geographical region is selected from the group comprising oceans, continents, countries, states, cities, mountain ranges, plateaus, grasslands and rivers.
9. The method (200) as claimed in claim 5, wherein the plurality of graphical elements (310) include one or more of 2-Dimensional (2D) and 3-Dimensional (3D) illustrations of entities selected from a group comprising animals, monuments, national flags, landmarks, inventions and foods related to the geographical region.
10. The method (200) as claimed in claim 5, further comprising a step of displaying information related to the plurality of graphical elements (310) on the display device (102) of the computing device (101).
11. A system (900) for enabling augmented reality interactions with a globe (107) comprising:
an imaging module (910);
an image processing module (920);
a graphics generation module (930); and
an interface module (940);
wherein the imaging module (910) is configured to receive an image of a portion of an outer shell (1072) of the globe (107), from an image capturing device (105) of a computing device (101);
wherein the image processing module (920) is configured identify a geographical region from the image;
wherein the graphics generation module (930) is configured to generate a plurality of graphical elements (310) related to the geographical region; and
wherein the interface module (940) is configured to display the plurality of graphical elements (310) on a display device (102) of the computing device (101).
12. The system (900) as claimed in claim 11, wherein the outer shell (1072) is a hollow shell made up of a material selected from a metal or a light weight polymer.
13. The system (900) as claimed in claim 12, wherein the outer shell (1072) has an outer surface, the outer surface including a map of the world having a plurality of geographical regions marked out.
14. The system (900) as claimed in claim 11, wherein the geographical region is selected from the group comprising oceans, continents, countries, states, cities, mountain ranges, plateaus, grasslands and rivers.
15. The system (900) as claimed in claim 11, wherein the plurality of graphical elements (310) include 2-Dimensional (2D) or 3-Dimensional (3D) illustrations of entities selected from a group comprising animals, monuments, national flags, landmarks, inventions and foods related to the geographical regions.
16. The system (900) as claimed in claim 11, wherein the interface module (940) is further configured to display information related to the plurality of graphical elements (310) on the display device (102) of the computing device (101).
Dated this the 22nd day of January 2018
[VIVEK DAHIYA]
OF SAGACIOUS RESEARCH
AGENT FOR THE APPLICANT- IN/PA 1491
| # | Name | Date |
|---|---|---|
| 1 | 201741041096-2. Marked Copy under Rule 14(2) [28-02-2025(online)].pdf | 2025-02-28 |
| 1 | 201741041096-PROVISIONAL SPECIFICATION [17-11-2017(online)].pdf | 2017-11-17 |
| 2 | 201741041096-Annexure [28-02-2025(online)].pdf | 2025-02-28 |
| 2 | 201741041096-FORM 1 [17-11-2017(online)].pdf | 2017-11-17 |
| 3 | 201741041096-Information under section 8(2) [28-02-2025(online)].pdf | 2025-02-28 |
| 3 | 201741041096-DRAWINGS [17-11-2017(online)].pdf | 2017-11-17 |
| 4 | 201741041096-Retyped Pages under Rule 14(1) [28-02-2025(online)].pdf | 2025-02-28 |
| 4 | 201741041096-FORM 3 [22-01-2018(online)].pdf | 2018-01-22 |
| 5 | 201741041096-Written submissions and relevant documents [28-02-2025(online)].pdf | 2025-02-28 |
| 5 | 201741041096-ENDORSEMENT BY INVENTORS [22-01-2018(online)].pdf | 2018-01-22 |
| 6 | 201741041096-FORM-26 [29-01-2025(online)].pdf | 2025-01-29 |
| 6 | 201741041096-DRAWING [22-01-2018(online)].pdf | 2018-01-22 |
| 7 | 201741041096-CORRESPONDENCE-OTHERS [22-01-2018(online)].pdf | 2018-01-22 |
| 7 | 201741041096-Correspondence to notify the Controller [27-01-2025(online)].pdf | 2025-01-27 |
| 8 | 201741041096-US(14)-ExtendedHearingNotice-(HearingDate-14-02-2025)-1130.pdf | 2025-01-24 |
| 8 | 201741041096-COMPLETE SPECIFICATION [22-01-2018(online)].pdf | 2018-01-22 |
| 9 | 201741041096-PA [15-02-2018(online)].pdf | 2018-02-15 |
| 9 | 201741041096-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [25-01-2024(online)].pdf | 2024-01-25 |
| 10 | 201741041096-ENDORSEMENT BY INVENTORS [23-01-2024(online)].pdf | 2024-01-23 |
| 10 | 201741041096-FORM-26 [15-02-2018(online)].pdf | 2018-02-15 |
| 11 | 201741041096-Changing Name-Nationality-Address For Service [15-02-2018(online)].pdf | 2018-02-15 |
| 11 | 201741041096-PETITION UNDER RULE 137 [23-01-2024(online)].pdf | 2024-01-23 |
| 12 | 201741041096-ASSIGNMENT DOCUMENTS [15-02-2018(online)]_97.pdf | 2018-02-15 |
| 12 | 201741041096-Proof of Right [23-01-2024(online)].pdf | 2024-01-23 |
| 13 | 201741041096-ASSIGNMENT DOCUMENTS [15-02-2018(online)].pdf | 2018-02-15 |
| 13 | 201741041096-FORM-26 [18-01-2024(online)].pdf | 2024-01-18 |
| 14 | 201741041096-8(i)-Substitution-Change Of Applicant - Form 6 [15-02-2018(online)]_190.pdf | 2018-02-15 |
| 14 | 201741041096-Correspondence to notify the Controller [09-01-2024(online)].pdf | 2024-01-09 |
| 15 | 201741041096-8(i)-Substitution-Change Of Applicant - Form 6 [15-02-2018(online)].pdf | 2018-02-15 |
| 15 | 201741041096-US(14)-HearingNotice-(HearingDate-29-01-2024).pdf | 2023-12-13 |
| 16 | 201741041096-AMMENDED DOCUMENTS [30-09-2022(online)].pdf | 2022-09-30 |
| 16 | 201741041096-FORM-26 [03-09-2018(online)].pdf | 2018-09-03 |
| 17 | 201741041096-CLAIMS [30-09-2022(online)].pdf | 2022-09-30 |
| 17 | 201741041096-CERTIFIED COPIES-CERTIFICATE U-S 72 147 & UR 133-2 [03-09-2018(online)].pdf | 2018-09-03 |
| 18 | 201741041096-COMPLETE SPECIFICATION [30-09-2022(online)].pdf | 2022-09-30 |
| 18 | 201741041096-FORM FOR STARTUP [07-09-2018(online)].pdf | 2018-09-07 |
| 19 | 201741041096-ENDORSEMENT BY INVENTORS [30-09-2022(online)].pdf | 2022-09-30 |
| 19 | 201741041096-FORM FOR SMALL ENTITY [07-09-2018(online)].pdf | 2018-09-07 |
| 20 | 201741041096-EVIDENCE FOR REGISTRATION UNDER SSI [30-09-2022(online)].pdf | 2022-09-30 |
| 20 | 201741041096-FORM-26 [09-11-2021(online)].pdf | 2021-11-09 |
| 21 | 201741041096-FER_SER_REPLY [30-09-2022(online)].pdf | 2022-09-30 |
| 21 | 201741041096-POA [16-11-2021(online)].pdf | 2021-11-16 |
| 22 | 201741041096-FORM 13 [30-09-2022(online)].pdf | 2022-09-30 |
| 22 | 201741041096-FORM-26 [16-11-2021(online)].pdf | 2021-11-16 |
| 23 | 201741041096-FORM 18 [16-11-2021(online)].pdf | 2021-11-16 |
| 23 | 201741041096-FORM 3 [30-09-2022(online)].pdf | 2022-09-30 |
| 24 | 201741041096-FORM FOR STARTUP [30-09-2022(online)].pdf | 2022-09-30 |
| 24 | 201741041096-FORM 13 [16-11-2021(online)].pdf | 2021-11-16 |
| 25 | 201741041096-AMENDED DOCUMENTS [16-11-2021(online)].pdf | 2021-11-16 |
| 25 | 201741041096-MARKED COPIES OF AMENDEMENTS [30-09-2022(online)].pdf | 2022-09-30 |
| 26 | 201741041096-FER.pdf | 2022-03-31 |
| 26 | 201741041096-OTHERS [30-09-2022(online)].pdf | 2022-09-30 |
| 27 | 201741041096-POA [30-09-2022(online)].pdf | 2022-09-30 |
| 28 | 201741041096-FER.pdf | 2022-03-31 |
| 28 | 201741041096-OTHERS [30-09-2022(online)].pdf | 2022-09-30 |
| 29 | 201741041096-AMENDED DOCUMENTS [16-11-2021(online)].pdf | 2021-11-16 |
| 29 | 201741041096-MARKED COPIES OF AMENDEMENTS [30-09-2022(online)].pdf | 2022-09-30 |
| 30 | 201741041096-FORM 13 [16-11-2021(online)].pdf | 2021-11-16 |
| 30 | 201741041096-FORM FOR STARTUP [30-09-2022(online)].pdf | 2022-09-30 |
| 31 | 201741041096-FORM 18 [16-11-2021(online)].pdf | 2021-11-16 |
| 31 | 201741041096-FORM 3 [30-09-2022(online)].pdf | 2022-09-30 |
| 32 | 201741041096-FORM 13 [30-09-2022(online)].pdf | 2022-09-30 |
| 32 | 201741041096-FORM-26 [16-11-2021(online)].pdf | 2021-11-16 |
| 33 | 201741041096-FER_SER_REPLY [30-09-2022(online)].pdf | 2022-09-30 |
| 33 | 201741041096-POA [16-11-2021(online)].pdf | 2021-11-16 |
| 34 | 201741041096-EVIDENCE FOR REGISTRATION UNDER SSI [30-09-2022(online)].pdf | 2022-09-30 |
| 34 | 201741041096-FORM-26 [09-11-2021(online)].pdf | 2021-11-09 |
| 35 | 201741041096-ENDORSEMENT BY INVENTORS [30-09-2022(online)].pdf | 2022-09-30 |
| 35 | 201741041096-FORM FOR SMALL ENTITY [07-09-2018(online)].pdf | 2018-09-07 |
| 36 | 201741041096-FORM FOR STARTUP [07-09-2018(online)].pdf | 2018-09-07 |
| 36 | 201741041096-COMPLETE SPECIFICATION [30-09-2022(online)].pdf | 2022-09-30 |
| 37 | 201741041096-CLAIMS [30-09-2022(online)].pdf | 2022-09-30 |
| 37 | 201741041096-CERTIFIED COPIES-CERTIFICATE U-S 72 147 & UR 133-2 [03-09-2018(online)].pdf | 2018-09-03 |
| 38 | 201741041096-AMMENDED DOCUMENTS [30-09-2022(online)].pdf | 2022-09-30 |
| 38 | 201741041096-FORM-26 [03-09-2018(online)].pdf | 2018-09-03 |
| 39 | 201741041096-8(i)-Substitution-Change Of Applicant - Form 6 [15-02-2018(online)].pdf | 2018-02-15 |
| 39 | 201741041096-US(14)-HearingNotice-(HearingDate-29-01-2024).pdf | 2023-12-13 |
| 40 | 201741041096-8(i)-Substitution-Change Of Applicant - Form 6 [15-02-2018(online)]_190.pdf | 2018-02-15 |
| 40 | 201741041096-Correspondence to notify the Controller [09-01-2024(online)].pdf | 2024-01-09 |
| 41 | 201741041096-ASSIGNMENT DOCUMENTS [15-02-2018(online)].pdf | 2018-02-15 |
| 41 | 201741041096-FORM-26 [18-01-2024(online)].pdf | 2024-01-18 |
| 42 | 201741041096-ASSIGNMENT DOCUMENTS [15-02-2018(online)]_97.pdf | 2018-02-15 |
| 42 | 201741041096-Proof of Right [23-01-2024(online)].pdf | 2024-01-23 |
| 43 | 201741041096-Changing Name-Nationality-Address For Service [15-02-2018(online)].pdf | 2018-02-15 |
| 43 | 201741041096-PETITION UNDER RULE 137 [23-01-2024(online)].pdf | 2024-01-23 |
| 44 | 201741041096-ENDORSEMENT BY INVENTORS [23-01-2024(online)].pdf | 2024-01-23 |
| 44 | 201741041096-FORM-26 [15-02-2018(online)].pdf | 2018-02-15 |
| 45 | 201741041096-PA [15-02-2018(online)].pdf | 2018-02-15 |
| 45 | 201741041096-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [25-01-2024(online)].pdf | 2024-01-25 |
| 46 | 201741041096-US(14)-ExtendedHearingNotice-(HearingDate-14-02-2025)-1130.pdf | 2025-01-24 |
| 46 | 201741041096-COMPLETE SPECIFICATION [22-01-2018(online)].pdf | 2018-01-22 |
| 47 | 201741041096-CORRESPONDENCE-OTHERS [22-01-2018(online)].pdf | 2018-01-22 |
| 47 | 201741041096-Correspondence to notify the Controller [27-01-2025(online)].pdf | 2025-01-27 |
| 48 | 201741041096-FORM-26 [29-01-2025(online)].pdf | 2025-01-29 |
| 48 | 201741041096-DRAWING [22-01-2018(online)].pdf | 2018-01-22 |
| 49 | 201741041096-Written submissions and relevant documents [28-02-2025(online)].pdf | 2025-02-28 |
| 49 | 201741041096-ENDORSEMENT BY INVENTORS [22-01-2018(online)].pdf | 2018-01-22 |
| 50 | 201741041096-Retyped Pages under Rule 14(1) [28-02-2025(online)].pdf | 2025-02-28 |
| 50 | 201741041096-FORM 3 [22-01-2018(online)].pdf | 2018-01-22 |
| 51 | 201741041096-DRAWINGS [17-11-2017(online)].pdf | 2017-11-17 |
| 51 | 201741041096-Information under section 8(2) [28-02-2025(online)].pdf | 2025-02-28 |
| 52 | 201741041096-Annexure [28-02-2025(online)].pdf | 2025-02-28 |
| 52 | 201741041096-FORM 1 [17-11-2017(online)].pdf | 2017-11-17 |
| 53 | 201741041096-2. Marked Copy under Rule 14(2) [28-02-2025(online)].pdf | 2025-02-28 |
| 53 | 201741041096-PROVISIONAL SPECIFICATION [17-11-2017(online)].pdf | 2017-11-17 |
| 1 | 201741041096E_30-03-2022.pdf |