Abstract: Disclosed is a method and system for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles. The method comprises identifying entry points to an enclosed area using a set of unmanned vehicles based on a global positioning system and assigning an entry point of the entry points, to each the unmanned vehicles for traversing and scanning. The method further comprises identifying a plurality of sub-paths corresponding to a plurality of primary paths traversed collectively by the set of unmanned vehicles based on the scanning and designating a sub-path from the plurality of sub-paths to the unmanned vehicle from the set of unmanned vehicles for further scanning. The method furthermore comprises generating a map of the enclosed area based on an aggregation of the scanning of each of the primary paths and the sub-paths scanned by the set of the unmanned vehicles.
FIELD
[001] The present subject matter described herein, in general, relates to a system and a method for generating a map, and more particularly a system and a method for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles.
BACKGROUND
[002] Search and rescue operations may be understood as searching for and providing of aid to people who are in distress or imminent danger. The general field of search and rescue includes many specialty sub-fields, usually determined by the type of terrain the search is conducted over. These include mountain rescue; ground search and rescue, including the use of search and rescue dogs; urban search and rescue in cities; combat search and rescue on the battlefield and air-sea rescue over water. In some conditions, the search and rescue operations are dangerous operations, such a building fire and thus can greatly benefit from the use of unmanned vehicles (UV) to survey the environment, develop a map, and collect evidence about the position of a missing person.
[003] In enclosed space searching and rescue operations, such as building or a cave, detailed information and maps of the enclosed area are generally unavailable. Thus implementing a search and rescue operation is a challenging task. The conventional systems using a single UV require lot of time to perform a search and generate a map. Further, as in case of emergencies time is of essence and conventional systems fail to execute the searching and mapping operation in a shorter period of time. Thus there exists a need to map and search an enclosed area in a short amount of time.
[004] Furthermore, the conventional UV and mapping methodology has been developed based on the availability of GPS throughout the mapping operation. Such conventional systems fail during unavailability of GPS, typically encountered inside the enclosed area such as a cave. Thus there is a need to develop a system and method to search and generating map an enclosed area in absence of GPS.
SUMMARY
[005] Before the present systems and methods for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles, are described, it is to be understood that this
3
application is not limited to the particular systems, and methodologies described, as there can be multiple possible embodiments which are not expressly illustrated in the present disclosures. It is also to be understood that the terminology used in the description is for the purpose of describing the particular implementations or versions or embodiments only, and is not intended to limit the scope of the present application. This summary is provided to introduce aspects related to a system and a method for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[006] In one implementation, a system for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles is disclosed. In one aspect, the system may identify one or more entry points to an enclosed area using a set of unmanned vehicles based on a global positioning system. Upon identification cooperatively between unmanned vehicles, the system may assign an entry point of the one or more entry points, to each the unmanned vehicles. Further, each of the unmanned vehicles may be configured to traverse a primary path associated with the assigned entry point, and scan the primary path using a camera and one or more sensors mounted on the unmanned vehicle and without GPS. Subsequent to assigning, the system may identify a plurality of sub-paths corresponding to a plurality of primary paths traversed collectively by the set of unmanned vehicles based on the scanning. Further, each of the primary paths comprises one or more sub-paths. Further to identifying, the system may designate a sub-path from the plurality of sub-paths to one or more of the unmanned vehicle from the set of unmanned vehicles. Further, one or more of the unmanned vehicle is configured to traverse the designated sub-path, and scan the designated sub-path using the camera, and the one or more sensors mounted on the unmanned vehicle and without GPS. Upon designating, the system may generate a map of the enclosed area based on an aggregation of the scanning of each of the primary paths and the sub-paths scanned by the set of the unmanned vehicles.
[007] In one implementation, a method for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles is disclosed. In one aspect, the method may comprise identifying one or more entry points to an enclosed area using a set of unmanned vehicles based on a global positioning system. Upon identifying cooperatively between unmanned vehicles, the method may comprise assigning an entry point of the one or more entry points,
4
to each the unmanned vehicles. Further, each of the unmanned vehicles is configured to traverse a primary path associated with the assigned entry point, and scan the primary path using a camera and one or more sensors mounted on the unmanned vehicle and without GPS. Subsequent to assigning, the method may comprise identifying a plurality of sub-paths corresponding to a plurality of primary paths traversed collectively by the set of unmanned vehicles based on the scanning. Further, each of the primary paths comprises one or more sub-paths. Upon identifying, the method may comprise designating a sub-path from the plurality of sub-paths to one or more of the unmanned vehicle from the set of unmanned vehicles. Further, the one or more of the unmanned vehicle is configured to traverse the designated sub-path, and scan the designated sub-path using the camera and the one or more sensors mounted on the unmanned vehicle and without GPS. Subsequent to designating, the method may comprise generating a map of the enclosed area based on an aggregation of the scanning of each of the primary paths and the sub-paths scanned by the set of the unmanned vehicles.
[008] In yet another implementation, non-transitory computer readable medium embodying a program executable in a computing device for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles is disclosed. In one aspect, the program may comprise a program code for identifying one or more entry points to an enclosed area using a set of unmanned vehicles based on a global positioning system. The program may comprise a program code for assigning an entry point of the one or more entry points, to each the unmanned vehicles. Further, each of the unmanned vehicles is configured to traverse a primary path associated with the assigned entry point, and scan the primary path using a camera and one or more sensors mounted on the unmanned vehicle and without GPS. The program may comprise a program code for identifying a plurality of sub-paths corresponding to a plurality of primary paths traversed collectively by the set of unmanned vehicles based on the scanning. Further, each of the primary paths comprises one or more sub-paths. The program may comprise a program code for designating a sub-path from the plurality of sub-paths to one or more of the unmanned vehicle from the set of unmanned vehicles. Further, the one or more of the unmanned vehicle is configured to traverse the designated sub-path, and scan the designated sub-path using the camera and the one or more sensors mounted on the unmanned vehicle and without GPS. The program may comprise a program code for generating a map of the enclosed area based on an aggregation of the scanning of each of the primary paths and the sub-paths scanned by the set of the unmanned vehicles.
5
BRIEF DESCRIPTION OF THE DRAWINGS
[009] The foregoing detailed description of embodiments is better understood when read in conjunction with the appended drawings. For the purpose of illustrating of the present subject matter, an example of construction of the present subject matter is provided as figures; however, the invention is not limited to the specific method and system disclosed in the document and the figures.
[010] The present subject matter is described detail with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer various features of the present subject matter.
[011] Figure 1 illustrates a network implementation of a system for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles, in accordance with an embodiment of the present subject matter.
[012] Figure 2 illustrates the system generating a map of an enclosed area by utilizing a plurality of unmanned vehicles, in accordance with an embodiment of the present subject matter.
[013] Figure 3 illustrates a method for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles, in accordance with an embodiment of the present subject matter.
DETAILED DESCRIPTION
[014] Some embodiments of this disclosure, illustrating all its features, will now be discussed in detail. The words "comprising," "having," "containing," and "including," and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms "a," "an," and "the" include plural references unless the context clearly dictates otherwise. Although any systems and methods for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles, similar or equivalent to those described herein can be used in the practice or testing
6
of embodiments of the present disclosure, the exemplary, systems and methods for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles are now described. The disclosed embodiments for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles are merely examples of the disclosure, which may be embodied in various forms.
[015] Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles. However, one of ordinary skill in the art will readily recognize that the present disclosure for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles is not intended to be limited to the embodiments described, but is to be accorded the widest scope consistent with the principles and features described herein.
[016] In an implementation, a system and method for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles (UV), is described. In the implementation one or more entry points to an enclosed area, such as a building, a house, a warehouse, cave, and a storehouse, using a set of unmanned vehicles based on a global positioning system. Upon identification of the one or more entry points, an entry point of the one or more entry points is assigned to each of the unmanned vehicles or vice versa. Further, each unmanned vehicle is configured to traverse a primary path connected to the entry point and scan the primary path using a camera and one or more sensors mounted on the unmanned vehicle and without GPS. In one example, the sensor may be a proximity sensor, an accelerometer sensor, a gyroscope sensor, a temperature sensor, a laser and the like.
[017] Further to assigning the entry points, a plurality of sub-paths corresponding to a plurality of primary paths traversed collectively by the set of unmanned vehicles are identified based on the scanning. Further, each of the primary paths comprises one or more sub-paths. Subsequent to designating a sub-path from the plurality of sub-paths to one or more of the unmanned vehicle from the set of unmanned vehicles. Further, the one or more of the unmanned vehicle is configured to traverse the designated sub-path, and scan the designated sub-path using the camera and the one or more sensors mounted on the unmanned vehicle and without GPS. Finally, a map of the enclosed area is generated based on an aggregation of the scanning of each of the primary paths and the sub-paths scanned by the set of the unmanned vehicles.
7
[018] Referring now to Figure 1, a network implementation of a system 102 for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles (UV) 104, in accordance with an embodiment of the present subject matter may be described. In one embodiment, the present subject matter is explained considering that the system 102 may be implemented as a system 102 installed within a central server 110 connected to a network 106. It may be understood that the system 102 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, a cloud-based computing environment and the like. It may be understood that the system 102 may also be implement in an unmanned vehicle connected 104-1 to other unmanned vehicle, 104-2…. 104-N, herein after referred to as unmanned vehicle (UV) 104, via the network 106. In the embodiment the unmanned vehicle 104with on-board installed system 102 may act as a master UV 104 and the other UVs 104 in communication with it may act as the slave UV. In one example, the UV may comprise a camera 112 and one or more sensors and without GPS.
[019] In another embodiment, the system 102 may also be implemented on a display device 108. It may be understood that the system implemented on the display device supports a plurality of browsers and all viewports. Examples of the plurality of browsers may include, but not limited to, Chrome™, Mozilla™, Internet Explorer™, Safari™, and Opera™. It will also be understood that the system 102 may be accessed by multiple users through one or more display devices 108, or applications residing on the display devices 104. Examples of the display devices 104 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, a television and a workstation. The display devices 108 are communicatively coupled to the system 102 and the UV 104.
[020] In one implementation, the network 106 may be a wireless network, a wired network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), Wireless Personal Area Network (WPAN), Wireless Local Area Network (WLAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, MQ Telemetry Transport (MQTT), Extensible Messaging and Presence Protocol (XMPP), Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to
8
communicate with one another. Further the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like. In one embodiment, the system 102 may be communicatively coupled via a network to a database 110. In one example, the display devices may obtain the multimedia to be viewed from the database 110 via a network or via an electromagnetic signal transmission.
[021] Referring now to Figure 2, the system 102 is illustrated in accordance with an embodiment of the present subject matter. In one embodiment, the system 102 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 may be configured to fetch and execute computer-readable instructions stored in the memory 206.
[022] The I/O interface 204 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the system 102 to interact with the user directly or through the client devices 104. Further, the I/O interface 204 may enable the system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server.
[023] The memory 206 may include any computer-readable medium or computer program product known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and data 210.
[024] The modules 208 include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. In one implementation, the modules 208 may include a primary path module 212, a sub-path module
9
214, a generation module 216 and an other module 218. The other modules 218 may include programs or coded instructions that supplement applications and functions of the system 102. The modules 208 described herein may be implemented as software modules that may be executed in the cloud-based computing environment of the system 102.
[025] The memory 206, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the modules 208. The memory 206 may include data generated as a result of the execution of one or more modules in the other module 220. In one implementation, the memory may include data 210. Further, the data 210 may include a system data 220 for storing data processed, computed received and generated by one or more of the modules 208. Furthermore, the data 210 may include other data 224 for storing data generated as a result of the execution of one or more modules in the other module 220.
[026] In one implementation, at first, a user may use the device 108 to access the system 102 via the I/O interface 204. The user may register using the I/O interface 204 in order to use the system 102. In one aspect, the user may access the I/O interface 204 of the system 102 for obtaining information or providing input information. In one implementation the system 102 my automatically provide information to the user through I/O interface 204.
PRIMARY PATH MODULE 212
[027] Referring to figure 2, in an embodiment the primary path module 212 may identify one or more entry points to an enclosed area using a set of unmanned vehicles based on a global positioning system. The Global Positioning System (GPS) may be understood as a space-based navigation system that provides location and time information in all weather conditions, anywhere on or near the Earth where there is an unobstructed line of sight to four or more GPS satellites. Further, the enclosed area may be understood as an area without direct access to the sky for example, a cave, a house, warehouse, a building and the like where GPS is not accessible. Furthermore, the unmanned vehicle may be unmanned aerial vehicle (UAV), a robot, a drone, and quad-copter. In one example, the unmanned vehicle may be understood as is a vehicle without a human pilot/driver aboard. The motion of UVs may be controlled either autonomously by on board/off board computers or by the remote control of a pilot/driver. Further, in the said embodiment, the primary path module 212 may
10
store the identified entry points along with its GPS coordinates and other data in the system data 220.
[028] Upon identification of the entry points to the enclosed area, the primary path module 212 may assign an entry point from the entry points, to each of the unmanned vehicles. In one example, the entry points assigned to are distinct from each other. Further in one example, each of the unmanned vehicles is configured to traverse a primary path associated with the assigned entry point and scan the primary path using a camera 112 and one or more sensors mounted on the unmanned vehicle and without GPS. In one example, the sensors may comprise an accelerator meter, a gyroscope, a temperature sensor, laser, and a proximity sensor. Further in the said embodiment, the primary path module 212 may store the assignment along with scanning data and other data in the system data 220.
[029] In one other example, the primary path module 212 may monitor the scanning and detect a conflict between two or more unmanned vehicles from the set of unmanned vehicles during scanning of the primary paths based on the one or more sensors and the camera and without GPS. In one example, the conflict may be detected by a proximity sensor configured to sense proximity between two UV. In other example, the conflict may be detected by the camera 112 configure to detect a UV in its vicinity. Further, the conflict may be understood as an overlap of the primary path being scanned by the two or more unmanned vehicles. Further to detection, the primary path module 212 may assign a new primary path to at least one of the unmanned vehicle in conflict based on the detection.
SUB-PATH MODULE 214
[030] In the implementation further to assigning the entry points, the sub-path module 214 may identify a plurality of sub-paths corresponding to a plurality of primary paths traversed collectively by the set of unmanned vehicles based on the scanning. Further, each of the primary paths may comprise one or more sub-paths. Further, the sub-path module 214 may store the identified sub-paths in the system data 220.
[031] Upon identification of the sub-paths, the sub-path module 214 may designate a sub-path from the plurality of sub-paths to one or more of the unmanned vehicle from the set of unmanned vehicles. Further, the one or more of the unmanned vehicle is configured to traverse the designated sub-path and scan the designated sub-path using the camera 112 and the one or more sensors mounted on the unmanned vehicle and without GPS. In one more
11
embodiment, the UV may be configured to scanning sub-paths/sub-area located inside the enclosed area on a priority as predefined by a user. In one example, the sensors may comprise an accelerator meter, a gyroscope, a temperature sensor, a laser, and a proximity sensor.
[032] In one implementation, detecting a conflict between two or more unmanned vehicles from the set of unmanned vehicles during scanning of the sub-paths based on the one or more sensors and the camera 112. In one example, the conflict may be detected by a proximity sensor configured to sense proximity between two UV. In other example, the conflict may be detected by the camera 112 configure to detect a UV in its vicinity. Further, the conflict may be understood of an overlap of the sub-path being scanned by the two or more unmanned vehicles. Upon detection of conflict, the sub-path module 214 may designate a new sub-path to at least one of the unmanned vehicle based on the detection.
GENERATION MODULE 216
[033] In the implementation upon designating and scanning the sub-paths, the generation module 216 may generate a map of the enclosed area. In an embodiment, the generation may be based on an aggregation of the scanning of each of the primary paths and the sub-paths scanned by the set of the unmanned vehicles. Further, the generation module 216 may store the map in the system data 220.
[034] In one other embodiment, the generation module 216 may generate a map in real-time and provide the map to a user. In one more embodiment, the generation module 216 may identify one or more objects in the enclosed area based on scanning. In one example, the one or more objects may be one of an animate object, such as human, animal etc. or an inanimate object, laptop, book etc. as predefined by a user. Upon identifying the object, the generation module 216 may notify the user with a location of the object in the map of the enclosed area, enabling easy retrieval. Furthermore, the generation module 216 may compute a distance between the entry points, distance between the sub-paths, length of the primary paths, length of the sub-paths based on the scanning from providing an accurate map and distance between two points.
[035] Exemplary embodiments for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles discussed above may provide certain advantages. Though not required to practice aspects of the disclosure, these advantages may include those provided by the following features.
12
[036] Some embodiments of the system and the method enable generating a map in absence of GPS/GPRS.
[037] Some embodiments of the system and the method enable simulations mapping and searching of a new location.
[038] Some embodiments of the system and the method reduce repeated scanning and time wastage.
[039] Referring now to Figure 3, a method 300 for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles is shown, in accordance with an embodiment of the present subject matter. The method 300 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types.
[040] The order in which the method 300 for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles as described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 300 or alternate methods. Additionally, individual blocks may be deleted from the method 300 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 300 may be considered to be implemented in the above described system 102.
[041] At block 302, one or more entry points to an enclosed area using a set of unmanned vehicles based on a global positioning system may be identified. In an implementation, the primary path module 212 may identify data one or more entry points to an enclosed area using a set of unmanned vehicles based on a global positioning system and store entry points data in the system data 220.
[042] At block 304, an entry point of the one or more entry points, may be assigned to each of the unmanned vehicles. Further, each unmanned vehicle may be configured to traverse a primary path associated with the assigned entry point, and scan the primary path using a camera and one or more sensors mounted on the unmanned vehicle and without GPS. In the
13
implementation, the primary path module 212 may assign an entry point of the one or more entry points for scanning of the primary path associate with the entry point, to each of the unmanned vehicles. The primary path module 212 may store assignment and the scanned data in the system data 220.
[043] At block 306, a plurality of sub-paths corresponding to a plurality of primary paths traversed collectively by the set of unmanned vehicles may be identified based on the scanning. Further, each of the primary paths comprises one or more sub-paths. In the implementation, the sub-path module 214 may identify a plurality of sub-paths corresponding to a plurality of primary paths traversed collectively by the set of unmanned vehicles and store the sub-path in the system data 220.
[044] At block 308, a sub-path from the plurality of sub-paths may be designated to one or more of the unmanned vehicle from the set of unmanned vehicles. Further, wherein one or more of the unmanned vehicle is configured to traverse and scan the designated sub-path. Furthermore the scanning is based on the camera and the one or more sensors mounted on the unmanned vehicle and without GPS. In the implementation, the sub-path module 214 may designate a sub-path from the plurality of sub-paths to one or more of the unmanned vehicle from the set of unmanned vehicles. The sub-path module 214 may store the designation and the scanned data in the system data 220.
[045] At block 310, a map of the enclosed area based on an aggregation of the scanning of each of the primary paths and the sub-paths scanned by the set of the unmanned vehicles may be generated. In the implementation, the generation module 216 may generate a map of the enclosed area and store the map in the system data 220.
[046] Exemplary embodiments discussed above may provide certain advantages. Though not required to practice aspects of the disclosure, these advantages may include a method and system for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles.
[047] Although implementations for methods and systems for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific
14
features and methods are disclosed as examples of implementations for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles.
WE CLAIM:
1. A method for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles, the method comprising:
identifying, by a processor, one or more entry points to an enclosed area using a set of unmanned vehicles based on a global positioning system; assigning, by the processor, an entry point of the one or more entry points, to each of the unmanned vehicles, wherein each unmanned vehicle is configured to traverse a primary path associated with the assigned entry point, and wherein each unmanned vehicle is configured to scan the primary path using a camera and one or more sensors mounted on the unmanned vehicle; identifying, by the processor, a plurality of sub-paths corresponding to a plurality of primary paths traversed collectively by the set of unmanned vehicles based on the scanning, wherein each of the primary paths comprises one or more sub-paths; designating, by the processor, a sub-path from the plurality of sub-paths to one or more of the unmanned vehicle from the set of unmanned vehicles, wherein one or more of the unmanned vehicle is configured to traverse the designated sub-path, and wherein each unmanned vehicle is configured to scan the designated sub-path using the camera and the one or more sensors mounted on the unmanned vehicle; and generating, by the processor, a map of the enclosed area based on an aggregation of the scanning of each of the primary paths and the sub-paths scanned by the set of the unmanned vehicles.
2. The method of claim 1, further comprising scanning a sub-area located inside the enclosed area on a priority, predefined by a user.
3. The method of claim 1, further comprising:
identifying one or more objects, wherein each objects is one of an animate object or an inanimate object predefined by a user; and notifying the user with a location of the object in the map of the enclosed area.
4. The method of claim 1, further comprising:
detecting a conflict between two or more unmanned vehicles from the set of unmanned vehicles during scanning based on the one or more sensors and the camera, wherein the conflict is indicative of an overlap of the primary path or the sub-path being scanned by the two or more unmanned vehicles
16
5. The method of claim 4, further comprising assigning a new primary path to at least one of the unmanned vehicle in conflict based on the detection.
6. The method of claim 4, further comprising designating a new sub-path to at least one of the unmanned vehicle based on the detection.
7. The method of claim 1, further comprising computing a distance between the entry points, distance between the sub-paths, length of the primary paths, length of the sub-paths based on the scanning.
8. The method of claim 1, wherein the one or more sensors comprises an accelerator meter, a gyroscope, a temperature sensor, a laser, and a proximity sensor.
9. The method of claim 1, wherein the unmanned vehicle is one of an unmanned aerial vehicle (UAV), a robot, a drone, and quad-copter.
10. The method of claim 1, wherein the each of the unmanned vehicle is configured to scan the primary paths and the sub-paths without GPS.
11. A system for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles, the system comprising:
a memory; and a processor coupled to the memory, wherein the processor is capable of executing instructions to perform steps of: identifying one or more entry points to an enclosed area using a set of unmanned vehicles based on a global positioning system; assigning an entry point of the one or more entry points, to each of the unmanned vehicles, wherein each unmanned vehicle is configured to traverse a primary path associated with the assigned entry point, and wherein each unmanned vehicle is configured to scan the primary path using a camera and one or more sensors mounted on the unmanned vehicle; identifying a plurality of sub-paths corresponding to a plurality of primary paths traversed collectively by the set of unmanned vehicles based on the scanning, wherein each of the primary paths comprises one or more sub-paths; designating a sub-path from the plurality of sub-paths to one or more of the unmanned vehicle from the set of unmanned vehicles, wherein one or more of the unmanned vehicle is configured to traverse the designated sub-path, and wherein each unmanned vehicle is configured to scan the designated sub-path using the camera and the one or more sensors mounted on the unmanned vehicle; and
17
generating a map of the enclosed area based on an aggregation of the scanning of each of the primary paths and the sub-paths scanned by the set of the unmanned vehicles.
12. The system of claim 11, further comprising scanning a sub-area located inside the enclosed area on a priority, predefined by a user.
13. The system of claim 11, further comprising
identifying one or more objects, wherein each objects is one of an animate object or an inanimate object predefined by a user; and notifying the user with a location of the object in the map of the enclosed area.
14. The system of claim 11, further comprising
detecting a conflict between two or more unmanned vehicles from the set of unmanned vehicles during scanning based on the one or more sensors and the camera, wherein the conflict is indicative of an overlap of the primary path or the sub-path being scanned by the two or more unmanned vehicles
15. The system of claim 14, further comprising assigning a new primary path to at least one of the unmanned vehicle in conflict based on the detection.
16. The system of claim 14, further comprising designating a new sub-path to at least one of the unmanned vehicle based on the detection.
17. The system of claim 11, further comprising computing a distance between the entry points, distance between the sub-paths, length of the primary paths, length of the sub-paths based on the scanning.
18. The system of claim 11, wherein the one or more sensors comprises an accelerator meter, a gyroscope, a temperature sensor, a laser, and a proximity sensor.
19. The system of claim 11, wherein the each of the unmanned vehicle is configured to scan the primary paths and the sub-paths without GPS.
20. The system of claim 11, wherein the unmanned vehicle is one of an unmanned aerial vehicle (UAV), a robot, a drone, and quad-copter.
21. A non-transitory computer program product having embodied thereon a computer program for generating a map of an enclosed area by utilizing a plurality of unmanned vehicles, the computer program product storing instructions, the instructions comprising instructions for:
identifying one or more entry points to an enclosed area using a set of unmanned vehicles based on a global positioning system;
18
assigning an entry point of the one or more entry points, to each of the unmanned vehicles, wherein each unmanned vehicle is configured to traverse a primary path associated with the assigned entry point, and wherein each unmanned vehicle is configured to scan the primary path using a camera and one or more sensors mounted on the unmanned vehicle; identifying a plurality of sub-paths corresponding to a plurality of primary paths traversed collectively by the set of unmanned vehicles based on the scanning, wherein each of the primary paths comprises one or more sub-paths; designating a sub-path from the plurality of sub-paths to one or more of the unmanned vehicle from the set of unmanned vehicles, wherein one or more of the unmanned vehicle is configured to traverse the designated sub-path, and wherein each unmanned vehicle is configured to scan the designated sub-path using the camera and the one or more sensors mounted on the unmanned vehicle; and generating a map of the enclosed area based on an aggregation of the scanning of each of the primary paths and the sub-paths scanned by the set of the unmanned vehicles.
| # | Name | Date |
|---|---|---|
| 1 | Form 9 [29-01-2016(online)].pdf | 2016-01-29 |
| 1 | Reply from DRDO.pdf | 2022-08-30 |
| 2 | Form 3 [29-01-2016(online)].pdf | 2016-01-29 |
| 2 | 201611003348-FER.pdf | 2022-02-03 |
| 3 | 201611003348-reply from DRDO-(16-09-2019).pdf | 2019-09-16 |
| 4 | Form 18 [29-01-2016(online)].pdf | 2016-01-29 |
| 4 | 201611003348-Response to office action (Mandatory) [11-09-2018(online)].pdf | 2018-09-11 |
| 5 | Drawing [29-01-2016(online)].pdf | 2016-01-29 |
| 5 | 201611003348-Defence Letter-(22-09-2016).pdf | 2016-09-22 |
| 6 | Other Patent Document [30-07-2016(online)].pdf | 2016-07-30 |
| 6 | Description(Complete) [29-01-2016(online)].pdf | 2016-01-29 |
| 7 | abstract.jpg | 2016-07-11 |
| 7 | 201611003348- Defence Letter - (17-02-2016).pdf | 2016-02-17 |
| 8 | 201611003348-GPA-(13-05-2016).pdf | 2016-05-13 |
| 8 | 201611003348-Correspondence Others-(13-05-2016).pdf | 2016-05-13 |
| 9 | 201611003348-Form-1-(13-05-2016).pdf | 2016-05-13 |
| 10 | 201611003348-GPA-(13-05-2016).pdf | 2016-05-13 |
| 10 | 201611003348-Correspondence Others-(13-05-2016).pdf | 2016-05-13 |
| 11 | abstract.jpg | 2016-07-11 |
| 11 | 201611003348- Defence Letter - (17-02-2016).pdf | 2016-02-17 |
| 12 | Other Patent Document [30-07-2016(online)].pdf | 2016-07-30 |
| 12 | Description(Complete) [29-01-2016(online)].pdf | 2016-01-29 |
| 13 | Drawing [29-01-2016(online)].pdf | 2016-01-29 |
| 13 | 201611003348-Defence Letter-(22-09-2016).pdf | 2016-09-22 |
| 14 | Form 18 [29-01-2016(online)].pdf | 2016-01-29 |
| 14 | 201611003348-Response to office action (Mandatory) [11-09-2018(online)].pdf | 2018-09-11 |
| 15 | 201611003348-reply from DRDO-(16-09-2019).pdf | 2019-09-16 |
| 16 | Form 3 [29-01-2016(online)].pdf | 2016-01-29 |
| 16 | 201611003348-FER.pdf | 2022-02-03 |
| 17 | Reply from DRDO.pdf | 2022-08-30 |
| 17 | Form 9 [29-01-2016(online)].pdf | 2016-01-29 |
| 1 | searchE_01-02-2022.pdf |