Sign In to Follow Application
View All Documents & Correspondence

System And Method For Managing Mobile Robot

Abstract: A system and method for managing a mobile robot operating in a work area comprising a matrix of ground markers is provided. The system comprises an odometry control arrangement to control movement of the mobile robot, a sensing arrangement to estimate a position of the mobile robot, and a scanning arrangement to scan space around the mobile robot. The system further comprises a processing unit configured to: determine an odometry error in movement of the mobile robot; command an assistive robot to travel to a position in a neighbourhood of the mobile robot; detect presence of features of the assistive robot and a relative position thereof; localize the mobile robot based on the position of the assistive robot and the determined relative position; determine a nearest ground marker to the mobile robot based on the localization; and control the mobile robot to travel to the determined nearest ground marker. FIG. 5

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
24 November 2022
Publication Number
22/2024
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application

Applicants

Addverb Technologies Limited
Plot No. 5, Sector-156, Phase-II, Noida, Gautam Buddha Nagar, Uttar Pradesh, India, 201310

Inventors

1. Sarthak Upadhyay
E2003, Akriti Shantiniketan, Sector 143B, Noida, Uttar Pradesh 201306
2. Devnath Nair
Vadakkedath, Velloor, Kottayam, Kerala, India – 686609
3. Sunil Sulania
2 Ka 179, Shivaji Park, Alwar, Rajasthan - 301001

Specification

Description:SYSTEM AND METHOD FOR MANAGING MOBILE ROBOT

FIELD OF THE PRESENT DISCLOSURE
[0001] The present disclosure generally relates to autonomous guided vehicles, such as a mobile robot, implemented to move in a work area comprising a matrix of ground markers, and particularly to a system and method for managing the mobile robot operating in the work area, or specifically automatically correcting orientation of the mobile robot operating in the work area in case of determination of odometry error.

BACKGROUND
[0002] Autonomous guided vehicles (AGVs), also known as mobile robots, are increasingly being employed for transporting goods and materials from one place to another in constrained environments, such as a factory or a warehouse. For example, mobile robots are used in warehouse environments to assist with inventory management by transporting goods from one area of the warehouse to another. In the warehouse, the mobile robot may travel from a loading area to a dropping area based on a control system and without intervention from users. In a manufacturing plant, the mobile robots can transport items such as heavy vehicle components like engines, chassis, etc. along a route on a floor of the manufacturing plant to deliver the payload from one location to another or to allow various manufacturing operations to be performed thereon. Mobile robots may offer the ability to carry payloads too heavy for a person to carry and without the supervision of a person, while also offering the flexibility to be reconfigured to follow a different route or carry different types of payloads.
[0003] Most systems involving such mobile robots implement ground markers placed on a floor, usually in the form of a matrix, to enable the mobile robots to follow a path defined using a combination of such ground markers. The mobile robot determines its position with respect to the floor based on the ground marker in vicinity (specifically, directly underneath) thereof. The very essence of the mobile robots is that its movements are accurately predetermined, so as to accurately follow the predefined path. However, sometimes the mobile robot may deviate from the predefined path during its operation due to one or the other factor. Traditionally, the mobile robot is moved (or brought) back to the predefined path by positioning (or placing) it on one of the adjacent ground markers as part of the predefined path, but this process is usually manual which may be cumbersome and not cost-effective. Further, it may be noted that since unimpeded system operation highly depends upon mobile robot’s availability, downtime or maintenance is undesirable and can incur high costs.
[0004] Therefore, in light of the foregoing discussion, there exists a need to overcome problems associated with conventional techniques and provide systems and/or methods for managing the mobile robots in the work area, or specifically assisting the mobile robot to correct its orientation in a manner which is automated, while minimally affecting system’s throughput.

SUMMARY
[0005] In an aspect, a system for managing a mobile robot operating in a work area comprising a matrix of ground markers is provided. The system comprises an odometry control arrangement provided in the mobile robot. The odometry control arrangement is configured to control movement of the mobile robot in the work area based on the ground markers therein. The system also comprises a sensing arrangement configured to estimate a position of the mobile robot based on the ground markers, when the mobile robot is moved in the work area using the odometry control arrangement. The system also comprises a scanning arrangement provided in the mobile robot. The scanning arrangement is configured to scan space around the mobile robot to generate scan data comprising features of one or more objects and relative position of the one or more objects with respect to the mobile robot in a vicinity of the mobile robot. The system further comprises a processing unit. The processing unit is configured to determine an odometry error in the movement of the mobile robot in the work area based on the estimated position of the mobile robot. The processing unit is further configured to command an assistive robot to travel in the work area to a designated position corresponding to one of the ground markers in a neighbourhood of the mobile robot based on a last estimated position of the mobile robot in the work area, in response to determination of the odometry error. The processing unit is further configured to implement the scanning arrangement in the mobile robot to scan the space around the mobile robot to generate scan data therefor. The processing unit is further configured to process the generated scan data to detect presence of features of the assistive robot and a relative position of the assistive robot with respect to the mobile robot in the work area. The processing unit is further configured to localize the mobile robot in the work area based on the position of the assistive robot in the work area and the determined relative position of the assistive robot with respect to the mobile robot. The processing unit is further configured to determine a nearest ground marker, as part of a predefined path, to the mobile robot and a trajectory for the mobile robot to travel from a current position thereof to the determined nearest ground marker based on the localization of the mobile robot. The processing unit is further configured to implement the odometry control arrangement to control the mobile robot to travel to the determined nearest ground marker by following the determined trajectory.
[0006] In one or more embodiments, the processing unit is configured to select one of other mobile robots from a fleet of mobile robots operating in the work area to designate as the assistive robot.
[0007] In one or more embodiments, the processing unit is configured to determine a suitable time to command the assistive robot to travel in the work area to the designated position corresponding to one of the ground markers in the neighbourhood of the mobile robot based on at least one of: an operational cycle of the other mobile robot designated as the assistive robot and operational cycles of mobile robots from the fleet of mobile robots operating in the work area.
[0008] In one or more embodiments, the processing unit is configured to command two or more assistive robots to travel in the work area to respective designated positions corresponding to different ground markers in the neighbourhood of the mobile robot based on the last estimated position of the mobile robot in the work area, in response to determination of the odometry error.
[0009] In one or more embodiments, the sensing arrangement comprises an optical recognizer configured to capture an image of a portion of the work area underneath the mobile robot when the mobile robot is operating in the work area to detect the ground marker directly underneath the mobile robot, from the matrix of ground markers. Further, the odometry control arrangement is disposed in signal communication with the optical recognizer and configured to control movement of the mobile robot to enable the mobile robot to traverse the predefined path in the work area, defined by virtually linking two or more ground markers from the matrix of ground markers therein, based on detection of the ground markers by the optical recognizer.
[0010] In one or more embodiments, the processing unit is configured to determine the odometry error in the movement of the mobile robot in the work area based on non-detection of the ground marker, from the matrix of ground markers, by the optical recognizer in accordance with the predefined path in the work area.
[0011] In another aspect, a method for managing a mobile robot operating in a work area comprising a matrix of ground markers is provided. The method comprises estimating a position of the mobile robot based on the ground markers, when the mobile robot is moved in the work area. The method further comprises determining an odometry error in a movement of the mobile robot in the work area based on the estimated position of the mobile robot. The method further comprises commanding an assistive robot to travel in the work area to a designated position corresponding to one of the ground markers in a neighbourhood of the mobile robot based on a last estimated position of the mobile robot in the work area, in response to determination of the odometry error. The method further comprises scanning a space around the mobile robot to generate scan data therefor. The method further comprises processing the generated scan data to detect presence of features of the assistive robot and a relative position of the assistive robot with respect to the mobile robot in the work area. The method further comprises localizing the mobile robot in the work area based on the position of the assistive robot in the work area and the determined relative position of the assistive robot with respect to the mobile robot. The method further comprises determining a nearest ground marker, as part of a predefined path, to the mobile robot and a trajectory for the mobile robot to travel from a current position thereof to the determined nearest ground marker based on the localization of the mobile robot. The method further comprises moving the mobile robot to travel to the determined nearest ground marker by following the determined trajectory.
[0012] In one or more embodiments, the method further comprises selecting one of other mobile robots from a fleet of mobile robots operating in the work area to designate as the assistive robot; and determining a suitable time to command the assistive robot to travel in the work area to the designated position corresponding to one of the ground markers in the neighbourhood of the mobile robot based on at least one of: an operational cycle of the other mobile robot designated as the assistive robot and operational cycles of mobile robots from the fleet of mobile robots operating in the work area.
[0013] In one or more embodiments, the method further comprises commanding two or more assistive robots to travel in the work area to respective designated positions corresponding to different ground markers in the neighbourhood of the mobile robot based on the last estimated position of the mobile robot in the work area, in response to determination of the odometry error.
[0014] In one or more embodiments, the method further comprises determining the odometry error in the movement of the mobile robot in the work area based on non-detection of the ground marker, from the matrix of ground markers, by the optical recognizer in accordance with the predefined path in the work area.
[0015] The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.

BRIEF DESCRIPTION OF THE FIGURES
[0016] For a more complete understanding of example embodiments of the present disclosure, reference is now made to the following descriptions taken in connection with the accompanying drawings in which:
[0017] FIG. 1 illustrates a schematic of a system that may reside on and may be executed by a computer, which may be connected to a network, in accordance with one or more embodiments of the present disclosure;
[0018] FIG. 2 illustrates a schematic of an exemplary computing system for managing a mobile robot operating in a work area, in accordance with one or more embodiments of the present disclosure;
[0019] FIG. 3 illustrates an exemplary implementation of the system for the work area in which a fleet of mobile robots are operated, in accordance with one or more embodiments of the present disclosure;
[0020] FIG. 4 illustrates an exemplary implementation of the system for the work area in which one of the mobile robots is determined to have an odometry error, in accordance with one or more embodiments of the present disclosure;
[0021] FIG. 5 illustrates an exemplary implementation of the system for the work area in which an assistive robot is provided to assist the mobile robot determined to have an odometry error to correct its position and orientation, in accordance with one or more embodiments of the present disclosure;
[0022] FIG. 6 illustrates a flowchart listing steps involved in a method for managing the mobile robot operating in a work area, in accordance with one or more embodiments of the present disclosure.

DETAILED DESCRIPTION
[0023] In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be apparent, however, to one skilled in the art that the present disclosure is not limited to these specific details.
[0024] Reference in this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. The appearance of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Further, the terms “a” and “an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced items. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not for other embodiments.
[0025] Furthermore, in the following detailed description of the present disclosure, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. However, it will be understood that the present disclosure may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the present disclosure.
[0026] Embodiments described herein may be discussed in the general context of computer-executable instructions residing on some form of computer-readable storage medium, such as program modules, executed by one or more computers or other devices. By way of example, and not limitation, computer-readable storage media may comprise non-transitory computer-readable storage media and communication media; non-transitory computer-readable media include all computer-readable media except for a transitory, propagating signal. Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. The functionality of the program modules may be combined or distributed as desired in various embodiments.
[0027] Some portions of the detailed description that follows are presented and discussed in terms of a process or method. Although steps and sequencing thereof are disclosed in figures herein describing the operations of this method, such steps and sequencing are exemplary. Embodiments are well suited to performing various other steps or variations of the steps recited in the flowchart of the figure herein, and in a sequence other than that depicted and described herein. Some portions of the detailed descriptions that follow are presented in terms of procedures, logic blocks, processing, and other symbolic representations of operations on data bits within a computer memory. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. In the present disclosure, a procedure, logic block, process, or the like, is conceived to be a self-consistent sequence of steps or instructions leading to a desired result. The steps are those utilizing physical manipulations of physical quantities. Usually, although not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as transactions, bits, values, elements, symbols, characters, samples, pixels, or the like.
[0028] In some implementations, any suitable computer usable or computer readable medium (or media) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer-usable, or computer-readable, storage medium (including a storage device associated with a computing device) may be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable medium may include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fibre, a portable compact disc read-only memory (CD-ROM), an optical storage device, a digital versatile disk (DVD), a static random access memory (SRAM), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, a media such as those supporting the internet or an intranet, or a magnetic storage device. Note that the computer-usable or computer-readable medium could even be a suitable medium upon which the program is stored, scanned, compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory. In the context of the present disclosure, a computer-usable or computer-readable, storage medium may be any tangible medium that can contain or store a program for use by or in connection with the instruction execution system, apparatus, or device.
[0029] In some implementations, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. In some implementations, such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. In some implementations, the computer readable program code may be transmitted using any appropriate medium, including but not limited to the internet, wireline, optical fibre cable, RF, etc. In some implementations, a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
[0030] In some implementations, computer program code for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object-oriented programming language such as Java®, Smalltalk, C++ or the like. Java and all Java-based trademarks and logos are trademarks or registered trademarks of Oracle and/or its affiliates. However, the computer program code for carrying out operations of the present disclosure may also be written in conventional procedural programming languages, such as the "C" programming language, PASCAL, or similar programming languages, as well as in scripting languages such as JavaScript, PERL, or Python. In present implementations, the used language for training may be one of Python, TensorflowTM, Bazel, C, C++. Further, decoder in user device (as will be discussed) may use C, C++ or any processor specific ISA. Furthermore, assembly code inside C/C++ may be utilized for specific operation. Also, ASR (automatic speech recognition) and G2P decoder along with entire user system can be run in embedded Linux (any distribution), Android, iOS, Windows, or the like, without any limitations. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user’s computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user’s computer through a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the internet using an Internet Service Provider). In some implementations, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGAs) or other hardware accelerators, micro-controller units (MCUs), or programmable logic arrays (PLAs) may execute the computer readable program instructions/code by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
[0031] In some implementations, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus (systems), methods and computer program products according to various implementations of the present disclosure. Each block in the flowchart and/or block diagrams, and combinations of blocks in the flowchart and/or block diagrams, may represent a module, segment, or portion of code, which comprises one or more executable computer program instructions for implementing the specified logical function(s)/act(s). These computer program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the computer program instructions, which may execute via the processor of the computer or other programmable data processing apparatus, create the ability to implement one or more of the functions/acts specified in the flowchart and/or block diagram block or blocks or combinations thereof. It should be noted that, in some implementations, the functions noted in the block(s) may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
[0032] In some implementations, these computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks or combinations thereof.
[0033] In some implementations, the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed (not necessarily in a particular order) on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts (not necessarily in a particular order) specified in the flowchart and/or block diagram block or blocks or combinations thereof.
[0034] Referring now to the example implementation of FIG. 1, there is shown a system 100 that may reside on and may be executed by a computer (e.g., computer 12), which may be connected to a network (e.g., network 14) (e.g., the internet or a local area network). Examples of computer 12 may include, but are not limited to, a personal computer(s), a laptop computer(s), mobile computing device(s), a server computer, a series of server computers, a mainframe computer(s), or a computing cloud(s). In some implementations, each of the aforementioned may be generally described as a computing device. In certain implementations, a computing device may be a physical or virtual device. In many implementations, a computing device may be any device capable of performing operations, such as a dedicated processor, a portion of a processor, a virtual processor, a portion of a virtual processor, a portion of a virtual device, or a virtual device. In some implementations, a processor may be a physical processor or a virtual processor. In some implementations, a virtual processor may correspond to one or more parts of one or more physical processors. In some implementations, the instructions/logic may be distributed and executed across one or more processors, virtual or physical, to execute the instructions/logic. Computer 12 may execute an operating system, for example, but not limited to, Microsoft® Windows®; Mac® OS X®; Red Hat® Linux®, or a custom operating system. (Microsoft and Windows are registered trademarks of Microsoft Corporation in the United States, other countries or both; Mac and OS X are registered trademarks of Apple Inc. in the United States, other countries or both; Red Hat is a registered trademark of Red Hat Corporation in the United States, other countries or both; and Linux is a registered trademark of Linus Torvalds in the United States, other countries or both).
[0035] In some implementations, the instruction sets and subroutines of system 100, which may be stored on storage device, such as storage device 16, coupled to computer 12, may be executed by one or more processors (not shown) and one or more memory architectures included within computer 12. In some implementations, storage device 16 may include but is not limited to: a hard disk drive; a flash drive, a tape drive; an optical drive; a RAID array (or other array); a random-access memory (RAM); and a read-only memory (ROM). In some implementations, network 14 may be connected to one or more secondary networks (e.g., network 18), examples of which may include but are not limited to: a local area network; a wide area network; or an intranet, for example.
[0036] In some implementations, computer 12 may include a data store, such as a database (e.g., relational database, object-oriented database, triplestore database, etc.) and may be located within any suitable memory location, such as storage device 16 coupled to computer 12. In some implementations, data, metadata, information, etc. described throughout the present disclosure may be stored in the data store. In some implementations, computer 12 may utilize any known database management system such as, but not limited to, DB2, in order to provide multi-user access to one or more databases, such as the above noted relational database. In some implementations, the data store may also be a custom database, such as, for example, a flat file database or an XML database. In some implementations, any other form(s) of a data storage structure and/or organization may also be used. In some implementations, system 100 may be a component of the data store, a standalone application that interfaces with the above noted data store and/or an applet / application that is accessed via client applications 22, 24, 26, 28. In some implementations, the above noted data store may be, in whole or in part, distributed in a cloud computing topology. In this way, computer 12 and storage device 16 may refer to multiple devices, which may also be distributed throughout the network.
[0037] In some implementations, computer 12 may execute application 20 for managing a mobile robot operating in a work area. In some implementations, system 100 and/or application 20 may be accessed via one or more of client applications 22, 24, 26, 28. In some implementations, system 100 may be a standalone application, or may be an applet / application / script / extension that may interact with and/or be executed within application 20, a component of application 20, and/or one or more of client applications 22, 24, 26, 28. In some implementations, application 20 may be a standalone application, or may be an applet / application / script / extension that may interact with and/or be executed within system 100, a component of system 100, and/or one or more of client applications 22, 24, 26, 28. In some implementations, one or more of client applications 22, 24, 26, 28 may be a standalone application, or may be an applet / application / script / extension that may interact with and/or be executed within and/or be a component of system 100 and/or application 20. Examples of client applications 22, 24, 26, 28 may include, but are not limited to, a standard and/or mobile web browser, an email application (e.g., an email client application), a textual and/or a graphical user interface, a customized web browser, a plugin, an Application Programming Interface (API), or a custom application. The instruction sets and subroutines of client applications 22, 24, 26, 28, which may be stored on storage devices 30, 32, 34, 36, coupled to user devices 38, 40, 42, 44, may be executed by one or more processors and one or more memory architectures incorporated into user devices 38, 40, 42, 44.
[0038] In some implementations, one or more of storage devices 30, 32, 34, 36, may include but are not limited to: hard disk drives; flash drives, tape drives; optical drives; RAID arrays; random access memories (RAM); and read-only memories (ROM). Examples of user devices 38, 40, 42, 44 (and/or computer 12) may include, but are not limited to, a personal computer (e.g., user device 38), a laptop computer (e.g., user device 40), a smart/data-enabled, cellular phone (e.g., user device 42), a notebook computer (e.g., user device 44), a tablet (not shown), a server (not shown), a television (not shown), a smart television (not shown), a media (e.g., video, photo, etc.) capturing device (not shown), and a dedicated network device (not shown). User devices 38, 40, 42, 44 may each execute an operating system, examples of which may include but are not limited to, Android®, Apple® iOS®, Mac® OS X®; Red Hat® Linux®, or a custom operating system.
[0039] In some implementations, one or more of client applications 22, 24, 26, 28 may be configured to effectuate some or all of the functionality of system 100 (and vice versa). Accordingly, in some implementations, system 100 may be a purely server-side application, a purely client-side application, or a hybrid server-side / client-side application that is cooperatively executed by one or more of client applications 22, 24, 26, 28 and/or system 100.
[0040] In some implementations, one or more of client applications 22, 24, 26, 28 may be configured to effectuate some or all of the functionality of application 20 (and vice versa). Accordingly, in some implementations, application 20 may be a purely server-side application, a purely client-side application, or a hybrid server-side / client-side application that is cooperatively executed by one or more of client applications 22, 24, 26, 28 and/or application 20. As one or more of client applications 22, 24, 26, 28, system 100, and application 20, taken singly or in any combination, may effectuate some or all of the same functionality, any description of effectuating such functionality via one or more of client applications 22, 24, 26, 28, system 100, application 20, or combination thereof, and any described interaction(s) between one or more of client applications 22, 24, 26, 28, system 100, application 20, or combination thereof to effectuate such functionality, should be taken as an example only and not to limit the scope of the disclosure.
[0041] In some implementations, one or more of users 46, 48, 50, 52 may access computer 12 and system 100 (e.g., using one or more of user devices 38, 40, 42, 44) directly through network 14 or through secondary network 18. Further, computer 12 may be connected to network 14 through secondary network 18, as illustrated with phantom link line 54. System 100 may include one or more user interfaces, such as browsers and textual or graphical user interfaces, through which users 46, 48, 50, 52 may access system 100.
[0042] In some implementations, the various user devices may be directly or indirectly coupled to network 14 (or network 18). For example, user device 38 is shown directly coupled to network 14 via a hardwired network connection. Further, user device 44 is shown directly coupled to network 18 via a hardwired network connection. User device 40 is shown wirelessly coupled to network 14 via wireless communication channel 56 established between user device 40 and wireless access point (i.e., WAP) 58, which is shown directly coupled to network 14. WAP 58 may be, for example, an IEEE 802.11a, 802.11b, 802.11g, Wi-Fi®, RFID, and/or BluetoothTM (including BluetoothTM Low Energy) device that is capable of establishing wireless communication channel 56 between user device 40 and WAP 58. User device 42 is shown wirelessly coupled to network 14 via wireless communication channel 60 established between user device 42 and cellular network / bridge 62, which is shown directly coupled to network 14.
[0043] In some implementations, some or all of the IEEE 802.11x specifications may use Ethernet protocol and carrier sense multiple access with collision avoidance (i.e., CSMA/CA) for path sharing. The various 802.11x specifications may use phase-shift keying (i.e., PSK) modulation or complementary code keying (i.e., CCK) modulation, for example, BluetoothTM (including BluetoothTM Low Energy) is a telecommunications industry specification that allows, e.g., mobile phones, computers, smart phones, and other electronic devices to be interconnected using a short-range wireless connection. Other forms of interconnection (e.g., Near Field Communication (NFC)) may also be used.
[0044] For the purposes of the present disclosure, the system 100 may include a fleet management system. Herein, FIG. 2 is a block diagram of an example of a computing system representing the fleet management system 200 capable of implementing embodiments according to the present disclosure, with the two terms being interchangeably used without any limitations. The fleet management system 200 is implemented for issuing commands for managing and controlling operations of a fleet of mobile robots (as will be described later in more detail), which, in turn, may be utilized in a warehouse environment, a manufacturing plant and the like. In one embodiment, the application 20 for managing a mobile robot as described above may be executed as a part of the fleet management system 200 as described herein. Thereby, for example in case of a warehouse, the system 100 may be a broader system such as the warehouse management system (WMS) as known in the art, in which the fleet management system 200 may be executed for managing and controlling operations of a fleet of mobile robots. Hereinafter, the terms “system 100” and “fleet management system 200” have been broadly interchangeably used to represent means for managing and controlling operations of a fleet of mobile robots in the warehouse environment, without any limitations.
[0045] In the example of FIG. 2, the fleet management system 200 includes a processing unit 205 for running software applications (such as, the application 20 of FIG. 1) and optionally an operating system. Memory 210 stores applications and data for use by the processing unit 205. Storage 215 provides non-volatile storage for applications and data and may include fixed disk drives, removable disk drives, flash memory devices, and CD-ROM, DVD-ROM or other optical storage devices. An optional user input device 220 includes devices that communicate user inputs from one or more users to the fleet management system 200 and may include keyboards, mice, joysticks, touch screens, etc. A communication or network interface 225 is provided which allows the fleet management system 200 to communicate with other computer systems via an electronic communications network, including wired and/or wireless communication and including an Intranet or the Internet. In one embodiment, the fleet management system 200 receives instructions and user inputs from a remote computer through communication interface 225. Communication interface 225 can comprise a transmitter and receiver for communicating with remote devices. An optional display device 250 may be provided which can be any device capable of displaying visual information in response to a signal from the fleet management system 200. The components of the fleet management system 200, including the processing unit 205, the memory 210, the data storage 215, the user input devices 220, the communication interface 225, and the display device 250, may be coupled via one or more data buses 260.
[0046] In the embodiment of FIG. 2, a graphics system 230 may be coupled with the data bus 260 and the components of the fleet management system 200. The graphics system 230 may include a physical graphics processing unit (GPU) 235 and graphics memory. The GPU 235 generates pixel data for output images from rendering commands. The physical GPU 235 can be configured as multiple virtual GPUs that may be used in parallel (concurrently) by a number of applications or processes executing in parallel. For example, mass scaling processes for rigid bodies or a variety of constraint solving processes may be run in parallel on the multiple virtual GPUs. Graphics memory may include a display memory 240 (e.g., a framebuffer) used for storing pixel data for each pixel of an output image. In another embodiment, the display memory 240 and/or additional memory 245 may be part of the memory 210 and may be shared with the processing unit 205. Alternatively, the display memory 240 and/or additional memory 245 can be one or more separate memories provided for the exclusive use of the graphics system 230. In another embodiment, graphics system 230 includes one or more additional physical GPUs 255, similar to the GPU 235. Each additional GPU 255 may be adapted to operate in parallel with the GPU 235. Each additional GPU 255 generates pixel data for output images from rendering commands. Each additional physical GPU 255 can be configured as multiple virtual GPUs that may be used in parallel (concurrently) by a number of applications or processes executing in parallel, e.g., processes that solve constraints. Each additional GPU 255 can operate in conjunction with the GPU 235, for example, to simultaneously generate pixel data for different portions of an output image, or to simultaneously generate pixel data for different output images. Each additional GPU 255 can be located on the same circuit board as the GPU 235, sharing a connection with the GPU 235 to the data bus 260, or each additional GPU 255 can be located on another circuit board separately coupled with the data bus 260. Each additional GPU 255 can also be integrated into the same module or chip package as the GPU 235. Each additional GPU 255 can have additional memory, similar to the display memory 240 and additional memory 245, or can share the memories 240 and 245 with the GPU 235. It is to be understood that the circuits and/or functionality of GPU as described herein could also be implemented in other types of processors, such as general-purpose or other special-purpose coprocessors, or within a CPU.
[0047] Referring to FIG. 3, illustrated is an implementation of the system 100 for a work area 300 in which a fleet of mobile robots are operated, in accordance with one or more embodiments of the present disclosure. Herein, the work area 300 is shown to include the fleet of mobile robots operating therein. In the illustration of FIG. 3, two mobile robots have been shown and represented by numerals 302, 303 although it may be appreciated that there may be more than two mobile robots, as part of the fleet of mobile robots, operating in the work area 300. Hereinafter, the embodiments of the present disclosure have been described with reference to the mobile robot 302 in term of the problem being solved, and with reference to the mobile robot 303 as part of the disclosed solution. It may be appreciated that the work area 300 may be part of a larger floor space, e.g., in a warehouse environment (not shown) or the like. The mobile robot 302 (as well as the mobile robot 303) may be utilized for various operations in the work area 300, like transferring of goods, such as cartons, in the work area 300, which is typical, e.g., for the warehouse environment. The mobile robot 302 may be configured to perform at least one operation in a cycle, which may involve the mobile robot 302 to travel from one position in the work area 300 to another, and this may be defined as an “operational cycle” of the mobile robot 302.
[0048] In the present embodiments, as shown in FIG. 3, the work area 300 includes a matrix of ground markers 306. In other words, the ground markers 306 are arranged in a manner to define a rectangular array of ground markers 306. Herein, the term "ground marker" is meant to include any number and all types of markers that may serve the distinguishing function, either in isolation or combination. Such ground markers may include, but are not limited to, geometric shapes or characters that superficially and/or structurally alter the appearance of the work area 300, that may be easily recognized by compatible sensing means (as discussed later in the description) provided in the mobile robots 302. In the present illustration, the ground markers 306 are shown as regular sized squares; however, other shapes including, but not limited to, circular, hexagonal, etc. may be contemplated without any limitations. Further, in some examples, each of the ground markers 306 may be unique. This may be achieved by providing the ground markers 306 with unique identification codes, like QR codes, barcodes, etc.
[0049] As may be seen from FIG. 3, the system 100 comprises a respective grid 308 defined for one or more of the ground markers 306 from the matrix of ground markers 306 in the work area 300, with each of the defined grids 308 having the corresponding ground marker 306 positioned inside thereof. That is, as may be seen, the matrix of ground markers 306 virtually divides the work area 300 into the plurality of grids 308. Herein, each such grid 308 may, generally, be equal in area and may further, generally, have the same size as the mobile robot 302. That is, the grid 308 may be defined to have an area equivalent to an area of the mobile robot 302. In an example, the grid 308 may have an area of 1 metre by 1 metre. In an embodiment, each of the defined grids 308 has a square shape. Further, the corresponding ground marker 306 is positioned at a centre (not labelled) of the square shape of the respective defined grid 308. Herein, the centre may be a diagonal centre of the square shape of the respective defined grid 308. It may be appreciated that each of the defined grids 308 may have other polygonal shapes which may provide a centre at which the respective ground marker 306 may be placed, including rectangular, hexagonal, octagonal or the like without departing from the spirit and the scope of the present disclosure.
[0050] Herein, the system 100 may define a path, i.e., a predefined path (such as, an exemplary predefined path 310 as shown in FIG. 3) to be followed by the mobile robot 302 in the work area 300. The predefined path 310 may be defined by virtually linking multiple ground markers 306 (as a virtual track), in various possible combinations, for the mobile robot 302 to travel thereon. Typically, the predefined path 310 as provided by the system 100 is a navigation path including a set of straight lines passing through centres of the ground markers 306, in the matrix of ground markers 306 in the work area 300. Such arrangement using the ground markers may be contemplated by a person skilled in the art and thus has not been described further for the brevity of the present disclosure.
[0051] The system 100 includes an odometry control arrangement 320 (as schematically shown in FIG. 3). The odometry control arrangement 320 is provided in the mobile robot 302. Herein, “odometry” refers to the use of data from motion sensors to estimate change in position over time. The odometry control arrangement 320 is configured to control movement of the mobile robot 302 in the work area 300 based on the ground markers 306 therein. In particular, the odometry control arrangement 320 may be implemented to control movements of the mobile robot 302 in the work area 300, such that the mobile robot 302 may be able to follow the predefined path 310 (as discussed earlier). In the present embodiments, the odometry control arrangement 320 is configured to control movement of the mobile robot 302 in the work area 300 based on the ground markers 306 positioned (laid out) therein. Specifically, the odometry control arrangement 320 in the mobile robot 302 may enable the mobile robot 302 to move (change its position) in the work area 300 from a current ground marker 306 to a next ground marker 306, and thereby follow the predefined path 310 as provided by the system 100.
[0052] It may be appreciated that the odometry control arrangement 320 may be in the form of a controller which may be any processing device, system or part thereof that controls at least one operation of the device. Such controller may be implemented in hardware, firmware or software, or some combination of at least two of the same. It should be noted that the functionality associated with any particular controller may be centralized or distributed, whether locally or remotely. Such controller may be a multi-core processor, a single core processor, or a combination of one or more multi-core processors and one or more single core processors. For example, the one or more processors may be embodied as one or more of various processing devices, such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing circuitry with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like. Further, the memory may include one or more non-transitory computer-readable storage media that can be read or accessed by other components in the device. The memory may be any computer-readable storage media, including volatile and/or non-volatile storage components, such as optical, magnetic, organic or other memory or disc storage, which can be integrated in whole or in part with the device. In some examples, the memory may be implemented using a single physical device (e.g., optical, magnetic, organic or other memory or disc storage unit), while in other embodiments, the memory may be implemented using two or more physical devices without any limitations.
[0053] The system 100 also includes a sensing arrangement 330. The sensing arrangement 330 is configured to estimate a position of the mobile robot 302 based on the ground markers 306, when the mobile robot 302 is moved in the work area 300 using the odometry control arrangement 320. In other words, the sensing arrangement 330 is configured to estimate a position of the mobile robot 302 with respect to one of the ground markers 306, from the matrix of ground markers 306, in vicinity thereof when the mobile robot 302 is moved using the odometry control arrangement 320 during the operational cycle thereof. In particular, the sensing arrangement 330 may determine a relative position of the mobile robot 302 with respect to the ground marker 306 in vicinity thereof. Herein, by the term “the ground marker 306 in vicinity thereof” means the ground marker 306 from which the mobile robot 302 may have started to be moved to the next ground marker 306 as per the predefined path 310, or the next ground marker 306 to which the mobile robot 302 is supposed to reach as per the predefined path 310.
[0054] In an embodiment, the sensing arrangement 330 is provided in the mobile robot 302. That is, the sensing arrangement 330 may be designed to monitor movement of the mobile robot 302 from within to estimate the position of the mobile robot 302 relative to its starting location in the work area 300. In the present embodiments, the sensing arraignment 330 may include an optical recognizer (marked by the same numeral 330 and generally represented in FIG. 3). Such optical recognizer 330 may be provided in the mobile robot 302. In the present embodiments, the optical recognizer 330 is configured to capture an image of a portion of the work area 300 underneath the mobile robot 302 when the mobile robot 302 is operating in the work area 300. The optical recognizer 330 may be configured to recognize presence of the ground markers 306, specifically the ground marker 306 underneath the corresponding mobile robot 302 based on the captured image. In an example embodiment, with each of the ground marker 306 being unique, by detecting the unique ground marker 306 (i.e., detecting the identification code like QR code, bar code, etc.), and with prior information about the absolute position of each of the unique ground marker 306 in the work area 300, the present system 100 may be able to estimate a the position of the corresponding mobile robot 302 in the work area 300. In the present example, the optical recognizer 330 may be in the form of, but not limited to, a camera (or generally any optical arrangement) provided in a body of the mobile robot 302 and pointed to a floor of the work area 300 and/or a scanner configured to distinguish colours when the ground markers, including the ground markers 306, may be of a substantially different from the floor of the work area 300, or the like. Such optical recognizer 330 may be contemplated by a person skilled in the art and thus has not been described in any more detail herein for the brevity of the present disclosure.
[0055] In some examples, the optical recognizer 330 may be configured to monitor movement of the mobile robot 302 to estimate the position of the mobile robot 302 in the work area 300. The optical recognizer 330 may recognize the ground markers 306, and by keeping count of the number of such recognized ground markers 306 and known change in directions as per the predefined path 310, the processing unit 205 may use the recognized ground markers 306 to estimate the position of the mobile robot 302 (either absolute or relative to its known starting location) in the work area 300. Herein, a sensitivity of the optical recognizer 330 may be dependent on a field-of-view (FoV) of the optical recognizer 330, including horizontal FoV as well as vertical horizontal FoV therefor. In general, larger the sensitivity of the optical recognizer 330, higher the density of the ground markers 306 in the work area 300 could be used. Further, the geometric area of the ground markers 306 may be fixed based on the sensitivity of the optical recognizer 330, i.e., smaller the geometric area of the ground markers 306 that may be recognized by the optical recognizer 330, higher the density of the ground markers 306 in the matrix of second ground markers 306 in the work area 300 could be used.
[0056] In other examples, the sensing arrangement 330 may include one or more of: a set of wheel encoders associated with drive wheels of the mobile robot 302, an odometer, an inertial measurement unit for estimating the position of the mobile robot 302. That is, in one example, for wheel-driven mobile robot 302, the wheel encoders (not shown) may be associated with one or more of its drive wheels, and such wheel encoders may determine the distance travelled by the mobile robot 302 and thereby the estimate the position of the mobile robot 302 relative to its known starting location in the work area 300. In another example, the odometer (not shown) associated with the mobile robot 302 (or specifically, the drive wheels of the mobile robot 302) may perform the same function to estimate the position of the mobile robot 302 relative to its known starting location in the work area 300. In still other example, the inertial measurement unit (IMU) (not shown) associated with the mobile robot 302 may perform the same function (as may be contemplated by a person skilled in the art) to estimate the position of the mobile robot 302 relative to its known starting location in the work area 300.
[0057] In some embodiments, the sensing arrangement 330 may be external to the mobile robot 302. That is, the sensing arrangement 330 may not be disposed in the mobile robot 302 itself, but may be located outside the mobile robot 302. In such case, the sensing arrangement 330 may monitor the movement of the mobile robot 302 to estimate the position of the mobile robot 302 relative to its starting location in the work area 300. For this purpose, the sensing arrangement 330 includes one or more imaging device (not shown) arranged to provide a view covering at least the work area 300 to capture image frames thereof while the mobile robot 302 may be performing operations therein. In an example, the imaging device may include a camera or the like. By using image analysis on the captured image frames from the imaging device, the sensing arrangement 330 may estimate a position of the mobile robot 302 with respect to one of the ground markers 306, from the matrix of ground markers 306, in vicinity thereof, as would be understood by a person skilled in the art.
[0058] In some embodiments, the sensing arrangement 330 may be disposed both in the mobile robot 302 as well external to the mobile robot 302. It may be understood that in case the optical recognizer 330 (as part of the sensing arrangement 330 internal to the mobile robot 302) may be damaged which may be causing an odometry error (as discussed in the subsequent paragraphs), then the external sensing arrangement 330 may be used to estimate the position of the mobile robot 302 with respect to one of the ground markers 306, from the matrix of ground markers 306, in vicinity thereof when the mobile robot 302 is moved using the odometry control arrangement 320 during the operational cycle thereof. In general, the odometry control arrangement 320 is disposed in signal communication with the optical recognizer 330 and configured to control movement of the mobile robot 302 to enable the mobile robot 302 to traverse the predefined path 310 in the work area 300, defined by virtually linking two or more ground markers 306 from the matrix of ground markers 306 therein, based on detection of the ground markers 306 by the optical recognizer 330
[0059] The system 100 further includes a scanning arrangement 340. In an embodiment, the scanning arrangement 340 is provided in the mobile robot 302. The scanning arrangement 340 is configured to scan space around the mobile robot 302 to generate scan data. Herein, the scan data comprises features of one or more objects and relative position of the one or more objects with respect to the mobile robot 302 in a vicinity of the mobile robot 302. In an example, the scanning arrangement 340 may be a point (one-dimensional) scanning device or three-dimensional scanning device without any limitations. In the present examples, the scanning arrangement 340 may be a laser scanner as known. The laser scanner may be in the form of a light detection and ranging (LIDAR) device. Such LIDAR device includes a light source configured to emit light within a wavelength range. The light source may include a fiber laser. The LIDAR device also includes a scanning portion configured to direct the emitted light in a reciprocating manner about a first axis and a plurality of detectors configured to sense light within the wavelength range. The LIDAR device further includes a rotational mount configured to rotate (like 360 degrees) about a second axis. At least the scanning portion is disposed within the housing and a wall of the housing includes a light filter configured to allow light within the wavelength range to propagate through the light filter. The LIDAR device is additionally coupled to the processing unit 205 configured to receive target information which is indicative of at least one of: a type of object, a size of an object, a shape of an object, a position, a location, or an angle range. In general, the mobile robot 302 may be configured to perform a pre-defined sequence of motions for completing the scan operation. Such scanning arrangement 340 may be contemplated by a person skilled in the art and thus further details have not been explained herein for the brevity of the present disclosure.
[0060] As discussed, most systems involving the mobile robots (such as the mobile robot 302) requires that the movements of the mobile robot 302 are accurate in the work area (such as the work area 300) for its proper operation. That is, the mobile robot 302 may need to accurately follow the predefined path (such as, the predefined path 310) using the ground markers (such as, the ground markers 306). However, due to operational wear, mechanical degradation, electrical degradation, temperature variation, etc., the mobile robot 302 may accumulate position error over its various operation cycles. As illustrated in FIG. 4, this can cause the mobile robot to deviate from the predefined path 310 during its operation or have a certain bias while following the predefined path 310. Further, the mobile robot 302 may be deviated from its predefined path 310 due to some obstruction, collision, etc. in the work area 300 during the operational cycle thereof. Therefore, the mobile robot 302 needs to be reoriented over its operational cycle for correcting these variations, such that the mobile robot 302 could accurately follow the predefined path 310 so as to achieve the defined purpose therefor.
[0061] According to embodiments of the present disclosure, as illustrated in FIG. 5, the processing unit 205 may assist with re-orienting the mobile robot 302 (from its disoriented position as shown in FIG. 4). For this purpose, the processing unit 205 is first configured to determine an odometry error in the movement of the mobile robot 302 in the work area 300 based on the estimated position of the mobile robot 302. Herein, the term “odometry error” represents a navigation error in the mobile robot 302 in the work area 300. As discussed, the predefined path 310 as provided by the system 100 is a navigation path including a set of straight lines passing through centres of the ground markers 306, in the matrix of ground markers 306 in the work area 300. In the illustration of FIG. 4, the odometry error has been depicted as a deviated position (represented by the numeral 402) for the mobile robot 302 from the predefined path 310. Such navigation error may occur when the mobile robot 302 may deviate from such straight lines (e.g., missing the centres of the ground markers 306) while supposedly following the predefined path 310 during the operational cycle thereof. In an example implementation, the sensing arrangement 330 may utilize the processing unit 205 of the system 100 for performing the computation and calculations (as described in the proceeding paragraphs) required for confirming the odometry error in the movement of the mobile robot 302 as per embodiments of the present disclosure.
[0062] In an embodiment of the present disclosure, the processing unit 205 is configured to determine the odometry error in the movement of the mobile robot 302 in the work area 300 based on non-detection of the ground marker 306, from the matrix of ground markers 306, by the optical recognizer 330 in accordance with the predefined path 210 in the work area 300. That is, the system 100 may conclude that there may be odometry error in the movement of the mobile robot 302 in the work area 300 if the ground marker 306, from the matrix of ground markers 306, may not be detected by the sensing arrangement 330 (or specifically the optical recognizer 300) when it may be expected as per the predefined path 310 being traversed as defined by virtually linking the ground markers 306. It may be appreciated that, in this case, the processing unit 205 may be embodied as a controller (as described above) in the mobile robot 302 itself. Further, the processing unit 205 may send a signal to the broader fleet management system 200 indicating the presence of odometry error, so that the corrective action may be taken thereby as described hereinafter.
[0063] In some embodiments, the sensing arrangement 330 is configured to determine one or more of: a cross track error, a number of missed ground markers and an average of goal reaching tolerances, based on the estimated position of the mobile robot 302 with respect to one of the ground markers 306 during the operational cycle thereof, to determine the odometry error. In an example, the odometry error in the movement of the mobile robot 302 may be determined based on lateral error (cross track error) which is the distance between the geometric centre of the optical recognizer 330 on the mobile robot 302 and the closest point on the predefined path 310. Lateral error is the principal measure of how close the position of the mobile robot 302 is to the desired position along the predefined path 310. In an example, the odometry error may be determined based on number of emergency stop incidents of the mobile robot 302 during the operational cycle. Such emergency stop incidents of the mobile robot 302 may occur when the mobile robot 302 would stop or would have to be stopped (either manually or automatically by the system 100) due to improper functioning of the odometry control arrangement 320 therein. In another example, the odometry error may be determined based on average goal reaching error of the mobile robot 302. Herein, the average goal reaching error may include longitudinal error which is defined as a difference between the centre of the ground marker 306 and the centre of the optical recognizer 330 along a direction of movement of the mobile robot 302, and an orientation error which is defined as the angular difference between the heading of the mobile robot 302 (in the direction of movement) and the ground marker 306. In yet another example, the odometry error may be determined based on number of missed ground markers 306 by the mobile robot 302 during its operational cycle while supposedly following the predefined path 310. As during conventional operation, the ground marker 306 is to be detected at a fixed/variable set of distances, and if the ground marker 306 is not detected (e.g., by the optical recogniser 330) at the said fixed/variable distance (or the ground marker count threshold) during the operational cycle in the work area 300, the incident is counted as a missed ground marker. It may be appreciated that all these determined errors may generally be determined based on the estimated position of the mobile robot 302 with respect to one of the ground markers 306. These different errors are recorded and quantified by the sensing arrangement 330 to determine an average/rolling values of such errors in isolation or combination during the operational cycle of the mobile robot 302, and if such average exceeds a predetermined threshold, the sensing arrangement 330 may confirm the odometry error in the movement of the mobile robot 302 in the work area 300.
[0064] Further, the processing unit 205 is configured to command an assistive robot (in this case, the mobile robot 303 as explained later) to travel in the work area 300 to a designated position (as represented by the numeral 502 in FIG. 5) corresponding to one of the ground markers 306 in a neighbourhood of the mobile robot 302 based on a last estimated position of the mobile robot 302 in the work area 300, in response to determination of the odometry error. In an embodiment, the processing unit 205 is configured to select one of other mobile robots (in this case, the mobile robot 303) from the fleet of mobile robots operating in the work area 300 to designate as the assistive robot. Herein, the mobile robot 303 is selected based on the operational cycle thereof. That is, if one of the other mobile robots in the fleet of mobile robots operating in the work area 300, like the mobile robot 303, may have completed its operational cycle, then such mobile robot 303 may be designated as the assistive robot. Hereinafter, the assistive robot has also been referenced by the same numeral as the said other designated mobile robot 303 without any limitations. Subsequently, the assistive robot 303 may help the mobile robot 302 to correct its orientation, as discussed in detail in the proceeding paragraphs.
[0065] First, the assistive robot 303 is commanded to travel in the work area 300 to the designated position 502 corresponding to one of the ground markers 306 in the neighbourhood of the mobile robot 302 based on the last estimated position of the mobile robot 302 in the work area 300, in response to determination of the odometry error. That is, when the odometry error is confirmed for the mobile robot 302, the system 100 may designate the assistive robot 303 and generates a path (represented by the numeral 504 and which may be different from its original predefined path) for the assistive robot 303 to reach to the position in the neighborhood (vicinity) of the mobile robot 302 based on its last estimated position, that is the last ground marker 306 that the mobile robot 302 may have detected in its operational cycle using the corresponding optical recognizer 330, before or during the mobile robot 302 may have de-coursed from the predefined path 310. It may be understood that the position for the assistive robot 303 may be a different ground marker 306 which may be adjacent to the last detected ground marker 306 for the mobile robot 302, so that the mobile robot 302 and the assistive robot 303 may not collide with each other. In some examples, the assistive robot 303 may employ the scanning arrangement (similar to the scanning arrangement 340) therein to scan for the surrounding space to ensure that it may not collide with the mobile robot 302 while travelling to the designated position 502 therefor.
[0066] In an embodiment, the processing unit 205 is configured to determine a suitable time to command the assistive robot 303 to travel in the work area 300 to the designated position 502 corresponding to one of the ground markers 306 in the neighbourhood of the mobile robot 302 based on at least one of: an operational cycle of the other mobile robot designated as the assistive robot 303 and operational cycles of mobile robots from the fleet of mobile robots operating in the work area 300. That is, the assistive robot 303 may be given the command to travel to the designated position 502 (i.e., a global pose for the assistive robot 303) only when the current operational cycle thereof may be completed. Further, the command may be given in consideration of the other mobile robots operating in the work area 300, such that the assistive robot 303 may not hinder (i.e., comes in) the path of any of other mobile robots operating in the work area 300 while travelling to the designated position 502 by following the defined path 504 therefor. In an embodiment, the processing unit 205 is configured to command two or more assistive robots 303 to travel in the work area 300 to respective designated positions 502 corresponding to different ground markers 306 in the neighbourhood of the mobile robot 302 based on the last estimated position of the mobile robot 302 in the work area 300, in response to determination of the odometry error. That is, instead of one, two or more other mobile robots may be designated as assistive robots 303 and are positioned in neighbourhood of the mobile robot 302 which may have de-coursed from the predefined path 310. Such configuration of using two or more assistive robots 303 may be helpful for more quickly and efficiently correct orientation and position of the lost mobile robot 302 in the work area 300, as would be contemplated by a person skilled in the art by reading the following disclosure.
[0067] Further, the processing unit 205 is configured to implement the scanning arrangement 340 in the mobile robot 302 to scan the space around the mobile robot 302 to generate scan data therefor. Herein, the scanning process has been represented by a scanned space (referenced by the numeral 506 in FIG. 5). The mobile robot 302 may receive communication from the fleet management system 200 that the assistive robot 303 may have arrived, so that the scanning arrangement 340 may be initialized. Also, the processing unit 205 is configured to process the generated scan data to detect presence of features of the assistive robot 303 and a relative position of the assistive robot 303 with respect to the mobile robot 302 in the work area 300. As discussed, by using the scanning arrangement 340, the mobile robot 302 is able to detect features of one or more objects and relative position of the one or more objects with respect thereto in the vicinity thereof. Thus, by using the scanning arrangement 340, the mobile robot 302 would be able to detect any assistive robot(s) 303 in the vicinity thereof, by distinguishing the assistive robot(s) 303 from other objects detecting during the scanning, as the features of the assistive robot(s) 303 may be pre-known to the scanning arrangement 340 of the mobile robot 302. Further, once the assistive robot(s) 303 may have been detected, the scanning arrangement 340 may determine the relative position(s) of the assistive robot(s) 303 with respect to the mobile robot 302 in which it may be disposed. Such techniques for the scanning arrangement 340 in the mobile robot 302 may be contemplated by a person skilled in the art and thus have not been described in any further detail. By using two or more associative robots 303, the mobile robot 302 may be able to determine its relative positions with respect to each one of the implemented two or more associative robots 303.
[0068] Further, the processing unit 205 is configured to localize the mobile robot 302 in the work area 300 based on the position of the assistive robot 303 in the work area 300 and the determined relative position of the assistive robot 303 with respect to the mobile robot 302. In case of two or more associative robots 303, the mobile robot 302 may localize itself using the relative positions with respect to each one of the implemented two or more associative robots 303. This is possible, since the system 100 would have information the absolute position(s) of the assistive robot(s) 303 in the work area 300, being the position(s) where the assistive robot(s) 303 may have been commanded to travel to; and further, since the system 100 would have information about the relative position(s) of the assistive robot(s) 303 with respect to the mobile robot 302 (i.e., from processing of the generated scan data); thereby, the system 100 would be able to determine the absolute position of the mobile robot 302 in the work area 300. In general, the mobile robot 302 may be localized based on the global pose of the assistive robot 303, as pre-known to the system 100 and determined by the scanning arrangement 340 of the mobile robot 302 from the scan operation. It may be appreciated that by implementing the assistive robots 303, the mobile robot 302 may have multiple reference points and thus may be able to more accurately localize its position in the work area 300.
[0069] Once the position of the mobile robot 302 may have been determined, the system 100 may correct the position and/or the orientation of the mobile robot 302, so that the mobile robot 302 may be able to traverse the predefined path 310 as originally defined therefor, for completing its defined operational cycle or the like. FIG. 5 illustrates a diagrammatic representation of exemplary implementation of the system 100 for the work area 300 in which the mobile robot 302 has its orientation being corrected. For this purpose, the processing unit 205 is configured to determine a nearest ground marker (represented by the numeral 508), as part of the predefined path 310, to the mobile robot 302 and a trajectory (represented by the numeral 510) for the mobile robot 302 to travel from a current position (which is generally same as the deviated position 402) thereof to the determined nearest ground marker 508 based on the localization of the mobile robot 302. Further, the processing unit 205 is configured to implement the odometry control arrangement 320 to control the mobile robot 302 to travel to the determined nearest ground marker 508 by following the determined trajectory 510. Herein, the nearest ground marker 508 may be one of the ground markers 306 in the predefined path 310 for the mobile robot 302 and is in closest proximity to the current position of the mobile robot 302 relative to other ground markers 306 in the matrix of ground markers 306. Further, the trajectory 510 may be defined as a path between the current position of the mobile robot 302 and the nearest ground marker 508. In an example, the nearest ground marker 508 and the trajectory 510 may be selected/defined such that the mobile robot 302 may be already orientated in a suitable orientation to be able to continue traversing the predefined path 310 from the said nearest ground marker 508 when reached thereat by following the defined trajectory 510.
[0070] This way the present system 100 may assist the mobile robot 302 to correct its orientation if deviated from the predefined path 310, to be able to get back to following the predefined path 310 and perform the necessary designated functions in the work area 300. That is, the processing unit 205 is further configured to instruct the mobile robot 302 to move in the work area 300 after correction of orientation by the odometry control arrangement 320, to perform regular operations thereof. With the orientation of the mobile robot 302 now being corrected, the mobile robot 302 may be able to precisely follow the predefined path 310 therefor (as provided by the system 100) during the operational cycle thereof, and thereby resulting in (contributing to) even more efficient operation of the present system 100. Further, the assistive robot 303 may be unreserved from this task and then may continue to perform its assigned operations as may be defined by the system 100.
[0071] The present disclosure further provides a method for managing the mobile robot 302, or specifically correcting orientation of the mobile robot 302, operating in the work area 300 comprising the matrix of ground markers 306. Various embodiments and variants disclosed above, with respect to the aforementioned system 100, apply mutatis mutandis to the present method. FIG. 6 is a flowchart 600 of a method for managing the mobile robot 302 operating in the work area 300 comprising the matrix of ground markers 306, as described herein. The various steps involved in the present method have been depicted as blocks in the flowchart 600 of FIG. 6, and the details for the same have been provided hereinafter.
[0072] At step 602, the method includes estimating a position of the mobile robot 302 based on the ground markers 306, when the mobile robot 302 is moved in the work area 300. As discussed, the sensing arrangement 330 is configured to estimate the position of the mobile robot 302 based on the ground markers 306, when the mobile robot 302 is moved in the work area 300 using the odometry control arrangement 320. In particular, the sensing arrangement 330 may determine a relative position of the mobile robot 302 with respect to the ground marker 306 in vicinity thereof.
[0073] At step 604, the method includes determining an odometry error in the movement of the mobile robot 302 in the work area 300 based on the estimated position of the mobile robot 302. As discussed, the sensing arrangement 330 comprises the optical recognizer (referred by the same numeral 330) configured to capture an image of a portion of the work area 300 underneath the mobile robot 302 when the mobile robot 302 is operating in the work area 300 to detect the ground marker 306 directly underneath the mobile robot 302, from the matrix of ground markers 306. Further, as discussed, the odometry control arrangement 320 is disposed in signal communication with the optical recognizer 330 and configured to control movement of the mobile robot 302 to enable the mobile robot 302 to traverse the predefined path 310 in the work area 300, defined by virtually linking two or more ground markers 306 from the matrix of ground markers 306 therein, based on detection of the ground markers 306 by the optical recognizer 330. Herein, the method further comprises determining the odometry error in the movement of the mobile robot 302 in the work area 300 based on non-detection of the ground marker 306, from the matrix of ground markers 306, by the optical recognizer 330 in accordance with the predefined path 310 in the work area 300. That is, it may be concluded that there may be odometry error in the movement of the mobile robot 302 in the work area 300 if the ground marker 306, from the matrix of ground markers 306, may not be detected when it may be expected as per the predefined path 310 being traversed as defined by virtually linking the ground markers 306.
[0074] At step 606, the method includes commanding the assistive robot 303 to travel in the work area 300 to a designated position 502 corresponding to one of the ground markers 306 in a neighbourhood of the mobile robot 302 based on a last estimated position of the mobile robot 302 in the work area 300, in response to determination of the odometry error. In an embodiment, the method includes selecting one of other mobile robots from a fleet of mobile robots operating in the work area 300 to designate as the assistive robot 303. The method may further include determining a suitable time to command the assistive robot 303 to travel in the work area 300 to the designated position 502 corresponding to one of the ground markers 306 in the neighbourhood of the mobile robot 302 based on at least one of: an operational cycle of the other mobile robot designated as the assistive robot 303 and operational cycles of mobile robots from the fleet of mobile robots operating in the work area 300. In some examples, the method further includes commanding two or more assistive robots 303 to travel in the work area 300 to respective designated positions 502 corresponding to different ground markers 306 in the neighbourhood of the mobile robot 302 based on the last estimated position of the mobile robot 302 in the work area 300, in response to determination of the odometry error.
[0075] At step 608, the method includes scanning a space around the mobile robot 302 to generate scan data therefor. At step 610, the method includes processing the generated scan data to detect presence of features of the assistive robot 303 and a relative position of the assistive robot 303 with respect to the mobile robot 302 in the work area 300. As discussed, by using the scanning arrangement 340, the mobile robot 302 is able to detect features of one or more objects and relative position of the one or more objects with respect thereto in the vicinity thereof. Thus, by using the scanning arrangement 340, the mobile robot 302 would be able to detect any assistive robot(s) 303 in the vicinity thereof, by distinguishing the assistive robot(s) 303 from other objects detecting during the scanning, as the features of the assistive robot(s) 303 may be pre-known to the scanning arrangement 340 of the mobile robot 302. Further, once the assistive robot(s) 303 may have been detected, the scanning arrangement 340 may determine the relative position(s) of the assistive robot(s) 303 with respect to the mobile robot 302 in which it may be disposed.
[0076] At step 612, the method includes localizing the mobile robot 302 in the work area 300 based on the position of the assistive robot 303 in the work area 300 and the determined relative position of the assistive robot 303 with respect to the mobile robot 302. This is possible, since the system 100 would have information the absolute position of the assistive robot(s) in the work area 300, being the position(s) where the assistive robot(s) may have been commanded to travel to; and further, since the system 100 would have information about the relative position(s) of the assistive robot(s) with respect to the mobile robot 302 (i.e., from processing of the generated scan data); thereby, the system 100 would be able to determine the absolute position of the mobile robot 302 in the work area 300.
[0077] At step 614, the method includes determining a nearest ground marker 508, as part of the predefined path 310, to the mobile robot 302 and a trajectory (such as, the trajectory 510 in FIG. 5) for the mobile robot 302 to travel from a current position thereof to the determined nearest ground marker 508 based on the localization of the mobile robot 302. At step 616, the method includes moving the mobile robot 302 to travel to the determined nearest ground marker 508 by following the determined trajectory 510 using the odometry control arrangement 320. That is, once the position of the mobile robot 302 may have been determined, the system 100 may correct the position and/or the orientation of the mobile robot 102, so that the mobile robot 102 may be able to traverse the predefined path 310 as originally defined therefor.
[0078] The system and the method of the present disclosure provide for correcting orientation of a mobile robot operating in a work area comprising a matrix of ground markers. Herein, the mobile robots may be commanded to undergo reorientation by the present system when it may be detected that the mobile robot may have deviated from the predefined path. The system and the method of the present disclosure utilizes other mobile robot(s) in the work area to reorient the mobile robot to bring it back to the predefined path. It may be appreciated that the mobile robots may already have means (like, the sensing arrangement) for providing information about exceptions, contingencies, recorded incidents, faults, runtimes (such as, max path deviation, average goal reach accuracy, total distance run etc.), which can help with determining its deviation from the predefined path, which can be used as supplementary to the techniques described in the present disclosure. The present disclosure addresses and corrects for the fact that systemic and non-systemic odometry errors can take place over the operational cycle for an autonomous robot, like the present mobile robot. The present disclosure provides that the odometry errors are corrected without affecting the operational cycle of the mobile robot.
[0079] The system and the method of the present disclosure solve the problem or issue of lost robots with a system of “assisted robot recovery'' which guides the lost robot to the nearest ground marker in the event of exceptions and errors with the assistance of multiple robots. The present disclosure addresses the lost mobile robot problem by augmenting the recovery operation with assistive robots acting as fixed references for the mobile robot operating in the same work area. The disclosure provides opportunistic allocation of assistive recovery robots to the requested assisted recovery area by the fleet management system for not disrupting time critical operations, by assigning assistive robot(s) based subject to various constraints such as but not limited to system throughput, load, energy usage, utilization. Thereby, the present system and method safely handle multiple mobile robot deviations of robots operating in the same work area and provide unimpeded operation of multiple mobile robots operating in the same work area, and thus improve system throughput by eliminating operational downtime for robots operating in the same work area and ensure time bound operations for robots operating in the same work area.
[0080] The foregoing descriptions of specific embodiments of the present disclosure have been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the present disclosure to the precise forms disclosed, and obviously many modifications and variations are possible in light of the above teaching. The exemplary embodiment was chosen and described in order to best explain the principles of the present disclosure and its practical application, to thereby enable others skilled in the art to best utilize the present disclosure and various embodiments with various modifications as are suited to the particular use contemplated.
, Claims:WE CLAIM:
1. A system for managing a mobile robot operating in a work area comprising a matrix of ground markers, the system comprising:
an odometry control arrangement provided in the mobile robot, the odometry control arrangement configured to control movement of the mobile robot in the work area based on the ground markers therein;
a sensing arrangement configured to estimate a position of the mobile robot based on the ground markers, when the mobile robot is moved in the work area using the odometry control arrangement;
a scanning arrangement provided in the mobile robot, the scanning arrangement configured to scan space around the mobile robot to generate scan data comprising features of one or more objects and relative position of the one or more objects with respect to the mobile robot in a vicinity of the mobile robot; and
a processing unit configured to:
determine an odometry error in the movement of the mobile robot in the work area based on the estimated position of the mobile robot;
command an assistive robot to travel in the work area to a designated position corresponding to one of the ground markers in a neighbourhood of the mobile robot based on a last estimated position of the mobile robot in the work area, in response to determination of the odometry error;
implement the scanning arrangement in the mobile robot to scan the space around the mobile robot to generate scan data therefor;
process the generated scan data to detect presence of features of the assistive robot and a relative position of the assistive robot with respect to the mobile robot in the work area;
localize the mobile robot in the work area based on the position of the assistive robot in the work area and the determined relative position of the assistive robot with respect to the mobile robot;
determine a nearest ground marker, as part of a predefined path, to the mobile robot and a trajectory for the mobile robot to travel from a current position thereof to the determined nearest ground marker based on the localization of the mobile robot; and
implement the odometry control arrangement to control the mobile robot to travel to the determined nearest ground marker by following the determined trajectory.

2. The system as claimed in claim 1, wherein the processing unit is configured to select one of other mobile robots from a fleet of mobile robots operating in the work area to designate as the assistive robot.

3. The system as claimed in claim 2, wherein the processing unit is configured to determine a suitable time to command the assistive robot to travel in the work area to the designated position corresponding to one of the ground markers in the neighbourhood of the mobile robot based on at least one of: an operational cycle of the other mobile robot designated as the assistive robot and operational cycles of mobile robots from the fleet of mobile robots operating in the work area.

4. The system as claimed in claim 1, wherein the processing unit is configured to command two or more assistive robots to travel in the work area to respective designated positions corresponding to different ground markers in the neighbourhood of the mobile robot based on the last estimated position of the mobile robot in the work area, in response to determination of the odometry error.

5. The system as claimed in claim 1, wherein the sensing arrangement comprises an optical recognizer configured to capture an image of a portion of the work area underneath the mobile robot when the mobile robot is operating in the work area to detect the ground marker directly underneath the mobile robot, from the matrix of ground markers, and wherein the odometry control arrangement is disposed in signal communication with the optical recognizer and configured to control movement of the mobile robot to enable the mobile robot to traverse the predefined path in the work area, defined by virtually linking two or more ground markers from the matrix of ground markers therein, based on detection of the ground markers by the optical recognizer.

6. The system as claimed in claim 5, wherein the processing unit is configured to determine the odometry error in the movement of the mobile robot in the work area based on non-detection of the ground marker, from the matrix of ground markers, by the optical recognizer in accordance with the predefined path in the work area.

7. A method for managing a mobile robot operating in a work area comprising a matrix of ground markers, the method comprising:
estimating a position of the mobile robot based on the ground markers, when the mobile robot is moved in the work area;
determining an odometry error in a movement of the mobile robot in the work area based on the estimated position of the mobile robot;
commanding an assistive robot to travel in the work area to a designated position corresponding to one of the ground markers in a neighbourhood of the mobile robot based on a last estimated position of the mobile robot in the work area, in response to determination of the odometry error;
scanning a space around the mobile robot to generate scan data therefor;
processing the generated scan data to detect presence of features of the assistive robot and a relative position of the assistive robot with respect to the mobile robot in the work area;
localizing the mobile robot in the work area based on the position of the assistive robot in the work area and the determined relative position of the assistive robot with respect to the mobile robot;
determining a nearest ground marker, as part of a predefined path, to the mobile robot and a trajectory for the mobile robot to travel from a current position thereof to the determined nearest ground marker based on the localization of the mobile robot; and
moving the mobile robot to travel to the determined nearest ground marker by following the determined trajectory.

8. The method as claimed in claim 7 further comprising:
selecting one of other mobile robots from a fleet of mobile robots operating in the work area to designate as the assistive robot; and
determining a suitable time to command the assistive robot to travel in the work area to the designated position corresponding to one of the ground markers in the neighbourhood of the mobile robot based on at least one of: an operational cycle of the other mobile robot designated as the assistive robot and operational cycles of mobile robots from the fleet of mobile robots operating in the work area.

9. The method as claimed in claim 7 further comprising commanding two or more assistive robots to travel in the work area to respective designated positions corresponding to different ground markers in the neighbourhood of the mobile robot based on the last estimated position of the mobile robot in the work area, in response to determination of the odometry error.

10. The method as claimed in claim 7, the method further comprises determining the odometry error in the movement of the mobile robot in the work area based on non-detection of the ground marker, from the matrix of ground markers, in accordance with the predefined path in the work area.

Documents

Application Documents

# Name Date
1 202211067532-FORM 18 [24-11-2022(online)].pdf 2022-11-24
2 202211067532-FORM 1 [24-11-2022(online)].pdf 2022-11-24
3 202211067532-FIGURE OF ABSTRACT [24-11-2022(online)].pdf 2022-11-24
4 202211067532-DRAWINGS [24-11-2022(online)].pdf 2022-11-24
5 202211067532-DECLARATION OF INVENTORSHIP (FORM 5) [24-11-2022(online)].pdf 2022-11-24
6 202211067532-COMPLETE SPECIFICATION [24-11-2022(online)].pdf 2022-11-24
7 202211067532-FORM-26 [17-01-2023(online)].pdf 2023-01-17
8 202211067532-GPA-090223.pdf 2023-02-10
9 202211067532-Correspondence-090223.pdf 2023-02-10
10 202211067532-Proof of Right [18-04-2023(online)].pdf 2023-04-18
11 202211067532-FER.pdf 2025-11-11

Search Strategy

1 202211067532_SearchStrategyNew_E_MOBILEROBOTE_10-11-2025.pdf