Sign In to Follow Application
View All Documents & Correspondence

Kits Capable Of Being Retrofitted On Robots And Methods For Autonomous Navigation

Abstract: KITS CAPABLE OF BEING RETROFITTED ON ROBOTS AND METHODS FOR AUTONOMOUS NAVIGATION ABSTRACT Disclosed is a kit that is capable of being retrofitted on robot, kit comprising: housing; sensor arrangement comprising sensors, communication arrangement comprising communication interface(s); and processing arrangement comprising first processor(s) communicably coupled with sensor arrangement and second processor of robot via communication interface(s). The sensors and the communication interface(s) are detachably attached to housing, to provide form factor flexibility to kit. First processor(s) is configured to: receive specification data related to configuration of robot; receive sensor data; process sensor data to determine state of robot with respect to frame of reference and to generate occupancy grid map; generate cost map of traversing environment; generate navigation plan for robot; generate control signal(s) for moving robot, based on navigation plan and specification data; and send control signal(s) to second processor of robot, when control signal(s) is executed by second processor, robot moves according to navigation plan for achieving navigation objective. [FIG. 1]

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
21 February 2024
Publication Number
36/2025
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application

Applicants

Flo Mobility Pvt Ltd
F-705, Springfields apartment, Ambalipura, Sarjapur road, Bangalore, Karnataka - 560102

Inventors

1. Manesh Jain
F-705, Springfields apartment, Ambalipura, Sarjapur road, Bangalore, Karnataka - 560102
2. Clay Motupalli
#29-37-31, House of Bread, Eluru Road Vijayawada, Andhra Pradesh - 520002
3. Ashok Kumar Dhaker
Village - Swaroop ji ki kheri ,post - rayta, Tehsil- begun Chittorgarh, Rajasthan 312023

Specification

Description:TECHNICAL FIELD
The present disclosure relates to kits that are capable of being retrofitted on robots. The present disclosure also relates to methods for autonomous navigation, using kits that are capable of being retrofitted on the robots.
BACKGROUND
Robotics like many other technological fields experiences rapid advancements. Modernization demands and technological advancements often render existing robotic systems obsolete. In the field of robotics, intensity of technological progress frequently renders established robotic systems outdated. This accelerated pace of advancement in such technological progress , spanning hardware and software domains leads to an obsolescence of existing robotic platforms.
Despite many recent advancements in the field of robotics, the existing techniques and equipment used for updating the robot have several limitations associated therewith. Currently, there are limitations in capabilities of sensors, processors, and communication protocols while updating the existing robot. Conventional robots comprise sensors with restricted range and sensitivity, thereby constraining their ability to perceive and respond to an environment around said conventional robot with required granularity. Hence, this poses a significant challenge for industries and applications relying on robotics, as legacy systems struggle to keep up with contemporary demands (for example, conventional robots may be equipped with outdated cameras and sensors that lack the resolution and features of modern counterparts). Secondly, conventional robots used for navigation often come with structures that are fixed and rigid, which limit their adaptability to particular environments or a particular task. Such structures make it challenging to deploy the conventional robots where constraints or requirements may continuously be varying. Hence, the conventional robots may lack scalability, thereby making it difficult to incorporate additional sensors or actuators to meet task requirements that are evolving. This limitation restricts the robot’s ability to evolve and adapt according to changing operational needs.
Therefore, in light of the foregoing discussion, there exists a need to overcome the aforementioned drawbacks.
SUMMARY
The present disclosure seeks to provide a kit that is capable of being retrofitted on the robot. The present disclosure also seeks to provide a method for autonomous navigation, using a kit that is capable of being retrofitted on the robot. An aim of the present disclosure is to provide a solution that overcomes at least partially the problems encountered in prior art.
In one aspect, an embodiment of the present disclosure provides a kit that is capable of being retrofitted on a robot, the kit comprising:
a housing;
a sensor arrangement comprising a plurality of sensors, wherein the plurality of sensors are detachably attached to the housing, to provide form factor flexibility to the kit;
a communication arrangement comprising at least one communication interface, wherein the at least one communication interface is detachably attached to the housing, to provide form factor flexibility to the kit;
a processing arrangement comprises at least one first processor, the at least one first processor being communicably coupled with the sensor arrangement and a second processor of the robot, via the at least one communication interface, wherein the at least one first processor is configured to:
receive specification data that is related to a configuration of the robot;
receive sensor data, collected by the plurality of sensors, wherein the sensor data pertains to a state of the robot and to an environment surrounding the robot;
process the sensor data to determine the state of the robot with respect to a frame of reference and to generate an occupancy grid map of the environment;
generate a cost map of traversing the environment, based at least on the occupancy grid map;
generate a navigation plan for the robot, based on the cost map and a navigation objective;
generate at least one control signal for moving the robot, based on the navigation plan and the specification data; and
send the at least one control signal to the second processor of the robot, wherein when the at least one control signal is executed by the second processor, the robot moves according to the navigation plan for achieving the navigation objective.
Optionally, for flexibly transitioning the kit between different form factors, at least one of: a quantity, an arrangement, of constituents of at least one of: the sensor arrangement, the communication arrangement, the processing arrangement, is modifiable.
Optionally, the at least one first processor is further configured to:
receive a feedback from a plurality of actuators of the robot, via the second processor of the robot;
determine movement information of the robot, by translating the feedback to odometry information of the robot; and
determine a current position of the robot, by fusing the odometry information with the sensor data,
wherein the at least one control signal is generated based also on the current position of the robot.
Optionally, the kit further comprises at least one input device communicably coupled to the at least one first processor via the at least one communication interface, wherein the at least one first processor is further configured to receive at least one user input pertaining to the specification data, via the at least one input device, wherein the at least one user input is provided by a user associated with the at least one input device.
Optionally, the at least one first processor is further configured to:
process the at least one user input to determine a drive mechanism of the robot specified therein;
determine whether the drive mechanism matches any pre-defined drive mechanism amongst a plurality of pre-defined drive mechanisms;
when it is determined that the drive mechanism matches a pre-defined drive mechanism amongst the plurality of pre-defined drive mechanisms, access a pre-defined kinematic model based on the pre-defined drive mechanism; and
when it is determined that the drive mechanism does not match any pre-defined drive mechanism amongst the plurality of pre-defined drive mechanisms, receive an additional user input pertaining to a customised plugin for the drive mechanism, wherein an input of the customized plugin is odometry information and an output of the customized plugin is at least one of: linear velocity, angular velocity, of the robot.
Optionally, the specification data comprises at least one of: a kinematic model based on a drive mechanism of the robot, a format of the at least one control signal, a format of a feedback received from the robot, physical dimensions of the robot, constraints of the robot, a maximum speed of the robot, a maximum acceleration of the robot, a plurality of specifications of at least one another communication interface onboard the robot.
Optionally, when the specification data comprises the plurality of specifications of the at least one another communication interface onboard the robot, the at least one first processor is further configured to selectively load any one of: a communication interface plugin, a library, that is associated with the at least one another communication interface, to enable establishment of at least one communication channel between the kit and the robot.
Optionally, the at least one first processor is further configured to:
process at least one raw data stream of at least one of: a log of processing outputs of the at least one first processor, a log of navigation missions undertaken by the robot, information pertaining to navigation missions undertaken by the robot, to filter the at least one raw data stream, based on predefined criteria; and
generate a package comprising the at least one raw data stream that is filtered.
Optionally, the plurality of sensors comprises at least two of: an image sensor, a motion sensor, a distance sensor.
In another aspect, an embodiment of the present disclosure provides a method for autonomous robotic navigation, using a kit that is capable of being retrofitted on a robot, wherein the kit comprises a housing that is adjustable to provide form factor flexibility to the kit, a sensor arrangement comprising a plurality of sensors that are detachably attached to the housing, and a communication arrangement comprising at least one communication interface that is detachably attached to the housing,
and wherein the method comprises:
receiving specification data that is related to a configuration of the robot;
receiving sensor data, collected by the plurality of sensors, wherein the sensor data pertains to a state of the robot and to an environment surrounding the robot;
processing the sensor data to determine the state of the robot with respect to a frame of reference and to generate an occupancy grid map of the environment;
generating a cost map of traversing the environment, based at least on the occupancy grid map;
generating a navigation plan for the robot, based on the cost map and a navigation objective;
generating at least one control signal for moving the robot, based on the navigation plan and the specification data; and
sending the at least one control signal to a second processor of the robot, wherein when the at least one control signal is executed by the second processor, the robot moves according to the navigation plan for achieving the navigation objective.
Optionally, the method further comprises:
receiving a feedback from a plurality of actuators of the robot, via the second processor of the robot;
determining movement information of the robot, by translating the feedback to odometry information of the robot; and
determining a current position of the robot, by fusing the odometry information with the sensor data,
wherein the at least one control signal is generated based also on the current position of the robot.
Optionally, the kit further comprises at least one input device communicably coupled to the at least one first processor via the at least one communication interface, the method further comprising receiving at least one user input pertaining to the specification data, via the at least one input device, wherein the at least one user input is provided by a user associated with the at least one input device.
Optionally, the method further comprises:
processing the at least one user input to determine a drive mechanism of the robot specified therein;
determining whether the drive mechanism matches any pre-defined drive mechanism amongst a plurality of pre-defined drive mechanisms;
when it is determined that the drive mechanism matches a pre-defined drive mechanism amongst the plurality of pre-defined drive mechanisms, accessing a pre-defined kinematic model based on the pre-defined drive mechanism; and
when it is determined that the drive mechanism does not match any pre-defined drive mechanism amongst the plurality of pre-defined drive mechanisms, receiving an additional user input pertaining to a customised plugin for the drive mechanism, wherein an input of the customized plugin is odometry information and an output of the customized plugin is at least one of: linear velocity, angular velocity, of the robot.
Optionally, the specification data comprises a plurality of specifications of the at least one communication interface onboard the robot, the method further comprising selectively loading any one of: a communication interface plugin, a library, that is associated with the at least one another communication interface, to enable establishment of at least one communication channel between the kit and the robot.
Optionally, the method further comprises:
processing at least one raw data stream of at least one of: a log of processing outputs of the at least one first processor, a log of navigation missions undertaken by the robot, information pertaining to navigation missions undertaken by the robot, to filter the at least one raw data stream, based on predefined criteria; and
generating a package comprising the at least one raw data stream that is filtered.
BRIEF DESCRIPTION OF THE DRAWINGS
The summary above, as well as the following detailed description of illustrative embodiments, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the present disclosure, exemplary constructions of the disclosure are shown in the drawings. However, the present disclosure is not limited to specific methods and instrumentalities disclosed herein. Moreover, those skilled in the art will understand that the drawings are not to scale. Wherever possible, like elements have been indicated by identical numbers.
Embodiments of the present disclosure will now be described, by way of example only, with reference to the following diagrams wherein:
FIG. 1 illustrates a block diagram of an architecture of a kit that is capable of being retrofitted on a robot, in accordance with an embodiment of the present disclosure;
FIG. 2 illustrates an internal view of an exemplary kit retrofitted on an exemplary robot, in accordance with an embodiment of the present disclosure;
FIG. 3 illustrates steps of a method for autonomous navigation using a kit that is capable of being retrofitted on a robot, in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION
The following detailed description illustrates embodiments of the present disclosure and ways in which they can be implemented. Although some modes of carrying out the present disclosure have been disclosed, those skilled in the art would recognize that other embodiments for carrying out or practicing the present disclosure are also possible.
The present disclosure provides a kit that is capable of being retrofitted on a robot and the aforementioned method for autonomous robotic navigation. Beneficially, the kit has an architecture that allows seamless integration of an arbitrary quantity of sensors and actuators. This facilitates the kit to adapt effortlessly to different form factors, thereby enabling retrofitting the kit into different robots of varying shapes and sizes, without a need for extensive modifications. Advantageously, integrating the plurality of sensors in the kit enables comprehensive sensing of the robot's state and environment surrounding the robot. This sensor data collected by the plurality of sensor contributes in providing an accurate and detailed information which is useful for the robot to become autonomous. Furthermore, the communication arrangement ensures seamless communication between the kit and the robot. This communication arrangement facilitates the exchange of data and control signals, enabling effective coordination and integration with the robot's onboard systems. Moreover, the presence of at least one first processor in the kit allows for onboard processing of the sensor data and eventually generating the at least one control signal to control the robot. This autonomy reduces the dependence on the existing robot's processing capabilities. The generation of the navigation plan enhances the robot's capabilities, contributing to a safer and more effective operations of the robot.
Referring to FIG. 1, illustrated is a block diagram of an architecture of a kit 100 that is capable of being retrofitted on a robot 102, in accordance with an embodiment of the present disclosure. The kit 100 comprises a housing 104, a sensor arrangement 106 comprising a plurality of sensors (depicted as sensors 108A, 108B, and 108C), a communication arrangement 110 comprising at least one communication interface (depicted as a communication interface 112), a processing arrangement 114 comprising at least one first processor (depicted as a first processor 116), and a second processor 118 of the robot 102. The first processor 116 is communicably coupled with the sensor arrangement 108A-C and the second processor 118 via the communication interface 112. Optionally, the kit 100 further comprises at least one input device (depicted as an input device 120) that is coupled to the first processor 116 via the communication interface 112.
It may be understood by a person skilled in the art that the FIG. 1 includes a simplified architecture of a kit 100 for sake of clarity, which should not unduly limit the scope of the claims herein. The person skilled in the art will recognize many variations, alternatives, and modifications of embodiments of the present disclosure.
Throughout the present disclosure, the term "kit" refers to a specialized set of components that is designed to enhance or upgrade existing robots. Furthermore, this kit 100 is compatible with and augment the capabilities of pre-existing robots. This is used to provide new functionalities to the robot 102 or improve a performance of the robot 102. Notably, the kit 100 offers capability of connecting with arbitrary quantity of sensors 108A-C and actuators through at least one communication interface 112 that utilizes industry-standard communication protocols. Furthermore, the kit100 is engineered to facilitate a smooth transition between different form factors (as described later).
Throughout the present disclosure, the term "housing" refers to a structural covering encasing components (namely, the sensor arrangement 106, the communication arrangement 110, the processing arrangement 114) of the kit 100. The housing 104 is fabricated to protect the components of the kit 100 from damage that may be caused due to falling, bumping, or any such impact to the kit 100. Examples of materials used to manufacture the housing 104 include, but are not limited to, polymers (such as polyvinyl chloride, high density polyethylene, polypropylene, polycarbonate), metals and their alloys (such as aluminium, steel, copper), non-metals (such as carbon fibre, toughened glass) or any combination thereof.
Throughout the present disclosure, the term "sensor arrangement" refers to an arrangement of a plurality of sensors 108A-C, and peripheral components required for operation of the plurality of sensors 108A-C and transmittance or communication of sensor data captured by each of the plurality of sensors 108A-C. Herein, the plurality of sensors 108A-C are devices that detects signals, stimuli, or changes in quantitative features and/or qualitative features of an environment surrounding the robot 102. Herein, the plurality of sensors 108A-C are detachably attached to the housing 104 of the kit 100, i.e., the plurality of sensors 108A-C can be easily connected or disconnected from the housing 104. Herein, the plurality of sensors 108A-C are detachably attached to the housing 104 of the kit 100 using detachable attachment means, for example such as connectors, mounts, etc. The detachable attachment of the plurality of sensors 108A-C allow flexibility in configuring the sensor arrangement 106 based on the requirements of the robot 102, ensuring that the kit 100 can be used for different tasks, and/or exemplary use-case scenarios that are to be implemented in different environments surrounding the robot 102. Beneficially, the detachable attachment of the plurality of sensors 108A-C facilitates in scaling the kit 100, which can be further utilized for upgrading the robot 102. In such upgradation of the robot 102, a user can add, remove, or replace any sensor from the plurality of sensors 108A-C in the sensor arrangement 106 to meet at least one of: evolving technological advancements, changing operational requirements, related to the robot 102, without a need to replace the kit 100 entirely. Furthermore, such addition, removal, or replacement of any sensor from the plurality of sensors 108A-C in the sensor arrangement 106 provides the form factor flexibility. Herein, the term "form factor" refers to at least one of: a physical size, a shape, a configuration of the sensor arrangement 106 in relation to at least one of: a compatibility, a design, a functionality, of the kit 100. Moreover, the term "form factor flexibility" refers to an ability of the sensor arrangement 106 to adapt to at least one of: a different physical size, a different shape, a different configuration. Hence, by detachably attaching the plurality of sensors 108A-C to the housing 104, the kit 100 can accommodate variations in arrangement and configuration of the plurality of sensors 108A-C. This is beneficial for adapting the kit 100 to robots that have at least one of: distinct shapes, distinct size, and distinct structural layouts, wherein the robots can be used in different environments, or as per preferences of the user.
Optionally, the plurality of sensors 108A-C comprises at least two of: an image sensor, a motion sensor, a distance sensor. Herein, the term "image sensor" refers to a device which detects light from the environment at its photosensitive surface, when light is incident thereupon. Upon such detection of the light, at least one image signal is generated, which is processed to generate an image. The image sensor is configured to capture at least one image of the environment surrounding the robot 102, wherein such at least one image is processed for at least one of: obstacle detection, object recognition, spatial mapping of the environment surrounding the robot 102. The at least one image is processed using processing algorithms to at least one of: identify obstacles based on visual characteristics (for example, such as edges, shapes, textures), classify objects based on their visual characteristics, fuse at least one image captured from different perspectives to create a three-dimensional (3D) representation (for example, such as a point cloud, a depth map) of the environment. Such processing algorithms are well-known in the art. It will be appreciated that the image sensor is commonly used in a camera, for example, such as a stereo camera, a monocular cameras, and other imaging devices. It will be appreciated that the stereo camera comprises two image sensors that are positioned in such a manner so as to mimic human binocular vision. Moreover, the monocular camera comprises a single image sensor.
The term "motion sensor" refers to a device that at least one of: detects a change of position of an object within the environment that surrounds the robot 102, determines a position of the robot 102 in the environment. When the motion sensor is used for detecting the change of position of the object within the environment, the object is at least one of: a static object, a dynamic object. When the motion sensor is used for determining the position of the robot 102 within the environment, the motion sensor is configured to measure at least one of: a distance traveled by the robot 102, angular acceleration of the robot 102, absolute positioning information of the robot 102. Examples of the motion sensor may include, but are not limited to, encoders, Global Navigation Satellite System (GNSS), Inertial Measurement Unit (IMU), and Light Detection and Ranging (LiDAR).
Furthermore, the term "distance sensor" refers to a device that measures a distance between said distance sensor and a surface and/or the object, in the environment surrounding the robot 102. The distance sensor is configured to measure at least one of: a time taken for a signal emitted by the distance sensor to reflect after hitting the surface and/or the object, an intensity of the signal, changes in capacitance between the distance sensor and the surface and/or object as distance changes. Hence, such signal is processed for at least one of the obstacle detection, by perceiving the environment surrounding the robot 102, and thereby utilizing this information for avoiding collisions with the objects in proximity to the robot 102, and/or to create a map of the environment. Examples of distance sensors are, Sound Navigation and Ranging (SONAR) sensors, two-dimensional (2D) LiDAR, a 3D LiDAR, a capacitive sensor, and an ultrasonic sensor.
A technical effect of the sensor arrangement 106 comprising the plurality of sensors 108A-C is that it provides enhanced adaptability, versatility, and functionality for navigating the robot 102 in an autonomous manner.
Throughout the present disclosure, the term "communication arrangement" refers to an arrangement of at least one communication interface 112, and peripheral components required for operation of the at least one communication interface 112. The at least one communication interface 112 is detachably attached to the housing 104 of the kit 100, i.e., the at least one communication interface 112 can be easily connected or disconnected from the housing 104. Herein, the at least one communication interface 112 is detachably attached to the housing 104 of the kit 100 using another detachable attachment means, for example such as other connectors, other mounts, etc. The detachable attachment of the at least one communication interface 112 allow flexibility in configuring the communication arrangement 110 based on the requirements of the robot 102, ensuring that the kit 100 can be used for different tasks, and/or exemplary use-case scenarios that are to be implemented in different environments surrounding the robot 102. Herein, the term "communication interface" refers to an electronic circuit that provides a physical or virtual interface that is configured to facilitate data communication between the at least one first processor 116 of the kit 100 and the second processor 118 of the robot 102. The at least one communication interface 112 is at least one of: a set of hardware component, a set of software components. The data communication includes wired communication or wireless communication that can be carried out via any number of known protocols, including, but not limited to, Internet Protocol (IP), Wireless Access Protocol (WAP), Frame Relay, or Asynchronous Transfer Mode (ATM). Moreover, any other suitable protocols using voice, video, data, or combinations thereof, can also be employed.
Optionally, the at least one communication interface 112 comprises at least one of: a Universal Serial Bus (USB), a Controller Area Network (CAN), a Universal Asynchronous Receiver-Transmitter (UART), Gigabit Multimedia Serial Link (GMSL), a Camera Serial Interface (CSI) of Mobile Industry Processor Interface (MIPI) Alliance, an Inter-integrated Circuit/Serial Peripheral Interface (I2C/SPI), Ethernet. Optionally, for establishing communication between the at least one first processor 116 and the second processor 118, the at least one communication interface 112 comprises: USB, CAN, UART, I2C/SPI, Ethernet. Optionally, the at least one communication interface 112 that the plurality of sensors 108A-C use within the kit 100 comprises at least one of: USB, UART, MIPI CSI, GMSL. Optionally, for establishing auxiliary communication between the second processor 118 and a communication network, the at least one communication interface 112 comprises at least one of: Wireless Fidelity (Wi-fi), Bluetooth®, fourth generation mobile network (4G), fourth generation mobile network (5G), Ethernet. This auxiliary communication between the second processor 118 and a communication network allows accessing external data, updates, or commands from a network source, which can be incorporated by the at least one first processor 116 to enhance the capabilities of the robot 102 when the kit 100 is retrofitted on the robot 102. For example, the kit 100 may be retrofitted onto the robot 102. Herein, to establish communication between the at least one first processor 116 and the second processor 118, the second processor 118 may be connected to the at least one first processor 116 via at least one communication interface 112 (for example, such as the USB).
Throughout the present disclosure, the term "processing arrangement" refers to an arrangement of at least one first processor 116, wherein the processing arrangement 114 is configured to handle computational tasks, data processing, and/or decision-making within the kit 100. The processing arrangement 114 is a structured setup involving the at least one first processor 116 and associated components responsible for managing and executing computational functions related to navigating the robot 102 in the autonomous manner. Notably, the at least one first processor 116 controls an overall operation of the robot 102 when the kit 100 is capable of being retrofitted on the robot 102. Herein, the at least one first processor 116 is responsible for handling tasks such as sensor data processing of sensor data, determining a decision based on the processed sensor data-making, and/or generating control signals. Beneficially, the at least one first processor 116 plays an essential role in organizing the autonomous navigation capabilities of the robot 102 when the kit 100 is retrofitted on the robot 102. Optionally, the at least one first processor 116 comprises at least one of: Central Processing Unit (CPU), Graphics Processing Unit (GPU), Neural Processing Unit (NPU). The second processor 118 is configured to control only an operation of the robot 102. In this regard, the second processor 118 is distinct from the at least one first processor 116. The second processor 118 is configured to at least one of: control a motor of the robot 102, establish communication with other components on the robot 102, coordinate functions within the robot 102.
The at least one first processor 116 and the second processor 118 could be implemented as any one of: a microprocessor, a microcontroller, or a controller. As an example, the at least one first processor 116 could be implemented as a central computing unit based on existing Advanced RISC Machine (ARM®) chip. As another example, the second processor 118 could be implemented as an application-specific integrated circuit (ASIC) chip or a reduced instruction set computer (RISC) chip.
Optionally, the processing arrangement 114 comprises at least one of: a perception module, a planning module, a control module, a data engine module, a conductor module. Each of these modules are explained in depth below. The sensor arrangement 106 comprised in the kit 100 enables a functionality of the perception module. The at least one first processor 116 is configured to receive the specification data as at least one input provided by a user, or from the second processor 118 of the robot 102. Herein, the term "specification data" refers to information or parameters that describe various features of the robot’s configuration. Subsequently, the specification data encompass physical attributes and/or functional attributes of the robot 102, which is used for determining how the robot 102 operates and interacts with the environment. Herein, the specification data is processed by the at least one first to determine characteristics and capabilities of the robot 102. This facilitates in modifying the characteristics of the robot 102 so as to adapt said robot 102 to make it autonomous.
Throughout the present disclosure, the sensor arrangement 106 comprised in the kit 100 enables a functionality of the perception module. The plurality of sensors 108A-C collectively captures information about the robot's surroundings. Herein, the plurality of sensors 108A-C provides sensor data which is comprehensive and diverse, thereby configuring the at least one first processor 116 of the kit 100 to process the sensor data to determine different scenarios within the environment and the robot's condition in that scenario. Moreover, in the perception module, the at least one first processor 116 is configured to primarily interpret and understand the surrounding (namely, a local environment) in which the robot 102 operates. This involves collecting data from the plurality of sensors 108A-C to build a representation of the robot's surroundings. Herein, this sensor data comprises information of at least about the object surrounding the robot 102, the object being at least one of: the static object, the dynamic object . Subsequently, the plurality of sensors 108A-C collects the sensor data which includes information about an internal state of the robot 102. This internal state information comprises at least one of: a position, orientation, speed, acceleration, of the robot 102. In this regard, each sensor from amongst the plurality of sensors 108A-C has a unique functionality which contributes to specific types of sensor data that collectively form a comprehensive understanding of the surroundings and/or the internal state of the robot 102. For example, the plurality of sensors 108A-C may comprise a distance sensor (for example, such as a LiDAR), an image sensor (comprised in a stereo camera and a monocular camera), and a motion sensor (for example, such as an IMU). Herein, the LiDAR may scan the surroundings of the robot 102, detect obstacles, and creating a detailed map. The stereo cameras and monocular cameras may work together to identify and track moving objects, such as humans or other robots, in the robot's path. Furthermore, the IMU may continuously measure the robot's accelerations and angular rates, providing real-time data to the at least one first processor 116 on motion of said robot 102.
The sensor data received from the plurality of sensors 108A-C is then processed by the at least one first processor 116 by providing the sensor data as input to any one of: a set of heuristic algorithms, neural networks. An output received after processing the sensor data is used to determine the state of the robot 102 relative to the frame of reference. Herein, the term "frame of reference" refers to any one of: a coordinate system, a set of axes, that is used to define a position and an orientation of the robot 102 within surroundings. The sensor data is utilized to generate the occupancy grid map of the environment. In this regard, the occupancy grid map divides the environment into a grid of cells, wherein each cell is labeled as any one of: occupied (i.e., the given cell is occupied by an obstacle), free space, based on sensor data. Herein, the cell refers to a discrete unit within a grid-based representation of an environment. Beneficially, determining the state of the robot 102 with respect to the frame of reference and generating the occupancy grid map are essential to enable the robot 102 to be autonomous. Such robots can then at least one of: navigate, understand, interact with their environments. on their own.
The occupancy grid map is then used to generate the cost map. Throughout the present disclosure, the term "cost map" refers to a spatial representation of the environment wherein each cell of the occupancy grid map is assigned cost values. This cost value indicates a difficulty associated with traversing a particular cell in the occupancy grid map. Herein, some cells with obstacles are assigned a high cost value to discourage the robot 102 from traversing through those cells whereas other cells with free space are assigned a low cost value to encourage the robot 102 to navigate through the other cells. Moreover, the cost map of traversing the environment can be generated based also on at least one of: the sensor data, the specification data. For example, each cell in the occupancy grid map may be assigned a numerical cost value, wherein a high numerical value (for example, such as 9 in a scale of 1-10) may typically indicate greater difficulty or risk, while a low value (for example, such as 2 in the scale of 1-10) may suggest more favorable conditions for traversal for the robot 102.
The navigation plan is generated by the at least one first processor 116 by utilizing the planning module. Herein, the term "navigation plan" refers to a detailed strategy or a sequence of actions that guides the robot 102 from its current position to a particular destination within an environment, wherein such particular destination could be defined in the navigation objective. The navigation plan is generated by the at least one first processor 116 using algorithms, wherein the algorithms are employed to consider a layout of the environment, obstacles, and various constraints to determine the optimal path for the robot 102 to follow. Such algorithms are well-known in the art. Notably, the term "navigation objective" refers to a specific target that the robot 102 needs to achieve during its movement within the environment. The navigation objective could include start and end points, ancillary navigation details (for example, such as, what paths to avoid, which trajectories to use while surveillance, repetition of certain routes) etc. Herein, the navigation plan ensures the robot 102 takes a path which is optimized based on the cost map and the navigation objective. Thereby, this is useful in conserving resources such as time, and energy, which enhances battery life. Furthermore, by considering the cost map and the navigation objective, the robot 102 can navigate safely, by avoiding obstacles and adhering to predefined paths for surveillance. For example, the navigation objective may be to reach a particular destination by being energy efficient. Hence, the at least one first processor 116 may be configured to generate a navigation plan that may comprise particular cells with low cost values and a low energy consumption, while considering constraints and avoiding high-cost areas.
Throughout the present disclosure, the term "control signal" refers to a specific set of commands or instructions sent to the second processor 118 the robot 102 to control motion of the robot 102. Herein, generating the at least one control signal for moving the robot 102 involves translating the navigation plan which is in a high-level language, into specific commands that are in low-level language, that is executable by the second processor 118 of the robot 102. The at least one control signal is any one of: a voltage signal, a current signal. Additionally, the at least one control signal is generated by combining the navigation plan and the specification data to align the robot's capabilities and operational requirements. Using the specification data ensures that the at least one control signal are generated based on characteristics of the robot 102, thereby allowing for precise and effective control of movements of the robot 102. Without the specification data, controlling the robot 102 could become challenging and may lead to unpredictable or undesired behavior. Furthermore, the at least one control signal typically includes parameters related to the robot's movement, such as linear and angular velocities, that are necessary for executing for following the navigation plan. Advantageously, the at least one control signal are generated after one configures the kit 100, understanding the robot’s configuration, kinematic model, drive train, and hardware APIs.
The at least one control signal generated by configuring the at least one first processor 116, which at least specifies motion parameters (such as linear and angular velocities) of the robot 102, is sent from the at least one first processor 116 to the second processor 118 of the robot 102. The second processor 118 is responsible for the motion of the robot 102 and help achieve the navigation objective. The second processor 118(namely, a motor control unit), receives the at least one control signal and processes the at least one control signal to produce specific commands for the movement of the robot 102. Subsequently, once the second processor 118 executes the at least one control signal, the robot 102 moves based on the navigation plan generated earlier. The overall objective is for the robot 102 to follow the navigation plan to reach its destination or fulfill a specific task. In an exemplary use-case scenario, the robot 102 may be used for navigation in a warehouse. Herein, the navigation objective of the robot 102 may be to move from a storage area to a packing station, within the warehouse, while avoiding obstacles. The at least one first processor 116 may be configured to generate at least one control signal which at least specifies linear velocity to be 0.5metres per second (m/s) and angular velocity to be 0.2 radians per second (rad/s). The at least one control signal may be sent to the second processor 118 responsible for the motion of the robot 102, wherein the second processor 118 executes the at least one control signal provide the robot 102 with the linear velocity of 0.5 m/s and the angular velocity of 0.2 rad/s. The robot 102 may successfully navigate to the packing station from the storage area, by avoiding obstacles to achieve the navigation objective.
Optionally, for flexibly transitioning the kit 100 between different form factors, at least one of: a quantity, an arrangement, of constituents of at least one of: the sensor arrangement, the communication arrangement, the processing arrangement, is modifiable. Herein, term "quantity" refers to a number of constituents within a particular arrangement of the kit. For example, the quantity of sensors may refer to a number of individual sensors in the sensor arrangement, quantity of communication interfaces may represent number of communication interfaces in the communication arrangement. Moreover, the term "arrangement" refers to a configuration, or positioning of the constituents within the kit. Advantageously, the kit can be scaled up or scaled down based on operational requirements of the robot, for example, such as a robot with higher computational needs may require additional processing units, and the aforementioned kit allows for such adjustments.
Beneficially, the ability to rearrange the constituents of at least one of: the sensor arrangement, the communication arrangement, the processing arrangement provides a high degree of flexibility in adjusting an overall shape and layout of the kit. This facilitates transition of the kit between different form factors. Herein, the housing could have variable mounting points or attachment mechanisms that allow the constituents to be positioned at different locations. A technical effect of flexibly transitioning the kit between different form factors is the enhancement of the kit's versatility, making it adaptable to a wide range of robotic platforms and applications by allowing users to customize the quantity and arrangement of its constituents.
Optionally, the at least one first processor 116 is further configured to:
receive a feedback from a plurality of actuators of the robot 102, via the second processor 118 of the robot 102;
determine movement information of the robot, by translating the feedback to odometry information of the robot 102; and
determine a current position of the robot 102, by fusing the odometry information with the sensor data,
wherein the at least one control signal is generated based also on the current position of the robot 102.
Herein, the term "plurality of actuators" refers to a collection or group of multiple actuators within the robot 102 for controlling various mechanical components of the robot 102 responsible for the motion of said robot 102. It will be appreciated that the plurality of actuators convert electrical signals into physical movements, thereby allowing the robot 102 to perform actions, for example, such as moving, turning, and similar. Herein, the at least one first processor 116 receives feedback from the plurality of actuators of the robot 102 through the second processor 118, wherein this feedback is used to determine the movement information of the robot 102. This involves translating the feedback into odometry information, wherein the odometry information comprises position and orientation of the robot 102 based on the movement of said robot 102. In this regard, fusion algorithms are used for such translation. Such fusion algorithms are well-known in the art. Thereby, for determining the current position of the robot 102, the odometry information obtained from the actuators is fused (combined) with the sensor data, thereby enhancing the accuracy of determining the current position of the robot 102. Optionally, the feedback from the plurality of actuators is fused with sensor data received from the motion sensors (for example, such as the IMU, the GNSS) which provides an accurate information of the current position of the robot 102. Furthermore, the current position of the robot 102 is also used to generate at least one control signal to achieve the navigation objective. Advantageously, by incorporating feedback from the plurality of actuators and fusing it with the sensor data, the system gains a more accurate understanding of how the robot 102 is moving and where it is located in its environment. This improved awareness is then used to refine and adjust the control signals generated by the at least one first processor 116. Subsequently, by fusing odometry with the sensor data, the system improves its ability to accurately determine the robot’s position, contributing to better localization. Herein, the integration of feedback allows the control signals to be dynamically adjusted based on the robot's actual movements, enabling real-time adaptation to change in the environment. This means that the control signals are not only based on high-level navigation plans and specification data but are dynamically adjusted based on real-time information about the robot’s position. This adaptability is essential for precise and responsive robotic movements. Beneficially, fusing feedback from actuators and the plurality of sensors 108A-C helps to mitigate errors and position drift that may occur over time, providing a more reliable basis for control signal generation. For example, if the robot 102 encounters an obstacle that was not part of the initial plan, the control signals can be adjusted based on the fused information to navigate around it. A technical effect of configuring the at least one first processor 116 in such a manner is that the movement of the robot 102 can be dynamically adjusted based on the current position of the robot 102.
Optionally, the kit 100 further comprises at least one input device 120 communicably coupled to the at least one first processor 116 via the at least one communication interface 112, wherein the at least one first processor 116 is further configured to receive at least one user input pertaining to the specification data, via the at least one input device 120, wherein the at least one user input is provided by a user associated with the at least one input device 120.
Herein, the term "input device" refers to an electronic device associated with (or used by) a user that is capable of enabling the user to provide input to the kit 100. Furthermore, the input device 120 is intended to be broadly interpreted to include any electronic device that may be used for voice and/or data communication over a wireless communication network. Examples of the at least one input device 120 may include, but are not limited to, a keyboard, a mouse, a touchscreen, a trackpad, and a microphone. Subsequently, a means of communication established here between the at least one input device 120 and the at least one first processor 116, thereby allowing data exchange. The at least one first processor 116 is configured to receive at least one user input related to the specification data, which could include information essential for configuring and operating the robot 102. Herein, the at least one user input could be used to customize or specify certain parameters associated with the robot's configuration. This means of communication that is established could optionally ensure that the at least one user input received is attributed to a specific user interacting with the at least one input device 120. For example, the at least one input device 120 may be touchscreen integrated into the kit 100. A user may provide at least one user input by interacting with the touchscreen to provide at least one user input pertaining to the specification data, A technical effect of inclusion of at least one input device 120 provides a user-friendly means for dynamically configuring and customizing the robotic system, wherein the user can directly provide at least one user input related to specification data. This user interaction contributes to the versatility of the robotic kit 100 and makes the kit 100 accessible for users to operate the robot 102 according to operational requirements.
Optionally, the at least one first processor 116 is further configured to:
process the at least one user input to determine a drive mechanism of the robot 102 specified therein;
determine whether the drive mechanism matches any pre-defined drive mechanism amongst a plurality of pre-defined drive mechanisms;
when it is determined that the drive mechanism matches a pre-defined drive mechanism amongst the plurality of pre-defined drive mechanisms, access a pre-defined kinematic model based on the pre-defined drive mechanism; and
when it is determined that the drive mechanism does not match any pre-defined drive mechanism amongst the plurality of pre-defined drive mechanisms, receive an additional user input pertaining to a customised plugin for the drive mechanism, wherein an input of the customized plugin is odometry information and an output of the customized plugin is at least one of: linear velocity, angular velocity, of the robot 102.
In this regard, the at least one user input is processed to identify the drive mechanism that is employed by the robot 102. Herein, the term "drive mechanism" refers to a configuration or a type of mechanism responsible for controlling the movement of the robot 102. Thereafter, the drive mechanism is then compared against any pre-defined drive mechanism amongst a plurality of pre-defined drive mechanisms to determine whether the robot 102 employs the pre-defined drive mechanism or employs a new drive mechanism altogether. Herein, the term "pre-defined drive mechanism" refers to a standardized drive mechanism that represents a particular way in which the drive mechanism of the robot 102 is designed. Additionally, the term "plurality of pre-defined drive mechanisms" refers to a collection of distinct predetermined drive mechanisms that is available for the robot 102.
When it is determined that the drive mechanism matches the pre-defined drive mechanism amongst the plurality of pre-defined drive mechanisms, the at least one first processor 116 is configured to access the pre-defined kinematic model associated with that drive mechanism. Herein, the term "pre-defined kinematic model" refers to a standardized representation (for example, such as a mathematical representation) describing how the robot's motion in relation to the at least one control signal. Some examples of pre-defined kinematic model corresponding to different drive mechanisms are, but are not limited to, differential drive kinematics model, Ackermann steering kinematics model, tracked drive kinematics model.
When it is determined that the drive mechanism does not match any pre-defined drive mechanism amongst the plurality of pre-defined drive mechanisms, it indicates that the user has selected non-standardized drive mechanism. The at least one first processor 116 may generate a prompt for the user to provide the additional input to create the customized plugin. Herein, the term "customized plugin" refers to a software module that is specifically modified to accommodate unique characteristics of the drive mechanism that is non-standardized. Herein, the customized plugin takes odometry information as input. Subsequently, the output of the customized plugin includes at least one of the following linear velocity and angular velocity of the robot 102. This output is essential for controlling the robot's movement accurately. Beneficially, creating the customized plugin allows the kit 100 to adapt the specific characteristics, ensuring accurate and effective control of the robot’s motion. This approach ensures that the kit 100 remains versatile and can accommodate a wide range of robotic configurations, even those outside standard drive models. A technical effect involves enhancing the kit's adaptability, user customization, and seamless integration of non-standard drive mechanisms, ultimately contributing to the kit’s effectiveness in diverse robotic applications.
Optionally, the specification data comprises at least one of: a kinematic model based on a drive mechanism of the robot 102, a format of the at least one control signal, a format of a feedback received from the robot 102, physical dimensions of the robot 102, constraints of the robot 102, a maximum speed of the robot 102, a maximum acceleration of the robot 102, a plurality of specifications of at least one another communication interface 112 onboard the robot 102.
Throughout the present disclosure, the term "kinematic model" refers to representation (could be at least a mathematical representation) based on characteristics of the drive mechanism. Herein, the term "drive mechanism" refers to components (for example, such as, wheels, tracks, legs) responsible for a movement of the robot 102. The kinematic model accounts for factors associated with the movement of the robot 102. Examples of the drive mechanism may include, but are not limited to, differential drive mechanism, skid steering drive mechanism, Ackermann steering drive mechanism. For example, the drive mechanism may be differential drive mechanism. The kinematic model for the robot 102 using the drive mechanism may define how a position or an orientation of the robot 102 may change with respect to time based on velocity with which the robot 102 moves.
The phrase "format of the at least one control signal" refers to the structure or arrangement of data that constitutes the at least one control signal. The format may include how this data is organized, encoded, and transmitted from the at least one first processor 116 to the second processor. The phrase "format of the feedback received from the robot 102" refers to a structure and arrangement of information sent back by the plurality of actuators of the robot 102. Moreover, the physical dimensions of the robot 102 include parameters such as at least one of: a length, a width, and a height, a weight of the robot 102. The constraints of the robot 102 comprises at least one of: a maximum turning radius, a minimum clearance height, restrictions on certain types of terrain.
The term "maximum speed of the robot" refers to a highest achievable speed that the robot can attain, when in use. The maximum speed of the robot is an essential parameter for controlling the robot’s movement and ensuring that the robot 102 operates with safe and predefined speed limits. Subsequently, the term "maximum acceleration of the robot 102" refers to the highest rate at which the robot 102 can change its velocity, when in use. Advantageously, setting constraints, maximum speed, and maximum acceleration ensures that the kit 100 generates control signals that keep the robot's motion within safe operational limits. This contributes to safe and controlled navigation, preventing excessive speeds or accelerations that could lead to undesirable outcomes.
The phrase "plurality of specifications of at least one another communication interface" refers to detailed specifications of additional communication interfaces present on the robot 102, which are different from the at least one communication interface 112. A technical effect of specifying the specification data is that it facilitates the at least one first processor 116 to provide a customized and adaptable retrofitting solution that enhances the autonomous capabilities of the robot 102 by considering its specific characteristics, constraints, and communication requirements.
Optionally, when the specification data comprises the plurality of specifications of the at least one another communication interface 112 onboard the robot 102, the at least one first processor 116 is further configured to selectively load any one of: a communication interface 112 plugin, a library, that is associated with the at least one another communication interface 112, to enable establishment of at least one communication channel between the kit 100 and the robot 102.
Throughout the present disclosure the term "communication interface plugin" refers to a modular piece of software designed to extend or enhance the capabilities of the kit 100 in communicating with specific types of communication interfaces. This communication interface plugin can be dynamically loaded or plugged into a software framework of the kit 100. Each communication interface plugin is customized to support a particular communication protocol or interface. Herein, the term "library" refers to a set of pre-compiled software modules or functions that offer standardized and reusable routines for communication with the specific communication interface or protocol. In this regard, when the specification data includes details about the at least one another communication interface onboard the robot 102 (e.g., its type, protocol, communication parameters), this information is essential for the at least one first processor 116 of the kit 100 to establish communication with the second processor 118 of the robot 102. The at least one first processor 116 is designed to adapt to different communication scenarios, thereby based on the provided specifications the at least one first processor 116 can selectively load either the communication interface plugin or the library associated with the identified communication interface. It will be appreciated that different robots may employ diverse communication interfaces like proprietary protocols, custom hardware etc.
The at least one first processor 116, upon receiving the specification data, selectively loads the at least one of: the communication interface plugin, the library specifically designed for the at least one another communication interface. This facilitates establishment of the communication channel allowing seamless interaction between the at least one first processor 116 and the second processor 118. A technical effect of configuring the at least one first processor 116 in such a manner is that it allows the at least one first processor 116 to establish communication channels in a dynamic manner, with different robots, thereby enhancing the kit's versatility.
Optionally, the at least one first processor 116 is further configured to:
process at least one raw data stream of at least one of: a log of processing outputs of the at least one first processor 116, a log of navigation missions undertaken by the robot 102, information pertaining to navigation missions undertaken by the robot 102, to filter the at least one raw data stream, based on predefined criteria; and
generate a package comprising the at least one raw data stream that is filtered.
Throughout the present disclosure, the term "raw data stream" refers to the unprocessed and unfiltered data that is generated during the robot’s operation, specifically related to movements and performance of the robot 102. Herein the term "log" refers to a systematic recording of information over a period of time, while the robot 102 is in operation. Furthermore, the phrase "log of processing outputs" is a detailed record that captures the information and decisions made by the at least one first processor 116 of the kit 100. Subsequently, in the context of autonomous robotic navigation, decisions can be rule-based, or decisions of an artificial intelligence (AI) model executed by the at least one processor. The term "log of navigation mission" refers to the recorded information related to the execution of various navigation tasks or missions undertaken by the robot 102.
In this regard, the kit 100 is equipped with the capability to handle at least one raw data stream. These raw data streams originate from various sources, including logs of processing outputs from the at least one first processor 116 or logs/information related to navigation missions undertaken by the robot 102. Subsequently, the raw data streams are subjected to filtering process which involves applying predefined criteria to selectively include or exclude specific data elements from the raw data streams. The predefined criteria may be predetermined based on the requirements or objectives of the user. Thereby, after filtering the raw data streams, the at least one first processor 116 generates the package containing the processed and filtered data, organized in a structured manner. Such processing is made possible using the data engine module comprised in the at least one first processor 116.
The at least one first processor 116 is configured to extract meaningful insights, and then organizing or packaging it in a structured format for further use. Subsequently, one of the primary tasks of the data engine is to ensure that relevant and valuable data is collected. This may involve filtering out irrelevant information and focusing on capturing data that is essential for the intended purposes of the robot 102. Furthermore, the collected data serves multiple purposes, including analytics, crashlytics, and training.
A technical effect involves the kit’s ability to manage and refine large volumes of raw data generated during robot’s operations. By processing and filtering this data, the kit 100 provides a user with condensed and relevant information, facilitating efficient analysis, or monitoring of the robot’s behaviour and missions.
Referring to FIG. 2, there is illustrated an internal view of an exemplary kit 200 retrofitted on an exemplary robot 202, in accordance with an embodiment of the present disclosure. The kit 200 comprises a housing 204, a sensor arrangement 206, a communication arrangement 206, and a processing arrangement 208. The robot 202 comprises the kit 200, a second processor 210, and a plurality of actuators (depicted as actuators 212A, 212B, 212C, and 212D). The robot 202 is capable of autonomous robotic navigation.
It may be understood by a person skilled in the art that the FIG. 2 includes a simplified architecture of a kit 200 for sake of clarity, which should not unduly limit the scope of the claims herein. The person skilled in the art will recognize many variations, alternatives, and modifications of embodiments of the present disclosure.
Referring to FIG. 3, illustrated are steps of a method for autonomous navigation using a kit that is capable of being retrofitted on a robot, in accordance with an embodiment of the present disclosure. At step 302, specification data that is related to a configuration of the robot is received. At step 304, sensor data, collected by the plurality of sensors, is received, wherein the sensor data pertains to a state of the robot and to an environment surrounding the robot. At step 306, the sensor data is processed to determine the state of the robot with respect to a frame of reference and to generate an occupancy grid map of the environment. At step 308, a cost map of traversing the environment is generated, based at least on the occupancy grid map. At step 310, a navigation plan of the robot is generated, based on the cost map and a navigation objective. At step 312, at least one control signal for moving the robot is generated, based on the navigation plan and the specification data. At step 314, the at least one control signal is sent to a second processor of the robot, wherein when the at least one control signal is executed by the second processor, the robot moves according to the navigation plan for achieving the navigation objective.
The aforementioned steps are only illustrative and other alternatives can also be provided where one or more steps are added, one or more steps are removed, or one or more steps are provided in a different sequence without departing from the scope of the claims herein.
The present disclosure also relates to the aforementioned second aspect as described above. Various embodiments and variants disclosed above, with respect to the aforementioned first aspect, apply mutatis mutandis to the aforementioned second aspect.
Optionally, the method further comprising:
receiving a feedback from a plurality of actuators of the robot, via the second processor of the robot;
determining movement information of the robot, by translating the feedback to odometry information of the robot; and
determining a current position of the robot, by fusing the odometry information with the sensor data,
wherein the at least one control signal is generated based also on the current position of the robot.
Optionally, the kit further comprises at least one input device communicably coupled to the at least one first processor via the at least one communication interface, the method further comprising receiving at least one user input pertaining to the specification data, via the at least one input device, wherein the at least one user input is provided by a user associated with the at least one input device.
Optionally, the method further comprising:
processing the at least one user input to determine a drive mechanism of the robot specified therein;
determining whether the drive mechanism matches any pre-defined drive mechanism amongst a plurality of pre-defined drive mechanisms;
when it is determined that the drive mechanism matches a pre-defined drive mechanism amongst the plurality of pre-defined drive mechanisms, accessing a pre-defined kinematic model based on the pre-defined drive mechanism; and
when it is determined that the drive mechanism does not match any pre-defined drive mechanism amongst the plurality of pre-defined drive mechanisms, receiving an additional user input pertaining to a customised plugin for the drive mechanism, wherein an input of the customized plugin is odometry information and an output of the customized plugin is at least one of: linear velocity, angular velocity, of the robot.
Optionally, the specification data comprises a plurality of specifications of the at least one communication interface onboard the robot, the method further comprising selectively loading any one of: a communication interface plugin, a library, that is associated with the at least one another communication interface, to enable establishment of at least one communication channel between the kit and the robot.
Optionally, the method further comprising:
processing at least one raw data stream of at least one of: a log of processing outputs of the at least one first processor, a log of navigation missions undertaken by the robot, information pertaining to navigation missions undertaken by the robot, to filter the at least one raw data stream, based on predefined criteria; and
generating a package comprising the at least one raw data stream that is filtered.
, Claims:CLAIMS
What is claimed is:
1. A kit that is capable of being retrofitted on a robot, the kit comprising:
a housing;
a sensor arrangement comprising a plurality of sensors, wherein the plurality of sensors are detachably attached to the housing, to provide form factor flexibility to the kit;
a communication arrangement comprising at least one communication interface , wherein the at least one communication interface is detachably attached to the housing, to provide form factor flexibility to the kit;
a processing arrangement comprises at least one first processor, the at least one first processor being communicably coupled with the sensor arrangement and a second processor of the robot, via the at least one communication interface, wherein the at least one first processor is configured to:
receive specification data that is related to a configuration of the robot;
receive sensor data, collected by the plurality of sensors, wherein the sensor data pertains to a state of the robot and to an environment surrounding the robot;
process the sensor data to determine the state of the robot with respect to a frame of reference and to generate an occupancy grid map of the environment;
generate a cost map of traversing the environment, based at least on the occupancy grid map;
generate a navigation plan for the robot, based on the cost map and a navigation objective;
generate at least one control signal for moving the robot, based on the navigation plan and the specification data; and
send the at least one control signal to the second processor of the robot, wherein when the at least one control signal is executed by the second processor, the robot moves according to the navigation plan for achieving the navigation objective.
2. The kit as claimed in claim 1, wherein for flexibly transitioning the kit between different form factors, at least one of: a quantity, an arrangement, of constituents of at least one of: the sensor arrangement, the communication arrangement, the processing arrangement, is modifiable.
3. The kit as claimed in claim 1, wherein the at least one first processor is further configured to:
receive a feedback from a plurality of actuators of the robot, via the second processor of the robot;
determine movement information of the robot, by translating the feedback to odometry information of the robot; and
determine a current position of the robot, by fusing the odometry information with the sensor data,
wherein the at least one control signal is generated based also on the current position of the robot.
4. The kit as claimed in claim 1, further comprising at least one input device communicably coupled to the at least one first processor via the at least one communication interface, wherein the at least one first processor is further configured to receive at least one user input pertaining to the specification data, via the at least one input device, wherein the at least one user input is provided by a user associated with the at least one input device.
5. The kit as claimed in claim 4, wherein the at least one first processor is further configured to:
process the at least one user input to determine a drive mechanism of the robot specified therein;
determine whether the drive mechanism matches any pre-defined drive mechanism amongst a plurality of pre-defined drive mechanisms;
when it is determined that the drive mechanism matches a pre-defined drive mechanism amongst the plurality of pre-defined drive mechanisms, access a pre-defined kinematic model based on the pre-defined drive mechanism; and
when it is determined that the drive mechanism does not match any pre-defined drive mechanism amongst the plurality of pre-defined drive mechanisms, receive an additional user input pertaining to a customised plugin for the drive mechanism, wherein an input of the customized plugin is odometry information and an output of the customized plugin is at least one of: linear velocity, angular velocity, of the robot.
6. The kit as claimed in claim 1, wherein the specification data comprises at least one of: a kinematic model based on a drive mechanism of the robot, a format of the at least one control signal, a format of a feedback received from the robot, physical dimensions of the robot, constraints of the robot, a maximum speed of the robot, a maximum acceleration of the robot, a plurality of specifications of at least one another communication interface onboard the robot.
7. The kit as claimed in claim 6, wherein when the specification data comprises the plurality of specifications of the at least one another communication interface onboard the robot, the at least one first processor is further configured to selectively load any one of: a communication interface plugin, a library, that is associated with the at least one another communication interface, to enable establishment of at least one communication channel between the kit and the robot.
8. The kit as claimed in claim 1, wherein the at least one first processor is further configured to:
process at least one raw data stream of at least one of: a log of processing outputs of the at least one first processor, a log of navigation missions undertaken by the robot, information pertaining to navigation missions undertaken by the robot, to filter the at least one raw data stream, based on predefined criteria; and
generate a package comprising the at least one raw data stream that is filtered.
9. The kit as claimed in claim 1, wherein the plurality of sensors comprises at least two of: an image sensor, a motion sensor, a distance sensor.
10. A method for autonomous robotic navigation, using a kit that is capable of being retrofitted on a robot, wherein the kit comprises a housing that is adjustable to provide form factor flexibility to the kit, a sensor arrangement comprising a plurality of sensors that are detachably attached to the housing, and a communication arrangement comprising at least one communication interface that is detachably attached to the housing,
and wherein the method comprises:
receiving specification data that is related to a configuration of the robot;
receiving sensor data, collected by the plurality of sensors, wherein the sensor data pertains to a state of the robot and to an environment surrounding the robot;
processing the sensor data to determine the state of the robot with respect to a frame of reference and to generate an occupancy grid map of the environment;
generating a cost map of traversing the environment, based at least on the occupancy grid map;
generating a navigation plan for the robot, based on the cost map and a navigation objective;
generating at least one control signal for moving the robot, based on the navigation plan and the specification data; and
sending the at least one control signal to a second processor of the robot, wherein when the at least one control signal is executed by second the processor, the robot moves according to the navigation plan for achieving the navigation objective.
11. The method as claimed in claim 10, further comprising:
receiving a feedback from a plurality of actuators of the robot, via the second processor of the robot;
determining movement information of the robot, by translating the feedback to odometry information of the robot; and
determining a current position of the robot, by fusing the odometry information with the sensor data,
wherein the at least one control signal is generated based also on the current position of the robot.
12. The method as claimed in claim 10, wherein the kit further comprises at least one input device communicably coupled to the at least one first processor via the at least one communication interface, the method further comprising receiving at least one user input pertaining to the specification data, via the at least one input device, wherein the at least one user input is provided by a user associated with the at least one input device.
13. The method as claimed in claim 12, further comprising:
processing the at least one user input to determine a drive mechanism of the robot specified therein;
determining whether the drive mechanism matches any pre-defined drive mechanism amongst a plurality of pre-defined drive mechanisms;
when it is determined that the drive mechanism matches a pre-defined drive mechanism amongst the plurality of pre-defined drive mechanisms, accessing a pre-defined kinematic model based on the pre-defined drive mechanism; and
when it is determined that the drive mechanism does not match any pre-defined drive mechanism amongst the plurality of pre-defined drive mechanisms, receiving an additional user input pertaining to a customised plugin for the drive mechanism, wherein an input of the customized plugin is odometry information and an output of the customized plugin is at least one of: linear velocity, angular velocity, of the robot.
14. The method as claimed in claim 10, wherein the specification data comprises a plurality of specifications of the at least one communication interface onboard the robot, the method further comprising selectively loading any one of: a communication interface plugin, a library, that is associated with the at least one another communication interface, to enable establishment of at least one communication channel between the kit and the robot.
15. The method as claimed in claim 10, further comprising:
processing at least one raw data stream of at least one of: a log of processing outputs of the at least one first processor, a log of navigation missions undertaken by the robot, information pertaining to navigation missions undertaken by the robot, to filter the at least one raw data stream, based on predefined criteria; and
generating a package comprising the at least one raw data stream that is filtered.

Documents

Application Documents

# Name Date
1 202441012172-STATEMENT OF UNDERTAKING (FORM 3) [21-02-2024(online)].pdf 2024-02-21
2 202441012172-POWER OF AUTHORITY [21-02-2024(online)].pdf 2024-02-21
3 202441012172-FORM FOR STARTUP [21-02-2024(online)].pdf 2024-02-21
4 202441012172-FORM FOR SMALL ENTITY(FORM-28) [21-02-2024(online)].pdf 2024-02-21
5 202441012172-FORM 1 [21-02-2024(online)].pdf 2024-02-21
6 202441012172-FIGURE OF ABSTRACT [21-02-2024(online)].pdf 2024-02-21
7 202441012172-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [21-02-2024(online)].pdf 2024-02-21
8 202441012172-EVIDENCE FOR REGISTRATION UNDER SSI [21-02-2024(online)].pdf 2024-02-21
9 202441012172-DRAWINGS [21-02-2024(online)].pdf 2024-02-21
10 202441012172-DECLARATION OF INVENTORSHIP (FORM 5) [21-02-2024(online)].pdf 2024-02-21
11 202441012172-COMPLETE SPECIFICATION [21-02-2024(online)].pdf 2024-02-21
12 202441012172-Proof of Right [22-02-2024(online)].pdf 2024-02-22
13 202441012172-FORM-26 [22-02-2024(online)].pdf 2024-02-22