Sign In to Follow Application
View All Documents & Correspondence

A Robotic System And Method Thereof

Abstract: The present invention relates to a robotic system and method thereof. More particularly, the present invention relates to the robotic system which is capable of identifying, localizing, grasping, and transporting a plurality of items by means of an actuated framing, shelf system, and a plurality of sensors -all- in conjunction with the said robotic arm which utilizes the said end-effector and the grasper. Figure 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
29 September 2022
Publication Number
18/2024
Publication Type
INA
Invention Field
BIO-MEDICAL ENGINEERING
Status
Email
Parent Application

Applicants

Pankaj Patil
403, Department of Electronics Engineering, Ramrao Adik Institute of Technology, D Y Patil deemed to be University, Nerul, Navi Mumbai - 400706
Kaival Trapasia
403, Department of Electronics Engineering, Ramrao Adik Institute of Technology, D Y Patil deemed to be University, Nerul, Navi Mumbai - 400706
Pradyumn Pathak
403, Department of Electronics Engineering, Ramrao Adik Institute of Technology, D Y Patil deemed to be University, Nerul, Navi Mumbai - 400706
Dr. Vishwesh A. Vyawahare
403, Department of Electronics Engineering, Ramrao Adik Institute of Technology, D Y Patil deemed to be University, Nerul, Navi Mumbai - 400706
Ms. Divya K. Shah
403, Department of Electronics Engineering, Ramrao Adik Institute of Technology, D Y Patil deemed to be University, Nerul, Navi Mumbai - 400706
Ramrao Adik Institute of Technology, DY PATIL DEEMED TO BE UNIVERSITY
DY Patil Deemed to be University, Nerul, Navi Mumbai 400 706, Maharashtra [Email: deanrnd@rait.ac.in]

Inventors

1. Pankaj Patil
403, Department of Electronics Engineering, Ramrao Adik Institute of Technology, D Y Patil deemed to be University, Nerul, Navi Mumbai - 400706
2. Kaival Trapasia
403, Department of Electronics Engineering, Ramrao Adik Institute of Technology, D Y Patil deemed to be University, Nerul, Navi Mumbai - 400706
3. Pradyumn Pathak
403, Department of Electronics Engineering, Ramrao Adik Institute of Technology, D Y Patil deemed to be University, Nerul, Navi Mumbai - 400706
4. Dr. Vishwesh A. Vyawahare
403, Department of Electronics Engineering, Ramrao Adik Institute of Technology, D Y Patil deemed to be University, Nerul, Navi Mumbai - 400706
5. Ms. Divya K. Shah
403, Department of Electronics Engineering, Ramrao Adik Institute of Technology, D Y Patil deemed to be University, Nerul, Navi Mumbai - 400706

Specification

FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10 and rule 13)
1. TITLE OF THE INVENTION
A ROBOTIC SYSTEM AND METHOD THEREOF
2. APPLICANT(S):
a) Pankaj Patil, Kaival Trapasia, Pradyumn Pathak, Dr. Vishwesh A. Vyawahare, and Ms. Divya K. Shah
b) All Indian Nationals
c) 403, Department of Electronics Engineering, Ramrao Adik Institute of Technology, D Y Patil deemed to be University, Nerul, Navi Mumbai - 400706
a) Ramrao Adik Institute of Technology, DY Patil deemed to be University
b) An Indian Educational Institute
c) Ramrao Adik Institute of Technology, D Y Patil deemed to be University, Nerul, Navi Mumbai – 400706
3. PREAMBLE TO THE DESCRIPTION
The following specification particularly describes the invention and the manner in which it is to be performed.

FIELD OF INVENTION:
The present invention relates to a robotic system and method thereof. More particularly, the present invention relates to the robotic system which is capable of identifying, localizing, grasping, and transporting a plurality of items by means of an actuated framing, shelf system, and a plurality of sensors -all- in conjunction with the said robotic arm which utilizes the said end-effector and the grasper.
BACKGROUND OF INVENTION:
Today, to produce any material across all sectors, manual labour alone consumes 31% of effort time - the largest among all type of tasks, despite years of advances in automation. That's because, only repetitive tasks for handling simple, symmetrical objects have remained technically feasible to automate till date.
While every kind of manual labour can be generalized as a series of repetitive, Pick-Orient-Place, actions handling objects, however these objects in the real-world are present in a lot of clutter and randomness. This makes the physical task to be performed unpredictable and the automation of such tasks difficult or impossible.
The field of robotic manipulation is comprised of numerous fully integrated (mobile and non-mobile) systems designed for performing grasping and manipulation tasks. Although the existing manipulation platforms share similarities and differences in hardware design, they are mainly distinguished by their software architecture, the variety of manipulation tasks they can accomplish, and their level of autonomy.
Patent US5447403A discloses an anatomically correct, humanlike, mechanical arm and hand that an operator can control to perform with the dexterity and compliance of a human hand. Being humanlike and robotic enhances the device's control and gripper dexterity. Control of the movement of the arm and hand is performed or guided by a "teachglove" worn by the operator. As he or she performs some hand manipulation, a controller stores signals from sensors on the exoskeleton. The sensors monitor the operator's finger-joint movement positions. These values are later translated into actuator control signals for servomotors, eventually duplicating the operator's movement.
However, there is a need to have an improved version of the robotic arm which is capable of identifying, localizing, grasping, and transporting a plurality of items in a

more controlled manner without any human effort. More particularly, there is a need to establish and implement the full working system with all hardware installed and controlled. Using a multi-dimensional segregation approach, co-coordinated and adaptive acquisition enabled by an integrated eye-brain platform, and deep-neural network models reinforced with motion feedback, the system can accurately pick and accurately place even complex geometry objects, even when presented in a random bin with occlusions and entanglements.
OBJECT OF THE INVENTION:
The primary objective of the present invention is to disclose a robotic system and method thereof.
Another primary objective of the present invention is to disclose the robotic system and method wherein a controller configured in the said robotic arm for receiving the sensor signals wherein the said controller includes: tangible, non-transitory memory on which is recorded a plurality of classes of sensory signals and a computer-executable instruction for executing a plurality of task; and an operating system configured for executing the instructions from the tangible, non-transitory memory to thereby cause the controller for controlling, motion planning and networking of the said robotic arm.
Yet another objective of the present invention is to disclose the robotic system and method wherein, a trajectory planning mechanism wherein the said mechanism is used for specifying the motion of the said robot as it traverses the path.
Yet another objective of the present invention is to disclose the robotic system and method wherein, the said robotic system is capable to identify, localize, grasp and transport a plurality of items by means of an actuated framing, shelf system, and a plurality of sensors -all- in conjunction with the said robotic arm which utilizes the said end-effector and the grasper.
Yet another objective of the present invention is to disclose the robotic system as method wherein, the response time to identify, localize, grasp and transport a plurality of items is less than 60 seconds.
Other objects and advantages of the present invention will become apparent from the following description taken in connection with the accompanying drawings, wherein, by way of illustration and example, the aspects of the present invention are disclosed.

SUMMARY OF INVENTION:
In accordance with the present invention, a robotic system and method is disclosed wherein, the said system comprising of a robotic arm, and the said robotic arm comprising of: a plurality of actuators, an optical encoder, a gearing system, a plurality of motors, an image capturing device, a plurality of sensors. A controller, in the present invention, is configured in the said robotic arm for receiving the sensor signals wherein the said controller includes: tangible, non-transitory memory on which is recorded a plurality of classes of sensory signals and a computer-executable instruction for executing a plurality of task; and an operating system configured for executing the instructions from the tangible, non-transitory memory to thereby cause the controller for controlling, motion planning and networking of the said robotic arm.
In accordance with the present invention, a plurality of serial links are affixed between a plurality of joints through a base frame through an end-effector of the said robotic arm; and wherein the position and orientation of the said end-effector is calculated, and a co-ordinated frame is attached to each joint of the said end-effector to determine Denavit-Hartenberg (DH) parameters, and the movement of the joints of the said end-effector, and a trajectory planning mechanism wherein the said mechanism is used for specifying the motion of the said robot as it traverses the path.
Further, a first data processing framework wherein the said first data processing framework comprising of an filtering option, a feature estimating option, a surface reconstructing option; wherein the said plurality of options are used for filter outliers from noisy data, stitch 3D point clouds together, segment relevant parts of a scene, extract key points and compute descriptors to recognize objects. A second data processing framework wherein the said second data processing framework assists in recognizing a plurality of small objects, and the said second data processing framework comprising of an extracting feature, a filtration technique for image recognition.
Moreover, in accordance with the present invention - the said robotic system and method thereof is capable to identify, localize, grasp and transport a plurality of items by means of an actuated framing, shelf system, and a plurality of sensors -all- in conjunction with the said robotic arm which utilizes the said end-effector and the grasper.

Other objects and advantages of the present invention will become apparent from the following description taken in connection with the accompanying drawings, wherein, by way of illustration and example, the aspects of the present invention are disclosed.
BRIEF DESCRIPTION OF DRAWINGS:
The present invention will be better understood after reading the following detailed description of the presently preferred aspects thereof with reference to the appended drawings, in which the features, other aspects and advantages of certain exemplary embodiments of the invention will be more apparent from the accompanying drawings in which:
Figure 1 illustrates a preferred embodiment of the robotic arm.
Figure 2 illustrates an interface for object pose detection.
Figure 3 illustrates an interface for encoder feedback.
Figure 4 illustrates an interface for encoder feedback using joint states.
DETAILED DESCRIPTION:
The following description describes various features and functions of the disclosed device and methods with reference to the accompanying figures. In the figures, similar symbols identify similar components, unless context dictates otherwise. The illustrative aspects described herein are not meant to be limiting. It may be readily understood that certain aspects of the disclosed system, method and apparatus can be arranged and combined in a wide variety of different configurations, all of which are contemplated herein.
These and other features and advantages of the present invention may be incorporated into certain embodiments of the invention and will become more fully apparent from the following description and claims or may be learned by the practice of the invention as set forth hereinafter.
Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope of the invention. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.

The terms and words used in the following description and claims are not limited to the bibliographical meanings, but are merely used to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to those skilled in the art that the following description of exemplary embodiments of the present invention are provided for illustration purpose only and not for the purpose of limiting the invention.
It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise.
It should be emphasized that the term “comprises/comprising” when used in this specification is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.
In accordance with the present invention, a robotic system and method is disclosed wherein, the said system comprising of a robotic arm, and the said robotic arm comprising of: a plurality of actuators, an optical encoder, a gearing system, a plurality of motors, an image capturing device, a plurality of sensors. A controller, in the present invention, is configured in the said robotic arm for receiving the sensor signals wherein the said controller includes: tangible, non-transitory memory on which is recorded a plurality of classes of sensory signals and a computer-executable instruction for executing a plurality of task; and an operating system configured for executing the instructions from the tangible, non-transitory memory to thereby cause the controller for controlling, motion planning and networking of the said robotic arm.
In accordance with the present invention, a plurality of serial links are affixed between a plurality of joints through a base frame through an end-effector of the said robotic arm; and wherein the position and orientation of the said end-effector is calculated, and a co-ordinated frame is attached to each joint of the said end-effector to determine DH parameters, and the movement of the joints of the said end-effector, and a trajectory planning mechanism wherein the said mechanism is used for specifying the motion of the said robot as it traverses the path.
Further, a first data processing framework wherein the said first data processing framework comprising of an filtering option, a feature estimating option, a surface reconstructing option; wherein the said plurality of options are used for filter outliers

from noisy data, stitch 3D point clouds together, segment relevant pa rts of a scene, extract key points and compute descriptors to recognize objects. A second data processing framework wherein the said second data processing framework assists in recognizing a plural i ty of small objects, and the said second data processing framework comprising of an extracting feature, a filtration technique for image recognition.
IVI oreover, in accordance with the present invention ~ the said robotic system and method thereof is capable to identify, localize, grasp and transport a plurality of items by means of an actuated framing, shelf system, and a plurality of sensors "all" in conjunction with the said robotic arm which utilizes the said end_effector and the grasper.
Figure 1. In accordance with Figure 1 of the present invention, a preferred embodiment of the said robotic arm is disclosed. I he said robotic arm comprises of a plurality of actuators, an optical encoder, a gearing system, a plurality of motors, an image capturing device, and a plurality of sensors,
Further, a controller is configured in the said robotic arm for receiving the sensor signals wherein the said controller includes, tangible, non-transitory memory on which is recorded a plurality of classes of sensory signals and a computer~executable instruction for executing a plurality of task, and an operating system configured for executing the instructions from the tangible, non-transitory memory to thereby cause the controller for control ling, motion planning and n etw orking of the said robotic arm.
Figure 2 of the present invention discloses a first data processing framework wherein the said first data processing framework comprising of an filtering option, a feature esti mating option, a surface reconstructing option, wherein the said plural i ty of options are used for filter outliers from noisy data, stitch 3D point clouds together, segment relevant pa rts of a scene, extract key points and compute descriptors to recognize objects.
Figure 3 of the present invention discloses a second data processing framework wherein the said second data processing framework assists in recognizing a plurality of small objects, and the said second data processing framework comprising of an extracting feature, a filtration technique for image recognition.
Path P,l anning andI T rajectory Planning.

A trajectory planning mechanism wherein the said mechanism is used for specifying the motion ofthe said robot as it traverses the path. In robotics, the trajectory planning is moving from point A to point B while avoiding collisions over time. This can be computed in both discrete and continuous methods. I rajectory planning is a major area in robotics as it gives way to autonomous vehicles.
T rajectory planning is sometimes referred to as motion planning and erroneously as
path planning. T rajectory planning is distinct from path planning in that it is
parametrized by time, Essentially trajectory planning encompasses path planning in
addition to planning how to move based on veloci ty, time, and kinematics.
Implementation of the operating system in the present invention.
The first main part ofthe invention is a Robotic Operating System and the second main part is Cloud Library.
Robotic operating system (RoS): The R obot Uperating System (ROS) is a flexible framework for writing robot software. It is a collection of tools, libraries, and conventions that aim to simplify the task of creating complex and robust robot behaviour across a wide variety of robotic platforms.
Cloud Library: The Point Lloud Library [or PCL) is a, lrge_scale framework for 2D/3D image and point cloud processing. The PCL framework contains numerous state~of~the art algorithms including filtering, feature estimation, surface reconstruction, registration, model fitting and segmentation.

We Claim:
1. A robotic system, comprising:
a. a robotic arm, and the said robotic arm comprising of:
i. a plurality of actuators,
ii. an optical encoder, iii. a gearing system, iv. a plurality of motors,
v. an image capturing device, vi. a plurality of sensors,
b. a controller configured in the said robotic arm for receiving the sensor signals
wherein the said controller includes:
i. tangible, non-transitory memory on which is recorded a plurality of classes of sensory signals and a computer-executable instruction for executing a plurality of task; ii. an operating system configured for executing the instructions from the tangible, non-transitory memory to thereby cause the controller for controlling, motion planning and networking of the said robotic arm;
c. a plurality of serial links are affixed between a plurality of joints through a base
frame through an end-effector of the said robotic arm; and wherein the position
and orientation of the said end-effector is calculated, and a co-ordinated frame
is attached to each joint of the said end-effector to determine DH parameters,
and the movement of the joints of the said end-effector;
d. a trajectory planning mechanism wherein the said mechanism is used for
specifying the motion of the said robot as it traverses the path;
e. a first data processing framework wherein the said first data processing
framework comprising of an filtering option, a feature estimating option, a
surface reconstructing option; wherein the said plurality of options are used for
filter outliers from noisy data, stitch 3D point clouds together, segment relevant
parts of a scene, extract key points and compute descriptors to recognize objects;
f. a second data processing framework wherein the said second data processing
framework assists in recognizing a plurality of small objects, and the said
second data processing framework comprising of an extracting feature, a
filtration technique for image recognition.
2. The robotic system as claimed in claim 1 wherein, the said robotic system is capable to identify, localize, grasp and transport a plurality of items by means of an actuated framing, shelf system, and a plurality of sensors -all- in conjunction with the said robotic arm which utilizes the said end-effector and the grasper.
3. The robotic system as claimed in claim 1 wherein, the object detection and travel distance will be between 40 to 50 cm.
4. The robotic system as claimed in claim 1 wherein, the response time to identify, localize, grasp and transport a plurality of items is less than 60 seconds.

5. The robotic system as claimed in claim 1 wherein, the said controller is a microcontroller.
6. The robotic system as claimed in claim 1 wherein, the operating system is the robotic operating system.
7. A method, comprising:
a. affixing, a plurality of serial links between a plurality of joints through a base
frame through an end-effector of the said robotic arm; and wherein the position
and orientation of the said end-effector is calculated, and a co-ordinated frame
is attached to each joint of the said end-effector to determine DH parameters,
and the movement of the joints of the said end-effector;
b. using, a trajectory planning mechanism for specifying the motion of the said
robot as it traverses the path;
c. controlling, motion planning and networking of the said robotic arm, by means
of an operating system;
d. a first data processing framework wherein the said first data processing
framework comprising of an filtering option, a feature estimating option, a
surface reconstructing option; wherein the said plurality of options are used for
filter outliers from noisy data, stitch 3D point clouds together, segment relevant
parts of a scene, extract key points and compute descriptors to recognize objects;
e. a second data processing framework wherein the said second data processing
framework assists in recognizing a plurality of small objects, and the said
second data processing framework comprising of an extracting feature, a
filtration technique for image recognition.
8. The method as claimed in claim 5 wherein, the said robotic system is capable to identify, localize, grasp and transport a plurality of items by means of an actuated framing, shelf system, and a plurality of sensors -all- in conjunction with the said robotic arm which utilizes the said end-effector and the grasper.
9. The method as claimed in claim 5 wherein, the object detection and travel distance will be between 40 to 50 cm.
10. The method as claimed in claim 5 wherein, the response time to identify, localize, grasp and transport a plurality of items is less than 60 seconds.

Documents

Application Documents

# Name Date
1 202221056032-STATEMENT OF UNDERTAKING (FORM 3) [29-09-2022(online)].pdf 2022-09-29
2 202221056032-REQUEST FOR EXAMINATION (FORM-18) [29-09-2022(online)].pdf 2022-09-29
3 202221056032-FORM 18 [29-09-2022(online)].pdf 2022-09-29
4 202221056032-FORM 1 [29-09-2022(online)].pdf 2022-09-29
5 202221056032-DRAWINGS [29-09-2022(online)].pdf 2022-09-29
6 202221056032-DECLARATION OF INVENTORSHIP (FORM 5) [29-09-2022(online)].pdf 2022-09-29
7 202221056032-COMPLETE SPECIFICATION [29-09-2022(online)].pdf 2022-09-29
8 Abstract1.jpg 2022-12-08
9 202221056032-FORM-9 [13-02-2023(online)].pdf 2023-02-13
10 202221056032-FORM-26 [22-08-2023(online)].pdf 2023-08-22