Sign In to Follow Application
View All Documents & Correspondence

System And Method For Material Handling And Inventory Management

Abstract: SYSTEM AND METHOD FOR MATERIAL HANDLING AND INVENTORY MANAGEMENT A system and method for material handling and inventory management is disclosed. The system may comprise a cart 100, a central server 400, and a user device 501. The ACU 206 of the cart may configured to receive and process the input data captured from the plurality of sensors and the camera to identify an occurrence of an event from a group of events at a location of one or more shelves. The Motion control unit (MCU) 205 may configured to navigate the cart at location of one or more shelves according to location transmitted by the central server 400. The central server 400 may be configured to update the data of an inventory based on the information received from the ACU 206. [To be published with Figure 1]

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
18 March 2017
Publication Number
38/2018
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ip@stratjuris.com
Parent Application
Patent Number
Legal Status
Grant Date
2020-05-06
Renewal Date

Applicants

Autonomous Logistics Technologies Private Limited
6-101, Hill Ridge Springs, Gachibowli, Telangana, Hyderabad, India - 500019.

Inventors

1. Raghuram Nanduri
6-101, Hill Ridge Springs, Gachibowli, Telangana, Hyderabad, India - 500019.

Specification

DESC:FORM 2

THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003

COMPLETE SPECIFICATION

(See Section 10 and Rule 13)

Title of invention:
SYSTEM AND METHOD FOR MATERIAL HANDLING AND INVENTORY MANAGEMENT

APPLICANT:
Autonomous Logistics Technologies Private Limited
An Indian entity
having address,
6-101, Hill Ridge Springs, Gachibowli
Telangana, Hyderabad, India - 500019

The following specification describes the invention and the manner in which it is to be performed.

CROSS-REFERENCE TO RELATED APPLICATIONS AND PRIORITY
The present application claims priority from Indian Provisional Patent Application No. 201741009468 filed on 18th March 2017

TECHNICAL FIELD
The present invention in general relates to a system and method for material handling and inventory management.
BACKGROUND
Retail, e-commerce, manufacturing and logistics companies employ millions of workers for material handling in warehouses, distribution or fulfilment centres and retail stores. These workers perform monotonous and tiring work like unloading inventory, putting away in storage racks, picking, packing and shipping. They walk several miles every shift pushing heavy carts to pick / put away inventory. This leads to several health and social issues. This is also a slow and inefficient process, especially for retail and e-commerce where margins are thin and customers expect same day delivery. Retail stores rely on their staff to monitor shelf stock manually, report stock outs or empty shelves and replenish them with inventory. But this has proven to be quite inefficient especially in large stores or hypermarkets where empty shelves and stock outs are common problems. Inventory items also get damaged and misplaced as consumers often handle them and leave them in other places. This leads to further losses for the retailer.
Existing warehouse automation solutions are inflexible and highly capital intensive. Their mechanization led approach was suited for handling of materials in bulk (like pallets, drums etc.) but not for cases and individual items or eaches. Besides, most of these systems are limited to specific warehouse operations like order picking or fulfilment and cannot be easily adapted to different uses and environments. Also, existing automated systems cannot be operated manually and do not allow users to ride them. Similarly, existing manually operated systems do not support automatic operation.
Therefore, there is long standing need of system and method for material handling and inventory management which is highly flexible, multi-purpose, easy to implement/use, low cost, reduces work load on the workers and make overall system fast and efficient.

SUMMARY

Before the present system and its method of use is described, it is to be understood that this disclosure is not limited to the particular apparatus and its arrangement as described, as there can be multiple possible embodiments which are not expressly illustrated in the present disclosure. It is also to be understood that the terminology used in the description is for the purpose of describing the particular versions or embodiments only, and is not intended to limit the scope of the present application. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in detecting or limiting the scope of the claimed subject matter.

In one implementation, the present subject matter pertains to a system for material handling and inventory management. The system may comprise a cart, a central server, and a user device. The user device may be communicatively coupled with the central server. The cart may further comprise a payload platform, a camera, a plurality of sensors, a chassis and body, a human machine interface (HMI), an autonomous control unit (ACU), a Motion control unit (MCU) and a riding board. The cart and the central server may be communicatively coupled with each other. The plurality of sensors and the camera may be configured to capture an input data. The ACU may configured to receive the input data captured from the plurality of sensors and the camera. The ACU may further configured to process the input data in order to identify an occurrence of an event from a group of events at a location of one or more shelves. The central server may be further configured to receive a message indicative of occurrence of the event from the group of events from the ACU through a communication network. The central server may be configured to transmit a location of an available stock to the cart and a user associated with the user device, based on identified location of the available stock in the warehouse. The user associated with the user device may assigned with one or more tasks associated with the cart. The Motion control unit (MCU) may further configured to navigate the cart at location of one or more shelves. The ACU may configured to transmit an information to the central server based on the completion of one or more tasks associated with the cart. The central server may be configured to update the data of an inventory items, empty shelves, storage density, misplaced or damaged inventory and counting of inventory based on the information received from the ACU.
In one embodiment, the present subject matter pertains to a method for material handling and inventory management. The method may comprise receiving via an autonomous control unit (ACU), an input data wherein the input data captured using a plurality of sensors and camera. The method may comprise processing, via the autonomous control unit (ACU), the input data in order to identify an occurrence of an event from a group of events at a location of one or more shelves. The method may comprise receiving, via a central server a message indicative of occurrence of the event from the group of events from the ACU through a communication network. The method may comprise transmitting, via the central server a location of an available stock to the cart and a user associated with the user device, based on identified location of the available stock in the warehouse, wherein the user is assigned with one or more tasks associated with the cart. The method may comprise navigating, via a motion control unit (MCU), the cart to at the location of one or more shelves. The method may comprise transmitting, via the ACU, an information to the central server based on the completion of one or more tasks associated with the cart. The method may comprise updating, via the central server the data of an inventory items, empty shelves, storage density, misplaced or damaged inventory and counting of inventory based on information received from the ACU.

BRIEF DESCRIPTION OF DRAWINGS
The detailed description is described with reference to the accompanying Figures. In the Figures, the left-most digit(s) of a reference number identifies the Figure in which the reference number first appears. The same numbers are used throughout the drawings to refer like features and components.
Figure 1, illustrates a cart 100 for material handling and inventory management, in accordance with an embodiment of the present subject matter.
Figure 2, illustrates block diagram 200 of components of a cart for material handling and inventory management, in accordance with an embodiment of the present subject matter.
Figure 3, illustrates Schematic of PCB 300 with components of motion control system (MCU), in accordance with an embodiment of present subject matter.
Figure 4, illustrates the component of the central server 400, in accordance with an embodiment of present subject matter.
Figure 5 illustrates an implementation of system 500 for material handling and inventory management, in accordance with an embodiment of the present subject matter.
Figure 6, illustrates a method 600 implemented by the system 500, in accordance with an embodiment of the present subject matter.
Figure 7, illustrates an exemplary embodiment for material handling and inventory management, in accordance with an embodiment of the present subject matter.

DETAILED DESCRIPTION

The present subject matter in general relates to a system and method used for material handling and inventory management.
In one embodiment, the present subject matter may be directed towards a system for material handling and inventory management. The system may comprise a cart 100, a central server 400 and a user device 501. User device 501 may communicatively be coupled with the central server 400. The cart 100 may further comprise a payload platform 103, a camera 203, a plurality of sensors, a chassis and body 105, a human machine interface (HMI) 101, an autonomous control unit (ACU) 206, a Motion control unit (MCU) 205 and a riding board 102. The cart 100 and the central server 400 may be communicatively coupled with each other. The plurality of sensors and the camera 203 may be configured to capture an input data. The ACU 206 may configured to receive the input data captured from the plurality of sensors and the camera 203. The ACU 206 may further configured to process the input data in order to identify an occurrence of an event from a group of events at a location of one or more shelves. The central server 400 may further configured to receive a message indicative of occurrence of the event from the group of events from the ACU 206, through a communication network. The central server 400 may be configured to transmit a location of an available stock to the cart 100 and a user associated with the user device 501 based on identified location of the available stock in the warehouse. The user associated with the user device 501 may assigned with one or more tasks associated with the cart 100. The Motion control unit (MCU) 205 may further configured to navigate the cart 100 at location of one or more shelves. The ACU 206 may configured to transmit an information to the central server 400 based on the completion of one or more tasks associated with the cart. The central server 400 may configured to update the data of an inventory items, empty shelves, storage density, misplaced or damaged inventory and counting of inventory based on the information received from the ACU 206.
Now referring to figure 1, the cart 100 for material handling and inventory management, in accordance with an embodiment of the present subject matter is illustrated. In one embodiment, the cart 100 may comprise a human machine interface (HMI) 101, a riding board 102, drive train and suspension 106, a chassis and body 105, a sensor bay 104 and payload platform 103. In one embodiment, cart 100 further comprises a Motion control unit (MCU) and autonomous control unit (ACU) (not shown in the figure 1).
In one embodiment, the human machine interface (HMI) 101 may consist of a set of capacitive touch sensors and switches, wherein the set of capacitive touch sensor and switches mounted on the both sides of the handle bar and a touch screen display in the middle. The touch sensors are used to manually control the motion of the cart 100. In one exemplary embodiment, the user may control the stop, start, left turn and right turn functions of the cart 100. The user or the operator of the cart 100 may touch the designated touch sensor with his palm or fingers. The set of switches may also be included in order to perform functions like power on, shut down and emergency stop. The touchscreen display configured to display instructions, task details and information to the user of the cart. The touch screen may receive the information from the control circuitry.
In one embodiment, the cart 100 comprises the riding board 102 in order to provide standing platform to the user or to the operator. The riding board 102 has anti slip surface. In one embodiment, the riding board 102 may be a fixed part of the chassis. In another embodiment, the riding board 102 may be collapsible or a separate attachment of the cart 100. The riding board 102 may not be present in some embodiments.
In one embodiment, the cart 100 comprises a drive and suspension 106. The drive and suspension 106 may comprises two wheels driven by two independent motors using typical two-wheel differential drive configuration. The wheels are on either side of the cart 100 and are attached to the chassis using suspension system consisting of a leaf and /or coil spring. In one embodiment, the placement and number of wheels may vary as per requirement. The motors are powered by the battery pack (Not shown in the figure 1) and controlled by the Motion control unit (MCU).
In one embodiment, the chassis 105 is the skeleton of the cart which is designed to support the weight and stresses of the payload platform 103, riding board 102 and battery pack. The chassis may hold wheels, motors, plurality of sensors, the camera, motion control unit (MCU), autonomous control unit (ACU) and human machine interface (HMI). The body 105 may serve as skin or cover in order to enclose all the components.
In one embodiment, the cart 100 may comprises a sensor bay 104 in front of the cart 100 which holds plurality of sensors required for obstacle detection and navigation. The cart further comprises the payload platform 103 where materials are kept for transporting. The payload platform 103 has anti-rugged surface and/or rollers and / or custom made rack/shelf.
In one embodiment, the cart 100 may operate in manual or autonomous mode. In one exemplary embodiment, the user may stand on the riding board 102 and uses the human machine interface (HMI) 101 on the handle bar in order to drive the cart 100 manually. The user may log into warehouse management system (Hereinafter referred as WMS) using the user device 501, wherein the WMS is stored on the central server 400, using the touchscreen display to receive task and display one after the another. In one exemplary embodiment, the orders to be picked are displayed on the touchscreen display one at a time with details of the items, quality, their location in the warehouse and the other instructions. The user may read the details and drives the cart 100 to the pick location, picks the item and further verifies details using touchscreen. In one embodiment, the barcode scanner is attached with the cart used for the verification of the details. After successful verification, the warehouse management system (WMS) may assign a new task to the user and repeats the process.
In one embodiment, the cart 100 may operate in autonomous mode. The autonomous control unit (ACU) may control the cart in autonomous mode. The autonomous control unit (ACU) may receive task from the warehouse management system (WMS 501) stored on the central server 400, and autonomously navigates to the pick location. In autonomous mode, the user may stand on the riding board 102 but need not require to control the cart. In one exemplary embodiment, the task details are displayed on the touchscreen. The cart 100 autonomously navigates to the pick location and orders are picked from the pick location. After successful completion of the task, the warehouse management system which is stored on the central server 400, may assign new task to the cart 100 and further repeats the process. In one embodiment, if any indication of malfunction or unexpected behaviour occurs, the user may use emergency stop button in order to stop the cart or take control of the cart 100.
In one embodiment, cart 100 is communicatively coupled with the central server 400 through the wireless network. The central server 400 configured to constantly track multiple carts and assign them tasks.
Now referring to figure 2, a block diagram 200 of components of a cart 100 for material handling and inventory management, in accordance with an embodiment of the present subject matter is illustrated. In one embodiment, the cart may comprise the plurality of sensors, the camera 203, the human machine interface (HMI) 101, the Motion control unit (MCU) 205, the Autonomous control unit (ACU) 206, a left motor 207, a right motor 209 and a position encoder 208. The plurality of sensors comprises a proximity sensor 201, an IMU sensor 202 and a depth sensor 204. The plurality of sensors and the camera 203 may be configured to detect obstacle in the path while navigating through the warehouse. In one embodiment, the human machine interface (HMI) 101 and autonomous control unit (ACU) 206 electronically coupled with the Motion control unit (MCU) 205. The Motion control unit (MCU) 205 may receive commands from the human machine interface (HMI) 101 in order to operate in the manual mode. The Motion control unit (MCU) 205 may receive commands from the autonomous control unit (ACU) 206 in order to operate in the autonomous mode.
Now referring to figure 3, schematic of PCB 300 with components of motion control system (MCU) 205, in accordance with an embodiment of present subject matter is illustrated. In one embodiment, the Motion control unit (MCU) 205 may comprise a microcontroller 301 and related control circuitry. The microcontroller 301 electronically coupled with the sensor port 302. The sensor port 302 may provide connection interface to the plurality of sensors and the camera 203. The microcontroller 301 electronically coupled with communication port 303 in order to communicate with the other system. Further, the Motion control unit (MCU) 205 is configured to process the commands received from the human machine interface (HMI) 101 or the autonomous control unit (ACU) 206 along with the input data received from the plurality of sensors. The Motion control unit (MCU) 205 may control velocities of the left motor 207 and the right motor 209 wherein the left motor 207 and right motor 209 communicatively coupled with the Motion control unit (MCU) 205. In one embodiment, the position encoder 208 electronically coupled with the left and right motor. Further, the Motion control unit (MCU) 205 may use PID control loop in order to maintain same velocity in both motors. The motion control unit (MCU) 205 may provide required differential velocity for desired trajectory in order to the turn the cart 100. The Motion control unit (MCU) 205 may send and receive all commands on serial communication bus using proprietary protocol. In one exemplary embodiment, if any obstacle is detected within set range, then Motion control unit (MCU) 205 may automatically blows the horn and flashes LED lights as a warning. In another exemplary embodiment, if the obstacle is detected within set distance, then the Motion control unit (MCU) 205 automatically stops the motors by overriding all manual and/or autonomous commands.
In one embodiment, Autonomous control unit (ACU) 206 is configured to perform path planning, localization and obstacle avoidance in the autonomous mode. The Autonomous control unit (ACU) 206 may use advanced robotics algorithm for path planning, localization and obstacle avoidance in the autonomous mode. The Autonomous control unit (ACU) 206 may use 2D map for navigation. In one embodiment, the tasks are assigned to the cart 100 from the warehouse management system (WMS) which is stored on the central server 400, through the wireless network. The task is specified as a point or location on the 2D map to which the cart has to travel autonomously. In one embodiment, the local path planning algorithm may determine the desired path or trajectory and accordingly sends the motor commands to the Motion control unit (MCU) 205. The Motion control unit (MCU) 205 may processes data received from the plurality of sensors in order to estimate its position on the 2D map and further localize itself whenever it reads the physical markers. In one embodiment, barcode, QR codes or AR markers may be used to identify these points or locations physically.
In one embodiment, Autonomous control unit (ACU) may use line following system for the navigation. In one exemplary embodiment, the lines may be drawn on the floor in order to serve as path for cart 100 to follow. Further, the Autonomous control unit (ACU) 206 may use the camera 203 and /or depth sensor 204 in order to detect the line. The Autonomous control unit (ACU) 206 may send commands to the Motor control unit (MCU) 205 in order to make cart to follow the line.

Now referring to figure 4, the components of the central server 400, is illustrated, in accordance with an embodiment of the present subject matter. As shown in figure 3, the central server 400, may include at least one processor 401, an input/output (I/O) interface 402, a memory 403, modules 404 and data 409. In one embodiment, the at least one processor 401 may be configured to fetch and execute computer-readable instructions stored in the memory 403.

In one embodiment, the I/O interface 402 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 402 may allow the central server 400 to interact with the cart or the user device 501. Further, the I/O interface 402 may enable the central server 400 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 402 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 402 may include one or more ports for connecting to another server.

In an implementation, the memory 403 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random-access memory (SRAM) and dynamic random-access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and memory cards. The memory 403 may include modules 404 and data 409.

The modules 404 include routines, programs, objects, components, data structures, etc., which perform particular tasks, functions or implement particular abstract data types. In one implementation, the modules 404 may include a communication module 405, a location identification module 406, inventory management module 407 and other modules 408. The other modules 408 may include programs or coded instructions that supplement applications and functions of the central server 400, 400.In one embodiment, the processor 401 may execute the instructions stored in the modules 404 in order to work as Warehouse management system.

In one embodiment, the data 409 may comprise repository 410 and other data 411. In one exemplary embodiment, the repository 410 may be configured to store data processed, received, and generated by one or more of the modules 404. The other data 411 may include data generated as a result of the execution of one or more modules

Now referring to figure 5, the implementation of system 500 for material handling and inventory management is illustrated. As shown, the cart 101 may be communicatively coupled with the central server 400. In one embodiment, the user device may be communicatively coupled with the central server 400. Although the present subject matter is explained considering that the central server is implemented as on a server, it may be understood that the system 400 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, and the like. It will be understood that the central server 400 may comprise WMS which accessed by multiple users through one or more user devices collectively referred to as user hereinafter, or WMS residing on the user devices. Examples of the user device to access WMS may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, and a workstation. The WMS implemented on the user device are communicatively coupled to the central server 400 through a network 502.

In one implementation, the network 502 may be a wireless network, a wired network or a combination thereof. The network 502 can be accessed by the user device 501 using wired or wireless network connectivity means including updated communications technology. The network 502 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 502 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 502 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.

In one embodiment, the plurality of sensors and camera of the cart 100 may capture an input data. In one exemplary embodiment, the input data may comprise images or video captured at location of one or more shelves. In one embodiment, the ACU 206 may configured to receive the input data captured from the plurality of sensors and the camera. In one embodiment, the ACU 206 may configured process the input data in order to identify an occurrence of the event from the group of events at a location of one or more shelves. In one exemplary embodiment, the group of events may comprise empty shelves, storage density, misplaced or damaged inventory and counting of inventory. In one exemplary embodiment, the ACU 206 may process the images and the video from the camera and vision sensors of the cart to identify empty shelves. In one exemplary embodiment, the ACU 206 may process the images and video from the camera and vision sensors of the cart to estimate storage density and stock level in the shelf. In one embodiment, the ACU 206 may process the images and video from the camera and vision sensors of the cart 100 to identify misplaced and damaged inventory items. In one embodiment, the ACU 206 may configured to transmit message after identification of the event to the central server 400.

Now again referring to Figure 4, the communication module 405 of the central server 400 may configured to receive the message indicative of occurrence of the event from the group of events through a communication protocol. In one embodiment, the communication protocol may be wireless communication protocol. In one embodiment, the location identification module 406 may identify the location of the available stock in the warehouse. In one embodiment, the central server 400 may configured to transmit a location of an available stock to the cart 100 and a user associated with the user device 501, based on identified location of the available stock in the warehouse. Now again referring to figure 5, In one embodiment, the user associated with the user device 501 may assigned with one or more tasks associated with the cart 100. In one embodiment, one or more task associated with the cart 100 may comprise arrangement of replenishment stock delivered by the cart 100, placement of misplaced stocks/items at a right place, replacement of damaged stocks/items with new stocks/items. In one embodiment, the ACU 206 may display instructions on the HMI to the user associated with cart. The user may fulfil and confirm the instructions displayed on the HMI 101. In one embodiment, wherein the Motion control unit (MCU) may further configured to navigate the cart at location of one or more shelves based on the instructions received from the HMI 101 or ACU 206. The Motion control unit (MCU) 205 may receive commands from the human machine interface (HMI) 101 in order to operate in the manual mode. The Motion control unit (MCU) 205 may also receive commands from the autonomous control unit (ACU) 206 in order to operate in the autonomous mode. In one embodiment, the Motion control unit (MCU) 205 may control velocities of the left motor 207 and the right motor 209 wherein the left motor 207 and right motor 209 communicatively coupled with the Motion control unit (MCU) 205. The position encoder 208 electronically coupled with the left and right motor. In one embodiment, the Motion control unit (MCU) 205 may use PID control loop in order to maintain same velocity in both motors. In one embodiment, the motion control unit (MCU) 205 may provide required differential velocity for desired trajectory in order to the turn the cart 100. The Motion control unit (MCU) 205 may send and receive all commands on serial communication bus using proprietary protocol. In one exemplary embodiment, if any obstacle is detected within set range, then Motion control unit (MCU) 205 may automatically blows the horn and flashes LED lights as a warning. In one embodiment, if the obstacle is detected within set distance, then the Motion control unit (MCU) 205 automatically stops the motors by overriding all manual and/or autonomous commands. In one embodiment, the ACU 206 may configured to transmit an information to the central server 400 based on the completion of one or more tasks associated with the cart. Now again referring to the figure 4, the inventory management module 407 may configured to update the data of an inventory items, empty shelves, storage density, misplaced or damaged inventory and counting of inventory based on the information received from the ACU.

Now referring to figure 6, a method 600 implemented by the system 500 for material handling and inventory management, in accordance with an embodiment of the present subject matter is illustrated.

At step 601, the autonomous control unit (ACU) 206 may receive an input data wherein the input data captured using a plurality of sensors and camera 203.

At step 602, the autonomous control unit (ACU) 206 may process the input data in order to identify an occurrence of an event from a group of events at a location of one or more shelves.

At step 603, the central server 400, may receive the message indicative of occurrence of the event from the group of events from the ACU 206, through a communication network.

At step 604, the central server 400, may transmit a location of an available stock to the cart and a user associated with the user device 501, based on identified location of the available stock in the warehouse, wherein the user is assigned with one or more tasks associated with the cart.

At step 605, the motion control unit (MCU) 205 may navigate the cart to at the location of one or more shelves.

At step 606, the ACU 206 may transmit the information to the central server 400, based on the completion of one or more tasks associated with the cart.

At step 607, the central server 400, may update the data of an inventory items, empty shelves, storage density, misplaced or damaged inventory and counting of inventory based on information received from the ACU.

Now referring to figure 7, an exemplary embodiment for material handling and inventory management, in accordance with an embodiment of the present subject matter is illustrated. At step 701, the camera and vision sensors of the cart may transmit images and videos to the ACU 206 while navigating at the location of one or more shelves. At step 702, the ACU 206 may process images and videos using computer vision algorithm. In one embodiment, ACU 206 may process images and video from the cart’s cameras and vision sensors to identify empty shelves and arrange replenishment stock delivered by the Cart. In one embodiment, ACU 206 may process the images and video from the cart’s cameras and vision sensors to estimate storage density and stock level in the shelf and arrange replenishment stock delivered by the cart if below desired level. In one embodiment, ACU 206 may process the images and video from the cart’s cameras and vision sensors to identify misplaced and damaged inventory items and alert the store / warehouse system implemented in the user device 501. At step 703, the ACU 206 may identify any empty shelf and sends alert message to the central server 400 over wireless network. At step 704, the central server 400, may transmit message to Store/Warehouse management System implemented on the user device 501 through the communication interface. At step 705, the Store/Warehouse management System of the user device 501 may alert a human supervisor and/or generates a Stock Replenishment Order and sends to the central server 400. In one exemplary embodiment, the stock replenishment order is generated automatically or by a human supervisor and sent to the central server 400, which then assigns the task to another Cart to bring inventory from the back room of the warehouse to the said shelf. At step 706, the central server 400, may identify location of stock and directs the Cart 101 and human worker to that location. At step 707, the human worker reads instructions displayed on HMI and places the items / stock on the cart and confirms in HMI. In one embodiment, necessary information may be displayed on the touchscreen for a user or the user or the human worker to place the required inventory on the cart and confirm. At step 708, the cart navigates to the empty shelf location autonomously avoiding obstacles. In one embodiment, the cart may use the camera and/or depth sensor in order to detect the user associated with the user device 501. In one embodiment, the ACU 206 may transmit instructions to the MCU 205 to follow the human worker associated with the user device 501. At step 709, a human worker reads instructions displayed on HMI and places the items / stock in the shelf and confirms in HMI. In one embodiment, MCU 205 may navigate the cart to the empty shelf and wait for user or the user or the operator to put away the inventory in the shelf and confirm. At step 710, the cart may transmit updates to the central server 400, of task completion and waits for next task. At step 711, the central server 400, may update the warehouse management system and transmits updates to the user device 501.
Exemplary embodiments discussed above may provide certain advantages. Some embodiments of the present disclosure may enable the system and method may enable to for material handling and inventory management.
In one embodiment, cart may include a software application for inventory management using computer vision technologies to identify empty shelves, storage density, misplaced or damaged inventory and counting of inventory.
In some embodiments system may include of proprietary PCB and software, that allows the cart to be operated in manual mode (human controlled) as well as in autonomous mode (system controlled).
,CLAIMS:We claim:
1. A system 500 for material handling and inventory management, the system comprising:
a cart 100;
a central server 400; and
a user device 501, wherein the user device 501 is communicatively coupled with the central server 400, wherein the cart further comprises a payload platform 103, a camera 203, a plurality of sensors, a chassis and body 105, a human machine interface (HMI) 101, an autonomous control unit (ACU) 206, a Motion control unit (MCU) 205 and a riding board 102; wherein the cart 100 and the central server 400, communicatively coupled with each other, wherein the plurality of sensors and the camera 203 are configured to capture an input data, wherein the ACU 206 is configured to receive the input data captured from the plurality of sensors and the camera; wherein the ACU 206 is further configured to process the input data in order to identify an occurrence of an event from a group of events at a location of one or more shelves; wherein the central server 400 is configured to receive a message indicative of occurrence of the event from the group of events from the ACU 206 through a communication network; wherein the central server 400 is configured to transmit a location of an available stock to the cart and a user associated with the user device 501, based on identified location of the available stock in the warehouse, wherein the user associated with the user device 501 is assigned with one or more tasks associated with the cart; wherein the Motion control unit (MCU) is further configured to navigate the cart at location of one or more shelves, wherein the ACU 206 is configured to transmit an information to the central server 400 based on the completion of one or more tasks associated with the cart, wherein the central server 400 is configured to update the data of an inventory items, empty shelves, storage density, misplaced or damaged inventory and counting of inventory based on the information received from the ACU 206.

2. The system of claim 1, wherein the group of events comprises empty shelves, storage density, misplaced or damaged inventory and counting of inventory.

3. The system of claim 1, wherein the ACU 206 is configured to identify the occurrence of the event using computer vision algorithm.

4. The system of claim 1, wherein the ACU 206 is configured to use the camera and/or depth sensor in order to detect the user associated with the user device 501.

5. The system of claim 1, wherein the ACU 206 is configured to transmit instructions to motor control unit (MCU) in order to make cart to follow user.

6. A method 600 for material handling and inventory management, the method comprising:
receiving, via an autonomous control unit (ACU) 206, an input data wherein the input data captured using a plurality of sensors and camera;
Processing, via the autonomous control unit (ACU) 206, the input data in order to identify an occurrence of an event from a group of events at a location of one or more shelves;
receiving, via a central server 400, a message indicative of occurrence of the event from the group of events from the ACU 206, through a communication network;
transmitting, via the central server 400, a location of an available stock to the cart and a user associated with the user device (501) based on identified location of the available stock in the warehouse, wherein the user is assigned with one or more tasks associated with the cart;
navigating, via a motion control unit (MCU) 205, the cart to at the location of one or more shelves;
transmitting, via the ACU, an information to the central server 400, based on the completion of one or more tasks associated with the cart;
updating, via the central server 400, the data of an inventory items, empty shelves, storage density, misplaced or damaged inventory and counting of inventory based on information received from the ACU.
7. The method of claim 6, wherein the group of events comprises empty shelves, storage density, misplaced or damaged inventory and counting of inventory.

8. The method of claim 6, wherein ACU 206 further transmitting instructions to motor control unit (MCU) in order to make cart to follow user.

9. The method of claim 6, wherein identification of the occurrence of the event is done using computer vision algorithm.

Dated 16th day of March 2018

Documents

Application Documents

# Name Date
1 FORM28 [18-03-2017(online)].pdf_27.pdf 2017-03-18
2 FORM28 [18-03-2017(online)].pdf 2017-03-18
3 Form 1 [18-03-2017(online)].pdf 2017-03-18
4 EVIDENCE FOR SSI [18-03-2017(online)].pdf_26.pdf 2017-03-18
5 EVIDENCE FOR SSI [18-03-2017(online)].pdf 2017-03-18
6 Drawing [18-03-2017(online)].pdf 2017-03-18
7 Description(Provisional) [18-03-2017(online)].pdf 2017-03-18
8 Form 3 [21-03-2017(online)].pdf 2017-03-21
9 Other Patent Document [06-04-2017(online)].pdf 2017-04-06
10 Form 26 [06-04-2017(online)].pdf 2017-04-06
11 Correspondence By Agent_Form1_11-04-2017.pdf 2017-04-11
12 201741009468-FORM 3 [16-03-2018(online)].pdf 2018-03-16
13 201741009468-ENDORSEMENT BY INVENTORS [16-03-2018(online)].pdf 2018-03-16
14 201741009468-DRAWING [16-03-2018(online)].pdf 2018-03-16
15 201741009468-CORRESPONDENCE-OTHERS [16-03-2018(online)].pdf 2018-03-16
16 201741009468-COMPLETE SPECIFICATION [16-03-2018(online)].pdf 2018-03-16
17 Correspondence by Agent_Form 5_22-03-2018.pdf 2018-03-22
17 201741009468-DRAWING [16-03-2018(online)].pdf 2018-03-16
18 201741009468-ENDORSEMENT BY INVENTORS [16-03-2018(online)].pdf 2018-03-16
18 201741009468-FORM 18A [11-03-2019(online)].pdf 2019-03-11
19 201741009468-FER.pdf 2019-04-23
19 201741009468-FORM 3 [16-03-2018(online)].pdf 2018-03-16
20 201741009468-FER_SER_REPLY [17-09-2019(online)].pdf 2019-09-17
20 Correspondence By Agent_Form1_11-04-2017.pdf 2017-04-11
21 201741009468-COMPLETE SPECIFICATION [17-09-2019(online)].pdf 2019-09-17
21 Form 26 [06-04-2017(online)].pdf 2017-04-06
22 201741009468-HearingNoticeLetter-(DateOfHearing-08-01-2020).pdf 2019-12-11
22 Other Patent Document [06-04-2017(online)].pdf 2017-04-06
23 201741009468-Written submissions and relevant documents (MANDATORY) [17-01-2020(online)].pdf 2020-01-17
23 Form 3 [21-03-2017(online)].pdf 2017-03-21
24 Description(Provisional) [18-03-2017(online)].pdf 2017-03-18
24 201741009468-PatentCertificate06-05-2020.pdf 2020-05-06
25 Drawing [18-03-2017(online)].pdf 2017-03-18
25 201741009468-Marked up Claims_Granted 336541_06-05-2020.pdf 2020-05-06
26 EVIDENCE FOR SSI [18-03-2017(online)].pdf 2017-03-18
26 201741009468-IntimationOfGrant06-05-2020.pdf 2020-05-06
27 EVIDENCE FOR SSI [18-03-2017(online)].pdf_26.pdf 2017-03-18
27 201741009468-Drawings_Granted 336541_06-05-2020.pdf 2020-05-06
28 201741009468-Description_Granted 336541_06-05-2020.pdf 2020-05-06
28 Form 1 [18-03-2017(online)].pdf 2017-03-18
29 201741009468-Claims_Granted 336541_06-05-2020.pdf 2020-05-06
29 FORM28 [18-03-2017(online)].pdf 2017-03-18
30 201741009468-Abstract_Granted 336541_06-05-2020.pdf 2020-05-06

Search Strategy

1 search_201741009468_05-04-2019.pdf

ERegister / Renewals

3rd: 13 Jul 2020

From 18/03/2019 - To 18/03/2020

4th: 13 Jul 2020

From 18/03/2020 - To 18/03/2021

5th: 13 Jul 2020

From 18/03/2021 - To 18/03/2022

6th: 29 Apr 2022

From 18/03/2022 - To 18/03/2023

7th: 20 May 2022

From 18/03/2023 - To 18/03/2024

8th: 20 May 2022

From 18/03/2024 - To 18/03/2025

9th: 20 May 2022

From 18/03/2025 - To 18/03/2026

10th: 20 May 2022

From 18/03/2026 - To 18/03/2027