Sign In to Follow Application
View All Documents & Correspondence

Auto Navigation Vehicle System And Method Of Operating The Same

Abstract: Auto-navigation vehicle system (S) and method for transportation of patients, differently abled and elderly people in a multi-storeyed building via an elevator (L) by a single touch software application (101) on a portable Input/Output device (100) connected to auto-navigation vehicle (Av) via a wireless communication module (W). The auto-navigation vehicle system (S) is based on the Robot Operating system (ROS) navigation stack and is capable of generating and loading maps (M1, M2…Mn) of each floor (Fl1, Fl2…Fln) in a multi-storied building using 360-degree LiDAR sensor (203), with Simultaneous Localization And Mapping (SLAM) (304), Adaptive Monte Carlo Localization (AMCL) (305) and odometry (208). The auto-navigation vehicle system (S) provides a built-in isolation hood (400) for transportation of infected patients with proper isolation while maintaining social distancing.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
20 May 2020
Publication Number
48/2021
Publication Type
INA
Invention Field
PHYSICS
Status
Email
sunita@skslaw.org
Parent Application
Patent Number
Legal Status
Grant Date
2024-07-22
Renewal Date

Applicants

AMRITA VISHWA VIDYAPEETHAM
Amritapuri, Clappana PO, Kollam - 690525, Kerala, India

Inventors

1. MEGALINGAM, Rajesh Kannan
MA Math. Amritapuri, Kollam, Kerala 690525
2. RAJ, Akhil
Thekkevazhappallil Veedu, Thekkumcherry, Puthoor (P.O), Kollam, Kerala 691507
3. RAJENDRAPRASAD, Anandu
Bhargavanivas, Punukkannoor, Perumpuzha PO Kollam, Kerala 691504

Specification

DESC:FIELD OF THE INVENTION
The present invention relates to an auto navigation vehicle system and method of operating the same. More specifically the present invention relates to a novel auto navigation vehicle system such as a wheelchair which can auto-navigate using multiple maps especially maps of different floors in a multi-storied building and navigating via elevator enabling multi-floor access. The present invention is also capable of manual navigation and provides a built-in isolation hood to assist in transporting patients with communicable diseases (like COVID-19) while maintaining social distancing.

BACKGROUND OF THE INVENTION
Unsupervised auto navigation systems are a boon to people who are otherwise challenged. Wheelchairs are the primary means of mobility for people with different kinds of impairments. Last decade there was a lot of
development in the field of science and technology. Presently there are many issues related to control of the transportation vehicle such as a wheelchair especially in a multi-level buildings or such other structures.

For instance, people with different physical issues cannot perfectly control the powered wheelchair and has to depend on others, reducing their independence and quality of life. In a situation like that of COVID pandemic, moving a patient from isolation wards autonomously to radiology lab for X-ray, to blood test lab for blood test etc., within the hospital, is a required task. And these labs might be in different floors. The
nurse/caretaker need not come in direct contact with patients or wheelchair using the proposed wheelchair.

There are two basic requirements for any auto navigating vehicle system. First, an auto-navigating vehicle must navigate safely for long periods of time with minimum human intervention. The system should be programmed efficiently that it is able to prevent harm from coming to the user. Second, the auto-navigation system should interact effectively with the user and the environment by mapping the environment and communicating the same to the user. Besides these two requirements, desirable features may include outdoor as well as indoor navigation, automatic mode selection based upon the current environment and task to reduce the cognitive overhead of the user, and easily adaptable user interfaces. Most people take low-level control for granted when walking or driving. For example, when walking down a busy corridor, a person is not usually aware of all of the small changes he makes to avoid people and other obstacles. So, there is an urgent need for auto navigation vehicle and the method for operating the same.

Reference is made to US Patent Application US20120316884A1 discloses a personal mobility vehicle, such as a wheelchair system, includes an input audio transducer having an output coupled to a speech recognition system and an output audio transducer having an input coupled to a speech synthesis system. The wheelchair system further includes a control unit having a data processor and a memory. The data processor is coupled to the speech recognition system and to the speech synthesis system and is operable in response to a recognized utterance made by a user to present the user with a menu containing wheelchair system functions. The data processor is further configured in response to at least one further recognized utterance made by the user to select from the menu at least one wheelchair system function, to activate the selected function and to provide audible feedback to the user via the speech synthesis system.
Another reference is made to Chinese utility model application no. CN210605467U disclosing an intelligent wheelchair and control system based on vision and three point positioning. The wheelchair comprises a seat, armrests, a handle, front wheels, rear wheels, pedals, a driving device, and an intelligent following module, wherein the intelligent following module comprises an intelligent label, an upper computer, and a lower computer. The lower computer comprises a drive control main board, a laser radar, an inertia measurement unit, at least one group of ultrasonic radar, at least one group of visualization device, a handle control module, a button assembly and a three-point positioning base station. The utility model possess certain environmental perception and reply ability and can be used as a wheelchair and intelligent transport.

Another reference is made to US Patent Application US20160052139A1 discloses systems, devices, and methods described for providing, among other things, a wheelchair-assist robot for assisting a wheelchair user with everyday tasks or activities at work, at home, and the like. Further it discloses the mobile wheelchair assist robot includes a wheelchair interface component configured to exchange control information with a wheelchair controller. Moreover, it provides a wheelchair-assist robot mount assembly for electrically and physically coupling to an associated wheelchair.

Another reference is made to PCT international application no. PCT/CN2017//072100 discloses systems and methods for controlling intelligent wheelchair. The system may include a processor that performs the operations including receiving information, building a map, planning a route, and generating control parameters; a movement module that executes control parameters to move around and hold sensors for sensing information; and a gimbal for holding sensors to sense information. However, the application discloses about building a map using image processing method and planning a route. The application does not have dynamic obstacle detection and avoidance and re-planning the route avoiding dynamic obstacle. The application requires networking for connecting multiple robots and GPS, GLONASS, COMPASS, QZSS and WIFI for positioning all of which used for outdoor use of wheelchair.

However, the existing inventions are costly and none of the existing prior arts provides user-friendly and self-propelled system that can map the environment to provide ready information to the user to navigate the immediate environment with minimum human intervention and maximum accuracy.

So, there is a dire need of an auto-navigation vehicle system which is affordable, adaptable to any indoor environment with easy to operate user interface than existing technologies which usually involves frequent concentration or support from the user for the movement of the vehicle system.

OBJECT OF THE INVENTION
In order to obviate the drawbacks in the existing state of the art, the present invention aims to provide a user-friendly auto-navigation vehicle system that can map the environment to provide ready information to the user to navigate the immediate environment with multiple levels connected by an elevator, which can function with minimum human intervention and maximum accuracy. The present invention also provides a built-in isolation hood along with multi-map, multi-floor auto-navigation feature in the vehicle system to transport patients with communicable diseases like COVID-19. The main object of the present invention is to provide for an auto-navigation vehicle system such as a wheelchair that can auto navigate using multiple maps.

Another object of the present invention is to enable an auto-navigating vehicle to use the elevator to navigate a multi-storeyed building.

Yet another object of the present invention is to provide an auto-navigating vehicle system with a built-in isolation hood to ensure social distancing of infected persons during the course of the movement or transportation.

Yet another object of the present invention is to provide an auto-navigation vehicle system capable of providing transportation by a single touch using a suitable software application on a portable Input/Output device.

Yet another object of the present invention is to provide an auto navigation vehicle system capable of functioning in multiple driving modes as per the requirement of the user.

Yet another object of the present invention is to provide an auto-navigation vehicle system adaptable to any indoor environment with easy to operate user interface.

SUMMARY OF THE INVENTION
Accordingly, the present invention provides an auto-navigation vehicle capable of auto-navigating through a multi-storeyed building using multiple maps via elevator enabling multi-floor access in said building. The present invention provides for a built-in isolation hood for purposes of enabling easy transport of infected persons with isolation while maintaining social distancing. Thus, the present invention provides for transportation of persons, patients, differently abled persons, and elderly people in a multi-storied building without any external intervention.

The present invention reduces the burden of manual navigation and can make transportation easier by a single touch using a suitable software application on a portable Input/Output device including but not limited to android or internet operating system (iOS) on a mobile handset. The transportation of infected patients is risky in case of manual or powered wheelchairs without proper isolation. This auto-navigation vehicle system includes a manual mode which can be controlled using the same software application installed on Input/Output device.

The system software is based on Robot Operating System (ROS) navigation stack, and it can map each floor separately in a multi-storied building and load the maps according to the user's need. A 360-degree laser scanner, Simultaneous Localization and Mapping (SLAM) and Adaptive Monte Carlo Localization (AMCL) is used for localization, mapping, and navigation of the auto-navigation vehicle system.

The present system comprises of rotary encoders, Inertial Measurement Unit (IMU) and 360-degree LiDAR (Light Detection and Ranging) sensor to identify the objects in the environment where a point cloud data of the laser scan can be generated and the distance between the LiDAR and the obstacles can be observed. The rotary encoders and Inertial measurement unit (IMU) and odometry provide current position and orientation of the auto-navigating vehicle.
The auto-navigation vehicle system is capable of operating in multiple modes including fully autonomous also referred to as auto mode, semi-autonomous mode also referred to as fixed mode and manual mode. The user can access any floor in a multi-storied building via an elevator by choosing the multi-map mode using the software application installed in the Input/Output device. When the elevator reaches a particular floor, the user can select said floor from the drop down menu in selected mode of navigation in said application to download the navigation map of said floor and the auto-navigating vehicle can be navigated to that floor autonomously. The map server in Robot Operating System (ROS) Navigation stack is used to publish different maps on the I/O device screen as per the user input.

The auto-navigation vehicle system has a control unit and a pair of 320 watt motors to power it. The control unit may be based on Linux platform. The vehicle movement in the environment is capable of being powered by an independent source of energy such as batteries or solar cells. The vehicle movement monitored by a pair of rotary encoders attached to the motors and a six-axis Inertial Measurement Unit which tracks the orientation of the wheelchair with respect to the environment.

The present invention has a wide application including households, hospitals, universities, airports, and very specific use of the present invention is in a multi-level hospitals for the transportation of persons including patients having infectious disease across multiple floors and other public places.

BRIEF DESCRIPTION OF DRAWINGS
Fig. 1 depicts the auto-navigating vehicle (Av) with multi-map navigation.
Fig. 2 depicts system architecture of the auto-navigating vehicle system (S).
Fig. 3a, 3b, 3c, 3d depicts different modes of navigation in auto-navigation vehicle system.
Fig. 4a depicts navigation map generated in fixed mode (Fm) of navigation showing pre-defined destinations (D1, D2…Dn) for navigation of auto-navigating vehicle
Fig. 4b depicts navigation map generated in auto mode (Am) of navigation.
Fig. 5 depicts operational flow diagram of auto-navigating vehicle with multi-map navigation.
Fig 6 depicts navigation map showing obstacles (O1, O2…On) in an indoor environment.
Fig, 7a, 7b, 7c depicts maps of different floors generated in multi-map navigation in a multi-storied building

DETAILED DESCRIPTION OF THE INVENTION WITH ILLUSTRATIONS AND REFERENCE TO DRAWINGS
It should be noted that the particular description and embodiments set forth in the specification below are merely exemplary of the wide variety and arrangement of instructions which can be employed with the present invention.

The present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. All the features disclosed in this specification may be replaced by similar other or alternative features performing similar or same or equivalent purposes. Thus, unless expressly stated otherwise, they all are within the scope of present invention. Various modifications or substitutions are also possible without departing from the scope or spirit of the present invention.

Therefore, it is to be understood that this specification has been described by way of the most preferred embodiments and for the purposes of illustration and not limitation

The present invention provides for an auto-navigation vehicle system (S) and method for transportation of patients, differently abled and elderly people in a multi-storeyed building without any external intervention. The auto-navigating vehicle (Av) of the present system (S) is capable of navigating to any multi-storied building and provide multi-floor access using an elevator (L). The auto-navigation vehicle system (S) provides a built-in isolation hood (400) which enables the transportation of infected patients with proper isolation while maintaining the social distancing. Figure 1 shows the auto-navigating vehicle (Av) of the present auto-navigation vehicle system (S) with multi-map navigation.

Figure 2 depicts the system architecture of the auto-navigation vehicle system (S). The auto-navigation-vehicle system (S) comprises of an auto-navigating vehicle (Av), an Input/Output device (100) comprising of a software application (101) installed on said Input/Output device (100) and a wireless communication module (W), Control unit (200) comprising of master-controller unit (201), sub-controller unit (202), at least one sensor (s), and atleast one motor driver (206) to drive atleast one motor (207), Software unit (300) comprising of Robot Operating System (ROS), at least one Extended Kalman Filter (EKF) (301), at least one Laser filter (302) and at least one Map server (303) and Isolation hood (400) to enable the transportation of infected patients while maintaining social distancing.
The auto-navigation vehicle system (S) is operated through the portable Input/Output device (100) via the installed software application (101) and wireless communication module (W) to enable the user (U) to access preferred mode of navigation as well as to control and monitor the navigation of the auto-navigating vehicle (Av). The wireless communication module (W) couples the sub-controller unit (202) with the Input/Output device (100) which in turn couples the auto-navigating vehicle (Av) with said Input/Output device (100). The operating system for the Input/Output device (100) is selected from but not limited to android and internet operating system (iOS).

The Control unit (200) of present auto-navigation vehicle system (S) is based on Linux platform. The master controller unit (201) is the main control unit (200) of the auto-navigation vehicle system (S) and supervises the entire Control unit (200). The sensor (s) in the Control unit (200) comprises of at least one LiDAR sensor (203) capable of identifying the surrounding and obstacles (O1, O2….On) in path, at least one pair of rotary encoders (204) attached to the motor (207) identifying the direction of the rotation of wheel of auto-navigating vehicle (Av) by computing the displacement made by the auto-navigating vehicle (Av) and at least one six-axis inertial measurement unit (IMU) (205) configured to collect data from the surrounding environment and determining pose and orientation of auto-navigating vehicle (Av) with respect to the environment.

The sub-controller unit (202) is used as an interface (I) for the wireless communication module (W), the rotary encoders (204) and the inertial measurement unit (IMU) (205). The sub-controller (202) receives data from the sensors (s) and I/O device (100) through data acquisition (P-1), processes said data and transmits the processed data to the master-controller unit (201) via serial communication (P-2).

The software unit (300) comprises of Robot Operating System (ROS) configured to visualize 2D and 3D values from ROS data using Simultaneous localization and mapping (SLAM) (304) and Adaptive Monte Carlo Localization (AMCL) (305) providing real time motion planning (P-6) by creation of multiple navigation maps (M1, M2…Mn), obstacle (O1, O2…On) avoidance and localization of the auto-navigating vehicle (Av) in a multi-storied building. The created navigation maps (M1, M2…Mn) are saved in the master controller unit (201) and the Input/Output device (100).

The software unit (300) further comprises of atleast one Extended Kalman Filter (EKF) (301) capable of combining measurement data from IMU (205) and odometry (208), atleast one Laser filter (302) capable of processing scan data from said LiDAR (203), and atleast one Map server (303) capable of downloading of said navigation maps (M1, M2…Mn) on the Input/Output device.

The Simultaneous localization and mapping (SLAM) (304) in the Robot Operating system (ROS) is a g-mapping tool which receives the point cloud data from the LiDAR (203), IMU (205) and odometry (208) via ROS serial communication (P-3) and creates multiple navigation maps (M1, M2…Mn) based on user (U) input and save said navigation maps (M1, M2…Mn) to ROS file system using said map server (303)

The Adaptive Monte Carlo Localization (AMCL) (305) in the Robot Operating system (ROS) determines the position and orientation of auto-navigating vehicle (Av) on created navigation maps (M1, M2…Mn) during navigation of the auto-navigating vehicle (Av). The Adaptive Monte Carlo Localization (AMCL) (305) is integrated with said EKF (301) to reduce the uncertainties in estimation of the position of auto-navigating vehicle (Av).

The Inertial measurement unit (IMU) (205) data from the sub-controller unit (202) is calibrated (P-4) based on the physical position of IMU (205) and said data is sent to the Extended Kalman Filter (EKF) (301). The rotary encoder (204) data from sub-controller unit (202) are used for conversion and the converted data is sent to EKF (301). The Adaptive Monte Carlo Localization (AMCL) (305) receives filtered LiDAR (203) data from the laser filter (302), filtered IMU (205) data and odometry (208) data from EKF (301) to conduct scan matching process and provide an accurate localization and path planning of the auto-navigation vehicle (Av). The Robot Operating system (ROS) generates the motor control (P-7) commands based on the dynamics (P-5) of the auto-navigating vehicle (Av) and send said commands to the sub-controller unit (202) to drive the motors (207) via motor drivers (206) to navigate the auto-navigating vehicle (Av).

Figure 3a depicts the various modes of navigation in the auto-navigation vehicle system (S). The auto-navigating vehicle (Av) of the present system can be operated in three modes namely
? fully autonomous mode or Auto-Mode (Am) [Fig. 3b]
? semi-autonomous mode or Fixed mode (Fm) [Fig. 3c]; and
? manual mode [Fig. 3d]

The auto-mode (Am) and fixed mode (Fm) are multi-map modes of navigation which enables the user (U) to navigate in a multi-storied building via an elevator (L) using the software application (101) installed on a portable Input/Output device (100) connected to the auto-navigating vehicle (Av). The user (U) can select said auto mode (Am) or fixed mode (Fm) on the application (101) on being present at point of reference (POR), where the point of reference (POR) is either the reference position (R) or an elevator (L). It allows the user (U) to initialize the localization system used by the ROS navigation stack by setting the pose of the auto-navigating vehicle (Av) in the environment.

Auto mode:
Figure 3b depicts the Auto mode (Am) of navigation in the auto-navigation vehicle system (S). In Auto mode (Am) the user (U) selects auto-mode (Am) from the drop down menu on said software application (101) installed on portable Input/Output device (100) connected to the auto-navigating vehicle through the wireless communication module (W). The dropdown menu for selecting the floor (Fl1, Fl2…Fln) is displayed on screen of the Input/Output device (100) and user (U) selects the desired floor (Fl1, Fl2… Fln) in the multi-storied building, whereupon the multiple navigation maps (M1, M2…Mn) of the building are downloaded on the screen of the I/O device (100) indicating the point of reference (POR) which is either the reference position (R) or the elevator (L) .

In case the user (U) wishes to navigate to the same floor in which he is present, the said user (U) will move to the reference position (R) and chooses the desired floor (Fl1, Fl2…Fln) from the options displayed on the screen of the I/O device (100).

In case, the user (U) wishes to navigate to another floor, the said user (U) moves to the elevator (L) and chooses the desired floor from the options displayed on the screen of the I/O device (100).

The system (S) checks whether the auto navigating vehicle (Av) is at the point of reference (POR) and if the auto-navigating vehicle (Av) is at the point of reference (POR) the pre-existing map of the selected floor (Fl1, Fl2…Fln) is be loaded into the map server (303) and displayed on the screen of the I/O device (100).

Figure 4b depicts the navigation map (M1, M2…Mn) displayed in auto-mode (Am) of auto-navigation vehicle system (S). The user (U) can provide destinations for navigation on displayed map (M1, M2…Mn) and the auto-navigating vehicle (Av) shall auto-navigate to said destination. The system (S) also provide a stop button for manual stop in said auto-mode (Am).

Fixed mode:
Figure 3c depicts Fixed mode (Am) of navigation in the auto-navigation vehicle system (S). In Fixed mode (Fm), the user (U) selects fixed-mode (Fm) from the drop down menu on software application (101) installed on an Input/Output device (100) connected to the auto-navigating vehicle through the wireless communication module (W). The dropdown menu for selecting the floor (Fl1, Fl2…Fln) is displayed on screen of the Input/Output device (100) and user (U) selects the desired floor (Fl1, Fl2… Fln) in the multi-storied building, whereupon the multiple navigation maps (M1, M2…Mn) of the building are downloaded on the screen of said I/O device (100) indicating the point of reference (POR) which is either the reference position (R) or the elevator (L) .

In case the user (U) wishes to navigate to the same floor in which said user (U) is present, user (U) will move to the reference position (R) and choose the desired floor (Fl1, Fl2…Fln) from the options displayed on the screen of the Input/Output device (100).
In case, the user (U) wishes to navigate to another floor, said user (U) moves to the elevator (L) and choose the desired floor from the options displayed on the screen of Input/Output device (100).

The system (S) checks whether the auto navigating vehicle (Av) is at the point of reference (POR) and if the auto-navigating vehicle (Av) is at the point of reference (POR), the pre-existing map of the selected floor (Fl1, Fl2…Fln) is downloaded into the map server (303) and displayed on the screen of the Input/Output device (100).

Figure 4a depicts the navigation map (M1, M2…Mn) displayed in Fixed mode (Fm) of navigation in the auto-navigation vehicle system (S). The navigation map (M1, M2…Mn) displayed on the fixed mode (Fm) shows pre-defined destinations (D1, D2… Dn) along with the obstacles (O1, O2….On) (Fig. 6) in the chosen path or route of said destination . The user (U) can select any of said pre-defined destination (D1, D2…Dn) on the displayed map (M1, M2…Mn) whereupon the auto-navigating vehicle (Av) shall navigate directly towards said destination (D1, D2…D2n).

Figure 5 depicts the operational flow diagram of auto-navigating vehicle system (S) providing multi-map navigation. In the flow diagram,
? x represents input for selecting mode
? m represents map switch / input for selecting map
? L1, L2, L3, L4, L5 and L6 are the elevator positions of each floor; and
? R1, R2, R3, R4, R5 and R6 are the reference positions on each floor

The navigation maps (M1, M2…Mn) can be selected from a software application (101) installed on the Input/Output device (100) application via wireless communication module (W). Any of the six navigation maps (M1, M2…Mn) can be downloaded according to the user's (U) input. When a navigation map (M1, M2…Mn) is selected as per the input of the user (U), its corresponding yaml file is loaded into the map-server (303).

The initial position of the auto-navigating vehicle (Av) in any of the navigation maps (M1, M2….Mn) shall be the corresponding elevator (L1, L2…Ln). The user (U) shall ensure that the auto-navigating vehicle (Av) is inside the corresponding elevator (L) at the required floor before choosing or changing a map. The initial position of the auto-navigating vehicle (Av) is published to the topic ‘/initialpose’ as ‘PoseWithCovarianceStamped’ message.

The user (U) can choose the navigation map (M1, M2…Mn) of the desired floor (Fl1, Fl2…Fln) if the auto-navigating vehicle (Av) is at the corresponding point of reference (POR) which could either be reference position or an elevator (L). The user (U) can provide destination inputs if said user (U) satisfies the above criterion. The goals positions or destination of the auto-navigating vehicle (Av) are published to the topic “/move_base_simple/goal” as ‘PoseStamped ‘messages.

Figure 6 depicts a navigation map created for an indoor environment where O1, O2, O3, O4, O5 represent the obstacles in the path/route of the chosen destination by the user (U).

Figure 7a, 7b, 7c depicts navigation maps (M1, M2…Mn) of different floors (Fl1, Fl2…Fln) generated in multi-map navigation mode of the present auto-navigation vehicle system (S) in a multi-storied building.

Manual mode:
Figure 3d depicts Manual mode (Mm) of navigation in the auto-navigation vehicle system (S). The user (U) selects Manual mode (Mm) option from the drop down menu of software application installed on the Input/Output device (100). The Manual mode (Mm) enables the user (U) to maneuver the auto-navigating vehicle (Av) by using the direction tabs and stop button visible on manual mode (Mm). The said direction tabs maneuvers the auto-navigating vehicle (Av) in Forward, Backward, Left, Right direction.

It is evident from the above disclosure that if the user (U) wishes to maneuver the auto-navigating vehicle (Av) in a particular direction, the user (U) can simply choose the manual mode (Mm) and move accordingly in forward, backward, left, or right direction.

,CLAIMS:1. Auto-navigation vehicle system (S) for providing multi-map navigation in a multi-storied building to the user (U), said system comprising:
? auto-navigating vehicle (Av) comprising of an Input/Output device (100) and an Isolation hood (400)
- said Input/Output device (100) comprising of software application (101) installed on said Input/Output device (100) and wireless communication module (W)
- said Isolation hood (400) enables secure transportation of user (U) including infected patients while maintaining social distancing
? Control unit (200) comprising of master-controller unit (201), sub-controller unit (202), at least one sensor (s), atleast one motor driver (206) to drive atleast one motor (207),
? Software unit (300) comprising of Robot Operating System (ROS), at least one Extended Kalman Filter (EKF) (301), at least one Laser filter (302) and at least one Map server (303)

wherein
? said software application (101) enables said user (U) to access preferred mode of navigation selected from auto mode (Am), fixed mode (Fm) and manual mode (Mm)
? said wireless communication module (W) is a short-range wireless interconnection and facilitates duplex wireless communication between said Input/Output device (100) and said sub-controller unit (202)
? said master-controller unit (201) runs said Robot Operating system (ROS) and supervises said Control unit (200)
? said sub-controller unit (202) receives data from said sensors (s) and transmits processed data to said master controller unit (201)
? said sensors (s) comprising
- at least one LiDAR (Light Detection and Ranging) sensor (203) capable of identifying the surrounding and objects to avoid any obstacle (O1, O2…On) in path identified in said navigation map (M1, M2….Mn);
- atleast one pair of rotary encoders (204) attached to the motors (207); and
- atleast one inertial measurement unit (IMU) (205) capable of identifying the pose of the auto-navigating vehicle (Av)

? said Robot Operating System (ROS) being configured to visualize 2D and 3D values from ROS data using Simultaneous localization and mapping (SLAM) (304) and Adaptive Monte Carlo Localization (AMCL) (305) for localization and creation of multiple navigation maps (M1, M2…Mn),
? said Extended Kalman Filter (EKF) (301) being capable of combining measurement data from said IMU (205) and odometry(208),
? said Laser filter (302) being capable of processing scan data from said LiDAR (203),
? said Map server (303) being capable of storing and loading of said navigation maps (M1, M2…Mn)
said auto-navigation vehicle system (S) being deployed in a manner such that said auto-navigating vehicle (Av) is capable of accurate navigation in a multi-storied building by a single touch using said application (101) on said I/O device (100) based on said navigation maps (M1, M2…Mn) without any external intervention while maintaining social distancing.

2. The auto-navigation vehicle system (S) as claimed in claim 1 wherein said modes of navigation are fully autonomous or auto mode (Am), semi-autonomous or fixed mode (Fm) and manual mode (Mm) of which said auto-mode (Am) and fixed mode (Fm) are multi-map modes and auto-navigate the vehicle (Av) in a multi-storeyed building via elevator (L).

3. The auto-navigation vehicle system (S) as claimed in claim 1 wherein operating system for said Input/Output device (100) is selected from but is not limited to android and internet operating system (iOS).

4. The auto-navigation vehicle system (S) as claimed in claim 1 wherein said wireless communication module (W) is capable of operating in closed network environment having short range wireless interconnection such as Bluetooth as well open network environment such as internet or Wi-Fi.

5. The auto-navigation vehicle system (S) as claimed in claim 1 wherein said Robot Operating System (ROS) is installed on Ubuntu core platform.

6. The auto-navigation vehicle system (S) as claimed in claim 1 wherein said sub-controller unit (202) is used as an interface (I) for said wireless communication module (W), said rotary encoders (204) and said Inertial measurement unit (IMU) (205).

7. The auto-navigation vehicle system (S) as claimed in claim 1 wherein said LiDAR (203) is a 360-degree Laser sensor being capable of identifying the surrounding environment, detecting the obstacles (O1, O2…On), and localizing said auto-navigating vehicle (Av).

8. The auto-navigation vehicle system (S) as claimed in claim 1 wherein said rotary encoders (204) is capable of identifying the direction of the rotation of wheel of said auto-navigating vehicle(Av) by computing the displacement made by said auto-navigating vehicle (Av).

9. The auto-navigation vehicle system (S) as claimed in claim 1 wherein said Inertial measurement unit (IMU) (205) is an atleast six-axis inertial measurement unit and being capable of determining the orientation of said auto-navigating vehicle (Av) with respect to the environment.

10. The auto-navigation vehicle system (S) as claimed in claim 1 wherein said Simultaneous localization and mapping (SLAM) (304) receives point cloud data from said LiDAR (203) and said IMU (205) for creating said navigation maps (M1, M2…Mn) and saving said navigation maps (M1, M2…Mn) to ROS file system using said map server (303).

11. The auto-navigation vehicle system (S) as claimed in claim 1 wherein said Adaptive Monte Carlo Localization (AMCL) (305) is capable of determining the position and orientation of said auto-navigating vehicle (Av) on said navigation maps (M1, M2…Mn) during navigation.

12. The auto-navigation vehicle system (S) as claimed in claim 1 wherein said Adaptive Monte Carlo Localization (AMCL) (305) is integrated with said Extended Kalman Filter (EKF) (301) and capable of reducing uncertainties in estimation of the position of said auto-navigating vehicle (Av).

13. The auto-navigation vehicle system (S) as claimed in claim 1 wherein IMU (205) data from said sub-controller unit (202) is calibrated based on the physical position of said IMU (205) and said data provided to said Extended Kalman Filter (EKF) (301) .

14. The auto-navigation vehicle system (S)as claimed in claim 1 wherein said Adaptive Monte Carlo Localization (AMCL) (305) receives filtered LiDAR (203) data from said laser filter (302), filtered IMU (205) data and odometry (208) data from said Extended Kalman Filter (EKF) (301) for scan matching and provides an accurate localization of the auto-navigation vehicle (Av).

15. A method for operating auto-navigation vehicle system (S) as claimed in claim 1 wherein said user (U) is capable of navigating said auto-navigating vehicle (Av) in auto mode (Am), fixed mode (Fm), and manual mode (Mm).

16. The method for operating auto-navigation vehicle system (S) as claimed in claim 15, wherein said user (U) selects said auto mode (Am) or fixed mode (Fm) on said application (101) on being present at point of reference (POR).

17. The method for operating auto-navigation vehicle system (S) as claimed in claim 16, wherein said point of reference (POR) is either the reference position (R) or an elevator (L).

18. The method for operating auto-navigation vehicle system (S) as claimed in claim 15, wherein said user on reaching at said point of reference (POR) selects auto-mode (Am) in said application (101) installed on said Input/Output device (100) wherein said auto mode (Am) comprises the steps of
? checking whether said Input/Output device (100) is connected with said wireless communication module (W)
? selecting auto mode (Am) displayed on screen from dropdown menu of said application (101) installed on said I/O device (100)
? downloading navigation map (M1, M2…Mn) of said multi-storied building via said map server (303) on said I/O device (100) indicating said point of reference (POR), whereupon the user (U) moves to either said reference position (R) or said elevator (L)
? where said user (U) is navigating to the same floor,
? maneuvering said auto-navigating vehicle (Av) to said reference position (R)
? choosing said floor (Fl1, Fl2….Fln) from dropdown menu in said application (101) on said I/O device (100)
? downloading said navigation map (M1, M2….Mn) on screen of said I/O device (100) from said map server (303)
? where said user (U) is navigating to another floor,
? maneuvering said auto-navigating vehicle (Av) to said elevator (L)
? ensuring that auto-navigating vehicle (Av) is inside said elevator(L)
? choosing desired floor (Fl1, Fl1… Fln) from dropdown menu in said application (101) on said I/O device (100)
? downloading said navigation map (M1, M2….Mn) on screen of said I/O device (100) from said map server (303)

wherein said auto-mode (Am) enables said user (U) to choose destination for navigation on said navigation map (M1, M2…Mn) of said floor (Fl1, Fl1… Fln) whereupon said auto-navigating vehicle (Av) navigates towards said destination.

19. The method for operating auto-navigation vehicle system (S) as claimed in claim 15, wherein said user (U) on reaching at said point of reference (POR) selects fixed-mode (Fm) in said application (101) installed on said Input/Output device (100) wherein said fixed mode (Fm) comprises the steps of

? checking whether said Input/Output device (100) is connected with said communication module (W)
? selecting fixed mode (Fm) displayed on screen from dropdown menu of said application (101) installed in said I/O device (100)
? downloading navigation map (M1, M2…Mn) of said multi-storied building via map server (303) on said I/O device (100) indicating the point of reference (POR), whereupon said user (U) moves to the either said reference position (R) or said elevator (L)
? where said user (U) is navigating to the same floor,
? maneuvering said auto-navigating vehicle (Av) to said reference position (R)
? choosing said floor (Fl1, Fl2….Fln) from dropdown menu in said application (101) on said I/O device (100)
? downloading said navigation map (M1, M2….Mn) on screen of said I/O device (100) from said map server (303)

? where said user (U) is navigating to another floor,
? maneuvering said auto-navigating vehicle (Av) to said elevator (L)
? ensuring that auto-navigating vehicle (Av) is inside said elevator (L)
? choosing desired floor (Fl1, Fl1… Fln) from dropdown menu in said application (101) on said I/O device (100)
? downloading said navigation map (M1, M2….Mn) on screen of said I/O device (100) from said map server (303)
wherein
? said navigation map (M1, M2… Mn) displayed on said fixed mode show obstacles (O1, O2….On) on said floor (Fl1, Fl2… Fln)
? said fixed-mode (Fm) enables said user to navigate said auto-navigating vehicle (Av) on selected pre-defined destinations (D1, D2…Dn) on said navigation map (M1, M2…Mn) of said floor (Fl1, Fl1… Fln), whereupon said auto-navigating vehicle navigates towards selected pre-defined destination (D1, D2….Dn).

20. The method for operating auto-navigation vehicle system (S) as claimed in claim 19, wherein on selection of said pre-defined destination (D1, D2…Dn), navigation mode of said auto-navigating vehicle (Av) cannot be changed mid-navigation.

21. The method for operating auto-navigation vehicle system (S) as claimed in claim 15, wherein said user (U) selects said manual mode (Mm) when said user (U) is maneuvering said auto-navigating vehicle (Av) in a particular direction, wherein said manual mode (Mm) comprises the steps of
? checking whether said Input/Output device (100) is connected with said wireless communication module (W)
? selecting manual mode (Mm) displayed on screen from dropdown menu of said application (101) installed on said Input/Output device (100)
? maneuvering said auto-navigating vehicle (Av) towards its destination by using the direction tab, forward, backward, left, right, visible on said Input/Output device (100)

Documents

Application Documents

# Name Date
1 202041021163-STATEMENT OF UNDERTAKING (FORM 3) [20-05-2020(online)].pdf 2020-05-20
2 202041021163-PROVISIONAL SPECIFICATION [20-05-2020(online)].pdf 2020-05-20
3 202041021163-FORM 1 [20-05-2020(online)].pdf 2020-05-20
4 202041021163-DECLARATION OF INVENTORSHIP (FORM 5) [20-05-2020(online)].pdf 2020-05-20
5 202041021163-Proof of Right [13-07-2020(online)].pdf 2020-07-13
6 202041021163-FORM-26 [13-07-2020(online)].pdf 2020-07-13
7 202041021163-ENDORSEMENT BY INVENTORS [13-07-2020(online)].pdf 2020-07-13
8 202041021163 _Correspondence_02-09-2020.pdf 2020-09-02
9 202041021163-DRAWING [20-05-2021(online)].pdf 2021-05-20
10 202041021163-COMPLETE SPECIFICATION [20-05-2021(online)].pdf 2021-05-20
11 202041021163-FORM 18 [27-06-2022(online)].pdf 2022-06-27
12 202041021163-EDUCATIONAL INSTITUTION(S) [27-06-2022(online)].pdf 2022-06-27
13 202041021163-FER.pdf 2022-12-14
14 202041021163-FER_SER_REPLY [13-06-2023(online)].pdf 2023-06-13
15 202041021163-FORM-26 [14-06-2023(online)].pdf 2023-06-14
16 202041021163-US(14)-HearingNotice-(HearingDate-10-05-2024).pdf 2024-04-18
17 202041021163-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [08-05-2024(online)].pdf 2024-05-08
18 202041021163-US(14)-ExtendedHearingNotice-(HearingDate-14-06-2024).pdf 2024-05-10
19 202041021163-Correspondence to notify the Controller [05-06-2024(online)].pdf 2024-06-05
20 202041021163-Correspondence to notify the Controller [13-06-2024(online)].pdf 2024-06-13
21 202041021163-FORM-26 [14-06-2024(online)].pdf 2024-06-14
22 202041021163-Written submissions and relevant documents [26-06-2024(online)].pdf 2024-06-26
23 202041021163-MARKED COPIES OF AMENDEMENTS [26-06-2024(online)].pdf 2024-06-26
24 202041021163-FORM-26 [26-06-2024(online)].pdf 2024-06-26
25 202041021163-FORM 13 [26-06-2024(online)].pdf 2024-06-26
26 202041021163-AMMENDED DOCUMENTS [26-06-2024(online)].pdf 2024-06-26
27 202041021163-FORM-8 [28-06-2024(online)].pdf 2024-06-28
28 202041021163-PatentCertificate22-07-2024.pdf 2024-07-22
29 202041021163-IntimationOfGrant22-07-2024.pdf 2024-07-22
29 202041021163-STATEMENT OF UNDERTAKING (FORM 3) [20-05-2020(online)].pdf 2020-05-20

Search Strategy

1 202041021163E_13-12-2022.pdf

ERegister / Renewals

3rd: 19 Oct 2024

From 20/05/2022 - To 20/05/2023

4th: 19 Oct 2024

From 20/05/2023 - To 20/05/2024

5th: 19 Oct 2024

From 20/05/2024 - To 20/05/2025

6th: 08 May 2025

From 20/05/2025 - To 20/05/2026

7th: 08 May 2025

From 20/05/2026 - To 20/05/2027

8th: 08 May 2025

From 20/05/2027 - To 20/05/2028

9th: 08 May 2025

From 20/05/2028 - To 20/05/2029

10th: 08 May 2025

From 20/05/2029 - To 20/05/2030