Sign In to Follow Application
View All Documents & Correspondence

A Steering Responsive Camera Control System For Autonomous Navigation And Method Thereof

Abstract: A steering-responsive camera control system for autonomous navigation and method thereof [0037] The present invention discloses a steering-responsive camera control system for autonomous navigation and the corresponding method. The system (100) comprises a steering angle sensor (101) connected to a steering of an autonomous vehicle to detect and transmit steering angle data to a controller (102), that processes the received data to determine the region of interest of a camera (104) attached to the autonomous vehicle. Further, the camera (104) dynamically adjusts its region of interest based on the command signals received from the controller (102) and transmits the captured image data back to the controller (102) for obstacle detection. The controller (102) further generates brake actuation signals and transmits them to the brake actuation module to engage the brakes upon detection of obstacles within a range of four to six meters from the autonomous vehicle. (Figure 1)

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
22 November 2023
Publication Number
21/2025
Publication Type
INA
Invention Field
MECHANICAL ENGINEERING
Status
Email
Parent Application

Applicants

NMICPS Technology Innovation Hub on Autonomous Navigation Foundation
C/O Indian Institute of Technology (IIT) Hyderabad, Kandi-502285, Sangareddy, Telangana, India.
Indian Institute of Technology (IIT) Hyderabad
A417, Academic Block A, Indian Institute of Technology (IIT) Hyderabad, Kandi-502285, Sangareddy, Telangana, India.

Inventors

1. Prof. Rajalakshmi Pachamuthu
C/O Indian Institute of Technology (IIT) Hyderabad, Kandi-502285, Sangareddy, Telangana, India.
2. Dr. Swapnil Dipak Shinde
C/O Indian Institute of Technology (IIT) Hyderabad, Kandi-502285, Sangareddy, Telangana, India.

Specification

DESC:Priority Claim:

[0001] This application claims priority from the provisional application number 202341079263 filed with Indian Patent Office, Chennai on 22nd November 2023, entitled “A steering-responsive camera control system for autonomous navigation and method thereof”, the entirety of which is expressly incorporated herein by reference.
Preamble to the Description

[0002] The following specification particularly describes the invention and the manner in which it is to be performed:
DESCRIPTION OF THE INVENTION
Technical field of the invention
[0003] The present invention relates to an autonomous navigation system for an Unmanned Ground Vehicle (UGV) equipped with a steering angle sensor and a camera for obstacle detection. More specifically, the present invention relates to a control system for the UGV that utilizes data from a steering angle sensor to dynamically adjust the field of view of a camera for detecting obstacles and actuating a braking force to enhance safety of the UGV. The present invention further discloses a method for obstacle detection through the fusion of a steering angle data and the region of interest of a camera.
Background of the invention
[0004] In the recent years, the development of Unmanned Ground Vehicles (UGVs) has gained significant momentum, driven by their potential applications in various domains, such as military operations, agriculture, transportation, and surveillance.The development of autonomous navigation used in the UGVs is largely attributed to multiple interconnected fields such as robotics, artificial intelligence, and sensor technology including Light Detection and Ranging (LiDAR), Radio Detection and Ranging (RADAR), and ultrasonic sensors, that helps any Unmanned Ground Vehicle (UGV) to perceive the environment, detect obstacles, and map the surroundings. Further, the integration of Artificial Intelligence (AI) for interpreting sensor data, coupled with machine learning algorithms, enables the UGVs to make informed decisions, while computer vision analyses visual data to aid in the path detection, reading the signs, and the obstacle identification. The integration of Global Positioning System (GPS) and Global Navigation Satellite System (GNSS) has further enhanced navigational accuracy, allowing for precise positioning over large areas.
[0005] However, current autonomous navigation systems face challenges in their approach to environmental monitoring. Current systems typically maintain constant surveillance of their entire field of view, regardless of the vehicle’s steering angle or immediate path. This uniform surveillance leads to inefficient resource allocation and delayed responses to the obstacles in the vehicle’s current trajectory. Further, the power requirements of the UGV, due to the integration of multiple sensors, and actuators reduce the vehicle’s operational range. Additionally, the memory storage capacity issues arises from processing and storing data from numerous sensors.
[0006] Several attempts have been made to address these challenges. For instance, the Patent Application No IN202321029544A entitled, “Unmanned ground vehicle with advanced sensor and navigation systems for autonomous operation” discloses an unmanned ground vehicle (UGV) for autonomous functionality. The UGV, built on a commercial off-road platform, comprises a electrical actuator, a dual-voltage energy unit, and a sensor suite with multiple cameras and laser scanners. The invention features a navigation system that autonomously generates routes based on environmental data, ideal for remote or hazardous locations. The autonomous vehicle incorporates a motion control system for obstacle avoidance. The UGV further includes a graphical interface for control and monitoring, and is capable of extended operation with a supplementary charger for convenient power replenishment.
[0007] The Patent Application No KR101454153B1 entitled, “'Navigation system for unmanned ground vehicle by sensor fusion with virtual lane” discloses a navigation system for a self-driving vehicle that combines multiple sensors with virtual lane technology. The system ensures positional accuracy and stable autonomous operation by fusing sensor data and image sensor-detected lane information. The invention comprises several key units such as a location information measuring unit for real-time vehicle positioning, a virtual lane setting unit that establishes a driving path using stored map data and waypoints, and a driving information detection unit that captures the actual lane information as the vehicle follows the set path. The system further comprises a location information correction unit that adjusts any discrepancies in the measured location data, and a location information output unit that provides the corrected vehicle location. The correction is further achieved through map matching between the virtual lane information and the actual location data.
[0008] Despite the advancements in autonomous navigation systems, there is a need for an improved environmental monitoring system for the UGVs that optimizes sensor utilization, enhances obstacle detection during vehicles’ turns, reduces computational complexity, energy usage and the cost of operation while maintaining operational efficiency.
Summary of the invention
[0009] The present invention addresses the limitations of the prior art by disclosing a steering-responsive camera control system for autonomous navigation that dynamically adjusts the region of interest of a camera attached to the UGV for detection of any obstacles.
[0010] The system comprises a steering angle sensor that receives and transmits steering angle data to a controller, that processes the steering angle data to determine the region of interest of the camera. Further, the camera, connected to the controller, dynamically adjusts its field of view according to the command received from the controller based on the angle of movements of the steering.
[0011] In an embodiment, for the steering angles between -50 degrees and +50 degrees, the camera maintains a centralized polygonal boundary as its region of interest. Further, for the steering angles exceeding +50 degrees during a right turn, the camera adjusts the region of interest to a trapezium-shaped boundary extending towards the right side of the UGV. Additionally, for the steering angles below -50 degrees during a left turn, the camera adapts the region of interest to a trapezium-shaped boundary towards the left side of the UGV. The camera further captures and transmits image data back to the controller for further processing and obstacle detection. The camera functions as an imaging device and a distance sensor, enabling both the image capture and distance measurement within the adjusted region of interest.
[0012] The system further comprises a brake actuation module that engages for detection of the obstacles within a range of four to six meters from the UGV, ensuring timely intervention for the safety.
[0013] The present invention further discloses a method for steering-responsive camera control in autonomous navigation. The method comprises receiving steering angle data from the steering angle sensor attached to the steering of an UGV and transmitting it to a controller for processing and determining the region of interest for the camera. Further, the camera dynamically adjusts its region of interest based on the received command signals from the controller and transmits the captured image data back to the controller for obstacle detection. Furthermore, the controller processes the image data to detect obstacles and on detecting obstacles within a range of four to six meters form the UGV, the controller generates brake actuation signals and transmits them to the brake actuation module. Consequently, the brake actuation module engages the vehicle brakes to ensure safe operation of the UGV.
Brief Description of drawings
[0014] Figures 1 illustrates a block diagram of a steering-responsive camera control system for autonomous navigation, in accordance with an embodiment of the invention.
[0015] Figure 2 illustrates a flowchart of a method used in the steering-responsive camera control for autonomous navigation, in accordance with an embodiment of the invention.
Detailed description of the invention
[0016] In order to describe and point out the subject matter of the invention, the following definitions are provided for specific terms, which are used in the following written description more clearly and concisely.
[0017] The term “Autonomous vehicle” refers to a vehicle equipped with a combination of multiple sensors, camera, and artificial intelligence that allow the vehicle to navigate and operate without any human intervention.
[0018] The term “Unmanned ground vehicle (UGV)” refers to a ground-based autonomous vehicle that operates without any direct human control.
[0019] The term “Steering angle sensor” refers to a sensing device that measures the angular displacement of a steering wheel of a vehicle.
[0020] The term “Region of Interest (ROI)” refers to a focused area within the field of view of a camera.
[0021] The term “Brake actuator” refers to a mechanical or electromechanical component of a vehicle that executes braking commands received from a controller.
[0022] The present invention discloses a steering-responsive camera control system for autonomous navigation. The system comprises a steering angle sensor connected to the steering of an UGV to receive and transmit steering angle data to a controller, that processes the data to determine the region of interest of the camera. Further, the camera dynamically adjusts its region of interest based on the command from the controller and transmits the captured image data to the controller for obstacle detection. Furthermore, the controller generates brake actuation signals and transmits them to the brake actuation module on detection of the obstacles within a range of four to six meters from the UGV for engaging the brakes.
[0023] The present invention further discloses a method for steering-responsive camera control in autonomous navigation. The method involves receiving steering angle data from the steering angle sensor and transmitting the steering angle data to the controller for determining the region of interest of the camera, adjusting the region of interest of the camera based on the received commands from the controller, transmitting the captured image data to the controller, processing of the image data for detecting obstacles and generating brake actuation signals on detection of obstacles within a range of four to six meters from the UGV.
[0024] In an embodiment, the steering angle sensor captures the degree of the rotation of the vehicle's steering corresponding to the radius of curvature of the road during the turning. The steering angle data is further processed through the controller using an algorithm that aligns the region of interest of the camera.
[0025] In an embodiment of the invention, for a forward movement of the UGV, the camera frame showcases the region of interest at a straight-ahead steering angle (?) of zero (0) degree, defining a polygonal boundary positioned centrally within the frame of the camera. The boundary of the region of interest encloses the area immediately in front of the UGV for obstacle detection and navigation assistance.
[0026] In another embodiment of the invention, for a right turn of the UGV, the region of interest is defined by a trapezium-shaped boundary extending towards the right side of the camera frame, corresponding to the projected path of the vehicle during the turn. Within the trapezium, the controller actively tracks potential obstacles, with a secondary bounding box placed around the identified obstacles, along with numerical data indicating the distance of the obstacle from the camera.
[0027] In yet another embodiment of the invention, for a left turn of the UGV, the region of interest is defined by a trapezium-shaped boundary that aligns with the expected trajectory of the vehicle into the adjacent lane towards the left side of the camera frame. Within the trapezium, the controller actively tracks potential obstacles, with a secondary bounding box placed around the identified obstacles, along with numerical data indicating the distance of the obstacle from the camera.
[0028] Figures 1 illustrates a block diagram of a steering-responsive camera control system for autonomous navigation, in accordance with an embodiment of the invention. The system (100) comprises a steering angle sensor (101) connected to the steering of the UGV to detect and transmit steering angle data to a controller (102). The controller (102) processes the received data to determine the region of interest for a camera (104). Further, the camera (104) connected to the controller (102) dynamically adjusts its region of interest according to the commands received from the controller (102). In an embodiment, the camera (104) functions as both an imaging device and a distance sensor, capturing and transmitting image signals to the controller (102) for obstacle detection. Furthermore, the controller (102) processes the image signals and generates the brake actuation signals upon detection of the obstacles within a range of four to six meters from the UGV. Further, the brake actuation module (103) receives the signals from the controller (102) to engage the vehicle brakes. The safety mechanism ensures that the UGV promptly responds to a near-field obstacles detected during the operation, thereby enhancing the autonomous driving capabilities.
[0029] Figure 2 illustrates a flowchart of a method used in the steering-responsive camera control for autonomous navigation, in accordance with an embodiment of the invention. The method (200) for a steering-responsive camera control for autonomous navigation comprises the steps of receiving the real-time steering angle data from the steering angle sensor and transmitting to the controller in step (201). The steering angle data is processed by the controller in step (202). Further, in step (203), a dynamic region of interest for the camera is determined and adjusted based on the processed steering angle data. Subsequently, the image data of the adjusted region of interest is captured and transmitted back to the controller by the camera in step (204). Further, in step (205), the presence of one or more potential obstacles within the adjusted region of interest from the received image data is evaluated by the controller. Additionally, in step (206), a brake actuation signal is sent by the controller to a brake actuation module upon detection of at least one obstacle within a range of four to six meters from the UGV initiating a timely braking action to avoid the risk of collision.
[0030] The present invention offers numerous advantages that enhance the operation of Unmanned Ground Vehicles (UGVs). The fusion of the steering angle sensor (101) and the camera (104) data allows the system (100) for a synergistic approach that dynamically shifts the camera's (104) region of interest in accordance with the turning of the UGV. By reducing the dependencies on multiple cameras, the system (100) reduces both the complexity and the number of components, lowering the cost and the power consumption, providing a more economical and energy-efficient solution.
[0031] Further, the fusion of camera (104) and steering angle data enables the system (100) for a comprehensive understanding of the environment of the UGV, aiding in collision prevention and road safety.
[0032] Furthermore, the ability of the system (100) to adjust the camera’s (104) region of interest to different driving scenarios improves its adaptability, ensuring optimal navigation performance across various terrains and traffic conditions.
[0033] Additionally, the prioritization of detected objects based on the steering angle minimizes the occurrence of the false alarms. Further, by focusing on areas directly in the projected path of the UGV, the system (100) enhances operational reliability.
[0034] Moreover, the system (100) reduces memory storage requirements by processing the data from fewer sensors, simplifying the algorithm’s complexity and improving the overall efficiency of the system (100).
[0035] Having generally described this invention, a further understanding can be obtained by reference to a specific example, which is provided herein for the purpose of illustration only and is not intended to be limiting unless otherwise specified.
Example 1: Demonstrative illustration of usage of proposed invention
[0036] As an illustrative example, consider an UGV equipped with the steering-responsive camera control system (100) is navigating through a curved road. As the UGV approaches a turn, the steering angle sensor (101) detects the steering angle and transmits this data to the controller (102). For a right turn exceeding 50 degrees, the camera (104) dynamically adjusts its region of interest to form a trapezium-shaped boundary extending towards the right side, aligning with the projected path of the UGV. If a pedestrian or any obstacle appears within this adjusted region at less than six meters, the controller (102) processes this information from the received image signals and immediately transmits brake actuation signals to the brake actuation module (103). The brake actuation module (103) engages the vehicle brakes, ensuring timely braking action to avoid the risk of collision.
Reference Numbers:
Components Reference Numbers
System 100
Steering angle sensor 101
Controller 102
Brake actuation module 103
Camera 104

,CLAIMS:We claim:
1. A steering-responsive camera control system for autonomous navigation of a vehicle, the system (100) comprising:
a. a steering angle sensor (101) connected to a steering of an Unmanned Ground Vehicle (UGV) to continuously receive plurality of real-time data of the steering angle; and

b. a controller (102) connected to the steering angle sensor (101) to receive and process the steering angle data to determine and control a region of interest of a camera (104) connected to the controller (102);
wherein the camera (104) continuously captures the real-time image data of the region of interest and continuously transmit the captured image data to the controller (102).

2. The system (100) as claimed in claim 1, wherein the controller (102) is configured to evaluate the presence of one or more obstacles within the adjusted region of interest of the camera (104) and to generate a brake actuation signal on detection of at least one obstacle.

3. The system (100) as claimed in claim 1, wherein the controller (102) is configured to send a brake actuation signal to a brake actuation module (103) on detection of at least one obstacle within a range of 4 to 6 meters from the UGV.

4. The system (100) as claimed in claim 1, wherein the brake actuation module (103) connected to the controller (102) is configured to apply braking force on one or more wheels of the UGV in response to the brake actuation signal received from the controller (102).

5. The system (100) as claimed in claim 1, wherein the camera (104) is configured to function as both an imaging device and a distance sensor.

6. The system (100) as claimed in claim 1, wherein the controller (102) is configured to adjust the region of interest of the camera (104) by defining a polygonal-shaped boundary within the field of view of the camera (104).

7. The system (100) as claimed in claim 1, wherein the controller (102) divides the region of interest of the camera (104) into one or more regions based on the steering angle thresholds:

a. a first region of interest for the steering angle between -50 degrees and +50 degrees;
b. a second region of interest for the steering angle greater than +50 degrees; and
c. a third region of interest for the steering angle less than -50 degrees.

8. A method for steering-responsive camera control in autonomous navigation, the method (200) comprising the steps of:
a. receiving real-time steering angle data of an autonomous vehicle from a steering angle sensor (201);
b. processing the steering angle data by a controller (202);
c. determining and adjusting a dynamic region of interest for a camera (203);
d. capturing and sending the image data of the adjusted region of interest by the camera to the controller (204);
e. evaluating the presence of one or more potential obstacles within the adjusted region of interest by the controller (205); and
f. sending a brake actuation signal to a brake actuation module on detection of at least one obstacle within a range of four to six meters from the autonomous vehicle (206).

9. The method (200) as claimed in claim 8, wherein the determining and adjusting the dynamic region of interest of the camera comprises:
a. establishing a polygonal boundary for a straight path navigation of the autonomous vehicle;
b. establishing a right-oriented trapezium-shaped boundary for a right turn; and
c. establishing a left-oriented trapezium-shaped boundary for a left turn of the autonomous vehicle.

Documents

Application Documents

# Name Date
1 202341079263-PROVISIONAL SPECIFICATION [22-11-2023(online)].pdf 2023-11-22
2 202341079263-PROOF OF RIGHT [22-11-2023(online)].pdf 2023-11-22
3 202341079263-POWER OF AUTHORITY [22-11-2023(online)].pdf 2023-11-22
4 202341079263-FORM 1 [22-11-2023(online)].pdf 2023-11-22
5 202341079263-DRAWINGS [22-11-2023(online)].pdf 2023-11-22
6 202341079263-FORM-5 [22-11-2024(online)].pdf 2024-11-22
7 202341079263-FORM 3 [22-11-2024(online)].pdf 2024-11-22
8 202341079263-DRAWING [22-11-2024(online)].pdf 2024-11-22
9 202341079263-COMPLETE SPECIFICATION [22-11-2024(online)].pdf 2024-11-22
10 202341079263-FORM 18 [26-03-2025(online)].pdf 2025-03-26
11 202341079263-RELEVANT DOCUMENTS [18-11-2025(online)].pdf 2025-11-18
12 202341079263-POA [18-11-2025(online)].pdf 2025-11-18
13 202341079263-FORM 13 [18-11-2025(online)].pdf 2025-11-18