Sign In to Follow Application
View All Documents & Correspondence

System And Method For Real Time Vehicle Occupancy Monitoring And Alert Generation

Abstract: Disclosed is a system (100) for monitoring vehicle occupancy in real-time. The system (100) includes at least one imaging unit (110) disposed over at least one door of a vehicle (102) for capturing real-time images or videos. A location tracking unit (112) obtains real-time location and corresponding time. An edge device (104) with a processing unit (106) receives the real time images or videos, the real-time location data and the time, and processes the images or videos, determines passenger entry/exit events, and calculates real-time occupancy count. An information processing apparatus (120) with processing circuitry (122) receives the occupancy count and location data, generates alerts based on vehicle conditions and occupancy levels, and identifies occupancy patterns. The system (100) further includes a local database (128) at a vehicle depot (126) for storing data, and an interface (114) for notifying the vehicle managing authority about alerts. FIG. 1 is selected

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
08 August 2025
Publication Number
45/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

IITI DRISHTI CPS FOUNDATION
IIT Indore, Khandwa Road, Simrol, Indore, Madhya Pradesh, 453552, India
Indian Institute of Technology Roorkee
Indian Institute of Technology Roorkee, Roorkee, Haridwar, Uttarakhand, 247667, India

Inventors

1. Amit Agarwal
Indian Institute of Technology Roorkee, Roorkee, Haridwar, Uttarakhand, 247667, India
2. Karthik Krishnan O
Indian Institute of Technology Roorkee, Roorkee, Haridwar, Uttarakhand, 247667, India
3. Ritesh Singh
Indian Institute of Technology Roorkee, Roorkee, Haridwar, Uttarakhand, 247667, India

Specification

Description:FIELD OF DISCLOSURE
The present disclosure relates to vehicle occupancy monitoring systems, and more particularly to a system and method for real-time vehicle occupancy monitoring and alert generation.
BACKGROUND
Public transportation systems play a crucial role in urban mobility, providing essential services for commuters and contributing to the reduction of traffic congestion and environmental impact. These systems typically include various modes of transport such as buses, trains, and metros, which serve large numbers of passengers daily.
Conventional public transportation systems often rely on manual methods or basic technologies for tracking vehicle locations and estimating passenger occupancy. Some existing solutions utilize GPS-based tracking systems to provide real-time location information of vehicles. However, these systems generally lack the capability to accurately monitor and report passenger occupancy in real-time.
Certain advanced public transportation systems have implemented passenger counting technologies, such as infrared sensors or weight-based mechanisms. While these solutions offer some level of occupancy monitoring, they often face challenges in accurately counting passengers in crowded scenarios, particularly in densely populated urban areas. Additionally, many of these systems struggle to provide real-time data updates, leading to delays in information dissemination to both operators and passengers.
Current public transportation systems often face difficulties in efficiently managing vehicle schedules and routes based on real-time demand. The lack of accurate, up-to-date occupancy data can result in overcrowded vehicles during peak hours and underutilized resources during off-peak times. This inefficiency can lead to passenger discomfort, increased wait times, and reduced overall system performance.
Furthermore, existing systems may not adequately address the need for timely alerts regarding vehicle breakdowns, off-route movements, or extended periods of idling. The absence of such real-time monitoring and alert mechanisms can result in delayed responses to operational issues, potentially impacting service quality and passenger satisfaction.
Another significant challenge faced by urban transport agencies is the efficient management and storage of surveillance data collected from transit vehicles. While occupancy counting data may be transmitted wirelessly due to its relatively small size, the abundant video data captured by onboard camera systems cannot be feasibly transferred through wireless networks due to bandwidth limitations and associated costs. Transit agencies require access to recorded video footage for monitoring and surveillance purposes, particularly in cases of security incidents or operational investigations. However, the lack of efficient data transfer mechanisms often results in the need for manual data collection or expensive wired infrastructure, creating operational inefficiencies and increased maintenance costs.
Existing solutions typically require either physical retrieval of storage devices or establishment of permanent wired connections at depot locations, both of which present logistical challenges and may compromise system reliability. The absence of automated, wireless data transfer capabilities for video files with longer duration creates a gap between the need for comprehensive surveillance data and the practical limitations of current data management approaches in dynamic transportation environments.
Therefore, there exists a need for a technical solution that solves the aforementioned problems of conventional systems and methods for monitoring vehicle occupancy and generating alerts in public transportation.
SUMMARY
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In an aspect of the present disclosure, a system for monitoring vehicle occupancy in real-time is disclosed. The system includes at least one imaging unit disposed over at least one door of a vehicle. The imaging unit is configured to capture real time images or videos of a field of view proximate to the at least one door of the vehicle such that the at least one door of a vehicle allows the one or more passengers to enter into or exit from the vehicle. The system includes a location tracking unit configured to obtain real-time location data of the vehicle and time corresponding to the real-time location data. The system includes an edge device coupled to the at least one imaging unit and the location tracking unit. The edge device comprises a processing unit configured to receive the real time images or videos, the real-time location data of the vehicle and the time, determine speed data based on the real-time location data of the vehicle and the time, process the real time images or videos based on the real-time location data and the speed data, determine one of an entry event, an exit event, or a combination thereof, of one or more passengers through the at least one door of the vehicle based on the processed real time images or videos, and determine a real time occupancy count based on determined one of the entry event, exit event, or the combination thereof, of the one or more passengers. The system includes an information processing apparatus coupled to the edge device and the location tracking unit. The information processing apparatus comprises processing circuitry configured to receive the real time occupancy count from the processing unit and the real-time location data from the location tracking unit, generate one of a first alert, a second alert, or a combination thereof, and identify occupancy patterns of the vehicle. The first alert is generated based on one of breakdown of the vehicle, off-route movements of the vehicle, extended idling of the vehicle, or a combination thereof. The second alert is generated when the real time occupancy count exceeds a predefined occupancy threshold.
In some aspects of the present disclosure, the processing unit is further configured to determine a tampering of the edge device or the location tracking unit, or change in the field of view of at least one imaging unit. The processing unit is configured to generate a third alert based on the tampering of the edge device, or the location tracking unit, or change in the field of view of the at least one imaging unit.
In some aspects of the present disclosure, the processing circuitry is further configured to generate an adjusted vehicle scheduling or routing based on the occupancy patterns of the vehicle.
In some aspects of the present disclosure, the system further includes a local database disposed at a vehicle depot. The local database is configured to be coupled to the edge device by way of a local network when the vehicle comes within the vicinity of the vehicle depot such that the local database is configured to receive and store the real time images or videos stored in a memory of the edge device.
In some aspects of the present disclosure, the system further includes an interface configured to receive and notify a vehicle managing authority of the vehicle about one of the first alert, the second alert, the third alert, or a combination thereof.
In an aspect of the present disclosure, a method for monitoring vehicle occupancy in real-time is disclosed. The method includes capturing, by at least one imaging unit disposed over at least one door of a vehicle, real time images or videos of a field of view proximate to the at least one door of the vehicle such that the at least one door of a vehicle (102) allows the one or more passengers to enter into or exit from, the vehicle. The method includes determining by the processing unit speed data based on the real-time location data of the vehicle and the time. The method includes obtaining, by a location tracking unit, real-time location data of the vehicle and time corresponding to the real-time location data. The method includes receiving, by a processing unit of an edge device coupled to the at least one imaging unit (110) and the location tracking unit (112), the real time images or videos, the real time location data and the time. The method includes processing, by the processing unit, the real time images or videos based on the real-time location data and the speed data. The method includes determining, by the processing unit, one of an entry event, an exit event, or a combination thereof, of one or more passengers through the at least one door of the vehicle based on the processed real time images or videos. The method includes determining, by the processing unit, a real time occupancy count based on determined one of the entry event, exit event, or the combination thereof, of the one or more passengers. The method includes receiving, by processing circuitry of an information processing apparatus, the real time occupancy count from the processing unit and the real-time location data from the location tracking unit. The method includes generating, by the processing circuitry, one of a first alert, a second alert, or a combination thereof. The first alert is generated based on one of breakdown of the vehicle, off-route movements of the vehicle, extended idling of the vehicle, or a combination thereof. The second alert is generated when the real time occupancy count exceeds a predefined occupancy threshold. The method includes identifying, by the processing circuitry, occupancy patterns of the vehicle.
In some aspects of the present disclosure, the method further includes determining, by the processing unit or the location tracking unit, a tampering of the edge device or change in the field of view of at least one imaging unit. The method includes generating, by the processing unit, a third alert based on the tampering of the edge device or change in the field of view of at least one imaging unit.
In some aspects of the present disclosure, the method further includes generating, by the processing circuitry, an adjusted vehicle scheduling or routing based on the occupancy patterns of the vehicle.
In some aspects of the present disclosure, the method further includes receiving and storing, by a local database disposed at a vehicle depot, the real time images or videos stored in a memory of the edge device when the vehicle comes within the vicinity of the vehicle depot. The local database is coupled to the edge device by way of a local network.
In some aspects of the present disclosure, the method further includes receiving and notifying, by an interface, a vehicle managing authority of the vehicle about one of the first alert, the second alert, the third alert, or a combination thereof.
The foregoing general description of the illustrative aspects and the following detailed description thereof are merely exemplary aspects of the teachings of this disclosure and are not restrictive.
BRIEF DESCRIPTION OF FIGURES
The following detailed description of the preferred aspects of the present disclosure will be better understood when read in conjunction with the appended drawings. The present disclosure is illustrated by way of example, and not limited by the accompanying figures, in which like references indicate similar elements.
FIG. 1 illustrates a block diagram of a system for monitoring vehicle occupancy in real-time, according to aspects of the present disclosure.
FIG. 2 illustrates a block diagram of a processing unit for the system of FIG. 1, according to an exemplary aspect of the present disclosure.
FIG. 3 illustrates a block diagram of an information processing apparatus for the system of FIG. 1, according to an exemplary aspect of the present disclosure.
FIG. 4 illustrates a flowchart of a method for monitoring vehicle occupancy in real-time, according to aspects of the present disclosure.
DETAILED DESCRIPTION
The following description sets forth exemplary aspects of the present disclosure. It should be recognized, however, that such description is not intended as a limitation on the scope of the present disclosure. Rather, the description also encompasses combinations and modifications to those exemplary aspects described herein.
The present disclosure introduces a system and method to enhance public transportation efficiency and safety. The system and method integrates real-time imaging, location tracking, and data processing technologies to provide accurate passenger counting, occupancy monitoring, and vehicle status tracking. The system includes vehicle-mounted components including imaging units for capturing passengers’ entry and exit, a location tracking-unit for determining vehicle position and speed, and an edge device for local data processing. The system further includes an information processing apparatus that receives and analyzes data from multiple vehicles, generating alerts for various conditions such as overcrowding, vehicle breakdowns, off-route movements, and extended idling. The system employs speed-based filtering techniques to optimize image processing and focus computational resources on relevant data during passenger boarding and alighting. Additionally, the system identifies occupancy patterns, enabling dynamic adjustment of vehicle scheduling and routing. The system includes a local database at a vehicle depot to facilitate efficient storage of video files with longer duration. The system furthermore includes a manager or supervisor interface that provides real-time notifications. The system also incorporates tampering detection techniques to ensure data integrity. By leveraging advanced techniques and machine learning techniques, this system offers a comprehensive solution for modern transit management, improving operational efficiency, passenger experience, and resource allocation in public transportation networks.
FIG. 1 illustrates a block diagram of a system 100 for monitoring vehicle occupancy in real-time. The system 100 includes a vehicle 102, an information processing apparatus 120, a vehicle depot 126, a communication network 118, and a local network 130.
The vehicle 102 may include an edge device 104, at least one imaging unit 110 (hereinafter referred as “the imaging unit 110”) disposed at least one door of the vehicle 102, a location tracking unit 112 (hereinafter interchangeably referred as “the location tracking unit 112”), an interface 114, and a communication unit 116.
The edge device 104 may be coupled to the imaging unit 110, the location tracking unit 112, the driver interface 114, and the communication unit 116. The edge device may include a processing unit 106 and a memory 108. The processing unit 106 may be configured to receive the real time images or videos from the imaging unit 110, process the real time images or videos based on the speed data from the location tracking unit 112, and determine one of an entry event, an exit event, or any combination thereof, of one or more passengers through the at least one door of the vehicle (102).
Examples of the processing unit 106 may include, but are not limited to, an Application-Specific Integrated Circuit (ASIC) processor, a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Field-Programmable Gate Array (FPGA), a Programmable Logic Control unit (PLC), and the like. Aspects of the present disclosure are intended to include or otherwise cover any type of the processing unit 106 including known, related art, and/or later developed technologies.
The memory 108 may be configured to store the real time images or videos from the imaging unit 110, and the real-time location data and the speed data from the location tracking unit 112. The memory 108 may further be configured to store logic, instructions, circuitry, interfaces, and/or codes of the processing unit 106. Examples of the memory 108 may include, but are not limited to, a Read Only Memory (ROM), a Random Access Memory (RAM), a Flash Memory (FM), a Removable Storage Drive (RSD), a Hard Disk Drive (HDD), a Solid-State Memory (SSM), a Magnetic Storage Drive (MSD), a Programmable Read Only Memory (PROM), an Erasable PROM (EPROM), and/or an Electrically EPROM (EEPROM). Aspects of the present disclosure are intended to include or otherwise cover any type of the memory 108 including known, related art, and/or later developed technologies.
The imaging unit 110 may be configured to capture real time images or videos of a field of view proximate to the at least one door of the vehicle to detect the entry events and exit events of one or more passengers. Examples of the imaging unit 110 may include but not limited to, a digital camera, an analog camera, an infrared camera, a stereoscopic camera, or the like. Aspects of the present disclosure are intended to include and/or otherwise cover any type of the imaging unit 110, including known, related, and later developed technologies, without deviating from the scope of the present disclosure.
The location tracking unit 112 may be configured to obtain real-time location data of the vehicle 102 and speed data of the vehicle 102 corresponding to the real-time location data. In some aspects, the location tracking unit 112 may be a Global Navigation Satellite System (GNSS) Module.
The interface 114 may be configured to enable the vehicle managing authority receive information or notifications.
The communication unit 116 may be configured to enable the edge device 104 to communicate with the information processing apparatus 120, local database 128 and other components of the system 100 over the communication network 118 and the local network 130. The communication unit 116 may support various wireless communication protocols, including but not limited to cellular networks (e.g., 4G, 5G), Wi-Fi, Bluetooth, or satellite communication.
The information processing apparatus 120 may be a network of computers, a framework, or a combination thereof, that may provide a generalized approach to create a server implementation. In some embodiments of the present disclosure, the information processing apparatus 120 may be a server. Examples of the information processing apparatus 120 may include, but are not limited to, personal computers, laptops, mini-computers, mainframe computers, any non-transient and tangible machine that can execute a machine-readable code, cloud-based servers, distributed server networks, or a network of computer systems. The information processing apparatus 120 may be realized through various web-based technologies such as, but not limited to, a Java web-framework, a .NET framework, a personal home page (PHP) framework, or any other web-application framework. The information processing apparatus 120 may include one or more processing circuitries of which processing circuitry 122 is shown and a database 124.
The processing circuitry 122 may be configured to receive the real time occupancy count from the processing unit 106 and the real-time location data from the location tracking unit 112. The processing circuitry 122 may further be configured to generate one of a first alert, a second alert, or a combination thereof wherein the first alert is generated based on one of breakdown of the vehicle 102, off-route movements of the vehicle 102, intentional trip curtailment the vehicle 102, extended idling of the vehicle 102, or a combination thereof, and the second alert is generated when the real time occupancy count exceeds a predefined occupancy threshold. The processing circuitry 122 may be configured to identify occupancy patterns of the vehicle 102.
In some aspects of the present disclosure, the processing circuitry 122 may further be configured to generate an adjusted vehicle scheduling or routing based on the occupancy patterns of the vehicle 102.
Examples of the processing circuitry 122 may include, but are not limited to, an ASIC processor, a RISC processor, a CISC processor, a FPGA, and the like. Embodiments of the present disclosure are intended to include and/or otherwise cover any type of the processing circuitry 122 including known, related art, and/or later developed technologies.
The database 124 may be configured to store the raw data, processed data, location data, occupancy data of the processing circuitry 122 for executing various operations. The database 124 may further be configured to store therein, the data corresponding to the real time occupancy count from the processing unit 106 and the real-time location data from the location tracking unit 112. It will be apparent to a person having ordinary skill in the art that the database 124 may be configured to store various types of data associated with the system 100, without deviating from the scope of the present disclosure. Examples of the database 124 may include but are not limited to, a Relational database, a NoSQL database, a Cloud database, an Object-oriented database, and the like. Further, the database 124 may include associated memories that may include, but is not limited to, a ROM, a RAM, a flash memory, a removable storage drive, a HDD, a solid-state memory, a magnetic storage drive, a PROM, an EPROM, and/or an EEPROM. Embodiments of the present disclosure are intended to include or otherwise cover any type of the database 124 including known, related art, and/or later developed technologies.
The communication network 118 may include suitable logic, circuitry, and interfaces that may be configured to provide a plurality of network ports and a plurality of communication channels for transmission and reception of data related to operations of various entities in the system 100. Each network port may correspond to a virtual address (or a physical machine address) for transmission and reception of the communication data. For example, the virtual address may be an Internet Protocol Version 4 (IPV4) (or an IPV6 address) and the physical address may be a Media Access Control (MAC) address. The communication network 118 may be associated with an application layer for implementation of communication protocols based on one or more communication requests from the edge device 104, location tracking unit 112, manager/ supervisor interface 114, communication unit 116, and the information processing apparatus 120.
The communication data may be transmitted or received, via the communication protocols. Examples of the communication protocols may include, but are not limited to, Hypertext Transfer Protocol (HTTP), File Transfer Protocol (FTP), Simple Mail Transfer Protocol (SMTP), Domain Network System (DNS) protocol, Common Management Interface Protocol (CMIP), Transmission Control Protocol and Internet Protocol (TCP/IP), User Datagram Protocol (UDP), Long Term Evolution (LTE) communication protocols, or any combination thereof.
The vehicle depot 126 may include a local database 128. The local database 128 may be configured to connect to the vehicle 102 through a local network 130 when the vehicle is within range of the depot. This allows transfer of stored data between the edge device 104 and the local database 128, particularly large video files, between the edge device 104 and the local database 128 without the need for internet facilities or manual intervention.
In some embodiment, the local network 130 may be include suitable logic, circuitry, and interfaces that may be configured to provide a plurality of network ports and a plurality of communication channels for transmission and reception of real time images or videos stored in a memory (108) of the edge device (104). The real time images or videos stored in a memory (108) of the edge device (104) may be transmitted or received via at least one communication channel of a plurality of communication channels in the local network 130. The communication channels may include, but are not limited to, a wireless channel, a wired channel, a combination of wireless and wired channel thereof. The wireless or wired channel may be associated with a data standard which may be defined by one of a Local Area Network (LAN), a Personal Area Network (PAN), a Wireless Local Area Network (WLAN), a Wireless Sensor Network (WSN), Wireless Area Network (WAN), Wireless Wide Area Network (WWAN), a Metropolitan Area Network (MAN), a Satellite Network, the Internet, a Fiber Optic Network, a Coaxial Cable Network, an Infrared (IR) network, a Radio Frequency (RF) network, and a combination thereof. Embodiments of the present invention are intended to include or otherwise cover any type of communication channel, including known, related art, and/or later developed technologies.
In some aspects of the present disclosure, the system 100 may include a power distribution unit (not shown) to supply power to components of the system 100 and peripherals thereof while minimizing space usage and heat dissipation.
In operation, the system 100 captures image data of passengers entering and exiting the vehicle 102 using the imaging unit 110 that captures real time images or videos of the field of view proximate to the at least one door of the vehicle 102. The location tracking unit 112 provides the real-time location data and time corresponding to the real-time location data of the vehicle 102 to the edge device 104. The edge device 104 determines speed data based on the real-time location data and the time to operate the at least one imaging unit (110) to capture the real time images or videos based on the speed data. In some aspects, the at least one imaging unit (110) thus captures the real time images or the videos when the vehicle 102 slows down for boarding and/or alighting of the one or more passengers. The edge device 104 further determines one of an entry event, an exit event, or a combination thereof, of the one or more passengers.
The information processing apparatus 120 via the communication network 118 receive the real time occupancy count from the edge device 104 and the real-time location data from the location tracking unit 112. The system 100 generates alerts based on predefined occupancy thresholds or vehicle status conditions, which may include notifications for overcrowding, vehicle breakdowns, off-route movements, extended idling, or unsafe passenger position within the vehicle. These alerts can be displayed to the supervisor/ manager via interface 114 or sent to the information processing apparatus 120 for further action.
When the vehicle 102 arrives at the depot 126 after its service, the edge device 104 may compress and encrypt the video files for safe transfer. The edge device 104 automatically connects to the local network 130 at the depot and establishes a connection to the local database 128, allowing for efficient transfer of large video files stored in the memory 108 of the edge device 104.
This comprehensive system enables efficient monitoring of vehicle occupancy, facilitates improved transit management, enhances passenger experience, and provides a solution for secure and automated transfer of large video files in dynamic transportation environments.
FIG. 2 illustrates a block diagram of the processing unit 106 for the system 100 of FIG. 1. The processing unit 106 includes a data collection engine 200, a data processing engine 202, an occupancy count determination engine 204, and a tempering/FOV change determination engine 206 which are interconnected via a data communication bus 208.
The data collection engine 200 may be configured to receive input data from the imaging unit 110 and the location tracking unit 112. In some aspects of the present disclosure, the input data from the imaging unit 110 may be the real time images or videos of the field of view proximate to the at least one door of the vehicle 102 while the input data from the location tracking unit 112 may be the real-time location data of the vehicle 102 and the time corresponding to the real time location data. The data collection engine 200 serves as the primary interface for incoming data streams. The data collection engine 200 may implement various data compression techniques to efficiently handle the large volume of image data from multiple cameras. The data collection engine 200 may also synchronize the incoming video streams with the corresponding location and time data, ensuring temporal alignment of all collected information. In some aspects of the present disclosure, the data collection engine 200 may optimize data collection in different operational scenarios.
The data processing engine 202 may be configured to process the collected data. In some aspects, the data processing engine 202 may be configured to determine speed data based on the real-time location data and the time to operate the at least one imaging unit 110 to capture the real time images or videos based on the speed data. The data processing engine 202 may be configured to form the core of the image analysis system. The data processing engine 202 may employ advanced computer vision techniques, such as convolutional neural networks or region-based convolutional neural networks (R-CNN), vision transformers, or hybrid techniques to detect and track passengers in the video streams. The data processing engine 202 may also utilize the speed and location data to contextualize the image analysis, adjusting detection parameters based on the vehicle's operational state (e.g., stopped at a station, moving between stops). In some aspects of the present disclosure, the data processing engine 202 may implement fine-tuning, supervised learning, transfer learning, or reinforcement learning techniques to adapt to different vehicle layouts or lighting conditions, enhancing the system's versatility across various vehicle types.
In some aspects of the present disclosure, to operate the at least one imaging unit 110, the data processing engine 202 may be configured to implements a speed filter approach, which optimizes the processing of image data based on the vehicle's movement. The speed filter operates as follows:
The data processing engine 202 enables the at least one imaging unit 110 to capture the image data and then receives the image data to processes the image data primarily when the vehicle is moving slowly or is stationary, typically during boarding and alighting of passengers. This conditional processing significantly reduces unnecessary computations when passenger entry or exit is unlikely. The real-time location data from the location tracking unit 112 is utilized to determine the vehicle's speed, acting as a conditional trigger for image processing.
In some aspects, when the location tracking unit 112 indicates appropriate conditions for passenger movement, the data processing engine 202 focuses on specific regions of interest (ROI) in the image frame, typically areas near the vehicle doors. Within these ROIs, the engine employs computer vision techniques s to detect and track passenger movements. The direction of motion is analyzed to distinguish between entering and exiting passengers.
Based on the detected entry and exit events, the data processing engine 202 may enable the occupancy count determination engine 204 to update the real-time occupancy count of the vehicle. This approach allows for efficient use of computational resources by processing image data only when relevant to passenger counting, while still maintaining accurate occupancy tracking.
In some aspects of the present disclosure, the data processing engine 202 may implement fine-tuning, supervised learning, transfer learning, or reinforcement learning techniques to adapt to different vehicle layouts or lighting conditions, enhancing the system's versatility across various vehicle types. The engine may also employ advanced computer vision techniques s, such as convolutional neural networks or region-based convolutional neural networks (R-CNN), vision transformers or hybrid techniques to improve the accuracy of passenger detection and tracking.
This use of speed data ensures that the system 100 can adapt to various operational scenarios, from busy urban routes with frequent stops to express services with longer intervals between passenger boarding events. The speed filter approach not only optimizes processing efficiency but also contributes to the overall accuracy of the occupancy monitoring system by focusing computational resources on the most relevant data.
The occupancy count determination engine 204 may be configured to determine passenger entry and exit events and calculate real-time occupancy counts. The occupancy count determination engine 204 may aggregate the processed data to maintain an accurate count of passengers. The occupancy count determination engine 204 may implement sophisticated tracking techniques to handle scenarios such as partial occlusions, temporary loss of detection, or complex passenger movements (e.g., a passenger entering and immediately exiting). The occupancy count determination engine 204 may also incorporate historical data and statistical models to predict and correct for potential counting errors. In some aspects of the present disclosure, the occupancy count determination engine 204 may generate confidence scores for its counts, allowing the system to flag situations where a possible ticketing theft may occur based on the electronic ticketing data.
The tampering/FOV change determination engine 206 may monitor for any unauthorized changes to the edge device 104 or alterations in the field of view of the at least one imaging unit 110. In some aspects, for determining the change in FOV of the at least one imaging unit 110, the tampering/FOV change determination engine 206 may be configured to capture by way of the at least one imaging unit 110, a reference image when the vehicle 102 is parked at the vehicle depot 126 at end of a day. Further, the tampering/FOV change determination engine 206 may be configured to capture another image by way of the at least one imaging unit 110 when the vehicle 102 is parked after whole day of operation (i.e., day next to the day when the reference image was captured), and compare the another image with the reference image to identify deviation in the pixels in the another image w.r.t the reference image. Further, the tampering/FOV change determination engine 206 may generate a third alert representing change in FOV of the at least one imaging unit when the deviation in the pixels are more than a predefined threshold.
In some aspects, the tampering/FOV change determination engine 206 may be configured to determine tempering in components of the system 100 such as unplugging input power supply from the engine for one or more components of the system 100, loosening of the one or more components and/or cables of the one or more components, theft of some items from a casing of the vehicle 102, etc. For determining the tempering, the tampering/FOV change determination engine 206 may be configured to detect cut off or deviation of the input power supply and transmit an operating signal to an inverter unit (not shown) and a battery unit (not shown) for providing power to the components of the system 100 for a predefined period of time. Further, the tampering/FOV change determination engine 206 may be configured to generate the third alert to that may represent tempering in the one or more components of the system 100, to the vehicle operating authorities.
In some aspects of the present disclosure, the tampering /FOV change determination engine 206 may employ anomaly detection algorithms to identify unusual patterns in the data streams that might indicate tampering. The tampering/FOV change determination engine 206 may also perform regular system integrity checks to ensure all components are functioning as expected. In some aspects of the present disclosure, the tampering/FOV change determination engine 206 may implement blockchain or other distributed ledger technologies to create an immutable record of system configurations and changes, enhancing security and auditability.
FIG. 3 illustrates a block diagram of the information processing apparatus 120 for the system 100 of FIG. 1. The information processing apparatus 120 includes processing circuitry 122 that is coupled to a database 124 via a data communication bus 304. The processing circuitry 122 includes a network interface 300 and an input/output interface 302 that are connected through the data communication bus 304.
The processing circuitry 122 comprises multiple processing engines connected via data communication bus 316. Specifically, the processing circuitry 122 may include a data collection engine 306 for gathering data from multiple vehicles, an occupancy count comparison engine 308 for analyzing occupancy levels, and a vehicle scheduling/routing engine 310 for managing vehicle operations.
The data collection engine 306 may be configured to receive and organize data from multiple vehicles equipped with the system 100. The data collection engine 306 may implement load balancing techniques to efficiently handle data streams from numerous vehicles simultaneously. The data collection engine 306 may also perform initial data validation and error checking to ensure the integrity of incoming information. In some aspects of the present disclosure, the data collection engine 306 may implement edge computing principles, performing preliminary data aggregation at local hubs before transmitting to the central server, thus optimizing network usage.
The occupancy count comparison engine 308 may analyze occupancy data against predefined thresholds and historical patterns. The occupancy count comparison engine 308 may compare the real time occupancy count with a predefined occupancy threshold and when the when the real time occupancy count exceeds a predefined occupancy threshold, then generate a signal indicating the same to an alert generation engine 312. This occupancy count comparison engine 308 may employ machine learning techniques, such as time series analysis or anomaly detection models, to identify unusual occupancy patterns. The occupancy count comparison engine 308 may also perform cross-vehicle comparisons to detect system-wide trends or anomalies. In some aspects of the present disclosure, the occupancy count comparison engine 308 may implement predictive analytics to forecast occupancy levels for different routes and times, aiding in proactive resource allocation.
The vehicle scheduling/routing engine 310 may be configured to fetch pre-defined route and schedule for the vehicle 102 from the database 124. In some aspects, the vehicle scheduling/routing engine 310 may be coupled to the occupancy count comparison engine 308 and generate adjusted vehicle schedules or routes based on real-time occupancy data. In some other aspects of the present disclosure, the vehicle scheduling/routing engine 310 may generate adjusted vehicle schedules or routes based on overcrowding with in the vehicle 102 while following a particular route. In some other aspects, for adjustments in vehicle scheduling or routing, the vehicle scheduling/routing engine 310 may be configured to generate an alert to the vehicle managing authority based on the real-time occupancy data and/or the overcrowding in the vehicle 102.
The vehicle scheduling/routing engine 310 may utilize optimization techniques to balance passenger demand with available resources. The vehicle scheduling/routing engine 310 may consider factors such as historical occupancy data, current traffic conditions, and special events to suggest route modifications or additional vehicle deployments. In some aspects of the present disclosure, the vehicle scheduling/routing engine 310 may implement reinforcement learning techniques to continuously improve its decision-making based on the outcomes of previous scheduling decisions.
The processing circuitry 122 further includes the alert generation engine 312 for generating various system alerts, a tampering detection engine 314 for monitoring system integrity, and a notification engine 314 for communicating with a vehicle managing authority.
The alert generation engine 312 may produce alerts for various conditions such as overcrowding, vehicle breakdowns, or off-route movements. The alert generation engine 312 may implement a multi-tiered alert system, categorizing alerts based on urgency and required response. The alert generation engine 312 may also incorporate machine learning techniques to reduce false positives and prioritize alerts based on their potential impact on service quality. In some aspects of the present disclosure, the alert generation engine 312 may implement natural language processing techniques to generate human-readable alert descriptions for different stakeholders (e.g., vehicle managing authority).
The notification engine 314 may manage the delivery of alerts and information to the interface 114. The notification engine 314 may implement context-aware notification systems, prioritizing and formatting information based on the vehicle's current status and the vehicle managing authority's cognitive load. The notification engine 314 may also incorporate text-to-speech technology for hands-free information delivery. In some aspects of the present disclosure, the notification engine 314 may implement adaptive user interface techniques, customizing the presentation of information based on individual preferences and interaction patterns.
FIG. 4 illustrates a flowchart of a method 400 for monitoring vehicle occupancy in real-time. The method 400 begins with a step 402 of capturing real-time images or videos of the field of view proximate to the at least one door of the vehicle 102 using the imaging unit 110 such that the at least one door of a vehicle 102 allows the one or more passengers to enter into or exit from, the vehicle 102.
In a step 404, the method 400 obtains the real-time location data and the time corresponding to the real-time location data using the location tracking unit 112. This step may involve obtaining data from one or more sensors, such as GPS, GNSS etc., to provide accurate and robust location and corresponding time information.
The method 400 proceeds to a step 406 where the images or videos, and the location data with the time are received at the edge device 104. This step may involve data buffering and initial format conversion to prepare the data for processing. In some aspects of the present disclosure, this step may also include preliminary data compression to optimize storage and transmission.
In a step 408, the method 400 may determine speed data based on the real-time location data of the vehicle 102 and the time to process the images or videos based on the speed data. This step may employ various computer vision techniques, such as background subtraction, object detection, and tracking. The processing may be adaptive, adjusting parameters based on the current speed of the vehicle. For example, different processing techniques might be applied when the vehicle is stationary at a stop versus when it's moving between stops.
The method 400 then moves to a step 410 where passenger entry and exit events through the vehicle door are determined. This step may involve analyzing the trajectories of detected objects (passengers) relative to the door area. It may also incorporate spatio-temporal analysis to distinguish between passengers entering, exiting, or merely standing near the door.
Following this, in a step 412, the method 400 determines a real-time occupancy count based on determined the entry and/or exit events by way of the processing circuitry 122. This step may involve maintaining a running tally of passengers, with various error correction mechanisms in place. For example, the step may involve implementing confidence scoring for each count update, allowing the system to flag and potentially correct uncertain counts.
The method 400 then proceeds to a step 414 where the occupancy count and location data are received by the information processing apparatus 120 for processing. This step may involve data validation and initial aggregation, preparing the data for higher-level analysis.
The method 400 branches into two parallel steps. In a step 416, alerts are generated based on vehicle conditions and occupancy levels. This step may involve comparing current occupancy against predefined thresholds, analyzing vehicle location and speed patterns for anomalies, and generating appropriate alerts. The alerts may be prioritized based on urgency and potential impact on service quality.
Simultaneously, in a step 418, the method 400 identifies occupancy patterns based on the received data. This step may involve various data mining and machine learning techniques to extract meaningful patterns from the accumulated data. It may analyze both short-term trends (e.g., occupancy patterns during a single journey) and long-term trends (e.g., weekly or monthly patterns across multiple vehicles and routes).
Thus, the system 100 and method 400 provides several technical advantages that represent significant advancements in vehicle occupancy monitoring and transit management. The system enables real-time, accurate passenger counting through advanced edge computing and computer vision techniques, overcoming limitations of traditional infrared or weight-based systems in crowded scenarios. By integrating location tracking and speed filtering, the system contextualizes occupancy data, allowing for more precise analysis and decision-making. The innovative local database setup at vehicle depots facilitates efficient transfer of large video files without requiring internet connectivity or manual intervention, addressing a critical data management challenge in dynamic transportation environments. The system's ability to generate real-time alerts for various vehicle conditions and occupancy thresholds enhances operational responsiveness and passenger safety. Furthermore, the implementation of tampering detection mechanisms and the capability to identify long-term occupancy patterns contribute to improved system reliability and strategic transit planning. These technical advancements collectively enable more efficient resource allocation, enhanced passenger experience, and data-driven optimization of public transportation systems.
Aspects of the present disclosure are discussed here with reference to flowchart illustrations and block diagrams that depict methods, systems, and apparatus in accordance with various aspects of the present disclosure. Each block within these flowcharts and diagrams, as well as combinations of these blocks, can be executed by computer-readable program instructions. The various logical blocks, modules, circuits, and techniques steps described in connection with the disclosed aspects may be implemented through electronic hardware, software, or a combination of both. To emphasize the interchangeability of hardware and software, the various components, blocks, modules, circuits, and steps are described generally in terms of their functionality. The decision to implement such functionality in hardware or software is dependent on the specific application and design constraints imposed on the overall system. Persons having ordinary skill in the art can implement the described functionality in different ways depending on the particular application, without deviating from the scope of the present disclosure.
The flowcharts and block diagrams presented in the figures depict the architecture, functionality, and operation of potential implementations of systems, methods, and apparatus according to different aspects of the present disclosure. Each block in the flowcharts or diagrams may represent an engine, segment, or portion of instructions comprising one or more executable instructions to perform the specified logical function(s). In some alternative implementations, the order of functions within the blocks may differ from what is depicted. For instance, two blocks shown in sequence may be executed concurrently or in reverse order, depending on the required functionality. Each block, and combinations of blocks, can also be implemented using special-purpose hardware-based systems that perform the specified functions or tasks, or through a combination of specialized hardware and software instructions.
Although the preferred aspects have been detailed here, it should be apparent to those skilled in the relevant field that various modifications, additions, and substitutions can be made without departing from the scope of the disclosure. These variations are thus considered to be within the scope of the disclosure as defined in the following claims.
Features or functionalities described in certain example aspects may be combined and re-combined in or with other example aspects. Additionally, different aspects and elements of the disclosed example aspects may be similarly combined and re-combined. Further, some example aspects, individually or collectively, may form components of a larger system where other processes may take precedence or modify their application. Moreover, certain steps may be required before, after, or concurrently with the example aspects disclosed herein. It should be noted that any and all methods and processes disclosed herein can be performed in whole or in part by one or more entities or actors in any manner.
Although terms like "first," "second," etc., are used to describe various elements, components, regions, layers, and sections, these terms should not necessarily be interpreted as limiting. They are used solely to distinguish one element, component, region, layer, or section from another. For example, a "first" element discussed here could be referred to as a "second" element without departing from the teachings of the present disclosure.
The terminology used here is intended to describe specific example aspects and should not be considered as limiting the disclosure. The singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. The terms "comprises," "includes," "comprising," and "including," as used herein, indicate the presence of stated features, steps, elements, or components, but do not exclude the presence or addition of other features, steps, elements, or components.
As used herein, the term "or" is intended to be inclusive, meaning that "X employs A or B" would be satisfied by X employing A, B, or both A and B. Unless specified otherwise or clearly understood from the context, this inclusive meaning applies to the term "or."
Unless otherwise defined, all terms used herein (including technical and scientific terms) have the same meaning as commonly understood by one of ordinary skill in the relevant art. Terms should be interpreted consistently with their common usage in the context of the relevant art and should not be construed in an idealized or overly formal sense unless expressly defined here.
The terms "about" and "substantially," as used herein, refer to a variation of plus or minus 10% from the nominal value. This variation is always included in any given measure.
In cases where other disclosures are incorporated by reference and there is a conflict with the present disclosure, the present disclosure takes precedence to the extent of the conflict, or to provide a broader disclosure or definition of terms. If two disclosures conflict, the later-dated disclosure will take precedence.
The use of examples or exemplary language (such as "for example") is intended to illustrate aspects of the invention and should not be seen as limiting the scope unless otherwise claimed. No language in the specification should be interpreted as implying that any non-claimed element is essential to the practice of the invention.
While many alterations and modifications of the present invention will likely become apparent to those skilled in the art after reading this description, the specific aspects shown and described by way of illustration are not intended to be limiting in any way.
A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the scope of the disclosure. Accordingly, other implementations are within the scope of the following claims. , Claims:1. A system (100) for monitoring vehicle occupancy in real-time, the system (100) comprising:
at least one imaging unit (110) disposed over at least one door of a vehicle (102) that is configured to capture real time images or videos of a field of view proximate to the at least one door of the vehicle (102) wherein the at least one door of a vehicle (102) allows the one or more passengers to enter into, or exit from the vehicle (102);
a location tracking unit (112) that is configured to obtain real-time location data of the vehicle (102) and time corresponding to the real time location data of the vehicle (102);
an edge device (104) coupled to the at least one imaging unit (110) and the location tracking unit (112), the edge device (104) comprising a processing unit (106) that is configured to:
receive the real time images or videos and the real-time location data of the vehicle (102) and the time;
determine speed data based on the real-time location data and the time;
process the real time images or videos based on the real-time location data and the speed data; and
determine one of an entry event, an exit event, or a combination thereof, of the one or more passengers through the at least one door of the vehicle (102) based on the processed real time images or videos; and
determine a real time occupancy count based on determined one of the entry event, exit event, or the combination thereof, of the one or more passengers; and
an information processing apparatus (120) coupled to the edge device (104) and the location tracking unit (112), the information processing apparatus (120) comprising processing circuitry (122) that is configured to;
receive the real time occupancy count from the processing unit (106) and the real-time location data from the location tracking unit (112):
generate one of a first alert, a second alert, or a combination thereof wherein the first alert is generated based on one of breakdown of the vehicle (102), off-route movements of the vehicle (102), extended idling of the vehicle (102), or a combination thereof, and the second alert is generated when the real time occupancy count exceeds a predefined occupancy threshold; and
identify occupancy patterns of the vehicle (102).
2. The system (100) as claimed in claim 1, wherein the processing unit (106) is further configured to:
determine a tampering of the edge device (104) or the location tracking unit (112), or change in the field of view of at least one imaging unit (110); and
generate a third alert based on the tampering of the edge device (104), or the location tracking unit (112), or change in the field of view of the at least one imaging unit (110).
3. The system (100) as claimed in claim 1, wherein the processing circuitry (122) is further configured to:
generate an adjusted vehicle scheduling or routing based on the occupancy patterns of the vehicle (102).
4. The system (100) as claimed in claim 1, wherein the system (100) further comprising:
a local database (128) disposed at a vehicle depot (126) wherein the local database (128) is configured to be coupled to the edge device (104) by way of a local network (130) when the vehicle (102) comes within the vicinity of the vehicle depot (126) such that the local database (128) is configured to receive and store the real time images or videos stored in a memory (108) of the edge device (104).
5. The system (100) as claimed in claim 1, further comprising:
an interface (114) that is configured to receive and notify a vehicle managing authority of the vehicle (102) about one of the first alert, the second alert, the third alert, or a combination thereof.

6. A method (400) for monitoring vehicle occupancy in real-time, the method (400) comprising:
capturing, by at least one imaging unit (110) disposed over at least one door of a vehicle (102), real time images or videos of a field of view proximate to the at least one door of the vehicle (102) wherein the at least one door of a vehicle (102) allows the one or more passengers to enter into or exit from, the vehicle (102);
obtaining, by a location tracking unit (112), real-time location data of the vehicle (102) and time corresponding to the real-time location data;
receiving, by a processing unit (106) of an edge device (104) coupled to the at least one imaging unit (110) and the location tracking unit (112), the real time images or videos and the real time images or videos from the at least one imaging unit (110);
determining, by the processing unit (106), speed data based on the real-time location data of the vehicle (102) and the time;
processing, by the processing unit (106), the real time images or videos based on the speed data;
determining, by the processing unit (106), one of an entry event, an exit event, or a combination thereof, of one or more passengers through the at least one door of the vehicle (102) based on the processed real time images or videos;
determining, by the processing unit (106), a real time occupancy count based on determined one of the entry event, exit event, or the combination thereof, of the one or more passengers;
receiving, by processing circuitry (122) of an information processing apparatus (120), the real time occupancy count from the processing unit (106) and the real-time location data from the location tracking unit (112);
generating, by the processing circuitry (122), one of a first alert, a second alert, or a combination thereof, wherein the first alert is generated based on one of breakdown of the vehicle (102), off-route movements of the vehicle (102), extended idling of the vehicle (102), or a combination thereof, and the second alert is generated when the real time occupancy count exceeds a predefined occupancy threshold; and
identifying, by the processing circuitry (122), occupancy patterns of the vehicle (102).
7. The method (400) as claimed in claim 6, further comprising:
determining, by the processing unit (106), a tampering of the edge device (104) or the location tracking unit (112), or change in the field of view of the at least one imaging unit (110); and
generating, by the processing unit (106), a third alert based on the tampering of the edge device (104) or change in the field of view of at least one imaging unit (110).
8. The method (400) as claimed in claim 6, further comprising:
generating, by the processing circuitry (122), an adjusted vehicle scheduling or routing based on the occupancy patterns of the vehicle (102).
9. The method (400) as claimed in claim 6, further comprising:
receiving and storing, by a local database (128) disposed at a vehicle depot (126), the real time images or videos stored in a memory (108) of the edge device (104) when the vehicle (102) comes within the vicinity of the vehicle depot (126), wherein the local database (128) is coupled to the edge device (104) by way of a local network (130).
10. The method (400) as claimed in claim 6, further comprising:
receiving and notifying, by an interface (114), a vehicle managing authority of the vehicle (102) about one of the first alert, the second alert, the third alert, or a combination thereof.

Documents

Application Documents

# Name Date
1 202521075676-STATEMENT OF UNDERTAKING (FORM 3) [08-08-2025(online)].pdf 2025-08-08
2 202521075676-FORM FOR SMALL ENTITY(FORM-28) [08-08-2025(online)].pdf 2025-08-08
3 202521075676-FORM FOR SMALL ENTITY [08-08-2025(online)].pdf 2025-08-08
4 202521075676-FORM 1 [08-08-2025(online)].pdf 2025-08-08
5 202521075676-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [08-08-2025(online)].pdf 2025-08-08
6 202521075676-EVIDENCE FOR REGISTRATION UNDER SSI [08-08-2025(online)].pdf 2025-08-08
7 202521075676-DRAWINGS [08-08-2025(online)].pdf 2025-08-08
8 202521075676-DECLARATION OF INVENTORSHIP (FORM 5) [08-08-2025(online)].pdf 2025-08-08
9 202521075676-COMPLETE SPECIFICATION [08-08-2025(online)].pdf 2025-08-08
10 Abstract.jpg 2025-08-22
11 202521075676-FORM-26 [01-09-2025(online)].pdf 2025-09-01
12 202521075676-FORM-9 [04-11-2025(online)].pdf 2025-11-04
13 202521075676-Proof of Right [24-11-2025(online)].pdf 2025-11-24