Abstract: A method and system for real-time monitoring of vehicles in a vehicle yard is disclosed. In some embodiments, the method includes identifying at least one of: at least (202) one vehicle attribute associated with a vehicle and at least one parking slot allocated to the vehicle; and at least (302) one container attribute associated with a container and at least one container slot allocated to the container. The method further includes rendering to a user, at least one of: first directions (204) and second directions (304). The method further includes determining compliance of at least one of the directions in response to an action performed by the user. The method further includes generating a notification in response to determining compliance of the at least one of the directions. To be published with FIG. 1.
Description: Technical Field
[001] Generally, the invention relates to Augmented Reality (AR) and Virtual Reality (VR). More specifically, the invention relates to method and system for real-time monitoring of vehicles in a vehicle yard using smart glasses.
Background
[002] In today's competitive business environment, warehouse efficiency has become crucial for overall efficiency of the supply chains they are a part of. As a result, new technologies are being explored and implemented in industry to increase warehouse performance. Due to recent advancement in field of Augmented Reality (AR) and Virtual Reality (VR) technologies, these technologies have gain popularity as a platform for designing system and devices that may assist users in a variety of sectors including warehouse management. There is no doubt that the intelligent use of AR and VR devices may improve warehouse operations thereby enhancing business status. For this many AR and VR enabled devices are used to perform warehouse planning. However, no effective advancement with respect to these technologies has taken place for monitoring movement of vehicles or containers available in yard that may improve overall yard management.
[003] Therefore, there is a need of an efficient and reliable method and system for performing real-time monitoring of vehicles in a vehicle yard.
SUMMARY OF INVENTION
[004] In one embodiment, a method for real-time monitoring of vehicles in a vehicle yard is disclosed. The method may include identifying at least one of: at least one vehicle attribute associated with a vehicle and at least one parking slot allocated to the vehicle, and at least one container attribute associated with a container and at least one container slot allocated to the container. The method may include rendering at least one of first directions and second directions to the user. The first directions may be rendered to perform navigation of the vehicle to a parking slot from the at least one parking slot and parking of the vehicle in the parking slot. The second directions may be rendered to perform one of loading the container onto the vehicle or a cargo carrier and unloading the container to a container slot from the at least one container slot. The method may include determining compliance of at least one of the directions in response to an action performed by the user. The method may include generating a notification in response to determining compliance of the at least one of the directions.
[005] In another embodiment, a system for real-time monitoring of vehicles in a vehicle yard is disclosed. The system includes smart glasses. The smart glasses includes a processor and a memory communicatively coupled to the processor. The memory may store processor-executable instructions, which, on execution, may causes the processor to identify at least one of: at least one vehicle attribute associated with a vehicle and at least one parking slot allocated to the vehicle, and at least one container attribute associated with a container and at least one container slot allocated to the container. The processor-executable instructions, on execution, may further cause the processor to render at least one of first directions and second directions to the user. The first directions may be rendered to perform navigation of the vehicle to a parking slot from the at least one parking slot and parking of the vehicle in the parking slot. The second directions may be rendered to perform one of loading the container onto the vehicle or a cargo carrier and unloading the container to a container slot from the at least one container slot. The processor-executable instructions, on execution, may further cause the processor to determine compliance of at least one of the directions in response to an action performed by the user. The processor-executable instructions, on execution, may further cause the processor to generate a notification in response to determining compliance of the at least one of the directions.
[006] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[007] The present application can be best understood by reference to the following description taken in conjunction with the accompanying drawing figures, in which like parts may be referred to by like numerals.
[008] FIG. 1 illustrates a system for performing real-time monitoring of vehicles in a vehicle yard, in accordance with an embodiment.
[009] FIG. 2 illustrates a flowchart of a method for performing real-time monitoring of vehicles in a vehicle yard, in accordance with an embodiment.
[010] FIG. 3 illustrates a flowchart of a method for performing real-time monitoring of containers in a vehicle yard, in accordance with an embodiment.
[011] FIG. 4 illustrates a flowchart of a method for identifying at least one vehicle attribute associated with a vehicle, in accordance with an embodiment.
[012] FIG. 5 illustrates a flowchart of a method for identifying at least one container attribute associated with a container, in accordance with an embodiment.
[013] FIG. 6 illustrates a pictorial representation of identification of at least one vehicle attribute associated with a vehicle, in accordance with an exemplary embodiment.
[014] FIG. 7 illustrates a pictorial representation of identification of at least one container attribute associated with a container, in accordance with an exemplary embodiment.
[015] FIG. 8 illustrates a pictorial representation of rendering first directions to a user, in accordance with an exemplary embodiment.
[016] FIG. 9 illustrates a pictorial representation of rendering second direction to a user for performing unloading a container in a vehicle yard, in accordance some exemplary embodiment.
[017] FIG. 10 illustrates a pictorial representation of an elaborated view of second direction rendered on smart glasses of a user for performing unloading of a container, in accordance with an exemplary embodiment.
[018] FIG. 11 illustrates a pictorial representation of an elaborated view of second direction rendered on smart glasses of a user for performing loading of a container, in accordance some exemplary embodiment.
DETAILED DESCRIPTION OF THE DRAWINGS
[019] The following description is presented to enable a person of ordinary skill in the art to make and use the invention and is provided in the context of particular applications and their requirements. Various modifications to the embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the invention. Moreover, in the following description, numerous details are set forth for the purpose of explanation. However, one of ordinary skill in the art will realize that the invention might be practiced without the use of these specific details. In other instances, well-known structures and devices are shown in block diagram form in order not to obscure the description of the invention with unnecessary detail. Thus, the invention is not intended to be limited to the embodiments shown but is to be accorded the widest scope consistent with the principles and features disclosed herein.
[020] While the invention is described in terms of particular examples and illustrative figures, those of ordinary skill in the art will recognize that the invention is not limited to the examples or figures described. Those skilled in the art will recognize that the operations of the various embodiments may be implemented using hardware, software, firmware, or combinations thereof, as appropriate. For example, some processes can be carried out using processors or other digital circuitry under the control of software, firmware, or hard-wired logic. (The term “logic” herein refers to fixed hardware, programmable logic and/or an appropriate combination thereof, as would be recognized by one skilled in the art to carry out the recited functions.) Software and firmware can be stored on computer-readable storage media. Some other processes can be implemented using analog circuitry, as is well known to one of ordinary skill in the art. Additionally, memory or other storage, as well as communication components, may be employed in embodiments of the invention.
[021] A system 100 for performing real-time monitoring of vehicles in a vehicle yard, is illustrated in FIG. 1. In particular, the system 100 may include smart glasses 102 that may be responsible for performing real-time monitoring of vehicles in a vehicle yard. In an embodiment, the smart glasses 102 may correspond to one of Virtual Reality (VR) device and an Augmented Reality (AR) device. As will be appreciated, despite having similar device design, such as the smart glasses 102, the VR device and AR device accomplish things in two very different ways. In case when the smart glasses 102 are the AR device, then the smart glasses 102 may enable a user to see his surroundings along with required information associated with vehicles and containers present in the vehicle yard, overlayed on the smart glasses 102. In other words, the AR enabled smart glasses may enable the user to control his presence in real world by projecting information on top of what the user is already seeing. In another case when the smart glasses 102 are the VR device, then the smart glasses 102 may provide virtual view of surroundings along with required information associated with vehicles and containers present in the vehicle yard. In other words, the VR enabled smart glasses may isolate user from reality by taking him to a frictional world.
[022] In order to perform real-time monitoring of vehicles in the vehicle yard, in one embodiment, a user wearing the smart glasses 102 may identify at least one vehicle attribute associated with a vehicle and at least one parking slot allocated to the vehicle. The at least one vehicle attribute may be identified by capturing image of the vehicle via a camera 104. The camera 104 may be an inbuilt camera or a camera hosted on another electronic device coupled with the smart glasses 102. In this embodiment, a user wearing the smart glasses 102 may capture image of the vehicle via the camera 104 to identify the at least one vehicle attribute. The at least one vehicle attribute may include, but is not limited to, at least one of vehicle number, vehicle type, vehicle dimensions, or vehicle make. In order to identify the at least one vehicle attribute, upon capturing the image of the vehicle, information associated with the vehicle may be retrieved from a server 116. The retrieved information may correspond the at least one vehicle attribute.
[023] Once the information is retrieved, the retrieved information may be rendered on the smart glasses 102 of the user. Upon receiving the retrieved information, first directions may be rendered on the smart glasses 102 of the user to perform an action. The first directions rendered on the smart glasses 102 of the user may enable the user to perform navigation of the vehicle to a parking slot from the at least one parking slot. Based on navigation of the vehicle to the parking slot, the user may park the vehicle in the parking slot. Moreover, based on the first directions rendered to the user, compliance of the first directions may be determined in response to the action performed by the user. Further, based on the determined compliance a notification may be generated via the smart glasses 102 of the user. The generated notification may enable the user to be informed regarding the action performed with respect to the first direction. In an embodiment, the user may be a yard supervisor, a yard operator, and a driver of the vehicle. A method for performing real-time monitoring of the vehicle with respect to the at least one vehicle attribute has been explained in greater detail in conjunction with FIG. 2
[024] In order to perform real-time monitoring of vehicles in the vehicle yard, in another embodiment, the smart glasses 102 may identify at least one container attribute associated with a container. The at least one container attribute may be identified by capturing image of the container via the camera 104. In this embodiment, the user wearing the smart glasses 102 may capture image of the container to identify the at least one container attribute. The at least one container attribute may include, but is not limited to, at least one of container number, container type, container dimensions, weight of the container, or content within the container. In order to identify the at least one container attribute, upon capturing the image of the container, information associated with the container may be retrieved from the server 116. The retrieved information may correspond the at least one container attribute.
[025] Once the information is retrieved, the retrieved information may be rendered on the smart glasses 102 of the user. Upon receiving the at least one container attribute, second directions may be rendered on the smart glasses 102 of the user. The second directions may be rendered on the smart glasses 102 of the user to enable user to perform one of loading of the container onto the vehicle or a cargo carrier and unloading the container to a container slot from the at least one container slot. Based on an action performed by the user of loading or unloading of the container, the user may determine compliance of the second directions via the smart glasses 102. In response to the determined compliance, the notification may be generated by the smart glasses 102 of the user and may be rendered to other users. A method for performing real-time monitoring of the vehicle with respect to the at least one container attribute has been explained in greater detail in conjunction with FIG. 3. The complete process followed by the system 100 is further explained in detail in conjunction with FIG. 2 to FIG. 11.
[026] The smart glasses 102 may further include a memory 106, a processor 108, and the display 110. The display 110 may further include a User Interface (UI) 112. As described above, the user may interact with the smart glasses 102 and vice versa through the display 110. By way of an example, the display 110 may be used to display results (i.e., the at least one vehicle attribute, the at least one container attribute, the at least one parking slot allocated to the vehicle, the first directions, the second directions, etc.,) based on actions performed by the smart glasses 102, to the user. Moreover, the display 110 may be used to display the notification to the user in response to determining compliance of the at least one of the directions.
[027] By way of another example, the user interface 112 may be used by the user to provide inputs to the smart glasses 102. Thus, for example, in some embodiment, the user may ingest an input via the smart glasses 102 that may include identification of a particular vehicle or a container in the vehicle yard. Further, for example, in some embodiments, the smart glasses 102 may render intermediate results (e.g., the at least one vehicle attribute, the at least one container attribute, the at least one parking slot allocated to the vehicle, the at least one container slot, the first directions, the second directions) or final results (e.g., the notification) to the user via the user interface 112.
[028] The memory 106 may store instructions that, when executed by the processor 108, may cause the processor 108 to perform real-time monitoring of vehicles in the vehicle yard. As will be described in greater detail in conjunction with FIG. 2 to FIG. 11, in order to perform real-time monitoring of vehicles in the vehicle yard, the processor 108 in conjunction with the memory 106 may perform various functions including identification of the at least one vehicle attribute and the allocated parking slot, identification of the at least one container attribute and the allocated parking slot, rendering of the first directions and second direction, determination of the compliance of at least one of the directions, generation of the notification based on the determined compliance, etc.
[029] The memory 106 may also store various data (e.g., the at least one vehicle attribute, the at least one container attribute, the first directions, the second directions, the allocated parking slot of the vehicle, the allocated parking slot of the container, etc.,) that may be captured, processed, and/or required by the smart glasses 102. The memory 106 may be a non-volatile memory (e.g., flash memory, Read Only Memory (ROM), Programmable ROM (PROM), Erasable PROM (EPROM), Electrically EPROM (EEPROM) memory, etc.) or a volatile memory (e.g., Dynamic Random-Access Memory (DRAM), Static Random-Access memory (SRAM), etc.).
[030] Further, the smart glasses 102 may interact with a video capturing equipment 114, the server 116, or user devices 122 over a network 120 for sending and receiving various data. The video capturing equipment 114 may be used to collect information associated with vehicles and containers present in the vehicle yard. In an embodiment, the video capturing equipment 114 may be a smart camera. Further, the user devices 122 may be used by a plurality of users to provide their selection of the action that need to be performed by the user based on which the smart glasses 102 may render appropriate directions. In addition, the user devices 122 may be used by the plurality of user to receive the notification generated in response to the determined compliance. Examples of the user devices 122 may include, but is not limited to, computer, tablet, mobile, and laptop. The network 120, for example, may be any wired or wireless communication network and the examples may include, but may be not limited to, the Internet, Wireless Local Area Network (WLAN), Wi-Fi, Long Term Evolution (LTE), Worldwide Interoperability for Microwave Access (WiMAX), and General Packet Radio Service (GPRS).
[031] In some embodiment, the smart glasses 102 may fetch each of the at least one vehicle attribute and each of the at least one container attribute from the server 116. In addition, the server 116 may provide access of information (i.e., the at least one vehicle attribute and the one container attribute) associated with each of the vehicle and the container to the user, based on his requirement. The server 116 may further include a database 118. The database 118 may store information associated with the vehicles and the containers present in the vehicle yard. The database 118 may be periodically updated with new information available for new vehicles and containers present in the vehicle yard.
[032] Referring now to FIG. 2, a flowchart of a method 200 for performing real-time monitoring of vehicles in a vehicle yard is illustrated, in accordance with an embodiment. At step 202, at least one vehicle attribute associated with a vehicle may be identified. In an embodiment, the at least one vehicle attribute may include at least one of vehicle number, vehicle type, vehicle dimensions, or vehicle make. In addition to identification of the at least one vehicle attribute, at step 202, at least one parking slot allocated to the vehicle may be identified. As will be appreciated, each of a plurality of vehicle attribute associated with each of the vehicle may be stored in a server database. In reference to FIG. 1, the server database may correspond to the database 118 of the server 116. Moreover, each of the at least one parking slot may be randomly allocated to each of the vehicle during each entry of the vehicle in the vehicle yard. A method for identifying the at least one vehicle attribute associated with the vehicle has been explained in greater detail in reference to FIG. 4.
[033] Upon identifying the at least one vehicle attribute and the at least one parking slot, at step 204, first directions may be rendered to a user. Upon receiving first directions, at step 206, the user may perform navigation of the vehicle to a parking slot from the at least one parking slot. In order to perform the navigation of the vehicle, a path to be followed by the vehicle and a direction to the reach the parking slot may be displayed to the user. In addition to the path and direction, location of the parking slot from each the at least one parking slot may be highlighted. The location of the parking slot may be highlighted in order to enable the user to easily identify the parking slot. Further, based on received navigation details, at step 208, the user may park the vehicle in the parking slot.
[034] Further, at step 210, compliance of the first directions may be determined. In an embodiment, the compliance of the first direction may be determined in response to an action performed by the user. In order words, the compliance may be determined to get conformity of the action performed by the user based on requirements of the vehicle yard. Thereafter, at step 212, a notification may be generated in response to determining compliance of the first directions. In an embodiment, the notification may include one of an alert message or a success message. The alert message may be generated upon determining non-compliance with the first directions. In contrast, the success message may be generated in response to determination of the compliance with the first directions.
[035] Referring now to FIG. 3, a flowchart of a method 300 for performing real-time monitoring of containers in a vehicle yard is illustrated, in accordance with an embodiment. At step 302, at least one container attribute associated with a container may be identified. In an embodiment, the at least one container attribute may include at least one of container number, container type, container dimensions, weight of the container, or content within the container. A method for identifying the at least one container attribute has been explained in greater detail in conjunction with FIG. 5. In addition to identification of the at least one container attribute, at step 302, at least one container slot allocated to the container may be identified.
[036] As will be appreciated, each of a plurality of container attribute associated with each of the vehicle may be stored in a server database. In reference to FIG. 1, the server database may correspond to the database 118 of the server 116. Moreover, each of the at least one parking slot may be randomly allocated to each of the container during each entry of the container in the vehicle yard. Upon identifying the at least one container attribute and the at least one container slot, at step 304, second directions may be rendered to a user. In one embodiment, upon receiving the second directions, at step 306, the user may perform loading of the container onto the vehicle or a cargo carrier. In another embodiment, upon receiving the second directions, at step 308, the user may perform unloading of the container to a container slot from the at least one container slot.
[037] Further, at step 310, compliance of the second directions may be determined. The compliance of the second directions may be determined in response to an action performed by the user. In an embodiment, the action performed by the user may correspond to one of the loading or the unloading of the container performed by the user. In an embodiment, the compliance may be determined to get conformity of the action performed by the user based on requirements of the vehicle yard.
[038] Thereafter, at step 312, a notification may be generated in response to determining compliance of the second directions. In an embodiment, the notification may include one of an alert message or a success message based on the determined compliance. The alert message may be generated upon determining non-compliance with the second directions. In contrast, the success message may be generated in response to determination of the compliance with the second directions.
[039] Referring now to FIG. 4, a flowchart of a method 400 for identifying at least one vehicle attribute associated with a vehicle is illustrated, in accordance with an embodiment. In reference to FIG. 2, as mentioned in step 202, in order to identify the at least one vehicle attribute associated with the vehicle, at step 402, a field of vision may be directed towards the vehicle, via the smart glasses. In other words, the field of vision of the user wearing the smart glasses may be directed towards the vehicle to identify the at least one vehicle attribute. By way of example, in order to direct the field of vision of the user towards the vehicle, upon wearing the smart glasses, the user may face towards the vehicle to identify the at least one vehicle attribute associated with the vehicle. In reference to FIG. 1, the smart glasses may correspond to the smart glasses 102.
[040] By way of an example, the user may correspond to the yard supervisor or the yard operator. Upon directing the field of vision towards the vehicle, at step 404, information associated with the vehicle may be retrieved from a server. In reference to FIG. 1, the server may correspond to the server 116. The information may correspond to the at least one vehicle attribute. Once the at least one vehicle attribute has been identified and retrieved, then, at step 406, the at least one vehicle attribute may be rendered to the user. In reference to FIG. 1, the at least one vehicle attribute may be rendered on the smart glasses 102. A method of identifying and rendering the at least one vehicle attribute has been explained in greater detail in conjunction with FIG. 6.
[041] Referring now FIG. 5, a flowchart of a method 500 for identifying at least one container attribute associated with a container, in accordance with an embodiment. In reference to FIG. 3, as mentioned in step 302, in order to identify the at least one container attribute associated with the container, at step 502, a field of vision may be directed towards the container, via the smart glasses. In other words, the field of vision of the user wearing the smart glasses may be directed towards the container to identify the at least one container attribute. By way of example, in order to direct the field of vision of the user towards the container, upon wearing the smart glasses, the user may face towards the container to identify the at least one vehicle attribute associated with the vehicle. In reference to FIG. 1, the smart glasses may correspond to the smart glasses 102.
[042] By way of an example, the user may correspond to the yard supervisor or the yard operator. Upon directing the field of vision towards the container, at step 504, information associated with the container may be retrieved from a server. In reference to FIG. 1, the server may correspond to the server 116. The information may correspond to the at least one container attribute. Once the at least one container attribute has been identified and retrieved, then at step 506, the at least one container attribute may be rendered to the user. In reference to FIG. 1, the at least one container attribute may be rendered on the smart glasses 102. A method of identifying and rendering the at least one container attribute has been explained in greater detail in conjunction with FIG. 7.
[043] Referring now to FIG. 6, a pictorial representation 600 of identification of at least one vehicle attribute associated with a vehicle 606 is illustrated, in accordance with an exemplary embodiment. As will be appreciated, FIG. 6 is explained in reference to FIG. 4. Initially, in order to identify the at least one vehicle attribute associated with the vehicle 606, a user 602 wearing smart glasses 604 may face towards the vehicle 606. The user 602 may correspond to the yard operator or the yard supervisor. In reference to FIG. 1, the smart glasses 604 may correspond to the smart glasses 102.
[044] Initially, the user 602 may identify the at least one attribute associated with the vehicle 606. In order to identify the at least one vehicle attribute of the vehicle, the user 602 may face towards the vehicle 606. Upon facing towards the vehicle 606, the user 602 may retrieve the information associated with the vehicle 606 from the server. In an embodiment, the information may correspond to the at least one vehicle attribute. The server may correspond to the server 116 of the FIG. 1. Upon retrieving the at least one vehicle attribute from the server, each of the at least one vehicle attribute retrieved may be rendered on the smart glasses 604 of the user 602.
[045] In present FIG. 6, each of the at least one vehicle attribute rendered on the smart glasses 604 of the user 602 may be represented as depicted via an elaborated view 608 of the smart glasses 604. In an embodiment., the at least one vehicle attribute may include at least one of vehicle number, vehicle type, vehicle dimensions, or vehicle make. As depicted via the elaborated view 608, first attribute, i.e., ‘vehicle number’ representing vehicle registration number may be represented as ‘AU 45CF 8195’. Second attribute, i.e., ‘vehicle type’ representing capacity may be represented as ‘heavy vehicle’. Third attribute, i.e., ‘vehicle dimensions’ representing dimensions may be represented as ‘21L x 7.2W x 7H’. Further, fourth attribute, i.e., ‘vehicle make’ representing manufacturing brand of the vehicle may be represented as ‘ABCD’.
[046] In addition to rendering of each of the at least one vehicle attribute, some additional information may also be rendered on the smart glasses 604. The additional information rendered on the smart glasses 604 as depicted via the elaborated view 608 may include vehicle identification (ID), i.e., ‘12’, vehicle status, i.e., ‘unloaded’, date, i.e., ‘12/09/21’, time, i.e., ‘10:05 AM’, and weather, i.e., ‘33 degree’. In an embodiment, the vehicle ID may be a temporary ID assigned to the vehicle while entering the vehicle yard. In addition, the vehicle status may depict whether the vehicle is loaded with the container or not.
[047] Referring now to FIG. 7, a pictorial representation 700 of identification of at least one container attribute associated with a container ‘C5’ is illustrated, in accordance with an exemplary embodiment. FIG. 7 is explained in reference to FIG. 5. Initially, in order to identify the at least one container attribute associated with the container ‘C5’, a user 702 wearing smart glasses 704 may face towards the container ‘C5’. In an embodiment, the user 702 may be either the yard supervisor or the yard operator. In reference to FIG. 1, the smart glasses 704 may correspond to the smart glasses 102. Upon facing towards the container ‘C5’, the user 702 may retrieve the at least one container attribute associated with the container ‘C5’ from a server. The server may correspond to the server 116 of the FIG. 1.
[048] Upon retrieving each of the at least one container attribute, each of the at least one container attribute retrieved may be rendered on the smart glasses 704 of the user 702. In the present FIG. 7, each of the at least one container attribute rendered on the smart glasses 704 of the user 702 may be represented as depicted via an elaborated view 706 of the smart glasses 704. In an embodiment, the at least one container attribute may include at least one of container number, container type, container dimensions, weight of the container, or content within the container. As depicted via the elaborated view 706, first attribute, i.e., ‘container number’ may be represented as ‘C5’. Second attribute, i.e., ‘container type’ may be represented as ‘loaded’. Third attribute, i.e., ‘container dimensions’ may be represented as ’20 feet (ft)’. Fourth attribute, i.e., ‘container weight’ may be represented as ‘27600 Kg’. Further, fifth attribute, i.e., ‘content within the container’ may be represented as marble slabs.
[049] It should be noted that, the content within the container attribute may be displayed since the container C5 is the loaded container. In case of an unloaded container, the content within the container attribute may not displayed. In some embodiment, in addition to rendering of the at least one container attribute, some additional information may also be rendered on the smart glasses 704 of the user 702. The additional information rendered on the smart glasses 704 as depicted via the elaborated view 706 may include date, i.e., ‘12/09/21’, time, i.e., ‘10:05 AM’, and weather, i.e., ‘33 degree’. As will be appreciated, for ease of explanation, process of retrieving each of the at least one container attribute corresponding to the container ‘C5’ is explained in the present FIG. 7. However, using similar process described in the present FIG. 7, one or more of the at least one attribute associated with any number of containers may be retrieved.
[050] As will be appreciated, in some embodiment, the container ‘C5’ depicted in the present FIG. 7, may be loaded on a vehicle, when parked in the at least one container slot. In reference to FIG. 6, the vehicle may correspond to the vehicle 606. The vehicle 606 displayed in the FIG. 6 may correspond a vehicle with an attached trailer on which the container ‘C5’ may be loaded, when the container ‘C5’ is parked in the at least one container slot. In another embodiment, when the container ‘C5’ is parked in the at least one container slot, the container ‘C5’ may be loaded on the trailer detached from the vehicle. In yet another embodiment, when the container ‘C5’ is parked in the at least one container slot, the container ‘C5’ may be loaded on a box type truck.
[051] Referring now to FIG. 8, a pictorial representation 800 of rendering first directions to a user 802 is illustrated, in accordance with an exemplary embodiment. Consider a scenario where a driver, i.e., the user 802 may want to park a vehicle 806. In order to park the vehicle 806, the at least one vehicle attribute and the at least one parking slot allocated to the vehicle 806 may be identified. The process of identifying the at least one vehicle attribute has been explained in detail in reference to FIG. 6. In an embodiment, the at least one parking slot may be randomly allocated to the vehicle 806 upon identifying the at least one vehicle attribute. In some embodiment, the at least one parking slot may be allocated initially while first entry of the vehicle 806 in the vehicle yard.
[052] Once the at least one vehicle attribute and the at least one parking slot is identified, then the first directions may be rendered to the user 802 via smart glasses 804 worn by the user 802. The first directions may be rendered to the user 802 on the smart glasses 804 to perform navigation of the vehicle 806 to a parking slot (e.g., P5) from the at least one parking slot (e.g., P2 and P5), and to perform parking of the vehicle 806 in the parking slot (i.e., P5). The first directions rendered to the user 802 on the smart glasses 804 may be represented as depicted via an elaborated view 808. In order to perform navigation of the vehicle 806 to the parking slot ‘P5’ a path to be followed by the vehicle 806 and a direction of movement to reach the parking slot ‘P5’ may be displayed to the user as depicted via the elaborated view 808. By way of an example, the path may be depicted as keep moving ‘1 Kilometers(km)’ straight, after moving 1km turn right and move straight for ‘200m’ at a speed of 20 miles per hour (mph). In addition, the location of the parking slot ‘P5’ may be highlighted for enabling the user 802 to easily identify the parking slot ‘P5’.
[053] Referring now to FIG. 9, a pictorial representation of rendering second direction to a user for performing unloading a container in a vehicle yard 900 is illustrated, in accordance some exemplary embodiment. As depicted via the present FIG. 9, the vehicle yard 900 may include a plurality of vehicle parking slots 902 and a plurality of container slots 904. As will be appreciated, for ease of explanation a set of five vehicle parking slots, i.e., ‘P1’, ‘P2’, ‘P3’, ‘P4’ and ‘P5’, and a set of five container slots, i.e., ‘S1’, ‘S2’, ‘S3’, ‘S4’, ‘S5’ are illustrated. In addition to the vehicle parking slots 902 and the container slots 904, a vehicle 906 loaded with the container is depicted. In order to perform unloading of the container loaded on the vehicle 906, the second direction towards an appropriated container slot may be rendered on the smart glasses of the user. In present embodiment, the user may correspond to the driver of the vehicle 906. In addition, the appropriate container slot for performing unloading of the container may be ‘S4’. An elaborated view of the second direction towards the container slot ‘S4’, rendered on the smart glasses of the user for performing unloading of the container has been explained in detail in conjunction with FIG. 10.
[054] Referring now to FIG. 10, a pictorial representation 1000 of an elaborated view 1008 of second direction rendered on smart glasses 1004 of a user 1002 for performing unloading of a container ‘C4’ is illustrated, in accordance with an exemplary embodiment. FIG. 10 is explained in reference to FIG. 9. As depicted via the elaborated view 1008, the second direction, towards the container slot ‘S4’ may be rendered to the user 1002 of the vehicle 1006 (same as 906) on his smart glasses 1004 to perform unloading of the container ‘C4’. The second direction rendered on the smart glasses 1004 of the user 1002 may display a path to be followed by the vehicle 1006 and a direction of movement to reach the container slot ‘S4’. By way of an example, the path followed by the vehicle 1006 to unload the container ‘C4’ may be depicted as keep moving ‘800 meters(m)’ straight, after moving 800m turn right and move straight for ‘100m’ at a speed of 10 miles per hour (mph). Moreover, as represented via the elaborated view 1008 of the smart glasses 1004, location of the container slot ‘S4’ in which the container ‘C4’ needs to be unloaded may be highlighted in order to enable the user 1002 to easily identify the container slot ‘S4’.
[055] Referring now to FIG. 11, a pictorial representation 1100 of an elaborated view 1108 of second direction rendered on smart glasses 1104 of a user 1102 for performing loading a container ‘C4’ is illustrated, in accordance some exemplary embodiment. In present FIG. 11, the elaborated view 1108 may depict the second direction rendered on smart glasses 1104 of the user 1102 (i.e., the driver) of a vehicle 1106 to perform loading of the container ‘C4’ from a container slot ‘S4’. As depicted via the elaborated view 1108, the vehicle 1106 may be an unloaded vehicle. Moreover, the second direction rendered on the smart glasses 1104 may display a path to be followed by the vehicle 1106 to reach the container slot ‘S4’ along with a direction to reach the container slot ‘S4’. Moreover, as represented via the elaborated view 1108, location of the container slot ‘S4’ from which the container ‘C4’ needs to be loaded may be highlighted and rendered to the user 1102 on his smart glasses 1104. By way of an example, the path followed by the vehicle 1106 to load the container ‘C4’ from the container slot ‘S4’ may be depicted as keep moving ‘800 m’ straight, after moving 800m turn right and move straight for ‘100m’, at a speed of 10 mph. In an embodiment, the location of the container slot ‘S4’ may be highlighted for easy identification of the container slot ‘S4’.
[056] Various embodiments provide method and system for real-time monitoring of vehicles in a vehicle yard. The disclosed method and system may identify at least one of at least one vehicle attribute associated with a vehicle and at least one parking slot allocated to the vehicle and at least one container attribute associated with a container and at least one container slot allocated to the container. Further, the disclosed method and system may render to a user, at least one of first directions and second directions. The first directions may be rendered to the user to perform navigation of the vehicle to a parking slot from the at least one parking slot and parking of the vehicle in the parking slot. The second directions may be rendered to the user to perform one of loading the container onto the vehicle or a cargo carrier and unloading the container to a container slot from the at least one container slot. Moreover, the disclosed method and system may determine compliance of at least one of the directions in response to an action performed by the user. Additionally, the disclosed method and system may generate a notification in response to determining compliance of the at least one of the directions.
[057] The method and system provide some advantages like, the method and system may automate location assignment of vehicles and containers by providing real-time visibility of vehicle yard (inventory). Further, the method and system may enable automated guidance to handle vehicle and containers, thereby avoiding manual search. This in turn may increase velocity of the vehicle yard, build performance, and reduce congestion. In addition, the method and system may facilitate on timely departure of vehicles (e.g., trains, ships, trucks, trailers, etc.), and timely unloading or loading of containers. Moreover, the method and system may reduce vehicle dwell time inside the vehicle yard and shuttle driver turn time for containers which in turn may reduce demurrage cost that merchant needs to pay for use of containers. Additionally, the method and system may improve customer experience and satisfaction along with improved safety of employees of the vehicle yard.
[058] It will be appreciated that, for clarity purposes, the above description has described embodiments of the invention with reference to different functional units and processors. However, it will be apparent that any suitable distribution of functionality between different functional units, processors or domains may be used without detracting from the invention. For example, functionality illustrated to be performed by separate processors or controllers may be performed by the same processor or controller. Hence, references to specific functional units are only to be seen as references to suitable means for providing the described functionality, rather than indicative of a strict logical or physical structure or organization.
[059] Although the present invention has been described in connection with some embodiments, it is not intended to be limited to the specific form set forth herein. Rather, the scope of the present invention is limited only by the claims. Additionally, although a feature may appear to be described in connection with particular embodiments, one skilled in the art would recognize that various features of the described embodiments may be combined in accordance with the invention.
[060] Furthermore, although individually listed, a plurality of means, elements or process steps may be implemented by, for example, a single unit or processor. Additionally, although individual features may be included in different claims, these may possibly be advantageously combined, and the inclusion in different claims does not imply that a combination of features is not feasible and/or advantageous. Also, the inclusion of a feature in one category of claims does not imply a limitation to this category, but rather the feature may be equally applicable to other claim categories, as appropriate.
Claims: WHAT IS CLAIMED IS:
1. A method for real-time monitoring of vehicles in a vehicle yard, the method comprising:
identifying, by smart glasses (102), at least one of:
at least (202) one vehicle attribute associated with a vehicle and at least one parking slot allocated to the vehicle; and
at least (302) one container attribute associated with a container and at least one container slot allocated to the container;
rendering to a user, by the smart glasses, at least one of:
first directions (204), via a Graphical User Interface (GUI) on the smart glasses (102), to perform:
navigation (206) of the vehicle to a parking slot from the at least one parking slot; and
parking (208) of the vehicle in the parking slot; and
second directions (304), via the GUI, to perform one of:
loading (306) the container onto the vehicle or a cargo carrier; and
unloading (308) the container to a container slot from the at least one container slot;
determining, by the smart glasses (102), compliance of at least one of the directions in response to an action performed by the user; and
generating, by the smart glasses (102), a notification in response to determining compliance of the at least one of the directions.
2. The method as claimed in claim 1, wherein:
the smart glasses (102) are one of a Virtual Reality device and a Augmented Reality (AR) device;
the at least one vehicle attribute comprises at least one of vehicle number, vehicle type, vehicle dimensions, or vehicle make;
the at least one container attribute comprises at least one of container number, container type, container dimensions, weight of the container, or content within the container; and
the notification comprises one of:
an alert message generated in response to non-compliance with at least one of the first directions and the second directions; and
a success message generated in response to compliance with at least one of the first directions and the second directions;
3. The method as claimed in claim 1, wherein:
identifying (202) the at least one vehicle attribute comprises retrieving (404), by the smart glasses (102), information from a server in response to identifying one of the at least one vehicle attribute, and wherein the information corresponds to the at least one vehicle attribute; and
identifying (302) the at least one container attribute comprises retrieving (504), by the smart glasses (102), information from a server in response to identifying one of the at least one container attribute, and wherein the information corresponds to the at least one container attribute.
4. The method as claimed in claim 1, further comprising rendering, via the GUI, the at least one vehicle attribute and the at least one container attribute to the user, in response to the identifying.
5. The method as claimed in claim 1, wherein:
rendering the first directions to perform navigation of the vehicle to the parking slot comprises displaying, via the GUI, a path to be followed by the vehicle and a direction of movement to reach the parking slot; and
rendering the first directions to perform parking of the vehicle in the parking slot comprises highlighting, via the GUI, location of the parking slot.
6. A system (100) for real-time monitoring of vehicles in a vehicle yard, the system (100) comprising:
smart glasses (102), wherein the smart glasses (102) comprises:
a processor (108); and
a memory (106) communicatively coupled to the processor (108), wherein the memory (106) stores processor executable instructions, which, on execution, causes the processor (108) to:
identify at least one of:
at least (202) one vehicle attribute associated with a vehicle and at least one parking slot allocated to the vehicle; and
at least (302) one container attribute associated with a container and at least one container slot allocated to the container;
render to a user, at least one of:
first directions (204), via a Graphical User Interface (GUI), to perform:
navigation (206) of the vehicle to a parking slot from the at least one parking slot; and
parking (208) of the vehicle in the parking slot; and
second directions (304), via the GUI, to perform one of:
loading (306) the container onto the vehicle or a cargo carrier; and
unloading (308) the container to a container slot from the at least one container slot;
determine compliance of at least one of the directions in response to an action performed by the user; and
generate a notification in response to determining compliance of the at least one of the directions.
7. The system (100) as claimed in claim 6, wherein:
the smart glasses (102) are one of a Virtual Reality device and a Augmented Reality (AR) device;
the at least one vehicle attribute comprises at least one of vehicle number, vehicle type, vehicle dimensions, or vehicle make;
the at least one container attribute comprises at least one of container number, container type, container dimensions, weight of the container, or content within the container; and
the notification comprises one of:
an alert message generated in response to non-compliance with at least one of the first directions and the second directions; and
a success message generated in response to compliance with at least one of the first directions and the second directions.
8. The system (100) as claimed in claim 6, wherein the processor executable instructions further cause the processor (108) to:
identify (202) the at least one vehicle attribute by retrieving (404) information from a server in response to identifying one of the at least one vehicle attribute, and wherein the information corresponds to the at least one vehicle attribute; and
identify (302) the at least one container attribute by retrieving (504) information from a server in response to identifying one of the at least one container attribute, and wherein the information corresponds to the at least one container attribute.
9. The system (100) as claimed in claim 6, wherein the processor executable instructions further cause the processor (108) to:
render, via the GUI, the at least one vehicle attribute and the at least one container attribute to the user, in response to the identifying.
10. The system (100) as claimed in claim 6, wherein the processor executable instructions further cause the processor (108) to:
render the first directions to perform navigation of the vehicle to the parking slot by displaying, via the GUI, a path to be followed by the vehicle and a direction of movement to reach the parking slot; and
render the first directions to perform parking of the vehicle to the parking slot by highlighting, via the GUI, location of the parking slot.
| # | Name | Date |
|---|---|---|
| 1 | 202211026490-ABSTRACT [14-03-2023(online)].pdf | 2023-03-14 |
| 1 | 202211026490-STATEMENT OF UNDERTAKING (FORM 3) [06-05-2022(online)].pdf | 2022-05-06 |
| 2 | 202211026490-CORRESPONDENCE [14-03-2023(online)].pdf | 2023-03-14 |
| 2 | 202211026490-REQUEST FOR EXAMINATION (FORM-18) [06-05-2022(online)].pdf | 2022-05-06 |
| 3 | 202211026490-REQUEST FOR EARLY PUBLICATION(FORM-9) [06-05-2022(online)].pdf | 2022-05-06 |
| 3 | 202211026490-DRAWING [14-03-2023(online)].pdf | 2023-03-14 |
| 4 | 202211026490-PROOF OF RIGHT [06-05-2022(online)].pdf | 2022-05-06 |
| 4 | 202211026490-FER_SER_REPLY [14-03-2023(online)].pdf | 2023-03-14 |
| 5 | 202211026490-POWER OF AUTHORITY [06-05-2022(online)].pdf | 2022-05-06 |
| 5 | 202211026490-OTHERS [14-03-2023(online)].pdf | 2023-03-14 |
| 6 | 202211026490-FORM-9 [06-05-2022(online)].pdf | 2022-05-06 |
| 6 | 202211026490-FER.pdf | 2022-09-19 |
| 7 | 202211026490-FORM 18 [06-05-2022(online)].pdf | 2022-05-06 |
| 7 | 202211026490-COMPLETE SPECIFICATION [06-05-2022(online)].pdf | 2022-05-06 |
| 8 | 202211026490-FORM 1 [06-05-2022(online)].pdf | 2022-05-06 |
| 8 | 202211026490-DECLARATION OF INVENTORSHIP (FORM 5) [06-05-2022(online)].pdf | 2022-05-06 |
| 9 | 202211026490-DRAWINGS [06-05-2022(online)].pdf | 2022-05-06 |
| 9 | 202211026490-FIGURE OF ABSTRACT [06-05-2022(online)].jpg | 2022-05-06 |
| 10 | 202211026490-DRAWINGS [06-05-2022(online)].pdf | 2022-05-06 |
| 10 | 202211026490-FIGURE OF ABSTRACT [06-05-2022(online)].jpg | 2022-05-06 |
| 11 | 202211026490-DECLARATION OF INVENTORSHIP (FORM 5) [06-05-2022(online)].pdf | 2022-05-06 |
| 11 | 202211026490-FORM 1 [06-05-2022(online)].pdf | 2022-05-06 |
| 12 | 202211026490-COMPLETE SPECIFICATION [06-05-2022(online)].pdf | 2022-05-06 |
| 12 | 202211026490-FORM 18 [06-05-2022(online)].pdf | 2022-05-06 |
| 13 | 202211026490-FER.pdf | 2022-09-19 |
| 13 | 202211026490-FORM-9 [06-05-2022(online)].pdf | 2022-05-06 |
| 14 | 202211026490-OTHERS [14-03-2023(online)].pdf | 2023-03-14 |
| 14 | 202211026490-POWER OF AUTHORITY [06-05-2022(online)].pdf | 2022-05-06 |
| 15 | 202211026490-FER_SER_REPLY [14-03-2023(online)].pdf | 2023-03-14 |
| 15 | 202211026490-PROOF OF RIGHT [06-05-2022(online)].pdf | 2022-05-06 |
| 16 | 202211026490-DRAWING [14-03-2023(online)].pdf | 2023-03-14 |
| 16 | 202211026490-REQUEST FOR EARLY PUBLICATION(FORM-9) [06-05-2022(online)].pdf | 2022-05-06 |
| 17 | 202211026490-CORRESPONDENCE [14-03-2023(online)].pdf | 2023-03-14 |
| 17 | 202211026490-REQUEST FOR EXAMINATION (FORM-18) [06-05-2022(online)].pdf | 2022-05-06 |
| 18 | 202211026490-STATEMENT OF UNDERTAKING (FORM 3) [06-05-2022(online)].pdf | 2022-05-06 |
| 18 | 202211026490-ABSTRACT [14-03-2023(online)].pdf | 2023-03-14 |
| 1 | 202211026490E_16-09-2022.pdf |