Sign In to Follow Application
View All Documents & Correspondence

A Lane Marking System And A Method To Operate The Same

Abstract: A lane marking system (100) is provided. The system includes a machine (102). The machine includes a drivetrain (104) enabling the movement of the machine on a surface, a wire-laying unit (106) configured to lay a wire on the surface for the machine to follow a path, a navigation unit (108) includes a position-based navigating system for automating the movement of the machine, a path definer (112) to navigate the drivetrain, a camera (122) mounted on a variable offset boom (124) for capturing the image of a preferred reference point, a visual-based marking unit (116) to mark a lane with respect to a plurality of reference points on the surface, an image processing module (132) to process an image and predict a trajectory on the surface and a user interface (118) receives commands from a user for the operation of the machine, an end effector (120) generates the lane markings. FIG. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
13 July 2023
Publication Number
35/2023
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2024-06-28
Renewal Date

Applicants

ETERNAL ROBOTICS PRIVATE LIMITED
181 & 202, RAGHAVENDRA COLONY, KONDAPUR, HYDERABAD- 500084, TELANGANA, INDIA

Inventors

1. MIR AMAN ALI
ETERNAL ROBOTICS PRIVATE LIMITED, 181 & 202, RAGHAVENDRA COLONY, KONDAPUR, HYDERABAD- 500084, TELANGANA, INDIA
2. SUBBA RAO TADIKONDA
ETERNAL ROBOTICS PRIVATE LIMITED, 181 & 202, RAGHAVENDRA COLONY, KONDAPUR, HYDERABAD- 500084, TELANGANA, INDIA
3. POOJIT MADDINENI
ETERNAL ROBOTICS PRIVATE LIMITED, 181 & 202, RAGHAVENDRA COLONY, KONDAPUR, HYDERABAD- 500084, TELANGANA, INDIA
4. RAJARAM BANDA
ETERNAL ROBOTICS PRIVATE LIMITED, 181 & 202, RAGHAVENDRA COLONY, KONDAPUR, HYDERABAD- 500084, TELANGANA, INDIA
5. SOPAN SHRIRANG KOTBAGI
ETERNAL ROBOTICS PRIVATE LIMITED, 181 & 202, RAGHAVENDRA COLONY, KONDAPUR, HYDERABAD- 500084, TELANGANA, INDIA
6. AJITH ARAVINDAKSHAN NAIR
ETERNAL ROBOTICS PRIVATE LIMITED, 181 & 202, RAGHAVENDRA COLONY, KONDAPUR, HYDERABAD- 500084, TELANGANA, INDIA
7. PASAM PREM KUMAR
ETERNAL ROBOTICS PRIVATE LIMITED, 181 & 202, RAGHAVENDRA COLONY, KONDAPUR, HYDERABAD- 500084, TELANGANA, INDIA
8. THALLAPALLY SOLOMON
ETERNAL ROBOTICS PRIVATE LIMITED, 181 & 202, RAGHAVENDRA COLONY, KONDAPUR, HYDERABAD- 500084, TELANGANA, INDIA
9. NITESH BOYINA
ETERNAL ROBOTICS PRIVATE LIMITED, 181 & 202, RAGHAVENDRA COLONY, KONDAPUR, HYDERABAD- 500084, TELANGANA, INDIA
10. SAI SRIKAR REDDY METTU
ETERNAL ROBOTICS PRIVATE LIMITED, 181 & 202, RAGHAVENDRA COLONY, KONDAPUR, HYDERABAD- 500084, TELANGANA, INDIA
11. HARINI ASHOK BURRA
ETERNAL ROBOTICS PRIVATE LIMITED, 181 & 202, RAGHAVENDRA COLONY, KONDAPUR, HYDERABAD- 500084, TELANGANA, INDIA

Specification

Description:FIELD OF INVENTION
[0001] Embodiments of a present disclosure relate to marking devices, and more particularly to a lane marking system and a method to operate the same.
BACKGROUND
[0002] Surface markings are lines, patterns, and words, signs for surfaces like roads. These are applied to the carriageway or to objects to the carriageway for controlling, warning, guiding, and informing the road users. Also, road markings are an important component of a highway, which function in guiding and controlling the traffic. Marking machines are used for marking on surfaces like roads. Further, the marking machines have played a huge role in urban planning and highway construction with its advantages such as fast, efficient and accurate, saving the construction period and economic investment of road construction to the greatest extent.
[0003] Currently, the existing marking machines like manual lane marker machine does not have good ergonomics and mundane handling. The current machines highly rely on human skill and hand-eye coordination which may lead to inaccuracy in marking. Also, existing machines deposit less quality material because of over or underheating of thermoplastic paint. Further, the direct exposure to hot thermoplastic fumes is health hazardous of the operator.
[0004] There is need of a system for screeding, spraying, and extrusion of hot & cold thermoplastic and other materials which is used for marking on surfaces like roads, fields, parking lots and the like. Also, there is need for a system which may operate manually as well as automatically.
[0005] Hence, there is a need for a lane marking system which addresses the aforementioned issues.

OBJECTIVE OF THE INVENTION
[0006] An objective of the invention is to provide a system for lane marking system on a surface.
[0007] Another objective of the invention is to provide a good ergonomics and mundane handling for an operator of the system.
[0008] Yet, another objective of the invention is to automate the lane marking system.
[0009] Additionally, the objective of the invention is to provide a high accuracy in lane marking.
[0010] Further, the objective of the invention is to maintain the temperature of the paint for the lane marking to avoid health hazards of the operator of the system.
BRIEF DESCRIPTION
[0011] In accordance with one embodiment of the disclosure, a lane marking system is disclosed. The lane marking system includes a machine. The machine includes a drivetrain, a wire-laying unit, a navigation unit, a path definer, a user interface, an end effector, and a visual-based marking module. The drivetrain is configured to enable the movement of the machine on a surface. The wire-laying unit is configured to lay a wire on the surface for the machine to follow a pre-defined path. The navigation unit is electronically coupled to the drivetrain. The navigation unit is a position-based navigating system for automating the movement of the machine. The path definer to navigate the drivetrain corresponding to a identified path for lane marking. The path definer is sensed by a camera sensor, and a visual-based marking module operatively coupled with the path definer wherein the visual-based marking module is configured to mark a lane with a marking material with respect to a plurality of reference point on a surface, and wherein the plurality of reference points is starting reference point and ending reference point for the line marking. The user interface is operatively coupled with the navigation unit wherein the user interface is configured to receive commands from a user for the operation of the machine. The end effector is operatively coupled with the navigation unit and the user interface, wherein the end effector is configured to generate the lane markings by using a contact-based applicator mechanism a colour paint. The end effector includes a level sensor for sensing the quantity of the paint available at the end effector. An actuator actuates a process of transferring the paint from a main chamber to the end effector for actuating the lane marking. The camera is mounted on a variable offset boom and is mechanically coupled to a machine frame, for capturing the image of a preferred reference point in an operative condition. The variable offset boom aids the camera to be placed on a reference line to be tracked during the lane markings. The visual-based marking module includes an image processing module. The image processing module is configured to process at least one an image and a video captured by the camera to generate a reference line with respect to the reference point for navigation and marking on the surface. The image processing module is also configured to segment images to detect line markings on the surface, extract white stripes from the segmentation, mask, filter, and fine-tuned to get one single line. The image processing module is also configured to convert the plurality of reference points in the generated reference line from two-dimensional coordinates to three-dimensional points using a deep-learning neural network model. Further, the image processing module is configured to predict a trajectory on the surface, wherein the trajectory is predicted by joining the three-dimensional points, to get an orientation and a position of the machine. Furthermore, the image processing module is configured to update a plurality of line coordinates in real time and interpolate a start of painting and synchronize the plurality of line coordinates with the reference line.
[0012] In accordance with another embodiment, a method for operating the lane marking system is disclosed. The method includes moving, a drivetrain of a machine on a surface. The method also includes following a path laid by wire laying unit. Further, the method includes automating, by a navigating unit of a navigation unit, the movement of the machine. Furthermore, the method includes navigating, by a path definer of the navigation unit, the drivetrain corresponding to a defined path for lane marking, wherein the path definer is sensed by a camera sensor. Furthermore, the method includes marking, by a visual-based marking module of the navigation unit, a lane with a marking material with respect to a plurality of reference points on a surface, and wherein the plurality of reference points is starting reference point and ending reference point for the line marking. Furthermore, the method includes receiving, by a user interface of the machine, commands from a user the operation of the machine. Furthermore, the method includes generating, by an end effector of the machine, generate the lane markings by using a contact-based applicator mechanism and a color paint. Furthermore, the method includes providing, a camera positioned on the machine with an extended arm, wherein the extended arm aids the camera to be placed on a reference line to be tracked during the lane markings. Furthermore, the method includes processing, by an image processing module of the visual-based marking module, at least one an image and a video captured by the camera to generate a reference line with respect to the reference point for navigation and marking on the surface. Furthermore, the method includes segmenting, by the image processing module of the visual-based marking module, an image to detect line markings on the surface, extract white stripes from the segmentation, mask, filter, and fine-tuned to get one single line. Furthermore, the method includes converting, by the image processing module of the visual-based marking module, the plurality of reference points in the generated reference line from two-dimensional coordinates to three-dimensional points using a deep learning neural network model. Furthermore, the method includes predicting, by the image processing module of the visual-based marking module, a trajectory on the surface, wherein the trajectory is predicted by joining the three-dimensional points, to get an orientation and a position of the machine. Furthermore, the method includes updating, by the image processing module of the visual-based marking module, a plurality of line coordinates in real time and interpolating a start of painting and synchronizing the plurality of line coordinates with the reference line.
[0013] To further clarify the advantages and features of the present disclosure, a more particular description of the disclosure will follow by reference to specific embodiments thereof, which are illustrated in the appended figures. It is to be appreciated that these figures depict only typical embodiments of the disclosure and are therefore not to be considered limiting in scope. The disclosure will be described and explained with additional specificity and detail with the appended figures.
BRIEF DESCRIPTION OF THE DRAWINGS
The disclosure will be described and explained with additional specificity and detail with the accompanying figures in which:
[0014] FIG. 1 is a schematic representation of a top view of a lane marking system in accordance with an embodiment of the present disclosure;
[0015] FIG. 2a is a rear view of the lane marking system of FIG.1 in accordance with an embodiment of the present disclosure;
[0016] FIG. 2b is a front view of the lane marking system of FIG.1 in accordance with an embodiment of the present disclosure;
[0017] FIG. 2c is a top view of the lane marking system in an inoperative condition of FIG.1 in accordance with an embodiment of the present disclosure;
[0018] FIG. 2d is a bottom view of the lane marking system of FIG.1 in accordance with an embodiment of the present disclosure;
[0019] FIG. 3 is a perspective view of the lane marking system of FIG.1 in accordance with an embodiment of the present disclosure;
[0020] FIG. 4 is a perspective view of an exemplary embodiment of the lane marking system of FIG.1 in accordance with an embodiment of the present disclosure;
[0021] FIG. 5 is a top view of the lane marking system in an operative condition of FIG.1 in accordance with an embodiment of the present disclosure;
[0022] FIG. 6a is a flow chart representing steps involved in a method for the operation of the lane marking system with an embodiment of the present disclosure; and
[0023] FIG. 6b is the continued steps involved in the method for operation of the lane marking system in accordance with an embodiment of the present disclosure.
[0024] Further, those skilled in the art will appreciate those elements in the figures are illustrated for simplicity and may not have necessarily been drawn to scale. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the figures by conventional symbols, and the figures may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the figures with details that will be readily apparent to those skilled in the art having the benefit of the description herein.
DETAILED DESCRIPTION
[0025] For the purpose of promoting an understanding of the principles of the disclosure, reference will now be made to the embodiment illustrated in the figures and specific language will be used to describe them. It will nevertheless be understood that no limitation of the scope of the disclosure is thereby intended. Such alterations and further modifications in the illustrated system, and such further applications of the principles of the disclosure as would normally occur to those skilled in the art are to be construed as being within the scope of the present disclosure.
[0026] The terms “comprises”, “comprising”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such a process or method. Similarly, one or more devices or sub-systems or elements or structures or components preceded by “comprises” does not, without more constraints, preclude the existence of other devices, sub-systems, elements, structures, components, additional devices, additional sub-systems, additional elements, additional structures or additional components. Appearances of the phrase "in an embodiment”, "in another embodiment" and similar language throughout this specification may, but not necessarily do, all refer to the same embodiment.
[0027] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by those skilled in the art to which this disclosure belongs. The system, methods, and examples provided herein are only illustrative and not intended to be limiting.
[0028] In the following specification and the claims, reference will be made to a number of terms, which shall be defined to have the following meanings. The singular forms “a”, “an”, and “the” include plural references unless the context clearly dictates otherwise.
[0029] Embodiments of the present disclosure related to a lane marking system. The lane marking system includes a machine. The machine includes a drivetrain, a wire-laying unit, a navigation unit, a user interface, an end effector, and a visual-based marking module. The drivetrain is configured to enable the movement of the machine on a surface. The wire-laying unit is configured to lay a wire on the surface for the machine to follow a pre-defined path. The navigation unit is electronically coupled to the drivetrain. The navigation unit is a position-based navigating system for automating the movement of the machine, a path definer to navigate the drivetrain corresponding to a defined path for lane marking, wherein the path definer is sensed by a camera sensor, and a visual-based marking module operatively coupled with the path definer wherein the visual-based marking module is configured to mark a lane with a marking material with respect to a plurality of reference point on a surface, and wherein the plurality of reference points is starting reference point and ending reference point for the line marking. The user interface is operatively coupled with the navigation unit wherein the user interface is configured to receive commands from a user for the operation of the machine. The end effector is operatively coupled with the navigation unit and the user interface, wherein the end effector is configured to generate the lane markings by using a contact-based applicator mechanism and a colour paint. The end effector includes a level sensor for sensing the quantity of the paint available at the end effector. An actuator actuates a process of transferring the paint from a main chamber to the end effector for actuating the lane marking. The camera is positioned on the machine with an extended arm, wherein the extended arm aids the camera to be placed on a reference line to be tracked during the lane markings. The visual-based marking module includes an image processing module. The image processing module is configured to process at least one an image and a video captured by the camera to generate a reference line with respect to the reference point for navigation and marking on the surface. The image processing module is also configured to segment images to detect line markings on the surface, extract white stripes from the segmentation, mask, filter, and fine-tuned to get one single line. The image processing module is also configured to convert the plurality of reference points in the generated reference line from two-dimensional coordinates to three-dimensional points using a deep-learning neural network model. Further, the image processing module is configured to predict a trajectory on the surface, wherein the trajectory is predicted by joining the three-dimensional points, to get an orientation and a position of the machine. Furthermore, the image processing module is configured to update a plurality of line coordinates in real time and interpolate a start of painting and synchronize the plurality of line coordinates with the reference line.
[0030] FIG. 1 is a schematic representation of a top view of a lane marking system (100) in accordance with an embodiment of the present disclosure. The lane marking system includes a machine (102) which further includes a drivetrain (104) (Shown in FIG. 2d), a wire laying unit (106), a navigation unit (108), a path definer (112), a user interface (118), an end effector (120), a camera (122), and an image processing module (132).
[0031] The drivetrain (104) is configured to enable the movement of the machine (102) on a surface. In one embodiment, a drivetrain (104) may be a 3-wheel (134) setup with a differential coupled with a brushless direct current electric motor on the front and a motor-controlled steering wheel at a rear end.
[0032] The wire-laying unit (106) is configured to lay a wire on the surface for the machine (102) to follow a pre-defined path. In one embodiment, the wire-laying unit (106) is detachably connected to the machine (102).
[0033] The navigation unit (108) is electronically coupled to the drivetrain (104). The navigation unit (108) is configured for automating the movement of the machine (102). The navigation unit (108) is a position-based navigating system (112). The path definer (112) is configured to navigate the drivetrain (104) corresponding to a pre-defined path for lane marking. The path definer (112) is sensed by a camera sensor. In one embodiment, the path definer (112) is sensed by an inductive sensor. In one embodiment, the navigation unit (108) includes a navigation sensor. In one embodiment, the navigation sensor is connected to a pickup coil arrangement on the machine (102).
[0034] Further, the machine (102) includes a processing subsystem (126). The visual-based marking module (116) includes an image-processing module (132). The visual-based marking module (116) is operatively coupled with the path definer (112). The visual-based marking module (116) is configured to mark a lane with a marking material with respect to a plurality of reference points on a surface. The plurality of reference points is starting reference point and ending reference point for the line marking.
[0035] The image processing module (132) is configured to process at least one an image and a video captured by the camera (122) to generate a reference line with respect to the reference point for navigation and marking on the surface. The image processing module (132) is also configured to segment the image to detect line markings on the surface, extract white stripes from the segmentation, mask, filter, and fine-tuned to get one single line. Further, the image processing module (132) is configured to convert the plurality of reference points in the generated reference line from two-dimensional coordinates to three-dimensional points using a deep-learning neural network model. Furthermore, the image processing module (132) is configured to predict a trajectory on the surface, wherein the trajectory is predicted by joining the three-dimensional points, to get an orientation and a position of the machine (102). Furthermore, the image processing module (132) is configured to update a plurality of line coordinates in real time and interpolate a start of painting and synchronize the plurality of line coordinates with the reference line.
[0036] The user interface (118) is operatively coupled with the navigation unit (108) wherein the user interface (118) is configured to receive commands from a user for the operation of the machine (102).
[0037] The end effector (120) is operatively coupled with the navigation unit (108) and the user interface (118). The end effector (120) is configured to generate the lane markings by using a contact-based applicator mechanism and a color paint. In one embodiment, the applicator mechanism is contact less. The end effector (120) includes a level sensor (138) for sensing the quantity of the paint available at the end effector (120). An actuator actuates a process of transferring the paint from a main chamber (136) to the end effector (120) for actuating the lane marking. In one embodiment, the actuators may be pneumatic-based actuators to perform timing operations. In one embodiment, the system includes a tank level sensor (140) for indicating the level of paint available in the tank.
[0038] The camera (122) is positioned on the machine (102) with an extended arm (124). The extended arm (124) aids the camera (122) to be placed on a reference line to be tracked during the lane markings. In one embodiment, the camera (122) may be rotated or set to obtain the road view.
[0039] FIG. 2a is a rear view of the lane marking system (100) of FIG.1 in accordance with an embodiment of the present disclosure. FIG. 2b is a front view of the lane marking system (100) of FIG.1 in accordance with an embodiment of the present disclosure. FIG. 2c is a top view of the lane marking system (100) of FIG.1 in accordance with an embodiment of the present disclosure. FIG. 2d is a bottom view of the lane marking system (100) of FIG.1 in accordance with an embodiment of the present disclosure. In one embodiment, the drivetrain (104) includes one of a wheel (134) or track or bipedal-based arrangement for the movement of the machine (102) in an operative condition. In one embodiment, the drivetrain (104) is fuel powered. In another embodiment, the drivetrain (104) is electrically powered.
[0040] FIG. 3 is a perspective view of the lane marking system (100) of FIG.1 in accordance with an embodiment of the present disclosure. In one embodiment, the path definer (112) may follow a path with reference to a tape, a quick response follower, a reflector, a wireless system, light or acoustic-based navigation unit (108) for automatically detecting the position of the machine (102). In one embodiment, the system (100) is used for screeding, spraying, and extrusion of hot & cold thermoplastic or other materials. which is used for marking on surfaces like roads, fields, parking lots, and the like. In another embodiment, the system (100) may be configured to operate manually as well as automatically.
[0041] FIG. 4 is a perspective view of an exemplary embodiment of the lane marking system (100) of FIG.1 in accordance with an embodiment of the present disclosure. In one embodiment, the user interface (118) is at least one of onboard keypads, a handheld human-machine interface, a plurality of buttons, a switch, a joystick, and a web application-based device.
[0042] FIG. 5 is a top view of the lane marking system (100) in an operative condition (100) of FIG.1 in accordance with an embodiment of the present disclosure. In one embodiment, the camera (122) captures multiple images and videos. In one embodiment, the captured image and video include at least one of a plurality of painted lines, a plurality of chalk markings, a rope, and a plurality of faded lines. In another embodiment, the camera (122) is mounted, wherein the variable offset boom (124) is configured as rotatable and length adjustable, for directing the camera (122) to the preferred reference point. Yet, in another embodiment, the camera (122) is used to trace a mark on the surface to enable the machine (102) to paint a line at an offset distance from a previously painted line and trace an old marking on the surface to enable the machine (102) to paint a new line over an old, faded line. In one embodiment, the path definer (112) includes an inductive wire laid on the surface and is configured to generate a magnetic field upon charging. In one embodiment, the navigation unit (108) includes a controller configured to control the navigation of the machine (102).
[0043] FIG. 6a is a flow chart representing steps involved in a method (200) for the operation of the lane marking system with an embodiment of the present disclosure and FIG. 6b is the continued steps involved in the method (200) for the operation of the lane marking system in accordance with an embodiment of the present disclosure. The method includes moving, a drivetrain of a machine on a surface in step (202). The method also includes moving, the machine by a wheel or track or bipedal-based arrangement.
[0044] The method also includes following, by a wire-laying unit, a pre-defined path by a wire laid on the surface in step (204). The method also includes removably coupling, the wire-laying unit with the machine. The method also includes controlling, by a controller, the navigation of the machine corresponding to the strength of the magnetic field.
[0045] Further, the method includes automating, by a navigating unit, the movement of the machine in step (206). The method also includes providing, a position-based navigation system. The method also includes adjusting, the variable offset as rotatable and length adjustable, for directing the camera to the preferred reference point. The method also includes providing, a navigation sensor connected to a pickup coil arrangement to pick a magnetic field used for navigation.
[0046] Furthermore, the method includes navigating, by a path definer of the navigation unit, the drivetrain corresponding to a defined path for lane marking, wherein the path definer is sensed by a camera in step (208).
[0047] Furthermore, the method includes marking, by a visual-based marking module of the navigation module, a lane with a marking material with respect to a plurality of reference points on a surface, wherein the plurality of reference points is starting reference point and ending reference point for the line marking in step (210).
[0048] Furthermore, the method includes receiving, by a user interface of the machine, commands from a user the operation of the machine in step (212). The method also includes providing, at least one of onboard keypads, a handheld human machine interface, a plurality of buttons, a switch, a joystick, and a web application-based device as the user interface.
[0049] Furthermore, the method includes generating, by an end effector of the machine, generate the lane markings by using a contact-based applicator mechanism in step (214). The method also includes capturing, frame by frame the images of stripped lines or complete lines fed into the visual-based marking module to formulate the line equation with respect to the position of the machine.
[0050] Furthermore, the method includes sensing, by a level sensor of the end effector of the machine, the quantity of the paint available at the end effector in step (216). The method also includes maintaining, the temperature of the paint.
[0051] Furthermore, the method includes actuating, by an actuator of the machine, a process of transferring the paint from a main chamber to the end effector for actuating the lane marking in step (218). The method also includes notifying, the user about the unavailability of the paint at the end effector and the main chamber.
[0052] Furthermore, the method includes providing, a camera mounted on a variable offset boom mechanically coupled to a machine frame, for capturing the image of a preferred reference point in an operative condition, wherein the variable offset boom aids the camera to be placed on a reference line to be tracked during the lane markings in step (220). The method also includes adjusting, the variable offset boom as rotatable and the length adjustable, for directing the camera to the preferred reference point. The method also includes marking, on the surface to enable the machine to paint a line at an offset distance from a previously painted line and trace an old marking on the surface to enable the machine to paint a new line over an old, faded line.
[0053] Furthermore, the method includes processing, by an image processing module of the visual-based marking module, at least one an image and a video captured by the camera to generate a reference line with respect to the reference point for navigation and marking on the surface in step (222).
[0054] Furthermore, the method includes segmenting, by the image processing module of the visual-based marking module, an image to detect line markings on the surface, extract white stripes from the segmentation, mask, filter, and fine-tuned to get one single line in step (224).
[0055] Furthermore, the method includes converting, by the image processing module of the visual-based marking module, the plurality of reference points in the generated reference line from two-dimensional coordinates to three-dimensional points using a deep learning neural network model in step (226).
[0056] Furthermore, the method includes predicting, by the image processing module of the visual-based marking module, a trajectory on the surface, wherein the trajectory is predicted by joining the three-dimensional points, to get an orientation and a position of the machine in step (228). The method includes predicting, arrays of lines and lifting the lines from the two-dimensional coordinate system to the real-world coordinate system with respect to the machine.
[0057] Furthermore, the method includes updating, by the image processing module of the visual-based marking module, a plurality of line coordinates in real time and interpolating a start of painting and synchronizing the plurality of line coordinates with the reference line in step (230).
[0058] Furthermore, the method includes utilizing, a modern computer vision techniques like image segmentation to detect any kind of markings on the road. The method also includes extracting, the segmentation mask of the white stripes they are filtered based on the distance from the machine and fine-tuned to get one single line instead of thick stripes. The method also includes taking, multiple points on the detected line and extending based on the line equation they are forming.
[0059] Various embodiments of the present disclosure enable effective line marking. The system disclosed in the present disclosure provides good ergonomics. The system also provides mundane handling of lane marking machines which may be handled manually. The system disclosed in the present disclosure avoids high reliance on human skill and hand-eye coordination by automating the lane marking system. The lane marking system prevents the deposition of less quality material because of over or under heating of thermoplastic paint. The temperature of the paint is maintained in the main chamber to avoid health hazardous of the user.
[0060] Further, the lane marking system in the present disclosure provides high accuracy while marking on the surface. The system disclosed in the present disclosure is used for screeding, spraying, and extrusion of hot & cold thermoplastic or other materials. The system is also used for marking on surfaces like roads, fields, parking lots, and the like.
[0061] While specific language has been used to describe the disclosure, any limitations arising on account of the same are not intended. As would be apparent to a person skilled in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein.
[0062] The figures and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, the order of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts need to be necessarily performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples.
, Claims:1. A system (100) for lane marking on a surface comprises:
a machine (102) comprising:
a drivetrain (104) configured to enable the movement of the machine (102) on a surface;
a wire-laying unit (106) configured to lay a wire on the surface for the machine (102) to follow a pre-defined path;
a navigation unit (108) electronically coupled to the drivetrain (104), wherein the navigation unit (108) is configured to for automating the movement of the machine (102);
a path definer (112) configured to navigate the drivetrain (104) corresponding to a defined path for lane marking, wherein the path definer (112) is sensed by a camera sensor, and wherein the path definer (112) is an external object;
a user interface (118) operatively coupled with the navigation unit (108) wherein the user interface (118) is configured to receive commands from a user for the operation of the machine (102);
an end effector (120) operatively coupled with the navigation unit (108) and the user interface (118),
wherein the end effector (120) is configured to generate the lane markings by using a contact-based applicator mechanism and a color paint,
wherein the end effector (120) comprises a level sensor (138) for sensing the quantity of the paint available at the end effector (120) and notifies the user alarm if the paint is out of the required quantity, and
wherein an actuator actuates a process of transferring the paint from a main chamber (136) to the end effector (120) for actuating the lane marking;
characterized in that:
the camera (122) mounted on a variable offset boom (124) mechanically coupled to a machine frame, for capturing the image of a preferred reference point in an operative condition, wherein the variable offset boom (124) aids the camera (122) to be placed on a reference line to be tracked during the lane markings;
a visual-based marking module (116) operatively coupled with the path definer (112) and the camera (122) wherein the visual-based marking module (116) is configured to mark a lane with a marking material with respect to a plurality of reference points on a surface, and wherein the plurality of reference points is starting reference point and ending reference point for the line marking; and
an image processing module (132) configured to:
process at least one an image and a video captured by the camera (122) to generate a reference line with respect to the reference point for navigation and marking on the surface;
segmenting image to detect line markings on the surface, extract white stripes from the segmentation, mask, filter, and fine-tuned to get a single line;
convert the plurality of reference points in the generated reference line from two-dimensional coordinates to three-dimensional points using a deep learning neural network model;
predict a trajectory on the surface, wherein the trajectory is predicted by joining the three-dimensional points, to get an orientation and a position of the machine (102); and
update a plurality of line coordinates in real time and interpolate a start of painting and synchronize the plurality of line coordinates with the reference line.
2. The system (100) as claimed in claim 1, wherein the drivetrain (104) comprises one of a wheel (134) or track or bipedal-based arrangement for the movement of the machine (102) in an operative condition.
3. The system (100) as claimed in claim 1, wherein the path definer (112) comprises an inductive wire laid on the surface and is configured to generate a magnetic field upon charging.
4. The system (100) as claimed in claim 1, wherein the navigation unit (108) comprises a controller configured to control the navigation of the machine (102).
5. The system (100) as claimed in claim 1, wherein the path definer (112) follows a path with reference to a tape, a quick response follower, a reflector, a wireless system, light or acoustic-based navigation unit (108) for automatically detecting the position of the machine (102).
6. The system (100) as claimed in claim 1, wherein the user interface (118) is at least one of onboard keypads, a handheld human machine interface, a plurality of buttons, a switch, a joystick, and a web application-based device.
7. The system (100) as claimed in claim 1, wherein the captured image and video comprises at least one of a plurality of painted lines, a plurality of chalk markings, a rope, and a plurality of faded lines.
8. The system (100) as claimed in claim 1, wherein the variable offset boom (124) is configured as rotatable and length adjustable, for directing the camera (122) to the preferred reference point.
9. The system (100) as claimed in claim 1, wherein the camera (122) is used to trace a mark on the surface to enable the machine (102) to paint a line at an offset distance from a previously painted line and trace an old marking on the surface to enable the machine (102) to paint a new line over an old, faded line.
10. The method (200) for operating the system for lane marking on the surface comprises:
moving, a drivetrain of a machine on a surface; (202)
following, by a wire-laying unit, a pre-defined path by a wire laid on the surface; (204)
automating, by a navigation unit, the movement of the machine; (206)
navigating, by a path definer of the navigation unit, the drivetrain corresponding to a defined path for lane marking, wherein the path definer is sensed by a camera sensor; (208)
identifying, by a visual-based marking module of the navigation unit, a lane with a marking material with respect to a plurality of reference point on a surface, and wherein the plurality of reference points is starting reference point and ending reference point for the line marking; (210)
receiving, by a user interface of the machine, commands from a user the operation of the machine; (212)
generating, by an end effector of the machine, generate the lane markings by using a contact-based applicator mechanism and a color paint; (214)
sensing, by a level sensor of the end effector of the machine, the quantity of the paint available at the end effector; (216)
actuating, by an actuator of the machine, a process of transferring the paint from a main chamber to the end effector for actuating the lane marking; (218)
providing, a camera mounted on a variable offset boom mechanically coupled to a machine frame, for capturing the image of a preferred reference point in an operative condition, wherein the variable offset boom aids the camera to be placed on a reference line to be tracked during the lane markings; (220)
processing, by an image processing module of the visual-based marking module, at least one an image and a video captured by the camera to generate a reference line with respect to the reference point for navigation and marking on the surface; (222)
segmenting, by the image processing module of the visual-based marking module, an image to detect line markings on the surface, extract white stripes from the segmentation, mask, filter, and fine-tuned to get one single line; (224)
converting, by the image processing module of the visual-based marking module, the plurality of reference points in the generated reference line from two-dimensional coordinates to three-dimensional points using a deep learning neural network model; (226)
predicting, by the image processing module of the visual-based marking module, a trajectory on the surface, wherein the trajectory is predicted by joining the three-dimensional points, to get an orientation and a position of the machine; (228) and
updating, by the image processing module of the visual-based marking module, a plurality of line coordinates in real time and interpolate a start of painting and synchronize the plurality of line coordinates with the reference line. (230)
Dated this 13th day of July 2023

Signature

Jinsu Abraham
Patent Agent (IN/PA-3267)
Agent for the Applicant

Documents

Application Documents

# Name Date
1 202341047345-STATEMENT OF UNDERTAKING (FORM 3) [13-07-2023(online)].pdf 2023-07-13
2 202341047345-REQUEST FOR EARLY PUBLICATION(FORM-9) [13-07-2023(online)].pdf 2023-07-13
3 202341047345-PROOF OF RIGHT [13-07-2023(online)].pdf 2023-07-13
4 202341047345-POWER OF AUTHORITY [13-07-2023(online)].pdf 2023-07-13
5 202341047345-FORM-9 [13-07-2023(online)].pdf 2023-07-13
6 202341047345-FORM FOR STARTUP [13-07-2023(online)].pdf 2023-07-13
7 202341047345-FORM FOR SMALL ENTITY(FORM-28) [13-07-2023(online)].pdf 2023-07-13
8 202341047345-FORM 1 [13-07-2023(online)].pdf 2023-07-13
9 202341047345-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [13-07-2023(online)].pdf 2023-07-13
10 202341047345-EVIDENCE FOR REGISTRATION UNDER SSI [13-07-2023(online)].pdf 2023-07-13
11 202341047345-DRAWINGS [13-07-2023(online)].pdf 2023-07-13
12 202341047345-DECLARATION OF INVENTORSHIP (FORM 5) [13-07-2023(online)].pdf 2023-07-13
13 202341047345-COMPLETE SPECIFICATION [13-07-2023(online)].pdf 2023-07-13
14 202341047345-STARTUP [14-07-2023(online)].pdf 2023-07-14
15 202341047345-FORM28 [14-07-2023(online)].pdf 2023-07-14
16 202341047345-FORM 18A [14-07-2023(online)].pdf 2023-07-14
17 202341047345-FORM-26 [09-08-2023(online)].pdf 2023-08-09
18 202341047345-FER.pdf 2023-11-15
19 202341047345-OTHERS [07-05-2024(online)].pdf 2024-05-07
20 202341047345-FORM 3 [07-05-2024(online)].pdf 2024-05-07
21 202341047345-FER_SER_REPLY [07-05-2024(online)].pdf 2024-05-07
22 202341047345-US(14)-HearingNotice-(HearingDate-12-06-2024).pdf 2024-05-24
23 202341047345-FORM-26 [07-06-2024(online)].pdf 2024-06-07
24 202341047345-Correspondence to notify the Controller [07-06-2024(online)].pdf 2024-06-07
25 202341047345-Written submissions and relevant documents [25-06-2024(online)].pdf 2024-06-25
26 202341047345-ENDORSEMENT BY INVENTORS [25-06-2024(online)].pdf 2024-06-25
27 202341047345-PatentCertificate28-06-2024.pdf 2024-06-28
28 202341047345-IntimationOfGrant28-06-2024.pdf 2024-06-28
29 202341047345-RENEWAL OF PATENTS [12-06-2025(online)].pdf 2025-06-12
30 202341047345-RELEVANT DOCUMENTS [12-06-2025(online)].pdf 2025-06-12
31 202341047345-FORM 28 [12-06-2025(online)].pdf 2025-06-12

Search Strategy

1 searchstrategy_202341047345E_20-09-2023.pdf

ERegister / Renewals

3rd: 12 Jun 2025

From 13/07/2025 - To 13/07/2026