Abstract: An automatic fog removing system for vehicles includes, a first artificial intelligence based camera 1 installed within a vehicle 3 to authenticate and track eye position of the user and store within a microcontroller, a computing unit connected with the microcontroller to enter information about driver, a fog detection module 4 detects presence of fog ahead of the vehicle 3, an aerial unit 5, multiple propellers 6, sensors, wherein the aerial unit comprises a processing unit connects with the microcontroller to fetch signal received from the fog detection module 4, a fog removing unit 7 activate by the processing unit upon receiving signal from the microcontroller and actuates the unit 5 to fly towards the fog to remove fog based upon eyesight and driving skill of the user, and a second artificial intelligence based camera 8 mounted over the unit 5 that aids in providing pathway to the vehicle 3.
The present invention relates to an automatic fog removing system for vehicles that equipped with a drone based fog removal means to clear fog particle in path of the vehicle by using the dry air to provide better situational awareness to a driver of the vehicle in order to assist in driving vehicle.
BACKGROUND OF THE INVENTION
[0002] Vehicles are becoming need of day to day life in modern lifestyle, as people are tend towards use of personal vehicle over public transportation. One of the most common causes of traffic accidents is bad weather, and the severity of traffic accidents can readily cause by everyone. Mist has the greatest impact in bad weather, frequently resulting in freeway speed limits or closures, delayed operating time, or even a chain of collisions. Accidents result in significant financial losses. Due to the severity of haze (dry) or mist (wet) in recent years, or even when visibility is reduced to zero (100 metres of insufficient visibility is often regarded zero), this has become unusually dangerous for drivers.
[0003] Fog is a natural weather conditions that can cause visibility to become zero. It can cause accidents on normally safe roads. In existing devices or methods includes use of visibility meter or fog sensor. There is the problems such as the price is very expensive in the detection method of visibility meter, it is ensured that the traffic safety on highway, especially in a mist. When detection, intensive arrangement monitoring device is needed, to which cost is high, while real-time, portability are bad.
[0004] CN106548463B discloses about a sea fog image automatic defogging method and system based on dark and Retinex that the invention discloses a kind of, belong to technical field of image information processing. The method of the present invention includes the following steps: that seeks the dark channel image
of input picture, ratio shared by the lower pixel of dark channel image pixel value is sought, brightness and the contrast metric of input picture are sought, image is classified automatically according to required ratio and feature, image is handled according to the classification of image to be processed. The sea fog image automatic defogging system based on dark and Retinex that the invention also discloses a kind of. The present invention can classify to image according to the attribute of image, corresponding processing method is adaptive selected, the contrast and clarity of marine foggy image are greatly improved, and the complexity of algorithm is low, the speed of service is fast, it is thus possible to be applied to Maritime Intelligent Traffic System.
[0005] US9360556B2 discloses about a methods and systems for detecting weather conditions including fog using vehicle onboard sensors are provided. An example method includes receiving laser data collected from scans of an environment of a vehicle, and associating, by a computing device, laser data points of with one or more objects in the environment. The method also includes comparing laser data points that are unassociated with the one or more objects in the environment with stored laser data points representative of a pattern due to fog, and based on the comparison, identifying by the computing device an indication that a weather condition of the environment of the vehicle includes fog.
[0006] Existing fog removal systems are either installed over roads or need to install over specific platform to perform fog removing task by exhausting dry air into atmospheric air. Existing techniques are failed to solve problems of fog faced by drivers while driving over roads in fog weather condition.
[0007] In order to overcome the aforementioned drawbacks, there is a need of automatic fog detection system which can assist to remove fog particles in path of a vehicle to provide clear sight to driver of the vehicle.
OBJECTS OF THE INVENTION
[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.
[0009] An object of the present invention is to remove or eliminate fog particles in path of vehicle to ensure clear vision to driver of the vehicle.
[0010] Another object of the present invention is to avoid accident caused due to unclear vision by providing better situational awareness of nearby objects or vehicles in foggy weather conditions.
[0011] Another object of the present invention is to assist the driver of the vehicle to drive on roads in foggy weather condition.
[0012] Yet another object of the present invention is to provide a vehicle based portable fog removing system.
[0013] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.
SUMMARY OF THE INVENTION
[0014] The present invention relates to an automatic fog removing system for vehicles that is configured with an aerial unit by which the system first detects and clears fog percent in vicinity of the vehicle in real time and simultaneously monitors of the driving skills of the vehicle in order to provide a safer driving experience for the user.
[0015] According to an embodiment of the present invention, the automatic fog
removing system for vehicles include, a first artificial intelligence based camera installed on dashboard of a vehicle, the camera configured to track eye position of the user along with driving skill, user authorisation and store within a microcontroller, a computing unit operated by the user and wirelessly connected to the microcontroller via communication module, the unit provide options to the user to enter information including name, license, photo and save into a server, a fog detection module integrated with the vehicle and linked to the microcontroller, the module detects presence of fog ahead of the vehicle and upon detecting the fog, sends a signal over the microcontroller, an aerial unit installed over the vehicle fitted with a body, multiple propellers, sensors, wherein the aerial unit consist of a processing unit integrated within the aerial unit that connects with the microcontroller with help of a Bluetooth module to fetch signal received from the fog detection module, a fog removing unit fitted with the aerial unit, the processing unit actuates the aerial unit to fly the unit towards the fog via aid of the propellers and sensors to remove fog particles in path of vehicle based upon eyesight and driving skill of the user, and a second artificial intelligence based camera fitted over the aerial unit that aids in providing pathway along with presence of fog to the aerial unit.
[0016] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0017] These and other features, aspects, and advantages of the present invention
will become better understood with regard to the following description, appended
claims, and accompanying drawings where:
Figure 1 illustrates a perspective view of a vehicle's cabin;
Figure 2 illustrates an isometric view of an aerial unit in deployed state; and
Figure 3 illustrates a side view of the vehicle with installed aerial unit.
DETAILED DESCRIPTION OF THE INVENTION
[0018] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.
[0019] In any embodiment described herein, the open-ended terms "comprising," "comprises," and the like (which are synonymous with "including," "having" and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.
[0020] As used herein, the singular forms "a," "an," and "the" designate both the singular and the plural, unless expressly stated to designate the singular only.
[0021] The present invention relates to an automatic fog removing system for vehicles that employed with a drone that can be attached and detached from vehicle to eliminate fog particles presented in path of vehicle by determining real time positon of the vehicle to provide clear sight and assist driver by providing audio alert to avoid collision with other vehicles or any object.
[0022] Referring to Figure 1, a perspective view of vehicle's cabin is illustrated. The system includes a first Artificial Intelligence (AI) based camera 1 installed
within a user's vehicle 3. The Artificial intelligence based camera 1 include includes but not limited to, a processor and a camera lens. The camera 1 is integrated with the processor for processing images.
[0023] The first camera module 1 capture images of the user who is seating inside vehicle's cabin and perform facial recognition on user's facial image. The processor is configured with facial recognition protocols (machine readable instructions) and object classification protocols (machine readable instructions). At first, the process is configured to convert colored image into a greyscale image and further removes noise from the greyscale image. The processor marks different portions of the facial image and compare regions of the user's face with pre-stored images in order to match and verify the facial recognition and upon matching the regions the user/person is considered as authorized person/user. The camera 1 is configured to track eye movement and position of the user based on captured images of the user while driving to analyze driving skill of the user.
[0024] The pre-stored images are saved into a server based database. The server is connected with a microcontroller via a communication module. The communication module is connected with the microcontroller. The microcontroller is selected from but not limited to, Advanced Rise Machine (ARM), Alf and Vegard Rise (AVR), Snapdragon processor, and Mixed Signal Processor (MSP). The microcontroller includes a memory. The memory may be a Random Access Memory (RAM), Read Only Memory (ROM) or both. The memory may include various instructions. The instructions may comprise code from any suitable computer-programming language, including but not limited to, C, C++, C#, Visual Basic, Java, Python, Perl, and JavaScript.
[0025] The communication module is selected from but not limited to, a Wireless Fidelity (WI-FI) module, Bluetooth module, and a Global System for Mobile Communications (GSM) module. In one case, the module is the WIFI module and the WIFI module is configured to support Wi-Fi versions that includes but not
limited to, 802.11a, 802.11b, 802.11c, 802. llg, 802.1 In, 802.11 ac, 802.1 lax, 802.llh, and 802.Hi. In second case, the module is preferably the Bluetooth modem. The module is configured to support Bluetooth versions that includes but not limited to, 4, 4.1, 4.2, 4.3, 4.5, 5, 5.1, 5.2 and 6. In third case, the module is a Global system for mobile communications (GSM).
[0026] In an embodiment, a computing unit is connected with the microcontroller over a network via the communication module. The computing unit may be a smartphone, a Personal Computer (PC), and a laptop. The computing unit is adapted with a user interface that allows the user to enter information of the user and the information includes but not limited to, name, license number, photo(s)/image(s) of the user, and save into the server. It is to be noted that user manually needs to enter the information. The user interface is preferably a Graphical User Interface (GUI). The user can also enter eye sight data to the server.
[0027] The GUI is configured to support different operating systems. Those of skill in the art will recognize that suitable computer operating systems include, by way of non-limiting examples, Microsoft® Windows®, Apple® Mac OS X®, UNIX®, and UNIX-like operating systems such as GNU/Linux®.
[0028] Referring to fig 2, an isometric view of an aerial unit in deployed state is illustrated. A fog detection module is integrated with the vehicle 3 and linked to the microcontroller. The fog detection module 4 is preferably installed on bonnet of the vehicle 3. The fog detection module includes but not limited to, a transmitter and a receiver. The fog detection module 4 is sensitive for fog particles in a zone approximately ahead of the module 4 placement to detect limited visibility in the area. Fog particles are typically the microscopic water particles constituting fog, but may also include snowfakes, raindrops, and dust or air pollutants. The transmitter is programmed to transmit a beam of modulated infrared light for pre-defined time limit. In one case the time limit is 30 seconds.
The receiver is configured to measuring the amount of light refracted by the atmosphere to the receiver in the instrument and is capable of detecting fog in the range of 20m to 4km. The fog detection module 4 is configured to detect presence of fog level around and before the vehicle 3, upon detecting the fog, sends a signal over the microcontroller.
[0029] It is to be noted that, fog consists of liquid water droplets suspended in air near the earth's surface. The fog detection module 4 is configured to measure scattering in the forward direction, commonly at angles of 42° or 45°. The fog detector are configured to offset from one another to create a sensing volume where the two fields of view overlap. When fog droplets are present in the sensing volume, light is scattered toward the receiver which generates a signal that is converted to a visual range.
[0030] The vehicle 3 is integrated with a primary ultrasonic sensor that measures distance between the vehicle 3 and object in front of the vehicle 3 and sends derived data to the microcontroller.
[0031] Referring to fig 3, a side view of the vehicle with installed aerial unit is illustrated. An aerial unit 5 is fitted over the vehicle 3 within a body. The aerial unit 5 is present within the rooftop of the vehicle 3. The aerial unit 5 is adapted to remove the fog (or smog) from the air.
[0032] The aerial unit 5 employed with multiple propellers 6. The aerial propellers 6 are connected with frame of the aerial unit 5 and the frame via connecting members. The aerial unit 5 may have a quadcopter structure and the aerial unit 5 having four connecting members connected to main frame of the unit 5. A propeller is coupled with shaft of a motor at each distal end of the member. The motor's shaft provide rotational motion to the propellers 6 that helps to hover the aerial unit 5 in air and fly in different directions. The aerial unit 5 is employed with sensors and the sensors include multiple secondary ultrasonic sensor that
installed over the aerial unit 5 to allow manoeuvre in air by dodging from obstructions.
[0033] The aerial unit 5 is employed with a processing unit that fitted over. The frame of the aerial unit 5 is the main structure, or the skeleton upon which the rest of components are attached. The processing unit is preferably fitted over a Printed Circuit Board (PCB). The PCB is fitted over the frame. The processing unit includes but not limited to, a processor, a memory module, and one or more rechargeable batteries. The aerial unit 5 is equipped with multiple sensors and the sensors are multiple secondary ultrasonic sensors which helps to assist the aerial unit 5 in flying by determining position of nearby object and to avoid collision with other objects.
[0034] The processing unit connects with the microcontroller with help of a Bluetooth module in order to fetch signal received from the fog detection module 4. The fog removing unit is arranged on the aerial unit 5 and configured to remove fog presented in way of the vehicle 3. The fog removing unit 7 includes a heating unit, a blowing unit and an outlet. The heating unit by using a heating element to generate the dry air to reduce the relative humidity of atmospheric air by using the dry air, and thereby removing fog. The heating unit is configured to transfer the generated dry air to the blowing unit via the connection unit through a heat transfer blowing part. The heating unit may be a fuel-fired heating unit or an electrical heating unit. The blowing unit may include a blowing housing that receives the dry air generated by the heating unit via the connection unit and blows the received dry air by using a strong blowing pressure, and a blowing driver that is formed on a rear end portion of the blowing housing and generates the strong blowing pressure through rotation of a propeller by a motor with respect to the dry air received from the heating unit. The blowing unit is connected to the outlet which exhaust the hot air to eliminate fog particles in atmospheric air presented in path of the vehicle 3. The unit 5 upon removing fog, returns to the vehicle 3 and gets automatically installed over the vehicle 3 via a
carrier 9.
[0035] A second Artificial Intelligence (AI) based camera 8 mounted over the aerial unit 5 that aids in providing pathway along with presence of fog to the aerial unit 5. The second camera 8 includes but not limited to, a processor and a camera lens. The second camera 8 is configured to monitor speed and movement of the vehicle 3 by performing object classification over captured images or videos and to identify vehicle 3 in the image.
[0036] The second camera 8 transmit the recorded data about the speed and movement of the vehicle 3 and based on the data the microcontroller is configured to analyse driving performance of the user and provide reward points accordingly. The aerial unit 5 remains in power sleep mode. It activates only after receiving a signal from the fog detector. The processing unit receives location of the vehicle 3 from the microcontroller in order to allow the aerial unit 5 to hover in proximity to the vehicle 3.
[0037] The system award points to the driver based on the driving skill efficiency e.g. avoid collision and rash driving. The system tracks the user's performance and updates in the database. The processing unit sends updates to a display screen 2 via the microcontroller about approaching potholes, weather condition detected by the second camera 8. The display screen 2 may be fitted over dashboard of the vehicle 3 so that to provide alert or notification to the user/driver of the vehicle 3. The processing unit sends updates to a display screen 2 via the microcontroller about approaching potholes, weather condition detected by the second camera 8.
[0038] Brief working of the present invention is explained as follows: Once the driver (or user) enters (or driving) the vehicle 3, then the first AI camera 1 recognizes the user based upon the profile stored in the server. Once the fog detector detects fog (or smog), then the microcontroller transmit signal to the processing unit of the aerial unit 5 via the communication module to activate the
aerial unit 5 in real-time. The aerial unit 5 takes off from the roof. It removes the fog up to a distance from the vehicle 3 based upon the eyesight and driving skill of the driver (or user) to make him/her visibility clear for driving in real-time. The aerial unit 5 automatically adjust its position in real time such that it make clear vision for the driver (or user). Simultaneously, the microcontroller alerts and guides the driver (or user) through screen 2 based upon the eyesight and driving skill (using aerial unit 5).
[0039] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention.
We Claim:
1) An automatic fog removing system for vehicles, comprising:
i) a first artificial intelligence based camera 1 installed within a user's vehicle 3, wherein said camera 1 captures multiple images of said user for authentication and tracks eye position of said user while driving to analyse driving skill and store within a microcontroller;
ii) a computing unit operated by said user and wirelessly paired to said microcontroller via a communication module , wherein said unit allows said user to enter information including name, license, photo and save into a server;
iii) a fog detection module 4 integrated with said vehicle 3 and linked to said microcontroller, wherein said module 4 detects presence of fog ahead of said vehicle 3 and upon detecting said fog, sends a signal over said microcontroller;
iv) an aerial unit 5 installed over said vehicle 3 fitted with a body having plurality of propellers 6, sensors, wherein said aerial unit comprising of:
a) a processing unit integrated within said aerial unit 5 that connects with said microcontroller with help of a Bluetooth module in order to fetch signal received from said fog detection module 4;
b) a fog removing unit 7 fitted with said aerial unit 5, wherein upon receiving signal from said microcontroller, said processing unit actuates said aerial unit 5 in order to fly towards said fog via aid of said propellers 6 and sensors to remove fog from direction based upon eyesight and driving skill of said user; and
c) a second artificial intelligence based camera 8 mounted over said aerial unit 5 that aids in providing pathway along with presence of fog to said aerial unit 5, wherein said camera 8 monitors and sends data regarding movement and speed of said vehicle 3 to said microcontroller and based upon which said microcontroller analysis driving performance of said user and provide reward points accordingly.
2) The system as claimed in claim 1, wherein said processing unit receives location of said vehicle 3 from said microcontroller in order to allow said aerial unit 5 to hover in proximity to said vehicle 3.
3) The system as claimed in claim 1, wherein said aerial unit 5 remains in power sleep mode until said processing unit receives a signal from said microcontroller.
4) The system as claimed in claim 1, wherein said vehicle 3 is also integrated with a primary ultrasonic sensor that measures distance between said vehicle 3 and object in front of said vehicle 3 and sends derived data to said microcontroller.
5) The system as claimed in claim 1, wherein upon removing said fog, said aerial unit 5 returns and gets automatically installed over said vehicle 3 via a carrier 9.
6) The system as claimed in claim 1, wherein said processing unit sends updates to a display screen 2 via said microcontroller about approaching potholes, weather condition detected by said second camera 8.
7) The system as claimed in claim 1, wherein said sensors include plurality of secondary ultrasonic sensors installed over said aerial unit 5 to allow maneuver in air by dodging from obstructions.
8) The system as claimed in claim 1, wherein said fog detection module 4 is selected from but not limited to, a heating unit, a blowing unit and an outlet.
| # | Name | Date |
|---|---|---|
| 1 | 202111050821-FORM-9 [10-11-2021(online)].pdf | 2021-11-10 |
| 1 | 202111050821-STATEMENT OF UNDERTAKING (FORM 3) [05-11-2021(online)].pdf | 2021-11-05 |
| 2 | 202111050821-PROOF OF RIGHT [05-11-2021(online)].pdf | 2021-11-05 |
| 2 | 202111050821-COMPLETE SPECIFICATION [05-11-2021(online)].pdf | 2021-11-05 |
| 3 | 202111050821-POWER OF AUTHORITY [05-11-2021(online)].pdf | 2021-11-05 |
| 3 | 202111050821-DECLARATION OF INVENTORSHIP (FORM 5) [05-11-2021(online)].pdf | 2021-11-05 |
| 4 | 202111050821-DRAWINGS [05-11-2021(online)].pdf | 2021-11-05 |
| 4 | 202111050821-FORM FOR SMALL ENTITY(FORM-28) [05-11-2021(online)].pdf | 2021-11-05 |
| 5 | 202111050821-FORM 1 [05-11-2021(online)].pdf | 2021-11-05 |
| 5 | 202111050821-EDUCATIONAL INSTITUTION(S) [05-11-2021(online)].pdf | 2021-11-05 |
| 6 | 202111050821-FIGURE OF ABSTRACT [05-11-2021(online)].jpg | 2021-11-05 |
| 6 | 202111050821-EVIDENCE FOR REGISTRATION UNDER SSI [05-11-2021(online)].pdf | 2021-11-05 |
| 7 | 202111050821-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [05-11-2021(online)].pdf | 2021-11-05 |
| 8 | 202111050821-FIGURE OF ABSTRACT [05-11-2021(online)].jpg | 2021-11-05 |
| 8 | 202111050821-EVIDENCE FOR REGISTRATION UNDER SSI [05-11-2021(online)].pdf | 2021-11-05 |
| 9 | 202111050821-FORM 1 [05-11-2021(online)].pdf | 2021-11-05 |
| 9 | 202111050821-EDUCATIONAL INSTITUTION(S) [05-11-2021(online)].pdf | 2021-11-05 |
| 10 | 202111050821-DRAWINGS [05-11-2021(online)].pdf | 2021-11-05 |
| 10 | 202111050821-FORM FOR SMALL ENTITY(FORM-28) [05-11-2021(online)].pdf | 2021-11-05 |
| 11 | 202111050821-DECLARATION OF INVENTORSHIP (FORM 5) [05-11-2021(online)].pdf | 2021-11-05 |
| 11 | 202111050821-POWER OF AUTHORITY [05-11-2021(online)].pdf | 2021-11-05 |
| 12 | 202111050821-PROOF OF RIGHT [05-11-2021(online)].pdf | 2021-11-05 |
| 12 | 202111050821-COMPLETE SPECIFICATION [05-11-2021(online)].pdf | 2021-11-05 |
| 13 | 202111050821-STATEMENT OF UNDERTAKING (FORM 3) [05-11-2021(online)].pdf | 2021-11-05 |
| 13 | 202111050821-FORM-9 [10-11-2021(online)].pdf | 2021-11-10 |
| 14 | 202111050821-FORM 18 [03-11-2025(online)].pdf | 2025-11-03 |