Sign In to Follow Application
View All Documents & Correspondence

Automotive Intensity Control Of An Illumination Apparatus In An Automobile

Abstract: A system and method for an automatic maneuvering of at least one illumination apparatus is described. The invention discloses a system and method wherein the variation in intensity commensurating with at least one or combination of parameters associated with a vehicle. The input parameters like distance, speed, weather, Road type and driver's action are provided by the sensors and detectors for processing by the controller. The controller then calculates an output intensity value by applying one or more control rules by means of its one or more modules and/or components. Fig.I

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
24 November 2011
Publication Number
26/2013
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ip@legasis.in
Parent Application

Applicants

TATA CONSULTANCY SERVICES LIMITED
NIRMAL BUILDIANG,9TH FLOOR, NARIMAN POINT,MUMBAI 400021, MAHARASHTRA, INDIA.

Inventors

1. B KUM AR, KRISHNA
TATA CONSULTANCY SERVICES LIMITED, 69/2 JAL BUILDING,SALARPURIA, GR TECH PARK, MAHADEAVAPURA, WHITEFIELD ROAD, BANGALORE-560066, INDIA.
2. SHAW KUMAR, PRABODH
TATA CONSULTANCY SERVICES LIMITED, BENGAL INTELLIGENT PARK BLDG,"D" PLOT-A2,M2,N2 BLOCK-GP,SECTOR-V, SALT LAKE KOLKATA-700009, INDIA.

Specification

FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
Title of invention:
AUTOMATIC INTENSITY CONTROL OF AN ILLUMINATION APPARATUS IN AN AUTOMOBILE
Applicant
TATA Consultancy Services Limited A company Incorporated in India under The Companies Act, 1956
Having address:
Nirmal Building, 9th Floor,
Nariman Point, Mumbai 400021,
Maharashtra, India
The following specification particularly describes the invention and the manner in which it is to be performed.

FIELD OF THE INVENTION
The present invention relates to a field of vehicle safety control by controlling the vehicle safety devices. More particularly, the invention relates to a system and method for controlling the intensity of an illumination apparatus of the vehicle.
BACKGROUND OF THE INVENTION
Dependency on automobiles has drastically increased due to their ability of providing faster mode of communication. These days, roads are seen flooded with vehicles taking the increase in traffic to a higher level. As observed in today's scenario, road transports are also made so advanced so that they can serve for long distance journeys. With the increase in road transport the probability of road accidents has also increased. Night accidents have increased in large number as compared to day accidents. One of the prime reasons for this increase is head light control especially in heavy vehicles where it is very difficult to achieve an instant control. Other important reason is the driver action at the time of driving the vehicle along with his pattern of driving.
Many of the solutions have been proposed to address the above mentioned problem. All of these systems are focused towards gathering surrounding information by means of radar, ultrasonic devices or cameras in order to control the movement of the vehicle so that the probability of accident may be reduced. These days' vehicles are also provided with certain control systems and sensor assemblies which are adapted to sense the parameters which may affect the vision of a driver while driving the vehicle. Such types of systems are adapted to process the sensed parameters and provide an intensity value which may be reflected on the dashboard. By using this intensity value, the driver may control the head light by setting its intensity manually.

Some other system provides sensor which communicate the traffic situation to a microprocessor which then activates a preferred forward light distribution. The system also provides an apparatus for controlling the rotation of head lights so that the head lights could be rotated with respect to the speed and direction of oncoming vehicle. However all these conventional systems do not provide any automatic control of head light intensity with respect to surrounding parameters. Also, these systems remain silent on taking the driver's action into consideration for controlling the movement or head light intensity of the vehicle.
Therefore, there is a need of a system and method which should be capable of providing an automatic control over head light intensity. The system should also be capable of responding towards the action of driver driving the vehicle and should also revert for a particular driving style.
OBJECTS OF THE INVENTION
It is the primary object of the invention to provide a system and method for automatic control of intensity of an illumination apparatus.
It is another object of the invention to provide a detector for detecting driver's action and driving pattern while driving a vehicle.
It is another object of the invention to provide one or more sensors for sensing one or more parameters of the surrounding affecting the vision of the driver.
It is yet another object of the invention to provide an output intensity value by applying one or more control rules.
It is yet another object of the invention to select the best suitable rule out of the one or more overlapping rules.

SUMMARY OF THE INVENTION
The present invention relates to a system for an automatic maneuvering intensity of at least one illumination apparatus such that the variation in intensity commensurating with at least one or combination of parameters associated with a vehicle. The system comprises of one or more sensors for sensing one or more parameters associated with the vehicle and surrounding thereof, a detector for detecting a value associated with a driver's action driving the vehicle and a controller configured to process an input matrix to instantaneously generate an illumination descriptor value. The controller further comprises of a classifier configured to detect and classify at least one vehicle encountered while driving, an artificial intelligent module configured to handle plurality of parameters and detected value and derive at least one control rule applicable to the said combination of parameters and detected value, a comparator for comparing the classified parameters and detected value with a set of predetermined control rules stored in a storage medium, the controller further applies one or more control rule thereto and an output generating means for calculating a numerical value of the intensity of the illumination apparatus with respect to the illumination descriptor value based on the applied control rule.
A method for an automatic maneuvering of at least one illumination apparatus such that the variation in intensity commensurating with at least one or combination of parameters associated with a vehicle. The method comprises of a processor implemented steps of sensing one or more parameters associated with the vehicle and surroundings thereof, detecting a value associated with a driver's action driving the vehicle and controlling the illumination by processing an input matrix to instantaneously generate an illumination descriptor value. The controlling further comprises of detecting and classifying at least one vehicle encountered while driving, deriving at least one control rule applicable to the said combination of parameters and detected value, comparing the said combination of classified

parameters and detected value with a set of predefined control rules and applying one or more control rule and generating an output by calculating a numerical value of the intensity of the illumination apparatus with respect to the illumination descriptor value based on the applied control rule.
DETAILED DESCRIPTION OF THE INVENTION
Some embodiments of this invention, illustrating its features, will now be discussed:
The words "comprising", "having", "containing", and "including", and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items.
It must also be noted that as used herein and in the appended claims, the singular forms "a", "an", and "the" include plural references unless the context clearly dictates otherwise. Although any systems, methods, apparatuses, and devices similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present invention, the preferred, systems and parts are now described. In the following description for the purpose of explanation and understanding reference has been made to numerous embodiments for which the intent is not to limit the scope of the invention.
One or more components of the invention are described as module for the understanding of the specification. For example, a module may include self-contained component in a hardware circuit comprising of logical gate, semiconductor device, integrated circuits or any other discrete component. The module may also be a part of any software programme executed by any hardware entity for example processor. The implementation of module as a software programme may include a set of logical instructions to be executed by the processor or any other hardware

entity. Further a module may be incorporated with the set of instructions or a
programme by means of an interface.
The disclosed embodiments are merely exemplary of the invention, which may be
embodied in various forms.
The present invention relates to a system and method for an automatic maneuvering
intensity of at least one illumination apparatus. The present invention also relates to
the system and method for automatic maneuvering intensity where the variation in
intensity commensurating with at least one or combination of parameters associated
with a vehicle.
In accordance with an embodiment of the invention, referring to figure 1, the system (100) for automatic maneuvering intensity of at least one illumination apparatus comprises of one or more sensors (102), a detector (104) and a controller (106), The controller (106) further comprises of a classifier (108), an artificial intelligent module (110), a comparator (112) and an output generating means (116). The comparator (112) further comprises of an evaluation module (114). The output generating means (116) further comprises of a fuzzifier (118), an inference engine (120) and a defuzzifier (122).
Still referring to figure 1, the one or more sensors (102) present in the vehicle senses the light signal coming from an oncoming vehicle. The sensor (102) further comprises of a bi-convex lens and a photodiode. The sensors (102) include and are not limited to an ultrasonic sensor, a humidity sensor, a vibration sensor or a combination thereof. The ultrasonic sensor is adapted to measure distance between the vehicle and the oncoming vehicle whether the oncoming vehicle is near, very near, medium, far or very far from the vehicle. The humidity sensor is adapted to measure a weather condition while driving the vehicle. It senses whether the weather is normal, rainy or cloudy. The vibration sensor is adapted to measure type of road

whether road is bend, terrain, highway, blind turns etc. The other type of sensor senses speed whether it is low, medium or high.
The detector (104) presents in the vehicle captures an action of a driver driving the vehicle. Driver action is considered as normal and is represented with a binary value zero (0) or one (1). The action of the driver is an important parameter for carrying out the vehicle moments. The driver action zero (0) specifies the illumination apparatus control is completely automatic and vice versa. The process of application of brakes and resumption of speed varies from driver to driver. The detector (104) captures the driver's action with respect to each new driver and decides the corresponding binary value for the same. Other important parameters are driving style and driving pattern for each driver. The different driver style refers to a braking practice whereas driving pattern refers to maintaining speed and changing gears etc.
In accordance with an embodiment, the illumination apparatus of the vehicle includes a head light.
The input captured by all the sensors and detector are supplied to the controller (106) as an input parameter for further processing. The controller (106) derives an input matrix out of these input parameters to instantaneously generate an illumination descriptor value. The illumination descriptor value provides a threshold value for the output intensity ranging from a minimum value to a maximum value.
Referring to figure 2, 3 and 4, the controller also stores a membership function for one or more parameters sensed by the sensor and for output intensity. The membership function for distance is divided into five regions (very far, far, medium, near, and very near). The membership function for speed is divided into three regions (low. medium and high) and the membership function for output intensity is divided into five regions (very high, high, medium, low and very low). The controller (106) processes the plurality of input parameters by way of its one or more components.

The classifier (108) classifies whether the oncoming vehicle is a heavy vehicle or a light vehicle. The heavy vehicle includes truck, buses, trolleys etc and light vehicle includes passenger cars etc.
After the vehicle is classified and all other input parameters are obtained, the artificial intelligent module (110) processes the input parameters in order to support the calculation of intensity value and derives one or more control rule applicable to the combination of the sensed parameters of the sensors (102) and the binary value obtained from the detector (104). The control rule derived by the artificial intelligent module (110) and the input matrix is passed to the comparator (112). The comparator (112) stores a set of predetermined control rules in its storage medium. The comparator (112) further compares the derived control rule with the predetermined rule based on the input matrix and selects a suitable control rule.
By way of specific example, table 1 illustrates a type of control rule:

Road type City High way Terrain Blind turn
Weather Normal L H VH H
Cloudy M H H VH
Rainy H H H H
Table 1
In accordance with an embodiment, the comparator further includes an evaluation module (114) for selecting the most appropriate control rule to be applied to the input matrix. When the comparator (112) provides more than one set of matching or overlapping rules for one derived rule, then the evaluation module (114) selects the most appropriate rule out of those overlapping rules.

The output generating means (116) further applies the control rule selected by the comparator or the evaluation module in case of overlapping rules and calculates a numerical value of the intensity of the illumination apparatus with respect to the illumination descriptor value based on the applied control rule. The output generating means (116) further comprises of the fuzzifier (118), the inference engine (120) and the defuzzifier (122), The fuzzifier (118) converts the numerical inputs into the linguistic fuzzy values. The linguistic fuzzy values are derived by the inference engine (120) and the control rule is fired by the inference engine. Since these linguistic values cannot be used directly for deriving a value of intensity so they are again converted into numerical value by the defuzzifier (122). Minimum-maximum composition is used in inferencing procedure while the weight counting method is used for defuzzification.
The output intensity calculated by the output generating means is based on the illumination descriptor value. This output intensity is in terms of voltage and is further used for supplying an illumination current to the illumination apparatus. The voltage is converted into the current by using a voltage to current converter. In case of an oncoming vehicle, when the distance between the vehicle and the oncoming vehicle reduces, the intensity of the illumination apparatus of the vehicle reduces to a threshold value defined by the illumination descriptor value. Also, when the oncoming vehicle is not near the increase in the intensity is also limited to a threshold value defined by the illumination descriptor value.
BEST MODE/EXAMPLE FOR WORKING OF THE INVENTION
The process illustrated for a customer service queue optimization in the above paragraph can be supported by a working example showed in the following paragraph; the process is not restricted to the said example only:

Referring to figure 5 and 6, the system is tested by implementing in a vehicle. The illumination intensity with the variation of road type and weather is shown in table below. Also assume that there are total 5*3*3*4*2= 360 control rules stored in the comparator.
Table 1

Road Distance (m) Speed (km/h) Weather O/P intensity (volts)
City 89 89 Normal 1.25
City 100 110 Rainy 2.0
Terrain 20 34 Rainy 3.5
Highway 67 45 Normal 2.18
Highway 95 95 Normal 1.5
Input parameters: Highway, Normal weather, no change in driver's action, distance = 67m, speed= 45 km/hr.
Referring to figure 6, the output intensity will be calculated by using the following equation:
Iijktm= min{µdI (x), µgI (y)} (1)
Where, x0, y0 are normalized points for inputs.
Therefore, 1= (f,k*0.2+f2k*0.7) / (0.2+0.7)
= 2.166 Volts
For the cases where more than one rules are applied due to overlapping of linguistic variable, the output of each rule is calculated by matching rule antecedent part with the predefined control rules and corresponding grade is calculated. For

defuzzification, peak value of each linguistic variable of output membership function is calculated. And its weight function is calculated by using the below mentioned equation (2).
I = ∑ µA (w)*f (k) / SuA (W) (2)
Where,
F(k) is normalized peak output intensity for given linguistic variables with grade as uA(w)
The value of speed may be in the region of low-medium or medium-high or high. In the above case, where the speed is 45km/hr, it falls in the range low to medium speed. First the speed is assumed low and the corresponding grade µA! is found in the distance speed membership function. For this grade the corresponding output intensity is found from the membership function for output intensity fl (k) graph. Assuming that if the speed is low the output intensity will also be low. Similarly speed is assumed medium and the corresponding grade is found from distance-speed membership function which is µ,A2. The corresponding output intensity for this grade is f2 (k) graph. Here the output intensity is medium. The output intensity is calculated using the equation (2) as shown above. The results indicate that the output intensity varies and is affected by the input parameters, which was aimed. The resultant output intensity value is in terms of voltage. This is so for the simulation however, in experimental system the signal is applied to a voltage to current converter before applying to the headlight.

We claim:
1. A system for an automatic maneuvering intensity of at least one illumination apparatus, the variation in intensity commensurating with at least one or combination of parameters associated with a vehicle, the system comprising: one or more sensors for sensing one or more parameters associated with the vehicle and surrounding thereof;
a detector for detecting a value associated with a driver's action driving the vehicle; and
a controller configured to process an input matrix to instantaneously generate an illumination descriptor value, the controller further comprising:
a classifier configured to detect and classify at least one vehicle encountered while driving;
an artificial intelligent module configured to handle plurality of parameters and detected value and derive at least one control rule applicable to the said combination of parameters and detected value; a comparator for comparing the classified parameters and detected value with a set of predetermined control rules stored in a storage medium; and
an output generating means for applying the control rule thereto and calculating a numerical value of the intensity of the illumination apparatus with respect to the illumination descriptor value based on the applied control rule.
2. The system as claimed in claim 1, wherein the one or more sensors include an ultrasonic sensor, a humidity sensor, a vibration sensor, or a combination thereof.

3. The system as claimed in claim 1, wherein the input matrix further comprises of one or more parameters sensed by the one or more sensors and the value detected by the detector.
4. The system as claimed in claim 1, wherein the artificial intelligent module processes the input matrix in order to supply a differentiative illumination current to the illumination apparatus.
5. The system as claimed in claim 1, wherein the illumination descriptor value varies from a minimum value to a maximum value with respect to a threshold value.
6. The system as chimed in claim 1 and 5, wherein the controller further stores a membership function defined for one or more the parameters sensed by the sensor and for the numerical value of intensity calculated by the output generating means.
7. The system as claimed in claim 1, wherein the comparator further comprises of an evaluation module configured for selecting the most suitable control rule out of the one or more overlapping control rule.
8. The system as claimed in claim 1, wherein the illumination apparatus includes a headlight.
9. The system as claimed in claim 1, wherein the output generating means further comprises of a fuzzifier, an inference engine and a defuzzifier.
10. A method for an automatic maneuvering of at least one illumination apparatus, the variation in intensity commensurating with at least one or

combination of parameters associated with a vehicle, the method comprising
a processor implemented steps of:
sensing one or more parameters associated with the vehicle and surroundings
thereof;
detecting a value associated with a driver's action driving the vehicle; and
controlling the illumination by processing an input matrix to instantaneously
generate an illumination descriptor value, the controlling further comprising:
detecting and classifying at least one vehicle encountered while driving;
deriving at least one control rule applicable to the said combination of
parameters and detected value;
comparing the said combination of classified parameters and detected
value with a set of predefined control rules and applying one or more
control rule; and
generating an output by calculating a numerical value of the intensity
of the illumination apparatus with respect to the illumination
descriptor value based on the applied control rule.
11. The method as claimed in claim 10, wherein the method further comprises of the processor implemented step of sensing distance of the vehicle with respect to an encountered vehicle, weather condition, road type measurement and a combination thereof.
12. The method as claimed in claim 10, wherein the input matrix further comprises of one more sensed parameters and detected value.
13. The method as claimed in claim 10, wherein the processing of the input matrix supplies a differentiative illumination current to the illumination apparatus.

14. The method as claimed in claim 10, wherein the illumination descriptor value
varies from a minimum value to a maximum value with respect to a threshold
value.
15. The method as claimed in claim 10 and 14, wherein a membership function is defined for one or more sensed parameters and value of output intensity.
16. The method as claimed in claim 10, wherein the comparing of the classified parameters further comprises of the processor implemented step of selecting the most suitable control rule out of the one or more overlapping control rule.
17. The method as claimed in claim 10, wherein the generating of the output further comprises of the processor implemented step of converting the value of the parameters into a linguistic fuzzy value and again converting the linguistic fuzzy value to a numerical value.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 3311-MUM-2011-US(14)-HearingNotice-(HearingDate-15-10-2020).pdf 2021-10-03
1 ABSTRACT1.jpg 2018-08-10
2 3311-MUM-2011-FORM 3.pdf 2018-08-10
2 3311-MUM-2011-Written submissions and relevant documents [28-10-2020(online)].pdf 2020-10-28
3 3311-MUM-2011-FORM 26(6-2-2012).pdf 2018-08-10
3 3311-MUM-2011-Correspondence to notify the Controller [12-10-2020(online)].pdf 2020-10-12
4 3311-MUM-2011-FORM-26 [12-10-2020(online)].pdf 2020-10-12
4 3311-MUM-2011-FORM 2.pdf 2018-08-10
5 3311-MUM-2011-Response to office action [12-10-2020(online)].pdf 2020-10-12
5 3311-MUM-2011-FORM 2(TITLE PAGE).pdf 2018-08-10
6 3311-MUM-2011-FORM 18.pdf 2018-08-10
6 3311-MUM-2011-CLAIMS [06-12-2018(online)].pdf 2018-12-06
7 3311-MUM-2011-FORM 1.pdf 2018-08-10
7 3311-MUM-2011-COMPLETE SPECIFICATION [06-12-2018(online)].pdf 2018-12-06
8 3311-MUM-2011-FORM 1(7-5-2012).pdf 2018-08-10
8 3311-MUM-2011-DRAWING [06-12-2018(online)].pdf 2018-12-06
9 3311-MUM-2011-FER.pdf 2018-08-10
9 3311-MUM-2011-FER_SER_REPLY [06-12-2018(online)].pdf 2018-12-06
10 3311-MUM-2011-DRAWING.pdf 2018-08-10
10 3311-MUM-2011-OTHERS [06-12-2018(online)].pdf 2018-12-06
11 3311-MUM-2011-ABSTRACT.pdf 2018-08-10
11 3311-MUM-2011-DESCRIPTION(COMPLETE).pdf 2018-08-10
12 3311-MUM-2011-CLAIMS.pdf 2018-08-10
12 3311-MUM-2011-CORRESPONDENCE.pdf 2018-08-10
13 3311-MUM-2011-CORRESPONDENCE(6-2-2012).pdf 2018-08-10
13 3311-MUM-2011-CORRESPONDENCE(7-5-2012).pdf 2018-08-10
14 3311-MUM-2011-CORRESPONDENCE(6-2-2012).pdf 2018-08-10
14 3311-MUM-2011-CORRESPONDENCE(7-5-2012).pdf 2018-08-10
15 3311-MUM-2011-CLAIMS.pdf 2018-08-10
15 3311-MUM-2011-CORRESPONDENCE.pdf 2018-08-10
16 3311-MUM-2011-ABSTRACT.pdf 2018-08-10
16 3311-MUM-2011-DESCRIPTION(COMPLETE).pdf 2018-08-10
17 3311-MUM-2011-OTHERS [06-12-2018(online)].pdf 2018-12-06
17 3311-MUM-2011-DRAWING.pdf 2018-08-10
18 3311-MUM-2011-FER.pdf 2018-08-10
18 3311-MUM-2011-FER_SER_REPLY [06-12-2018(online)].pdf 2018-12-06
19 3311-MUM-2011-DRAWING [06-12-2018(online)].pdf 2018-12-06
19 3311-MUM-2011-FORM 1(7-5-2012).pdf 2018-08-10
20 3311-MUM-2011-COMPLETE SPECIFICATION [06-12-2018(online)].pdf 2018-12-06
20 3311-MUM-2011-FORM 1.pdf 2018-08-10
21 3311-MUM-2011-CLAIMS [06-12-2018(online)].pdf 2018-12-06
21 3311-MUM-2011-FORM 18.pdf 2018-08-10
22 3311-MUM-2011-FORM 2(TITLE PAGE).pdf 2018-08-10
22 3311-MUM-2011-Response to office action [12-10-2020(online)].pdf 2020-10-12
23 3311-MUM-2011-FORM 2.pdf 2018-08-10
23 3311-MUM-2011-FORM-26 [12-10-2020(online)].pdf 2020-10-12
24 3311-MUM-2011-Correspondence to notify the Controller [12-10-2020(online)].pdf 2020-10-12
24 3311-MUM-2011-FORM 26(6-2-2012).pdf 2018-08-10
25 3311-MUM-2011-Written submissions and relevant documents [28-10-2020(online)].pdf 2020-10-28
25 3311-MUM-2011-FORM 3.pdf 2018-08-10
26 ABSTRACT1.jpg 2018-08-10
26 3311-MUM-2011-US(14)-HearingNotice-(HearingDate-15-10-2020).pdf 2021-10-03

Search Strategy

1 search_13-12-2017.pdf