Abstract: Embodiments of present disclosure relates to method and direction estimation system (101) for estimating direction of motion of object. Direction estimation system (101) receives data items from sensors and information associated with plurality of parts of object. Direction estimation system (101) divides each data item into respective components and determines first information and second information change for each part of object. Further, direction estimation system (101) selects component for each part of object based on first and second information change associated with corresponding part. Direction estimation system (101) identifies a part in data item, and corresponding part in subsequent data item using respective selected component. Thereafter, direction estimation system (101) estimates direction of motion of object based on identified part. The estimated direction maybe utilised by ADAS controller of vehicle to increase reaction zone between vehicle and object. Thus, the present disclosure is able to avoid any accidents during driving. Figure 4
We claim:
1. A method of estimating a direction of motion of an object, the method comprising:
receiving, by a direction estimation system (101), a plurality of data items of an object along with information associated with a plurality of parts of the object, from one or more sensors located on a vehicle;
dividing, by the direction estimation system (101), each data item of the plurality of data items into respective plurality of components based on a type of the plurality of data items;
determining, by the direction estimation system (101), for each part of the plurality of parts of the object, a first information change and a second information change in each of the plurality of components, for each data item of the plurality of data items based on one or more features of the plurality of parts of the object, wherein the first information change and the second information change are determined by identifying a difference between the one or more features of a part of the object with one or more features of a preceding part of the object and a succeeding part of the object, respectively, for each of the plurality of parts of the object;
selecting, by the direction estimation system (101), for each part of the plurality of parts of the object, for each data item of the plurality of data items, a component from the plurality of components based on the first information change and the second information associated with corresponding part;
identifying, by the direction estimation system (101), for at least one part in a data item from the plurality of parts, a corresponding part in a subsequent data item from the plurality of data items using the respective selected component; and
estimating, by the direction estimation system (101), a direction of motion of the object based on the identified at least one part.
2. The method as claimed in claim 1, wherein selecting the component corresponding to each
part of the plurality of parts of the object comprises:
comparing, by the direction estimation system (101), the first information change and the second information change associated with each part of the plurality of parts of a
component with the first information change and the second information change associated with each part of the plurality of parts of other component from the plurality components for each data item of the plurality of data items; and
selecting, by the direction estimation system (101), a component from the plurality of components, having a minimum first information change and a minimum second information change, based on the comparison.
3. The method as claimed in claim 1, wherein estimating the direction of motion of the object comprises projecting a line from the at least one part in the data item to the corresponding at least one part in the subsequent data item.
4. The method as claimed in claim 1, wherein the type of the plurality of data items comprises image data, and depth data.
5. The method as claimed in claim 1, wherein the information associated with the plurality of parts of the object comprise position information.
6. The method as claimed in claim 1, wherein the information associated with the plurality of parts of the object is mapped in each of the plurality of components.
7. A direction estimation system (101) for estimating a direction of motion of an object, comprising:
a processor (104); and
a memory (106) communicatively coupled to the processor (104), wherein the memory (106) stores processor-executable instructions, which, on execution, cause the processor (104) to:
receive a plurality of data items of an object along with information associated with a plurality of parts of the object, from one or more sensors located on a vehicle;
divide each data item of the plurality of data items into respective plurality of components based on a type of the plurality of data items;
determine for each part of the plurality of parts of the object, a first information change and a second information change in each of the plurality of
components, for each data item of the plurality of data items based on one or more features of the plurality of parts of the object, wherein the first information change and the second information change are determined by identifying a difference between the one or more features of a part of the object with one or more features of a preceding part of the object and a succeeding part of the object, respectively, for each of the plurality of parts of the object;
select for each part of the plurality of parts of the object, for each data item of the plurality of data items, a component from the plurality of components based on the first information change and the second information associated with corresponding part;
identify for at least one part in a data item from the plurality of parts, a corresponding part in a subsequent data item from the plurality of data items using the respective selected component; and
estimate a direction of motion of the object based on the identified at least one part.
8. The direction estimation system (101) as claimed in claim 7, wherein the processor (104)
is configured to select the component corresponding to each part of the plurality of parts
of the object by:
comparing the first information change and the second information change associated with each part of the plurality of parts of a component with the first information change and the second information change associated with each part of the plurality of parts of other component from the plurality components for each data item of the plurality of data items; and
selecting a component from the plurality of components, having a minimum first information change and a minimum second information change, based on the comparison.
9. The direction estimation system (101) as claimed in claim 7, wherein estimating the
direction of motion of the object comprises projecting a line from the at least one part in
the data item to the corresponding at least one part in the subsequent data item.
10. The direction estimation system (101) as claimed in claim 7, wherein the type of the plurality of data items comprises image data, and depth data.
11. The direction estimation system (101) as claimed in claim 7, wherein the information associated with the plurality of parts of the object comprise position information.
12. The direction estimation system (101) as claimed in claim 7, wherein the information associated with the plurality of parts of the object is mapped in each of the plurality of components.
| # | Name | Date |
|---|---|---|
| 1 | 202341007384-STATEMENT OF UNDERTAKING (FORM 3) [06-02-2023(online)].pdf | 2023-02-06 |
| 2 | 202341007384-REQUEST FOR EXAMINATION (FORM-18) [06-02-2023(online)].pdf | 2023-02-06 |
| 3 | 202341007384-PROOF OF RIGHT [06-02-2023(online)].pdf | 2023-02-06 |
| 4 | 202341007384-FORM 18 [06-02-2023(online)].pdf | 2023-02-06 |
| 5 | 202341007384-FORM 1 [06-02-2023(online)].pdf | 2023-02-06 |
| 6 | 202341007384-DRAWINGS [06-02-2023(online)].pdf | 2023-02-06 |
| 7 | 202341007384-DECLARATION OF INVENTORSHIP (FORM 5) [06-02-2023(online)].pdf | 2023-02-06 |
| 8 | 202341007384-COMPLETE SPECIFICATION [06-02-2023(online)].pdf | 2023-02-06 |
| 9 | 202341007384-FORM-26 [07-02-2023(online)].pdf | 2023-02-07 |
| 10 | 202341007384-Power of Attorney [31-10-2023(online)].pdf | 2023-10-31 |
| 11 | 202341007384-Form 1 (Submitted on date of filing) [31-10-2023(online)].pdf | 2023-10-31 |
| 12 | 202341007384-Covering Letter [31-10-2023(online)].pdf | 2023-10-31 |
| 13 | 202341007384-FER.pdf | 2025-10-07 |
| 1 | 202341007384_SearchStrategyNew_E_METHODANDSYSTEMOFESTIMATINGADIRECTIONOFMOTIONOFANOBJECTE_18-09-2025.pdf |