Sign In to Follow Application
View All Documents & Correspondence

Device And Method For Distance Computation

Abstract: Devices and methods for computing a distance of a target object are described herein. In one embodiment  the method includes computing a first distance between a first reference point and a second reference point of the target object (104) in a first captured image  using a camera (210)  from a first location  computing a second distance between the first reference point and the second reference point of the target object (104) in a second captured image  using the camera (210)  from a second location. The method further comprises computing a displacement between the first location and the second location  and determining the distance between the target object (104) and at least one of the first location and the second location based on the first distance  the second distance  and the displacement. <>

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
30 November 2012
Publication Number
33/2014
Publication Type
INA
Invention Field
COMMUNICATION
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2021-12-27
Renewal Date

Applicants

SAMSUNG INDIA ELECTRONICS PVT. LTD.
Logix Cyber Park  Plot No. C- 28 & 29  Tower D 2nd Floor  Sector - 62  Noida

Inventors

1. DOGRA  Debi Prosad
Ramaganja   P.O. Jayantipur  Dist. Paschim Medinipur  West Bengal 721201  India
2. TYAGI  Saurabh
H/No - 6/159  Sector - 2  Rajendra Nagar  Ghaziabad 201005  Uttar Pradesh  India

Specification

FIELD OF INVENTION
[0001] The present subject matter relates to devices and, particularly, but not exclusively,
to a method and device for computing distance of a target object.
BACKGROUND
[0002] Devices, such as cellular phones, smart phones, personal digital assistants
(PDAs), tablets, chipsets, and laptops, provide users with a variety of applications, services, and
networking capabilities. Most of the devices include various sensors which provide the users
with additional functions. For example, a majority of the devices may include a global
positioning system (GPS) module which facilitates the user to navigate from one geographical
location to another. Some devices also facilitate the user to estimate the distance of a target
object from the user’s current location, which may facilitate the user in travelling or selecting a
probable destination.
[0003] Most of the conventional devices implement computer stereo vision to estimate
the distance of the target object from the user’s current location. Computer stereo vision uses the
mechanism of stereopsis. Stereopsis may be defined as the perception of depth by a human being
having normal binocular vision, when viewing a scene with both his eyes. In human beings, the
left eye and the right eye, because of their different positions, create two different images of the
scene. These differences, usually referred to as binocular disparity, facilitate the brain to
determine depth in the scene, and provide depth perception.
[0004] Similarly, the devices which implement stereo vision include two cameras on the
same side of the device. The two cameras capture images of the same scene. Due to the different
positions of the cameras, the images created by each of the two cameras are different. The
devices then implement conventionally known image processing techniques to analyze the
binocular disparity between the two images and extract depth information about the objects
present in the images. Based on the analysis, the devices determine the distance of a target object
from the location from which the images were captured.
[0005] However, the devices which implement stereo vision have additional cameras
which lead to higher costs of such devices. Further, implementing image processing techniques
3
to analyze the binocular disparity leads to higher processing overheads on the processor and
lowers battery life. Thus, the market penetration of such devices is quite low.
SUMMARY
[0006] This summary is provided to introduce concepts related to computing distance of
a target object. This summary is not intended to identify essential features of the claimed subject
matter nor is it intended for use in determining or limiting the scope of the claimed subject
matter.
[0007] According to an embodiment, a method for computing a distance of a target
object is described. The method includes computing a first distance between a first reference
point and a second reference point of the target object in a first captured image, using a camera,
from a first location, computing a second distance between the first reference point and the
second reference point of the target object in a second image captured, using the camera, from a
second location. Further, the method includes computing a displacement between the first
location and the second location, and determining the distance between the target object and
either of the first location and the second location based on the first distance, the second distance,
and the displacement.
[0008] In another embodiment, a device for computing a distance of a target object is
described. The device includes a processor, a camera, and a displacement computing module
coupled to the processor. The displacement computing module is configured to determine a
displacement between a first location and a second location. The first location refers to a location
where a first image of the target object is captured by the camera. The second location refers to a
location where a second image of the target object is captured by the camera. Further, the device
includes an image processing module coupled to the processor. In one implementation, the image
processing module is configured to compute a first distance between a first reference point and a
second reference point of the target object in the first image, compute a second distance between
the first reference point and the second reference point of the target object in the second image.
Furthermore, the device includes a distance computing module configured to determine the
distance between the target object and either of the first location and the second location based
on the first distance, the second distance, and the displacement.
4
BRIEF DESCRIPTION OF THE FIGURES
[0009] The detailed description is described with reference to the accompanying figures.
In the figures, the left-most digit(s) of a reference number identifies the figure in which the
reference number first appears. The same numbers are used throughout the figures to reference
like features and components. Some embodiments of system and/or methods in accordance with
embodiments of the present subject matter are now described, by way of example only, and with
reference to the accompanying figures, in which:
[0010] Figure 1a illustrates a working environment implementing a device for computing
distance of a target object, in accordance with an embodiment of the present subject matter;
[0011] Figure 1b illustrates the variations in images of the target object, in accordance
with an embodiment of the present subject matter.
[0012] Figure 2 illustrates a device for computing distance of the target object, in
accordance with an embodiment of the present subject matter;
[0013] Figures 3a, 3b, and 3c illustrate three scenarios for computing distance of the
target object, in accordance with embodiments of the present subject matter; and
[0014] Figure 4 illustrates a computer implemented method for computing distance of the
target object, in accordance with an embodiment of the present subject matter.
[0015] It should be appreciated by those skilled in the art that any block diagrams herein
represent conceptual views of illustrative systems embodying the principles of the present
subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state
transition diagrams, pseudo code, and the like represent various processes which may be
substantially represented in computer readable medium and so executed by a computer or
processor, whether or not such computer or processor is explicitly shown.
DESCRIPTION OF EMBODIMENTS
[0016] Devices and methods for computing distance of a target object are described
herein. The methods can be implemented in various devices, such as cellular phones, smart
phones, personal digital assistants (PDAs), tablets, laptops, digital cameras, and digital single
lens reflex (DSLR) cameras. Although the description herein is with reference to certain specific
5
devices, the methods and devices may be implemented in any other device, albeit with a few
variations, as will be understood by a person skilled in the art.
[0017] The devices implementing computer stereo vision are usually costly as they have
additional cameras. Further, these devices implement processor intensive image processing
techniques which lead to high response time, and drain battery power rapidly. Moreover, a major
number of users own devices which have a single camera. Even if the devices, such as
smartphones, include multiple cameras, they are placed on the opposite faces, i.e., a primary
camera, primarily used for capturing images, on the backside and a secondary camera, which is
front facing, for facilitating video calls. Thus, implementing conventional computer stereo vision
techniques becomes difficult in a majority of the devices.
[0018] According to an implementation of the present subject matter, devices and
methods for computing distance of a target object are described herein. As mentioned earlier,
devices are usually equipped with at least one camera to facilitate capturing of images by the
user. Additionally, the devices also include various other modules (hardware or software) for
providing additional functionalities, such as a gyroscope for measuring or maintaining
orientation, and a Global Positioning System (GPS) module for navigation and determining or
detecting displacement. The techniques described herein may be implemented in any device
which includes at least one camera.
[0019] In operation, a user of the device captures a first image of a target object from a
first location, at a distance D1 from the target object, using a camera of the device. In one
implementation, the GPS module of the device may be configured to mark a first set of
coordinates of the first location, for example in terms of latitude and longitude. In another
implementation, various applications, such as a geo-tagging application may be used to mark the
first location where the first image was captured. In yet another implementation, the device may
prompt the user to enter the first set of coordinates of the first location, for example, in terms of
latitude and longitude.
[0020] The user may then move towards or move away from the target object and reach a
second location, at a distance D2 from the target object. The user then captures a second image
of the target object from the second location. Further, the device may also be configured to mark
a second set of coordinates of the second location using the GPS module, or applications such as
6
geo-tagging application, or based on user input. Based on the first set of coordinates of the first
location and the second set of coordinates of the second location, the device may be configured
to determine the displacement, represented by DP, of the user. If the user moves towards the
target object, the relation between D1, D2, and DP, is as per equation (1) given below; whereas if
the user moves away from the target object, the relation between D1, D2, and DP, is as per
equation (2) given below. For the purpose of explanation, it is assumed that the user is moving
away from the target object.
DP = D1 - D2 ……………. Equation (1)
DP = D2 – D1 ……………. Equation (2)
[0021] In said implementation, the device may be configured to select two reference
points, a first reference point and a second reference point, of the target object in the first image.
In one configuration, the device may be configured to select two points on the target object as
seen on the first image as the reference points based on various image parameters, such as color,
brightness, hue, saturation, sharpness, and exposure. In another configuration, the device may
prompt the user to select the two reference points on the target object as seen on the first image.
The device may then compute a first distance, represented by DR1, between the first reference
point and the second reference point in the first image, for example say in terms of number of
pixels.
[0022] The device may further select the same two points of the target object on the
second image as the two reference points, a first reference point and a second reference point, of
the second image. In one implementation, the device may be configured to process the second
image for neutralizing the error caused by any of image capturing parameters, such as light
conditions, exposure, and zoom, which may have changed from the instant of capturing the first
image and the instant of capturing the second image. This is done to reduce any difference in the
two images, sans binocular disparity. A second distance (represented by DR2, distance between
the first reference point and the second reference point in the second image) is calculated in
terms of number of pixels.
[0023] As will be understood by those skilled in the art, the size of an object in an image
is inversely proportional to the distance of the object from the camera capturing the image of the
object. For example, consider a camera having a focal length F. In said example, the object is at a
7
distance, say, X. It is assumed that the length of the object is LO, and the length of the image is
LI. Then, according to the laws of optics, as well known in the art, the relation between F, X,
LO, and LI is as given in equation (3).
LO
X

LI
F
…………. Equation 3
[0024] Thus, if the focal length of the camera of the device is represented as FCD, and the
distance of the two reference points on the target object is Dactual then based on the equation (3),
the relation between DR1, D1, FCD, and Dactual is as per equation (4) given below. Similarly, the
relation between DR2, D2, FCD, and Dactual, is as per equation (5) given below.
D
D1

D
F
…………. Equation 4
D
D2

D"
F
…………. Equation 5
[0025] Based on the equation (4) and the equation (5), the relation between DR1, D1, DR2,
and D2 as per the equation (6) is provided below.
D1
D2

D"
D
…………. Equation 6
[0026] Using the property of ratios, as would be known to those skilled in the art, the
equation (6) may be rewritten as equation (7).
D2 % D1
D2

D % D"
D
…………. Equation 7
[0027] By combination of equation (2), and the equation (7), the distance D2 may be
determined as given below in equation (8). In one implementation, the device may be configured
to implement the equation (8) to determine the distance D2, which is the distance of the second
location from the target object.
D2
D' ( D
D % D"
…………. Equation 8
8
[0028] Thus, by combining equation (8) and either of equation (1) or equation (2), as per
the case, the device may determine D1, the distance of the first location from the target object.
[0029] In the aforementioned case, it has been assumed that the first image and the
second image have been captured in parallel planes, henceforth referred to as parallel image
planes. However, in a majority of practical scenarios, the image plane of the first image and the
image plane of the second image may not be parallel but may be at an angle, say α. In an
embodiment, the angle α, may be determined by a sensor, such as the gyroscope, of the device.
In such scenarios, the parameter DR2 has to be amended as per the equation (9) given below. The
new value of DR2 is represented as DR2corrected and is the product of DR2 and the value of the
cosine function of α (cos α).
D"*++,,- D" cos α……………. Equation 9
[0030] The above described techniques and methods facilitate the computation of
distance of the target object from at least one of the first location and the second location, using a
device having a single camera. The aforementioned techniques and methods further reduce the
computational overhead on the processor of the device, and enhance battery life. These and
other features of the aforementioned techniques and methods are described in greater detail in
conjunction with the figures.
[0031] The described methodologies can be implemented in hardware, firmware,
software, or a combination thereof. For a hardware implementation, the processing units can be
implemented within one or more application specific integrated circuits (ASICs), digital signal
processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices
(PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers,
microprocessors, electronic devices, other electronic units designed to perform the functions
described herein, or a combination thereof. Herein, the term "system" encompasses logic
implemented by software, hardware, firmware, or a combination thereof.
[0032] For a firmware and/or software implementation, the methodologies can be
implemented with modules (e.g., procedures, functions, and so on) that perform the functions
described herein. Any machine readable medium tangibly embodying instructions can be used in
implementing the methodologies described herein. For example, software codes and programs
can be stored in a memory and executed by a processing unit. Memory can be implemented
9
within the processing unit or may be external to the processing unit. As used herein the term
"memory" refers to any type of long term, short term, volatile, or other storage devices and is not
to be limited to any particular type of memory or number of memories, or type of media upon
which memory is stored.
[0033] It should be noted that the description merely illustrates the concept of the present
subject matter. It will thus be appreciated that those skilled in the art will be able to devise
various arrangements that, although not explicitly described herein, embody the concepts of the
present subject matter and are included within its spirit and scope. Furthermore, all examples
recited herein are principally intended expressly to be only for pedagogical purposes to aid the
reader in understanding the principles of the invention and the concepts contributed by the
inventor(s) to furthering the art, and are to be construed as being without limitation to such
specifically recited examples and conditions. Moreover, all statements herein reciting principles,
aspects, and embodiments of the invention, as well as specific examples thereof, are intended to
encompass equivalents thereof.
[0034] The manner in which the systems and methods shall be implemented has been
explained in details with respect to the Figure 1, Figure 2, Figure 3a, Figure 3b, and Figure 4.
While aspects of described systems and methods can be implemented in any number of different
Devices, transmission environments, and/or configurations, the embodiments are described in the
context of the following exemplary system(s).
[0035] Figure 1a illustrates a working environment 100, implementing a device 102-1,
for computing a distance of a target object is described, in accordance with an embodiment of the
present subject matter. The device 102-1 described herein can be implemented in any device
with an image capturing capability, such as cellular phones, smart phones, personal digital
assistants (PDAs), tablets, laptops, digital cameras, and digital single lens reflex (DSLR)
cameras. In one implementation, the device 102-1 is configured to compute the distance of a
target object from at least one location of the user.
[0036] In one example, say the user of the device 102-1 wants to determine its distance
from a target object 104, such as a tower. In operation, the user captures a first image of the
target object 104 from a first location, which is at a distance D1 from the target object104. In one
implementation, the device 102-1 marks the first set of coordinates of the first location using
10
inbuilt sensors or by prompting for input of the first set of coordinates by the user. The user then
changes his location by moving towards or moving away from the target object 104. For the sake
of explanation of the present subject matter, it is assumed that the user is moving away from the
target object 104. On reaching a second location, at a distance D2 from the target object 104, the
user then uses the camera of the device 102-1 to capture a second image of the target object 104.
Further, the device 102-1 may be configured to mark the second set of coordinates of the second
location using inbuilt sensors or by prompting for input of the second set of coordinates by the
user.
[0037] Based on the first set of coordinates of the first location and the second set of
coordinates of the second location, the device 102-1 may determine the displacement of the user
from the first location and the second location, as indicated by the line 106. In one example, the
displacement 106, as represented by DP, may be computed using equation (2), which is provided
below :
DP = D2 – D1 ……………. Equation (2)
[0038] In one implementation, the device 102-1 may select two reference points on the
first image of the target object 104. In one implementation, the device 102-1 may select the two
reference points based on various image parameters, such as color, brightness, hue, saturation,
sharpness, and exposure, whereas in another configuration, the device 102-1 may request for user
input to select the two reference points. In one example, say the point 108-1 on the target object
104 as the first reference point and the point 108-2 on the target object 104 as the second
reference point. The portion of the first image formed by the portion of the target object 104 by
the reference points 108-1, and 108-2, is represented by 110-1. In said implementation, the
device 102-1 may compute a first distance, represented by DR1, between the first reference point
and the second reference point in the first image. The first distance may be measured in any unit,
such as millimeters as measurable on the display screen of the device 102-1, and number of
pixels.
[0039] Similarly, the device 102-1 selects the same two reference points, 108-1 and 108-
2, of the target object 104 on the second image as the two reference points, a first reference point
and a second reference point, of the second image. The portion of the second image formed by
the portion of the target object 104 by the reference points 108-1, and 108-2, is represented by
11
110-2. A second distance, represented by DR2, between the first reference point 108-1 and the
second reference point 108-2 in the second image, for example say in terms of number of pixels
is then computed.
[0040] Based on the parameters DP, DR1, and DR2, a distance computing module 112 of
the device 102-1 may be configured to determine the distance of the second location from the
target object 104 as per the equation (8), as provided below. The distance computing module 112
may also be configured to compute the distance D1, based on the equation (2) as mentioned
earlier.
D2
D' ( D
D % D"
…………. Equation 8
[0041] Figure 1b illustrates the variation of the images of the target object 104 in
accordance with an embodiment of the present subject matter. As mentioned earlier, the portion
of the first image formed by the first reference point 108-1 and the second reference point 108-2
on the target object 104 is 110-1. In the portion of the first image, the image of the first reference
point 108-1 is 152-1, and the image of the second reference point 108-2 is 152-2. Thus, the first
distance DR1 is the length of the straight line joining the points 152-1 and 152-2. In one
implementation, say the point 152-1 is represented by coordinates (x1, y1) and the point 152-2 is
represented by the coordinates (x2, y2). In one implementation, the coordinates may be measured
in terms of number of pixels by taking any point on the first image as the reference point
represented by the coordinates (0, 0). Based on the principles of coordinate geometry, as is well
known to those skilled in the art, the first distance DR1 may be computed as per equation (10-1)
given below.
D 2x1 % x2" 4 y1 % y2" ……… Equation (10-1)
[0042] Similarly, the portion of the second image formed by the first reference point 108-
1 and the second reference point 108-2 on the target object 104 is 110-2. In the portion of the
second image, the image of the first reference point 108-1 is 154-1, and the image of the second
reference point 108-2 is 154-2. The corresponding point on the second image to the reference
point of the first image is selected as the reference point of the second image and designated as
(0, 0). Thus, the second distance DR2 is the length of the straight line joining the points 154-1 and
154-2. In one implementation, say the point 154-1 is represented by coordinates (x3, y3) and the
12
point 154-2 is represented by the coordinates (x4, y4). Based on the principles of coordinate
geometry, as is well known to those skilled in the art, the second distance DR2 may be computed
as per equation (10-2) given below.
D" 2x3 % x4" 4 y3 % y4" ……… Equation (10-2)
[0043] Thus, the device 102-1 as described above facilitates the user to determine the
distance of a target object 104 using a single camera. Further, the computational overload on the
device 102-1 is less which results in enhanced battery life.
[0044] Figure 2 illustrates the components of the device 102-1 configured for computing
a distance of a target object 104. The working of the various modules of the device 102-1 is
described in conjunction with Figure 3a, Figure 3b and Figure 3c which illustrate three scenarios
for computing distance of the target object 104, in accordance with embodiments of the present
subject matter.
[0045] In one implementation, the device 102-1 includes processor(s) 202. The processor
202 may be implemented as one or more microprocessors, microcomputers, microcontrollers,
digital signal processors, central processing units, state machines, logic circuitries, and/or any
devices that manipulate signals based on operational instructions. Among other capabilities, the
processor(s) 202 is configured to fetch and execute computer-readable instructions stored in a
memory.
[0046] The functions of the various elements shown in the figure, including any
functional blocks labeled as “processor(s)”, may be provided through the use of dedicated
hardware as well as hardware capable of executing software in association with appropriate
software. When provided by a processor202, the functions may be provided by a single dedicated
processor, by a single shared processor, or by a plurality of individual processors, some of which
may be shared.
[0047] In another embodiment of the present subject matter, the device 102-1 may also
include a memory 204. The memory 204 may be communicatively coupled to the processor 202.
The memory 204 can include any computer-readable medium known in the art including, for
example, volatile memory, such as static random access memory (SRAM) and dynamic random
13
access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM),
erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
[0048] Further, the device 102-1 may include module(s) 206 and data 208. The modules
206 and the data 208 may be coupled to the processors 202. The modules 206, amongst other
things, include routines, programs, objects, components, data structures, etc., which perform
particular tasks or implement particular abstract data types. The modules 206 may also be
implemented as, signal processor(s), state machine(s), logic circuitries, and/or any other device
or component that manipulate signals based on operational instructions.
[0049] Further, the modules 206 can be implemented in hardware, instructions executed
by a processing unit, or by a combination thereof. The processing unit can comprise a computer,
a processor, a state machine, a logic array or any other suitable devices capable of processing
instructions. In another aspect of the present subject matter, the modules 206 may be machinereadable
instructions (software) which, when executed by a processor/processing unit, perform
any of the described functionalities. The machine-readable instructions may be stored on an
electronic memory device, hard disk, optical disk or other machine-readable storage medium or
non-transitory medium. In one implementation, the machine-readable instructions can be also be
downloaded to the storage medium via a network connection.
[0050] In one implementation, the device 102-1 includes an image capturing device, such
as a camera 210. In said implementation, the module(s) 206 includes a displacement computing
module 212, an image processing module 214, the distance computing module 112, and other
module(s) 218. The other module(s) 218 may include programs or coded instructions that
supplement applications or functions performed by the device 102-1. In said implementation, the
data 208 includes computation data 220, and other data 222. The other data 222 amongst other
things, may serve as a repository for storing data that is processed, received, or generated as a
result of the execution of one or more modules in the module(s) 206.
[0051] Also, the device 102-1 includes interface(s) 224. The interfaces 224 may include a
variety of software and hardware interfaces that allow the device 102-1 to interact with various
networks or with other devices. The interfaces 224 may facilitate multiple communications
within a wide variety of networks and protocol types, including wire networks, for example,
14
LAN, cable, etc., and wireless networks, for example, WLAN, cellular, satellite-based network,
etc.
[0052] In operation, the user would capture a first image of the target object 104 using
the camera 210 from the first location. As depicted in Figure 3a, say the first location 302-1 is at
a distance, D1, as depicted by line 304-1, from the target object 104. In said implementation, the
displacement computing module 212 may be configured to receive the first set of coordinates of
the first location from an inbuilt module, such as a GPS module, of the device 102-1 or may
prompt the user to input the first set of coordinates of the first location. As depicted, the image
306-1 is the image of the target object 104 captured by the camera 210.
[0053] In said implementation, the image processing module 214 may be configured to
select two points on the target object 104 as seen on the first image 306-1 as the reference points
based on various image parameters, such as color, brightness, hue, saturation, sharpness, and
exposure. Alternatively, the image processing module 214 may be configured to prompt the user
to select the reference points of the target object 104. In one implementation, the reference points
selected on the target object 104 are 108-1, and 108-2. As depicted, the corresponding reference
points of the target object 104, on the first image 306-1, are 308-1 and 308-2. In one
implementation, the image processing module 214 determines the length of the portion of the
image covered by the reference points, 308-1 and 308-2, as DR1.
[0054] In one scenario, the user may move towards the target object 104 and reach the
second location 302-2, which is at a distance, D2, as depicted by line 304-2, from the target
object 104. In said implementation, the displacement computing module 212 may be configured
to receive the second set of coordinates of the second location from an inbuilt module, such as a
GPS module, of the device 102-1 or may prompt the user to input the second set of coordinates
of the second location. The displacement computing module 212 may then determine the
displacement, DP, of the user as per the equation (1) which is reproduced below for the ease of
readability.
DP = D1 - D2 ……………. Equation (1)
[0055] At the second location, 302-2, the user may capture a second image, 306-2, of the
target object 104 using the camera 210. The image processing module 214 may be configured to
process the second image 306-2 for neutralizing the effect of any image capturing parameter,
15
such as light conditions, exposure, and zoom, which may have changed from the instant of
capturing the first image 306-1 and the instant of capturing the second image 306-2. This is done
to reduce any difference between the two images 306-1 and 306-2, sans binocular disparity. In
one implementation, the image processing module 214 may be configured to identify the
reference points 108-1, and 108-2 of the target image 104 on the second image 306-2. As
illustrated in Figure 3a, the corresponding reference points of the target object 104, on the first
image 306-2, are 310-1 and 310-2. In said implementation, the image processing module 214
determines the length of the portion of the image covered by the reference points, 310-1 and 310-
2, as DR2.
[0056] Based on above determined DP, DR1, and DR2, the distance computing module 112
may be configured to determine the distance of the second location 302-2, D2, using equation
(11) as given below. Further, based on equation (1), the distance computing module 112 may
determine the distance of the fist location 306-1, which is D1.
D2
D' ( D
D" % D
…………. Equation 11
[0057] In another scenario, the user may move away from the target object 104 and reach
a new second location 302-3, which is at a distance, D2, as depicted by line 304-3, from the
target object 104. In said implementation, the displacement computing module 212 may be
configured to receive the set of coordinates of the new second location 302-3 from an inbuilt
module or may prompt the user to input the set of coordinates of the new second location 302-3.
The displacement computing module 212 may then determine the displacement, DP, of the user
as per the equation (2) which is reproduced below for the ease of readability.
DP = D2 – D1 ……………. Equation (2)
[0058] At the new second location 302-3, the user may capture a new second image, 306-
3, of the target object 104 using the camera 210. The image processing module 214 may be
configured to process the new second image 306-3 for neutralizing the effect of any image
capturing parameter. In one implementation, the image processing module 214 may be
configured to identify the reference points 108-1, and 108-2 of the target image 104 on the new
second image 306-3. As illustrated in Figure 3a, the corresponding reference points of the target
object 104, on the new second image 306-3, are 312-1 and 312-2. In said implementation, the
16
image processing module 214 determines the length of the portion of the image covered by the
reference points, 312-1 and 312-2, as DR2.
[0059] Based on above determined DP, DR1, and DR2, the distance computing module 112
may be configured to determine the distance of the new second location 306-3, D2, using the
equation (8) which is reproduced below. Further, based on equation (2), the distance computing
module 112 may determine the distance of the first location 306-1, which is D1.
D2
D' ( D
D % D"
…………. Equation 8
[0060] In the scenarios, explained in conjunction with Figure 3a, it has been presumed
that the first image 306-1 and the second image 306-2 or the new second image 306-3 have been
captured in the image plane. However, in a majority of practical scenarios, the image plane of the
first image 306-1 and the image plane of the second image 306-2 may not be parallel but may be
at an angle.
[0061] Figure 3b depicts another scenario for computing distance of the target object
104, in accordance with embodiments of the present subject matter.
[0062] In operation, the user would capture a first image 306-1 of the target object 104
using the camera 210 from the first location. As depicted in Figure 3b, say the first location 302-
1 is at a distance, D1, as depicted by line 304-1, from the target object 104. As depicted, the
image 306-1 is the first image of the target object 104 captured by the camera 210.
[0063] In one scenario, say the user moves to a second location 302-2 which is at a
distance, D2, as depicted by line 304-2, from the target object 104. Now instead of capturing the
second image of the target object 104 as 306-2, which is in the same image plane as the first
image 306-1, the user captures the second image of the target object 104 as 354-1, whose image
plane is at an angle α, as denoted by curved line 356, with the image plane of the first image 306-
1. In one implementation, the image processing module 214 may be configured to determine the
angle between the image plane of the first image 306-1 and second image 306-2, from the
measurements made by an inbuilt sensor of the device 102-1, such as a gyroscope.
[0064] Based on the determined angle α, the image processing module 214 may correct
the distance between the two reference points in the second image 354-1 as per the equation (9)
17
reproduced below. The new value of DR2 is represented as DR2corrected. In one implementation, the
image processing module 214 may retrieve the value of the cosine function of the angle α,
represented as cos α, from the computation data 220.
D"*++,,- D" cos α……………. Equation 9
[0065] Thus, based on above determined DP, DR1, and DR2corrected, the distance computing
module 112 may be configured to determine the distance of the second location 302-2, D2, using
equation (12) below. Further, based on equation (2), the distance computing module 112 may
determine the distance of the first location 302-1, which is D1.
D2
D' ( D
D % D"*++,,-
…………. Equation 12
[0066] Figure 3c depicts another scenario for computing distance of the target object 104,
in accordance with embodiments of the present subject matter.
[0067] In certain cases, it is possible that the line of sight from the first location 302-1
and the second location 302-2 may not be the same. However, in some scenarios, the line of sight
from the first location 302-1 and the second location 302-2 may vary drastically. As depicted in
Figure 3c, the target object 104, the first location 302-1, and the second location 302-2 do not lie
in the same straight line.
[0068] In one implementation, the image processing module 214 may be configured to
determine an angle β, as depicted by curve 360, between the line of sight from the first location
302-1, i.e., the straight line joining the first location 302-1 and the target object 104, and the line
of sight from the second location 302-2, i.e., the straight line joining the second location 302-2
and the target object 104. In case the determined angle β is less than a pre-defined threshold
value δ, then the image processing module 214 may be configured to ignore the effect of angle
between the two lines of sight. However, in case the angle β exceeds the pre-defined threshold
value δ, then the image processing module 214 may be configured to introduce a correction
parameter, which is the value of the cosine function of the angle β, to neutralize the effect of the
angle between the two lines of sight on the computed value of the distance between the target
object 104 and at least one of the first location 302-1 and the second location 302-2.
18
[0069] Thus, the device 102-1 facilitates determining of the distance between the target
object 104 and at least one of the first location 302-1 and the second location 302-2 of the user
with high precision. The device 102-1 also takes into account the possible rotational effect which
may be induced while capturing images of the target object 104 from different locations, such as
the first location 302-1, and the second location 302-2, to enhance the accuracy of the computed
distance.
[0070] Figure 4 illustrates method 400 for computing a distance of a target object 104,
according to an embodiment of the present subject matter. The order in which the method 400 is
described is not intended to be construed as a limitation, and any number of the described
method blocks can be combined in any order to implement the method 400, or any alternative
methods. Additionally, individual blocks may be deleted from the method without departing
from the spirit and scope of the subject matter described herein. Furthermore, the method can be
implemented in any suitable hardware, software, firmware, or combination thereof.
[0071] The method may be described in the general context of computer executable
instructions. Generally, computer executable instructions can include routines, programs, objects,
components, data structures, procedures, modules, functions, etc., that perform particular
functions or implement particular abstract data types. The method may also be practiced in a
distributed computing environment where functions are performed by remote processing devices
that are linked through a communications network. In a distributed computing environment,
computer executable instructions may be located in both local and remote computer storage
media, including memory storage devices.
[0072] A person skilled in the art will readily recognize that steps of the method can be
performed by programmed computers. Herein, some embodiments are also intended to cover
program storage devices, for example, digital data storage media, which are machine or
computer readable and encode machine-executable or computer-executable programs of
instructions, where said instructions perform some or all of the steps of the described method.
The program storage devices may be, for example, digital memories, magnetic storage media,
such as a magnetic disks and magnetic tapes, hard drives, or optically readable digital data
storage media.
19
[0073] Referring to Figure 4, at block 402, a first image of the target object captured
from a first location is received. In one implementation, the user may capture the first image 306-
1 of the target object 104 from the first location 302-1 using the camera 210 of the device 102-1.
[0074] As depicted in block 404, a first distance, DR1, is computed between a first
reference point and a second reference point of the target object 104 in the first image 306-1. In
one implementation, the image processing module 214 of the device 102-1 may be configured to
determine two reference points, the first reference point 108-1 and the second reference point
108-2, of the target object 104 based on various image parameters, such as color, brightness, hue,
saturation, sharpness, and exposure. In another implementation, the first reference point 108-1
and the second reference point 108-2 may prompt the user to select the two reference points on
the target object 104 as seen on the first image 306-1. The image processing module 214 may
then determine the first distance, DR1, between the two reference points on the first image 302-1.
[0075] As illustrated in block 406, a second image of the target object captured from a
second location is received. In one implementation, the user may capture the second image 306-2
of the target object 104 from the second location 302-2 using the camera 210 of the device 102-
1.
[0076] As shown in block 408, a second distance, DR2, is computed between a first
reference point and a second reference point of the target object 104 in the second image. In one
implementation, the image processing module 214 may be configured to process the second
image 306-2 for neutralizing the effect of any image capturing parameter, such as light
conditions, exposure, and zoom, which may have changed from the instant of capturing the first
image 306-1 and the instant of capturing the second image 306-2, so as to reduce any difference
in the two images 306-1 and 306-2, sans binocular disparity. A second distance, represented by
DR2, between the first reference point and the second reference point in the second image 306-2,
for example say in terms of number of pixels is then computed by the image processing module
214.
[0077] At block 410, it is ascertained if the image planes of the first image 306-1 and the
second image 306-2 are parallel. In one implementation, the image processing module 214 may
be configured to ascertain whether the image planes of the first image 306-1 and the second
20
image 306-2 are parallel based on measurements made by inbuilt sensors of the device 102-1,
such as a gyroscope.
[0078] As illustrated in block 412, the second distance, DR2, is computed between the
first reference point and the second reference point of the target object 104 in the second image
306-2 may be corrected based on the ascertaining performed at block 410. In one example, say
the image processing module 214 determines the angle between the image planes of the first
image 306-1 and the second image 306-2 to be at an angle of α. The image processing module
214 may then multiply the second distance, DR2, with the value of the cosine function of α to
obtain the corrected value of the second distance, DR2.
[0079] As depicted in block 414, the displacement of the user between the first location
302-1 and second location 302-2 is determined. In one implementation, the displacement
computing module 212 of the device 102-1 may be configured to determine the displacement
between the first location 302-1 and second location 302-2 based on the coordinates of the
location.
[0080] At block 416, the distance between the target object 104 and at least one of the
first location 302-1 and second location 302-2 is determined, based on the first distance DR1, the
second distance DR2, and the displacement between the first location 302-1 and second location
302-2. In one implementation, the distance computing module 112 may be configured to
determine the distance between the target object 104 and at least one of the first location 302-1
and second location 302-2 using the equations as described earlier.
[0081] Although embodiments for methods and systems for computing a distance of a
target object 104 have been described in a language specific to structural features and/or
methods, it is to be understood that the invention is not necessarily limited to the specific
features or methods described. Rather, the specific features and methods are disclosed as
exemplary embodiments for computing a distance of a target object 104.
21

I/We claim:
1. A computer implemented method for computing a distance of a target object (104)
comprising:
measuring a first distance, by a device (102-1), between a first reference point and
a second reference point of the target object (104) in a first captured image, using a
camera (210) mounted on the device (102-1), from a first location;
measuring a second distance, by a device (102-1), between the first reference
point and the second reference point of the target object (104) in a second captured
image, using the camera (210), from a second location;
determining a displacement between the first location and the second location;
and
determining the distance between the target object (104) and at least one of the
first location and the second location based on the first distance, the second distance, and
the displacement.
2. The computer implemented method as claimed in claim 1, wherein the method further
comprises:
ascertaining whether an image plane of the first image is parallel to an image
plane of the second image;
determining an angle between the image plane of the first image and the image
plane of the second image on ascertaining the image plane of the first image and the
image plane of the second image not to be parallel; and
correcting the second distance by a correction factor based on the angle.
3. The computer implemented method as claimed in claim 1, wherein the method further
comprises determining at least one of the first reference point and the second reference point of
the target object (104) in the first image based on at least one of one or more image parameter
and a user input.
4. The computer implemented method as claimed in claim 1, wherein the method further
comprises processing the second image for neutralizing an effect of at least one image capturing
parameter.
22
5. The computer implemented method as claimed in claim 1, wherein the computing the
displacement further comprises
determining a first set of coordinates of the first location;
determining a second set of coordinates of the second location; and
evaluating the displacement based on the first set of coordinates, and the second
set of coordinates.
6. A device (102-1), for computing a distance of a target object (104) comprising:
a processor (202);
a camera (210);
a displacement computing module (212) coupled to the processor (202), the
displacement computing module (212) configured to determine a displacement between a
first location where a first image of the target object (104) is captured by the camera
(210) and a second location where a second image of the target object (104) is captured
by the camera (210);
an image processing module (214) coupled to the processor (202), the image
processing module (214) configured to:
measure a first distance between a first reference point and a second
reference point of the target object (104) in the first image;
measure a second distance between the first reference point and the second
reference point of the target object (104) in the second image and
a distance computing module (112) coupled to the processor (202), the distance
computing module (112) configured to determine the distance between the target object
(104) and at least one of the first location and the second location based on the first
distance, the second distance, and the displacement.
7. The device (102-1) as claimed in claim 6, wherein the image processing module (214) is
further configured to:
ascertain whether an image plane of the first image is parallel to the image plane
of the second object, based on measurements obtained from at least one inbuilt sensor of
the computing device (102-1);
23
determine an angle between the image plane of the first image and the image
plane of the second image on ascertaining the image plane of the first image and the
image plane of the second image not to be parallel; and
correct the second distance by a factor of cosine function of the angle.
8. The device (102-1) as claimed in claim 6, wherein the image processing module (214) is
further configured to compute the first distance and the second distance in terms of number of
pixels.
9. The device (102-1) as claimed in claim 6, wherein the image processing module (214) is
further configured to determine at least one of the first reference point and the second reference
point of the target object (104) in the first image based on at least one of one or more image
parameter and a user input.
10. The device (102-1) as claimed in claim 6, wherein the image processing module (214) is
further configured to:
ascertain an angle between a line of sight of the target object 104 from the first
location 302-1 and a line of sight of the target object 104 from the second location 302-2;
determine whether the angle exceeds a predefined threshold value;
compute a correction factor based of the angle on determining the angle to exceed
the pre-defined threshold limit; and
correct the distance between the target object (104) and at least one of the first
location and the second location based on the correction factor.
Dated this 30 November 2012

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 3693-del-2012-Correspondence Others-(10-12-2012).pdf 2012-12-10
1 3693-DEL-2012-RELEVANT DOCUMENTS [09-09-2023(online)].pdf 2023-09-09
2 3693-del-2012-Form-1-(26-12-2012).pdf 2012-12-26
2 3693-DEL-2012-IntimationOfGrant27-12-2021.pdf 2021-12-27
3 3693-DEL-2012-PatentCertificate27-12-2021.pdf 2021-12-27
3 3693-del-2012-Correspondence-others-(26-12-2012).pdf 2012-12-26
4 Form-5.pdf 2013-01-16
4 3693-DEL-2012-Written submissions and relevant documents [25-11-2021(online)].pdf 2021-11-25
5 Form-3.pdf 2013-01-16
5 3693-DEL-2012-FORM-26 [15-11-2021(online)].pdf 2021-11-15
6 Form-1.pdf 2013-01-16
6 3693-DEL-2012-Correspondence to notify the Controller [10-11-2021(online)].pdf 2021-11-10
7 Drawings.pdf 2013-01-16
7 3693-DEL-2012-US(14)-HearingNotice-(HearingDate-15-11-2021).pdf 2021-10-17
8 3693-DEL-2012-RELEVANT DOCUMENTS [08-05-2018(online)].pdf 2018-05-08
8 3693-DEL-2012-CLAIMS [28-01-2020(online)].pdf 2020-01-28
9 3693-DEL-2012-Changing Name-Nationality-Address For Service [08-05-2018(online)].pdf 2018-05-08
9 3693-DEL-2012-COMPLETE SPECIFICATION [28-01-2020(online)].pdf 2020-01-28
10 3693-DEL-2012-AMENDED DOCUMENTS [08-05-2018(online)].pdf 2018-05-08
10 3693-DEL-2012-DRAWING [28-01-2020(online)].pdf 2020-01-28
11 3693-DEL-2012-FER.pdf 2019-07-30
11 3693-DEL-2012-FER_SER_REPLY [28-01-2020(online)].pdf 2020-01-28
12 3693-DEL-2012-OTHERS [28-01-2020(online)].pdf 2020-01-28
12 3693-DEL-2012-PA [19-09-2019(online)].pdf 2019-09-19
13 3693-DEL-2012-ASSIGNMENT DOCUMENTS [19-09-2019(online)].pdf 2019-09-19
13 3693-DEL-2012-Correspondence-101019.pdf 2019-10-14
14 3693-DEL-2012-8(i)-Substitution-Change Of Applicant - Form 6 [19-09-2019(online)].pdf 2019-09-19
14 3693-DEL-2012-OTHERS-101019.pdf 2019-10-14
15 3693-DEL-2012-8(i)-Substitution-Change Of Applicant - Form 6 [19-09-2019(online)].pdf 2019-09-19
15 3693-DEL-2012-OTHERS-101019.pdf 2019-10-14
16 3693-DEL-2012-ASSIGNMENT DOCUMENTS [19-09-2019(online)].pdf 2019-09-19
16 3693-DEL-2012-Correspondence-101019.pdf 2019-10-14
17 3693-DEL-2012-PA [19-09-2019(online)].pdf 2019-09-19
17 3693-DEL-2012-OTHERS [28-01-2020(online)].pdf 2020-01-28
18 3693-DEL-2012-FER.pdf 2019-07-30
18 3693-DEL-2012-FER_SER_REPLY [28-01-2020(online)].pdf 2020-01-28
19 3693-DEL-2012-AMENDED DOCUMENTS [08-05-2018(online)].pdf 2018-05-08
19 3693-DEL-2012-DRAWING [28-01-2020(online)].pdf 2020-01-28
20 3693-DEL-2012-Changing Name-Nationality-Address For Service [08-05-2018(online)].pdf 2018-05-08
20 3693-DEL-2012-COMPLETE SPECIFICATION [28-01-2020(online)].pdf 2020-01-28
21 3693-DEL-2012-CLAIMS [28-01-2020(online)].pdf 2020-01-28
21 3693-DEL-2012-RELEVANT DOCUMENTS [08-05-2018(online)].pdf 2018-05-08
22 3693-DEL-2012-US(14)-HearingNotice-(HearingDate-15-11-2021).pdf 2021-10-17
22 Drawings.pdf 2013-01-16
23 3693-DEL-2012-Correspondence to notify the Controller [10-11-2021(online)].pdf 2021-11-10
23 Form-1.pdf 2013-01-16
24 3693-DEL-2012-FORM-26 [15-11-2021(online)].pdf 2021-11-15
24 Form-3.pdf 2013-01-16
25 Form-5.pdf 2013-01-16
25 3693-DEL-2012-Written submissions and relevant documents [25-11-2021(online)].pdf 2021-11-25
26 3693-DEL-2012-PatentCertificate27-12-2021.pdf 2021-12-27
26 3693-del-2012-Correspondence-others-(26-12-2012).pdf 2012-12-26
27 3693-DEL-2012-IntimationOfGrant27-12-2021.pdf 2021-12-27
27 3693-del-2012-Form-1-(26-12-2012).pdf 2012-12-26
28 3693-DEL-2012-RELEVANT DOCUMENTS [09-09-2023(online)].pdf 2023-09-09
28 3693-del-2012-Correspondence Others-(10-12-2012).pdf 2012-12-10

Search Strategy

1 3693_DEL_2012_search_26-07-2019.pdf

ERegister / Renewals

3rd: 04 Jan 2022

From 30/11/2014 - To 30/11/2015

4th: 04 Jan 2022

From 30/11/2015 - To 30/11/2016

5th: 04 Jan 2022

From 30/11/2016 - To 30/11/2017

6th: 04 Jan 2022

From 30/11/2017 - To 30/11/2018

7th: 04 Jan 2022

From 30/11/2018 - To 30/11/2019

8th: 04 Jan 2022

From 30/11/2019 - To 30/11/2020

9th: 04 Jan 2022

From 30/11/2020 - To 30/11/2021

10th: 04 Jan 2022

From 30/11/2021 - To 30/11/2022

11th: 06 Apr 2022

From 30/11/2022 - To 30/11/2023

12th: 28 Oct 2023

From 30/11/2023 - To 30/11/2024

13th: 14 Nov 2024

From 30/11/2024 - To 30/11/2025

14th: 23 Oct 2025

From 30/11/2025 - To 30/11/2026